Duration: Permanent/Full Time
Location: 100% Remote 

*** U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time. ***

Overview: You will design, build and manage the architecture that helps analyze and process data as required by the initiatives in the Data Factory and make sure those systems are performing smoothly. You will also prepare the "big data infrastructure to be analyzed by Data Scientists.

Qualifications

  • Bachelor's Degree
  • 5+ years of experience in designing and developing Data Pipelines for Data Ingestion or Transformation using Java or Scala or Python
  • 5+ years of experience with SQL and Shell Scripting experience
  • 3+ years of experience with Microsoft Azure or public cloud service
  • Intermediate level experience/knowledge in at least one scripting language (Python, Perl, JavaScript)
  • Basic understanding of ETL standards and best practices