Duration: 9+ months
Location: 100% REMOTE
- Possess extensive analysis, design and development experience in Hadoop and AWS Big Data platforms.
- Able to critically inspect and analyze large, complex, multi-dimensional data sets in Big Data platforms.
- Experience with Big Data technologies, distributed file systems, Hadoop, HDFS, Hive, and Hbase.
- Define and execute appropriate steps to validate various data feeds to and from the organization using Sqoop.
- Create scripts to extract, transfer, transform, load, and analyze data residing in Hadoop and RDBMS including Oracle and Teradata.
- Design, implement, and load table structures in Hadoop and RDBMS including Oracle and Teradata to facilitate detailed data analysis.
- Participate in user acceptance testing in a fast-paced Agile development environment.
- Must have very strong experience with Hadoop, Sqoop, Hive, Oozie and Shell Scripting.
- Highly proficient and extensive experience working with relational databases, particularly Oracle and Teradata.
- Excellent working knowledge of UNIX-based systems.
- Experience with Spark, Scala, R, and Python is a huge plus.
- Bachelor's degree or Master's degree in Computer Science or equivalent work experience.