Search This Blog

April 5, 2021

Sr AWS/BIG Data Engineer // 100% remote


Hello,
Please find the below requirement details

Position: Sr AWS/BIG Data Engineer
Location: Can be 100% remote (Eden Prairie, MN)
Duration: 6 Months
Client: UHG
Vendor: Apex
Visa's: Only USC, GC and H4 EAD


Required Skills
What are the top must-have technologies/required experience for the position?
 1. Hadoop (Apache Spark/Hive)
2. Scala
3. AWS Elastic Map Reduce (EMR)
4. AWS General Services (Lambda, EC2, automation, etc.)

 Preferred qualifications:
1. AWS Athena
2. AWS Glue
3. Additional DevOps tools experience (Terraform, Jenkins)

 
Job Descriptio
We have an urgent opening for a Sr AWS/BIG Data Engineer (Details below). This is a 6+ month Contract to hire opportunity.

Project Details – Size, Scale, Scope?
·         Data pipeline work - getting data from the clients and going through 2-3 steps and the team is called Optum Analytics Data Warehouse (big data platform they are building) – this is the final DB that is given to many of the Optum Applications.
·         Reporting applications and analytic applications are pulling from this platform/data
·         They are looking for someone who has heavy cluster computing (Spark and EMR) experience and has worked heavily with Big Data in Apache as well as having more recent experience in Cloud (AWS) elastic-computing
·         On-prem Hadoop and moving to EMR in AWS (Hadoop clusters for AWS)
·         Want someone who has AWS experience with EMR, automation, user permissions, etc.

Day to Day Responsibilities/typical day look like:
·          80% or more coding/data focused and 20% on the design and meetings and other work decision
·         50% in cloud (AWS) and 50% in big-data ETL (Hadoop/Scala)

  Education or Certification Requirements:
·         Certifications in AWS or other technologies are a plus
·         Email certification copies over if they have