Search This Blog

Genpact || AWS Data Engineer || Remote

Hi Member,

One of our client is looking for Data Engineer

 

Title: AWS Data Engineer

Location: Remote

10+ Years

 

Job description:

Strong understanding of data structures and algorithms

• Strong understanding of solution and technical design

• Has a strong problem solving and analytical mindset?

• Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks

• Advanced experience building cloud scalable, real time and high-performance data lake solutions

• In-depth understanding of micro service architecture

• Strong understanding of developing complex data solutions

• Experience working on end-to-end solution design

• Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions

• Willing to learn new skills and technologies

• Has a passion for data solutions

 

Required and Preferred Skill Sets:

 

• 1 -2 years of hands-on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud; Ability to solve complex problems

• 1-2 years of hands-on experience Spark Batch Processing and some familiarity with Spark Structured Streaming; Ability to solve complex issues

• 1-2 years’ experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion

• 2-3 years of hands-on experience with SQL, ETL, data transformation and analytics functions; Ability to solve complex problems

• 2-3 years of hands-on Python experience including Batch scripting, data manipulation, distributable packages; Ability to solve complex problems

• 2-3 years’ experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow

• 2-3 years working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices

• 2-3 years working with deployment automation tools such as Jenkins and familiarity with containerization concepts such as Docker and Kubernetes

• 2-3 years of hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka

• 2-3 years designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake

• Preferred 1+ years of experience supporting Tableau or Cognos use cases

• Familiarity with Agile; working experience preferred

Regards,


Yash Prakash Srivastava

VBeyond Corporation

Associate Recruiter – TAG USA

Call/ Text : (908) 345-2054

E: Yashs@Vbeyond.com | www.vbeyond.com

390 Amwell Road, Suite # 107, Hillsborough, NJ 08844

 



Note – VBeyond is fully committed to Diversity and Equal Employment Opportunity.

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com

Featured Post

Fwd: Senior Data Analyst Location: San Antonio, TX / Plano, TX (LOCAL CANDIDATES ONLY!) In person Interview

Greetings, We have the below requirement with Client. Kindly go through the Job Description and let me know your interest.   J...

Contact Form

Name

Email *

Message *

Total Pageviews