Search This Blog

July 25, 2022

Contract Job opportunity for the role of Data Engineer with Genpact, Remote.

Hi,


Hope you are doing well. Our client is looking for a Data Engineer on contract. There is a 100% remote availability with this one. Please go through the job description below and please share the profile of your relevant candidates.


Looking forward to your response.



Title: Data Engineer (Senior/Lead)

Location: Remote

 

Job description:

 

Strong understanding of data structures and algorithms

• Strong understanding of solution and technical design

• Has a strong problem solving and analytical mindset?

• Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks

• Advanced experience building cloud scalable, real time and high-performance data lake solutions

• In-depth understanding of micro service architecture

• Strong understanding of developing complex data solutions

• Experience working on end-to-end solution design

• Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions

• Willing to learn new skills and technologies

• Has a passion for data solutions

 

Required and Preferred Skill Sets:

 

• 1 -2 years of hands-on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud; Ability to solve complex problems

• 1-2 years of hands-on experience Spark Batch Processing and some familiarity with Spark Structured Streaming; Ability to solve complex issues

• 1-2 years’ experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion

• 2-3 years of hands-on experience with SQL, ETL, data transformation and analytics functions; Ability to solve complex problems

• 2-3 years of hands-on Python experience including Batch scripting, data manipulation, distributable packages; Ability to solve complex problems

• 2-3 years’ experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow

• 2-3 years working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices

• 2-3 years working with deployment automation tools such as Jenkins and familiarity with containerization concepts such as Docker and Kubernetes

• 2-3 years of hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka

• 2-3 years designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake

• Preferred 1+ years of experience supporting Tableau or Cognos use cases

• Familiarity with Agile; working experience preferred



Regards,

Rajat Diwan Vishwakarma

Talent Acquisition

VBeyond Corporation |

390 Amwell Road, Suite #107, Hillsborough, NJ 08844

(908) 356-0193

RajatV@vbeyond.comhttp://www.vbeyond.com