Hi,
I have below opening. Please let me know, if you are interested.
Role : Title: Senior Data Engineer Location: Remote Job description: - Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
- Able to quickly pick up new programming languages, technologies, and frameworks
- Advanced experience building cloud scalable, real time and high-performance data lake solutions
- In-depth understanding of micro service architecture
- Strong understanding of developing complex data solutions
- Experience working on end-to-end solution design
- Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions
- Willing to learn new skills and technologies
- Has a passion for data solutions
- Strong understanding of data structures and algorithms
Required and Preferred Skill Sets: - 1 -2 years of hands-on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud; Ability to solve complex problems
- 1-2 years of hands-on experience Spark Batch Processing and some familiarity with Spark Structured Streaming; Ability to solve complex issues
- 1-2 years’ experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion
- 2-3 years of hands-on experience with SQL, ETL, data transformation and analytics functions; Ability to solve complex problems
- 2-3 years of hands-on Python experience including Batch scripting, data manipulation, distributable packages; Ability to solve complex problems
- 2-3 years’ experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow
- 2-3 years working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices
- 2-3 years working with deployment automation tools such as Jenkins and familiarity with containerization concepts such as Docker and Kubernetes
- 2-3 years of hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka
- 2-3 years designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake
- Preferred 1+ years of experience supporting Tableau or Cognos use cases
- Familiarity with Agile; working experience preferred
Thanks & Regards,
Priyanka Sharma I IT Recruitment VBeyond Corporation I PARTNERING FOR GROWTH Lucknow I Mumbai I Gurgaon I Pune I New Jersey, USA Email – Priyankas@vbeyond.com Website : www.vbeyond.com
|