Hello All,
Please find the below requirement details
Position: Azure Data Engineer
Location: Minneapolis, MN - Remote to Start
Duration: 6 Months
Client: Optum UHG
Rate: $60/Hr. on C2C
Visa: Only USC, GC, H4/L2/GC EAD's
Position Description:
• Provide technical leadership and build in architecture, design and engineering in modernization of legacy Data Ingestion, ETL and Database to new technologies in the Public Cloud (AWS/Azure), Big Data and API Space
• Lead the technology transformation from our legacy Data and Analytic platforms to a Big Data Cloud based Modern Software paradigm. Be innovative in solution design and development to meet the needs of the business.
• Create a common framework and repository for the development of Modern APIs available to the Optum Product Engineering Team
• Demonstrate leadership in the context of software engineering and be an evangelist for engineering best practices
• Create next generation streaming applications to use our Data as a strategic platform to grow the top line revenue
• Stay abreast of leading-edge technologies in the industry evaluating emerging software technologies
• Work collaboratively with all business areas to assess unmet/new business needs and solutions
• Encourage the growth of junior engineers on the team through skills development, mentoring and technical guidance
• Create a "startup mentality" to accelerate the introduction of new capabilities and transform teams
• Avid supporter of the Open Source software community
• Excellent time management, communication, decision making, and presentation skills
• Display a strong desire to achieve and attain high levels of both internal and external customer satisfaction
Required:
• BS / BA in Engineering, Computer Science or equivalent experience
• 8+ years of software product development experience working on commercially available software and / or healthcare platforms
• 6+ months of experience developing solutions hosted within public cloud providers such as Azure and AWS or private cloud/container based systems using Mesos, Kubernetes/OpenShift
Java/Scala development experience
• Experience with Big Data technologies like HDFS, Hive, Spark, Kafka
• Experience with Kubernetes, Big Data technologies like Spark, Hbase, MapReduce, Storm, Flume, Sqoop, Pig, Apache Drill, Oozie, Zeppelin
• Experience managing, leading, and / or mentoring teams in using Big Data technologies
• Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc.
• Proven track record of acting as an advocate for driving new technology across the organization via the creation of communities and publishing SDK / libraries for reuse across the organization
• Strong experience of operating in a quickly changing environment and driving technological innovation to meet business requirement
• Proven track record of building relationships across cross-functional teams
Preferred:
• Experience building Big Data solutions on public cloud (Azure)
• Experience building data pipelines with Kafka and Kafka Streams
• Experience developing Java RESTful Services using Springboot
• Good understanding of Microservices architecture
• Experience with Automation Frameworks – Selenium Webdriver, Rest assured, SOAP UI
• Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment
• Experience of developing cloud based API gateways would be highly desirable
• Experience / exposure to API integration frameworks
• Experience in the healthcare industry
Thanks & Regards, Ranjith Dandabathini Account Manager - Apex Account |
Phone: (209) 392-5335 Email: Ranjith@imcsgroup.net 9901 East Valley Ranch Parkway Suite 3020 Irving, Texas – 75063
|
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com