Search This Blog

October 5, 2022

Urgent Need Azure Data engineer@Bentonville, AR( Initially Remote)

Job Title: Azure Data engineer
Job Location: 
 Bentonville, AR( Initially Remote)

Rate: $62/hr on c2c

Primary Skills:

ETL TOOLS , RESTFUL SERVICES , ZOOKEEPER , HBASE , DATA WAREHOUSING , APACHE KAFKA , SCALA , JIRA , DATA MODELLING , SPARK , JAVA , NOSQL , ELT , JENKINS , ENGINEER , RDBMS , MAPREDUCE , REDIS , PYTHON , CI/CD , AZURE , LINUX , INFORMATICA , JAVASCRIPT FRAMEWORKS , HADOOP , MICROSOFT AZURE

Client Job Description -UST Global® is looking for a highly energetic and collaborative Data Engineer with experience in enterprise data solutions around Business and IT operations. The ideal candidate should have relevant experience in the Big Data space, using traditional, new and emerging technologies. Candidate should have good experience in defining the overall technical architecture for enterprise and customer data platforms. A good understanding of enterprise data architecture and systems is expected. Excellent written and verbal communication skills and ability to work as team member.As a data engineer at UST Global, this is your opportunity to · Drive customer conversations to define the requirements and overall technical architecture for data and related solutions· Exercise strong skills in terms of data analysis of large data infrastructure· Ability to work on technical solution and architecture, artifacts, work products and presentation of solution with the customer. Deliver based on requirements and industry best practices· Review functional and technical requirements, raising potential issues and participating actively in design discussions· Develop reusable artifacts/frameworks, re-usable assets, industry solutions, reference architecture, design, development and QA best practice· Collaborating with Customers and Product Marketing / Management teams to identify opportunities that can be exploited through great software features. Work closely with Product Owners, Scrum Masters, Senior Business Analysts, and other client stakeholders· Working as a member of a design, build and test agile team continuously releasing new features· Mentoring people on the team· Key Responsibilities Requirements and design Software architecture and coding Integration Testing Feature Definition You bring: Bachelor's degree in Computer Science, Computer Engineering or a software related discipline. A Master's degree in a related field is an added plus · 6+ years of hands-on experience developing a distributed data processing platform with GCP/Azure, Hive or Spark, Airflow or a workflow orchestration solution are required· 2+ years of hands-on experience in modelling and designing schema for data lakes or for RDBMS platforms. Awareness of schema data modelling Experience in data quality engineering, metadata consolidation and integration, metadata model development and maintenance, repository management, data warehouse design and data mining, and data security Experience in leveraging enterprise data warehouse modeling constructs, methodologies and practices to ensure flexible, scalable, maintainable, and high-performing physical databases 2+ years of hands on experience in integrating OTLP and ERP systems with big data/data lake repositories using ELT/ETL tools, preferably Informatica and Oracle 1+ year of hands on experience in implementing and using CI/CD framework technologies such as Jenkins or Bamboo Experience with technologies and frameworks such as Hadoop, MapReduce, Pig, Hive, HBase, Flume, ZooKeeper, NoSQL and Cassandra Experience with Data warehousing and data mining is a must Strong experience in programming languages and latest technologies such Java, Javascript frameworks, RESTful Services, Spark, Scala , Python, Linux, Hive, Kafka , Redis, and Hortonworks SQL is must to have 3+ years of experience working as a member of an Agile team For this role, we value: The ability to adapt quickly to a fast-paced environment Excellent written and oral communication skills A critical thinker that challenges assumptions and seeks new ideas Proactive sharing of accomplishments, knowledge, lessons, and updates across the organization Experience building, testing and releasing software solutions in a complex, large organization · Knowledge of JIRA, GitHub· Practice working with, processing, and managing large data sets (multi TB/PB scale).· Ba 
Comments for Suppliers:Azure Data Lake,Gcp,Data Migration
Additional Skills:Microsoft Azure,Apache Kafka,Pyspark,Data Modelling


Warm Regards,

Bhaskar kumar | Senior recruiter 

3S Business Corporation

kumar.koppisetti@3sbc.com

Richmond Avenue | Houston, TX – 77082

An E-Verified Company 

To be removed from our mailing list reply with "remove@3sbc.com" and include your "original email address/addresses" in the subject heading. Include complete address/addresses and/or domain to be removed. We will immediately update it accordingly. We apologize for the inconvenience if any caused. Please consider the environment before printing this email. Go Green

--
You received this message because you are subscribed to the Google Groups "hotrequirements223" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hotrequirements223+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hotrequirements223/d7c426ff-0126-45c7-818b-33f413e0bd7cn%40googlegroups.com.