ETL Architect || Dallas/ Denver || (Day 1 Onsite)
Greeting from Nityo Infotech Corp .
We have an Excellent Opportunity with Tech Mahindra. Super urgent requirements Immediate interview on this
We have below position with us, Kindly share resume @ Anuj.Rana@nityo.com
Role: ETL Architect
Location: Dallas/ Denver (Day 1 Onsite)
Interview mode : WebEx Video
Start Date: Immediate
JOB DESCRIPTION :
Required skills
1. Making the basic ETL architectural rules and regulations.
2. Supervising and guiding ETL architectural implements
3. Coordinating with the other IT teams to collect design needs and making specialized structured written papers based on the operative needs
4. Assisting the structural and specialized designs of the schemes to make sure that the judgments are being made in accordance with the existing and forthcoming business plans of actions and chances
5. Owning the architectural production actions related to ETL projects.
6. Analyze the needs of large systems and breaking them down into smaller manageable parts
7. Plan and design the structure of technology systems, discuss these with the client
8. Communicate system requirements to software designers and developers; explain system structure to them and provide assistance throughout the assembly process.
9. Choose suitable software, hardware and suggest integration methods.
10. Help resolve technical problems as and when they arise
11. Ensure that systems satisfy quality standards and procedures
What you Must Have
· 12+ years of IT experience in Data Engineering, Data Quality, Data Migrations, Data Architecture, Data Lake formation and Data Analytics.
· 5+ Years hands on solid Experience on AWS services like S3, EMR, VPC, EC2, IAM, EBS, RDS, Glue, Lambda, Lake Formation etc.
· Must have worked in producing architecture document for small to large solution implementations.
· In depth understanding of Spark Architecture including Spark Code, Spark SQL, Data frames, Spark Streaming, Spark MLiB, etc. Experience on handing very high-volume streaming data in various format like JSON, XML, AVRO, Snappy etc.
· Good Exposure to Kafka to design future capacity planning, partition Planning, Read and write
· Must have worked with Big Data and should have good knowledge Mar reduce and Spark.
· Must have very good working exposure on different kind of databases like RDBMS, No SQL Columnar, Document, distributed databases, Could Databases, in memory databases etc.
· Python Exposure is an added advantage.
Thanks & Regards,
Anuj Rana
Technical Recruiter
Desk No : 609-853-0818 *2529
Email id : Anuj.rana@nityo.com
Nityo Infotech Corp.
666 Plainsboro Road,Suite 1285,Plainsboro, NJ 08536
URL : www.nityo.com
Comments
Post a Comment
Thanks