| Unsubscribe |
Job Location: NYC
Job Description :
The ETL Hadoop data engineer will be responsible for analyzingthe business requirements design develop and implement highly efficient highlyscalable ETL processes.
Candidate is required to perform daily project functions with afocus on meeting the business objectives on time in rapidly changing workenvironment and should be able to lead and drive globally located team toachieve business objectives.
Required Skills:
- 5 to 10 years of hands on experience working with Informatica power center Hadoop
- Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems
- Strong knowledge of working with relational databases like Teradata, DB2, Oracle and Sql server
- Hands on experience in writing shell scripts on Unix platform
- Experience in data warehousing ETL tools MPP database systems
- Understanding of Data Models
- Experience conceptual logical and physical dimensional relational data model
- Design and analyze functional specifications and assist in designing potential technical solutions
- Identifies data sources and works with source system team and data analyst to define data extraction methodologies
- Good knowledge in writing complex queries in Teradata DB2 Oracle PL SQL
- Maintain batch processing jobs and respond to critical production issues
- Communicate well with stakeholders on his / her proposal recommendations
- Knowledge of status risks regarding delivering solution on time
- Strong with data analysis and data profiling root cause analysis
- Should able to understand Banking system processes and data flow
- Can work independently lead and mentor the team
Thanks & Regards
Abhishek Verma | Technical Recruiter
Emonics LLC
Piscataway NJ 08854
Direct-201 336 0511
Email address- abhishek.v@emonics.com
| This email is generated using CONREP software. |
A68146
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com