Hi Folks ,
Hope you doing great today !
Below is the requirement, please let me know if interested.
Role : Bigdata Consultant
Location : Tampa , FL
Duration : 12 Months
Interview : Phone and Skype
Location: Remote worker *** (Must work 1st week onsite in Tampa - expenses are paid for this)
Responsibilities:
• Design, Develop and Implement Micro Services based Data Lake Solutions on Hadoop Platform
• Acquire data from primary or secondary data sources
• Identify, analyze, and interpret trends or patterns in complex data sets
• Transforming existing ETL logic on DataStage application into Hadoop Platform
• Innovate new ways of managing, transforming and validating data
• Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
• Apply quality assurance best practices to all work products
• Analyze, design and code business-related solutions, as well as core architectural changes, using an Agile programming approach resulting in software delivered on time and in budget;
• Experience of working in a development teams, using agile techniques and Object Oriented development and scripting languages, is preferred
• Comfortable learning cutting edge technologies and applications to greenfield projects
• Challenges status quo and mentors development staff in terms of efficient Design as well as reusable Development best practices in minimizing unfavorable work variances.
• Communicate risks or issues stemming from projects or tickets work to core teams as well as assigned Technical delivery managers, proactively.
Qualifications:
• Bachelor’s Degree in Computer Science, Computer Engineering, Information Technology or other related field in addition to relevant work experience required for the job
• Minimum 3+ years of developer experience on writing SQLs, Hadoop, YARN, Sqoop, Spark SQL, Hive, other ETL tools (Informatica, Talend etc), Impala, etc.
• Minimum 2+ years of development experience in delivering DataStage based ETL solutions.
• 2-5 years of experience in Python, Java or Scala Development
• Experience on developing Spark processes and performance tuning
• Experience performing data analytics on Hadoop-based platforms and implementing complex ETL transformations.
• Strong Experience with UNIX shell scripting to automate file preparation and database loads
• Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues
• Familiarity with relational database environment (Oracle, DB2, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc and integrating with Salesforce backend.
• Experience analyzing and designing deliverables in an Agile environment is required.
• Demonstrated independent problem solving skills and ability to develop solutions to complex analytical/data-driven problems
Jeetendra Pandey | Sr. Recruiter | Apetan Consulting LLC
Phone: 201-620-9700 * 160 |
Mailing Address: 72 Van Reipen Avenue pmb#255, Jersey City, NJ 07306 |
Corp. Office: 15 Union Avenue, office # 6, Rutherford, New Jersey 07070 |