Search This Blog

Big Data Hadoop Admin

Job Title: Big Data Hadoop Admin
Location: Tarrytown, NY
Type: CTH (Contract to Hire)
 
Job Title: Big Data Hadoop Admin
Location: Sleepy Hallow, NY
Job requirements/Skill Set:
·             Bachelor's degree in Computer Science or related disciplines
·             Required: 5+ to three plus years hands-on experience of On-prem or Cloud Hadoop/Spark environment.
·             Required: 5+ plus years of Big data or high data volume management experience
·             Responsible for implementation and ongoing administration of Hadoop infrastructure.
·             Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
·             Cluster maintenance as well as creation and removal of nodes using tools like Ambari, Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
·             Capacity planning, Performance tuning and troubleshooting of Hadoop clusters.
·             Monitor Hadoop cluster connectivity and security
·             Manage and Monitor Hadoop clusters.
  • Responsible for scripting/automation of environments using tools such as (CHEF, Puppet or Ansible)
  • Work with scripting languages such as Python, Unix Shell or Ruby
  • Knowledge of AWS CloudFormation or Azure ARM scripts
·             HDFS support and maintenance.
·             Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
·             Ability to work in a fast-paced environment a must.
·             Pharmaceutical experience preferred but not required, consumer / retail / technology /Insurance experience preferred.
 
Must have for this role:
·             Required: Hands-on experience in Hadoop Installation, Configuration, Administration, Security, Development, and Designing.
·             Experience with Hortonworks Data Platform and Hortonworks Data flow ( Apache NiFi)
·             Experience with Hortonworks Cloudbreak
·             Apache Ranger, Atlas, Kerberos
·             AWS Cloud Services
·             Responsible for troubleshooting and development on Hadoop technologies like HDFS, Hive, Pig, Flume, MongoDB, Accumulo, Sqoop, Zookeeper, Spark, MapReduce2, YARN, HBase, Tez, Kafka, and Storm.
·             Fine tune applications and systems for high performance and higher volume throughput.
·             Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
·             Excellent planning, communication, documentation, organizational, analytical, and problem-solving abilities. Experience working with minimal direction, with a keen ability to leverage business-sense to guide decision making.
 
 
Thanks
Divya
divya@levanture.com


Company Name | Website

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com

Featured Post

Fwd: Sr. System Engineer Location: Chicago, IL (Hybrid Role)

Greetings, We have the below requirement with Client. Kindly go through the Job Description and let me know your interest.   Job Title: Sr. ...

Contact Form

Name

Email *

Message *

Total Pageviews