Search This Blog

Urgent to Fill - Systems Engineer or Hadoop Admin @ Temple Terrace-FL

Greetings!!

Hi,
This is Narasimha B From RNR IT Solutions,
Hope you are doing Fantastic,
We have an urgent requirement, please find the job description and
Let me know your interest and Send Updated Resume.

Job Posting Title: CIO - Systems Engineer or Hadoop Admin
Temple Terrace-FL-USA.
Only independent consultants.

Description:  As a Part of VZ Platform Engineering Team, the candidate (Hadoop Admin) will be responsible for implementation and ongoing Administration of Hadoop Bigdata infrastructure. The Hadoop Admin will support, implement and maintain bigdata infrastructure in Verizon and will be responsible for end to end Hadoop cluster administration.
JOB DUTIES:
•            Responsible for implementation and ongoing administration of Hadoop infrastructure initiatives.
•            Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
•            Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig, Spark and MapReduce access for the new users.
•            Cluster maintenance as well as creation and removal of nodes using Hadoop Management Admin tools like Ambari, Cloudera Manger etc.
•            Sound knowledge in Ranger, Nifi, Kafka, Atlas, Hive, Storm, pig, spark, Elastic Search, Splunk, Solr, Kyvos, Hbase etc and other bigdata tools.
•            Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
•            Screen Hadoop cluster job performances and capacity planning
•            Monitor Hadoop cluster connectivity and security
•            Manage and review Hadoop log files, File system management and monitoring.
•            HDFS support and maintenance.
•            Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
•            Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
•            Implement automation tools and frameworks (CI/CD pipelines). Knowledge on Ansible, Jenkins, Jira, Artifactory, Git etc.
•            Design, develop, and implement software integrations based on user feedback.
•            Troubleshoot production issues and coordinate with the development team to streamline code deployment.
•            Analyze code and communicate detailed reviews to development teams to ensure a marked improvement in applications and the timely completion of projects.
•            Collaborate with team members to improve the company’s engineering tools, systems and procedures, and data security.
MUST HAVE SKILLS:
•            Must know Hadoop and bigdata infrastructure
•            Expert in Hadoop administration with knowledge of Hortonworks/Cloudera or Mapr Bigdata management tools
•            Expert in developing/managing Java and Web applications.
•            Expert in implementing and trouble shooting hive, spark, pig, storm, Kafka, Nifi, Atlas, Kyvos, Elastic Search, Solr, Splunk, HBase applications.
•            Possess a strong command of software-automation production systems (Jenkins and Selenium) and code deployment tools (Puppet, Ansible, and Chef).
•            Working knowledge of Ruby or Python and known DevOps tools like Git and GitHub.
•            Working knowledge of database (Oracle/Teradata) and SQL (Structured Query Language).
•            General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and network
DESIRED SKILLS:
•            The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups.
•            Good knowledge of Linux as Hadoop runs on Linux.
•            Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting.
•            Knowledge of Troubleshooting Core Java Applications is a plus.
•            Problem-solving skills.
•            A methodical and logical approach.
•            The ability to plan work and meet deadlines
•            Accuracy and attention to detail
EDUCATION/CERTIFICATIONS:
B.S. or equivalent Engineering degree with at least 5 Years for Big data work experience
--  

Regards,
Narasimha B
RNR IT Solutions Inc.
 7800 Preston Rd, #126, || Plano | TX 75024.
Email:- narasimha.b@rnrits.com.
Phone: 469-327-6044.

Confidentiality Note: This email and any attachments are confidential and may be protected by legal privilege.If you are not the intended recipient, be aware that any disclosure. copying, distribution or use of this e-mail or any attachment is prohibited. If you have received this e-mail in error, please notify us immediately by returning it to the sender and delete this copy from our system. Thank you.

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com

Featured Post

Fwd: Blue Yonder Category Management Solution Architect

Greetings, We have the below requirement with Client. Kindly go through the Job Description and let me know your interest.   J...

Contact Form

Name

Email *

Message *

Total Pageviews