Hadoop Administrator Consultant F2F interview. King of Prussia, PA Need Locals only
Hadoop Administrator Consultant -
Must sit in King of Prussia, PA
Must F2F interview.
H1 OK if the communication is great.
“MUST HAVE EXPERIENCE WORKING WITH HADOOP IN THE CONTEXT OF WEB APPLICATIONS” – PLEASE ASK THIS
Description
The Hadoop Administrator will install, Configure, Monitor, Troubleshoot, and Maintain Hadoop environments running in-house and in cloud infrastructure such as AWS. Participate as a member of a team charged with overall systems administration of customer-facing production systems and development/test systems with high up-time expectations and rapid turnaround requirements. Gather requirements, recommend solutions, deliver and maintain solutions in line with formal and emerging requirements.
ESSENTIAL JOB FUNCTIONS AND RESPONSIBILITIES:
- Responsible for implementation and support of the Hadoop environments, both in-house, bare metal, and on AWS cloud infrastructure.
- Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling, fail-over for disaster recovery, and administration.
- The administrator will work closely with infrastructure, network, database, business intelligence and application development teams to ensure business applications are highly available and performing within agreed on service levels.
- Need to implement concepts of Hadoop eco system such as YARN, MapReduce, HDFS, HBase, Zookeeper, Pig and Hive, Ranger, Knox, Presto.
- In charge of installing, administering, and supporting Windows and Linux operating systems in an enterprise environment.
- Accountable for storage, performance tuning and volume management of Hadoop clusters and MapReduce routines.
- In command of setup, configuration and security for Hadoop clusters using Kerberos.
- Monitor Hadoop cluster connectivity and performance; monitor apps using Hadoop services.
- Manage and analyze Hadoop log files.
- Managing and deploying HBase.
- File system management and monitoring.
- Develop and document best practices.
- HDFS support and maintenance.
- Setting up new Hadoop users and tomcat apps on a multi-tenant cluster.
- Responsible for the new and existing administration of Hadoop infrastructure.
- Includes system administration responsibilities such as design and implementation, software installation and configuration, database backup and recovery, database connectivity and security.
KNOWLEDGE, SKILLS AND ABILITIES:
- Ability to apply professional concepts, experience and company objectives in order to perform an in-depth analysis of situations or data to resolve complex issues in creative ways.
- Ability to work without supervision. Latitude for independent decision making.
- Ability to network with key contacts outside own area of expertise.
- Ability to listen and understand information and communicate the same.
- Must possess strong interpersonal, organizational, presentation and facilitation skills.
- Must be results oriented and customer focused.
- Proficiency in Microsoft Office packages.
- Sufficient knowledge of business communication, including telephone, voicemail, and e-mail and operations of office machines, such as photocopier, scanner, and fax.
EDUCATION, TRAINING AND EXPERIENCE:
- 5+ years of direct operating system administration, support and design responsibilities across technologies;
- 2+ years of Hadoop administration, support and design;
- Strong working experience with Hortonworks Hadoop distribution in high-availability configuration required. Specific points include but are not limited to configuring HDP repository, Ambari-server and -agent including wizard, cluster management (node addition / decommissioning), rack topology, configuration, Capacity Scheduler, DataNode files, home directories, permissions, NameNode HA and ResourceManager HA, HiveServer2 HA, Distcp, Snapshotting, Knox, Ranger, and HDFS ACLs for security, HDP service restart, log file reviews, alert configuration, monitoring, and management, job failure and resolution.
- Experience working in a Linux (Redhat) environment running on bare metal and within Amazon Web Services (AWS) infrastructure.
- Proficiency in shell scripting and automation tools required.
- AWS experience running Hadoop required.
- AWS experience with S3 required.
- Hortonworks experience and certifications are a plus
REQUIRED FOUNDATIONAL COMPETENCIES:
- Builds Relationships: Fosters open dialogue and obtains shared commitment to proposals; shared ideas and information to promote mutual understanding, respect, and effective decision-making.
- Drives for Results: Acts to create opportunities for Vertex or to avoid future problems; has the courage to act with incomplete information rather than simply thinking about it; maintains a focused commitment to achieving enterprise objectives.
- Knows the Business: Understands and applies knowledge of Vertex’s business and processes to accomplish goals.
- Anticipates Customer Needs (internal and external): Establishes and maintains productive relationships with customers and partners, anticipating their needs.
- Learns Continuously: Expands own knowledge base to enhance performance; seeks development to increase strengths for current and future needs.
Priyanka Katiyar
Technical Recruiter
Nityo Infotech Corp.
Phone:609-853-0818 Ext-2220
G talk: puja123hbti
Comments
Post a Comment
Thanks