Search This Blog

September 19, 2024

Hiring - Big Data Developer / Architect :: Atlanta - GA :: Contract

Hi Partner,

I hope you’re doing well.

Please review the highlighted points below and let me know if you have any consultants who would be a good fit for this role.

 

  • Must be Local to Georgia with Local Photo ID (No Utilities bill)
  • Only Senior H1bs
  • If you have any Big data Profile who is Local, Please share I will check and revert back to you.

 

 

Position 1:

 

Software Engineer - Big Data Developer (Spark/Pyspark 3-6 yrs exp must)

Location: Atlanta, at least 3 days a week Onsite

Rate up to $65 per hour.

 

Experience with:

  • 10+ years of experience in the role of implementation of high end software products.
  • Sound knowledge of Kubernetes and deployment methodologies
  • Bigdata  either of Spark/Pyspark is mandatory , with strong problem-solving skills, and able to thrive with minimal supervision.
  • Knowledge of database principles, SQL, and experience working with large databases.
  • Basic Unix command.

 

Key responsibilities:

  • Perform Development & Support activities for Data warehousing domain using Big Data Technologies
  • Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues
  • Perform Development & Deployment. Should be able to Code, Unit Test & Deploy
  • Creation necessary documentation for all project deliverable phases
  • Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met

 

Technical Skills:

Mandatory

  • Data Engineering :
    • Strong Experience on one of the big data platform  ( Hadoop / Snowflake/ ADLS / Big query )
    • Hands on experience on Python/Java programming
    • Experience on Spark or Azure Data bricks
    • Strong SQL analysis skills
    • Experience of working with Kafka
  • Sound knowledge of Unix Scripting

 

Good to have

  • Cloud Skills – should have knowledge on either AWS or Azure or GLP.
  • Sound knowledge of Kubernetes and deployment methodologies
  • Database Skills – Strong experience in Oracle and Data replication concepts.

 

Behavioral skills :

  • Eagerness & Hunger to learn
  • Good problem solving & decision making skills
  • Good communication skills within the team, site and with the customer
  • Ability to stretch respective working hours when necessary, to support business needs
  • Ability to work independently and drive issues to closure
  • Consult when necessary with relevant parties, raise timely risks
  • Effectively handle multiple and complex work assignments, while consistently deliver high quality work

 

 

 

Position 2:

 

Big Data Architect

Location: Atlanta, at least 3 days a week Onsite

Rate up to $72 per hour.

 

Role Description

  • Creates the high-level software design and dictates technical standards including software coding standards, tools, and platforms.
  • Responsible to build or define correct  processes and tools to establish non functional governance of Data Solution ( including Monitoring , performance , cost optimization , SLA compliance, security, scalability , Logging ).
  • Act as a Strong partner for our internal/external stakeholders to  provide effective solution for architectural requirements  and non functional challenges.
  • Responsible for designing holistic technical architecture solutions with appropriate HW/SW specifications (i.e. size) in alignment with customer requirements,  solution specifications and cost constraints.
  • Makes cross-component design decisions. Defines the technical design, examines new technologies, and defines technical components.
  • In charge of the architectural roadmap. Provides architecture solutions during the software development lifecycle.

 

Key responsibilities

  1. Analyzing and designing effective and clear technical solutions for infrastructure and enterprise applications, providing high-level and/or detailed designs per need, responsible for solution updates according to project evolution and changes
  2. Able to analyze  performance and architecture issues in the data processing framework components ( eg Application , Infra structure , network , compute …)
  3. Interacting with a variety of internal and external stakeholders: presale leads, customer IT & business managers, Amdocs customer-facing managers, project managers R&D managers, procurement, IT , 3rd party Software support teams
  4. Performance tuning of Cloudera/Horton work big data platforms.
  5. Effectively engage third party software support teams for production non functional issues due to 3rd Party.
  6. Performing capacity (including trending) and implementation design planning for all Data Center technical components (e.g. servers, storage, network)
  7. Responsible to work as a point of contact for architecture governance  with internal/external stakeholders.
  8. Responsible to establish build, design and operate model for   HA , DR  , Security , scalability for all data pipelines and solutions.
  9. Constantly reviewing new technologies in Data /CI/CD Domain and bring new practices to project.
  10. Works with software engineers and other architects to define and refine the product structure to align with the business, development, deployment, and maintenance needs. Works with customers and product line management to identify, refine, and translate customer needs into concrete technical requirements. Understands and plans for architecture evolution including integration of new software technologies. Takes accountability for product and application architecture. Supports and leads architectural decisions and tasks within a product line or across multiple product lines (cross-portfolio).
  11. Works with the software engineering teams to establish best practice guidelines and reusable and production-ready solutions. Reviews technical designs, and provides guidance to software engineers on technical and architectural design decisions. Is an ambassador for the architecture, ensuring that it is being implemented correctly. Gives feedback and inputs to the product management team to consider and improve the product line. Leadership is focused on the technical aspects of the job, rather than the people management aspects.
  12. Standards and Best Practices - Researches, evaluates, and prototypes new methodologies, technologies, and products. Provides leadership in the use of processes and tools. Proposes and follows through with improvements in processes and tools. Ensures effective application of corporate standards and procedures. Proposes improvements, and tracks and provides updates about pertinent technology trends.
  13. Customer Engagement - Obtains a good understanding of the customer context while making technical decisions and choices. Interacts effectively with customers as required. Provides the development or support team with inputs and requirements related to the technical aspects of the solution/product.

 

Critical Experiences

  1. Extensive background in software engineering and proven experience in software design AND many technologies and application architecture
  2. Experienced in building complex applications or products taking into account various technical considerations.
  3. Proven experience of designing and operating high volume data solution  from design till production.
  4. Strong experience in Cloudera/Horton work Big data platforms
  5. Demonstrated deep technical domain expertise.
  6. Experience in presenting ideas, influencing, and build consensus in a group setting.
  7. Experience with production performance/capacity analysis
  8. Customer-facing experience (relevant for some of the jobs)

 

Technical Skills

  1. Key focus areas (10+ experience for Architect)
    1. Data Management Services
    2. Data Strategy, Data Governance, Data Architecture, Data Integration, Data Quality, Data Cataloging, Data Security, Data Operations
    3. Big Data Analytics, Self-Service BI, Data Visualization
    4. Data Science (AI/ML, Platform, Self Service)
    5. BI Platform Migrations
    6. DevOps, Automation
  2. Big Data Technologies (6+ experience; Cloud experience is a must  - AWS/Azure/GCP)
    1. Hadoop, Spark, MapReduce, Hive, HBase, ,   Spark, Python, Java, Scala, Kafka, Nifi, ElasticSearch
    2. Performance Tuning of Big data Hadoop Platforms ( Cloudera , Horton work ).
    3. Kubernetes
    4. Ansible , Jenkins , Docker
    5. Automation of cluster setup/configuration, code generation

 

 

 

Anurag Singh

EIT Professionals

17199 N Laurel Park Dr. Ste 402, Livonia, MI 48152, USA

Work: +1 (734) 772-9929 ext. 438

LinkedIn: linkedin.com/in/anurag-singh-it-recruiter
Email: anurag@eitprofessionals.com
Web: www.eitprofessionals.com