Search This Blog

September 13, 2021

Immediate Interview:: Python/Pyspark Developer / Morrisville, NC

Click here to if you no longer wish to receive our emails

Hi

This is Sudheer from EA team. Hope you’re doing great today.

I have an opening for the below requirement. If you’reinterested please let me know.

 

Title:Python/Pyspark Developer

Location:Morrisville, NC

Duration:6+ Months

 

Client Expectation.

1. All the profiles shared thus far, seem to have REST and API build outexpertise, but light weight expertise on building Spark applications usingPython for data processing within the Data Lake. I would appreciate if youcould please confirm and validate this expertise for the candidates.

 

To reiterate, minimum qualifying criteria that should be met by allcandidates:

 

• Python with Spark (PySpark) for data processing and build out datapipelines (Minimum of 2 years)

• SQL to wrangle, analyze and transform data (Minimum of 2 years)

• UNIX to orchestrate the pipeline build out and also to perform bashbased data processing (Minimum of 2 years)

 

Python/PySpark/ Developer

Roles/Responsibilities: (5 â€" 8 day to day candidate’s responsibilities):

Description:

• Develop in Python & PySpark enhancements to an internal modelexecution platform that utilize a custom set of interfaces

• Communicate and direct model development teams on implementation ofstatistical models

• Interface with model owners and gather feedback on deliveredanalytical data.

• Write specifications and documentation for Python interfaces

• Communicate needed changes to other members of the development team

• Work with QA team on test plan reviews and assist in QA testingprocess with requirements clarifications and questions

 

Required Qualifications (5 â€" 8 bullet points on must have skills)            

• 5+ Years of Python/PySpark Development

• Software development background required â€" understanding offundamental computer science concepts required including algorithms and datastructures

• Experience developing and packaging own functionality in Python thatdemonstrate an understanding of namespaces, environments & data structuresin Python

• Knowledge & experience in working within the Hadoop ecosystem

• Experience in building and debugging PySpark jobs

• Comfortable working in a Unix environment

• Experience in source control tooling, preferably GIT

• Knowledge of and experience in building RESTful interfaces

• Experience in working with relational databases

• Experience indeveloping in R a plus




This email is generated using CONREP software.

A90024