AWS Data Engineer
Location – Portland, OR (Remote)
Job Description:
- Mandatory Skills: AWS, Databricks, Spark, Python, Pyspark
- Good to have: AtScale, Airflow
Skills / Qualifications:
• 12+ years of experience with data engineering with emphasis on data analytics and reporting
• Strong experience developing with PySpark, leveraging Databricks managed service
• Experience developing with scripting languages such as Shell and Python
• Expert experience with advanced SQL, Python
• Expert experience with Experience with AWS services
• Experience with agile delivery methodologies– Scrum, SAFe, Extreme Programming
• Experience working with source-code management tools such as GitHub and Jenkins
• Ability to partner with business and technology team members, to understand business requirements and translate those into value-add technology solutions
Additional preferences are:
• Experience developing solutions leveraging AtScale
• Experience with workload automation tools such as Airflow, Autosys.
• Knowledge of building solutions with data visualization and reporting tools (Tableau, Cognos)
• Knowledge of Finance / Sustanability/Manufacturing business processes and objectives
• Prior Nike experience
Thanks,
Rahul Srivastava
TekisHub® Consulting Services
Mailto: rahul.kumar@tekishub.com
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com