Intuceo Requirement for Python, Data Engineer, Cloud Data Architect, BI Designer
Hi ,
I hope you are doing well. We have a requirements open with us. Please take a look.
Job Description:
1. Job Title | Python Developer |
Location | Remote |
Duration | Contract |
Experience:- 8+ years
>Pyspark
>Python
>Database
>ETL
>Palantir
Job Description:
We are looking for experienced developer resources with the below key skillset .
The sooner the better. Looking forward.
- Pyspark for implementation in Databricks. –2 to 3 years of experience
- Python with oops concept exposure. –3 years of experience
- Implementation knowledge covering Snowflake sql and Teradata. – Few years of experience.
- Understanding CDC for performance tunning , Data quality checks before and after fact.
3.
Job Title | Cloud Data Architect |
Location | Los Angeles, CA - Remote |
Duration | Long Term Contract |
Client | IBM |
Qualifications:
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
Experience driving large-scale cloud enterprise data warehouse projects & establishing the source of truths for key entities with a focus on data quality metrics and connectedness of data across foundations
Strong experience collaborating with business stakeholders and users of multiple business domains such as Product, Sales, Operations, Finance, and Marketing to understand, document, and implement data & analytics needs
Hands-on experience with distributed systems such as Spark, Hadoop (HDFS, Hive, Presto, PySpark) to query and process data & Advanced SQL, python, and scripting skills and experience in visualization tools such as Tableau
Experience with at least one major MPP or cloud database technology (Snowflake, Redshift, Big Query)
Job Responsibilities:
Design and develop scalable data warehousing solutions, building ETL/ELT pipelines in Big Data environments (cloud, on-prem, hybrid)
Help architect data solutions/frameworks and define data models for the underlying data warehouse and data marts
Collaborate with Data Product Managers, Data Architects, and Data Engineers to design, implement, and deliver successful data solutions
Maintain detailed documentation of your work and changes to support data quality and data governance
Perform data analysis and profiling, meet with application SME's to understand the source data model and translate it to a Dimensional model
Our tech stack includes Azure/AWS, Snowflake, Databricks, Spark.
4.
Job Title | BI Designer |
Location | Remote |
Duration | Cntract |
Job Description
Reverse engineering skills to disparate data sources to create various data domains and architect solution per requirement.
Designed and documented skillset from new sources or existing ones to provide performance oriented dimensional model.
Future Technologies to be focused include , Good understanding of Python, Pyspark for distributed computing (Databricks) , knowledge of Snowflake most of it closely related to other sql db's with its own cloud variation. (provide performance oriented design)
Understanding of Cloud technologies and their functionality for creating robust data pipelines, Azure, Aws (future roadmap is to be move to Aws)
Good and pleasant communication skills to communicate across multiple platforms.
Thanks & Regards.
Vinay Yadav
Recruiter
P. 904-204-0825
tyadav@intuceo.com | www.intuceo.com
4110 Southpoint Blvd. Suite 124 Jacksonville, FL 32216
Comments
Post a Comment
Thanks