Hi,
Hope you're doing well.
My name is Mayank, and I am a Technical Recruiter from Empower Professionals Inc. We are sourcing for a "Data Engineer" role located at "Issaquah, WA" and it's a long-term contract with possible extensions.
If you have any suitable profiles, please share their updated resumes with the candidate's location, work authorization and expected rate so that we can proceed.
Role: Data Engineer
Location: Issaquah, WA (hybrid, locals only)
Duration: 12+ Months
Must have:
- 10 + Years of exp in Data Engineering
- 5 + Years of Exp in Python
- 4+ Years of exp in GCP/Azure
- Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Overview:
The Data Engineer will be responsible for designing, building, and maintaining scalable, high-performance data pipelines and integration solutions using Python and Google Cloud Platform (GCP) services.
This role requires a hands-on engineer with strong expertise in data architecture, ETL/ELT development, and real-time/batch data processing, who can collaborate closely with analytics, development, and DevOps teams to ensure reliable, secure, and efficient data delivery across the organization.
Job Duties/Essential Functions:
- Builds data models and develop data pipelines to store data in defined data models and structures.
- Identifies ways to improve data reliability, efficiency and quality of data management.
- Conducts ad-hoc data retrieval for business reports and dashboards.
- Assesses the integrity of data from multiple sources.
- Manages database configuration including installing and upgrading software and maintaining relevant documentation.
- Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (BI, Advanced analytics, APIs/Services).
- Works in tandem with Data Architects, Data Stewards and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration.
- Designs, develops, & implements ETL/ELT processes using Informatica Intelligent Cloud Services (IICS).
- Uses Google Cloud and Azure services such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta Lake to improve and speed up delivery of our data products and services.
- Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
- Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
- Identifies ways to improve data reliability, efficiency and quality of data management.
- Communicates technical concepts to non-technical audiences both in written and verbal form.
- Performs peer reviews for other data engineer's work.
- Regular and reliable workplace attendance at your assigned location.
Requirements:
- 5+ years' experience engineering and operationalizing data pipelines with large and complex datasets.
- 3+ years' hands-on experience with Informatica PowerCenter and/or IICS.
- 4+ years' experience working with Cloud technologies such as Data flow, Data Fusion, Pub/Sub, Dataform, dbt, GCS, Bigquery, Cloud SQL, Firestore/ Datastore, Apigee ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, and other big data technologies.
- Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON.
- Advanced SQL skills required. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
- 3+ years' experience with Data Modeling, ETL, and Data Warehousing.
- Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Recommended:
- Azure or Google datacloud certifications.
- Experience implementing data integration techniques such as event/message-based integration (Kafka, Azure Event Hub), ETL.
- Experience with Git/Azure DevOps.
- Experience delivering data solutions through agile software development methodologies.
- Exposure to the retail industry.
- Excellent verbal and written communication skills.
- Experience with UC4 Job Scheduler.
- Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
- Successful internal candidates will have spent one year or more on their current team.
In compliance with the salary transparency law, the expected pay range for this role is $50 - 60/hr. Actual compensation depends on experience and interview evaluation.
Thanks
Mayank Verma
Senior Technical Recruiter | Empower Professionals
......................................................................................................................................
mayank@empowerprofessionals.com | LinkedIn: https://www.linkedin.com/in/mayankdverma/
Fax: 732-356-8009 | 100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873
Certified NJ and NY Minority Business Enterprise (NMSDC)
Empower Professionals firmly opposes e-mail "spamming". We apologize to those who do not wish to receive this e-mail and also to those who have accidentally received it again. Please reply with "REMOVE" in the subject listing, with all aliases email addresses that you would want removed and any inconvenience caused is highly regretted. We appreciate your patience and cooperation. This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful.
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com