Land top tech jobs in Silicon Valley! Find software, data, and AI roles at the biggest U.S. startups and tech giants.

Search This Blog

GCP Data Engineer :: Bentonville, AR (Onsite)

Job Title: GCP Data Engineer

Location: Bentonville, AR (Onsite)

Position Type: Long-term contract

 

Mandatory skills

Spark

Scala

GCP

Airflow

Dag

ETL

Pyspark

 

Job Description :

1. Design, develop, and automate data processing workflows using Airflow, PySpark, and Dataproc on GCP.

2. Develop ETL (Extract, Transform, Load) processes that handle diverse data sources and formats.

3. Manage and provision GCP resources including Dataproc clusters, serverless batches, Vertex AI instances, GCS buckets, and custom images.

4. Provide platform and pipeline support to analytics and product teams, troubleshooting issues related to Spark, Big Query, Airflow DAGs, and serverless workflows.

5. Collaborate with data scientists and analysts to understand data needs and deliver robust solutions.

6. Provide timely and effective technical support to internal users (e.g., data analysts, data scientists) addressing their data-related queries and problems

7. Optimize and fine-tune data systems for high performance, reliability, and cost efficiency.

8. Perform root cause analysis for recurring issues and collaborate with data analysts and scientists to implement preventative measures to minimize future occurrences.

 

Required Skills:

• Strong programming skills in Python, SQL

• Hands-on experience with cloud platforms

• Expertise in GCP data tools: BigQuery, Dataproc, Vertex AI, Pub/Sub, Cloud Functions.

• Strong hands-on experience with Apache Airflow (incl. Astronomer), PySpark, and Python.

• Familiarity with SQL, SparkSQL, Hive, PL/SQL, and data modelling.

• Comfortable supporting distributed data systems and large-scale batch/stream data processing.

• Optimize and support Spark jobs and ETL pipelines running on Dataproc.

 
 
 
 
 
 

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com