Hi,
Hope everything is well.
Please check the following JD and let me know if you're interested.
Only Local candidates and 11+ Years
Role: Sr. GCP Data Engineer with Spark, Scala, GCP
Location: Sunnyvale CA (Hybrid)
Duration: 12+ Months
Positions: 2
Need 12+ years of experience.
Please don't share -GC/GC EAD/TN and USC on C2C
Mandatory:
· Spark – 8+ Yrs. of Exp
· Scala – 8+ Yrs. of Exp
· GCP – 3+ Yrs. of Exp
· Hive – 8+ Yrs. of Exp
· SQL - 8+ Yrs. of Exp
· ETL Process / Data Pipling - 8+ Years of experience
· Retail (preferred)
Requirements:
· 8+ years of hands-on experience with developing data warehouse solutions and data products.
· 4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow, or a workflow orchestration solution are required.
· 4 + years of experience in GCP, GCS Data proc, BIG Query
· 2+ years of hands-on experience in modelling (Erwin) and designing schema for data lakes or for RDBMS platforms.
· Experience with programming languages: Python, Java, Scala, etc.
· Experience with scripting languages: Perl, Shell, etc.
· Practice working with, processing, and managing large data sets (multi-TB/PB scale).
· Exposure to test driven development and automated testing frameworks.
· Background in Scrum/Agile development methodologies.
· Capable of delivering on multiple competing priorities with little supervision.
· Excellent verbal and written communication skills.
· Bachelor's degree in computer science or equivalent experience.