Search This Blog

March 7, 2024

GCP Data Architect - Phoenix, AZ (Day 1 onsite / Hybrid Model - 15+ Yrs Exp Must

Hi,

 

Please find below requirement for GCP Data Architect - Phoenix, AZ (Day 1 onsite / Hybrid Model

 

Title:  GCP Data Architect

Location: Phoenix, AZ (Day 1 onsite / Hybrid Model

Duration: 12+months

 

Mandatory Skills:

  • Extensive experience working with GCP Data-related Services such as Cloud Storage, Dataflow, Dataproc, BigQuery, Bigtable
  • Very strong experience with Google Composer and Apache Airflow; ability to set up, monitor, and debug a complex environment running a large number of concurrent tasks
  • Good Exposure to RDBMS / SQL fundamentals
  • Exposure to Spark, Hive, GCP Data Fusion, GCP Astronomer, Pub/Sub Messaging, Vertex, and the Python Programming Language

 

Minimum Qualifications:

 

  • Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent.
  • A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
  • Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition.
  • Minimum of 12 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud Big Query, Cloud Pub Sub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
  • Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proProc,on Minimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
  • Experience with Data lake, data warehouse ETL build and design
  • Experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big Table Proven ability in one or more of the following programming or scripting languages- Python, JavaScript, Java,

 

 

 




Don't want any more emails? Unsubscribe.