Hi, Hope you are doing good! This is Charan representing Tek Leaders Inc., Please review the position described below and send me an updated latest resume. Role : GCP Lead Data Engineer Location: Dublin, OH (Hybrid) Duration: 12+ Months Experience : 9+ Years Work Authorization: H1-B, H4 EAD, L2 EAD, GC, GC EAD, USC are fine no OPT&CPT. TAX Terms: C2C, W2 & 1099 Available. Job Description: Responsibilities: - Possess deep functional and technical understanding of the Machine Learning technologies (Google’s Cloud Platform) and leverage it across Client’s large, complex and diverse landscape.
- Analyze performance and scalability characteristics to identify bottlenecks in large-scale distributed systems
- Perform root cause analysis of performance issues identified by internal testing and from customers and suggest corrective actions
- Business acumen (strong understanding of how business operates, and how to harness data and analytics to meet business needs)
- expert at designing as well as developing all layers of an application and platform
- Leads the development of technology transitions or architecture evolutions by creating foundational examples of working solutions and coaches’ teams on how to build on those examples
- Engage early in project efforts to analyze current solutions, provide solution options and recommendations, understand business process impact, provide accurate estimates.
- Design and Architect applications across multiple domains including Data Engineering, Data Science, Data Viz and App development.
- Demonstrates deep understanding of business processes and technology building blocks
- Lead collaboration with project teams, application, data, integration, infrastructure, and security enterprise architects to develop and deploy comprehensive solution architectures
- Ensure projects are delivered in-line with design, roadmap and to the defined standards and best practices.
- Is skilled as a Data Architect who can design end-to-end data driven solutions
- Influences adoption working IT, Business and Architecture groups
- Creates design considering data sources & dependencies, development landscape, deployment landscape.
- Generate ideas and suggestions for process and technical improvements for platforms and processes supported by the team
- Architect the GCP data platform to meet the non-functional requirements of the consumptions layer and solutions built on top of the data platform.
- Identify potential performance and availability challenges proactively, implement recommendations, and ensure that the systems’ capacity and availability exceed requirements while ensuring the platform achieves business results.
- Partner with solution architects, data engineers, platform engineers and other team roles to assess the platform’s needs, help design new capabilities, establish architectural roadmaps, design and run tests/proof-of-concepts, help troubleshoot problems, identify risks, and make recommendations.
- Comfortable with communication across all levels of department and create artifacts to clearly deliver the message.
Qualifications - 10+ Experience on large-scale implementation programs preferred.
- Advanced experience with data platforms including GCP, Teradata and SAP HANA.
- Expert in SQL, Data Engineering, Data pipelines, Data Visualization and Data Science
- Experience with cloud computing including knowledge of Google Cloud Platform (GCP) and Amazon Web Services (AWS) platform is preferred.
- Proven ability to manage multiple tasks, respond quickly to emergent problems, and to focus both on long-range projects and immediate tasks required to maintain system functionality.
- Experience with designing, developing, and deploying Machine Learning models.
- Demonstrated expertise of database design and modeling.
- Expert knowledge of BI Reporting and Data Discovery tools.
- Experience with business-critical applications.
- Delivery of related information software solutions such as data warehouses and integration platforms.
- Agile development skills and experience.
- BQ expert, query performance tuning techniques
- Familiarity with performance monitoring tools
- Familiarity with integration patterns with consumption layer
- Real-time data streaming tools such as Kafka
- Understanding of Tableau, Looker, AtScale, LookML
- Capacity modeling/planning processes and tools
- Understand high availability, Resiliency planning.
- Excellent verbal and written communication skills, bias towards action, proactive and self-driven, can work independently.
- Architecting high performance application data services against data stores with very large amounts of data
|
Comments
Post a Comment
Thanks