Client is Muzuho
Location: NYC NY
Role 1: Azure Data Architect
Our Challenge: We are seeking an experienced Azure Data Architect to design and implement scalable, secure, and efficient data solutions on Azure Cloud for our financial services client. The architect will lead the development of data platforms using Azure services and Databricks, ensuring robust data architecture that supports business objectives and regulatory compliance.
Responsibilities:
Design and develop end-to-end data architecture on Azure Cloud, including data ingestion, storage, processing, and analytics solutions.
Lead the deployment of Databricks environments and integrate them seamlessly with other Azure services.
Collaborate with stakeholders to gather requirements and translate them into architectural designs.
Ensure data security, privacy, and compliance standards are met within the architecture.
Optimize data workflows and pipelines for performance and cost-efficiency.
Provide technical guidance and mentorship to development teams.
Keep abreast of the latest Azure and Databricks technologies and incorporate best practices.
Must Have:
Extensive experience designing and implementing data architectures on Azure Cloud.
Deep understanding of Databricks platform and its integration with Azure services.
Strong knowledge of data warehousing, data lakes, and real-time streaming solutions.
Proficiency in SQL, Python, Scala, or Spark.
Experience with Azure Data Factory, Azure Data Lake, Azure SQL, and Azure Synapse Analytics.
Solid understanding of security, governance, and compliance in cloud data solutions.
Nice to Have:
Experience working in the financial services domain.
Knowledge of machine learning and AI integration within data platforms.
Familiarity with other cloud platforms like AWS or GCP.
Certifications such as Azure Solutions Architect Expert, Azure Data Engineer, or Databricks Certification.
Certifications:
Azure Solutions Architect Expert
Azure Data Engineer Associate
Databricks Certification (Certified Data Engineer or similar)
Domain: Financial Services
Role 2: Senior Data Engineer
Job Description:
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic Team.
Responsibilities:
Understand technical specifications
Business requirements discussion with business analyst and business users
Python/SQL Server/Snowflake/Databricks application development and system design
Develop and maintain data models and schemas to support data integration and analysis.
Implement data quality and validation checks to ensure accuracy and consistency of data.
Execution of UT and SIT with business analysts to ensure of high-quality testing
Support for UAT with business users
Production support and maintenance of application platform
Qualifications:
General:
Around 15 years IT industry experience
Strong communication skills
Agile methodology and SDLC processes
Design and Architecture experience
Experience working in global delivery model (onshore/offshore/nearshore)
Strong problem-solving and analytical skills
Self-starter, collaborative team player and works with minimal guidance
Technical Skills:
Mandatory (Strong)
Python, SQL server and relational database concepts, Azure Databricks, Snowflake, Scheduler (Autosys/Control-M), ETL, CI/CD
Plus:
PySpark
Financial systems/capital markets/credit risk/regulatory application development experience
Location: NYC NY
Role 1: Azure Data Architect
Our Challenge: We are seeking an experienced Azure Data Architect to design and implement scalable, secure, and efficient data solutions on Azure Cloud for our financial services client. The architect will lead the development of data platforms using Azure services and Databricks, ensuring robust data architecture that supports business objectives and regulatory compliance.
Responsibilities:
Design and develop end-to-end data architecture on Azure Cloud, including data ingestion, storage, processing, and analytics solutions.
Lead the deployment of Databricks environments and integrate them seamlessly with other Azure services.
Collaborate with stakeholders to gather requirements and translate them into architectural designs.
Ensure data security, privacy, and compliance standards are met within the architecture.
Optimize data workflows and pipelines for performance and cost-efficiency.
Provide technical guidance and mentorship to development teams.
Keep abreast of the latest Azure and Databricks technologies and incorporate best practices.
Must Have:
Extensive experience designing and implementing data architectures on Azure Cloud.
Deep understanding of Databricks platform and its integration with Azure services.
Strong knowledge of data warehousing, data lakes, and real-time streaming solutions.
Proficiency in SQL, Python, Scala, or Spark.
Experience with Azure Data Factory, Azure Data Lake, Azure SQL, and Azure Synapse Analytics.
Solid understanding of security, governance, and compliance in cloud data solutions.
Nice to Have:
Experience working in the financial services domain.
Knowledge of machine learning and AI integration within data platforms.
Familiarity with other cloud platforms like AWS or GCP.
Certifications such as Azure Solutions Architect Expert, Azure Data Engineer, or Databricks Certification.
Certifications:
Azure Solutions Architect Expert
Azure Data Engineer Associate
Databricks Certification (Certified Data Engineer or similar)
Domain: Financial Services
Role 2: Senior Data Engineer
Job Description:
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic Team.
Responsibilities:
Understand technical specifications
Business requirements discussion with business analyst and business users
Python/SQL Server/Snowflake/Databricks application development and system design
Develop and maintain data models and schemas to support data integration and analysis.
Implement data quality and validation checks to ensure accuracy and consistency of data.
Execution of UT and SIT with business analysts to ensure of high-quality testing
Support for UAT with business users
Production support and maintenance of application platform
Qualifications:
General:
Around 15 years IT industry experience
Strong communication skills
Agile methodology and SDLC processes
Design and Architecture experience
Experience working in global delivery model (onshore/offshore/nearshore)
Strong problem-solving and analytical skills
Self-starter, collaborative team player and works with minimal guidance
Technical Skills:
Mandatory (Strong)
Python, SQL server and relational database concepts, Azure Databricks, Snowflake, Scheduler (Autosys/Control-M), ETL, CI/CD
Plus:
PySpark
Financial systems/capital markets/credit risk/regulatory application development experience
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/CAAkQWbBrn%3DWLmxArQ%2B-Z_%2BEk0cvjehBY%3D2SSOGA7g2dDjSo-Nw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com