ICS & Snowflake Lead Developer || REMOTE ||

Hello connections,

Hope you are well

 

Job Title: ICS & Snowflake Lead Developer || REMOTE || Xebia / HAL

Location: 100% remote role with overlap in PST hours .

Client: Xebia Consulting / HAL USA


If interested share your resume

neelima@jpctechno.com

 

Job Summary

We are seeking a skilled IICS & Snowflake Lead Developer to design, develop, and optimize ETL/ELT pipelines with a strong focus on masked data ingestion, migration to higher environments, and testing. The ideal candidate will have expertise in Informatica Intelligent Cloud Services (IICS) for data integration and Snowflake for cloud-based data warehousing. You will work closely with data engineers, analysts, and DevOps teams to ensure secure, scalable, and efficient data movement across environments.

 

1. ETL Development Using IICS

  • Develop and maintain ETL/ELT pipelines using Informatica Intelligent Cloud Services (IICS).
  • Implement masked data ingestion using Informatica Cloud Data Masking (ICDM) for PII/PHI security.
  • Integrate IICS Cloud Data Integration (CDI) and Cloud Application Integration (CAI) for data transformation and workflow automation.
  • Develop error handling, logging, and monitoring mechanisms for reliable data pipelines.
  • Ensure data quality, cleansing, and enrichment during the ETL process.
  • Work with REST APIs, web services, and flat files for diverse data integrations.

 

2. Snowflake Development & Optimization

  • Design and implement Snowflake data warehouse solutions, including schema design, data modeling, and performance tuning.
  • Optimize data ingestion, transformation, and query performance for large-scale data processing.
  • Implement data masking, encryption, and role-based access control (RBAC) to ensure secure data handling.
  • Automate data workflows using Snowflake Streams, Tasks, and Snowpipe.
  • Utilize SQL, Snowflake Snowpark, and Python for advanced data transformations and processing.
  • Ensure data governance, compliance, and security policies are followed as per industry standards (GDPR, HIPAA).

 

3. Migration to Higher Environments & Testing

  • Plan and execute migration of ETL jobs, data pipelines, and Snowflake objects from development to higher environments (QA, UAT, and Production).
  • Perform version control and release management for IICS workflows and Snowflake scripts.
  • Work with CI/CD pipelines to automate deployment of ETL jobs and data models.
  • Develop unit, integration, and performance testing strategies for ETL processes and Snowflake queries.
  • Implement automated testing frameworks for data pipelines to ensure accuracy and reliability.
  • Troubleshoot and optimize ETL workflows and Snowflake queries for better performance in production environments.

 

 

4. Data Modelling

  • Design and implement fact and dimension tables for optimized data warehousing.
  • Develop Star Schema, Snowflake Schema, and hybrid models based on business requirements.
  • Ensure proper grain definition for fact tables to maintain data consistency and accuracy.
  • Optimize slowly changing dimensions (SCD Types 1, 2, and 3) using IICS and Snowflake.
  • Implement surrogate keys, hierarchical relationships, and aggregation strategies for efficient querying.
  • Develop ETL workflows in IICS to handle dimension updates and fact table loads efficiently.
  • Perform data validation and reconciliation to maintain data integrity across facts and dimensions.
  • Optimize fact tables for partitioning, clustering, and indexing in Snowflake.

 

Required Skills & Qualifications:

  • 3-7 years of experience in IICS (Informatica Intelligent Cloud Services) for ETL/ELT development.
  • Strong expertise in masked data ingestiondata anonymization, and secure data handling.
  • Experience in Snowflake schema design, query optimization, and performance tuning.
  • Hands-on experience with IICS Cloud Data Integration (CDI), Cloud Application Integration (CAI), and Secure Agent setup.
  • Experience in migrating ETL jobs and data models across environments with version control.
  • Knowledge of SQL, Python, and Snowflake Snowpark for data transformation and processing.
  • Familiarity with testing frameworks for ETL pipelines and data validation techniques.
  • Strong understanding of RBAC, data masking policies, and encryption techniques in Snowflake.
  • Experience with DevOps practices, CI/CD pipelines, and cloud deployments.

 

Preferred Qualifications:

  • Experience in data warehousing projects involving Snowflake and Informatica.
  • Exposure to DataOps, CI/CD for ETL pipelines, and automation frameworks.
  • Experience with data validation, testing automation, and production monitoring.
  • Understanding of data compliance frameworks.

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/0bce8dfc-5307-4c9d-b25d-ea74e89dd422n%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Comments

Popular posts from this blog

Sr. Data Engineer : With MI DL & In-person interview

Azure Data Engineer //10+//H1b

GCP Data Engineer - Bentonville, AR (Hybrid) | Ex-Walmart or Retail Domain