Search This Blog

Devops||Local||USC/GC

Job Title: DevOps Engineer

Location: BOSTON, MA (HYBRID)-Local ONLY

Duration: 12+ months

c2c- USC/GC -only

 

DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES:

  • Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake, Informatica (IICS), and Airflow DAG (Directed Acyclic Graph) deployments
  • Implement automated code promotion between development, test, and production environments
  • Integrate testing, linting, and security scanning into deployment processes
  • Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects, network, and cloud resources
  • Manage configuration and environment consistency across multi-region/multi-cloud setups
  • Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls)
  • Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance
  • Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage
  • Optimize pipeline performance, concurrency, and cost governance in Snowflake
  • Own deployment frameworks for ETL/ELT code, SQL scripts, metadata updates
  • Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow
  • Troubleshoot platform and orchestration issues, lead incident response during outages
  • Enforce DevSecOps practices including encryption, secrets management, and key rotation
  • Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements
  • Participate in testing, deployment, and release management for new data workflows and enhancements.

 

 

Required Qualifications

  • 3–7+ years in DevOps, Cloud Engineering, or Data Platform Engineering roles

 

Hands-on experience with:

  • Snowflake (roles, warehouses, performance tuning, cost control)
  • Apache Airflow (DAG orchestration, monitoring, deployments)
  • Informatica (IICS pipeline deployment automation preferred)
  • Strong CI/CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar
  • Proficiency with Terraform, Python, and Shell scripting
  • Deep understanding of cloud platforms: AWS, Azure, or GCP
  • Experience with containerization (Docker, Kubernetes), especially for Airflow
  • Strong knowledge of networking concepts and security controls

 

Preferred Knowledge, Skills & Abilities:

  • Experience migrating from SQL Server or other legacy DW platforms
  • Knowledge of FinOps practices for Snowflake usage optimization
  • Background in healthcare, finance, or regulated industries a plus

 

Soft Skills

  • Effective communication with technical and non-technical stakeholders
  • Ability to troubleshoot complex distributed data workloads
  • Strong documentation and cross-team collaboration skills
  • Proactive and committed to process improvement and automation
  • Detail-oriented, with a focus on data accuracy and process improvement.

 

Education and Certification:

  • Bachelor's degree or equivalent years in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field.

 

 

 

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/CA%2BttNjrQXjZAAFEHPDMbGX2ogGQvB8tmLM%2BQvZWK6hFE%2BBMgzg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com

Featured Post

Devops||Local||USC/GC

Job Title: DevOps Engineer Location: BOSTON, MA (HYBRID)-Local ONLY Duration: 12+ months c2c- USC/GC -only   DETAILED L...

Contact Form

Name

Email *

Message *

Total Pageviews