Hi
Hope you are doing good.
We have an urgent opening for the below role. Please share your updated resume if you are available for new posiitons.
Title - AWS Cloud Engineer
Location - Malvern, PA
Long Term
As discussed please find the jd
Only one round will be there
Title: Senior Cloud Engineer
Responsibilities:
Works with a team of AWS experts to implement and rollout Collibra Data Governance System
on a cloud infrastructure
Understand various technology complexities and dependencies around installing, securing and
automating activities and provide solid solutions Install and setup MuleSoft based Collibra Connect Server on AWS
Integrate Collibra with Enterprise AD for user authentication and authorization
Secure all modes of transportation as per Collibra’s recommendation and Client’s established
standards
Document all the work, provide knowledge transfer, build automated testing
Provide guidance to other Cloud Engineers and Data Governance Analysts to design and
implement a comprehensive solution
Required Skills:
10+ Years of experience in overall Software Development
Must have 3+ years of AWS Cloud Infrastructure using AWS best practices
Must have Python experience in automating AWS Infrastructure
Working knowledge in Glue, EC2, S3, VPC, TLS, IAM, CloudFormation, MuleSoft , Cross Account
Access
Experience in developing CI/CD pipelines using Bamboo, Ansible, Jenkins, Maven, BitBucket Proven experience in Cloud Authentication, Authorization, Singe Sign on Ability to work independently, collaborate with other infrastructure teams and motivated to
navigate through large organizations
Additional Skills:
Exposure to Collibra or similar Data Governance system implementations is preferred
Experience in Financial Industry is a plus
Exposure to Java/Python Programming, Unix, RDBMS, JIRA, Confluence
Title: Mid-level Cloud Engineer
Responsibilities:
Works with a team of AWS experts to implement and rollout Collibra Data Governance System
on a cloud infrastructure
Develop CI/CD pipelines to automate infrastructure deployments
Using MuleSoft Anypoint Studio develop connectors for AWS Glue, RDBMS and other like
sources Deploy and run connectors to populate Collibra Catalogs
Document all the work, provide knowledge transfer, build automated testing
Collaborate with other Cloud Engineers and Data Governance Analysts to design and implement
a comprehensive solution
Required Skills:
7+ Years of experience in overall Software Development
Must have 3+ years of AWS Cloud Infrastructure using AWS best practices
Must have Python experience in automating AWS Infrastructure
Working knowledge in Glue, EC2, S3, CloudFormation, MuleSoft Anypoint Studio
Experience in developing CI/CD pipelines using Bamboo, Ansible, Jenkins, Maven, BitBucket
Ability to work independently, collaborate with other teams and motivated to navigate through
large organizations
Additional Skills:
Exposure to Collibra, MuleSoft is preferred
Experience in Financial Industry is a plus
Masters in Software Engineering or related discipline is preferred
Exposure to Java/Python Programming, Unix, RDBMS, JIRA, Confluence
Title - Data Engineer
Location - Malvern, PA
Interview - Only one Round
Long Term Contract
Experience and Qualifications:
- 3+ years of hands-on experience designing and deploying an AWS-based application (native/re-factored)
- 8+ years with Python, Scala, Spark, Oozie, Big Data
- Expertise in the core AWS services, uses, automation, and architecture best practices
- Proficiency in designing, developing, and deploying cloud-based Big Data solutions using AWS
- Experience with developing and maintaining applications written for Amazon Simple Storage Service, Amazon Simple Queue Service, Amazon Simple Notification Service,
- Amazon Simple Workflow, API Gateway Service, AWS Elastic Beanstalk, and AWS CloudFormation
- Proficiency in Amazon Compute and Storage Instances
- Experience with S3 Server Side Encryption, IAM, and Policy, CloudTrail, CloudWatch.
- Experience on EMR and Glue
- Experience setting up Kinsesis streams and integrating them with CDC (Attunity preferred)
- 5+ years working with Big Data (Hadoop, Cloudera, HBase)
- Proficiency on High Available, Fault Tolerant, and DR Architecture
- Good working knowledge and experience working with databases like DynamoDB, S3, Postgre
- Experience on DevOps CI and CD using Jenkins or Bamboo or Code Deploy
- AWS Developer, Solution Architect Certified a plus but not required
- Experience with Atlassian stack highly preferred
- Extensive exp getting data ready for BI consumption and analyzing the data to build reports
- Experience migrating reports and queries across platforms is highly desired (eg migrating from Cognos to Tableau or Qlik)
Job responsibilities:
- Design, develop and deliver scalable and automated Data Pipelines to ingest data from multiple sources including but not limited to OnPrem DB2, SaaS clouds, Genesys, Tealeaf
- Familiarity with ingesting and loading data using Oozie workflow manager and cloud- native ingestion services
- Code and enable Data store on S3
- Leverage IAM roles & policies for service authentication
- Build load, transformation, and validation logic in EMR (Spark/Scala) and Lambda (Python)
- Build necessary infra to provision query cluster using existing architecture
- Migrate OnPrem Hadoop data and queries to AWS
- Promote serverless code where appropriate
- Data Quality evaluations based on the source data
- Build reports with BI tools such as Tableau, QuickSight, Qlik
- Thanks
Karan
925 310 7579