Search This Blog

Hiring - Senior Data Engineer – Investments Data Platform (Snowflake) - New York, NY (Hybrid)

Hello Folks,

 

Hope you are doing great!

 

This is Himank Jani from ApTask.

 

We have urgent requirements with one of our client's, please review the job description below and let me know if you have any relevant candidates on your bench, Kindly share.

 

Need Only Local Candidate's.

Kindly share work authorization status and current location with resume.

 

Role: Senior Data Engineer – Investments Data Platform (Snowflake)

Location: New York, NY (Hybrid). No remote

Rate: $63/hr on C2C.

Client:- Mphasis

 

Note:

  • Expertise on data engineering with focus on DB and ETL/ELT.
  • Data modelling, data governance.
  • Data quality/reconciliation.

 

Who are we looking for?

Experience building enterprise‑scale data platforms, preferably within asset management, capital markets, or financial services, who will play a critical role in shaping the architecture, development standards, and long‑term scalability of this strategic investments data platform.

 

Key Responsibilities

Data Platform & Pipeline Engineering

  • Design, build, and maintain robust, scalable ELT pipelines on Snowflake to ingest data from IBOR/ABOR systems, custodians, market data providers, and internal platforms.
  • Implement end‑to‑end ingestion, standardization, and transformation logic using Airflow, dbt, Python, and Snowflake stored procedures.
  • Manage schema evolution, metadata, and versioning to support changing upstream data structures without disrupting downstream consumers.

 

Data Modeling & Analytics Enablement

  • Design and implement investment‑focused data models to support holdings, transactions, cash flows, exposure analytics, and performance reporting.
  • Build analytics‑ready, well‑documented data structures optimized for portfolio management, finance, risk, and regulatory use cases.
  • Partner with reporting, analytics, and investment operations teams to translate business reporting requirements into scalable technical designs.

 

Data Quality, Governance & Controls

  • Establish and enforce data quality checks, reconciliation controls, and validation frameworks to ensure accuracy and completeness of financial data.
  • Implement governance standards including lineage, auditability, and data certification aligned with regulatory and internal control expectations.
  • Support operational readiness by enabling monitoring, alerting, and issue resolution across data pipelines.

 

Performance & Scalability

  • Optimize Snowflake performance and cost efficiency for large‑scale financial datasets through clustering, partitioning, and query tuning.
  • Contribute to platform architecture decisions, development standards, and best practices to ensure long‑term scalability and maintainability.

 

Collaboration & Leadership

  • Work closely with investment operations, finance, reporting, and analytics stakeholders to ensure platform alignment with business needs.
  • Act as a senior technical contributor in a lean, high‑impact environment, influencing design decisions and mentoring junior engineers as needed.

 

 

 

Required Qualifications

  • 10–15 years of experience building enterprise‑scale data platforms, preferably within asset management, capital markets, or financial services.
  • Deep hands‑on experience with Snowflake, including performance optimization and advanced SQL development.
  • Strong experience with modern ELT frameworks such as dbt and orchestration tools like Airflow.
  • Advanced proficiency in SQL and Python for data transformation, automation, and control frameworks.
  • Proven experience designing complex financial data models supporting holdings, transactions, cash flows, and exposure analytics.
  • Solid understanding of IBOR/ABOR concepts, investment operations, and downstream reporting requirements.
  • Experience implementing data quality, reconciliation, and governance controls in regulated environments.

 

Preferred Qualifications

  • Experience working with custodian data feeds, market data, and investment accounting platforms.
  • Familiarity with regulatory reporting and audit requirements in investment management.
  • Exposure to cloud‑native data architectures and cost optimization strategies.
  • Strong documentation and stakeholder communication skills.

 

Behavioral & Leadership Competencies

  • Strong analytical problem‑solving; able to resolve complex technical and operational issues.
  • Executive‑level communication and stakeholder management.
  • Self‑motivated, outcome‑oriented, with a strong transformation and delivery focus.
  • Collaborates effectively across matrixed teams; drives client satisfaction and trust.

 

 

Best Regards,

Himank Deepak Jani

ApTask | A global, diversity-certified workforce solutions provider.

Address: 120 Wood Ave South, Suite # 300, Iselin, NJ 08830

 

This e-mail and any attachments may be confidential, proprietary or legally privileged. Any review, use, disclosure, distribution or copying of this e-mail is prohibited except by or on behalf of the intended recipient. If you received this message in error or are not the intended recipient, please delete or destroy the e-mail message and any attachments or copies and notify the sender of the erroneous delivery by return e-mail. It shall not attach any liability on the sender or ApTask or its affiliates. Any views or opinions presented in this email are solely those of the sender and may not necessarily reflect the opinions of ApTask or its affiliates.

 

Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.

If you have any concerns or queries about your personal information, please feel free to contact our compliance team at 
compliance@aptask.com.

Applicant Consent:
By submitting your application, you agree to ApTask's (www.aptask.com)
 Terms of Use and Privacy Policy, and provide your consent to receive SMS and voice call communications regarding employment opportunities that match your resume and qualifications. You understand that your personal information will be used solely for recruitment purposes and that you can withdraw your consent at any time by contacting us at 732-355-8000 or help@aptask.com. Message frequency may vary. Msg & data rates may apply.

 

 

Local to PA - Lead Performance Engineer - Philadelphia, PA (Hybrid Onsite)

Hi,


Hope you're doing well.

My name is Mayank, and I am a Technical Recruiter from Empower Professionals Inc. We are sourcing for a "
Lead Performance Engineer" role located at "Philadelphia, PA" and it's a long-term contract with possible extensions.

 

If you have any suitable profiles, please share their updated resumes with the candidate's location, work authorization and expected rate so that we can proceed.

 

Role: Lead Performance Engineer (CGEMJP00335873)
Duration: 12 Months
Location: Philadelphia, PA (Hybrid Onsite)

Job Description:
Perf/SRE Skill Set:

  • Performance Engineering & RCA: Strong hands-on expertise in end-to-end performance engineering, including proactive monitoring, deep-dive root cause analysis, and optimization across frontend (UX) and backend layers, covering latency, throughput, and scalability issues.
  • APM, RUM & Observability Tools: Extensive experience with APM and RUM tools such as AppDynamics, along with log aggregation and analytics platforms like ELK Stack (Elasticsearch, Logstash, Kibana) and Grafana for real-time monitoring, troubleshooting, and performance insights.
  • Cloud & Distributed Systems: Solid understanding of AWS and Azure services, with experience analyzing cloud-native and hybrid architectures, identifying bottlenecks, and optimizing resource utilization, load balancing, and scalability.
  • Frontend Performance Optimization: Strong knowledge of Core Web Vitals (LCP, INP, CLS), browser performance metrics, and frontend optimization techniques including asset optimization, rendering performance, and network efficiency.
    System Architecture & Design Optimization: Proven ability to quickly understand end-to-end application architecture and drive performance improvements at the system design level, including service interactions, data flows, caching strategies, and scalability patterns.

 

In compliance with the salary transparency law, the expected pay range for this role is $60 - 70/hr. Actual compensation depends on experience and interview evaluation.

Thanks

Mayank Verma

Senior Technical Recruiter | Empower Professionals

......................................................................................................................................

mayank@empowerprofessionals.com | LinkedIn: https://www.linkedin.com/in/mayankdverma/

Fax: 732-356-8009 | 100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873

www.empowerprofessionals.com

Certified NJ and NY Minority Business Enterprise (NMSDC)

Empower Professionals firmly opposes e-mail "spamming". We apologize to those who do not wish to receive this e-mail and also to those who have accidentally received it again. Please reply with "REMOVE" in the subject listing, with all aliases email addresses that you would want removed and any inconvenience caused is highly regretted. We appreciate your patience and cooperation. This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. 

 

Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)

🚨 Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)

📍 Location: New York, NY (Relocation OK)
💼 Experience: 8–10 Years


🔹 Must Have:

• Oracle Customer Care & Billing (CCB)
• Utilities Domain

🔹 Nice to Have:

• SQL, ADO
• Functional / Integration / System / Regression Testing


🔹 Role:

Lead end-to-end testing for Oracle CCB/MDM (CIS) implementations and upgrades


🔹 Top 3 Responsibilities:

• Drive test strategy, planning & execution across all testing phases
• Oversee test case design, execution & defect management
• Collaborate with business & technical teams to ensure quality delivery

Best Regards,
Email ID: teja.a@siriinfoinc.com 
LinkedIn: www.linkedin.com/in/sri-teja-reddy-ala-535621258


--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/bf854e85-6099-4b5d-99aa-edbfca14aa2fn%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Data Engineer (PySpark + AWS + Iceberg) :- Chicago, IL – Only Local – No Relocation

Hello Folks,

 

Hope you are doing great!

 

This is Himank Jani from ApTask.

 

We have urgent requirements with one of our client's, please review the job description below and let me know if you have any relevant candidates on your bench, Kindly share.

 

Need Only Local Candidate's.
(PySpark + AWS + Iceberg) exp is must, don't share any irrelevant resumes.

Please ensure that all profiles shared include details such as current location and work authorization status. Profiles without this information may not receive a response.

Job Title: Data Engineer (PySpark + AWS + Iceberg)

Location : Chicago, IL – Only Local – No Relocation

Exp: 10+ Yrs Minimum

Rate: $63/hr on C2C

No of Position:  2

RTTO – 5 Days Onsite

 

Job Description:
Job Summary

We are looking for a skilled Data Engineer to design and build scalable data solutions using PySpark and AWS services. The ideal candidate will have hands-on experience in building modern data platforms using Apache Iceberg and implementing Medallion architecture on AWS.

 

Key Responsibilities

  • Design and implement end-to-end data solutions using PySpark, ensuring scalability and performance.
  • Build and manage data pipelines using AWS services such as AWS Glue, EMR, and Lambda.
  • Develop data products using PySpark + AWS Glue stack.
  • Implement Medallion Architecture (Bronze, Silver, Gold layers) for structured data processing.
  • Work with Apache Iceberg tables for efficient data storage, versioning, and schema evolution.
  • Ensure data quality, governance, and optimization across pipelines.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Optimize data processing jobs and improve performance and cost-efficiency on AWS.

 

Required Skills & Experience

  • Strong experience in PySpark for data processing and pipeline development.
  • Hands-on experience with AWS ecosystem (Glue, EMR, Lambda, S3).
  • Experience implementing Medallion Architecture.
  • Practical knowledge of Apache Iceberg or similar table formats.
  • Strong understanding of distributed data processing and big data frameworks.
  • Experience designing scalable and reliable data pipelines.
  • Good understanding of data modeling and ETL/ELT concepts.

 

Preferred Qualifications

  • Experience working outside of Databricks-only environments (ability to build solutions using native AWS stack).
  • Familiarity with modern data lake architectures and open table formats.
  • Knowledge of performance tuning and cost optimization in AWS.
  • Experience with CI/CD pipelines for data engineering workflows.

 

What the Client is Specifically Looking For

  • Engineers who can independently design solutions using PySpark (not limited to Databricks).
  • Strong expertise in AWS-native data engineering tools.
  • Hands-on implementation experience with Apache Iceberg (preferred over Delta).
  • Ability to build data products using Glue + PySpark stack.
  • Clear understanding and implementation of Medallion architecture using AWS services.

 

 

 

 

 

Best Regards,

Himank Deepak Jani

 

ApTask | A global, diversity-certified workforce solutions provider.

Address: 120 Wood Ave South, Suite # 300, Iselin, NJ 08830

 

This e-mail and any attachments may be confidential, proprietary or legally privileged. Any review, use, disclosure, distribution or copying of this e-mail is prohibited except by or on behalf of the intended recipient. If you received this message in error or are not the intended recipient, please delete or destroy the e-mail message and any attachments or copies and notify the sender of the erroneous delivery by return e-mail. It shall not attach any liability on the sender or ApTask or its affiliates. Any views or opinions presented in this email are solely those of the sender and may not necessarily reflect the opinions of ApTask or its affiliates.

 

Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.

If you have any concerns or queries about your personal information, please feel free to contact our compliance team at 
compliance@aptask.com.

Applicant Consent:
By submitting your application, you agree to ApTask's (www.aptask.com)
 Terms of Use and Privacy Policy, and provide your consent to receive SMS and voice call communications regarding employment opportunities that match your resume and qualifications. You understand that your personal information will be used solely for recruitment purposes and that you can withdraw your consent at any time by contacting us at 732-355-8000 or help@aptask.com. Message frequency may vary. Msg & data rates may apply.

 

 

Java Architect (Telecom Domain) – St. Louis, MO (Onsite)

Job Title: Java Architect (Telecom Domain) – St. Louis, MO (Onsite)
Experience: 8+ Years

Work Authorization: USC/GC
Job Description:
We are seeking an experienced Java Architect with a strong background in the Telecom domain to lead the design and development of scalable enterprise applications. The ideal candidate will play a key role in architecting robust solutions and guiding development teams.
Key Responsibilities:
Design and implement scalable, high-performance Java-based applications
Define architecture, frameworks, and best practices for development
Work closely with business stakeholders and technical teams
Lead code reviews and ensure adherence to coding standards
Integrate telecom systems (OSS/BSS) and APIs
Provide technical leadership and mentorship to the team
Required Skills:
Strong experience in Java, Spring Boot, Microservices architecture
Solid experience in Telecom domain (OSS/BSS systems)
Experience with REST APIs, cloud platforms (AWS/Azure), and distributed systems
Strong understanding of system design, scalability, and performance tuning
Experience with CI/CD pipelines and DevOps practices
Best Regards,
Email ID: teja.a@siriinfoinc.com 

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/7ef8db21-aee1-4ec8-a609-e038b268095an%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Local to GA - Lead Data Engineer - Atlanta, GA (Onsite)

Hi,


Hope you're doing well.

My name is Mayank, and I am a Technical Recruiter from Empower Professionals Inc. We are sourcing for a "
Lead Data Engineer" role located at "Atlanta, GA" and it's a long-term contract with possible extensions.

 

If you have any suitable profiles, please share their updated resumes with the candidate's location, work authorization and expected rate so that we can proceed.

 

Role: Lead Data Engineer

Duration: 12 Months
Location: Atlanta, GA (Onsite)

Job Description:

We are seeking a Lead Data Engineer to drive the ESP Migration initiative. This role requires deep expertise in modern data engineering practices, hands-on technical skills, and the ability to lead complex data transformation and migration efforts.

Key Responsibilities

  • Lead end-to-end data engineering tasks for the ESP Migration program.
  • Design, build, and optimize scalable data pipelines and workflows.
  • Collaborate with cross-functional teams to ensure seamless data movement and transformation.
  • Ensure data quality, performance, and reliability across all stages of migration.
  • Provide technical leadership and mentor team members throughout the project lifecycle.

 

Required Skills (Must-Have)

  • Strong foundation in core data engineering concepts and best practices.
  • Hands-on expertise in:
  • Python
  • SQL
  • Snowflake
  • Airflow
  • Proven experience leading data engineering initiatives in complex environments.

 

Preferred Skills (Nice-to-Have)

  • Exposure to Generative AI concepts and related tools/technologies.

 

 

In compliance with the salary transparency law, the expected pay range for this role is $60 - 70/hr. Actual compensation depends on experience and interview evaluation.

Thanks

Mayank Verma

Senior Technical Recruiter | Empower Professionals

......................................................................................................................................

mayank@empowerprofessionals.com | LinkedIn: https://www.linkedin.com/in/mayankdverma/

Fax: 732-356-8009 | 100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873

www.empowerprofessionals.com

Certified NJ and NY Minority Business Enterprise (NMSDC)

Empower Professionals firmly opposes e-mail "spamming". We apologize to those who do not wish to receive this e-mail and also to those who have accidentally received it again. Please reply with "REMOVE" in the subject listing, with all aliases email addresses that you would want removed and any inconvenience caused is highly regretted. We appreciate your patience and cooperation. This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. 

 

Agentic AI Developer @ RTP, NC (Onsite)

 

We have the below requirement with Client. Kindly go through the Job Description and let me know your interest.

 

Position: Agentic AI Developer

Location: Research Triangle Park (RTP), North Carolina Onsite

Duration: Long Term Contract

 

LinkedIn and Passport Number are mandatory for submission.

 

LinkedIn profile must have been created before 2020.

 

Project: AI Analytics Platform

About the Role:
We are building an enterprise-grade AI-powered analytics platform that enables business users to query complex datasets through natural language. The platform uses a multi-agent architecture where specialized AI agents collaborate to route questions, generate SQL, execute queries against Snowflake, produce insights, and create dynamic visualizations - all autonomously.


We are looking for consultants who can contribute to the development of new domain agents, extend the platform's capabilities, and help scale the architecture to support additional business domains.


What You Will Work On

  • Develop and integrate new domain-specific AI agents into an existing multi-agent orchestration system built on LangGraph and LangChain
  • Design and implement SQL generation agents that translate natural language questions into precise Snowflake SQL, enforcing business rules, RBAC, and fiscal period logic
  • Build and consume MCP (Model Context Protocol) server integrations for secure, structured data access across enterprise data sources
  • Work with PromptQL and RAG/PageIndex patterns to improve query accuracy, context retrieval, and domain-specific grounding
  • Develop FastAPI endpoints and async workflows to support real-time query processing, background job execution, and frontend integration
  • Create intelligent data visualization pipelines that automatically select and generate the right chart type (heatmaps, bar charts, KPIs) based on query results and user intent
  • Write domain context configurations (JSON schemas) that define column mappings, business rules, valid values, and metric definitions for each data domain
  • Contribute to a plugin-based domain registry architecture that allows new agents to be added without modifying core orchestration code.

Required Technical Skills

  • Python - Expert-level proficiency; ability to write production-quality async code, work with complex class hierarchies, and debug multi-layer systems
  • Snowflake - Strong experience writing and optimizing analytical SQL; understanding of RBAC patterns, fiscal period logic, and aggregation queries
  • LangGraph / LangChain - Hands-on experience building stateful, multi-step AI agent workflows with tool calling, checkpointing, and conditional routing
  • RAG / PageIndex - Practical experience implementing retrieval-augmented generation pipelines for context-aware AI applications
  • MCP (Model Context Protocol) - Practical experience with MCP client/server patterns for structured tool-to-data communication
  • Snowflake Cortex - Experience with Snowflake Cortex AI functions (COMPLETE, EXTRACT_ANSWER, SUMMARIZE, SENTIMENT) and building AI/ML workflows natively within Snowflake
  • PromptQL - Experience with structured prompt engineering and query language patterns for grounding LLM outputs in enterprise data

Preferred Skills

  • Experience with Plotly for programmatic chart generation
  • Pandas for data transformation and pivot operations
  • Docker and containerized deployment workflows
  • Enterprise authentication patterns (OAuth 2.0, token management)
  • Experience working within large enterprise codebases with multiple contributors

Minimum Qualifications

  • 5+ years of professional Python development experience
  • 2+ years building AI/ML or LLM-powered applications in production
  • Demonstrated experience with at least two of: LangChain, LangGraph, PromptQL, or equivalent agent orchestration frameworks, MCP
  • Strong SQL skills with experience on Snowflake and snowflake cortex
  • Ability to work independently, understand existing architecture quickly, and deliver production-ready code with minimal supervision

 

 

Thanks & Regards

Srujan Burra

Email: srujan@sourcemantra.com

Source Mantra Inc | www.sourcemantra.com

295 Durham Ave, Suite # 201, South Plainfield, NJ 07080

Certified Minority Business Enterprise (MBE)

Network Cloud Engineer : Alpharetta, GA - Need Local only

Hello Folks,

 

Hope you are doing great!

 

This is Himank Jani from ApTask.

 

We have urgent requirements with one of our client’s, please review the job description below and let me know if you have any relevant candidates on your bench, Kindly share.

 

Need Only local Candidate’s.

Please ensure that all profiles shared include details such as current location and work authorization status. Profiles without this information may not receive a response.

Role : Network Cloud Engineer

Location : Alpharetta, GA ( hybrid - 3days a week onsite) – Only Local

 

The final Interview would be in-person(Mandatory)

Interview process : One screening call same day – the client in person interview.

 

Role Overview

We are seeking a highly skilled Senior Network Cloud Testing Engineer with deep expertise in IPv6 networking and hands-on experience across public cloud infrastructures (AWS, Azure, GCP). The ideal candidate will design and implement test strategies, develop lab testing frameworks, and validate IPv6 and dual-stack networking in cloud-native and enterprise environments. This role involves close collaboration with architecture teams and business units to ensure compliance with IPv6 readiness standards and optimize network security and performance.

 

Key Responsibilities

  • Design and execute test strategies for IPv6 validation across multi-cloud and hybrid cloud infrastructures.
  • Develop test automation frameworks using Terraform, Ansible, and CloudFormation.
  • Build and manage lab environments for hardware/software certification and PoCs.
  • Validate network DMZ designs, certify underlying hardware/software features, and enhance network security posture.
  • Perform data gathering and analytics for test results and prepare deployment templates.
  • Troubleshoot complex hardware/software issues and provide solutions.
  • Collaborate with cross-functional teams (cloud, compute, security, telemetry, desktop engineering) and senior management.
  • Provide escalation support to network operations for bug validation and vulnerability assessments.

 

Required Qualifications

  • 8+ years of experience in LAN/WAN network design, engineering, and feature testing/certification.
  • CCIE Certification is mandatory
  • CCNP certification or equivalent experience.
  • Hands-on experience with AWS, Azure, and GCP networking:
    • AWS: VPC IPv6 addressing, Internet Gateway, Transit Gateway, Direct Connect, Route 53
    • Azure: VNet IPv6, Load Balancer, ExpressRoute, NSGs, Network Watcher
    • GCP: Cloud Routers, IPv6 Access, NAT64, Interconnect, VPC Flow Logs
  • 3+ years of IPv6 networking and validation experience.
  • Strong knowledge of routing and switching (Cisco IOS-XE/XR/NXOS), TCP/IP, BGP, OSPF, MP-BGP, MPLS VPNs, VXLAN, Multicast.
  • Experience with DMVPN, IPSec, and encryption standards.
  • Familiarity with Python network programming (preferred).
  • Experience with job scheduling tools and Agile methodologies.

 

Desired Skills

  • Certifications in AWS Advanced Networking, Azure Network Engineer, or GCP Cloud Network Engineer.
  • Experience with network security hardening and best practices.
  • Knowledge of test-driven development and CI/CD pipelines.
  • Strong written and verbal communication skills for stakeholder engagement.

 

 

 

Best Regards,

Himank Deepak Jani

 

ApTask | A global, diversity-certified workforce solutions provider.

 

Connect: (732) 576-0557 | himankj@aptask.com

Address: 120 Wood Ave South, Suite # 300, Iselin, NJ 08830

 

This e-mail and any attachments may be confidential, proprietary or legally privileged. Any review, use, disclosure, distribution or copying of this e-mail is prohibited except by or on behalf of the intended recipient. If you received this message in error or are not the intended recipient, please delete or destroy the e-mail message and any attachments or copies and notify the sender of the erroneous delivery by return e-mail. It shall not attach any liability on the sender or ApTask or its affiliates. Any views or opinions presented in this email are solely those of the sender and may not necessarily reflect the opinions of ApTask or its affiliates.

 

Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.

If you have any concerns or queries about your personal information, please feel free to contact our compliance team at 
compliance@aptask.com.

Applicant Consent:
By submitting your application, you agree to ApTask's (www.aptask.com)
 Terms of Use and Privacy Policy, and provide your consent to receive SMS and voice call communications regarding employment opportunities that match your resume and qualifications. You understand that your personal information will be used solely for recruitment purposes and that you can withdraw your consent at any time by contacting us at 732-355-8000 or help@aptask.com. Message frequency may vary. Msg & data rates may apply.

 

 

Job Title: HBITS-07-14698 - Senior Oracle Architect/Developer

Job Title: HBITS-07-14698 - Senior Oracle Architect/Developer         

Duration: 24 months

Location: Albany, NY(Day 1 onsite)


Need the candidate's I-797A, DL, visa stamping, education certificate (provisional), passport copy, and 3 references. Without these documents, the client will not accept the resumes.

 

Requested Qualifications

84 Months Experience using Oracle development tools such as SQL Plus, SQL Developer, and SQL Loader for programming and maintaining applications.

60 Months of Experience in Oracle PL/SQL and developing procedures and packages.                          

48 Months Experience designing and implementing logical and physical database models, including indexes, primary keys, foreign keys, and constraints.               

48 months Experience optimizing, performance tuning, and debugging PL/SQL.                         

24 months Experience performing Extract, Transform, and Load (ETL) techniques       


--

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/b10d942f-0cdd-436c-8c6f-d11601a71e90n%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Featured Post

Hiring - Senior Data Engineer – Investments Data Platform (Snowflake) - New York, NY (Hybrid)

Hello Folks,   Hope you are doing great!   This is Himank Jani  from ApTask .   We have urgent requirements with one of our c...

Contact Form

Name

Email *

Message *

Total Pageviews