Search This Blog

Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)

🚨 Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)

📍 Location: New York, NY (Relocation OK)
💼 Experience: 8–10 Years


🔹 Must Have:

• Oracle Customer Care & Billing (CCB)
• Utilities Domain

🔹 Nice to Have:

• SQL, ADO
• Functional / Integration / System / Regression Testing


🔹 Role:

Lead end-to-end testing for Oracle CCB/MDM (CIS) implementations and upgrades


🔹 Top 3 Responsibilities:

• Drive test strategy, planning & execution across all testing phases
• Oversee test case design, execution & defect management
• Collaborate with business & technical teams to ensure quality delivery

Best Regards,
Email ID: teja.a@siriinfoinc.com 
LinkedIn: www.linkedin.com/in/sri-teja-reddy-ala-535621258


--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/bf854e85-6099-4b5d-99aa-edbfca14aa2fn%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Data Engineer (PySpark + AWS + Iceberg) :- Chicago, IL – Only Local – No Relocation

Hello Folks,

 

Hope you are doing great!

 

This is Himank Jani from ApTask.

 

We have urgent requirements with one of our client's, please review the job description below and let me know if you have any relevant candidates on your bench, Kindly share.

 

Need Only Local Candidate's.
(PySpark + AWS + Iceberg) exp is must, don't share any irrelevant resumes.

Please ensure that all profiles shared include details such as current location and work authorization status. Profiles without this information may not receive a response.

Job Title: Data Engineer (PySpark + AWS + Iceberg)

Location : Chicago, IL – Only Local – No Relocation

Exp: 10+ Yrs Minimum

Rate: $63/hr on C2C

No of Position:  2

RTTO – 5 Days Onsite

 

Job Description:
Job Summary

We are looking for a skilled Data Engineer to design and build scalable data solutions using PySpark and AWS services. The ideal candidate will have hands-on experience in building modern data platforms using Apache Iceberg and implementing Medallion architecture on AWS.

 

Key Responsibilities

  • Design and implement end-to-end data solutions using PySpark, ensuring scalability and performance.
  • Build and manage data pipelines using AWS services such as AWS Glue, EMR, and Lambda.
  • Develop data products using PySpark + AWS Glue stack.
  • Implement Medallion Architecture (Bronze, Silver, Gold layers) for structured data processing.
  • Work with Apache Iceberg tables for efficient data storage, versioning, and schema evolution.
  • Ensure data quality, governance, and optimization across pipelines.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Optimize data processing jobs and improve performance and cost-efficiency on AWS.

 

Required Skills & Experience

  • Strong experience in PySpark for data processing and pipeline development.
  • Hands-on experience with AWS ecosystem (Glue, EMR, Lambda, S3).
  • Experience implementing Medallion Architecture.
  • Practical knowledge of Apache Iceberg or similar table formats.
  • Strong understanding of distributed data processing and big data frameworks.
  • Experience designing scalable and reliable data pipelines.
  • Good understanding of data modeling and ETL/ELT concepts.

 

Preferred Qualifications

  • Experience working outside of Databricks-only environments (ability to build solutions using native AWS stack).
  • Familiarity with modern data lake architectures and open table formats.
  • Knowledge of performance tuning and cost optimization in AWS.
  • Experience with CI/CD pipelines for data engineering workflows.

 

What the Client is Specifically Looking For

  • Engineers who can independently design solutions using PySpark (not limited to Databricks).
  • Strong expertise in AWS-native data engineering tools.
  • Hands-on implementation experience with Apache Iceberg (preferred over Delta).
  • Ability to build data products using Glue + PySpark stack.
  • Clear understanding and implementation of Medallion architecture using AWS services.

 

 

 

 

 

Best Regards,

Himank Deepak Jani

 

ApTask | A global, diversity-certified workforce solutions provider.

Address: 120 Wood Ave South, Suite # 300, Iselin, NJ 08830

 

This e-mail and any attachments may be confidential, proprietary or legally privileged. Any review, use, disclosure, distribution or copying of this e-mail is prohibited except by or on behalf of the intended recipient. If you received this message in error or are not the intended recipient, please delete or destroy the e-mail message and any attachments or copies and notify the sender of the erroneous delivery by return e-mail. It shall not attach any liability on the sender or ApTask or its affiliates. Any views or opinions presented in this email are solely those of the sender and may not necessarily reflect the opinions of ApTask or its affiliates.

 

Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.

If you have any concerns or queries about your personal information, please feel free to contact our compliance team at 
compliance@aptask.com.

Applicant Consent:
By submitting your application, you agree to ApTask's (www.aptask.com)
 Terms of Use and Privacy Policy, and provide your consent to receive SMS and voice call communications regarding employment opportunities that match your resume and qualifications. You understand that your personal information will be used solely for recruitment purposes and that you can withdraw your consent at any time by contacting us at 732-355-8000 or help@aptask.com. Message frequency may vary. Msg & data rates may apply.

 

 

Java Architect (Telecom Domain) – St. Louis, MO (Onsite)

Job Title: Java Architect (Telecom Domain) – St. Louis, MO (Onsite)
Experience: 8+ Years

Work Authorization: USC/GC
Job Description:
We are seeking an experienced Java Architect with a strong background in the Telecom domain to lead the design and development of scalable enterprise applications. The ideal candidate will play a key role in architecting robust solutions and guiding development teams.
Key Responsibilities:
Design and implement scalable, high-performance Java-based applications
Define architecture, frameworks, and best practices for development
Work closely with business stakeholders and technical teams
Lead code reviews and ensure adherence to coding standards
Integrate telecom systems (OSS/BSS) and APIs
Provide technical leadership and mentorship to the team
Required Skills:
Strong experience in Java, Spring Boot, Microservices architecture
Solid experience in Telecom domain (OSS/BSS systems)
Experience with REST APIs, cloud platforms (AWS/Azure), and distributed systems
Strong understanding of system design, scalability, and performance tuning
Experience with CI/CD pipelines and DevOps practices
Best Regards,
Email ID: teja.a@siriinfoinc.com 

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/7ef8db21-aee1-4ec8-a609-e038b268095an%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Local to GA - Lead Data Engineer - Atlanta, GA (Onsite)

Hi,


Hope you're doing well.

My name is Mayank, and I am a Technical Recruiter from Empower Professionals Inc. We are sourcing for a "
Lead Data Engineer" role located at "Atlanta, GA" and it's a long-term contract with possible extensions.

 

If you have any suitable profiles, please share their updated resumes with the candidate's location, work authorization and expected rate so that we can proceed.

 

Role: Lead Data Engineer

Duration: 12 Months
Location: Atlanta, GA (Onsite)

Job Description:

We are seeking a Lead Data Engineer to drive the ESP Migration initiative. This role requires deep expertise in modern data engineering practices, hands-on technical skills, and the ability to lead complex data transformation and migration efforts.

Key Responsibilities

  • Lead end-to-end data engineering tasks for the ESP Migration program.
  • Design, build, and optimize scalable data pipelines and workflows.
  • Collaborate with cross-functional teams to ensure seamless data movement and transformation.
  • Ensure data quality, performance, and reliability across all stages of migration.
  • Provide technical leadership and mentor team members throughout the project lifecycle.

 

Required Skills (Must-Have)

  • Strong foundation in core data engineering concepts and best practices.
  • Hands-on expertise in:
  • Python
  • SQL
  • Snowflake
  • Airflow
  • Proven experience leading data engineering initiatives in complex environments.

 

Preferred Skills (Nice-to-Have)

  • Exposure to Generative AI concepts and related tools/technologies.

 

 

In compliance with the salary transparency law, the expected pay range for this role is $60 - 70/hr. Actual compensation depends on experience and interview evaluation.

Thanks

Mayank Verma

Senior Technical Recruiter | Empower Professionals

......................................................................................................................................

mayank@empowerprofessionals.com | LinkedIn: https://www.linkedin.com/in/mayankdverma/

Fax: 732-356-8009 | 100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873

www.empowerprofessionals.com

Certified NJ and NY Minority Business Enterprise (NMSDC)

Empower Professionals firmly opposes e-mail "spamming". We apologize to those who do not wish to receive this e-mail and also to those who have accidentally received it again. Please reply with "REMOVE" in the subject listing, with all aliases email addresses that you would want removed and any inconvenience caused is highly regretted. We appreciate your patience and cooperation. This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. 

 

Agentic AI Developer @ RTP, NC (Onsite)

 

We have the below requirement with Client. Kindly go through the Job Description and let me know your interest.

 

Position: Agentic AI Developer

Location: Research Triangle Park (RTP), North Carolina Onsite

Duration: Long Term Contract

 

LinkedIn and Passport Number are mandatory for submission.

 

LinkedIn profile must have been created before 2020.

 

Project: AI Analytics Platform

About the Role:
We are building an enterprise-grade AI-powered analytics platform that enables business users to query complex datasets through natural language. The platform uses a multi-agent architecture where specialized AI agents collaborate to route questions, generate SQL, execute queries against Snowflake, produce insights, and create dynamic visualizations - all autonomously.


We are looking for consultants who can contribute to the development of new domain agents, extend the platform's capabilities, and help scale the architecture to support additional business domains.


What You Will Work On

  • Develop and integrate new domain-specific AI agents into an existing multi-agent orchestration system built on LangGraph and LangChain
  • Design and implement SQL generation agents that translate natural language questions into precise Snowflake SQL, enforcing business rules, RBAC, and fiscal period logic
  • Build and consume MCP (Model Context Protocol) server integrations for secure, structured data access across enterprise data sources
  • Work with PromptQL and RAG/PageIndex patterns to improve query accuracy, context retrieval, and domain-specific grounding
  • Develop FastAPI endpoints and async workflows to support real-time query processing, background job execution, and frontend integration
  • Create intelligent data visualization pipelines that automatically select and generate the right chart type (heatmaps, bar charts, KPIs) based on query results and user intent
  • Write domain context configurations (JSON schemas) that define column mappings, business rules, valid values, and metric definitions for each data domain
  • Contribute to a plugin-based domain registry architecture that allows new agents to be added without modifying core orchestration code.

Required Technical Skills

  • Python - Expert-level proficiency; ability to write production-quality async code, work with complex class hierarchies, and debug multi-layer systems
  • Snowflake - Strong experience writing and optimizing analytical SQL; understanding of RBAC patterns, fiscal period logic, and aggregation queries
  • LangGraph / LangChain - Hands-on experience building stateful, multi-step AI agent workflows with tool calling, checkpointing, and conditional routing
  • RAG / PageIndex - Practical experience implementing retrieval-augmented generation pipelines for context-aware AI applications
  • MCP (Model Context Protocol) - Practical experience with MCP client/server patterns for structured tool-to-data communication
  • Snowflake Cortex - Experience with Snowflake Cortex AI functions (COMPLETE, EXTRACT_ANSWER, SUMMARIZE, SENTIMENT) and building AI/ML workflows natively within Snowflake
  • PromptQL - Experience with structured prompt engineering and query language patterns for grounding LLM outputs in enterprise data

Preferred Skills

  • Experience with Plotly for programmatic chart generation
  • Pandas for data transformation and pivot operations
  • Docker and containerized deployment workflows
  • Enterprise authentication patterns (OAuth 2.0, token management)
  • Experience working within large enterprise codebases with multiple contributors

Minimum Qualifications

  • 5+ years of professional Python development experience
  • 2+ years building AI/ML or LLM-powered applications in production
  • Demonstrated experience with at least two of: LangChain, LangGraph, PromptQL, or equivalent agent orchestration frameworks, MCP
  • Strong SQL skills with experience on Snowflake and snowflake cortex
  • Ability to work independently, understand existing architecture quickly, and deliver production-ready code with minimal supervision

 

 

Thanks & Regards

Srujan Burra

Email: srujan@sourcemantra.com

Source Mantra Inc | www.sourcemantra.com

295 Durham Ave, Suite # 201, South Plainfield, NJ 07080

Certified Minority Business Enterprise (MBE)

Network Cloud Engineer : Alpharetta, GA - Need Local only

Hello Folks,

 

Hope you are doing great!

 

This is Himank Jani from ApTask.

 

We have urgent requirements with one of our client’s, please review the job description below and let me know if you have any relevant candidates on your bench, Kindly share.

 

Need Only local Candidate’s.

Please ensure that all profiles shared include details such as current location and work authorization status. Profiles without this information may not receive a response.

Role : Network Cloud Engineer

Location : Alpharetta, GA ( hybrid - 3days a week onsite) – Only Local

 

The final Interview would be in-person(Mandatory)

Interview process : One screening call same day – the client in person interview.

 

Role Overview

We are seeking a highly skilled Senior Network Cloud Testing Engineer with deep expertise in IPv6 networking and hands-on experience across public cloud infrastructures (AWS, Azure, GCP). The ideal candidate will design and implement test strategies, develop lab testing frameworks, and validate IPv6 and dual-stack networking in cloud-native and enterprise environments. This role involves close collaboration with architecture teams and business units to ensure compliance with IPv6 readiness standards and optimize network security and performance.

 

Key Responsibilities

  • Design and execute test strategies for IPv6 validation across multi-cloud and hybrid cloud infrastructures.
  • Develop test automation frameworks using Terraform, Ansible, and CloudFormation.
  • Build and manage lab environments for hardware/software certification and PoCs.
  • Validate network DMZ designs, certify underlying hardware/software features, and enhance network security posture.
  • Perform data gathering and analytics for test results and prepare deployment templates.
  • Troubleshoot complex hardware/software issues and provide solutions.
  • Collaborate with cross-functional teams (cloud, compute, security, telemetry, desktop engineering) and senior management.
  • Provide escalation support to network operations for bug validation and vulnerability assessments.

 

Required Qualifications

  • 8+ years of experience in LAN/WAN network design, engineering, and feature testing/certification.
  • CCIE Certification is mandatory
  • CCNP certification or equivalent experience.
  • Hands-on experience with AWS, Azure, and GCP networking:
    • AWS: VPC IPv6 addressing, Internet Gateway, Transit Gateway, Direct Connect, Route 53
    • Azure: VNet IPv6, Load Balancer, ExpressRoute, NSGs, Network Watcher
    • GCP: Cloud Routers, IPv6 Access, NAT64, Interconnect, VPC Flow Logs
  • 3+ years of IPv6 networking and validation experience.
  • Strong knowledge of routing and switching (Cisco IOS-XE/XR/NXOS), TCP/IP, BGP, OSPF, MP-BGP, MPLS VPNs, VXLAN, Multicast.
  • Experience with DMVPN, IPSec, and encryption standards.
  • Familiarity with Python network programming (preferred).
  • Experience with job scheduling tools and Agile methodologies.

 

Desired Skills

  • Certifications in AWS Advanced Networking, Azure Network Engineer, or GCP Cloud Network Engineer.
  • Experience with network security hardening and best practices.
  • Knowledge of test-driven development and CI/CD pipelines.
  • Strong written and verbal communication skills for stakeholder engagement.

 

 

 

Best Regards,

Himank Deepak Jani

 

ApTask | A global, diversity-certified workforce solutions provider.

 

Connect: (732) 576-0557 | himankj@aptask.com

Address: 120 Wood Ave South, Suite # 300, Iselin, NJ 08830

 

This e-mail and any attachments may be confidential, proprietary or legally privileged. Any review, use, disclosure, distribution or copying of this e-mail is prohibited except by or on behalf of the intended recipient. If you received this message in error or are not the intended recipient, please delete or destroy the e-mail message and any attachments or copies and notify the sender of the erroneous delivery by return e-mail. It shall not attach any liability on the sender or ApTask or its affiliates. Any views or opinions presented in this email are solely those of the sender and may not necessarily reflect the opinions of ApTask or its affiliates.

 

Candidate Data Collection Disclaimer:
At ApTask, we prioritize safeguarding your privacy. As part of our recruitment process, certain Personally Identifiable Information (PII) may be requested by our clients for verification and application purposes. Rest assured, we strictly adhere to confidentiality standards and comply with all relevant data protection laws. Please note that we only collect the necessary information as specified by each client and do not request sensitive details during the initial stages of recruitment.

If you have any concerns or queries about your personal information, please feel free to contact our compliance team at 
compliance@aptask.com.

Applicant Consent:
By submitting your application, you agree to ApTask's (www.aptask.com)
 Terms of Use and Privacy Policy, and provide your consent to receive SMS and voice call communications regarding employment opportunities that match your resume and qualifications. You understand that your personal information will be used solely for recruitment purposes and that you can withdraw your consent at any time by contacting us at 732-355-8000 or help@aptask.com. Message frequency may vary. Msg & data rates may apply.

 

 

Job Title: HBITS-07-14698 - Senior Oracle Architect/Developer

Job Title: HBITS-07-14698 - Senior Oracle Architect/Developer         

Duration: 24 months

Location: Albany, NY(Day 1 onsite)


Need the candidate's I-797A, DL, visa stamping, education certificate (provisional), passport copy, and 3 references. Without these documents, the client will not accept the resumes.

 

Requested Qualifications

84 Months Experience using Oracle development tools such as SQL Plus, SQL Developer, and SQL Loader for programming and maintaining applications.

60 Months of Experience in Oracle PL/SQL and developing procedures and packages.                          

48 Months Experience designing and implementing logical and physical database models, including indexes, primary keys, foreign keys, and constraints.               

48 months Experience optimizing, performance tuning, and debugging PL/SQL.                         

24 months Experience performing Extract, Transform, and Load (ETL) techniques       


--

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/c2c_benchrecruiters/b10d942f-0cdd-436c-8c6f-d11601a71e90n%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

OVS Technologies – Consultant Hotlist for C2C Engagements

Dear Recruiters,

 

OVS Technologies is pleased to present our current hotlist of consultants who are actively seeking new C2C opportunities. We invite you to review the list and reach out if any candidates meet your staffing needs.

To ensure you receive future updates, please add pavan@ovstechnologies.com to your distribution list.

 

Thank you for considering our consultants—we look forward to supporting your hiring goals.

 

HOTLIST - OVS TECHNOLOGIES

Sl.No

Candidate name

Technology

Exp

Location

Relocation

Visa

1

Sreenivasulu

Performance engineer

12+

TX

Remote

H1B

2

Prema Sai

QA Engineer

8+

NC

Open

H1B

3

Lokesh

QA Engineer/ ETL Tester

10+

NJ

Open

H1B

4

Suresh Babu

QA SDET

13+

NC

Hybrid

H1B

5

Harinadha

Senior QA Engineer

14+

VA

Open

H1B

6

Harish

Java Developer

8+

TX

Open

H1B

7

Bindhu Madhavi

Java Full Stack Developer

3+

VA

Open

OPT

8

Phani

Java Full Stack Developer

5+

OH

Open

OPT

9

Nitish

Java Developer

5+

MI

Open

OPT

10

Venkatesh

Java Full Stack Developer

5+

AZ

Remote

OPT

11

Niharika

Java Full Stack Developer

5+

NC

Open

OPT

12

Uday Shree

Java Developer

7+

VA

VA

H1B

13

Aruna

Full Stack Developer

12+

NJ

Open

H1B

14

Sharmila

Lead Java Software Developer / AWS Cloud Engineer

25+

VA

Depends on Location

USC

15

Tejas

SAP FICO

10+

NY

Open

H1B

16

Sivaraj

SAP TM

13+

TX

Remote

H1B

17

Sivaiah

SAP HANA BW/BI

18+

TX

Remote

TN

18

Shakeel

Data Engineer

11+

NC

Open

H1B

19

Sri lasya

SR Data Engineer

12+

VA

Remote

TN

20

Muralidhar

AWS Data Engineer

16+

VA

VA

H1B

21

Sasidhar

Data Engineer

9+

TX

Remote

H1B

22

Ramya

AWS Cloud Operations Engineer

9+

CA

Remote

H4EAD

23

Srinivasarao

AWS cloud engineer

16+

VA

EST

H1B

24

Raghavendra 

Cloud Engineer / Devops / SRE

18+

VA

Remote/VA

GC

25

Hari

Data analyst

6+

TX

TX

OPT

26

Sai Mounik

Data

6+

CT

Open

OPT

27

Vaheeda

Data analyst/scrum

10+

IA

Remote

H4EAD

28

Mahantesh

Data Analyst

15+

IL

EST

USC

29

Ramana

Data Quality engineer

12+

VA

Remote

H4EAD

30

Hiranmayee

Business Data Analyst

12+

VA

VA

H4EAD

31

Vudatha

Devops

5+

IA

Open

OPT

32

Apoorva

Devops engineer

5+

MD

MD

H4EAD

33

Sagar

DevOps/Cloud Engineer

8+

VA

VA/ Remote

H1B

34

Bargav

Sr Cloud/Devops Engineer

8+

VA

DMV

TN

35

Anil

Python developer

11+

AZ

Open

H1B

36

Siva Prasad Konduru

Principal Python Engineer

15+

AZ

Remote

H1B

37

Alekhya

Python AWS

8+

VA

Open

H4EAD

38

Prasanth

.Net full stack developer

4+

VA

Open

OPT

39

Arun

.Net developer

13+

VA

VA/ Remote

GCEAD

40

Phanindra

.Net developer

10+

AL

Remote

H1B

41

Shalini

Mainframe Developer

7+

CA

CA/Remote

H4EAD

42

Chandrashekar

Security Analyst

19+

NJ

Open

H1B

43

Praveen

Workday HCM Consultant

8+

SC

Remote

H1B

44

Gowthami

Workday consultant

10+

TX

Hybrid

H4EAD

45

Manmohan

Power BI

9+

AZ

Open

H1B

46

Amit

Program Manager/ Project Manager

18+

WA

Open

H1B

47

Sai Saran

Project Manager

25+

VA

DMV

USC

48

Sai Saranya

SRE Engineer

7+

TX

Open

OPT

49

Shashidhar

Software Developer

2+

VA

Open

Stem OPT

 

 

Regards, 

Pavankumar

Desk:  +1-202-979-0403

Email: pavan@ovstechnologies.com

Linkedin : https://www.linkedin.com/in/pavan-kumar-0a1879220/

OVS Technologies Inc (OVS) | http://www.ovstechnologies.com

||"Doors To Digital" || An E-Verified Company

Corporate & East Coast Office: 22375 Broderick Drive Suite 165, Sterling VA - 20166

     

 

Featured Post

Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)

  Hiring: Oracle CCB Testing Lead | New York (Day 1 Onsite)  Location: New York, NY (Relocation OK)  Experience: 8–10 Years   Must Have: • O...

Contact Form

Name

Email *

Message *

Total Pageviews