Intuceo Requirement for Business Analyst, IMS DBA, DB2 DBA, MemSQADBA, Druid DBA , Duck Creek Policy Developer, Machine Learning Engineer/AWS Engineer/Data Engineer/Data Scientist, Data Bricks/Cloud Data Engineer
Dear Partner,
We have below requirement open with us. Please take a look and share me your best fit candidates resume, also share Rate, DL and work authorization copy.
Role : Business Analyst
Location: Jacksonville, FL (onsite)
Duration: 6+ Months
Job Description :
We have urgent requirement of 2 good Business Analysts.
Ideally, both of them located in Jacksonville, FL and would need to have experience on Agile practices, Business Requirements gathering along platforms like SFDC, SFMC and Drupal (Web solutions).
Job Title | : | IMS DBA |
Job Location | : | Boston, Massachusetts ( Remote ) |
Contract : 12 months
Job Description | : | IMS DBA |
Job Description
Title : DB2 DBA
Location : Boston, MA or (100% Remote)
Contract : 12 months
Summary:
Manage DB2 Mainframe databases adhering to company regulations and security standards. Assist with testing of new/improved systems, as well as product upgrades. Attend meetings and interact regularly with management, team, and clients to continuously improve services.
Bachelor's degree/technical certification in Computer Science or related field or 5 - 7+ years of Mainframe DB2 Database Administration experience.
Required:
• Manage database definitions, monitor free space, refresh test systems, resolve performance issues, assist with database and related DB2 product upgrades.
• Monitor the Backup, Reorg and RUNSTAT jobs regularly.
• Participate in Disaster Recovery Exercises.
• Provide database support to client Application Development teams.
• Provide database support to applicable production control teams.
• Continuously monitor database performance and take steps pro-actively to ensure good database performance.
• ISO/ITIL experience in the areas of Incident, Problem, Change, Request, and RCA management.
• Experience in working within an ITSM System to address and update Incidents, Changes, and Requests according to required timeframes.
• Strong incident/problem determination skills.
• Experience in documenting RCA's (Root Cause Analysis)
• Experience in working with automation teams to generate auto ticketing of DB2 subsystem alerts.
• Experience in working across internal tower teams such as Storage, Security, OS, Networking, Operations, and Solution Architect Teams.
• Participation in required On-Call rotation schedule as required.
• JCL, ISPF and 3rd party DBA tool experience.
• Experience in tuning DB2 system performance.
• Experience in tuning partitioned and indexed DB2 tablespaces.
• Experience in reorganizing DB2 database subsystems.
• Verbal and written proficiency in English.
• Experience in working closely with client account teams.
• Strong communications skills required for daily client interfacing.
• Participation in On-Call rotation schedule as required.
• Required to work onsite with the client at the client location.
• Experience with DB2 version 10 and the new capabilities of version 11.
• Project Management skills are a plus.
Role: MemSQA DBA
Experience: 8+ Years
Rate: $85/ Hr C2C.
Location: Remote
P1: Architect maximum availability cluster configuration and implement as feasible
P1: Configure and administer data replication between clusters and generate reports to capture replication status
P1: Deploy Rowstore, Columnstore and Hybrid architectures as required
P1: Setup and manage cluster resources & pools, conduct performance analysis and fine tune
P1: Discuss and implement best practices for partition management
P2: Setup and optimize database backups using S3 or S3 like devices; Validate backups to ensure they can be used for recoverability
P2: Migrate/Copy data across clusters or databases
P2: Analyze query plans using Studio and/or command line interfaces and recommend query rewrite
P2: Implement security best practices - least privileged access, data encryption, access monitoring/logging etc.
P2: Analyze inter-node traffic
P3: Troubleshoot issues for stability/availability/performance by opening service requests with database vendor support
P3: Patch and upgrade database clusters
P3: Generate operational playbook and review with the team
Role: Druid DBA
Experience: 8+ Years
Rate: $85/Hr C2C
Location: Remote
Design
P1: Design all of the required building blocks with flow of data/objects to support various sizing and shaping requirements.
P1: Design automated HA Configuration for Druid with Load Balancers and for metadata store(mysql / postgresql ) using replica / Active-Active
P3: Develop and conduct failover tests to meet the availability SLAs
Build
P1: Build using the features of Apache Druid to achieve the desired design to meet the business requirements
P1: Define "golden" standard for all database environments and create standard MOPS
Automation
P1: Expertise with Orchestration and Automation tools - Jenkins, Docker, Ansible, etc..
P2: K8s and Docker expertise to manage a druid cluster on K8s
P1: Deploy with Ansible automation to manage Apache Druid components and maintain their configurations along with Deep storage/Zookeeper/metadata database configurations
Security
P1: Druid and Zookeeper Security/User Access Control and DB hardening
P3: Create tool to automate user account creation and password management
Alerting
P3: Design and setup Monitoring and Alerting (varying priority levels)
P3: Deploy NewRelic
Optimization
P2: Tuning expertise with segment sizing, indexing, roll-ups, partitioning, compaction, compression and query tuning
P2: Build and maintain data retention and compaction schedules per the audit and security
Development
P2: DRUID native query development and tuning skills on top of SQL
Optimization
P2: Develop and deploy stress testing procedures covering various application functionalities
P3: Druid historical metrics repository to pull Oracle AWR/ASH type reports
Troubleshooting
P2: Druid troubleshooting graphical workflow; Develop tool to show Druid real-time performance similar to how Spotlight does for SQL Server
P3: Monitoring, support 24*7 and maintain performance of the transitioned production system to meet SLAs
Requisites: Druid Solutions Architecture certification; Working knowledge of Ansible
Role: Duck Creek Policy Developer
Location: Remote
Rate: Open, Market Best
JD
- Responsible for all configuration work
- Design Duck Creek table structure mapping
- Task allocation to developers
- Review of completed work
- Monitor progress and making sure timelines are adhered to as planned
- Conduct configuration option analysis to support business requirements
- Configure Duck Creek software application per agreed design
- Provide input on effort and impact associated with potential configurations / customizations
- Support the Business Lead and Business Analysts with the implementation approach for the medium to critical functional requirements
Support unit testing, system testing, UAT and Implementation project phases by providing Defect fixes for coding errors
Role: Machine Learning Engineer/AWS Engineer/Data Engineer/Data Scientist
Location: 100% Remote (NY doesn't matter)
Work Authorization: Any status is OK as long as there are willing to come on our payroll
Candidates should have strong experience with AWS
JD: Looking for ML/Data Engineers to build an ML accelerator on top of AWS infrastructure. Ideal candidate has experience deploying ML models that are developed through AWS sagemaker at scale. It's primarily ml-engineering role. It is not a typical Data Scientist role who can only build models or work with algorithms.
You need to be familiar with:
- AWS specific SDKs and APIs used to for model deployment and load balancing
- Within the AWS components, how we can design the infrastructure and what databases and storages are needed and why
- Challenges faced for live upgrading of a model that's already deployed in AWS
- Machine Learning tools and technologies Specifically in AWS cloud
- Coding in Python
- AWS Lambda Step functions
- AWS Glue
- Amazon EMR for Sagemaker
- DynamoDB and its importance in ML space
- Feature store
- AWS Workbench
- Jupyter
Knowledge of Cloud Platform AWS
Role: Data Bricks Cloud Data Engineer
Location: Detroit, Michigan
Duration: Long term
Rate: Market best.
skills :
Thanks & Regards.
Vinay
Recruiter,
Phone No : 9042041368
vthadichettu@intuceo.com | www.intuceo.com
4110 Southpoint Blvd. Suite 124 Jacksonville, FL 32216
Comments
Post a Comment
Thanks