Intuceo Requirement for Python, Java Developer, Data Engineer, Cloud Data Architect

Hi ,  

  

I hope you are doing well. We have a requirements open with us. Please take a look.  

 

 

Job Description: 

1.

Job Title 

Python Developer 

   Location 

Remote 

   Duration 

Contract 

 

Experience:- 8+ years 

>Pyspark 

>Python 

>Database 

>ETL 

>Palantir 

 

 

2.

Job Title  

Java Developer  

   Location  

Remote  

   Duration  

Contract  

 

 

Must-have skills: Struts/Tiles, Struts 2, Servlets/JSP, and JPA/Hibernate (5+ years); XML and CSS, Oracle, PL/SQL including both DML and DDL, Javascript frameworks, such as AngularJS, NodeJS, and React, Bootstrap, Git, Jenkins, Information Systems Development Methodology, database query tools and security, Unified Modeling Language, Agile, GIS

 

Requirements:
• Experience with Java application development, leveraging frameworks such as Struts/Tiles, Struts 2, Servlets/JSP, and JPA/Hibernate (5+ years);
• Experience in web development leveraging XML and CSS (5+ years);
• Experience with Oracle databases with the capability to write complex queries and develop complex PL/SQL database objects, including both DML and DDL (5+ years);
• Experience with Javascript frameworks, such as AngularJS, NodeJS, and React (2+ years);
• Experience with responsive design frameworks, preferably Bootstrap;
• Experience using code repositories, preferably Git;
• Experience using continuous integration tools, preferably Jenkins;
• Experience with relational and object-oriented database designs;
• Experience with object-oriented design methodologies;
• Experience with Information Systems Development Methodology (ISDM);
• Experience with database query tools (i.e., TOAD, SQL Developer, SQL Navigator);
• Experience with database security, including role-based security;
• Experience with Unified Modeling Language (UML);
• Experience with Agile development, specifically Scrum, Extreme Programming (XP) and Kanban (preferred);
• Experience with JQuery (Javascript library) (preferred);
• Experience with Virtualization, preferably Docker (preferred);
• Experience implementing GIS (Geographic Information System) applications (preferred);
• Experience maintaining WebLogic Server (preferred);
• Experience developing web services, preferably RESTful web services (preferred);
• Experience with environmental regulatory business processes and practices (preferred);
• Knowledge and understanding of DEP's technical environment (preferred);
• Oracle certification(s) (preferred);
• Bachelor's Degree in Computer Science, Information Systems or other Information Technology major, or equivalent work experience

 

 

3.

 

Role: Data Engineer, 8+ years Experience

   Job Description:
 
   We are looking for experienced developer resources with the below key skillset .
   The sooner the better. Looking forward.
 

  • Pyspark for implementation in Databricks. –2 to 3 years of experience
  • Python with oops concept exposure. –3 years of experience
  • Implementation knowledge covering Snowflake sql and Teradata. – Few years of experience.
  • Understanding CDC for performance tunning , Data quality checks before and after fact.

 

 

 

4.

Job Title

Cloud Data Architect

   Location

Los Angeles, CA - Remote

   Duration

Long Term Contract

  Client

IBM

 

Job Description


Qualifications:
Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
Experience driving large-scale cloud enterprise data warehouse projects & establishing the source of truths for key entities with a focus on data quality metrics and connectedness of data across foundations
Strong experience collaborating with business stakeholders and users of multiple business domains such as Product, Sales, Operations, Finance, and Marketing to understand, document, and implement data & analytics needs
Hands-on experience with distributed systems such as Spark, Hadoop (HDFS, Hive, Presto, PySpark) to query and process data & Advanced SQL, python, and scripting skills and experience in visualization tools such as Tableau
Experience with at least one major MPP or cloud database technology (Snowflake, Redshift, Big Query)

Job Responsibilities:
Design and develop scalable data warehousing solutions, building ETL/ELT pipelines in Big Data environments (cloud, on-prem, hybrid)
Help architect data solutions/frameworks and define data models for the underlying data warehouse and data marts
Collaborate with Data Product Managers, Data Architects, and Data Engineers to design, implement, and deliver successful data solutions
Maintain detailed documentation of your work and changes to support data quality and data governance
Perform data analysis and profiling, meet with application SME's to understand the source data model and translate it to a Dimensional model
Our tech stack includes Azure/AWS, Snowflake, Databricks, Spark.



Thanks & Regards.

Vinay Yadav   

Recruiter

P. 904-204-0825

tyadav@intuceo.com | www.intuceo.com 

4110 Southpoint Blvd. Suite 124 Jacksonville, FL 32216 


  

Comments

Popular Posts