Senior ETL Developer, Beloit, WI

Agility ETL Developer Job Description

Beloit, WI, Locals required 

Need local DL

Exp:- 10 years



Position:  ETL Developer (Data Conversions, SSIS, SQL Server, Python, ERP Expertise)

We're seeking a dynamic and detail oriented ETL Developer with hands-on experience in large-scale data conversion projects. This role requires a proactive professional who has led or contributed to two or more recent data migration initiatives, particularly involving Db2 and IBM mainframe systems transitioning into MS SQL Server. The ideal candidate will also have strong ERP expertise, with a comprehensive understanding of ERP data models and concepts, and the ability to efficiently migrate data into a custom ERP system.

Requirements:

  • Data Conversion Expertise: Must have led or been significantly involved in data conversion projects in the last two roles, specifically migrating data from Teradata and IBM mainframe systems into SQL Server.
  • SSIS & SQL Server Mastery: Advanced experience in developing and optimizing SSIS packages, alongside proficiency in MS SQL Server with a focus on complex SQL query writing and database performance tuning. Min 8-10 yrs. of SSIS, ETL & SQL Development. Knowledge of MS SQL Server 2012 or later. Experience with SSRS, SSIS, T-SQL; develop SSIS packages.
  • Python Proficiency:  Strong Python skills to automate ETL processes and handle intricate data transformations.
  • Data Mapping & Modeling: Demonstrated experience in conducting detailed data mapping and creating precise data models for migration. Strong Data analysis and data migration script creation experience.
  • Data Flow Diagrams: Ability to craft clear and effective data flow diagrams to illustrate migration and transformation processes.
  • Technical Documentation: Proven track record of writing clear, comprehensive technical design documents that communicate complex technical solutions to both technical and non-technical audiences.
  • ERP Data Expertise: Extensive experience working with ERP data models, with a deep understanding of core ERP entities like customers, suppliers, items, and transactions, and the ability to load data into custom ERP systems.
  • Collaboration & Communication: Exceptional communication skills with the ability to work directly with business users, architects, and cross-functional teams, ensuring alignment and understanding of data solutions.

Responsibilities:

  • Design and implementation of SSIS, and ETL based on business requirements.
  • Strong problem-solving and analytical skills. Able to understand business requirements and develop SQL Server solutions that work within the technical boundaries.  
  • Conduct comprehensive data mapping, data analysis, and data modeling to support ETL processes.
  • Use Python scripting to automate data transformation processes and streamline ETL workflows.
  • Write SQL and T-SQL scripts/statements to analyze and translate legacy ERP data into new ERP platform.
  • Collaborate closely with business stakeholders, architects, and development teams to

translate complex business requirements into robust data engineering solutions.

  • Create, debug, and execute T-SQL scripts and stored procedures that match data mapping specifications.
  • Create and maintain detailed data flow diagrams, showcasing the logical flow of data throughout the migration process.
  • Present insights and technical details to both technical and non-technical teams, ensuring transparent communication across all levels.
  • Perform quality assurance and database validation on all test and live conversions.
  • Leverage a strong understanding of ERP concepts to ensure seamless data interpretation and migration strategies.
  • Work with ERP data models, including customers, suppliers, items, and open transactions, ensuring data is accurately migrated into custom ERP systems.
  • Ability to learn on the fly. Quickly and effectively integrates new information and skills to enhance personal performance or the performance of the organization. Learns from successes and failures, regards all experience as an opportunity to learn and improve.
  • Creates jobs for batch and real time processing of data from internal and external sources.
  • Supports software applications/operating systems through data research and debugging.
  • Participates in the testing process through test review and analysis, test witnessing and certification of data.
  • Able to communicate with application technical leads and business users.
  • Ability to roll up sleeves, dig in and figure out complex problems.

Additional Skills (Good to Have):

  • NiFi Experience: Knowledge of NiFi for building efficient data flows and processing pipelines.
  • Apache Airflow: Familiarity with Apache Airflow for orchestrating complex ETL workflows and managing data pipelines.
  • Apache Splunk: Experience with Apache Splunk for monitoring, searching, and analyzing large volumes of data.
  • Apache Spark: Familiarity with Apache Spark for large-scale data processing and distributed computing.

--
You received this message because you are subscribed to the Google Groups "C2Cbenchrecruiters" group.
To unsubscribe from this group and stop receiving emails from it, send an email to c2c_benchrecruiters+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/c2c_benchrecruiters/CANN10zPxhSdOy8am7MP07wrQifXcPOApWLm8MD0kr1SZs1p1Vw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Comments

Popular Posts