Role: Data Engineer Location: Salt Lake City Utah (remote) Job description:
Highlights: Experience with IBM Data Stage, python, and Linux scripting.
- Responsible for maintaining enterprise-grade platforms that enable data-driven solutions.
- Search for ways to automate and maintain scalable infrastructure.
- Ensure delivery of highly available and scalable systems.
- Monitor all systems and applications and ensure optimal performance.
- Analyzes and designs technical solutions to address production problems.
- Participate in troubleshooting applications and systems issues.
- Identifies, investigates, and proposes solutions to technical problems.
- While providing technical support for issues, develop, test, and modify software to improve efficiency of data platforms and applications.
- Monitors system performance to maintain consistent up time.
- Prepares and maintains necessary documentation.
- Participate in daily standups, team backlog grooming, and iteration retrospectives.
- Coordinate with data operations teams to deploy changes into production.
- Highest level may function as a lead.
- Other duties as assigned.
Qualifications:
- 5+ years of experience in ETL (prefer experience with IBM DataStage), SQL, UNIX/Linux, Big Data distributed systems, various programming languages like Python, orchestration tools and processes or other directly related experience.
- Extensive experience in data migration, data analysis, data transformations, conversion, interface, large volume data loading (ETL techniques), database modeling, and performance SQL tuning.
- Significant experience in leveraging database tools to develop DDL scripts, stored procedures, and functions to create and alter database objects.
- Have a good understanding of entity-relational database design, including data normalization and ERD modeling.
- Experience working with Greenplum, Oracle, SQL Server, DB2, Teradata, and delimited text files would be helpful.
- Basic knowledge of Cloud computing, search technology, building real-time data pipelines, scheduling tools and processes.
- Hands on experiences with Git version control processes, data streaming technologies such as Kafka, and unstructured data handling preferred.
- Serve in the goalie (On-Call) rotation to support the Production environment.
- Responsible for maintaining enterprise-grade platforms that enable data-driven solutions.
- Good analytical, organizational, and problem-solving skills.
- Ability and desire to learn new technologies quickly.
- Ability to elicit, gather and analyze user requirements.
- Ability to work independently and collaborate with others at all levels of technical understanding.
- Able to meet deadlines.
- Good judgment and project management skills.
- Ability to communicate both verbally and in writing with both technical and non-technical staff.
- Ability to work in a team environment and have good interpersonal skills.
- Ability to adapt to changing technology and priorities.
- Must be able to interpret, validate and map business requirements to an appropriate solution.
|