Hello All,
As discussed, please see the below job and let me know if you have any consultants available.
Job Title: Senior Cloud/Data Engineer
Location: McLean, VA 22102 or Plano, TX (Local candidates only – In-person interview is required)
Duration: 12 Months (Possible Extensions)
Client: Freddie Mac
Interview Information – In-person interview
Division: IT
Call Notes:
Looking for a Cloud/Data Engineer with expertise in Python, ETL/Informatica, IICS, Spark, PySpark and JSON.
Someone who has experience with both Python and Java, those candidates will be preferred first.
Need someone with good communication skills (both written and verbal).
Previous Mortgage/Financial industry candidates are highly preferred.
Top Skills:
Python
PySpark
ETL
IICS
Snowflake
API's
Java (Someone who has experience will be highly preferred)
Job Description:
The Senior Developer will be part of the Freddie Mac's Enterprise Risk Business Technology Office. This team is responsible for partnering with the Enterprise Risk and Compliance organizations to define strategies, roadmaps, objectives, and deliver capabilities that transform the businesses. This role will be responsible for supporting our organization's data-driven initiatives, collaborating with business partners and multi-discipline technology teams, and designing and implementing data modeling solutions! This position requires strong experience in data analysis, modeling and engineering with ability to translate complex technical issues into easily understood communications that will influence executive audiences with varied technical backgrounds and capabilities.
• Business requirements gathering and solutioning in alignment with enterprise business data strategy
• Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data.
• Ability to Analyze and resolve data issues
• Build and maintain integrations with internal and external data sources and APIs.
• Monitor system performance, troubleshoot issues, and implement optimizations to enhance reliability and efficiency.
Qualifications:
• Bachelor's degree in computer science, information technology or related field; advanced studies/degree preferred.
• 5 years' extensive knowledge and experience in the Data technologies for Data Analytics, Data Lake/Mart/Warehouse, Databases SQL/NoSQL (DB2, Oracle), Big Data Technologies (PySpark), ETL (Informatica), CDC (Attunity), REST API
• 3+ years' experience in Python
• 5+ years' experience with Technologies including Web Service API, XML, JSON, JDBC, Java
• 3+ years working with SaaS platforms such as Snowflake, Collibra
• Knowledge of enterprise data models, information classification, meta-data models, taxonomies and ontologies.
• Exposure to Full stack enterprise application development(Agular, Spring Boot, Automation testing using Selenium, cucumber)
• Knowledge of data warehousing and business intelligence concepts including data mesh, data fabric, data lake, data warehouse, and data marts
• Risk and GRC products experience preferred
• Effective communication and interpersonal skills
• Ability to work independently and in a team environment
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com