Hi Partners,
Hope you are doing well! Please see the job details below and let me know if you would be interested in this role. If interested, please send me your resume, your contact details, your availability, and a good time to connect with you. Data Engineer Location: CA, NY, OH Key skillsets in descending priority order: · Datorama · Python · Data Warehouse, Data Modeling · Snowflake (SQL) · Airflow · Databricks Responsibilities: Partner with technical and non-technical colleagues to understand data and reporting requirements. · Work with engineering teams to collect required data from internal and external systems. · Design table structures and ETL strategy to build performant Data solutions that are reliable and scalable in a fast growing data ecosystem. · Develop Data Quality checks for source and target data sets. Develop UAT plans and conduct QA. · Develop and maintain ETL routines using ETL and orchestration tools such as Airflow, Luigi and Jenkins. · Document and publish Metadata and table designs to facilitate data adoption. · Perform ad hoc analysis as necessary. · Perform SQL and ETL tuning as necessary. · Develop and maintain Dashboards/reports using Tableau and Looker · Coach and mentor team members to improve their designs and ETL processes · Create and conduct project/architecture design review · Create POC when necessary to test new approaches · Design and build modern data management solutions · Enforce common data design pattern to increase code maintainability · Conduct peers code review and provide constructive feedback · Partner with team leads to identify, design and implement internal process improvements · Automate manual processes, optimize data delivery, understand when to re-design architecture for greater scalability Basic Qualifications: Relevant Professional experience. Work experience implementing and reporting on business key performance indicators in data warehousing environments. Strong understanding of data modeling principles including Dimensional modeling, data normalization principles etc. · Work experience using analytic SQL, working with traditional relational databases and/or distributed systems such as Hadoop / Hive, BigQuery, Redshift. · Experience programming languages (e.g. Python, R, bash). · Experience with workflow management tools (Airflow, Oozie, Azkaban, UC4) · Expert level understanding of SQL Engines and able to conduct advanced performance tuning · Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase) · Familiarity with data exploration / data visualization tools like Tableau, Chartio, etc. · Ability to think strategically, analyze and interpret market and consumer information. · Strong communication skills – written and verbal presentations. · Excellent conceptual and analytical reasoning competencies. · Degree in an analytical field such as economics, mathematics, or computer science is desired. · Comfortable working in a fast-paced and highly collaborative environment. A great team player who embraces collaborations also work well individually while supporting multiple projects in parallel
Thanks & Regards Harshit Srivastava harshits@vbeyond.com 732-585-1251
The content of this email is confidential and intended for the recipient specified in message only. It is strictly forbidden to share any part of this message with any third party, without a written consent of the sender. If you received this message by mistake, please reply to this message and follow with its deletion, so that we can ensure such a mistake does not occur in the future. |
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com