Required || Data Engineer with Python & PL/SQL || New York, NY
02/26/20 4:31 PM
Greetings,
We have an opening for . Below are the requirement details, just go through it & if you feel interested, revert back to me with your updated resume Your earliest reply is highly appreciated.
Note: if you know of someone who may be a good fit, please forward this to them
Role: Data Engineer with Python & PL/SQL
Location: New York, NY
Skillset is PL\SQL, Python, Spark, Scala, snowflake with AWS.
Primary Skill set – Very good PL/SQL with programming experience in python or Scala for Spark.
Secondary - Experience in Snowflake will be an advantage (if not at least experience in Redshift). Cloud engineering experience in AWS is good to have.
• Deep understanding of EDW data modeling using Star-schema/Snowflake and data architecting , Capacity Planning and Sizing
• Expertise in Data modeling/ETL/ELT/ BI reporting tools.
• Expertise in Python/Java scripting
• Hands on experience in implementing EDW/Data lake application on Cloud platform - AWS-S3/ Redshift/ Snowflake / Azure / Google Cloud)
• Relational and NoSQL databases. (Mongo DB , Hbase, Casandra) is plus
• Experience in Designing best architecture for EDW/Data Lake on Public/Private Cloud
• Deep understanding of EDW data modeling using Star-schema/Snowflake and data architecting , Capacity Planning and Sizing
• Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL
• Strong understanding of various data formats such as CSV, XML, JSON, Parquet, Avro.
• Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on premise technologies
• Experience in reporting technologies like Tableau, Looker is good to have. Should have some background in reporting.
• Expertise in Snowflake data modeling, ELT/ETL using Snowflake SQL, implementing stored procedures and standard DWH+ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel. Table Clustering/ Defining Cluster Keys.
Preferred certifications
• AWS CSA certification
• Snowflake certification
Thanks,
Next Level Business Services, Inc.
Consulting| Analytics| Staff Augmentation
VaishaliSharma
11340 Lakefield Drive Suite #200,
Johns Creek, GA, 30097
(904) 344-3291
vaishali.sharma@nlbservices.com
If you would prefer to no longer receive any emails from this Company, you may opt out at anytime by clicking here.
Greetings,
We have an opening for . Below are the requirement details, just go through it & if you feel interested, revert back to me with your updated resume Your earliest reply is highly appreciated.
Note: if you know of someone who may be a good fit, please forward this to them
Role: Data Engineer with Python & PL/SQL
Location: New York, NY
Skillset is PL\SQL, Python, Spark, Scala, snowflake with AWS.
Primary Skill set – Very good PL/SQL with programming experience in python or Scala for Spark.
Secondary - Experience in Snowflake will be an advantage (if not at least experience in Redshift). Cloud engineering experience in AWS is good to have.
• Deep understanding of EDW data modeling using Star-schema/Snowflake and data architecting , Capacity Planning and Sizing
• Expertise in Data modeling/ETL/ELT/ BI reporting tools.
• Expertise in Python/Java scripting
• Hands on experience in implementing EDW/Data lake application on Cloud platform - AWS-S3/ Redshift/ Snowflake / Azure / Google Cloud)
• Relational and NoSQL databases. (Mongo DB , Hbase, Casandra) is plus
• Experience in Designing best architecture for EDW/Data Lake on Public/Private Cloud
• Deep understanding of EDW data modeling using Star-schema/Snowflake and data architecting , Capacity Planning and Sizing
• Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL
• Strong understanding of various data formats such as CSV, XML, JSON, Parquet, Avro.
• Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on premise technologies
• Experience in reporting technologies like Tableau, Looker is good to have. Should have some background in reporting.
• Expertise in Snowflake data modeling, ELT/ETL using Snowflake SQL, implementing stored procedures and standard DWH+ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel. Table Clustering/ Defining Cluster Keys.
Preferred certifications
• AWS CSA certification
• Snowflake certification
Thanks,
Next Level Business Services, Inc.
Consulting| Analytics| Staff Augmentation
VaishaliSharma
11340 Lakefield Drive Suite #200,
Johns Creek, GA, 30097
(904) 344-3291
vaishali.sharma@nlbservices.com
Comments
Post a Comment
Thanks