Job Location: Remote
Need 10+ years of Experience
Job Description:
s looking for a highly energetic and collaborative Senior Data Engineer with experience leading enterprise data projects around Business and IT operations. The ideal candidate should be an expert in leading projects in developing and testing data pipelines, data analytics efforts, proactive issue identification and resolution and alerting mechanism using traditional, new and emerging technologies. Excellent written and verbal communication skills and ability to liaise with technologists to executives is key to be successful in this role
Overall Experience level:
10+ years in GCP Data Engineering and Analyst experience.
• 6 + years of experience in Data Warehouse and Hadoop/Big Data ecosystem
• 3+ years of experience in strategic data planning, standards, procedures, and governance
• 4+ years of hands-on experience in Python or Scala
• 4+ years of experience in writing and tuning SQLs, Spark queries
• 3+ years of experience working as a member of an Agile team
• Experience with Kubernetes and containers is a plus
• Experience in understanding and managing Hadoop Log Files.
• Experience in understanding Hadoop multiple data processing engines such as interactive SQL, real time streaming, data science and batch processing to handle data stored in a single platform in Yarn.
• Experience in Data Analysis, Data Cleaning (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.
• Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment., ETL Flow
• Experience in architecting, designing, installation, configuration and management of Apache Hadoop Clusters
• Experience in analyzing data in HDFS through Map Reduce, Hive and Pig
• Experience building and optimizing 'big data' data pipelines, architectures and data sets.
• Strong analytic skills related to working with unstructured datasets
• Experience in Migrating Big Data Workloads
• Experience with data pipeline and workflow management tools: Airflow
• Experience with scripting languages: Python, Scala, etc. •
Comments for Suppliers:Big Data, Hadoop, Kubernetes, spark, SQL tuning
Should have Google Cloud(GCP)/Azure exp.
Additional Skills: Spark, Pyspark, Scala,SQL tuning, hadoop and GCP
Warm Regards,
Bhaskar kumar | Senior Recruiter
3S Business Corporation
kumar.koppisetti@3sbc.com
16700 HOUSE HAHL RD BLDG 6B, Cypress, TX-77433
An E-Verified Company
To be removed from our mailing list reply with "rem...@3sbc.com" and include your "original email address/addresses" in the subject heading. Include complete address/addresses and/or domain to be removed. We will immediately update it accordingly. We apologize for the inconvenience if any caused. Please consider the environment before printing this email. Go Green
--You received this message because you are subscribed to the Google Groups "hotrequirements223" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hotrequirements223+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hotrequirements223/d1fc6ae2-3022-433c-b6c1-93aa0980ccafn%40googlegroups.com.
No comments:
Post a Comment
Thanks
Gigagiglet
gigagiglet.blogspot.com