Hello Govardhan, Greetings!
This is Ubaid Rahman from VBeyond Corp. We are a global recruitment company with specialization in of hiring of IT professionals. One of our clients is looking Data Architect Role : Sr. Data Architect with Data Bricks Location : Appleton, Wisconsin ( Remote till Covid)
Requirements : • 10 + years of experience in Data pipeline engineering for both batch and streaming applications. • Experience with data ingestion process, creating data pipelines and performance tuning with Snowflake and AWS. • Implementing SQL query tuning, cache optimization, and parallel execution techniques. Must be hands-on coding capable in at least a core language skill of (Python, Java or Scala) with Spark. • Expertise in working with distributed DW and Cloud services (like Snowflake, Redshift, AWS etc) via scripted pipeline Leveraged frameworks and orchestration like Airflow as required for ETL pipeline This role intersects with “Big data” stack to enable varied analytics, ML etc. Not just Datawarehouse type workload. • Experience handling large and complex sets of XML, JSON, Parquet and CSV from various sources and databases. · Solid grasp of database engineering and design Identify bottlenecks and bugs in the system. · Knowledge of highly scalable ‘big data’ data stores, Stream sets, Databricks |
Comments
Post a Comment
Thanks