Search This Blog

Multiple positions- Snowflake Admin Or Snowflake Data engineer Or Architect @Remote

Hi  
My name is Bhaskar. I am a Technical Recruiter with PamTen, headquartered in Princeton New Jersey. PamTen has been offering technology services to fortune 500 companies across United States for over 15 years, having global presence with offices in USA, Canada, & India. The purpose of this email is to serve as an invitation to discuss the below opportunity with our direct customer that I think you would be a great fit for.

Role: Snowflake Admin
Location: Remote
No OPT and CPT
 
Being a technical specialist of all aspects of Snowflake Define standard methodologies on Snowflake & implement by closely working with SME Architects & Cloud Specialists
Maintain deep understanding of complementary technologies and help organizations use Snowflake as part of their larger technology stack
Working with product team & helping them in crafting & onboarding their application to Snowflake & productionizing.
Provide mentorship on how to resolve customer-specific technical challenges & work with the snowflake vendor on resolving bugs or any new requirements
Build, design, architect and implement high-volume, high-scale data analytics and machine learning Snowflake solutions in the cloud.
Provide technology leadership in driving large complex Snowflake implementations
Tune Snowflake for better performance and optimize utilization both on storage & processing
Craft the monitoring requirements, automate them using python programming & implement
Understand the Data Classification defined by security policies
Auditing measures to ensure strict controls on data
Provide snowflake trainings on new features & how they can be used by the product teams
 
Role: Snowflake Data Engineer
Location: Remote
No OPT and CPT
 
Job Description
  • At least 8 years of IT experience and 4 years or more of work experience in data management disciplines including data integration, modeling, optimization and data quality.
  • Strong experience with advanced analytics tools for Object-oriented/object function scripting using languages such as [R, Python, Java, C++, Scala, others].
  • Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management.
  • Strong experience with popular database programming languages including [SQL, Blob Storage and SAP HANA] for relational databases and certifications on upcoming [MS Snowflake HDInsights, Cosmos] for non-relational databases.
  • Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using traditional data integration technologies. These should include [ETL/ELT, data replication/CDC, message-oriented data movement, API design and access] and upcoming data ingestion and integration technologies such as [stream data integration, CEP and data virtualization].
  • Strong experience in working with and optimizing existing ETL processes and data integration and data preparation flows and helping to move them in production.
  • Strong experience in streaming and message queuing technologies [such Snowflake Service Bus, and Kafka].
  • Basic experience working with popular data discovery, analytics and BI software tools like [Tableau, Power BI and others] for semantic-layer-based data discovery.
  • Strong experience in working with data science teams in refining and optimizing data science and machine learning models and algorithms.
  • Demonstrated success in working with large, heterogeneous datasets to extract business value using popular data preparation tools.
  • Demonstrated ability to work across multiple deployment environments including [cloud, on-premises and hybrid], multiple operating systems and through containerization techniques such as [Docker, Kubernetes].
  • Interpersonal Skills and Characteristics
  • Strong leadership, partnership and communication skills
  • Ability to coordinate with all levels of the firm to design and deliver technical solutions to business problems
  • Ability to influence without authority
  • Prioritization and time management
  • Data modelling with Enterprise Data Warehouse and DataMart, Snowflake Data Lake Gen2 & BLOB.,
  • Data engineering experience with Snowflake Databricks
  • Hands-on experience in SQL, Python, NoSQL, JSON, XML, SSL, RESTful APIs, and other formats of data viz parquet, ORC, AVRO
  • Hands-on emphasis with a proven track record of building and evaluating data pipelines, and delivering systems for final production
  • Exposure to Big Data Analytics (data and technologies), in-memory data processing using spark.
  • Working Experience with various data bases like SAP HANA, Cassandra, Mangodb
  • Strong understanding DevOps, on-premise, and cloud deployments
Roles and responsibilities:
  • Build Data Pipelines
  • Drive Automation through effective metadata management
  • Learning and applying modern data preparation, integration and AI-enabled metadata management tools and techniques.
  • Tracking data consumption patterns.
  • Performing intelligent sampling and caching.
  • Monitoring schema changes.
  • Recommending — or sometimes even automating — existing and future integration flows.
  • Collaborate across departments
  • train counterparts in these data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.
  • Participate in ensuring compliance and governance during data use
 
Role: Snowflake architect with strong Azure Experience
Location: Roswell, GA (Remote till Covid)
No OPT and CPT 
  1. Experience in working with Azure data ecosystem of ADF, databricks, ADLS, HDInsights, PowerBI, Synapse or SQL DWH
  2. Must have designed and executed at least 2 projects using Azure data ecosystem
  3. Experience in designing architecture patterns for streaming, batch, near real-time using Azure
  4. Experience in connecting to snowflake from Azure
  5. Experience in building data warehouse on snowflake
  6. Must have understanding of snowflake roles and deployments of virtual warehouses
  7. Must have knowledge of snowflake architecture and implementation patterns
  8. Must have designed and executed at least 1 project of snowflake implementation on Azure cloud platform
  9. Good knowledge of Azure PaaS and IaaS
  10. Good knowledge of data modeling, integration and design techniques
  11. Must have ability to evaluate different Azure products to fit the requirements
  12. Must have excellent presentation skills to justify technical design decision to architecture board and senior leadership
  13. Must have experience in leading team/s and get execution done
  14. Must have experience in creating high level and detailed design documents
  15. Must be a hands-on programming actively in pySpark or snowflake and SQL queries
  16. Must be able to write complex sql queries

Regards,
Bhaskar koppisetti
bhaskar.kumar@pamten.com
609-212-0524 x 116 (Work)|609-759-2972 (Direct)
Technical Recruiter



www.pamten.com
New Address: 2 Research Way, Princeton, NJ 08540
Certified Minority & Women-Owned Business Enterprise - NJ, NY, Philadelphia
Inc 5000 Company
NJ Top 50 Fastest Growing Companies
Smart CEOs Future 50 Companies

20 Most promising Web Development & Design Solutions Provider
 
Build Your Career @ PamTen 
 

IMPORTANT CONFIDENTIALITY NOTICE. This message  is intended exclusively for the individual or entity to which it is addressed. If you have received this communication in error, please notify the sender immediately and delete all copies of the communication. Any unauthorized disclosure, use, distribution, or reproduction of this message or any part of it including attachments is prohibited and may be unlawful.
 


If you would like to unsubscribe from PamTen Inc, please click here.

No comments:

Post a Comment

Thanks

Gigagiglet
gigagiglet.blogspot.com

Featured Post

Salesforce Architect||Philadelphia,PA-Hybrid

Job: - Senior Salesforce Architect Location: Philadelphia, PA (Hybrid) Term: 12+ months W2 or C2C Interview Process: Video  ...

Contact Form

Name

Email *

Message *

Total Pageviews