Urgent need of Data streaming for remote role

Hi,

Please go through the JD and revert me your comfortness :-

Position:- Data streaming

Location:-remote


Responsibility:-

Client Need help building out a streaming data pipeline using:

  • AWS services - (Lambda, DynamoDB, SageMaker Endpoint, SageMaker Feature store, S3, MSK, Aurora MySql, Step Functions, GLUE, API Gateway, Cloud9)
  • Kafka
  • Apache Flink

 

Desired skills:

  • Experience with “topics” within the Kafka environment.  
  • Knowledge of Apache Flink.
  • Working with Amazon “SageMaker Feature Store” would be a plus.

 

If the individual has data lake experience within AWS, specifically (Data “Lakehouse services”)

 

 


Vaibhav Kumar | VBeyond Corporation

+1-6788928560 Ext. 211/ mob-732-436-3975 | (866) 614-3884 (F) | vaibhavk@vbeyond.com

Hangout:-vaibhavvbeyond@gmail.com


 

 

Note: VBeyond is fully committed to Diversity and Equal Employment Opportunity.

 

Disclaimer: We respect your Online Privacy. This is not an unsolicited mail. Under Bill S 1618 Title III passed by the 105th US Congress this mail cannot be considered Spam as long as we include Contact information and a method to be removed from our mailing list. If you are not interested in receiving our e-mails then please reply to vaibhavk@vbeyond.com subject=Remove. Also mention all the e-mail addresses to be removed which might be diverting the e-mails to you. We are sorry for the inconvenience


Comments

Popular posts from this blog

SAP Basis Architect

JD :: Snowflake Python AWS Developer | contract | New Jersey

Data Architect