Job Location: ATHENA, Florida
Primary Skills :
ENGINEERS , SPARK , JAVA , ECS , SCALA , EMR , ANALYTICAL SKILLS , ELASTIC SEARCH , HDFS , PYSPARK , PYTHON , EFS , APIS , AWS , AZURE , DYNAMO DB , HADOOP , HBASE , BIG DATA , BIGDATA , RDBMS
Description :
Our client is seeking a Big Data Engineer to join our Product Delivery team. This is a great opportunity for someone who specializes in Big data to enable cloud-based financial services platform to access timely, accurate and relevant data.
An ideal candidate will have built real time software services platforms where large volume messaging is core to the solution set.
SKILLSET SUMMARY - Must Have's:
· BigData: Hadoop / Spark (preferably with Scala; PySpark okay). Looking for Engineers with hands on experience. Or Python, Athena, Dynamo DB,RDS
· Hands-on engineer (good coding skills). Preferably with Java, Scala or Python
· Exposure to working with cloud infra like AWS, Azure
· Ability to program in Java (or at least open to work in Java if/when needed)
· Knowledge on Database architectures of RDBMS & No-SQL
· Experience in building Data-pipeline & Data-lakes
· Experience or Knowledge with Data-formats like Parquet, CSV, etc.
· Data-storage architectures like HDFS, HBase, S3 and/or Hive.
· Data-transformations concepts including Partitioning, Shuffling,
· Data-processing constructs like Joins, MapReduce
· Exposure to working with cloud infra like AWS, Azure
· AWS Cloud experience - S3, EFS, MSK, ECS, EMR, etc Data-transformations concepts including Partitioning, Shuffling, Data-processing constructs like Joins, MapReduce Exposure to working with cloud infra like AWS, Azure
SKILLSET SUMMARY – Good to Have:
5+ years of experience in relevant Streaming/Queueing implementation roles
Bachelor degree in Technical discipline; Masters preferred
Experience in monitoring the health of Kafka cluster (data loss and data lagging) and strategy for short TTD (time to detect) of broker failure and fast TTR (time to recover)
Strong coder who can implement Kafka producers and consumers in various programming languages following the common patterns and best practices
Experience in various integration with Kakfa such as Elastic Search, Databases (RDBMS or NoSQL)
Experience in Spark stream processing is a plus
Experience in RDBMS change log streaming is a plus
Systems integration experience, including design and development of APIs, Adapters, and Connectors and Integration with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions.
Financial Industry experience preferred Number of years of experience is flexible, if the candidate has above mentioned proven hands on tech skills
Proactive, ability to take initiative Curious, learning new technologies and solving problems is critical Critical thinking, analytical skills, Communication and positive mental attitude
Quick learner working with minimum guidance
Experienced Java Programmer.
Great attention to detail Organizational skills
An analytical mind
Hdfs,Hbase,Aws Emr,Java
Additional Skills:
Hadoop,Python,Big Data,Spark
Warm Regards,
Bhaskar kumar | Senior recruiter
3S Business Corporation
Richmond Avenue | Houston, TX – 77082
An E-Verified Company
To be removed from our mailing list reply with "remove@3sbc.com" and include your "original email address/addresses" in the subject heading. Include complete address/addresses and/or domain to be removed. We will immediately update it accordingly. We apologize for the inconvenience if any caused. Please consider the environment before printing this email. Go Green
You received this message because you are subscribed to the Google Groups "hotrequirements223" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hotrequirements223+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hotrequirements223/4e026ec2-516c-450c-9df6-c3e3cfc0cfe7n%40googlegroups.com.