Search This Blog

June 14, 2019

AWS Big Data Architect - Sleepy Hollow, NY

AWS Big Data Architect - Sleepy Hollow, NY
AWS Big Data Architect - Sleepy Hollow, NY
Job title: AWS Big Data Architect
Position ID: MCIVNI190027
Location: Sleepy Hollow, NY
Start date: ASAP
Duration: Long Term
Rate: $/hr on 1099/C2C
 
 
AWS Certifications a PLUS for this one…
 

We are looking for a Senior AWS Big Data Architect to support our growing BI/Cloud/Data Practice division for our ever-expanding client base. You will be required to have expertise in architecting and solution large scale enterprise cloud/big data platforms as well as knowledge of traditional solution stack with a special focus on cloud deployment.
 
Job Requirements
·    Good domain understanding of Pharma industry especially Commercial and Operations
·    Understanding of current trends of healthcare and life sciences infrastructure, (including databases, enterprise integration buses, protocols, APIs and formats), security, networking and cloud-based delivery models.
·    Work closely with Business Partners to design Enterprise Data Warehouse
·    Build the strategy, direction and roadmap for data warehousing, business intelligence and analytics for the client
·    Rapidly architect, design, prototype, and implement solutions to tackle traditional BI, Big Data needs for clients
·    Work in subject matter experts to understand client needs and ingest variety of data sources such as social media, internal/external documents, financial data, and operational data
·    Research, experiment, and utilize leading Cloud Services, Traditional BI, Big Data methodologies
·    Translate client business problems into technical approaches that yield actionable results across multiple, diverse domains; communicate results and educate others through design and build of insightful visualizations, reports, and presentations
·    Provide technical design leadership with the responsibility to ensure the efficient use of resources, the selection of appropriate technology and use of appropriate design methodologies
·    Experience on some horizontally scalable file systems (or databases) like S3, HDFS. Experience in technology areas such as Big data appliances, in-memory, NoSQL DBs (such as HBase, MongoDB, Cassandra, Redis).
·    Knowledge of MPP, Graph databases for networks analysis.
·    Knowledge of Open Source tools and services - such as python, apache Spark, apache Airflow, apache Nifi, apache Flink etc.
·    Experience on traditional databases (RDBMS) like MySQL, DB2, PostgreSQL, MariaDB. Oracle.
·    Additional focus areas would-be real-time analytics (such as Kafka, Spark Streaming, Flink), in-memory processing (Spark), Next Best Action, and Internet of Things (IoT).
·    Knowledge of MPP, Graph and NOSQL databases
·    Knowledge of Open Source tools and services - such as python, apache spark, apache Airflow
·    Knowledge of designing security architectures on AWS and Hortonworks
·    Good to have knowledge on the teamwork tools and product management tools like Git and Jira.
·    Good to have knowledge of container technologies like Mesos, Docker and Kubernettes.
·    Key responsibilities in this role would be Data Lake Implementation, BI system migration / Transformation, Tools / Technology comparison and selection, etc.
·    Be able to work with globally distributed teams with flexible working hours around the clock.

Company Name | Website