Hadoop Developer
Location : Charlotte, NC & Chicago, IL (Hybrid)
Contract : 12+ Months Contract
Job Description :
Hadoop Engineer (SME) role supporting NextGen Platforms built around Big Data Technologies (Hadoop, Spark, Kafka, Impala, Hbase, Docker-Container, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI / ML) products like Cloudera, Databricks, Snowflake, Talend, Greenfield, ELK, KPMG Ignite etc. Hadoop Engineer is involved in the full life cycle of an application and part of an agile development process. They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity.
Expert level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, Zookeeper and Postgres
Hadoop Developer • Charlotte, NC, United States