Pune, India
Posted 1 year ago

Job Description

  • Developer needs to work with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others
  • Take end-to-end responsibility of the Hadoop Life Cycle in the organization
  • Be the bridge between data scientists, engineers and the organizational needs
  • Do in-depth requirement analysis and exclusively choose the work platform
  • Setup and development of Hadoop Architecture and HDFS is a must
  • Take the ownership of of MapReduce, Spark, Scala, Storm, Java related jobs

Experience : 3 – 7 Years


  • Knowledge of MapReduce Framework, Storm, Kafka with Java
  • Working knowledge of Linux operating system (RedHat centOS 7)
  • Must have worked on ecosystem like Hortonworks Ambari
  • Knowledge of Spark with Scala will add advantage

Job Features

Job CategoryBig Data

Apply Online

A valid email address is required.
A valid phone number is required.