Hadoop Developer - IT, New Pune

Job Description

We are searching for a Big Data / Hadoop developer who will work in our data ingestion team to develop ingestion pipelines for loading terabytes of data into Hadoop. Ingestion involves loading both structured and unstructured data. Work with data scientist team to help them analyze large amounts of data.

Required Skills

  1. • Experience working of programming languages like Java, Python etc.
  2. • Knowledge of relational databases (RDBMS) like Oracle, MYSQL, MS SQL Server etc.
  3. • Understanding and working experience with Hadoop, Hive, Pig, Sqoop, Map/Reduce, Oozie, Spark etc.
  4. • Understanding of No-SQL databases like Mongo, HBase etc.
  5. • Knowledge in scripting for automation process.
  6. • Proficiency in Linux and Windows environment.
  7. • Troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage and networks.
  8. • Familiarity and experience in different phases of software development life cycle.
  9. • Good understanding of algorithms, data structures, performance optimization techniques, and object-oriented programming.
  10. • Excellent communication, interpersonal, and problem-solving skills.
  11. • BS in Computer Science or equivalent.

Required Education

• (UG - B.Tech/B.E. - Any Specialization) OR (PG- MCA - Computers) AND (Doctorate- Any Doctorate - Any Specialization, Doctorate Not Required).

Please email your resume to careers@gridedge.com