The Hadoop Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages that are aligned with other client internal projects.
1. Design and development of data ingestion pipelines.
2. Perform data migration and conversion activities.
3. Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
4. Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
5. Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
1. Hadoop (Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase
6. Python, Perl
Good to Have:
7. Strong Database Design Skills
8. ETL Tools
9. NoSQL databases (Mongo, Couchbase, Cassandra)
10. Good understanding and working knowledge of Agile development
1. Lead technical design sessions, suggest best solution options per industry standards
2. Document and maintain project artifacts.
2. Suggest best practices, and implementation strategies using Hadoop, Java, ETL tools.
3. Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
4. Other duties as assigned.
|Experience||5 - 11 Years|
|Salary||5 Lac 50 Thousand To 18 Lac P.A.|
|Industry||IT Software - Application Programming / Maintenance|
|Qualification||Higher Secondary, Professional Degree, Other Bachelor Degree, B.Tech/B.E, M.C.A|
|Key Skills||Hadoop HDFS Hive Impala Spark Oozie Hbase SCALA SQL Linux JSON Python Perl ETL Tool NoSQL MongoDB couchbase cassandra|