Job Role -
1.Strong understanding of technologies in Hortonworks Data Platform in
- Data lifecycle & Governance: Falcon, Atlas
- Data Workflow: Sqoop, Flume, Kafka
- Data access: Map Reduce, Pig, Hive, Hbase, Storm, Spark
- Security: Ranger, Knox, HDFS encryption, Kerberos
- Managing & monitoring: Ambari, ZooKeeper
2. Strong understanding of major programming/scripting languages like Java, Linux, Phyton
3. Must have worked in Agile projects implementing Big Data solutions
4. Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms
5. Excellent analytical and communication skills
6. Strong in BI processes & tools (ETL, OLAP, Relational Databases)
7. Working experience on Teradata/Oracle/Informatica
8. Knowledge of recent evolutions in open source technologies in the Data Analytics and Business Intelligence Domain
9. Creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development of the proposed solution.
10. Responsible for managing the full life-cycle of a Hadoop solution
11. Should be able to link the needs of the organization and the big data scientists and the big data engineers
12. Understanding emerging and evolving end user usage models and requirements in Big Data and Analytics, documenting those usage models and business, technical and user requirements and designing a solutions architecture to meet those requirements
Please Note - academic records Class X onwards (Minimum 50%) and Only Full Time courses would be considered.
Didn’t find the job appropriate? Report this Job