Role:
Responsible for teaching BigData/ Hadoop Technologies and will be part of develop BigData/ Hadoop Applications
Develop BigData/ Hadoop Technologies training content for Working Professionals .
Conduct online and classroom training sessions by providing practical use cases and assignments.
Design quality self-faced recorded training sessions on all latest BigData/ Hadoop development technologies for students, working professionals and corporate.
Continuously improve on teaching methodology to suite online model to lead to high student.
Work in small teams where each team member has a lot of ownership and each individual can make a big impact.
Design and make the trainees develop mini or major real time projects for practical exposure.
Experience / Skills :
Engineering degree or post graduate degree or Ph.D in Statistics / Computer Science or equivalent experience.
Minimum 9 years of developing and implementing Big data solutions in a corporate environment with a total of 4 years of experience in leadership.
Experience in Hadoop (Cloudera or HortonWorks preferred), HDFS, MapReduce, Hive, Pig, HBase, R, Java, C/C, Perl, or Python.
Experience in cassandra, MangoDB, Jaspersoft, oozie and zookeeper.
Experience in developing Hadoop integrations for data ingestion, data mapping and data processing capabilities.
Any past experience with optimizing computing techniques i.e. parallel processing, grid computing etc.
Good interpersonal with excellent communication skills.
Preference :
Teaching Experience
Researcher / Phd aspirants
About Job Profile:
Its a great working environment for those who wish to work with top 50 Professors of the world.
Flexibly of work
You should be comfortable travelling to deliver lecture
Didn’t find the job appropriate? Report this Job