Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
18/10 Anandh Shanmugaraj
CEO at Gladwin Analytics

Views:1280 Applications:73 Rec. Actions:Recruiter Actions:24

Gladwin Analytics - Senior Data Scientist - Big Data Engineering - Aerospace/Airlines/Transportation (5-15 yrs)

Middle East/Dubai Job Code: 502847

Experience - 5 - 8 Years with 4-5 large Hadoop / Big Data implementations

Industry Domain - Aerospace, Airlines, Transportation

Skillset

- Ability to work with huge volumes of data so as to derive Business Intelligence.

- Analyze data, uncover information, derive insights and propose data-driven strategies.

- Database concepts, principles, structures and best practices.

- Hands-on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others.

- Full knowledge of Hadoop Architecture and HDFS is a must.

- Good knowledge of Data warehousing concepts and Business Intelligence, Data management & Data Architecture.

- Comprehensive understanding of Hadoop/MapReduce ecosystem and architecture.

- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.

- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala.

- Experience with Spark, NoSQL databases, such as HBase, Cassandra and MongoDB.

- Knowledge of various ETL techniques and frameworks, such as Flume.

- Experience with various messaging systems, such as Kafka or RabbitMQ.

- Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.

- Knowledge of Java & Web development.

- An analytical bent of mind and ability to learn-unlearn-relearn.

Responsibilities -

- Design ETL Hubs, ETL Architecture for Data warehouse/BI implementations.

- Ensure systems meet business requirements and industry practices.

- Build high-performance algorithms, prototypes, predictive models and proof of concepts.

- Develop data set processes for data modeling, mining and production.

- Collaborate with data architects, modelers and IT team members on project goals.

- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.

- Implementing ETL process using SQL programming, database design/development and using ETL tools.

- Monitoring performance and advising any necessary infrastructure changes.

- Defining data retention policies.

- Work with Hive, Sqoop, Impala and Kudu components of the Hadoop ecosystem.

- Write complex scripts using Python/Linux scripting/Perl.

This job opening was posted long time back. It may not be active. Nor was it removed by the recruiter. Please use your discretion.

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
  • Apply
  • Assess Yourself
  • Save
  • Insights
  • Follow-up
Something suspicious? Report this job posting.