Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
31/07 Rahul Malani
Head - Analytics at Grand Labs

Views:937 Applications:13 Rec. Actions:Recruiter Actions:10

Grand Labs - Database Architect - Data Innovation & Management Team (5-10 yrs)

Bangalore Job Code: 477764

Title: Database Architect in Data Innovation & Management (DIM) Team @ Grand hyper Labs

Description :

Grand Labs is an Advanced Analytics Captive based in Bangalore completely owned by Regency Group that owns and operates the Brand - GRAND - Retail Stores in Gulf Countries

Role :

The candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake.

Key Responsibilities :

- Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC

- Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality

- Manage All ETL and Data Model Update Routines

- Integrate new data sources into DWH

- Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure

Skills Needed :

- Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc)

- Experience with UNIX and comfortable working with the shell (bash or KRON preferred)

- Good understanding of Data warehousing concepts.

Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce

- Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.

- Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users.

- Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools.

- Performance tuning of Hadoop clusters and Hadoop MapReduce routines.

- Screen Hadoop cluster job performances and capacity planning

- Monitor Hadoop cluster connectivity and security

- File system management and monitoring.

- HDFS support and maintenance.

- Collaborating with application teams to install operating system and

- Hadoop updates, patches, version upgrades when required.

- Defines, develops, documents and maintains Hive based ETL mappings and scripts

Min Experience: 5-10 Years

Education Qualification: Bachelor's Degree in Computer Science/Engineering, IT, or Data Analytics

Expected Joining Date: Within 45-60 days

This job opening was posted long time back. It may not be active. Nor was it removed by the recruiter. Please use your discretion.

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
  • Apply
  • Assess Yourself
  • Save
  • Insights
  • Follow-up
Something suspicious? Report this job posting.