Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
05/07 Shalini
Manager - Talent Acquisition at BB works

Views:106 Applications:23 Rec. Actions:Recruiter Actions:1

Data Modelling Role - IT (4-10 yrs)

Greater Noida/Hyderabad/Bangalore Job Code: 1283495

- Design and develop data models for the Enterprise Data Warehouse (EDW) using industry best practices and data modeling techniques.

- Build and maintain ETL/ELT pipelines to extract, transform, and load data from various sources into the EDW, using Spark and Scala programming languages.

- Optimize and fine-tune Spark applications to ensure efficient data processing and high-performance analytics.

- Collaborate with data scientists and analysts to understand their data requirements and implement solutions that enable effective data analysis and reporting.

- Identify and resolve data quality and data integration issues by implementing data cleansing, transformation, and validation processes.

- Monitor and maintain data pipelines, ensuring data availability, reliability, and scalability.

- Work closely with cross-functional teams to gather requirements, provide technical expertise, and contribute to the overall data strategy and architecture.

- Stay up to date with emerging technologies, tools, and best practices in the field of data engineering and apply them to improve existing data infrastructure and processes.

- Document data models, processes, and technical specifications to facilitate knowledge sharing and maintain data lineage and governance.

Qualifications and Skills :

- Bachelor's degree in Computer Science, Information Systems, or a related field. A master's degree is a plus.

- Strong proficiency in Scala programming language and experience with Spark for big data processing and analytics.

- Solid understanding of data modeling concepts, including relational, dimensional, and schema design principles.

- Hands-on experience with enterprise data warehouse (EDW) design and implementation.

- Proficiency in SQL and experience with database technologies (e.g., Oracle, MySQL, PostgreSQL) for data manipulation and retrieval.

- Experience with data integration and ETL/ELT tools and frameworks (e.g., Apache Kafka, Apache Airflow, Talend, Informatica).

- Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and related data services (e.g., Amazon Redshift, Azure Synapse Analytics).

- Strong problem-solving and analytical skills with the ability to translate business requirements into technical solutions.

- Excellent communication skills, with the ability to collaborate effectively with both technical and non-technical stakeholders.

This job opening was posted long time back. It may not be active. Nor was it removed by the recruiter. Please use your discretion.

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
  • Apply
  • Assess Yourself
  • Save
  • Insights
  • Follow-up
Something suspicious? Report this job posting.