Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
13/09 Kirti
HR Associate at THEMATHCOMPANY

Views:152 Applications:12 Rec. Actions:Recruiter Actions:4

TheMathCompany - Data Engineer Architect (8-11 yrs)

Bangalore Job Code: 1314002

REQUIRED QUALIFICATIONS:

- We are looking for individuals who are curious, excited about learning, and navigating through the uncertainties and complexities that are associated with growing a company. Some qualifications that we think would help you thrive in this role are:

- BE/BS/MTech/MS in computer science or equivalent work experience.

- 8+ years of experience in building data processing applications using Hadoop, Spark and NoSQL DB and Hadoop streaming

ROLE DESCRIPTION:

- Responsible for designing, storing, processing, and maintaining large-scale data and related infrastructure

- Is a technical lead who is responsible to build data pipelines, implementing data quality, building data models, monitoring and optimizing data pipelines to minimize runtime, and adhering to customer and MathCo. set standards

- Is responsible to set standards, drive customer conversations, build end-end data flow, optimize for time and cost, etc.

- Has a strong conceptual understanding of Data Warehousing and ETL, Data Governance and Security, Cloud Computing, and Batch & Real Time data processing

- Ability to build reusable frameworks that can drive efficiency of the overall data system

- Can build architecture for end-end data systems on at least one cloud platform with the ability to adapt to other cloud platforms with minimal supervision

- Has executed multiple projects including - streaming, batch, large data pipelines, etc.

- Manages conversation with the client stakeholders to understand the requirement and translate it into technical outcomes"

ROLE QUALIFICATIONS:

- Has strong execution knowledge of Data Modeling, Databases in general (SQL and NoSQL), software development lifecycle and practices, unit testing, functional programming, etc.

- Working knowledge of ETL and/or orchestration tools like IICS, Metallion, Airflow, Azure Data Factory, AWS Glue, GCP Composer, etc.

- Working knowledge of one or more Data warehouses like Snowflake, Redshift, Hive, Big Query, etc.

- Proficient in Spark and can optimize Spark jobs with ease

- Proficient in at least one programming language used in data engineering, such as Python (Non-negotiable) (or/AND Scala/Rust/Java)

- Understanding of Medallion architecture pattern

- Has strong SQL knowledge along with optimization skills

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
  • Apply
  • Assess Yourself
  • Save
  • Insights
  • Follow-up
Something suspicious? Report this job posting.