Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
29/11 Parnika
Consultant at Gi Group (Elixir Consulting)

Views:153 Applications:32 Rec. Actions:Recruiter Actions:8

Senior Manager/AVP/Senior AVP - AWS Archtecture (8-15 yrs)

Gurgaon/Gurugram/Bangalore Job Code: 1343647

Hiring for AWS Architect Role for Gurgaon/Bangalore location for top MNC

Notice period - 30 Days max

Pointers:

1. Data Architecture experience with Designing and implementation of solution

2. Real time / Streaming should be part of the search

3. Mandate - Python + SQL + Pyspark + Airflow

4. Kafka/ Kineis - Any one is Mandate

5. Some knowledge on CI/CD, data quality, security etc is preferred

Job Responsibilities:

1. Data Architecture and Cloud Strategy

- Develop and maintain a comprehensive data architecture and cloud strategy that aligns with the organization's goals and needs.

- Design, implement, and manage cloud-based data infrastructure on AWS, ensuring scalability, reliability, and cost-efficiency.

- Utilize AWS services (S3, Glue, EMR, Redshift, Lambda, Kinesis, etc.) to build and optimize data pipelines and storage solutions.

- Champion the use of data lakehouse architecture and optimize its performance for analytical and operational workloads.

2. Data Engineering

- Lead and guide data engineering teams to develop, maintain, and optimize ETL processes for data ingestion, transformation, and loading.

- Implement real-time data processing solutions using technologies such as Apache Kafka and AWS Kinesis.

- Collaborate with data scientists and analysts to ensure data availability and quality, enabling effective analytics and reporting.

- Leverage dbt for data modelling and transformation to support self-service analytics and data governance.

3. Data Ingestion & Ingestion

- Architect and implement data integration solutions for API ingestion, enabling data from diverse sources to be captured, transformed, and ingested into our data lakehouse.

- Utilize Airbyte and custom APIs to ensure efficient, reliable, and secure data transfers.

- Manage data integration pipelines to support real-time and batch data processing.

4. Workflow Orchestration

- Design, configure, and maintain workflow orchestration using Apache Airflow to automate ETL processes and data pipeline executions.

- Monitor and optimize job scheduling, error handling, and performance of data workflows.

5. Security and Compliance

- Implement data security protocols, access controls, and encryption to safeguard sensitive data.

- Ensure compliance with data privacy regulations and industry standards.

6. Collaboration and Documentation

- Collaborate with cross-functional teams to understand data requirements and provide data solutions to meet their needs.

- Maintain comprehensive documentation for data engineering and data architecture processes and solutions.

7. Infra & Operations

- Guide the team in setting up cloud Infra and automate using tools like terraform, cloud formation, Jenkins etc

- Guide the operations team in setting up automated monitoring & alerts mechanism

Qualifications:

- Bachelor's or higher degree in a relevant field.

- 8+ years of proven experience in data engineering, cloud architecture, and AWS services.

- Extensive knowledge of data lakehouse technologies, Hudi, dbt, Airbyte, Redshift, Glue, Kinesis and Apache Airflow.

- Strong expertise in programming languages like SQL, Python and processing frameworks like PySpark

- Strong expertise in real-time data processing.

- Excellent problem-solving and analytical skills.

- Strong communication and teamwork abilities.

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
Something suspicious? Report this job posting.