HamburgerMenu
iimjobs
Job Views:  
284
Applications:  18
Recruiter Actions:  0

Posted in

IT & Systems

Job Code

1620031

Big Data Technical Architect

Wxperience : 10 - 16 years

Location : Bengaluru

Exp: 10+Years

Work Mode: Hybrid

Notice Period : Immediate joiners(notice period served or serving candidate)

NOTE : We are looking for Immediate joiners(notice period served or serving candidate)

Apply only If you are Immediate joiners(notice period served or serving candidate)

Apply only if you are having 10+Years of relevant experience as per the JD.

Job Description :

- We are seeking a 10+ years experienced and technically proficient Big Data Technical Architect with deep expertise in cloud platforms (AWS, Azure) and Business Intelligence (BI) tools like Tableau, Power BI, and Qlik.


- The ideal candidate will have a solid understanding of modern data architecture patterns, data lake and lakehouse strategies, robust ETL/ELT design, real-time data ingestion, and hands-on experience with tools such as Snowflake, AWS Glue, Microsoft Fabric, Spark, and Python.


- This role requires a strategic thinker with excellent communication skills to interface with clients and internal stakeholders and lead enterprise-level data initiatives.

Roles & Responsibilities:

Key Responsibilities:

- Architect and guide implementation of Data Lakes, Lakehouses, and Data Warehouses using tools such as Snowflake, Microsoft Fabric, and Delta Lake.

- Design and implement scalable, secure, and high-performing Big Data architectures across AWS and Azure.

- Develop robust ETL/ELT pipelines using modern data services like AWS Glue, Azure Data Factory, Spark, and custom scripts in Python/SQL/PLSQL.

- Integrate structured and unstructured data sources using API integrations, event-driven pipelines, real-time data ingestion, and batch processing.

- Lead the BI and analytics layer strategy using tools such as Tableau, Power BI, and Qlik Sense for enterprise reporting and dashboarding.

- Design and implement data models (conceptual, logical, physical) that support both operational and analytical requirements.

- Establish and enforce data governance, data security, and data quality standards across platforms.

- Drive initiatives in data observability, monitoring data pipelines, identifying issues, and ensuring SLA adherence.

- Serve as a technical SME and advisor to both internal teams and clientstranslating business needs into technical solutions.

- Lead architectural reviews and provide guidance on data best practices and cloud optimization.

- Develop and deliver technical presentations to executive and non-technical stakeholders.

Domain Experience (Good to Have):

- Exposure to BFSI domain, including understanding of risk management, regulatory compliance (Basel III, PCI DSS), fraud detection, and financial data workflows.

- Familiarity with Retail data challenges such as supply chain analytics, customer behavior tracking, inventory management, and omni-channel reporting.

- Experience with Pharma and Healthcare sectors, including clinical data management, regulatory compliance (HIPAA, GDPR), patient analytics, and drug safety data.

- Ability to adapt data architecture and BI solutions to domain-specific requirements across these industries, supporting both operational and strategic business goals.

Required Skills:

- Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.


Strong hands-on experience with cloud platforms and related data technologies:

- AWS: S3, AWS Glue, Redshift, Lambda, Kinesis Data Streams & Firehose, Managed Kafka (MSK), EMR (Spark), Athena, IAM, KMS.

- Azure: Data Lake Storage Gen2, Synapse Analytics, Data Factory, Event Hubs, Stream Analytics, Managed Kafka, Databricks, Azure Functions, Active Directory, Key Vault.

- Proven expertise in building and optimizing ETL/ELT pipelines using AWS Glue, Azure Data Factory, Apache Spark, and scripting languages like Python, SQL, and PL/SQL.


- Solid experience with data lake and lakehouse strategies, and hands-on with modern data warehouse platforms such as Snowflake and Microsoft Fabric.


- Skilled in real-time data ingestion and streaming technologies like Apache Kafka, AWS Kinesis, Azure Event Hubs, and Spark Streaming.


- Deep understanding of data modeling concepts (conceptual, logical, physical) and best practices for both OLTP and OLAP systems.


- Expertise in business intelligence tools such as Tableau, Power BI, and Qlik Sense for enterprise-grade dashboards and analytics.


- Strong grasp of data governance, data security (encryption, access control), data quality frameworks, and data observability tools like Monte Carlo, DataDog, or Great Expectations.

- Familiarity with relevant data privacy and regulatory compliance standards (GDPR, CCPA, HIPAA, PCI DSS).

- Excellent client-facing communication skills with ability to explain complex technical concepts to non-technical stakeholders.

- Proven leadership and mentoring capabilities in guiding cross-functional teams.

Qualifications:

- Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.

Didn’t find the job appropriate? Report this Job

Job Views:  
284
Applications:  18
Recruiter Actions:  0

Posted in

IT & Systems

Job Code

1620031

UPSKILL YOURSELF

My Learning Centre

Explore CoursesArrow