HamburgerMenu
iimjobs
Job Views:  
248
Applications:  43
Recruiter Actions:  0

Posted in

IT & Systems

Job Code

1674511

Data Architect - Dremio Lakehouse

CREWKARMA NETWORKS PRIVATE LIMITED.5 - 17 yrs.Gurgaon/Gurugram/Mumbai/Bangalore/Hyderabad
Posted 1 month ago
Posted 1 month ago

Description:

Data Architect Dremio Lakehouse

Location: Mumbai / Bengaluru / Hyderabad / Gurugram

Experience: 5-17 Years

Work Mode: 3 Days Work From Office

Role Overview:

We are hiring a Data Architect with deep expertise in Dremio-based lakehouse architecture to lead enterprise-scale data modernization initiatives. The role requires strong hands-on Dremio experience combined with architectural ownership of ingestion, semantic modeling, governance, and performance optimization across cloud ecosystems.

This position is ideal for candidates currently working in enterprise analytics, cloud-native data platforms, consulting-led modernization programs, or advanced data engineering environments.

Key Responsibilities:


- Architect and implement Dremio-based lakehouse solutions on AWS/Azure ecosystems.

- Define data ingestion, curation, and semantic modeling strategies supporting analytics and AI workloads.

- Design and optimize reflections, caching strategies, and query performance within Dremio.

- Integrate data sources via APIs, JDBC, Delta/Parquet, and object storage (S3 / ADLS / GCS).

- Collaborate with data engineering teams working with Airflow, DBT, Kafka, Spark and related pipeline tools.

- Establish enterprise-grade governance including lineage, RBAC-based access control, and metadata management.

- Enable governed self-service analytics and well-defined semantic layers.

- Develop reusable design patterns, documentation standards, and scalable deployment frameworks.

Mandatory Requirements:

- 5+ years of experience in Data Architecture / Data Engineering.

- Minimum 3+ years of hands-on Dremio experience (configuration and setup experience must be clearly reflected in CV).

- Strong expertise in SQL optimization, analytical schema design, and query performance tuning.

- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg.

- Proven experience designing and implementing end-to-end lakehouse architecture.

- Strong understanding of data governance, lineage, security architecture, and RBAC frameworks.

- Experience working in enterprise data modernization, cloud-native, or analytics-driven environments.

- Strong stakeholder communication and documentation discipline.

Preferred:

- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker).

- Exposure to data catalogs (Collibra, Alation, Purview).

- Experience with Snowflake, Databricks, or BigQuery ecosystems.


Didn’t find the job appropriate? Report this Job

Similar jobs that you might be interested in
Job Views:  
248
Applications:  43
Recruiter Actions:  0

Posted in

IT & Systems

Job Code

1674511