- Minimum 7 years of software development experience
- Bachelor's and/or Master's degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing and provisioning
- Led and delivered data management projects in Azure Cloud
- Translated complex analytical requirements into technical design including data models, ETLs and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Strong knowledge of continuous integration, static code analysis and test-driven development
- Knowledge of databases in Cloud platforms - AWS, Azure and GCP
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have Excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detai and determine core Azure services needed to fulfill the technical design
- Architect on Azure platform to meet client specific functional and non-functional requirements
- Oversee design and solution architecture to ensure standards are followed, codebase is modular and scalable
- Estimate effort using a parametric estimation model and bring all stakeholders on-board about the overall schedule, and dependencies
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapase or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optiize platform cost by selecting right platform services and architecting the solution in a cost effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts Brief-Role requires experience in Azure core technologies - Azure Data Lake Storage, Azure Data Lake Analytics, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight, Azure Databricks and SQL data warehouse.
Skills reqd-""
- Azure, Data factory, Data Warehouse, Data Lake, Spark, Python, Kubernetes"
Didn’t find the job appropriate? Report this Job