Key Responsibilities
- Relevant experience working on production grade projects experience in hands on, end to end development.
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping, and data rationalization artifacts.
Proficiency in:
- Modern programming languages like Java, Python, Scala
- Hands on Big Data technologies Hadoop, Spark, HIVE, Kafka
- Hands On Cloud Services AWS , Azzure .
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD
Didn’t find the job appropriate? Report this Job