Talent Acquisition at Thoucentric Technology Private Limited
Views:1493 Applications:12 Rec. Actions:Recruiter Actions:0
Thoucentric - Data Architect (10-12 yrs)
- Should have experience in designing and architecting large scale distributed applications
- Must have excellent data modelling skills (relational and dimensional) for Cloud or Big Data applications
- Excellent knowledge of relational database technologies with experience on databases such as Oracle / SQL Server / Amazon Redshift and operational knowledge on NOSQL databases
- Hands on experience on any of the ETL tools (Informatica / Datastage / AbInitio / Alteryx / Talend / SSIS / ). Microsoft SSIS / Azure is preferred.
- Data Warehousing skills is mandatory
- Must be familiar with one scripting language - Unix Scripting, Perl scripting, Python etc.
- DB Architecture, Data Modelling, Database Design mandatory
- Must have experience on DB / Performance Tuning
- Exposure to Data Migration from on-prim to cloud is desirable although not mandatory
- Experience with anyone. Azure is preferred.
- The Hadoop stack (Hadoop, Hive, Pig, Hbase, Sqoop, Flume, Spark, Shark, Oozie, etc.)
- Microsoft Azure Platform (HDInsight, ADF (Azure Data Factory), Azure - Cloud Services, Event Hub, SQL DW, Data bricks)
- Proven experience in developing architecture blueprint, design specifications, data interfaces and integration approaches, infrastructure requirements, etc.
- Must be well versed with architecting data-as-a-service layer for servicing Digital assets, decision support systems (BI), Visualization/reporting & analytics
- Good understanding of architecting data ingestion framework capable of processing structured, semi-structured & un-structured data sets in batch & real time and integrating it with internal data to develop real-time actionable insights
- Conduct prototyping with the solutions and drive the requirements to closure with the business users. Able to use "Open Analytics" solutions for quick prototyping, then implement, and operationalize
- Candidate should be able to architect highly scalable distributed systems / big data solutions, using different open source tools, Microsoft Platform is preferred
- Candidate should be able to design, develop, load, maintain and test large-scale distributed systems.
- Should be able to focus on analysing and visualizing large sets of data to turn information into insights using multiple platforms
- Translate complex functional and technical requirements into detailed design.
- Should be able to install, configure and support Big Data tools.
- Maintain security and data privacy.
- Propose best practices/standards.
- Being a part of a POC effort to help build new big data clusters.
- Analytical mind with a problem-solving aptitude.
- Ability to cope in a complex and fast-changing business environment, and to respond calmly and rationally to changing aspirations in a deadline-driven situation.
- Good communication skills with a capacity to present, discuss and explain issues coherently and logically both in writing and verbally.
- Good team player, self-motivated and able to work on own initiative.
Mandatory Skills :
- Data Architecture, Big Data, Data Modelling, Design and Implement data solution
Desirable Skills :
- ETL Tools, Data Warehousing, python, hadoop stack
Department : Analytics
Job Type - Permanent