Chat

iimjobs

jobseeker Logo
Now Apply on the Go!
Download iimjobs Jobseeker App and get a seamless experience for your job-hunting
16/09 Manoj Sharma
Director at HuQuo

Views:932 Applications:185 Rec. Actions:Recruiter Actions:0

Data Engineer - KPO (3-10 yrs)

Bangalore Job Code: 1315355

About the client:


- Company Name: Www.Huquo.com


- Industries: IT Services & Consulting


- Company Type: Forbes Global 2000


- Funding Stage: Other


- Headquarter Location: US


- Nature of Offering: Service


- Founding Year: 2007


- No of Employees: 501-1000


You have experience with client projects and in handling vast amounts of data - working on database design and development, data integration and ingestion, designing ETL architectures using a variety of ETL tools and techniques. You are someone with a drive to implement the best possible solutions for clients and work closely with a highly skilled Data Science team. Lead on projects from a data engineering perspective, working with our clients to model their data landscape, obtain data extracts and define secure data exchange approaches

- Plan and execute secure, good practice data integration strategies and approaches


- Acquire, ingest, and process data from multiple sources and systems into Big Data platforms


- Create and manage data environments in the Cloud


- Collaborate with our data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in their advanced analytical models


- Have a strong understanding of Information Security principles to ensure compliant handling and management of client data


- This is a fantastic opportunity to be involved in end-to-end data management for cutting edge Advanced Analytics and Data Science

Qualifications:

- Commercial experience leading on client-facing projects, including working in close-knit teams


- 3+ years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)


- 3+ years of experience working on projects within the cloud ideally AWS or Azure


- Data Warehousing experience with cloud products like Snowflake, Azure DW, or Redshift


- Experience in building operational data pipelines across a number of sources, and constructing relational and dimensional data models using any ETL/ELT tools (e.g. Talend, Informatica, Alteryx etc.) - Posses ETL Experience with Data Bricks


- Must posses strong understanding in Data Quality (e.g. Profiling, Cleansing, Validations et.) concepts & Testing methods


- Strong development background with experience in SQL along with at least one scripting knowledge (e.g. Python, Java, Scala, R etc.)


- Good to have experience with streaming architectures and patterns like Kafka


- Good to have better understanding of DevOps process


- Comfortable with client facing role with good consulting skills having knowledge in SDLC process


- Excellent interpersonal skills when interacting with clients in a clear, timely, and professional manner.


- A deep personal motivation to always produce outstanding work for your clients and colleagues


- Excel in team collaboration and working with others from diverse skill-sets and backgrounds

This job opening was posted long time back. It may not be active. Nor was it removed by the recruiter. Please use your discretion.

Women-friendly workplace:

Maternity and Paternity Benefits

Add a note
Something suspicious? Report this job posting.