Posted By

user_img

HR

HR - Talent Acquisition at Pioneer Financial & Management Services Ltd

Last Login: 23 April 2024

9571

JOB VIEWS

93

APPLICATIONS

59

RECRUITER ACTIONS

Posted in

IT & Systems

Job Code

750015

Big Data Solution Architect - IT

12 - 16 Years.Mumbai/Noida/Pune
Posted 4 years ago
Posted 4 years ago

Exp : 12+yrs

Verticals exp : Government projects, Banking projects, Security projects, Telecom & Insurance.

Basic Function :

- The Big Data Architect works closely with the customer and the solutions architect to translate the customer's business requirements into a Big Data solution. This includes understanding the customer data requirements, platform selection, design of the technical architecture, design of the application and interfaces, and development, testing, and deployment of the proposed solution.

- Has the ability to design enterprise grade large-scale data processing systems and help identify the best options for architecture. The Big Data Architect also understands the complexity of data and can design systems and models to handle different variety of data with varying levels of volume, velocity and veracity.

- Should have independently worked on proposing architecture, design and data ingestion concepts in a consultative mode. Leads client assessments, preparing current state and future state architectures along with go forward recommendations. Will work with the practice leads and account management team to develop statements of work, implementation plans, resource plans and project estimates

Role:

- Understand business requirements and convert them into solution designs & Presales exp

- Architecture, Design and Development of Big Data data Lake Platform.

- Understand the functional and non-functional requirements in the solution and mentor the team with technological expertise and decisions.

- Conducts peer reviews to ensure consistency, completeness and accuracy of the delivery.

- Hands-on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others.

- Take end-to-end responsibility of the Hadoop Life Cycle in the organization

- Full knowledge of Hadoop Architecture and HDFS is a must

- Working knowledge of MapReduce, HBase, Pig, MongoDb, Cassandra, Impala, Oozie, Mahout, Flume, Zookeeper/Sqoop and Hive

- In addition to above technologies, understanding of major programming/scripting

- languages like Java, Linux, PHP, Ruby, Phyton and/or R

- He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms

- Must have minimum 3+ years hands-on experience in one of the Big Data Technologies (I.e. Apache Hadoop, HDP, Cloudera, MapR)

- MapReduce, HDFS, Hive, Hbase, Impala, Pig, Tez, Oozie, Scoo

Didn’t find the job appropriate? Report this Job

Posted By

user_img

HR

HR - Talent Acquisition at Pioneer Financial & Management Services Ltd

Last Login: 23 April 2024

9571

JOB VIEWS

93

APPLICATIONS

59

RECRUITER ACTIONS

Posted in

IT & Systems

Job Code

750015

UPSKILL YOURSELF

My Learning Centre

Explore CoursesArrow