Send me more jobs like this

Big Data Architect

Keywords / Skills : Big Data Architect, Architect, Big Data, Big Data Analytics

9 - 15 years
Posted: 2019-07-30

Industry
IT/Computers - Software
Function
IT
Role
System Analyst/ Tech Architect
Posted On
30th Jul 2019
Job Description
Total of 10 – 15 Years of experience in BI & DW with at least 4 – 6 years of experience in Big Data implementations

• Understand business requirements and convert them into solution designs

• Architecture, Design and Development of Big Data Platform.

• Understand the functional and non-functional requirements in the solution and mentor the team with technological expertise and decisions.

• Responsible for Preparation, reviewing and owning Technical design documentation.

• reviews, and preparing documents for Big Data applications according to system standards.

• Responsible for project support, support mentoring, and training for transition to the support team.

• Hands-on experience in working with Big Data platforms like Hadoop (HortonWorks, Cloudera, MapR etc.), MongoDB, Cassandra, MarkLogic etc. (NoSQLs)

• Experience in implementing Big data architecture in AWS /MS Azure / GCP (Google Cloud Platform)

• Take end-to-end responsibility of the Big Data Architecture Life Cycle in the organization

• Be the bridge between data scientists, engineers and the organizational needs.

• Do in-depth requirement analysis and exclusively choose the work platform.

• Full knowledge of Hadoop Architecture and HDFS is a must

• Working knowledge of MapReduce, HBase, Pig, MongoDb, Cassandra, Impala, Oozie , Mahout, Flume, Zookeeper/Sqoop and Hive

• In addition to above technologies , understanding of major programming/scripting

• languages like Java, Linux, PHP, Ruby, Phyton and/or R

• He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms

• Must have minimum 3+ years hands-on experience in one of the Big Data Technologies (I.e. Apache Hadoop, HDP, Cloudera, MapR)

• MapReduce, HDFS, Hive, Hbase, Impala, Pig, Tez, Oozie, Scoop

• Hands on experience in designing and developing BI applications

• Excellent knowledge in Relational, NoSQL, Document Databases, Data Lakes and cloud storage

• Expertise in various connectors and pipelines for batch and real-time data collection/delivery

• Experience in integrating with on-premises, public/private Cloud platform

• Good knowledge in handling and implementing secure data collection/processing/delivery

• Good knowledge with the Hadoop components like Kafka, Spark, Solr, Atlas

• Desirable knowledge with one of the Open Source data ingestion tool like Talend, Pentaho, Apache NiFi.

• Desirable knowledge with one of the Open Source reporting tool Brit, Pentaho, JasperReport, KNIME, Google Chart API, D3



About Company

Tech Mahindra represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates and the Society to Rise™. We are a USD 4.7 billion company with 115,200+ professionals across 90 countries, helping over 903 global customers including Fortune 500 companies. Our convergent, digital, design experiences, innovation platforms and reusable assets connect across a number of technologies to deliver tangible business value and experiences to our stakeholders. Tech Mahindra is amongst the Fab 50 companies in Asia (Forbes 2016 list).

We are part of the USD 19 billion Mahindra Group that employs more than 200,000 people in over 100 countries. The Group operates in the key industries that drive economic growth, enjoying a leadership position in tractors, utility vehicles, after-market, information technology and vacation ownership
Similar Jobs
View All Similar Jobs
Walkin for you