Job Summary
Company Name
TATA Consultancy Services Ltd.
6 - 12 years
Key Skills
Hadoop Administrator, Hadoop, Hadoop Admin
System Administrator
IT/ Computers - Software
Posted On
27th Jun 2018
SocialTwist Tell-a-Friend  
About Company

Just as an organization needs the right talent to drive its business objective, people need the right environment to grow and achieve their career goals. The moment you step into TCS, you would be greeted with that unmistakable feeling of being at the place. Along with that, working with TCS affords you with a sense of certainty of a successful career that would be driven by boundless growth opportunities and exposure to cutting- edge technologies and learning possibilities.

The work environment at TCS is built around the belief of growth beyond boundaries. Some of the critical factors that define our work culture are global exposure, cross-domain experience, and work-life balance. Each of these elements goes much deeper than what it ostensibly conveys.
  1. A part of the Tata group, India's largest industrial conglomerate
  2. Over 198,000 of the world's best-trained IT consultants across 42 countries
  3. The 1st Company in the world to be assessed at Level 5 for integrated enterprise-wide CMMI and PCMM
  4. Serving over 900 clients in 55 countries with repeat business from more than 98.3% of the clients annually
  5. 49 of the Top 100 Fortune 500 U.S. Companies are TCS clients

Job Title:

Hadoop Administrator

Job Description

Role: Hadoop Administrator

Required Technical Skill Set: Hadoop

No of Requirements: 1

Desired Experience Range: 6- 12 Years

Location of Requirement Chennai

Desired Competencies (Technical/Behavioral Competency)


(Ideally should not be more than 3-5)

1. Hadoop Administration (E3 or E2)

2. Unix Administration (E2)

3. Unix Shell Scripting (E3 or E2)

4. No SQL Database Administration (E1)


1. Java


3. Agile

Responsibility of / Expectations from the Role

1 Ability to Design and deploy Hadoop cluster environment that can scale to petabytes

2 Manage HDFS, Hive, Flume, Yarn and all the related Hadoop tools

3 Design, configure and manage the backup and disaster recovery for Hadoop data

4 Install and configure monitoring tools for the Hadoop environment . Optimize and tune the Hadoop environment to meet the performance requirements

5 Should be willing to Work with big data developers, designers and scientists in troubleshooting map reduce job failures and issues with Hive, Pig, HBASE, Flume etc.

Please apply with your updated CV.

    Company Profile

Powered by Monster