Job Summary
Company Name
TATA Consultancy Services Ltd.
12 - 15 years
Key Skills
Big Data, Hadoop MR, Hive, Oozie, NoSQL DBs, Sqoop, BigData
System Analyst/ Tech Architect
IT/ Computers - Software
Posted On
27th Jun 2018
SocialTwist Tell-a-Friend  
About Company

Just as an organization needs the right talent to drive its business objective, people need the right environment to grow and achieve their career goals. The moment you step into TCS, you would be greeted with that unmistakable feeling of being at the place. Along with that, working with TCS affords you with a sense of certainty of a successful career that would be driven by boundless growth opportunities and exposure to cutting- edge technologies and learning possibilities.

The work environment at TCS is built around the belief of growth beyond boundaries. Some of the critical factors that define our work culture are global exposure, cross-domain experience, and work-life balance. Each of these elements goes much deeper than what it ostensibly conveys.
  1. A part of the Tata group, India's largest industrial conglomerate
  2. Over 198,000 of the world's best-trained IT consultants across 42 countries
  3. The 1st Company in the world to be assessed at Level 5 for integrated enterprise-wide CMMI and PCMM
  4. Serving over 900 clients in 55 countries with repeat business from more than 98.3% of the clients annually
  5. 49 of the Top 100 Fortune 500 U.S. Companies are TCS clients

Job Title:

Hadoop Solution Architect

Job Description

Role: Big Data Architect

Desired Experience Range: 12+ Years overall (5+ in Big Data)

Location of Requirement

Desired Competencies (Technical/Behavioral Competency)


(Ideally should not be more than 3-5)

1. Should have experience in multiple Big Data tools and technologies such as Hadoop(MR), Hive, Oozie, NoSQL DBs, Sqoop etc; SHould have in-depth knowledge of commonly used Big data stack

2. Needs to have hands-on experience with Big Data applications from a full life-cycle perspective - Requirement analysis, comparing probable technologies / tools and deciding on best suiting technology stack, development, administration, configuration management, monitoring, debugging, integration with other technology systms / applications / application layers, testing and performance tuning

3. Should be experienced in Big Data cluster hardware estimation - both in-premise and over cloud, benchmark big data clusters, analyse system bottlenecks and propose solutions to eliminate them

4. Should have at least an understanding of a variety of hardware platforms including mainframes, distributed platforms, desktops, and mobile devices as well as a deep understanding of databases, data in storage and data in motion

5. In addition to big data solutions, needs to have a firm understanding of end-to-end solution stack - major programming/scripting languages like Java, Linux, PHP, Ruby, Phyton and/or R; experience in working with ETL tools such as Informatica, Talend and/or Pentaho; BI tools; data warehouses; parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms


1. Have experience with one of the large cloud-computing infrastructure solutions like Amazon Web Services or Elastic MapReduce

2. Have experience with Data and related services - governance, MDM, security, privacy etc

3 . Be able to document use cases, solutions and recommendations

Responsibility of / Expectations from the Role

Be able to describe the structure and behaviour of a big data solution and how that big data solution can be delivered using big data technologies

Be able to vision the overall Big Data roadmap for the client and provide solution / extrapolate metrics accordingly

Please apply with your updated CV.

    Company Profile

Powered by Monster