Send me more jobs like this

Platform Admins

Keywords / Skills : Linux, HDFS, Yarn, Oozie, Hive, Spark, Kerberos, Cloud, Hbase , Hadoop

2 - 10 years
Posted: 2019-10-10

Industry
IT/Computers - Software
Function
IT
Role
Software Engineer/ Programmer
Posted On
10th Oct 2019
Job Description
JD Description: 
 

Platform Admins

Linux, HDFS, Yarn, Oozie, Hive, Spark, Kerberos, Cloud, Hbase 

Core Role Responsibilities:

• Implementation and ongoing administration of – deploying a hadoop cluster, maintaining a hadoop cluster, adding and removing nodes using cluster monitoring, configuring the NameNode high availability and keeping a track of all the running hadoop jobs.

• Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. Takes care of the day-to-day running of Hadoop clusters.

• Work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected.

• Responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.

• Ensure that the hadoop cluster is up and running all the time adhering to the SLA’s of the client

• Monitoring the cluster connectivity and performance.

• Manage and review Hadoop log files.

• Backup and recovery tasks

• Resource and security management ( Kerberos, Knox , Ranger etc)

• Troubleshooting application errors and ensuring that they do not occur again.

• Responsible for implementation and ongoing administration of Hadoop infrastructure.

• Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.

• Working with Client delivery teams to setup new Hadoop cluster / users. Cluster maintenance as well as creation and removal of nodes .

• Performance tuning of Hadoop clusters and Hadoop routines.

• Screen Hadoop cluster job performances and capacity planning

• Monitor Hadoop cluster connectivity and security

• Manage and review Hadoop log files.

• File system management and monitoring.

• Team up with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.

• Collaborating with application teams to install operating system and HDP updates, patches, version upgrades when required.

• Point of Contact for Vendor escalation.

• Training and Mentoring subordinates and juniors.

• Deploying, automating, maintaining and managing Hadoop based production system, to ensure the availability, performance, scalability and security of productions systems.

• Build, release and configuration management of production systems.

• Pre-production Acceptance Testing to help assure the quality of our products / services.

• System troubleshooting and problem solving across platform, hadoop and application domains.

• Suggesting architecture improvements, recommending process improvements.

• Evaluate new technology options and vendor products.

• Ensuring critical system security using best in class cloud security solutions.



About Company

Aarvi Encon Limited
Similar Jobs
View All Similar Jobs


Walkin for you