Send me more jobs like this

Big-Data Technical Leads/ Architects

Keywords / Skills : Hadoop, big data, pig, hive, sqoop, flume, Hbase, Mapreduce

7 - 15 years
Posted: 2019-10-23

IT/Computers - Software
Team Leader/ Technical Leader
Posted On
23rd Oct 2019
Job Description
Big-Data Technical Leads/ Architects

Experience: 7+ Years

Primary Skills: Pig, HIVE, Sqoop, Flume, Mapreduce, Hbase, Hadoop

DataMetica is seeking a Big Data Architect with expert level experience in Big Data Technologies and implementing large scale distributed data processing systems. This is a very challenging role at DataMetica with an opportunity for building innovative Hadoop/Big Data products and solutions. You will be responsible to build end to end solutions from scratch. In addition to that you will also be responsible for improving the existing distributed system architecture in our client environment. As an Architect you would also be part of COE where you would be involved in solving complex business problems, technology comparisons and analyzing and extending distributed architecture patterns and solutions. You would directly working with Chief architects/CTOs/CEOs of clients.

  • Hands on technical role; contribute to all phases of the software development lifecycle, including the analysis, architecture, design, implementation, and QA 
  • Collaboration on requirements; work with the Engineering, Product Management and Client Success teams to define features to help improve results for our customers and ourselves alike 
  • Partner with our Data Mining and Analytics team to do analysis, build predictive models & optimization algorithms, and run experiments 
  • Work closely with Operations/IT to assist with requirements and design of Hadoop clusters to handle very large scale; help with troubleshooting operations issues 
  • The Hadoop Architect should have a solid background in the fundamentals of computer science, distributed computing, large scale data processing as well as mastery of database designs and data warehousing. The person should have a high degree of self motivation, an unwavering commitment to excellence, excellent work ethic, positive attitude, and is fun to work with. 
  • Expertise in building massively scalable distributed data processing solutions with Hadoop, Hive & Pig 
  • Proficiency with Big Data processing technologies (Hadoop, HBase, Flume, Oozie) 
  • Deep experience with distributed systems, large scale non-relational data stores, map-reduce systems, data modeling, database performance, and multi-terabyte data warehouses 
  • Experience in Data Analytics, Data mining, Predictive Modeling 
  • Experience in building data pipelines and analysis tools using Java, Python 
  • Hands on Java experience building scalable solutions 
  • Experience building large-scale server-side systems with distributed 
processing algorithms. 
  • Aptitude to independently learn new technologies 
  • Strong problem solving skills 
  • Experience designing or implementing systems which work with external vendors' interfaces 
  • Ability to communicate with internal teams. 
  • Exposure to ISMS policies and procedures. 

About Company

DataMetica is dedicated to client success by providing a robust and scalable analytics framework, which will become the backbone of the business. We help clients from conceptualization to realization of the analytical solution using Big Data, Business Intelligence and Analytics capabilities.
We work for Fortune 500 companies and have clients in Healthcare, Finance and Retail industries.
Similar Jobs
View All Similar Jobs
Walkin for you