Send me more jobs like this

Sorry, This job is expired.

Opening for Big Data/Data Scientist.

Keywords / Skills : statistical software tools: SQL, R, Python, business analytics, forecasting or business planning with emphasis on analytical modeling, quantitative reasoning and metrics reporting, Good data intuition / analysis skills; sql, plsql knowledge must.

5 - 9 years
Posted: 2019-03-27

Nationality
India
Industry
IT/Computers - Hardware & Networking
IT/Computers - Software
Function
IT
Role
Software Engineer/ Programmer
Education
Bachelor Of Technology (BTech
BE)
Masters in Technology (MTech
ME
MSc)
MSc
Stream:
Computers
Computers
Statistics
Posted On
27th Mar 2019
Job Ref code
GH/IT/ JC//27032019/
/82720
Job Description
· Produce a detailed functional design document to match customer requirements.

· Responsible for Preparation, reviewing and owning Technical design documentation.

· reviews, and preparing documents for Big Data applications according to system standards.

· Conducts peer reviews to ensure consistency, completeness and accuracy of the delivery.

· Detect, analyse, and remediate performance problems.

· Evaluates and recommends software and hardware solutions to meet user needs.

· Responsible for project support, support mentoring, and training for transition to the support team.

· Share best practices and be consultative to clients throughout duration of the project.

· Hands-on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others.

· Take end-to-end responsibility of the Hadoop Life Cycle in the organization

· Be the bridge between data scientists, engineers and the organizational needs.

· Do in-depth requirement analysis and exclusively choose the work platform.

· Full knowledge of Hadoop Architecture and HDFS is a must

· Working knowledge of MapReduce, HBase, Pig, MongoDb, Cassandra, Impala, Oozie , Mahout, Flume, Zookeeper/Sqoop and Hive

· In addition to above technologies , understanding of major programming/scripting

· languages like Java, Linux, PHP, Ruby, Phyton and/or R

· He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms

· Must have minimum 3+ years hands-on experience in one of the Big Data Technologies (I.e. Apache Hadoop, HDP, Cloudera, MapR)

· MapReduce, HDFS, Hive, Hbase, Impala, Pig, Tez, Oozie, Scoop

· Hands on experience in designing and developing BI applications

· Excellent knowledge in Relational, NoSQL, Document Databases, Data Lakes and cloud storage

· Expertise in various connectors and pipelines for batch and real-time data collection/delivery

· Experience in integrating with on-premises, public/private Cloud platform

· Good knowledge in handling and implementing secure data collection/processing/delivery

· Desirable knowledge with the Hadoop components like Kafka, Spark, Solr, Atlas

About Company

GlobalHunt India is a leading executive search & selection firm, in terms of services offered, professional team, mandates handled and Industry specializations across Asia-Pacific, US and Europe markets. Currently we would like to touch base with you regarding an opening with one of our client. Please touch base with us and send across your CV in word format to discuss it further. I am sending you some details for an opening that might help you to give a better perspective about the position as well as the organization. If you have any query; do give me a buzz.


Walkin for you