Minimum Qualifications and Job Requirements: • Must have 3+ years of automation experience with Selenium. • Strong knowledge of Java programming language required. • 1+ years working Hadoop, Hbase, and Hive • Good understanding of file formats in
Design and implement networking / EMS / NMS solutions with high optimization and performance Come up with proof of concepts for critical modules/new frameworks Establish high, mid and micro level plans and estimates for project teams.
About the role:
We are looking for a Senior Software Engineer
As a Senior Software Engineer, you will
Deliver high quality, optimized codes. Collaborate with the project teams to identify relevant data for analysis and develop the Big Data s
We are looking for a Big Data Engineer that will work on collecting, storing, processing, and analyzing huge sets of data. Primary focus- choosing optimal solutions to use then maintaining, implementing, and monitoring them.
Should have a bachelor’s degree in field of Statistics, Business Intelligence or an equivalent Should have got relevant experience in working with Hadoop, Spark environment and other related tools such as Hive, Presto, Ranger etc.
JD for Hadoop: Minimum 6+ years Relevant experience in Hadoop administration • The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the c
The candidate should have strong experience in development and deployment of containerize micro services platform. This position requires working in distributed agile teams and experience in developing low latency, scalable, resilie
Bengaluru / Bangalore
Hyderabad / Secunderabad
About the role:
As a Data Engineer, you will build a variety of big data analytics solutions, including big data lakes. More specifically, you will:
· Design and build scalable data ingestion pipelines to handle real time streams, CDC events, and
Experience in design and architect large scale real time data processing application in distributed environment. End to end responsibility of designing and developing high performance scalable data solution.