Excellent knowledge of Core Java and Spring/Spring Boot Candidate should have working knowledge on web services Should have worked in distributed agile model and continuous integration Should have knowled
• 10+ years of experience in IT with7+ years in Big Data Ecosystem • Exp in development on Hadoop technologies like Python, Pyspark, HDFS, Hive, Pig, Flume, Sqoop, Zookeeper, Spark, MapReduce2, YARN, HBase, Kafka, and Storm.
Work with developers to design algorithms and flowcharts Produce clean, efficient code based on specifications Integrate software components and third-party programs Verify and deploy programs and systems Troubleshoot, debug and upgrade existing
Exp : 3 to 8 Years. Strong development knowledge using Terraform with deep Infrastructure As Code (IAC) automation background. Ability to write software that interacts with AWS services and/or is hosted in AWS specifically using boto3 and the Java.