This is a truly exciting time to join the team at Monster. Monster is building on its leadership position, is investing in game-changing innovation and creating a world-class organization.
Skills :
GRAB THE OPPORTUNITY FOR THE POST OF HADOOP DEVELOPER OR SOFTWARE ENGINEER.
Skills :
Bigdata AWS / Azure fully work from home, long term projects
Skills: Jasper, Java
Skills :
10+ years of experience in the role of managing and implementation of high-end software products.<br>
Skills :
10+ years of experience in the role of managing and implementation of high-end software products.<br>
Skills :
Strong Experience in Data Engineer. Handling Team <br> Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES, Apache Spark, Python,<br> Experience in 5Yrs to 10 Yrs <br> Location:- Noida, Gurgram, New Delhi.<br> <br>
Skills :
Looking For immediate Joiner
Skills :
Proficiency in Hadoop Stack (Hadoop, HDFS, Hive, Spark, Scala) and / or other BIG Data technology.
Skills :
Looking for Sr Hadoop Developer with Hadoop, MapReduce, HDFS, stream-processing systems, using solutions such as Storm or Spark-Streaming, Kafka
Skills :
We need a Tech resource who can be in Data Analytics or Backend software Engineer
Skills :
The person will work as Big data Developer in Azure Big data platform using Azure Data Bricks, Azure Data Lake storage and analytic.
IMMEDIATE JOINER _JAVA DEVELOPER_ NOIDA<br> EXCELLENT COMMUNICATION SKILLS REQUIRED
Skills :
Experience with Hybris 6.x with knowledge of Hybris- PAN India (Remote)
Skills :
Leading IT & Consulting MNC requires Sr Data engineer - WFH( Permanent)
Skills :
To work as Linux \ Unix admin. For US client.
Skills :
Strong Exp in Data Engineer, Strong expertise with data models, segmentation techniques, data systems and <br> pipelines. AWS, EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES, Apache Spark, Python, Scala, PostgreSQL, Git, Linux
Skills :
Our client, a leading global bank is looking for a Cloudera Administrator for Hadoop in Gurugram
Skills :
Working with stakeholders to understand needs in order with respect to data structure, availability, scalability and accessibility.
Skills :
Job Description : Minimum experience 3+ years for engineer and 7 years for Analyst • Strong experience working in Hadoop ecosystem testing the various point of data entry/ingestion, transformation, processing and final consumption by data analytics,
Skills :
B-Tech/MS degree in Computer Science, Engineering, or a related subject.Strong experience in Java programmingminimum jdk8 as java version.<br>
Skills :
1. Designing and core architecture experience for highly scalable platforms<br> 2. Highly skilled in Java and related frameworks and technologies like Spring, hibernate etc<br> 3. Highly skilled in database and schema design. In-depth understanding of MySQL
Skills :
Hands on experience with spark/hive/pig/flume/sqoop/kafka
Skills :
Data architect is an expert in definition, design, implementation of data solutions adhering to enterprise architecture strategies, processes and standards. Has expertise in big data technologies Hadoop, NoSQL Spark Kafka & cloud compatible services.
Skills :
Hands-on experience with AWS, Azure, GCP ( Google Cloud Platform )
Skills :