We have immediate requirement for Big Data Developer Experience: 4-8 Years Location: Pune (Initially Remote/WFH) Notice: Immediate to 30 Days Duration: Fulltime/Permanent Key skills: (Spark+Java+GCP) or (Spark+Python), (Java/Scala/Python).
Job Role • Big Data Developer required with experience working in Hadoop in technologies including Spark, HDFS, Hive • Mandate to have Scala programming language experience • Must be able to work without close supervision. Driven and committed
1. Minimal experience 7-8 years in Hadoop. 2. Good Experience in Hortonworks ecosystem components (HDFS, HBase, Kafka, Hive,YARN) 3. Must have experience in writing complex HQL, manage and maintain Hive tables.
Lead a team and deliver the best output .Responsible for product road map.Maintain quality of code and provide the best solution to meet the client needs and anticipate their future needs based on an understanding of the market.
Immediate requirement for Cloud Data Engineer Experience: 4-8 Years Duration: Permanent/Fulltime Location: Pune/Hyderabad/Chennai/Bangalore/Mumbai Key Skills: Java+Spark+AWS/GCP Notice: Immediate to 30 Days
We have multiple positions where we are looking for people with 4 - 15 years of experience working with AWS Stack as Data Engineer or Architects. This is more Hands on role and hiring is happening for iLink Digital.
As a Staff Big Data Engineer (Data Science/ML) you will build and expand upon the testing framework and testing infrastructure of IAS core ad verification, analytics and anti ad fraud software products.
Credit and fraud risk management across the customer lifecycle, covering acquisition, underwriting, customer mgnt ,collections.Development of fraud risk strategies through best-in-class analytics and data science, for the US clients .