Spark Hadoop with AWS Developer

Spark Hadoop with AWS Developer

Tata Group
7-10 years
Not Specified

Job Description

Job Description
·Strong Knowledge on Spark, Scala,Python, PySpark
  • Strong in Teradata SQL concepts
  • Good Knowledge on Bigdata Technologies like Hive ,Sqoop,Pig,Kafka,Flume.
  • Knowledge on Hive Query Language with performance aspects
  • Experience in Version Control tools like SVN, GITHUB as well as experiences in Production deployment process.
  • Batch processing in Hadoop Ecosystem
  • Hands-on development with AWS services: Glue, Lake Formation, S3, DynamoDB, SQS, SNS, Aurora, Lambda, Step Functions, API Gateway, IAM, CloudWatch, Kinesis, ElasticSearch, Airflow, EventBridge, Grafana
  • Basic knowledge in below skill sets:
      1. SQL Server
      2. Linux Shell Scripting

c.DataWarehousing concepts
Work Location: Kolkata
Experience Range: 7 + yrs
Roles & Responsibilities:
  • Interacting with IT Business Analysts and / or our business partners to acquire the information necessary for product design.
  • Following written requirements to help design databases and develop dashboards to support our customers.
  • Translating raw data from multiple sources (ETL) into a single dataset at aggregate levels necessary to meet project needs.
  • Suggesting alternative ways to group, filter and sort aggregated data for optimal presentation and performance possibilities.
  • Providing front-line production support by analyzing customer questions, evaluating data sets to explain end results or quickly adjust processes based on consumer demands.
  • Working independently and with groups while engaging in multiple initiatives simultaneously in a fast paced IT environment.
  • Resolving business process conflicts as they relate to project specifications.
  • Reporting project statuses to IT Management staff as required.
  • Review current environment that is used for product launch. Design and recommend the go forward archit ecture using Spark, Python and Teradata.
  • Migrate to and implement the new rules and jobs to support Operational Intelligence
  • Set up and configure alerting and monitoring
  • Create STTMs, ERDs, data dictionaries and other knowledge artifacts for the NDW Base environment
  • The person should have the ability to leverage BTEQ scripts, familiar with Informatica, UC4/Automic and obviously strong SQL skills.

Job Details

Employment Types:





About Tata Group

Similar Jobs

People Also Considered

Data Not Available

Career Advice to Find Better

Simple body text this will replace with orginal content