Big Data AWS Developer/Lead/Architect for Pune - PERM 8 to 12 Years

Big Data AWS Developer/Lead/Architect for Pune - PERM 8 to 12 Years

Arminus Software Private Limited
Pune
0 - 50 Years
Not Specified

Job Description


Job Description :
Job Description: Responsibilities –
- Customers trusted advisor:¬ collaborate with customer/AWS partner account, training and support teams to help customers learn and use AWS services such as Amazon Elastic Map Reduce (EMR), Redshift, Kinesis, SageMaker, Glue, S3, DynamoDB, and the Relational Database Service (RDS)
- Business partner: serve as a key member of the business development and account management team in helping to ensure customer success in building and migrating applications, software and services on the AWS platform.
- Public engagement: provide thought leadership on Analytics solutions that benefit customers through the use of AWS Services. This takes the form of contribution to external publications such as the Blogs, Whitepapers and Reference architectures, or public presentations or industry events.
- Community player: capture and share best-practices, participate, and contribute to Xebia technology events/sessions.
Key Skills: Technical skills-
-Experience building complex software systems that have been successfully delivered to customers
- Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Ability to take a project from scoping requirements through actual launch of the project.
- Experience in communicating with users, other technical teams, and management to collect requirements, describe software product features, and technical designs.
- Implementation and tuning experience in the Big Data / Apache Hadoop ecosystem (including tools such as Hadoop Streaming, Spark, Pig and Hive), or in the Real-time ecosystem (with tools such as Amazon Kinesis, Apache Kafka, Flink, Storm)
- Proven experience on database and analytics technologies in the industry (such as Data Warehousing, MPP, Relational, OLTP, OLAP, and NoSQL databases), on commercial or open source solutions (Microsoft SQL Server, Oracle Databases, IBM DB2, MySQL, PostgreSQL, MongoDB, Apache Cassandra, HBase, Snowflake, or Amazon Databases), and on different stages of the lifecycle (Schema Design, Query Tuning and Optimisation, Data Migration and Integration).
- Must be hands-on on EMR, Redshift, Quicksight, Athena, RDS, DynamoDB, AWS Glue, S3,and Glacier
- Should have working experience using AWS core services: EC2, S3, VPC, ELB.
- Must have implemented a few projects using Bigdata eco system including but not limited to HDFS, Hive, Yarn, Map-Reduce, hive catalog, spark, HBase, Kafka, Cassandra
- Must be able to write spark code (java/scala/python)
- Must be able to implement kinesis/kafka
- Must have working knowledge on different file format such as text, csv, parquet, and ORC
- Must have worked on deltalake implementations (Cloud based or on-prem). Experience with Microsoft services is a major plus.
- Must have an experience in Python, Bash, Java, SQL.
- Must have knowledge on AWS services such as Kinesis, IAM, CloudWatch, CloudTrail, ElasticSearch, Lambda, Neptune, sage maker, powershell, Elastic Beanstalk, codedeploy
- Must have hands-on/working experience with CI/CD tool like Jenkins/codepipeline, github/gitlab/bitbucket, chef, maven, etc.
- Experience with Terraform, puppet is a major plus
- Must have good knowledge of code deployment
- Must have experience in multithreaded, distributed system design and architecture
- Experience with Java concurrency and threading is a major plus
- Testing and debugging skills
- Deep understanding of the technical and use case differences between OLTP, OLAP, in-memory, columnar, MPP, NoSQL, big data batch and streaming data stores and their impact on application design and performance
- Experience in designing, deploying or managing Data Lake platforms in complex environments
- Understanding of security practices related to data ingestion, storage, analysis and visualization
- Be familiar with modern container technologies (Docker, Kubernetes), have a deep understanding of DevOps tasks, CI/CD, understanding of Monitoring and Security Best Practices.
- Track record of implementing AWS services in a variety of business environments such as large enterprises and start-ups. AWS Certifications, e.g. AWS Solutions Architect Associate/Professional

Job Details

Industry:

Function:

IT

Similar Jobs

People Also Considered

Data Not Available

Career Advice to Find Better

Simple body text this will replace with orginal content