Experience in Data Migration from on premise databases to Snowflake. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Python connector for Snowflake. Experience in cloud technologies AWS, Azure or GCP ETL/ELT, Matillion
Experience In Developing Solutions Focused On AWS and Big Data And Analytics,Solution architect experience must be atleast of 2-3 years,AWS ,Data Modeling,Streaming & Real-Time Data Processing,Big Data,Data Virtualization (Denodo, Tibco).
Should have a bachelor’s degree in field of Statistics, Business Intelligence or an equivalent Should have got relevant experience in working with Hadoop, Spark environment and other related tools such as Hive, Presto, Ranger etc.
We have immediate requirement for Big Data Developer Experience: 4-8 Years Location: Pune (Initially Remote/WFH) Notice: Immediate to 30 Days Duration: Fulltime/Permanent Key skills: (Spark+Java+GCP) or (Spark+Python), (Java/Scala/Python).
The responsibilities include Define the Altiplano architecture and design Play a key role in the Business Units innovation strategy development Altiplano portfolio consists of multiple products in the space of fixed access domain controller
• Experience developing and administering large data systems. • Solid knowledge of CS fundamentals in algorithms and data structures. • Experience with Hadoop, Spark, Kafka. • Exp. with relational SQL & NoSQL databases including SQL Server & CosmosDB
Posted: 9 days ago
Get noticed by recruiters
Give your career a boost with Monster's resume services.