Send me more jobs like this

Keywords / Skills : Dimensional data model, Matillion, snowflake, programming and scripting, ETL developers, ETL

4 - 6 years
Posted: 2019-11-29

Industry
IT/Computers - Software
Function
IT
Role
Software Engineer/ Programmer
Posted On
29th Nov 2019
Job Description
ETL Developer
Notice Period: 60 days

Budget: open

Experience 4- 6 Years

Location: Pune



Specific Experience and Capabilities Required


● Sound Knowledge of Data Warehousing concepts, Star Schema, Snowflake Schema, and database architecture for OLTP and OLAP applications, Data Analysis, Data Requirement Analysis, Data Mapping and ELT processes

● Ability to create scalable ELT jobs

● Proven experience in dimensional data modeling and implementing Datawarehouse/Data Marts

● Good experience with Snowflake, Matillion, Pentaho, SCD, Surrogate Keys, Normalization/ De-normalization, Hadoop, Hive, Amazon Redshift, PostgreSQL, MySQL, MongoDB and others

● Experience with Snowflake encryption and Amazon KMS

● Good analytical, programming, problem solving and troubleshooting skills

● Good experience with Java, Shell Scripting and Python

● Experience in setting up monitoring for ELT jobs

● Experience with event driven processing and tools such as apache Kafka or Amazon Kinesis

● Constant lookout and proven experience with evolving needs for sourcing, standardizing and cleansing data

● Demonstrated understanding of data engineering and data integration

● Ability to identify and troubleshoot operational issues within a warehouse environment

● Coordinate with QA to produce test plans and test automation that ensures the accuracy of ELT tasks and support System Integration Testing

● Ability to collaborate cross-functionally and work well in a distributed, team-oriented environment

● Experience in working with DBA’s on tuning queries and databases

● Understanding of best practices in managing enterprise data including master data, data quality, lineage and security

● Strong verbal and written communication skills

● Enthusiastic about evaluating new technologies to gain competitive advantages throughout Cyara’s data pipeline

● Experience working with high volume non-relational event data ideally working in both the event stream capture, and complex data correlation and analytics.

Responsibilities:



● Designing and developing high-performing data warehouses and ELT processes

● Design and build a data pipeline to pull data from multiple data sources including external feeds and push to staging and use necessary transformations to push the data to Datawarehouse/Data Marts

● Working with a DBA and NOC in troubleshooting issues related to warehouses and ELT processes/jobs

● Write SQL and stored procedures for reports and data aggregation

● Optimizing and tuning data-cleansing algorithms, queries and databases

● Contribute to ensuring the security of the data while in motion and at rest

● Ability to work with technical precision under tight deadlines

● Personal initiative and independence in achieving outcomes

● Ability to pick up new technologies and trends and utilize effectively in current stream of work

● Ability to work effectively within an agile team environment doing continuous testing

● Ability to produce high quality, well documented software artefacts

● Willingness to accept a wide range of responsibilities, including technical leadership of the corporate big data strategy



About Company

Xangars Solutions Private Limited
Similar Jobs
View All Similar Jobs
Walkin for you