We are looking for the dynamic and the brightest highly talented professional for our esteemed client in Pune .
Our client is an AWS Cloud Consulting Company providing Cloud Consulting, Solutions & Managed Services on AWS Cloud. We make Cloud work for Organizations. We are a Clan of Passionate People specializing on Amazon Web Services. We love working on and solving complex Architectural & Design problems for AWS Cloud, check out our Customer Success Stories to see what we have done. Our Customer Focus approach ensures high Quality Delivery at every step of our Engagement right from Advisory, Solution Designing, Implementation, Delivery and Management of Cloud Deployments.
Please find the appended details of the requirement.
Salary--Open for good candidate
Job location: - Pune
Notice Period- 30 Days/Immediate
Experience in Business Intelligence (BI), Data warehouse (DW)
Clear understanding of Cube, OLAP, OLTP concepts
Hands-on experience of Schema design and data modeling
Should have clear understanding of DFD and ERD
Ability to translate complex data flows/transformations into sequences of ETL implementation tasks
Thorough Knowledge of ETL, data quality, data cleansing, and data blending tools
Familiarity with AWS Data Pipeline a plus
Proven ability to write and tune SQL queries, Understanding of columnar DBs like Redshift and Actian Vector, and conventional RDBMS like SQL Server, MS SQL, MySQL, etc.
Knowledge of one or more OLAP (e.g. Mondrian), reporting and visualization tools (e.g. Tableau, Spotfire, Jaspersoft, QlikView, Looker, DataWatch, Kibana, Apache Zeppelin etc)
Work in teams to perform ETL(Extract, Transform and Load) of data from variety of databases, from SQL, NoSQL, HDFS.
Review the data quality and data definitions, and perform data cleansing and data management tasks.
Support client engagements focused on Big Data and Advanced Business Analytics in diverse domains such as product development. Interface with databases (SQL, NoSQL, HDFS) to extract, transform and load data Utilize tools such as ODI, Informatica, Pentaho, and Alteryx, in addition to command- line SQL, Python, and scripting.
Manage data quality, by reviewing data for errors or mistakes from data input, data transfer, or storage limitations. Perform data management to ensure data definitions, lineage and source are suitable for analysis.
Proficient in SQL, in addition to one or more of modern programming language such as Java etc.
Experience working with databases (SQL, NoSQL, HDFS) Experience working with data validation, cleaning, and munging Experience and ability to work in a Unix/ Linux environment, and proficient in command- line scripting. Experience with ETLtools such as ODI, Informatica, Pentaho, and Alteryx preferre
Work in a multidisciplinary team to understand available data sources, needs, and downstream uses.
Ability to understand domain and gain business knowledge quickly
Be research oriented and passionate to learn new technologies and adapt to the same
Deep problem solving skills and ability to break down complex concepts
If interested send your updated resumes to [HIDDEN TEXT]
Do add following metrics for us to process it fast:
Current CTC -
Expected CTC -
Notice Period -
Current location -
Total Years of Exp.-
Relevant Years of Exp.-
Thanks and regards,
Headfitted Solutions Pvt Ltd