Send me more jobs like this

Data Hub Librarion /ETL

Keywords / Skills : Abinitio Developer

5 - 8 years
Posted: 2019-07-21

Industry
ITES/BPO
Function
IT
Education
Bachelors
Degree
Posted On
21st Jul 2019
Job Ref code
122740
Job Description
Job Description :

Role - Data Hub Librarion
Background:-
Large scale Data onboarding project to create Data Hub that involves extensive data profiling, implement data cleaning rules, and Data lineage for business.
Environment-
ETL tool is primarily Ab Initio comprising, MHUB (highly preferred), GDE, EME, Conduct It, Control Center, ACE and BRE (highly preferred).
Job Description:-
The successful candidate will be assigned to the BI Shared Services team for the Data onboarding application. This will be a lead role that will also act as a liaison between the core dev team and the Application product support team.This requires enthusiastic and committed individuals to join, having previous experience on developing business intelligence systems and working in an agile environment. The IT is currently based in Paris. It is proposed to create a similar capacity in India, and the teams in both locations have to work effectively to meet the project objectives.
Skills -
ü6+yrs experience working on Ab Initio development and 2+yrs on Metahub in importing datasets and creating data lineage. This is a MUST!!!.
ü2+yrs experience working on ACE and BRE and working closely with business in creating rule sets and application config templates. This is a MUST!!
üExperience implementing and setting up Ab Initio MHub, importing data sources, identifying required data sets to create data lineage.
üStrong understanding or hands-on experience with data quality, data profiling and metadata management procedures.
üIndustry experience in one or more of the following industries highly preferred (Banking/Securities experience preferred)
üAcquire data from internal or external data sources and maintain mapping documents of the files, databases/data systems.
üPerform data analysis and data modeling to create source to target mappings.
üEvaluate risk from statistical information to determine risk to operations or non-compliance.
üCreate systems models, specifications, diagrams, and charts to provide direction to business and other IT teams.
üInterpret data, analyze results and work closely with Data Stewards to gather good understanding of the Data and perform data profiling to create Data cleansing rules.
üKnowledge of Data processing in Hadoop architecture is nice to have.
üProfile & Analyze Source Data (Mainframe Extracts, Oracle tables & UNIX flat files) usage using Ab Initio Data Quality Environment (DQE).
üIdentify data migration themes and quality patterns represented in the data analysis results and provide guidance to the ETL designers regarding data quality of source data.
Key Skill(s)

Similar Jobs
View All Similar Jobs
Walkin for you