Data Integration Engineer

Data Integration Engineer

Hexaware
4-7 years
Not Specified

Job Description


Job Description
Responsibilities
. Front end the delivery of processes to data extraction, transformation,
and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
. Sound understanding of dimensional data modelling standards and best practices to ensure high quality
. Batch Processing - Capability to design an efficient way of processing
high volumes of data where a group of transactions is collected over a period
. Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles,
data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another
. Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data
. Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
. Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-d projects
The Role offers
. Great opportunities to learn various tools and technologies used in a sophisticated data architecture within the Business Intelligence and Analytics Data Services
. Gives an opportunity to showcase candidates strong analytical skills and problem-solving ability
. An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations
. Learning & growth opportunities in cloud and Big data engineering spaces
Essential Skills
. 3+ years experience in developing large scale data pipelines in a cloud/On-prem environment.
. Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,
. Fundamental knowledge in Data warehouse/Data Mart architecture and modelling
. Define and develop data ingest, validation, and transform pipelines.
. Fundamental knowledge of distributed data processing and storage
. Fundamental knowledge of working with structured, unstructured, and semi structured data
. For cloud data engineer, experience with ETL/ELT patterns, preferably using Azure Data Factory and Databricks jobs
. Extensive experience in the application of analytics, insights and data mining to commercial 'real-world' problems
. Technical experience in programming in Python, R and statistical packages (R, SAS)
Essential Qualification
. BE/Btech in Computer Science, Engineering or relevant field

About Hexaware

Job Source : careers.hexaware.com

Similar Jobs

People Also Considered

Career Advice to Find Better

Simple body text this will replace with orginal content