Send me more jobs like this

Keywords / Skills : Kafka

5 - 15 years
Posted: 2019-10-22

Industry
IT/Computers - Hardware & Networking
Function
IT
Role
Software Engineer/ Programmer
Posted On
22nd Oct 2019
Job Description
Responsibilities/Duties

Create and maintain optimal data pipeline architecture, Assemble large, complex data solutions that meet functional / non-functional business requirements.

Identify, design, and implement data domains for a microservice environment (data synchronisation and data movement).

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, cloud and ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Work with data and analytics experts to strive for greater functionality in our data systems.

Technical Skills Skills :

We are looking for a candidate with 3+ years of experience in a Data Engineer role

Advanced working SQL knowledge and experience working with relational databases and non-relational databases. Experience building and optimizing data pipelines, architectures and data sets.

Experience solution design using appropriate design patterns.

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Strong analytic skills related to working with unstructured datasets.

Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets.

Working knowledge of message queuing, stream processing, and highly scalable data stores.

Strong project management and organizational supporting and working with cross-functional teams in a dynamic environment. You should also have experience using the following software/tools:

Working knowledge of Lambda and Kappa architectures
Experience with relational SQL and NoSQL (Json, column store, graph) databases. Experience with big data tools - Spark, Kafka, etc.

Experience with data integration pat Experience with cloud platforms – Azure

Experience with stream-processing systems: Kafka, Spark-Streaming, etc.

Experience with object-oriented/object function scripting languages: Python, Java, C++, NodeJs, Scala, etc.
Key Skill(s)

About Company

Neudesic is home to some very smart, talented and motivated people. People who want to work for an innovative company that values their skills and keeps their passions alive with new challenges and opportunities. We’ve created a culture of innovation that makes Neudesic not only an industry leader, but also a career destination for today’s brightest technologists. You can see it in our year-over-year growth, made possible by satisfied employees dedicated to delivering the right solutions to our clients.
Similar Jobs
View All Similar Jobs
Walkin for you