Data Architect

Data Architect

Techwave Consulting India Private Limited
Hyderabad / Secunderabad
0 - 50 Years
Not Specified

Job Description


Job Description
Data Architect (www.techwave.net)
Job Description:
Should have at least 10+ years of relevant experience
This is going to be an individual contributor role.
Work with cross-functional teams to understand technical needs, design prototypes, demo system and establish POCs for cloud-based platforms (AWS, Azure or GCP).
Provide leadership & hands on contribution in areas like Data Access, Data Integration, Data Visualization, Data Modelling, Data Quality and Metadata management.
Full understanding of the processes of data quality, data cleansing and data transformation
Define the appropriate use of data management tools, principles, and methods related to system development and operation.
Lead necessary technology proof of concepts in the data architecture domain and ensures alignment and communication across the organization.
Data Modelling Experience in a Data Warehouse / Data Marts environment
Focus on increasing the efficiency and productivity of data systems by implementing efficient data management practices along with compliance with technology standards and established governance
Responsibilities include performing data analysis, defining ETL architecture, data modelling and implementing robust data pipelines in the cloud
Provides expertise across: Data Warehousing, ETL, Analytics and Business Intelligence Modelling.
Leverage and makes recommendations on changes to enterprise designs architectural frameworks for managing data and metadata along with data quality, data transformation, data movement and transport and lineage.
Experience in consulting or IT experience supporting Enterprise Data Warehouses & Business Intelligence environments, including experience with data -warehouse architecture & design, Responsible for defining the data strategy and for ensuring that the programs and project align to that strategy.
A hands-on architect having implemented atleast 3 ETL/ELT full end to end life cycle is mandatory (preferably at least one using Talend).
Cloud knowledge and experience, including cloud databases and data warehousing (ex: Amazon RDS, Amazon DynamoDB, Amazon RedShift, Azure Cosmos DB)
Preferable to have knowledge in Python/Spark.
Deep knowledge of big data technologies and proven expertise in utilizing them for large scale implementation: Hadoop, Map Reduce, Spark SQL, MongoDB, HBASE, Elasticsearch, Kibana, ETL tools
Technology consulting experience will be a plus

Similar Jobs

People Also Considered

Career Advice to Find Better