Our client, top tier global investment bank, is building a new big data storage and analytics platform to aid in achieving its strategic goals.
The first and primary job of this platform is to centralize all data about transaction costs and provide reporting that will help in the reduction of operations overhead and in automation of manual processes.
The candidate will work on development of new functionality for EQD project closely with other teams over the globe.
- Building Big Data cluster to support current transaction model across the bank
- Lot of reference data and historical data will be in Hadoop eventually (in fact part of this project could be to store data into HDFS) but lot of data initially will be sourced from SQL, KDb+
- That data plus transactional data forms the basis of pre-trade analytics
- Strong RDBMS database is needed as that'll be the source initially to get lot of the data
- 5+ years of experience with Hive, Hbase and Impala
- Understand and be fluent in architecture in the mandatory stack
- Able to actually develop solutions and write code
- Knowledge of SQL & Databases
Nice to have
- Ability to lead Big Data team
- English: Upper-intermediate