· Design and setup infrastructure for transfer data from legacy system to the data lake.
· Responsible for performance, scalability and extensibility of any application requiring usage of the data lake.
· Determines database structural requirements by analyzing data-access pattern, reviewing objectives with data scientist and evaluating current data within the data lake.
· Understand and communicate how data within data model can be interact and integrate with other applications and communicate with IT on the requirement to data required for data modeling.
· Lead the analysis of current data management technologies and platforms to detect critical deficiencies and recommend solutions for improvement. In addition, lead the impact analysis of new technologies and market trends on the current data architecture.
· Bachelor's degree required, advanced degree
· Minimum 3 -5 years of experience in relevant business area
· Solid SQL/PLSQL skills and strong understanding of data quality and cleansing strategies
· Experience with ETL processes using ETL Tools such as SAS DI, Oracle Warehouse Builder, Microsoft SSIS, etc.
· Able to building large-scale distributed products
· Experience with various database technologies (e.g. time series/metrics databases, column-oriented datastores, key-value datastores)
· Innovative problem solving skills with the ability to identify and resolve complex architectural issues
· Experience with the Big Data technologies and their ecosystem is a plus.
· Experience with distributed computing frameworks, Apache Spark in particular, and its underlying internals is a plus.
· Experience with Financial Products such as deposit, load, investment is a plus.
For the further discussion please contact :
Tel : 02-658-6614 / 090-901-3821
Email : Pornpinit.email@example.com
6 other openings