- Master or Bachelor Degree in Computer Science, Computer Engineering, Statistics, Information Technology or Related Field
- At Least 3-5 Year experienced in Developing Data Warehouse, Data Lake, Big data, Etc.
- Knowledge Big data tools: Hadoop, Spark, Kafka, SQL and NoSQL databases, Python, Java, C++, Scala, Etc.
- Knowledge of message queuing, stream processing
- Build processes supporting data transformation, data structures
5 other openings