Design and create codes for all related data extraction, transformation and loading (ETL) into database under responsibilities
Maintain and support ETL jobs for the project
Collaborate with all developers and business users to gather required data and execute all ETL programs
Continuously learn about data warehousing, data lakes, and big data concepts and stay updated with the latest tools and best practices in Python, SQL, Snowflake, and GCP
Develop and update technical documentation
Conduct unit testing and troubleshooting
Collaborate with cross-functional teams, including Data Analysts (DA) and Data Scientists (DS), to understand data requirements and develop data pipelines to support analytics and reporting.
Stay abreast of industry trends and emerging technologies to recommend and implement improvements to our data engineering processes.
Qualifications:
1-2 years’ experience in Data engineer or related role (Fresh graduate is welcome)
Have a basic knowledge of container (Docker) concept
Strong understanding of database concepts, data modeling, and SQL.
Have experience in Microsoft SQL Server, Bigquery ,Snowflake is plus
Have experience in Apache Airflow ,SSIS
Have experience in BI Tools such as Tableau, Looker Studio
Proficiently in Python, SQL (DML, DDL, DQL)
Strong team player with good communication skills and a passion for learning
Passion to learn and adapt to new technologies in a fast-paced environment.