Collect data from different sources (databases, logs, APIs) and Clean and process data to prepare it for analysis and modeling purposes.
Design and maintain data pipelines to ensure data is clean and ready for analysis
Set up and manage data warehouses (Data Warehouse/Data Lake) to store big data from different systems.
Set up appropriate data access authorization rules.
Develop APIs to retrieve, update and delete data from Data Warehouse.
Integrate BI tools (Tableau, Power BI, Metabase, Apache superset) and data analytics systems with the backend to provide data for reports and dashboards.
Improve data processing and query performance.
Candidate requirements
Minimum 2 years of experience working in Data Engineering or similar roles.
Experience participating in one of the following types of projects: Data Analytics, Business Intelligence, Data Warehouse, Data Lake, BigData...
Experience integrating and working with BI tools such as Tableau, Power BI, Metabase, Apache Superset...
Knowledge of ETL Tools: Apache NiFi, Apache Airflow or similar tools.
Proficient in one of the following programming languages: Python/Java/Scala and SQL.
Have knowledge of Big Data, Data Lake, Data Warehouse, ETL, ELT, EDA systems
Interest
Income: negotiable based on ability (up to 35m)
Working in an educational technology environment with a platform of foreign language learning applications in English, Japanese, Vietnamese, Korean, Spanish, German, French...
Full basic benefits such as: Social insurance, holiday bonuses, travel,...
Young working environment, focusing on human resources. Many development opportunities with clear roadmaps.
Participate in training courses to improve capacity and skills...