Profil:
• Enrolled in a Bachelor's or Master's program in Computer Science, Data Science, or a related field.
• Strong interest in data engineering and data technologies.
• Strong skills in SQL and Python are a must have.
• Familiarity with GCP (Google Cloud Platform).
• Good skills in Looker and Terraform are required.
• Good problem-solving skills and attention to detail.
• Strong communication and teamwork abilities.
Missions:
• Collaborate with senior data engineers to design, build, and maintain data pipelines and ETL processes.
• Analyze functional and technical requirements
• Implement developments on GCP (BigQuery, Cloud Run, Pub Sub, Cloud Function,…).
• Assist in the integration and transformation of data from various sources.
• Contribute to the optimization and performance tuning of existing data pipelines.
• Assist in the development and maintenance of data warehouses and databases.
• Collaborate with data analysts and data scientists to ensure data availability and reliability.
• Document and communicate data engineering processes and best practices.