Wij, en derde partijen, maken op onze website gebruik van cookies. Wij gebruiken cookies om ervoor te zorgen dat onze website goed functioneert, om jouw voorkeuren op te slaan, om inzicht te verkrijgen in bezoekersgedrag, maar ook voor marketing en social media doeleinden (laten zien van gepersonaliseerde advertenties). Door op ‘Accepteren’ te klikken, ga je akkoord met het gebruik van alle cookies. In onze Cookieverklaring kun je meer lezen over de cookies die wij gebruiken en kun je jouw voorkeuren opslaan of wijzigen. Door ‘Weigeren’ te klikken ga je alleen akkoord met het gebruik van functionele cookies.
Start date negotiable
Are you looking for a Data Engineering Internship? Decathlon wants sport to be accessible to everyone in the most sustainable way. In this role as a Data intern, you will support the data team in different topics and will help the business by making data accessible, to make more data driven decisions.
We are seeking a Data Engineering Intern to support the design, development, and maintenance of data pipelines and data infrastructure used for analytics and reporting. In this role, you will work with engineers and analysts to ingest, transform, and validate structured and unstructured data from multiple sources. Responsibilities include assisting with ETL processes, writing and optimizing SQL or Python scripts, documenting workflows, and helping ensure data quality and reliability. The ideal candidate is pursuing a degree in computer science, data engineering, or a related field, and has a foundational understanding of databases, programming, and data processing concepts.
Your activities include:
Operational Health: Monitor daily Airflow DAGs and Airbyte syncs; act as the first line of defense for pipeline failures.
DBT Operations: Monitor daily DBT jobs, troubleshoot model failures, and ensure the data warehouse is up to date for the business.
Analyst Support: Act as a "bridge" for data analysts, assisting them with DBT documentation, testing, and minor model adjustments.
SOP Execution: Troubleshoot common issues (API timeouts, schema drifts) using our existing playbooks and document new fixes.
Cloud Maintenance: Manage permissions and basic resource monitoring within AWS and Google Projects.
You are enthusiastic, driven and have a passion for sport
You are not afraid to approach people and indicate what you want to learn
You are available for at least 32 hours for a minimum of 4 months(ideally 6 months) to do an internship (of which at least 24 hours is co-working)
You are a master student in the direction of Data Engineering
You are fluent in English
Strong SQL skills are a must;
Python for general scripting and Airflow logic.
Intermediate understanding of data modeling concepts, of how tables relate (Joins, Primary Keys, and basic Star Schema concepts).
Ability to read logs in Databricks/Airflow and trace data flow through the stack.
Ability to explain technical table structures to analysts in a clear way.
You have a passion for Data
.png)




