Job description:
For our customer, we are looking for an experienced **Data Engineer** to support their data team with the development and maintenance of ETL pipelines and datawarehouse solutions. The consultant will work to extract, transform and load data from various sources, as well as develop and optimize data flows using Python.
**Responsibilities:**
Design, develop and maintain ETL pipelines to ensure efficient and accurate data transfer
Implement and optimize datawarehouse solutions to support business analysis and reporting
Write and maintain Python scripts to automate data flows and processes
* Collaborate with data teams and business interests to understand data needs and requirements
**Qualifications:**
Experience in designing and developing ETL pipelines using tools like Apache Airflow or Dagster
Knowledge of data warehousing and data modeling
Advanced knowledge in Python and experience writing scripts for data management and automation. We use mostly Polars.
Experience in working with SQL and database management
Ability to work independently and in teams, and to manage multiple projects at the same time
Good communication skills and ability to collaborate with different stakeholders
Competence level:**
Knowledge: High knowledge in its field
Experience: 4-8 years as a consultant in the field. Is the role model for other lower-level consultants.
* Independence: Can work independently
**Meritation:**
Experience with the municipality and/or public sector
domain knowledge of data in specific systems (Raindance, Lifecare, Procapita, Edlevo)
Experience of working in agile development environments
** Expected Start Date:** 2025-07-01
Expected End Date:** 2025-12-31
**Times per week:** 40
**Distance work: ** 75%
**Option for extension:** Option to extend the contract is available in 18 months. Options used in this agreement are always made
current conditions.
_Availability requirements under the DOS Digital Delivery Act will apply to all contract requests. See the attachment for more info._