Job description:
For our client we are looking for **Data Engineer**.
**Job description:**
You will be involved in one of the biggest data transformation journey. As a data engineer, you will be working with building of data products in the context of Data Mesh concept based on defined target vision and requirements.
We appreciate a multitude of technical backgrounds, and we believe you will enjoy working with our client if you are passionate about data. In this role, you will be required to implement data-intensive solutions for a data-driven organization.
You will join the Data Engineering Competence area within AI (Artificial Intelligence) and be an individual contributor in one of the data product-teams. The area supports all our brands globally to create, structure, guard and ensure data is available, understandable and of high quality.
**Requirements:**
* Experience in data query languages (SQL or similar), BigQuery, and different data formats (Parquet, Avro)
* Take end-to-end responsibility to design, develop and maintain the large-scale data infrastructure required for the machine learning projects
* Have the DevOps mindset and principles to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).
* Leverage the understanding of software architecture and software design patterns to write scalable, maintainable, well-designed, and future-proof code
* Work in cross-functional agile team of highly skilled engineers, data scientists, business stakeholders to build the AI ecosystem
**Required cloud certification:** GCP
**Tech skills:**
* GCP service – (BigQuery, cloud run, cloud functions, pubsub, dataflow, cloud composer, etc.)
* SQL
* Python
* DBT
* Terraform
* Basics of Azure
Please make sure to send your CV and motivational letter in English.
Please note! We offer continuously. That means that we sometimes remove the assignments before deadline. If you are interested we recommend that you apply immediately.