Data Streaming & Analytics Developer

Job description:

For one of our clients we are looking for an **Data Streaming & Analytics Developer:**
**Description:**
At the department of Autonomous System, we are developing transport solutions for the future. Autonomous transport environments such as autonomous mining sites or autonomous transport hubs are examples of cyber-physical systems that amongst other things, produce large amounts of data. We are looking for you, a data engineer who is enthusiastic to make this big data useful via appropriate data streaming, modeling, visualization and analysis. You will work closely together with a team of inspiring and engaged colleagues. Our main development task is to support operations in autonomous transport system via effective utilization of data.
**Your Responsibilities**
* Helping the team to design and develop appropriate data pipelines, from different sources of data to the final users.
* Creating and consuming Kafka Topics on the cloud as well on-premise
* Defining scenarios within autonomous transport systems to apply data-driven methods.
* Developing software and services.
* Designing algorithms for data analysis within autonomous transport system domain.
**Qualifications and experience, to be successful in this role we see that you have:**
* Master’s or Bachelor’s degree in Data Analytics, Computer Science, Information Technology or another related fields.
* Solid experience in Apache Kafka (Stream Processing Platform) and Event-Driven Systems
* Design, develop, and maintain scalable data pipelines and solutions using Databricks, integrating with AWS, to process and analyze large datasets for business insights.
* Demonstrated experience in Alarming, Monitoring and Logging for Event-driven Systems.
* Solid experience in AWS (Amazon Web Services) Cloud Computing Platform. Experience in Other cloud computing platforms (e.g. Microsoft Azure) is a plus.
* Knowledge and experience in Terraform (Infrastructure as Code tool for cloud)
* Knowledge and experience in CI/CD and GitLab
* Solid experience in Java Programming Language.
* Solid experience in Python (with focus on Machine Learning and Data Analytics)
* Knowledge and experience in Machine Learning and Data Analytics tools such as Databricks, Apache Spark, TensorFlow etc.
* Good theoretical background in Machine Learning, Big Data Analytics and Statistics.
**Meriting:**
* Knowledge in other AI areas such as AI Planning, Generative AI, Knowledge Representation, Autonomous Robotics etc. is a plus.
**Start:** 2024-12-02
**End:** 2025-12-31
**Workload:** 100%
**Location:** Södertälje. Minimum 50% onsite.
**Language:** Swedish and English
**To apply for the assignment:**
* Send in your up to date CV in word format 
- competence matrix
* Availability
* Targeted justification in which you describe why you are suitable for this assignment 
- refer to previous consulting assignments, employment, training and personal characteristics.
* We do not accept any applications through email, all must be submitted through the portal to be valid.
**Important info:**
Please disregard the information under 'Payment Option' as the customer has recently updated their payment terms from 60+5 to 90+5 days.
Note that all contracts have the Pay Express service activated by default. This service comes with a 3.25% fee and ensures payment within 3 business days. If you’re not interested, please lets us know by adding that as a comment when you apply for the assignment.
_**We offer continuously.**_
_That means that we sometimes remove the assignments before deadline. If you are interested we recommend that you apply immediately._

Be a part of our comminity

Join us on Telegram or Discord to get instant notifications about the newest freelance projects and talk to some of the smartest software engineers in the world.