As a Data Engineer at STACC you will ensure that day-to-day data flows run smoothly and collaborate with data scientists and application development teams to understand their needs providing support when and where needed.
- Build and maintain RESTful APIs around our ML models.
- Build data pipelines and microservices (Docker).
- Build and scale services in a cloud environment.
- Orchestrate ML workflows (Apache Airflow) and automate services deployment.
- Evaluate and optimize the performance of infrastructure and applications
- Create and manage databases.
You need to have hands-on experience:
- In designing, deploying and monitoring production environments.
- In building and optimizing data pipelines.
- In automating and integrating new solutions with existing infrastructure (e.g. Ansible).
- With cloud platforms (preferably AWS).
- With open source web servers (e.g. Nginx or Apache) and databases (e.g. Postgres, MongoDB, Mysql).
- With Linux environment.
- With one or more job scheduler systems, such as Airflow or Luigi.
- With one or more mainstream programming languages, such as Python, C, C++, Java, Go, etc.
Nice to haves
- Like to play board games. A lot. Oh, and Monopoly doesn’t count!
- Use rubber ducks when debugging and like pair programming.
- Be an outgoing team player and see the big picture!
- Think outside the box and dare to take initiative.
- Be open minded, reliable and witty.
- Be an AWSome Python Ninja!
- Have a business oriented, analytical, and problem-solving mindset.
- Have a Master`s degree and a minimum of 2 years of related experience.
What we offer
- Flexible working hours.
- A work environment that fosters self-development and a chance to participate in training courses within your area of work.
- Frequent all-inclusive team events and sports compensation.
- Motivating performance-related pay.