We are currently looking for a “Data Operations Engineer” to join our team.
Main responsibilities:
- Support users of the Data Sharing platform by troubleshooting issues and providing technical assistance.
- Work closely with the data scientist and data engineers team to ensure data is properly integrated and flowing smoothly through the platform.
- Manage and maintain databases and data pipelines, ensuring they are optimized for performance and scalability.
- Monitor data quality and integrity, and implement processes to ensure data is accurate and complete.
- Support the process of pre-data sharing preparation.
- Collaborate with cross-functional teams to identify and implement new features and enhancements to the platform.
- Stay up-to-date with industry developments and trends in data management and apply best practices to the platform.
Experience, Competencies and Skills Required:
- 3+ years of experience in operations or related fields.
- Bachelor’s degree in Computer Science, Information Technology, or a related field (preferred but not mandatory).
- Solid understanding of Linux OS.
- Proficiency in SQL and Python.
- Familiarity with data visualization tools and related technologies.
- Hands-on experience with Docker (docker-machine, docker-compose).
- Good understanding of Kubernetes / OpenShift.
- Experience with monitoring tools such as Grafana, Prometheus, Zabbix.
- At least one scripting language (Bash or Python).
- Basic knowledge of RDBMS and Data Engineering fundamentals.
- Strong analytical and problem-solving skills with attention to detail.
- Excellent communication and collaboration skills.
Candidates wishing to apply can send their CV to the e-mail address in the Apply for job button, specifying the job title in the “Subject” field. Only candidates who meet the requirements of the vacancy will be contacted for the next stage of the recruitment process.