Main duties
  • Architect and build a highly scalable, real-time data management platform
  • Manage data integrity in a complex landscape of applications and microservices
  • Design and implementation of several data retention strategies and subsequent data pipelines
  • Hands on development of data components as part of software engineering team, supporting the SDLC
  • Support of test automation by making sure adequate data management practices are supported
  • Support of automation of data focused applications and their integration processes using OpenShift/Kubernetes
  • Advancing a CI / CD approach at the platform level (Infrastructure as Code)
  • Documenting processes and solutions
  • Data management in distributed systems based on Distributed Ledger Technologies
Your profile
  • At least 10 years relevant work experience in a similar capacity
  • University degree or equivalent, preferably with a degree in information technology.
  • Polyglot engineering experience – ideally Kotlin, Java, Python or Scala
  • Experience with Realtime/Batch Processing
  • Experience with Hadoop/Spark/Hive/MapReduce/Kafka stack
  • Experience designing data focused APIs
  • Experience in the field of Continuous Integration and with the tools and processes (Jenkins, OpenShift, Kubernetes, GitLab)
  • Independent and proactive working method and a high-quality standard
  • Communicative and team-able personality
  • Very good English knowledge in spoken and written
  • Knowledge of agile methods (Lean, Kanban, Scrum, ...)
  • Experience in the financial services industry
  • Passion for working with emerging technologies
We offer you
  • Individual career path and development opportunities
  • An experience to be part in the world's first regulated digital exchange for investors, banks, and entrepreneurs bridging the gap between traditional financial services and digital communities

If you have any questions, please send an email

We only accept online direct applications.