Are you passionate about technology and like to take on complex technical challenges? Do you want to work in exciting projects at renowned clients?
At ALTEN we believe that knowing and meeting the expectations of our consultants is a key factor in our success. Our consultants make the difference.
Do you want to make the difference too?
Improve the tooling for Monitoring, Reconciliation and QA for the data pipelines
Define the solution for the data catalogues and data management of the platform
Provide the Data Vault functionality in a public cloud setup like Amazon AWS
Define the processes for the CI / CD of the data catalogues and data management in the platform
PhD, Master or bachelor’s degree in computer science, Math, Physics, Engineering, Statistics or another technical field
Firm understanding of major programming / scripting languages like Java and Python
Know-how in Kafka and Data Pipelines is a must
Experience in ETL and Data Warehouse is an asset
Data Security Management is a plus
Practical experience with distributed systems, Big Data technologies, streaming technologies and SaaS based architectures (e.
g. Hadoop, Spark, Presto, Kafka, Data Lakes)
Practical experience with Container Platforms (OpenShift) and / or containerization software (Kubernetes, Dockers)
Hands-on experience architecting data pipelines including data collection, data storage and processing, data analysis on scale and elastically
Knowledge in data modelling, query optimization on different storage. solutions such as RDMS, document stores, graph databases, time-series databases, data warehouses
Familiar with defining Data Governance concepts (incl. data lineage, data dictionary)
Knowledge of Gradle based tooling for building polyglot CI / CD pipelines, DevOps, automation, agile methods, automated testing, code quality
We are looking forward to getting to know you and your ambitions!