Big Data Engineer
Zürich, Switzerland
vor 6 Tg.
source : Experteer

Job Reference #220765BRCityZürich Job TypeFull Time Your roleAre you fascinated by advanced analytics leveraging the latest big data and artificial intelligence technologies?

Are you innovative, analytical and deeply motivated by the fascinating and bound to grow world of cyber security?We are looking for such a person to :

  • develop and enhance advanced data extraction components working on a big data platform
  • manage and develop multi-terabyte cyber data lakes utilizing big data technologies
  • engineer and prepare terabytes of security related events processed by deep learning detection models
  • create new data pipelines that automate the flow and transformation of data between systems
  • design and implement complex ETL pipelines to integrate data coming from multiple sources
  • perform multi-terabyte worldwide acquisition of forensic data to detect unknown malware missed by other security products
  • collaborate within the broader CISO team to apply big data techniques to in-house available vendor technologiesYour teamYou’ll be working in the Data Engineering unit of the Advanced Analytics team and be based within the Mission Control Center in Zurich.
  • The Advanced Analytics team, part of the Chief Information Security Office (CISO) within Group Technology, strives to detect malicious activities within the UBS IT infrastructure as well as technically complex cyber-attacks in early phases of attack lifecycle.

    It is a young and empowered team always seeking excellence and continuous improvement.Your expertise

  • 1-3 years of experience in a role as a data engineer, data scientist, software engineer, quantitative analyst and / or quantitative model developer.
  • Ideally part of this experience would be in the field of IT Security

  • Master's degree in a quantitative, scientific or technological area
  • excellent programming skills in Python, practical experience with frameworks enabling big data technologies : pyspark
  • experience designing, implementing, and maintaining modern ETL pipelines with a clean code-base. Practical knowledge in implementing technical API's for data extractions
  • experience with big data technologies and tools : Hadoop, HDFS, Spark, Kafka, relational SQL databases. Understanding of cloud-based services and platforms : Cloudera, Databricks, DSVM
  • preferably Information Security know-how and practical experience
  • fluent in English with the ability to describe complex concepts and technologies with simple explanationsAbout usExpert advice.
  • Wealth management. Investment banking. Asset management. Retail banking in Switzerland. And all the support functions. That's what we do.

    And we do it for private and institutional clients as well as corporations around the world.We are about 60,000 employees in all major financial centers, in more than 50 countries.

    Do you want to be one of us?Join usWe're a truly global, collaborative and friendly group of people. Having a diverse, inclusive and respectful workplace is important to us.

    And we support your career development, internal mobility and work-life balance. If this sounds interesting, apply now.

    Melde diesen Job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Klicke auf "Fortfahren", um unseren Datenschutz-und Nutzungsbestimmungen zuzustimmen . Du kriegst außerdem die besten Jobs als E-Mail-Alert. Los geht's!