Big Data / Hadoop Administrator
Credit Suisse
Zürich, Region Zurich, Switzerland , Switzerland
vor 7 Tg.

We Offer

  • You the opportunity to become a member of a growing, high-visibility cross-Bank team that is developing and deploying solutions to some of Credit Suisse’s most challenging analytical problems.
  • As a member of this team, you will support big data clients with data spanning Credit Suisse’s global organization to solve emerging mission-

    critical challenges via the utilization of emerging technologies such as :

  • Distributed file systems and storage technologies (HDFS, HBase, Hive, Kudu)
  • Large-scale distributed data analytic platforms and compute environments (Spark, Map / Reduce)
  • Streaming technologies such as Kafka, Flume
  • The responsibility to manage, troubleshoot and review Hadoop log files as well as to monitor Hadoop cluster connectivity and security
  • The possibility to work with data delivery teams to setup new Hadoop clusters
  • The opportunity to collaborate with application teams to install Hadoop updates, patches, version upgrades when required as well as to align with the engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
  • You are responsible for enterprise-grade Linux servers requiring configuration around deployment, installation and deployment
  • A position with the ability to shape and drive architectural decisions to achieve high value results
  • Support of users through technical issues
  • This role may require occasional out-of-hours work
  • You Offer

  • 7+ years’ experience in enterprise IT at a global organization. Recent 3+ years in Big Data ecosystem software development
  • Expert level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, ZooKeeper and Postgres
  • Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager
  • Monitor Hadoop cluster connectivity and security. Manage, Troubleshoot and review Hadoop log files
  • Eyes-on-glass cluster monitoring and responding to internal tickets and email alerts. File system management and monitoring
  • The ability to function within a multidisciplinary, global team. Be a self-starter with a strong curiosity for extracting knowledge from data and the ability to elicit technical requirements from a non-technical audience
  • Languages : Bash (essential), Python (essential). Understanding of security principles e.g. SSL, Kerberos
  • Strong communication skills and the ability to present deep technical findings to a business audience.
  • Fluent written and spoken English
  • Ms T. Goop would be delighted to receive your application.

    Please apply via our career-portal.

    Zu Favoriten hinzufügen
    Aus Favoriten entfernen
    Meine Email
    Wenn Sie auf "Fortfahren" klicken, stimmen Sie zu, dass neuvoo Ihre persönliche Daten, die Sie in diesem Formular angegeben haben, sammelt und verarbeitet, um ein Neuvoo-Konto zu erstellen und Sie gemäß unserer Datenschutzerklärung per Email zu benachrichtigen. Sie können Ihre Zustimmung jederzeit widerrufen, indem Sie diesen Schritten folgen.