Data Engineer
Logitech
Lausanne, Switzerland
vor 5 Tg.

Position at Logitech

Logitech is transforming into a connected company, where devices and cloud services work hand in hand to create new user experiences.

You will be a Data Engineer within the CTO Office, a transversal organisation developing a common data platform, enabling big data / internet of things analytics and other advanced technologies such as Machine Learning for Logitech’s Business Groups.

We will be leveraging public clouds, like AWS, Azure, and GCP, as well as tools like Apache Spark, Snowflake, D3.js and Tableau.

We will be developing the worldwide infrastructure and operational best practices serving several millions of customers and devices.

Due to the nature of the CTO office team, this is a challenging role that requires being able to anticipate business needs and focus on business success, as well as strong technical skills, willingness to experiment with technology, and ability to deliver on multiple projects under pressure.

Responsibilities

Be Yourself. Be Open. Stay Hungry and Humble. Collaborate. Challenge. Decide and just Do. These are the behaviors you’ll need for success at Logitech.

In this role you will :

Develop and maintain ETL flows for loading data into the warehouse from the systems collecting data from devices

Work with engineering teams and business users to define data schemas for device event data stored in a common data warehouse

Define and manage views in the warehouse to meet the requirements of data scientists using platforms like Spark and business users using visualization tools like Tableau

Work with data scientists to productize analytics and data models, developing new ETL flows for new applications driven by model-based analytics

As business needs grow, develop and maintain data stream processing workflows for device event data, supporting the needs of business users for up-

to-date information and customer-facing services

Qualifications

2 years of relevant work experience in, building pipelines for conventional, unstructured, streaming or big data sets using tools like Spark, Flink or Hadoop.

Programming proficiency in at least one major language (Python, Scala or Java)

Experience building fault tolerant distributed systems

Strong problem solving skills

Desired Skills

A conceptual understanding of data science and machine learning applications like recommender systems, classification, predictive modeling and clustering.

Familiarity with consumer oriented analytic techniques like segmentation and user profiling.

Understanding of parallelized data ingestion techniques (not essential but useful)

Pragmatic attitude and ability to rapidly iterate and evolve ideas into products

Experience assessing and anticipating business needs and focusing on delivering business-relevant results

Desire to collaborate in a team of researchers, data scientists and software developers.

Education

MsC degree in Computer Science, Data Science, Machine Learning or related technical field or equivalent practical experience.

All qualified applicants will receive consideration for employmentwithout regard to race, sex, color, religion, sexual orientation, gender identity, national origin, Veteran status, or on the basis of disability.

Bewerben
Zu Favoriten hinzufügen
Aus Favoriten entfernen
Bewerben
Meine Email
Wenn Sie auf "Fortfahren" klicken, stimmen Sie zu, dass neuvoo Ihre persönliche Daten, die Sie in diesem Formular angegeben haben, sammelt und verarbeitet, um ein Neuvoo-Konto zu erstellen und Sie gemäß unserer Datenschutzerklärung per Email zu benachrichtigen. Sie können Ihre Zustimmung jederzeit widerrufen, indem Sie diesen Schritten folgen.
Fortfahren
Bewerbungsbogen