EPAM is committed to providing our global team of 36,700+ EPAMers with inspiring careers from day one. EPAMers lead with passion and honesty and think creatively.
Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence.
In today’s new market conditions, we continue to support operations for hundreds of clients around the world remotely, with the vast majority of our teams working from home.
No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.
DESCRIPTIONDo you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data?
Have you been working in similar positions before? Currently we are looking for an experienced Big Data Engineer to make our team in Manno even stronger.
We offer you a highly motivated and experienced team with in-depth knowledge about Advanced Analytics, building models and algorithms.
You’ll operate and maintain the whole platform end to end and work closely with the development teams. This role includes significant responsibilities and development possibilities.
ResponsibilitiesEngineer and integrate the platform from a technology point of viewEngineer core Big Data platform capabilitiesRequirementsA passionate Big Data Engineer looking for new challengesProactive, looking for creative and innovative solutionsA flexible, open-minded, and cooperative personInterested in working in a fast-paced international environment as part of an international teamBachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experienceAt least 3 years of Experience in designing and operating Kafka Clusters (Confluent and Apache Kafka) on-premiseAt least 5 years of Experience in design, sizing, implementation and maintaining Hortonworks based Hadoop ClustersAt least 3 years of Experience in securing and protection of Hadoop clusters (Ranger, Atlas, Kerberos, Knox, SSL)At least 5 years of Experience in designing Big Data architecturesAt least 5 years of demonstrated experience in gathering and understanding customer business requirements to introduce Big Data technologiesAt least 5 years of Experience in configuring the tools from the Hadoop ecosystem, like Hadoop, Hive, Spark, Kafka, Solr, NifiExperience with IBM Watson Studio Local integrationExperience with IBM DB2 is a plusExperience with IBM Power Systems is a plusKnowledge of Security (e.
g. 2FA, Ranger etc.)Experienced in implementing complex security requirements in the financial industryGood abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personalityWe offerCompetitive compensation depending on skills and experienceKnowledge-sharing across EPAM's global Tech CommunitiesUnlimited access to LinkedIn learning solutionsRelocation support as per EPAM relocation policiesEPAM Community with regular corporate and social eventsCareer growth, performance and compensation reviewsAdditionalPlease note that any offers will be subject to appropriate background checksWe do not accept CVs from recruiting or staffing agenciesDue to the Swiss labour legislation we can only accept applicants who have a valid right to work in Switzerland