€80000 - €90000.00 per annum + benefits
etwa 1 Jahr
For one of our financial services clients in central Munich we are currently looking for an experienced Hadoop Engineer (m/w/x):
You will work closely with architecture teams to obtain requirements, in order to drive and improve technological system performance. Further tasks will include
- Administer a large scale Hadoop infrastructure.
- Full lifecycle management of Hadoop cluster: provide architectural guidance, analyse cluster capacity, and construct roadmaps for cluster deployment.
- Create and implement enterprise level security.
- Develop Spark & Kafka applications with Scala.
- Implement machine learning algorithms within Spark applications.
- DevOps support for business applications and use cases.
- Completed degree within Computer Science, Mathematics, Business Informatics or other relevant field.
- Very good knowledge of distributed computing and distributed system performance.
- Deep expertise in implementing large-scale Hadoop clusters.
- 2 years of relevant industry experience working with Spark, Spark Streaming, Kafka, Zookeeper, MapReduce, Flume, Hive as well as Oracle and MySQL.
- Programming and scripting skills in Scala, Java, Ruby, Python or R.
- Proficiency with continuous integration tools
- Linux skills.
- Fluent English skills, German would be a bonus.
For more information please call Tom Fernandez on +49 89 2109 3906 or send your CV to email@example.com for consideration.