## About the job:
– You will design, develop and maintain our distributed data processing pipeline which is the core of our products and platform
– You will implement features on our web platform.
– You will measure performance, quality and reliability of our systems
## Your profile:
– We expect that you believe in and enforce our culture.
– We also expect that you’re always trying to improve your working environment by researching and applying new technologies.
– You feel the desire to automate boring stuff and save time for the exciting things.
– We work in a collaborative environment, so you should be comfortable with peer reviews.
– You can quickly adapt yourself to new challenges.
## Your skills:
– Computer Science degree or relevant working experience
– Experience designing and maintaining modular service oriented architectures
– Deep knowledge in one of the following languages: Ruby, Elixir or JVM experience (Java, Scala or Clojure)
– Demonstrate strong technical and problem-solving skills
– Exceptional spoken and written communication skills
– Experience in creating and supporting development, test and production environments and automating most of the repetitive tasks is a plus
– Experience with the Hadoop ecosystem (Hive, Spark, etc.) is a plus