Build the Revenue Science Revolution with Us!

Are you a life-long learner who is always asking deep questions? Are challenges exciting to you? Want to push the boundaries of marketing and innovate new Revenue Science products?

Our Values

At Dealtale, we thrive on the collaborative relationships we build with our fellow teammates. Our unique ecosystem is a delicate balance of support, transparency, collaboration, and results.

We thrive on teamwork as we empower marketers to boost revenue performance through remarkable technologies. Our mission is to deliver busienss value with deep insights—without the worry of coding, data silos, legacy data structures, data science, or engineering.

Our Culture And Values

Our Culture And Values

Our culture is based on nine core values to inspire teamwork, personal excellence, and fun!

Available Positions

Building solutions to multi-faceted problems is a challenge, but it’s also explosively fun! If innovation is in DNA, please take a look at our available jobs.

Senior Backend Engineer

Herzliya, Israel (Hybrid) · Full-time

About The Position

Meet Dealtale

We love marketers. We love the way they think, create, react, problem-solve, and build. And we know the tremendous impact that marketers can have on revenue. So, we’re innovating new tools just for them.

When we started Dealtale, we focused on getting specific about the exact journey a buyer takes down the funnel, so that marketers never have to rely on their gut when they build campaigns.

That’s how we got our name: Dealtale.

We discovered that by finding the root cause of customer behavior, marketing technology could go from predictive to prescriptive – not just forecasting future results, but actually suggesting specific actions to change undesirable outcomes.

Through these innovations, we have created a new category called Revenue Science, which fuses the data industry’s two most powerful methods—causal machine learning and prescriptive analytics. When these two elements are combined, the results are explosive.

Build the Revenue Science Revolution with Us

We are looking for a talented Backend Engineer to join our team and drive our development and delivery of our new Revenue Science platform. This position is a key part of our future growth and we are looking for someone who can hit the ground running, with a can-do attitude and a strong collaboration skills.


What you’ll do: 

  • Be a part of a vertical engineering team while taking ownership from requirements to production
  • Pursue the best, state-of-the-art solutions, within the dynamic requirements and timelines.
  • Design, plan and build all aspects of the platform’s data pipelines and backend components. 
  • Be goal and data oriented. Always strive to make decisions based on data.
  • Be minded for quality and efficiency – invest time, when needed, in order to run better and faster.
  • Focus on continuous growth and improvement, in every aspect (personal, products, processes, tools, skills, etc.).
  • Be proud of what you do.


Who you are: 


  • 5+ years of track record developing SaaS products, data-driven pipelines and/or distributed systems (on AWS or a cloud provider alike), working on production environments with live customers. 
  • Experience with at least one of the following programming languages: Python (big advantage), NodeJS, Scala, Java.
  • Experience with data warehousing/storage technologies (such as Amazon Redshift, Snowflake, PostgreSQL, Redis, MongoDB) and experience with data pipelines development (ETL/ELT)
  • Experience with software engineering best practices and keen for automation (e.g. unit testing, code reviews, design documentation, CI/CD, etc.). 
  • Experience in integrating and deploying solutions using docker and docker orchestration platforms (Kubernetes is big advantage).
  • You hold a “get-things-done” approach. 
  • Willing and ability to collaborate and overcome challenges in an open team environment.
  • You are comfortable working in an agile, dynamic, fast-paced working environment. 
  • B.Sc. or higher degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field


  • Experience in building and designing ML/AI driven production infrastructures and pipelines – both runtime and training 
  • Experience with data pipelines/workflow management tools (such as Azkaban, Luigi, Airflow, etc.)
  • Familiar with big/distributed data concepts and experience in related tech stacks (like Hadoop, Hive, Spark, Presto, Sqoop, DataLake, etc.) 
  • Messaging/PubSub platforms experience (Kafka, RabbitMQ, etc)
  • Experience with web frontend technologies and frameworks (such and React)

Apply for this position