Career

Big Data junior developer

Sagarmatha is looking for a junior big data developer to join our next generation development team.

Our product is a targeting and recommendation platform, with a cloud based machine learning engine running on cutting edge big data technologies and multiple user interfaces for both campaign management and analytics.

 

In this role, you will join our big data development team and participate in building cloud based big data platform for AI based personalization and recommendation engine for both batch and online (e-commerce ) use cases.

 

  • B.Sc in Computer Science,  in a known university , with average grade of 85.
  • 0-2 years experience or knowledge  in python, Spark.
  • curious , independent and eager to learn .
Applay
Big Data & Cloud technical leader

Sagarmatha is looking for a Big Data & Cloud technical leader.

Our new product is a cloud-based scoring and targeting engine, that provides recommendations and optimization in both batch and real-time modes. It incorporates machine learning models and optimization algorithms in the area of retail and marketing.

 

Sagarmatha provides personalization and analytics solutions for retailers around the world (N. America, Israel). We receive the loyalty data of millions of shoppers daily, and apply machine learning and optimization algorithms in order to extract insights and drive actions. Our product and our professional services allow retailers and suppliers to:

  • Identify: analyze shopper behavior, categories and brands and find challenges and opportunities.
  • Activate: Target each shopper with the right content, at the right time and through the right touchpoint.
  • Analyze: Generate learnings from each retailer activity, shopper action, and customer response.

Job Description

  • Manage big data team including new recruits
  • Design and implementation of a cloud based, big data scoring and targeting solution
  • Migrating current algorithms and capabilities into the new environment
  • Adding new capabilities and features according to product roadmap
  • Laying out the infrastructure to support the data science flow required including a research and testing environment
  • Working closely with the data science and product teams

 

Requirements:

  • A/B.Sc in Computer Science, Mathematics, Statistics or similar in a known university – an advantage
  • Team/Group management skills. At least 3 years of experience- Must
  • Software development: at least 8  years of experience- Must
  • Experience with distributed big-data environment (Spark)- Must
  • Experience working with a NoSQL databases – advantage
  • Experience using orchestration solutions (e.g. Luigi, azure datafactory)-advantage
  • Experience building cloud based solutions, preferably Azure- Must
  • Experience working with a data science/recommendations team to deploy machine learning solutions – a big advantage.
  • Experience building real-time marketing applications (e.g. Ad-tech, real-time recommendations) – an advantage
  • Creative, Initiative, and a great person to work with!
Applay
Senior big data developer

Sagarmatha is looking for a senior big data developer to join our next generation development team.

Our product is a targeting and recommendation platform, with a cloud based machine learning engine running on cutting edge big data technologies and multiple user interfaces for both campaign management and analytics.

 

In this role, you will join our big data development team and participate in building cloud based big data platform for AI based personalization and recommendation engine for both batch and online (e-commerce ) use cases.

 

 

Job Description

  • Be part of a small team of professionals building the next generation big data and personalization engine of the company.
  • Build infrastructure for a cloud based platform handling huge amount of data and complex machine learning based solution.
  • Develop in python, spark and Scala
  • Work closely with data scientists and develop machine learning based solutions
  • Work with modern big data databases ( Delta lake, Snowflake, Exasol, Cassandra ) as well as rational databases ( SQL server, MySQL)
  • Work with serverless technologies

 

Requirements:

  • 3-4 years of experience in enterprise grade back end development (Java/Python/Scala)
  • At least 2 years of experience in big data development preferably in Spark
  • Experience in big data projects, Hadoop/big data ecosystems
  • Experience with data stores like delta-lake, parquet files, EXASOL, Vertica, Cassandra etc.
  • Experience with large scale data processing.
  • Experience in Azure – advantage
  • Experience with serverless technology – advantage
  • Experience in Agile/Scrum methodologies
  • Excellence and huge motivation to learn and develop complex solutions
  • Can do attitude and make it happen approach – must!
  • Passion for technology
  • Excellent written and verbal communication skills, in Hebrew and English
  • Creative, Initiative, and a great person to work with!

 

Applay