Data Engineer

Remote in South Africa

Applications have closed
Plum Guide logo

Plum Guide

Plum Guide handpicks the world's best vacation rentals, holiday homes, short term lets and Airbnbs from over 25 different sites

About Plum Guide

Plum Guide is on a mission to build the definitive collection of the world’s most remarkable homestays. We are taking a systematic approach to vetting every single home on the planet and accepting only the top 3%. Like a Michelin Guide, but for homes. We do it by putting every home in a destination through a systematic vetting process, which includes identifying candidate homes through proprietary AI, interviewing hosts and sending our Home Critics to visit and test nominated homes in person.

We launched in 2016. Since then we have grown incredibly quickly; expanded to 345 locations in 29 countries; tested over 600,000 homes; and developed a customer experience that’s returning the highest NPS scores in the industry. 

About the Data Team

We are a small but high performing team who work with every team across the business as well as building our own projects. We are highly respected in the business and are getting ever-increasing responsibility for delivering new growth opportunities or finding efficiencies. 

Some of the work done by the data team that we have written about:

Plum Guide and the Data Team’s focus for the next 12 months is on hyperscale and hypergrowth: taking Plum Guide truly global. This is where you come in.

The Role

We believe in democratising access to d as ata – we want everyone in the company to be able to access data analytics and insight to make our guest and host experiences as great as they can be. We want to do this by creating self- service platforms for the team to use and drive. We want to maintain high standards of data quality but balance that against what is practical to build, maintain and operate within a small and lean team. We take a similar view towards technology – our platform is continually evolving to support the needs of our guests and hosts, and we have the freedom and flexibility to change any aspect of it, but only when it makes sense.

In response to growing demands for data and our passion to improve our technology stack, we are looking to recruit a talented, skilled and passionate Data Engineer to join our team. 

Responsibilities

The main responsibility of being Plum Guide’s first Data Engineer is to develop and maintain our Data Infrastructure to support our scalability requirements as we use data in a real time environment and as data empowers our teams day to day activities.

This includes:

  • Building developer, staging and production environments using cloud architecture, containerisation with automated deployment pipelines
  • Ensuring our systems scale as we grow and ensuring resiliency with monitoring and alerting
  • Owning our data warehouse that is within Snowflake.
  • Managing and improving our existing ETL jobs which is a mixture of custom flows (in Python, Spark), Airflow, Azure, Stitch and Snowflake
  • Assisting with building data pipelines for real time personalisation for our new search. This will involve pushing data from Segment to an external source, replaying events and building user profiles so that we can run models in real time.
  • Building new and improving our current data pipelines with big data processing for scrapping jobs. This will be a collaboration with Data Science to make these scraping jobs more intelligent.
  • Working with our Engineering Team to rebuilt legacy data processes to handle scale and performance
  • Developing APIs with our Data Scientists and Engineering team for real time usage.
  • Re-thinking our architecture: how we should push data back into tools (Salesforce, Campaign Monitor, Sendgrid etc.). Should we adopt Census or be building pipelines in Airflow for example.
  • Improving our monitoring and alerting on our existing models, ETL and services.
  • Creating elegant, simple, tested and reusable code.

Who we are looking for

You don’t need to have prior experience in hospitality or tech start ups. The important thing is that you are:

  • Solid grounding in engineering principles of coding practices, tools and development environments
  • Proven experience in building resilient platforms and environments
  • Experience with modern observability, monitoring and alerting tools
  • Extensive Experience with Python, Spark and ideally an additional server language.
  • 2- 4 years of experience as a Data Engineer who has delivered more than ETL jobs.
  • Extensive experience in setting up events pipelines, processing that data and having the ability to replay data all  for real time usage.
  • Extensive experience working with cloud providers and workflow software such as Airflow.
  • Someone who has an opinion and knowledge of what our data infrastructure should look like and how we implement best practices.
  • Deep knowledge of data warehousing: ideally experience with Snowflake or alike.
  • Ideally experience with scraping data and data collection, with the ability to process large volumes of data in an efficient manner.
  • Experience with deploying machine learning algorithms into production and understanding which means of deployment is going to work best for each project: API, vs Airflow vs Docker etc. 
  • Organised - able to project manage complex processes with multiple stakeholders.
  • Someone who is able to collaborate and work with our data analyst to help them build better tables/pipelines in DBT, think with our data scientists on how to deploy code or improve a process and work with our engineering team to improve how they process and use Data.

Remote Benefits:

  • Competitive salary
  • 25 days paid holiday + bank holidays
  • Birthday off (because who wants to work their birthdays!?)
  • Work from anywhere in the world for 4 weeks of the year
  • Discount on Plum stays for Family and Friends

Plum Guide is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status’.

* Salary range is an estimate based on our salary survey at salaries.freshremote.work

Tags: Airflow Analytics API APIs Azure Big Data Deep Learning Docker ETL Machine Learning Python Salesforce Spark

Perks/benefits: Career development Competitive pay Flex vacation Startup environment Team events

Region: Africa
Country: South Africa
Job stats:  7  1  0
Categories: Data Jobs Dev Jobs

Other jobs like this

Explore more Remote Work and WFH career opportunities

Find open roles in Engineering, Design, Data, Marketing, Sales, Operations, Support and more, filtered by job title or popular skill, toolset and products used.