FreshRemote.Work

Data Engineer | Lonely Planet

Remote

At a Glance:

This role is not open to visa sponsorship or transfer of visa sponsorship including those on OPT and STEM-EXT OPT, nor is it available to work corp-to-corp.

About Us: For over 50 years, our travel brand has been the go-to resource for explorers seeking expert guidance through our trusted guidebooks. As we evolve our digital offerings to meet the modern traveler’s needs, we are looking to expand our digital presence and enhance the marketing and sales experiences of our products. We aim to create compelling digital campaigns that resonate with travelers and drive both customer engagement and business growth.

As a Data Engineer on Lonely Planet, you'll build data products that power our machine learning and analytics efforts for one of the world's most iconic travel brands. Work in a high-autonomy environment using AWS and Spark/Scala via Databricks.

 

We value diverse backgrounds and encourage you to apply even if you don't meet all qualifications!

 

What You'll Do:

- Build and maintain ETL pipelines using Spark and Scala/Python

- Collaborate with cross-functional teams to create end-to-end data solutions

- Contribute to evolving our data engineering stack, including evaluating data catalog products and automating ETL processes

- Participate in code reviews and mentor junior team members

- Iterate quickly, with a "Speed Trumps Perfection" mindset

- Design and implement data models to support travel-related analytics and recommendations

- Optimize data pipelines for performance and cost-efficiency

- Ensure data quality and integrity throughout our systems

 

What We're Looking For:

- 3+ years of experience in data engineering/ETL work

- Strong skills in Spark using Scala or Python

- Experience with AWS services such as S3, EC2, EMR, Glue, and Lambda

- Proficiency in SQL and experience with data warehousing solutions (e.g., Redshift, Snowflake)

- Knowledge of Git and CI/CD processes

- Ability to work independently and take ownership of projects

- Enthusiasm for learning new technologies and challenging the status quo

- Experience with data modeling and schema design

- Familiarity with NoSQL databases (e.g., DynamoDB, MongoDB)

- Understanding of data security and compliance requirements

 

Nice to Have:

- Experience with Airflow or other workflow management tools

- Familiarity with streaming data processing (e.g., Kafka, …

This job isn't fresh anymore!
Search Fresh Jobs