Senior Software Engineer (Data)

Remote - London, England, United Kingdom

Full Time Mid-level / Intermediate
Zego logo
Apply now Apply later

About us

We are Zego - a commercial motor insurance provider that powers opportunities for businesses, from entire fleets of vehicles to self-employed drivers and riders. We combine best-in-class technology with sophisticated data sources to offer insurance products that save businesses time and money.

Since our inception, we have believed that the problem with traditional insurance is that it holds businesses back. It’s too expensive and time consuming, and it no longer suits businesses who use vehicles to earn money. Our products represent a solution to this problem for businesses based across the UK, Europe and beyond.

So far, we have raised over $200 million in funding and we were the first UK insurtech to be valued at over $1 billion. We were also the first to be awarded our own insurance license and recently won Tech Company of the Year 2020.

At Zego, we are proud to say we have a diverse and inclusive team, unified by our shared values and mission. Our people are the most important part of our story and everybody at Zego, no matter their role, has an integral part to play.

Role overview

We are looking for a Software Engineer with data processing experience to help us build a data platform with focus on stream processing to enable tiered data architecture.

At Zego the Data Engineering team is integral to our data platform, working closely with Software Engineers, Data Scientists and Data Analysts along with other areas of the business. We use a variety of internal and external tooling to maintain our data repositories. We are looking for people who have worked with streaming architecture and built interactive APIs over data streams, have a background in Scala, good software engineering and data infrastructure principles, spent time working with complex and fast growing datasets and are able to communicate well.

Our stack currently involves but is not limited to Airflow, Data Build Tool (DBT), a multitude of AWS services, Stitch and Fivetran. With change in direction, as a Senior Software Engineer and new addition to the team, you will have the opportunity to rebuild the data platform with a view to treat data as a first class entity. Participate in promoting emerging technologies where they can add value to the business and promote better ways of working.

It is an exciting time to join, and you’ll partner with world class engineers, analysts and product managers to help make Zego the best loved insurtech in the world.

About the role:

Over the next 12 months you will:

  • Support our data scientists in the development and implementation of our ML models and experiments.
  • Help evolve the architecture of our data ecosystem to support our long term vision by continuously iterating on our data infrastructure.
  • Work with the data team and the rest of engineering to develop and implement a clear data strategy.
  • Collaborate with product managers and across teams to bring new products and features to the market.
  • Own data as a product, building a data platform focusing on data structure, quality, usage and efficiency.
  • Build tailored data replication pipelines as our backend application is broken into microservices.
  • Assist in developing and maintaining our ETL and ELT pipelines.

About you:

We are looking for somebody with a strong working knowledge of building data pipelines and the underlying infrastructure. You should have participated in the design of tiered data architectures, data lakes and followed best practices during implementation. You have worked with Data Analysts, Data Scientists and Software Engineers.

Practical knowledge of the following:

  • Scala
  • Akka (streams, clustering)
  • Having worked with a wide variety of AWS services
  • Experience in using infrastructure as code tools (e.g. Terraform)
  • Kubernetes (EKS)
  • Docker

Otherwise an interest in learning these, with the support of the team, is essential. We're looking for people with a commitment to building, nurturing, and iterating on an ever-evolving data ecosystem.

Other beneficial skills include:

  • Python
  • Implementation / Contribution to building a Data Lake or Data Mesh
  • Data Warehousing (Redshift / Snowflake)
  • SQL (We use DBT for modelling data in the warehouse)
Job region(s): Europe
Job stats:  1  0  0
  • Share this job via
  • or

Explore more Remote Work and WFH career opportunities