FreshRemote.Work

Lead Data Engineer

Remote

Imagine having an enterprise-grade AppStore at work — one that ensures you can easily search, request, and gain access to any app you need, precisely when you need it. No more long waiting times with outstanding IT requests. Lumos is solving the app and access management challenges for organizations of all sizes through a unified platform. Our fast-growing startup is pioneering the way to untangle the complex web of app and access management by building the critical infrastructure that defines relationships between app, identities and data.   Why Lumos?
  • Jump on a Rocketship: Since launching out of stealth mode just over 2 years ago, our team has grown from 20 to ~100 people and our customer base has 10x’ed with companies like GitHub, MongoDB and Major League Baseball!
  • Build with Renowned Investor Backing: Andreessen Horowitz (a16z) backed us since the beginning and we've raised over $65m from Scale, Neo, Greg Brockman (President at OpenAI), Phil Venables (CISO at Google), and others.
  • Thrive in a Unique Culture: You’ll join an early-stage company where you have actual influence on the trajectory of the company. We deeply care about our people and the philosophy we live by - check out our values here.

Lumos is making it a joy for companies to manage their apps and identities ✨. By integrating usage, spend, compliance, and access data, we provide a level of clarity and insight previously unimaginable. To deliver a best-in-class product, Lumos depends on state-of-the-art data pipelines that power our analytics and AI-driven solutions.

We are seeking a Lead Data Engineer to expand and enhance our existing data infrastructure, built around MySQL, Fivetran, Airbyte, and Snowflake. In this role, you will design and implement production-ready data pipelines with a strong emphasis on reliability, testing, and scalability. Your work will ensure that our AI products and in-product analytics perform flawlessly at scale, driving value for our customers.

✨ Your Responsibilities

  • Your mission is to architect, build, and maintain cutting-edge data pipelines that empower our AI products, in-product analytics, and internal reporting.
  • You will ensure the scalability, reliability, and quality of our analytics data infrastructure, enabling the seamless integration of usage, spend, compliance, and access data to drive business insights and deliver exceptional value to our customers.
  • By focusing on testing, automation, and best-in-class engineering practices, you will play a pivotal role in transforming complex data into actionable intelligence, fueling Lumos' growth and innovation.

🙌 What We Value

  • Extensive experience designing and implementing medallion architectures (bronze, silver, gold layers) or similar data warehouse paradigms. Skilled in optimizing data pipelines for both batch and real-time processing.
  • Proficiency in deploying data pipelines using CI/CD tools and integrating automated data quality checks, version control, and deployment automation to ensure reliable and repeatable data processes.
  • Expertise in advanced SQL, ETL processes, and data transformation techniques. Strong programming skills in Python.
  • Demonstrated ability to work closely with AI engineers, data scientists, product engineers, product managers, and other stakeholders to ensure that data pipelines meet the needs of all teams.

💰 Pay Range

  • $190,000 - $245,000. Note that this range is a good faith estimate of likely pay for this role; upon hire, the pay may differ due to skill and/or level of experience.

 

💸 Benefits and Perks:

  • 💯 Remote work culture (+/-4 hours Pacific Time)
  • ⛑ Medical, Vision, & Dental coverage covered by Lumos
  • 🛩 Company and team bonding trips throughout the year fully covered by Lumos
  • 💻 Optimal WFH setup to set you up for success
  • 🌴 Unlimited PTO, with minimum time off to make sure you are rested and able to be at your best
  • 👶🏽 Up to (4) months off for both the Birthing & Non-birthing parent
  • 💰 Wellness stipend to keep you awesome and healthy
  • 🏦 401k matching plan 
Apply

Job Profile

Restrictions

Must work within +/-4 hours of Pacific Time

Benefits/Perks

Company bonding trips Medical/vision/dental coverage Optimal WFH setup Remote work Remote work culture Unique culture Unlimited PTO

Tasks
  • Architect data pipelines
  • Ensure data quality and reliability
  • Maintain data infrastructure
Skills

AI Airbyte Analytics Automation CI/CD Compliance Data engineering Data Pipelines Data Quality Checks Data Transformation Engineering ETL Fivetran Infrastructure IT MySQL Python Scalability Snowflake SQL

Experience

5 years

Timezones

UTC-8