FreshRemote.Work

Data Engineer (Remote in California)

San Francisco, California

About Rocket Lawyer   We believe everyone deserves access to affordable and simple legal services. Founded in 2008, Rocket Lawyer is the largest and most widely used online legal service platform in the world. With offices in North America, South America, and Europe, Rocket Lawyer has helped over 30 million people create over 50 million legal documents, and get their legal questions answered.   We are in a unique position to enhance and expand the Rocket Lawyer platform to a scale never seen before in the company’s history, to capture audiences worldwide. We are expanding our team to take on this challenge!

About the Role

Rocket Lawyer is seeking an experienced, passionate data engineer for the data engineering team at Rocket Lawyer. In this role, you will build highly reliable, trust worthy data lakehouse that is leveraged across the organization to derive insights and make real-time decisions. You will be bringing your skills to drive new data projects, build reliable data pipelines, set up strong monitoring and alerting systems and build data solutions to support a diverse set of use cases.

We value a fun, collaborative, team-oriented work environment, where we celebrate our accomplishments. 

Responsibilities

  • Design and develop data pipelines using tools like Apache Airflow or by leveraging Snowflake's external tables functionality.
  • Translate HiveQL queries to Snowflake SQL, ensuring compatibility and efficient data processing.
  • Utilize dbt to model and transform data within Snowflake, adhering to best practices for data governance and maintainability.
  • Configure and manage data pipelines in GCP to orchestrate data movement and processing tasks.
  • Collaborate with data analysts and stakeholders to understand data requirements and ensure a successful migration outcome.
  • Monitor and optimize data pipelines for performance and scalability.
  • Develop and implement automated testing procedures to validate data quality and integrity after migration.
  • Document the migration process and provide ongoing support for the migrated data warehouse on Snowflake.

Requirements

  • 5+ years of experience as a Data Engineer with a proven track record of successful data warehouse/data lake implementation and management.
  • Deep understanding of leveraging Snowflake, or similar tools, to build highly performant and resilient datawarehouse/lakehouse system.
  • Strong expertise in HiveQL, SQL, and experience with data warehousing/lake house concepts (dimensional modeling, data quality, etc.).
  • Strong programming knowledge in python.
  • Strong understanding of data architectures and patterns.
  • Experience with Apache Spark for large-scale data processing (a plus).
  • Proficiency in dbt for data modeling and transformation in Snowflake preferred.
  • Experience working with Google Cloud Platform (GCP) and its data storage services (GCS, BigQuery - a plus).
  • Excellent written and verbal communication skills with the ability to collaborate effectively with cross-functional teams.
  • Strong problem-solving skills and a passion for building efficient and scalable data solutions.
  • Snowflake certification preferred.

Preferred Qualifications:

  • Experience in DataOps implementation and support
  • Experience in MLOps implementation and support.
  • Experience in building and supporting AI/ML platform.

Benefits & Perks

  • Comprehensive health plans (including Medical, Dental and Vision insurance for full-time employees)
  • Unlimited PTO
  • Competitive salary packages
  • Life insurance
  • Disability benefits
  • Supplemental Optional Life Insurance Benefits
  • FSA Options Optional
  • HSA with Company Match
  • 401k program with Company Match
  • Fertility Assistance and Planning options
  • Wellhub & ClassPass fitness platforms
  • Comprehensive Pet Insurance options
  • Financial Wellbeing & Student Loan Program access
  • Access to additional Mental Health & Wellbeing resources
  • Pre-tax Commuter/Transit Benefits
  • Free Rocket Lawyer account with online access to an extensive legal documents library and brilliant licensed attorneys at discounted rates 

Rocket Lawyer is proudly committed to recruiting and retaining a diverse and inclusive workforce. As an Equal Opportunity Employer, we never discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, military or veteran status, status as an individual with a disability, or other applicable legally protected characteristics. We particularly welcome applications from veterans and military spouses.

All your information will be kept confidential according to EEO guidelines.

You may request reasonable accommodations by sending an email to hr@rocketlawyer.com.

Compensation
Base salary range by location:
  • San Francisco Bay Area, CA: $124,000 - $160,000
  • California (outside of San Francisco Bay Area) and Colorado: $106,000 - $139,000
  • Utah, Arizona, and North Carolina: $99,000 - $131,000

 

Actual compensation packages are determined by various factors unique to each candidate, including but not limited to skill set, depth of experience, certifications, specific work location, and performance during the interview process.$100,000—$160,000 USD

By applying for this position, your data will be processed as per Rocket Lawyer Privacy Policy

Apply

Job Profile

Regions

North America

Countries

United States

Restrictions

Must be remote in California

Benefits/Perks

401(k) with company match Competitive salary Competitive salary packages Comprehensive health plans Disability benefits Fertility assistance Fitness platforms Health plans Life Insurance Pet Insurance Unlimited PTO

Tasks
  • Collaborate with data analysts
  • Collaborate with stakeholders
  • Configure data pipelines in GCP
  • Design and develop data pipelines
  • Document migration process
  • Implement automated testing
  • Monitor and optimize data pipelines
  • Translate HiveQL to Snowflake SQL
  • Utilize dbt for data modeling
Skills

AI Apache Airflow Apache Spark Automated Testing Benefits BigQuery Communication Compensation Data Architectures Data engineering Data Lakehouse DataOps Data Pipelines Data Quality Dbt Dimensional Modeling Google Cloud Platform HiveQL ML MLOps Organization Problem-solving Programming Python Snowflake SQL Verbal communication

Experience

5 years

Certifications

Snowflake certification

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9