FreshRemote.Work

Data Engineer (Remote in Colorado)

Colorado

About Rocket Lawyer   We believe everyone deserves access to affordable and simple legal services. Founded in 2008, Rocket Lawyer is the largest and most widely used online legal service platform in the world. With offices in North America, South America, and Europe, Rocket Lawyer has helped over 30 million people create over 50 million legal documents, and get their legal questions answered. We are in a unique position to enhance and expand the Rocket Lawyer platform to a scale never seen before in the company’s history, to capture audiences worldwide. We are expanding our team to take on this challenge!

About the Role

Rocket Lawyer is seeking an experienced, passionate data engineer for the data engineering team at Rocket Lawyer. In this role, you will build highly reliable, trust worthy data lakehouse that is leveraged across the organization to derive insights and make real-time decisions. You will be bringing your skills to drive new data projects, build reliable data pipelines, set up strong monitoring and alerting systems and build data solutions to support a diverse set of use cases.

We value a fun, collaborative, team-oriented work environment, where we celebrate our accomplishments. 

Responsibilities

  • Design and develop data pipelines using tools like Apache Airflow or by leveraging Snowflake's external tables functionality.
  • Translate HiveQL queries to Snowflake SQL, ensuring compatibility and efficient data processing.
  • Utilize dbt to model and transform data within Snowflake, adhering to best practices for data governance and maintainability.
  • Configure and manage data pipelines in GCP to orchestrate data movement and processing tasks.
  • Collaborate with data analysts and stakeholders to understand data requirements and ensure a successful migration outcome.
  • Monitor and optimize data pipelines for performance and scalability.
  • Develop and implement automated testing procedures to validate data quality and integrity after migration.
  • Document the migration process and provide ongoing support for the migrated data warehouse on Snowflake.

Requirements

  • 5+ years of experience as a Data Engineer with a proven track record of successful data warehouse/data lake implementation and management.
  • Deep understanding of leveraging Snowflake, or similar tools, to build highly performant and resilient datawarehouse/lakehouse system.
  • Strong expertise in HiveQL, SQL, and experience with data warehousing/lake house concepts (dimensional modeling, data quality, etc.).
  • Strong programming knowledge in python.
  • Strong understanding of data architectures and patterns.
  • Experience with Apache Spark …
This job isn't fresh anymore!
Search Fresh Jobs