FreshRemote.Work

Data Engineer

Brazil (Remote)

About Rocket Lawyer   We believe everyone deserves access to affordable and simple legal services. Founded in 2008, Rocket Lawyer is the largest and most widely used online legal service platform in the world. With offices in North America, South America, and Europe, Rocket Lawyer has helped over 30 million people create over 50 million legal documents, and get their legal questions answered.   We are in a unique position to enhance and expand the Rocket Lawyer platform to a scale never seen before in the company’s history, to capture audiences worldwide. We are expanding our team to take on this challenge!

About the Role

Rocket Lawyer is seeking a passionate data engineer for Rocket Data Platform. In this role, you will build a highly reliable, trustworthy data lakehouse that is leveraged across the organization to derive insights and make real-time decisions. You will be bringing your skills to drive new data projects, build reliable data pipelines, set up strong monitoring and alerting systems, and build data solutions to support a diverse set of use cases.

We value a fun, collaborative, team-oriented work environment, where we celebrate our accomplishments.

Responsibilities

  • Design and develop data pipelines using tools like Apache Airflow or by leveraging Snowflake's external tables functionality.

  • Translate HiveQL queries to Snowflake SQL, ensuring compatibility and efficient data processing.

  • Utilize dbt to model and transform data within Snowflake, adhering to best practices for data governance and maintainability.

  • Configure and manage data pipelines in GCP to orchestrate data movement and processing tasks.

  • Collaborate with data analysts and stakeholders to understand data requirements and ensure a successful migration outcome.

  • Monitor and optimize data pipelines for performance and scalability.

  • Develop and implement automated testing procedures to validate data quality and integrity after migration.

  • Document the migration process and provide ongoing support for the migrated data warehouse on Snowflake.

Requirements

  • Minimum 5+ years of experience as a Data Engineer with a proven track record of successful data warehouse/data lake implementation and management.
  • Deep understanding of leveraging Snowflake to build highly performant and resilient datawarehouse (or experience with similar platforms).
  • Strong expertise in HiveQL, SQL, and experience with data warehousing/lake house concepts (dimensional modeling, data quality, etc.).
  • Strong programming knowledge in python.
  • Experience with Apache Spark for large-scale data processing (a plus).
  • Proficiency in dbt for data modeling and transformation in Snowflake preferred.
  • Experience working with Google Cloud Platform (GCP) and its data storage services (GCS, BigQuery - a plus) - or experience with similar platforms.
  • Excellent written and verbal communication skills with the ability to collaborate effectively with cross-functional teams.
  • Strong problem-solving skills and a passion for building efficient and scalable data solutions.

Preferred Qualifications:

  • Strong understanding of data architectures and patterns.
  • Experience in DataOps implementation and support.
  • Experience in MLOps implementation and support.
  • Experience in building and supporting AI/ML platform.

Benefits & Perks

  • Private health insurance 
  • Life insurance
  • Meal/Food voucher
  • Wellhub partnership
  • Mental health assistance
  • Birthday off
  • Daycare assistance
  • Financial support for those who have children with special needs and disabilities
  • Employee referral program
  • Free Rocket Lawyer account with online access to an extensive legal documents library and brilliant licensed attorneys at discounted rates

Regime de contratação: CLT

Brazil Monthly CompensationR$22.000—R$24.500 BRL

By applying for this position, your data will be processed as per Rocket Lawyer Privacy Policy

Apply

Job Profile

Regions

South America

Countries

Brazil

Benefits/Perks

Birthday off Daycare assistance Employee Referral Program Financial support for special needs Free legal document access Health insurance Life Insurance Meal vouchers Mental health assistance Private health insurance

Tasks
  • Collaborate with analysts
  • Configure data pipelines in GCP
  • Design and develop data pipelines
  • Document migration process
  • Implement automated testing
  • Monitor and optimize data pipelines
  • Translate HiveQL to Snowflake SQL
  • Utilize dbt for data modeling
Skills

Ai/ml platforms Apache Airflow Apache Spark Benefits BigQuery Communication Compensation Data Architecture Data engineering DataOps Data Pipelines Data Quality Dbt Dimensional Modeling Google Cloud Platform HiveQL MLOps Organization Problem-solving Python Snowflake SQL

Experience

5 years

Timezones

America/Manaus America/Rio_Branco America/Sao_Paulo UTC-3 UTC-4 UTC-5