Data Engineer (Remote in Colorado)
Colorado
About the Role
Rocket Lawyer is seeking an experienced, passionate data engineer for the data engineering team at Rocket Lawyer. In this role, you will build highly reliable, trust worthy data lakehouse that is leveraged across the organization to derive insights and make real-time decisions. You will be bringing your skills to drive new data projects, build reliable data pipelines, set up strong monitoring and alerting systems and build data solutions to support a diverse set of use cases.
We value a fun, collaborative, team-oriented work environment, where we celebrate our accomplishments.
Responsibilities
- Design and develop data pipelines using tools like Apache Airflow or by leveraging Snowflake's external tables functionality.
- Translate HiveQL queries to Snowflake SQL, ensuring compatibility and efficient data processing.
- Utilize dbt to model and transform data within Snowflake, adhering to best practices for data governance and maintainability.
- Configure and manage data pipelines in GCP to orchestrate data movement and processing tasks.
- Collaborate with data analysts and stakeholders to understand data requirements and ensure a successful migration outcome.
- Monitor and optimize data pipelines for performance and scalability.
- Develop and implement automated testing procedures to validate data quality and integrity after migration.
- Document the migration process and provide ongoing support for the migrated data warehouse on Snowflake.
Requirements
- 5+ years of experience as a Data Engineer with a proven track record of successful data warehouse/data lake implementation and management.
- Deep understanding of leveraging Snowflake, or similar tools, to build highly performant and resilient datawarehouse/lakehouse system.
- Strong expertise in HiveQL, SQL, and experience with data warehousing/lake house concepts (dimensional modeling, data quality, etc.).
- Strong programming knowledge in python.
- Strong understanding of data architectures and patterns.
- Experience with Apache Spark …
This job isn't fresh anymore!
Search Fresh JobsJob Profile
Remote only in Colorado
Benefits/Perks401(k) with company match Competitive salary Competitive salary packages Comprehensive health plans Disability benefits Fertility assistance Fitness platforms Health plans Life Insurance Pet Insurance Unlimited PTO
Tasks- Design and develop data pipelines
- Implement automated testing
AI Apache Airflow Apache Spark Automated Testing Benefits BigQuery Communication Compensation Data Architectures Data engineering Data Lakehouse DataOps Data Pipelines Data Quality Dbt Dimensional Modeling Google Cloud Platform ML MLOps Organization Problem-solving Programming Python Snowflake SQL Verbal communication
Experience5 years
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9