Lead Data Engineer
Canada
Figment is the world’s leading provider of blockchain infrastructure. We provide the most comprehensive staking solution for our over 500+ institutional clients including exchanges, wallets, foundations, custodians, and large token holders to earn rewards on their crypto assets. These clients rely on Figment’s institutional staking service including rewards optimization, rapid API development, rewards reporting, partner integrations, governance, and slashing protection. Figment is backed by industry experts, financial institutions and our global team across twenty three countries. This all leads to our mission to support the adoption, growth and long term success of the Web3 ecosystem.
We are a growth stage technology company – looking for people who are builders and doers. People who are comfortable plotting their course through ambiguity and uncertainty to drive impact and who are excited to work in new ways and empower a generative company culture.
Responsibilities
- Lead the design and implementation of reliable data pipelines and data storage solutions.
- Lead the implementation of data modeling and integrate technologies according to project needs.
- Manage specific data pipelines and oversees the technical aspects of data operations.
- Ensure data processes are optimized and align with business requirements.
- Identify areas for process improvements and suggests tools and technologies to enhance efficiency.
- Continuously improve data infrastructure automation, ensuring reliable and efficient data processing.
- Lead the development and maintenance of data pipelines and ETL processes using technologies such as Dagster and DBT to ensure efficient data flow and processing.
- Automate data ingestion, transformation, and loading processes to support blockchain data analytics and reporting.
- Utilize Snowflake data warehousing solutions to manage and optimize data storage and retrieval.
- Collaborate with Engineering Leadership and Product teams to articulate data strategies and progress.
- Promote best practices in data engineering, cloud infrastructure, networking, and security.
Minimum Qualifications
- Experience with the data transformation tool DBT
- Having led the design and implementation of complex data transformations using advanced DBT models, materializations, and configurations to streamline data workflows and improve performance.
- Having led the optimization and troubleshooting of DBT pipelines for scale, ensuring that transformations run efficiently in production environments, handling large datasets without issues.
- Experience programming in Python
- Having led the design and implementation of scalable, high-performance applications by leveraging Python's advanced libraries and frameworks (e.g., Pandas, FastAPI, asyncio), ensuring clean code, modularity, and maintainability. …
This job isn't fresh anymore!
Search Fresh JobsJob Profile
100% Remote 100% remote-first environment 4 weeks of PTO Base salary Fully remote Generative company culture Home office stipend Learning & Development Budget Parental leave Remote-first company
SkillsAirflow Analytics API Development Automation AWS BigQuery Blockchain CI/CD CI/CD pipelines Cloud Infrastructure Communication Crypto Dagster Databricks Data engineering Data Modeling Data Orchestration Data Pipelines Data storage Data Warehousing Dbt ETL FastAPI Golang Governance Monitoring Networking Pandas Python Redshift Security Security Best Practices Snowflake Staking Terraform Web3
Experience5 years
Education TimezonesAmerica/Edmonton America/Moncton America/Regina America/St_Johns America/Toronto America/Vancouver UTC-3 UTC-4 UTC-5 UTC-6 UTC-7 UTC-8