Senior Data Engineer
Toronto, Canada
You could work anywhere. Why Figment?
Figment powers the future of Web3 through industry-leading blockchain infrastructure. As the leading provider of staking solutions, we help 500+ institutional clients optimize their crypto rewards, including top exchanges, asset managers, wallets, foundations, custodians, and major token holders. Our clients trust Figment for a comprehensive suite of services, including reward optimization, cutting-edge API development, detailed rewards reporting, seamless partner integrations, governance support, and slashing protection.
Backed by a team of passionate and intelligent Figmates, with a 100% remote-first global presence across 12 countries, our company is on a mission to accelerate the adoption, growth, and long-term success of the Web3 ecosystem. We’re building the infrastructure that will power the decentralized future.
As a fast-growing tech company, we’re looking for builders and innovators — people who thrive in the face of uncertainty and are motivated to make an impact. We are also looking for true teammates - people who are genuine, humble, and driven to level up together. If you're excited to shape the future, contribute to an energetic company culture, and work at the cutting edge of blockchain technology, we want you to join our team and help us lead the charge!
About the opportunity
Join Figment and help it in becoming the world’s leading staking services provider. Figment currently has over $15B assets under stake and growing. This role combines data engineering practices and software development, focusing on data pipelines and cloud infrastructure. The position requires building custom tools and automating data processes in a highly secure and scalable environment.
How you will make an impact
- Implement and maintain reliable data pipelines and data storage solutions.
- Implement data modeling and integrate technologies according to project needs.
- Manage specific data pipelines and oversees the technical aspects of data operations
- Ensure data processes are optimized and align with business requirements
- Identify areas for process improvements and suggests tools and technologies to enhance efficiency
- Continuously improve data infrastructure automation, ensuring reliable and efficient data processing.
- Develop and maintain data pipelines and ETL processes using technologies such as Dagster and DBT to ensure efficient data flow and processing.
- Automate data ingestion, transformation, and loading processes to support blockchain data analytics and reporting.
- Utilize Snowflake data warehousing solutions to manage and optimize data storage …
This job isn't fresh anymore!
Search Fresh JobsJob Profile
100% Remote 100% remote-first environment 4 weeks of PTO Base salary Competitive benefits Energetic company culture Fully remote Health benefits Home office stipend Impactful work Learning & Development Budget Monthly wifi reimbursement Parental leave Remote-first company Stock options
Tasks- Develop data infrastructure
- Implement data pipelines
- Optimize data processes
Airflow Analytics API Development Automation Blockchain Blockchain Technology CI/CD CI/CD pipelines Cloud Infrastructure Collaboration Communication Crypto Dagster Data engineering Data Modeling Data Operations Data Orchestration Data Pipelines Data Science Data storage Data Transformation Data Warehousing Dbt Engineering ETL Go Governance Microservices Networking Python Sales Security Security Best Practices Snowflake Software Development SQL Staking Web3
Education TimezonesAmerica/Edmonton America/Moncton America/Regina America/St_Johns America/Toronto America/Vancouver UTC-3 UTC-4 UTC-5 UTC-6 UTC-7 UTC-8