Staff Software Engineer, Data - Level 4
Florida - Remote Office
This is a remote position; however, the candidate must reside within 30 miles of one of the following locations: Portland, ME; Boston, MA; Chicago, IL; and San Francisco Bay Area, CA.
About the Team/Role
We are looking for a highly motivated and highly potential Staff Engineer to join our Data team to make big business impacts and grow your career.
This is an exciting time to be part of the Data team at WEX. WEX offers sophisticated business solutions that empower a diverse range of customers. The data generated from these systems, applications, and platforms is rich and complex. As one of the most valuable assets of WEX, this data holds immense potential to drive value for our customers and the business.
The Data team's mission is to build big data technologies, platforms, systems, and tools that clean, process, enrich, and optimize core company data, making it easy and efficient to use. This enables both our customers and internal teams to unlock business value. We also create value-added data products for WEX customers. Leveraging modern big data and AI technologies, we employ agile development practices, a combined engineering approach, and the product operating model to drive innovation and efficiency.
We provide challenging problems that have significant business impact, offering you opportunities to learn and grow. Our team consists of highly skilled engineers and leaders who will support, guide, and coach you throughout your journey.
If you're driven to become a strong engineer capable of solving complex problems, delivering impactful solutions, and growing quickly, this is the ideal opportunity for you!.
How you’ll make an impact
Collaborate with partners and stakeholders to understand customers’ business needs and key challenges.
Design, test, code, and instrument new data products, systems, platforms, and pipelines of high complexity, ensuring simple and high-quality solutions.
Leverage data effectively to measure, analyze, and drive decisions.
Develop and maintain CI/CD automation using tools such as GitHub Actions.
Implement Infrastructure as Code (IaC) using tools like Terraform, managing cloud-based data infrastructure.
Perform software development using TDD, BDD, Microservices, and event-oriented architectures with a focus on efficiency, reliability, quality, and scalability.
Support live data products, systems, and platforms, ensuring proactive monitoring, high data quality, rapid incident response, and continuous improvement.
Analyze data, systems, and processes independently to identify bottlenecks and opportunities for improvement.
Mentor peers and foster continuous learning of new technologies within the team and organization.
Attract top talent to the team; participate in interviews and provide timely, constructive feedback.
Act as a role model for team processes and best practices, ensuring assigned tasks solve customer and business problems effectively, reliably, and sustainably.
Collaborate with and lead peers in completing complex tasks and problem-solving efforts.
Lead Scrum teams with hands-on involvement in Agile practices, ensuring the timely and high-quality development of solutions.
Own large, complex components or systems, products, and platforms.
Lead and participate in technical discussions, driving high-quality and efficient system design.
Independently complete medium to large complexity tasks and proactively seek feedback from senior engineers to ensure quality.
Proactively identify and communicate project dependencies.
Review peer work and provide constructive feedback to enhance team collaboration and quality.
Build reliable, secure, high-quality, and scalable big data platforms and tools to support data transfer, ingestion, processing, serving, delivery, consumption, and data governance.
Design and implement scalable systems, platforms, pipelines, and tools for the end-to-end data lifecycle, including ingestion, cleaning, processing, enrichment, optimization, and serving, ensuring high-quality and easy-to-use data for both internal and external purposes.
Develop data quality measurement and monitoring systems, metadata management, data catalogs, and Master Data Management (MDM) solutions.
Use data modeling techniques to design and implement efficient and user-friendly data models and structures.
Become a subject matter expert in your functional area and best practices.
Apply creative problem-solving techniques to resolve issues or provide various approaches for unique situations.
Leverage data and AI technologies in design and development for high productivity and improved solution quality, influencing peers in these areas.
Lead team initiatives, applying your broad experience and technical knowledge to make informed decisions on solving complex issues.
Hold yourself and your team accountable for delivering high-quality results using defined OKRs.
Interact with senior managers to discuss plans, results, and provide advice on complex matters.
Experience you’ll bring
Bachelor's degree in Computer Science, Software Engineering, or a related field, OR equivalent deep understanding, experience, and capability.
A Master’s or PhD degree in Computer Science (or related field) and 5+ years of software engineering experience, or 7+ years of large-scale software engineering experience, with expertise in data system/platform development.
Technically deep, innovative, empathetic, and passionate about delivering business-oriented solutions.
Strong problem-solving skills, with excellent communication and collaboration abilities.
Highly self-motivated and eager to learn, continuously adopting new technologies to improve productivity and the quality of deliverables. Proficiency in leveraging GenAI technologies to enhance work productivity and build innovative products or systems for customers.
Extensive experience designing simple, high-quality, performant, and efficient solutions for large, complex problems.
Strong understanding and hands-on experience with CI/CD automation.
Proven experience in combined engineering practices and Agile development.
Extensive experience and strong implementation skills in languages such as Java, C#, Golang, and Python, including coding, automated testing, performance measurement, and monitoring, with high productivity.
In-depth understanding of data processing techniques, such as data pipeline and platform development, SQL, and databases.
Extensive experience in data ingestion, cleaning, processing, enrichment, storage, serving, and quality assurance techniques and tools, including ELT, SQL, relational algebra, and databases.
Experience with cloud technologies, particularly AWS and Azure.
Good understanding of data warehousing, dimensional modeling, and related techniques.
Passion for understanding and solving customer and business problems.
Familiarity with data governance is a plus.
Preferred Qualifications:
Lakehouse Platform Development: Experience in building and maintaining Lakehouse platforms and resources, ideally using Snowflake or equivalent data technologies.
Cloud Infrastructure Management: Extensive hands-on experience with cloud infrastructures in AWS and Azure, with a strong focus on managing infrastructure as code (IaC) using Terraform and implementing CI/CD pipelines using GitHub Actions.
API Expertise: Skilled in building, managing, and consuming APIs, with a strong understanding of API technologies and integration best practices.
Job Profile
Bay Area Boston, MA Must reside within 30 miles of specified cities Remote Office
Benefits/PerksCareer growth Continuous improvement Disability Insurance Flexible Spending Flexible Spending Accounts Health savings account Life Insurance Opportunities to learn Paid Time Off Quarterly or annual bonus Retirement savings Retirement savings plan Supportive team environment Team Collaboration Total compensation package Tuition reimbursement
Tasks- Analyze data
- Automation
- Collaborate with stakeholders
- Conduct technical discussions
- Continuous Improvement
- Design and implement data products
- Develop CI/CD automation
- Lead scrum teams
- Manage cloud-based infrastructure
- Mentor peers
- Resolve issues
- SQL
- Team collaboration
Agile Agile Development AI AI technologies APIs Automated Testing Automation AWS Azure BDD Big Data C CI/CD Cloud Technologies Coding Collaboration Communication Continuous Improvement Data analysis Data Governance Data ingestion Data Management Data Modeling Data processing Data Quality Data Warehousing Design Event-oriented architecture Flexible spending accounts GitHub GitHub Actions Golang Implementation Incident Response Infrastructure as Code Innovation Java Mentoring Microservices Performance Performance Measurement Problem-solving Python Quality Assurance Reliability Sales Scrum Snowflake Software Development Software Engineering SQL System design TDD Team Collaboration Terraform Testing
EducationBachelor's Bachelor's degree Bachelor's degree in Computer Science Business Computer Science Engineering Equivalent Related Field Software Engineering
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9