FreshRemote.Work

Senior Data Engineer

United States - Remote

Our Story:
Crisis Prevention Institute Inc. is the worldwide leader in evidence-based de-escalation and crisis prevention training, and dementia care services. Since 1980, we’ve helped train more than 15 million people within service-oriented industries including education, healthcare, behavioral health, long-term care, human services, security, corrections, corporate, and retail.

At CPI, we are dedicated to changing behaviors and reducing conflict for the Care, Welfare, Safety, and SecuritySM of everyone. We believe in the power of empathy, compassion, and meaningful connections. We believe personal safety and security are the antidotes to fear and anxiety. It’s a philosophy that is central to everything we do, and traces back to our beginning. It is what defines and differentiates us, and informs our core beliefs.

The Role:

The Senior Data Engineer will focus on quality engineering best practices to meet and exceed internal and external client expectations. In this position, you will analyze, design, develop, test and document solutions supporting data integration, performance tuning, and data modeling to drive organization growth objectives. The Senior Data Engineer will define the standards for data architecture, platform architecture, and data quality and governance. This role is responsible for ensuring that the function is aligned with the overall CPI organization and continuously works to meet critical service levels in access, delivery and security.

What You Get To Do Everyday:

  • Co-architect Crisis Prevention Institute’s (CPI) next-gen cloud data analytics platform.
  • Increase operating efficiency and adapt to new requirements.
  • Monitor and maintain the health of solutions generated.
  • Support and enhance our data-ops practices.
  • Provide task breakdowns, identify dependencies, and provide effort estimates.
  • Model data warehouse entities in Erwin.
  • Build data transformation pipelines with Data Build Tools (DBT).
  • Evaluate the latest technological trends and develop proof-of-concept prototypes that align with CPI opportunities.
  • Develop positive relationships with clients, stakeholders, and internal teams.
  • Understand business goals, drivers, context, and processes to suggest technology solutions that improve CPI.
  • Work collaboratively on creative solutions with engineers, product managers, and analysts in an agile-like environment.
  • Perform, design, and code reviews.
  • Perform other position-related duties as assigned.

You Need to Have:

  • Bachelor’s degree in computer engineering, computer science, data science, or related field
  • Seven years or more of experience working with data modeling, architecture and engineering
  • Two years or more experience designing and implementing data warehouses in Snowflake
  • Experience with all core software development activities, including requirements gathering, design, construction, and testing
  • Experience performing data transformation using DBT
  • Experience working with DQ products such as Monte Carlo, BigEye, or Great Expectations
  • Experience with Azure DevOps (Repos, Pipelines, Boards, Wiki, Test Plans)
  • Experience with formal software development methodologies including Software Development Life Cycle (SDLC), Agile or SCRUM
  • Experience building high-performance and highly reliable data pipelines
  • Experience Knowledge of data warehouse design patterns (star schema, data vault)
  • Experience building dashboards with business integration tools
  • Knowledge of DataOps
  • with cloud-based compute, storage, integration, and security patterns
  • Knowledge and understanding of RESTful APIs
  • Knowledge of current data engineering trends, best practices, and standards
  • Knowledge of SQL and Python
  • Ability to work in a collaborative environment
  • Ability to facilitate evaluation of technologies and achieve consensus on technical standards and solutions among a diverse group of information technology professionals
  • Ability to work in an organization driven by continuous improvement or with an equivalent focus on process improvement
  • Ability to manage multiple competing priorities and attain the best possible outcomes for the organization
  • Excellent verbal and written communication and effective listening skills

We'd Love to See:

  • Experience in delivering an end-to-end data analytics platform using modern data stack components
  • Experience working with artificial intelligence (AI) and machine learning (ML)
  • SnowPro Advanced Certification
  • DBT Analytical Engineer Certification

What We Offer:

  • $120,000 - $130,000 annual salary
  • Annual company performance bonus
  • Comprehensive benefits package
  • 401k
  • PTO
  • Health & Wellness Days
  • Paid Volunteer Time Off
  • Continuing education and training
  • Hybrid work schedule

Crisis Prevention Institute is an Equal Opportunity Employer that does not discriminate against any applicant or employee on the basis of age, race, color, ethnicity, national origin, citizenship, religion, diversity of thoughts and beliefs, creed, sex, sexual orientation, gender, gender identity, or expression (including against any individual that is transitioning, has transitioned, or is perceived to be transitioning), marital status or civil partnership/union status, physical or mental disability, medical condition, pregnancy, childbirth, genetic information, military and veteran status, or any other basis prohibited by applicable federal, state, or local law. The Company will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Our management team is dedicated to this policy with respect to recruitment, hiring, placement, promotion, transfer, training, compensation, benefits, employee activities, and general treatment during employment.

Apply

Job Profile

Regions

North America

Countries

United States

Tasks
  • Build data transformation pipelines
  • Co-architect cloud data analytics platform
  • Collaborate with teams
  • Develop client relationships
  • Evaluate technological trends
  • Model data warehouse entities
  • Monitor and maintain solution health
  • Perform code reviews
  • Perform other duties as assigned
  • Support data-ops practices
Skills

Agile Azure DevOps Cloud data analytics Dashboard building Data Architecture Data engineering Data Governance Data Integration Data Modeling DataOps Data Quality Data Transformation Data warehouse design Dbt Performance Tuning Python Quality Engineering RESTful API's Scrum SDLC SQL

Experience

7 years

Education

Bachelor’s degree in Computer Engineering Bachelor's degree in Computer Science Bachelor’s degree in data science Related Field

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9