Data Engineer
Ohio
Job Highlights
· Location: Remote, must be based in the United States· Salary Range: $103,500-$143,500 per year, plus benefits. Individual salary offers will be based on experience and qualifications unique to each candidate.· Position Type: Grant funded, limited-term opportunity· Position End Date: June 30, 202
Overview
The Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining data infrastructure for a public health organization. This role is aligned to the Workforce Acceleration Initiative (WAI). WAI is a federally funded CDC Foundation program with the goal of helping the nation’s public health agencies by providing them with the technology and data experts they need to accelerate their information system improvements.Working within Cleveland Department of Public Health (CDPH). The Data Engineer is responsible for enabling data integration and data preparation pipelines for downstream analytics on behalf of the Office of Epidemiology and Population Health. This role requires business intuition and ability to use a variety of technical and soft skills necessary to collaborate across departments.
The Data Engineer will be hired by the CDC Foundation and assigned to the Epidemiologist responsible for informatics in the Office of Epidemiology and Population Health (OEPH). The Data Engineer will additionally cooperate with the Office of Urban Analytics & Innovation (Urban AI) at the City of Cleveland for alignment on data infrastructure requirements and best practices for the enterprise. This position is eligible for a fully remote work arrangement for U.S. based candidates.
Responsibilities
· Utilize software engineering methods and tools on a common data analytic platform to integrate, process and prepare multiple sources of data for downstream public health surveillance analyses.· Collaborate with the Data Analyst and Epidemiologists to understand data requirements, develop and maintain data pipelines automating data transformation tasks.· Perform data linkages between public health surveillance data and geospatial data assets.· Document data transformation processes and maintain comprehensive records for reproducibility.· Test data and/or applications to validate data accuracy/quality· Track projects from conceptualization to completion, including helping to create project roadmaps, project plans and requirements documentation· Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, and storage.· Collect data from various sources, transforming and cleaning it to ensure accuracy and consistency. Load data into storage systems or data warehouses.· Optimize data pipelines, infrastructure, and workflows for performance and scalability.· Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them.· Implement security measures to protect sensitive information.· Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives.· Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.· Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data.· Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses.· Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure.· Provide technical guidance to other staff.· Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings.
Qualifications
· Bachelor's degree in computer science or information systems, or equivalent experience· Demonstrated ability in complex data management and data preparation, including but not limited to data storage, data standardization, and data operations, for data warehousing efforts· Experience working with data integration frameworks· Experience working with cloud services & infrastructure (Microsoft Azure Databricks preferred)· Experience in designing, writing, and delivering code in a team environment, using source code control, unit testing, and other software engineering principles (e.g., Java, Python, R)· Ability to thrive in a project-based, team environment· Preferred Skills· Spatial data experience, e.g. geopandas or ArcGIS
Special Notes
This role is involved in a dynamic public health program. As such, roles and responsibilities are subject to change as situations evolve. Roles and responsibilities listed above may be expanded upon or updated to match priorities and needs, once written approval is received by the CDC Foundation in order to best support the public health programming.All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law.
We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans.
The CDC Foundation is a smoke-free environment. Relocation expenses are not included. Apply
Job Profile
Fully remote Must be based in the United States
Benefits/PerksFully remote Grant funded opportunity Health benefits Remote work Remote work arrangement
Tasks- Collaborate Across Departments
- Collaborate with analysts and epidemiologists
- Design and maintain data infrastructure
- Develop data pipelines
- Document processes
- Gather requirements
- Implement security measures
- Optimize data pipelines
- Optimize workflows
- Present findings
- Provide technical guidance
- Test data accuracy
- Track project progress
Analytics ArcGIS Azure Best Practices C Cloud Collaboration Data & Analytics Data engineering Data Infrastructure Data Integration Data Management Data Pipelines Data preparation Data Quality Data storage Data Transformation Data Warehousing Documentation Engineering Epidemiology ETL ETL Processes Geospatial data Informatics Information systems Infrastructure Integration Java Microsoft Azure NoSQL NoSQL databases Project Management Public health Python R Relational databases Security Security measures Software Engineering Testing
EducationBachelor's Bachelor's degree Bachelor's degree in Computer Science Computer Science Epidemiology Information Systems Public health Software Engineering
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9