FreshRemote.Work

Senior Data Engineer

Virginia Remote Office (VA99)

Job Description 

ICF is a mission-driven company filled with people who care deeply about improving the lives of others and making the world a better place. Our core values include Embracing Difference; we seek candidates who are passionate about building a culture that encourages, embraces, and hires dimensions of difference.  

  

Our Health Engineering Systems (HES) team works side by side with customers to articulate a vision for success, and then make it happen. We know success doesn't happen by accident. It takes the right team of people, working together on the right solutions for the customer. We are looking for a seasoned Senior Data Engineer who will be a key driver to make this happen. 

 

Responsibilities:  

Design, develop, and maintain scalable data pipelines using Spark, Hive, and Airflow 

Develop and deploy data processing workflows on the Databricks platform 

Develop API services to facilitate data access and integration 

Create interactive data visualizations and reports using AWS QuickSight 

Builds required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies 

Monitor and optimize the performance of data infrastructure and processes 

Develop data quality and validation jobs 

Assembles large, complex sets of data that meet non-functional and functional business requirements 

Write unit and integration tests for all data processing code 

Work with DevOps engineers on CI, CD, and IaC 

Read specs and translate them into code and design documents 

Perform code reviews and develop processes for improving code quality 

Improve data availability and timeliness by implementing more frequent refreshes, tiered data storage, and optimizations of existing datasets 

Maintain security and privacy for data at rest and while in transit 

Other duties as assigned 

 

Minimum Qualifications:  

Bachelor's degree in computer science, engineering or related field 

7+ years of hands-on software development experience 

4+ years of data pipeline experience using Python, Java and cloud technologies 

Candidate must be able to obtain and maintain a Public Trust clearance 

Candidate must reside in the US, be authorized to work in the US, and work must be performed in the US 

Must have lived in the US 3 full years out of the last 5 years 

 

Preferred Qualifications:  

 

Experienced in Spark and Hive for big data processing 

Experience building job workflows with the Databricks platform 

Strong understanding of AWS products including S3, Redshift, RDS, EMR, AWS Glue, AWS Glue DataBrew, Jupyter Notebooks, Athena, QuickSight, EMR, …

This job isn't fresh anymore!
Search Fresh Jobs

Job Profile

Regions

North America

Countries

United States

Restrictions

Authorized to work in the US Must be authorized to work in the U.S. Must reside in the US Must reside in the US, be authorized to work in the US, and work must be performed in the US Prohibits personal VPN connections Work must be performed in the US

Benefits/Perks

Cutting-edge technology Equal opportunity employer Inclusive workplace Reasonable accommodations Remote Remote-first company

Tasks
  • Code reviews
  • Create data visualizations
  • Design and develop data pipelines
  • Development
  • Develop processes
  • Integration
  • Navigate change
  • Other duties as assigned
  • Perform code reviews
  • Shape the future
  • Write unit and integration tests
Skills

Access Agile Agile Methodology Airflow API API Development AWS AWS Glue AWS QuickSight Big Data Business Business Requirements C C++ Cassandra CI/CD Cloud Cloud Technologies Databases Databricks Data engineering Data Governance Data Infrastructure Data Pipeline Data processing Data Quality Data Transformation Data Visualization Data visualizations Design Development DevOps Education EMR Engineering GitHub GitHub Actions Hive IaC Infrastructure Integration IT Java Jupyter Management NoSQL Policy Postgres Python Redshift Reports REST Scala Science Scripting Security Software Development Spark SQL Storage Technology Services Terraform Test Driven Development Workflow Management Workflows

Experience

7 years

Education

Bachelor's degree Business Computer Science Education Engineering IT Management Or related field Policy Related Field Science Technology

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9