FreshRemote.Work

Senior Data Engineer

Remote, US

As a Data Engineer at TSPi, you will implement data engineering solutions for a variety of government agencies including, but not limited to, clients with conservation and environmental missions. You’ll deploy and develop pipelines and platforms that organize and make disparate data meaningful. At TSPi you will work with a multi-disciplinary team of analysts, data engineers, developers, and data consumers in a fast-paced, Agile environment. You’ll use your experience in analytical exploration and data examination while you manage the assessment, design, building, and maintenance of scalable data platforms for your clients.

Job Duties

  • Optimize queries and data processing to improve performance and reduce costs.
  • Perform data transformations and data cleansing to prepare data for analysis and reporting.
  • Monitor data pipelines and systems for performance, reliability, and data quality issues.
  • Document data pipelines, data models, and data infrastructure components.
  • Share knowledge with team members and stakeholders to ensure a clear understanding of data processes.
  • Implement data validation and data enrichment processes.
  • Design and develop robust and scalable data pipelines to extract, transform, and load (ETL) data from various sources into data warehouses or data lakes.

Experience:

  • Ability to work in a fast-paced, dynamic environment with tight deadlines and competing requirements. Strong organizational, planning, and time management skills.
  • 3+ years of professional and technical experience as a Data Architect, Data Engineer, or related role developing data models and architectural solutions.
  • Knowledge of developing and using data standards comprising common formats, representation, definition, structuring, manipulation, tagging, transmission, use, and management of data
  • 3+ years of experience working with data programming languages (Python, Java, SQL, etc.), data orchestration and integration pipelines
  • Knowledge of cloud computing platforms similar to Google Cloud Platform (GCP), Amazon Web Services (AWS), Azure or equivalent.
  • Understanding of software development lifecycle including Agile and traditional project management and delivery methodologies
  • Strong written and oral communications skills. 
  • Strong math and analytical skills.

Required Skills:

  • 3+ years technical working or other relevant industry experience
  • 2+ years of experience with SQL, database development, and administration
  • 2+ years of experience of AWS, Azure, or GCP Cloud services (AWS preferred)
  • Experience with building dashboards using data visualization tools, including Tableau, Qlik, PowerBI, or QuickSight
  • Excellent oral and written communication skills
  • Successful candidate is subject to a background investigation by the government and must be able to meet the requirements to hold a position of public trust

Education:

  • Bachelor’s degree in Computer Science or related field
Apply

Job Profile

Regions

North America

Countries

United States

Restrictions

Must meet public trust requirements Subject to background investigation

Tasks
  • Design data pipelines
  • Document data processes
  • Implement data solutions
  • Monitor data pipelines
  • Optimize queries
Skills

Agile Analytical AWS Azure Cloud Computing Communication Data cleansing Data engineering Data Modeling Data Pipelines Data Quality Data Transformation Data Visualization Data visualization tools ETL GCP Java PowerBi Power BI Project Management Python Qlik Quicksight SQL Tableau Web Services

Experience

3 years

Education

Bachelor's degree Related Field

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9