Senior Data Engineer - People & Finance Technology (REMOTE) - MD Chevy Chase (Office) - JPS

GEICO is seeking an experienced Senior Data Engineer with a passion for transforming existing technology to new Open Source technologies. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission, while co-creating the culture of psychological safety and continuous improvement.

Our Senior Data Engineer is a key member of the engineering staff working across the organization to provide a friction-less experience to our customers and maintain the exit highest standards of protection and availability. Our team thrives and succeeds in delivering high quality technology products and services in a hyper-growth environment where priorities shift quickly.

The ideal candidate is an experienced Data Engineer with background in ETL or ELT processing with SQL/NoSQL databases, a background in transforming existing tech to new open source technologies (ideally Postgres) as well as a strong development background in Spark, Scala, Java and/or Python.

Position Responsibilities

As a Senior Data Engineer, you will:

  • Design and implement a data ingestion platform

  • Scope, design, and build scalable, resilient distributed systems

  • Build product definition and leverage your technical skills to drive towards the right solution

  • Engage in cross-functional collaboration throughout the entire software lifecycle

  • Lead in design sessions and code reviews with peers to elevate the quality of engineering across the organization

  • Define, create, and support reusable application components/patterns from a business and technology perspective

  • Build the processes required for optimal extraction, transformation, and loading of data

  • Work with other teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies

  • Perform unit tests and conduct reviews with other team members to make sure code is rigorously designed, elegantly coded, and effectively tuned for performance

  • Share your passion for staying on top of tech trends, experimenting with, and learning recent technologies, participating in internal and external technology communities, and mentoring other members of the engineering community

  • Mentor other engineers

  • Consistently share best practices and improve processes within and across teams


  • Experience developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components

  • Advanced programming experience and big data experience

  • Understanding of data warehouse concepts including data modeling and OLAP

  • Experience working with cloud data solutions (Delta Lake, Iceberg, Hudi, Snowflake, Redshift or equivalent)

  • Experience with data formats such as Parquet, Avro, ORC, XML, JSON

  • Experience with designing, developing, implementing, and maintaining solutions for data ingestion and transformation projects

  • Experience working streaming applications (Spark Streaming, Flink, Kafka or equivalent)

  • Data processing/data transformation using ETL/ELT tools such as DBT (Data Build Tool), or Databricks

  • Experience programming languages like Python, Scala, Spark, Java

  • Experience with container orchestration services including Docker and Kubernetes

  • Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs to reduce the execution window or reduce resource utilization

  • Experience with cloud computing (AWS, Microsoft Azure, Google Cloud)

  • Exposure to messaging such as Kafka, ActiveMQ, RabbitMQ or similar messaging technologies.

  • Experience with REST, Microservices is a big plus

  • Experience with developing systems that are scalable, resilient, and highly available

  • Experience with Infrastructure as Code

  • Experience with CI/CD deployment and test automation. ADO, Jenkins, Gradle, Artifactory or equivalents

  • Experience with containerization (examples include Docker and Kubernetes)

  • Experience with version control systems such as GIT

  • Experience with load testing and load testing tools

  • Advanced understanding of monitoring concepts and tooling

  • Experience with Elastic Search, Dynatrace, Thousand Eyes, Influx, Prometheus, Grafana or equivalents

  • Experience architecting and designing new and current systems

  • Advanced understanding of DevOps concepts

  • Strong problem-solving ability

  • Ability to excel in a fast-paced environment

  • Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication)


  • 4+ years of professional software development within at least one of the following Java, Spark, Scala, Python

  • 3+ years of experience with architecture and design

  • 3+ years of experience with AWS, GCP, Azure, or another cloud service

  • 2+ years of experience in open-source frameworks


  • Bachelor’s degree in Computer Science, Information Systems, or equivalent education or work experience



Annual Salary

$82,000.00 - $185,000.00

The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate’s work experience, education and training, the work location as well as market and business considerations.


GEICO will consider sponsoring a new qualified applicant for employment authorization for this position.



As an Associate, you’ll enjoy our Total Rewards Program* to help secure your financial future and preserve your health and well-being, including:

  • Premier Medical, Dental and Vision Insurance with no waiting period**
  • Paid Vacation, Sick and Parental Leave
  • 401(k) Plan
  • Tuition Reimbursement
  • Paid Training and Licensures

*Benefits may be different by location.  Benefit eligibility requirements vary and may include length of service.

**Coverage begins on the date of hire. Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect.

The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled.

GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.


Job Profile


401(k) Plan Paid training Paid Training and Licensures Tuition reimbursement


Architecture Automation Avro AWS Azure Best Practices Business CI/CD Cloud Collaboration Communication Compliance Cross-functional Collaboration Databricks Data Management Data Modeling Data processing Data Quality Data Solutions Dbt Delta Lake Deployment Design DevOps Docker ELT Engineering ETL Excel Finance Flink Full-stack development GCP Git Grafana HUDI Iceberg Information systems IT Java Jenkins JSON Kafka Kubernetes Management Microservices Microsoft Monitoring NoSQL OLAP Operations ORC Orchestration Organization Parquet Policy Postgres Problem-solving Programming Prometheus Python Redshift REST Scala Snowflake Software Development Spark Spark Streaming SQL Teams Technical Testing Training XML

  • Architecting
  • Best Practices
  • Build scalable, resilient distributed systems
  • Communication
  • Continuous Improvement
  • Create reusable application components
  • Data Management
  • Design and implement data ingestion platform
  • Designing
  • Develop and maintain data solutions
  • Developing
  • Development
  • Develop systems using various technologies
  • Engage in cross-functional collaboration
  • Improve processes
  • Lead design sessions and code reviews
  • Mentoring
  • Mentor other engineers
  • Optimize data extraction, transformation, and loading
  • Share best practices
  • Share best practices and improve processes
  • Testing
  • Training
  • Work on streaming applications
  • Write and optimize SQL queries and ETL jobs

Architecture Business Communication Computer Science Education Engineering Environment Equivalent Finance Information Systems IT Management Policy Science