FreshRemote.Work

Senior Data Engineer (REMOTE)

United States

SailPoint is the leader in identity security for the cloud enterprise. Our identity security solutions secure and enable thousands of companies worldwide, giving our customers unmatched visibility into the entirety of their digital workforce, ensuring workers have the right access to do their job – no more, no less.  

Want to be on a team that full of results-driven individuals who are constantly seeking to innovate?

Want to make an impact?

At SailPoint, our Data Platform team does just that. SailPoint is seeking a Senior Data Engineer to help build robust data ingestion and processing system to power our data platform. We are looking for well-rounded engineers who are passionate about building and delivering reliable, scalable data pipelines. This is a unique opportunity to build something from scratch but have the backing of an organization that has the muscle to take it to market quickly, with a very satisfied customer base.

Responsibilities:

  • Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses.

  • Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks.

  • Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem.

  • Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems.

  • Develop and maintain workflow orchestration using tools like Apache Airflow.

  • Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes.

  • Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills.

Qualifications:

  • Due to FedRAMP requirements, US Citizenship is required to be considered for this role

  • BS in computer science or a related field.

  • 5+ years of experience in data engineering or related field.

  • Demonstrated system-design experience orchestrating ELT processes targeting data

  • Hands-on experience with at least one streaming or batch processing framework, such as Flink or Spark.

  • Hands-on experience with containerization platforms such as Docker and container orchestration tools like Kubernetes.

  • Proficiency in AWS service stack.

  • Familiarity with workflow orchestration tools such as Airflow.

  • Experience with DBT, Kafka, Jenkins and Snowflake.

  • Experience leveraging tools such as Kustomize, Helm and Terraform for implementing infrastructure as code.

  • Strong interest in staying ahead of new technologies in the data engineering space.

  • Comfortable working in ambiguous team-situations, showcasing adaptability and drive in …

This job isn't fresh anymore!
Search Fresh Jobs