Staff Architect
Remote (United States)
Astronomer designed Astro, an industry-leading, orchestration-first DataOps platform for data teams. Powered by Airflow, Astro accelerates building reliable data products that unlock insights, unleash AI value, and drive data-driven applications.
We’re a globally-distributed and rapidly growing venture-backed team of learners, innovators and collaborators. Our mission is to empower data teams to bring mission-critical analytics, AI, and software to life. As a member of our team, you will be at the forefront of the industry as we strive to deliver the world's data.
Your background may be unconventional; as long as you have the essential qualifications, we encourage you to apply. While having "bonus" qualifications makes for a strong candidate, Astronomer values diverse experiences. Many of us at Astronomer haven't followed traditional career paths, and we welcome it if yours hasn't either.
About this role:
As a Staff Architect, you will be a key member of our professional services team and work with directly with Astronomer’s most important customers, assisting them in a technical leadership capacity with their data ecosystem modernization and DataOps transformation initiatives.
In this role, you will be exposed to a wide variety of data processing use cases, primarily orchestrated by Apache Airflow. The impact you make will enable customers to rapidly grow their businesses by adopting Astronomer products to meet their goals.
What you get to do:
Work directly with Astronomer customers, acting as technical lead for high-stakes professional services engagements
Participate in pre-sales motions where a specialized technical solution is required
Build architecture, data flow, and operational diagrams and documents
Provide reference implementations of various activities, including composing data pipelines in Airflow, implementing new Airflow features, or integrating Airflow with 3rd party solutions
Collaborate to build reusable assets, automation tools, and documented best practices
Interact with Product and Engineering stakeholders to channel product feedback and requirements discussions
Work with Global Service Delivery team members to ensure clients are realizing value in their Airflow and Astronomer journeys
Establish strong relationships with key customer stakeholders
What you bring to the role:
Experience with Apache Airflow in production environments
Experience in designing and implementing ETL, Data Warehousing, and ML/AI/analytics use cases
Experience with OpenLineage and data quality and observability tools
Proficiency in Python and ideally other programming languages
Knowledge of cloud-native data architecture
Demonstrated technical leadership on team projects
Strong oral and written communication skills
Customer empathy
Willingness to learn new technologies and build reference implementations
Bonus points if you have:
7+ years experience in a data engineering or similar role
2+ years in a customer-facing role
Consulting experience
Experience in migrating workflows from legacy schedulers (Control-M, Autosys, Oozie, Cron, etc.) to Apache Airflow
Snowflake experience
Databricks or Spark experience
Kubernetes experience, either on-premise or in the cloud
Enterprise data experience in regulated environments
The estimated salary for this role ranges from $175,000 - $200,000, along with an equity component. This range is merely an estimate, and the width of the range reflects willingness to consider candidates with broad prior seniority. Actual compensation may deviate from this range based on skills, experience, and qualifications.
#LI-Remote
At Astronomer, we value diversity. We are an equal opportunity employer: we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Astronomer is a remote-first company.
ApplyJob Profile
Diversity valued Equity component Remote-first company Remote work
Tasks- Build architecture and data flow diagrams
- Collaborate on reusable assets
- Establish customer relationships
- Interact with product stakeholders
- Provide reference implementations
- Technical lead for customer engagements
AI Airflow Analytics Apache Airflow Automation Cloud-native Data Architecture Communication Customer-facing Data Architecture Databricks Data engineering Data Quality Data Warehousing ETL Kubernetes Leadership ML Observability tools OpenLineage Orchestration Programming Programming languages Python Sales Snowflake Spark Written communication
Experience7 years
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9