Senior Solution Architect, Data Engineering
US-CA-Bay Area-Remote
Build the future of the AI Data Cloud. Join the Snowflake team.
Enterprises are modernizing data platforms and processes at a growing rate in order to meet the demands of their customers. A majority of this journey requires not just technical expertise but also the ability to drive predictability, and manage complexity. Snowflake’s Solution Innovation Team offers our customers a market-leading set of technical capabilities as well as best practices for modernization and implementation based on experienced leadership. Our portfolio of modernization solutions spans data migrations, validation, application development, data sharing, and data science. Features like Snowpark, Snowpipe Streaming, and upcoming features from the recent Datavolo acquisition expand the Snowflake Data Platform to data engineering, data science and machine learning.
As a Senior Solution Architect in our team, you will be responsible for delivering exceptional outcomes for our teams and customers. You will work to document and understand product requirements, own the roadmap, and drive the delivery of the products.
This role will report to the Director of Data Engineering within the Solution Innovation Team (SIT) organization.
AS A SENIOR SOLUTIONS ARCHITECT AT SNOWFLAKE, YOU WILL:
Be a technical expert on all aspects of Snowflake
Informal leadership on optimizing migration delivery for SQL and/or Spark workloads
Be well versed in both leading and executing migrations of applications and data onto cloud platforms
Have the ability to outline the architecture of modern apps and data in order for customers to create solutions and services on Snowflake
Run workshops and design sessions with stakeholders, consumers, and product teams
Create repeatable processes and documentation as a result of customer engagement
Develop scripting using python and shell scripts for ETL workflows
Develop best practices, enablement, and knowledge transfer so that customers fully understand and are able to extend the capabilities of Snowflake on their own
Provide guidance on how to resolve customer-specific technical challenges
Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology
Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
Collaborate cross-functionally with Product Management, Engineering, Professional Services, Sales, and Marketing to continuously improve Snowflake’s products and marketing.
OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE:
Minimum of 10 years of experience working with customers in a pre-sales or post-sales technical role
Ability to communicate requirements for capabilities on Snowpark conversion for Java, Scala and Spark based back-end software modules
Ability to deliver requirements for design and development of back-end big data frameworks built on Spark with features like Spark-as-a-service
Ability to build strategies for a comprehensive big data platform for data science and engineering
Ability to develop and advise on frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, REST based API, Artificial Intelligence, and Machine Learning
Experience with Apache NiFi, Kafka, Informatica, dbt, AWS Glue, Azure Data Factory, or other ELT tools
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.
How do you want to make your impact?
For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
ApplyJob Profile
- Collaborate cross-functionally
- Develop ETL workflows
- Document product requirements
- Drive product delivery
- Own product roadmap
- Provide technical guidance
- Run workshops
AI Apache NiFi Applications Architecture Artificial Intelligence AWS AWS Glue Azure Azure Data Factory Cloud Cloud platforms Customer Engagement Data Data Cloud Data engineering Data Science Data Security Dbt ETL Informatica Java Kafka Leadership Machine Learning Marketing Organization Product Management Python Sales Scripting Shell scripting Snowflake Spark SQL
Experience10 years
Education TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9