AWS Data Architect with Databricks
Remote job
Hello, let’s meet!
We are Xebia - a place where experts grow. For nearly two decades now, we've been developing digital solutions for clients from many industries and places across the globe. Among the brands we’ve worked with are UPS, McLaren, Aviva, Deloitte, and many, many more.
We're passionate about Cloud-based solutions. So much so, that we have a partnership with three of the largest Cloud providers in the business – Amazon Web Services (AWS), Microsoft Azure & Google Cloud Platform (GCP). We even became the first AWS Premier Consulting Partner in Poland.
Formerly we were known as PGS Software. In 2021, we joined Xebia Group – a family of interlinked companies driven by the desire to make a difference in the world of technology.
Xebia stands for innovation, talented team members, and technological excellence. Xebia means worldwide recognition, and thought leadership. This regularly provides us with the opportunity to work on global, innovative projects.
Our mission can be captured in one word: Authority. We want to be recognized as the authority in our field of expertise.
What makes us stand out? It's the little details, like our attitude, dedication to knowledge, and the belief in people's potential - emphasizing every team members development. Obviously, these things are not easy to present on paper – so make sure to visit us to see it with your own eyes!
Now, we've talked a lot about ourselves – but we'd love to hear more about you.
Send us your resume to start the conversation and join the #Xebia.
You will be:
validating our proposal on Data Platform Architecture (ML Platform, DWH) that includes Databricks caps usage and integration with external Airflow (AWS MWAA) and other AWS native services,
helping with developing governance policies around target organisation and usage patterns (workspaces organisation, IAM including programmatic access to S3 buckets, deeper understanding of data cataloguing with Unity Catalog or similar),
helping with defining granular Finops practices on top of above-mentioned structure,
defining architectural best practices using Databricks and data platforms in general, together with our DF team,
providing best practices and directional; thinking around Workspace & Infrastructure creation and Isolation guidelines.
Requirements
Your profile:
ready to start immediately,
many proven experiences on building cloud native data intensive applications, Amazon Web Service experience is a must and good to have experience in GCP,
design and implement scalable, efficient, and secure data architectures, including data lakes, data warehouses, and data marts,
experience with data mesh, data fabric and other methodologies,
proficiency in defining and enforcing data architecture principles, standards, and best practices,
familiar with modern cloud native data stack,
hands-on experience of building and maintaining Spark applications,
fundamental understanding of various Parquet, Delta Lake and other OTFs file formats,
strong written and verbal English communication skill and proficient in communication with non-technical stakeholders.
Work from Poland and a work permit to work from Poland are required.
Recruitment Process:
CV review – HR call – Interview – Client Interview – Decision
ApplyJob Profile
RestrictionsWork permit required for Poland
Benefits/PerksInnovative projects Professional development Remote work
Tasks- Define architectural best practices
- Develop governance policies
- Provide best practices for workspace creation
- Validate data platform architecture
Airflow AWS AWS MWA Azure Cloud native Communication Data Architecture Databricks Data Cataloguing Data Fabric Data Governance Data Lakes Data marts Data Mesh Data Warehouses Delta Lake English English Communication FinOps GCP Google Cloud Google Cloud Platform IAM Microsoft Azure ML Parquet S3 Spark Unity Catalog Word
Experience5 years
Timezones