Senior Data Engineer (DataBricks)
Lithuania - Remote
Softeta is looking for an experienced Data Engineer to join their growing team. As a Data Engineer, you will have the opportunity to work on cutting-edge data projects and contribute to the development of innovative data solutions.
In this role, you will have to work with big data, data migration, and ELT creation.
Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes;
- Data modeling, data cleansing;
- Automating data processing workflows using tools such as Airflow or other workflow management tools;
- Optimizing the performance of databases, including designing and implementing data structures and using indexes appropriately
- Implement data quality and data governance processes.
- Being a data advocate and helping unlock business value by using data.
Technologies - MS SQL, Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Airflow, Python.
Requirements
- 4+ years of experience as a Data Engineer;
- Experience with Azure (Certifications are a Plus)
- Experience with Databricks, Azure Data Lake, Data Factory and Apache Airflow
- CI/CD or infrastructure as code
- Architecture: knowledge of Medallion Architecture or Multihop architecture
- Experience developing and administering ETL processes in the Cloud (Azure, AWS or GCP) environment;
- Strong programming skills in Python and SQL;
- Strong problem-solving and analytical skills.
Benefits
- Diverse and technically challenging projects;
- Remote workplace model;
- Flexible schedule and Agile/SCRUM environment;
- Technical equipment which you can choose.
- Salary range - from 4000 EUR to 6000 EUR before taxes, from 30 EUR/h to 40 EUR/h for contractors.
Job Profile
Benefits/PerksAgile environment Choice of technical equipment Diverse projects Flexible schedule Remote work
Tasks- Automate data workflows
- Design and develop data pipelines
- Implement data quality processes
- Maintain ETL processes
- Optimize database performance
Airflow AWS Azure Databricks Azure Data Factory Azure Data Lake Storage Big Data CI/CD Database optimization Data cleansing Data engineering Data Governance Data migration Data Modeling Data Quality ETL GCP Infrastructure as Code Medallion architecture MS SQL Multihop Architecture Python Workflow Management
Experience4 years
Certifications Timezones