FreshRemote.Work

Senior Data Engineer

Remote

This is a remote position.

At Softgic we work with the coolest, with those who build, with those who love what they do, with those who have 100 in attitude, because that's our #Cooltura. Join our purpose of making life easier with technology and be part of our team as a Data Engineer.

Compensation:
USD 20 - 28/hour.

Location:
Remote (Remote for México, Guatemala, Colombia, Perú, Chile, Argentina, Paraguay, Brasil, Honduras, Jamaica, República Dominicana, Belice, Estados Unidos, Panamá, Canadá, España, South Africa, Kenia, India, Filipinas residents).

Mission of Softgic:
In Softgic S.A.S. we work for the digital and cognitive transformation of our clients, aware that quality is an essential factor for us, we incorporate the following principles into our policy:
  • Deliver quality products and services.
  • Achieve the satisfaction of our internal and external clients.
  • Encourage in our team the importance of training to grow professionally and personally through development plans.
  • Comply with the applicable legal and regulatory requirements.
  • Promote continuous improvement of the quality management system.
What makes you a strong candidate:
  • You are expert in data architecture.
  • You are proficient in Azure Data Lake, Databricks, ELT (Extract, load, transform), and ETL (Extract, transform, load).
  • English - English - Native or fully fluent.
Responsibilities and more:
This vacancy is 100% On-site in: Colombia, Guatemala, Mexico, Peru, Chile, Belize, United States, Canada, Spain, Dominican Republic, Jamaica, Honduras, Brazil, Paraguay, Argentina, South Africa, Kenya, India, Philippines.

Job Responsibilities:
  • Design, develop, and maintain scalable data architectures using SQL Server, Azure SQL Database, and Snowflake on Azure.
  • Implement and manage data pipelines using Azure Data Factory, supporting ETL and ELT processes.
  • Work with SQL Change Data Capture (CDC) along with Debezium to enable real-time and incremental data processing.
  • Work with Streaming technologies such as Kafka and Azure Event Hub to deliver near real time analytics and reporting.
  • Manage Azure Data Lake to store and process structured and unstructured data efficiently.
  • Design and optimize Data Vault and Star Schema models for data warehousing solutions.
  • Develop and maintain ETL/ELT workflows using Python and SQL-based tools.
  • Leverage Databricks for big data processing, machine learning, and advanced analytics.
  • Ensure data quality, governance, and security across multiple data environments.
  • Build and maintain analytical reports using Sigma
  • Collaborate with business stakeholders and data analysts to ensure data solutions align with business needs.
  • Monitor and troubleshoot data pipelines to ensure reliability, accuracy, and efficiency.
  • Support disaster recovery planning and high-availability data strategies.
  • Stay up to date with emerging data engineering technologies and best practices.


Requirements

Abilities:
  • 5-7 years of experience as a data architect or senior-level data engineer.
  • Expertise in SQL Server (SSMS, T-SQL, SSIS, SSRS, SSAS) and Azure SQL Database.
  • Strong experience in data modeling, including Data Vault and Star Schema methodologies.
  • Proficiency in ETL/ELT development and data pipeline management.
  • Hands-on experience with Snowflake on Azure and Databricks for big data processing.
  • Experience working with streaming technologies (e.g., Kafka, Flint, Event Hub)
  • Strong analytical and problem-solving skills with a focus on data integrity and scalability.
  • Knowledge of Python for data transformation, automation, and analytics is a bonus.

Requirements:
  • Ability to sit or stand for extended periods of time as required.
  • Ability to work in a fast-paced, deadline-driven environment with minimal supervision.

Benefits

  • We're certified as a Great Place to Work.
  • Opportunities for advancement and growth.
  • Paid time off.
  • Formal education and certifications support.
  • Benefits with partner companies.
  • Referral program.
  • Flexible working hours.


Apply

Job Profile

Restrictions

Position is remote for specific countries only

Benefits/Perks

Certified great workplace Flexible working hours Great Place to Work Opportunities for advancement Opportunities for professional growth Paid Time Off Referral program

Tasks
  • Collaborate with stakeholders
  • Design
  • Design and maintain data architectures
  • Develop
  • Develop workflows
  • Ensure data quality
  • Implement data pipelines
  • Manage data lake
  • Monitor data pipelines
  • Optimize data models
Skills

Analytical Analytics Automation Azure Azure Data Factory Azure Data Lake Azure Event Hub Azure SQL Azure SQL Database Big Data Continuous Improvement Data Architecture Databricks Data Factory Data Governance Data Lake Data Modeling Data Quality Data Vault Data Warehousing Debezium ELT ETL Kafka Problem-solving Python Reporting Security Snowflake SQL Sql change data capture Sql database SQL Server SSAS SSIS SSMS SSRS Star Schema T-SQL

Experience

5-7 years

Education

Design Engineering