Epicareer Might not Working Properly
Learn More

Data Engineer

Salary undisclosed

Apply on

Availability Status

This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.


Original
Simplified
About us


Headquartered in Singapore, SATS Ltd. is one of the world’s largest providers of air cargo handling services and Asia’s leading airline caterer. SATS Gateway Services provides airfreight and ground handling services including passenger services, ramp and baggage handling, aviation security services, aircraft cleaning and aviation laundry. SATS Food Solutions serves airlines and institutions, and operates central kitchens with large-scale food production and distribution capabilities for a wide range of cuisines.

SATS is present in the Asia-Pacific, the Americas, Europe, the Middle East and Africa, powering an interconnected world of trade, travel and taste. Following the acquisition of Worldwide Flight Services (WFS) in 2023, the combined SATS and WFS network operates over 215 stations in 27 countries. These cover trade routes responsible for more than 50% of global air cargo volume. SATS has been listed on the Singapore Exchange since May 2000. For more information, please visit www.sats.com.sg


Key Responsibilities


Objectives of this role

Work with teams from concept to operations, providing technical subject matter expertise for successful implementation of data solutions in the enterprise using modern data technologies. This individual will be responsible for the planning, execution, and delivery of data initiatives. The role will also be working with the team to expand and optimise data pipelines and architecture. This is a hands-on development role mainly using Microsoft Azure data engineering and Databricks skill sets and data integration using Python and Java.

  • Develop, and maintain scalable data pipelines to ingest, process, and store data from various sources into the operational and analytical data platforms
  • Optimise data processing and storage infrastructure for mission-critical, high-volume, near-real-time & batch data pipelines
  • Implement data quality checks, monitoring, and alerting to ensure data accuracy and availability
  • Troubleshoot and resolve data pipeline issues in a timely manner to minimise impact on business operations
  • Work collaboratively with relevant teams to define functional and technical requirements
  • Document technical specifications, processes, and workflows for data pipelines and related systems
  • Manage stakeholder expectations and ensure clear communication

#LI-ST2


Key Requirements


Required skills and qualifications

  • 3 or more years’ data engineering experience, with experience in Python, Java and SQL
  • Familiar with cloud computing platforms, data engineering tools and services
  • Experience in data lake and data warehouse pipelines
  • Familiar with structured and unstructured data, database management and transformation methodologies
  • Familiar with technical integrations using microservices, API, message queue, stream processing, etc.
  • Exposure to CI/CD pipeline, Azure DevOps or GitHub
  • Communication skills, with ability to explain technical concepts to non-technical stakeholders

Preferred skills and qualifications

  • Tertiary qualifications in computer science, information technology, engineering, or related discipline
  • Certifications on cloud technology and data engineering

#LI-ST2


Job Reference: SATS01757