Link Search Menu Expand Document

Architecture

Overview

Sailfish OS is a mobile operating system based on GNU/Linux.

The Sailfish OS architecture is primarily made up of three areas hardware adapation layer, the middleware layer and the app/UI layer:

Applications
Lipstick Homescreen and UI management
Qt application framework
Other middleware libraries and services
Hardware adaptation components
Kernel

Hardware Adaptation layer

In the hardware adaptation layer, Sailfish OS uses a Linux kernel with hardware-specific additions.

Sailfish OS can run on top of standard Linux hardware with native drivers, or one can utilize the drivers for an Android-compatible hardware via libhybris, which bridges Linux libraries (based on GNU C) with those based on Bionic, such as Android. Building an adaptation for Android-compatible hardware is instructed in HADK documentation.

Middleware layer

In the middleware layer there are core system components for building services above hardware layer.

The Qt C++ application development framework provides the primary development libraries. Aside from the main Qt modules, Sailfish OS uses add-on modules such as Qt Maps, Qt Sensors and Qt Contacts. Also, all Sailfish applications are written with QML, a Qt technology for easily building user interfaces into C++ applications.

Sailfish OS also includes a large range of middleware libraries and frameworks that service the application layer, more details on the location for the sources please refer to Sailfish OS Source. They are written in C/C++, and libraries that are directly accessed by the UI layer include QML modules to allow them to be used by QML-based applications without additional QML/C++ bindings.

Application and UI layer

Sailfish OS applications are written in a combination of C++ and QML/Qt Quick. QML is a Qt technology primarily used to declaratively assemble application user interfaces and connect them to C++ backend code, and Qt Quick is a core part of the QML framework for UI creation. A Sailfish OS app typically defines the UI in QML, and if necessary, includes C++ utility code to execute further functionality that is otherwise unavailable from the QML layer.

Application launching and lifetime is controlled by Lipstick, which provides the essential user-session UI with an application launcher and other main screens, and also acts as the window manager.

Call chains

Here are few call/usage chains of components in Sailfish OS from ux/middleware to the hardware adaptation driver. The parts that mention droid/libhybris/binder/Android HAL are dependant a bit on Android BSP driver version of the device and for native adaptations the chain looks different.

Should be noted that Sailfish OS does not have kernel provided with the OS but kernel is something that is provided by the Hardware Adaptation layer. Currently lowest supported kernel is 3.4 (which needs some patches), and it is recommended to use kernel 4.4 or newer. There is configuration check script that is used to verify that kernel provides all required functionalities.

AreaCall chainNotes
Audiopulseaudio <> pulseaudio-modules-droid <> libhybris <> audioflingerglue <> Android BSP libbinder <> miniaf <> Android BSP HAL: audio 
BluetoothBluez5 <> kernel VHCI <> bluebinder <> libgbinder <> Android BSP HAL: android.hardware.bluetoothAndroid BSP >= 8
Camera/Multimediagst-droid <> libhybris <> droidmedia <> Android BSP libbinder <> minimedia/minisf <> Android BSP HAL 
Displaymce <> mce-plugins-libhybris <> libhybris <> Android BSP HAL: gralloc or hwcomposer 
Fingerprintsailfish-fpd <> sailfish-fpd-slave <> libgbinder <> Android BSP HAL: android.hardware.fingerprintAndroid BSP >= 8
Graphicsqtbase <> qt5-qpa-hwcomposer-plugin <> libhybris-compat-library: libhwc2_compat_layer <> libgbinder <> Android BSP HAL: android.hardware.graphics.composerAndroid BSP >= 8
"
qtbase <> qt5-qpa-hwcomposer-plugin <> libhybris <> Android BSP HAL: hwcomposerAndroid BSP <= 7
LEDmce <> kernel 
"
mce <> mce-plugins-libhybris <> libhybris <> Android BSP HAL: lights 
Location (GPS)geoclue <> geoclue-providers-hybris <> libgbinder <> Android BSP HAL: android.hardware.gnssAndroid BSP >= 8
"
geoclue <> geoclue-providers-hybris <> libhybris <> Android BSP HAL: gpsAndroid BSP <= 7
ModemoFono(ril driver) <> libgrilio <> ofono-ril-binder-plugin <> libgbinder-radio <> libgbinder <> Android BSP HAL: android.hardware.radioAndroid BSP >= 8
"
oFono(ril driver) <> libgrilio <> socket <> Android BSP: rildAndroid BSP <= 7
NFCnfcd <> nfcd-binder-plugin <> libgbinder <> Android BSP HAL: android.hardware.nfcAndroid BSP >= 8
Sensorssensorfw-qt5 <> sensorfw-qt5-hybris <> libgbinder <> Android BSP HAL: android.hardware.sensorsAndroid BSP >= 8
"
sensorfw-qt5 <> sensorfw-qt5-hybris <> libhybris <> Android BSP HAL: sensorsAndroid BSP <= 7
Storage (eMMC & sdcard)udisks2 <> kernel 
Touchapp <> lipstick <> qtbase <> evdev <> kernelSee below
USBusb_moded <> kernel 
MTPbuteo-mtp <> usb_moded <> kernelSee Architecture#MTP
WiFiconnman (sailfish_wifi plugin) <> libgsupplicant <> wpa_supplicant <> kernel 
Mobile dataconnman (sailfish_ofono plugin) <> libgofono <> oFono 
Volume keyspulseaudio <> lipstick <> mce <> kernel 
Power keycall-ui or alarm-ui <> mce <> kernelShort keypress
"
systemd <> dsme <> kernel5s+ keypress

For more information on the areas covered by middleware libraries and services, see Core Areas and APIs.

MTP

usb-moded detects usb connection based on udev notification from kernel and initiates gadget configuration and starts buteo-mtp. buteo-mtp then finalizes the gadget configuration and handles data transmission.

Touch

mce-tools package provides evdev_trace command. Use --show-readers option to figure out which device handles touch input (e.g. ABS_MT_TRACKING_ID). See kernel documentation for more information about the multi-touch protocol.

$ /usr/sbin/evdev_trace --show-readers

Start tracing your touch screen (event2 is an example).

$ /usr/sbin/evdev_trace -t /dev/input/event2

Qt handles evdev touch events via evdev plugin (/usr/lib64/qt5/plugins/generic/libqevdevtouchplugin.so on a 64bit device). Qt logging category would be “qt.qpa.input”. QEvdevTouchScreenHandler auto detects touchscreen.

Key Architectural Areas

An overview of some architectural areas and the APIs which expose the related functionality can be found in the page describing Core Areas and APIs. Some more in-depth documentation about key architectural areas follow:

  • Cellular Telephony Architecture talks in detail about the mobile phone functionality
  • Audio Architecture describes audio routing and sharing
  • Screen display and application compositing is described in the Graphical Architecture
  • Multimedia Architecture covers the camera and video subsystems
  • The Qt Framework explains which Qt components applications should use to access features
  • Android compatibility is enabled by the Android Emulation Framework