This page describes older version of Sailfish OS
Touch gestures are the backbone of any touch screen experience. A harmonized use of common gestures enhance the efficiency and ease of use of an interface, making getting up to speed with a new app much easier. Sailfish OS divides touch gestures into two groups: Screen edge swipes and content interactions.
Screen edge swipes
Use these gestures to navigate on the OS level. As the name implies, an edge swipe gesture is related to the screen edge. They all start from the outside of the screen edge, and moves towards the center. At any time during the gesture, user can reverse the movement to cancel it. This makes it possible to peek into Home or Events, without really leaving the current app, creating a liberating user experience. The reason edge swipes fit so well for the OS level control, is that user can blindly locate their starting points. All four of them are right next to physical device edges, which have a very strong tactile feel to them. Here’s how edge swipes work in various situations.
These common gestures are used to interact on the application level. Moving content around and touching to select, are both the bread and butter when going through our daily tasks. That’s why the application page navigation is build around the former. The following images show how they’re used across Sailfish OS core applications.
In addition to moving between application pages, the same gesture can be used to accept or cancel a dialog. See more about the dialog in Navigation architecture.
Double touch is also used for waking up the display when device is idle. This gives user the freedom to always interact with the display, without a need for physical buttons.