From Qt Wiki
Jump to: navigation, search

Gestures and Scroll

Unsolved issues

Qt does not really reflect the system behavior wrt gestures on scrollable views (widget classes inheriting QAbstractItemView, QML classes) well.

Typically one expects that a single-finger panning gesture scrolls, as long there is no selection. If there is a selection, the lowest common denominator is that it should extend the selection (text edit classes). Tap-And-Hold-like gestures should start a DnD operation (item views). In list views, tapping can select or deselect an item, whereas tap-and-hold can initiate a lasso mode in which it's possible to select a range of items.

However on most touch platforms there is normally a handle for dragging the cursor after text editing has begun, or a pair of handles whenever there is a selection. This has been implemented in QtQuick Controls, but not in Widgets. So it would be nice to get it working the same way in widgets too. That would be more precise than simply extending the text selection with one finger, or tapping to set the cursor position.

In widgets, the pan recognizer is currently hard-coded to use 2 touch points. For touchscreens, it should be changed to one. But that can't be done as long as single-finger-panning is reserved for selecting text.

When using a touch screen, the selection in widgets is driven by mouse events synthesized from touch by the system (Windows) or Qt itself (other platforms). The same touch events drive QGestureManager.

Windows Specifics

Windows provides native recognizers for gestures, for example a pan recognizer that works with 1 touch point as expected. The downside of it is that it suppresses touch events (and thus the mouse events synthesized from it) and require native child widgets to be used. In Qt 4, the native pan recognizer was used for pan scrolling for QTextEdit and QPlainTextEdit, enforcing native widgets for them.

So Windows should probably start sending QNativeGestureEvents at some point. QtQuick will be able to make use of them eventually.

Native behaviour on Windows:

  • Notepad and other 'old' applications: Horizontal single finger pan starts/extends the selection (can be continued vertically). Other directions scroll
  • MS Office 2013: Shows selection handles on tap.

iOS Specifics

iOS has a class UIScrollView which can emit native scroll events with the correct physics. These should be mapped to Qt scroll events.

There may be a possibility of using native text handles too, so maybe the handles could be "handled" below the QPA interface; but it might be difficult to take the same approach on other platforms.

macOS Specifics

The OS can recognize all the common 2-finger gestures. The reason that panning takes 2 fingers on the trackpad is that the single finger is reserved for mouse cursor movement, clicking and dragging. A trackpad cannot behave the same as a touchscreen, and we should not consider the trackpad as a direct precedent for how touch should work on touchscreens.

We are moving towards using QNativeGestures for the pinch gesture in QtQuick, but the pan/flick gesture is already being converted to QWheelEvent for compatibility with the widgets that expect those. So, the direct handling of touch to do flicking and panning is not relevant on macOS, unless we begin to support touchscreens. But macOS does not have multi-touch touchscreen support anyway, AFAIK.

Linux specifics

Touchscreens are commonplace, and we should try to have feature parity with the other touch platforms.