Next-generation pointing device and gesture support: Difference between revisions

From Qt Wiki
Jump to navigation Jump to search
(not a wiki entry)
(don't delete)
Line 1: Line 1:
{{Delete | reason=Not a wiki entry}}
Shawn is having a session on touch and pointer events in Qt Quick
Shawn is having a session on touch and pointer events in Qt Quick



Revision as of 08:48, 13 July 2015

Shawn is having a session on touch and pointer events in Qt Quick

Shawn has been working on making MouseArea handle touch events. Its a tricky problem, and patches always ends up breaking some autotest.

QTouch events and QMouseEvents are parallel classes, ending up having parallel delivery paths.

Alot of the time, you don't really care about if you have a touch or a mouse event. We don't support tablets.

Synthesizing Velocity is in flickable. But maybe this should be moved.

QPointingEvent new event class.

MouseArea not being an item, but have mouse be attached property. MouseArea is just increasing in size.

Fine granularity on attach handlers.

But you can not have multiple attached handlers.

Excessive use of attached handler name ie.

   Pointer.enabled:...
   Pointer.onPressed
   Pointer.onReleased

The point here is not to have a visual item. This will ease the event propagation.

Shawn proposes a possible new input event hierarchy.

Shawn has some slides he will share.

A lot of talk regarding if we should have a new event type being propagated through the object hierarchy. Also, alot of discussions regarding the semantics.

Prototype: Touch attached property. Seems to be working, but not perfect. Implementing gesture recognition (f.ex pinch) may become difficult.

Can we implement Flickable purely in QML? Currently we can't but we should get to that stage in the future. (not that we should do it, but as an exercise/example)

Experiences with doing gesture recognition in C++: not very easy and maintainable. Perhaps push some of the gesture recognition to lower levels in QPA to get better consistency between platforms. On Apple platforms there is no choice anyway. On Embedded however we do all recognition ourselves. (QNativeGesture only for OS X currenetly, not anything else)

It's hard to support both native gestures and the ones recognized "manually" in apps, but there is no easy way to solve this.

Shawn is experimenting with Box2D QML stuff, which could be an inspiration for us. Rectangle { Body { Box { } FrictionJoint { } } }

Sidenote: real physics (Box2D) can be used for Flickable and friends. Concerns about this being an overkill.

QQuickPointerEvent prototype inspired by the W3C pointer event is presented. Dispatch these events to the items. If the item rejects it, it gets touch event, if that is rejected too, it gets the synthesized mouse events. So like today but add the new QQuickPointerEvent on top.

This is a transitional path to Qt 6 because there the touch/mouse events can go away, but what we implement now in Qt 5 in Quick will still work, just drop the legacy mouse/touch path.

Why not make it an opt-in solution: either old event or new event. But then we cannot mix code using both. (different import? But then it's Qt Quick 3 which may be separate from Qt 6, but still new)

Notes from Tor Arne presented: alternatives to attached handlers. (PointerArea, grouped properties, ...)

Hover. We are inconsistent: cannot reject hover. Trying to fix this. Why is this? Not sure. Example of rects in flickable with rejecting onPressed (by setting accepted to false) and preventStealing set to true. The former does not work for hovers since we cannot reject by setting accepted to false. This might have been fixed in a series of patches for hover event propagation in Quick earlier this year. Not sure if we have a problem or not.

How do we feel about the idea of attached handlers conceptually? That is the main question here. Waiting for feedback. "Solves problems", "like it", "like it, but a lot of work, not sure if it's that reasonable". Based on the prototype it does not seem that complicated.

What about multiple pointers (mouse and touch at the same time)? Can we solve it now?

Event based system is probably still desirable and then put stuff on top of it. Should generic gesture recognition be possible from QML? We would like to but people should use C++.