The problem is that we have a few components handling both, some only handle one or the other. This discussion is about which direction we want to take in the future and how to unify the "pointer" input handling.
- MouseArea and Flickable handle only mouse
- MultiPointTouchArea handles touch but "eats" mouse events
This is problematic because several MouseAreas /Flickables next to each other don't work (on touch devices). Synthetization of mouse events from touch seems to be fast enough generally, but is far far from perfect.
It seems like there are 2.5 approaches…
- Make everything handle both
- What about more pointer/touch specific properties (touch point radius, right mouse button/wheel)
- Make everything handle only one of them
- App developpers have to add two areas to work on touch and with mouse
- Have a simple area that handles mouse and touch events as compromise
What about handling both simultanously?
- Do we even want to support that?
- Are there problems?
- Multiple mice?
It became clear that we face two independent issues:
- 1 Several MouseAreas when using touch (only one will get events)
- Piano App: several MouseAreas you cannot press two keys at the same time
- 2 With setFiltersChildMouseEvents we only go "down" to parent elements, but never back
- Photo App: Flickable with PinchArea put down one finger to move the image, second finger to zoom, remove one finger -> moving doesn't work any more…
While we did not reach a clear conclusion we came up with two paths to persue.
Short term it will make sense to make MouseArea and Flickable touch aware and fix
the MultiPointTouchArea to not simply swallow mouse clicks.
Longer term we might want several mouse events synthesized since that also makes custom mouse handling elements work.
We might need to keep track of not just the last mouse grabber but make it a "stack" to keep track of all grabbers. Then we can pop back to an area more on top.
We might need a vector of "stacks" of event receivers.