Devices such as Windows 8 touch-enabled laptops now support both mouse and touch input methods together. As a control developer you have to take this into consideration.
When you develop your own controls, then there are some things to consider:
With the introduction of touch-enabled Window 8 devices, touch is becoming part of the expected desktop experience. In the past, UI5 statically detected whether the running environment supports touch events. Then the assumption was made that only touch (and not mouse) events need to be supported. This assumption became faulty with the emerging of touch-enabled Windows 8 devices. The fact that touch events are supported doesn't mean that user can't use other input device than touch. Therefore "support touch" doesn't equal "doesn't need mouse support" anymore. For all these reasons, we don't switch between touch and mouse - we now just support them both!
The following figure shows how this is implemented:
A desktop control is defined as a control that listens to mouse events, whereas a mobile control listens to touch events. To ensure that all events can be received, for mouse events touch simulation events are created and for touch events mouse simulation events, respectively. So the UI Area, which acts as an event delegate, receives the correct events. In detail:
Touch interfaces try to emulate mouse and click events obviously because touch interfaces need to work with applications that have only interacted with mouse events before. This means for a single tap on touch interfaces, the following events are fired in the written order:
If we support mouse and touch together, the event handler is called twice for a single tap because there are touchstart and mousedown fired by the browser. Fortunately, we have found a way to set a flag on the emulated mouse events from touch interfaces and suppress those events when they reach the event handler.