Some window systems provide support for input devices that react to the user’s touching the screen and moving fingers while touching the screen. These input devices are known as touchscreens, and Emacs reports the events they generate as touchscreen events.
Most individual events generated by a touchscreen only have meaning as part of a larger sequence of other events: for instance, the simple operation of tapping the touchscreen involves the user placing and raising a finger on the touchscreen, and swiping the display to scroll it involves placing a finger, moving it many times upwards or downwards, and then raising the finger.
While a simplistic model consisting of one finger is adequate for taps and scrolling, more complicated gestures require support for keeping track of multiple fingers, where the position of each finger is represented by a touch point. For example, a “pinch to zoom” gesture might consist of the user placing two fingers and moving them individually in opposite directions, where the distance between the positions of their individual points determine the amount by which to zoom the display, and the center of an imaginary line between those positions determines where to pan the display after zooming.
The low-level touchscreen events described below can be used to implement all the touch sequences described above. In those events, each point is represented by a cons of an arbitrary number identifying the point and a mouse position list (see Click Events) specifying the position of the finger when the event occurred.
This event is sent when point is created by the user pressing a finger against the touchscreen.
This event is sent when a point on the touchscreen has changed position. points is a list of touch points containing the up-to-date positions of each touch point currently on the touchscreen.
This event is sent when point is no longer present on the display, because another program took the grab, or because the user raised the finger from the touchscreen.