Pointing device

from Wikipedia, the free encyclopedia
Logitech optical mouse
Touchpad and trackpoint of an IBM notebook
Wii remote control

Under a pointing device (Pointing Device) refers to a type of continuous input devices for interaction with a system. It is usually used to represent the movement of the user (pointing) with a position indicator ( cursor ) within a graphical user interface (GUI). In the course of the historical development of human-computer interaction , various pointing devices were developed and, depending on the context of use and technical development, have established themselves until today.

historical development

The history of pointing devices is closely related to the development of human-machine interaction (HCI). Until the patenting of the first pointing device, Doug Engelbart's computer mouse, in 1970, interaction with a computer was initially limited to punched cards and keyboards. However, it was not until the beginning of the 1990s, with the development of graphical user interfaces and the spread of PCs (personal computers), that the pointing device could consolidate its importance in the HCI. As a result of technical progress and the area of ​​application, various pointing devices with different interaction techniques have since been developed.


Pointing devices can be classified into the following classes:

  • Local relationship between the pointing device and the output device
    • Direct: The location of the display and the output are identical
    • Indirect: The pointing device and cursor are spatially separated from each other
  • Kind of illustration
    • Absolute: The point (pointer) recorded by the pointing device corresponds exactly to the point (cursor) displayed in the system
    • Relative: The movement recorded by the pointing device is translated and translated relative to the resolution of the output device
  • Type of pointing: finger-based or medium
  • Degree of freedom of the pointing device
    • Integrated into the system
    • 2-dimensional xy interaction
    • 3-dimensional xyz interaction
  • Type of sensors
    • Isotonic (pointing through movement: the distance describes the display of the cursor)
    • Isometric (pointing with pressure: degree of mechanical tension describes the movement of the cursor)
Pointing device \ classification Indirect / direct Input device / finger-based Absolute / relative
Trackball Indirectly Input device Relative
Light pen Directly Input device Absolutely
Graphics tablet Indirectly Input device Absolutely
joystick Indirectly Input device Relative
Computer mouse Indirectly Input device Relative
Touch screen Directly Finger based Absolutely
Touchpad Indirectly Input device Relative

Common pointing devices

Trackball with billiard ball as a ball

The following is a chronological listing of the most common pointing devices.


In 1950, the first trackball in the DATAR system developed by the Canadian military was used to transmit position information. Initially only used in the military sector, the trackball was also used in the commercial mainframe area in the 1960s. With the invention of the mouse, the trackball played a rather subordinate role in the area of ​​pointing devices. The development of the trackball was congruent with the mouse . With the first representatives, the pointing movement was transmitted via mechanical pick-up rollers. Today's generations record this movement using optical sensors.

Light pen

Using a light pen on a computer terminal, 1969

The first light pen was developed in 1955 at the Lincoln Laboratory and published in 1963 and served as a pointing device for CRT screens . At the tip of the light pen there is a photodiode with which the light pen recognizes when the electron beam hits the tube's luminescent layer at this point. This allows the system to calculate where the light pen is currently located. Due to the way it works, this pointing device is tied to a system with a tube screen. Due to the exclusive use of LCD monitors, it is therefore currently no longer used.

Graphics tablet

Graphics tablet with stylus

The graphics tablet , or drawing tablet or English "digitizer", "pen tablet") is mainly used in the field of image processing , "digital design" and "digital art". In 1957, Tom Dimond's "Stylator" system was the first to be published, which was developed to recognize handwritten user input. The following representatives have been optimized to realistically simulate drawing on paper. In addition to the position of the mouse pointer, pressure intensity, pen inclination and pen rotation are also transmitted to the system. Quality criteria of a graphics tablet are resolution, the pressure level and the size of the tablet.



The joystick was one of the first pointing devices, but due to its limited field of application it was only important in the gaming and simulation area. In 1969 Sega first used a joystick as an interaction device as part of the “Missile” arcade console. In 1977 Atari released the " Atari2600 ", the first joystick for home use. The deflection of the stick was recorded on this digital joystick by pressure-sensitive switching elements. The mechanics are still used today, only the sensors have changed over time. In the case of analog joystick variants, the deflection angle of the joystick is also included in the calculation of the pointer-pointer relation. Mini joysticks were and are mainly used in the mobile context. The first internet-enabled cell phones often had control buttons to navigate on the screen. The Trackpoint is an analog version of the mini-joystick that is still used in notebooks today.

Computer mouse

Mechanical mouse with scroll wheel

The computer mouse is the most widely used pointing device. Conceived by Doug Engelbart in the 1960s as an "XY position indicator for a display system", Apple helped the mouse achieve a breakthrough in 1983 with the worldwide marketing of the " Lisa " desktop system . In the course of the development of human-computer interaction , the mouse has been continuously developed. The most important representatives are:

  • Mechanical mouse (ball mouse): The mouse was the first to hit the PC market. Transfer of the rotation of a rubberized ball on the underside of the mouse to pick-up rollers arranged at right angles. Translation of the x / y roller rotation into the relative coordinates for displaying the mouse pointer on the output device.
  • Optical mouse : An optical sensor detects the background and a processor integrated in the mouse calculates the movement of the mouse from the sequence of images. The system then transfers this to the mouse pointer. In the first generation, light-emitting diodes were used to illuminate the surface on which the mouse was moved.

Touch screen

Capacitive touch screen

The touchscreen is by far the most popular direct pointing device. The first generalized representative, Plato IV, was developed in the early 1970s and was used in schools and universities. With the first touch-enabled PC HP-150 (1983), this technology found its way into everyday life. The triumph of the touchscreen in the mobile sector began in 1995 with the release of the IBM Simon . This multifunctional device was the first to combine a mobile phone with a PDA, using a resistive touchscreen as the only input method. The next milestone was the release of the Apple iPhone in 2007, as it was the first time that a multi-touch user interface was used in a mobile context using a capacitive interaction field. Touch technology is used today in almost all MMI areas. The main reasons for this are the space-saving installation and the usability without input peripherals.


Touchpad on an Acer notebook

In the early 1990s, the touchpad was first used as a mouse replacement in the mobile context of notebooks . Because of the limitations of the area of ​​application, a practical solution was sought for integrating a pointing device. Here the paradigm of the mouse was followed by adding buttons for right and left click next to the interaction surface for the pointer, usually capacitive .

Wii Remote

Wii Remote with bracelet

In 2005 Nintendo released the new Wii console . In contrast to the competition on the console market, the company relied on a new type of interaction concept for the development. In addition to the position determination via infrared sensor, which is relevant for pointing, acceleration values ​​and rotation of the pointing device are recorded. In 2009 Sony released the " PlayStation Move ", a similar game controller that uses the PlayStation Eye to record the user's pointing movement. In contrast to the Wii Remote, the relative position of the “color ball” recorded by the webcam is transmitted to the output device . This means that pointing to points distant from the center in the recording area of ​​the webcam is relatively laborious.

Buxton Taxonomy

With the Buxton Taxonomy , an attempt was made for the first time to classify continuous input devices . William Buxton defined the number of dimensions (columns), type of display (rows) and type of interaction (T = touch, M = mechanical, with aids) as criteria and visualized them in a table.

Buxton's Taxonomy with a selection of pointing devices

Buxton's three-state model

The three-state model was developed by William Buxton in 1990 in order to be able to characterize inputs in graphical user interfaces. This should enable the requirements for interactive transactions (interaction of input device with system) to be recorded easily and comprehensively.

It describes three characteristic states that can occur when interacting with a system:

  • out of range (state0) : The position of the pointing device cannot be determined by the system because it is not in range
  • Tracking (state1) : The movement of the pointing device only moves the cursor
  • Dragging (state2) : The pointing device moves an object

Not every pointing device can assume each of the three states. The following sequences can be modeled from these states.

model description
2 State Transaction
In State1 , the movement of the pointing device, here the mouse, causes the cursor to move ( tracking ). By holding down the mouse button over an object, it can be moved (dragging) - State2 . If you release the mouse button you get back to State1 .
State 0-1 Transaction
In this sequence, a touch interaction is considered. In State0 the pointing device, in this case the finger, moves outside the physical interaction range ( out of range ), so it has no effect on the cursor. As soon as the finger touches the touch surface, State1 ( tracking ) is triggered and the cursor follows the movement of the pointing. As soon as the finger loses contact with the touch field, you return to State0 .
State 0-1-2 Transaction
The above sequence can be expanded to include the dragging state if a stylus is used as a pointing device. State2 dragging can follow State1 tracking by activating the object in the system by pressing or pressing a button on the stylus. If you reduce the pressure or press the button on the stylus again, the object is placed and you return to State1.

Note : In current touch systems without a stylus, dragging can be achieved using various system-specific actions ( longclick , doublecklick ) (Google, 2016). State1 is skipped here.

State2 set
If you look at a mouse with several buttons, you can derive a "State 2 Set" sequence from this, whereby there can be several status options for State2 depending on the range of functions of the mouse. For example, when selecting the object with button A, it could be selected and moved, whereas button B causes the same process to select and move a copy of the object. After releasing the key you return to State1 tracking .

Buxton points out in his work that this concept is a first attempt at classification and could be adapted in subsequent work. He also emphasizes that State2 is not restricted to dragging , only State0 and State1 are fixed. The model can thus be adapted to new types of pointing techniques.

Fitts' law

Fitts 'Law , or Fitts' Law, was developed by Paul Fitts in 1954 and describes a model for human interactions. With his experiments, Fitts investigated which factors influence the speed of movement during controlled actions. Among other things, it calculated the time required for a movement, taking into account the size and distance of the target. The results showed that large targets that are closer to the starting point of the action are much easier to hit than small targets that are further away.

The mathematical formula for this is:

in which:

  • MT ( movement time ) for the movement needed time
  • a constant the reaction time of the user
  • b is constant for the time that the nervous system requires a bit to be processed
  • ID ( index of difficulty ) Difficulty level of the task (defined by the number of bits required to complete the task)
  • D ( distance ) represents the distance between the starting point and the center of the destination
  • W ( width ) is the width of the target object

Note : In the original formula, a was not taken into account and was added afterwards.

Application areas in UI design

Fitts' law is used in the following areas of user interface design :

  • Interactive elements: The larger an interaction area, for example a " button ", the easier it is to reach it with the cursor. Of course, this positive effect only applies until other elements are negatively influenced by it.
  • Edge of the interaction area:
    • Corners: Since the cursor automatically stops in the corners of the surface, the "width" can be viewed as infinite. Accordingly, the time required to hit these key points is minimal. It is therefore very helpful to place important interaction areas here.
    • Top and Bottom Margin: These areas provide the same benefits as corners, albeit to a lesser extent. The width of the interaction area plays a role here again.
  • Menus: When activated, this should be placed as close as possible to the current location of the cursor in order to keep the distance "D" small. With a large number of sub -entries in drop-down menus , selecting an entry can take a long time. In some cases, cake menus are therefore ideal , as they implement Fitts' law very well.

Control display gain

The control display gain (CD gain) is a variable that describes the relationship between the movement of the pointing device and the movement of the cursor during a pointing process.

  • CD gain value = 1 : The cursor moves at the same speed and the same distance as the pointing device.
  • CD gain value <1 : The cursor moves more slowly and by a shorter distance than the pointing device.
  • CD gain value > 1: The cursor moves faster and a greater distance than the pointing device.

The speed of the pointer in relation to the speed of the pointing device is used to calculate the control display gain.

To optimize the interaction, the CDgain is dynamically adapted in modern operating systems to the speed of the movement of the pointing device. With the so-called pointer acceleration (PA), the CDgain is increased with fast movement of the pointing device and decreased with slow movement. The control of the CDgain by the system can prevent the following problems:

  • Quantization : Describes the problem of the inaccessibility of individual pixels. This occurs when the maximum resolution of the pointing device using a very high CDgain does not allow access to individual pixels in the output device. This can be prevented by setting the maximum CDgain that can be used by the system. The ratio of the resolution of the pointing device to the resolution of the output device is calculated using the same unit of measurement (usually DPI).
  • Clutching : Clutching means “reaching around” in the case of pointing devices with a limited interaction surface. If the user reaches the limit of the area available for pointing while pointing, he or she must realign the pointing device in order to reach the desired target with the pointing device (e.g. touchpad).

Other pointing devices

  • GlideCursor: Software that bridges clutching by automatically moving the cursor forward while the grip is being changed
  • RollerMouse: Alternative mouse that is attached below the keyboard. This is intended to reduce the amount of grip
  • Footmouse : Mouse controlled with a foot mainly used as a pointing device for disabled users
  • Pen Mouse: Mouse pen that uses optical sensors to detect the user's pointing movement
  • Laser projection pointing: Recognition of the finger on a projected interaction surface
  • Ring Mouse: Mouse attached to the finger
  • 3D Mouse: Mice specially designed for 3D software
  • Gyroscopic mouse: gesture-based pointing device
  • Eye-Tracking : Control of the cursor by eye movement

Web links

Commons : pointing device  - album with pictures, videos and audio files

Individual evidence

  1. a b Administrator: Mouse. In: dougengelbart.org. Retrieved December 15, 2016 .
  2. History of Computer Development. In: stefan-lenz.ch. Retrieved December 15, 2016 .
  3. ^ A History of the GUI . In: Ars Technica . ( arstechnica.com [accessed December 15, 2016]).
  4. web.cs.wpi.edu (PDF)
  5. ^ KL Norman, D. Kirk: Comparison of Relative Versus Absolute Pointing Devices . (PDF) In: Human-Computer Interaction Lab , 2010, pp. 1–17. Retrieved from cgis.cs.umd.edu .
  6. Julie A. Jacko: The human-computer interaction handbook: fundamentals, evolving technologies, and emerging applications . CRC Press, 2012, OCLC 441142179 .
  7. ^ L. Garbani: History of Computer Pointing Input Devices . 2011 Students.Asl.Ethz.Ch . Retrieved from http://students.asl.ethz.ch/upl_pdf/358-report.pdf
  8. TrackballWorld. (2012). A Brief History of the Personal Computer Trackball. Retrieved November 6, 2016 from http://www.trackballworld.com/trackball-history.aspx
  9. ^ R. Lama: History of Computer - Light Pen . 2011. Retrieved November 30, 2016 from https://drive.google.com/file/d/0B8OiJVdB8D6wX1dLSG5qU0Uzam8/view?usp=sharing
  10. Goods Teitelman: Real Time Recognition of hand-drawn character . In: Proceedings of the October 27-29, 1964, Fall Joint Computer Conference, Part I (=  AFIPS '64 (Fall, part I) ). ACM, New York January 1, 1964, pp. 559-575 , doi : 10.1145 / 1464052.1464106 ( acm.org [accessed December 16, 2016]).
  11. J. Asher: Joystick control . United States, 1982. Retrieved from http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&co1=AND&d= PTXT & s1 = 4,349,708.PN. & OS = PN / 4,349,708 & RS = PN / 4,349,708
  12. JA Jacko (Ed.): The human-computer interaction handbook: fundamentals, evolving technologies, and emerging applications (3rd ed). CRC Press, Boca Raton FL 2012.
  13. W. Yurcik: Pointing Devices . 2002. Retrieved November 5, 2016 from http://www.encyclopedia.com/computing/news-wires-white-papers-and-books/pointing-devices
  14. B. Flamig: How Mice & trackballs work . Sensors, Peterborough NH 1984.
  15. ^ How an Optical Mouse Works. In: tech-faq.com. Retrieved December 16, 2016 .
  16. ^ History of Computers and Computing, Birth of the modern computer, The bases of digital computers, Touch Screen. In: history-computer.com. Retrieved December 16, 2016 .
  17. April | 2013 | Ars Technica. In: arstechnica.com. Retrieved December 16, 2016 .
  18. How Do Touchpads Work? | Techwalla.com . In: Techwalla . ( ehow.com [accessed December 16, 2016]).
  19. Oliver Kreylos: Oliver Kreylos' Research and Development Homepage - Wiimote Hacking. In: idav.ucdavis.edu. Retrieved December 16, 2016 .
  20. PlayStation Move: everything you ever wanted to know. In: Engadget. Retrieved December 16, 2016 .
  21. ^ William Buxton: Lexical and Pragmatic Considerations of Input Structures . In: SIGGRAPH Comput. Graph. tape 17 , no. 1 , January 1, 1983, ISSN  0097-8930 , p. 31-37 , doi : 10.1145 / 988584.988586 ( acm.org [accessed December 16, 2016]).
  22. JA Jacko (Ed.): The human-computer interaction handbook: fundamentals, evolving technologies, and emerging applications (3rd ed). CRC Press, Boca Raton FL 2012.
  23. ^ A Three-State Model of Graphical Input * +. In: www.dgp.toronto.edu. Retrieved December 16, 2016 .
  24. ^ A b H. Drewes: A Lecture on Fitts' Law . July 2013, pp. 1–31.
  25. ^ Applying Fitts' Law To Mobile Interface Design . In: Web Design Envato Tuts + . ( tutsplus.com [accessed December 17, 2016]).
  26. interaction-design.org
  27. Géry Casiez, Daniel Vogel, Ravin Balakrishnan, Andy Cockburn: The Impact of Control-Display Gain on User Performance in Pointing Tasks . In: Human – Computer Interaction . tape 23 , no. 3 , August 29, 2008, ISSN  0737-0024 , p. 215–250 , doi : 10.1080 / 07370020802278163 ( tandfonline.com [accessed December 16, 2016]).
  28. Mathieu Nancel, Daniel Vogel, Edward Lank: Clutching Is Not (Necessarily) the Enemy . In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (=  CHI '15 ). ACM, New York, NY, USA 2015, ISBN 978-1-4503-3145-6 , pp. 4199-4202 , doi : 10.1145 / 2702123.2702134 ( acm.org [accessed December 16, 2016]).
  29. Michel Beaudouin-Lafon, Stéphane Huot, Halla Olafsdottir, Pierre Dragicevic: GlideCursor: Pointing with an Inertial Cursor . In: Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (=  AVI '14 ). ACM, New York 2014, ISBN 978-1-4503-2775-6 , pp. 49-56 , doi : 10.1145 / 2598153.2598166 .