Multi Touch And Touchscreen Technologies Computer Science

Essay add: 19-06-2017, 10:52   /   Views: 12

Multi-touch allows the user to have a full on interaction with a visual display unit at the touch of their fingertips. Commands are identified as gestures which are created by the movements of the fingers across the screen. These commands are then sent to a device to carry out the instructions implemented. This sort of technology is widely used as a tool for interacting with smart phone devices or computer screens [1], and would be considered as a replacement for contemporary devices such as a stylus pen or a mouse.

The actual term multi-touch has been trademarked by Apple Incorporated. [2]

There are different ways of implementing Multi-touch but it is mainly the size and type of interface which determines the technique that is to be used. First the image would be projected onto glass/acrylic on either the touch tables/touch walls. Then a light source would be used to backlight the image. This light would scatter as an object or a finger touches the surface of the projected image leaving the scattered light to be sensed by the cameras or sensors. A software would then interpret the data sent from the capturing devices leading it on to carry out instructions according to the gestures created. [1]

The varied amount of scattered light reflected from the surface can each have sub commands depending upon how firmly an object or a finger are pressed onto the surface. [1]

1.1 Touch screen history

In 1982, the first human-input multi touch system was created at the University of Toronto's Input Research labs. This consisted of a camera placed behind a frosted glass panel and when two or more fingers were pressed onto the surface, black dots would be detected by the camera. These dots were registered as an input via the data sent.

In 1985 Bill Buxton from Bell labs developed on that concept, reducing the size of the original appliance by using capacitance so that a bulky camera-based system is not used.

In 1991, the Digital Desk was designed by Pierre Wellner. This was the first concept to allow pinching and multi-finger motions to be implemented and ever since, numerous companies have gone about creating various different designs of this discovery.

Apple revealed their iPhone in the late twenty-first century, followed by Microsoft introducing their Microsoft Surface. Apple alone is expected to increase the various productions of multi-touch mobile phones and small scale hand held devices from 200,000 in 2006, to 21 Million in 2012. This has caused competitors to constantly find ways in which to improve their current interface in order to keep up with the technology that is currently available on the market.

2.0 Multi-touch Technologies

At present there are four major techniques being used by major firms that allow for the creation of a stable multi-touch hardware systems. They consist of Jeff Han's pioneering Frustrated Total Internal Reflection (FTIR) approach, Rear Diffused Illumination (Rear DI) used in such applications like Microsoft's Surface Table, Laser Light Plan (LLP) pioneered by Alex Popovich and finally, Diffused Surface Illumination (DSI) developed by Tim Roth [1].

These four major techniques are being utilized by the community to all work based around the principal of Computer Vision. This is where inputs are extrapolated in real time by analysing video feeds through single or multiple cameras. This use of computer vision to obtain and manipulate inputted data makes these techniques not only cost effective with high resolution, but also makes them very scalable ranging in size from around 15 inches to greater than 20 feet [1].

2.1 Frustrated Total Internal Reflection (FTIR)

The methodology was developed by Jeff Han in 2005. Infrared LEDs are positioned around an acrylic pane as its perimeter. Shining the IR LEDs through the sides of the sheet allows a total internal reflection (TIR) where light is reflected within the acrylic. Acrylic is mainly used as it is a material which allows the least amount of light to be lost within the material, making it most suitable. When the acrylic is pressed down by a finger, interaction is sensed. This is achieved by the frustration of the light where the infrared light scatters downwards at the point of contact by the finger tip. The scattered light is then captured by a camera placed behind the acrylic sheet which is sensitive to infrared light. A software algorithm using image processing would identify the points of contact as blobs which would be converted into instructions for the device to carry out. [5]

Using some sort of complaint surface allows sensitivity is greatly improved. Here a silicone rubber layer is often used as a compliant surface and firm touching on the bare acrylic will set off the FTIR effect [1].

Figure 1 - An illustration showing FTIR method [1]

2.2 Diffused Illumination (DI):

"Diffused Illumination comes in two main forms" and both techniques use the same basic principles [1].

2.2.1 Rear DI:

This is the first technique and it allows infrared light to be shined at the screen from below the touch surface. A diffuser is then placed on the top or the bottom of the touch surface so when an object touches the surface, it would reflect more light than the diffuser or objects in the background. Here the extra light would then be sensed by a camera and depending on the diffuser; this method can also be used to detect hover and objects placed on the surface [1].

Figure 2 - An illustration showing Rear DI method [1]

2.2.2 Front DI

The second technique uses infrared light, often from ambient surroundings, to be shined at the screen from above the touch surface. A diffuser is placed on either the top or the bottom of the touch surface so when an object meets the surface, a shadow would be created in the position of the object which will be picked up and sensed by the camera [1].

Figure 3 - An illustration showing Front DI method [1]

2.3 Laser Light Plane (LLP):

LLP is the one technique that does not use IR LEDs (Infrared Light Emitting Diode) like all other multi-touch ways and instead uses an IR laser [1].

Here Infrared light from a laser is shined just above the surface where the plane of light is about 1mm thick. When a finger is pressed on the laser, the point hit by the tip of the finger which will be registered as an IR blob [1].

Figure 4 - An illustration showing LLP method [1]

There are however safety issues with this technology. The lasers used for this technology are class 3B which are hazardous if exposed directly at the eye. Often protective eyewear is used when dealing with laser beams in this class and the lasers must also be equipped with a manual key switch and safety interlock. [6]

2.4 Diffused Surface Illumination (DSI):

DSI uses the basic FTIR setup with an LED but replaces the standard acrylic with a special type of acrylic which distributes the infrared beam evenly across the surface plane. This acrylic uses small particles that are inside the material that act like thousands of small reflective mirrors. When an IR light is shown into the edges of this material, the light gets redirected and spread to the surface of the acrylic. The effect is similar to DI but differs in the ways of illumination and hotspots [1].

Figure 5 - An illustration showing Rear DSI method [1]

2.5 Other Technologies

Non-optical technologies like capacitive and resistive methods are also in widespread use. Capacitive touch-screens are coated with a material that continuously conducts an electrical field, creating a capacitance. This also works on the basis of a human touch interrupting this field allowing the location to be processed as a touch point. This technique can however only be used with a human touch and not with gloved fingers or styluses. It also allows multi-touch capabilities as can be seen on the Apple iPhone [1].

Resistive technology is the most commonly used type of touch-screen technology. A resistive touch screen is made of multiple thin layers that when pressed, connect via an electrical circuit that processes it as a touch point. It does not support multi-touch capability though and is often confused with capacitive technology [1].

2.6 Display Techniques

Another aspect of touch-screens is the output display. The display will act as guidance for the end user. Laptops for example use touch technology, track pad. The track pad is a direct replacement to a mouse, however allowing the movement of the cursor by dragging a finger across it. Using iPhone for example unlike a track-pad, it has a display under the touch technology, this in turn allows the user to navigate using touch and drag in one motion, making it more user-friendly.

2.6.1 Projector

Figure 6 - Illustration of Rear Projection [16]Projector works in a very simple way, it basically projects images upon a median chosen by the user. Projector is an ideal component when used with an acrylic transparent medium. This allows the display to be projected upon the diffuser and be displayed on the acrylic. By the use of an infra red light, finger touch can be registered and processed.

2.6.2 LCD

LCD also known as liquid crystal display is a display made up rows of a viscous crystalline liquid. The liquid crystals are lined up in a so called "twisted" alignment and once an electric charge is passed through the liquid crystals effectively un-twists and allows light to pass through at various angles. This technology is used in a wide verity of products ranging from laptop screens to flat panel monitors.

2.7 Diffused Surfaces/Reflection

Diffused reflection is created when light reflects off an uneven surface. As a result the uneven surface can be said to be the diffused surface. This phenomenon is very useful when dealing with multi-touch technology:

"A projector based setup uses the projection surface, sometimes referred to as the diffuser, because it stops a lot of the projected image's light." [1]

"A LCD based setup makes use of the diffuser in order to go under the LCD screen so that the back light can evenly cover the screen. Common materials used are vellum or tracing paper." [1]

"A Rear Diffused Illumination (Rear DI) setup uses a diffuser and places it on the touch surface which is often the projection layer. IR light is then projected out through the touch surface and when it hits the finger, it is reflected back to the camera. The camera sees the diffused light off the finger as being brighter and thus makes (and) creates a blob at that location." [1]

"A Front Diffused Illumination (Front DI) setup also places a diffuser on the touch surface but this time, the IR light is projected from above the touch surface and creates shadows when you touch the surface. The camera sees the shadows off your finger as being darker than the diffusion layer, and after reversing the image, the touch is brighter, and thus makes a blob out of it." [1]

3.0 Multi-touch Interactions

Multimodal interaction (multiple mode interaction) is interaction that involves many inputs at the same time. A special case of multimodal interaction is Multitouch interaction. Here, the several input means or the modes are generally the users fingers. Decades ago multitouch interaction has been imagined and even studied, however only recently hardware and algorithmic developments have made it a reality for the rest of us.

Multitouch interaction styles: like for single touch interaction (even more, actually), there is a whole continuum of possible interaction styles with multitouch devices. However, most demos published in the last years fit in one of the three following categories, or are simple combinations of them:

Multitouch gestures: you perform gestures with one, two or more fingers and the gestures are mapped onto commands.

Multipoint interaction; just as if you had several mice, you can point at several objects at the same time, drag several windows, or resize a window by picking two corners at the same time.

Physical interaction: the contact surfaces are used to interact with the contents of the screen. You can scoop objects between your two hands, for instance.

Multi user touches can be incorporated into the tabletop interface allowing two or more users to use the handed motion on the same table top. This in turn demonstrates the use of a tabletop interface on a larger scale using initial primary handed actions.

4.0 Multi-touch in real life use

With this sort of pace in the advancement of technology, it seems as soon as someone purchases a new piece of technology, there is bound to be a new and better advancement it already in development.

This can be shown in the way that regular mobile phones grew to feature touch screens or how standard iPods were turned into iPod touches. Not only has the technology developed within cell phone and hand held devices, it has also changed teaching methods in schools along with interaction with customers in banking and retail stores [3].

'SMART' boards are now being implemented into a majority of schools and have resulted in teaching being more interactive and stimulating to pupils [3].

Banks are now adapting touchscreen kiosks. This will allow customers to carry out transactions and also view information about their products and services. This is a great impact on banks as their customers will not need to queue up in order to speak to a representative, this in turn will save time for both bank and customers.[4]

High scale retailers have also taken on the concept of touchscreen kiosks. This will allow customers to check and access product availability, including other services that the stores itself are not able to provide. They are also used to help navigate customers through the store, enabling an enhancement to the experience that the customer receives [3]. Retail stores have diversified their use of technology with such sophisticated screens by using it to advertise on shop window fronts. This displays items and offers in the store and has given a literal meaning to the phrase 'window shopping'.

Many users would agree that touchscreen devices make interaction more user-friendly and less complicated to use. Touchscreen gives devices a sleek edge in design as it eliminates the use of buttons [3].

4.1 The Apple iPhone

Apple Inc was able to incorporate the multi-touch function into a product known as the iPhone. The iPhone uses a non-optical technology method, it comprises of a capacitive touch screen which has the ability of detecting multiple touch events. Tasks can be performed on the iPhone with a few common useful gestures, the user can navigate around the applications by "flick" or "drag", and they can either drag up or down to scroll. Certain applications and webpages allow the user to scroll side to side. A pinching gesture would allow users to zoom in and out, when viewing photos, web pages, email, or maps.

Figure 7 - An illustration showing pinch gesture. [9]

Figure 8 - An illustration showing slide gesture for scrolling [9]

4.2 Microsoft Surface

Microsoft Surface is a multi-touch computer that allows multiple users using gesture recognition to manipulate digital content. The product was introduced on May 29, 2007. The technology unlike Perceptive Pixel products can detect and recognise certain physical objects that are placed on the surface using advanced image processing. [14] The contents are displayed on surface using rear projection. [13]

"Microsoft Surface has four key capabilities that make it such a unique experience:" [13]

"Direct interaction. Users can grab digital information with their hands and interact with content on-screen by touch and gesture - without using a mouse or keyboard." [13]

"Multi-user experience. The large, horizontal, 30 inch display makes it easy for several people to gather and interact together with Microsoft Surface - providing a collaborative, face-to-face computing experience." [13]

"Multi-touch. Microsoft Surface responds to many points of contact simultaneously - not just from one finger, as with a typical touch screen, but from dozens of contact points at once." [13]

"Object recognition. Users can place physical objects on the screen to trigger different types of digital responses - providing for a multitude of applications and the transfer of digital content to mobile devices." [13]

Figure 11 - Illustration of what's inside Microsoft Surface [15]

Figure 12 - Illustration of Microsoft Surface [13]

4.3 Perceptive Pixel

Jeff Han, the founder of Perceptive Pixel constructed an astounding multi-touch technology. The technology uses an optical technique, where the touch is detected through Frustrated Total Internal Refraction (FTIR).

Currently perceptive pixel systems are the most advanced sensing display on the market [10]. Their products allow unlimited number of touches at the same time which. As a result there can be unrestricted number of users controlling different movements at the same time.

Later after founding the company, Jeff Han's team developed a larger interface than his previous invention called the "Magic Wall". This is a huge 85.5" solution; this offers more space for users to interact at the same time. The technology can be found in many commercial markets including: [11]



medical imaging



energy exploration


Figure 9 - FTIR tabletop. [12]

Figure 10 - Perceptive Pixel's 'Magic Wall' [10]

industrial design markets

5.0 Aims and Objectives

The fundamental aim of this project is to design and develop a multi touch system to be employed as a table top device. Unlike most table top devices, this device will have multi touch functionality and also have an on board display. This onboard display it the main feature that differs it from generic multi touch devices. This device will basically be in the same market as the Apple Inc iPad.

5.1 Tabletop Requirements

Be independent in terms of physicality. This is basically all components located within the device with only output mode which is for an external laptop.

Mobility is another main requirement. The device must be portable.

Another apparent requirement is to be able to track multiple touches.

A reasonably large touch pad to be able to cater for multiple touch.

Tracking software to be used will be open source; will also be attuned with multi touch applications that are open source.

Use a LCD display, to project the image.

Use a webcam to detect touch.

Reliability is the key to success of this device.

Completion of this project will be well within the appointed time frame.

6.0 Proposed system

Research shows there are many different multi-touch technologies that have been implemented into marketed products. The creation of the Multi-touch tabletop will be using an optical Frustrated Total Internal Reflection (FTIR) method.

Unlike the Microsoft surface and Jeff Han's method where both uses a projector to project the image the prototype will be using an LCD as the display. The budget is a limiting factor for this project. Therefore costs needs to be taken into consideration. LCD option would be more cost effective. The initial prototype will be constructed using strong thick cardboard box, to house all the components temporarily, so that the system can be tested.

The camera would be placed at the bottom of the tabletop, so that it a wider range is achieved to track the fingers across the LCD display. The camera that will be used can be a cheap webcam but with a high enough resolution to ensure sensitive finger tracking can be achieved.

The camera would be connected to a PC/laptop. As it is a prototype a laptop can be used as it is more portable. The camera will be used to track the finger's gestures across the screen, where its data will be sent to tracking software called "Touchlib". It is an open source software that can be downloaded from web. The software will allow calibration of the camera with the alignment of the LCD display. The laptop will be used to run applications to test the multi-touch functionality, further configurations can be made on software to enhance the sensitivity of touch.

To ensure that brighter blobs can be detected by the camera, and that the finger can glide across the screen easily, a compliant surface is needed on top of the acrylic sheet. This can be done by adding a tracing paper, however the visibility of the display would be dull, and therefore a clear plastic film sheet will be needed. It is very important that the sheet is clear, thin and flexible, as the flexibility will allow the blobs to be brighter as the finger presses down on the surface. [8]

Once all tests have been conducted, a wooden housing can be constructed, as a cardboard housing is weaker and can also cause damage to the components if it were to collapse. Wooden structure for the housing will add support and strength to the project put also would look more presentable, and durable for everyday use. With this system, more than one person will be able to operate at the same time, unlike a laptop track-pad.

7.0 Gant's Chart - Timetable

Work packages


Review of Multi-touch Tabletop

Design tabletop prototype

Development of tabletop prototype

Testing of tabletop prototype

Design, Develop and Test actual tabletop

Evaluation of Multi-touch Tabletop system

Compiling Final report & oral presentation

Article name: Multi Touch And Touchscreen Technologies Computer Science essay, research paper, dissertation