WO2004070604A2 - Method of selecting objects of a user interface on a display screen - Google Patents
Method of selecting objects of a user interface on a display screen Download PDFInfo
- Publication number
- WO2004070604A2 WO2004070604A2 PCT/IB2004/000228 IB2004000228W WO2004070604A2 WO 2004070604 A2 WO2004070604 A2 WO 2004070604A2 IB 2004000228 W IB2004000228 W IB 2004000228W WO 2004070604 A2 WO2004070604 A2 WO 2004070604A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- selection
- objects
- user interface
- area
- display screen
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Definitions
- the invention relates to a method of selecting objects of a user interface on a display screen as defined in claim 1.
- a multitude of objects is displayed on a display resource present on the display screen for displaying the user interface. Many of these objects are used for initiating entries to be performed by a computer to which the user interface is assigned.
- the selection of objects of the user interface may principally be performed in two different ways: either by way of ancillary means connected to the computer, or directly by entry on the display screen.
- Ancillary means are, for example, keyboards or an indicator device such as a mouse, a trackball or a joystick.
- the direct entry on the display screen is only possible with types specially prepared for this purpose, for example, touch-sensitive display screens.
- touch-sensitive display screens it may be very difficult to select small objects or at least partly masked objects on a high-resolution user interface, particularly when a user uses his fingers as the entry means.
- the size of selectable objects is often adapted to the entry method.
- User interfaces for touch-screens i.e., for example, fingertip-operated display screens therefore usually display very large objects so that the number of objects to be displayed on the user interface is reduced.
- the selectable objects cover a certain minimum area of the user interface if they are to be fingertip-selected in a precise manner.
- a 15" display screen which is touch-sensitive can generally display a user interface with a resolution of 1024 x 768 pixels.
- the accuracy of a touch-entry is, however, only about 1% of the pixels, which corresponds to about 10 pixels or about 3 mm.
- the parallax of the pixel plane and the touch plane having a distance of about 2 to 3 mm reduces the accuracy of the entry by a further 2 to 3 mm in the case of an assumed maximum viewing angle of about 45°, which corresponds to about 6 to 9 pixels.
- the minimum area covered by objects to be easily selectable by the user's fingers can be calculated with reference to the fingertip-touch area on the display screen.
- a finger may touch, for example, an area having a diameter of about 10-15 mm, which corresponds to about 30 to 45 pixels.
- entry errors which are based on an inaccurate alignment between the display screen plane and the touch plane, or result from long-time offset and amplification errors in the touch entry, are not taken into account. Such errors can generally be eliminated by means of calibration methods.
- An essential idea of the invention is to determine a selection coefficient with reference to predetermined parameters for an object of a user interface on a display screen.
- the selection coefficient which serves for selecting the object, can be considered as a kind of "virtual" selection area of an object. It provides the particular possibility of designing a user interface which, as regards the size of selectable objects, is less restrictive than has been described hereinbefore.
- the design can be adapted to, for example, the resolution of a display on the display screen, the resolution of an entry method, the number of objects displayed on a display resource of a display screen, or similar parameters.
- the invention specifically relates to a method of selecting objects of a user interface on a display screen, in which a selection coefficient is determined for an object with reference to predetermined parameters, which selection coefficient serves for selecting the object.
- a selection coefficient is determined for an object with reference to predetermined parameters, which selection coefficient serves for selecting the object.
- the predetermined parameters may comprise the object area, the part of the object area visible on the display screen, a predetermined selection area, the part of the selection area that is not masked by other objects and is thus particularly visible, and a plane of the user interface in which the object is situated.
- the area can be determined with reference to the proportion between the selection area and the object area.
- the afore-mentioned proportion for the objects with a small area may be chosen to be very large as compared with the objects with a large area. Consequently, easy selection of a small-area object is also possible in a selection method using a small resolution, for example, a fingertip entry.
- Other computations for determining the selection coefficient essentially determining the object selection are of course feasible, for example, by taking the visible selection area of a first object in proportion to the visible selection area of a neighboring, second object into account when determining the selection coefficient for selecting the first object, or when a priority of an object is concerned.
- Priority is herein understood to mean particularly that an object having an important function (for example close window) has a higher priority than an object performing a less important function (for example, maximize/minimize window).
- the selection coefficient is preferably evaluated by a factor, for example, multiplied, which factor is assigned to the plane in which the object is situated.
- a factor for example, multiplied
- a "third" dimension is used. In other words, not only areas and their ratios for improving the selection of objects of the user interface but also their arrangement on the normal of the display screen plane are taken into account.
- the object having the largest selection coefficient evaluated by the factor is selected from a group of objects in so far as the factor-evaluated selection coefficients of the other objects of the group are smaller than a predetermined threshold value. If, in contrast, one of the factor-evaluated selection coefficients of the other objects is larger than the predetermined threshold value, a section of the user interface with the group of objects is automatically enlarged, similarly as in the case when the group of objects is magnified. The selection of an object is thereby essentially facilitated. This is particularly advantageous when all objects of the group essentially have the same factor-evaluated selection coefficient, i.e.
- the predetermined threshold value may be derived, for example, from the largest factor-evaluated selection coefficient of the objects of a group and from the other factor-evaluated selection coefficients.
- the user interface is a graphic user interface, in which an object is a symbol, a pictogram, a switch area, a selection menu, a window or a similar element of a graphic user interface.
- the display screen is preferably a touch-sensitive display screen.
- the method may, however, be advantageously used in other entry methods of selecting objects on the user interface, for example, in high-resolution user interfaces having a multitude of objects and a mouse as an entry device.
- the invention further relates to a data-processing device comprising a display screen which is program-technically arranged to perform the method according to the invention.
- the data-processing device is part of a device for patient monitoring, in which the display screen is used for displaying patient data and for operating the device.
- the display screen may be used for indicating given measuring data of a patient and simultaneously indicate a user interface of a patient monitoring system comprising objects for selection.
- the display screen is touch-sensitive, the patient monitoring and, for example, the indicated data can be directly controlled via the selection of given objects which are indicated on the user interface of the display screen.
- the invention is not limited to its use in a patient monitoring system but can be used advantageously in any computer system comprising a user interface with selectable objects.
- Fig. 1 is a cross-section of a first example of a user interface with selectable objects and their areas such as selection areas with which the areas are determined for selection according to the invention
- Fig. 2 is a cross-section of a second example of a user interface with selectable objects which are superposed in different planes, in which one object partly masks another object.
- Fig. 1 shows objects of a user interface 10 assigned to a computer (not shown) and displayed on a touch-sensitive display screen or computer monitor (not shown).
- the Figure shows three objects Ol, O2 and O3 which are juxtaposed for the sake of clarity.
- the objects are switch areas of the user interface which, when touched, control the computer for performing a predetermined operation.
- the objects shown in Fig. 1 are particularly provided for fingertip selection or selection by entry means such as a stylus for use on a touch-sensitive display screen.
- entry means such as a stylus for use on a touch-sensitive display screen.
- the function assigned to the object is performed by the computer. This is effected in that the touch-sensitive display screen detects a touch on one of the objects and signalizes this to the computer.
- the signalization is effected similarly as with a conventional computer keyboard on which a given key is pressed.
- a selection coefficient is determined according to the invention for each object, which selection coefficient is evaluated for selecting the corresponding object.
- This selection coefficient can also be considered to be a kind of virtual touch area or activation area for the object.
- the predetermined selection area TAl is arranged within the outer periphery of the object Ol and principally corresponds to said virtual touch area of the object Ol.
- the predetermined selection area TAl is selected and the operation based on this object is performed by the computer. Since the predetermined selection area TAl is smaller than the object area Al, no touch is detected when touching the peripheral area of the object area Al .
- the object Ol is therefore generally a larger object on the user interface.
- the central object 02 has an object area A2 which is also substantially square- shaped but smaller than the object area Al of the left object Ol . Consequently, selecting the object O2 by means of a fingertip touch is more difficult than selecting the left object Ol which has a larger object area Al .
- the object O2 has a predetermined selection area TA2 which is substantially circular and larger than the object area A2.
- the outer periphery of the predetermined selection area TA2 is shown in broken lines again.
- the object area A2 is situated within the predetermined selection area TA2. Consequently it is easily possible to select the object O2 by means of a fingertip touch, even when this object has a very small object area A2, for example, as distinct from object Ol .
- a third object 03 is shown at the extreme right in the Figure, which object has an object area A3 and is substantially elliptical.
- the selection coefficient is essentially determined by this selection area.
- This selection area may be predetermined by the user interface, for example, by a program-technical routine of a user interface program performed on the computer and displaying the computer user interface on the display screen or the computer monitor.
- the predetermined selection area can be dynamically adjusted, for example, in dependence upon the number of objects displayed on the display resource of the display screen, which resource is provided for and utilized by the user interface. Further parameters used for determining the selection coefficient for selecting an object may be, for example, the placement of objects on the user interface, and the size, priority and accumulation of objects at a given position on the user interface.
- the selection coefficient does not automatically correspond to the object area but can principally be fixed arbitrarily so as to allow adaptation, particularly of the accuracy of entry when selecting an object, to, for example, the display resource of the display screen, the display resolution, the resolution of detecting a fingertip touch, etc.
- Fig. 2 is a cross-section of a further user interface 12 on a display screen or computer monitor (not shown).
- This cross-section shows two partially overlapping objects O4 and O5.
- the objects are arranged in different planes, which is represented by the axis Z, which essentially corresponds to the normal on the display screen surface.
- This representation is shown for reasons of clarity and does not represent the real situation, because the displayed objects O4 and 05 of the user interface 12 are two-dimensional and the third dimension indicated by the Z axis only virtually exists in the program performing the user interface.
- the front object 04 has a substantially rectangular object area A4 and a predetermined selection area TA4 which is substantially elliptical and whose circumference is shown in broken lines. It is situated within the object area A4 of the object 04.
- the object area A4 and the selection area TA4 correspond to the visible object area NA4 and the visible selection area TNA4, respectively. Since the object 04 is situated in front of the object 05 on the user interface 12, it has also a higher value on the Z axis.
- the further object 05 arranged behind or below the object 04 has a substantially square-shaped object area A5 and is partially masked by the object 04 in such a way that the lower left corner of the object 05 is not visible on the user interface 12.
- the masked part of the object 05 is shown in broken lines. Consequently, the visible area NA5 of the object 05 is smaller than the object area A5.
- the object 05 has a predetermined selection area TA5 which is substantially circular and whose circumference is shown in broken lines.
- the predetermined selection area TA5 of the object 05 is situated within or on the object area A5 of the object 05.
- the visible selection area TNA5 of the object 05 is characterized by the shaded area and is smaller than the predetermined selection area TA5. Since the visible selection area TNA5 is directly taken into account when computing the selection coefficient TC5 of the object O5, the selection coefficient TC5 is smaller than in the case when the object 05 is completely visible on the user interface 12, i.e. when it is not masked by other objects such as object 04. The selection coefficient becomes TCZ5 by multiplying TC5 by the value of the object 05 on the Z axis.
- the selection coefficient TCZ5 of the object 05 is smaller than the selection coefficient TCZ4 of the object 04.
- a more accurate entry for selecting the second object 05 will be necessary because of the smaller selection coefficient TCZ5.
- the upper object 04 is sooner giving priority because it has a selection coefficient TCZ4 which is larger than the selection coefficient TCZ4 of the object 05.
- the selection coefficient TCZ4 for selecting the object 04 would also be small so that possibly the partially masked object 05 has a larger selection coefficient TCZ5 and would sooner be selected in the case of an inaccurate fingertip touch.
- a user interface When a user interface is employed, it is quite common practice to touch a plurality of neighboring object simultaneously, here both objects 04 and 05 in the example described above. In such a case, it may often be difficult to select an object, particularly to decide which one of the touched objects should be selected. The decision does not present any problems when one of the touched objects has a much larger selection coefficient than the other touched objects, such as the object 04 in the example described above. This object can then be selected. However, if all of the touched objects have substantially the same selection coefficient or when the distinction is too small for an unambiguous selection, the selection will be much more difficult. According to the invention, a solution to this problem is essentially to enlarge the group of neighboring objects, including the touched objects, on the user interface.
- the object 04 shown in Fig. 2 may have such a small predetermined selection area TA4 that the selection coefficients TCZ4 and TCZ5 of the two neighboring objects 04 and 05 are substantially equally large.
- the selection is not unambiguous and the objects should be automatically enlarged so as to facilitate the selection of one of these objects.
- An implementation of the algorithm for automatic enlargement may comprise the following steps.
- TCZp is computed for each object considered for selection, by forming the ratio TCZ/TCZmax for each object.
- TCZmax is the largest value of objects considered for selection.
- the object having the largest value TCZ is selected, when all other objects have a value TCZp which is smaller than a predetermined threshold value P of, for example, 30%.
- the present invention allows a less restrictive implementation of selectable objects in the design of a user interface, because a multitude of different parameters is talcen into account when creating the selectable objects in accordance with the invention and can be talcen into account for the design, so that particularly the accuracy of a user entry is improved by talcing account of distinctive parameters such as the display resolution on the screen, the resolution of the entry method, etc. in the function for selecting an object.
- TA1-TA5 predetermined selection, or touch, areas
Abstract
The invention relates to a method of selecting objects of a user interface on a display screen, in which a selection coefficient is determined for an object (O4) with reference to predetermined parameters, which selection coefficient serves for selecting the object. When selecting an object from a group of objects (O4, O5) of the user interface, the object that has the largest determined selection coefficient among the objects of the group can be selected.
Description
Method of selecting objects of a user interface on a display screen
The invention relates to a method of selecting objects of a user interface on a display screen as defined in claim 1.
In graphic or textual user interfaces, a multitude of objects is displayed on a display resource present on the display screen for displaying the user interface. Many of these objects are used for initiating entries to be performed by a computer to which the user interface is assigned.
The selection of objects of the user interface may principally be performed in two different ways: either by way of ancillary means connected to the computer, or directly by entry on the display screen. Ancillary means are, for example, keyboards or an indicator device such as a mouse, a trackball or a joystick. The direct entry on the display screen is only possible with types specially prepared for this purpose, for example, touch-sensitive display screens.
In both principal entry methods, it is often difficult to select a given object from a multitude of objects, particularly when the objects are represented in a tiny form on the display screen or when the display screen has a very large resolution in proportion to the resolution of the entry method.
Particularly when using touch-sensitive display screens, it may be very difficult to select small objects or at least partly masked objects on a high-resolution user interface, particularly when a user uses his fingers as the entry means. To provide the possibility of a reliable selection of objects, the size of selectable objects is often adapted to the entry method. User interfaces for touch-screens, i.e., for example, fingertip-operated display screens therefore usually display very large objects so that the number of objects to be displayed on the user interface is reduced. In other words, the selectable objects cover a certain minimum area of the user interface if they are to be fingertip-selected in a precise manner.
This will hereinafter be elucidated with reference to an example. A 15" display screen which is touch-sensitive can generally display a user interface with a resolution of 1024 x 768 pixels. The accuracy of a touch-entry is, however, only about 1% of the pixels, which corresponds to about 10 pixels or about 3 mm. The parallax of the pixel plane and the
touch plane having a distance of about 2 to 3 mm reduces the accuracy of the entry by a further 2 to 3 mm in the case of an assumed maximum viewing angle of about 45°, which corresponds to about 6 to 9 pixels. The minimum area covered by objects to be easily selectable by the user's fingers can be calculated with reference to the fingertip-touch area on the display screen.
A finger may touch, for example, an area having a diameter of about 10-15 mm, which corresponds to about 30 to 45 pixels. In this case, entry errors, which are based on an inaccurate alignment between the display screen plane and the touch plane, or result from long-time offset and amplification errors in the touch entry, are not taken into account. Such errors can generally be eliminated by means of calibration methods.
The considerations mentioned above are usually taken into account in the design of the user interface and essentially determine the minimum area of objects that can be fingertip-selected. It is true that it is principally possible to use a stylus instead of the fingertip for entries, as is already current practice in many personal digital assistants. However, since the pixel distance decreases continuously, this problem will also be significant in future applications.
It is therefore an object of the present invention to propose a method of selecting objects of a user interface on a display screen, which method is improved as compared with conventional entry methods. This object is achieved by the method as defined in the characterizing part of claim 1. Further advantageous embodiments of the invention are defined in the dependent claims.
An essential idea of the invention is to determine a selection coefficient with reference to predetermined parameters for an object of a user interface on a display screen. The selection coefficient, which serves for selecting the object, can be considered as a kind of "virtual" selection area of an object. It provides the particular possibility of designing a user interface which, as regards the size of selectable objects, is less restrictive than has been described hereinbefore. Particularly, the design can be adapted to, for example, the resolution of a display on the display screen, the resolution of an entry method, the number of objects displayed on a display resource of a display screen, or similar parameters.
The invention specifically relates to a method of selecting objects of a user interface on a display screen, in which a selection coefficient is determined for an object with reference to predetermined parameters, which selection coefficient serves for selecting the object. When selecting an object from a group of objects of the user interface, for example,
the object that has the largest determined selection coefficient among the objects of the group can be selected. The particular advantage of this method is that the selection of an object depends on a coefficient which is determined with reference to predetermined parameters, which provides the possibility of a more flexible adaptation of the selection of objects as compared with the state of the art, in which only the area of an object that is visible on the user interface on a display screen is taken into account for selection.
Particularly, the predetermined parameters may comprise the object area, the part of the object area visible on the display screen, a predetermined selection area, the part of the selection area that is not masked by other objects and is thus particularly visible, and a plane of the user interface in which the object is situated. These different parameters, which can be taken into account in the method according to the invention, provide the basis for an intelligent and flexible control of selecting objects of the user interface on the display screen.
For example, the area can be determined with reference to the proportion between the selection area and the object area. To prefer objects with a small area to objects with a large area, the afore-mentioned proportion for the objects with a small area may be chosen to be very large as compared with the objects with a large area. Consequently, easy selection of a small-area object is also possible in a selection method using a small resolution, for example, a fingertip entry. Other computations for determining the selection coefficient essentially determining the object selection are of course feasible, for example, by taking the visible selection area of a first object in proportion to the visible selection area of a neighboring, second object into account when determining the selection coefficient for selecting the first object, or when a priority of an object is concerned. Priority is herein understood to mean particularly that an object having an important function (for example close window) has a higher priority than an object performing a less important function (for example, maximize/minimize window).
The selection coefficient is preferably evaluated by a factor, for example, multiplied, which factor is assigned to the plane in which the object is situated. When selecting objects, a "third" dimension is used. In other words, not only areas and their ratios for improving the selection of objects of the user interface but also their arrangement on the normal of the display screen plane are taken into account.
In a particularly preferred embodiment of the method according to the invention, the object having the largest selection coefficient evaluated by the factor is selected from a group of objects in so far as the factor-evaluated selection coefficients of the other objects of the group are smaller than a predetermined threshold value. If, in contrast,
one of the factor-evaluated selection coefficients of the other objects is larger than the predetermined threshold value, a section of the user interface with the group of objects is automatically enlarged, similarly as in the case when the group of objects is magnified. The selection of an object is thereby essentially facilitated. This is particularly advantageous when all objects of the group essentially have the same factor-evaluated selection coefficient, i.e. about the same selection area, are situated in the same plane of the user interface and, as compared with the selection area, have a proportionally small visible object area. The predetermined threshold value may be derived, for example, from the largest factor-evaluated selection coefficient of the objects of a group and from the other factor-evaluated selection coefficients.
In a particularly preferred embodiment of the method, the user interface is a graphic user interface, in which an object is a symbol, a pictogram, a switch area, a selection menu, a window or a similar element of a graphic user interface.
Finally, the display screen is preferably a touch-sensitive display screen. Principally, the method may, however, be advantageously used in other entry methods of selecting objects on the user interface, for example, in high-resolution user interfaces having a multitude of objects and a mouse as an entry device.
The invention further relates to a data-processing device comprising a display screen which is program-technically arranged to perform the method according to the invention. Particularly, the data-processing device is part of a device for patient monitoring, in which the display screen is used for displaying patient data and for operating the device. For example, the display screen may be used for indicating given measuring data of a patient and simultaneously indicate a user interface of a patient monitoring system comprising objects for selection. When the display screen is touch-sensitive, the patient monitoring and, for example, the indicated data can be directly controlled via the selection of given objects which are indicated on the user interface of the display screen. In this respect, it is to be noted that the invention is not limited to its use in a patient monitoring system but can be used advantageously in any computer system comprising a user interface with selectable objects. These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
In the drawings:
Fig. 1 is a cross-section of a first example of a user interface with selectable objects and their areas such as selection areas with which the areas are determined for selection according to the invention, and
Fig. 2 is a cross-section of a second example of a user interface with selectable objects which are superposed in different planes, in which one object partly masks another object.
Fig. 1 shows objects of a user interface 10 assigned to a computer (not shown) and displayed on a touch-sensitive display screen or computer monitor (not shown). The Figure shows three objects Ol, O2 and O3 which are juxtaposed for the sake of clarity. The objects are switch areas of the user interface which, when touched, control the computer for performing a predetermined operation.
The objects shown in Fig. 1 are particularly provided for fingertip selection or selection by entry means such as a stylus for use on a touch-sensitive display screen. As soon as one of the three objects Ol, O2 or O3 on the display screen is fingertip-touched, the function assigned to the object is performed by the computer. This is effected in that the touch-sensitive display screen detects a touch on one of the objects and signalizes this to the computer. The signalization is effected similarly as with a conventional computer keyboard on which a given key is pressed.
However, since in contrast to a conventional computer keyboard, in which the switch areas are fixed by the size of the keys, the object size depends on different factors such as, for example, the display screen resolution and the number of displayed objects on a user interface with the displayed objects, a selection coefficient is determined according to the invention for each object, which selection coefficient is evaluated for selecting the corresponding object. This selection coefficient can also be considered to be a kind of virtual touch area or activation area for the object.
The object Ol shown at the extreme left in Fig. 1, which object is substantially square-shaped and has an object area Al, comprises a substantially circular predetermined selection, or touch, area TAl whose circumference is shown in broken lines in Fig. 1. The predetermined selection area TAl is arranged within the outer periphery of the object Ol and principally corresponds to said virtual touch area of the object Ol. By fingertip-touching the predetermined selection area TAl, the object Ol is selected and the operation based on this object is performed by the computer. Since the predetermined selection area TAl is smaller
than the object area Al, no touch is detected when touching the peripheral area of the object area Al . The object Ol is therefore generally a larger object on the user interface.
The central object 02 has an object area A2 which is also substantially square- shaped but smaller than the object area Al of the left object Ol . Consequently, selecting the object O2 by means of a fingertip touch is more difficult than selecting the left object Ol which has a larger object area Al . To compensate this drawback, the object O2 has a predetermined selection area TA2 which is substantially circular and larger than the object area A2. The outer periphery of the predetermined selection area TA2 is shown in broken lines again. The object area A2 is situated within the predetermined selection area TA2. Consequently it is easily possible to select the object O2 by means of a fingertip touch, even when this object has a very small object area A2, for example, as distinct from object Ol . Finally, a third object 03 is shown at the extreme right in the Figure, which object has an object area A3 and is substantially elliptical. The predetermined selection area TA3, whose circumference is again shown in broken lines, substantially corresponds to the object area A3.
According to the invention, the selection coefficient to be determined for selecting an object can now be computed with reference to the parameters shown in Fig. 1, such as the object area and the predetermined selection area: the selection, or touch, coefficient of an object, also referred to as TC, is determined from the ratio between the part of the visible selection area and the visible object area, in accordance with the expression TC = TNA/NA, in which TV A is a visible selection area and NA is a visible object area. Since the visible selection area TNA corresponds to the predetermined selection areas TAl, TA2 and TA3 for all of the three objects Ol, 02 and 03, respectively, the selection coefficient TC for each object is the proportion between the predetermined selection area and the visible object area, i.e. TA1/A1 (< 1), TA2/A2 (> 1) and TA3/A3 (~1).
Since the determined selection coefficient or the above-mentioned proportion TC essentially depends on the predetermined or visible selection area, the selection coefficient is essentially determined by this selection area. This selection area may be predetermined by the user interface, for example, by a program-technical routine of a user interface program performed on the computer and displaying the computer user interface on the display screen or the computer monitor. Particularly, the predetermined selection area can be dynamically adjusted, for example, in dependence upon the number of objects displayed on the display resource of the display screen, which resource is provided for and utilized by the user interface. Further parameters used for determining the selection coefficient for
selecting an object may be, for example, the placement of objects on the user interface, and the size, priority and accumulation of objects at a given position on the user interface. Further parameters used for computing the selection coefficients are of course also feasible. However, it is essential that the selection coefficient does not automatically correspond to the object area but can principally be fixed arbitrarily so as to allow adaptation, particularly of the accuracy of entry when selecting an object, to, for example, the display resource of the display screen, the display resolution, the resolution of detecting a fingertip touch, etc.
Fig. 2 is a cross-section of a further user interface 12 on a display screen or computer monitor (not shown). This cross-section shows two partially overlapping objects O4 and O5. The objects are arranged in different planes, which is represented by the axis Z, which essentially corresponds to the normal on the display screen surface. This representation is shown for reasons of clarity and does not represent the real situation, because the displayed objects O4 and 05 of the user interface 12 are two-dimensional and the third dimension indicated by the Z axis only virtually exists in the program performing the user interface.
The front object 04 has a substantially rectangular object area A4 and a predetermined selection area TA4 which is substantially elliptical and whose circumference is shown in broken lines. It is situated within the object area A4 of the object 04. The object area A4 and the selection area TA4 correspond to the visible object area NA4 and the visible selection area TNA4, respectively. Since the object 04 is situated in front of the object 05 on the user interface 12, it has also a higher value on the Z axis. The selection coefficient TC4 = TNA4/NA4 = TA4/A4 is obtained while taking the value of the Z axis with respect to TCZ4 into account. It can be obtained, for example, by multiplying the value Z of the object 04 on the Z axis by TC4. The further object 05 arranged behind or below the object 04 has a substantially square-shaped object area A5 and is partially masked by the object 04 in such a way that the lower left corner of the object 05 is not visible on the user interface 12. For the sake of clarity, the masked part of the object 05 is shown in broken lines. Consequently, the visible area NA5 of the object 05 is smaller than the object area A5. The object 05 has a predetermined selection area TA5 which is substantially circular and whose circumference is shown in broken lines. The predetermined selection area TA5 of the object 05 is situated within or on the object area A5 of the object 05. About one quarter of the predetermined selection area TA5 of the object 05 is masked by the object 04 and, consequently, invisible (dotted part of the circumferential line of the selection area
TA5). The visible selection area TNA5 of the object 05 is characterized by the shaded area and is smaller than the predetermined selection area TA5. Since the visible selection area TNA5 is directly taken into account when computing the selection coefficient TC5 of the object O5, the selection coefficient TC5 is smaller than in the case when the object 05 is completely visible on the user interface 12, i.e. when it is not masked by other objects such as object 04. The selection coefficient becomes TCZ5 by multiplying TC5 by the value of the object 05 on the Z axis.
The selection coefficient TCZ5 of the object 05 is smaller than the selection coefficient TCZ4 of the object 04. When selecting one of the two objects 04 or 05, for example, by means of a fingertip touch of the display screen on which the user interface 12 is shown, a more accurate entry for selecting the second object 05 will be necessary because of the smaller selection coefficient TCZ5. In other words, in the selection and in the case of an inaccurate fingertip touch of the user interface, in more accurate terms the relevant program, the upper object 04 is sooner giving priority because it has a selection coefficient TCZ4 which is larger than the selection coefficient TCZ4 of the object 05. However, if the predetermined selection area TA4 of the upper object 04 and hence the visible selection area TNA4 were much smaller than the object area A4 or the visible object NA4 of the upper object 04, then the selection coefficient TCZ4 for selecting the object 04 would also be small so that possibly the partially masked object 05 has a larger selection coefficient TCZ5 and would sooner be selected in the case of an inaccurate fingertip touch.
When a user interface is employed, it is quite common practice to touch a plurality of neighboring object simultaneously, here both objects 04 and 05 in the example described above. In such a case, it may often be difficult to select an object, particularly to decide which one of the touched objects should be selected. The decision does not present any problems when one of the touched objects has a much larger selection coefficient than the other touched objects, such as the object 04 in the example described above. This object can then be selected. However, if all of the touched objects have substantially the same selection coefficient or when the distinction is too small for an unambiguous selection, the selection will be much more difficult. According to the invention, a solution to this problem is essentially to enlarge the group of neighboring objects, including the touched objects, on the user interface. This can be done in an individual window which is automatically opened as soon as such a case occurs. For example, the object 04 shown in Fig. 2 may have such a small predetermined selection area TA4 that the selection coefficients TCZ4 and TCZ5 of the two neighboring
objects 04 and 05 are substantially equally large. When the two objects 04 and 05 are touched simultaneously, the selection is not unambiguous and the objects should be automatically enlarged so as to facilitate the selection of one of these objects.
An implementation of the algorithm for automatic enlargement may comprise the following steps.
1. Initially, all objects of a group from which an object is to be selected are sorted in accordance with their value TCZ.
2. Subsequently, a value TCZp is computed for each object considered for selection, by forming the ratio TCZ/TCZmax for each object. TCZmax is the largest value of objects considered for selection.
3. Subsequently, the object having the largest value TCZ is selected, when all other objects have a value TCZp which is smaller than a predetermined threshold value P of, for example, 30%.
The following computation example will elucidate this algorithm: a group comprises four objects 1 -4 having the following values: object 1 : TCZ = 1.11 TCZp = 10.1% object 2: TCZ = 0.60 TCZp = 5.5% obj ect 3 : TCZ = 7.1 TCZp = 64.7% object 4: TCZ = TCZmax=10.99 TCZp = 100.0% When a threshold value P of 70% is predetermined for the enlargement, object
4 can be selected immediately because the values TCZp of the objects 1 to 3 are smaller than 70%). However, if the given threshold value is about 50% or even only 30%>, a selection will be automatically enlarged because at least one of the values TCZp of the objects 1 to 3 is larger than the predetermined threshold value. In summary, the present invention allows a less restrictive implementation of selectable objects in the design of a user interface, because a multitude of different parameters is talcen into account when creating the selectable objects in accordance with the invention and can be talcen into account for the design, so that particularly the accuracy of a user entry is improved by talcing account of distinctive parameters such as the display resolution on the screen, the resolution of the entry method, etc. in the function for selecting an object.
List of reference signs:
10, 12 cross-section of a user interface
O1-O5 objects
TA1-TA5 predetermined selection, or touch, areas
A1-A5 object areas
TNA5 visible selection area
Z "third" dimension
Claims
1. A method of selecting objects of a user interface on a display screen, wherein a selection coefficient is determined for an object (O4) with reference to predetermined parameters, which selection coefficient serves for selecting the object (O4).
2. A method as claimed in claim 1, characterized in that in the selection of an object (O4) from a group of objects (O4, 05) of the user interface, the object that has the largest determined selection coefficient among the objects of the group is selected.
3. A method as claimed in claim 1 or 2, characterized in that the predetermined parameters comprise the object area (A5), the part of the object area visible on the display screen, a predetermined selection area (TA5), and the part of the selection area (TNA5) that is not masked by other objects and is thus particularly visible.
4. A method as claimed in claim 3, characterized in that the predetermined parameters comprise one plane (Z) in the user interface in which the object (05) is situated.
5. A method as claimed in claim 3 or 4, characterized in that the selection coefficient is determined with reference to the proportion between the part of the visible selection area (TNA5) and the object area (A5).
6. A method as claimed in claim 5, characterized in that the selection coefficient is further evaluated by a factor (Z) which is assigned to the plane in which the object (Ol) is situated.
7. A method as claimed in claim 6, characterized in that the object having the largest selection coefficient evaluated by the factor (Z) is selected from a group of objects in so far as the selection coefficients of the other objects of the group evaluated by the factor (Z) are smaller than a predetermined threshold value, and, in the contrary case, a section of the user interface with the group of objects is automatically enlarged so as to facilitate selection of an object.
8. A method as claimed in any one of the preceding claims, characterized in that the user interface is a graphic user interface, in which an object is a symbol, a pictogram, a switch area, a selection menu, a window or a similar element of a graphic user interface.
9. A method as claimed in any one of the preceding claims, characterized in that the display screen is a touch-sensitive display screen.
10. A data-processing device comprising a display screen which is program- technically adapted to perform a method as claimed in any one of claims 1 to 9.
11. A data processing device as claimed in claim 10, characterized in that it is a part of a device for patient monitoring, in which the display screen is used for displaying patient data and for operating the device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03100238.9 | 2003-02-05 | ||
EP03100238 | 2003-02-05 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2004070604A2 true WO2004070604A2 (en) | 2004-08-19 |
WO2004070604A3 WO2004070604A3 (en) | 2004-09-16 |
Family
ID=32842809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2004/000228 WO2004070604A2 (en) | 2003-02-05 | 2004-01-23 | Method of selecting objects of a user interface on a display screen |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2004070604A2 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2187297A3 (en) * | 2008-11-14 | 2011-01-19 | Samsung Electronics Co., Ltd. | Method for selecting area of content for enlargement, and apparatus and system for providing content |
EP2426591A1 (en) * | 2007-01-07 | 2012-03-07 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US8358281B2 (en) | 2009-12-15 | 2013-01-22 | Apple Inc. | Device, method, and graphical user interface for management and manipulation of user interface elements |
US8370736B2 (en) | 2009-03-16 | 2013-02-05 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8427445B2 (en) | 2004-07-30 | 2013-04-23 | Apple Inc. | Visual expander |
WO2013136707A1 (en) * | 2012-03-15 | 2013-09-19 | Sony Corporation | Information processing apparatus, method, and non-transitory computer-readable medium |
US8570278B2 (en) | 2006-10-26 | 2013-10-29 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US8661339B2 (en) | 2011-05-31 | 2014-02-25 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
CN103645841A (en) * | 2013-12-12 | 2014-03-19 | 深圳Tcl新技术有限公司 | Method and device for realizing self-adaptive display of 3-dimentional filed depth of mouse |
US9348511B2 (en) | 2006-10-26 | 2016-05-24 | Apple Inc. | Method, system, and graphical user interface for positioning an insertion marker in a touch screen display |
US9898162B2 (en) | 2014-05-30 | 2018-02-20 | Apple Inc. | Swiping functions for messaging applications |
US9971500B2 (en) | 2014-06-01 | 2018-05-15 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US10146431B2 (en) | 2008-09-11 | 2018-12-04 | Interdigital Ce Patent Holdings | Touch panel device |
US10620812B2 (en) | 2016-06-10 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for managing electronic communications |
EP3838209A1 (en) * | 2019-12-20 | 2021-06-23 | Koninklijke Philips N.V. | Generating interactive zones for interventional medical devices |
US11188168B2 (en) | 2010-06-04 | 2021-11-30 | Apple Inc. | Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757358A (en) * | 1992-03-31 | 1998-05-26 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for enhancing computer-user selection of computer-displayed objects through dynamic selection area and constant visual feedback |
GB2352154A (en) * | 1999-07-16 | 2001-01-17 | Ibm | Automatic target enlargement for simplified selection |
US6259436B1 (en) * | 1998-12-22 | 2001-07-10 | Ericsson Inc. | Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch |
-
2004
- 2004-01-23 WO PCT/IB2004/000228 patent/WO2004070604A2/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757358A (en) * | 1992-03-31 | 1998-05-26 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for enhancing computer-user selection of computer-displayed objects through dynamic selection area and constant visual feedback |
US6259436B1 (en) * | 1998-12-22 | 2001-07-10 | Ericsson Inc. | Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch |
GB2352154A (en) * | 1999-07-16 | 2001-01-17 | Ibm | Automatic target enlargement for simplified selection |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8427445B2 (en) | 2004-07-30 | 2013-04-23 | Apple Inc. | Visual expander |
US9348511B2 (en) | 2006-10-26 | 2016-05-24 | Apple Inc. | Method, system, and graphical user interface for positioning an insertion marker in a touch screen display |
US9632695B2 (en) | 2006-10-26 | 2017-04-25 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US8570278B2 (en) | 2006-10-26 | 2013-10-29 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
US9207855B2 (en) | 2006-10-26 | 2015-12-08 | Apple Inc. | Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker |
EP2426591A1 (en) * | 2007-01-07 | 2012-03-07 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display |
EP2126676B1 (en) * | 2007-01-07 | 2013-05-22 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US9529524B2 (en) | 2008-03-04 | 2016-12-27 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US10146431B2 (en) | 2008-09-11 | 2018-12-04 | Interdigital Ce Patent Holdings | Touch panel device |
EP2573663A1 (en) * | 2008-11-14 | 2013-03-27 | Samsung Electronics Co., Ltd | Method for selecting area of content for enlargement, and apparatus and system for providing content |
US8930848B2 (en) | 2008-11-14 | 2015-01-06 | Samsung Electronics Co., Ltd. | Method for selecting area of content for enlargement, and apparatus and system for providing content |
EP2187297A3 (en) * | 2008-11-14 | 2011-01-19 | Samsung Electronics Co., Ltd. | Method for selecting area of content for enlargement, and apparatus and system for providing content |
US8584050B2 (en) | 2009-03-16 | 2013-11-12 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8370736B2 (en) | 2009-03-16 | 2013-02-05 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US10761716B2 (en) | 2009-03-16 | 2020-09-01 | Apple, Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8756534B2 (en) | 2009-03-16 | 2014-06-17 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US9875013B2 (en) | 2009-03-16 | 2018-01-23 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8661362B2 (en) | 2009-03-16 | 2014-02-25 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US9846533B2 (en) | 2009-03-16 | 2017-12-19 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8510665B2 (en) | 2009-03-16 | 2013-08-13 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US8358281B2 (en) | 2009-12-15 | 2013-01-22 | Apple Inc. | Device, method, and graphical user interface for management and manipulation of user interface elements |
US11709560B2 (en) | 2010-06-04 | 2023-07-25 | Apple Inc. | Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator |
US11188168B2 (en) | 2010-06-04 | 2021-11-30 | Apple Inc. | Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator |
US11256401B2 (en) | 2011-05-31 | 2022-02-22 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8677232B2 (en) | 2011-05-31 | 2014-03-18 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US9092130B2 (en) | 2011-05-31 | 2015-07-28 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8661339B2 (en) | 2011-05-31 | 2014-02-25 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8719695B2 (en) | 2011-05-31 | 2014-05-06 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US10664144B2 (en) | 2011-05-31 | 2020-05-26 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
WO2013136707A1 (en) * | 2012-03-15 | 2013-09-19 | Sony Corporation | Information processing apparatus, method, and non-transitory computer-readable medium |
CN104185979A (en) * | 2012-03-15 | 2014-12-03 | 索尼公司 | Information processing apparatus, method, and non-transitory computer-readable medium |
US11747958B2 (en) | 2012-03-15 | 2023-09-05 | Sony Corporation | Information processing apparatus for responding to finger and hand operation inputs |
US10007401B2 (en) | 2012-03-15 | 2018-06-26 | Sony Corporation | Information processing apparatus, method, and non-transitory computer-readable medium |
CN103645841A (en) * | 2013-12-12 | 2014-03-19 | 深圳Tcl新技术有限公司 | Method and device for realizing self-adaptive display of 3-dimentional filed depth of mouse |
US11226724B2 (en) | 2014-05-30 | 2022-01-18 | Apple Inc. | Swiping functions for messaging applications |
US9898162B2 (en) | 2014-05-30 | 2018-02-20 | Apple Inc. | Swiping functions for messaging applications |
US10739947B2 (en) | 2014-05-30 | 2020-08-11 | Apple Inc. | Swiping functions for messaging applications |
US11068157B2 (en) | 2014-06-01 | 2021-07-20 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US10416882B2 (en) | 2014-06-01 | 2019-09-17 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US9971500B2 (en) | 2014-06-01 | 2018-05-15 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US11494072B2 (en) | 2014-06-01 | 2022-11-08 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US11868606B2 (en) | 2014-06-01 | 2024-01-09 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
US10620812B2 (en) | 2016-06-10 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for managing electronic communications |
EP3838209A1 (en) * | 2019-12-20 | 2021-06-23 | Koninklijke Philips N.V. | Generating interactive zones for interventional medical devices |
WO2021122181A1 (en) * | 2019-12-20 | 2021-06-24 | Koninklijke Philips N.V. | Generating interactive zones for interventional medical devices |
Also Published As
Publication number | Publication date |
---|---|
WO2004070604A3 (en) | 2004-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004070604A2 (en) | Method of selecting objects of a user interface on a display screen | |
US8797363B2 (en) | Method of controlling touch panel display device and touch panel display device using the same | |
CN107066137B (en) | Apparatus and method for providing user interface | |
US9182884B2 (en) | Pinch-throw and translation gestures | |
EP2241959B1 (en) | Touch-panel device | |
US7924271B2 (en) | Detecting gestures on multi-event sensitive devices | |
EP1774429B1 (en) | Gestures for touch sensitive input devices | |
US9377909B2 (en) | Touchscreen data processing | |
AU2008100547B4 (en) | Speed/position mode translations | |
US8479122B2 (en) | Gestures for touch sensitive input devices | |
US9052817B2 (en) | Mode sensitive processing of touch data | |
US20090066659A1 (en) | Computer system with touch screen and separate display screen | |
US20080165154A1 (en) | Apparatus and method for controlling touch sensitivity of touch screen panel and touch screen display using the same | |
US20100220062A1 (en) | Touch sensitive display | |
KR20090029670A (en) | Method and apparatus for selecting an object within a user interface by performing a gesture | |
CN108064371A (en) | A kind of control method and device of flexible display screen | |
AU2015202763B2 (en) | Glove touch detection | |
US20110157045A1 (en) | Information processing apparatus, information processing method, and program | |
CN111221439A (en) | Touch operation method of touch screen oscilloscope, digital oscilloscope and signal measuring device | |
JP5414134B1 (en) | Touch-type input system and input control method | |
CN110633023B (en) | False touch prevention method and device, terminal and computer readable storage medium | |
US9524051B2 (en) | Method and terminal for inputting multiple events | |
EP2791773A1 (en) | Remote display area including input lenses each depicting a region of a graphical user interface | |
TW202127222A (en) | User interface adjustment method and touch display device | |
US20230063584A1 (en) | Contactless input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
122 | Ep: pct application non-entry in european phase |