WO2000016186A2 - Apparatus and method for moving objects on a touchscreen display - Google Patents

Apparatus and method for moving objects on a touchscreen display Download PDF

Info

Publication number
WO2000016186A2
WO2000016186A2 PCT/US1999/021301 US9921301W WO0016186A2 WO 2000016186 A2 WO2000016186 A2 WO 2000016186A2 US 9921301 W US9921301 W US 9921301W WO 0016186 A2 WO0016186 A2 WO 0016186A2
Authority
WO
WIPO (PCT)
Prior art keywords
communications device
portable intelligent
intelligent communications
touchscreen display
perimeter
Prior art date
Application number
PCT/US1999/021301
Other languages
French (fr)
Other versions
WO2000016186A3 (en
Inventor
Mona Singh
Original Assignee
Ericsson Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ericsson Inc. filed Critical Ericsson Inc.
Priority to DE19983569T priority Critical patent/DE19983569T1/en
Priority to JP2000570657A priority patent/JP2002525705A/en
Priority to AU62508/99A priority patent/AU6250899A/en
Publication of WO2000016186A2 publication Critical patent/WO2000016186A2/en
Publication of WO2000016186A3 publication Critical patent/WO2000016186A3/en
Priority to HK02104016.0A priority patent/HK1042359B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates generally to a computer-controlled touchscreen display and, more particularly, to an apparatus and method for moving objects between distinct locations on a touchscreen display of a portable intelligent communications device or a separate computer.
  • the portable intelligent communications device is designed extensively to be a communications device, rather than just a mobile computer, and as such it includes a computer integrated with communications hardware and software to provide telephony, messaging and information services. To enable at least some of these features, the portable intelligent communications device is able to be connected to the Internet by either a wired link or a wireless link. It will also be understood that certain software applications are provided within the portable intelligent communications device which facilitate the aforementioned features, as well as other desirable features such as a Personal Information Manager (PIM), games and the like.
  • PIM Personal Information Manager
  • Portable intelligent communications devices like other computer-controlled devices, include a screen or display panel to enable interaction with the computer via a graphical user interface. This interaction is oftentimes accomplished by way of a mouse or other pointing device. To input or select information from the screen, the user manipulates the mouse to direct a cursor to an appropriate area of the screen. Once at the appropriate area, the user selects an item by using a mouse button, or enters a command or text through a keyboard.
  • objects are moved to new screen locations using a drag and drop sequence.
  • the cursor is positioned over the object to be moved, and the object is selected by pressing and holding down a mouse button. While the button is held down, the cursor and object are "dragged" to the new location on the display screen. At the new location, the mouse button is released to complete the move.
  • an object is moved to a new screen location by first selecting a drag and drop mode from a control panel. Once in the drag and drop mode, the cursor is moved to the desired object, and the mouse "clicked” to select the object. The cursor is then moved to the new target location, and the mouse "clicked” again to move the object to that location. After the object is moved, the cursor must again be directed to the control panel to deselect and exit the drag and drop mode.
  • Yet another object of the present invention is to provide an apparatus and method for moving an object on a touchscreen display of a portable intelligent communications device in which the target location for the object may be identified with a single touch.
  • a method of moving an object depicted on a touchscreen display of a portable intelligent communications device or other computer-controlled device including the steps of selecting an object having an initial location on the touchscreen display by touching an area associated with the object in a predetermined manner, identifying a target location for the object on the touchscreen display, and moving the object from the initial location to the target location.
  • the object is moved when the target location is identified within a predetermined time period after the object has been selected.
  • the object is also identified as being selected and the target location as being allowed for the object prior to movement of the object.
  • the object may be selected in one of several manners, including touching first and second areas on the touchscreen display associated with the object, touching the touchscreen display in a circular motion substantially about a perimeter of the object, simultaneously touching the object and the target location on the touchscreen display, and touching a corner of the object and moving diametrically thereacross to an opposite corner thereof.
  • a portable intelligent communications device including circuitry for performing telephony operations, a processing circuit, a memory circuit, and a touchscreen display coupled to the processing circuit for controlling the display.
  • the processing circuit is operable to move the location of objects on the touchscreen display upon detection of a predetermined tactile gesture on the touchscreen display in an area associated with one of such objects followed by a subsequent touch at a new location on the touchscreen display. An object is moved to the new location when the predetermined tactile gesture selecting the object and the subsequent touch occur within a predetermined time period.
  • the predetermined tactile gesture to select an object may be first and second touches by a thumb and finger on opposite sides of the object, a circular motion with a finger about the object's perimeter, simultaneously touching the object and the new location on the touchscreen display, and touching a corner of the object and moving diametrically thereacross to an opposite corner thereof.
  • FIG. 1 is a perspective view of a portable intelligent communications device in accordance with the present invention
  • Fig. 2 is a block diagram of the major components of the portable intelligent communications device depicted in Fig. 1 ;
  • Fig. 3 is a block diagram of the software architecture for the portable intelligent communications device depicted in Figs. 1 and 2;
  • Fig. 4 is an exemplary screen display from a representative software application depicting an object being selected for movement to a new location on the screen display, as well as the identification of such new location for the object in accordance with the present invention
  • Fig. 5 is an exemplary screen display similar to Fig. 4, depicting the selected object at the target location following movement from its original location;
  • Fig. 6 is a diagrammatic view of an object being selected for movement in accordance with the present invention.
  • Fig. 7 is a diagrammatic view of an alternative method for selecting an object to be moved in accordance with the present invention
  • Fig. 8 is a diagrammatic view of another alternative method for selecting an object to be moved in accordance with the present invention.
  • Fig. 9 is a flowchart of the steps by which a preferred method of the present invention is accomplished.
  • Fig. 1 depicts a portable intelligent communications device identified generally by the numeral 10.
  • portable intelligent communications device 10 is principally a communications device and includes circuitry and components which allows it to function in such capacity through cellular, landline, infrared data association (IrDA), phone cards, and other modes.
  • Portable intelligent communications device 10 also includes circuitry which enables it to function in the capacity of a computer, and a plurality of software applications may be utilized therewith. Because of this combination, portable intelligent communications device 10 is uniquely suited to interface software applications with communications hardware and software, particularly where connection to an Internet address is desired.
  • portable intelligent communications device 10 generally operates in accordance with a device shown and described in a patent application entitled “Switching Of Analog Signals In Mobile Computing Devices” and having Serial No. 08/796,119, which is also owned by the assignee of the present invention and is hereby incorporated by reference.
  • portable intelligent communications device 10 includes a casing 12 for housing the communications and other circuitry as will be discussed in greater detail hereinafter.
  • a handset 14 is positioned within a top portion 16 of casing 12 and preferably includes a built-in speaker 18 for use when handset 14 is maintained therein.
  • a pivotable antenna 20 (shown in Fig. 1 in the open or use position) is provided to enable a communications function, as when portable intelligent communications device 10 is in a cellular mode of operation. It will be understood that various ports, jacks, and interfaces will be provided to further enable communications functions by portable intelligent communications device 10. Control buttons 21 and 23 are also shown as being located on top portion 16 of casing 12.
  • Portable intelligent communications device 10 further includes a display screen 22, which preferably is a type in which a user of the device is able to interact through touching designated areas thereon. It will be appreciated that a stylus 24 may optionally be utilized to indicate a particular area more specifically than can be accomplished with the user's finger, although most designated areas are sized for touch interaction by a typically sized finger. Since portable intelligent communications device 10 preferably is no larger than a standard business desk telephone, display screen 22 is sized to be approximately eight (8) inches measured diagonally across. This puts screen display 22 in a distinct size class, as it is smaller than normal monitor sizes for personal and portable computers and larger than screen displays for personal digital assistants (PDAs), calculators, and other similar personal electronic devices.
  • PDAs personal digital assistants
  • Fig. 2 depicts the internal circuitry of portable intelligent communications device 10 as including a processing circuit 26, which may, for example, be a
  • processing circuit 26 is connected to both Read Only Memory (ROM) 28 and Random Access Memory (RAM) 30 in which both operating systems and software applications are stored.
  • An optional bulk storage device 32 is further provided for storing databases.
  • Processing circuit 26 is also coupled to display screen 22 through a standard driver (not shown) in order to control the images displayed thereon, as well as receive information through graphical user interfaces in which the user of portable intelligent communications device 10 may indicate chosen options.
  • the communications functions of portable intelligent communications device 10 are also handled through processing circuit 26 via a serial and/or parallel port 34 to the particular circuitry of a communications mode designated generically by reference numeral 36.
  • a keyboard 38 may also be connected to processing circuit 26, where keyboard 38 can be depicted on display screen 22 or be a separate physical package which can be utilized with portable intelligent communications device 10 such as through a keyboard IR port 40 (see Fig. 1).
  • Fig. 3 depicts a schematic block diagram of the software architecture for portable intelligent communications device 10.
  • the software is divided into three basic areas: applications software 42, desktop software 44, and system operating software 46 (which includes everything else from the class libraries down to the device drivers for portable intelligent communications device 10). It will be understood that neither applications software 42 nor desktop software 44 will ever interact with anything other than the top layer of system operating software 46.
  • Exemplary software applications are shown within applications software 42, with particular reference being made to Phone Book software application 48.
  • FIG. 4 an exemplary screen display 50 is illustrated on display screen 22 when portable intelligent communications device 10 operates within Phone Book software application 48.
  • the present invention will be described with respect to representative Phone Book software application 48, which may be used to save and group business card information on portable intelligent communications device 10 or a similar computer. It will be appreciated, however, that although the present invention is described with respect to a Phone Book software application, the invention is applicable to any touch-based user interface, such that any screen image that may be moved via a drop and drag procedure may also be moved via the pick and place method of the present invention.
  • the user interface of representative screen display 50 includes a variety of screen images or objects, otherwise known as "touchable items," through which a user interacts with the application.
  • touchable items include a plurality of virtual tabbed areas which make up a main control panel 52.
  • these tabbed areas are designated as “Phone” at 54, "Edit” at 56, “Setup” at 58, "Services” at 60 and “Help” at 62.
  • a second level of objects or menu choices correspond to each of tabbed areas 54-62, and appear on display screen 22 when the corresponding tabbed area has been selected.
  • the "Phone" tabbed area at 54 has been selected, causing a second level of objects to be displayed.
  • Control panel 82 includes the options "Phone Dialer” at 84, "Phone Book” at 86, "Speed Dial” at 88, and "Unanswered Calls” at 90, each of which may be selected by the user to perform a particular function within Phone Book software application 48.
  • the user has selected the
  • Display list 94 includes a plurality of touchable icons 96 aligned under the group heading “Phone Books” and subheadings "Personal", “Professional” and “Emergency.” Each of the touchable items 96 may or may not be associated with a text field which describes the depicted icon. In representative application 48, selection of any one of touchable items 96 brings forth a phone number corresponding to the individual or organization identified in the text field from memory circuits 28, 30 or 32.
  • Control buttons 98 and 100 may be used to initiate or terminate access to the telephony features of the portable intelligent communications device 10 using a telephone number obtained from display list 94.
  • a bottom rectangular area 102 of screen display 50 may be used to display status information, as well as one or more additional control buttons (identified collectively by numeral 104).
  • An additional list window or work area 105 may be provided to the right of list window 92 for entering or retrieving information related to display list 94.
  • screen display 50 includes a top window title bar 106 and the standard Windows-based control buttons 108 located along the right-hand side of title bar 106.
  • a vertical scroll bar 1 10 is also provided for stepping through the items displayed in list window 92 when the document is too large to be displayed in its entirety therein.
  • Scroll bar 110 preferably operates in the same manner as the equivalent vertical controls for a Windows-based program.
  • Each of the objects described above has a unique location on screen display 50 that is set and controlled by processing circuit 26. This location is inte ⁇ reted by processing circuit 26 in determining what action to take following one or more touches on display screen 22.
  • processing circuit 26 attributes a particular location to each touchable item, this location may be changed for many of the items, such as control tabs, buttons and icons, through a user initiated sequence.
  • processing circuit 26 relocates an object upon detecting a touch in an area of display screen 22 associated with the object in a predetermined manner (i.e., "picking" the object), followed by the identification of a new or target location (i.e., "placing" the object).
  • an object such as that indicated by reference numeral 111, is selected or "picked" by touching the object in a predetermined manner inte ⁇ reted by processing circuit 26 as requesting a movement thereof.
  • First touch 112 and second touch 114 are preferably on opposite sides of object 111, and is typically accomplished with a thumb and finger of a user's hand using the same motion generally made in picking up a physical object. It will be understood, however, that the touching gesture described may be done in any manner with any two separate digits of the user's hands.
  • first and second touches 112 and 1 14 occur substantially simultaneously (i.e., approximately 0.10 second), but in any event within a predetermined time period (e.g., approximately one second or less), in order for processing circuit 26 to distinguish the touches as selecting object 111 for movement, rather than another screen task.
  • First and second touches 112 and 114 that occur outside of the predetermined time period are inte ⁇ reted by processing circuit 26 as selecting the object for a different action or result in an error message indicating a failed move attempt, but in any event would not initiate movement of the object.
  • object 111 After object 111 has been selected, it is highlighted (see Fig. 4) to provide a visual indication to the user of its selection.
  • a target location for object 111 is identified on display screen 22 in order to complete the move.
  • a target location 118 for object 11 1 is identified by touching display screen 22 at the desired point. This generally is accomplished, as shown in screen display 50, by touching display screen 22 with a fingertip 116 at target location 118.
  • the touch preferably occurs within a predetermined time period after object 111 is selected for movement.
  • the predetermined time period between selection of object 1 1 1 and identification of target location 1 18 is less than 2 seconds. If target location 118 is not identified within this predetermined time period, then object 111 is either automatically deselected or an error message is displayed on display screen 22 indicating a failed movement attempt.
  • target location 118 selected on display screen 22 must also be in an allowed area for the particular object being moved. It will be appreciated, for example, that the tabbed areas of main control panel 52 and secondary control panel 82, respectively, must remain therein and that touchable items 96 must remain within list window 94.
  • processing circuit 26 alters display screen 22 to depict object 11 1 at target location 118.
  • Fig. 5 depicts screen display 50 after object 111 has been selected and moved from its initial position under the subheading "Personal” to a new location under the subheading "Professional.” It will be understood that the initial location of object 111 is shown in dashed lines at 120, while object 111 is shown highlighted at target location 118.
  • Fig. 6 is a diagrammatic view of object 111 being doubled touched as described hereinabove.
  • touchable items 96 are modeled as a rectangle 122 having a center 124 (although other shapes may be utilized). Rectangle 122 is sized to best approximate the size and shape of object 11 1; thus, it may be of varying dimensions with the particular dimensions thereof depending upon the modeled object.
  • touchable item 96 and its accompanying text field "Alex Jones” are modeled as a single rectangle 122 since they are associated on screen display 50 and movable as a single object.
  • object 111 is divided into four equal quadrants 126, 128, 130 and 132 by vertical center line 134 and horizontal center line 136 extending between opposing sides 138, 140 and 142, 144, respectively, through center 124.
  • Sides 138, 140, 142 and 144 of rectangle 122 form a perimeter 148 for object 1 1 1.
  • a border 146 shown as having a thickness t by a shaded area, surrounds rectangle 122. In the preferred embodiment, thickness t of border 146 is approximately 8-16 millimeters.
  • object 111 is selected for movement by touching rectangle 122 within first and second areas of two different quadrants.
  • object 11 1 is touched at arrows 112 and 1 14 along opposing longitudinal sides 138 and 140 of rectangle 122 in quadrants 126 and 128.
  • object 111 could alternatively be touched substantially simultaneously at quadrants 130 and 132 or along lateral sides 142 and 144 at quadrants 126 and 130 or 128 and 132.
  • the two touches preferably begin within border 146 outside of the object and move in a sliding action along display screen 22 ending on or just inside perimeter 148 of object 111.
  • object 111 As object 111 is touched in such manner, the user's fingertips move toward each other in the direction of arrows 112 and 114 so that the distance between the two touches decreases (i.e., moves toward horizontal center line 136 of object 1 1 1).
  • This touching action is similar to that used to pick up a physical object, and is translated in the present invention to a touchscreen display in order to impart an intuitive hand motion to movement of an object depicted thereon.
  • object 111 After object 111 has been touched in this manner and selected, it is moved to a target location. This is accomplished provided such target location is in an allowed area for the object and it is identified by touching display screen 22 within the predetermined time period.
  • FIG. 7 An alternative embodiment for selecting an object in accordance with the present invention is depicted in Fig. 7, where an object 211 is similarly modeled as a rectangle 222 having a center 224, quadrants 226, 228, 230 and 232, and a border 246.
  • the predetermined manner of selecting object 211 involves moving a human digit (preferably an index finger) from within border 246 (adjacent a first corner 250 of rectangle 222) diametrically across rectangle 222. This movement ends within border 246 adjacent an opposing second corner 252, as shown by arrow 212.
  • this method only a single touch is required to select object 211, thereby eliminating the need to touch the object twice within a predetermined time period.
  • the target location may then be identified with a single touch on display screen 22 in order to complete the move as in the previous embodiment.
  • Fig. 8 depicts another alternative embodiment for selecting an object in accordance with the present invention in which an object 311 is again modeled as a rectangle 322 having a center 324, four quadrants 326, 328, 330 and 332, and a border 346.
  • object 311 is selected by touching it in a circular motion substantially about the area thereof, as shown by arrow 312. More specifically, circular touch 312 preferably begins within border 346 surrounding object 31 1 and proceeds about perimeter 348 of object 311. Although touch 312 preferably follows border 346 around perimeter 348, it need not fall entirely within the shaded area of border 346 in order for object 311 to be selected.
  • an object may be selected and moved by simultaneously touching the object and target location on display screen 22.
  • object 1 11 may be moved by touching object 111 with a fingertip at the same time that a second fingertip (e.g., 1 16) touches target location 118.
  • a flow chart depicting the logical steps for moving an object within display screen 22 using the touch method described herein is provided in Fig. 9. Starting at a function block 154, it will be understood that the user touches an object on opposite sides in the manner depicted in Fig. 6.
  • a decision block 156 determines whether the two touches took place within the predetermined time period. If the answer is NO at 156, then the routine is finished and it returns to step 158 without moving or selecting the object. If the answer is YES at 156, then a second decision block 160 determines whether the two touches began in different quadrants of the rectangular model. If the answer is NO at 160, then the routine is finished and it returns to step 158. If the answer at 160 is YES, then a third decision block 162 determines whether the touches began within the border surrounding the object. If the answer at 162 is NO, then the routine is finished and it returns to step 158.
  • a fourth decision block 164 determines whether the touches either move toward each other or end on or within the perimeter of the object. If the answer is NO at 164, the routine is finished and it returns to step 158. If the answer is YES at 164, then a function block selects the indicated object, as evidenced by highlighting or some other visual or aural manner.
  • a decision block 168 determines whether a subsequent touch has occurred on the display screen. If the answer is NO at 168, then the object is deselected at function block 170, the routine is finished and it returns to step 158 without moving the object. If the answer is YES at 168, then a decision block 172 determines whether the subsequent touch on the display screen occurred with the predetermined time period after the object was selected. If the answer at 172 is NO, then the object is deselected at function block 170, the routine is finished and it returns to step 158. If the answer is YES at 172, then a decision block 174 determines whether the location indicated by the subsequent touch is an allowed location for the object.
  • step 158 If the answer at 174 is NO, then the object is deselected at function block 170, the routine is finished and it returns to step 158. If the answer is YES at 174, then a function block 176 moves the selected object to the location indicated by the subsequent touch. Following movement of the object at block 176, the routine then returns to step 158.

Abstract

An apparatus and method for moving an object on a touchscreen display of a portable intelligent communications device or a separate computer is disclosed as including the steps of touching first and second areas on the display screen associated with the object to select the object, and identifying a new location for the object on the display screen. The object is selected when the first and second areas are touched within a predetermined time period, and moved to the new location when the location is identified on the screen within an additional predetermined time period. In touching the areas associated with an object to select the object, the screen is contacted at first and second points within a selection range about the object. From these points, the touches move in unison towards the center of the object, terminating at a point abutting or inside the periphery of the object. The first and second touches may be on opposite sides of the object and accomplished using a thumb and finger.

Description

APPARATUS AND METHOD FOR MOVING OBJECTS ON A TOUCHSCREEN DISPLAY
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates generally to a computer-controlled touchscreen display and, more particularly, to an apparatus and method for moving objects between distinct locations on a touchscreen display of a portable intelligent communications device or a separate computer.
2. Description of Related Art.
Various types of computer-based devices have been developed for communications, information processing and other purposes. Among these devices are personal computers, personal digital assistants, and a relatively new class of devices known as portable intelligent communications devices. Unlike the first two devices, the portable intelligent communications device is designed extensively to be a communications device, rather than just a mobile computer, and as such it includes a computer integrated with communications hardware and software to provide telephony, messaging and information services. To enable at least some of these features, the portable intelligent communications device is able to be connected to the Internet by either a wired link or a wireless link. It will also be understood that certain software applications are provided within the portable intelligent communications device which facilitate the aforementioned features, as well as other desirable features such as a Personal Information Manager (PIM), games and the like. An exemplary portable intelligent communications device is shown and disclosed in a patent application entitled "Switching Of Analog Signals In Mobile Computing Devices" and having Serial No. 08/796,119, which is owned by the assignee of the present invention and is hereby incorporated by reference. Portable intelligent communications devices, like other computer- controlled devices, include a screen or display panel to enable interaction with the computer via a graphical user interface. This interaction is oftentimes accomplished by way of a mouse or other pointing device. To input or select information from the screen, the user manipulates the mouse to direct a cursor to an appropriate area of the screen. Once at the appropriate area, the user selects an item by using a mouse button, or enters a command or text through a keyboard.
In addition to inputting and selecting information, oftentimes it is desirable to move objects such as icons, control tabs and text fields to new locations on the screen. In a mouse-based system, such as a Windows® graphical user interface, objects are moved to new screen locations using a drag and drop sequence. In this sequence, the cursor is positioned over the object to be moved, and the object is selected by pressing and holding down a mouse button. While the button is held down, the cursor and object are "dragged" to the new location on the display screen. At the new location, the mouse button is released to complete the move.
In an alternative method, an object is moved to a new screen location by first selecting a drag and drop mode from a control panel. Once in the drag and drop mode, the cursor is moved to the desired object, and the mouse "clicked" to select the object. The cursor is then moved to the new target location, and the mouse "clicked" again to move the object to that location. After the object is moved, the cursor must again be directed to the control panel to deselect and exit the drag and drop mode.
While the drag and drop procedures described above are satisfactory for moving objects in mouse-based systems, these procedures do not translate intuitively to a touch-based system in which a user interacts with the computer by touching designated areas on the display screen with a finger tip. In a touch-based system, moving objects by the primary drag and drop method described above leads to ambiguity and error since the user's view of the screen is oftentimes obstructed by the user's own hand during the drag motion. Furthermore, the single touch required to select and move an object is similar to actions utilized for executing other screen tasks and therefore can be misinterpreted, leading to the unintentional moving of objects. While the alternative drag and drop method described above eliminates some of these problems, it too is undesirable since users frequently forget to exit the drag and drop mode after a move sequence, resulting in the unintentional moving of objects.
Accordingly, it is a primary object of the present invention to provide an apparatus and method for moving objects on a touchscreen display that is intuitive for the modality of touch.
It is another object of the present invention to provide an apparatus and method for moving objects on a touchscreen display in which objects are selected with a distinct manual gesture, thereby virtually eliminating confusion between a move action and other screen tasks. It is still another object of the present invention to provide an apparatus and method for moving objects on a touchscreen display which eliminates the need to drag a selected object to the new location on the screen display.
Yet another object of the present invention is to provide an apparatus and method for moving an object on a touchscreen display of a portable intelligent communications device in which the target location for the object may be identified with a single touch.
These objects and other features of the present invention will become more readily apparent upon reference to the following description when taken in conjunction with the following drawings.
SUMMARY OF THE INVENTION In accordance with a first aspect of the present invention, a method of moving an object depicted on a touchscreen display of a portable intelligent communications device or other computer-controlled device is disclosed as including the steps of selecting an object having an initial location on the touchscreen display by touching an area associated with the object in a predetermined manner, identifying a target location for the object on the touchscreen display, and moving the object from the initial location to the target location. The object is moved when the target location is identified within a predetermined time period after the object has been selected. The object is also identified as being selected and the target location as being allowed for the object prior to movement of the object. The object may be selected in one of several manners, including touching first and second areas on the touchscreen display associated with the object, touching the touchscreen display in a circular motion substantially about a perimeter of the object, simultaneously touching the object and the target location on the touchscreen display, and touching a corner of the object and moving diametrically thereacross to an opposite corner thereof.
In accordance with a second aspect of the present invention, a portable intelligent communications device is disclosed as including circuitry for performing telephony operations, a processing circuit, a memory circuit, and a touchscreen display coupled to the processing circuit for controlling the display. The processing circuit is operable to move the location of objects on the touchscreen display upon detection of a predetermined tactile gesture on the touchscreen display in an area associated with one of such objects followed by a subsequent touch at a new location on the touchscreen display. An object is moved to the new location when the predetermined tactile gesture selecting the object and the subsequent touch occur within a predetermined time period. The predetermined tactile gesture to select an object may be first and second touches by a thumb and finger on opposite sides of the object, a circular motion with a finger about the object's perimeter, simultaneously touching the object and the new location on the touchscreen display, and touching a corner of the object and moving diametrically thereacross to an opposite corner thereof.
BRIEF DESCRIPTION OF THE DRAWINGS While the specification concludes with claims particularly pointing out and distinctly claiming the present invention, it is believed the same will be better understood from the following description taken in conjunction with the accompanying drawings in which:
Fig. 1 is a perspective view of a portable intelligent communications device in accordance with the present invention; Fig. 2 is a block diagram of the major components of the portable intelligent communications device depicted in Fig. 1 ;
Fig. 3 is a block diagram of the software architecture for the portable intelligent communications device depicted in Figs. 1 and 2;
Fig. 4 is an exemplary screen display from a representative software application depicting an object being selected for movement to a new location on the screen display, as well as the identification of such new location for the object in accordance with the present invention;
Fig. 5 is an exemplary screen display similar to Fig. 4, depicting the selected object at the target location following movement from its original location;
Fig. 6 is a diagrammatic view of an object being selected for movement in accordance with the present invention;
Fig. 7 is a diagrammatic view of an alternative method for selecting an object to be moved in accordance with the present invention; Fig. 8 is a diagrammatic view of another alternative method for selecting an object to be moved in accordance with the present invention; and
Fig. 9 is a flowchart of the steps by which a preferred method of the present invention is accomplished.
DETAILED DESCRIPTION OF THE INVENTION
Referring now to the drawings in detail, wherein identical numerals indicate the same elements throughout the figures, Fig. 1 depicts a portable intelligent communications device identified generally by the numeral 10. It will be understood that portable intelligent communications device 10 is principally a communications device and includes circuitry and components which allows it to function in such capacity through cellular, landline, infrared data association (IrDA), phone cards, and other modes. Portable intelligent communications device 10 also includes circuitry which enables it to function in the capacity of a computer, and a plurality of software applications may be utilized therewith. Because of this combination, portable intelligent communications device 10 is uniquely suited to interface software applications with communications hardware and software, particularly where connection to an Internet address is desired. In this regard, it will be understood that portable intelligent communications device 10 generally operates in accordance with a device shown and described in a patent application entitled "Switching Of Analog Signals In Mobile Computing Devices" and having Serial No. 08/796,119, which is also owned by the assignee of the present invention and is hereby incorporated by reference.
As seen in Fig. 1, portable intelligent communications device 10 includes a casing 12 for housing the communications and other circuitry as will be discussed in greater detail hereinafter. A handset 14 is positioned within a top portion 16 of casing 12 and preferably includes a built-in speaker 18 for use when handset 14 is maintained therein. A pivotable antenna 20 (shown in Fig. 1 in the open or use position) is provided to enable a communications function, as when portable intelligent communications device 10 is in a cellular mode of operation. It will be understood that various ports, jacks, and interfaces will be provided to further enable communications functions by portable intelligent communications device 10. Control buttons 21 and 23 are also shown as being located on top portion 16 of casing 12.
Portable intelligent communications device 10 further includes a display screen 22, which preferably is a type in which a user of the device is able to interact through touching designated areas thereon. It will be appreciated that a stylus 24 may optionally be utilized to indicate a particular area more specifically than can be accomplished with the user's finger, although most designated areas are sized for touch interaction by a typically sized finger. Since portable intelligent communications device 10 preferably is no larger than a standard business desk telephone, display screen 22 is sized to be approximately eight (8) inches measured diagonally across. This puts screen display 22 in a distinct size class, as it is smaller than normal monitor sizes for personal and portable computers and larger than screen displays for personal digital assistants (PDAs), calculators, and other similar personal electronic devices.
Fig. 2 depicts the internal circuitry of portable intelligent communications device 10 as including a processing circuit 26, which may, for example, be a
Motorola microprocessor known by the designation Power PC 821. It will be seen that processing circuit 26 is connected to both Read Only Memory (ROM) 28 and Random Access Memory (RAM) 30 in which both operating systems and software applications are stored. An optional bulk storage device 32 is further provided for storing databases. Processing circuit 26 is also coupled to display screen 22 through a standard driver (not shown) in order to control the images displayed thereon, as well as receive information through graphical user interfaces in which the user of portable intelligent communications device 10 may indicate chosen options. The communications functions of portable intelligent communications device 10 are also handled through processing circuit 26 via a serial and/or parallel port 34 to the particular circuitry of a communications mode designated generically by reference numeral 36. As noted hereinabove, there are several communication mode options available, including cellular, landline, IrDA, and phone cards, and it will be appreciated that more than one such option may be utilized at a given time. A keyboard 38 may also be connected to processing circuit 26, where keyboard 38 can be depicted on display screen 22 or be a separate physical package which can be utilized with portable intelligent communications device 10 such as through a keyboard IR port 40 (see Fig. 1).
Fig. 3 depicts a schematic block diagram of the software architecture for portable intelligent communications device 10. As seen therein, the software is divided into three basic areas: applications software 42, desktop software 44, and system operating software 46 (which includes everything else from the class libraries down to the device drivers for portable intelligent communications device 10). It will be understood that neither applications software 42 nor desktop software 44 will ever interact with anything other than the top layer of system operating software 46. Exemplary software applications are shown within applications software 42, with particular reference being made to Phone Book software application 48.
Turning now to Fig. 4, an exemplary screen display 50 is illustrated on display screen 22 when portable intelligent communications device 10 operates within Phone Book software application 48. The present invention will be described with respect to representative Phone Book software application 48, which may be used to save and group business card information on portable intelligent communications device 10 or a similar computer. It will be appreciated, however, that although the present invention is described with respect to a Phone Book software application, the invention is applicable to any touch-based user interface, such that any screen image that may be moved via a drop and drag procedure may also be moved via the pick and place method of the present invention.
As can be seen in Fig. 4, the user interface of representative screen display 50 includes a variety of screen images or objects, otherwise known as "touchable items," through which a user interacts with the application. These touchable items include a plurality of virtual tabbed areas which make up a main control panel 52. In screen display 50, these tabbed areas are designated as "Phone" at 54, "Edit" at 56, "Setup" at 58, "Services" at 60 and "Help" at 62. A second level of objects or menu choices correspond to each of tabbed areas 54-62, and appear on display screen 22 when the corresponding tabbed area has been selected. In screen display 50, the "Phone" tabbed area at 54 has been selected, causing a second level of objects to be displayed. These objects include "Dialer" at 64, "End" at 66, "Hold" at 68, "Resume" at 70, "Transfer" at 72, "Mute" at 74, "Record" at 76, "Vol" at 78, and "Exit" at 80.
Below main control panel 52, in the lower half of screen display 50, is a second control panel 82. Control panel 82 includes the options "Phone Dialer" at 84, "Phone Book" at 86, "Speed Dial" at 88, and "Unanswered Calls" at 90, each of which may be selected by the user to perform a particular function within Phone Book software application 48. In screen display 50, the user has selected the
"Phone Book" option at 86, which has brought forth a list window 92 containing a display list 94. Display list 94 includes a plurality of touchable icons 96 aligned under the group heading "Phone Books" and subheadings "Personal", "Professional" and "Emergency." Each of the touchable items 96 may or may not be associated with a text field which describes the depicted icon. In representative application 48, selection of any one of touchable items 96 brings forth a phone number corresponding to the individual or organization identified in the text field from memory circuits 28, 30 or 32.
Additional control buttons or objects identified as "Call" at 98 and "Cancel" at 100 are located beneath second control panel 82. Control buttons 98 and 100 may be used to initiate or terminate access to the telephony features of the portable intelligent communications device 10 using a telephone number obtained from display list 94. A bottom rectangular area 102 of screen display 50 may be used to display status information, as well as one or more additional control buttons (identified collectively by numeral 104). An additional list window or work area 105 may be provided to the right of list window 92 for entering or retrieving information related to display list 94.
In addition to the objects described above, it will be noted that screen display 50 includes a top window title bar 106 and the standard Windows-based control buttons 108 located along the right-hand side of title bar 106. A vertical scroll bar 1 10 is also provided for stepping through the items displayed in list window 92 when the document is too large to be displayed in its entirety therein. Scroll bar 110 preferably operates in the same manner as the equivalent vertical controls for a Windows-based program.
Each of the objects described above has a unique location on screen display 50 that is set and controlled by processing circuit 26. This location is inteφreted by processing circuit 26 in determining what action to take following one or more touches on display screen 22. Although processing circuit 26 attributes a particular location to each touchable item, this location may be changed for many of the items, such as control tabs, buttons and icons, through a user initiated sequence. In the present invention, processing circuit 26 relocates an object upon detecting a touch in an area of display screen 22 associated with the object in a predetermined manner (i.e., "picking" the object), followed by the identification of a new or target location (i.e., "placing" the object).
As can be seen in Fig. 4, an object, such as that indicated by reference numeral 111, is selected or "picked" by touching the object in a predetermined manner inteφreted by processing circuit 26 as requesting a movement thereof.
This preferably involves touching first and second areas on object 111, as indicated by arrows 112 and 114. First touch 112 and second touch 114 are preferably on opposite sides of object 111, and is typically accomplished with a thumb and finger of a user's hand using the same motion generally made in picking up a physical object. It will be understood, however, that the touching gesture described may be done in any manner with any two separate digits of the user's hands. Preferably, first and second touches 112 and 1 14 occur substantially simultaneously (i.e., approximately 0.10 second), but in any event within a predetermined time period (e.g., approximately one second or less), in order for processing circuit 26 to distinguish the touches as selecting object 111 for movement, rather than another screen task. First and second touches 112 and 114 that occur outside of the predetermined time period are inteφreted by processing circuit 26 as selecting the object for a different action or result in an error message indicating a failed move attempt, but in any event would not initiate movement of the object. After object 111 has been selected, it is highlighted (see Fig. 4) to provide a visual indication to the user of its selection. Thereafter, a target location for object 111 is identified on display screen 22 in order to complete the move. In the preferred embodiment, a target location 118 for object 11 1 is identified by touching display screen 22 at the desired point. This generally is accomplished, as shown in screen display 50, by touching display screen 22 with a fingertip 116 at target location 118. In order for processing circuit 26 to associate the touch at target location 118 with movement of object 1 11, the touch preferably occurs within a predetermined time period after object 111 is selected for movement. In the preferred embodiment, the predetermined time period between selection of object 1 1 1 and identification of target location 1 18 is less than 2 seconds. If target location 118 is not identified within this predetermined time period, then object 111 is either automatically deselected or an error message is displayed on display screen 22 indicating a failed movement attempt. For movement of object 1 11 to be completed, target location 118 selected on display screen 22 must also be in an allowed area for the particular object being moved. It will be appreciated, for example, that the tabbed areas of main control panel 52 and secondary control panel 82, respectively, must remain therein and that touchable items 96 must remain within list window 94.
After object 111 has been "picked" as shown at 1 12 and 1 14, and target location 1 18 has been identified within the predetermined time period, processing circuit 26 alters display screen 22 to depict object 11 1 at target location 118. Fig. 5 depicts screen display 50 after object 111 has been selected and moved from its initial position under the subheading "Personal" to a new location under the subheading "Professional." It will be understood that the initial location of object 111 is shown in dashed lines at 120, while object 111 is shown highlighted at target location 118.
The selection of an object for a movement within a screen display will now be described in more detail with reference to Fig. 6, which is a diagrammatic view of object 111 being doubled touched as described hereinabove. As shown in Fig. 6, touchable items 96 are modeled as a rectangle 122 having a center 124 (although other shapes may be utilized). Rectangle 122 is sized to best approximate the size and shape of object 11 1; thus, it may be of varying dimensions with the particular dimensions thereof depending upon the modeled object. In Fig. 6, it will be appreciated that touchable item 96 and its accompanying text field "Alex Jones" are modeled as a single rectangle 122 since they are associated on screen display 50 and movable as a single object.
In rectangular model 122, object 111 is divided into four equal quadrants 126, 128, 130 and 132 by vertical center line 134 and horizontal center line 136 extending between opposing sides 138, 140 and 142, 144, respectively, through center 124. Sides 138, 140, 142 and 144 of rectangle 122 form a perimeter 148 for object 1 1 1. A border 146, shown as having a thickness t by a shaded area, surrounds rectangle 122. In the preferred embodiment, thickness t of border 146 is approximately 8-16 millimeters.
In the preferred embodiment, object 111 is selected for movement by touching rectangle 122 within first and second areas of two different quadrants. In the model shown in Fig. 6, object 11 1 is touched at arrows 112 and 1 14 along opposing longitudinal sides 138 and 140 of rectangle 122 in quadrants 126 and 128. It will be understood that object 111 could alternatively be touched substantially simultaneously at quadrants 130 and 132 or along lateral sides 142 and 144 at quadrants 126 and 130 or 128 and 132. To select object 111, the two touches preferably begin within border 146 outside of the object and move in a sliding action along display screen 22 ending on or just inside perimeter 148 of object 111. As object 111 is touched in such manner, the user's fingertips move toward each other in the direction of arrows 112 and 114 so that the distance between the two touches decreases (i.e., moves toward horizontal center line 136 of object 1 1 1). This touching action is similar to that used to pick up a physical object, and is translated in the present invention to a touchscreen display in order to impart an intuitive hand motion to movement of an object depicted thereon. As described hereinabove, after object 111 has been touched in this manner and selected, it is moved to a target location. This is accomplished provided such target location is in an allowed area for the object and it is identified by touching display screen 22 within the predetermined time period.
An alternative embodiment for selecting an object in accordance with the present invention is depicted in Fig. 7, where an object 211 is similarly modeled as a rectangle 222 having a center 224, quadrants 226, 228, 230 and 232, and a border 246. In this alternative method, the predetermined manner of selecting object 211 involves moving a human digit (preferably an index finger) from within border 246 (adjacent a first corner 250 of rectangle 222) diametrically across rectangle 222. This movement ends within border 246 adjacent an opposing second corner 252, as shown by arrow 212. In this method, only a single touch is required to select object 211, thereby eliminating the need to touch the object twice within a predetermined time period. After object 21 1 is selected, the target location may then be identified with a single touch on display screen 22 in order to complete the move as in the previous embodiment.
Fig. 8 depicts another alternative embodiment for selecting an object in accordance with the present invention in which an object 311 is again modeled as a rectangle 322 having a center 324, four quadrants 326, 328, 330 and 332, and a border 346. In this alternative embodiment, object 311 is selected by touching it in a circular motion substantially about the area thereof, as shown by arrow 312. More specifically, circular touch 312 preferably begins within border 346 surrounding object 31 1 and proceeds about perimeter 348 of object 311. Although touch 312 preferably follows border 346 around perimeter 348, it need not fall entirely within the shaded area of border 346 in order for object 311 to be selected. Following the circular motion to select object 311, movement is completed by touching the target location on display screen 22 within the aforementioned predetermined time period. In addition to the embodiments described hereinabove, an object may be selected and moved by simultaneously touching the object and target location on display screen 22. For instance, in screen display 50 of Fig. 4, object 1 11 may be moved by touching object 111 with a fingertip at the same time that a second fingertip (e.g., 1 16) touches target location 118. A flow chart depicting the logical steps for moving an object within display screen 22 using the touch method described herein is provided in Fig. 9. Starting at a function block 154, it will be understood that the user touches an object on opposite sides in the manner depicted in Fig. 6. After this has occurred, a decision block 156 determines whether the two touches took place within the predetermined time period. If the answer is NO at 156, then the routine is finished and it returns to step 158 without moving or selecting the object. If the answer is YES at 156, then a second decision block 160 determines whether the two touches began in different quadrants of the rectangular model. If the answer is NO at 160, then the routine is finished and it returns to step 158. If the answer at 160 is YES, then a third decision block 162 determines whether the touches began within the border surrounding the object. If the answer at 162 is NO, then the routine is finished and it returns to step 158. If the answer at 162 is YES, then a fourth decision block 164 determines whether the touches either move toward each other or end on or within the perimeter of the object. If the answer is NO at 164, the routine is finished and it returns to step 158. If the answer is YES at 164, then a function block selects the indicated object, as evidenced by highlighting or some other visual or aural manner.
After the object is selected, a decision block 168 determines whether a subsequent touch has occurred on the display screen. If the answer is NO at 168, then the object is deselected at function block 170, the routine is finished and it returns to step 158 without moving the object. If the answer is YES at 168, then a decision block 172 determines whether the subsequent touch on the display screen occurred with the predetermined time period after the object was selected. If the answer at 172 is NO, then the object is deselected at function block 170, the routine is finished and it returns to step 158. If the answer is YES at 172, then a decision block 174 determines whether the location indicated by the subsequent touch is an allowed location for the object. If the answer at 174 is NO, then the object is deselected at function block 170, the routine is finished and it returns to step 158. If the answer is YES at 174, then a function block 176 moves the selected object to the location indicated by the subsequent touch. Following movement of the object at block 176, the routine then returns to step 158.
Having shown and described the preferred embodiment of the present invention, further adaptations of the apparatus and method for moving an object on a touchscreen display can be accomplished by appropriate modifications by one of ordinary skill in the art without departing from the scope of the invention. What is claimed is:

Claims

1. A method of moving an object depicted on a touchscreen display of a computer- controlled device, comprising the following steps:
(a) selecting an object having an initial location on said touchscreen display by touching an area associated with said object in a predetermined manner;
(b) identifying a target location for said object on said touchscreen display; and
(c) moving said object from said initial location to said target location.
2. The method of claim 1, wherein said target location is identified by touching said touchscreen display at a desired location.
3. The method of claim 1, wherein said object is moved when said target location is identified within a predetermined time period after said object has been selected.
4. The method of claim 1, wherein said object is selected by touching first and second areas on said touchscreen display associated with said object.
5. The method of claim 4, wherein said object is selected when said first and second areas are touched within a predetermined time period.
6. The method of claim 4, said selecting step further comprising:
(a) contacting first and second points on said touchscreen display adjacent said object; and
(b) moving from said first and second contact points towards a center line of said object between said contact points.
7. The method of claim 6, said first and second contact points being located outside a perimeter of said object, wherein said object is selected by moving from said first and second contact points to new points within the perimeter of said object.
8. The method of claim 6, said first and second contact points being located outside a perimeter of said object, wherein said object is selected by moving from said first and second contact points to new points within a border of said object.
9. The method of claim 7, further comprising the steps of: (a) defining a border about the perimeter of said object; and
(b) selecting said object when said first and second contact points are within said border.
10. The method of claim 7, wherein said first and second contact points are located on opposite sides of said object.
11. The method of claim 10, wherein said first and second contact points are established by separate digits of a user's hands.
12. The method of claim 4, wherein said first and second areas are on opposite sides of said object.
13. The method of claim 11, wherein said first and second areas are touched by a thumb and finger.
14. The method of claim 5, wherein said predetermined time period is approximately one second.
15. The method of claim 6, wherein said predetermined time period is approximately two seconds.
16. The method of claim 1, further comprising the step of identifying said object as being selected prior to said moving step.
17. The method of claim 1, further comprising the step of verifying said target location as being allowed for said object prior to said moving step.
18. The method of claim 1, further comprising the step of providing a model for each object depicted on said touchscreen display.
19. The method of claim 18, wherein said models encompass each object and any associated text.
20. The method of claim 18, wherein said models are rectangular in shape.
21. The method of claim 18, wherein each model is divided into four substantially equal quadrants.
22. The method of claim 18, wherein a border is provided surrounding a perimeter of each said model.
23. The method of claim 21, said selecting step further comprising contacting said touchscreen display on opposite quadrants of said model with a pair of human digits.
24. The method of claim 23, wherein said human digits move from initial contact points toward a center line of said model.
25. The method of claim 24, wherein said motion extends from outside a perimeter of said model to inside the perimeter of said model.
26. The method of claim 24, wherein said motion begins within a specified border located outside a perimeter of said model.
27. The method of claim 1, wherein said predetermined manner of touching comprises moving a finger in a circular motion substantially about a perimeter of said object.
28. The method of claim 18, said selecting step further comprising:
(a) touching said touchscreen display on a perimeter of said model with a human digit; and
(b) moving said human digit in a circular motion substantially about said model perimeter.
29. The method of claim 1, wherein said selecting, identifying, and moving steps are accomplished by simultaneously touching said object and said target location on said touchscreen.
30. The method of claim 1, said selecting step further comprising moving a human digit diametrically across said object.
31. The method of claim 18, said selecting step further comprising:
(a) touching said touchscreen display at a first corner of said model with a human digit;
(b) moving said human digit diametrically across said model so as to intersect a center thereof; and
(c) terminating movement of said human digit at a second corner of said model opposite said first corner.
32. A portable intelligent communications device, comprising:
(a) circuitry for performing telephony operations;
(b) a processing circuit;
(c) a memory circuit; and
(d) a touchscreen display; said processing circuit being coupled to said touchscreen display to control the depiction of objects thereon, wherein said processing circuit moves the location of an object depicted on said touchscreen display upon detection of a predetermined tactile gesture on said touchscreen display in an area associated with said object followed by a subsequent touch at a new location on said touchscreen display.
33. The portable intelligent communications device of claim 32, wherein said processing circuit operates to move the location of said object when said predetermined tactile gesture and said subsequent touch occur within a predetermined time period.
34. The portable intelligent communications device of claim 33, wherein said predetermined time period is two seconds.
35. The portable intelligent communications device of claim 33, wherein said predetermined tactile gesture on said touchscreen display comprises first and second touches on opposite sides of said object.
36. The portable intelligent communications device of claim 35, wherein said processing circuit recognizes an object as being selected for movement when said first and second touches occur within a predetermined time period.
37. The portable intelligent communications device of claim 36, wherein said predetermined time period is approximately one second.
38. The portable intelligent communications device of claim 35, wherein said first and second touches move toward a center line of said object between said touches.
39. The portable intelligent communications device of claim 35, wherein said processing circuit detects a selection of said object for movement when said first and second touches move from outside a perimeter of said object to points inside the perimeter of said object.
40. The portable intelligent communications device of claim 35, wherein said processing circuit detects a selection of said object for movement when said first and second touches move from outside a perimeter of said object to points within a border surrounding said object.
41. The portable intelligent communications device of claim 39, said processing circuit defining a border about the perimeter of said object, wherein said processing circuit detects a selection of said object for movement when said first and second touches occur within said border.
42. The portable intelligent communications device of claim 32, wherein said processing circuit identifies said object as being selected for movement prior to moving the location of said object.
43. The portable intelligent communications device of claim 32, wherein said processing circuit verifies the new location for said object as being permitted prior to moving the location of said object.
44. The portable intelligent communications device of claim 32, wherein said processing circuit provides a model for each object depicted on said touchscreen display.
45. The portable intelligent communications device of claim 44, wherein said model encompasses each object and any associated text.
46. The portable intelligent communications device of claim 44, said model for each object being divided into four substantially equal quadrants, wherein said processing circuit detects selection of an object for movement when contact on said touchscreen display on opposite quadrants of said model is recognized.
47. The portable intelligent communications device of claim 44, wherein a border is provided surrounding a perimeter of each said model.
48. The portable intelligent communications device of claim 46, wherein said contacts move toward a center line of said model therebetween.
49. The portable intelligent communications device of claim 48, wherein said motion extends from outside a perimeter of said model to inside the perimeter of said model.
50. The portable intelligent communications device of claim 48, wherein said motion begins within a specified border located outside a perimeter of said model.
51. The portable intelligent communications device of claim 46, wherein said first and second touches are made by a thumb and index finger.
52. The portable intelligent communications device of claim 32, wherein said predetermined tactile gesture on said touchscreen display comprises a circular motion substantially about a perimeter of said object.
53. The portable intelligent communications device of claim 52, wherein said processing circuit operates to move the location of said object when said circular motion and said subsequent touch occur within a predetermined time period.
54. The portable intelligent communications device of claim 32, wherein said object is selected and moved by simultaneously touching said object and said new location on said touchscreen display.
55. The portable intelligent communications device of claim 32, wherein said predetermined tactile gesture on said touchscreen display comprises moving a human digit diametrically across said object.
PCT/US1999/021301 1998-09-15 1999-09-15 Apparatus and method for moving objects on a touchscreen display WO2000016186A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
DE19983569T DE19983569T1 (en) 1998-09-15 1999-09-15 Device and method for moving objects on a touchscreen display
JP2000570657A JP2002525705A (en) 1998-09-15 1999-09-15 Apparatus and method for moving an object on a touch screen display
AU62508/99A AU6250899A (en) 1998-09-15 1999-09-15 Apparatus and method for moving objects on a touchscreen display
HK02104016.0A HK1042359B (en) 1998-09-15 2002-05-29 Apparatus and method for moving objects on a touchscreen display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/153,701 US20020018051A1 (en) 1998-09-15 1998-09-15 Apparatus and method for moving objects on a touchscreen display
US09/153,701 1998-09-15

Publications (2)

Publication Number Publication Date
WO2000016186A2 true WO2000016186A2 (en) 2000-03-23
WO2000016186A3 WO2000016186A3 (en) 2000-05-25

Family

ID=22548368

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/021301 WO2000016186A2 (en) 1998-09-15 1999-09-15 Apparatus and method for moving objects on a touchscreen display

Country Status (7)

Country Link
US (1) US20020018051A1 (en)
JP (1) JP2002525705A (en)
CN (1) CN1126021C (en)
AU (1) AU6250899A (en)
DE (1) DE19983569T1 (en)
HK (1) HK1042359B (en)
WO (1) WO2000016186A2 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1717770A1 (en) * 2005-04-27 2006-11-02 Aruze Corp. gaming machine
WO2007019767A1 (en) * 2005-08-12 2007-02-22 Huawei Technologies Co., Ltd. A mobile terminal and its keyboard, a method for selecting a window, a method for allocating or activating a fix point on the screen
WO2007057770A3 (en) * 2005-11-21 2008-03-06 Nokia Corp Gesture based document editor
CN100397321C (en) * 2005-05-31 2008-06-25 富士通天株式会社 Map display device and map display method
EP1955904A1 (en) * 2005-10-31 2008-08-13 Toyota Jidosha Kabushiki Kaisha Parking support device
WO2008085768A3 (en) * 2007-01-07 2008-09-18 Apple Inc System and method for moving list items on a touch screen
US7975242B2 (en) 2007-01-07 2011-07-05 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US8014760B2 (en) 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
US8090087B2 (en) 2006-10-26 2012-01-03 Apple Inc. Method, system, and graphical user interface for making conference calls
CN102368199A (en) * 2011-10-25 2012-03-07 中兴通讯股份有限公司 File management method and device for electronic equipment with touch screen, and electronic equipment
US8284170B2 (en) 2008-09-30 2012-10-09 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US8456431B2 (en) 2009-09-22 2013-06-04 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8504947B2 (en) 2006-09-06 2013-08-06 Apple Inc. Deletion gestures on a portable multifunction device
US8539385B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for precise positioning of objects
US8539386B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for selecting and moving objects
US8612884B2 (en) 2010-01-26 2013-12-17 Apple Inc. Device, method, and graphical user interface for resizing objects
US8766928B2 (en) 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8780069B2 (en) 2009-09-25 2014-07-15 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8799826B2 (en) 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8972879B2 (en) 2010-07-30 2015-03-03 Apple Inc. Device, method, and graphical user interface for reordering the front-to-back positions of objects
US9081494B2 (en) 2010-07-30 2015-07-14 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US9098182B2 (en) 2010-07-30 2015-08-04 Apple Inc. Device, method, and graphical user interface for copying user interface objects between content regions
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US10250735B2 (en) 2013-10-30 2019-04-02 Apple Inc. Displaying relevant user interface objects
US10254949B2 (en) 2007-01-07 2019-04-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
US10628028B2 (en) 2008-01-06 2020-04-21 Apple Inc. Replacing display of icons in response to a gesture
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US10761691B2 (en) 2007-06-29 2020-09-01 Apple Inc. Portable multifunction device with animated user interface transitions
US10788953B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US11281368B2 (en) 2010-04-07 2022-03-22 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11323559B2 (en) 2016-06-10 2022-05-03 Apple Inc. Displaying and updating a set of application views
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay

Families Citing this family (169)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
EP1128638A1 (en) * 2000-02-23 2001-08-29 Koninklijke Philips Electronics N.V. Device with a display panel and method for displaying data
ATE338300T1 (en) * 2000-05-11 2006-09-15 Nes Stewart Irvine ZEROCLICK
FI20010817A (en) * 2001-04-20 2003-02-14 Nokia Corp A method for displaying information on an electronic device display and an electronic device
US7093201B2 (en) * 2001-09-06 2006-08-15 Danger, Inc. Loop menu navigation apparatus and method
JP3847641B2 (en) * 2002-02-28 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing program, computer-readable recording medium storing information processing program, and information processing method
US7958455B2 (en) * 2002-08-01 2011-06-07 Apple Inc. Mode activated scrolling
US20050085215A1 (en) * 2003-10-21 2005-04-21 Nokia Corporation Method and related apparatus for emergency calling in a touch screen mobile phone from a touch screen and keypad lock active state
US20050090304A1 (en) * 2003-10-24 2005-04-28 Pokertek, Inc. System and method of displaying or obscuring electronic playing cards
CN100421064C (en) * 2003-12-19 2008-09-24 升达科技股份有限公司 Touch control device, control method and electronic products thereof
JP4213052B2 (en) * 2004-01-28 2009-01-21 任天堂株式会社 Game system using touch panel input
AU2005201050A1 (en) * 2004-03-11 2005-09-29 Aruze Corp. Gaming machine and program thereof
JP4855654B2 (en) * 2004-05-31 2012-01-18 ソニー株式会社 On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program
US9552141B2 (en) 2004-06-21 2017-01-24 Apple Inc. Methods and apparatuses for operating a data processing system
ES2325264T3 (en) * 2004-06-21 2009-08-31 Weike (S) Pte Ltd. VIRTUAL CARD GAME SYSTEM.
US10201753B2 (en) 2004-07-16 2019-02-12 Universal Entertainment Corporation Gaming machine and program thereof
CN103365595B (en) * 2004-07-30 2017-03-01 苹果公司 Gesture for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7719523B2 (en) 2004-08-06 2010-05-18 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
US7794324B2 (en) * 2004-09-13 2010-09-14 Pokertek, Inc. Electronic player interaction area with player customer interaction features
CN100339813C (en) * 2004-10-28 2007-09-26 京瓷美达株式会社 Electronic instrument and its display control method
JP2008527557A (en) * 2005-01-14 2008-07-24 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Moving an object presented by a touch input display device
CN101133385B (en) * 2005-03-04 2014-05-07 苹果公司 Hand held electronic device, hand held device and operation method thereof
US7672512B2 (en) * 2005-03-18 2010-03-02 Searete Llc Forms for completion with an electronic writing device
US8225231B2 (en) 2005-08-30 2012-07-17 Microsoft Corporation Aggregation of PC settings
CN102841713A (en) * 2005-09-15 2012-12-26 苹果公司 System and method for processing raw data of track pad device
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
JP2008012199A (en) * 2006-07-10 2008-01-24 Aruze Corp Game system and image display control method thereof
CN101356493A (en) * 2006-09-06 2009-01-28 苹果公司 Portable electronic device for photo management
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
KR101349811B1 (en) * 2007-02-23 2014-01-10 엘지전자 주식회사 Mobile communication terminal and method of connecting internet using thereof
KR100863046B1 (en) * 2007-03-09 2008-10-13 엘지전자 주식회사 Method for displaying broadcast channel information and broadcast receiver capable of implementing the same
KR101531416B1 (en) 2007-09-13 2015-06-24 옵티스 셀룰러 테크놀로지, 엘엘씨 Method For Transmitting Uplink Signals
US20090146908A1 (en) * 2007-12-07 2009-06-11 Research In Motion Limited System and method for event-dependent state activation for a mobile communication device
CN101458585B (en) * 2007-12-10 2010-08-11 义隆电子股份有限公司 Touch control panel detecting method
CN101458586B (en) * 2007-12-11 2010-10-13 义隆电子股份有限公司 Method for operating objects on touch control screen by multi-fingers
JP4924402B2 (en) * 2007-12-14 2012-04-25 ブラザー工業株式会社 Control device and control program
US20090189869A1 (en) * 2007-12-20 2009-07-30 Seiko Epson Corporation Touch panel input device, control method of touch panel input device, media stored control program, and electronic device
US8395584B2 (en) * 2007-12-31 2013-03-12 Sony Corporation Mobile terminals including multiple user interfaces on different faces thereof configured to be used in tandem and related methods of operation
KR100943908B1 (en) 2008-02-19 2010-02-24 엘지전자 주식회사 Method For Transmitting and Receiving Control Information Through PDCCH
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US20100087173A1 (en) * 2008-10-02 2010-04-08 Microsoft Corporation Inter-threading Indications of Different Types of Communication
US20100087169A1 (en) * 2008-10-02 2010-04-08 Microsoft Corporation Threading together messages with multiple common participants
US20100107100A1 (en) * 2008-10-23 2010-04-29 Schneekloth Jason S Mobile Device Style Abstraction
US20100105441A1 (en) * 2008-10-23 2010-04-29 Chad Aron Voss Display Size of Representations of Content
US8411046B2 (en) * 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
US8385952B2 (en) * 2008-10-23 2013-02-26 Microsoft Corporation Mobile communications device user interface
JP5036684B2 (en) * 2008-10-27 2012-09-26 シャープ株式会社 Portable information terminal
US20110193812A1 (en) * 2008-10-30 2011-08-11 Kaoru Uchida Portable terminal device, data manipulation processing method and data manipulation processing program
CN101770326B (en) * 2008-12-31 2012-07-25 北京联想软件有限公司 Realization method for moving object on touch screen and computing device
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8355698B2 (en) * 2009-03-30 2013-01-15 Microsoft Corporation Unlock screen
US8238876B2 (en) * 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8269736B2 (en) * 2009-05-22 2012-09-18 Microsoft Corporation Drop target gestures
KR101587211B1 (en) * 2009-05-25 2016-01-20 엘지전자 주식회사 Mobile Terminal And Method Of Controlling Same
US8836648B2 (en) * 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US20110007019A1 (en) * 2009-07-07 2011-01-13 Nuvoton Technology Corporation Systems and methods for using tft-based lcd panels as capacitive touch sensors
US9182854B2 (en) * 2009-07-08 2015-11-10 Microsoft Technology Licensing, Llc System and method for multi-touch interactions with a touch sensitive screen
KR101608770B1 (en) * 2009-08-03 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
US8411050B2 (en) * 2009-10-14 2013-04-02 Sony Computer Entertainment America Touch interface having microphone to determine touch impact strength
US9069437B2 (en) 2009-12-18 2015-06-30 Lenovo (Beijing) Limited Window management method, apparatus and computing device
KR101092841B1 (en) 2009-12-29 2011-12-14 (주)엑시스 소프트웨어 엔지니어링 Computing apparatus for recognizing touch input
US8786559B2 (en) * 2010-01-06 2014-07-22 Apple Inc. Device, method, and graphical user interface for manipulating tables using multi-contact gestures
US8487889B2 (en) * 2010-01-15 2013-07-16 Apple Inc. Virtual drafting tools
US8386965B2 (en) * 2010-01-15 2013-02-26 Apple Inc. Techniques and systems for enhancing touch screen device accessibility through virtual containers and virtually enlarged boundaries
CN102770835B (en) 2010-01-20 2016-01-06 诺基亚公司 For organizing the method and apparatus of image item
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
CA2788710A1 (en) * 2010-02-04 2011-08-11 Nokia Corporation User input
CN102147694B (en) * 2010-02-09 2016-05-04 康佳集团股份有限公司 A kind of method, system and embedded device of window sliding
US8769443B2 (en) * 2010-02-11 2014-07-01 Apple Inc. Touch inputs interacting with user interface items
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9310994B2 (en) * 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
EP2565764B1 (en) * 2010-04-30 2020-10-07 Nec Corporation Information processing terminal and operation control method for same
CN101882043A (en) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 Method for improving touch precision of edge of capacitance type touch screen
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
CN102375588B (en) * 2010-08-19 2016-01-20 上海博泰悦臻电子设备制造有限公司 By the method and apparatus that the gesture opertaing device of electronic equipment screen operates
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US20120159383A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Customization of an immersive environment
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US20120304132A1 (en) 2011-05-27 2012-11-29 Chaitanya Dev Sareen Switching back to a previously-interacted-with application
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
CN103246382B (en) * 2012-02-13 2017-03-01 联想(北京)有限公司 Control method and electronic equipment
CN102866841A (en) * 2011-07-04 2013-01-09 鸿富锦精密工业(深圳)有限公司 File dragging method and system
US20130016129A1 (en) * 2011-07-14 2013-01-17 Google Inc. Region-Specific User Input
US8863027B2 (en) * 2011-07-31 2014-10-14 International Business Machines Corporation Moving object on rendered display using collar
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US20130067398A1 (en) * 2011-09-09 2013-03-14 Theresa B. Pittappilly Semantic Zoom
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
JP2012027940A (en) * 2011-10-05 2012-02-09 Toshiba Corp Electronic apparatus
DE102011116175B4 (en) * 2011-10-14 2015-03-26 Volkswagen Aktiengesellschaft Method and device for providing a user interface, in particular in a vehicle
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
USD729760S1 (en) * 2012-08-27 2015-05-19 Aiphone Co., Ltd. Interphone
US9218118B2 (en) 2012-09-11 2015-12-22 Apple Inc. Media player playlist management
US9558278B2 (en) 2012-09-11 2017-01-31 Apple Inc. Integrated content recommendation
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
EP2741476A1 (en) * 2012-12-10 2014-06-11 Telefonaktiebolaget L M Ericsson (publ) Mobile device and method of operation
US11096668B2 (en) 2013-03-13 2021-08-24 Samsung Electronics Co., Ltd. Method and ultrasound apparatus for displaying an object
WO2014142468A1 (en) * 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Method of providing copy image and ultrasound apparatus therefor
CN103513914B (en) * 2013-03-13 2016-05-11 展讯通信(上海)有限公司 The method of toch control of application and device
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US9020567B2 (en) * 2013-04-05 2015-04-28 Blackberry Limited Authentication using fingerprint sensor in gesture path
US20140359538A1 (en) * 2013-05-28 2014-12-04 General Electric Company Systems and methods for moving display objects based on user gestures
JP5511040B2 (en) * 2013-05-29 2014-06-04 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
JP5686422B2 (en) * 2013-05-29 2015-03-18 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
KR20150014084A (en) * 2013-07-29 2015-02-06 삼성전자주식회사 Device based on touch screen and method for controlling object thereof
CN103488392A (en) * 2013-09-03 2014-01-01 小米科技有限责任公司 Editing method and device used for editable content of touch screen, and terminal
CN103530040B (en) * 2013-10-22 2016-03-30 腾讯科技(深圳)有限公司 Object element moving method, device and electronic equipment
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
KR102298602B1 (en) 2014-04-04 2021-09-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Expandable application representation
EP3129846A4 (en) 2014-04-10 2017-05-03 Microsoft Technology Licensing, LLC Collapsible shell cover for computing device
EP3129847A4 (en) 2014-04-10 2017-04-19 Microsoft Technology Licensing, LLC Slider cover for computing device
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10291597B2 (en) 2014-08-14 2019-05-14 Cisco Technology, Inc. Sharing resources across multiple devices in online meetings
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
CN106662891B (en) 2014-10-30 2019-10-11 微软技术许可有限责任公司 Multi-configuration input equipment
US10542126B2 (en) 2014-12-22 2020-01-21 Cisco Technology, Inc. Offline virtual participation in an online conference meeting
US9948786B2 (en) 2015-04-17 2018-04-17 Cisco Technology, Inc. Handling conferences using highly-distributed agents
US10291762B2 (en) 2015-12-04 2019-05-14 Cisco Technology, Inc. Docking station for mobile computing devices
US9762709B1 (en) * 2016-03-10 2017-09-12 Cisco Technology, Inc. Unibody desk telephone
AU201612400S (en) * 2016-05-03 2016-06-02 C Rafin & Co Pty Ltd Health Information Communication Device
US10574609B2 (en) 2016-06-29 2020-02-25 Cisco Technology, Inc. Chat room access control
US10592867B2 (en) 2016-11-11 2020-03-17 Cisco Technology, Inc. In-meeting graphical user interface display using calendar information and system
US10516707B2 (en) 2016-12-15 2019-12-24 Cisco Technology, Inc. Initiating a conferencing meeting using a conference room device
US10515117B2 (en) 2017-02-14 2019-12-24 Cisco Technology, Inc. Generating and reviewing motion metadata
US9942519B1 (en) 2017-02-21 2018-04-10 Cisco Technology, Inc. Technologies for following participants in a video conference
US10440073B2 (en) 2017-04-11 2019-10-08 Cisco Technology, Inc. User interface for proximity based teleconference transfer
US10375125B2 (en) 2017-04-27 2019-08-06 Cisco Technology, Inc. Automatically joining devices to a video conference
US10404481B2 (en) 2017-06-06 2019-09-03 Cisco Technology, Inc. Unauthorized participant detection in multiparty conferencing by comparing a reference hash value received from a key management server with a generated roster hash value
US10375474B2 (en) 2017-06-12 2019-08-06 Cisco Technology, Inc. Hybrid horn microphone
US10477148B2 (en) 2017-06-23 2019-11-12 Cisco Technology, Inc. Speaker anticipation
US10516709B2 (en) 2017-06-29 2019-12-24 Cisco Technology, Inc. Files automatically shared at conference initiation
US10706391B2 (en) 2017-07-13 2020-07-07 Cisco Technology, Inc. Protecting scheduled meeting in physical room
US10091348B1 (en) 2017-07-25 2018-10-02 Cisco Technology, Inc. Predictive model for voice/video over IP calls
US10771621B2 (en) 2017-10-31 2020-09-08 Cisco Technology, Inc. Acoustic echo cancellation based sub band domain active speaker detection for audio and video conferencing applications
CN112204785A (en) * 2018-05-30 2021-01-08 日产自动车株式会社 Fuel cell system and method for operating the same
DK201970535A1 (en) 2019-05-06 2020-12-21 Apple Inc Media browsing user interface with intelligently selected representative media items
DK202070613A1 (en) 2020-02-14 2021-10-15 Apple Inc User interfaces for workout content

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
EP0528597A2 (en) * 1991-08-16 1993-02-24 Sun Microsystems, Inc. Apparatus and methods for moving/copying objects using destination and/or source bins
EP0536715A2 (en) * 1991-10-07 1993-04-14 Fujitsu Limited An apparatus for manipulating an object displayed on a display device
EP0667567A2 (en) * 1993-12-30 1995-08-16 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables, and diagrams in a gesture-based input system and editing system
EP0690368A2 (en) * 1994-06-29 1996-01-03 International Business Machines Corporation A pen-based computer system
EP0698845A1 (en) * 1994-07-25 1996-02-28 International Business Machines Corporation Apparatus and method for marking text on a display screen in a personal communication device
US5627567A (en) * 1993-04-27 1997-05-06 Hewlett-Packard Company Method and apparatus for adaptive touch recognition in a touch sensitive user interface
US5670987A (en) * 1993-09-21 1997-09-23 Kabushiki Kaisha Toshiba Virtual manipulating apparatus and method
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61267128A (en) * 1985-05-21 1986-11-26 Sharp Corp Display erasure system
JP3256994B2 (en) * 1991-10-30 2002-02-18 富士通株式会社 Display target movement method by touch input
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
JPH11191036A (en) * 1997-12-26 1999-07-13 Yokogawa Electric Corp Window moving device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
EP0528597A2 (en) * 1991-08-16 1993-02-24 Sun Microsystems, Inc. Apparatus and methods for moving/copying objects using destination and/or source bins
EP0536715A2 (en) * 1991-10-07 1993-04-14 Fujitsu Limited An apparatus for manipulating an object displayed on a display device
US5627567A (en) * 1993-04-27 1997-05-06 Hewlett-Packard Company Method and apparatus for adaptive touch recognition in a touch sensitive user interface
US5670987A (en) * 1993-09-21 1997-09-23 Kabushiki Kaisha Toshiba Virtual manipulating apparatus and method
EP0667567A2 (en) * 1993-12-30 1995-08-16 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables, and diagrams in a gesture-based input system and editing system
EP0690368A2 (en) * 1994-06-29 1996-01-03 International Business Machines Corporation A pen-based computer system
EP0698845A1 (en) * 1994-07-25 1996-02-28 International Business Machines Corporation Apparatus and method for marking text on a display screen in a personal communication device
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
DATABASE WPI Section EI, Week 199938 Derwent Publications Ltd., London, GB; Class T01, AN 1999-454132 XP002124473 & JP 11 191036 A (YOKOGAWA DENKI KK), 13 July 1999 (1999-07-13) *
PATENT ABSTRACTS OF JAPAN vol. 011, no. 121 (P-568), 16 April 1987 (1987-04-16) & JP 61 267128 A (SHARP CORP), 26 November 1986 (1986-11-26) *
PATENT ABSTRACTS OF JAPAN vol. 017, no. 488 (P-1606), 3 September 1993 (1993-09-03) & JP 05 119946 A (FUJITSU LTD), 18 May 1993 (1993-05-18) *
PEDERSEN E R ET AL: "TIVOLI: AN ELECTRONIC WHITEBOARD FOR INFORMAL WORKGROUP MEETINGS" PROCEEDINGS OF THE CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS. (INTERCHI),US,READING, ADDISON WESLEY, 24 - 29 April 1993, page 391-398 XP000473784 *
REKIMOTO J: "A MULTIPLE DEVICE APPROACH FOR SUPPORTING WHITEBOARD-BASED INTERACTIONS" CHI CONFERENCE PROCEEDINGS. HUMAN FACTORS IN COMPUTING SYSTEMS,US,NEW YORK, NY: ACM, 18 - 23 April 1998, page 344-351 XP000780809 ISBN: 0-89791-975-0 *

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7462798B2 (en) 2005-04-27 2008-12-09 Aruze Corp. Gaming machine
EP1717770A1 (en) * 2005-04-27 2006-11-02 Aruze Corp. gaming machine
US9728044B2 (en) 2005-04-27 2017-08-08 Universal Entertainment Corporation Controlling method of a gaming machine
CN100397321C (en) * 2005-05-31 2008-06-25 富士通天株式会社 Map display device and map display method
WO2007019767A1 (en) * 2005-08-12 2007-02-22 Huawei Technologies Co., Ltd. A mobile terminal and its keyboard, a method for selecting a window, a method for allocating or activating a fix point on the screen
EP1955904A1 (en) * 2005-10-31 2008-08-13 Toyota Jidosha Kabushiki Kaisha Parking support device
EP1955904A4 (en) * 2005-10-31 2009-12-02 Toyota Motor Co Ltd Parking support device
US8035531B2 (en) 2005-10-31 2011-10-11 Toyota Jidosha Kabushiki Kaisha Parking support device
EP2390154A1 (en) * 2005-10-31 2011-11-30 Toyota Jidosha Kabushiki Kaisha Parking support device
US8487783B2 (en) 2005-10-31 2013-07-16 Toyota Jidosha Kabushiki Kaisha Parking support device
US9703474B2 (en) 2005-11-21 2017-07-11 Core Wireless Licensing S.A.R.L. Gesture based document editor
US8643605B2 (en) 2005-11-21 2014-02-04 Core Wireless Licensing S.A.R.L Gesture based document editor
WO2007057770A3 (en) * 2005-11-21 2008-03-06 Nokia Corp Gesture based document editor
US10359907B2 (en) 2005-12-30 2019-07-23 Apple Inc. Portable electronic device with interface reconfiguration mode
US9933913B2 (en) 2005-12-30 2018-04-03 Apple Inc. Portable electronic device with interface reconfiguration mode
US11650713B2 (en) 2005-12-30 2023-05-16 Apple Inc. Portable electronic device with interface reconfiguration mode
US11449194B2 (en) 2005-12-30 2022-09-20 Apple Inc. Portable electronic device with interface reconfiguration mode
US10536819B2 (en) 2006-09-06 2020-01-14 Apple Inc. Missed telephone call management for a portable multifunction device
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8255003B2 (en) 2006-09-06 2012-08-28 Apple Inc. Missed telephone call management for a portable multifunction device
US8135389B2 (en) 2006-09-06 2012-03-13 Apple Inc. Missed telephone call management for a portable multifunction device
US8504947B2 (en) 2006-09-06 2013-08-06 Apple Inc. Deletion gestures on a portable multifunction device
US10778828B2 (en) 2006-09-06 2020-09-15 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8014760B2 (en) 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
US11039283B2 (en) 2006-09-06 2021-06-15 Apple Inc. User interfaces for a messaging application
US11736602B2 (en) 2006-09-06 2023-08-22 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11240362B2 (en) 2006-09-06 2022-02-01 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8090087B2 (en) 2006-10-26 2012-01-03 Apple Inc. Method, system, and graphical user interface for making conference calls
US8972904B2 (en) 2007-01-07 2015-03-03 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US8091045B2 (en) 2007-01-07 2012-01-03 Apple Inc. System and method for managing lists
US10254949B2 (en) 2007-01-07 2019-04-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
WO2008085768A3 (en) * 2007-01-07 2008-09-18 Apple Inc System and method for moving list items on a touch screen
US7975242B2 (en) 2007-01-07 2011-07-05 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US10732821B2 (en) 2007-01-07 2020-08-04 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11169691B2 (en) 2007-01-07 2021-11-09 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11405507B2 (en) 2007-01-07 2022-08-02 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US11586348B2 (en) 2007-01-07 2023-02-21 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US10999442B2 (en) 2007-01-07 2021-05-04 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US9325852B2 (en) 2007-01-07 2016-04-26 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US10320987B2 (en) 2007-01-07 2019-06-11 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US11743390B2 (en) 2007-01-07 2023-08-29 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US9706054B2 (en) 2007-01-07 2017-07-11 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US10761691B2 (en) 2007-06-29 2020-09-01 Apple Inc. Portable multifunction device with animated user interface transitions
US11507255B2 (en) 2007-06-29 2022-11-22 Apple Inc. Portable multifunction device with animated sliding user interface transitions
US11861138B2 (en) 2007-09-04 2024-01-02 Apple Inc. Application menu user interface
US11604559B2 (en) 2007-09-04 2023-03-14 Apple Inc. Editing interface
US11010017B2 (en) 2007-09-04 2021-05-18 Apple Inc. Editing interface
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US10620780B2 (en) 2007-09-04 2020-04-14 Apple Inc. Editing interface
US10628028B2 (en) 2008-01-06 2020-04-21 Apple Inc. Replacing display of icons in response to a gesture
US9606715B2 (en) 2008-09-30 2017-03-28 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US8284170B2 (en) 2008-09-30 2012-10-09 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US10209877B2 (en) 2008-09-30 2019-02-19 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US8780082B2 (en) 2008-09-30 2014-07-15 Apple Inc. Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US10788965B2 (en) 2009-09-22 2020-09-29 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8863016B2 (en) 2009-09-22 2014-10-14 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US10564826B2 (en) 2009-09-22 2020-02-18 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US10282070B2 (en) 2009-09-22 2019-05-07 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US11334229B2 (en) 2009-09-22 2022-05-17 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8464173B2 (en) 2009-09-22 2013-06-11 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8456431B2 (en) 2009-09-22 2013-06-04 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8766928B2 (en) 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US11366576B2 (en) 2009-09-25 2022-06-21 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US10928993B2 (en) 2009-09-25 2021-02-23 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US10254927B2 (en) 2009-09-25 2019-04-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US9310907B2 (en) 2009-09-25 2016-04-12 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8780069B2 (en) 2009-09-25 2014-07-15 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US11947782B2 (en) 2009-09-25 2024-04-02 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8799826B2 (en) 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US8539386B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for selecting and moving objects
US8539385B2 (en) 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for precise positioning of objects
US8612884B2 (en) 2010-01-26 2013-12-17 Apple Inc. Device, method, and graphical user interface for resizing objects
US8677268B2 (en) 2010-01-26 2014-03-18 Apple Inc. Device, method, and graphical user interface for resizing objects
US11281368B2 (en) 2010-04-07 2022-03-22 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US11809700B2 (en) 2010-04-07 2023-11-07 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US10788953B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders
US11500516B2 (en) 2010-04-07 2022-11-15 Apple Inc. Device, method, and graphical user interface for managing folders
US9098182B2 (en) 2010-07-30 2015-08-04 Apple Inc. Device, method, and graphical user interface for copying user interface objects between content regions
US8972879B2 (en) 2010-07-30 2015-03-03 Apple Inc. Device, method, and graphical user interface for reordering the front-to-back positions of objects
US9081494B2 (en) 2010-07-30 2015-07-14 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US9626098B2 (en) 2010-07-30 2017-04-18 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
CN102368199A (en) * 2011-10-25 2012-03-07 中兴通讯股份有限公司 File management method and device for electronic equipment with touch screen, and electronic equipment
US10972600B2 (en) 2013-10-30 2021-04-06 Apple Inc. Displaying relevant user interface objects
US10250735B2 (en) 2013-10-30 2019-04-02 Apple Inc. Displaying relevant user interface objects
US11316968B2 (en) 2013-10-30 2022-04-26 Apple Inc. Displaying relevant user interface objects
US11323559B2 (en) 2016-06-10 2022-05-03 Apple Inc. Displaying and updating a set of application views
US10739974B2 (en) 2016-06-11 2020-08-11 Apple Inc. Configuring context-specific user interfaces
US11733656B2 (en) 2016-06-11 2023-08-22 Apple Inc. Configuring context-specific user interfaces
US11073799B2 (en) 2016-06-11 2021-07-27 Apple Inc. Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US11340757B2 (en) 2019-05-06 2022-05-24 Apple Inc. Clock faces for an electronic device
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device

Also Published As

Publication number Publication date
DE19983569T1 (en) 2001-10-04
HK1042359A1 (en) 2002-08-09
JP2002525705A (en) 2002-08-13
CN1126021C (en) 2003-10-29
CN1326564A (en) 2001-12-12
US20020018051A1 (en) 2002-02-14
AU6250899A (en) 2000-04-03
WO2000016186A3 (en) 2000-05-25
HK1042359B (en) 2004-02-27

Similar Documents

Publication Publication Date Title
US20020018051A1 (en) Apparatus and method for moving objects on a touchscreen display
US6259436B1 (en) Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US10353570B1 (en) Thumb touch interface
US8471822B2 (en) Dual-sided track pad
KR101424294B1 (en) Multi-touch uses, gestures, and implementation
JP5983503B2 (en) Information processing apparatus and program
US9292111B2 (en) Gesturing with a multipoint sensing device
CN106909304B (en) Method and apparatus for displaying graphical user interface
US6157379A (en) Apparatus and method of formatting a list for display on a touchscreen
US9348458B2 (en) Gestures for touch sensitive input devices
EP2317422B1 (en) Terminal and method for entering command in the terminal
JP5456529B2 (en) Method and computer system for manipulating graphical user interface objects
US6335725B1 (en) Method of partitioning a touch screen for data input
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
US10509549B2 (en) Interface scanning for disabled users
US20150212591A1 (en) Portable electronic apparatus, and a method of controlling a user interface thereof
TWI482077B (en) Electronic device, method for viewing desktop thereof, and computer program product therof
JP2009525538A (en) Gesture using multi-point sensing device
WO2013003105A1 (en) Electronic device and method with dual mode rear touch pad
EP2065794A1 (en) Touch sensor for a display screen of an electronic device
KR20210005753A (en) Method of selection of a portion of a graphical user interface
CN104360813A (en) Display equipment and information processing method thereof
KR100381583B1 (en) Method for transmitting a user data in personal digital assistant
JPH11305933A (en) Input device and input method
WO2017147994A1 (en) Task management method and system based on pressure touch

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 99813289.6

Country of ref document: CN

AK Designated states

Kind code of ref document: A2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2000 570657

Country of ref document: JP

Kind code of ref document: A

RET De translation (de og part 6b)

Ref document number: 19983569

Country of ref document: DE

Date of ref document: 20011004

WWE Wipo information: entry into national phase

Ref document number: 19983569

Country of ref document: DE

122 Ep: pct application non-entry in european phase