WO1994029788A1 - A method for utilizing a low resolution touch screen system in a high resolution graphics environment - Google Patents

A method for utilizing a low resolution touch screen system in a high resolution graphics environment Download PDF

Info

Publication number
WO1994029788A1
WO1994029788A1 PCT/US1994/006755 US9406755W WO9429788A1 WO 1994029788 A1 WO1994029788 A1 WO 1994029788A1 US 9406755 W US9406755 W US 9406755W WO 9429788 A1 WO9429788 A1 WO 9429788A1
Authority
WO
WIPO (PCT)
Prior art keywords
character
resolution
display
touch screen
zoom window
Prior art date
Application number
PCT/US1994/006755
Other languages
French (fr)
Inventor
Kevin P. Staggs
William B. Kilgore
Original Assignee
Honeywell Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell Inc. filed Critical Honeywell Inc.
Publication of WO1994029788A1 publication Critical patent/WO1994029788A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection

Definitions

  • touch screen systems are utilized which allows direct information input to an information processing apparatus from a coordinate position on a display screen by pointing out a desired position on a display of a cathode ray tube (CRT) display device (or some other like device).
  • CRT cathode ray tube
  • This technique is well accepted and has become the accepted standard for process operator interaction today.
  • this technology utilizes a low resolution infrared LED touch screen with a resolution of 127 by 95 to select targets on a display with a resolution of 640 by 448. Even though the touch screen resolution is lower than that of the display, selecting targets was not a problem since the minimum target size is 8 by 8. This results in a target resolution of 80 by 56 which is less than the resolution of the touch screen.
  • the resultant target resolution is 160 by 128.
  • targets on this high resolution display cannot be selected directly with the current touch screen resolution.
  • the invention described in the related application identified above describes a system that zooms up the area around the immediate touch area. This technique works well except when the data immediately under the touch area is changing dynamically. hen this situation occurs, the new data cannot be drawn because the zoomed up data covers where the new data should be drawn.
  • the present invention is an improvement to the invention identified in the related applications section above and corrects this condition.
  • a method of the present invention selects a character being displayed on a display using a pointer element.
  • the display has a first resolution and a touch screen system associated with the display has a second resolution.
  • the character has a third resolution resulting in a target resolution, such that the target resolution is higher than the resolution of the touch screen system.
  • the touch screen system transmits to a data processing unit processing the display, an X and Y coordinate signal of the character pointed to.
  • the method comprising the steps of, when a character is pointed to, accepting an initial X and Y coordinate value from the touch screen system, and calculating a corresponding X and Y coordinate value of the display.
  • the display is zoomed a predetermined number of characters around the initial X and Y coordinate values into a zoom window, the zoom window having a predetermined size and having a predetermined location relative to an initial entry point of the pointer element. If the character to be selected is not over a cursor, the cursor being displayed in the center of the zoom window, the pointer element is moved towards the character to be selected, otherwise the pointer is withdrawn. Updated X and Y coordinate position values of the pointer element is continually accepted on a periodic basis.
  • a new center point of the display in the zoom window is calculated, the new center point following the pointer element at a predetermined speed with respect to the speed of the pointer element.
  • Zoomed display data around the new center point is displayed in the zoom window.
  • the pointer is moved towards the character to be selected and the method is repeated until the character to be selected is in the center of the zoom window, i.e., in proximity or coincident with the cursor.
  • the pointer element is withdrawn, the new center point calculated previously having the X and Y coordinate values of the character to be selected, thereby permitting a desired character to be selected when the target resolution is higher than the resolution of the touch screen system.
  • Figure 2 shows a block diagram of common elements of each physical module of the process control system of Figure 1;
  • Figure 3 shows a functional block diagram of a typical physical module of the process control system
  • Figure 4 shows a partial, functional block diagram of the existing system and the opened system of the preferred embodiment
  • Figure 5 shows a functional block diagram of an open operator station of the preferred embodiment
  • Figure 6 shows a block diagram of a graphics card of the preferred embodiment of the universal station
  • Figure 7 which comprises Figures 7A and 7B, shows examples of screen displays of the display unit of the process control system
  • FIG. 8 shows an overview of the operation of the method of the present invention
  • Figure 9 (which comprises Figure 9A and 9B,) shows a flow diagram of the logic of the graphics card 150 for implementing the method of the present invention
  • Figure 10 shows an example of display in the zoom window which follows the stylus (or pointer) at half speed (or half distance);
  • Figure 11 which comprises Figures 11 A and 1 IB, shows a flow diagram of the event handler logic of a co-processor of the open operator station of the preferred embodiment of the present invention;
  • Figure 12 shows a partial display screen showing a partial row of characters with the corresponding display pixel value and beam of the touch screen system; and Figure 13 shows a partial expanded character display of Figure 12 according to the method of the present invention.
  • DETAILED DESCRIPTION Before describing the method of the present invention, it will be helpful in understanding a system environment in which the invention is utilized. Referring to Figure 1, there is shown a block diagram of a process control system 10 of the preferred embodiment in which the present invention can be found.
  • the process control system 10 includes a plant control network 11, and connected thereto is a data highway 12, which permits a process controller 20' to be connected thereto.
  • additional process controllers 20' can be operatively connected to the plant control network 11 via a corresponding highway gateway 601 and a corresponding data highway 12.
  • UCN universal control network
  • NIM network interface module
  • additional process controllers 20 can be operatively connected to the plant control network 11 via a corresponding UCN 14 and a corresponding NIM 602.
  • the plant control network 11 includes a plurality of physical modules, which include a universal operator station (US) 122, an application module (AM) 124, a history module (HM) 126, a computer module (CM) 128, and duplicates (backup or secondary) of these modules (and additional types of modules, not shown) as necessary to perform the required control/supervisory function of the process being controlled.
  • Each of these physical modules is operatively connected to a local control network (LCN) 120 which permits each of these modules to communicate with each other as necessary.
  • the NIM 602 and HG 601 provide an interface between the LCN 120 and the UCN 14, and the LCN 120 and the data highway 12, respectively.
  • Physical modules 122, 124, 126, 128,... of network 11 of the preferred embodiment are of various specialized functional types.
  • Each physical module is the peer, or equivalent, of the other in terms of right of access to the network's communication medium, or LCN 120, for the purpose of transmitting data to other physical modules of network 11.
  • a history module (HM) 126 provides mass data storage capability.
  • the history module 126 includes at least one conventional disk mass storage device such as a Winchester disk, which disk storage device provides a large volume of nonvolatile storage capability for binary data.
  • the types of data stored by such a mass storage device are typically trend histories, event histories, ....or data from which such histories can be determined, data that constitutes or forms CRT type displays, copies of programs for the physical modules....
  • An application module (AM) 124 provides additional data processing capability in support of the process control functions performed by the controllers associated with the process control subsystem 20, 20' such as data acquisition, alarming, batch history collection, and provide continuous control computational facilities when needed.
  • the data processing capability of the application module 124 is provided by a processor (not shown) and a memory (not shown) associated with the module.
  • the local control network 120 is a high-speed, bit serial, dual redundant communication network that interconnects all the physical modules of plant control network 11.
  • LCN 120 provides the only data transfer path between the principal sources of data, such as highway gateway module 601, application module 124, and history module 126, and principal users of such data, such as universal operator station module 122, computer module 128, and application module 124.
  • LCN 120 also provides the communication medium over which large blocks of data, such as memory images, can be moved from one physical module such as history module 126 to universal station module 122.
  • LCN 120 is dual redundant in that it consists of two coaxial cables that permit the serial transmission of binary signals over both cables.
  • each of the physical modules includes a module central processor unit 38 and a module memory 40, a random-access memory (not shown), and such additional controller devices, or units (not shown), which are configured to provide the desired functionality of that type of module, i.e., that of the operator station 122, for example.
  • the data- processing capabilities of each module's CPU 38 and module memory 40 create a distributed processing environment which provides for improved reliability and performance of network 11 and process control system 10. The reliability of network 11 and system 10 is improved because, if one physical module of network 11 fails the other physical modules will remain operational.
  • network 11 as a whole is not disabled by such an occurrence as would be the case in centralized systems.
  • Performance is improved by this distributed environment in that throughput and fast operator response times result from the increase computer processing resources, and the concurrency and parallelism of the data-processing capabilities of the system.
  • each physical module includes the bus interface unit, BIU, 32 which is connected to the LCN 120 by the transceiver 34.
  • Each physical module is also provided with the module bus 36 which, in the preferred embodiment, is capable of transmitting 16 bits of data in parallel, between the module CPU 38 and the module memory 40.
  • Other units utilized to tailor each type of physical module to satisfy its functional requirements, are operatively connected to module bus 36 so that each such unit can communicate with the other units of the physical module via its module bus 36.
  • the BIU 32 of the physical module initiates the transmission of data over LCN 120. In the preferred embodiment, all transmissions by a BIU 32 are transmitted over the coaxial cables which, in the preferred embodiment, form the LCN 120.
  • FIG. 3 there is shown a functional block diagram of a typical physical module 122, 124, 126, 128 of the plant control network 11, and includes the bus interface unit (BIU) 32 and the transceiver 34, which connects BIU 32 to the LCN 120.
  • BIU bus interface unit
  • BIU 32 is capable of transmitting binary data over LCN 120 and of receiving data from LCN 120.
  • Transceiver 34 in the preferred embodiment is transformer coupled to the LCN 120.
  • the LCN 120 is a dually redundant coaxial cable with the capability of transmitting bit serial data.
  • BIU 32 is provided with a very fast micro- engine 56.
  • micro engine 56 is made up of bit slice components so that it can process eight bits in parallel, and can execute a 24 bit microinstruction from its programmable read only memory (PROM) 58.
  • PROM programmable read only memory
  • Signals received from the LCN 120 are transmitted by transceiver 34 and receive circuitry 52 to receive FIFO register 54.
  • Micro engine 56 examines the data stored in FIFO register 54 and determines if the information is addressed to the physical module. If the data is an information frame, the received data is transferred by direct memory access (DMA) write circuitry 66 by conventional direct memory access techniques to the physical module memory unit (MMU) 40 over module bus 36.
  • DMA direct memory access
  • MMU physical module memory unit
  • Each MCPU 38 includes a timing subsystem 48 which, in response to clock signals from module clock 45 produces fine resolution, synchronization, and real-time, timing signals.
  • Any timing subsystem 48 which is provided with a timing subsystem driver 50, has the capability of transmitting timing information to other physical modules over the LCN 120.
  • Another input to each timing subsystem 48 is timing information which is transmitted over LCN 120 and which is received through transceiver 34, timing receiver 55 and timing driver 57 of BIU 32.
  • Timing pulses from module power supply 59 which are a function of the frequency of the external source of A.C. electric power applied to power supply 59 are used by timing subsystem 48 to correct longer term frequency drift of the clock pulses produced by clock 45.
  • the universal operator station (US) 122 is coupled to a co-processor 200, and the co-processor is coupled to an open system, i.e., interfaces/protocols of differing design, including task control program/interface protocol (TCP/IP), open system interface (OSI), DECnet (a product of the Digital Equipment Corporation of Maynard, Massachusetts),....
  • the universal station 122 is also connected to the LCN 120 as described above.
  • the new universal operator station (open US) 123 includes the US 122 as described above in conjunction with the co-processor 200.
  • the purpose of the open US 123 is to open the graphical interface to the open systems and to provide information from the closed US to the open systems.
  • the co-processor 200 is structured to permit the interface to other systems, i.e., the open systems without jeopardizing the integrity of the existing system.
  • the co-processor 200 of the preferred embodiment is a Motorola 68040 microprocessor which is executing the UNIX operating systems (UNIX is an operating system of the American Telephone and Brass Company, ATT, is readily available and is well known to those skilled in the art), and is sometimes referred to as a UNIX co-processor.
  • the co ⁇ processor 200 requests service (e.g., the value of a point, contents of a file,... or any information of the process control system 10) of the module CPU 38 through shared memory 202.
  • the module CPU 38 men communicates with the appropriate module to perform the requested service in a normal fashion. Once the response is obtained the information is passed to the co-processor 200 via shared memory 202. Since the module CPU 38 is communicating via the LCN 120, the integrity of the LCN (i.e., the system) is maintained and similarly the module memory 40 cannot be corrupted by the co-processor 200.
  • the graphics card 150 includes a card bus 152. Attached to the card bus 152 is a data memory 154 which contains the information which is to be displayed onto the CRT, and also contains some control information.
  • a microprocessor 156 is also coupled to the card bus 152 and further is coupled to the module bus 36.
  • a graphics processor 160 is coupled to the card bus 152 and performs all the processing for developing the information stored in the data memory 154, including some control functions.
  • a shared memory 158 is coupled to the card bus 152.
  • a connection is made from the card bus 152 to the co-processor 200, thereby providing the interface mentioned above to the graphics card 150 from the co- processor 200.
  • the microprocessor 156 of the preferred embodiment of the graphic card 150 is a Motorola 68020 processor.
  • the graphics card 150 is a two port graphics card, one port of the graphics card being tied to the module bus 36 which is how a display is driven from LCN.
  • the LCN 120 provides a "single window to the process," i.e., a screen display of what the process/process control system is doing.
  • the second port is coupled to the co- processor 200 and provides the windows interface for the universal station 122.
  • the windows interface is the X-windows interface which is well defined and well known to those skilled in the art (the interface being defined by MIT, Cambridge, Massachusetts). It is through the interface from the co-processor 200 that all the window displays [i.e., the screen display(s) of the open system(s)] and windows controls are performed, including commands to the graphic card 150 to specify where to place the single window to the process on the screen of the CRT 151.
  • the interface between the graphics card 150 and the co-processor 200 is the full windows interface.
  • One of the windows is the display referred to above as the "single window to the processor" (sometimes referred to as the LCN window).
  • the co-processor 200 commands the graphics card 150 where the LCN window is to be placed on the CRT 151 and its relative size on the display.
  • X-windows is a well defined protocol of how to communicate with the graphics card 150 (or any graphics card) and display, and a computer permitting many windows to be displayed.
  • a server is defined in X-windows as the machine that is driving the display (or that portion of the co-processor 200 which interfaces to the graphics card 150), and a client is the application program, in the present embodiment, the DEC processor 300.
  • the client 300 can have data which is desired to be displayed.
  • the client 300 communicates with the server portion of the co-processor 200 through an X-windows protocol indicating data to be displayed.
  • the server portion of the co-processor 200 communicates with the graphics card 150 through a device dependent layer (DDL) and is provided by the vendor of the graphics card, or in X-windows is via DDX protocol.
  • the microprocessor 156 maintains the integrity of the card bus 152 into the data memory 154.
  • the processing of the data to be displayed on the CRT 151 is performed by the graphics processor 160.
  • the microprocessor 156 (which accepts requests from the LCN 120 via module bus 36) places the data in shared memory 158, and is subsequently processed by the graphics processor 160, and is then stored in data memory 154.
  • the open system 300 via the client
  • desires to display some information the information is communicated to the server portion of the co ⁇ processor 200 which then stores the information in the shared memory 158.
  • the graphics processor 160 then processes that information and stores it in the data memory 154 for display. In that manner, and under the control of the graphics processor 160, the plurality of displays, i.e., windows, is displayed on the CRT 151.
  • the co-processor 200 of the preferred embodiment of the present invention is a Motorola 68040, having bus capability with the other microprocessors of the system. It will be understood that a variety of processors can be utilized including a reduced instruction set processor which is available from Hewlett Packard among other processor manufacturers.
  • the preferred embodiment utilizes the UNIX operating system, it will be recognized by those skilled in the art that any operating system can be utilized, including OSF1, Open Systems Foundation/USA, Cambridge, Massachusetts.
  • the co ⁇ processor 200 is controlling the display in the preferred embodiment the graphics card can also perform the display control. Since X-windows was readily available and performed the desired display control function, X-windows was utilized to take advantage of the availability of the desired control function. It will be recognized by those skilled in the art that implementation of the present invention is not limited to X-windows, and that any protocol can be utilized.
  • the process control system 10 is open system permitting other system to interface into the LCN of the process control system and, because of the communication scheme as described above, the integrity of the process control system 10 is maintained.
  • the graphics card 150 although not in immediate control of the display unit 151, guarantees that the graphic view (control view) to a field device (i.e., valve,...) or any other controls view of the process control system on the display unit is always maintained regardless of the operational state of the co-processor 200.
  • one side of the touch screen system includes LEDs and on the opposite side sensors are placed which receive the signal from the LEDs.
  • a report is transmitted by the touch screen system to an information processing apparatus identifying the coordinate inputs of the position desired.
  • both an X beam and a Y beam must be broken.
  • the present invention utilizes a touch screen having a resolution of 127 wide by 95 high.
  • a display resolution of 1280 by 1024 has a resultant target resolution of 160 wide by 128 high.
  • the target resolution is higher than the touch screen resolution, the touch screen resolution being 127 by 95.
  • the display is zoomed in a predefined "window area" off to the side of the finger (or stylus) thereby permitting new data around the zoomed up area to be drawn (displayed) and viewed.
  • the character resolution around the reported position displayed in the window area has a character resolution of 16 by 16. This results in a target resolution of 80x64.
  • the touch screen resolution of 127 wide by 95 high results in having at least one beam cross a character, being able to position to a desired character.
  • the method of the present invention is implemented in the preferred embodiment wherein the graphics card 150 (more specifically, the microprocessor 156 and the graphics processor 160), sometimes referred to herein as graphics controller, includes some logic and the co-processor 200 includes some logic (in the X-server portion of the X-windows protocol and extension of the X-server), the graphics card 150 and the co-processor 22 operating together to obtain the improved display of the present invention.
  • the graphics card 150 more specifically, the microprocessor 156 and the graphics processor 160
  • the co-processor 200 includes some logic (in the X-server portion of the X-windows protocol and extension of the X-server), the graphics card 150 and the co-processor 22 operating together to obtain the improved display of the present invention.
  • Figure 8 there is shown an overview of the operation of the method of the present invention.
  • the touch screen system (or more simply referred to herein as display 151) detects the touching of the display 151 and reports the x-y coordinates of that touch to the graphics card 150.
  • the graphics card 150 reports the touch screen coordinates to the X-server logic.
  • the X-server logic computes the zoom region (sometimes referred to herein as the predefined "window area” or “window”) and notifies the graphic card 150 of the bounding region and location in which to zoom the display, and the graphics card 150 zooms up the area around the x-y touch coordinates into this zoom window.
  • the initial touch position is not directly over the desired target.
  • the operator will move the pointing device (finger, stylus, pointer,...) until the touch position is over the desired target.
  • the touch screen system sending a new x-y coordinate to the graphics card 150.
  • the graphics card 150 calculates new x-y coordinates to send to the X-server logic and zooms up the area around the new x-y coordinates into the zoom window.
  • the location of the zoom window on the display remains fixed, but the data contained within the zoom window follows the pointer as will be described in more detail hereinunder. This process continues until the pointing device is removed from the touch screen system.
  • the touch screen system sends an exit code to the graphics card 150.
  • the graphics card stops zooming data into the zoom window, and passes the exit code to the X-server logic.
  • the X-server logic detects the exit code, it removes the zoom window and the image that was covered by the zoom window is restored.
  • the zoom feature is included as part of the start up procedure by the operator. If the zoom feature is not initialized, then the touch screen zooming function is disabled. If the zoom feature is started, then a zoom window of setable size and location, i.e., programmable, is defined. The operator indicates the size of the zoom region by height and width, indicates the x-offset and y-offset, and indicates whether the operator is standing or sitting, right handed or left handed,.... In this fashion the zoom window will be presented in the most unobstructed manner, i.e., the zoom window will be unobstructed from the operators view as determined by the inputted initialized information.
  • the cursor position is defined as the center of the zoom region and is not displayed when the display is in a zoom mode. When the finger is removed and the zoom region is deleted, the cursor will appear in its proper position on the display.
  • FIG. 9 which comprises Figures 9 A and 9B, there is shown a flow diagram of the logic of the graphics card 150 for implementing the method of the present invention.
  • a report is received by the information handling apparatus, in the preferred embodiment the graphics processor 160 of the process control system 10.
  • the graphics processor 160 Upon the first entry of the finger or stylus onto the display screen, the report also indicates that this is an entry report (block 415).
  • the graphics processor 160 checks to verify whether this is the first report (block 420) and if it is sets a first report flag (block 425).
  • a report is transmitted to the graphics processor 160 by the touch screen system about every 1/16 of a second, i.e., 16 reports per second. Since the initial report, as a result of placing the finger onto the touch screen, may not result in identifying the desired character, the finger or stylus may be moved. On each subsequent report, since the finger or stylus is still on the screen, the exit report check of block 415 will indicate a no; however, the first report determination will also be a no thus proceeding to block 446.
  • the graphics processor 160 determines if the zoom touch and the zoom window are enabled (blocks 446, 447). If neither are enabled, the processing proceeds to block 430. If both are enabled the process proceeds to block 450. The new area to be zoomed around as the pointer is moved is now calculated.
  • FIG. 11 which comprises Figures 11 A and 11 B, there is shown a flow diagram of the X-server logic of co-processor 200, which operates in conjunction with the graphics card logic, to generate the display, and specifically to generate the window of the display 151, in the preferred embodiment of the present invention.
  • Figure 11 A shows the event handler of the X-server logic, including events, i.e., inputs, which are received from the open US 123 namely, keyboard inputs, mouse inputs, touch screen inputs,....
  • the zoom window is removed from the screen, and the display restored to a normal display without the window.
  • the logic of block 501 is activated.
  • the logic of block 511 is activated.
  • the touch screen event includes an exit code (block 520)
  • the logic of block 511 is activated in which the window is removed from the display. If the zoom window is mapped to the display (block 512), the zoom window is unmapped from the display (block 520).
  • the graphics controller is informed that the zoom window has been disabled (block 516).
  • the zoom window is removed from the screen.
  • the X-server logic sends the new cursor location on into the server for further processing (block 521). In this manner the X-server logic generates or removes the zoom window from the display and indicates to the graphics controller logic whether the zoom window is enabled or disabled.
  • the graphics controller logic subsequently places the zoomed data within the zoom window in accordance with the method of the present invention described above.
  • the pointer may be in other relative locations, such as above and to the right, below and to the left, below and to the right of the pointer.
  • the location of the window relative to the pointer is programmable from the data entered by the operator as described above.
  • the logic will invert the zoom window location such that it appears on the screen, i.e., below and to the right of the pointer.
  • the window will appear below the pointer.

Abstract

A relatively low resolution touch screen system is utilized in a graphic display environment wherein the resolution of the graphic display is relatively high. The character resolution yields a low target resolution, a single character being the target (or the desired character). The method zooms a predetermined number of characters around an initially reported position into a zoom window, the zoom window being an area of the display screen relative to the initially reported position, the center of the zoom window displaying a cursor character. The zooming increases the character resolution such that target resolution is decreased to a value resulting in a touch screen resolution at least equal to or higher than resolution of the target resolution. Further, as the stylus is moved toward the desired character, a new center position of the data to be displayed in the zoom window is calculated, the new center position following the pointer element at a predetermined speed. When the desired character is over the cursor, the pointing element iswithdrawn, the current position of the pointing element utilized to identify the position of the desired character, and thus the desired character itself.

Description

A METHOD FOR UTILIZING A LOW RESOLUTION TOUCH SCREEN SYSTEM IN A HIGH RESOLUTION GRAPHICS ENVIRONMENT
BACKGROUND OF THE INVENTION
The present invention relates to graphic display systems, and more particularly, to a method for utilizing a touch screen system having a relatively low resolution in a relatively high resolution graphics environment.
Presently, touch screen systems are utilized which allows direct information input to an information processing apparatus from a coordinate position on a display screen by pointing out a desired position on a display of a cathode ray tube (CRT) display device (or some other like device). This technique is well accepted and has become the accepted standard for process operator interaction today. However, this technology utilizes a low resolution infrared LED touch screen with a resolution of 127 by 95 to select targets on a display with a resolution of 640 by 448. Even though the touch screen resolution is lower than that of the display, selecting targets was not a problem since the minimum target size is 8 by 8. This results in a target resolution of 80 by 56 which is less than the resolution of the touch screen.
However, when higher resolution displays are utilized, such as a display having a
1280 by 1024 resolution for example, the resultant target resolution is 160 by 128. Thus, targets on this high resolution display cannot be selected directly with the current touch screen resolution. The invention described in the related application identified above describes a system that zooms up the area around the immediate touch area. This technique works well except when the data immediately under the touch area is changing dynamically. hen this situation occurs, the new data cannot be drawn because the zoomed up data covers where the new data should be drawn. The present invention is an improvement to the invention identified in the related applications section above and corrects this condition.
SUMMARY OF THE INVENTION
Therefore, there is provided by the present invention, an improved method for utilizing a low resolution touch screen with a high resolution graphics display. A method of the present invention selects a character being displayed on a display using a pointer element. The display has a first resolution and a touch screen system associated with the display has a second resolution. The character has a third resolution resulting in a target resolution, such that the target resolution is higher than the resolution of the touch screen system. The touch screen system transmits to a data processing unit processing the display, an X and Y coordinate signal of the character pointed to. The method comprising the steps of, when a character is pointed to, accepting an initial X and Y coordinate value from the touch screen system, and calculating a corresponding X and Y coordinate value of the display. Then the initial X and Y coordinate values and the corresponding calculated X and Y coordinate values are saved. The display is zoomed a predetermined number of characters around the initial X and Y coordinate values into a zoom window, the zoom window having a predetermined size and having a predetermined location relative to an initial entry point of the pointer element. If the character to be selected is not over a cursor, the cursor being displayed in the center of the zoom window, the pointer element is moved towards the character to be selected, otherwise the pointer is withdrawn. Updated X and Y coordinate position values of the pointer element is continually accepted on a periodic basis. A new center point of the display in the zoom window is calculated, the new center point following the pointer element at a predetermined speed with respect to the speed of the pointer element. Zoomed display data around the new center point is displayed in the zoom window. The pointer is moved towards the character to be selected and the method is repeated until the character to be selected is in the center of the zoom window, i.e., in proximity or coincident with the cursor. Then the pointer element is withdrawn, the new center point calculated previously having the X and Y coordinate values of the character to be selected, thereby permitting a desired character to be selected when the target resolution is higher than the resolution of the touch screen system.
Accordingly, it is an object of the present invention to provide a method for utilizing a low resolution touch screen with a high resolution graphics display. It is another object of the present invention to provide a method for utilizing a low resolution touch screen to select a character of a display whereby the display has a relatively high resolution.
It is still another object of the present invention to provide an improved method for utilizing a low resolution touch screen to select a character of a display, the display having a relatively high resolution, such that dynamically changing data within the zoomed area is displayed. These and other objects of the present invention will become more apparent when taken in conjunction with the following description and attached drawings, wherein like characters indicate like parts, and which drawings form a part of the present application. BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 shows a block diagram of a process control system of the preferred embodiment in which the present invention is utilized;
Figure 2 shows a block diagram of common elements of each physical module of the process control system of Figure 1;
Figure 3 shows a functional block diagram of a typical physical module of the process control system;
Figure 4 shows a partial, functional block diagram of the existing system and the opened system of the preferred embodiment;
Figure 5 shows a functional block diagram of an open operator station of the preferred embodiment; Figure 6 shows a block diagram of a graphics card of the preferred embodiment of the universal station;
Figure 7, which comprises Figures 7A and 7B, shows examples of screen displays of the display unit of the process control system;
Figure 8 shows an overview of the operation of the method of the present invention;
Figure 9, (which comprises Figure 9A and 9B,) shows a flow diagram of the logic of the graphics card 150 for implementing the method of the present invention;
Figure 10 shows an example of display in the zoom window which follows the stylus (or pointer) at half speed (or half distance); Figure 11 , which comprises Figures 11 A and 1 IB, shows a flow diagram of the event handler logic of a co-processor of the open operator station of the preferred embodiment of the present invention;
Figure 12 shows a partial display screen showing a partial row of characters with the corresponding display pixel value and beam of the touch screen system; and Figure 13 shows a partial expanded character display of Figure 12 according to the method of the present invention. DETAILED DESCRIPTION Before describing the method of the present invention, it will be helpful in understanding a system environment in which the invention is utilized. Referring to Figure 1, there is shown a block diagram of a process control system 10 of the preferred embodiment in which the present invention can be found. The process control system 10 includes a plant control network 11, and connected thereto is a data highway 12, which permits a process controller 20' to be connected thereto. In the present day process control system 10, additional process controllers 20' can be operatively connected to the plant control network 11 via a corresponding highway gateway 601 and a corresponding data highway 12. A process controller 20, an interface apparatus which includes many new, additions, improvements, and features over the process controller 20', is operatively connected to the plant control network 11 via a universal control network (UCN) 14 to a network interface module (NIM) 602. In the preferred embodiment of the process control system 10, additional process controllers 20 can be operatively connected to the plant control network 11 via a corresponding UCN 14 and a corresponding NIM 602. The process controllers 20, 20' interface the analog input and output signals, and digital input and output signals (A/I, A/O, D/I, and D/O respectively) to the process control system 10 from the variety of field devices (not shown) of the process being controlled which include valves, pressure switches, pressure gauges, thermocouples,.... The plant control network (or more simply network) 11 provides the overall supervision of the controlled process, in conjunction with the plant operator, and obtains all the information needed to perform the supervisory function, and includes an interface with the operator. The plant control network 11 includes a plurality of physical modules, which include a universal operator station (US) 122, an application module (AM) 124, a history module (HM) 126, a computer module (CM) 128, and duplicates (backup or secondary) of these modules (and additional types of modules, not shown) as necessary to perform the required control/supervisory function of the process being controlled. Each of these physical modules is operatively connected to a local control network (LCN) 120 which permits each of these modules to communicate with each other as necessary. The NIM 602 and HG 601 provide an interface between the LCN 120 and the UCN 14, and the LCN 120 and the data highway 12, respectively. Physical modules 122, 124, 126, 128,... of network 11 of the preferred embodiment are of various specialized functional types. Each physical module is the peer, or equivalent, of the other in terms of right of access to the network's communication medium, or LCN 120, for the purpose of transmitting data to other physical modules of network 11.
Universal operator station module (US) 122 of network 11 is a work station for one or more plant operators. It includes an operator console which is the interface between the plant operator, or operators, and the process or processes of the plant for which they are responsible. Each universal operator station module 122, is connected to the LCN 120, and all communications between the universal operator station module 122, and any other physical module of network 11, is via the LCN 120. Universal operator station module 122 has access to data that is on the LCN 120 and the resources and data available through, or from, any of the other physical modules of network 11. The universal station module 122 includes a cathode ray tube display (CRT) (not shown) which includes a video display generator, an operator keyboard (KB) (not shown), a printer (PRT) (not shown), and can also include (but not shown) a cartridge disk data storage device, trend pen recorders, and status displays, for example.
A history module (HM) 126 provides mass data storage capability. The history module 126 includes at least one conventional disk mass storage device such as a Winchester disk, which disk storage device provides a large volume of nonvolatile storage capability for binary data. The types of data stored by such a mass storage device are typically trend histories, event histories, ....or data from which such histories can be determined, data that constitutes or forms CRT type displays, copies of programs for the physical modules.... ^ An application module (AM) 124 provides additional data processing capability in support of the process control functions performed by the controllers associated with the process control subsystem 20, 20' such as data acquisition, alarming, batch history collection, and provide continuous control computational facilities when needed. The data processing capability of the application module 124 is provided by a processor (not shown) and a memory (not shown) associated with the module.
Computer module (CM) 128 uses the standard or common units of all physical modules to permit a medium-to-large scale, general purpose data processing system to communicate with other physical modules of network 11 and the units of such modules over the LCN 120 and the units of process control subsystems 20, 20' via the highway gateway module 601, and the NIM 602, respectively. Data processing systems of a computer module 128 are used to provide supervisory, optimization, generalized user program preparation and execution of such programs in higher level program languages.
Typically, the data processing systems of a computer module 128 have the capability of communicating with other such systems by a communication processor and communication lines.
The local control network 120 (LCN) is a high-speed, bit serial, dual redundant communication network that interconnects all the physical modules of plant control network 11. LCN 120 provides the only data transfer path between the principal sources of data, such as highway gateway module 601, application module 124, and history module 126, and principal users of such data, such as universal operator station module 122, computer module 128, and application module 124. LCN 120 also provides the communication medium over which large blocks of data, such as memory images, can be moved from one physical module such as history module 126 to universal station module 122. LCN 120 is dual redundant in that it consists of two coaxial cables that permit the serial transmission of binary signals over both cables.
Referring to Figure 2, there is shown a block diagram of the common elements of each physical module of the network 11 or the process control system 10. Each of the physical modules includes a module central processor unit 38 and a module memory 40, a random-access memory (not shown), and such additional controller devices, or units (not shown), which are configured to provide the desired functionality of that type of module, i.e., that of the operator station 122, for example. The data- processing capabilities of each module's CPU 38 and module memory 40 create a distributed processing environment which provides for improved reliability and performance of network 11 and process control system 10. The reliability of network 11 and system 10 is improved because, if one physical module of network 11 fails the other physical modules will remain operational. As a result, network 11 as a whole is not disabled by such an occurrence as would be the case in centralized systems. Performance is improved by this distributed environment in that throughput and fast operator response times result from the increase computer processing resources, and the concurrency and parallelism of the data-processing capabilities of the system.
As mentioned above, each physical module includes the bus interface unit, BIU, 32 which is connected to the LCN 120 by the transceiver 34. Each physical module is also provided with the module bus 36 which, in the preferred embodiment, is capable of transmitting 16 bits of data in parallel, between the module CPU 38 and the module memory 40. Other units, utilized to tailor each type of physical module to satisfy its functional requirements, are operatively connected to module bus 36 so that each such unit can communicate with the other units of the physical module via its module bus 36. The BIU 32 of the physical module initiates the transmission of data over LCN 120. In the preferred embodiment, all transmissions by a BIU 32 are transmitted over the coaxial cables which, in the preferred embodiment, form the LCN 120.
Referring to Figure 3 there is shown a functional block diagram of a typical physical module 122, 124, 126, 128 of the plant control network 11, and includes the bus interface unit (BIU) 32 and the transceiver 34, which connects BIU 32 to the LCN 120.
BIU 32 is capable of transmitting binary data over LCN 120 and of receiving data from LCN 120. Transceiver 34 in the preferred embodiment, is transformer coupled to the LCN 120. In the preferred embodiment, the LCN 120 is a dually redundant coaxial cable with the capability of transmitting bit serial data. BIU 32 is provided with a very fast micro- engine 56. In the preferred embodiment, micro engine 56 is made up of bit slice components so that it can process eight bits in parallel, and can execute a 24 bit microinstruction from its programmable read only memory (PROM) 58.
Signals received from the LCN 120 are transmitted by transceiver 34 and receive circuitry 52 to receive FIFO register 54. Micro engine 56 examines the data stored in FIFO register 54 and determines if the information is addressed to the physical module. If the data is an information frame, the received data is transferred by direct memory access (DMA) write circuitry 66 by conventional direct memory access techniques to the physical module memory unit (MMU) 40 over module bus 36.
Communication between MCPU processor 68, a Motorola 68020 microprocessor in the preferred embodiment, and other functional elements of MCPU 38 is via local microprocessor bus 39. Module bus interface element 41 provides the communication link between local bus 39 and module bus 36. Processor 68 executes instructions fetched from either its local memory 43, in the preferred embodiment an EPROM, or from MMU 40. Processor 68 has a crystal controlled clock 45 which produces clock pulses, or timing signals. Input/output (I/O) port 49 provides communication between MCPU 38 and equipment external to the physical module to permit program loading, and the diagnosis of errors, or faults, for example.
Each MCPU 38 includes a timing subsystem 48 which, in response to clock signals from module clock 45 produces fine resolution, synchronization, and real-time, timing signals. Any timing subsystem 48 which is provided with a timing subsystem driver 50, has the capability of transmitting timing information to other physical modules over the LCN 120. Another input to each timing subsystem 48, is timing information which is transmitted over LCN 120 and which is received through transceiver 34, timing receiver 55 and timing driver 57 of BIU 32. Timing pulses from module power supply 59 which are a function of the frequency of the external source of A.C. electric power applied to power supply 59 are used by timing subsystem 48 to correct longer term frequency drift of the clock pulses produced by clock 45.
Additional information of the BIU 32 can be found in U.S. Patent No. 4,556,974. A more detailed description of the process control system 10 can be had by referring to U.S. Patent No. 4,607,256. Additional information of the individual, common, functional blocks of the physical modules can be had by reference to U.S. Patent No. 4,709,347, all of the above-identified patents being assigned to the assignee of the present application, and additional information of the process controller 20' can be had by referencing U.S. Patent No. 4,296,464.
The addition of an interface apparatus which interfaces other systems to the process control system 10 described above and a modification to a graphics generator in the US 122 opens up the existing system, specifically the graphics interface, which includes designing in the capability to readily permit nodes of differing designs to communicate to the network, and will now be described.
Referring to Figure 4, there is shown a partial functional block diagram of the existing system and the open (or opened) system. The universal operator station (US) 122 is coupled to a co-processor 200, and the co-processor is coupled to an open system, i.e., interfaces/protocols of differing design, including task control program/interface protocol (TCP/IP), open system interface (OSI), DECnet (a product of the Digital Equipment Corporation of Maynard, Massachusetts),.... The universal station 122 is also connected to the LCN 120 as described above. Thus, the new universal operator station (open US) 123 includes the US 122 as described above in conjunction with the co-processor 200. The purpose of the open US 123 is to open the graphical interface to the open systems and to provide information from the closed US to the open systems. The co-processor 200 is structured to permit the interface to other systems, i.e., the open systems without jeopardizing the integrity of the existing system. The co-processor 200 of the preferred embodiment is a Motorola 68040 microprocessor which is executing the UNIX operating systems (UNIX is an operating system of the American Telephone and Telegraph Company, ATT, is readily available and is well known to those skilled in the art), and is sometimes referred to as a UNIX co-processor.
Referring to Figure 5, there is shown a functional block diagram of the open operator station 123 of the preferred embodiment. The operator station 122 as described above includes the BIU 32 connected to the module bus 36, the module memory 40, and the module CPU 38, both also connected to the module bus 36. These basic functional blocks are contained in all the physical modules. Additional functional blocks added to the physical module is what gives the physical module its personality apart from any other physical module. The operator station 122 includes a graphics card 150 which interfaces with a display (CRT) and a keyboard (KB) 151, 153. A shared memory 202 is included and is also connected to the module bus 36 which provides for communication between the co-processor 200 and the US physical module 122 (thereby providing communication to the rest of the process control system 10 via the module CPU 38). Thus, the co¬ processor 200 requests service (e.g., the value of a point, contents of a file,... or any information of the process control system 10) of the module CPU 38 through shared memory 202. The module CPU 38 men communicates with the appropriate module to perform the requested service in a normal fashion. Once the response is obtained the information is passed to the co-processor 200 via shared memory 202. Since the module CPU 38 is communicating via the LCN 120, the integrity of the LCN (i.e., the system) is maintained and similarly the module memory 40 cannot be corrupted by the co-processor 200.
Also shown in Figure 5 is an example open system (or foreign system), for example, a Digital Equipment Corporation system which includes the DECnet network and protocol and a DEC processor 300 attached to the DECnet network. In the preferred embodiment, the communication between the DEC open system and the co-processor 200 is via an X-windows protocol (X-windows being a protocol defined by the Massachusetts Institute of Technology, Cambridge, Massachusetts) for graphical display information, and other open systems standards being used for data exchange. Any requests of the outside system to the LCN is made via the co-processor 200 through the shared memory 202 to the module CPU 38 as described above.
It is also desired to open up the graphics interface such that a display which is not on the LCN can be displayed onto the CRT 151 of the US 122. This is achieved by the interface to the graphic card 150 from the co-processor 200. Referring to Figure 6, there is shown a block diagram of the graphics card 150 of the preferred embodiment. The graphics card includes a card bus 152. Attached to the card bus 152 is a data memory 154 which contains the information which is to be displayed onto the CRT, and also contains some control information. A microprocessor 156 is also coupled to the card bus 152 and further is coupled to the module bus 36. A graphics processor 160 is coupled to the card bus 152 and performs all the processing for developing the information stored in the data memory 154, including some control functions. A shared memory 158 is coupled to the card bus 152. A connection is made from the card bus 152 to the co-processor 200, thereby providing the interface mentioned above to the graphics card 150 from the co- processor 200. The microprocessor 156 of the preferred embodiment of the graphic card 150 is a Motorola 68020 processor. The graphics card 150 is a two port graphics card, one port of the graphics card being tied to the module bus 36 which is how a display is driven from LCN. The LCN 120 provides a "single window to the process," i.e., a screen display of what the process/process control system is doing. The second port is coupled to the co- processor 200 and provides the windows interface for the universal station 122. The windows interface is the X-windows interface which is well defined and well known to those skilled in the art (the interface being defined by MIT, Cambridge, Massachusetts). It is through the interface from the co-processor 200 that all the window displays [i.e., the screen display(s) of the open system(s)] and windows controls are performed, including commands to the graphic card 150 to specify where to place the single window to the process on the screen of the CRT 151. The interface between the graphics card 150 and the co-processor 200 is the full windows interface. One of the windows is the display referred to above as the "single window to the processor" (sometimes referred to as the LCN window). The co-processor 200 commands the graphics card 150 where the LCN window is to be placed on the CRT 151 and its relative size on the display. X-windows is a well defined protocol of how to communicate with the graphics card 150 (or any graphics card) and display, and a computer permitting many windows to be displayed.
This includes displaying at least one window from the LCN and/or at least one window from the open system 300. In this system, a server is defined in X-windows as the machine that is driving the display (or that portion of the co-processor 200 which interfaces to the graphics card 150), and a client is the application program, in the present embodiment, the DEC processor 300.
The client 300 can have data which is desired to be displayed. The client 300 communicates with the server portion of the co-processor 200 through an X-windows protocol indicating data to be displayed. The server portion of the co-processor 200 communicates with the graphics card 150 through a device dependent layer (DDL) and is provided by the vendor of the graphics card, or in X-windows is via DDX protocol. The microprocessor 156 maintains the integrity of the card bus 152 into the data memory 154. The processing of the data to be displayed on the CRT 151 is performed by the graphics processor 160. When a predetermined data screen is to be displayed, the microprocessor 156 (which accepts requests from the LCN 120 via module bus 36) places the data in shared memory 158, and is subsequently processed by the graphics processor 160, and is then stored in data memory 154. When the open system 300 (via the client) desires to display some information, the information is communicated to the server portion of the co¬ processor 200 which then stores the information in the shared memory 158. The graphics processor 160 then processes that information and stores it in the data memory 154 for display. In that manner, and under the control of the graphics processor 160, the plurality of displays, i.e., windows, is displayed on the CRT 151.
It will be understood by those skilled in the art that the X-window protocol is essentially the open interface standard, the X-window protocol being readily available and well known to those skilled in the art. In the preferred embodiment the UNIX operating system is utilized, the UNIX operating system being able to run on many commercially available processors. Further information on the preferred embodiment of the graphics card 150 of the preferred embodiment of the US 122 can be had by reference to U.S. Patent Numbers 4,490,797 and 4,663,619, although it will be understood that any graphics card can be utilized as discussed above. The graphics processor 160 of the preferred embodiment of the present invention is a Texas Instruments (TI) TMS 34020. The microprocessor 156 and the module CPU 38 is a Motorola 68020. The co-processor 200 of the preferred embodiment of the present invention is a Motorola 68040, having bus capability with the other microprocessors of the system. It will be understood that a variety of processors can be utilized including a reduced instruction set processor which is available from Hewlett Packard among other processor manufacturers.
Although the preferred embodiment utilizes the UNIX operating system, it will be recognized by those skilled in the art that any operating system can be utilized, including OSF1, Open Systems Foundation/USA, Cambridge, Massachusetts. Although the co¬ processor 200 is controlling the display in the preferred embodiment the graphics card can also perform the display control. Since X-windows was readily available and performed the desired display control function, X-windows was utilized to take advantage of the availability of the desired control function. It will be recognized by those skilled in the art that implementation of the present invention is not limited to X-windows, and that any protocol can be utilized.
Thus it can be seen that the process control system 10 is open system permitting other system to interface into the LCN of the process control system and, because of the communication scheme as described above, the integrity of the process control system 10 is maintained. Further, the graphics card 150, although not in immediate control of the display unit 151, guarantees that the graphic view (control view) to a field device (i.e., valve,...) or any other controls view of the process control system on the display unit is always maintained regardless of the operational state of the co-processor 200. If the co- processor 200 is running and controlling the display unit 151 (and in particular the actual display on the screen of the display unit 151) and a malfunction occurs or some other anomaly occurs to the co-processor 200, the function of the graphics card 150 guarantees that a single view of the process control system is maintained. As discussed above, the co¬ processor is connected into the US 122 and has and controls a graphical interface through the display 151 and keyboard 153.
Referring to Figure 7, which comprises Figures 7A and 7B, there is shown an example of two displays of the display unit 151. Figure 7 A shows an example of a typical normal display and Figure 7B shows a display when an anomaly occurs with the co¬ processor 200, or the fallback display. Figure 7A shows, for example, the windows which can be displayed. The windows always include a "view of the process", i.e., a control view from the process control system 10. Also included can be, for example, a window showing event history (a process control system application) coming from an outside system, running a process control system application, for example a DEC computer system 300 as shown in Figure 5. Another window can be data coming from another outside computer system (not shown), for example such as an Apple computer. This computer system can be running another application program referred to as documentation (in the preferred embodiment of the process control system the documentation of the process control system is created on an Apple computer). Still another window can be displayed, for example, lab data, coming from a Hewlett Packard computer system. The windows, except for the control view, are displayed on a single screen of the display unit 151, the display information for these windows coming from a number of outside computer systems connected into the co-processor 200. If an error is detected with the co¬ processor 200, the method of the present invention guarantees that the display windows from the outside systems are inhibited and the control view is the only display shown and is zoomed to take up the entire screen of the display unit 151. This observation also serves as an indication to the operator that a malfunction has occurred with the interface to the outside systems.
The utilization of a low resolution touch screen system in a high resolution graphics environment, the method of the present invention, will now be described. The display of the preferred embodiment of the present invention has a 1280 by 1024 resolution, i.e., 1280 pixels wide by 1024 pixels high. A single character on the display is an 8 by 8 pixels character, and it is desirable to be able to position to a character. The touch screen system of the preferred embodiment of the present invention is an infrared technology having X coordinate and Y coordinate beams across the display which are broken upon pointing to a desired position of the display. The pointing can be done by a finger (a pen, pencil, stylus,...) which breaks the beams. As is well known to those skilled in the art one side of the touch screen system includes LEDs and on the opposite side sensors are placed which receive the signal from the LEDs. When the beam is broken a report is transmitted by the touch screen system to an information processing apparatus identifying the coordinate inputs of the position desired. In order for a report to be transmitted, both an X beam and a Y beam must be broken. The present invention utilizes a touch screen having a resolution of 127 wide by 95 high. Thus for an 8 by 8 character size, a display resolution of 1280 by 1024 has a resultant target resolution of 160 wide by 128 high. In this particular instance the target resolution is higher than the touch screen resolution, the touch screen resolution being 127 by 95. Thus the desired objective of being able to position to a character cannot be met.
In order to be able to achieve the desired objective of positioning to a single character, the display is zoomed in a predefined "window area" off to the side of the finger (or stylus) thereby permitting new data around the zoomed up area to be drawn (displayed) and viewed. The character resolution around the reported position displayed in the window area has a character resolution of 16 by 16. This results in a target resolution of 80x64. As a result of the lower target resolution, the touch screen resolution of 127 wide by 95 high results in having at least one beam cross a character, being able to position to a desired character.
The method of the present invention is implemented in the preferred embodiment wherein the graphics card 150 (more specifically, the microprocessor 156 and the graphics processor 160), sometimes referred to herein as graphics controller, includes some logic and the co-processor 200 includes some logic (in the X-server portion of the X-windows protocol and extension of the X-server), the graphics card 150 and the co-processor 22 operating together to obtain the improved display of the present invention. Referring to Figure 8 there is shown an overview of the operation of the method of the present invention.
When a user of the open universal station 123 determines to select a target on the display 151 , the operator points at that target on the face of the display 151. When this is done, the touch screen system (or more simply referred to herein as display 151) detects the touching of the display 151 and reports the x-y coordinates of that touch to the graphics card 150. The graphics card 150 then reports the touch screen coordinates to the X-server logic. Using this data the X-server logic computes the zoom region (sometimes referred to herein as the predefined "window area" or "window") and notifies the graphic card 150 of the bounding region and location in which to zoom the display, and the graphics card 150 zooms up the area around the x-y touch coordinates into this zoom window. Due to the inherent inaccuracy of the touch screen system it is likely that the initial touch position is not directly over the desired target. As a result, the operator will move the pointing device (finger, stylus, pointer,...) until the touch position is over the desired target. This results is the touch screen system sending a new x-y coordinate to the graphics card 150. When these new coordinates are received, the graphics card 150 calculates new x-y coordinates to send to the X-server logic and zooms up the area around the new x-y coordinates into the zoom window. The location of the zoom window on the display remains fixed, but the data contained within the zoom window follows the pointer as will be described in more detail hereinunder. This process continues until the pointing device is removed from the touch screen system. When the removal of the pointing device is detected, the touch screen system sends an exit code to the graphics card 150. Upon receipt of this signal, the graphics card stops zooming data into the zoom window, and passes the exit code to the X-server logic. When the X-server logic detects the exit code, it removes the zoom window and the image that was covered by the zoom window is restored.
When the server logic starts up, the zoom feature is included as part of the start up procedure by the operator. If the zoom feature is not initialized, then the touch screen zooming function is disabled. If the zoom feature is started, then a zoom window of setable size and location, i.e., programmable, is defined. The operator indicates the size of the zoom region by height and width, indicates the x-offset and y-offset, and indicates whether the operator is standing or sitting, right handed or left handed,.... In this fashion the zoom window will be presented in the most unobstructed manner, i.e., the zoom window will be unobstructed from the operators view as determined by the inputted initialized information. The cursor position is defined as the center of the zoom region and is not displayed when the display is in a zoom mode. When the finger is removed and the zoom region is deleted, the cursor will appear in its proper position on the display.
Referring to Figure 9, which comprises Figures 9 A and 9B, there is shown a flow diagram of the logic of the graphics card 150 for implementing the method of the present invention. When the finger (or stylus) is inserted onto the touch screen such that an X and Y beam are broken (block 410), a report is received by the information handling apparatus, in the preferred embodiment the graphics processor 160 of the process control system 10. Upon the first entry of the finger or stylus onto the display screen, the report also indicates that this is an entry report (block 415). The graphics processor 160 checks to verify whether this is the first report (block 420) and if it is sets a first report flag (block 425). The x and y positions (T/Sx and T/Sy) from the touch screen system is accepted by the graphics processor 160 (block 430). The graphics processor calculates the actual screen x and y coordinates, ACTx and ACTx, utilizing the touch screen (T/S) positions from the touch screen systems (block 435). Actual coordinates are determined using the equations; ACTx = T/Sx * 10.15 ACTy = T/Sy * 10.88. These positions, i.e., the T/S positions and actual positions are saved (block 440), and the actual positions are sent to the co-processor 200 (block 440). If the zooming touch feature is disabled, the processing proceeds to block 410 (block 444). If the zooming touch feature is enabled (block 444), a "zoom up around touch" flag is set (block 445) and the processing proceeds to block 410.
A report is transmitted to the graphics processor 160 by the touch screen system about every 1/16 of a second, i.e., 16 reports per second. Since the initial report, as a result of placing the finger onto the touch screen, may not result in identifying the desired character, the finger or stylus may be moved. On each subsequent report, since the finger or stylus is still on the screen, the exit report check of block 415 will indicate a no; however, the first report determination will also be a no thus proceeding to block 446. The graphics processor 160 determines if the zoom touch and the zoom window are enabled (blocks 446, 447). If neither are enabled, the processing proceeds to block 430. If both are enabled the process proceeds to block 450. The new area to be zoomed around as the pointer is moved is now calculated. The T/Sx, T/Sy, coordinates and accepted (block 450) and XDIFF and YDIFF coordinates are calculated as follows: XDIFF = (INITIAL T/Sx - T/Xx) * 5.08
YDIFF = (INITIAL T/Sy - T/Sy) * 5.44 (block 455). The center of the screen X, Y coordinates which are to be zoomed around is calculated (block 456), and transmitted to the co-processor 200 (block 457). The processing then proceeds to block 445. The characters are zoomed to a 16 by 16 pixel size and showing approximately a 20 by 20 character (depending on the size of the programmable zoom region) around the new center position. Thus, as the finger or stylus is moved, the zoomed area follows the stylus at half the distance (or half the "speed") such that ultimately the desired character is pointed to by the finger or stylus. An example of the half speed display of the zoomed area is shown in Figures 10A and 10B, which comprises Figure 10. Recall that in all cases the zoomed region remains fixed in location on the display screen. When the desired (or target) character is pointed to by the cursor, i.e., in the center of the zoom region, in the preferred embodiment, the finger or stylus is withdrawn.
When the finger or stylus is withdrawn, the touch screen system transmits an exit report to the graphics processor 160 and the graphics processor clears the first report flag (block 485), sends and exit code to the co-processor 200 (block 481), and checks to see if the zoom window was enabled (block 482). If the zoom window has not been enabled the processing proceeds to block 410. If the zoom window was enabled the cursor is restored (block 483), and the processing proceeds to block 410.
At block 410, the graphics processor is waiting to receive reports. If no report is received the graphics processor proceeds to determine if the zoom up around touch flag was set (block 490), shown in Figure 9B. If the flag was set the graphics processor 160 determines if the zoom window has been enabled (block 492). If neither condition is met the processing repeats itself at block 410 and is in a constant wait loop. If both conditions are met the graphics processor clears the zoom up around touch flag (block 494), obtains the zoom window parameters from the co-processor 200 (block 496), and the zoom up area around the actual screen x-y touch coordinates are placed into the zoom window
(block 498). The processing then returns to block 410.
Referring to Figure 12 there is shown a partial display screen showing row 1 of characters 16 through 19. The X position pixels and Y position pixels of the display screen are also shown for the pertinent characters. The touch screen apparatus includes only the X beams (but will also apply for the Y beams although not shown). Beams 12,
13, and 14 are shown intersecting characters 16, 18, and 19. This is because the resolution of the touch screen apparatus as mentioned above is lower than the target resolution, assuming for purposes of example here that the target character is character 17. With a touch screen resolution of 127 by 95 pixels for the touch screen and a target resolution of 160 by 128, the beams can only cross approximately one out of every ten pixels. Thus for a character having 8 pixels there will be a number of characters which will not have an intersecting or crossing beam. Referring to Figure 13, there is shown a partial zoomed area of the display of Figure 12. Generally the finger or stylus will break 2 or 3 beams, rarely 4 beams in each direction. Let's assume for purposes of example that X beams 12, 13, and are broken, character 17 being the character attempted to pointed to (i.e., the desired character). The X coordinate position for beam 13 is transmitted as the actual T/Sx coordinate value.
According to the present invention, character 18 will be the "pointed to" character such that character 18 will be zoomed to a 16 by 16 character size (including approximately 20 by 20 characters around that initial position according to the present invention). Since character 17 is the desired character, as the finger is moved towards character 17 the character will appear in the zoom region under the cursor in the zoom region. At a point in time when character 17 is on the cursor, the finger is withdrawn. The beam positions are transmitted to the processing device which calculates the x, y screen coordinates which identifies character 17. When the finger is withdrawn the display is restored to its original. Referring to Figure 11 , which comprises Figures 11 A and 11 B, there is shown a flow diagram of the X-server logic of co-processor 200, which operates in conjunction with the graphics card logic, to generate the display, and specifically to generate the window of the display 151, in the preferred embodiment of the present invention. Figure 11 A shows the event handler of the X-server logic, including events, i.e., inputs, which are received from the open US 123 namely, keyboard inputs, mouse inputs, touch screen inputs,.... Generally, when the display is in a zoom mode, and an input is received from other than the touch screen, the zoom window is removed from the screen, and the display restored to a normal display without the window. When an event is received from the touch screen and is a coordinate event (i.e., not including the exit code), the logic of block 501 is activated. When the event is a touch screen exit code, the logic of block 511 is activated.
Referring to Figure 1 IB, a more detail flow chart of the touch screen event processing in co-processor 200 is shown. If the zoom feature is enabled (block 502) and the zoom window has not been mapped to the display (block 503), the window is generated. The size and position of the zoom window are determined (block 504), and the zoom window is mapped to the display (block 506). The information (i.e., parameters) of the zoom window are passed to the graphics card logic 150 (or sometimes referred to as the graphics controller (G/C)) (block 507), and the graphics controller is informed that the zoom window is enabled (block 508). The logic then continues to block 521.
Ifthe touch screen event includes an exit code (block 520), then the logic of block 511 is activated in which the window is removed from the display. Ifthe zoom window is mapped to the display (block 512), the zoom window is unmapped from the display (block
514), and the graphics controller is informed that the zoom window has been disabled (block 516). Thus the zoom window is removed from the screen. Ifthe zoom window has not been mapped to the display (block 512), the X-server logic sends the new cursor location on into the server for further processing (block 521). In this manner the X-server logic generates or removes the zoom window from the display and indicates to the graphics controller logic whether the zoom window is enabled or disabled. The graphics controller logic subsequently places the zoomed data within the zoom window in accordance with the method of the present invention described above.
It will be understood by one skilled in the art that although some of the figures show the zoom window above and to the left of the pointer, the pointer may be in other relative locations, such as above and to the right, below and to the left, below and to the right of the pointer. The location of the window relative to the pointer is programmable from the data entered by the operator as described above. Further, it will be understood by one skilled in the art that when the pointer enters the display at the upper left hand corner and the window is programmed to be above and to the left of the pointer, the logic will invert the zoom window location such that it appears on the screen, i.e., below and to the right of the pointer. Similarly when the pointer enters the display at an upper edge, the window will appear below the pointer. When the pointer enters the upper right corner of the display, the window is placed below and to the left,.... While there has been shown what is considered the preferred embodiment of the present invention, it will be manifest that many changes and modifications can be made therein without departing from the essential spirit and scope of the invention. It is intended, therefore, in the annexed claims to cover all such changes and modifications which fall within the true scope of the invention.

Claims

CLAIMS 1) A method for selecting a character being displayed on a display using a pointer element, wherein the display has a first resolution and a touch screen system associated with the display has a second resolution, and wherein the character has a third resolution resulting in a target resolution, and further wherein the target resolution is higher than the resolution of the touch screen system, the touch screen system transmitting to a data processing unit processing the display, an X and Y coordinate signal of the character pointed to, the method comprising the steps of: a) when a character is pointed to, accepting an initial X and Y coordinate value from the touch screen system, the pointer element having a current pointer location; b) calculating a corresponding X and Y coordinate value of the display; c) saving the initial X and Y coordinate values and the corresponding calculated X and Y coordinate values; d) zooming the display a predetermined number of characters around the initial X and Y coordinate values into a zoom window, the zoom window having a predetermined size and having a predetermined location relative to an initial entry point of the pointer element; e) ifthe character to be selected is not coincident with a cursor, the cursor being displayed in the center of the zoom window, i) moving the pointer element from the current pointer location to a new current pointer location towards the character to be selected, otherwise ii) proceeding to step (j); f) while the pointer element is being moved towards the character to be selected, continuing to accept updated X and Y coordinate position values of the current pointer location on a periodic basis; g) calculating a new center point of the display in the zoom window, the new center point following the pointer element at a predetermined distance with respect to the distance of the pointer element; h) providing the predetermined number of characters around the new center point to be displayed in the zoom window; i) repeating the method from step (e) until the character to be selected is in the center of the zoom window; and j) withdrawing the pointer element, the new center point calculated from step
(g) having the X and Y coordinate values of the character to be selected, thereby permitting a desired character to be selected when the target resolution is higher than the resolution of the touch screen system.
2. A method for selecting a character according to Claim 1 , wherein the step of zooming comprises the step of: zooming each character within the predetermined number of characters around the initial X and Y coordinate values wherein each of the zoomed characters has a fourth resolution, such that the resultant target resolution is less than the resolution of the touch screen system, thereby permitting a predetermined character to be selected.
3. A method for selecting a character according to Claim 2, wherein the fourth resolution, the resolution of the zoomed characters, is twice the third resolution, the resolution of the characters displayed on the display.
4. A method for selecting a character according to Claim 3 wherein the predetermined number of characters zoomed around the initial X and Y coordinates is 20 characters by 20 characters.
5. A method for selecting a character according to Claim 1 , wherein the step of calculating a new center point to zoom the display, generates the new center point coordinates from the current pointer location to a relative position from the imtial X and Y coordinate value, the relative position being a factor less than one, such that as the pointer element is moved the desired character ultimately gets to a position in the zoomed display wherein the cursor is coincident with the desired character.
6. A method for selecting a character according to Claim 2, wherein the step of calculating a new center point to zoom the display, generates the new center point coordinates from the current pointer location to a relative position from the initial X and Y coordinate value, the relative position being a factor less than one, such that as the pointer element is moved the desired character ultimately gets to a position in the zoom window wherein the cursor is coincident with the desired character.
7. A method for selecting a character according to Claim 5, wherein the factor used to determine the relative position of the new center point is one half.
8. A method for selecting a character according to Claim 7, wherein the predetermined size of the zoom window is programmable.
9. A method for selecting a character according to Claim 8 wherein the predetermined location of the zoom window is programmable such that the zoom window is viewable and unobstructed from view.
10. A method for selecting a character according to Claim 9 wherein the predetermined location of the zoom window remains in a fixed predetermined location as the pointer element moves over the display.
PCT/US1994/006755 1993-06-15 1994-06-15 A method for utilizing a low resolution touch screen system in a high resolution graphics environment WO1994029788A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US7783893A 1993-06-15 1993-06-15
US08/077,838 1993-06-15

Publications (1)

Publication Number Publication Date
WO1994029788A1 true WO1994029788A1 (en) 1994-12-22

Family

ID=22140349

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1994/006755 WO1994029788A1 (en) 1993-06-15 1994-06-15 A method for utilizing a low resolution touch screen system in a high resolution graphics environment

Country Status (1)

Country Link
WO (1) WO1994029788A1 (en)

Cited By (141)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000075766A1 (en) * 1999-06-02 2000-12-14 Ncr International, Inc. Self-service terminal
GB2353615A (en) * 1998-08-26 2001-02-28 Symtec Ltd Selecting items such as data files
US6359615B1 (en) 1999-05-11 2002-03-19 Ericsson Inc. Movable magnification icons for electronic device display screens
WO2004051392A2 (en) 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
WO2008070815A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Operating touch screen interfaces
JP2010102662A (en) * 2008-10-27 2010-05-06 Sharp Corp Display apparatus and mobile terminal
US7764272B1 (en) 1999-08-26 2010-07-27 Fractal Edge Limited Methods and devices for selecting items such as data files
CN101825968A (en) * 2010-04-23 2010-09-08 中国电子科技集团公司第五十四研究所 Anti-jitter rapid convergence method for touch screen
WO2010135160A2 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Method of visualizing an input location
US20100302281A1 (en) * 2009-05-28 2010-12-02 Samsung Electronics Co., Ltd. Mobile device capable of touch-based zooming and control method thereof
US20100333018A1 (en) * 2009-06-30 2010-12-30 Shunichi Numazaki Information processing apparatus and non-transitory computer readable medium
US7969421B2 (en) * 2003-10-29 2011-06-28 Samsung Electronics Co., Ltd Apparatus and method for inputting character using touch screen in portable terminal
US8032843B2 (en) 1999-12-20 2011-10-04 Apple Inc. User interface for providing consolidation and access
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8255830B2 (en) 2009-03-16 2012-08-28 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
JP2012203644A (en) * 2011-03-25 2012-10-22 Kyocera Corp Electronic device
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
WO2013157013A1 (en) * 2012-04-17 2013-10-24 Hewlett - Packard Development Company, L.P. Selection of user interface elements of a graphical user interface
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8744852B1 (en) 2004-10-01 2014-06-03 Apple Inc. Spoken interfaces
US8799813B2 (en) 2000-01-05 2014-08-05 Apple Inc. Method and system for providing an embedded application tool bar
US8892446B2 (en) 2010-01-18 2014-11-18 Apple Inc. Service orchestration for intelligent automated assistant
USRE45559E1 (en) 1997-10-28 2015-06-09 Apple Inc. Portable computers
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
EP2452260A4 (en) * 2009-07-09 2016-02-24 Qualcomm Inc Automatic enlargement of viewing area with selectable objects
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9300784B2 (en) 2013-06-13 2016-03-29 Apple Inc. System and method for emergency calls initiated by voice command
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US9448712B2 (en) 2007-01-07 2016-09-20 Apple Inc. Application programming interfaces for scrolling operations
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9606986B2 (en) 2014-09-29 2017-03-28 Apple Inc. Integrated word N-gram and class M-gram language models
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US9619132B2 (en) 2007-01-07 2017-04-11 Apple Inc. Device, method and graphical user interface for zooming in on a touch-screen display
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US9697822B1 (en) 2013-03-15 2017-07-04 Apple Inc. System and method for updating an adaptive speech recognition model
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9922642B2 (en) 2013-03-15 2018-03-20 Apple Inc. Training an at least partial voice command system
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10199051B2 (en) 2013-02-07 2019-02-05 Apple Inc. Voice trigger for a digital assistant
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US10592095B2 (en) 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10762293B2 (en) 2010-12-22 2020-09-01 Apple Inc. Using parts-of-speech tagging and named entity recognition for spelling correction
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10791216B2 (en) 2013-08-06 2020-09-29 Apple Inc. Auto-activating smart responses based on activities from remote devices
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US20210200304A1 (en) * 2019-12-31 2021-07-01 Lenovo (Beijing) Co., Ltd. Display method and electronic device
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5994135A (en) * 1982-11-20 1984-05-30 Mitsubishi Electric Corp Light pen input device
JPS6464062A (en) * 1987-09-04 1989-03-09 Hitachi Ltd Editing system for pattern picture data
EP0326751A2 (en) * 1988-02-01 1989-08-09 Sperry Marine Inc. Touchscreen control panel with sliding touch control
EP0575146A2 (en) * 1992-06-16 1993-12-22 Honeywell Inc. A method for utilizing a low resolution touch screen system in a high resolution graphics environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5994135A (en) * 1982-11-20 1984-05-30 Mitsubishi Electric Corp Light pen input device
JPS6464062A (en) * 1987-09-04 1989-03-09 Hitachi Ltd Editing system for pattern picture data
EP0326751A2 (en) * 1988-02-01 1989-08-09 Sperry Marine Inc. Touchscreen control panel with sliding touch control
EP0575146A2 (en) * 1992-06-16 1993-12-22 Honeywell Inc. A method for utilizing a low resolution touch screen system in a high resolution graphics environment

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A.G. COLE AND R. H. RIEKERT: "Touch-sensitive overlay selection repeater", IBM TECHNICAL DISCLOSURE BULLETIN, vol. 26, no. 5, October 1983 (1983-10-01), NEW YORK US, pages 2620 - 2621 *
B. JOHNSON AND G. RAPPS: "CYCLOPS: a one button alpha-numeric keypad", MOTOROLA TECHNICAL DEVELOPMENTS, vol. 15, May 1992 (1992-05-01), SCHAUMBURG, ILLINOIS, US, pages 49 - 56 *
PATENT ABSTRACTS OF JAPAN vol. 008, no. 211 (P - 303) 26 September 1984 (1984-09-26) *
PATENT ABSTRACTS OF JAPAN vol. 013, no. 271 (P - 889) 22 June 1989 (1989-06-22) *

Cited By (230)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE45559E1 (en) 1997-10-28 2015-06-09 Apple Inc. Portable computers
USRE46548E1 (en) 1997-10-28 2017-09-12 Apple Inc. Portable computers
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
GB2353615A (en) * 1998-08-26 2001-02-28 Symtec Ltd Selecting items such as data files
US6359615B1 (en) 1999-05-11 2002-03-19 Ericsson Inc. Movable magnification icons for electronic device display screens
WO2000075766A1 (en) * 1999-06-02 2000-12-14 Ncr International, Inc. Self-service terminal
US7764272B1 (en) 1999-08-26 2010-07-27 Fractal Edge Limited Methods and devices for selecting items such as data files
US8032843B2 (en) 1999-12-20 2011-10-04 Apple Inc. User interface for providing consolidation and access
US9684436B2 (en) 1999-12-20 2017-06-20 Apple Inc. User interface for providing consolidation and access
US8799813B2 (en) 2000-01-05 2014-08-05 Apple Inc. Method and system for providing an embedded application tool bar
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US9851864B2 (en) 2002-03-19 2017-12-26 Facebook, Inc. Constraining display in display navigation
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US9626073B2 (en) 2002-03-19 2017-04-18 Facebook, Inc. Display navigation
US10365785B2 (en) 2002-03-19 2019-07-30 Facebook, Inc. Constraining display motion in display navigation
US9886163B2 (en) 2002-03-19 2018-02-06 Facebook, Inc. Constrained display navigation
US9678621B2 (en) 2002-03-19 2017-06-13 Facebook, Inc. Constraining display motion in display navigation
US9753606B2 (en) 2002-03-19 2017-09-05 Facebook, Inc. Animated display navigation
US10055090B2 (en) 2002-03-19 2018-08-21 Facebook, Inc. Constraining display motion in display navigation
US8042044B2 (en) 2002-11-29 2011-10-18 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
JP2006520024A (en) * 2002-11-29 2006-08-31 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ User interface using moved representation of contact area
WO2004051392A2 (en) 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
KR101016981B1 (en) * 2002-11-29 2011-02-28 코닌클리케 필립스 일렉트로닉스 엔.브이. Data processing system, method of enabling a user to interact with the data processing system and computer-readable medium having stored a computer program product
WO2004051392A3 (en) * 2002-11-29 2004-12-29 Koninkl Philips Electronics Nv User interface with displaced representation of touch area
US8610669B2 (en) 2003-10-29 2013-12-17 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US9891819B2 (en) 2003-10-29 2018-02-13 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US7969421B2 (en) * 2003-10-29 2011-06-28 Samsung Electronics Co., Ltd Apparatus and method for inputting character using touch screen in portable terminal
US9098120B2 (en) 2003-10-29 2015-08-04 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US8508485B2 (en) * 2003-10-29 2013-08-13 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US9342156B2 (en) 2003-10-29 2016-05-17 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US9710162B2 (en) 2003-10-29 2017-07-18 Samsung Electronics Co., Ltd. Apparatus and method for inputting character using touch screen in portable terminal
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US8427445B2 (en) 2004-07-30 2013-04-23 Apple Inc. Visual expander
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US8744852B1 (en) 2004-10-01 2014-06-03 Apple Inc. Spoken interfaces
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9117447B2 (en) 2006-09-08 2015-08-25 Apple Inc. Using event alert text as input to an automated assistant
US8930191B2 (en) 2006-09-08 2015-01-06 Apple Inc. Paraphrasing of user requests and results by automated digital assistant
US8942986B2 (en) 2006-09-08 2015-01-27 Apple Inc. Determining user intent based on ontologies of domains
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9207855B2 (en) 2006-10-26 2015-12-08 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9632695B2 (en) 2006-10-26 2017-04-25 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US9348511B2 (en) 2006-10-26 2016-05-24 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
RU2623181C2 (en) * 2006-12-07 2017-06-27 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи Operation interfaces of touch screen
US7692629B2 (en) 2006-12-07 2010-04-06 Microsoft Corporation Operating touch screen interfaces
WO2008070815A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Operating touch screen interfaces
US10606470B2 (en) 2007-01-07 2020-03-31 Apple, Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US9619132B2 (en) 2007-01-07 2017-04-11 Apple Inc. Device, method and graphical user interface for zooming in on a touch-screen display
US11461002B2 (en) 2007-01-07 2022-10-04 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US11886698B2 (en) 2007-01-07 2024-01-30 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US10481785B2 (en) 2007-01-07 2019-11-19 Apple Inc. Application programming interfaces for scrolling operations
US9760272B2 (en) 2007-01-07 2017-09-12 Apple Inc. Application programming interfaces for scrolling operations
US11269513B2 (en) 2007-01-07 2022-03-08 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US10983692B2 (en) 2007-01-07 2021-04-20 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US9448712B2 (en) 2007-01-07 2016-09-20 Apple Inc. Application programming interfaces for scrolling operations
US10817162B2 (en) 2007-01-07 2020-10-27 Apple Inc. Application programming interfaces for scrolling operations
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US10381016B2 (en) 2008-01-03 2019-08-13 Apple Inc. Methods and apparatus for altering audio output signals
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US9529524B2 (en) 2008-03-04 2016-12-27 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
US9865248B2 (en) 2008-04-05 2018-01-09 Apple Inc. Intelligent text-to-speech conversion
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US10108612B2 (en) 2008-07-31 2018-10-23 Apple Inc. Mobile device having human language translation capability with positional feedback
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
JP2010102662A (en) * 2008-10-27 2010-05-06 Sharp Corp Display apparatus and mobile terminal
US9959870B2 (en) 2008-12-11 2018-05-01 Apple Inc. Speech recognition involving a mobile device
US9875013B2 (en) 2009-03-16 2018-01-23 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8255830B2 (en) 2009-03-16 2012-08-28 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8370736B2 (en) 2009-03-16 2013-02-05 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US9846533B2 (en) 2009-03-16 2017-12-19 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8510665B2 (en) 2009-03-16 2013-08-13 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8661362B2 (en) 2009-03-16 2014-02-25 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8584050B2 (en) 2009-03-16 2013-11-12 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US10761716B2 (en) 2009-03-16 2020-09-01 Apple, Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
WO2010135160A3 (en) * 2009-05-21 2011-03-10 Microsoft Corporation Method of visualizing an input location
US8416193B2 (en) 2009-05-21 2013-04-09 Microsoft Corporation Method of visualizing an input location
WO2010135160A2 (en) * 2009-05-21 2010-11-25 Microsoft Corporation Method of visualizing an input location
US20100302281A1 (en) * 2009-05-28 2010-12-02 Samsung Electronics Co., Ltd. Mobile device capable of touch-based zooming and control method thereof
US10795541B2 (en) 2009-06-05 2020-10-06 Apple Inc. Intelligent organization of tasks items
US11080012B2 (en) 2009-06-05 2021-08-03 Apple Inc. Interface for a virtual digital assistant
US10475446B2 (en) 2009-06-05 2019-11-12 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US20100333018A1 (en) * 2009-06-30 2010-12-30 Shunichi Numazaki Information processing apparatus and non-transitory computer readable medium
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
EP2452260A4 (en) * 2009-07-09 2016-02-24 Qualcomm Inc Automatic enlargement of viewing area with selectable objects
US9372614B2 (en) 2009-07-09 2016-06-21 Qualcomm Incorporated Automatic enlargement of viewing area with selectable objects
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10706841B2 (en) 2010-01-18 2020-07-07 Apple Inc. Task flow identification based on user intent
US8892446B2 (en) 2010-01-18 2014-11-18 Apple Inc. Service orchestration for intelligent automated assistant
US8903716B2 (en) 2010-01-18 2014-12-02 Apple Inc. Personalized vocabulary for digital assistant
US9548050B2 (en) 2010-01-18 2017-01-17 Apple Inc. Intelligent automated assistant
US11423886B2 (en) 2010-01-18 2022-08-23 Apple Inc. Task flow identification based on user intent
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
US10049675B2 (en) 2010-02-25 2018-08-14 Apple Inc. User profiling for voice input processing
CN101825968A (en) * 2010-04-23 2010-09-08 中国电子科技集团公司第五十四研究所 Anti-jitter rapid convergence method for touch screen
US10762293B2 (en) 2010-12-22 2020-09-01 Apple Inc. Using parts-of-speech tagging and named entity recognition for spelling correction
US10102359B2 (en) 2011-03-21 2018-10-16 Apple Inc. Device access using voice authentication
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
JP2012203644A (en) * 2011-03-25 2012-10-22 Kyocera Corp Electronic device
US10664144B2 (en) 2011-05-31 2020-05-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8677232B2 (en) 2011-05-31 2014-03-18 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9244605B2 (en) 2011-05-31 2016-01-26 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US8719695B2 (en) 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US9092130B2 (en) 2011-05-31 2015-07-28 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US11256401B2 (en) 2011-05-31 2022-02-22 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US11120372B2 (en) 2011-06-03 2021-09-14 Apple Inc. Performing actions associated with task items that represent tasks to perform
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
WO2013157013A1 (en) * 2012-04-17 2013-10-24 Hewlett - Packard Development Company, L.P. Selection of user interface elements of a graphical user interface
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9576574B2 (en) 2012-09-10 2017-02-21 Apple Inc. Context-sensitive handling of interruptions by intelligent digital assistant
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US10199051B2 (en) 2013-02-07 2019-02-05 Apple Inc. Voice trigger for a digital assistant
US9368114B2 (en) 2013-03-14 2016-06-14 Apple Inc. Context-sensitive handling of interruptions
US9697822B1 (en) 2013-03-15 2017-07-04 Apple Inc. System and method for updating an adaptive speech recognition model
US9922642B2 (en) 2013-03-15 2018-03-20 Apple Inc. Training an at least partial voice command system
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9966060B2 (en) 2013-06-07 2018-05-08 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US10657961B2 (en) 2013-06-08 2020-05-19 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US9300784B2 (en) 2013-06-13 2016-03-29 Apple Inc. System and method for emergency calls initiated by voice command
US10791216B2 (en) 2013-08-06 2020-09-29 Apple Inc. Auto-activating smart responses based on activities from remote devices
US9620105B2 (en) 2014-05-15 2017-04-11 Apple Inc. Analyzing audio input for efficient speech and music recognition
US10592095B2 (en) 2014-05-23 2020-03-17 Apple Inc. Instantaneous speaking of content on touch devices
US9502031B2 (en) 2014-05-27 2016-11-22 Apple Inc. Method for supporting dynamic grammars in WFST-based ASR
US11257504B2 (en) 2014-05-30 2022-02-22 Apple Inc. Intelligent assistant for home automation
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10497365B2 (en) 2014-05-30 2019-12-03 Apple Inc. Multi-command single utterance input method
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9734193B2 (en) 2014-05-30 2017-08-15 Apple Inc. Determining domain salience ranking from ambiguous words in natural speech
US10083690B2 (en) 2014-05-30 2018-09-25 Apple Inc. Better resolution when referencing to concepts
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10169329B2 (en) 2014-05-30 2019-01-01 Apple Inc. Exemplar-based natural language processing
US10289433B2 (en) 2014-05-30 2019-05-14 Apple Inc. Domain specific language for encoding assistant dialog
US11133008B2 (en) 2014-05-30 2021-09-28 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US10904611B2 (en) 2014-06-30 2021-01-26 Apple Inc. Intelligent automated assistant for TV user interactions
US9668024B2 (en) 2014-06-30 2017-05-30 Apple Inc. Intelligent automated assistant for TV user interactions
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US10431204B2 (en) 2014-09-11 2019-10-01 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9606986B2 (en) 2014-09-29 2017-03-28 Apple Inc. Integrated word N-gram and class M-gram language models
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US9986419B2 (en) 2014-09-30 2018-05-29 Apple Inc. Social reminders
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US11556230B2 (en) 2014-12-02 2023-01-17 Apple Inc. Data detection
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US9711141B2 (en) 2014-12-09 2017-07-18 Apple Inc. Disambiguating heteronyms in speech synthesis
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US10311871B2 (en) 2015-03-08 2019-06-04 Apple Inc. Competing devices responding to voice triggers
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US11087759B2 (en) 2015-03-08 2021-08-10 Apple Inc. Virtual assistant activation
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US11500672B2 (en) 2015-09-08 2022-11-15 Apple Inc. Distributed personal assistant
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US11526368B2 (en) 2015-11-06 2022-12-13 Apple Inc. Intelligent automated assistant in a messaging environment
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US11069347B2 (en) 2016-06-08 2021-07-20 Apple Inc. Intelligent automated assistant for media exploration
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US11037565B2 (en) 2016-06-10 2021-06-15 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US11152002B2 (en) 2016-06-11 2021-10-19 Apple Inc. Application integration with a digital assistant
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11405466B2 (en) 2017-05-12 2022-08-02 Apple Inc. Synchronization and task delegation of a digital assistant
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
US20210200304A1 (en) * 2019-12-31 2021-07-01 Lenovo (Beijing) Co., Ltd. Display method and electronic device

Similar Documents

Publication Publication Date Title
WO1994029788A1 (en) A method for utilizing a low resolution touch screen system in a high resolution graphics environment
EP0575146A2 (en) A method for utilizing a low resolution touch screen system in a high resolution graphics environment
EP0575150B1 (en) Method for controlling window displays in an open systems windows environment
EP0575144B1 (en) A method of coupling open systems to a proprietary network
EP0925532B1 (en) Control system monitor
CA2097558C (en) Directly connected display of process control system in an open systems windows environment
CA2266446C (en) Emulator for visual display object files and method of operation thereof
EP0527596A2 (en) Generic data exchange
EP0575149B1 (en) Priority based graphics in an open systems windows environment
EP2036076A2 (en) Apparatus and methods for ensuring visibility of display window
EP0575145B1 (en) Open distributed digital control system
US6144850A (en) Real-time monitoring of remote base station transceiver subsystems
EP0575147A2 (en) Device dependent layer of a windowing system for a process control system display
CN1068906A (en) Be used to handle the method and apparatus of concurrent pick event
WO1998013747A1 (en) Method for re-invoking previously displayed software application displays in a multi-window environment

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase