US20060136833A1 - Apparatus and method for chaining objects in a pointer drag path - Google Patents
Apparatus and method for chaining objects in a pointer drag path Download PDFInfo
- Publication number
- US20060136833A1 US20060136833A1 US11/012,908 US1290804A US2006136833A1 US 20060136833 A1 US20060136833 A1 US 20060136833A1 US 1290804 A US1290804 A US 1290804A US 2006136833 A1 US2006136833 A1 US 2006136833A1
- Authority
- US
- United States
- Prior art keywords
- objects
- chain
- pointer
- drag path
- chaining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
Definitions
- the file3.xls object is selected, which results in an object corresponding to the file3.xls object being added to the chain of objects (which was previously empty).
- the chain of objects 125 contains a single object 610 corresponding to the file3.xls object, as shown in FIG. 6 .
- the workflow step 1272 could be displayed to the user with the two rules that were touched.
- the workflow step 1272 could be displayed as soon as the Workflow Step # 2 1210 is selected, and the rules 1220 and 1230 could then be added to the display as soon as they are touched in the drag path.
- the preferred embodiment also gives rise to a new pointer event.
- Pointer pickup events and pointer drop events are known in the art.
- the preferred embodiments allow the defining of a new pointer event which we call a “pointer pickup event”. This new event causes an object that is touched in the drag path of a selected object to be stored in the chain of objects.
- chain and “chaining” as used herein expressly extends to any and all data structures and methods of linking or grouping together objects in a drag path of a selected object.
Abstract
An apparatus and method for a graphical user interface allow performing operations simply by dragging a first object to touch a second object. The selection of the first object places a corresponding first object in a chain of objects. When the selected first object touches a second object, a corresponding second object is added to the chain of objects. This process may continue for the selection of many objects by merely touching each object with the selected first object, which causes a corresponding object to be added to the chain of objects. The chain of objects may then be processed as an atomic group of operations that may be rolled back if any of the operations in the group fail.
Description
- The patent application is related to the patent application entitled “Apparatus and Method for Pointer Drag Path Operations”, Attorney Docket No. ROC920040286US1, U.S. Ser. No. ______ filed on ______, which is incorporated herein by reference.
- 1. Technical Field
- This invention generally relates to computer systems and more specifically relates to graphical user interfaces for computer systems.
- 2. Background Art
- Early computer systems used command-based operating systems and software applications. These command-based systems provided a user interface that required the user to memorize a relatively large number of commands in order to perform meaningful work. The user interfaces for these command-driven computer programs required a relatively high level of skill to operate, and were not considered to be “user-friendly.” With the introduction of the IBM personal computer (PC), computers became more widely available, both in the workplace and in homes, and the computer industry soon recognized the need to provide more user-friendly interfaces to computer programs. As a result, many different operating systems were introduced that provided a graphical user interface (GUL), including IBM's OS/2, Microsoft Windows, and the Apple McIntosh. Software applications with graphical user interfaces soon followed, and the vast majority of computer programs running on personal computers today include a user-friendly graphical user interface.
- Most graphical user interfaces provide many similar features. The basic display area in a graphical user interface is known as a window. A single window may fill the entire display, or the display can be split into a number of different windows. Most graphical user interfaces provide a menu bar that provides several drop-down lists of commands the user can perform. Various toolbars may also be provided that are customized to performing certain tasks within the computer program. A pointing device such as a trackball or a mouse is generally used to select various commands and options by clicking on the appropriate buttons on the menu bar, tool bar, or by selecting particular objects or portions of a window. Many operations in known graphical user interfaces may be performed by selecting an object by pressing a pointer button, holding the pointer button down while moving the object to a desired location, and dropping the object at the desired location by releasing the pointer button. The moving of an object while holding a pointer button down (i.e., while the object is selected) is commonly referred to as “dragging” the object. For example, a user can delete a file by selecting a file icon by pressing a pointer button, dragging the file icon to a recycle bin icon or a trash can icon, then releasing the pointer button to drop the file icon onto the recycle bin or trash can. A user can move a file by selecting a file icon and dragging the file icon to a new directory, then releasing the pointer button to drop the file icon onto the new directory. Both of these example relates to moving a file, but do not allow performing operations on the file without moving the file. These and other operations that allows a user to drag an object in a graphical user interface are known in the art.
- While using a pointer is a very convenient way to navigate a graphical user interface, the user still must make multiple pointer clicks to perform most operations in ways that are often not intuitive or easy to perform. For example, if the user desires to combine two logical expressions A and B with a logical operator AND, the user could select the first logical expression A with a pointer click, select the second logical expression B with a pointer click, then click on an AND button to logically AND these two logical expressions together. This sequence of operations is different than the way a human user looks at the logical expression A AND B. As a result, a user's efficiency may be negatively affected by the current user interfaces known in the art. Without a mechanism that simplifies the use of a graphical user interface, users will have to continue using outdated graphical user interfaces that require an excessive number of pointer clicks to accomplish a desired task and that do not support performing computer tasks in a quick and intuitive fashion.
- According to the preferred embodiments, an apparatus and method for a graphical user interface allow performing operations simply by dragging a first object to touch a second object. The selection of the first object places a corresponding first object in a chain of objects. When the selected first object touches a second object, a corresponding second object is added to the chain of objects. This process may continue for the selection of many objects by merely touching these objects with the selected first object. The addition of objects to the chain is performed according to predefined rules. The chain of objects may then be processed as an atomic group of operations that may be rolled back if any of the operations in the group fail. The processing of objects in the chain is also performed according to predefined rules. In this manner, the number of pointer clicks to implement a particular function may be reduced and operations may be performed in a more intuitive manner, thereby enhancing the efficiency of the user. The preferred embodiments thus provide a way to perform functions in a graphical user interface in a way that is faster, more intuitive, and uses fewer pointer clicks than is possible in the prior art.
- The foregoing and other features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings.
- The preferred embodiments of the present invention will hereinafter be described in conjunction with the appended drawings, where like designations denote like elements, and:
-
FIG. 1 is a block diagram of an apparatus in accordance with the preferred embodiments; -
FIG. 2 is a flow diagram of a method for performing pointer drag path chaining operations in accordance with the preferred embodiments; -
FIG. 3 is a flow diagram of a method for a user of a graphical user interface to perform chaining of objects in accordance with the preferred embodiments; -
FIG. 4 is a diagram of a sample GUI window illustrating the methods shown inFIGS. 2 and 3 ; -
FIG. 5 is a table of rules that govern the function of a pointer drag path chaining mechanism for a first example in accordance with the preferred embodiments; -
FIG. 6 is a diagram of a chain of objects in accordance with the preferred embodiments immediately after selecting the file3.xls object inFIG. 4 ; -
FIG. 7 is a diagram of the chain of objects inFIG. 6 after the pointer inFIG. 4 touches theCompress button 420; -
FIG. 8 is a diagram of the chain of objects inFIG. 7 after the pointer inFIG. 4 touches theEncrypt button 430; -
FIG. 9 is a diagram of the chain of objects inFIG. 7 after the pointer inFIG. 4 touches the MyFilesobject 440; -
FIG. 10 shows the resulting atomic group of operations represented by the objects inFIG. 9 ; -
FIG. 11 is a table of rules that govern the function of a pointer drag path chaining mechanism for a second example in accordance with the preferred embodiments; -
FIG. 12 is a diagram of a sample GUI window illustrating the method shown inFIG. 2 in accordance with the rules shown inFIG. 11 ; -
FIG. 13 is a diagram of a chain of objects in accordance with the preferred embodiments immediately after selecting theWorkflow Step # 2button 1210 inFIG. 12 ; -
FIG. 14 is a diagram of the chain of objects inFIG. 13 after the dragged object inFIG. 12 touches theRule # 2button 1220; -
FIG. 15 is a diagram of the chain of objects inFIG. 14 after the dragged object inFIG. 12 touches theRule # 1button 1230; -
FIG. 16 is a diagram of the chain of objects inFIG. 15 after the dragged object inFIG. 12 touches theWorkflow Step # 1button 1240; -
FIG. 17 is a diagram of the chain of objects inFIG. 16 after the dragged object inFIG. 12 touches theRule # 2button 1250; -
FIG. 18 is a diagram of the chain of objects inFIG. 17 after the dragged object inFIG. 12 touches theRule # 3button 1250; -
FIG. 19 is a diagram of the chain of objects inFIG. 18 after the dragged object inFIG. 12 touches theWorkflow Step # 3button 1260; and -
FIG. 20 is a diagram of the chain of objects inFIG. 19 after the dragged object inFIG. 12 touches theRule # 3button 1250. - The preferred embodiments provide pointer drag path chaining operations that enhance the power of a graphical user interface by chaining together a chain of objects that correspond to a selected object and to objects that are touched while dragging the selected object. The chain of objects may then be processed as an atomic group of operations that may be rolled back if any of the operations in the group fail. The addition of objects to the chain and the processing of objects in the chain are done according to predefined rules. In this manner, the number of pointer clicks to implement a particular function may be reduced and operations may be performed in a more intuitive manner, thereby enhancing the efficiency of the user.
- Referring now to
FIG. 1 , acomputer system 100 is one suitable implementation of an apparatus in accordance with the preferred embodiments of the invention.Computer system 100 is an IBM eServer iSeries computer system. However, those skilled in the art will appreciate that the mechanisms and methods of the present invention apply equally to any computer system, regardless of whether the computer system is a complicated multi-user computing apparatus, a single user workstation, or an embedded control system. As shown inFIG. 1 ,computer system 100 comprises aprocessor 110, amain memory 120, amass storage interface 130, adisplay interface 140, anetwork interface 150, and apointer interface 152. These system components are interconnected through the use of asystem bus 160.Mass storage interface 130 is used to connect mass storage devices (such as a direct access storage device 155) tocomputer system 100. One specific type of directaccess storage device 155 is a readable and writable CD RW drive, which may store data to and read data from aCD RW 195. -
Main memory 120 in accordance with the preferred embodiments containsdata 121, anoperating system 122, and a pointer dragpath chaining mechanism 123.Data 121 represents any data that serves as input to or output from any program incomputer system 100.Operating system 122 is a multitasking operating system known in the industry as OS/400; however, those skilled in the art will appreciate that the spirit and scope of the present invention is not limited to any one operating system. Pointer dragpath chaining mechanism 123 provides enhanced functionality for theoperating system 122 and for applications running on thecomputer system 100 by allowing the chaining together of objects that correspond to objects in a GUI so the objects may be processed as an atomic group of operations instead of being processed one at a time. This process of chaining objects allows a user interface that is more intuitive and easier to use (and hence more efficient) than known graphical user interfaces. An object is first selected using apointer 124 that is generated and controlled by a suitable pointing device 154 (e.g., mouse or trackball) coupled to thepointer interface 152. An object corresponding to the selected first object is then added to a chain ofobjects 125 that represents a set of operators and operands that will be processed all at once when a defined event occurs, such as a pointer drop of an object. While the object is selected, the selected object may be dragged to touch one or more other objects, which results in one or more corresponding objects being added to the chain ofobjects 125. Note that an object will not be necessarily added to the chain ofobjects 125 each time an object in the GUI is touched by the selected object. Whether or not the pointer dragpath chaining mechanism 123 will add an object to the chain ofobjects 125 when the selected object touches a different object may be determined usingpredefined rules 126. Once thepointer 124 deselects the first object, all of the operations represented in the chain ofobjects 125 will be processed according topredefined rules 126. In the preferred embodiments, the operations represented in the chain ofobjects 125 are performed as an atomic group of operations. By treating the chained operations as an atomic group of operations, all of the operations in the group may be rolled back in the event that any operation in the group fails. The function of the pointer dragpath chaining mechanism 123 is shown in more detail below with reference toFIGS. 2-20 . -
Computer system 100 utilizes well known virtual addressing mechanisms that allow the programs ofcomputer system 100 to behave as if they only have access to a large, single storage entity instead of access to multiple, smaller storage entities such asmain memory 120 andDASD device 155. Therefore, whiledata 121,operating system 122 and pointer dragpath chaining mechanism 123 are shown to reside inmain memory 120, those skilled in the art will recognize that these items are not necessarily all completely contained inmain memory 120 at the same time. It should also be noted that the term “memory” is used herein to generically refer to the entire virtual memory ofcomputer system 100, and may include the virtual memory of other computer systems coupled tocomputer system 100. -
Processor 110 may be constructed from one or more microprocessors and/or integrated circuits.Processor 110 executes program instructions stored inmain memory 120.Main memory 120 stores programs and data thatprocessor 110 may access. Whencomputer system 100 starts up,processor 110 initially executes the program instructions that make upoperating system 122.Operating system 122 is a sophisticated program that manages the resources ofcomputer system 100. Some of these resources areprocessor 110,main memory 120,mass storage interface 130,display interface 140,network interface 150,pointer interface 152, andsystem bus 160. - Although
computer system 100 is shown to contain only a single processor and a single system bus, those skilled in the art will appreciate that the present invention may be practiced using a computer system that has multiple processors and/or multiple buses. In addition, the interfaces that are used in the preferred embodiment each include separate, fully programmed microprocessors that are used to off-load compute-intensive processing fromprocessor 10. However, those skilled in the art will appreciate that the present invention applies equally to computer systems that simply use I/O adapters to perform similar functions. -
Display interface 140 is used to directly connect one ormore displays 165 tocomputer system 100. Thesedisplays 165, which may be non-intelligent (i.e., dumb) terminals or fully programmable workstations, are used to allow system administrators and users to communicate withcomputer system 100. Note, however, that whiledisplay interface 140 is provided to support communication with one ormore displays 165,computer system 100 does not necessarily require adisplay 165, because all needed interaction with users and other processes may occur vianetwork interface 150. -
Network interface 150 is used to connect other computer systems and/or workstations (e.g., 175 inFIG. 1 ) tocomputer system 100 across anetwork 170. The present invention applies equally no matter howcomputer system 100 may be connected to other computer systems and/or workstations, regardless of whether thenetwork connection 170 is made using present-day analog and/or digital techniques or via some networking mechanism of the future. In addition, many different network protocols can be used to implement a network. These protocols are specialized computer programs that allow computers to communicate acrossnetwork 170. TCP/EP (Transmission Control Protocol/Internet Protocol) is an example of a suitable network protocol. -
Pointer interface 152 is used to connect a pointing device 154 (such as a mouse or trackball) tocomputer system 100. A user may use thepointing device 154 to select an object in a graphical user interface. Once an object is selected, it may be dragged by moving the object while selected to change the object's location in the graphical user interface. While the mouse is the most commonly-used pointing device in use today, the preferred embodiments expressly extend to any and all pointing devices, whether currently known or developed in the future. - At this point, it is important to note that while the present invention has been and will continue to be described in the context of a fully functional computer system, those skilled in the art will appreciate that the present invention is capable of being distributed as a program product in a variety of forms, and that the present invention applies equally regardless of the particular type of signal bearing media used to actually carry out the distribution. Examples of suitable signal bearing media include: recordable type media such as floppy disks and CD RW (e.g., 195 of
FIG. 1 ), and transmission type media such as digital and analog communications links. - In the discussion that follows, the term “click” and “clicking” is used to denote pressing a button on a pointing device (such as a mouse or a trackball), which is one specific example of a way to perform a pointer selection of a GUI object within the scope of the preferred embodiments. The preferred embodiments expressly extend to any type of pointing device and any type of mechanism for selecting an object in a graphical user interface, including switch closures, optical detectors, voice commands, or any other way for selecting an object in a GUI, whether now known or developed in the future.
- Referring to
FIG. 2 ,method 200 begins by selecting an object in a graphical user interface (step 210). One suitable example for selecting an object is to place a mouse pointer on the object and click a mouse button to select the object. The object, once selected, may be dragged (step 220). If the dragged object touches a different object (step 230=YES), and if the predefined rules (i.e., rules 126 inFIG. 1 ) indicate the touched object may be chained (step 240=YES), the touched object is added to the chain (step 250). If the dragged object is not dropped (step 260=NO),method 200 loops back to step 230 and continues, potentially adding many touched objects to the chain. Once the dragged object is dropped (step 260=YES), the chain of objects is processed according to the predefined rules (step 270). These predefined rules are preferably included in therules 126 shown inFIG. 1 . - The preferred embodiments expressly extend to the chaining of GUI objects as well as the chaining of different objects that correspond to GUI objects. In the case of chaining different objects, an object in the GUI that may be touched by a dragged object has a corresponding object that may be added to the chain of objects. Thus, in the most preferred implementation, the chain of objects is a chain of objects that correspond to GUI objects instead of a chain of GUI objects themselves. The objects in the chain that correspond to GUI objects may simply indicate whether the object is an operand or an operator, and the type of operand or operator.
- Note that the discussion herein is couched in terms of “operators” and “operands”. An operator is broadly defined herein to mean anything that can perform a function of any kind on some operand. An operand is broadly defined herein to mean anything that can be acted upon by some operator. A simple mathematical example is 2+3. The
numbers - The objects being touched by a selected object may include a defined region that must be contacted by the dragged object or pointer to add the object to the chain of objects. This defined region may be the outline of the graphical icon for the object in the graphical user interface, or may be some region that is larger or smaller that the graphical icon for the object. In addition, the defined region may have any suitable shape, including shapes related to the shape of the icon and other unrelated shapes as well.
- In the alternative, a selected object that is being dragged must come within a defined proximity of an object to “touch” the object. This is alternative way to define a region which must be contacted by the object being dragged or by the pointer in order for the object being dragged to be added to the chain of objects. For example, the region could be defined by the outline of the icon of the object being dragged across, plus some defined space around the icon. In the alternative, the region could be defined in simple Cartesian coordinates. Note that the defined region may be smaller than the icon, the same size as the icon, or larger than the icon. Note that a first object is in a drag path of a selected object if any defined region corresponding to the selected object or the pointer touches any defined region corresponding to the first object during the dragging of the selected object.
- When an object is being dragged, the pointer may show the outline of the object being dragged, or may show the pointer with some visual indication that a drag operation is occurring. Displaying the pointer instead of the dragged object is especially useful when the objects in the GUI are relatively large, which would making navigating around with a dragged object difficult without inadvertently touching objects the user did not intend to touch.
- In the preferred embodiments, multiple operands may be operated upon by multiple operators by chaining both operands and operators together in the chain of objects. One way for the pointer drag path chaining mechanism to function is shown in
FIGS. 3-4 . Referring tomethod 300 inFIG. 3 , a first operand is clicked (step 310), the first operand is then dragged over multiple operators (step 320), and dropped on a second operand (step 330). As the object representing the first operand touches the objects corresponding to the operators instep 320, the pointer drag path chaining mechanism adds objects corresponding to each touched objects to the chain of objects (125 inFIG. 1 ). - One example of the function of
method 300 inFIG. 3 is shown inwindow 400 inFIG. 4 . The user clicks on the object “file3.xls” 410, drags theobject 410 across theCompress button 420, continues to drag theobject 410 across theEncrypt button 430, and drops theobject 410 on theMyFiles folder 440. By selecting the file3.xls file, an object corresponding to the file3.xls file is added to a chain of objects. As theobject 410 is dragged to touch thebuttons MyFiles folder 440, the pointer drag path chaining mechanism adds objects corresponding to these touched objects to the chain of objects. When theobject 410 is dropped, the pointer drag path chaining mechanism processes all of the objects in the chain as an atomic group of operations. - An example now follows to show the function of the pointer drag path chaining mechanism of the preferred embodiments for the specific example shown in
FIGS. 3 and 4 . Referring toFIG. 5 , a table is shown with chaining rules that determine whether a touched object may be chained (i.e., added to the chain of objects). There are three chaining rules in the table inFIG. 5 , namely: 1) the Compress, Encrypt or Delete objects may be chained to a dragged file object; 2) a location object may be chained to a dragged file object; and 3) a location object may be chained to the Compress, Encrypt or Delete objects. We now apply these rules to determine whether a touched object is added to the chain for the example inFIG. 4 . First, the file3.xls object is selected, which results in an object corresponding to the file3.xls object being added to the chain of objects (which was previously empty). As a result, after selecting the file3.xls file, the chain ofobjects 125 contains asingle object 610 corresponding to the file3.xls object, as shown inFIG. 6 . Once the dragged file3.xls object touches theCompress button 420 inFIG. 4 , we determine fromRule # 1 inFIG. 5 that the Compress object may be chained to a dragged file object, so anobject 620 corresponding to the Compress button is added to the chain ofobjects 125, as shown inFIG. 7 . When the dragged file3.xls object then touches theEncrypt button 430, we determine again fromRule # 1 inFIG. 5 that the Encrypt object may be chained to a dragged file object, so object 630 corresponding to the Encrypt button is added to the chain ofobjects 125, as shown inFIG. 8 . When the dragged file3.xls object then touches the MyFiles object 440 inFIG. 4 , we determine fromRule # 2 that theMyFiles location object 440 may be chained to the draggedMyFiles object 440, so object 640 corresponding to the MyFiles object 440 is added to the chain ofobjects 125, as shown inFIG. 9 . We assume that the selected file3.xls object is dropped on the MyFiles object 440 inFIG. 4 . The “dropping” of an object is a well-known term in the GUI art, which simply means that the selected object file3.xls is deselected. As a result of deselecting the file3.xls object (by dropping the file3.xls object 410 onto the MyFiles object 440 inFIG. 4 ), the pointer drag path chaining mechanism knows to process the chain ofobjects 125 inFIG. 9 as an atomic group of operations. One possible representation of this atomic group of operations is shown inFIG. 10 . One advantage of chaining objects into an atomic group of operations is that all of the operations in the group may be monitored as they are performed, which allows all of the operations in the group to be rolled back should any of the operations in the group fail. Thus, if the compression of file3.xls succeeds, but the encryption of file3.xls fails, the pointer drag path chaining mechanism can uncompress the file to return it to its original state. - Another example that shows the function of the pointer drag path chaining mechanism is shown in
FIGS. 11-20 . This specific example is a workflow controller that allows a user to construct workflows that represent real-world processes.FIG. 11 shows a table with chaining rules for the graphical user interface shown inFIG. 12 . These three rules state: A) a workflow step must be selected first; B) a rule is applied to the last selected workflow step; and C) when a workflow step follows a rule step, the previous workflow step ends and a new workflow step is begun. Note that these chaining rules have letter designators instead of numbers to avoid confusing these rules withrules FIG. 12 . - We assume for this example that the user initially clicks the pointer on the
object 1210 labeledWorkflow Step # 2 inFIG. 12 . The user then drags the selectedobject 1210 overRule # 2 1220 andRule # 3 1230, overWorkflow Step # 1 1240, overRule # 2 1220 andRule # 3 1250, overWorkflow Step # 3 1260, overRule # 3 1250, then drops the selectedobject 1210 into the Workflow Builtwindow 1270. Because there are multiple operands and multiple operators, the pointer drag path chaining mechanism needs to consult rules to determine whether an object may be chained when the object is touched in the drag path of the selected object. The rules A, B and C inFIG. 11 are applied to theGUI 1200 shown inFIG. 12 . - The drag path begins by selecting
Workflow Step # 2 1210. We see from Rule A inFIG. 11 that a workflow step must be selected first. The selection ofworkflow step 1210 causes acorresponding object 1310 to be added to the chain of objects 125 (which was previously empty), resulting in the chain ofobjects 125 containing asingle object 1310 corresponding to theworkflow step 1210, as shown inFIG. 13 . When the selectedobject 1210 is dragged to touch theRule # 2object 1220 inFIG. 12 , the chaining rules inFIG. 11 are consulted to determine whether theRule # 2 object may be chained to theworkflow step 1210. Rule B states that rules are applied to the last selected workflow step, which means theRule # 2 object may be chained to theworkflow object 1210, so anobject 1320 corresponding to theRule # 2object 1220 is added to the chain ofobjects 125, as shown inFIG. 14 . When the selectedobject 1210 is dragged to touch theRule # 1object 1230, the chaining rules inFIG. 11 are consulted, where rule B indicates thatRule # 1 should be applied to the last selected workflow step, namelyworkflow step 1210. As a result, anobject 1330 corresponding to theRule # 1object 1230 inFIG. 12 is added to the chain ofobjects 125, as shown inFIG. 15 . Next, the drag path touches theWorkflow Step # 1object 1240. The chaining rules inFIG. 11 are consulted, which indicate in rule C that going from a rule to a workflow step ends the previous workflow step and begins a new workflow step. As a result, anobject 1340 corresponding to theWorkflow Step # 1object 1240 is added to the chain ofobjects 125, as shown inFIG. 16 . Now the drag path of the selected object touches theRule # 2object 1220 again. Because Rule B inFIG. 11 allows a rule to follow a workflow step, anobject 1350 corresponding to theRule # 2object 1220 is added to the chain ofobjects 125, as shown inFIG. 17 . Next, the drag path of the selected object touches theRule # 3object 1250. Rule B inFIG. 11 allows adding anobject 1360 corresponding to theRule # 3object 1250 to be added to the chain ofobjects 125, as shown inFIG. 18 . Next, the drag path of the selected object touches theWorkflow Step # 3object 1260. Rule C inFIG. 11 states that this completes the previous workflow step and starts a new workflow step, so anobject 1370 corresponding to theworkflow step # 3 1260 is added to the chain ofobjects 125, as shown inFIG. 19 . The drag path of the selected object then touches theRule # 3object 1250. Rule B states that this object may be chained, and applies to the last selected workflow step, so anobject 1380 corresponding to theRule # 3object 1250 is added to the chain ofobjects 125, as shown inFIG. 20 . At this point, the selected object is dropped into the “Workflow Built”window 1270. The dropping (or deselecting) of the selected object tells the pointer dragpath chaining mechanism 123 that the process of building the workflow is complete. This means the complete workflow is displayed indisplay window 1270 shown inFIG. 12 . - Once the building of the workflow is complete, the workflow represented by the chain of
objects 125 inFIG. 20 may be performed as an atomic group of operations. By chaining objects together in a drag path of a pointer, without requiring explicit mouse clicks, a user interface is provided that is much easier and more intuitive, and allows accomplishing complex tasks without clicking the pointer to select each object in the drag path. Thus, building this complex workflow with three distinct steps that have multiple rules applied is done by clicking a pointer once to select an object, dragging the selected object, then dropping the selected object.FIGS. 11-20 show one simple illustration of how powerful the preferred embodiments can be in significantly reducing the number of pointer clicks it takes to accomplish a task and increasing the ease of use of a graphical user interface, thereby increasing a user's efficiency. - There are many variations that could be applied to the exemplary embodiments discussed herein. For example, while the preferred embodiments discuss selecting a single object, then dragging the single object, multiple objects could be initially selected and dragged together. Thus, in
window 400 inFIG. 4 , if the user clicked on file1.doc, then on file2.doc, then on file3.xls, these could be selected as a group and dragged to theCompress button 420, which would result in all three of these files being compressed. In another variation, a user may need a way to cancel an operation after an object is selected and dragging has begun. One way to provide a cancel operation is to provide a “cancel” button within the GUI window. Another way to provide a cancel operation is to cancel any drag operation that is dropped outside of the current window. - In another variation, what does a user do if the user accidentally touches an object he or she did not intend to touch? Because the pointer drag path chaining mechanism chains an object that is touched (if the rules allow), there needs to be a way for the user to get rid of the last object that was added to the chain if the user makes a mistake. One way to do this is to provide an object such as a “back” button within the GUI window that removes the last object that was dragged across. Another way is to specify that only one operator may be active, and that the selection of a second operator object following a first operator object in the drag path is intended to cancel the selection of the first operator object.
- Another variation greys out objects that are not valid selections. Thus, in
window 1200 inFIG. 12 , because a workflow step must be selected first (see rule A inFIG. 11 ), the Rule objects 1230, 1220 and 1250 could be initially greyed out, thereby preventing a user from selecting a rule before selecting a workflow step. This concept of greying out items in a GUI that should not or cannot be selected is known in the art, and extends the functionality of the preferred embodiments. The term “grey out” is used herein in a broad sense to mean any suitable way of disabling an object in the GUI from user selection, including the masking or hiding of the object from the user's view. - In another variation, status of the drag path could be displayed to the user as the user is dragging an item. There are many different ways to display such status information to a user. One way is to attach a small icon to the object being dragged or to the pointer to indicate the objects in the chain of objects. Another way is to provide a small status window that shows the selected object and the objects or attributes that have been picked up so far. This status window could be placed in a fixed location in the window, or could move to follow the pointer as it follows the drag path. Yet another way is to dynamically build a path in a separate window. For example, the workflow in the
workflow window 1270 inFIG. 12 could be displayed to the user during the drag operation. Thus, as soon as the drag path contacts theWorkflow Step # 1 1240, theworkflow step 1272 could be displayed to the user with the two rules that were touched. In the alternative, theworkflow step 1272 could be displayed as soon as theWorkflow Step # 2 1210 is selected, and therules - One suitable implementation of the preferred embodiments uses the concept of “listeners”. The general concept of listeners for GUI objects is well-known in the art. A “select listener” for an object detects when a mouse button is pressed when the pointer is on the object. A “drop listener” for an object detects when a mouse button is released when the pointer is on the object. The preferred embodiments define a new listener referred to herein as a “drag listener” for each GUI object. In one specific example, a drag listener, when it detects a drag operation that touches its corresponding object, triggers an operation that adds a corresponding object to the chain of objects. Of course, other implementations could also be used within the scope of the preferred embodiments, which extends to any and all implementations that fall within the scope of the disclosure and claims herein.
- The preferred embodiment also gives rise to a new pointer event. Pointer pickup events and pointer drop events are known in the art. However, the preferred embodiments allow the defining of a new pointer event which we call a “pointer pickup event”. This new event causes an object that is touched in the drag path of a selected object to be stored in the chain of objects.
- The preferred embodiments provide a pointer drag path chaining mechanism that allows a user of a graphical user interface to select an object, thereby placing a corresponding object in a chain of objects, then adding one or more other objects to the chain of objects as other objects are touched in the drag path of the selected object. By chaining multiple objects together by simply touching those objects in the drag path of a selected object, the simple action of dragging an object to touch other objects results in performing multiple functions with a single mouse click that would normally take multiple mouse clicks using prior art GUIs.
- Note that the preferred embodiments extend to dragging any object to touch a different object, even if the different object is in a different window in the graphical user interface, and even if the different object is in a window that corresponds to a different software application.
- While the discussion above and claims herein are couched in terms of “chaining” objects together, one skilled in the art will realize that any suitable data structure could be used to group object together, including a linked list, a queue, etc. The term “chain” and “chaining” as used herein expressly extends to any and all data structures and methods of linking or grouping together objects in a drag path of a selected object.
- One skilled in the art will appreciate that many variations are possible within the scope of the present invention. Thus, while the invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that these and other changes in form and details may be made therein without departing from the spirit and scope of the invention.
Claims (32)
1. An apparatus comprising:
at least one processor;
a memory coupled to the at least one processor; and
a pointer drag path chaining mechanism residing in the memory and executed by the at least one processor, the pointer drag path chaining mechanism comprising a pointer that allows selecting a first object in a graphical user interface, the selection of the first object causing the pointer drag path mechanism to place a first corresponding object in a chain of objects, wherein the pointer further allows dragging the selected first object to touch a second object, the touching of the second object causing the pointer drag path mechanism to place a second corresponding object in the chain of objects.
2. The apparatus of claim 1 wherein the pointer drag path chaining mechanism places the second corresponding object in the chain of objects without a user of the pointer pressing any additional buttons on a pointing device that controls the pointer.
3. The apparatus of claim 1 wherein the pointer drag path chaining mechanism adds objects to the chain of objects according to at least one predefined rule.
4. The apparatus of claim 1 further wherein the pointer drag path chaining mechanism processes the chain of objects according to at least one predefined rule.
5. The apparatus of claim 4 wherein the pointer drag path chaining mechanism processes the chain of objects when the selected first object is deselected.
6. The apparatus of claim 4 wherein the pointer drag path chaining mechanism processes the chain of objects as an atomic group of operations.
7. The apparatus of claim 4 wherein the pointer drag path chaining mechanism processes the chain of objects atomically, thereby allowing rolling back any performed operations if any operation in the chain of objects fails.
8. The apparatus of claim 1 wherein the pointer further allows dragging the selected first object to touch a third object, the touching of the third object causing a third corresponding object to be placed in the chain of objects.
9. The apparatus of claim 8 wherein the first, second and third objects each represent an operator or an operand.
10. A method for a user to perform a function using a graphical user interface, the method comprising the steps of:
selecting a first object;
as a result of selecting the first object, placing a first corresponding object in a chain of objects;
dragging the selected first object to touch a second object; and
as a result of the selected first object touching the second object, placing a second corresponding object in the chain of objects.
11. The method of claim 10 wherein the step of placing the second corresponding object in the chain of objects is performed without a user of the pointer pressing any additional buttons on a pointing device that controls the pointer.
12. The method of claim 10 further comprising the step of adding objects to the chain of objects according to at least one predefined rule.
13. The method of claim 10 further comprising the step of processing the chain of objects according to at least one predefined rule.
14. The method of claim 13 further comprising the step of processing the chain of objects when the selected first object is deselected.
15. The method of claim 13 further comprising the step of processing the chain of objects as an atomic group of operations.
16. The method of claim 13 further comprising the step of processing the chain of objects atomically, thereby allowing rolling back any performed operations if any operation in the chain of objects fails.
17. The method of claim 13 further comprising the step of dragging the selected first object to touch a third object, the touching of the third object causing a third corresponding object to be placed in the chain of objects.
18. The method of claim 17 wherein the first, second and third objects each represent an operator or an operand.
19. A method for using a graphical user interface, the method comprising the steps of:
generating a pointer pickup event in the graphical user interface by selecting a first object in the graphical user interface using the pointer, the pointer pickup event causing a first corresponding object to be stored in a chain of objects;
dragging the selected first object;
touching a second object in the graphical user interface with the selected first object while dragging the selected first object, thereby generating a pointer chain event in the graphical user interface that causes a second corresponding object to be stored in the chain of objects.
20. The method of claim 19 wherein the step of placing the second corresponding object in the chain of objects is performed without a user of the pointer pressing any additional buttons on a pointing device that controls the pointer.
21. The method of claim 19 further comprising the step of generating a pointer drop event in the graphical user interface by deselecting the first object in the graphical user interface.
22. The method of claim 21 further comprising the step of processing the chain of objects as an atomic group of operations when the pointer drop event occurs.
23. A program product comprising:
a pointer drag path chaining mechanism comprising a pointer that allows selecting a first object in a graphical user interface, the selection of the first object causing the pointer drag path chaining mechanism to place a first corresponding object in a chain of objects, wherein the pointer further allows dragging the selected first object to touch a second object, the touching of the second object causing the pointer drag path chaining mechanism to place a second corresponding object in the chain of objects; and
computer-readable signal bearing media bearing the pointer drag path chaining mechanism.
24. The program product of claim 23 wherein the signal bearing media comprises recordable media.
25. The program product of claim 23 wherein the signal bearing media comprises transmission media.
26. The program product of claim 23 wherein the pointer drag path chaining mechanism adds objects to the chain of objects according to at least one predefined rule.
27. The program product of claim 23 wherein the pointer drag path chaining mechanism processes the chain of objects according to at least one predefined rule.
28. The program product of claim 27 wherein the pointer drag path chaining mechanism processes the chain of objects when the selected first object is deselected.
29. The program product of claim 27 wherein the pointer drag path chaining mechanism processes the chain of objects as an atomic group of operations.
30. The program product of claim 27 wherein the pointer drag path chaining mechanism processes the chain of objects atomically, thereby allowing rolling back any performed operations if any operation in the chain of objects fails.
31. The program product of claim 23 wherein the pointer further allows dragging the selected first object to touch a third object, the touching of the third object causing a third corresponding object to be placed in the chain of objects.
32. The program product of claim 31 wherein the first, second and third objects each represent an operator or an operand.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/012,908 US20060136833A1 (en) | 2004-12-15 | 2004-12-15 | Apparatus and method for chaining objects in a pointer drag path |
CNA2005100776441A CN1790241A (en) | 2004-12-15 | 2005-06-17 | Apparatus and method for chaining objects in a pointer drag path |
US12/132,272 US7865845B2 (en) | 2004-12-15 | 2008-06-03 | Chaining objects in a pointer drag path |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/012,908 US20060136833A1 (en) | 2004-12-15 | 2004-12-15 | Apparatus and method for chaining objects in a pointer drag path |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/132,272 Continuation US7865845B2 (en) | 2004-12-15 | 2008-06-03 | Chaining objects in a pointer drag path |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060136833A1 true US20060136833A1 (en) | 2006-06-22 |
Family
ID=36597644
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/012,908 Abandoned US20060136833A1 (en) | 2004-12-15 | 2004-12-15 | Apparatus and method for chaining objects in a pointer drag path |
US12/132,272 Active 2025-08-31 US7865845B2 (en) | 2004-12-15 | 2008-06-03 | Chaining objects in a pointer drag path |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/132,272 Active 2025-08-31 US7865845B2 (en) | 2004-12-15 | 2008-06-03 | Chaining objects in a pointer drag path |
Country Status (2)
Country | Link |
---|---|
US (2) | US20060136833A1 (en) |
CN (1) | CN1790241A (en) |
Cited By (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030138184A1 (en) * | 2001-11-12 | 2003-07-24 | Reinhold Noe | Polarization scrambler and a method for polarization scrambling |
US20070130534A1 (en) * | 2005-12-02 | 2007-06-07 | C-Media Electronics Inc. | Graphic user interface with multi-divisional window |
US20070234226A1 (en) * | 2006-03-29 | 2007-10-04 | Yahoo! Inc. | Smart drag-and-drop |
US20070250304A1 (en) * | 2006-04-25 | 2007-10-25 | Stefan Elfner | Mapping a new user interface onto an existing integrated interface |
US20080186274A1 (en) * | 2006-12-04 | 2008-08-07 | Ulead Systems, Inc. | Method for selecting digital files and apparatus thereof |
US20080229210A1 (en) * | 2007-03-14 | 2008-09-18 | Akiko Bamba | Display processing system |
US20080229247A1 (en) * | 2007-03-14 | 2008-09-18 | Akiko Bamba | Apparatus, method, and computer program product for processing display |
US20080235609A1 (en) * | 2007-03-19 | 2008-09-25 | Carraher Theodore R | Function switching during drag-and-drop |
US20080313568A1 (en) * | 2007-06-12 | 2008-12-18 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
US20090276701A1 (en) * | 2008-04-30 | 2009-11-05 | Nokia Corporation | Apparatus, method and computer program product for facilitating drag-and-drop of an object |
US20090290773A1 (en) * | 2008-05-21 | 2009-11-26 | Varian Medical Systems, Inc. | Apparatus and Method to Facilitate User-Modified Rendering of an Object Image |
US20100079465A1 (en) * | 2008-09-26 | 2010-04-01 | International Business Machines Corporation | Intuitively connecting graphical shapes |
US20100146451A1 (en) * | 2008-12-09 | 2010-06-10 | Sungkyunkwan University Foundation For Corporate Collaboration | Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same |
US20100162179A1 (en) * | 2008-12-19 | 2010-06-24 | Nokia Corporation | Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
US20110145745A1 (en) * | 2009-12-14 | 2011-06-16 | Samsung Electronics Co., Ltd. | Method for providing gui and multimedia device using the same |
US20110173567A1 (en) * | 2010-01-13 | 2011-07-14 | Fuji Xerox Co., Ltd. | Display-controlling device, display device, display-controlling method, and computer readable medium |
US20110185315A1 (en) * | 2010-01-27 | 2011-07-28 | Microsoft Corporation | Simplified user controls for authoring workflows |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
US20110181529A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Selecting and Moving Objects |
US8006183B1 (en) * | 2006-12-08 | 2011-08-23 | Trading Technologies International Inc. | System and method for using a curser to convey information |
US20110296328A1 (en) * | 2010-06-01 | 2011-12-01 | Sony Corporation | Display method and information processing apparatus |
US8091094B2 (en) | 2007-10-10 | 2012-01-03 | Sap Ag | Methods and systems for ambistateful backend control |
US20120030628A1 (en) * | 2010-08-02 | 2012-02-02 | Samsung Electronics Co., Ltd. | Touch-sensitive device and touch-based folder control method thereof |
US8245147B2 (en) | 2009-07-01 | 2012-08-14 | Apple Inc. | System and method for reordering a user interface |
US20130069899A1 (en) * | 2008-03-04 | 2013-03-21 | Jason Clay Beaver | Touch Event Model |
US20130174069A1 (en) * | 2012-01-04 | 2013-07-04 | Samsung Electronics Co. Ltd. | Method and apparatus for managing icon in portable terminal |
US8539385B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
US8566044B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US20130326526A1 (en) * | 2012-06-05 | 2013-12-05 | Tomohiko Sasaki | Information processing apparatus, workflow generating system, and workflow generating method |
US20130335339A1 (en) * | 2012-06-18 | 2013-12-19 | Richard Maunder | Multi-touch gesture-based interface for network design and management |
US8661363B2 (en) | 2007-01-07 | 2014-02-25 | Apple Inc. | Application programming interfaces for scrolling operations |
US8682602B2 (en) | 2009-03-16 | 2014-03-25 | Apple Inc. | Event recognition |
US8717305B2 (en) | 2008-03-04 | 2014-05-06 | Apple Inc. | Touch event model for web pages |
US8723822B2 (en) | 2008-03-04 | 2014-05-13 | Apple Inc. | Touch event model programming interface |
US8732677B2 (en) | 2006-09-28 | 2014-05-20 | Sap Ag | System and method for extending legacy application with undo/redo functionality |
US20140181753A1 (en) * | 2011-04-26 | 2014-06-26 | Kyocera Corporation | Electronic device |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20140195968A1 (en) * | 2013-01-09 | 2014-07-10 | Hewlett-Packard Development Company, L.P. | Inferring and acting on user intent |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20140215405A1 (en) * | 2013-01-28 | 2014-07-31 | International Business Machines Corporation | Assistive overlay for report generation |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US20140304599A1 (en) * | 2011-10-06 | 2014-10-09 | Sony Ericsson Mobile Communications Ab | Method and Electronic Device for Manipulating a First or a Second User Interface Object |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20150033165A1 (en) * | 2013-07-29 | 2015-01-29 | Samsung Electronics Co., Ltd. | Device and method for controlling object on screen |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US20150153949A1 (en) * | 2013-12-03 | 2015-06-04 | Google Inc. | Task selections associated with text inputs |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9298363B2 (en) | 2011-04-11 | 2016-03-29 | Apple Inc. | Region activation for touch sensitive surface |
US9311112B2 (en) | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
US9529519B2 (en) | 2007-01-07 | 2016-12-27 | Apple Inc. | Application programming interfaces for gesture operations |
EP2713602A3 (en) * | 2012-09-26 | 2017-02-22 | Kyocera Document Solutions Inc. | Display / input device, image forming apparatus therewith, and method of controlling a display / input device |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US9720574B2 (en) | 2012-03-19 | 2017-08-01 | Microsoft Technology Licensing, Llc | Personal notes on a calendar item |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
US20170285931A1 (en) * | 2016-03-29 | 2017-10-05 | Microsoft Technology Licensing, Llc | Operating visual user interface controls with ink commands |
US9792079B2 (en) | 2013-02-25 | 2017-10-17 | Ricoh Company, Ltd. | Smart drag and drop user interfaces for print workflow system |
US10032135B2 (en) | 2012-03-19 | 2018-07-24 | Microsoft Technology Licensing, Llc | Modern calendar system including free form input electronic calendar surface |
US10282905B2 (en) | 2014-02-28 | 2019-05-07 | International Business Machines Corporation | Assistive overlay for report generation |
KR20200048757A (en) * | 2018-10-30 | 2020-05-08 | 삼성에스디에스 주식회사 | Search method and apparatus thereof |
US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
US11093623B2 (en) * | 2011-12-09 | 2021-08-17 | Sertainty Corporation | System and methods for using cipher objects to protect data |
US11243654B2 (en) * | 2011-09-30 | 2022-02-08 | Paypal, Inc. | Systems and methods for enhancing user interaction with displayed information |
US11444903B1 (en) * | 2021-02-26 | 2022-09-13 | Slack Technologies, Llc | Contextual discovery and design of application workflow |
US20220374856A1 (en) * | 2021-05-20 | 2022-11-24 | Bank Of America Corporation | Secure kinetic key patterns with soft assist for vision-impaired users |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007058785A (en) * | 2005-08-26 | 2007-03-08 | Canon Inc | Information processor, and operating method for drag object in the same |
US7516108B2 (en) * | 2005-12-22 | 2009-04-07 | International Business Machines Corporation | Block allocation times in a computer system |
US7996789B2 (en) * | 2006-08-04 | 2011-08-09 | Apple Inc. | Methods and apparatuses to control application programs |
KR101531504B1 (en) * | 2008-08-26 | 2015-06-26 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN101673168B (en) * | 2008-09-09 | 2012-03-28 | 联想(北京)有限公司 | Method and device for selecting target object on interface |
JP5620134B2 (en) * | 2009-03-30 | 2014-11-05 | アバイア インク. | A system and method for managing trust relationships in a communication session using a graphical display. |
US8434022B2 (en) * | 2009-04-29 | 2013-04-30 | Applied Micro Circuits Corporation | System and method for photo-image local distribution |
US8621391B2 (en) * | 2009-12-16 | 2013-12-31 | Apple Inc. | Device, method, and computer readable medium for maintaining a selection order in a displayed thumbnail stack of user interface elements acted upon via gestured operations |
US20120102400A1 (en) * | 2010-10-22 | 2012-04-26 | Microsoft Corporation | Touch Gesture Notification Dismissal Techniques |
US20120102437A1 (en) * | 2010-10-22 | 2012-04-26 | Microsoft Corporation | Notification Group Touch Gesture Dismissal Techniques |
KR101740436B1 (en) * | 2010-12-08 | 2017-05-26 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
CN102750068B (en) * | 2011-04-21 | 2015-09-16 | 宏碁股份有限公司 | The method of control interface scrolling and electronic installation |
CN102279719B (en) * | 2011-08-19 | 2014-04-16 | 中体彩科技发展有限公司 | Printing control method |
JP5927872B2 (en) * | 2011-12-01 | 2016-06-01 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5891875B2 (en) * | 2012-03-19 | 2016-03-23 | 富士ゼロックス株式会社 | Information processing apparatus and information processing program |
GB2511668A (en) | 2012-04-12 | 2014-09-10 | Supercell Oy | System and method for controlling technical processes |
US8954890B2 (en) * | 2012-04-12 | 2015-02-10 | Supercell Oy | System, method and graphical user interface for controlling a game |
US8814674B2 (en) | 2012-05-24 | 2014-08-26 | Supercell Oy | Graphical user interface for a gaming system |
JP5892040B2 (en) * | 2012-11-01 | 2016-03-23 | 富士ゼロックス株式会社 | Information processing apparatus and program |
US20140331187A1 (en) * | 2013-05-03 | 2014-11-06 | Barnesandnoble.Com Llc | Grouping objects on a computing device |
JP6212938B2 (en) * | 2013-05-10 | 2017-10-18 | 富士通株式会社 | Display processing apparatus, system, and display processing program |
US10528327B2 (en) * | 2015-11-23 | 2020-01-07 | Microsoft Technology Licensing Llc | Workflow development system with ease-of-use features |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4772882A (en) * | 1986-07-18 | 1988-09-20 | Commodore-Amiga, Inc. | Cursor controller user interface system |
US5428734A (en) * | 1992-12-22 | 1995-06-27 | Ibm Corporation | Method and apparatus for enhancing drag and drop manipulation of objects in a graphical user interface |
US5608860A (en) * | 1994-10-05 | 1997-03-04 | International Business Machines Corporation | Method and apparatus for multiple source and target object direct manipulation techniques |
US5638505A (en) * | 1991-08-16 | 1997-06-10 | Sun Microsystems, Inc. | Apparatus and methods for moving/copying objects using destination and/or source bins |
US6191807B1 (en) * | 1994-05-27 | 2001-02-20 | Canon Kabushiki Kaisha | Communication apparatus and method for performing a file transfer operation |
US6639612B2 (en) * | 2001-10-11 | 2003-10-28 | International Business Machines Corporation | Ad hoc check box selection |
US20030222915A1 (en) * | 2002-05-30 | 2003-12-04 | International Business Machines Corporation | Data processor controlled display system with drag and drop movement of displayed items from source to destination screen positions and interactive modification of dragged items during the movement |
US20040004638A1 (en) * | 2002-07-02 | 2004-01-08 | Ketan Babaria | Method and apparatus for multiple-window multiple-selection operations in graphical-user-interface environments |
US20050060653A1 (en) * | 2003-09-12 | 2005-03-17 | Dainippon Screen Mfg. Co., Ltd. | Object operation apparatus, object operation method and object operation program |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5801699A (en) * | 1996-01-26 | 1998-09-01 | International Business Machines Corporation | Icon aggregation on a graphical user interface |
JP3956553B2 (en) * | 1998-11-04 | 2007-08-08 | 富士ゼロックス株式会社 | Icon display processing device |
JP3798170B2 (en) * | 1999-02-08 | 2006-07-19 | シャープ株式会社 | Information processing system with graphical user interface |
US7454717B2 (en) * | 2004-10-20 | 2008-11-18 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
US7770125B1 (en) * | 2005-02-16 | 2010-08-03 | Adobe Systems Inc. | Methods and apparatus for automatically grouping graphical constructs |
US7665028B2 (en) * | 2005-07-13 | 2010-02-16 | Microsoft Corporation | Rich drag drop user interface |
US7503009B2 (en) * | 2005-12-29 | 2009-03-10 | Sap Ag | Multifunctional icon in icon-driven computer system |
-
2004
- 2004-12-15 US US11/012,908 patent/US20060136833A1/en not_active Abandoned
-
2005
- 2005-06-17 CN CNA2005100776441A patent/CN1790241A/en active Pending
-
2008
- 2008-06-03 US US12/132,272 patent/US7865845B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4772882A (en) * | 1986-07-18 | 1988-09-20 | Commodore-Amiga, Inc. | Cursor controller user interface system |
US5638505A (en) * | 1991-08-16 | 1997-06-10 | Sun Microsystems, Inc. | Apparatus and methods for moving/copying objects using destination and/or source bins |
US5428734A (en) * | 1992-12-22 | 1995-06-27 | Ibm Corporation | Method and apparatus for enhancing drag and drop manipulation of objects in a graphical user interface |
US6191807B1 (en) * | 1994-05-27 | 2001-02-20 | Canon Kabushiki Kaisha | Communication apparatus and method for performing a file transfer operation |
US5608860A (en) * | 1994-10-05 | 1997-03-04 | International Business Machines Corporation | Method and apparatus for multiple source and target object direct manipulation techniques |
US6639612B2 (en) * | 2001-10-11 | 2003-10-28 | International Business Machines Corporation | Ad hoc check box selection |
US20030222915A1 (en) * | 2002-05-30 | 2003-12-04 | International Business Machines Corporation | Data processor controlled display system with drag and drop movement of displayed items from source to destination screen positions and interactive modification of dragged items during the movement |
US20040004638A1 (en) * | 2002-07-02 | 2004-01-08 | Ketan Babaria | Method and apparatus for multiple-window multiple-selection operations in graphical-user-interface environments |
US20050060653A1 (en) * | 2003-09-12 | 2005-03-17 | Dainippon Screen Mfg. Co., Ltd. | Object operation apparatus, object operation method and object operation program |
Cited By (135)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030138184A1 (en) * | 2001-11-12 | 2003-07-24 | Reinhold Noe | Polarization scrambler and a method for polarization scrambling |
US20070130534A1 (en) * | 2005-12-02 | 2007-06-07 | C-Media Electronics Inc. | Graphic user interface with multi-divisional window |
US20070234226A1 (en) * | 2006-03-29 | 2007-10-04 | Yahoo! Inc. | Smart drag-and-drop |
US8793605B2 (en) * | 2006-03-29 | 2014-07-29 | Yahoo! Inc. | Smart drag-and-drop |
US7784022B2 (en) * | 2006-04-25 | 2010-08-24 | Sap Ag | Mapping a new user interface onto an existing integrated interface |
US20070250304A1 (en) * | 2006-04-25 | 2007-10-25 | Stefan Elfner | Mapping a new user interface onto an existing integrated interface |
US9298429B2 (en) | 2006-09-28 | 2016-03-29 | Sap Se | System and method for extending legacy applications with undo/redo functionality |
US8732677B2 (en) | 2006-09-28 | 2014-05-20 | Sap Ag | System and method for extending legacy application with undo/redo functionality |
US20080186274A1 (en) * | 2006-12-04 | 2008-08-07 | Ulead Systems, Inc. | Method for selecting digital files and apparatus thereof |
US8732578B2 (en) | 2006-12-08 | 2014-05-20 | Trading Technologies International, Inc. | System and method for using a cursor to convey information |
US20110239165A1 (en) * | 2006-12-08 | 2011-09-29 | Trading Technologies International Inc. | System and Method for Using a Cursor to Convey Information |
US8006183B1 (en) * | 2006-12-08 | 2011-08-23 | Trading Technologies International Inc. | System and method for using a curser to convey information |
US8661363B2 (en) | 2007-01-07 | 2014-02-25 | Apple Inc. | Application programming interfaces for scrolling operations |
US10481785B2 (en) | 2007-01-07 | 2019-11-19 | Apple Inc. | Application programming interfaces for scrolling operations |
US9760272B2 (en) | 2007-01-07 | 2017-09-12 | Apple Inc. | Application programming interfaces for scrolling operations |
US9037995B2 (en) | 2007-01-07 | 2015-05-19 | Apple Inc. | Application programming interfaces for scrolling operations |
US11449217B2 (en) | 2007-01-07 | 2022-09-20 | Apple Inc. | Application programming interfaces for gesture operations |
US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
US9639260B2 (en) | 2007-01-07 | 2017-05-02 | Apple Inc. | Application programming interfaces for gesture operations |
US10817162B2 (en) | 2007-01-07 | 2020-10-27 | Apple Inc. | Application programming interfaces for scrolling operations |
US10613741B2 (en) | 2007-01-07 | 2020-04-07 | Apple Inc. | Application programming interface for gesture operations |
US9575648B2 (en) | 2007-01-07 | 2017-02-21 | Apple Inc. | Application programming interfaces for gesture operations |
US9529519B2 (en) | 2007-01-07 | 2016-12-27 | Apple Inc. | Application programming interfaces for gesture operations |
US9448712B2 (en) | 2007-01-07 | 2016-09-20 | Apple Inc. | Application programming interfaces for scrolling operations |
US10175876B2 (en) | 2007-01-07 | 2019-01-08 | Apple Inc. | Application programming interfaces for gesture operations |
US9665265B2 (en) | 2007-01-07 | 2017-05-30 | Apple Inc. | Application programming interfaces for gesture operations |
US20080229247A1 (en) * | 2007-03-14 | 2008-09-18 | Akiko Bamba | Apparatus, method, and computer program product for processing display |
US20080229210A1 (en) * | 2007-03-14 | 2008-09-18 | Akiko Bamba | Display processing system |
US20080235609A1 (en) * | 2007-03-19 | 2008-09-25 | Carraher Theodore R | Function switching during drag-and-drop |
US20120140102A1 (en) * | 2007-06-12 | 2012-06-07 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
US20080313568A1 (en) * | 2007-06-12 | 2008-12-18 | Samsung Electronics Co., Ltd. | Digital multimedia playback apparatus and control method thereof |
US8091094B2 (en) | 2007-10-10 | 2012-01-03 | Sap Ag | Methods and systems for ambistateful backend control |
US8560975B2 (en) * | 2008-03-04 | 2013-10-15 | Apple Inc. | Touch event model |
US9389712B2 (en) | 2008-03-04 | 2016-07-12 | Apple Inc. | Touch event model |
US11740725B2 (en) | 2008-03-04 | 2023-08-29 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US9690481B2 (en) | 2008-03-04 | 2017-06-27 | Apple Inc. | Touch event model |
US9720594B2 (en) | 2008-03-04 | 2017-08-01 | Apple Inc. | Touch event model |
US9798459B2 (en) | 2008-03-04 | 2017-10-24 | Apple Inc. | Touch event model for web pages |
US9971502B2 (en) | 2008-03-04 | 2018-05-15 | Apple Inc. | Touch event model |
US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US8836652B2 (en) | 2008-03-04 | 2014-09-16 | Apple Inc. | Touch event model programming interface |
US20130069899A1 (en) * | 2008-03-04 | 2013-03-21 | Jason Clay Beaver | Touch Event Model |
US10521109B2 (en) | 2008-03-04 | 2019-12-31 | Apple Inc. | Touch event model |
US10936190B2 (en) | 2008-03-04 | 2021-03-02 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US8717305B2 (en) | 2008-03-04 | 2014-05-06 | Apple Inc. | Touch event model for web pages |
US8723822B2 (en) | 2008-03-04 | 2014-05-13 | Apple Inc. | Touch event model programming interface |
US9323335B2 (en) | 2008-03-04 | 2016-04-26 | Apple Inc. | Touch event model programming interface |
US20090276701A1 (en) * | 2008-04-30 | 2009-11-05 | Nokia Corporation | Apparatus, method and computer program product for facilitating drag-and-drop of an object |
US20090290773A1 (en) * | 2008-05-21 | 2009-11-26 | Varian Medical Systems, Inc. | Apparatus and Method to Facilitate User-Modified Rendering of an Object Image |
US20100079465A1 (en) * | 2008-09-26 | 2010-04-01 | International Business Machines Corporation | Intuitively connecting graphical shapes |
US8698807B2 (en) | 2008-09-26 | 2014-04-15 | International Business Machines Corporation | Intuitively connecting graphical shapes |
US20100146451A1 (en) * | 2008-12-09 | 2010-06-10 | Sungkyunkwan University Foundation For Corporate Collaboration | Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same |
US20100162179A1 (en) * | 2008-12-19 | 2010-06-24 | Nokia Corporation | Method and Apparatus for Adding or Deleting at Least One Item Based at Least in Part on a Movement |
US8566044B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US10719225B2 (en) | 2009-03-16 | 2020-07-21 | Apple Inc. | Event recognition |
US11755196B2 (en) | 2009-03-16 | 2023-09-12 | Apple Inc. | Event recognition |
US9965177B2 (en) | 2009-03-16 | 2018-05-08 | Apple Inc. | Event recognition |
US9483121B2 (en) | 2009-03-16 | 2016-11-01 | Apple Inc. | Event recognition |
US8682602B2 (en) | 2009-03-16 | 2014-03-25 | Apple Inc. | Event recognition |
US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US9311112B2 (en) | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
US11163440B2 (en) | 2009-03-16 | 2021-11-02 | Apple Inc. | Event recognition |
US9285908B2 (en) | 2009-03-16 | 2016-03-15 | Apple Inc. | Event recognition |
US8245147B2 (en) | 2009-07-01 | 2012-08-14 | Apple Inc. | System and method for reordering a user interface |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10788965B2 (en) | 2009-09-22 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8863016B2 (en) | 2009-09-22 | 2014-10-14 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10282070B2 (en) | 2009-09-22 | 2019-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8766928B2 (en) | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8780069B2 (en) | 2009-09-25 | 2014-07-15 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10928993B2 (en) | 2009-09-25 | 2021-02-23 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US10254927B2 (en) | 2009-09-25 | 2019-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US11366576B2 (en) | 2009-09-25 | 2022-06-21 | Apple Inc. | Device, method, and graphical user interface for manipulating workspace views |
US8799826B2 (en) | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for moving a calendar entry in a calendar application |
US20110078624A1 (en) * | 2009-09-25 | 2011-03-31 | Julian Missig | Device, Method, and Graphical User Interface for Manipulating Workspace Views |
US20110145745A1 (en) * | 2009-12-14 | 2011-06-16 | Samsung Electronics Co., Ltd. | Method for providing gui and multimedia device using the same |
US20110173567A1 (en) * | 2010-01-13 | 2011-07-14 | Fuji Xerox Co., Ltd. | Display-controlling device, display device, display-controlling method, and computer readable medium |
US9086782B2 (en) * | 2010-01-13 | 2015-07-21 | Fuji Xerox Co., Ltd. | Display-controlling device, display device, display-controlling method, and computer readable medium |
US10732997B2 (en) | 2010-01-26 | 2020-08-04 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
US8539385B2 (en) | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for precise positioning of objects |
US20110181529A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Selecting and Moving Objects |
US20110181528A1 (en) * | 2010-01-26 | 2011-07-28 | Jay Christopher Capela | Device, Method, and Graphical User Interface for Resizing Objects |
US8677268B2 (en) | 2010-01-26 | 2014-03-18 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8612884B2 (en) | 2010-01-26 | 2013-12-17 | Apple Inc. | Device, method, and graphical user interface for resizing objects |
US8539386B2 (en) * | 2010-01-26 | 2013-09-17 | Apple Inc. | Device, method, and graphical user interface for selecting and moving objects |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US20110185315A1 (en) * | 2010-01-27 | 2011-07-28 | Microsoft Corporation | Simplified user controls for authoring workflows |
US9141345B2 (en) * | 2010-01-27 | 2015-09-22 | Microsoft Technology Licensing, Llc | Simplified user controls for authoring workflows |
US20110296328A1 (en) * | 2010-06-01 | 2011-12-01 | Sony Corporation | Display method and information processing apparatus |
US8930854B2 (en) * | 2010-06-01 | 2015-01-06 | Sony Corporation | Display method and information processing apparatus |
US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
US10216408B2 (en) | 2010-06-14 | 2019-02-26 | Apple Inc. | Devices and methods for identifying user interface objects based on view hierarchy |
US9081494B2 (en) | 2010-07-30 | 2015-07-14 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US9098182B2 (en) | 2010-07-30 | 2015-08-04 | Apple Inc. | Device, method, and graphical user interface for copying user interface objects between content regions |
US9626098B2 (en) | 2010-07-30 | 2017-04-18 | Apple Inc. | Device, method, and graphical user interface for copying formatting attributes |
US8972879B2 (en) | 2010-07-30 | 2015-03-03 | Apple Inc. | Device, method, and graphical user interface for reordering the front-to-back positions of objects |
US20120030628A1 (en) * | 2010-08-02 | 2012-02-02 | Samsung Electronics Co., Ltd. | Touch-sensitive device and touch-based folder control method thereof |
US9535600B2 (en) * | 2010-08-02 | 2017-01-03 | Samsung Electronics Co., Ltd. | Touch-sensitive device and touch-based folder control method thereof |
US9298363B2 (en) | 2011-04-11 | 2016-03-29 | Apple Inc. | Region activation for touch sensitive surface |
US20140181753A1 (en) * | 2011-04-26 | 2014-06-26 | Kyocera Corporation | Electronic device |
US11720221B2 (en) | 2011-09-30 | 2023-08-08 | Paypal, Inc. | Systems and methods for enhancing user interaction with displayed information |
US11243654B2 (en) * | 2011-09-30 | 2022-02-08 | Paypal, Inc. | Systems and methods for enhancing user interaction with displayed information |
US20140304599A1 (en) * | 2011-10-06 | 2014-10-09 | Sony Ericsson Mobile Communications Ab | Method and Electronic Device for Manipulating a First or a Second User Interface Object |
US10394428B2 (en) * | 2011-10-06 | 2019-08-27 | Sony Corporation | Method and electronic device for manipulating a first or a second user interface object |
US11093623B2 (en) * | 2011-12-09 | 2021-08-17 | Sertainty Corporation | System and methods for using cipher objects to protect data |
US20130174069A1 (en) * | 2012-01-04 | 2013-07-04 | Samsung Electronics Co. Ltd. | Method and apparatus for managing icon in portable terminal |
US10732802B2 (en) | 2012-03-19 | 2020-08-04 | Microsoft Technology Licensing, Llc | Personal notes on a calendar item |
US9720574B2 (en) | 2012-03-19 | 2017-08-01 | Microsoft Technology Licensing, Llc | Personal notes on a calendar item |
US10872316B2 (en) | 2012-03-19 | 2020-12-22 | Microsoft Technology Licensing, Llc | Modern calendar system including free form input electronic calendar surface |
US10032135B2 (en) | 2012-03-19 | 2018-07-24 | Microsoft Technology Licensing, Llc | Modern calendar system including free form input electronic calendar surface |
US20130326526A1 (en) * | 2012-06-05 | 2013-12-05 | Tomohiko Sasaki | Information processing apparatus, workflow generating system, and workflow generating method |
US9256459B2 (en) * | 2012-06-05 | 2016-02-09 | Ricoh Company, Limited | Information processing apparatus, workflow generating system, and workflow generating method |
US20130335339A1 (en) * | 2012-06-18 | 2013-12-19 | Richard Maunder | Multi-touch gesture-based interface for network design and management |
US9189144B2 (en) * | 2012-06-18 | 2015-11-17 | Cisco Technology, Inc. | Multi-touch gesture-based interface for network design and management |
US9740394B2 (en) | 2012-09-26 | 2017-08-22 | Kyocera Document Solutions Inc. | Display/input device with swipe functionality, image forming apparatus therewith, and method of controlling a display/input device with swipe functionality |
EP2713602A3 (en) * | 2012-09-26 | 2017-02-22 | Kyocera Document Solutions Inc. | Display / input device, image forming apparatus therewith, and method of controlling a display / input device |
US20140195968A1 (en) * | 2013-01-09 | 2014-07-10 | Hewlett-Packard Development Company, L.P. | Inferring and acting on user intent |
US9372596B2 (en) * | 2013-01-28 | 2016-06-21 | International Business Machines Corporation | Assistive overlay for report generation |
US20140215405A1 (en) * | 2013-01-28 | 2014-07-31 | International Business Machines Corporation | Assistive overlay for report generation |
US9619110B2 (en) | 2013-01-28 | 2017-04-11 | International Business Machines Corporation | Assistive overlay for report generation |
US9792079B2 (en) | 2013-02-25 | 2017-10-17 | Ricoh Company, Ltd. | Smart drag and drop user interfaces for print workflow system |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
US11429190B2 (en) | 2013-06-09 | 2022-08-30 | Apple Inc. | Proxy gesture recognizer |
US20150033165A1 (en) * | 2013-07-29 | 2015-01-29 | Samsung Electronics Co., Ltd. | Device and method for controlling object on screen |
US20150153949A1 (en) * | 2013-12-03 | 2015-06-04 | Google Inc. | Task selections associated with text inputs |
US10282905B2 (en) | 2014-02-28 | 2019-05-07 | International Business Machines Corporation | Assistive overlay for report generation |
US20170285931A1 (en) * | 2016-03-29 | 2017-10-05 | Microsoft Technology Licensing, Llc | Operating visual user interface controls with ink commands |
US11256407B2 (en) * | 2018-10-30 | 2022-02-22 | Samsung Sds Co., Ltd. | Searching method and apparatus thereof |
KR20200048757A (en) * | 2018-10-30 | 2020-05-08 | 삼성에스디에스 주식회사 | Search method and apparatus thereof |
KR102605448B1 (en) | 2018-10-30 | 2023-11-22 | 삼성에스디에스 주식회사 | Search method and apparatus thereof |
US11444903B1 (en) * | 2021-02-26 | 2022-09-13 | Slack Technologies, Llc | Contextual discovery and design of application workflow |
US20220374856A1 (en) * | 2021-05-20 | 2022-11-24 | Bank Of America Corporation | Secure kinetic key patterns with soft assist for vision-impaired users |
US11657378B2 (en) * | 2021-05-20 | 2023-05-23 | Bank Of America Corporation | Secure kinetic key patterns with soft assist for vision-impaired users |
Also Published As
Publication number | Publication date |
---|---|
CN1790241A (en) | 2006-06-21 |
US20080235610A1 (en) | 2008-09-25 |
US7865845B2 (en) | 2011-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7865845B2 (en) | Chaining objects in a pointer drag path | |
US8302021B2 (en) | Pointer drag path operations | |
US8607149B2 (en) | Highlighting related user interface controls | |
US9542020B2 (en) | Remote session control using multi-touch inputs | |
JP3086399B2 (en) | Lazy drag of graphical user interface (GUI) objects | |
US5745718A (en) | Folder bar widget | |
US6331840B1 (en) | Object-drag continuity between discontinuous touch screens of a single virtual desktop | |
US5555370A (en) | Method and system for creating complex objects for use in application development | |
US7480863B2 (en) | Dynamic and intelligent hover assistance | |
JP2755371B2 (en) | Display system with embedded icons in menu bar | |
JP3605538B2 (en) | Method and medium for transferring items between display windows | |
US6111575A (en) | Graphical undo/redo manager and method | |
US7546545B2 (en) | Emphasizing drop destinations for a selected entity based upon prior drop destinations | |
US6493006B1 (en) | Graphical user interface having contextual menus | |
US6097391A (en) | Method and apparatus for graphically manipulating objects | |
EP1402337B1 (en) | Directing users' attention to specific icons being approached by an on-screen pointer on user interactive display interfaces | |
US20030222915A1 (en) | Data processor controlled display system with drag and drop movement of displayed items from source to destination screen positions and interactive modification of dragged items during the movement | |
US20030007017A1 (en) | Temporarily moving adjacent or overlapping icons away from specific icons being approached by an on-screen pointer on user interactive display interfaces | |
US20070226642A1 (en) | Apparatus and method for displaying transparent windows when copying or moving items between windows | |
JP2755372B2 (en) | Display system with nested objects | |
JPH0916364A (en) | Context -sensitive menu system / menu operation | |
JPH04319726A (en) | Apparatus and method for providing composite action by icon | |
JP2002244788A (en) | Window processor and program | |
CN106126107B (en) | Electronic apparatus and control method | |
JPH01310430A (en) | Data processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DETTINGER, RICHARD DEAN;KOLZ, DANIEL PAUL;WENZEL, SHANNON EVERETT;REEL/FRAME:015534/0073 Effective date: 20041214 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |