US20030214539A1 - Method and apparatus for hollow selection feedback - Google Patents
Method and apparatus for hollow selection feedback Download PDFInfo
- Publication number
- US20030214539A1 US20030214539A1 US10/144,256 US14425602A US2003214539A1 US 20030214539 A1 US20030214539 A1 US 20030214539A1 US 14425602 A US14425602 A US 14425602A US 2003214539 A1 US2003214539 A1 US 2003214539A1
- Authority
- US
- United States
- Prior art keywords
- graphical object
- color
- graphical
- halo
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
Definitions
- the present invention relates generally to graphical user interfaces. Specifically, aspects of the present invention relate to providing an indication to a user that certain graphical objects have been selected and/or highlighted. Further aspects of the present invention allow users to manipulate the selected graphical objects.
- GUIs graphical user interfaces
- the typical GUI uses a number of onscreen graphical objects to visually represent functions, applications, files, menus, and a host of other features for the computing environment.
- the user typically uses a mouse input device to move an onscreen pointer to select a particular graphical object for action.
- One common use of computers and GUIs is to generate and edit electronic documents.
- These electronic documents can contain text (e.g., electronic word processors) and/or images (e.g., pictures), which are displayed on the user's screen for editing.
- text e.g., electronic word processors
- images e.g., pictures
- the user typically uses a mouse input device to move an onscreen pointer to the desired object, and presses a button on the mouse to select the object.
- the selection of the particular object may be reflected in a change in its appearance.
- electronic word processing programs such as the MICROSOFT WORD (R) program, may display text as shown in FIG. 16.
- FIG. 16 the text “Sample text” appears in black on a white background.
- the text is arranged automatically in uniform rows of text across the user's screen, where the rows of text are assigned a predefined height based on user-defined settings (e.g., the use of 10 pt. font, the line spacing, etc.).
- a click-and-drag motion of the mouse pointer over the words their appearance may change to that shown in FIG. 17.
- the actual selected text is now given a white color
- the rectangular area inhabited by the text in the row is given a black background that serves as a blocked selection highlight, identifying the selected text.
- the black blocked selection highlight occupies the entire row height, and serves to differentiate the selected text from the non-selected text.
- a further drawback to this type of selection highlighting lies in the colors used for the selection block, and for the selected text appearing in the block.
- users of word processing programs are often provided with the ability to select font colors and/or background colors, they are not provided with the ability to select the color to be used for the selection block highlight, nor are they offered the ability to select the color of the selected text.
- these word processing programs automatically use predetermined colors for the selection block and selected text, which are often entirely different colors from the font color and background color. By using these new colors, these word processing programs may inadvertently obliterate any meaning that had been assigned to the original font color. For example, if a user were accustomed to using a red font color to identify text added by a certain person, and this color were changed upon selection of the text, the user might then be unable to tell whether the selected text was originally red or not.
- FIG. 19 a,b show examples of how an image, such as a simple diagonal line, may appear when selected.
- the line is a simple vector line created, for example, using a line drawing option available in Microsoft VISIO (r)
- the selected line's only change in appearance is the addition of selection handles 1901 at the endpoints.
- the line is an image, such as a bitmap image, its appearance upon selection changes with the addition of a selection box with handles 1901 located around the periphery of the image.
- the mere addition of handles 1901 does not clearly indicate the selected graphical object.
- the selected line image may be identified by the surrounding box and handles, but there is much wasted white space attributed to this selected line. This white space is wasteful, as it obscures more visual “real estate,” or displayable area of the GUI, than is necessary.
- a graphical object may appear in one color (or pattern) against a background of another color (or pattern).
- its body may be changed to a predetermined background color (such as white), or changed in color to appear transparent, allowing the color or pattern of the underlying background to show.
- a “halo” is displayed around the periphery of the graphical object, where the halo is displayed in a color (or pattern) matching the graphical object's original color or pattern.
- the halo may have a change in transparency or darkness of the color or pattern, allowing for a “glowing” appearance.
- the halo follows the contours of the outer periphery of the original graphical object. In further embodiments, the halo follows the exact contour of the outer periphery of the original graphical object. In yet further embodiments, the halo may be of a predetermined, constant, thickness.
- the highlighted graphical object may also be surrounded with a bounding shape, such as a rectangle.
- the bounding shape may include selection handles.
- selection of the halo by the user may result in the selection of the graphical object.
- the highlighted graphical object may be moved to a different location on the graphical user interface.
- a separate, moving version of the graphical object may be displayed to follow the movement of the graphical object.
- the moving version of the graphical object may be drawn in a fainter color and/or pattern than the original graphical object.
- the moving version of the graphical object may include a bounding box.
- the bounding box of the moving version of the graphical object may follow the outer contour of the graphical object.
- the halo retains its color and/or pattern as the graphical object is moved from a first area of the GUI, having a first pair of foreground/background colors, to a second area of the GUI having a different pair of foreground/background colors.
- the graphical object may have a varying thickness that, upon highlighting, is changed to a constant thickness, which simplifies the processing required to render the highlighted object.
- the halo may be rendered using a plurality of graphical layers.
- the halo may be rendered by overlaying a duplicate of the halo over an enlarged duplicate of the halo, where the first duplicate is of a different color.
- the halo may be rendered by overlaying a plurality of enlarged duplicates of the object, where each of the enlarged duplicates is of a slightly different size and color.
- FIG. 1 shows a schematic diagram of a conventional general-purpose digital computing environment in which one or more embodiments of the present invention may be implemented.
- FIG. 2 shows a tablet personal computing (PC) environment in which one or more embodiments of the present invention may be implemented.
- PC personal computing
- FIG. 3 shows an example of a graphical object usable in one or more embodiments of the present invention.
- FIG. 4 shows an example highlighted graphical object according to one embodiment of the present invention.
- FIG. 5 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to one embodiment of the present invention.
- FIG. 6 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to another embodiment of the present invention.
- FIG. 7 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to yet another embodiment of the present invention.
- FIG. 8 shows a highlighted graphical object according to a further embodiment of the present invention.
- FIGS. 9 a - 9 b show alternate embodiments of moving graphical objects according to further embodiments of the present invention.
- FIG. 10 shows an example of a graphical object to be highlighted according to aspects of the present invention.
- FIGS. 11 - 13 show example steps in the creation of a highlighted graphical object according to one embodiment of the present invention.
- FIGS. 14 - 15 show example steps in the creation of a highlighted graphical object according to one embodiment of the present invention.
- FIGS. 16 - 17 show an example of highlighted text in the prior art.
- FIG. 18 shows an example graphical object.
- FIGS. 19 a - b show an example of a highlighted line in the prior art.
- FIGS. 20 a - b are photographic examples of one embodiment of the present invention.
- FIGS. 21 a - b are photographic examples of another embodiment of the present invention.
- FIG. 1 illustrates a schematic diagram of a conventional general-purpose digital computing environment that can be used to implement various aspects of the present invention.
- a computer 100 includes a processing unit 110 , a system memory 120 , and a system bus 130 that couples various system components including the system memory to the processing unit 110 .
- the system bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- the system memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150 .
- a basic input/output system 160 (BIOS), containing the basic routines that help to transfer information between elements within the computer 100 , such as during start-up, is stored in the ROM 140 .
- the computer 100 also includes a hard disk drive 170 for reading from and writing to a hard disk (not shown), a magnetic disk drive 180 for reading from or writing to a removable magnetic disk 190 , and an optical disk drive 191 for reading from or writing to a removable optical disk 192 such as a CD ROM or other optical media.
- the hard disk drive 170 , magnetic disk drive 180 , and optical disk drive 191 are connected to the system bus 130 by a hard disk drive interface 192 , a magnetic disk drive interface 193 , and an optical disk drive interface 194 , respectively.
- the drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 100 . It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment.
- RAMs random access memories
- ROMs read only memories
- a number of program modules can be stored on the hard disk drive 170 , magnetic disk 190 , optical disk 192 , ROM 140 or RAM 150 , including an operating system 195 , one or more application programs 196 , other program modules 197 , and program data 198 .
- a user can enter commands and information into the computer 100 through input devices such as a keyboard 101 and pointing device 102 .
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner or the like.
- These and other input devices are often connected to the processing unit 110 through a serial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB).
- USB universal serial bus
- these devices may be coupled directly to the system bus 130 via an appropriate interface (not shown).
- a monitor 107 or other type of display device is also connected to the system bus 130 via an interface, such as a video adapter 108 .
- personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
- a pen digitizer 165 and accompanying pen or stylus 166 are provided in order to digitally capture freehand input.
- the pen digitizer 165 may be coupled to the processing unit 110 via a serial port, parallel port or other interface and the system bus 130 as known in the art.
- the digitizer 165 is shown apart from the monitor 107 , it is preferred that the usable input area of the digitizer 165 be co-extensive with the display area of the monitor 107 . Further still, the digitizer 165 may be integrated in the monitor 107 , or may exist as a separate device overlaying or otherwise appended to the monitor 107 .
- the computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 109 .
- the remote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 100 , although only a memory storage device 111 has been illustrated in FIG. 1.
- the logical connections depicted in FIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113 .
- LAN local area network
- WAN wide area network
- the computer 100 When used in a LAN networking environment, the computer 100 is connected to the local network 112 through a network interface or adapter 114 .
- the personal computer 100 When used in a WAN networking environment, the personal computer 100 typically includes a modem 115 or other means for establishing a communications over the wide area network 113 , such as the Internet.
- the modem 115 which may be internal or external, is connected to the system bus 130 via the serial port interface 106 .
- program modules depicted relative to the personal computer 100 may be stored in the remote memory storage device.
- FIG. 1 environment shows an example environment
- other computing environments may also be used.
- one or more embodiments of the present invention may use an environment having fewer than all of the various aspects shown in FIG. 1 and described above, and these aspects may appear in various combinations and subcombinations that will be apparent to one of ordinary skill.
- FIG. 2 illustrates a tablet personal computer (PC) 201 that can be used in accordance with various aspects of the present invention. Any or all of the features, subsystems, and functions in the system of FIG. 1 can be included in the computer of FIG. 2.
- Tablet PC 201 includes a large display surface 202 , e.g., a digitizing flat panel display, preferably, a liquid crystal display (LCD) screen, on which a plurality of windows 203 is displayed.
- a user can select, highlight, and write on the digitizing display area.
- suitable digitizing display panels include electromagnetic pen digitizers, such as the Mutoh or Wacom pen digitizers. Other types of pen digitizers, e.g., optical digitizers, may also be used.
- Tablet PC 201 interprets marks made using stylus 204 in order to manipulate data, enter text, and execute conventional computer application tasks such as spreadsheets, word processing programs, and the like.
- a stylus could be equipped with buttons or other features to augment its selection capabilities.
- a stylus could be implemented as a “pencil” or “pen”, in which one end constitutes a writing portion and the other end constitutes an “eraser” end, and which, when moved across the display, indicates portions of the display are to be erased.
- Other types of input devices such as a mouse, trackball, or the like could be used.
- a user's own finger could be used for selecting or indicating portions of the displayed image on a touch-sensitive or proximity-sensitive display. Consequently, the term “user input device”, as used herein, is intended to have a broad definition and encompasses many variations on well-known input devices.
- FIG. 3 shows an example of such a graphical object.
- the graphical object 300 is a handwritten version of the letter ‘H’ that, for example, is captured as digitized ink using a pen-based computer and its stylus.
- Other types of graphical objects include graphical icons, images, symbols, menus, prompts, and text.
- the body 301 of graphical object 300 may be drawn, or rendered, using any color or pattern value (e.g., hatched, shaded, a series of colors, etc.), and may be drawn on a background of a different color or pattern.
- the background 302 is a simple solid white background, but any other color and/or pattern may be used as the background 302 .
- the background 302 may also be comprised of one or more other graphical objects.
- the background may include a series of parallel horizontal (or vertical) lines to assist the user in keeping handwritten text in line.
- the object need not be constrained to such lines, however, and may simply overlap 1, 2, 3 or more of them.
- the background may also include various other objects, such as text, images, drawings, handwriting, etc.
- the graphical object 300 may become highlighted to represent that it has been selected, or to otherwise draw attention to it. This may occur in any number of ways. For example, in a pen-based computing device, the user may position the stylus over the displayed graphical object to select the object. The user may move a pointer over the graphical object to select it, and may also press a button to select it. Any other known method for selecting a graphical object in a GUI will also suffice.
- the body 301 may have its color value changed to a transparent value, such that the background 302 becomes visible in the area previously occupied by the body 301 .
- the highlighted graphical object may also have an outline, or “halo” 401 , of additional color and/or pattern drawn around its periphery, surrounding the original body 301 of the graphical object.
- a graphical object in a first color/pattern (e.g., red) on a background of a second color/pattern (e.g., black striped) will, upon selection, be changed in appearance to have a halo in the first color/pattern (i.e., red), and a body in the second color/pattern (i.e., black striped) due to its transparency.
- the body 301 of the graphical object may have its color changed to a predetermined color, such as white, rather than a transparent color. The use of a predetermined color may advantageously simplify the processing required to render the highlighted graphical object.
- FIG. 5 shows a close-up view taken from area A-A shown in FIG. 4.
- the halo may be of a predetermined thickness (T), such as a certain number of pixels (e.g., 8, 10, 12, 15, 20, etc. pixels) or a predetermined distance (e.g., ⁇ fraction (1/16) ⁇ th of an inch, 1 ⁇ 8th of an inch, 1 ⁇ 4 of an inch, etc.), and appears in the original color of the graphical object.
- T a certain number of pixels
- a predetermined distance e.g., ⁇ fraction (1/16) ⁇ th of an inch, 1 ⁇ 8th of an inch, 1 ⁇ 4 of an inch, etc.
- the contour 502 of the halo 401 may follow the outer contour 501 of the object 301 exactly, although a slight variation may occur due to the thickness of the halo 401 .
- the selected graphical object will more closely resemble the original graphical object.
- the halo may have a varying thickness to give the selected object a more distinguished appearance.
- the thickness of the halo may simply vary to give the highlighted graphical object a glowing appearance.
- the variation may be periodic between two or more thicknesses (T1 and T2) (e.g., varying between 10 and 15 pixels every linear inch), or alternatively, may randomly vary among two or more thicknesses.
- the halo may also vary its color and/or pattern value.
- the halo color and/or pattern may be darker near the outer edges of the graphical object body 301 , and become lighter near the outer edges of the halo 401 .
- the color/pattern may be a lighter version of the same color (e.g. light green becoming green, grey becoming black, etc.), or of the pattern (e.g., less concentrated stripes and/or shading, etc.). This variation in the halo color helps give the highlighted graphical object a glowing appearance. Other variations may also be used.
- the color and/or pattern may become darker towards the outer edge 502 of the halo, or may vary in darkness throughout.
- the process by which an example halo may be generated and rendered for display will be described further below with respect to FIGS. 10 - 15 .
- some embodiments of the present invention allow the user to select the graphical object by selecting its halo.
- a pen-based computing device may allow users to select graphical objects by placing the stylus on or over the highlighted graphical object. Similar selections may be made using a mouse that moves an onscreen pointer, pressing a sequence of keys on a keyboard, or any other form of GUI interaction.
- graphical objects having a thin appearance e.g., a single hand-drawn stroke or line
- halos described above One advantage offered by some embodiments of the halos described above is apparent in situations where multiple graphical objects appear on the display screen, and overlap one another. If one of these graphical objects is selected, while the other is not, the transparent, or “hollow,” body of the selected graphical object, together with its halo, clearly set the object apart from the other unselected graphical object(s).
- the highlighted graphical object may also be surrounded by a bounding box.
- FIG. 8 shows an example of a highlighted graphical object having a rectangular bounding box 801 .
- the bounding box 801 may be defined such that it is a rectangular shape whose dimensions depend on the size of the graphical object.
- the borders of the box 801 are horizontal and vertical, and are located a predetermined distance (d1) to the left, right, above and below the graphical object.
- the distance (d1) is a set number of pixels, such as 8, 10, 12, 15, or 20.
- the borders shown in FIG. 8 are rendered using a dashed line, but any other line type and/or color may be used.
- the bounding box 801 need not be rectangular, and need not be a box, at all.
- the bounding box may actually be implemented as any other shape, such as a circle, oval, diamond, etc. to surround the graphical object being highlighted.
- the bounding box 801 may include one or more selection handles 802 a,b that may be used to interact with the object for further handling, such as moving and/or resizing the object.
- Selection handles 802 a may be located at the corners of the bounding box 801 , and/or at other locations 802 b along the borders of the box. By selecting one of these handles (e.g., using a mouse-controlled pointer, stylus, or any other GUI selection method), the user may resize the graphical object, and/or select the graphical object for further processing/editing.
- the bounding box border distance (d1) is configured to be of sufficient size such that the selection handles do not overlap or obscure any portion of the highlighted graphical object.
- the user may move a highlighted graphical object through various portions of the GUI. This movement may be made, for example, by the user selecting the graphical object in a pen-based computing system by placing a stylus on the graphical object, pressing a button on the stylus to select the graphical object, and then moving the stylus across a display screen to the desired new location.
- some embodiments of the present invention may display a moving version of the graphical object at a location corresponding to the user's device. In this manner, the user is given feedback regarding the movement he/she is carrying out, as well as the system's acknowledgement that the move operation has begun.
- the moving version of the graphical object may appear as a simple outline of the object, as shown in FIG. 9 a , to assist the user in determining placement of the graphical object.
- the outline may be of any line type and/or color, and in some embodiments, is a thin dotted line. Using a simple dotted outline as the moving version allows the user to see background object(s) during the movement process.
- the moving version may alternatively appear as a bounding box with a pointer or cursor, as shown in FIG. 9 b .
- the dimensions of the bounding box are the minimum dimensions necessary to enclose the graphical object. Using these minimum dimensions helps convey to the user the amount of space needed by the object. Alternatively, the dimensions of the box may be greater than the minimum. Using a simple bounding box may simplify the processing required for rendering the box during the movement process, and the presence of the cursor may help make clear that the object is being moved. Alternative embodiments of this moving version may simply have the bounding box, without the cursor.
- the moving version of the graphical object may simply be a copy of the graphical object rendered in a different color and/or pattern, such as using a lighter shade of the color, or a fainter version of the color that allows more transparency.
- a lighter or fainter color for the moving version of the graphical object, the user is provided with a clear indication of what will be obscured upon repositioning of the object.
- the moving versions described above may be rendered with its own bounding box and/or selection handles, as described above, to further indicate that a movement process has begun.
- the GUI may include several distinct areas in which different color contrasts are used.
- a GUI may have a first area in which black graphical objects appear against a white background, and a second area in which red graphical objects appear against a green background.
- a graphical object originally appearing in black ink may be replaced with a transparent body and a black halo.
- the user may move this object from the first area to the second area.
- the color of the highlighted graphical object's body and halo remain the same. In this manner, any meaning associated with the color of the underlying graphical object is preserved in the new GUI area.
- the highlighted graphical object's halo may change in color to that of the second area (in this example, red).
- FIGS. 10 - 15 depict example processes that may be used to generate and render an example highlighted graphical object according to one or more of the embodiments described above.
- FIG. 10 depicts an example graphical object 1000 that may be highlighted.
- the object may be of any type of onscreen image, and one example would be a stroke of handwritten “ink” drawn by the user using a stylus and a touch-sensitive display.
- the stroke itself may follow a center line 1001 , and may vary in thickness or width. This variation may arise from a variety of causes, such as the user varying the pressure placed upon the stylus while drawing the stroke.
- FIGS. 11 - 13 depict example steps that may result in a graphical halo such as the one shown in FIG. 5.
- the computer display system may first generate an enlarged version 1101 of the original graphical object, as shown in FIG. 11.
- the enlarged version 1101 may be of the same color as the original graphical object, and have its width and/or height increased by a predetermined size, such as 0.125 inches, 10 pixels, etc., or by a predetermined percentage, such as 5%, 10%, etc.
- a predetermined size such as 0.125 inches, 10 pixels, etc.
- a predetermined percentage such as 5%, 10%, etc.
- a graphical object comprised of lines may be enlarged by having its lines widened by the predetermined amount.
- the enlarged version 1101 in FIG. 11 has been slightly increased in both height and width.
- the computer display system may generate the enlarged version 1101 in an internal memory or display buffer, and need not actually display or render the enlarged version on screen immediately.
- the computer display system may also generate another copy 1201 of the original graphical object 1000 , where the copy 1201 has the same dimensions as the original object.
- the copy 1201 may be assigned a color value that is the background color value of the area on which the original object was displayed.
- the copy 1201 may then be overlaid upon the enlarged version 1101 , resulting in the highlighted graphical object shown in FIG. 13.
- the enlarged version 1101 and copy 1201 need not be displayed onscreen prior to this “overlaying,” and may simply be constructed in a memory or video buffer of the system.
- the color value of the copy 1201 will be such that background images, text, and/or objects may be visible.
- the copy 1201 may be assigned this transparency color value, and when this copy 1201 is overlaid upon the enlarged version, the interior of the enlarged version becomes transparent in color, allowing background images to appear. This effect may be achieved even in systems that do not directly support the use of such transparency values.
- the copy 1201 may have the appearance of the graphical image, text, etc. that would have appeared directly behind the original object 1001 .
- the copy 1201 may be assigned a predetermined color, such as white, to simplify the processing required to generate the halo.
- a predetermined color such as white
- Using the background minimizes the amount of visual real estate that is obscured by the halo, while using a predetermined color simplifies the processing involved.
- One additional advantage to allowing the background pattern to show lies in the readability of text that already has another highlighting (such as a yellow highlight) associated with it. If the text had already been highlighted, then allowing this highlighting to show through the hollow body of the halo helps preserve the meaning associated with that highlighting. The user is still able to see that the selected text had the highlight associated with it.
- a similar process of “overlaying” versions may be used to create the “glowing” effect of the halo shown in FIG. 7.
- multiple versions of the object may be created.
- one version 1401 is a copy, much like the copy 1201 shown in FIG. 12.
- One or more enlarged versions may also be created, where each version is of a slightly different size and color.
- version 1402 is slightly larger in dimension than version 1401 , and is of the color of the original object.
- Another version 1403 is slightly larger than version 1402 , and is of a slightly lighter color. This lighter color may be a lighter shade than that of version 1402 .
- version 1403 may have the same color, but a greater transparency value, than version 1402 .
- FIG. 14 may be overlaid upon one another to result in the highlighted object shown in FIG. 15.
- FIG. 15 example uses two enlarged versions
- other embodiments may use more enlarged versions, with slighter differences in size and/or color between them, to create a smoother “glowing” effect.
- four such enlarged versions are used.
- the original graphical object 1000 may have its width changed before the various copies and enlarged versions are created.
- the object 1000 may originally have a varying width, but for purposes of generating the halo, it may have its width changed to that of a constant value.
- This constant value may be predefined, or may simply be the average width calculated along various points of the center line 1001 . Using such a constant width may simplify the processing required to generate the various copies and enlarged versions, and is particularly useful when the original graphical object was nearly of constant width to begin with (e.g., if the original graphical object was constructed using a linear stroke of a stylus).
- the various steps of “overlaying” described above may be achieved using a version of the object(s) stored in a memory or display buffer.
- the overlaying may be performed on a pixel-by-pixel basis by replacing the individual pixels of a portion of one image with those of another image.
- the graphical halo may be generated using the outer contour of the graphical object to define the contour that is to be followed by the halo.
- the computer system may calculate the path to be taken by the halo, and render just that path on the screen to create the halo effect.
- the halo may be rendered by tracing a single line around the contour. In such an embodiment, no processing is necessary for the “hollow” portion of the halo, but greater processing is needed to trace the path.
- the halo outline may be anti-aliased for improved resolution.
- FIGS. 20 a - b are photographic images of one example embodiment.
- a handwritten word “highlight” 2001 is shown having a highlighting 2002 .
- Highlighting 2002 may resemble the traditional highlighting generated using a handheld highlighter marker.
- FIG. 20 b shows an example in which the highlighting 2002 has been selected by the user, and exhibits a hollow selection feedback described above. In the depicted example, there is also a bounding box 2003 having selection handles 2004 .
- FIGS. 21 a - b are photographic images of another example embodiment.
- a handwritten stroke 2101 there is shown a handwritten stroke 2101 .
- the stroke 2101 has been selected, and exhibits a hollow selection feedback described above.
- This depicted example also includes a bounding box 2102 having selection handles 2103 .
- the interior of the bounding box 2102 may be transparent, to permit the background to show through.
Abstract
Description
- The present invention relates generally to graphical user interfaces. Specifically, aspects of the present invention relate to providing an indication to a user that certain graphical objects have been selected and/or highlighted. Further aspects of the present invention allow users to manipulate the selected graphical objects.
- A picture is worth a thousand words. The sentiment echoes throughout various aspects of our lives, and the world of computing is no exception. Since its inception, graphical user interfaces (GUIs) have become the standard, and preferred, way for millions of computer users to interact with their computers. Accordingly, more and more importance is being placed on the way in which computers visually show the user's actions and the environment in which the user may take these actions.
- The typical GUI uses a number of onscreen graphical objects to visually represent functions, applications, files, menus, and a host of other features for the computing environment. The user typically uses a mouse input device to move an onscreen pointer to select a particular graphical object for action.
- This GUI has proven effective, but a new step in the evolution of computing has revealed several drawbacks to existing GUIs. Specifically, the introduction of pen-based computing devices, such as the hand-held Personal Data Assistant (PDA) or the Tablet PC being introduced by Microsoft Corp., has changed the way we view the GUI, and the manner in which users can interact with their computers. This new approach to user interfaces has revealed problems and deficiencies in the traditional GUI described above. Examples of these problems will be discussed below.
- One common use of computers and GUIs is to generate and edit electronic documents. These electronic documents can contain text (e.g., electronic word processors) and/or images (e.g., pictures), which are displayed on the user's screen for editing. To interact with these onscreen objects, the user typically uses a mouse input device to move an onscreen pointer to the desired object, and presses a button on the mouse to select the object.
- The selection of the particular object may be reflected in a change in its appearance. For example, electronic word processing programs, such as the MICROSOFT WORD (R) program, may display text as shown in FIG. 16. In FIG. 16, the text “Sample text” appears in black on a white background. The text is arranged automatically in uniform rows of text across the user's screen, where the rows of text are assigned a predefined height based on user-defined settings (e.g., the use of 10 pt. font, the line spacing, etc.). Upon selecting these words using, for example, a click-and-drag motion of the mouse pointer over the words, their appearance may change to that shown in FIG. 17. In this figure, the actual selected text is now given a white color, and the rectangular area inhabited by the text in the row is given a black background that serves as a blocked selection highlight, identifying the selected text. The black blocked selection highlight occupies the entire row height, and serves to differentiate the selected text from the non-selected text.
- Although this prior art approach to highlighting text works well in the uniform, line-by-line environment of traditional word processors, this approach is less desirable in other environments that allow a greater degree of freedom, such as pen-based computing devices. For example, in systems where the text is handwritten (e.g., on a personal data assistant using a touch-sensitive screen), the user may be permitted to write text above and below any such regimented lines. FIG. 18 shows an example of handwritten text that does not conform to any such lines. The “blocked” approach discussed above would result in some confusion as to what is actually selected.
- A further drawback to this type of selection highlighting lies in the colors used for the selection block, and for the selected text appearing in the block. Although users of word processing programs are often provided with the ability to select font colors and/or background colors, they are not provided with the ability to select the color to be used for the selection block highlight, nor are they offered the ability to select the color of the selected text. Instead, these word processing programs automatically use predetermined colors for the selection block and selected text, which are often entirely different colors from the font color and background color. By using these new colors, these word processing programs may inadvertently obliterate any meaning that had been assigned to the original font color. For example, if a user were accustomed to using a red font color to identify text added by a certain person, and this color were changed upon selection of the text, the user might then be unable to tell whether the selected text was originally red or not.
- The prior art approach to highlighting images, as opposed to the regimented text of traditional word processors, does not offer much of a better solution. FIG. 19a,b show examples of how an image, such as a simple diagonal line, may appear when selected. If the line is a simple vector line created, for example, using a line drawing option available in Microsoft VISIO (r), the selected line's only change in appearance is the addition of
selection handles 1901 at the endpoints. If the line is an image, such as a bitmap image, its appearance upon selection changes with the addition of a selection box withhandles 1901 located around the periphery of the image. There are drawbacks to each of these approaches. In the former, the mere addition ofhandles 1901 does not clearly indicate the selected graphical object. In the latter, the selected line image may be identified by the surrounding box and handles, but there is much wasted white space attributed to this selected line. This white space is wasteful, as it obscures more visual “real estate,” or displayable area of the GUI, than is necessary. - Accordingly, there is now a need for an improved approach to identifying selected graphical objects in a GUI environment that can overcome one or more of the deficiencies identified above.
- Embodiments of the present invention may solve one or more of the problems and/or deficiencies identified above. In one aspect, a graphical object may appear in one color (or pattern) against a background of another color (or pattern). To highlight the graphical object, its body may be changed to a predetermined background color (such as white), or changed in color to appear transparent, allowing the color or pattern of the underlying background to show. A “halo” is displayed around the periphery of the graphical object, where the halo is displayed in a color (or pattern) matching the graphical object's original color or pattern. In further aspects, the halo may have a change in transparency or darkness of the color or pattern, allowing for a “glowing” appearance.
- In some embodiments, the halo follows the contours of the outer periphery of the original graphical object. In further embodiments, the halo follows the exact contour of the outer periphery of the original graphical object. In yet further embodiments, the halo may be of a predetermined, constant, thickness.
- In further embodiments, the highlighted graphical object may also be surrounded with a bounding shape, such as a rectangle. In yet further embodiments, the bounding shape may include selection handles.
- In some embodiments, selection of the halo by the user may result in the selection of the graphical object.
- In some embodiments, the highlighted graphical object may be moved to a different location on the graphical user interface. In further embodiments, a separate, moving version of the graphical object may be displayed to follow the movement of the graphical object. In yet further embodiments, the moving version of the graphical object may be drawn in a fainter color and/or pattern than the original graphical object. In further embodiments, the moving version of the graphical object may include a bounding box. In still further embodiments, the bounding box of the moving version of the graphical object may follow the outer contour of the graphical object.
- In some embodiments, the halo retains its color and/or pattern as the graphical object is moved from a first area of the GUI, having a first pair of foreground/background colors, to a second area of the GUI having a different pair of foreground/background colors.
- In further embodiments, the graphical object may have a varying thickness that, upon highlighting, is changed to a constant thickness, which simplifies the processing required to render the highlighted object.
- In yet further embodiments, the halo may be rendered using a plurality of graphical layers. In some embodiments, the halo may be rendered by overlaying a duplicate of the halo over an enlarged duplicate of the halo, where the first duplicate is of a different color. In further embodiments, the halo may be rendered by overlaying a plurality of enlarged duplicates of the object, where each of the enlarged duplicates is of a slightly different size and color.
- The above and other objects, features and advantages of the present invention will be readily apparent and fully understood from the following detailed description of preferred embodiments, taken in connection with the appended drawings.
- FIG. 1 shows a schematic diagram of a conventional general-purpose digital computing environment in which one or more embodiments of the present invention may be implemented.
- FIG. 2 shows a tablet personal computing (PC) environment in which one or more embodiments of the present invention may be implemented.
- FIG. 3 shows an example of a graphical object usable in one or more embodiments of the present invention.
- FIG. 4 shows an example highlighted graphical object according to one embodiment of the present invention.
- FIG. 5 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to one embodiment of the present invention.
- FIG. 6 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to another embodiment of the present invention.
- FIG. 7 shows a close-up view of a portion of the highlighted graphical object shown in FIG. 4, according to yet another embodiment of the present invention.
- FIG. 8 shows a highlighted graphical object according to a further embodiment of the present invention.
- FIGS. 9a-9 b show alternate embodiments of moving graphical objects according to further embodiments of the present invention.
- FIG. 10 shows an example of a graphical object to be highlighted according to aspects of the present invention.
- FIGS.11-13 show example steps in the creation of a highlighted graphical object according to one embodiment of the present invention.
- FIGS.14-15 show example steps in the creation of a highlighted graphical object according to one embodiment of the present invention.
- FIGS.16-17 show an example of highlighted text in the prior art.
- FIG. 18 shows an example graphical object.
- FIGS. 19a-b show an example of a highlighted line in the prior art.
- FIGS. 20a-b are photographic examples of one embodiment of the present invention.
- FIGS. 21a-b are photographic examples of another embodiment of the present invention.
- I. Example Hardware
- The present invention may be more readily described with reference to FIGS.1-21.
- FIG. 1 illustrates a schematic diagram of a conventional general-purpose digital computing environment that can be used to implement various aspects of the present invention. In FIG. 1, a
computer 100 includes aprocessing unit 110, asystem memory 120, and asystem bus 130 that couples various system components including the system memory to theprocessing unit 110. Thesystem bus 130 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. Thesystem memory 120 includes read only memory (ROM) 140 and random access memory (RAM) 150. - A basic input/output system160 (BIOS), containing the basic routines that help to transfer information between elements within the
computer 100, such as during start-up, is stored in the ROM 140. Thecomputer 100 also includes ahard disk drive 170 for reading from and writing to a hard disk (not shown), amagnetic disk drive 180 for reading from or writing to a removablemagnetic disk 190, and anoptical disk drive 191 for reading from or writing to a removableoptical disk 192 such as a CD ROM or other optical media. Thehard disk drive 170,magnetic disk drive 180, andoptical disk drive 191 are connected to thesystem bus 130 by a harddisk drive interface 192, a magneticdisk drive interface 193, and an opticaldisk drive interface 194, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for thepersonal computer 100. It will be appreciated by those skilled in the art that other types of computer readable media that can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the example operating environment. - A number of program modules can be stored on the
hard disk drive 170,magnetic disk 190,optical disk 192, ROM 140 orRAM 150, including anoperating system 195, one ormore application programs 196,other program modules 197, andprogram data 198. A user can enter commands and information into thecomputer 100 through input devices such as akeyboard 101 andpointing device 102. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner or the like. These and other input devices are often connected to theprocessing unit 110 through aserial port interface 106 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). Further still, these devices may be coupled directly to thesystem bus 130 via an appropriate interface (not shown). Amonitor 107 or other type of display device is also connected to thesystem bus 130 via an interface, such as avideo adapter 108. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers. In a preferred embodiment, apen digitizer 165 and accompanying pen orstylus 166 are provided in order to digitally capture freehand input. Although a direct connection between thepen digitizer 165 and theprocessing unit 110 is shown, in practice, thepen digitizer 165 may be coupled to theprocessing unit 110 via a serial port, parallel port or other interface and thesystem bus 130 as known in the art. Furthermore, although thedigitizer 165 is shown apart from themonitor 107, it is preferred that the usable input area of thedigitizer 165 be co-extensive with the display area of themonitor 107. Further still, thedigitizer 165 may be integrated in themonitor 107, or may exist as a separate device overlaying or otherwise appended to themonitor 107. - The
computer 100 can operate in a networked environment using logical connections to one or more remote computers, such as aremote computer 109. Theremote computer 109 can be a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer 100, although only amemory storage device 111 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 112 and a wide area network (WAN) 113. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment, the
computer 100 is connected to thelocal network 112 through a network interface oradapter 114. When used in a WAN networking environment, thepersonal computer 100 typically includes amodem 115 or other means for establishing a communications over thewide area network 113, such as the Internet. Themodem 115, which may be internal or external, is connected to thesystem bus 130 via theserial port interface 106. In a networked environment, program modules depicted relative to thepersonal computer 100, or portions thereof, may be stored in the remote memory storage device. - It will be appreciated that the network connections shown are examples and other techniques for establishing a communications link between the computers can be used. The existence of any of various well-known protocols such as TCP/IP, Ethernet, FTP, HTTP and the like is presumed, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server. Any of various conventional web browsers can be used to display and manipulate data on web pages.
- Although the FIG. 1 environment shows an example environment, it will be understood that other computing environments may also be used. For example, one or more embodiments of the present invention may use an environment having fewer than all of the various aspects shown in FIG. 1 and described above, and these aspects may appear in various combinations and subcombinations that will be apparent to one of ordinary skill.
- FIG. 2 illustrates a tablet personal computer (PC)201 that can be used in accordance with various aspects of the present invention. Any or all of the features, subsystems, and functions in the system of FIG. 1 can be included in the computer of FIG. 2.
Tablet PC 201 includes alarge display surface 202, e.g., a digitizing flat panel display, preferably, a liquid crystal display (LCD) screen, on which a plurality ofwindows 203 is displayed. Usingstylus 204, a user can select, highlight, and write on the digitizing display area. Examples of suitable digitizing display panels include electromagnetic pen digitizers, such as the Mutoh or Wacom pen digitizers. Other types of pen digitizers, e.g., optical digitizers, may also be used.Tablet PC 201 interprets marks made usingstylus 204 in order to manipulate data, enter text, and execute conventional computer application tasks such as spreadsheets, word processing programs, and the like. - A stylus could be equipped with buttons or other features to augment its selection capabilities. In one embodiment, a stylus could be implemented as a “pencil” or “pen”, in which one end constitutes a writing portion and the other end constitutes an “eraser” end, and which, when moved across the display, indicates portions of the display are to be erased. Other types of input devices, such as a mouse, trackball, or the like could be used. Additionally, a user's own finger could be used for selecting or indicating portions of the displayed image on a touch-sensitive or proximity-sensitive display. Consequently, the term “user input device”, as used herein, is intended to have a broad definition and encompasses many variations on well-known input devices.
- II. Example Highlighted Graphical Objects
- Anything appearing on a graphical user interface may be considered to be a graphical object, and FIG. 3 shows an example of such a graphical object. The
graphical object 300 is a handwritten version of the letter ‘H’ that, for example, is captured as digitized ink using a pen-based computer and its stylus. Other types of graphical objects include graphical icons, images, symbols, menus, prompts, and text. - The
body 301 ofgraphical object 300 may be drawn, or rendered, using any color or pattern value (e.g., hatched, shaded, a series of colors, etc.), and may be drawn on a background of a different color or pattern. In the FIG. 3 example, thebackground 302 is a simple solid white background, but any other color and/or pattern may be used as thebackground 302. Thebackground 302 may also be comprised of one or more other graphical objects. - For example, the background may include a series of parallel horizontal (or vertical) lines to assist the user in keeping handwritten text in line. The object need not be constrained to such lines, however, and may simply overlap 1, 2, 3 or more of them. The background may also include various other objects, such as text, images, drawings, handwriting, etc.
- The
graphical object 300 may become highlighted to represent that it has been selected, or to otherwise draw attention to it. This may occur in any number of ways. For example, in a pen-based computing device, the user may position the stylus over the displayed graphical object to select the object. The user may move a pointer over the graphical object to select it, and may also press a button to select it. Any other known method for selecting a graphical object in a GUI will also suffice. - When the graphical object is to be highlighted, its appearance changes. The
body 301 may have its color value changed to a transparent value, such that thebackground 302 becomes visible in the area previously occupied by thebody 301. The highlighted graphical object may also have an outline, or “halo” 401, of additional color and/or pattern drawn around its periphery, surrounding theoriginal body 301 of the graphical object. Thus, as an example, a graphical object in a first color/pattern (e.g., red) on a background of a second color/pattern (e.g., black striped) will, upon selection, be changed in appearance to have a halo in the first color/pattern (i.e., red), and a body in the second color/pattern (i.e., black striped) due to its transparency. In alternative embodiments, thebody 301 of the graphical object may have its color changed to a predetermined color, such as white, rather than a transparent color. The use of a predetermined color may advantageously simplify the processing required to render the highlighted graphical object. - FIG. 5 shows a close-up view taken from area A-A shown in FIG. 4. The halo may be of a predetermined thickness (T), such as a certain number of pixels (e.g., 8, 10, 12, 15, 20, etc. pixels) or a predetermined distance (e.g., {fraction (1/16)}th of an inch, ⅛th of an inch, ¼ of an inch, etc.), and appears in the original color of the graphical object. As a result of using a predetermined thickness, the
outer contour 502 of thehalo 401 may follow theouter contour 501 of thegraphical object 301. Thecontour 502 of thehalo 401 may follow theouter contour 501 of theobject 301 exactly, although a slight variation may occur due to the thickness of thehalo 401. By following thecontour 501, the selected graphical object will more closely resemble the original graphical object. - In alternative embodiments, the halo may have a varying thickness to give the selected object a more distinguished appearance. For example, as shown in FIG. 6, the thickness of the halo may simply vary to give the highlighted graphical object a glowing appearance. The variation may be periodic between two or more thicknesses (T1 and T2) (e.g., varying between 10 and 15 pixels every linear inch), or alternatively, may randomly vary among two or more thicknesses.
- As shown in FIG. 7, the halo may also vary its color and/or pattern value. For example, the halo color and/or pattern may be darker near the outer edges of the
graphical object body 301, and become lighter near the outer edges of thehalo 401. At the outer edges ofhalo 401, the color/pattern may be a lighter version of the same color (e.g. light green becoming green, grey becoming black, etc.), or of the pattern (e.g., less concentrated stripes and/or shading, etc.). This variation in the halo color helps give the highlighted graphical object a glowing appearance. Other variations may also be used. For example, the color and/or pattern may become darker towards theouter edge 502 of the halo, or may vary in darkness throughout. The process by which an example halo may be generated and rendered for display will be described further below with respect to FIGS. 10-15. - As another feature, some embodiments of the present invention allow the user to select the graphical object by selecting its halo. For example, a pen-based computing device may allow users to select graphical objects by placing the stylus on or over the highlighted graphical object. Similar selections may be made using a mouse that moves an onscreen pointer, pressing a sequence of keys on a keyboard, or any other form of GUI interaction. With such a selectable halo, graphical objects having a thin appearance (e.g., a single hand-drawn stroke or line) may be easily selected after they have been highlighted.
- One advantage offered by some embodiments of the halos described above is apparent in situations where multiple graphical objects appear on the display screen, and overlap one another. If one of these graphical objects is selected, while the other is not, the transparent, or “hollow,” body of the selected graphical object, together with its halo, clearly set the object apart from the other unselected graphical object(s).
- In a further aspect, the highlighted graphical object may also be surrounded by a bounding box. FIG. 8 shows an example of a highlighted graphical object having a
rectangular bounding box 801. There are several advantages to a bounding box. For example, the box helps further indicate to the user that a graphical object has been selected, and may also identify with greater particularity the object (or objects) that are actually selected. Thebounding box 801 may be defined such that it is a rectangular shape whose dimensions depend on the size of the graphical object. In the example shown in FIG. 8, the borders of thebox 801 are horizontal and vertical, and are located a predetermined distance (d1) to the left, right, above and below the graphical object. In one exemplary embodiment, the distance (d1) is a set number of pixels, such as 8, 10, 12, 15, or 20. The borders shown in FIG. 8 are rendered using a dashed line, but any other line type and/or color may be used. - In alternate embodiments, the
bounding box 801 need not be rectangular, and need not be a box, at all. For example, the bounding box may actually be implemented as any other shape, such as a circle, oval, diamond, etc. to surround the graphical object being highlighted. - In further embodiments, the
bounding box 801 may include one or more selection handles 802 a,b that may be used to interact with the object for further handling, such as moving and/or resizing the object. Selection handles 802 a may be located at the corners of thebounding box 801, and/or at other locations 802 b along the borders of the box. By selecting one of these handles (e.g., using a mouse-controlled pointer, stylus, or any other GUI selection method), the user may resize the graphical object, and/or select the graphical object for further processing/editing. In some embodiments, the bounding box border distance (d1) is configured to be of sufficient size such that the selection handles do not overlap or obscure any portion of the highlighted graphical object. - III. Movement of the Graphical Object
- In one aspect, the user may move a highlighted graphical object through various portions of the GUI. This movement may be made, for example, by the user selecting the graphical object in a pen-based computing system by placing a stylus on the graphical object, pressing a button on the stylus to select the graphical object, and then moving the stylus across a display screen to the desired new location. During this movement, some embodiments of the present invention may display a moving version of the graphical object at a location corresponding to the user's device. In this manner, the user is given feedback regarding the movement he/she is carrying out, as well as the system's acknowledgement that the move operation has begun.
- The moving version of the graphical object may appear as a simple outline of the object, as shown in FIG. 9a, to assist the user in determining placement of the graphical object. The outline may be of any line type and/or color, and in some embodiments, is a thin dotted line. Using a simple dotted outline as the moving version allows the user to see background object(s) during the movement process.
- The moving version may alternatively appear as a bounding box with a pointer or cursor, as shown in FIG. 9b. In some embodiments, the dimensions of the bounding box are the minimum dimensions necessary to enclose the graphical object. Using these minimum dimensions helps convey to the user the amount of space needed by the object. Alternatively, the dimensions of the box may be greater than the minimum. Using a simple bounding box may simplify the processing required for rendering the box during the movement process, and the presence of the cursor may help make clear that the object is being moved. Alternative embodiments of this moving version may simply have the bounding box, without the cursor.
- As a further alternative, the moving version of the graphical object may simply be a copy of the graphical object rendered in a different color and/or pattern, such as using a lighter shade of the color, or a fainter version of the color that allows more transparency. Using a lighter or fainter color for the moving version of the graphical object, the user is provided with a clear indication of what will be obscured upon repositioning of the object.
- The moving versions described above may be rendered with its own bounding box and/or selection handles, as described above, to further indicate that a movement process has begun.
- In some embodiments, the GUI may include several distinct areas in which different color contrasts are used. For example, a GUI may have a first area in which black graphical objects appear against a white background, and a second area in which red graphical objects appear against a green background. Upon selection in the first area, a graphical object originally appearing in black ink may be replaced with a transparent body and a black halo. The user may move this object from the first area to the second area. In some embodiments, upon movement to this second area, the color of the highlighted graphical object's body and halo remain the same. In this manner, any meaning associated with the color of the underlying graphical object is preserved in the new GUI area. Alternatively, the highlighted graphical object's halo may change in color to that of the second area (in this example, red).
- IV. Example Process of Highlighting Graphical Object
- FIGS.10-15 depict example processes that may be used to generate and render an example highlighted graphical object according to one or more of the embodiments described above. To simplify the explanation, FIG. 10 depicts an example
graphical object 1000 that may be highlighted. The object may be of any type of onscreen image, and one example would be a stroke of handwritten “ink” drawn by the user using a stylus and a touch-sensitive display. The stroke itself may follow acenter line 1001, and may vary in thickness or width. This variation may arise from a variety of causes, such as the user varying the pressure placed upon the stylus while drawing the stroke. - FIGS.11-13 depict example steps that may result in a graphical halo such as the one shown in FIG. 5. To create such a halo, the computer display system may first generate an
enlarged version 1101 of the original graphical object, as shown in FIG. 11. Theenlarged version 1101 may be of the same color as the original graphical object, and have its width and/or height increased by a predetermined size, such as 0.125 inches, 10 pixels, etc., or by a predetermined percentage, such as 5%, 10%, etc. Increasing the dimensions by a predetermined amount advantageously allows consistency in the visual appearance of highlighted graphical objects. On the other hand, increasing the dimensions by a percentage may allow for a more dynamic visual experience. In some embodiments, a graphical object comprised of lines may be enlarged by having its lines widened by the predetermined amount. Theenlarged version 1101 in FIG. 11 has been slightly increased in both height and width. - The computer display system may generate the
enlarged version 1101 in an internal memory or display buffer, and need not actually display or render the enlarged version on screen immediately. - The computer display system may also generate another
copy 1201 of the originalgraphical object 1000, where thecopy 1201 has the same dimensions as the original object. Thecopy 1201 may be assigned a color value that is the background color value of the area on which the original object was displayed. Thecopy 1201 may then be overlaid upon theenlarged version 1101, resulting in the highlighted graphical object shown in FIG. 13. Theenlarged version 1101 andcopy 1201 need not be displayed onscreen prior to this “overlaying,” and may simply be constructed in a memory or video buffer of the system. - In some embodiments, the color value of the
copy 1201 will be such that background images, text, and/or objects may be visible. For example, if the display system supports the use of transparency as a color value, then thecopy 1201 may be assigned this transparency color value, and when thiscopy 1201 is overlaid upon the enlarged version, the interior of the enlarged version becomes transparent in color, allowing background images to appear. This effect may be achieved even in systems that do not directly support the use of such transparency values. For example, rather than simply assign thecopy 1201 to a single background color, thecopy 1201 may have the appearance of the graphical image, text, etc. that would have appeared directly behind theoriginal object 1001. Alternatively, thecopy 1201 may be assigned a predetermined color, such as white, to simplify the processing required to generate the halo. Using the background minimizes the amount of visual real estate that is obscured by the halo, while using a predetermined color simplifies the processing involved. One additional advantage to allowing the background pattern to show lies in the readability of text that already has another highlighting (such as a yellow highlight) associated with it. If the text had already been highlighted, then allowing this highlighting to show through the hollow body of the halo helps preserve the meaning associated with that highlighting. The user is still able to see that the selected text had the highlight associated with it. - A similar process of “overlaying” versions may be used to create the “glowing” effect of the halo shown in FIG. 7. As shown in FIG. 14, multiple versions of the object may be created. In some embodiments, one
version 1401 is a copy, much like thecopy 1201 shown in FIG. 12. One or more enlarged versions may also be created, where each version is of a slightly different size and color. For example,version 1402 is slightly larger in dimension thanversion 1401, and is of the color of the original object. Anotherversion 1403 is slightly larger thanversion 1402, and is of a slightly lighter color. This lighter color may be a lighter shade than that ofversion 1402. Alternatively,version 1403 may have the same color, but a greater transparency value, thanversion 1402. As with the FIG. 13 halo, the various versions shown in FIG. 14 may be overlaid upon one another to result in the highlighted object shown in FIG. 15. Although the FIG. 15 example uses two enlarged versions, other embodiments may use more enlarged versions, with slighter differences in size and/or color between them, to create a smoother “glowing” effect. In one embodiment, four such enlarged versions are used. - In further embodiments, the original
graphical object 1000 may have its width changed before the various copies and enlarged versions are created. For example, theobject 1000 may originally have a varying width, but for purposes of generating the halo, it may have its width changed to that of a constant value. This constant value may be predefined, or may simply be the average width calculated along various points of thecenter line 1001. Using such a constant width may simplify the processing required to generate the various copies and enlarged versions, and is particularly useful when the original graphical object was nearly of constant width to begin with (e.g., if the original graphical object was constructed using a linear stroke of a stylus). - In some embodiments, the various steps of “overlaying” described above may be achieved using a version of the object(s) stored in a memory or display buffer. The overlaying may be performed on a pixel-by-pixel basis by replacing the individual pixels of a portion of one image with those of another image.
- As an alternative, the graphical halo may be generated using the outer contour of the graphical object to define the contour that is to be followed by the halo. To generate the halo, the computer system may calculate the path to be taken by the halo, and render just that path on the screen to create the halo effect. The halo may be rendered by tracing a single line around the contour. In such an embodiment, no processing is necessary for the “hollow” portion of the halo, but greater processing is needed to trace the path. In some embodiments, the halo outline may be anti-aliased for improved resolution.
- FIGS. 20a-b are photographic images of one example embodiment. In FIG. 20a, a handwritten word “highlight” 2001 is shown having a highlighting 2002. Highlighting 2002 may resemble the traditional highlighting generated using a handheld highlighter marker. FIG. 20b shows an example in which the highlighting 2002 has been selected by the user, and exhibits a hollow selection feedback described above. In the depicted example, there is also a
bounding box 2003 having selection handles 2004. - FIGS. 21a-b are photographic images of another example embodiment. In FIG. 21a, there is shown a
handwritten stroke 2101. In FIG. 21b, thestroke 2101 has been selected, and exhibits a hollow selection feedback described above. This depicted example also includes abounding box 2102 having selection handles 2103. The interior of thebounding box 2102 may be transparent, to permit the background to show through. - V. Conclusion
- Various embodiments and aspects of the present invention have been described above, and it will be understood by those of ordinary skill that the present invention includes within its scope all combinations and subcombinations of these embodiments and aspects. The full scope of the present invention should only be limited by the claims, which read as follows:
Claims (34)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/144,256 US20030214539A1 (en) | 2002-05-14 | 2002-05-14 | Method and apparatus for hollow selection feedback |
EP03008066A EP1363185A3 (en) | 2002-05-14 | 2003-04-14 | Method and apparatus for hollow selection feedback |
US11/067,949 US7870501B2 (en) | 2002-05-14 | 2005-03-01 | Method for hollow selection feedback |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/144,256 US20030214539A1 (en) | 2002-05-14 | 2002-05-14 | Method and apparatus for hollow selection feedback |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/067,949 Division US7870501B2 (en) | 2002-05-14 | 2005-03-01 | Method for hollow selection feedback |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030214539A1 true US20030214539A1 (en) | 2003-11-20 |
Family
ID=29269721
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/144,256 Abandoned US20030214539A1 (en) | 2002-05-14 | 2002-05-14 | Method and apparatus for hollow selection feedback |
US11/067,949 Expired - Fee Related US7870501B2 (en) | 2002-05-14 | 2005-03-01 | Method for hollow selection feedback |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/067,949 Expired - Fee Related US7870501B2 (en) | 2002-05-14 | 2005-03-01 | Method for hollow selection feedback |
Country Status (2)
Country | Link |
---|---|
US (2) | US20030214539A1 (en) |
EP (1) | EP1363185A3 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040075699A1 (en) * | 2002-10-04 | 2004-04-22 | Creo Inc. | Method and apparatus for highlighting graphical objects |
US20050083330A1 (en) * | 2003-10-15 | 2005-04-21 | Crawford Anthony B. | White space algorithm to support the creation of high quality computer-based drawings free of graphics and text overwrites |
US20060005151A1 (en) * | 2004-07-02 | 2006-01-05 | Adobe Systems | Graphical interface for adjustment of text selections |
US20070057930A1 (en) * | 2002-07-30 | 2007-03-15 | Microsoft Corporation | Freeform Encounter Selection Tool |
US20070188499A1 (en) * | 2006-02-10 | 2007-08-16 | Adobe Systems Incorporated | Course grid aligned counters |
US7432939B1 (en) * | 2002-07-10 | 2008-10-07 | Apple Inc. | Method and apparatus for displaying pixel images for a graphical user interface |
US7873916B1 (en) * | 2004-06-22 | 2011-01-18 | Apple Inc. | Color labeling in a graphical user interface |
US20110201387A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Real-time typing assistance |
US9213714B1 (en) * | 2004-06-22 | 2015-12-15 | Apple Inc. | Indicating hierarchy in a computer system with a graphical user interface |
US20160042553A1 (en) * | 2014-08-07 | 2016-02-11 | Pixar | Generating a Volumetric Projection for an Object |
CN105808083A (en) * | 2014-12-30 | 2016-07-27 | 鸿合科技有限公司 | Display method and apparatus |
US11429274B2 (en) | 2019-05-06 | 2022-08-30 | Apple Inc. | Handwriting entry on an electronic device |
US11656758B2 (en) * | 2020-05-11 | 2023-05-23 | Apple Inc. | Interacting with handwritten content on an electronic device |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8677274B2 (en) | 2004-11-10 | 2014-03-18 | Apple Inc. | Highlighting items for search results |
US7479968B2 (en) * | 2006-01-31 | 2009-01-20 | Microsoft Corporation | Semi-transparent highlighting of selected objects in electronic documents |
US8739068B2 (en) | 2007-06-15 | 2014-05-27 | Microsoft Corporation | Dynamic user interface for in-diagram shape selection |
US8762871B2 (en) * | 2008-02-03 | 2014-06-24 | Microsoft Corporation | Dynamic preview of diagram elements to be inserted into a diagram |
EP2335450A1 (en) * | 2008-10-08 | 2011-06-22 | Research In Motion Limited | Modifying the appearance of a movable position-marker on a display screen of a handheld electronic communication device |
US8490026B2 (en) * | 2008-10-27 | 2013-07-16 | Microsoft Corporation | Painting user controls |
TWM385730U (en) * | 2010-02-02 | 2010-08-01 | Sunrex Technology Corp | Computer input device with a marco function configuration |
KR101705872B1 (en) * | 2010-09-08 | 2017-02-10 | 삼성전자주식회사 | Method for selecting area on a screen in a mobile device and apparatus therefore |
US20120102401A1 (en) * | 2010-10-25 | 2012-04-26 | Nokia Corporation | Method and apparatus for providing text selection |
US10929208B1 (en) * | 2018-05-09 | 2021-02-23 | Accusoft Corporation | Methods and apparatus for copying a selected browser region to a clipboard as an image |
WO2022261097A1 (en) * | 2021-06-08 | 2022-12-15 | Roblox Corporation | Computer-assisted graphical development tools |
Citations (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4570182A (en) * | 1983-11-18 | 1986-02-11 | Sperry Corporation | Halo generator for CRT display symbols |
US4698666A (en) * | 1985-07-12 | 1987-10-06 | The Grass Valley Group, Inc. | Video key glow and border generator |
US5339387A (en) * | 1991-10-24 | 1994-08-16 | Abekas Video Systems, Inc. | Planar color gradients constructed as an arbitrary function of a distance function from an arbitrary 2-D curvilinear function |
US5410331A (en) * | 1992-05-20 | 1995-04-25 | Carmex, Inc. | Process for generating and/or using a look-up table |
US5442742A (en) * | 1990-12-21 | 1995-08-15 | Apple Computer, Inc. | Method and apparatus for the manipulation of text on a computer display screen |
US5459831A (en) * | 1992-01-10 | 1995-10-17 | International Business Machines Corporation | Method for selecting graphical objects in quadrants with a cursor |
US5470182A (en) * | 1994-02-04 | 1995-11-28 | Omation Corporation | Envelope opening machine |
US5500934A (en) * | 1991-09-04 | 1996-03-19 | International Business Machines Corporation | Display and control system for configuring and monitoring a complex system |
US5577170A (en) * | 1993-12-23 | 1996-11-19 | Adobe Systems, Incorporated | Generation of typefaces on high resolution output devices |
US5631975A (en) * | 1992-04-14 | 1997-05-20 | Koninkl Philips Electronics Nv | Image segmentation device |
US5666139A (en) * | 1992-10-15 | 1997-09-09 | Advanced Pen Technologies, Inc. | Pen-based computer copy editing apparatus and method for manuscripts |
US5687331A (en) * | 1995-08-03 | 1997-11-11 | Microsoft Corporation | Method and system for displaying an animated focus item |
US5754178A (en) * | 1993-03-03 | 1998-05-19 | Apple Computer, Inc. | Method and apparatus for improved feedback during manipulation of data on a computer controlled display system |
US5754187A (en) * | 1994-05-16 | 1998-05-19 | Agfa Division, Bayer Corporation | Method for data compression of digital data to produce a scaleable font database |
US5798752A (en) * | 1993-07-21 | 1998-08-25 | Xerox Corporation | User interface having simultaneously movable tools and cursor |
US5835919A (en) * | 1993-05-10 | 1998-11-10 | Apple Computer, Inc. | Computer-human interface system which manipulates parts between a desktop and a document |
US5867150A (en) * | 1992-02-10 | 1999-02-02 | Compaq Computer Corporation | Graphic indexing system |
US5977981A (en) * | 1993-03-08 | 1999-11-02 | Canon Information Systems Research Australia Pty Ltd. | Softstroking of edges for computer graphics |
US5982381A (en) * | 1997-07-03 | 1999-11-09 | Microsoft Corporation | Method and apparatus for modifying a cutout image for compositing |
US6014138A (en) * | 1994-01-21 | 2000-01-11 | Inprise Corporation | Development system with methods for improved visual programming with hierarchical object explorer |
US6054989A (en) * | 1998-09-14 | 2000-04-25 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio |
US6057858A (en) * | 1996-08-07 | 2000-05-02 | Desrosiers; John J. | Multiple media fonts |
US6077307A (en) * | 1997-10-31 | 2000-06-20 | Hewlett Packard Company | Forced conformance design-rule halos for computer aided design software |
US6088481A (en) * | 1994-07-04 | 2000-07-11 | Sanyo Electric Co., Ltd. | Handwritten character input device allowing input of handwritten characters to arbitrary application program |
US6091505A (en) * | 1998-01-30 | 2000-07-18 | Apple Computer, Inc. | Method and system for achieving enhanced glyphs in a font |
US6133925A (en) * | 1994-08-31 | 2000-10-17 | Microsoft Corporation | Automated system and method for annotation using callouts |
US6151309A (en) * | 1994-04-28 | 2000-11-21 | British Telecommunications Public Limited Company | Service provision system for communications networks |
US6243093B1 (en) * | 1998-09-14 | 2001-06-05 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which visually groups matching objects |
US20010020956A1 (en) * | 1997-12-09 | 2001-09-13 | Zyris Plc. | Graphic image design |
US6337689B1 (en) * | 1999-04-03 | 2002-01-08 | Hewlett-Packard Company | Adaptive buffering of computer graphics vertex commands |
US20020046386A1 (en) * | 2000-10-18 | 2002-04-18 | Chipworks | Design analysis workstation for analyzing integrated circuits |
US20020056136A1 (en) * | 1995-09-29 | 2002-05-09 | Wistendahl Douglass A. | System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box |
US20020064308A1 (en) * | 1993-05-20 | 2002-05-30 | Dan Altman | System and methods for spacing, storing and recognizing electronic representations of handwriting printing and drawings |
US20020112180A1 (en) * | 2000-12-19 | 2002-08-15 | Land Michael Z. | System and method for multimedia authoring and playback |
US20020126990A1 (en) * | 2000-10-24 | 2002-09-12 | Gary Rasmussen | Creating on content enhancements |
US6476834B1 (en) * | 1999-05-28 | 2002-11-05 | International Business Machines Corporation | Dynamic creation of selectable items on surfaces |
US6496981B1 (en) * | 1997-09-19 | 2002-12-17 | Douglass A. Wistendahl | System for converting media content for interactive TV use |
US20030038836A1 (en) * | 1999-11-30 | 2003-02-27 | Ronald Simon Paul | Web map tool |
US20030052906A1 (en) * | 1999-08-16 | 2003-03-20 | Christopher Lau | Interactive video object processing environment having concurrently active subordinate windows |
US20030063797A1 (en) * | 2001-09-28 | 2003-04-03 | Kaixuan Mao | Smart masking tool for image processing |
US20030151628A1 (en) * | 2001-10-20 | 2003-08-14 | Salter Hal Christopher | Interactive game providing instruction in musical notation and in learning an instrument |
US6628666B1 (en) * | 1998-03-30 | 2003-09-30 | Genesys Telecomm Lab Inc | Managing bandwidth on demand for internet protocol messaging with capability for transforming telephony calls from one media type to another media type |
US20040039934A1 (en) * | 2000-12-19 | 2004-02-26 | Land Michael Z. | System and method for multimedia authoring and playback |
US20040190779A1 (en) * | 2000-02-29 | 2004-09-30 | Goldpocket Interactive, Inc. | Method for outlining and filling regions in multi-dimensional arrays |
US20050010927A1 (en) * | 1993-03-03 | 2005-01-13 | Stern Mark Ludwig | Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program |
US6968511B1 (en) * | 2002-03-07 | 2005-11-22 | Microsoft Corporation | Graphical user interface, data structure and associated method for cluster-based document management |
US20060053383A1 (en) * | 2000-07-21 | 2006-03-09 | Microsoft Corporation | Integrated method for creating a refreshable web query |
US20080072168A1 (en) * | 1999-06-18 | 2008-03-20 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface to objects, the user interface exploiting spatial memory and visually indicating at least one object parameter |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2040681A1 (en) * | 1990-04-26 | 1991-10-27 | Alan H. Trist | Image editing system |
US5682439A (en) * | 1995-08-07 | 1997-10-28 | Apple Computer, Inc. | Boxed input correction system and method for pen based computer systems |
US5757383A (en) * | 1996-05-10 | 1998-05-26 | Apple Computer, Inc. | Method and system for highlighting typography along a geometric path |
US6337698B1 (en) * | 1998-11-20 | 2002-01-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US6381519B1 (en) * | 2000-09-19 | 2002-04-30 | Honeywell International Inc. | Cursor management on a multiple display electronic flight instrumentation system |
US7873916B1 (en) * | 2004-06-22 | 2011-01-18 | Apple Inc. | Color labeling in a graphical user interface |
-
2002
- 2002-05-14 US US10/144,256 patent/US20030214539A1/en not_active Abandoned
-
2003
- 2003-04-14 EP EP03008066A patent/EP1363185A3/en not_active Withdrawn
-
2005
- 2005-03-01 US US11/067,949 patent/US7870501B2/en not_active Expired - Fee Related
Patent Citations (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4570182A (en) * | 1983-11-18 | 1986-02-11 | Sperry Corporation | Halo generator for CRT display symbols |
US4698666A (en) * | 1985-07-12 | 1987-10-06 | The Grass Valley Group, Inc. | Video key glow and border generator |
US5442742A (en) * | 1990-12-21 | 1995-08-15 | Apple Computer, Inc. | Method and apparatus for the manipulation of text on a computer display screen |
US5500934A (en) * | 1991-09-04 | 1996-03-19 | International Business Machines Corporation | Display and control system for configuring and monitoring a complex system |
US5339387A (en) * | 1991-10-24 | 1994-08-16 | Abekas Video Systems, Inc. | Planar color gradients constructed as an arbitrary function of a distance function from an arbitrary 2-D curvilinear function |
US5459831A (en) * | 1992-01-10 | 1995-10-17 | International Business Machines Corporation | Method for selecting graphical objects in quadrants with a cursor |
US5867150A (en) * | 1992-02-10 | 1999-02-02 | Compaq Computer Corporation | Graphic indexing system |
US5631975A (en) * | 1992-04-14 | 1997-05-20 | Koninkl Philips Electronics Nv | Image segmentation device |
US5410331A (en) * | 1992-05-20 | 1995-04-25 | Carmex, Inc. | Process for generating and/or using a look-up table |
US5666139A (en) * | 1992-10-15 | 1997-09-09 | Advanced Pen Technologies, Inc. | Pen-based computer copy editing apparatus and method for manuscripts |
US20050010927A1 (en) * | 1993-03-03 | 2005-01-13 | Stern Mark Ludwig | Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program |
US20080155439A1 (en) * | 1993-03-03 | 2008-06-26 | Mark Ludwig Stern | Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program |
US5754178A (en) * | 1993-03-03 | 1998-05-19 | Apple Computer, Inc. | Method and apparatus for improved feedback during manipulation of data on a computer controlled display system |
US5977981A (en) * | 1993-03-08 | 1999-11-02 | Canon Information Systems Research Australia Pty Ltd. | Softstroking of edges for computer graphics |
US5835919A (en) * | 1993-05-10 | 1998-11-10 | Apple Computer, Inc. | Computer-human interface system which manipulates parts between a desktop and a document |
US20020064308A1 (en) * | 1993-05-20 | 2002-05-30 | Dan Altman | System and methods for spacing, storing and recognizing electronic representations of handwriting printing and drawings |
US5798752A (en) * | 1993-07-21 | 1998-08-25 | Xerox Corporation | User interface having simultaneously movable tools and cursor |
US5577170A (en) * | 1993-12-23 | 1996-11-19 | Adobe Systems, Incorporated | Generation of typefaces on high resolution output devices |
US6014138A (en) * | 1994-01-21 | 2000-01-11 | Inprise Corporation | Development system with methods for improved visual programming with hierarchical object explorer |
US5470182A (en) * | 1994-02-04 | 1995-11-28 | Omation Corporation | Envelope opening machine |
US6151309A (en) * | 1994-04-28 | 2000-11-21 | British Telecommunications Public Limited Company | Service provision system for communications networks |
US5754187A (en) * | 1994-05-16 | 1998-05-19 | Agfa Division, Bayer Corporation | Method for data compression of digital data to produce a scaleable font database |
US6088481A (en) * | 1994-07-04 | 2000-07-11 | Sanyo Electric Co., Ltd. | Handwritten character input device allowing input of handwritten characters to arbitrary application program |
US6133925A (en) * | 1994-08-31 | 2000-10-17 | Microsoft Corporation | Automated system and method for annotation using callouts |
US5687331A (en) * | 1995-08-03 | 1997-11-11 | Microsoft Corporation | Method and system for displaying an animated focus item |
US20020056136A1 (en) * | 1995-09-29 | 2002-05-09 | Wistendahl Douglass A. | System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box |
US6057858A (en) * | 1996-08-07 | 2000-05-02 | Desrosiers; John J. | Multiple media fonts |
US5982381A (en) * | 1997-07-03 | 1999-11-09 | Microsoft Corporation | Method and apparatus for modifying a cutout image for compositing |
US6496981B1 (en) * | 1997-09-19 | 2002-12-17 | Douglass A. Wistendahl | System for converting media content for interactive TV use |
US6077307A (en) * | 1997-10-31 | 2000-06-20 | Hewlett Packard Company | Forced conformance design-rule halos for computer aided design software |
US20010020956A1 (en) * | 1997-12-09 | 2001-09-13 | Zyris Plc. | Graphic image design |
US6091505A (en) * | 1998-01-30 | 2000-07-18 | Apple Computer, Inc. | Method and system for achieving enhanced glyphs in a font |
US7298763B2 (en) * | 1998-03-30 | 2007-11-20 | Genesys Telecommunications Laboratories, Inc | Managing bandwidth on demand for internet protocol messaging with capability for transforming telephony calls from one media type to another media type |
US20040062250A1 (en) * | 1998-03-30 | 2004-04-01 | Pickering Richard B. | Managing bandwidth on demand for internet protocol messaging with capability for transforming telephony calls from one media type to another media type |
US6628666B1 (en) * | 1998-03-30 | 2003-09-30 | Genesys Telecomm Lab Inc | Managing bandwidth on demand for internet protocol messaging with capability for transforming telephony calls from one media type to another media type |
US6054989A (en) * | 1998-09-14 | 2000-04-25 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio |
US6243093B1 (en) * | 1998-09-14 | 2001-06-05 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which visually groups matching objects |
US6337689B1 (en) * | 1999-04-03 | 2002-01-08 | Hewlett-Packard Company | Adaptive buffering of computer graphics vertex commands |
US6476834B1 (en) * | 1999-05-28 | 2002-11-05 | International Business Machines Corporation | Dynamic creation of selectable items on surfaces |
US20080072168A1 (en) * | 1999-06-18 | 2008-03-20 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface to objects, the user interface exploiting spatial memory and visually indicating at least one object parameter |
US20030052906A1 (en) * | 1999-08-16 | 2003-03-20 | Christopher Lau | Interactive video object processing environment having concurrently active subordinate windows |
US20030038836A1 (en) * | 1999-11-30 | 2003-02-27 | Ronald Simon Paul | Web map tool |
US20040190779A1 (en) * | 2000-02-29 | 2004-09-30 | Goldpocket Interactive, Inc. | Method for outlining and filling regions in multi-dimensional arrays |
US20060053383A1 (en) * | 2000-07-21 | 2006-03-09 | Microsoft Corporation | Integrated method for creating a refreshable web query |
US20020046386A1 (en) * | 2000-10-18 | 2002-04-18 | Chipworks | Design analysis workstation for analyzing integrated circuits |
US20020126990A1 (en) * | 2000-10-24 | 2002-09-12 | Gary Rasmussen | Creating on content enhancements |
US20040039934A1 (en) * | 2000-12-19 | 2004-02-26 | Land Michael Z. | System and method for multimedia authoring and playback |
US20020112180A1 (en) * | 2000-12-19 | 2002-08-15 | Land Michael Z. | System and method for multimedia authoring and playback |
US20030063797A1 (en) * | 2001-09-28 | 2003-04-03 | Kaixuan Mao | Smart masking tool for image processing |
US20030151628A1 (en) * | 2001-10-20 | 2003-08-14 | Salter Hal Christopher | Interactive game providing instruction in musical notation and in learning an instrument |
US6968511B1 (en) * | 2002-03-07 | 2005-11-22 | Microsoft Corporation | Graphical user interface, data structure and associated method for cluster-based document management |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7432939B1 (en) * | 2002-07-10 | 2008-10-07 | Apple Inc. | Method and apparatus for displaying pixel images for a graphical user interface |
US20070057930A1 (en) * | 2002-07-30 | 2007-03-15 | Microsoft Corporation | Freeform Encounter Selection Tool |
US8132125B2 (en) * | 2002-07-30 | 2012-03-06 | Microsoft Corporation | Freeform encounter selection tool |
US20040075699A1 (en) * | 2002-10-04 | 2004-04-22 | Creo Inc. | Method and apparatus for highlighting graphical objects |
US7057628B2 (en) * | 2003-10-15 | 2006-06-06 | Anthony Bruce Crawford | Method for locating white space to support the automated creation of computer-based drawings that are virtually free of graphics and text overwrites |
US20050083330A1 (en) * | 2003-10-15 | 2005-04-21 | Crawford Anthony B. | White space algorithm to support the creation of high quality computer-based drawings free of graphics and text overwrites |
US7873916B1 (en) * | 2004-06-22 | 2011-01-18 | Apple Inc. | Color labeling in a graphical user interface |
US20110145742A1 (en) * | 2004-06-22 | 2011-06-16 | Imran Chaudhri | Color labeling in a graphical user interface |
US9606698B2 (en) | 2004-06-22 | 2017-03-28 | Apple Inc. | Color labeling in a graphical user interface |
US9213714B1 (en) * | 2004-06-22 | 2015-12-15 | Apple Inc. | Indicating hierarchy in a computer system with a graphical user interface |
US20060005151A1 (en) * | 2004-07-02 | 2006-01-05 | Adobe Systems | Graphical interface for adjustment of text selections |
US20070188499A1 (en) * | 2006-02-10 | 2007-08-16 | Adobe Systems Incorporated | Course grid aligned counters |
US7868888B2 (en) * | 2006-02-10 | 2011-01-11 | Adobe Systems Incorporated | Course grid aligned counters |
US20110202876A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | User-centric soft keyboard predictive technologies |
US8782556B2 (en) | 2010-02-12 | 2014-07-15 | Microsoft Corporation | User-centric soft keyboard predictive technologies |
US9165257B2 (en) | 2010-02-12 | 2015-10-20 | Microsoft Technology Licensing, Llc | Typing assistance for editing |
US20110202836A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Typing assistance for editing |
US20110201387A1 (en) * | 2010-02-12 | 2011-08-18 | Microsoft Corporation | Real-time typing assistance |
US9613015B2 (en) | 2010-02-12 | 2017-04-04 | Microsoft Technology Licensing, Llc | User-centric soft keyboard predictive technologies |
US10126936B2 (en) | 2010-02-12 | 2018-11-13 | Microsoft Technology Licensing, Llc | Typing assistance for editing |
US10156981B2 (en) | 2010-02-12 | 2018-12-18 | Microsoft Technology Licensing, Llc | User-centric soft keyboard predictive technologies |
US20160042553A1 (en) * | 2014-08-07 | 2016-02-11 | Pixar | Generating a Volumetric Projection for an Object |
US10169909B2 (en) * | 2014-08-07 | 2019-01-01 | Pixar | Generating a volumetric projection for an object |
CN105808083A (en) * | 2014-12-30 | 2016-07-27 | 鸿合科技有限公司 | Display method and apparatus |
US11429274B2 (en) | 2019-05-06 | 2022-08-30 | Apple Inc. | Handwriting entry on an electronic device |
US11656758B2 (en) * | 2020-05-11 | 2023-05-23 | Apple Inc. | Interacting with handwritten content on an electronic device |
Also Published As
Publication number | Publication date |
---|---|
US7870501B2 (en) | 2011-01-11 |
US20050149882A1 (en) | 2005-07-07 |
EP1363185A3 (en) | 2007-12-05 |
EP1363185A2 (en) | 2003-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7870501B2 (en) | Method for hollow selection feedback | |
US7428711B2 (en) | Glow highlighting as an ink attribute | |
US8166388B2 (en) | Overlaying electronic ink | |
US8132125B2 (en) | Freeform encounter selection tool | |
US7190379B2 (en) | Method for resizing and moving an object on a computer screen | |
CA2124604C (en) | Method and apparatus for operating on an object-based model data structure to produce a second image in the spatial context of a first image | |
JP4637455B2 (en) | User interface utilization method and product including computer usable media | |
US6891551B2 (en) | Selection handles in editing electronic documents | |
JP4694606B2 (en) | Gesture determination method | |
US7028256B2 (en) | Adding white space to a document generating adjusted page sizing | |
US20050015731A1 (en) | Handling data across different portions or regions of a desktop | |
US20040257346A1 (en) | Content selection and handling | |
US9128919B2 (en) | Smart space insertion | |
US20090091547A1 (en) | Information display device | |
JP5664164B2 (en) | Electronic information board device, information display method, program | |
US20020080126A1 (en) | Mode hinting and switching | |
US20030226113A1 (en) | Automatic page size setting | |
US10725653B2 (en) | Image processing device, image processing system, and image processing method | |
US7454699B2 (en) | Smart content insertion | |
US11137903B2 (en) | Gesture-based transitions between modes for mixed mode digital boards | |
US10430458B2 (en) | Automated data extraction from a chart from user screen selections | |
US7296240B1 (en) | Document object membranes | |
CN108932054B (en) | Display device, display method, and non-transitory recording medium | |
JP2004185478A (en) | Illustration creating program and line drawing program | |
RU2365979C2 (en) | Input and reproduction of electronic ink |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IWEMA, MARIEKE;DAVIS, SHAWNA;JARRETT, ROBERT J.;AND OTHERS;REEL/FRAME:012903/0564 Effective date: 20020509 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0001 Effective date: 20141014 |