US20110109560A1 - Audio/Visual Device Touch-Based User Interface - Google Patents

Audio/Visual Device Touch-Based User Interface Download PDF

Info

Publication number
US20110109560A1
US20110109560A1 US12/613,943 US61394309A US2011109560A1 US 20110109560 A1 US20110109560 A1 US 20110109560A1 US 61394309 A US61394309 A US 61394309A US 2011109560 A1 US2011109560 A1 US 2011109560A1
Authority
US
United States
Prior art keywords
racetrack
menu
indication
audio
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/613,943
Inventor
Santiago Carvajal
John Michael Sakalowsky
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bose Corp
Original Assignee
Bose Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bose Corp filed Critical Bose Corp
Priority to US12/613,943 priority Critical patent/US20110109560A1/en
Assigned to BOSE CORPORATION reassignment BOSE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CARVAJAL, SANTIAGO, SAKALOWSKY, JOHN MICHAEL
Priority to US12/887,484 priority patent/US8686957B2/en
Priority to US12/886,802 priority patent/US8350820B2/en
Priority to US12/887,499 priority patent/US8638306B2/en
Priority to US12/886,837 priority patent/US20110113371A1/en
Priority to US12/886,998 priority patent/US8692815B2/en
Priority to US12/887,479 priority patent/US8669949B2/en
Priority to EP10777200A priority patent/EP2497017A1/en
Priority to PCT/US2010/055628 priority patent/WO2011057076A1/en
Publication of US20110109560A1 publication Critical patent/US20110109560A1/en
Priority to US13/414,436 priority patent/US8736566B2/en
Priority to US13/448,657 priority patent/US9201584B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks

Definitions

  • This disclosure relates to user interfaces incorporating a visual display and/or a touch-sensitive control.
  • an audio/visual program e.g., a piece of music, a recorded lecture, a recorded live performance, a movie, a slideshow, family pictures, an episode of a television program, etc.
  • an audio/visual program e.g., a piece of music, a recorded lecture, a recorded live performance, a movie, a slideshow, family pictures, an episode of a television program, etc.
  • the increasing variety of choices of sources of audio/visual programs and the increasing variety of mechanisms by which audio/visual programs are able to be stored and played has greatly complicated what was once the relatively simple act of watching or listening to the playing of an audio/visual program to enjoy it.
  • those wishing to “tune in” an audio/visual program being broadcast must now select a channel on which to view an audio/visual program from as many as 500 channels available through typical cable and/or satellite connections for television and/or radio.
  • audio/visual devices that are able to be programmed to autonomously tune in and record an audio/visual program for playing at a later time.
  • some of these possible sources of audio/visual programs require paid subscriptions for which key cards and/or decryption keys are required to gain access to at least some audio/visual programs.
  • Each such audio/visual device often has a unique user interface, and more often than not, is accompanied by a separate handheld wireless remote control by which it is operated. Attempts have been made to grapple with the resulting plethora of remote controls that often accompany a multitude of audio/visual devices by providing so-called “universal remotes” enabling multiple audio/visual devices to be operated using a single remote control.
  • a universal remote tends to go only so far in satisfying the desire of many users to simplify the coordination required in the operation of multiple audio/visual devices to perform the task of playing an audio/visual program.
  • a user interface for an audio/visual device incorporates one or both of a touch sensor having a touch surface on which is defined a racetrack surface having a ring shape and a display element on which is displayed a racetrack menu also having a ring shape, and where the user interface incorporates both, the ring shapes of the racetrack surface and the racetrack menu are structured to generally correspond such that the position of a marker on the racetrack menu is caused to correspond to the position at which a digit of a user's hand touches the racetrack surface.
  • an apparatus in one aspect, includes a display element capable of visually displaying a visual portion of an audio/visual program and a racetrack menu having a ring shape; a processing device; and a storage accessible to the processing device and storing a sequence of instructions.
  • the processing device is caused to: cause the racetrack menu to be visually displayed on the display element such that the racetrack menu surrounds a first display area in which the visual portion of the audio/visual program may be visually displayed; cause a plurality of menu items to be visually displayed in the racetrack menu; cause a first marker to be visually displayed in the racetrack menu; receive an indication that a first manually-operable control is being operated to move the first marker; in response to the indication of the first manually-operable control being operated to move the first marker, move the first marker about the racetrack menu and constrain movement of the first marker to remain within the racetrack menu; receive an indication of the first manually-operable control being operated to select a menu item of the plurality of menu items that is in the vicinity of the
  • Implementations may include, and are not limited to, one or more of the following features.
  • the touch-sensitive surface of the touch sensor may have a ring shape that defines the ring shape of the racetrack surface such that the racetrack surface encompasses substantially all of the touch-sensitive surface.
  • the apparatus may further include a manually operable control, and a casing wherein the touch sensor is disposed on the casing relative to the manually operable control such that the touch-sensitive surface surrounds the manually operable control.
  • the touch-sensitive surface of the touch sensor may be a continuous surface having no hole interrupting the touch-sensitive surface formed therethrough, where the ring shape of the racetrack surface is defined on the touch-sensitive surface to encompass a first portion of the touch-sensitive surface and is defined to be positioned about the periphery of the touch-sensitive surface so as to surround a second portion of the touch-sensitive surface, and a navigation surface is defined on the touch-sensitive surface to encompass the second portion.
  • At least one ridge may be formed in the touch-sensitive surface, wherein the at least one ridge also at least partly defines the ring shape of the racetrack surface.
  • the processing device may be caused by the sequence of instructions to define the first and second portions of the touch-sensitive surface by: monitoring activity on the touch-sensitive surface; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the indication of the digit touching the racetrack surface at the position; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as an indication of the digit operating a navigation control; and in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source.
  • the apparatus may further include a source interface operable to transmit commands to a source of the audio/visual program; wherein execution of the sequence of instructions by the processing device further causes the processing device to receive an indication of the manually-operable control being operated; and in response to the indication of the manually-operable control being operated, operate the source interface to transmit a command to the source to cause the source to visually display a navigation menu of the source on the display element.
  • the menu may have a ring shape that substantially corresponds to the ring shape of the racetrack surface.
  • the ring shape of both the racetrack surface and the menu may be a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface.
  • the ring shape of the menu may surround a display area in which a visual portion of the audio/visual program is displayed at a time when the audio/visual program is played.
  • Execution of the sequence of instructions by the processing device may further causes the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed.
  • Execution of the sequence of instructions by the processing device may further cause the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • Execution of the sequence of instructions by the processing device may further cause the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • a method includes receiving an indication of a digit of a hand of a user touching a racetrack surface at a position on the racetrack surface, wherein the racetrack surface is defined on a touch-sensitive surface of a touch sensor to encompass at least a portion of the touch-sensitive surface and is operable by the digit; in response to the indication of the digit touching the racetrack surface at the position, causing a marker to be visually displayed at a location that corresponds to the position on the racetrack surface on a menu that is visually displayed on a display element; receiving an indication of the position at which the digit touches the racetrack surface being moved about the racetrack surface; in response to the indication of the position being moved about the racetrack surface, causing the marker to be moved about the menu in a manner that corresponds to the manner in which the position is being moved about the racetrack; receiving an indication of the user increasing the pressure with which the user's digit touches the racetrack surface at the position at a time subsequent to receiving the indication of the position being moved about the racetrack; and
  • Implementations may include, and are not limited to, one or more of the following features.
  • the method may further include defining the racetrack surface on a first portion of the touch-sensitive surface and defining a navigation surface on a second portion of the touch-sensitive surface such that the ring shape of the racetrack surface surrounds the navigation surface by: monitoring activity on the touch-sensitive surface; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the receiving of the indication of the digit touching the racetrack surface at the position; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as receiving an indication of the digit operating a navigation control; and in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source.
  • the method may further include displaying the menu on the display element with a ring shape that substantially corresponds to the ring shape of the racetrack surface; and perhaps further include surrounding a display area on the display element with the menu, wherein a visual portion of the audio/visual program is displayed in the display area at a time when the audio/visual program is played.
  • the ring shape of both the racetrack surface and the menu may be a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface.
  • the method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed.
  • the method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • the method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • FIG. 1 is a perspective view of an embodiment of a user interface.
  • FIG. 2 depicts correlations between movement of a digit on a racetrack sensor of the user interface of FIG. 1 and movement of a marker on a racetrack menu of the user interface of FIG. 1 .
  • FIGS. 3 a , 3 b , 3 c and 3 d depict possible variants of the user interface of FIG. 1 incorporating different forms and combinations of markers.
  • FIG. 4 is a block diagram of a possible architecture of the user interface of FIG. 1 .
  • FIG. 5 is a perspective view of another embodiment of the user interface of FIG. 1 combining more of the features of the user interface into a single device.
  • FIG. 6 depicts a possibility of switching between displaying and not displaying the racetrack menu of the user interface of FIG. 1 .
  • FIGS. 7 a and 7 b depict additional possible details of the user interface of FIG. 1 .
  • FIG. 8 is a perspective view of the embodiment of the user interface of FIG. 5 , additionally incorporating the possible details of FIGS. 7 a and 7 b.
  • FIG. 9 is a block diagram of the controller of the architecture of FIG. 4 .
  • FIGS. 10 a and 10 b depict possible variants of the touch sensor employed in the user interface of FIG. 1 .
  • FIGS. 11 a and 11 b depict possible variants of the user interface of FIG. 1 incorporating more than one display area.
  • FIG. 12 depicts another embodiment of the user interface of FIG. 1 in which the racetrack menu and the display area surrounded by the racetrack menu do not occupy substantially all of a display element.
  • audio/visual devices i.e., devices that are structured to be employed by a user to play an audio/visual program.
  • audio/visual devices e.g., televisions, set-top boxes and hand-held remotes
  • presentations of specific embodiments are intended to facilitate understanding through the use of examples, and should not be taken as limiting either the scope of disclosure or the scope of claim coverage.
  • FIG. 1 depicts a user interface 1000 enabling a user's hand-eye coordination to be employed to more intuitively operate at least one audio/visual device to select and play an audio/visual program.
  • the user interface 1000 incorporates a displayed “racetrack” menu 150 and a corresponding “racetrack” surface 250 .
  • the user interface 1000 is implemented by an interoperable set of devices that include at least an audio/visual device 100 and a handheld remote control 200 , and as will be explained in greater detail, may further include another audio/visual device 900 .
  • the user interface 1000 may be substantially fully implemented by a single audio/visual device, such as the audio/visual device 100 .
  • the racetrack menu 150 is visually displayed on a display element 120 disposed on a casing 110 of the audio/visual device 100 , and as depicted, the audio/visual device 100 is a flat panel display device such as a television, employing a flat panel form of the display element 120 such as a liquid crystal display (LCD) element or a plasma display element. Further, the audio/visual device 100 may further incorporate acoustic drivers 130 to acoustically output sound. However, as those skilled in the art will readily recognize, the racetrack menu 150 may be displayed by any of a variety of types, configurations and sizes of audio/visual device, whether portable or stationary, including and not limited to, a projector or a handheld device.
  • a projector or a handheld device.
  • the racetrack surface 250 is defined on a touch-sensitive surface 225 of a touch sensor 220 disposed on a casing 210 of the handheld remote control 200 , and as depicted, the touch-sensitive surface 225 has a rectangular ring shape that physically defines the shape and position of the racetrack surface 250 such that the racetrack surface 250 encompasses substantially all of the touch-sensitive surface of the touch sensor 220 .
  • the touch sensor 220 may be incorporated into any of a wide variety of devices, whether portable or stationary, including and not limited to, a wall-mounted control panel or a keyboard. Further, it is also envisioned that the touch sensor 220 may have a variant of the touch-sensitive surface 225 (see FIG.
  • the touch sensor 220 may be based on any of a variety of technologies.
  • both the racetrack menu 150 and the racetrack surface 250 have a ring shape that is a generally rectangular ring shape with corresponding sets of four sides. More specifically, the four sides 150 a, 150 b, 150 c and 150 d of the racetrack menu 150 are arranged to correspond to the four sides 250 a, 250 b, 250 c and 250 d of the racetrack surface 250 .
  • This four-sided nature of both of the racetrack menu 150 and the racetrack surface 250 are meant to accommodate the rectilinear nature of the vast majority of display elements currently found in audio/visual devices and the rectilinear nature of the visual portion of the vast majority of currently existing audio/visual programs that have a visual portion.
  • racetrack menu 150 and the racetrack surface 250 are depicted and discussed herein as having a rectangular ring shape, other embodiments are possible in which the ring shape adopted by the racetrack surface 250 has a circular ring shape, an oval ring shape, a hexagonal ring shape or still other geometric variants of a ring shape. Further, where the racetrack menu 150 and/or the racetrack surface 250 have a ring shape that is other than a rectangular ring shape, one or both of the display element 120 and the touch sensor 220 may have a shape other than the rectangular shapes depicted herein.
  • the four sides 150 a - d of the racetrack menu 150 surround or overlie the edges of a display area 950 in which the visual portion of an audio/visual program selected via the user interface 1000 may be played. It is this positioning of the racetrack menu 150 about the periphery of the display element 120 and the display area 950 (whether surrounding or overlying the periphery of the display area 950 ) that supplies the impetus for both the racetrack menu 150 and the racetrack surface 250 having a ring shape that is generally a rectangular ring shape, rather than a ring shape of some other geometry.
  • the display area 950 may remain blank (e.g., display only a black or blue background color) or display status information concerning the playing of the selected audio/visual program as the selected audio/visual program is played, perhaps with the audio portion being acoustically output by the acoustic drivers 130 .
  • the four sides 150 a - d of the racetrack menu 150 are displayed by the display element 120 at the edges of the display element 120 .
  • the four sides 150 a - d of the racetrack menu 150 may be positioned about the edges of a “window” of a graphical user interface of the type commonly employed in the operation of typical computer systems, perhaps where the audio/visual device 100 is a computer system on which audio/visual programs are selected and played through the user interface 1000 .
  • menu items 155 that may be selected by a user of the user interface 1000 .
  • the menu items 155 may include alphanumeric characters (such as those depicted as positioned along the side 150 a ) that may be selected to specify a channel or a website from which to select and/or receive an audio/visual program, symbols (such as those depicted as positioned along the side 150 b ) representing commands to control the operation of an audio/visual device capable of playing an audio/visual program (e.g., “play” and “stop” commands for a video cassette recorder, a disc media player, or solid state digital file player, etc.), and indicators of inputs (such as those depicted as positioned along the side 150 c ) to an audio/visual device that may be selected and through which an audio/visual program may be selected and/or received.
  • alphanumeric characters such as those depicted as positioned along the side 150 a
  • symbols such as those depicted as positioned along the side 150 b
  • menu items 155 positioned along the racetrack menu 150 could conceivably serve any of a wide variety of purposes, it is envisioned that much of the functionality of the menu items 155 will be related to enabling a user to select an audio/visual program for playing, and/or to actually play an audio/visual program.
  • a user places the tip of a digit of one of their hands (i.e., the tip of a thumb or finger) on a portion of the racetrack surface 250 defined on the touch-sensitive surface 225 of the touch sensor 220 , and a marker 160 is displayed on a portion of the racetrack menu 150 that has a position on the racetrack menu 150 that corresponds to the position 260 on the racetrack surface 250 at which the tip of their digit is in contact with the touch-sensitive surface 225 of the touch sensor 250 .
  • the marker 160 moves about and is constrained to moving about the racetrack menu 150 to maintain a correspondence between its location on the racetrack menu 150 and the position 260 of the digit on the racetrack surface 250 as the user moves that digit about the racetrack surface 250 .
  • the marker 160 may move about the racetrack menu 150 in a manner in which the marker 160 “snaps” from being centered about one menu item 155 to an adjacent menu item 155 as the marker 160 is moved about a portion of the racetrack menu 150 having adjacent ones of the menu items 155 .
  • such “snapping” of the marker 160 between adjacent ones of the menu items 155 may be accompanied by the concurrent acoustic output of some form of sound (e.g., a “click” or “beep” sound that accompanies each “snap” of the marker 160 ) to provide further feedback to a user of the marker 160 moving from one such menu item 155 to another.
  • some form of sound e.g., a “click” or “beep” sound that accompanies each “snap” of the marker 160
  • the touch sensor 220 is capable of distinguishing different degrees of pressure with which the digit is put into contact with the touch-sensitive surface 225 of the touch sensor 220 on which the racetrack surface 250 is defined in order to distinguish an instance in which the user is pressing harder with that digit to select one of the menu items 155 .
  • the touch sensor 220 is able to function in a manner not unlike a mechanically depressible button in which the additional pressure applied through that digit by the user causes the touch sensor 220 to be pressed inward towards the casing 210 as part of selecting a menu item. This may be accomplished by overlying one or more buttons disposed within the casing 210 with the touch sensor 220 so that such buttons are depressed by the touch sensor 220 as the touch sensor 220 is itself depressed towards the casing 210 .
  • touch sensor 220 is able to be pressed inward towards the casing 210 , such inward movement may be accompanied by a “click” sound that may be heard by the user and/or a tactile “snap” sensation that can be sensed by the user through their digit to give the user some degree of positive feedback that they've successfully selected one of the menu items 155 .
  • a “click” or other sound accompanying the user's use of increased pressure on the racetrack surface 250 to select one of the menu items 155 may be acoustically output through an acoustic driver (not shown) incorporated into the remote control 200 and/or through the acoustic drivers 130 of the audio/visual device 100 .
  • FIGS. 3 a , 3 b and 3 c depict other variations of forms of marker and combinations of markers.
  • different forms of marker and combinations of multiple markers may be used to enhance the rapidity with which the eyes of a user of the user interface 1000 is drawn to a specific location on the racetrack menu 150 , and to aid the hand-eye coordination of that user.
  • FIG. 3 a depicts another variant of the marker 160 having the form of a triangular pointer. Still other possible graphical representations of the marker 160 will occur to those skilled in the art, such as forms of the marker 160 having other geometric shapes (e.g., a dot, a circle, an arrow, etc.) or other ways of being positioned in the vicinity of a given one of the menu items 155 (e.g., overlying, surrounding, pointing to, touching, etc., one of the menu items 155 ).
  • forms of the marker 160 having other geometric shapes (e.g., a dot, a circle, an arrow, etc.) or other ways of being positioned in the vicinity of a given one of the menu items 155 (e.g., overlying, surrounding, pointing to, touching, etc., one of the menu items 155 ).
  • the marker 160 may instead be a modified form of a given one of the menu items 155 , such as a change in a color of a menu item, an enlargement of a menu item in comparison to others, or some form of recurring animation or movement imparted to a menu item.
  • the position of the marker 160 (and by extension, the position 260 of the tip of a digit on the racetrack surface 250 ) may be indicated by one of the menu items 155 changing color, changing font, becoming larger, becoming brighter, or being visually altered in comparison to the others of the menu items 155 in any of a number of ways to draw a user's eyes to it.
  • FIG. 3 a also depicts an optional additional marker 165 that follows the location of the marker 160 and provides a visual “highlight” of which one of the four sides 150 a - d the marker 160 is currently positioned within as a visual aid to enable a user's eyes to be more quickly directed to that one of the four sides 150 a - d when looking at the racetrack menu 150 .
  • the additional marker 165 may be implemented as a highlighting, change in color, change in background color, change in font, enlargement or other visual alteration made to all of the menu items 155 that are positioned in that one of the four sides 150 a - d.
  • FIG. 3 b depicts the manner in which the marker 160 may be dynamically resized as it is moved about the racetrack menu 150 , especially in embodiments where the marker 160 is of a form that in some way overlaps or surrounds one of the menu items 155 at a time in order to take into account the different sizes of different ones of the menu items 155 . More specifically, and as depicted in FIG. 3 b , the numeral “3” has visibly smaller dimensions (i.e., occupies less space in the racetrack menu 150 ) than does the numeral “Ill” that is also present on the same racetrack menu 150 .
  • the marker 160 when the depicted form of the marker 160 (i.e., the “box” form of the marker 160 that has been discussed at length) is positioned on one or the other of these two particular ones of the menu items 155 , the marker 160 is resized to be larger or smaller as needed to take into account the different sizes of these two particular ones of the menu items 155 .
  • FIG. 3 c also depicts an optional additional marker 162 that follows the location of the marker 160 and provides a more precise visual indication than does the marker 160 of the position 260 of the tip of a user's finger along a corresponding portion of the racetrack surface 250 .
  • the marker 162 takes the form of what might be called a “dash” positioned along one of the edges of the box form of the marker 160 .
  • the marker 162 may take any of a variety of forms (e.g., a dot, a circle, an arrow, etc.).
  • FIG. 3 c depicts a succession of views of a portion of the racetrack menu 150 on which menu items 155 taking the form of the numerals “1” through “5” are positioned.
  • the marker 162 provides a more precise indication of the movement of the position 260 of the tip of the user's digit along a portion of the racetrack surface 250 from left to right than does the marker 160 which remains on the one of the menu items 155 having the form of the numeral “2” on this portion of the racetrack menu 150 .
  • Such a higher precision indication of the position 260 of the tip of the user's digit may aid the user in improving their hand-eye coordination in operating the user interface 1000 .
  • Such a higher precision indication of the position 260 may also provide a user with some degree of reassurance that the user interface 1000 is responding to their actions (or more specifically, whatever processing device is incorporated into the user interface 1000 is responding to their actions) by seeing that the exact position 260 of the tip of their digit is being successfully detected.
  • FIG. 3 d depicts yet another alternate variation of the marker 160 in a variant of the user interface 1000 in which the racetrack menu 150 is divided into multiple segments, with each such segment serving as a background to one of the menu items 155 .
  • the marker 160 is implemented as both a change in the color and/or brightness of one of those segments of the racetrack menu 150 and an enlarging of the graphical element representing the one of the menu items 155 (specifically, the numeral “3”) positioned within that segment.
  • the marker 160 might be said to have a form that is a variant of the earlier-depicted box, but a box that is made visible by having a color and/or brightness that differs from the rest of the racetrack menu 150 , rather than a box that is made visible by a border or outline.
  • FIG. 3 d also depicts this alternate variation of the marker 160 being used in combination with the earlier-described additional marker 162 that provides a more precise indication of the position 260 of the tip of a user's digit along a portion of the racetrack surface 250 .
  • FIG. 3 d also depicts how this variant of the marker 160 is resized to accommodate the different sizes of the different ones of the menu items 155 , although this resizing now corresponds to the differing dimensions of different ones of the segments into which the racetrack menu 150 is divided.
  • each of the segments may be individually sized to fit the visual size and shape of its corresponding one of the menu items 155 , as depicted in FIG. 3 d .
  • the segment of the racetrack menu 150 in which the numeral “3” is positioned is smaller than the segment in which the numeral “III” is positioned.
  • the segments filling at least one of the four sides 150 a - d may all be sized based on the quantity of the menu items 155 positioned in that one of the four sides so as to divide that one of the four sides 150 a - d into equal-sized segments.
  • the size of the segments in that one of the four sides 150 a - d may change in response to a change in quantity of the menu items 155 positioned in that one of the four sides 150 a - d.
  • a reduction in the quantity of menu items 155 in that one of the four sides 150 a - d results in each of its segments becoming larger in at least one dimension
  • an increase in the quantity of menu items 155 results in that one of the four sides 150 a - d results in each of its segments becoming smaller.
  • FIG. 4 is a block diagram of a possible architecture of the user interface 1000 by which a controller 500 receives input through a user's use of at least the racetrack surface 250 defined on at least a portion of a touch-sensitive surface 225 of the touch sensor 220 to which the controller 500 is coupled, and provides at least the racetrack menu 150 as a visual output to the user through at least the display element 120 to which the controller 500 is also coupled.
  • the controller 500 may be incorporated directly into the audio/visual device 100 , or into another audio/visual device 900 coupled to the audio/visual device 100 and shown in dotted lines in FIG. 1 . As also depicted in FIG.
  • the remote control 200 communicates wirelessly through the emission of radio frequency, infrared or other wireless emissions to whichever one of the audio/visual devices 100 and 900 incorporates the controller 500 .
  • the remote control 200 may communicate through an electrically and/or optically conductive cable (not shown) in other possible embodiments.
  • the remote control 200 may communicate through a combination of wireless and cable-based (optical or electrical) connections forming a network between the remote control 200 and the controller 500 .
  • FIG. 5 depicts an alternate variant of the audio/visual device 100 having more of a portable configuration incorporating both the display element 120 displaying the racetrack menu 150 and the touch sensor 220 on a touch-sensitive surface 225 on which the racetrack surface 250 is defined.
  • This alternative variant of the audio/visual device 100 may also incorporate the controller 500 , such that much (if not substantially all) of the user interface 1000 is implemented solely by the audio/visual device 100 .
  • the controller 500 incorporates multiple interfaces in the form of one or more connectors and/or one or more wireless transceivers by which the controller 500 is able to be coupled to one or more sources 901 , 902 , 903 and/or 904 .
  • Any such connectors may be disposed on the casing of whatever audio/visual device the controller 500 is incorporated into (e.g., the casing 110 of the audio/visual device 100 or a casing of the audio/visual device 900 ).
  • the controller 500 is able to transmit commands to one or more of the sources 901 - 904 to access and select audio/visual programs, and is able to receive audio/visual programs therefrom.
  • Each of the sources 901 - 904 may be any of a variety of types of audio/visual device, including and not limited to, RF tuners (e.g., cable television or satellite dish tuners), disc media recorders and/or players, tape media recorders and/or players, solid-state or disk-based digital file players (e.g., a MP3 file player), Internet access devices to access streaming data of audio/visual programs, or docking cradles for portable audio/visual devices (e.g., a digital camera). Further, in some embodiments, one or more of the sources 901 - 904 may be incorporated into the same audio/visual device into which the controller 500 is incorporated (e.g., a built-in disc media player or built-in radio frequency tuner).
  • RF tuners e.g., cable television or satellite dish tuners
  • disc media recorders and/or players e.g., tape media recorders and/or players
  • solid-state or disk-based digital file players e.g., a MP3 file
  • any of a variety of types of electrical and/or optical signaling conveyed via electrically and/or optically conductive cabling may be employed.
  • a single cable is employed both in relaying commands from the controller 500 to that one of the sources 901 - 904 and in relaying audio/visual programs to the controller 500 .
  • combinations of cabling in which different cables separately perform these functions are also possible.
  • Some of the possible forms of cabling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, Syndicat des Constructeurs d'Appareils Radiorecepteurs et Televiseurs (SCART) promulgated in the U.S. by the Electronic Industries Alliance (EIA) of Arlington, Va.; Ethernet (IEEE-802.3) or IEEE-1394 promulgated by the Institute of Electrical and Electronics Engineers (IEEE) of Washington, D.C.; Universal Serial Bus (USB) promulgated by the USB Implementers Forum, Inc.
  • SCART Syndicat des Constructeurs d'Appareils Radiorecepteurs et Televiseurs
  • DVI Digital Visual Interface
  • DDWG Digital Display Working Group
  • HDMI High-Definition Multimedia Interface
  • VESA Video Electronics Standards Association
  • cabling able to relay only one or the other of commands and audio/visual programs may conform to one or more industry standards, including and not limited to, RS-422 or RS-232-C promulgated by the EIA; Video Graphics Array (VGA) maintained by VESA; RC-5720C (more commonly called “Toslink”) maintained by the Japan Electronics and Information Technology Industries Association (JEITA) of Tokyo, Japan; the widely known and used Separate Video (S-Video); or S-Link maintained by Sony Corporation of Tokyo, Japan.
  • VGA Video Graphics Array
  • RC-5720C more commonly called “Toslink” maintained by the Japan Electronics and Information Technology Industries Association (JEITA) of Tokyo, Japan
  • S-Video Separate Video
  • S-Link maintained by Sony Corporation of Tokyo, Japan.
  • any of a variety of types of infrared, radio frequency or other wireless signaling may be employed.
  • a single wireless point-to-point coupling is employed both in relaying commands from the controller 500 to that one of the sources 901 - 904 and in relaying audio/visual programs to the controller 500 .
  • combinations of separate wireless couplings in which these functions are separately performed are also possible.
  • Some of the possible forms of wireless signaling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, IEEE 802.11a, 802.11b or 802.11g promulgated by the IEEE; Bluetooth promulgated by the Bluetooth Special Interest Group of Bellevue, Wash.; or ZigBee promulgated by the ZigBee Alliance of San Ramon, Calif.
  • a combination of cabling-based and wireless couplings may be used.
  • An example of such a combination may be the use of a cabling-based coupling to enable the controller 500 to receive an audio/visual program from that one of the sources 901 - 904 , while an infrared transmitter coupled to the controller 500 may be positioned at or near the one of the sources 901 - 904 to wirelessly transmit commands via infrared to that one of the sources 901 - 904 .
  • each of the sources 901 - 904 depicts each of the sources 901 - 904 as being directly coupled to the controller 500 in a point-to-point manner, those skilled in the art will readily recognize that one or more of the sources 901 - 904 may be coupled to the controller 500 indirectly through one or more of the others of the sources 901 - 904 , or through a network formed among the sources 901 - 904 (and possibly incorporating routers, bridges and other relaying devices that will be familiar to those skilled in the art) with multiple cabling-based and/or wireless couplings.
  • Some of the above-listed industry standards include specifications of commands that may be transmitted between audio/visual devices to control access to and/or control the playing of audio/visual programs, including most notably, SCART, IEEE-1394, USB, HDMI, and Bluetooth.
  • the controller 500 may limit the commands transmitted to one or more of the sources 901 - 904 to the commands specified by that industry standard and map one or more of those commands to corresponding ones of the menu items 155 such that a user is able to cause the controller 500 to send those commands to one or more of the sources 901 - 904 by selecting those corresponding ones of the menu items 155 .
  • the controller 500 may employ any of a wide variety of approaches to identify one or more of the sources 901 - 904 to an extent necessary to “learn” what commands are appropriate to transmit and the manner in which they must be transmitted.
  • a user of the user interface 1000 may select one of the sources 901 - 904 as part of selecting an audio/visual program for being played by employing the racetrack surface 250 and the marker 160 to select one or more of the menu items 155 shown on the racetrack menu 150 , such as the “I” through “IV” menu items 155 depicted as displayed by the controller 500 on the side 150 c of the racetrack menu 150 .
  • Those menu items 155 depicted on the side 150 c correspond to the sources 901 through 904 , which are depicted as bearing the labels “source I” through “source IV” in FIG. 4 .
  • the controller 500 receives input from the touch sensor 220 indicating the contact of the user's digit with a portion of the racetrack surface 250 , indicating movement of the position 260 of contact of the digit about the racetrack surface 250 , and indicating the application of greater pressure by the user through that digit against the touch sensor 220 at the position 260 (wherever the position 260 is at that moment) when selecting one of the menu items 155 .
  • the selection of one of the sources 901 - 904 by the user causes the controller 500 to switch to receiving audio/visual programs from that one of the sources 901 - 904 , and to be ready to display any visual portion in the display area 950 and acoustically output any audio portion through the acoustic drivers 130 (or whatever other acoustic drivers may be present and employed for playing audio/visual programs).
  • the selection of one of the sources 901 - 904 may further cause the controller 500 to alter the quantity and types of menu items 155 displayed on one or more of the sides 150 a - d of the racetrack menu 150 such that the displayed menu items 155 more closely correspond to the functions supported by whichever one of the sources 901 - 904 that has been selected.
  • This changing display of at least a subset of the menu items 155 enables the user to operate at least some functions of a selected one of the sources 901 - 904 by selecting one or more of the menu items 155 to thereby cause the controller 500 to transmit one or more commands corresponding to those menu items to the selected one of the sources 901 - 904 .
  • the racetrack menu 150 may include one or more menu items 155 that could be selected to cause the controller 500 to transmit a command to that previously selected one of the sources 901 - 904 to cause it to start recording an audio/visual program.
  • the controller 500 would alter the menu items 155 displayed on the racetrack menu 150 to remove one or more menu items associated with recording an audio/visual program.
  • at least a subset of the menu items 155 displayed on the racetrack menu 150 are “modal” in nature, insofar as at least that subset changes with the selection of different ones of the sources 901 - 904 .
  • the coupling and/or uncoupling of one or more of the sources 901 - 904 to and/or from whatever audio/visual device into which the controller 500 is incorporated may also cause the controller 500 to alter the quantity and/or types of menu items 155 that are displayed in another example of at least a subset of the menu items 155 being modal in nature.
  • the uncoupling of one of the sources 901 - 904 where that one of the sources 901 - 904 had been coupled through cabling may cause the controller 500 to remove the one of the menu items 155 by which that now uncoupled one of the sources 901 - 904 could be selected.
  • the controller 500 may respond to such an uncoupling by autonomously selecting one of the other of the sources 901 - 904 and altering the subset of the menu items 155 to correspond to the functions able to be performed by that newly selected one of the sources 901 - 904 .
  • the uncoupling of one of the sources 901 - 904 where that one of the sources 901 - 904 had been wirelessly coupled may or may not cause the controller 500 to remove the one of the menu items 155 by which that now uncoupled one of the sources 901 - 904 could be selected.
  • the uncoupling may well result in an alteration or removal of at least some of the menu items 155 displayed on the racetrack menu 150 .
  • the controller 500 may be caused to automatically select that now coupled one of the sources 901 - 904 . This may be done based on an assumption that the user has coupled that source to whatever audio/visual device into which the controller 500 is incorporated with the intention of immediately playing an audio/visual program from it.
  • menu items 155 may be modal in nature such that they are apt to change depending on the selection and/or condition of one or more of the sources 901 - 904
  • others of the menu items 155 may not be modal in nature such that they are always displayed whenever the racetrack menu 150 is displayed. More specifically, where one or more of the sources 901 - 904 are incorporated into the same audio/visual device as the controller 500 , the ones of the menu items 155 associated with those sources may remain displayed in the racetrack menu 150 , regardless of the occurrences of many possible events that may cause other menu items 155 having a modal nature to be displayed, to not be displayed, or to be displayed in some altered form.
  • a subset of the menu items 155 associated with selecting a radio frequency channel may be a subset of the menu items 155 that is always displayed in the racetrack menu 150 . It may be that the selection of any menu item of such a subset of the menu items 155 may cause the controller 500 to automatically switch the selection of a source of audio/visual programs to the source associated with those menu items 155 .
  • an audio/visual device incorporates a radio frequency tuner and menu items 155 associated with selecting a radio frequency channel are always displayed
  • the selection of any one of those menu items would cause the controller 500 to automatically switch to that radio frequency tuner as the source from which to receive an audio/visual program if that tuner were not already selected as the source.
  • one or more of the menu items 155 associated with selecting a source of audio/visual programs may be menu items that are always displayed in the racetrack menu 150 .
  • the racetrack menu 150 has a rectilinear configuration with the four sides 150 a - d that are configured to surround or overlie edges of the display area 950 .
  • the racetrack menu 150 is not always displayed such that what is shown on the display element 120 of the audio/visual device 100 could be either the display area 950 surrounded by the racetrack menu 150 , or the display area 950 expanded to fill the area otherwise occupied by the racetrack menu 150 .
  • the controller 500 may provide the display element 120 with an image that includes substantially nothing else but the display area 950 such that a visual portion of an audio visual program is substantially the only thing shown on the display element 120 .
  • the controller 500 then provides the display element 120 with an image that includes a combination of the display area 950 and the racetrack menu 150 .
  • the controller 500 reduces the size of the display area 950 to make room around the edges of the display area 950 for the display of the racetrack menu 150 on the display element 120 , and in so doing, may rescale the visual portion (if there is one) of whatever audio/visual program may be playing at that time.
  • the display area 950 is not resized, and instead, the racetrack menu 150 is displayed in a manner in which the racetrack menu 150 overlies edge portions of the display area 950 such that edge portions of any visual portion of an audio/visual program are no longer visible.
  • the racetrack menu 150 may be displayed in a manner in which at least some portions of the racetrack menu have a somewhat “transparent” quality in which the overlain edge portions of any visual portion of an audio/visual program can still be seen by the user “looking through” the racetrack menu 150 .
  • this “transparent” quality may be achieved through any of a number of possible approaches to combining the pixels of the image of the racetrack menu 150 with pixels of the overlain portion of any visual portion of an audio/visual program (e.g., by averaging pixel color values, alternately interspersing pixels, or bit-wise binary combining of pixels with a pixel mask).
  • the controller 500 may also combine audio associated with operation of the user interface 1000 with an audio portion (if present) of an audio/visual program being played. More specifically, “click” sounds associated with the user pressing the racetrack surface 250 defined on a surface of the touch sensor 220 with greater pressure and/or with the “snapping” of the marker 160 between adjacent ones of the menu items 155 may be combined with whatever audio portion is acoustically output as part of the playing of an audio/visual program.
  • the controller 500 may do more than simply cause the racetrack menu 150 to be displayed in response to a user touching a portion of the racetrack sensor 250 . More specifically, in addition to causing the racetrack menu 150 to be displayed, the controller 500 may take particular actions in response to particular ones of the sides 250 a - d of the racetrack surface 250 being touched by a user at a time when the racetrack menu 150 is not being displayed.
  • the detection of a touch to the side 250 d may cause a command to be sent to one of the sources 901 - 904 to provide an on-screen guide concerning audio/visual programs able to be provided by that source, where such a guide would be displayed in the display area 950 , with edges of the display area 950 being either surrounded or overlain by the racetrack menu 150 as has been previously described.
  • causing the racetrack menu 150 to be displayed requires both a touch and some minimum degree of movement of the tip of a user's digit on the racetrack surface 250 (i.e., a kind of “touch-and-drag” or “wiping” motion across a portion of the racetrack surface 250 ), while other particular actions are taken in response to where there is only a touch of a tip of a user's digit on particular ones of the sides 250 a - d of the racetrack sensor 250 .
  • touching the side 250 a may cause a command to be sent to a source to turn that source on or off, and touching the side 250 b may cause an audio portion of an audio/visual program to be muted, while both touching and moving a digit across a portion of the racetrack surface 250 in a “wiping” motion is required to enable the display and use of the racetrack menu 150 .
  • FIGS. 7 a and 7 b depict additional features that may be incorporated into the user interface 1000 .
  • a selected one of the sources 901 - 904 displays its own on-screen menu 170 (e.g., a guide concerning audio/visual programs available from that source), either in place of a visual portion of an audio/visual program or overlying a visual portion of an audio/visual program
  • some embodiments of the user interface 1000 may be augmented to support at least partly integrating the manner in which a user would navigate such an on-screen menu 170 into the user interface 1000 .
  • the touch sensor 220 may be configured to surround a set of controls for use in navigating the on-screen menu 170 just as the racetrack menu 150 surrounds the on-screen menu 170 , itself.
  • FIG. 7 b depicts the manner in which the touch sensor 220 disposed on the casing 210 of the remote control 200 of FIG. 1 may surround navigation buttons 270 a, 270 b, 270 c and 270 d, as well as a selection button 280 , that are also disposed on the casing 210 .
  • buttons 270 a - d and the selection button 280 may be surrounded by the touch sensor 220 , in addition to or in place of the navigation buttons 270 a - d and the selection button 280 , including and not limited to, a joystick, or a four-way rocker switch that may either surround a selection button (such as the selection button 280 ) or be useable as a selection button by being pressed in the middle.
  • a nested arrangement of concentrically located manually operable controls is created. FIG.
  • FIG. 7 a depicts a form of possible on-screen menu that will be familiar to those skilled in the art, including various menu items 175 that may be selected via the selection button 280 , and a marker 180 that may be moved by a user among the menu items 175 via the navigation buttons 270 a - d.
  • the concentrically nested arrangement of manually operable controls surrounded by the racetrack menu 250 defined on the touch-sensitive surface 225 of the touch sensor 220 that is disposed on the casing 210 of the remote control 200 corresponds to the similarly nested arrangement of the on-screen menu 170 surrounded by the racetrack menu 150 that is displayed on the display element 120 .
  • FIG. 7 b also depicts additional controls 222 , 225 , 226 and 228 that may be employed to perform particular functions where it may be deemed desirable to provide at least some degree of functionality in a manner that does not require the selection of menu items to operate.
  • the controls 222 , 225 , 226 and 228 are operable as a power button, a mute button, volume rocker switch and a channel increment/decrement rocker switch, respectively.
  • FIG. 8 depicts a variant of the handheld form of the audio/visual device 100 depicted in FIG.
  • FIG. 9 is a block diagram of a possible architecture of the controller 500 in which the controller 500 incorporates an output interface 510 , a sensor interface 520 , a storage 540 , a processing device 550 and a source interface 590 .
  • the processing device 550 is coupled to each of the output interface 510 , the sensor interface 520 , the storage 540 and the source interface 590 to at least coordinate the operation of each to perform at least the above-described functions of the controller 500 .
  • the processing device 550 may be any of a variety of types of processing device based on any of a variety of technologies, including and not limited to, a general purpose central processing unit (CPU), a digital signal processor (DSP), a microcontroller, or a sequencer.
  • the storage 540 may be based on any of a variety of data storage technologies, including and not limited to, any of a wide variety of types of volatile and nonvolatile solid-state memory, magnetic media storage, and/or optical media storage. It should be noted that although the storage 540 is depicted in a manner that is suggestive of it being a single storage device, the storage 540 may be made up of multiple storage devices, each of which may be based on different technologies.
  • Each of the output interface 510 , the sensor interface 520 and the source interface 590 may employ any of a variety of technologies to enable the controller 500 to communicate with other devices and/or other components of whatever audio/visual device into which the controller 500 is incorporated. More specifically, where the controller 500 is incorporated into an audio/visual device that also incorporates one or both of a display element (such as the display element 120 ) and at least one acoustic driver (such as the acoustic drivers 130 ), the output interface 510 may be of a type able to directly drive a display element with signals causing the display of the racetrack menu 150 and the display area 950 to display visual portions of audio/visual programs, and/or able to directly drive one or more acoustic drivers to acoustically output audio portions of audio/visual programs.
  • a display element such as the display element 120
  • acoustic driver such as the acoustic drivers 130
  • the output interface 510 may be of a type able to directly drive a display element with signals
  • the output interface 510 may be of a type employing cabling-based and/or a wireless signaling (perhaps signaling conforming to one of the previously listed industry standards) to transmit a signal to another audio/visual device into which a display element and/or acoustic drivers are incorporated (e.g., the audio/visual device 100 ).
  • the sensor interface 520 may be of a type able to directly receive electrical signals emanating from the touch sensor 220 . With such a more direct coupling, the sensor interface 520 may directly monitor a two-dimensional array of touch-sensitive points of the touch-sensitive surface 225 of the touch sensor 220 for indications of which touch-sensitive points are being touched by a tip of a user's digit, and thereby enable the processing device 550 to employ those indications to directly determine where the touch-sensitive surface 225 is being touched.
  • the processing device 550 may be enabled.
  • the controller 500 is incorporated into a device into which the touch sensor 220 is not also incorporated (e.g., the controller 500 is incorporated into the audio/visual device 100 and the touch sensor is incorporated into the remote control 200 )
  • the sensor interface 520 may be of a type able to receive cabling-based and/or wireless signaling transmitted by that other device (e.g., infrared signals emitted by the remote control 200 ).
  • circuitry that is co-located with the touch sensor 220 may perform the task of directly monitoring a two-dimensional array of touch-sensitive points of the touch-sensitive surface 225 , and then transmit indications of which touch-sensitive points are being touched by the tip of a user's digit to the sensor interface 520 .
  • the audio/visual device into which the controller 500 is incorporated may not incorporate any sources (such as the sources 901 - 904 ) from which the controller 500 receives audio/visual programs, it is deemed more likely that the audio/visual device into which the controller 500 is incorporated will incorporate one or more of such sources in addition to being capable of receiving audio/visual programs from sources not incorporated into the same audio/visual device.
  • the controller 500 may be incorporated into an audio/visual device into which a radio frequency tuner and/or an Internet access device is also incorporated to enable access to audio/visual programs for selection and playing without the attachment of another audio/visual device, while also having the capability of being coupled to another audio/visual device to receive still other audio/visual programs.
  • the controller 500 may well be incorporated into an audio/visual device that is at least akin to a television, whether portable (e.g., as depicted in FIG. 5 ) or stationary (e.g., as depicted in FIG. 1 ). Therefore, although the source interface 590 may have any of a number of configurations to couple the controller 500 to any of a number of possible sources, it is envisioned that the source interface 590 will be configured to enable the controller 500 to be coupled to at least one source that is also incorporated into the same audio/visual device into which the controller 500 is incorporated, and to also enable the controller 500 to be coupled to at least one source that is not incorporated into the same audio/visual device.
  • the source interface 590 incorporates one or more of an electrical interface 595 , an optical interface 596 , a radio frequency transceiver 598 and/or an infrared receiver 599 .
  • the electrical interface 595 (if present) enables the source interface 590 to couple the controller 500 to at least one source, whether incorporated into the same audio/visual device as the controller 500 , or not, to receive electrical signals (e.g., Ethernet, S-Video, USB, HDMI, etc.) conveying an audio/visual program to the controller 500 .
  • the optical interface 596 (if present) enables the source interface 590 to couple the controller 500 to at least one source to receive optical signals (e.g., Toslink) conveying an audio/visual program to the controller 500 .
  • the radio frequency transceiver 598 (if present) enables the source interface 590 to wirelessly couple the controller 500 to at least one other audio/visual device functioning as a source to receive radio frequency signals (e.g., Bluetooth, a variant of IEEE 802.11, ZigBee, etc.) conveying an audio/visual program to the controller 500 from that other audio/visual device.
  • the infrared receiver 599 (if present) enables the source interface 590 to wirelessly couple the controller 500 to at least one other audio/visual device functioning as a source to receive infrared signals conveying an audio/visual program to the controller 500 from that other source.
  • output interface 510 and the sensor interface 520 are depicted as separate from the source interface 590 , it may be deemed advantageous, depending on the nature of the signaling supported, to combine one or both of the output interface 510 and the sensor interface 520 with the source interface 590 .
  • Stored within the storage 540 are one or more of a control routine 450 , a protocols data 492 , a commands data 493 , an audio/visual data 495 , a rescaled audio/visual data 496 , and menu data 498 .
  • a sequence of instructions of the control routine 450 causes the processing device 550 to coordinate the monitoring of the touch sensor 220 for user input, the output of the racetrack menu 150 to a display element (e.g., the display element 120 ), the selection of a source of an audio/visual program to be played, and one or both of the display of a visual portion of an audio/visual program on a display element on which the racetrack menu 150 is also displayed and the acoustic output of an audio portion of the audio/visual program via one or more acoustic drivers (e.g., the acoustic drivers 130 ).
  • the control routine 450 causes the processing device 550 to operate the sensor interface 520 to await indications of a user placing a tip of a digit in contact with a portion of the racetrack surface 250 defined on a surface of the touch sensor 220 , moving that digit about the racetrack surface 250 and/or applying greater pressure at the position 260 on the racetrack surface 250 to make a selection.
  • the processing device 550 may be caused to operate the output interface to display the racetrack menu 150 with one or more of the menu items 155 positioned thereon and surrounding the display area 950 via a display element, if the racetrack menu 150 is not already being displayed.
  • the processing device 550 is further caused to display and position at least the marker 160 on the racetrack menu 150 in a manner that corresponds to the position 260 of the user's digit on the racetrack surface 250 . Further, in response to the passage of a predetermined period of time without receiving indications of activity by the user involving the racetrack surface 250 , the processing device 550 may be caused to operate the output interface 510 to cease displaying the racetrack menu 150 , and to display substantially little else on a display element than the display area 950 .
  • control routine 450 causes the processing device 550 to operate the sensor interface 520 to await an indication of a selection of a menu item 155 that corresponds to selecting a source from which the user may wish an audio/visual program to be provided for playing, and may operate the source interface 590 to at least enable receipt of an audio/visual program from that selected source.
  • the processing device 550 may be further caused to buffer audio and/or visual portions of the audio/visual program in the storage 540 as the audio/visual data 495 .
  • the processing device 550 may be further caused to buffer the rescaled form of the visual portion in the storage as the rescaled audio/visual program data 496 .
  • control routine 450 causes the processing device 550 to operate the sensor interface 520 to await an indication of a selection of a menu item 155 corresponding to the selection of a command (e.g., “play” or “record” commands, numerals or other symbols specifying a radio frequency channel to tune, etc.) to be transmitted to an audio/visual device serving as a source, and may operate the source interface 590 to transmit a command to that audio/visual device (e.g., one of sources 901 - 904 ) that corresponds to a menu item 155 that has been selected.
  • a command e.g., “play” or “record” commands, numerals or other symbols specifying a radio frequency channel to tune, etc.
  • the processing device 550 may be further caused to refer to the protocols data 492 for data concerning sequences of signals that must be transmitted by the source interface 590 as part of a communications protocol in preparation for transmitting the command, and/or the processing device 550 may be further caused to refer to the commands data 493 for data concerning the sequence of signals that must be transmitted by the source interface 590 as part of transmitting the command.
  • the protocols data 492 for data concerning sequences of signals that must be transmitted by the source interface 590 as part of a communications protocol in preparation for transmitting the command
  • the processing device 550 may be further caused to refer to the commands data 493 for data concerning the sequence of signals that must be transmitted by the source interface 590 as part of transmitting the command.
  • some of the earlier listed forms of coupling make use of various protocols to organize various aspects of commands and/or data that are conveyed, including and not limited to, Ethernet, Bluetooth, IEEE-1394, USB, etc.
  • the processing device 550 is further caused to store data correlating at least some of the various menu items with actions to be taken by the processing device 550 in response to their selection by the user in the storage 540 as the menu data 498 .
  • the processing device 550 may be caused to operate the output interface 510 to alter the quantity and/or type of menu items 155 that are displayed at various positions on the racetrack menu 150 . In so doing, the processing device 550 may be further caused to store information concerning the size, shape, color and other characteristics of the racetrack menu 150 , at least some of the graphical representations of the menu items 155 , and/or at least one graphical representation of the marker 160 in the storage 540 as part of the menu data 498 .
  • FIGS. 10 a and 10 b depict and contrast two variants of the touch sensor 220 . Both variants are depicted in perspective as distinct touch-sensitive devices that are typically mounted within a recess of a casing of a device, including either the casing 110 of any variant of the audio/visual device 100 or the casing 210 of any variant of the remote control 200 .
  • touch-sensitive device technologies may yield variants of the touch-sensitive device 220 that are film-like overlays that may be positioned to overlie a portion of a casing or of a circuitboard of a device. The discussion that follows is centered more on the shape and utilization of the touch-sensitive surface 225 of the touch sensor 220 , and not on the touch-sensitive technology employed.
  • FIG. 10 a depicts the variant of the touch sensor 220 having the ring shape that has been discussed above at length that permits other manually-operable controls (e.g., the navigation buttons 270 a - d and the selection button 280 ) to be positioned in a manner in which they are surrounded by the ring shape of the touch sensor 220 .
  • the ring shape of this variant of the touch sensor 220 provides a form of the touch-sensitive surface 225 that is bounded by the ring shape of the touch sensor 220 , and this in turn defines the ring shape of the racetrack surface 250 (where the racetrack surface 250 is defined on the touch-sensitive surface 225 to encompass substantially all of the touch-sensitive surface 225 ).
  • this variant of the touch sensor 220 is depicted as having a rectangular ring shape having four sides, other embodiments are possible in which the touch sensor 220 has a ring shape of a different geometry, such as a circular ring shape, an oval ring shape, a hexagonal ring shape, etc.
  • FIG. 10 b depicts an alternate variant of the touch sensor 220 having a rectangular shape that provides a continuous form of the touch-sensitive surface 225 that is bounded by this rectangular shape (i.e., there is no “hole” or formed through the touch-sensitive surface 225 ).
  • This rectangular shape more easily enables more than the ring shape of the racetrack surface 250 to be defined on the touch-sensitive surface 225 in a manner in which the racetrack surface 250 encompasses only a portion of the touch-sensitive surface 225 and leaves open the possibility of one or more other surfaces that serve other functions also being defined on thereon.
  • the ring shape of the racetrack surface 250 may be defined by a processing device executing a sequence of instructions of a routine, such as the processing device 550 executing the control routine 450 in FIG. 9 .
  • the location of the racetrack surface 250 may be defined by a processing device first being provided with indications of which touch-sensitive points of an array of touch-sensitive points making up the touch-sensitive surface 225 are being touched by a tip of a user's digit, and second treating some of those touch-sensitive points as belonging to the racetrack surface 250 and others of those touch-sensitive points as belonging to other surfaces that are defined on the touch-sensitive surface 225 (and which serve other functions).
  • one or more ridges 227 and/or grooves may be formed in the touch-sensitive surface 225 to at least provide a tactile guide as to where the racetrack surface 250 is defined on the touch-sensitive surface 225 .
  • Such ridges 227 may be formed integrally with the touch-sensitive surface 225 , may be formed as part of a casing on which the touch sensor 220 is disposed, or may be adhered to the touch-sensitive surface 225 .
  • ridges 227 and/or grooves may coincide with locations on the touch-sensitive surface 225 at which the touch sensor 220 is incapable of detecting the touch of a tip of a digit (i.e., the touch-sensitive surface 225 may be made up of multiple separate touch-sensitive portions, of which one is a portion having a ring shape where the racetrack surface 250 is defined).
  • the racetrack surface 250 is defined on the touch-sensitive surface 225 so as to be positioned about the periphery of the touch-sensitive surface 225 such that the ring shape of the racetrack surface 250 surrounds the remainder of the touch-sensitive surface 225 .
  • at least a portion of the touch-sensitive surface 225 that is surrounded by the racetrack surface 250 may be employed to provide the equivalent function of other manually-operable controls, such as the navigation buttons 270 a - d and the selection button 280 .
  • the navigation buttons 270 a - d and the selection button 280 may be implemented as navigation surfaces and a selection surface, respectively, defined on the touch-sensitive surface of the touch sensor 220 (perhaps by a processing device executing a sequence of instructions), along with the racetrack surface 250 .
  • both of the variants of the touch sensor 220 have been depicted in FIGS. 10 a and 10 b as having rectangular shapes with right angle corners, either variant may alternatively have rounded corners. Indeed, where such a variant of the touch sensor 220 has one or more of the ridges 227 and/or grooves (not shown), such ones of the ridges 227 and/or grooves may also have rounded corners, despite being depicted as having right angle corners in FIGS. 10 a and 10 b.
  • FIGS. 11 a and 11 b taken together, depict two variants of the user interface 1000 in which more than one display area is defined within the portion of the display element 120 that is surrounded by the racetrack menu 150 .
  • These variants enable more than one visual portion of one or more selected audio/visual programs to be played on the display element 120 in a manner that enables a user to view them simultaneously.
  • the manner in which various ones of the menu items 155 associated within only one of the display areas may be positioned along the racetrack menu 150 to provide a visual indication of their association with that one of the display areas.
  • FIG. 11 a depicts a configuration that is commonly referred to as “picture-in-picture” in which a display area 970 having smaller dimensions than the display area 950 is positioned within and overlies a portion of the display area 950 .
  • ones of the menu items 155 that are associated with the visual portion displayed in the display area 970 are positioned along portions of the racetrack menu 150 that are located closer to the display area 970 (specifically, portions of the sides 150 b and 150 d ) to provide a visual indication to the user of that one association.
  • ones of the menu items 155 that are associated with the visual portion displayed in the display area 950 are positioned along portions of the racetrack menu 150 that are further from the display area 970 (specifically, the sides 150 a and 150 c ) to provide a visual indication to the user of that other association.
  • the ones of the menu items 155 that are associated with the display area 950 correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning.
  • the ones of the menu items 155 that are associated with the display area 970 correspond to commands to play or to stop playing an audio/visual program, and selection of an input.
  • FIG. 11 b depicts a configuration that is commonly referred to as “picture-by-picture” in which the display areas 950 and 970 are positioned adjacent each other (as opposed to one overlapping the other) within the portion of the display element surrounded by the racetrack menu 150 .
  • ones of the menu items 155 that are associated with the visual portion displayed in the display area 950 are positioned along portions of the racetrack menu 150 that are located closer to the display area 950 (specifically, the side 150 c and portions of the sides 150 a and 150 b ) to provide a visual indication to the user of that one association.
  • ones of the menu items 155 that are associated with the visual portion displayed in the display area 970 are positioned along portions of the racetrack menu 150 that are located closer to the display area 970 (specifically, the side 150 d and portions of the sides 150 a and 150 b ) to provide a visual indication to the user of that other association.
  • each of the display areas 950 and 970 are associated with separate ones of the menu items 155 that correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning.
  • FIGS. 11 a and 11 b depict embodiments having only two display areas (i.e., the display areas 950 and 970 ) within the portion of the display element 120 surrounded by the racetrack menu 150 , those skilled in the art will readily recognize that other embodiments incorporating more than two such display areas are possible, and that in such embodiments, each of the menu items 155 may be positioned along the racetrack menu 150 in a manner providing a visual indication of its association with one of those display areas. Indeed, it is envisioned that variants of the user interface 1000 are possible having 2-by-2 or larger arrays of display areas to accommodate the simultaneous display of multiple visual portions, possibly in security applications.
  • FIGS. 11 a and 11 b depict separate sets of the menu items 155 corresponding to commands to play and to stop playing an audio/visual program that are separately associated with each of the display areas 150 and 170 , and although this suggests that the visual portions played in each of the display areas 150 and 170 must be from different audio/visual programs, it should be noted that the simultaneously displayed visual portions in the display areas 150 and 170 may be of the same audio/visual program.
  • an audio/visual program may have more than one visual portion.
  • An example of this may be an audio/visual program including video of an event taken from more than one angle, such as an audio/visual program of a sports event where an athlete is shown in action from more than one camera angle.
  • the simultaneous display of multiple visual portions there may be multiple audio portions that each correspond to a different one of the visual portions. While viewing multiple visual portions simultaneously may be relatively easy for a user insofar as the user is able to choose any visual program to watch with their eyes, listening to multiple audio portions simultaneously may easily become overwhelming. To address this, some embodiments may select one of the audio portions to be acoustically output to the user based on the position 260 of a tip of a digit along the racetrack surface 250 (referring back to FIG. 2 ).
  • an audio portion of the audio/visual program of the visual portion being displayed in the display area 950 is acoustically output to the user. If the user then moves that tip of a digit along the racetrack surface 250 such that the position 260 is moved to a portion of the racetrack surface 250 that corresponds to a portion of the racetrack menu 150 that is closer to the display area 970 , then an audio portion of the audio/visual program of the visual portion being displayed in the display area 970 is acoustically output to the user.
  • the corresponding position of the marker 160 along the racetrack menu 150 may serve as a visual indication to the user of which visual portion the current selection of audio portion corresponds to.
  • FIG. 12 depicts an alternate variant of the user interface 1000 in which the combined display of the racetrack menu 150 and the display area 950 surrounded by the racetrack menu 150 does not fill substantially all of the display element 120 .
  • Such an embodiment may be implemented on a more complex variant of the audio/visual device 100 capable of simultaneously performing numerous functions, some of which are entirely unrelated to selection and playing of an audio/visual program. As depicted, this leaves a display area 920 that is outside the racetrack menu 150 and that is overlain by the combination of the racetrack menu 150 and the display area 950 available for such unrelated functions.
  • Such a more complex variant of the audio/visual device 100 may be a general purpose computer system, perhaps one employed as a “media center system” or “whole house entertainment system.”
  • the combination of the racetrack menu 150 and the display area 950 may be displayed in a window defined by an operating system having a windowing graphical user interface where the window occupies substantially less than all of the display element 120 .
  • the user may select and control the playing of an audio/visual program through the use of a variant of the touch sensor 220 having a touch-sensitive surface 225 that has a continuous rectangular shape (such as the variant of the touch sensor 220 of FIG. 10 b ), as opposed to having a ring shape (such as the variant of the touch sensor 220 of FIG. 10 a ).
  • the racetrack surface 250 is defined on the touch-sensitive surface 225 in a manner that occupies the periphery of the touch-sensitive surface 225 and that surrounds a remaining portion of the touch-sensitive surface 225 that enables conventional operation of other functions of the audio/visual device 100 that may be unrelated to the selection and playing of an audio/visual program.
  • this remaining portion of the touch-sensitive surface 225 may be employed in a conventional manner that will be familiar to those skilled in the art of graphical user interfaces in which a user moves about a graphical cursor using a tip of a digit placed on this remaining portion.
  • the user may choose to engage in selecting audio/visual programs and controlling the playing of those audio/visual programs through the racetrack surface 250 , and may choose to engage in performing other tasks unrelated to the selection and playing of audio/visual programs through the remaining portion of the touch-sensitive surface 225 .
  • one or more ridges 227 and/or grooves may be formed in the touch-sensitive surface 225 .
  • the user may be aided in unerringly placing a tip of a digit on whichever one of the racetrack surface 250 or the remaining portion of the touch-sensitive surface 225 that they wish to place that tip upon, without errantly placing that tip on both, and without having to glance at the touch-sensitive surface 225 of the touch sensor 220 .

Abstract

A user interface for an audio/visual device incorporates one or both of a touch sensor having a touch surface on which is defined a racetrack surface having a ring shape and a display element on which is displayed a racetrack menu also having a ring shape, and where the user interface incorporates both, the ring shapes of the racetrack surface and the racetrack menu are structured to generally correspond such that the position of a marker on the racetrack menu is caused to correspond to the position at which a digit of a user's hand touches the racetrack surface.

Description

    TECHNICAL FIELD
  • This disclosure relates to user interfaces incorporating a visual display and/or a touch-sensitive control.
  • BACKGROUND
  • Part of enjoying the playing of an audio/visual program (e.g., a piece of music, a recorded lecture, a recorded live performance, a movie, a slideshow, family pictures, an episode of a television program, etc.) is the task of selecting the desired audio/visual program to be played. Unfortunately, the increasing variety of choices of sources of audio/visual programs and the increasing variety of mechanisms by which audio/visual programs are able to be stored and played has greatly complicated what was once the relatively simple act of watching or listening to the playing of an audio/visual program to enjoy it.
  • For example, those wishing to “tune in” an audio/visual program being broadcast must now select a channel on which to view an audio/visual program from as many as 500 channels available through typical cable and/or satellite connections for television and/or radio. Further, it has become commonplace to employ audio/visual devices that are able to be programmed to autonomously tune in and record an audio/visual program for playing at a later time. Still further, it is now becoming increasingly commonplace to obtain audio/visual programs from websites accessible through the Internet, either by receiving those audio/visual programs as streaming data while they are played, or downloading those audio/visual programs as a storable digital file on an audio/visual device for playing at a later time. Yet further, some of these possible sources of audio/visual programs require paid subscriptions for which key cards and/or decryption keys are required to gain access to at least some audio/visual programs.
  • Those seeking to avail themselves of even a modest subset of such a wide array of options for playing an audio/visual program have often found themselves having to obtain multiple audio/visual devices (e.g., tuners, descramblers, disc media players, video recorders, web access devices, digital file players, televisions, visual displays without tuners, etc.). Each such audio/visual device often has a unique user interface, and more often than not, is accompanied by a separate handheld wireless remote control by which it is operated. Attempts have been made to grapple with the resulting plethora of remote controls that often accompany a multitude of audio/visual devices by providing so-called “universal remotes” enabling multiple audio/visual devices to be operated using a single remote control. However, a universal remote tends to go only so far in satisfying the desire of many users to simplify the coordination required in the operation of multiple audio/visual devices to perform the task of playing an audio/visual program.
  • Efforts have recently been made through cooperation among multiple purveyors of audio/visual devices to further ease the coordinated operation of multiple audio/visual devices through the adoption of standardized command codes and various approaches to coupling multiple audio/visual devices to enable the exchange of those standardized command codes among multiple audio/visual devices. An example of this effort is the CEC standardized command set created as part of the HDMI interface specification promulgated by HDMI Licensing, LLC of Sunnyvale, Calif. However, these efforts, even in conjunction with a universal remote, still only go so far in making the playing of an audio/visual program into a truly simple undertaking.
  • SUMMARY
  • A user interface for an audio/visual device incorporates one or both of a touch sensor having a touch surface on which is defined a racetrack surface having a ring shape and a display element on which is displayed a racetrack menu also having a ring shape, and where the user interface incorporates both, the ring shapes of the racetrack surface and the racetrack menu are structured to generally correspond such that the position of a marker on the racetrack menu is caused to correspond to the position at which a digit of a user's hand touches the racetrack surface.
  • In one aspect, an apparatus includes a display element capable of visually displaying a visual portion of an audio/visual program and a racetrack menu having a ring shape; a processing device; and a storage accessible to the processing device and storing a sequence of instructions. When the sequence of instructions is executed by the processing device, the processing device is caused to: cause the racetrack menu to be visually displayed on the display element such that the racetrack menu surrounds a first display area in which the visual portion of the audio/visual program may be visually displayed; cause a plurality of menu items to be visually displayed in the racetrack menu; cause a first marker to be visually displayed in the racetrack menu; receive an indication that a first manually-operable control is being operated to move the first marker; in response to the indication of the first manually-operable control being operated to move the first marker, move the first marker about the racetrack menu and constrain movement of the first marker to remain within the racetrack menu; receive an indication of the first manually-operable control being operated to select a menu item of the plurality of menu items that is in the vicinity of the first marker at a time subsequent to the first manually-operable control being operated to move the first marker about the racetrack; and in response to the indication of the first manually-operable control being operated to select the menu item that is in the vicinity of the first marker, cause the menu item to be selected, wherein causing the menu item to be selected comprises taking an action to cause the audio/visual program to be selected for playing.
  • Implementations may include, and are not limited to, one or more of the following features. The touch-sensitive surface of the touch sensor may have a ring shape that defines the ring shape of the racetrack surface such that the racetrack surface encompasses substantially all of the touch-sensitive surface. The apparatus may further include a manually operable control, and a casing wherein the touch sensor is disposed on the casing relative to the manually operable control such that the touch-sensitive surface surrounds the manually operable control.
  • Alternatively, the touch-sensitive surface of the touch sensor may be a continuous surface having no hole interrupting the touch-sensitive surface formed therethrough, where the ring shape of the racetrack surface is defined on the touch-sensitive surface to encompass a first portion of the touch-sensitive surface and is defined to be positioned about the periphery of the touch-sensitive surface so as to surround a second portion of the touch-sensitive surface, and a navigation surface is defined on the touch-sensitive surface to encompass the second portion. At least one ridge may be formed in the touch-sensitive surface, wherein the at least one ridge also at least partly defines the ring shape of the racetrack surface. The processing device may be caused by the sequence of instructions to define the first and second portions of the touch-sensitive surface by: monitoring activity on the touch-sensitive surface; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the indication of the digit touching the racetrack surface at the position; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as an indication of the digit operating a navigation control; and in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source.
  • The apparatus may further include a source interface operable to transmit commands to a source of the audio/visual program; wherein execution of the sequence of instructions by the processing device further causes the processing device to receive an indication of the manually-operable control being operated; and in response to the indication of the manually-operable control being operated, operate the source interface to transmit a command to the source to cause the source to visually display a navigation menu of the source on the display element. The menu may have a ring shape that substantially corresponds to the ring shape of the racetrack surface. The ring shape of both the racetrack surface and the menu may be a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface. The ring shape of the menu may surround a display area in which a visual portion of the audio/visual program is displayed at a time when the audio/visual program is played.
  • Execution of the sequence of instructions by the processing device may further causes the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed. Execution of the sequence of instructions by the processing device may further cause the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed. Execution of the sequence of instructions by the processing device may further cause the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • In one aspect, a method includes receiving an indication of a digit of a hand of a user touching a racetrack surface at a position on the racetrack surface, wherein the racetrack surface is defined on a touch-sensitive surface of a touch sensor to encompass at least a portion of the touch-sensitive surface and is operable by the digit; in response to the indication of the digit touching the racetrack surface at the position, causing a marker to be visually displayed at a location that corresponds to the position on the racetrack surface on a menu that is visually displayed on a display element; receiving an indication of the position at which the digit touches the racetrack surface being moved about the racetrack surface; in response to the indication of the position being moved about the racetrack surface, causing the marker to be moved about the menu in a manner that corresponds to the manner in which the position is being moved about the racetrack; receiving an indication of the user increasing the pressure with which the user's digit touches the racetrack surface at the position at a time subsequent to receiving the indication of the position being moved about the racetrack; and in response to the indication of the user increasing pressure with which the user's digit touches the racetrack surface at the position, causing a menu item displayed in the vicinity of the marker to be selected, wherein causing the menu item to be selected comprises taking an action to cause an audio/visual program to be selected for playing.
  • Implementations may include, and are not limited to, one or more of the following features. The method may further include defining the racetrack surface on a first portion of the touch-sensitive surface and defining a navigation surface on a second portion of the touch-sensitive surface such that the ring shape of the racetrack surface surrounds the navigation surface by: monitoring activity on the touch-sensitive surface; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the receiving of the indication of the digit touching the racetrack surface at the position; treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as receiving an indication of the digit operating a navigation control; and in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source. Alternatively and/or additionally, the method may further include displaying the menu on the display element with a ring shape that substantially corresponds to the ring shape of the racetrack surface; and perhaps further include surrounding a display area on the display element with the menu, wherein a visual portion of the audio/visual program is displayed in the display area at a time when the audio/visual program is played. The ring shape of both the racetrack surface and the menu may be a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface.
  • The method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed. The method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed. The method may further include displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
  • Other features and advantages of the invention will be apparent from the description and claims that follow.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of an embodiment of a user interface.
  • FIG. 2 depicts correlations between movement of a digit on a racetrack sensor of the user interface of FIG. 1 and movement of a marker on a racetrack menu of the user interface of FIG. 1.
  • FIGS. 3 a, 3 b, 3 c and 3 d, together, depict possible variants of the user interface of FIG. 1 incorporating different forms and combinations of markers.
  • FIG. 4 is a block diagram of a possible architecture of the user interface of FIG. 1.
  • FIG. 5 is a perspective view of another embodiment of the user interface of FIG. 1 combining more of the features of the user interface into a single device.
  • FIG. 6 depicts a possibility of switching between displaying and not displaying the racetrack menu of the user interface of FIG. 1.
  • FIGS. 7 a and 7 b, together, depict additional possible details of the user interface of FIG. 1.
  • FIG. 8 is a perspective view of the embodiment of the user interface of FIG. 5, additionally incorporating the possible details of FIGS. 7 a and 7 b.
  • FIG. 9 is a block diagram of the controller of the architecture of FIG. 4.
  • FIGS. 10 a and 10 b, together, depict possible variants of the touch sensor employed in the user interface of FIG. 1.
  • FIGS. 11 a and 11 b, together, depict possible variants of the user interface of FIG. 1 incorporating more than one display area.
  • FIG. 12 depicts another embodiment of the user interface of FIG. 1 in which the racetrack menu and the display area surrounded by the racetrack menu do not occupy substantially all of a display element.
  • DETAILED DESCRIPTION
  • What is disclosed and what is claimed herein is intended to be applicable to a wide variety of audio/visual devices, i.e., devices that are structured to be employed by a user to play an audio/visual program. It should be noted that although various specific embodiments of audio/visual devices (e.g., televisions, set-top boxes and hand-held remotes) are presented with some degree of detail, such presentations of specific embodiments are intended to facilitate understanding through the use of examples, and should not be taken as limiting either the scope of disclosure or the scope of claim coverage.
  • It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices that employ a tuner and/or a network interface to receive an audio/visual program. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices structured to cooperate with other devices to play an audio/visual program and/or to cause an audio/visual program to be played. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices that are wirelessly connected to other devices, that are connected to other devices through electrically and/or optically conductive cabling, or that are not connected to any other device, at all. It is intended that what is disclosed and what is claimed herein is applicable to audio/visual devices having physical configurations structured to be either portable or not. Still other configurations of audio/visual devices to which what is disclosed and what is claimed herein are applicable will be apparent to those skilled in the art.
  • FIG. 1 depicts a user interface 1000 enabling a user's hand-eye coordination to be employed to more intuitively operate at least one audio/visual device to select and play an audio/visual program. The user interface 1000 incorporates a displayed “racetrack” menu 150 and a corresponding “racetrack” surface 250. As depicted, the user interface 1000 is implemented by an interoperable set of devices that include at least an audio/visual device 100 and a handheld remote control 200, and as will be explained in greater detail, may further include another audio/visual device 900. However, as will also be explained in greater detail, the user interface 1000 may be substantially fully implemented by a single audio/visual device, such as the audio/visual device 100.
  • The racetrack menu 150 is visually displayed on a display element 120 disposed on a casing 110 of the audio/visual device 100, and as depicted, the audio/visual device 100 is a flat panel display device such as a television, employing a flat panel form of the display element 120 such as a liquid crystal display (LCD) element or a plasma display element. Further, the audio/visual device 100 may further incorporate acoustic drivers 130 to acoustically output sound. However, as those skilled in the art will readily recognize, the racetrack menu 150 may be displayed by any of a variety of types, configurations and sizes of audio/visual device, whether portable or stationary, including and not limited to, a projector or a handheld device.
  • The racetrack surface 250 is defined on a touch-sensitive surface 225 of a touch sensor 220 disposed on a casing 210 of the handheld remote control 200, and as depicted, the touch-sensitive surface 225 has a rectangular ring shape that physically defines the shape and position of the racetrack surface 250 such that the racetrack surface 250 encompasses substantially all of the touch-sensitive surface of the touch sensor 220. However, as those skilled in the art will readily recognize, the touch sensor 220 may be incorporated into any of a wide variety of devices, whether portable or stationary, including and not limited to, a wall-mounted control panel or a keyboard. Further, it is also envisioned that the touch sensor 220 may have a variant of the touch-sensitive surface 225 (see FIG. 2) that is of a shape other than a ring shape with the racetrack surface 250 defined on that variant of the touch-sensitive surface 225 in another way such that the racetrack surface 250 encompasses only a subset of that variant of the touch-sensitive surface 225 of the touch sensor 220. Further, the touch sensor 220 may be based on any of a variety of technologies.
  • As depicted, both the racetrack menu 150 and the racetrack surface 250 have a ring shape that is a generally rectangular ring shape with corresponding sets of four sides. More specifically, the four sides 150 a, 150 b, 150 c and 150 d of the racetrack menu 150 are arranged to correspond to the four sides 250 a, 250 b, 250 c and 250 d of the racetrack surface 250. This four-sided nature of both of the racetrack menu 150 and the racetrack surface 250 are meant to accommodate the rectilinear nature of the vast majority of display elements currently found in audio/visual devices and the rectilinear nature of the visual portion of the vast majority of currently existing audio/visual programs that have a visual portion. However, it is important to note that although the racetrack menu 150 and the racetrack surface 250 are depicted and discussed herein as having a rectangular ring shape, other embodiments are possible in which the ring shape adopted by the racetrack surface 250 has a circular ring shape, an oval ring shape, a hexagonal ring shape or still other geometric variants of a ring shape. Further, where the racetrack menu 150 and/or the racetrack surface 250 have a ring shape that is other than a rectangular ring shape, one or both of the display element 120 and the touch sensor 220 may have a shape other than the rectangular shapes depicted herein.
  • As will be explained in greater detail, the four sides 150 a-d of the racetrack menu 150 surround or overlie the edges of a display area 950 in which the visual portion of an audio/visual program selected via the user interface 1000 may be played. It is this positioning of the racetrack menu 150 about the periphery of the display element 120 and the display area 950 (whether surrounding or overlying the periphery of the display area 950) that supplies the impetus for both the racetrack menu 150 and the racetrack surface 250 having a ring shape that is generally a rectangular ring shape, rather than a ring shape of some other geometry. Where a selected audio/visual program does not have a visual portion (e.g., the audio/visual program is an audio recording having only an audio portion), the display area 950 may remain blank (e.g., display only a black or blue background color) or display status information concerning the playing of the selected audio/visual program as the selected audio/visual program is played, perhaps with the audio portion being acoustically output by the acoustic drivers 130. As depicted, the four sides 150 a-d of the racetrack menu 150 are displayed by the display element 120 at the edges of the display element 120. However, it is also envisioned that the four sides 150 a-d of the racetrack menu 150 may be positioned about the edges of a “window” of a graphical user interface of the type commonly employed in the operation of typical computer systems, perhaps where the audio/visual device 100 is a computer system on which audio/visual programs are selected and played through the user interface 1000.
  • As shown in FIG. 2, at various positions along one or more of the four sides 150 a-d of the racetrack menu 150 are menu items 155 that may be selected by a user of the user interface 1000. The menu items 155 may include alphanumeric characters (such as those depicted as positioned along the side 150 a) that may be selected to specify a channel or a website from which to select and/or receive an audio/visual program, symbols (such as those depicted as positioned along the side 150 b) representing commands to control the operation of an audio/visual device capable of playing an audio/visual program (e.g., “play” and “stop” commands for a video cassette recorder, a disc media player, or solid state digital file player, etc.), and indicators of inputs (such as those depicted as positioned along the side 150 c) to an audio/visual device that may be selected and through which an audio/visual program may be selected and/or received. Although the various menu items 155 positioned along the racetrack menu 150 could conceivably serve any of a wide variety of purposes, it is envisioned that much of the functionality of the menu items 155 will be related to enabling a user to select an audio/visual program for playing, and/or to actually play an audio/visual program.
  • To operate the user interface 1000, a user places the tip of a digit of one of their hands (i.e., the tip of a thumb or finger) on a portion of the racetrack surface 250 defined on the touch-sensitive surface 225 of the touch sensor 220, and a marker 160 is displayed on a portion of the racetrack menu 150 that has a position on the racetrack menu 150 that corresponds to the position 260 on the racetrack surface 250 at which the tip of their digit is in contact with the touch-sensitive surface 225 of the touch sensor 250. FIG. 2 also depicts how the marker 160 moves about and is constrained to moving about the racetrack menu 150 to maintain a correspondence between its location on the racetrack menu 150 and the position 260 of the digit on the racetrack surface 250 as the user moves that digit about the racetrack surface 250. In some embodiments, the marker 160 may move about the racetrack menu 150 in a manner in which the marker 160 “snaps” from being centered about one menu item 155 to an adjacent menu item 155 as the marker 160 is moved about a portion of the racetrack menu 150 having adjacent ones of the menu items 155. Further, such “snapping” of the marker 160 between adjacent ones of the menu items 155 may be accompanied by the concurrent acoustic output of some form of sound (e.g., a “click” or “beep” sound that accompanies each “snap” of the marker 160) to provide further feedback to a user of the marker 160 moving from one such menu item 155 to another.
  • When the marker 160 is positioned over a menu item 155 that the user wishes to select, the user selects that menu item 155 by pressing whichever one of their digits that is already in contact with the racetrack surface 250 with greater pressure than was used in simply placing that digit in contact with the racetrack surface 250. In some embodiments, the touch sensor 220, itself, is capable of distinguishing different degrees of pressure with which the digit is put into contact with the touch-sensitive surface 225 of the touch sensor 220 on which the racetrack surface 250 is defined in order to distinguish an instance in which the user is pressing harder with that digit to select one of the menu items 155. In other embodiments, the touch sensor 220 is able to function in a manner not unlike a mechanically depressible button in which the additional pressure applied through that digit by the user causes the touch sensor 220 to be pressed inward towards the casing 210 as part of selecting a menu item. This may be accomplished by overlying one or more buttons disposed within the casing 210 with the touch sensor 220 so that such buttons are depressed by the touch sensor 220 as the touch sensor 220 is itself depressed towards the casing 210. Where the touch sensor 220 is able to be pressed inward towards the casing 210, such inward movement may be accompanied by a “click” sound that may be heard by the user and/or a tactile “snap” sensation that can be sensed by the user through their digit to give the user some degree of positive feedback that they've successfully selected one of the menu items 155. Regardless of whether the touch sensor 220 is able to be pressed inward towards the casing 210, or not, a “click” or other sound accompanying the user's use of increased pressure on the racetrack surface 250 to select one of the menu items 155 may be acoustically output through an acoustic driver (not shown) incorporated into the remote control 200 and/or through the acoustic drivers 130 of the audio/visual device 100.
  • FIGS. 3 a, 3 b and 3 c depict other variations of forms of marker and combinations of markers. As will be made clear, different forms of marker and combinations of multiple markers may be used to enhance the rapidity with which the eyes of a user of the user interface 1000 is drawn to a specific location on the racetrack menu 150, and to aid the hand-eye coordination of that user.
  • Although the marker 160 was depicted in FIG. 2 as taking the form of a box-shaped graphical element sized to surround one of the menu items 155 at a time when positioned in the vicinity of one or more of the menu items 155, FIG. 3 a depicts another variant of the marker 160 having the form of a triangular pointer. Still other possible graphical representations of the marker 160 will occur to those skilled in the art, such as forms of the marker 160 having other geometric shapes (e.g., a dot, a circle, an arrow, etc.) or other ways of being positioned in the vicinity of a given one of the menu items 155 (e.g., overlying, surrounding, pointing to, touching, etc., one of the menu items 155). Still further, instead of the marker being a graphical element that is separate and distinct from any of the menu items 155, the marker 160 may instead be a modified form of a given one of the menu items 155, such as a change in a color of a menu item, an enlargement of a menu item in comparison to others, or some form of recurring animation or movement imparted to a menu item. In other words, the position of the marker 160 (and by extension, the position 260 of the tip of a digit on the racetrack surface 250) may be indicated by one of the menu items 155 changing color, changing font, becoming larger, becoming brighter, or being visually altered in comparison to the others of the menu items 155 in any of a number of ways to draw a user's eyes to it.
  • FIG. 3 a also depicts an optional additional marker 165 that follows the location of the marker 160 and provides a visual “highlight” of which one of the four sides 150 a-d the marker 160 is currently positioned within as a visual aid to enable a user's eyes to be more quickly directed to that one of the four sides 150 a-d when looking at the racetrack menu 150. Though not specifically depicted, in other embodiments, the additional marker 165 may be implemented as a highlighting, change in color, change in background color, change in font, enlargement or other visual alteration made to all of the menu items 155 that are positioned in that one of the four sides 150 a-d.
  • FIG. 3 b depicts the manner in which the marker 160 may be dynamically resized as it is moved about the racetrack menu 150, especially in embodiments where the marker 160 is of a form that in some way overlaps or surrounds one of the menu items 155 at a time in order to take into account the different sizes of different ones of the menu items 155. More specifically, and as depicted in FIG. 3 b, the numeral “3” has visibly smaller dimensions (i.e., occupies less space in the racetrack menu 150) than does the numeral “Ill” that is also present on the same racetrack menu 150. Thus, when the depicted form of the marker 160 (i.e., the “box” form of the marker 160 that has been discussed at length) is positioned on one or the other of these two particular ones of the menu items 155, the marker 160 is resized to be larger or smaller as needed to take into account the different sizes of these two particular ones of the menu items 155.
  • FIG. 3 c also depicts an optional additional marker 162 that follows the location of the marker 160 and provides a more precise visual indication than does the marker 160 of the position 260 of the tip of a user's finger along a corresponding portion of the racetrack surface 250. As depicted, the marker 162 takes the form of what might be called a “dash” positioned along one of the edges of the box form of the marker 160. However, it should be noted that the marker 162 may take any of a variety of forms (e.g., a dot, a circle, an arrow, etc.). The provision of the marker 162 may be deemed desirable in embodiments where the marker 160 moves in the manner previously described in which the marker 160 “snaps” between adjacent ones of the menu items 155 such that the marker 160 does not, itself, provide as precise an indication of the position 260 of the tip of the user's digit. More specifically, FIG. 3 c depicts a succession of views of a portion of the racetrack menu 150 on which menu items 155 taking the form of the numerals “1” through “5” are positioned. As can be seen in this depicted succession, the marker 162 provides a more precise indication of the movement of the position 260 of the tip of the user's digit along a portion of the racetrack surface 250 from left to right than does the marker 160 which remains on the one of the menu items 155 having the form of the numeral “2” on this portion of the racetrack menu 150. Such a higher precision indication of the position 260 of the tip of the user's digit may aid the user in improving their hand-eye coordination in operating the user interface 1000. Such a higher precision indication of the position 260 may also provide a user with some degree of reassurance that the user interface 1000 is responding to their actions (or more specifically, whatever processing device is incorporated into the user interface 1000 is responding to their actions) by seeing that the exact position 260 of the tip of their digit is being successfully detected.
  • FIG. 3 d depicts yet another alternate variation of the marker 160 in a variant of the user interface 1000 in which the racetrack menu 150 is divided into multiple segments, with each such segment serving as a background to one of the menu items 155. As depicted, the marker 160 is implemented as both a change in the color and/or brightness of one of those segments of the racetrack menu 150 and an enlarging of the graphical element representing the one of the menu items 155 (specifically, the numeral “3”) positioned within that segment. As so depicted, the marker 160 might be said to have a form that is a variant of the earlier-depicted box, but a box that is made visible by having a color and/or brightness that differs from the rest of the racetrack menu 150, rather than a box that is made visible by a border or outline. FIG. 3 d also depicts this alternate variation of the marker 160 being used in combination with the earlier-described additional marker 162 that provides a more precise indication of the position 260 of the tip of a user's digit along a portion of the racetrack surface 250.
  • FIG. 3 d also depicts how this variant of the marker 160 is resized to accommodate the different sizes of the different ones of the menu items 155, although this resizing now corresponds to the differing dimensions of different ones of the segments into which the racetrack menu 150 is divided. In some variants, each of the segments may be individually sized to fit the visual size and shape of its corresponding one of the menu items 155, as depicted in FIG. 3 d. Thus, since the numeral “3” of one of the menu items 155 is smaller in at least one dimension than the numeral “III” of another one of the menu items 155 (even with the numeral “3” being enlarged in font size), the segment of the racetrack menu 150 in which the numeral “3” is positioned is smaller than the segment in which the numeral “III” is positioned. However, in other variants, the segments filling at least one of the four sides 150 a-d may all be sized based on the quantity of the menu items 155 positioned in that one of the four sides so as to divide that one of the four sides 150 a-d into equal-sized segments. Where the ones of the menu items 155 along that one of the four sides 150 a-d may change in response to a selection of an input or for other reasons, the size of the segments in that one of the four sides 150 a-d may change in response to a change in quantity of the menu items 155 positioned in that one of the four sides 150 a-d. Thus, for example, a reduction in the quantity of menu items 155 in that one of the four sides 150 a-d results in each of its segments becoming larger in at least one dimension, and an increase in the quantity of menu items 155 results in that one of the four sides 150 a-d results in each of its segments becoming smaller.
  • FIG. 4 is a block diagram of a possible architecture of the user interface 1000 by which a controller 500 receives input through a user's use of at least the racetrack surface 250 defined on at least a portion of a touch-sensitive surface 225 of the touch sensor 220 to which the controller 500 is coupled, and provides at least the racetrack menu 150 as a visual output to the user through at least the display element 120 to which the controller 500 is also coupled. In various possible embodiments, the controller 500 may be incorporated directly into the audio/visual device 100, or into another audio/visual device 900 coupled to the audio/visual device 100 and shown in dotted lines in FIG. 1. As also depicted in FIG. 1, the remote control 200 communicates wirelessly through the emission of radio frequency, infrared or other wireless emissions to whichever one of the audio/ visual devices 100 and 900 incorporates the controller 500. However, as those skilled in the art will readily recognize, the remote control 200 may communicate through an electrically and/or optically conductive cable (not shown) in other possible embodiments. Alternatively and/or additionally, the remote control 200 may communicate through a combination of wireless and cable-based (optical or electrical) connections forming a network between the remote control 200 and the controller 500.
  • Still other embodiments may incorporate the touch sensor 220 directly on a user accessible portion of one or both of the audio/ visual devices 100 and 900, either in addition to or as an alternative to providing the touch sensor 220 on the remote control 200. Indeed, FIG. 5 depicts an alternate variant of the audio/visual device 100 having more of a portable configuration incorporating both the display element 120 displaying the racetrack menu 150 and the touch sensor 220 on a touch-sensitive surface 225 on which the racetrack surface 250 is defined. This alternative variant of the audio/visual device 100 may also incorporate the controller 500, such that much (if not substantially all) of the user interface 1000 is implemented solely by the audio/visual device 100.
  • Returning to FIG. 4, regardless of which audio/visual device incorporates the controller 500, the controller 500 incorporates multiple interfaces in the form of one or more connectors and/or one or more wireless transceivers by which the controller 500 is able to be coupled to one or more sources 901, 902, 903 and/or 904. Any such connectors may be disposed on the casing of whatever audio/visual device the controller 500 is incorporated into (e.g., the casing 110 of the audio/visual device 100 or a casing of the audio/visual device 900). In being so coupled, the controller 500 is able to transmit commands to one or more of the sources 901-904 to access and select audio/visual programs, and is able to receive audio/visual programs therefrom. Each of the sources 901-904 may be any of a variety of types of audio/visual device, including and not limited to, RF tuners (e.g., cable television or satellite dish tuners), disc media recorders and/or players, tape media recorders and/or players, solid-state or disk-based digital file players (e.g., a MP3 file player), Internet access devices to access streaming data of audio/visual programs, or docking cradles for portable audio/visual devices (e.g., a digital camera). Further, in some embodiments, one or more of the sources 901-904 may be incorporated into the same audio/visual device into which the controller 500 is incorporated (e.g., a built-in disc media player or built-in radio frequency tuner).
  • In embodiments where one of the sources 901-904 is not incorporated into the same audio/visual device as the controller 500, and where that one of the sources 901-904 is coupled to the controller 500 via an interface of the controller 500 employing a connector, any of a variety of types of electrical and/or optical signaling conveyed via electrically and/or optically conductive cabling may be employed. Preferably, a single cable is employed both in relaying commands from the controller 500 to that one of the sources 901-904 and in relaying audio/visual programs to the controller 500. However, combinations of cabling in which different cables separately perform these functions are also possible. Some of the possible forms of cabling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, Syndicat des Constructeurs d'Appareils Radiorecepteurs et Televiseurs (SCART) promulgated in the U.S. by the Electronic Industries Alliance (EIA) of Arlington, Va.; Ethernet (IEEE-802.3) or IEEE-1394 promulgated by the Institute of Electrical and Electronics Engineers (IEEE) of Washington, D.C.; Universal Serial Bus (USB) promulgated by the USB Implementers Forum, Inc. of Portland, Oreg.; Digital Visual Interface (DVI) promulgated by the Digital Display Working Group (DDWG) of Vancouver, Wash.; High-Definition Multimedia Interface (HDMI) promulgated by HDMI Licensing, LLC of Sunnyvale, Calif.; or DisplayPort promulgated by the Video Electronics Standards Association (VESA) of Milpitas, Calif. Other possible forms of cabling able to relay only one or the other of commands and audio/visual programs may conform to one or more industry standards, including and not limited to, RS-422 or RS-232-C promulgated by the EIA; Video Graphics Array (VGA) maintained by VESA; RC-5720C (more commonly called “Toslink”) maintained by the Japan Electronics and Information Technology Industries Association (JEITA) of Tokyo, Japan; the widely known and used Separate Video (S-Video); or S-Link maintained by Sony Corporation of Tokyo, Japan.
  • In other embodiments where one of the sources 901-904 is not incorporated into the same audio/visual device as the controller 500, and where that one of the sources 901-904 is coupled to the controller 500 via a wireless transceiver, any of a variety of types of infrared, radio frequency or other wireless signaling may be employed. Preferably, a single wireless point-to-point coupling is employed both in relaying commands from the controller 500 to that one of the sources 901-904 and in relaying audio/visual programs to the controller 500. However, combinations of separate wireless couplings in which these functions are separately performed are also possible. Some of the possible forms of wireless signaling able to relay both commands and audio/visual programs may conform to one or more industry standards, including and not limited to, IEEE 802.11a, 802.11b or 802.11g promulgated by the IEEE; Bluetooth promulgated by the Bluetooth Special Interest Group of Bellevue, Wash.; or ZigBee promulgated by the ZigBee Alliance of San Ramon, Calif.
  • In still other embodiments where one of the sources 901-904 is not incorporated into the same audio/visual device as the controller 500, a combination of cabling-based and wireless couplings may be used. An example of such a combination may be the use of a cabling-based coupling to enable the controller 500 to receive an audio/visual program from that one of the sources 901-904, while an infrared transmitter coupled to the controller 500 may be positioned at or near the one of the sources 901-904 to wirelessly transmit commands via infrared to that one of the sources 901-904. Still further, although FIG. 4 depicts each of the sources 901-904 as being directly coupled to the controller 500 in a point-to-point manner, those skilled in the art will readily recognize that one or more of the sources 901-904 may be coupled to the controller 500 indirectly through one or more of the others of the sources 901-904, or through a network formed among the sources 901-904 (and possibly incorporating routers, bridges and other relaying devices that will be familiar to those skilled in the art) with multiple cabling-based and/or wireless couplings.
  • Some of the above-listed industry standards include specifications of commands that may be transmitted between audio/visual devices to control access to and/or control the playing of audio/visual programs, including most notably, SCART, IEEE-1394, USB, HDMI, and Bluetooth. Where such an industry standard for coupling the controller 500 to one or more of the sources 901-904 is employed, the controller 500 may limit the commands transmitted to one or more of the sources 901-904 to the commands specified by that industry standard and map one or more of those commands to corresponding ones of the menu items 155 such that a user is able to cause the controller 500 to send those commands to one or more of the sources 901-904 by selecting those corresponding ones of the menu items 155. However, where the benefit of such a standardized command set is unavailable, the controller 500 may employ any of a wide variety of approaches to identify one or more of the sources 901-904 to an extent necessary to “learn” what commands are appropriate to transmit and the manner in which they must be transmitted.
  • A user of the user interface 1000 may select one of the sources 901-904 as part of selecting an audio/visual program for being played by employing the racetrack surface 250 and the marker 160 to select one or more of the menu items 155 shown on the racetrack menu 150, such as the “I” through “IV” menu items 155 depicted as displayed by the controller 500 on the side 150 c of the racetrack menu 150. Those menu items 155 depicted on the side 150 c correspond to the sources 901 through 904, which are depicted as bearing the labels “source I” through “source IV” in FIG. 4. The controller 500 receives input from the touch sensor 220 indicating the contact of the user's digit with a portion of the racetrack surface 250, indicating movement of the position 260 of contact of the digit about the racetrack surface 250, and indicating the application of greater pressure by the user through that digit against the touch sensor 220 at the position 260 (wherever the position 260 is at that moment) when selecting one of the menu items 155. The selection of one of the sources 901-904 by the user causes the controller 500 to switch to receiving audio/visual programs from that one of the sources 901-904, and to be ready to display any visual portion in the display area 950 and acoustically output any audio portion through the acoustic drivers 130 (or whatever other acoustic drivers may be present and employed for playing audio/visual programs).
  • The selection of one of the sources 901-904 may further cause the controller 500 to alter the quantity and types of menu items 155 displayed on one or more of the sides 150 a-d of the racetrack menu 150 such that the displayed menu items 155 more closely correspond to the functions supported by whichever one of the sources 901-904 that has been selected. This changing display of at least a subset of the menu items 155 enables the user to operate at least some functions of a selected one of the sources 901-904 by selecting one or more of the menu items 155 to thereby cause the controller 500 to transmit one or more commands corresponding to those menu items to the selected one of the sources 901-904. By way of example, where the one of the sources 901-904 with the ability to record an audio/visual program was previously selected, the racetrack menu 150 may include one or more menu items 155 that could be selected to cause the controller 500 to transmit a command to that previously selected one of the sources 901-904 to cause it to start recording an audio/visual program. However, if the user then selects another one of the sources 901-904 that does not have the ability to record an audio/visual program, then the controller 500 would alter the menu items 155 displayed on the racetrack menu 150 to remove one or more menu items associated with recording an audio/visual program. In this way, at least a subset of the menu items 155 displayed on the racetrack menu 150 are “modal” in nature, insofar as at least that subset changes with the selection of different ones of the sources 901-904.
  • The coupling and/or uncoupling of one or more of the sources 901-904 to and/or from whatever audio/visual device into which the controller 500 is incorporated may also cause the controller 500 to alter the quantity and/or types of menu items 155 that are displayed in another example of at least a subset of the menu items 155 being modal in nature. By way of example, the uncoupling of one of the sources 901-904 where that one of the sources 901-904 had been coupled through cabling may cause the controller 500 to remove the one of the menu items 155 by which that now uncoupled one of the sources 901-904 could be selected. Alternatively and/or additionally, where that uncoupled one of the sources 901-904 was already selected at the time of such uncoupling such that a subset of the menu items 155 is displayed that is meant to correspond to the functions able to be performed by that now uncoupled one of the sources 901-904, the controller 500 may respond to such an uncoupling by autonomously selecting one of the other of the sources 901-904 and altering the subset of the menu items 155 to correspond to the functions able to be performed by that newly selected one of the sources 901-904. In contrast, and by way of another example, the uncoupling of one of the sources 901-904 where that one of the sources 901-904 had been wirelessly coupled may or may not cause the controller 500 to remove the one of the menu items 155 by which that now uncoupled one of the sources 901-904 could be selected. If there is a mechanism provided in the chosen form of wireless communications used in the coupling that indicates that the uncoupling is due simply to that one of the sources 901-904 entering into a low-power or “sleep” mode, then it may be that no change is made by the controller 500 to the menu items 155 that are displayed, especially if the form of wireless communications used allows the controller 500 to signal that one of the sources 901-904 to “wake up” in response to the user selecting one of the menu items 155 that is associated with it. However, if no such mechanism to indicate the circumstances of an uncoupling are available, then the uncoupling may well result in an alteration or removal of at least some of the menu items 155 displayed on the racetrack menu 150. Where a previously uncoupled one of the sources 901-904 is subsequently coupled, once again, regardless of the type of coupling, the controller 500 may be caused to automatically select that now coupled one of the sources 901-904. This may be done based on an assumption that the user has coupled that source to whatever audio/visual device into which the controller 500 is incorporated with the intention of immediately playing an audio/visual program from it.
  • While at least some of the menu items 155 may be modal in nature such that they are apt to change depending on the selection and/or condition of one or more of the sources 901-904, others of the menu items 155 may not be modal in nature such that they are always displayed whenever the racetrack menu 150 is displayed. More specifically, where one or more of the sources 901-904 are incorporated into the same audio/visual device as the controller 500, the ones of the menu items 155 associated with those sources may remain displayed in the racetrack menu 150, regardless of the occurrences of many possible events that may cause other menu items 155 having a modal nature to be displayed, to not be displayed, or to be displayed in some altered form. By way of example, where a radio frequency tuner is incorporated into the same audio/visual device into which the controller 500 is incorporated, then a subset of the menu items 155 associated with selecting a radio frequency channel (e.g., the decimal point and numerals “0” through “9” depicted as displayed within the side 150 a) may be a subset of the menu items 155 that is always displayed in the racetrack menu 150. It may be that the selection of any menu item of such a subset of the menu items 155 may cause the controller 500 to automatically switch the selection of a source of audio/visual programs to the source associated with those menu items 155. Thus, in the example where an audio/visual device incorporates a radio frequency tuner and menu items 155 associated with selecting a radio frequency channel are always displayed, the selection of any one of those menu items would cause the controller 500 to automatically switch to that radio frequency tuner as the source from which to receive an audio/visual program if that tuner were not already selected as the source. By way of another example, one or more of the menu items 155 associated with selecting a source of audio/visual programs (e.g., the roman numerals “I” through “IV” depicted as displayed within the side 150 c) may be menu items that are always displayed in the racetrack menu 150.
  • Regardless of what source is selected or how the source is selected, if an audio/visual program received by the controller 500 from that source has a visual portion, then the controller 500 causes that visual portion to be displayed in the display area 950. As has so far been depicted and described, the racetrack menu 150 has a rectilinear configuration with the four sides 150 a-d that are configured to surround or overlie edges of the display area 950. However, in some embodiments, it may be that the racetrack menu 150 is not always displayed such that what is shown on the display element 120 of the audio/visual device 100 could be either the display area 950 surrounded by the racetrack menu 150, or the display area 950 expanded to fill the area otherwise occupied by the racetrack menu 150.
  • As depicted in FIG. 6, what is shown on the display element 120 could toggle between these two possibilities, and this toggling could occur in response to observed activity and/or a lack of observed activity in the operation of at least the racetrack surface 250. More specifically, on occasions where no indication of contact by a user's digit on the racetrack surface 250 has been received by the controller 500 for at least a predetermined period of time, the controller 500 may provide the display element 120 with an image that includes substantially nothing else but the display area 950 such that a visual portion of an audio visual program is substantially the only thing shown on the display element 120. However, once the controller 500 has received an indication of activity such as the tip of a digit making contact with racetrack surface 250, the controller 500 then provides the display element 120 with an image that includes a combination of the display area 950 and the racetrack menu 150.
  • In some embodiments, at a time when both the display area 950 and the racetrack menu 150 are displayed, the controller 500 reduces the size of the display area 950 to make room around the edges of the display area 950 for the display of the racetrack menu 150 on the display element 120, and in so doing, may rescale the visual portion (if there is one) of whatever audio/visual program may be playing at that time. In other embodiments, the display area 950 is not resized, and instead, the racetrack menu 150 is displayed in a manner in which the racetrack menu 150 overlies edge portions of the display area 950 such that edge portions of any visual portion of an audio/visual program are no longer visible. However, in those embodiments in which the racetrack menu overlies edge portions of the display area 950, the racetrack menu 150 may be displayed in a manner in which at least some portions of the racetrack menu have a somewhat “transparent” quality in which the overlain edge portions of any visual portion of an audio/visual program can still be seen by the user “looking through” the racetrack menu 150. As will be familiar to those skilled in the art, this “transparent” quality may be achieved through any of a number of possible approaches to combining the pixels of the image of the racetrack menu 150 with pixels of the overlain portion of any visual portion of an audio/visual program (e.g., by averaging pixel color values, alternately interspersing pixels, or bit-wise binary combining of pixels with a pixel mask).
  • Along with combining the visual display of the display area 950 and the racetrack menu 150, the controller 500 may also combine audio associated with operation of the user interface 1000 with an audio portion (if present) of an audio/visual program being played. More specifically, “click” sounds associated with the user pressing the racetrack surface 250 defined on a surface of the touch sensor 220 with greater pressure and/or with the “snapping” of the marker 160 between adjacent ones of the menu items 155 may be combined with whatever audio portion is acoustically output as part of the playing of an audio/visual program.
  • In some embodiments, at a time when the racetrack menu 150 is not displayed (e.g., at a time when only the display area 950 is displayed), the controller 500 may do more than simply cause the racetrack menu 150 to be displayed in response to a user touching a portion of the racetrack sensor 250. More specifically, in addition to causing the racetrack menu 150 to be displayed, the controller 500 may take particular actions in response to particular ones of the sides 250 a-d of the racetrack surface 250 being touched by a user at a time when the racetrack menu 150 is not being displayed. By way of example, at a time when the racetrack menu 150 is not being displayed, the detection of a touch to the side 250 d may cause a command to be sent to one of the sources 901-904 to provide an on-screen guide concerning audio/visual programs able to be provided by that source, where such a guide would be displayed in the display area 950, with edges of the display area 950 being either surrounded or overlain by the racetrack menu 150 as has been previously described.
  • In a variation of such embodiments, it may be that causing the racetrack menu 150 to be displayed requires both a touch and some minimum degree of movement of the tip of a user's digit on the racetrack surface 250 (i.e., a kind of “touch-and-drag” or “wiping” motion across a portion of the racetrack surface 250), while other particular actions are taken in response to where there is only a touch of a tip of a user's digit on particular ones of the sides 250 a-d of the racetrack sensor 250. By way of example, while the racetrack menu 150 is not displayed, touching the side 250 a may cause a command to be sent to a source to turn that source on or off, and touching the side 250 b may cause an audio portion of an audio/visual program to be muted, while both touching and moving a digit across a portion of the racetrack surface 250 in a “wiping” motion is required to enable the display and use of the racetrack menu 150.
  • FIGS. 7 a and 7 b, taken together, depict additional features that may be incorporated into the user interface 1000. Where a selected one of the sources 901-904 displays its own on-screen menu 170 (e.g., a guide concerning audio/visual programs available from that source), either in place of a visual portion of an audio/visual program or overlying a visual portion of an audio/visual program, some embodiments of the user interface 1000 may be augmented to support at least partly integrating the manner in which a user would navigate such an on-screen menu 170 into the user interface 1000. In such embodiments, the touch sensor 220, with its ring shape (whether that ring shape is a rectangular ring shape, or a ring shape of a different geometry), may be configured to surround a set of controls for use in navigating the on-screen menu 170 just as the racetrack menu 150 surrounds the on-screen menu 170, itself.
  • In particular, FIG. 7 b depicts the manner in which the touch sensor 220 disposed on the casing 210 of the remote control 200 of FIG. 1 may surround navigation buttons 270 a, 270 b, 270 c and 270 d, as well as a selection button 280, that are also disposed on the casing 210. In alternate variants, other forms of one or more manually-operable controls may be surrounded by the touch sensor 220, in addition to or in place of the navigation buttons 270 a-d and the selection button 280, including and not limited to, a joystick, or a four-way rocker switch that may either surround a selection button (such as the selection button 280) or be useable as a selection button by being pressed in the middle. As a result of the ring shape of the touch sensor 220 being employed to surround the navigation buttons 270 a-d and the selection buttons 280, a nested arrangement of concentrically located manually operable controls is created. FIG. 7 a depicts a form of possible on-screen menu that will be familiar to those skilled in the art, including various menu items 175 that may be selected via the selection button 280, and a marker 180 that may be moved by a user among the menu items 175 via the navigation buttons 270 a-d. The concentrically nested arrangement of manually operable controls surrounded by the racetrack menu 250 defined on the touch-sensitive surface 225 of the touch sensor 220 that is disposed on the casing 210 of the remote control 200 corresponds to the similarly nested arrangement of the on-screen menu 170 surrounded by the racetrack menu 150 that is displayed on the display element 120.
  • FIG. 7 b also depicts additional controls 222, 225, 226 and 228 that may be employed to perform particular functions where it may be deemed desirable to provide at least some degree of functionality in a manner that does not require the selection of menu items to operate. In one possible variant, the controls 222, 225, 226 and 228 are operable as a power button, a mute button, volume rocker switch and a channel increment/decrement rocker switch, respectively. FIG. 8 depicts a variant of the handheld form of the audio/visual device 100 depicted in FIG. 5 in which the touch sensor 220 is positioned so as to surround the navigation buttons 270 a-d and the selection button 280, and in which this variant of the handheld form of the audio/visual device 100 may further incorporate the controls 222, 225, 226 and 228.
  • FIG. 9 is a block diagram of a possible architecture of the controller 500 in which the controller 500 incorporates an output interface 510, a sensor interface 520, a storage 540, a processing device 550 and a source interface 590. The processing device 550 is coupled to each of the output interface 510, the sensor interface 520, the storage 540 and the source interface 590 to at least coordinate the operation of each to perform at least the above-described functions of the controller 500.
  • The processing device 550 may be any of a variety of types of processing device based on any of a variety of technologies, including and not limited to, a general purpose central processing unit (CPU), a digital signal processor (DSP), a microcontroller, or a sequencer. The storage 540 may be based on any of a variety of data storage technologies, including and not limited to, any of a wide variety of types of volatile and nonvolatile solid-state memory, magnetic media storage, and/or optical media storage. It should be noted that although the storage 540 is depicted in a manner that is suggestive of it being a single storage device, the storage 540 may be made up of multiple storage devices, each of which may be based on different technologies.
  • Each of the output interface 510, the sensor interface 520 and the source interface 590 may employ any of a variety of technologies to enable the controller 500 to communicate with other devices and/or other components of whatever audio/visual device into which the controller 500 is incorporated. More specifically, where the controller 500 is incorporated into an audio/visual device that also incorporates one or both of a display element (such as the display element 120) and at least one acoustic driver (such as the acoustic drivers 130), the output interface 510 may be of a type able to directly drive a display element with signals causing the display of the racetrack menu 150 and the display area 950 to display visual portions of audio/visual programs, and/or able to directly drive one or more acoustic drivers to acoustically output audio portions of audio/visual programs. Alternatively, where one or both of a display element and acoustic drivers are not incorporated into the same audio/visual device into which the controller 500 is incorporated, the output interface 510 may be of a type employing cabling-based and/or a wireless signaling (perhaps signaling conforming to one of the previously listed industry standards) to transmit a signal to another audio/visual device into which a display element and/or acoustic drivers are incorporated (e.g., the audio/visual device 100).
  • Similarly, where the controller 500 is incorporated into an audio/visual device into which the touch sensor 220 is also incorporated, the sensor interface 520 may be of a type able to directly receive electrical signals emanating from the touch sensor 220. With such a more direct coupling, the sensor interface 520 may directly monitor a two-dimensional array of touch-sensitive points of the touch-sensitive surface 225 of the touch sensor 220 for indications of which touch-sensitive points are being touched by a tip of a user's digit, and thereby enable the processing device 550 to employ those indications to directly determine where the touch-sensitive surface 225 is being touched. Thus, a determination of whether or not the tip of the digit is touching a portion of the racetrack surface 250 and/or the position 260 by the processing device 550 may be enabled. However, where the controller 500 is incorporated into a device into which the touch sensor 220 is not also incorporated (e.g., the controller 500 is incorporated into the audio/visual device 100 and the touch sensor is incorporated into the remote control 200), the sensor interface 520 may be of a type able to receive cabling-based and/or wireless signaling transmitted by that other device (e.g., infrared signals emitted by the remote control 200). With such a more remote coupling, circuitry (not shown) that is co-located with the touch sensor 220 may perform the task of directly monitoring a two-dimensional array of touch-sensitive points of the touch-sensitive surface 225, and then transmit indications of which touch-sensitive points are being touched by the tip of a user's digit to the sensor interface 520.
  • Although it is possible that the audio/visual device into which the controller 500 is incorporated may not incorporate any sources (such as the sources 901-904) from which the controller 500 receives audio/visual programs, it is deemed more likely that the audio/visual device into which the controller 500 is incorporated will incorporate one or more of such sources in addition to being capable of receiving audio/visual programs from sources not incorporated into the same audio/visual device. By way of example, it is envisioned that the controller 500 may be incorporated into an audio/visual device into which a radio frequency tuner and/or an Internet access device is also incorporated to enable access to audio/visual programs for selection and playing without the attachment of another audio/visual device, while also having the capability of being coupled to another audio/visual device to receive still other audio/visual programs. In other words, it is envisioned that the controller 500 may well be incorporated into an audio/visual device that is at least akin to a television, whether portable (e.g., as depicted in FIG. 5) or stationary (e.g., as depicted in FIG. 1). Therefore, although the source interface 590 may have any of a number of configurations to couple the controller 500 to any of a number of possible sources, it is envisioned that the source interface 590 will be configured to enable the controller 500 to be coupled to at least one source that is also incorporated into the same audio/visual device into which the controller 500 is incorporated, and to also enable the controller 500 to be coupled to at least one source that is not incorporated into the same audio/visual device.
  • Thus, the source interface 590 incorporates one or more of an electrical interface 595, an optical interface 596, a radio frequency transceiver 598 and/or an infrared receiver 599. The electrical interface 595 (if present) enables the source interface 590 to couple the controller 500 to at least one source, whether incorporated into the same audio/visual device as the controller 500, or not, to receive electrical signals (e.g., Ethernet, S-Video, USB, HDMI, etc.) conveying an audio/visual program to the controller 500. The optical interface 596 (if present) enables the source interface 590 to couple the controller 500 to at least one source to receive optical signals (e.g., Toslink) conveying an audio/visual program to the controller 500. The radio frequency transceiver 598 (if present) enables the source interface 590 to wirelessly couple the controller 500 to at least one other audio/visual device functioning as a source to receive radio frequency signals (e.g., Bluetooth, a variant of IEEE 802.11, ZigBee, etc.) conveying an audio/visual program to the controller 500 from that other audio/visual device. The infrared receiver 599 (if present) enables the source interface 590 to wirelessly couple the controller 500 to at least one other audio/visual device functioning as a source to receive infrared signals conveying an audio/visual program to the controller 500 from that other source. It should be noted that although the output interface 510 and the sensor interface 520 are depicted as separate from the source interface 590, it may be deemed advantageous, depending on the nature of the signaling supported, to combine one or both of the output interface 510 and the sensor interface 520 with the source interface 590.
  • Stored within the storage 540 are one or more of a control routine 450, a protocols data 492, a commands data 493, an audio/visual data 495, a rescaled audio/visual data 496, and menu data 498. Upon being executed by the processing device 550, a sequence of instructions of the control routine 450 causes the processing device 550 to coordinate the monitoring of the touch sensor 220 for user input, the output of the racetrack menu 150 to a display element (e.g., the display element 120), the selection of a source of an audio/visual program to be played, and one or both of the display of a visual portion of an audio/visual program on a display element on which the racetrack menu 150 is also displayed and the acoustic output of an audio portion of the audio/visual program via one or more acoustic drivers (e.g., the acoustic drivers 130).
  • Upon execution, the control routine 450 causes the processing device 550 to operate the sensor interface 520 to await indications of a user placing a tip of a digit in contact with a portion of the racetrack surface 250 defined on a surface of the touch sensor 220, moving that digit about the racetrack surface 250 and/or applying greater pressure at the position 260 on the racetrack surface 250 to make a selection. Upon receiving an indication of activity by the user involving the racetrack surface 250, the processing device 550 may be caused to operate the output interface to display the racetrack menu 150 with one or more of the menu items 155 positioned thereon and surrounding the display area 950 via a display element, if the racetrack menu 150 is not already being displayed. The processing device 550 is further caused to display and position at least the marker 160 on the racetrack menu 150 in a manner that corresponds to the position 260 of the user's digit on the racetrack surface 250. Further, in response to the passage of a predetermined period of time without receiving indications of activity by the user involving the racetrack surface 250, the processing device 550 may be caused to operate the output interface 510 to cease displaying the racetrack menu 150, and to display substantially little else on a display element than the display area 950.
  • Upon execution, the control routine 450 causes the processing device 550 to operate the sensor interface 520 to await an indication of a selection of a menu item 155 that corresponds to selecting a source from which the user may wish an audio/visual program to be provided for playing, and may operate the source interface 590 to at least enable receipt of an audio/visual program from that selected source. Where an audio/visual program is received, the processing device 550 may be further caused to buffer audio and/or visual portions of the audio/visual program in the storage 540 as the audio/visual data 495. In embodiments in which a visual portion of an audio/visual program is rescaled to be displayed in the display area 950 at a time when the display area 950 is surrounded by the racetrack menu 150, the processing device 550 may be further caused to buffer the rescaled form of the visual portion in the storage as the rescaled audio/visual program data 496.
  • Upon execution, the control routine 450 causes the processing device 550 to operate the sensor interface 520 to await an indication of a selection of a menu item 155 corresponding to the selection of a command (e.g., “play” or “record” commands, numerals or other symbols specifying a radio frequency channel to tune, etc.) to be transmitted to an audio/visual device serving as a source, and may operate the source interface 590 to transmit a command to that audio/visual device (e.g., one of sources 901-904) that corresponds to a menu item 155 that has been selected. In transmitting that command, the processing device 550 may be further caused to refer to the protocols data 492 for data concerning sequences of signals that must be transmitted by the source interface 590 as part of a communications protocol in preparation for transmitting the command, and/or the processing device 550 may be further caused to refer to the commands data 493 for data concerning the sequence of signals that must be transmitted by the source interface 590 as part of transmitting the command. As will be familiar to those skilled in the art, some of the earlier listed forms of coupling make use of various protocols to organize various aspects of commands and/or data that are conveyed, including and not limited to, Ethernet, Bluetooth, IEEE-1394, USB, etc. In support of the processing device 550 responding to the selection of various ones of the menu items 155, the processing device 550 is further caused to store data correlating at least some of the various menu items with actions to be taken by the processing device 550 in response to their selection by the user in the storage 540 as the menu data 498.
  • Amidst operating the source interface 590 to enable receipt of an audio/visual program from a source selected by the user, the processing device 550 may be caused to operate the output interface 510 to alter the quantity and/or type of menu items 155 that are displayed at various positions on the racetrack menu 150. In so doing, the processing device 550 may be further caused to store information concerning the size, shape, color and other characteristics of the racetrack menu 150, at least some of the graphical representations of the menu items 155, and/or at least one graphical representation of the marker 160 in the storage 540 as part of the menu data 498.
  • FIGS. 10 a and 10 b, taken together, depict and contrast two variants of the touch sensor 220. Both variants are depicted in perspective as distinct touch-sensitive devices that are typically mounted within a recess of a casing of a device, including either the casing 110 of any variant of the audio/visual device 100 or the casing 210 of any variant of the remote control 200. However, as those skilled in the art will readily recognize, other touch-sensitive device technologies may yield variants of the touch-sensitive device 220 that are film-like overlays that may be positioned to overlie a portion of a casing or of a circuitboard of a device. The discussion that follows is centered more on the shape and utilization of the touch-sensitive surface 225 of the touch sensor 220, and not on the touch-sensitive technology employed.
  • FIG. 10 a depicts the variant of the touch sensor 220 having the ring shape that has been discussed above at length that permits other manually-operable controls (e.g., the navigation buttons 270 a-d and the selection button 280) to be positioned in a manner in which they are surrounded by the ring shape of the touch sensor 220. As has already been discussed, the ring shape of this variant of the touch sensor 220 provides a form of the touch-sensitive surface 225 that is bounded by the ring shape of the touch sensor 220, and this in turn defines the ring shape of the racetrack surface 250 (where the racetrack surface 250 is defined on the touch-sensitive surface 225 to encompass substantially all of the touch-sensitive surface 225). Once again, although this variant of the touch sensor 220 is depicted as having a rectangular ring shape having four sides, other embodiments are possible in which the touch sensor 220 has a ring shape of a different geometry, such as a circular ring shape, an oval ring shape, a hexagonal ring shape, etc.
  • FIG. 10 b depicts an alternate variant of the touch sensor 220 having a rectangular shape that provides a continuous form of the touch-sensitive surface 225 that is bounded by this rectangular shape (i.e., there is no “hole” or formed through the touch-sensitive surface 225). This rectangular shape more easily enables more than the ring shape of the racetrack surface 250 to be defined on the touch-sensitive surface 225 in a manner in which the racetrack surface 250 encompasses only a portion of the touch-sensitive surface 225 and leaves open the possibility of one or more other surfaces that serve other functions also being defined on thereon. In this alternate variant, the ring shape of the racetrack surface 250 may be defined by a processing device executing a sequence of instructions of a routine, such as the processing device 550 executing the control routine 450 in FIG. 9. In other words, the location of the racetrack surface 250 may be defined by a processing device first being provided with indications of which touch-sensitive points of an array of touch-sensitive points making up the touch-sensitive surface 225 are being touched by a tip of a user's digit, and second treating some of those touch-sensitive points as belonging to the racetrack surface 250 and others of those touch-sensitive points as belonging to other surfaces that are defined on the touch-sensitive surface 225 (and which serve other functions).
  • Alternatively and/or additionally, one or more ridges 227 and/or grooves (not shown) may be formed in the touch-sensitive surface 225 to at least provide a tactile guide as to where the racetrack surface 250 is defined on the touch-sensitive surface 225. Such ridges 227 may be formed integrally with the touch-sensitive surface 225, may be formed as part of a casing on which the touch sensor 220 is disposed, or may be adhered to the touch-sensitive surface 225. Further, such ridges 227 and/or grooves (not shown) may coincide with locations on the touch-sensitive surface 225 at which the touch sensor 220 is incapable of detecting the touch of a tip of a digit (i.e., the touch-sensitive surface 225 may be made up of multiple separate touch-sensitive portions, of which one is a portion having a ring shape where the racetrack surface 250 is defined).
  • More specifically, and as depicted in dotted lines in FIG. 10 b, the racetrack surface 250 is defined on the touch-sensitive surface 225 so as to be positioned about the periphery of the touch-sensitive surface 225 such that the ring shape of the racetrack surface 250 surrounds the remainder of the touch-sensitive surface 225. As also depicted, at least a portion of the touch-sensitive surface 225 that is surrounded by the racetrack surface 250 may be employed to provide the equivalent function of other manually-operable controls, such as the navigation buttons 270 a-d and the selection button 280. In other words, the navigation buttons 270 a-d and the selection button 280 may be implemented as navigation surfaces and a selection surface, respectively, defined on the touch-sensitive surface of the touch sensor 220 (perhaps by a processing device executing a sequence of instructions), along with the racetrack surface 250.
  • It should be noted that although both of the variants of the touch sensor 220 have been depicted in FIGS. 10 a and 10 b as having rectangular shapes with right angle corners, either variant may alternatively have rounded corners. Indeed, where such a variant of the touch sensor 220 has one or more of the ridges 227 and/or grooves (not shown), such ones of the ridges 227 and/or grooves may also have rounded corners, despite being depicted as having right angle corners in FIGS. 10 a and 10 b.
  • FIGS. 11 a and 11 b, taken together, depict two variants of the user interface 1000 in which more than one display area is defined within the portion of the display element 120 that is surrounded by the racetrack menu 150. These variants enable more than one visual portion of one or more selected audio/visual programs to be played on the display element 120 in a manner that enables a user to view them simultaneously. Also depicted is the manner in which various ones of the menu items 155 associated within only one of the display areas may be positioned along the racetrack menu 150 to provide a visual indication of their association with that one of the display areas.
  • More specifically, FIG. 11 a depicts a configuration that is commonly referred to as “picture-in-picture” in which a display area 970 having smaller dimensions than the display area 950 is positioned within and overlies a portion of the display area 950. As also depicted, ones of the menu items 155 that are associated with the visual portion displayed in the display area 970 are positioned along portions of the racetrack menu 150 that are located closer to the display area 970 (specifically, portions of the sides 150 b and 150 d) to provide a visual indication to the user of that one association. Further, ones of the menu items 155 that are associated with the visual portion displayed in the display area 950 are positioned along portions of the racetrack menu 150 that are further from the display area 970 (specifically, the sides 150 a and 150 c) to provide a visual indication to the user of that other association. As suggested in the depiction of FIG. 11 a, the ones of the menu items 155 that are associated with the display area 950 correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning. The ones of the menu items 155 that are associated with the display area 970 correspond to commands to play or to stop playing an audio/visual program, and selection of an input.
  • Also more specifically, FIG. 11 b depicts a configuration that is commonly referred to as “picture-by-picture” in which the display areas 950 and 970 are positioned adjacent each other (as opposed to one overlapping the other) within the portion of the display element surrounded by the racetrack menu 150. Again as depicted, ones of the menu items 155 that are associated with the visual portion displayed in the display area 950 are positioned along portions of the racetrack menu 150 that are located closer to the display area 950 (specifically, the side 150 c and portions of the sides 150 a and 150 b) to provide a visual indication to the user of that one association. Further, ones of the menu items 155 that are associated with the visual portion displayed in the display area 970 are positioned along portions of the racetrack menu 150 that are located closer to the display area 970 (specifically, the side 150 d and portions of the sides 150 a and 150 b) to provide a visual indication to the user of that other association. As suggested in the depiction of FIG. 11 b, each of the display areas 950 and 970 are associated with separate ones of the menu items 155 that correspond to commands to play or to stop playing an audio/visual program, selection of an input, and radio frequency channel tuning.
  • Although FIGS. 11 a and 11 b depict embodiments having only two display areas (i.e., the display areas 950 and 970) within the portion of the display element 120 surrounded by the racetrack menu 150, those skilled in the art will readily recognize that other embodiments incorporating more than two such display areas are possible, and that in such embodiments, each of the menu items 155 may be positioned along the racetrack menu 150 in a manner providing a visual indication of its association with one of those display areas. Indeed, it is envisioned that variants of the user interface 1000 are possible having 2-by-2 or larger arrays of display areas to accommodate the simultaneous display of multiple visual portions, possibly in security applications.
  • Although FIGS. 11 a and 11 b depict separate sets of the menu items 155 corresponding to commands to play and to stop playing an audio/visual program that are separately associated with each of the display areas 150 and 170, and although this suggests that the visual portions played in each of the display areas 150 and 170 must be from different audio/visual programs, it should be noted that the simultaneously displayed visual portions in the display areas 150 and 170 may be of the same audio/visual program. As those skilled in the art will readily recognize, an audio/visual program may have more than one visual portion. An example of this may be an audio/visual program including video of an event taken from more than one angle, such as an audio/visual program of a sports event where an athlete is shown in action from more than one camera angle. In such instances, there may be only one set of the menu items 155 corresponding to commands to play, fast-forward, rewind, pause and/or to stop playing the single audio/visual program, instead of the separate sets of menu items depicted FIGS. 11 a and 11 b.
  • With the simultaneous display of multiple visual portions, there may be multiple audio portions that each correspond to a different one of the visual portions. While viewing multiple visual portions simultaneously may be relatively easy for a user insofar as the user is able to choose any visual program to watch with their eyes, listening to multiple audio portions simultaneously may easily become overwhelming. To address this, some embodiments may select one of the audio portions to be acoustically output to the user based on the position 260 of a tip of a digit along the racetrack surface 250 (referring back to FIG. 2). Where the position 260 at which the user places a tip of a digit on the racetrack surface 250 corresponds to a portion of the racetrack menu 150 that is closer to the display area 950, then an audio portion of the audio/visual program of the visual portion being displayed in the display area 950 is acoustically output to the user. If the user then moves that tip of a digit along the racetrack surface 250 such that the position 260 is moved to a portion of the racetrack surface 250 that corresponds to a portion of the racetrack menu 150 that is closer to the display area 970, then an audio portion of the audio/visual program of the visual portion being displayed in the display area 970 is acoustically output to the user. As the selection of audio portion that is acoustically output to the user changes as the user moves the tip of a digit about the racetrack surface 250, the corresponding position of the marker 160 along the racetrack menu 150 may serve as a visual indication to the user of which visual portion the current selection of audio portion corresponds to.
  • FIG. 12 depicts an alternate variant of the user interface 1000 in which the combined display of the racetrack menu 150 and the display area 950 surrounded by the racetrack menu 150 does not fill substantially all of the display element 120. Such an embodiment may be implemented on a more complex variant of the audio/visual device 100 capable of simultaneously performing numerous functions, some of which are entirely unrelated to selection and playing of an audio/visual program. As depicted, this leaves a display area 920 that is outside the racetrack menu 150 and that is overlain by the combination of the racetrack menu 150 and the display area 950 available for such unrelated functions. Such a more complex variant of the audio/visual device 100 may be a general purpose computer system, perhaps one employed as a “media center system” or “whole house entertainment system.” In such an embodiment, the combination of the racetrack menu 150 and the display area 950 may be displayed in a window defined by an operating system having a windowing graphical user interface where the window occupies substantially less than all of the display element 120.
  • As also depicted in FIG. 12, in such an embodiment, the user may select and control the playing of an audio/visual program through the use of a variant of the touch sensor 220 having a touch-sensitive surface 225 that has a continuous rectangular shape (such as the variant of the touch sensor 220 of FIG. 10 b), as opposed to having a ring shape (such as the variant of the touch sensor 220 of FIG. 10 a). The racetrack surface 250 is defined on the touch-sensitive surface 225 in a manner that occupies the periphery of the touch-sensitive surface 225 and that surrounds a remaining portion of the touch-sensitive surface 225 that enables conventional operation of other functions of the audio/visual device 100 that may be unrelated to the selection and playing of an audio/visual program. In essence, this remaining portion of the touch-sensitive surface 225 may be employed in a conventional manner that will be familiar to those skilled in the art of graphical user interfaces in which a user moves about a graphical cursor using a tip of a digit placed on this remaining portion. Thus, the user may choose to engage in selecting audio/visual programs and controlling the playing of those audio/visual programs through the racetrack surface 250, and may choose to engage in performing other tasks unrelated to the selection and playing of audio/visual programs through the remaining portion of the touch-sensitive surface 225.
  • To provide tactile guidance to the user as to the location of the racetrack surface 250, one or more ridges 227 and/or grooves (not shown) may be formed in the touch-sensitive surface 225. In this way, the user may be aided in unerringly placing a tip of a digit on whichever one of the racetrack surface 250 or the remaining portion of the touch-sensitive surface 225 that they wish to place that tip upon, without errantly placing that tip on both, and without having to glance at the touch-sensitive surface 225 of the touch sensor 220.
  • Other implementations are within the scope of the following claims and other claims to which the applicant may be entitled.

Claims (21)

1. An apparatus comprising:
a touch sensor having a touch-sensitive surface that is manually operable with a digit of a hand of a user;
a racetrack surface having a ring shape defined on the touch-sensitive surface to encompass at least a portion of the touch-sensitive surface and operable by the digit;
a processing device; and
a storage accessible to the processing device and storing a sequence of instructions that when executed by the processing device, causes the processing device to:
receive an indication of the digit touching the racetrack surface at a position on the racetrack surface;
in response to the indication of the digit touching the racetrack surface at the position, cause a marker to be visually displayed at a location that corresponds to the position on the racetrack surface on a menu that is visually displayed on a display element;
receive an indication of the position at which the digit touches the racetrack surface being moved about the racetrack surface;
in response to the indication of the position being moved about the racetrack surface, cause the marker to be moved about the menu in a manner that corresponds to the manner in which the position is being moved about the racetrack;
receive an indication of the user increasing the pressure with which the user's digit touches the racetrack surface at the position at a time subsequent to the position being moved about the racetrack; and
in response to the indication of the user increasing pressure with which the user's digit touches the racetrack surface at the position, cause a menu item displayed in the vicinity of the marker to be selected, wherein causing the menu item to be selected comprises taking an action to cause an audio/visual program to be selected for playing.
2. The apparatus of claim 1, wherein the touch-sensitive surface of the touch sensor has a ring shape that defines the ring shape of the racetrack surface such that the racetrack surface encompasses substantially all of the touch-sensitive surface.
3. The apparatus of claim 2, further comprising:
a manually operable control; and
a casing on which the touch sensor and the manually operable control are both disposed, wherein the touch sensor is disposed on the casing relative to the manually operable control such that the touch-sensitive surface surrounds the manually operable control.
4. The apparatus of the claim 3, further comprising a source interface operable to transmit commands to a source of the audio/visual program, wherein execution of the sequence of instructions by the processing device further causes the processing device to:
receive an indication of the manually-operable control being operated; and
in response to the indication of the manually-operable control being operated, operate the source interface to transmit a command to the source to cause the source to visually display a navigation menu of the source on the display element.
5. The apparatus of claim 1, wherein:
the touch-sensitive surface of the touch sensor is a continuous surface having no hole interrupting the touch-sensitive surface formed therethrough;
the ring shape of the racetrack surface is defined on the touch-sensitive surface to encompass a first portion of the touch-sensitive surface and is defined to be positioned about the periphery of the touch-sensitive surface so as to surround a second portion of the touch-sensitive surface; and
a navigation surface is defined on the touch-sensitive surface to encompass the second portion.
6. The apparatus of claim 5, wherein at least one ridge is formed in the touch-sensitive surface, and wherein the at least one ridge also at least partly defines the ring shape of the racetrack surface.
7. The apparatus of claim 5, wherein the processing device is caused by the sequence of instructions to define the first and second portions of the touch-sensitive surface by:
monitoring activity on the touch-sensitive surface;
treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the indication of the digit touching the racetrack surface at the position;
treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as an indication of the digit operating a navigation control; and
in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source.
8. The apparatus of claim 1, wherein the menu has a ring shape that substantially corresponds to the ring shape of the racetrack surface.
9. The apparatus of claim 8, wherein the ring shape of both the racetrack surface and the menu is a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface.
10. The apparatus of claim 8, wherein the ring shape of the menu surrounds a display area in which a visual portion of the audio/visual program is displayed at a time when the audio/visual program is played.
11. The apparatus of claim 1, wherein execution of the sequence of instructions by the processing device further causes the processing device to cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed.
12. The apparatus of claim 1, wherein execution of the sequence of instructions by the processing device further causes the processing device to:
cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and
cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
13. The apparatus of claim 1, wherein execution of the sequence of instructions by the processing device further causes the processing device to:
cause the menu to be visually displayed in response to the indication of the digit touching the racetrack surface followed by an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and
cause a command concerning playing the audio/visual program to be transmitted to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
14. An method comprising:
receiving an indication of a digit of a hand of a user touching a racetrack surface at a position on the racetrack surface, wherein the racetrack surface is defined on a touch-sensitive surface of a touch sensor to encompass at least a portion of the touch-sensitive surface and is operable by the digit;
in response to the indication of the digit touching the racetrack surface at the position, causing a marker to be visually displayed at a location that corresponds to the position on the racetrack surface on a menu that is visually displayed on a display element;
receiving an indication of the position at which the digit touches the racetrack surface being moved about the racetrack surface;
in response to the indication of the position being moved about the racetrack surface, causing the marker to be moved about the menu in a manner that corresponds to the manner in which the position is being moved about the racetrack;
receiving an indication of the user increasing the pressure with which the user's digit touches the racetrack surface at the position at a time subsequent to receiving the indication of the position being moved about the racetrack; and
in response to the indication of the user increasing pressure with which the user's digit touches the racetrack surface at the position, causing a menu item displayed in the vicinity of the marker to be selected, wherein causing the menu item to be selected comprises taking an action to cause an audio/visual program to be selected for playing.
15. The method of claim 14, further comprising defining the racetrack surface on a first portion of the touch-sensitive surface and defining a navigation surface on a second portion of the touch-sensitive surface such that the ring shape of the racetrack surface surrounds the navigation surface by:
monitoring activity on the touch-sensitive surface;
treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the first portion as the receiving of the indication of the digit touching the racetrack surface at the position;
treating the receipt of an indication of the digit touching the touch-sensitive surface at a location within the second portion as receiving an indication of the digit operating a navigation control; and
in response to the indication of the digit touching the navigation control, causing a command to be transmitted to a source of the audio/visual program to operate a function of another menu associated with the source.
16. The method of claim 14, further comprising displaying the menu on the display element with a ring shape that substantially corresponds to the ring shape of the racetrack surface.
17. The method of claim 16, further comprising surrounding a display area on the display element with the menu, wherein a visual portion of the audio/visual program is displayed in the display area at a time when the audio/visual program is played.
18. The method of claim 16, wherein the ring shape of both the racetrack surface and the menu is a rectangular ring shape such that the racetrack surface comprises four sides and the menu comprises four sides that correspond to the four sides of the racetrack surface.
19. The method of claim 14, further comprising displaying the menu on the display element in response to the indication of the digit touching the racetrack surface at the position at a time when the menu is not being visually displayed.
20. The method of claim 14, further comprising:
displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit moving about the racetrack surface in a wiping motion starting at the position at a time when the menu is not being visually displayed; and
transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
21. The method of claim 14, further comprising:
displaying the menu on the display element in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit remaining in contact with the racetrack surface for at least a predetermined period of time at a time when the menu is not being visually displayed; and
transmitting a command concerning playing the audio/visual program to a source of the audio/visual program in response to the indication of the digit touching the racetrack surface followed by receiving an indication of the digit ceasing to touch the racetrack surface at a time when the menu is not being visually displayed.
US12/613,943 2009-11-06 2009-11-06 Audio/Visual Device Touch-Based User Interface Abandoned US20110109560A1 (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
US12/613,943 US20110109560A1 (en) 2009-11-06 2009-11-06 Audio/Visual Device Touch-Based User Interface
US12/887,479 US8669949B2 (en) 2009-11-06 2010-09-21 Touch-based user interface touch sensor power
US12/886,837 US20110113371A1 (en) 2009-11-06 2010-09-21 Touch-Based User Interface User Error Handling
US12/886,802 US8350820B2 (en) 2009-11-06 2010-09-21 Touch-based user interface user operation accuracy enhancement
US12/887,499 US8638306B2 (en) 2009-11-06 2010-09-21 Touch-based user interface corner conductive pad
US12/887,484 US8686957B2 (en) 2009-11-06 2010-09-21 Touch-based user interface conductive rings
US12/886,998 US8692815B2 (en) 2009-11-06 2010-09-21 Touch-based user interface user selection accuracy enhancement
EP10777200A EP2497017A1 (en) 2009-11-06 2010-11-05 Audio/visual device touch-based user interface
PCT/US2010/055628 WO2011057076A1 (en) 2009-11-06 2010-11-05 Audio/visual device touch-based user interface
US13/414,436 US8736566B2 (en) 2009-11-06 2012-03-07 Audio/visual device touch-based user interface
US13/448,657 US9201584B2 (en) 2009-11-06 2012-04-17 Audio/visual device user interface with tactile feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/613,943 US20110109560A1 (en) 2009-11-06 2009-11-06 Audio/Visual Device Touch-Based User Interface

Related Child Applications (8)

Application Number Title Priority Date Filing Date
US12/886,837 Continuation-In-Part US20110113371A1 (en) 2009-11-06 2010-09-21 Touch-Based User Interface User Error Handling
US12/886,998 Continuation-In-Part US8692815B2 (en) 2009-11-06 2010-09-21 Touch-based user interface user selection accuracy enhancement
US12/886,802 Continuation-In-Part US8350820B2 (en) 2009-11-06 2010-09-21 Touch-based user interface user operation accuracy enhancement
US12/887,499 Continuation-In-Part US8638306B2 (en) 2009-11-06 2010-09-21 Touch-based user interface corner conductive pad
US12/887,484 Continuation-In-Part US8686957B2 (en) 2009-11-06 2010-09-21 Touch-based user interface conductive rings
US12/887,479 Continuation-In-Part US8669949B2 (en) 2009-11-06 2010-09-21 Touch-based user interface touch sensor power
US13/414,436 Continuation US8736566B2 (en) 2009-11-06 2012-03-07 Audio/visual device touch-based user interface
US13/448,657 Continuation-In-Part US9201584B2 (en) 2009-11-06 2012-04-17 Audio/visual device user interface with tactile feedback

Publications (1)

Publication Number Publication Date
US20110109560A1 true US20110109560A1 (en) 2011-05-12

Family

ID=43973802

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/613,943 Abandoned US20110109560A1 (en) 2009-11-06 2009-11-06 Audio/Visual Device Touch-Based User Interface
US13/414,436 Expired - Fee Related US8736566B2 (en) 2009-11-06 2012-03-07 Audio/visual device touch-based user interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/414,436 Expired - Fee Related US8736566B2 (en) 2009-11-06 2012-03-07 Audio/visual device touch-based user interface

Country Status (1)

Country Link
US (2) US20110109560A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110099513A1 (en) * 2009-10-23 2011-04-28 Ameline Ian Ross Multi-Touch Graphical User Interface for Interacting with Menus on a Handheld Device
EP2584403A3 (en) * 2011-10-21 2013-11-06 Disney Enterprises, Inc. Multi-user interaction with handheld projectors
DE102012217783A1 (en) * 2012-09-28 2014-04-03 Siemens Aktiengesellschaft Input element e.g. selector switch for hearing aids, has runner that is provided with recess or window through which the current set value or condition on runner or within the runner is readable
US20150293616A1 (en) * 2014-04-09 2015-10-15 Wei-Chih Cheng Operating system with shortcut touch panel having shortcut function
CN106445378A (en) * 2016-09-12 2017-02-22 青岛海信电器股份有限公司 Display control method and device of touch menu, and touch display equipment
JP2017164908A (en) * 2016-03-14 2017-09-21 セイコーエプソン株式会社 Printer, electronic apparatus, control program, and operational parameter setting method for printer
CN109582204A (en) * 2012-11-23 2019-04-05 三星电子株式会社 Show equipment, input equipment and its control method
CN110309238A (en) * 2018-03-08 2019-10-08 上海博泰悦臻网络技术服务有限公司 Point of interest interactive approach, system, electric terminal and storage medium in music
US10873718B2 (en) 2014-04-02 2020-12-22 Interdigital Madison Patent Holdings, Sas Systems and methods for touch screens associated with a display

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102334796B1 (en) * 2015-07-10 2021-12-07 삼성디스플레이 주식회사 Display apparatus
US9619032B1 (en) * 2015-10-16 2017-04-11 International Business Machines Corporation Accessibility path guiding through microfluidics on a touch screen

Citations (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4786767A (en) * 1987-06-01 1988-11-22 Southwall Technologies Inc. Transparent touch panel switch
US4825209A (en) * 1985-03-06 1989-04-25 Alps Electric Co., Ltd Remote control apparatus
US5222895A (en) * 1990-03-13 1993-06-29 Joerg Fricke Tactile graphic computer screen and input tablet for blind persons using an electrorheological fluid
US5327160A (en) * 1991-05-09 1994-07-05 Asher David J Touch sensitive user interface for television control
US5367199A (en) * 1992-05-01 1994-11-22 Triax Technologies Sliding contact control switch pad
US5371553A (en) * 1992-03-11 1994-12-06 Sony Corporation Monitor apparatus for selecting audio-visual units and operating modes from a control window
US5408275A (en) * 1992-05-25 1995-04-18 Goldstar Co., Ltd. Apparatus and method for controlling television receiver
US5508703A (en) * 1992-09-14 1996-04-16 Smk Corporation Membrane switch having a rotary motion detection function
US5545857A (en) * 1994-07-27 1996-08-13 Samsung Electronics Co. Ltd. Remote control method and apparatus thereof
US5589893A (en) * 1994-12-01 1996-12-31 Zenith Electronics Corporation On-screen remote control of a television receiver
US5691778A (en) * 1995-08-31 1997-11-25 Samsung Electronics Co., Ltd. Double-wide television set having double-deck videocassette recorder and CD-OK system and method of controlling the same using graphic-remote controller
US5790820A (en) * 1995-06-07 1998-08-04 Vayda; Mark Radial graphical menuing system
US5990890A (en) * 1997-08-25 1999-11-23 Liberate Technologies System for data entry and navigation in a user interface
US6067081A (en) * 1996-09-18 2000-05-23 Vdo Adolf Schindling Ag Method for producing tactile markings on an input surface and system for carrying out of the method
US6094156A (en) * 1998-04-24 2000-07-25 Henty; David L. Handheld remote control system with keyboard
US6118435A (en) * 1997-04-10 2000-09-12 Idec Izumi Corporation Display unit with touch panel
US6128009A (en) * 1996-05-29 2000-10-03 Sony Corporation Program guide controller
US6215417B1 (en) * 1997-11-04 2001-04-10 Allen M. Krass Electronic equipment interface with command preselection indication
US6218966B1 (en) * 1998-11-05 2001-04-17 International Business Machines Corporation Tactile feedback keyboard
US6313851B1 (en) * 1997-08-27 2001-11-06 Microsoft Corporation User friendly remote system interface
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
US6396523B1 (en) * 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
US20020078445A1 (en) * 2000-07-11 2002-06-20 Imran Sharif Internet appliance for interactive audio/video display using a remote control unit for user input
US6445284B1 (en) * 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
US6448986B1 (en) * 1999-09-07 2002-09-10 Spotware Technologies Llc Method and system for displaying graphical objects on a display screen
US20020154888A1 (en) * 2001-04-19 2002-10-24 Digeo, Inc. Remote control device with integrated display screen for controlling a digital video recorder
US20020180707A1 (en) * 2001-05-29 2002-12-05 Alps Electric Co., Ltd. Input device capable of button input and coordinate input on the same operating surface
US20030022701A1 (en) * 2001-07-25 2003-01-30 Aloke Gupta Buttonless communication device with touchscreen display
US6538643B2 (en) * 2001-04-25 2003-03-25 Interlink Electronics, Inc. Remote control having a touch pad operable in a pad-to-screen mapping mode for highlighting preselected parts of a slide displayed on a display screen
US20030058265A1 (en) * 2001-08-28 2003-03-27 Robinson James A. System and method for providing tactility for an LCD touchscreen
US6570994B1 (en) * 1999-03-25 2003-05-27 Agere Systems Inc. Field layer speaker for consumer products
US6574083B1 (en) * 1997-11-04 2003-06-03 Allen M. Krass Electronic equipment interface with command preselection indication
US6628195B1 (en) * 1999-11-10 2003-09-30 Jean-Max Coudon Tactile stimulation device for use by a deaf person
US6633281B2 (en) * 1999-12-10 2003-10-14 Sun Wave Technology Corp. Intelligent touch-type universal remote control
US6636202B2 (en) * 2001-04-27 2003-10-21 International Business Machines Corporation Interactive tactile display for computer screen
US6701525B1 (en) * 1998-01-30 2004-03-02 Koninklijke Philips Electronics N.V. Method for operating an audio/video set as based on hierarchical menuing of selectable bulletized and stringed items and an audio/video set arranged for practicing the method
US6750803B2 (en) * 2001-02-23 2004-06-15 Interlink Electronics, Inc. Transformer remote control
US6765557B1 (en) * 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
US20040252109A1 (en) * 2002-04-11 2004-12-16 Synaptics, Inc. Closed-loop sensor on a solid-state object position detector
US20040252104A1 (en) * 2003-06-10 2004-12-16 Fujitsu Component Limited Inputting device stimulating tactile sense of operator thereof
US6834373B2 (en) * 2001-04-24 2004-12-21 International Business Machines Corporation System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback
US20050030292A1 (en) * 2001-12-12 2005-02-10 Diederiks Elmo Marcus Attila Display system with tactile guidance
US20050030434A1 (en) * 2003-07-23 2005-02-10 Norifumi Sata Remote control transmitter and transmitting and receiving device using the same
US20050054390A1 (en) * 2001-11-28 2005-03-10 Juhani Tuovinen Piezoelectric user interface
US20050081164A1 (en) * 2003-08-28 2005-04-14 Tatsuya Hama Information processing apparatus, information processing method, information processing program and storage medium containing information processing program
US20050151727A1 (en) * 2004-01-08 2005-07-14 Intel Corporation Wireless enabled touch pad pointing device with integrated remote control function
US6957386B2 (en) * 1996-07-26 2005-10-18 Sony Corporation Apparatus and method for controlling display of electrical program guide
US20050264538A1 (en) * 2004-05-25 2005-12-01 I-Hau Yeh Remote controller
US7009595B2 (en) * 2002-01-03 2006-03-07 United States Of America Extended refreshable tactile graphic array for scanned tactile display
US7036091B1 (en) * 2001-09-24 2006-04-25 Digeo, Inc. Concentric curvilinear menus for a graphical user interface
US7034814B2 (en) * 2001-07-13 2006-04-25 Apple Computer, Inc. Methods and apparatuses using control indicators for data processing systems
US20060119586A1 (en) * 2004-10-08 2006-06-08 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US20060119585A1 (en) * 2004-12-07 2006-06-08 Skinner David N Remote control with touchpad and method
US7139623B2 (en) * 2004-12-23 2006-11-21 Honeywell International Inc. Method and apparatus for use of multiple control systems
US7170428B2 (en) * 2002-06-14 2007-01-30 Nokia Corporation Electronic device and method of managing its keyboard
US7174518B2 (en) * 2001-10-11 2007-02-06 Lg Electronics Inc. Remote control method having GUI function, and system using the same
US20070105591A1 (en) * 2005-11-09 2007-05-10 Lifemost Technology Co., Ltd. Wireless handheld input device
US7269484B2 (en) * 2004-09-09 2007-09-11 Lear Corporation Vehicular touch switches with adaptive tactile and audible feedback
US20070220418A1 (en) * 2004-05-10 2007-09-20 Matsushita Electric Industrial Co., Ltd. User Interface Apparatus, Program and Recording Medium
US20070231901A1 (en) * 2005-12-02 2007-10-04 Shuichi Takayama Microfluidic cell culture media
US20070243627A1 (en) * 2004-09-30 2007-10-18 Shuichi Takayama Computerized control method and system for microfluidics and computer program product for use therein
US20070256029A1 (en) * 2006-05-01 2007-11-01 Rpo Pty Llimited Systems And Methods For Interfacing A User With A Touch-Screen
US20080030463A1 (en) * 1995-03-27 2008-02-07 Forest Donald K User interface apparatus and method
US7336266B2 (en) * 2003-02-20 2008-02-26 Immersion Corproation Haptic pads for use with user-interface devices
US20080047765A1 (en) * 2005-04-07 2008-02-28 Microsoft Corporation Circular Touch Sensor
US20080058022A1 (en) * 2006-09-04 2008-03-06 Kwang-Hyun Ahn Operation mode conversion device, mobile communication terminal having the operation mode conversion device and method for converting operation mode using the same
US20080161065A1 (en) * 2006-12-13 2008-07-03 Lg Electronics Inc. Mobile communication terminal for providing tactile interface
US20080251364A1 (en) * 2007-04-11 2008-10-16 Nokia Corporation Feedback on input actuator
US7453442B1 (en) * 2002-12-03 2008-11-18 Ncr Corporation Reconfigurable user interface systems
US20090002328A1 (en) * 2007-06-26 2009-01-01 Immersion Corporation, A Delaware Corporation Method and apparatus for multi-touch tactile touch panel actuator mechanisms
US20090109183A1 (en) * 2007-10-30 2009-04-30 Bose Corporation Remote Control of a Display
US7548232B2 (en) * 2000-01-19 2009-06-16 Immersion Corporation Haptic interface for laptop computers and other portable devices
US20090153288A1 (en) * 2007-12-12 2009-06-18 Eric James Hope Handheld electronic devices with remote control functionality and gesture recognition
US20090181724A1 (en) * 2008-01-14 2009-07-16 Sony Ericsson Mobile Communications Ab Touch sensitive display with ultrasonic vibrations for tactile feedback
US20090195512A1 (en) * 2008-02-05 2009-08-06 Sony Ericsson Mobile Communications Ab Touch sensitive display with tactile feedback
US7574672B2 (en) * 2006-01-05 2009-08-11 Apple Inc. Text entry interface for a portable communication device
US20090210815A1 (en) * 2008-02-14 2009-08-20 Creative Technology Ltd Apparatus and method for information input in an electronic device with display
US7589714B2 (en) * 2004-06-23 2009-09-15 Pioneer Corporation Tactile display device and touch panel apparatus with tactile display function using electrorheological fluid
US20090275406A1 (en) * 2005-09-09 2009-11-05 Wms Gaming Inc Dynamic user interface in a gaming system
US7616192B2 (en) * 2005-07-28 2009-11-10 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Touch device and method for providing tactile feedback
US7663604B2 (en) * 2002-08-29 2010-02-16 Sony Corporation Input device and electronic device using the input device
US20100052880A1 (en) * 2007-04-12 2010-03-04 Nokia Corporation Keypad
US7701445B2 (en) * 2002-10-30 2010-04-20 Sony Corporation Input device and process for manufacturing the same, portable electronic apparatus comprising input device
US20100146451A1 (en) * 2008-12-09 2010-06-10 Sungkyunkwan University Foundation For Corporate Collaboration Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same
US20100144395A1 (en) * 2008-12-05 2010-06-10 Sony Ericsson Mobile Communications Ab Mobile terminal and computer program
US20100156843A1 (en) * 2008-12-23 2010-06-24 Research In Motion Limited Piezoelectric actuator arrangement
US7745211B2 (en) * 2003-03-10 2010-06-29 The Regents Of The University Of Michigan Integrated microfluidic control employing programmable tactile actuators
US20100171715A1 (en) * 2009-01-08 2010-07-08 Cody George Peterson Tactile Surface
US7769417B2 (en) * 2002-12-08 2010-08-03 Immersion Corporation Method and apparatus for providing haptic feedback to off-activating area
US20100201652A1 (en) * 2009-02-12 2010-08-12 Sony Ericsson Mobile Communications Ab Embedded piezoelectric elements in touch panels
US20100231550A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Friction Displays and Additional Haptic Effects
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
US20100231539A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Interfaces Featuring Surface-Based Haptic Effects
US20100231541A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Using Textures in Graphical User Interface Widgets
US7825903B2 (en) * 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
US20110066980A1 (en) * 2009-09-16 2011-03-17 International Business Machines Corporation Placement of items in radial menus

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9014130D0 (en) 1990-06-25 1990-08-15 Hewlett Packard Co User interface
US5889506A (en) 1996-10-25 1999-03-30 Matsushita Electric Industrial Co., Ltd. Video user's environment
US6104334A (en) 1997-12-31 2000-08-15 Eremote, Inc. Portable internet-enabled controller and information browser for consumer devices
WO2000033571A1 (en) 1998-11-30 2000-06-08 Sony Corporation Information providing device and method
EP1923848B1 (en) 2002-03-28 2010-06-23 Igt System for interfacing a user and a casino gaming machine
JP2003308009A (en) 2002-04-17 2003-10-31 Nippon Hoso Kyokai <Nhk> Sensitivity information presenting apparatus with tactile stimulation
FR2847067B1 (en) 2002-11-08 2007-12-07 Delphi Tech Inc TOUCH RETURN FOR SENSITIVE SWITCH
FR2851347B1 (en) 2003-02-18 2005-10-21 Giat Ind Sa MACHINE INTERFACE DEVICE WITH TACTILE INFORMATION RETURN FOR TOUCH SLAB
KR101031753B1 (en) 2003-08-14 2011-04-29 파나소닉 주식회사 User interface system, program, and recording medium
JP3996144B2 (en) 2004-05-11 2007-10-24 日本航空電子工業株式会社 2-stage switch unit
JP2007066031A (en) 2005-08-31 2007-03-15 Sharp Corp Information input system
EP1946544A4 (en) 2005-10-03 2009-12-02 Thomson Licensing Method and apparatus for enabling channel selection
KR101241907B1 (en) 2006-09-29 2013-03-11 엘지전자 주식회사 Remote controller and Method for generation of key code on remote controller thereof
KR100896055B1 (en) 2007-01-15 2009-05-07 엘지전자 주식회사 Mobile terminal having a rotating input device and display method thereof
ATE497204T1 (en) 2007-06-08 2011-02-15 Research In Motion Ltd HAPTIC DISPLAY FOR AN ELECTRONIC HANDHELD DEVICE
WO2009039433A1 (en) 2007-09-20 2009-03-26 Incept Biosystems Inc. Analytical microfluidic culture system
DE602007001536D1 (en) 2007-11-02 2009-08-20 Research In Motion Ltd Electronic device and touchscreen
EP2169515A1 (en) 2008-09-26 2010-03-31 Research In Motion Limited Portable electronic device comprising tactile touch screen and method of controlling the portable electronic device

Patent Citations (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4825209A (en) * 1985-03-06 1989-04-25 Alps Electric Co., Ltd Remote control apparatus
US4786767A (en) * 1987-06-01 1988-11-22 Southwall Technologies Inc. Transparent touch panel switch
US5222895A (en) * 1990-03-13 1993-06-29 Joerg Fricke Tactile graphic computer screen and input tablet for blind persons using an electrorheological fluid
US5327160A (en) * 1991-05-09 1994-07-05 Asher David J Touch sensitive user interface for television control
US5371553A (en) * 1992-03-11 1994-12-06 Sony Corporation Monitor apparatus for selecting audio-visual units and operating modes from a control window
US5367199A (en) * 1992-05-01 1994-11-22 Triax Technologies Sliding contact control switch pad
US5408275A (en) * 1992-05-25 1995-04-18 Goldstar Co., Ltd. Apparatus and method for controlling television receiver
US5508703A (en) * 1992-09-14 1996-04-16 Smk Corporation Membrane switch having a rotary motion detection function
US5545857A (en) * 1994-07-27 1996-08-13 Samsung Electronics Co. Ltd. Remote control method and apparatus thereof
US5589893A (en) * 1994-12-01 1996-12-31 Zenith Electronics Corporation On-screen remote control of a television receiver
US20080030463A1 (en) * 1995-03-27 2008-02-07 Forest Donald K User interface apparatus and method
US5790820A (en) * 1995-06-07 1998-08-04 Vayda; Mark Radial graphical menuing system
US5691778A (en) * 1995-08-31 1997-11-25 Samsung Electronics Co., Ltd. Double-wide television set having double-deck videocassette recorder and CD-OK system and method of controlling the same using graphic-remote controller
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
US6128009A (en) * 1996-05-29 2000-10-03 Sony Corporation Program guide controller
US6957386B2 (en) * 1996-07-26 2005-10-18 Sony Corporation Apparatus and method for controlling display of electrical program guide
US6067081A (en) * 1996-09-18 2000-05-23 Vdo Adolf Schindling Ag Method for producing tactile markings on an input surface and system for carrying out of the method
US6118435A (en) * 1997-04-10 2000-09-12 Idec Izumi Corporation Display unit with touch panel
US5990890A (en) * 1997-08-25 1999-11-23 Liberate Technologies System for data entry and navigation in a user interface
US6313851B1 (en) * 1997-08-27 2001-11-06 Microsoft Corporation User friendly remote system interface
US6574083B1 (en) * 1997-11-04 2003-06-03 Allen M. Krass Electronic equipment interface with command preselection indication
US6215417B1 (en) * 1997-11-04 2001-04-10 Allen M. Krass Electronic equipment interface with command preselection indication
US6701525B1 (en) * 1998-01-30 2004-03-02 Koninklijke Philips Electronics N.V. Method for operating an audio/video set as based on hierarchical menuing of selectable bulletized and stringed items and an audio/video set arranged for practicing the method
US6094156A (en) * 1998-04-24 2000-07-25 Henty; David L. Handheld remote control system with keyboard
US6218966B1 (en) * 1998-11-05 2001-04-17 International Business Machines Corporation Tactile feedback keyboard
US6570994B1 (en) * 1999-03-25 2003-05-27 Agere Systems Inc. Field layer speaker for consumer products
US6396523B1 (en) * 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
US6448986B1 (en) * 1999-09-07 2002-09-10 Spotware Technologies Llc Method and system for displaying graphical objects on a display screen
US6628195B1 (en) * 1999-11-10 2003-09-30 Jean-Max Coudon Tactile stimulation device for use by a deaf person
US6633281B2 (en) * 1999-12-10 2003-10-14 Sun Wave Technology Corp. Intelligent touch-type universal remote control
US7548232B2 (en) * 2000-01-19 2009-06-16 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6765557B1 (en) * 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
US6445284B1 (en) * 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
US20020078445A1 (en) * 2000-07-11 2002-06-20 Imran Sharif Internet appliance for interactive audio/video display using a remote control unit for user input
US6750803B2 (en) * 2001-02-23 2004-06-15 Interlink Electronics, Inc. Transformer remote control
US20020154888A1 (en) * 2001-04-19 2002-10-24 Digeo, Inc. Remote control device with integrated display screen for controlling a digital video recorder
US6834373B2 (en) * 2001-04-24 2004-12-21 International Business Machines Corporation System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback
US6538643B2 (en) * 2001-04-25 2003-03-25 Interlink Electronics, Inc. Remote control having a touch pad operable in a pad-to-screen mapping mode for highlighting preselected parts of a slide displayed on a display screen
US6636202B2 (en) * 2001-04-27 2003-10-21 International Business Machines Corporation Interactive tactile display for computer screen
US20020180707A1 (en) * 2001-05-29 2002-12-05 Alps Electric Co., Ltd. Input device capable of button input and coordinate input on the same operating surface
US7034814B2 (en) * 2001-07-13 2006-04-25 Apple Computer, Inc. Methods and apparatuses using control indicators for data processing systems
US20030022701A1 (en) * 2001-07-25 2003-01-30 Aloke Gupta Buttonless communication device with touchscreen display
US20030058265A1 (en) * 2001-08-28 2003-03-27 Robinson James A. System and method for providing tactility for an LCD touchscreen
US7036091B1 (en) * 2001-09-24 2006-04-25 Digeo, Inc. Concentric curvilinear menus for a graphical user interface
US7174518B2 (en) * 2001-10-11 2007-02-06 Lg Electronics Inc. Remote control method having GUI function, and system using the same
US20050054390A1 (en) * 2001-11-28 2005-03-10 Juhani Tuovinen Piezoelectric user interface
US20050030292A1 (en) * 2001-12-12 2005-02-10 Diederiks Elmo Marcus Attila Display system with tactile guidance
US7009595B2 (en) * 2002-01-03 2006-03-07 United States Of America Extended refreshable tactile graphic array for scanned tactile display
US20040252109A1 (en) * 2002-04-11 2004-12-16 Synaptics, Inc. Closed-loop sensor on a solid-state object position detector
US7170428B2 (en) * 2002-06-14 2007-01-30 Nokia Corporation Electronic device and method of managing its keyboard
US7663604B2 (en) * 2002-08-29 2010-02-16 Sony Corporation Input device and electronic device using the input device
US7701445B2 (en) * 2002-10-30 2010-04-20 Sony Corporation Input device and process for manufacturing the same, portable electronic apparatus comprising input device
US7453442B1 (en) * 2002-12-03 2008-11-18 Ncr Corporation Reconfigurable user interface systems
US7769417B2 (en) * 2002-12-08 2010-08-03 Immersion Corporation Method and apparatus for providing haptic feedback to off-activating area
US7336266B2 (en) * 2003-02-20 2008-02-26 Immersion Corproation Haptic pads for use with user-interface devices
US7745211B2 (en) * 2003-03-10 2010-06-29 The Regents Of The University Of Michigan Integrated microfluidic control employing programmable tactile actuators
US20040252104A1 (en) * 2003-06-10 2004-12-16 Fujitsu Component Limited Inputting device stimulating tactile sense of operator thereof
US20050030434A1 (en) * 2003-07-23 2005-02-10 Norifumi Sata Remote control transmitter and transmitting and receiving device using the same
US20050081164A1 (en) * 2003-08-28 2005-04-14 Tatsuya Hama Information processing apparatus, information processing method, information processing program and storage medium containing information processing program
US20050151727A1 (en) * 2004-01-08 2005-07-14 Intel Corporation Wireless enabled touch pad pointing device with integrated remote control function
US20070220418A1 (en) * 2004-05-10 2007-09-20 Matsushita Electric Industrial Co., Ltd. User Interface Apparatus, Program and Recording Medium
US20050264538A1 (en) * 2004-05-25 2005-12-01 I-Hau Yeh Remote controller
US7589714B2 (en) * 2004-06-23 2009-09-15 Pioneer Corporation Tactile display device and touch panel apparatus with tactile display function using electrorheological fluid
US7269484B2 (en) * 2004-09-09 2007-09-11 Lear Corporation Vehicular touch switches with adaptive tactile and audible feedback
US20070243627A1 (en) * 2004-09-30 2007-10-18 Shuichi Takayama Computerized control method and system for microfluidics and computer program product for use therein
US20060119586A1 (en) * 2004-10-08 2006-06-08 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US20060119585A1 (en) * 2004-12-07 2006-06-08 Skinner David N Remote control with touchpad and method
US7139623B2 (en) * 2004-12-23 2006-11-21 Honeywell International Inc. Method and apparatus for use of multiple control systems
US20080047765A1 (en) * 2005-04-07 2008-02-28 Microsoft Corporation Circular Touch Sensor
US7825903B2 (en) * 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
US7616192B2 (en) * 2005-07-28 2009-11-10 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Touch device and method for providing tactile feedback
US20090275406A1 (en) * 2005-09-09 2009-11-05 Wms Gaming Inc Dynamic user interface in a gaming system
US20070105591A1 (en) * 2005-11-09 2007-05-10 Lifemost Technology Co., Ltd. Wireless handheld input device
US20070231901A1 (en) * 2005-12-02 2007-10-04 Shuichi Takayama Microfluidic cell culture media
US7574672B2 (en) * 2006-01-05 2009-08-11 Apple Inc. Text entry interface for a portable communication device
US20070256029A1 (en) * 2006-05-01 2007-11-01 Rpo Pty Llimited Systems And Methods For Interfacing A User With A Touch-Screen
US20080058022A1 (en) * 2006-09-04 2008-03-06 Kwang-Hyun Ahn Operation mode conversion device, mobile communication terminal having the operation mode conversion device and method for converting operation mode using the same
US20080161065A1 (en) * 2006-12-13 2008-07-03 Lg Electronics Inc. Mobile communication terminal for providing tactile interface
US20080251364A1 (en) * 2007-04-11 2008-10-16 Nokia Corporation Feedback on input actuator
US20100052880A1 (en) * 2007-04-12 2010-03-04 Nokia Corporation Keypad
US20090002328A1 (en) * 2007-06-26 2009-01-01 Immersion Corporation, A Delaware Corporation Method and apparatus for multi-touch tactile touch panel actuator mechanisms
US20090109183A1 (en) * 2007-10-30 2009-04-30 Bose Corporation Remote Control of a Display
US20090153288A1 (en) * 2007-12-12 2009-06-18 Eric James Hope Handheld electronic devices with remote control functionality and gesture recognition
US20090181724A1 (en) * 2008-01-14 2009-07-16 Sony Ericsson Mobile Communications Ab Touch sensitive display with ultrasonic vibrations for tactile feedback
US20090195512A1 (en) * 2008-02-05 2009-08-06 Sony Ericsson Mobile Communications Ab Touch sensitive display with tactile feedback
US20090210815A1 (en) * 2008-02-14 2009-08-20 Creative Technology Ltd Apparatus and method for information input in an electronic device with display
US20100144395A1 (en) * 2008-12-05 2010-06-10 Sony Ericsson Mobile Communications Ab Mobile terminal and computer program
US20100146451A1 (en) * 2008-12-09 2010-06-10 Sungkyunkwan University Foundation For Corporate Collaboration Handheld terminal capable of supporting menu selection using dragging on touch screen and method of controlling the same
US20100156843A1 (en) * 2008-12-23 2010-06-24 Research In Motion Limited Piezoelectric actuator arrangement
US20100171715A1 (en) * 2009-01-08 2010-07-08 Cody George Peterson Tactile Surface
US20100201652A1 (en) * 2009-02-12 2010-08-12 Sony Ericsson Mobile Communications Ab Embedded piezoelectric elements in touch panels
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
US20100231539A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Interfaces Featuring Surface-Based Haptic Effects
US20100231541A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Using Textures in Graphical User Interface Widgets
US20100231550A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Friction Displays and Additional Haptic Effects
US20110066980A1 (en) * 2009-09-16 2011-03-17 International Business Machines Corporation Placement of items in radial menus

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110099513A1 (en) * 2009-10-23 2011-04-28 Ameline Ian Ross Multi-Touch Graphical User Interface for Interacting with Menus on a Handheld Device
US10101898B2 (en) * 2009-10-23 2018-10-16 Autodesk, Inc. Multi-touch graphical user interface for interacting with menus on a handheld device
EP2584403A3 (en) * 2011-10-21 2013-11-06 Disney Enterprises, Inc. Multi-user interaction with handheld projectors
US8902158B2 (en) 2011-10-21 2014-12-02 Disney Enterprises, Inc. Multi-user interaction with handheld projectors
DE102012217783A1 (en) * 2012-09-28 2014-04-03 Siemens Aktiengesellschaft Input element e.g. selector switch for hearing aids, has runner that is provided with recess or window through which the current set value or condition on runner or within the runner is readable
CN109582204A (en) * 2012-11-23 2019-04-05 三星电子株式会社 Show equipment, input equipment and its control method
US10873718B2 (en) 2014-04-02 2020-12-22 Interdigital Madison Patent Holdings, Sas Systems and methods for touch screens associated with a display
US20150293616A1 (en) * 2014-04-09 2015-10-15 Wei-Chih Cheng Operating system with shortcut touch panel having shortcut function
US9274620B2 (en) * 2014-04-09 2016-03-01 Wei-Chih Cheng Operating system with shortcut touch panel having shortcut function
JP2017164908A (en) * 2016-03-14 2017-09-21 セイコーエプソン株式会社 Printer, electronic apparatus, control program, and operational parameter setting method for printer
CN106445378A (en) * 2016-09-12 2017-02-22 青岛海信电器股份有限公司 Display control method and device of touch menu, and touch display equipment
CN110309238A (en) * 2018-03-08 2019-10-08 上海博泰悦臻网络技术服务有限公司 Point of interest interactive approach, system, electric terminal and storage medium in music

Also Published As

Publication number Publication date
US8736566B2 (en) 2014-05-27
US20120162542A1 (en) 2012-06-28

Similar Documents

Publication Publication Date Title
US9172897B2 (en) Audio/visual device graphical user interface
US8601394B2 (en) Graphical user interface user customization
US8736566B2 (en) Audio/visual device touch-based user interface
US8638306B2 (en) Touch-based user interface corner conductive pad
US8350820B2 (en) Touch-based user interface user operation accuracy enhancement
US8692815B2 (en) Touch-based user interface user selection accuracy enhancement
US20110113371A1 (en) Touch-Based User Interface User Error Handling
US8669949B2 (en) Touch-based user interface touch sensor power
US9354726B2 (en) Audio/visual device graphical user interface submenu
US9001044B2 (en) Method for inputting user command and video apparatus employing the same
US8686957B2 (en) Touch-based user interface conductive rings
US20090262084A1 (en) Display control system providing synchronous video information
JP5565142B2 (en) Information processing apparatus, information processing apparatus control method, and recording medium storing information processing apparatus control program
US20130104082A1 (en) Audio/visual device applications graphical user interface
KR20100067296A (en) Main image processing apparatus, sub image processing apparatus and control method thereof
US9483936B2 (en) Remote controller and control method thereof, display device and control method thereof, display system and control method thereof
US9930392B2 (en) Apparatus for displaying an image and method of operating the same
US20210019027A1 (en) Content transmission device and mobile terminal for performing transmission of content
US9201584B2 (en) Audio/visual device user interface with tactile feedback
US20170180777A1 (en) Display apparatus, remote control apparatus, and control method thereof
JP5802312B2 (en) Broadcast receiving apparatus, extended function execution apparatus, control method for broadcast receiving apparatus, and information processing apparatus
WO2011057076A1 (en) Audio/visual device touch-based user interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: BOSE CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CARVAJAL, SANTIAGO;SAKALOWSKY, JOHN MICHAEL;SIGNING DATES FROM 20091118 TO 20091125;REEL/FRAME:023588/0317

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION