US20120274547A1 - Techniques for content navigation using proximity sensing - Google Patents

Techniques for content navigation using proximity sensing Download PDF

Info

Publication number
US20120274547A1
US20120274547A1 US13/284,810 US201113284810A US2012274547A1 US 20120274547 A1 US20120274547 A1 US 20120274547A1 US 201113284810 A US201113284810 A US 201113284810A US 2012274547 A1 US2012274547 A1 US 2012274547A1
Authority
US
United States
Prior art keywords
user
display
appendage
representation
input system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/284,810
Inventor
Eric Raeber
Jean-Michel Chardon
Frederic Vexo
Nicolas Chauvin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Logitech Inc
Original Assignee
Logitech Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Logitech Inc filed Critical Logitech Inc
Priority to US13/284,810 priority Critical patent/US20120274547A1/en
Publication of US20120274547A1 publication Critical patent/US20120274547A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/487Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/65Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42208Display device provided on the remote control
    • H04N21/42209Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/433Query formulation using audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4786Supplemental services, e.g. displaying phone caller identification, shopping application e-mailing

Definitions

  • GUI graphical user interface
  • input devices that incorporate touch are separated from a display on which a GUI is displayed.
  • touch pads In typical uses, a user touches a touch pad and moves one or more fingers to cause a cursor displayed on the GUI to move accordingly. A button proximate to the touch pad or sometimes the touch pad itself can be tapped to cause a GUI element to be selected. Other ways of interacting with the touch pad and any buttons with the touch pad may be used to interact with the GUI accordingly.
  • Techniques of the present disclosure provide for the interaction of graphical user interfaces using input devices that incorporate touch and/or proximity sensing. Such techniques provide advantages including advantages of some embodiments that allow users to obtain a touch user-input experience with displays that are not necessarily touch-input enabled. In an embodiment, a computer-implemented method of manipulating a display is described.
  • the method includes detecting a set of one or more user appendages proximate to and out of contact with a set of one or more corresponding sensor locations of a sensing region of a remote control device; determining, based at least in part on a mapping of the sensing region to a display region of a remote display device, a set of one or more display locations of the display region; and transmitting a signal that causes the display device to display a set of one or more representations of the detected set of one or more user appendages according to the determined set of one or more second locations.
  • the mapping may be an absolute mapping.
  • the method further includes: calculating at least one measurement that corresponds to distance of the detected appendage from the sensing region.
  • displaying the representation of the detected appendage may include displaying the representation of the detected appendage with one or more color characteristics that are based at least in part on the measurement.
  • the color characteristics may be, for instance, brightness, hue, opacity, and the like.
  • the display may display a graphical user interface and displaying the set of one or more representations may include overlaying the one or more representations on the graphical user interface or otherwise visually distinguishing locations corresponding to user interaction with the sensing region from other locations.
  • the graphical user interface includes one or more selectable options that each correspond to a selection region of the sensing region.
  • the method may further comprise detecting a contact event by at least one appendage of the set of one or more appendages.
  • the detected contact event may correspond to a contact location of the sensing region.
  • the graphical user interface may be updated according to the corresponding selectable option.
  • the displayed set of one or more representations may be changed upon detection of the contact event for which the contact location corresponds to the selection region of the corresponding selectable option.
  • Changing the displayed set of one or more representations may include removing the set of one or more representations from the display. At least one of the representations may resemble the corresponding appendage and the displayed set of one or more representations may include at least two representations of different forms, such as two different fingers.
  • a computer-implemented method of manipulating a display includes calculating measurements that correspond to distances of a user appendage from a sensing region of a remote control device as the user moves the user appendage relative to the sensing region; and taking one or more actions that cause a display device to display a representation of the appendage such that the representation has one or more color characteristics that vary based at least in part on the calculated measurements.
  • taking the one or more actions may include transmitting remotely generated signals to the display device.
  • the representation may have a transparent appearance when the user appendage is out of contact with the sensing region and an opaque appearance when the user appendage is in contact with the sensing region.
  • the method may also include determining location changes of the sensing region with which the user appendage is proximate or in contact.
  • taking the one or more actions may include updating locations of the representation on the display.
  • a user input system may be a set of one or more devices that collectively operate to change a display according to user input.
  • the user input system includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device update according to user input.
  • the display device may display a representation of a user appendage on a display of the display device and change one or more color characteristics of the representation based at least in part on changes in distances of the user appendage from a sensing region of a remote control device.
  • the instructions may further cause the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region.
  • the instructions may also further cause the user input system to cause the display device to update the display according to a predefined action of multiple user appendages in connection with the sensing region.
  • the display device may be separate from the user input system.
  • the display device may be a television and the user input system may be a remote control device (or remote control system) that operates the television.
  • the user input system allows for user interaction with a graphical user interface.
  • the user input system may include, for example, one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device to display information according to user input.
  • the display device may, for instance, display a representation of a user appendage at a display location of a display of the display device where the display location is based at least in part on an absolute mapping of display locations of the display device to sensing locations of a sensing region of a sensing device. At least when the appendage moves relative to and out of contact with the sensing device, the display device may change the display location based at least in part on the absolute mapping.
  • Variations of the user input system considered as being within the scope of the present disclosure include, but are not limited to, the instructions further causing the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region.
  • the instructions may further cause the user input system to identify the user appendage from a set of potential user appendages and/or cause the user input system to update the display based at least in part on detection of an event that is uncausable using at least one other of the potential user appendages.
  • the sensing device may be a component of a device that is physically disconnected from the display device and/or the sensing device may be a remote control device for the display device.
  • a display device includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to display information according to user input.
  • the display may, for instance, display a graphical user interface and receive signals corresponding to user interaction with a sensing region of a sensing input device, the signals being based at least in part on a number of dimensions of user interaction that is greater than two, and change the graphical user interface according to the received signals.
  • the signals may be generated by an intermediate device that receives other signals from a remote control device.
  • the sensing input device may be separate from the display device.
  • Changing the graphical user interface may include updating an appearance characteristic of a representation of an object used to interact with the sensing input device.
  • Changing the graphical user interface may also include updating, on the graphical user interface, a location of a representation of an object used to interact with the sensing input device.
  • the user interaction may include contactless interaction with the sensing input device.
  • FIG. 1 shows an illustrative example of an environment in which various embodiments may be practiced.
  • FIG. 2 shows an illustration of a remote control device and a display device in accordance with at least one embodiment.
  • FIG. 3 shows the remote control device and the display device of FIG. 2 being navigated by a user in accordance with at least one embodiment.
  • FIG. 4 shows an illustrative example of a process for facilitating user navigation of an interface in accordance with at least one embodiment.
  • FIG. 5 shows an illustrative example of maintaining an interface in accordance with at least one embodiment.
  • FIG. 6 shows an illustration of an aspect of the invention in accordance with at least one embodiment.
  • FIG. 7 shows the aspect of FIG. 6 as it changes according to user movement in accordance with at least one embodiment.
  • FIG. 1 shows an environment 100 in which various embodiments may be practiced.
  • environment 100 utilizes a content appliance 102 in order to provide content to a user.
  • the content may be provided to the user in various ways.
  • the environment 100 in FIG. 1 includes a television 104 , an audio system 106 and a mobile device 108 (such as a mobile phone) that may be used to provide content to a user.
  • Content may include video content, audio content, text content, and generally any type of content that may be provided audio, visually or otherwise to a user.
  • Other devices may also be used in the environment 100 .
  • the environment 100 includes an audio visual (AV) receiver 110 which operates in connection with television 104 .
  • the environment 100 as illustrated in FIG. 1 includes a video camera 112 , a set top box 114 and a remote control 116 and a keyboard 118 .
  • AV audio visual
  • one or more devices may utilize the content appliance 102 in some manner.
  • the various devices shown in FIG. 1 are configured to communicate with one another according to various protocols.
  • the content appliance 102 configured to communicate with various devices utilizing the different methods, such as according to the methods and protocols illustrated in FIG. 1 .
  • the content appliance 102 is configured to generate and transmit infrared (IR) signals to various devices that are configured to receive IR signals and perform one or more functions accordingly.
  • IR infrared
  • Different devices may utilize different codes and the content appliance may be configured to generate proper codes with each appliance. For example, a television from one manufacturer may utilize different codes in a television from another manufacturer.
  • the content appliance 102 may be configured accordingly to generate a transmit appropriate codes.
  • the content appliance may include a data store that has the codes for various devices and codes may be obtained from remote sources, such as from remote databases as discussed below.
  • a user may configure the content appliance 102 to submit the correct codes to the appropriate device(s).
  • the content appliance 102 includes various ports which may be used to connect with various devices.
  • the content appliance 102 includes an HDMI OUT port 120 which may be used to provide content through an HDMI cable to another device.
  • the HDMI OUT port 120 communicates content to the AV receiver 110 .
  • the HDMI OUT port may be used to provide content to other devices, such as directly to the television 104 .
  • the content appliance 102 includes an S/PDIF port 122 to communicate with the audio system 106 .
  • An ethernet port 124 may be provided with the content appliance 102 to enable the content appliance 102 to communicate utilizing an appropriate networking protocol, such as illustrated in FIG. 1 .
  • the content appliance 102 may communicate signals utilizing the ethernet port 124 to communicate to a set top box.
  • the set top box may operate according to an application of a content provider such as a satellite or cable television provider.
  • the ethernet port 124 of the content appliance 102 may be used to instruct the set top box 114 to obtain content on demand.
  • the content appliance 102 includes one or more universal serial bus (USB) ports 126 .
  • the USB ports 126 may be utilized to communicate with various accessories that are configured to communicate utilizing a USB cable.
  • the content appliance 102 communicates with a video camera 112 .
  • the video camera 112 may be used, for instance, to enable use of the content appliance to make video calls over a public communications network, such as the Internet 128 .
  • the content appliance 102 may be configured to communicate with any device connectable using USB techniques.
  • Other ports on the content appliance 102 may include RCA ports 130 in order to provide content to devices that are configured to communicate using such ports and an HDMI end port 132 which may be used to accept content from another device, such as from the set top box 114 .
  • the content appliance 102 may have additional ports to those discussed above and, in some embodiments, may include fewer ports than illustrated.
  • the remote control 116 may communicate with the content appliance 102 utilizing radio frequency (RF) communication.
  • RF radio frequency
  • the remote control 116 may include a touch screen that may be used in accordance with the various embodiments described herein.
  • a keyboard 118 may also communicate with the content appliance 102 utilizing RF or another method (and possibly one or more other devices, either directly, or through the content appliance 102 ).
  • the keyboard may be used for various actions, such as navigation on a interface displayed on the television 104 , user input by a user typing utilizing the keyboard 118 , and general remote control functions.
  • an interface displayed on the television 104 may include options for text entry.
  • the user may type text utilizing keyboard 118 . Keystrokes that the user makes on the keyboard 118 may be communicated to the content appliance 102 , which in turn generates an appropriate signal to send over an HDMI cable connecting the HDMI OUT port 120 to the AV receiver 110 .
  • the AV receiver 110 may communicate with television 104 over HDMI or another suitable connection to enable the television to display text or other content that corresponds to the user input.
  • the keyboard 118 may also include other features as well.
  • the keyboard 118 may include a touchpad, such as described below or generally a touchpad that may allow for user navigation of an interface displayed on a display device.
  • the touchpad may have proximity sensing capabilities to enable use of the keyboard in various embodiments of the present disclosure.
  • the mobile device 108 is also able to control the content appliance 102 (and possibly other devices, either directly, or through the content appliance 102 ).
  • the mobile device may include a remote control application that provides an interface for controlling the content appliance 102 .
  • the mobile device 108 includes a touch screen that may be used in a manner described below.
  • the mobile device may communicate with the content appliance 102 over wi-fi utilizing signals that correspond to the user's interaction with the mobile device 108 .
  • the content appliance 102 may be, for instance, configured to receive signals from the mobile device over wi-fi (directly, as illustrated, or indirectly, such as through a wireless router or other device).
  • the content appliance may be configured to generate signals of another type (such as IR, HDMI, RF, and the like) that correspond to codes received over wi-fi from the mobile device 108 and then generate and transmit signals accordingly.
  • An application executing on the mobile device 108 may provide a graphical user interface that allows users to use the mobile device 108 as a remote control and generate such codes accordingly.
  • the mobile device 108 (and other devices), as illustrated, may be configured to receive information from the content appliance 102 and reconfigure itself according to the information received.
  • the mobile device 108 may, for example, update a display and/or update any applications executing on the mobile device 108 according to information received by the content appliance 102 .
  • the mobile device may be a different device with at least some similar capabilities.
  • the mobile device may be a portable music player or tablet computing device with a touch screen.
  • Example mobile devices include, but are not limited to, a mobile phone with a touch screen (e.g., a smartphone such as an iPhone or an Android based phone, etc.), a portable music player (e.g., an iPod, etc.), a tablet computing device (e.g., an iPad, iPad2, etc.), and other devices with touch sensitive user input devices.
  • a mobile phone with a touch screen e.g., a smartphone such as an iPhone or an Android based phone, etc.
  • a portable music player e.g., an iPod, etc.
  • a tablet computing device e.g., an iPad, iPad2, etc.
  • other devices may be included additionally in a mobile device in the environment illustrated in FIG. 1 .
  • the content appliance 102 is also configured to utilize various services provided over a public communications network, such as the Internet 128 .
  • the content appliance 102 may communicate with a router 134 of home network.
  • the content appliance 102 and the router 134 may communicate utilizing a wired or wireless connection.
  • the router 134 may be directly or indirectly connected to the Internet 128 in order to access various third-party services.
  • a code service 136 is provided.
  • the code service in an embodiment provides codes for the content appliance 102 to control various devices to enable the content appliance to translate codes received from another device (such as the remote control 116 , the keyboard 118 , and/or the mobile device 108 ).
  • the various devices to control may be identified to the content appliance 102 by user input or through automated means.
  • the content appliance 102 may submit a request through the router 134 to the code service 136 for appropriate codes.
  • the codes may be, for example, IR codes that are used to control the various devices that utilize IR for communication.
  • a signal corresponding to the selection by the user may be communicated to the content appliance 102 .
  • the content appliance 102 may then generate a code based at least in part on information received from the code service 136 .
  • a signal corresponding to selection of the play button may be sent to the content appliance 102 which may generate a play IR code, which is then transmitted to the television 104 or to another suitable appliance, such as generally any appliance that is able to play content.
  • the content services may be, for example, any information resource, such as websites, video-streaming services, audio-streaming services and generally any services that provide content over the Internet 128 .
  • FIG. 1 the environment illustrated in FIG. 1 is provided for the purpose of illustration and that numerous environments may be used to practice embodiments of the present disclosure.
  • Various embodiments, for example, are applicable in any environment where proximity sensing is used as a method of enabling user input, including any environment in which a touch pad or touch screen with proximity sensing capabilities is used to interact with a GUI on a separate display.
  • various embodiments may be described as utilizing a particular input device such as a touch pad or touch screen but, unless otherwise clear from context, various embodiments of the invention may utilize other input devices other than those explicitly described.
  • FIG. 1 shows an example environment in which user input is provided to a display (television, in the illustrated example) through a content appliance.
  • FIG. 1 shows an example environment in which user input is provided to a display (television, in the illustrated example) through a content appliance.
  • techniques of the present disclosure are also applicable for providing user input directly to a device with a display.
  • the various techniques described herein may be used in connection with a television remote control device, where the television remote control device sends signals according to user interaction with a touch screen directly to a television.
  • FIG. 2 shows an illustration of such a remote control device 202 and a television 204 , although signals from the remote control device 202 could be transmitted using a content appliance, such as described above in connection with FIG. 1 .
  • the remote control device 202 is used to control how content is displayed on the television 204 or other device with a display.
  • the remote control device 202 may communicate signals directly to the television 204 or through an intermediate device, such as the content appliance 102 described above in connection with FIG. 1 .
  • the television 204 may display an interface which is navigable by a user utilizing the remote control 202 .
  • the television 204 in FIG. 2 displays an interface 206 which includes a plurality of selectable options.
  • the options are provided as interface buttons displayed on the television 204 .
  • Each button in this example corresponds to an activity that the user may perform by selecting one of the buttons.
  • the interface 206 includes a watch TV button 208 . Selection of the watch TV button 208 may result in one or more devices changing to a state suitable for watching TV on the television 204 .
  • a set top box (not shown) may be put into an on state if it was not in such a state already.
  • Television 204 may be put into a state wherein it receives television content from an appropriate source, such as from the set top box or from a different device, such as a content appliance 102 described in connection with FIG. 1 .
  • the television 204 (or a network of devices that includes the television 204 ) is configured to utilize one or more other devices, such as a DVD player, music player, a gaming device, and devices that allow communication over the Internet.
  • the users are able to utilize the television 204 to check an email account and/or stream a movie from a remote streaming service.
  • the television 204 or a network of devices that includes the television 204 may be configured for use with any device involved in providing content, either from the devices themselves, or from other sources, including remote sources accessible over the Internet or other communications network.
  • the remote control 202 includes a touch screen 210 .
  • the remote control is described as having a touch screen 210 , embodiments may utilize a remote control with a touch pad instead of or in addition to a touch screen.
  • the touch screen 210 or touch pad in an embodiment operates using capacitive proximity sensing.
  • the touch screen 210 (or touch pad) may be configured, for example, according to the disclosure of U.S. application Ser. No. 13/047,962, referenced above.
  • the touch screen 210 (or touch pad) may utilize any technique for proximity sensing.
  • the touch screen 210 (or touch pad) may be used as an input device by a user. The input may be input for controlling the remote control device 202 and/or the television 204 .
  • FIG. 2 illustrates various buttons, many of which are common on standard remote controls. Such buttons may be selected to perform corresponding functions. Performance of some functions may be caused by selection of corresponding buttons or selections of interface elements on the touchscreen 210 , although some functions may be causable by either the buttons or the touchscreen 210 , but not both.
  • each location on the touch screen or touch pad 210 is mapped to at least one location on the user interface on the display.
  • the absolute mapping may be a surjective mapping from the locations of the user interface on the display to the locations of the touch screen.
  • the mapping may be a surjective mapping from the locations of the touch screen to the locations of the user interface if there are more locations on the touch screen or touch pad than the user interface.
  • a mapping may also be a one-to-one mapping between touch screen sensor locations and pointer locations on the interface.
  • the mapping may be any mapping that is configured such that, from the user perspective, each location on the touch screen has a corresponding location on the user interface. It should be noted, however, that some embodiments of the present disclosure may utilize a relative mapping between the touch screen and user interface.
  • Mappings between a region of a remote control device and a display device may be determined in various ways.
  • a device in connection with the display device utilizes extended display identification data (EDID) or extended EDID (E-DID) received over a high-definition multimedia interface (HDMI) or other connection to determine display parameters for the display device.
  • the data may, for example, specify a maximum horizontal image size and maximum vertical image size which, allows for a mapping of the sensing region to the display.
  • Other ways of determining the display size for generating a mapping may be used. For instance, a user may input the display size during a setup process.
  • the user may alternatively enter an identifier (model number, e.g.) for a display device and a database (which may be a remote database) may be referenced to determine the display size based on the model identifier.
  • a database which may be a remote database
  • some other method may be used to determine an identifier for the display device (e.g., knowledge of the legacy remote control used by the display device), and the identifier may be used to determine the display size (possibly using a remote database).
  • any suitable for determining the display dimensions and generating a mapping may be used.
  • the user interacts with the touch screen or touch pad 210 in order to navigate the user interface displayed on the television 204 .
  • the user may interact with the touch screen 210 (or touch pad) by using one or more appendages (such as fingers) to touch and/or hover over the touch screen 210 and/or move the one or more appendages relative to the touch screen 210 .
  • the manner in which the user interacts with the touch screen 210 may be sensed by the touch screen 210 (or touch pad) to generate signals.
  • the generated signals may be interpreted by one or more processors of the remote control 202 to generate one or more other signals corresponding to user input and are transmitted by the remote control 202 to another device, such as directly to the television 204 or to a content appliance, such as described above, or in any suitable manner.
  • another device such as directly to the television 204 or to a content appliance, such as described above, or in any suitable manner.
  • one or more processors of the remote control 210 may interpret signals generated according to such touch and movement and generate and transmit one or more other signals that enable another device to update a device on which a GUI is displayed accordingly.
  • signals generated by the touch screen 210 or signals derived therefrom may be transmitted to another device (such as the television 204 or a content appliance) and interpreted by the other device.
  • another device such as the television 204 or a content appliance
  • signals generated by the touch screen 210 may be transmitted to another device (such as the television 204 or a content appliance) and interpreted by the other device.
  • any manner in which signals generated by the touch screen 210 (or touch pad) are interpreted as user input may be used.
  • FIG. 2 is simplified for the purpose of illustration and additional details and/or variations are considered as being within the scope of the present disclosure.
  • FIG. 2 shows an example where a remote control device is used to control a display on a television.
  • the techniques of the present disclosure may apply in any instance in which a touch-sensitive surface is used to provide user input for interaction with a GUI.
  • FIG. 2 shows the touch screen 210 without a display during user interaction.
  • the touch screen 210 may, however, include a display when the user interacts with the touch screen 210 , and/or at other times.
  • the display of the touch screen 210 may, for example, be identical or similar to the display on the television 204 . In this manner, the user can see the GUI by looking at either the remote control 202 or the television 204 .
  • a touch pad or other touch-sensitive user input device with which proximity sensing is possible may be used in accordance with various embodiments.
  • FIG. 2 shows a touch screen on a device (remote control 202 ) that is different from the device on which the GUI is displayed (or primarily displayed).
  • a touch screen and display may also be applied in instances in which the touch screen and display are incorporated in a single device.
  • a notebook computer for example, may include a touch screen that is separate from a display of the notebook computer.
  • the techniques described herein may also be applied to single devices.
  • a touch screen and display operating in accordance with the present disclosure may both be part of a single mobile device (such as a smart phone, tablet computing device, or other device).
  • the scope of the present disclosure is not limited to the embodiments explicitly described and illustrated herein.
  • FIG. 3 accordingly shows an illustrative example of how the touch screen 210 of FIG. 2 may be utilized to navigate the interface on the television 204 .
  • FIG. 3 techniques illustrated in FIG. 3 could apply to a variety of environments and devices, not just those explicitly illustrated and described herein.
  • FIG. 3 in this particular example shows a remote control device 302 and a television 304 which may be the remote control device 202 and television 204 described above in connection with FIG. 2 .
  • the television 304 displays an interface 306 which has various selectable options, such as a watch TV button 308 .
  • the remote control device 302 includes a touch screen 310 which may be the touch screen 210 described above in connection with FIG. 2 .
  • interaction with the touch screen 310 in an embodiment includes touching the touch screen 310 and, generally, performing actions in close proximity to the touch screen 310 .
  • a user is interacting with the touch screen 310 with a left hand 312 and a right hand 314 .
  • other appendages may be used to interact with the touch screen 310 .
  • the user may interact with the touch screen 310 or may interact with the touch screen with any appendage or combination of appendages, or other portions of appendages (such as the palm of a hand).
  • other items, such as styluses or other, non-human, things may be used for interaction with the touch screen 310 , although the present disclosure will focus on thumbs for the purpose of illustration.
  • a left thumb 316 and a right thumb 318 are shown interacting with touch screen 310 .
  • the user is hovering over, that is, not touching, the touch screen 310 with the left thumb 316 and, in particular, in the lower-left corner of the touch screen 310 .
  • the user is touching the touch screen 310 and in particular, touching an upper-right portion of touch screen 310 .
  • representations of the left thumb 316 and right thumb 318 are shown in corresponding places on the user interface 306 .
  • a representation of a left thumb 320 is displayed at a location on the location of the user interface 306 that corresponds to a location over which the left thumb 316 is hovering over the touch screen 310 .
  • a representation 322 of the right thumb 318 appears in the upper-right hand portion of the user interface 306 at a location that corresponds to a location touched by the user on the touch screen 310 .
  • the representation 320 of the left thumb 316 and the representation 322 of the right thumb 318 each have the appearance of a thumb. That is, the representations on the television 304 resemble the appendages used by the user. However, other representations may be used which do not necessarily resemble body parts. For example, circles or targets or any visual indicator of the user's interaction with touch screen 310 may be used. Further, FIG. 3 shows representations overlaid on a GUI for the purpose of illustration. However, other ways of providing visual feedback to the user based on the user's interaction with the touch screen may be used.
  • a representation may not be overlaid on the user interface, but may be made by manipulating the user interface in a way that indicates user interaction with the touch screen 310 .
  • the user interface could have a warped effect at a location where the user interacts with the touch screen 310 .
  • the display may brighten at locations corresponding to a location with which a user interacts with the touch screen 310 .
  • interface elements such as selectable options of the user interface, may change color, brightness or other characteristics when the user interacts with the touch screen 310 in a corresponding location.
  • any manner of providing visual feedback of the user's interaction with the touch screen 310 may be used.
  • the visual feedback of user interaction with the touch screen 310 may be provided in a varying manner. For example, as shown in FIG. 3 , color characteristics of representations of the user appendages vary according to how the user is interacting with touch screen 310 . For example, because the left thumb 316 is not touching but is only hovering over the touch screen 310 , the representation 320 of the left thumb 316 in this example appears not bright and transparent; that is, the combination of the representation 320 of the left thumb and the user interface has an appearance as if the interface shows through the representation 320 of the left thumb 316 .
  • the representation 322 of the right thumb 318 appears bright and opaque, thereby indicating to the user that the right thumb is touching the touch screen 310 .
  • the user can hover over the touch screen with an appendage and, based on the location of the representation on the interface 306 , knows where to move his or her appendage to navigate the interface as desired.
  • the representation is transparent when the corresponding appendage is hovering, the representation does not obscure the user interface.
  • the representation 320 of the left thumb 316 appears transparently over a play a game option 324 of the user interface 306 .
  • the user does not need to move the left thumb 316 in order to see what option would be selected by pressing the touch screen 310 at the same location over which the left thumb 316 is hovering.
  • other ways of providing visual feedback that do not obscure elements of the interface 306 such as by changing elements of the interface to be still recognizable, may be used and, in some embodiments, elements of the interface may be allowed to be obscured.
  • FIG. 3 lines radiating from the representation of the right thumb 322 at a location corresponding to a location where the user touches the touch screen 310 are shown as an illustrative example.
  • the radiated lines indicate that the user has touched the touch screen at this location and therefore selected a corresponding option on the user interface which, in this example, is a play music option 326 .
  • the lines may appear responsive to contact with the touch screen 310 (or proximity within a threshold), and may subsequently disappear, such as when the corresponding appendage loses contact with the touch screen 310 , when a determination is made that the user made a selection of an element, and/or at another time.
  • the amount by which a representation is transparent may vary according to a distance by which an appendage is hovering over the touch screen 310 .
  • the example in FIG. 3 shows the user navigating the user interface using the touch screen 310 utilizing two appendages, his or her thumbs, the navigation may be done by a single appendage or other mechanism, or more than two appendages.
  • representations may disappear in order to allow a user to fully view a new updated interface that appears as a result of the selection of the option.
  • GUIs may be used in accordance with the present disclosure.
  • any GUI that may be presented on a display and manipulated using touch techniques may be used.
  • Example GUIs are GUIs configured for use with any operating system that allows for touch-based input.
  • Embodiments of the present disclosure may also be used to enable non-touch-based operating systems to be navigated using touch techniques.
  • a left-right-up-down (LRUD) operating system such as on some televisions, may allow navigation only according to the left, right, up, or down directions.
  • LRUD left-right-up-down
  • User input on a touch screen may be translated to corresponding left, right, up, or down commands to enable touch navigation.
  • a touch screen may be divided into regions, where each region corresponds to one or more LRUD commands. Touching the touch screen at an upper middle portion, for instance, may correspond to an up command. Some regions may correspond to multiple commands. Touching a touch screen in a corner, for instance, may correspond to a sequence of LRUD commands. The lower left corner may, for example, correspond to an down-left or left-down sequence of commands. The commands may be transmitted in sequence to the device with the LRUD operating system upon selection of the touch screen. Generally, one or more ways of interacting with a touch screen may correspond to one or more commands of a non-touch-screen operating system.
  • FIG. 4 shows an illustrative example of a process 400 that may be used to provide navigation of an interface, such as in the manner illustrated in connection with FIGS. 2 and 3 .
  • Some or all of the process 400 may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware, or combinations thereof.
  • a computer system may be any device capable of performing processing functions, such as notebook computers, desktop computers, remote control devices, content appliances, mobile phones, tablet computing devices, and, generally, any device or collection of devices that utilize one or more processors.
  • One of more of the actions depicted in FIG. 4 may be performed by a device such as a remote control device, a content appliance, a television, or, generally, any device that is configured to participate in providing content.
  • the code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors.
  • the computer-readable storage medium may be non-transitory.
  • the process 400 includes displaying 402 an interface, such as described above.
  • an interface may mean actually displaying the interface or taking one or more actions that cause an interface to be displayed.
  • the content appliance 102 may display an interface by generating the signal, causing the signal to be sent to the AV receiver which relays the signal to the television 104 .
  • a device performing the process 400 such as a television thus configured, may display the interface itself.
  • a proximate appendage is detected 404 .
  • Detecting a proximate appendage may be done in any suitable manner, such as utilizing the techniques in U.S. application Ser. No. 13/047,962 and U.S. application Ser. No. 12/840,320, described above.
  • the appendage may be detected, for example, upon the user moving an appendage within a certain distance of a touch screen and by detecting the proximate appendage.
  • detecting the appendage may be performed in various ways. For example, detecting the appendage may be performed by detecting the appendage directly or receiving a signal from another device, such as a remote control device that detected the appendage.
  • an interface display location is determined 406 based at least in part on a mapping of input device locations to interface display locations, which may be an absolute mapping, as described above.
  • the representation of the appendage is overlaid 408 on the interface display at the determined interface display location.
  • other ways of providing representations may be performed, although overlays of representations are used for the purpose of illustration.
  • the position of the representation on the interface may be updated 410 according to movement of the appendage. For example, if the user moves the appendage to the left, the representation of the appendage may move to the left as well.
  • Determining how to update the position of the representation may include multiple detections of the appendage and corresponding determinations of the interface display location based on the mapping.
  • the user may make contact with the touch screen.
  • a touch event of the appendage is detected 412 .
  • an operation according to the touch event type and/or location of touch is performed 414 .
  • the way in which the user touches a touch screen may indicate, for example, how the user wishes to navigate a user interface. For instance, touching the touch screen and moving the appendage while in contact with the touch screen may indicate a drag operation in the user interface. If the initial touch was on an element and is dragable in the user interface, the element may move accordingly.
  • a double tap on the touch surface 210 may also be appropriately interpreted (for example as a double click on an icon on the display).
  • FIG. 5 shows a more detailed process 500 which may be used to update a user interface in accordance with an embodiment.
  • the process illustrated in FIG. 5 may be performed, for example, by a remote control device, such as described, or generally any device with a touch screen that is configured to operate in accordance with the present disclosure.
  • the process 500 includes displaying interface 502 , such as described above.
  • the touch screen may be monitored 504 for various events involving user interaction with the touch screen.
  • a process in a device performing the process 500 may periodically or otherwise poll for events and take action when events are detected. For example, as illustrated in FIG. 5 , a repetitive sub-process is performed in which the touch screen is monitored 504 and determinations are made 506 whether an appendage is detected.
  • Determining whether an appendage is detected may be performed in various ways. For example, in some embodiments, the determination may simply be a determination of whether signals from a touch screen indicate the presence of an appendage proximate to the touch screen. However, the determination may be more complex and may include other determinations. For example, determining whether an appendage is detected may include determining how many appendages are detected. In addition, in an embodiment, for any appendages detected, the detected appendages may be matched to actual appendages. For instance, referring to FIG. 3 , the two detected appendages may be matched to the left and right thumb. Other appendages and objects may be detected and matched, such as other fingers, styluses, and the like.
  • Matching detected appendages to actual appendages may be done in various ways. For example, in an embodiment, when an appendage is detected, the appendage will generally cause different portions of the touch screen to generate different signals. For example, with capacitive pressure sensors, the capacitance measurements for the touch screen may increase for locations that are proximate to the detected appendage. The locations for which the capacitance changes may be used to determine which appendage a detected appendage corresponds to. For example, a thumb will generally affect a larger region of the touch screen than other fingers due to the thumb's larger relative size. In addition, regions of affected locations in the touch screen may generally be oriented differently depending on the appendage being sensed. Referring to FIG.
  • a region of locations on the touch screen 310 for the left thumb would generally point upward and to the right whereas a right thumb would point upward and to the left.
  • Touch event criteria may be criteria that, when met, indicate a touch event. For example, criteria for a selection of a touch event corresponding to selection of a user interface element may be that the user may contact the touch screen and then may lose contact over a predetermined period of time. Other criteria can be simpler, the same complexity, and/or more complex. Criteria may take into account information about timing of various activities, such as how long the user has touched the touch screen, how many appendages or other objects touched the screen, whether or not the user moved while in contact with the touch screen, a certain amount and the like.
  • the touch event criteria takes into account matches to appendages that have been detected. Different touch events may correspond to different actions by a user using different subsets of his or her fingers (or other appendages or objects). For example, in an embodiment, when a middle finger and an index finger are detected, a right click event may be generated at a user interface (UI) location corresponding to a touch screen location selected by the index finger. The location of the UI may correspond to a particular right-click menu of selectable interface options that may be displayed upon detection of the right click event. As another example, when a thumb and right index finger are detected, a scroll left event may be detected for a UI element (scroll bar, icon, etc.) selected by the right index finger.
  • UI user interface
  • a scroll right event may be detected for a UI element selected by the right ring finger.
  • a scroll left or right event may be generated for a UI element selected by the index finger.
  • the direction of scroll may be determined in many ways, such as by the left or right thumb detected, by the direction of movement of the detected thumb, and the like. Generally, any way of matching actions of sets of one or more fingers (or other objects) to events may be used.
  • the object may be moved on the interface accordingly.
  • the user touches the touch screen at a location corresponding to a portion of the user interface that is not manipulable through user interaction and moved within that area while in contact with the touch screen, the user interface may be left as is, although representations of detected appendages may be updated with movement accordingly.
  • the touch screen may be continued to be monitored 504 . If it has been determined that touch event criteria have not been satisfied—for example, if the user has touched the touch screen, but not moved or lost contact with the touch screen—a determination may be made 516 whether the detection of the appendage corresponds to a touch or a hover. If it is determined 516 that the detected appendage is hovering, then a hover mode representation of an appendage on the interface screen at a location corresponding to the detected location may be overlaid or updated. For example, if the hover mode representation is already on the interface, the position of the representation may be changed accordingly.
  • a representation may then be overlaid at the location corresponding to the location at which the appendage was detected on the touch screen. If it is determined that the detected appendage is touching the touch screen, a touch mode representation of the appendage on the interface screen at a location corresponding to the detection location may be overlaid or updated accordingly. For example, if a hover mode representation was on the touch screen and the user then touched the touch screen, the hover mode representation may be changed into a touch mode representation. If no representation was on the interface, then a touch mode representation may appear.
  • a location of the representation may be updated in accordance with any movement by the user—for example, if the location changed since the last time the determination was made. As with other steps herein, the touch screen is continued to be monitored during performance of the process 500 .
  • representations of user appendages or other devices used to interact with a touch screen may change according to a manner in which the interaction is performed.
  • One way of changing the manner in which the interaction is performed is by changing the representation based at least in part on a distance of a detected appendage and, in particular, changing the representation based at least in part on the distances of multiple locations of a detected appendage (or other object).
  • the color characteristics of a representation of an appendage may be changed based at least in part on the distance the appendage is from the touch screen.
  • FIG. 6 shows a touch screen 602 over which a finger 604 is hovering at a distance that is not in contact with the touch screen 602 .
  • FIG. 6 shows a touch screen 602 over which a finger 604 is hovering at a distance that is not in contact with the touch screen 602 .
  • FIG. 6 also shows a corresponding user interface 606 on which a representation 608 of the finger 604 appears.
  • the touch screen 602 and user interface 606 are shown simplified for the purpose of illustration of an embodiment.
  • the touch screen 602 is shown without other portions of a device with which the touch screen 602 is attached.
  • the user interface 606 is illustrated without a device with which the user interface is displayed or a GUI, although the user interface may, and typically will, have a GUI displayed.
  • the touch screen 602 may similarly have a GUI displayed, which may match a GUI displayed on the user interface 606 .
  • the representation 608 of the finger 604 appears on the interface at a location corresponding to the location at which the finger 604 hovers over the touch screen 602 .
  • the representation 608 resembles an outline of a finger and this outline is oriented according to the orientation of the finger 604 over the touch screen 602 .
  • the finger 604 at different locations is a different distance over the touch screen 602 .
  • the representation 608 of the finger 604 has color characteristics changing according to the various distances. For example, portions of the representation 608 of the finger 604 that correspond to locations of the finger 604 that are closer to the touch screen 602 are darker than portions of the representation 608 that correspond to locations on the finger 604 that are further from the touch screen 602 .
  • the representation may be updated accordingly.
  • the location of the representation 608 on the user interface 606 may be updated as well. (If a representation is displayed on the touch screen, the representation may be updated there as well.)
  • transparency and/or other color characteristics of the representation 608 may change according to distance changes at locations of the various portions of the finger 604 from the touch screen 602 .
  • FIG. 7 shows an illustrative example of how in one way a representation may be updated.
  • FIG. 7 shows a touch screen 702 over which a finger 704 hovers in order to cause a user interface 706 to display a representation 708 of the finger 704 .
  • the touch screen 702 , finger 704 , interface 706 and representation 708 may be the same as those similarly named items described above in connection with FIG. 6 .
  • the finger 704 is closer to the touch screen 702 than the finger 604 to the touch screen 602 in FIG. 6 .
  • the representation 708 in FIG. 7 of the finger 704 accordingly, is displayed with different color characteristics than in FIG. 6 .
  • the tip of the finger 704 is closer to the touch screen 702 than the tip of the finger 604 to the touch screen 602 of FIG. 6 .
  • the tip of the representation 708 is therefore in this example less transparent than the tip of the representation 608 in FIG. 6 .
  • Portions of the representation 708 of the finger 704 that are further from the touch screen 702 are more transparent in the same manner as shown in FIG. 6 . However, they are less transparent than corresponding locations in FIG. 6 because overall the finger 704 in FIG. 7 is closer to the touch screen than the finger 604 in FIG. 6 at corresponding locations.
  • FIGS. 6 and 7 show comparative figures that illustrate one particular embodiment of the present disclosure.
  • the user interface may warp at locations corresponding to detected objects. The amount by which the user interface is warped may vary depending on the distance of the detected objects. The closer the object, the more the corresponding location on the user interface may be warped.
  • any way of varying the representation based at least in part on distance of a corresponding object to a touch screen may be used.
  • any input device that is able to detect proximity and touch may be used.
  • touch screens used in some embodiments may not themselves display any information or may display information different from that which is displayed on another device, such as a television.
  • proximity sensing techniques may be used in other contexts, not just planar touch sensitive areas, such as those illustrated above.
  • a remote control device with buttons such as physically displaceable buttons
  • Proximity sensors may be incorporated with the remote control device. When a user's appendage becomes close to a button (such as by making contact with the button), a representation of the appendage may appear on a display.
  • an action may be taken.
  • the action may correspond to the button, the location of the representation on the display, or otherwise.
  • a representation of the appendage may appear over a “play” button on a display, such as described above.
  • a play function may be performed. For instance, if watching a DVD, a DVD player may be put in a play state.
  • movement of the representation of an appendage on a display may correspond to user movement of the appendage relative to the remote control, manipulation of an input device (such as a joystick), or otherwise.
  • buttons of a remote control may be force sensing.
  • the above described hovering effects may be produced upon light presses of a button and selection of an object may be performed upon more forceful presses of the button.
  • sound, vibration, or other additional feedback may be provided to the user based on the user's interaction with a remote control device.
  • a remote control device may, for instance, vibrate upon making a selection using the techniques described herein.
  • the remote control or another device (such as a television or audio system) may make a sound upon selection.

Abstract

Techniques for content navigation utilize proximity sensing so that user interaction with a graphical user interface is based at least in part on both contact with a surface and contactless interaction with the surface. A representation of an object detected as being proximate to and/or in contact with a surface appears on a display, which may be separate from the surface. The representation may appear at a location of the display that is determined according to a mapping of surface locations to display locations. The representation is updated based at least in part on movement of the object relative to the surface and one or more distances of the object from the surface.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application incorporates by reference for all purposes the full disclosure of U.S. application Ser. No. 13/284,668, entitled “Remote Control System for Connected Devices,” and filed concurrently herewith. This application also incorporates by reference the full disclosure of U.S. Application No. 61/480,849, entitled “Remote Control for Connected Devices,” and filed on Apr. 29, 2011.
  • This application also incorporates by reference for all purposes the full disclosure of U.S. Provisional Application No. 61/227,485, filed Jul. 22, 2009, U.S. Provisional Application No. 61/314,639, filed Mar. 17, 2010, U.S. application Ser. No. 12/840,320 entitled “System and Method for Remote, Virtual On Screen Input,” and filed on Jul. 21, 2010, and U.S. application Ser. No. 13/047,962, entitled “System and Method for Capturing Hand Annotations” and filed on Mar. 15, 2011.
  • BACKGROUND OF THE INVENTION
  • Various devices can be used by users to provide input to different systems. Input devices such as mice, keyboards, keypads, touch pads, joysticks, and other devices, for example, allow users to control one or more devices by interaction with the input devices. In addition, as such technology improves, touch screens have become more prevalent as input devices. In a typical application, a touch screen and a display are integrated so that a graphical user interface (GUI) displayed on the touch screen provides visual indicators of how user input can be provided to interact with the GUI. The GUI may, for instance, include selectable options so that a user can see on the display where to touch the touch screen to select a displayed option. In many instances, input devices that incorporate touch are separated from a display on which a GUI is displayed. Many notebook computers, for example, include touch pads. In typical uses, a user touches a touch pad and moves one or more fingers to cause a cursor displayed on the GUI to move accordingly. A button proximate to the touch pad or sometimes the touch pad itself can be tapped to cause a GUI element to be selected. Other ways of interacting with the touch pad and any buttons with the touch pad may be used to interact with the GUI accordingly.
  • Despite the numerous ways users are able to interact with devices using various input devices, existing devices, whether devices being controlled or input devices used to control other devices, do not take full advantage of technologies that have been developed. In addition, many devices are designed in a manner that makes use of developed technologies cumbersome. Televisions, for example, often are configured to provide high-quality displays. At the same time, the use of touch-based input devices with televisions can be awkward. Users, for example, often view televisions from a large enough distance that incorporation of a touch screen input device with the television display is impractical. Similar issues exist for many display devices, such as computer monitors. Thus, while touch-based input has proven to be beneficial, many benefits of touch-based input are often unattained.
  • BRIEF SUMMARY OF THE INVENTION
  • The following presents a simplified summary of some embodiments of the invention in order to provide a basic understanding of the invention. This summary is not an extensive overview of the invention. It is not intended to identify key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some embodiments of the invention in a simplified form as a prelude to the more detailed description that is presented later.
  • Techniques of the present disclosure provide for the interaction of graphical user interfaces using input devices that incorporate touch and/or proximity sensing. Such techniques provide advantages including advantages of some embodiments that allow users to obtain a touch user-input experience with displays that are not necessarily touch-input enabled. In an embodiment, a computer-implemented method of manipulating a display is described. The method includes detecting a set of one or more user appendages proximate to and out of contact with a set of one or more corresponding sensor locations of a sensing region of a remote control device; determining, based at least in part on a mapping of the sensing region to a display region of a remote display device, a set of one or more display locations of the display region; and transmitting a signal that causes the display device to display a set of one or more representations of the detected set of one or more user appendages according to the determined set of one or more second locations. The mapping may be an absolute mapping.
  • Variations of the method are also considered as being within the scope of the present disclosure. For example, in an embodiment, the method further includes: calculating at least one measurement that corresponds to distance of the detected appendage from the sensing region. In this embodiment, displaying the representation of the detected appendage may include displaying the representation of the detected appendage with one or more color characteristics that are based at least in part on the measurement. The color characteristics may be, for instance, brightness, hue, opacity, and the like. The display may display a graphical user interface and displaying the set of one or more representations may include overlaying the one or more representations on the graphical user interface or otherwise visually distinguishing locations corresponding to user interaction with the sensing region from other locations. In an embodiment, the graphical user interface includes one or more selectable options that each correspond to a selection region of the sensing region. In this embodiment, the method may further comprise detecting a contact event by at least one appendage of the set of one or more appendages. The detected contact event may correspond to a contact location of the sensing region. When the contact location corresponds to a selection region of a corresponding selectable option of the graphical user interface, the graphical user interface may be updated according to the corresponding selectable option.
  • The displayed set of one or more representations may be changed upon detection of the contact event for which the contact location corresponds to the selection region of the corresponding selectable option. Changing the displayed set of one or more representations may include removing the set of one or more representations from the display. At least one of the representations may resemble the corresponding appendage and the displayed set of one or more representations may include at least two representations of different forms, such as two different fingers.
  • In accordance with another embodiment, a computer-implemented method of manipulating a display is disclosed. The method, in this embodiment, includes calculating measurements that correspond to distances of a user appendage from a sensing region of a remote control device as the user moves the user appendage relative to the sensing region; and taking one or more actions that cause a display device to display a representation of the appendage such that the representation has one or more color characteristics that vary based at least in part on the calculated measurements.
  • As with all methods disclosed and suggested herein, variations are considered within the scope of the present disclosure. For example, taking the one or more actions may include transmitting remotely generated signals to the display device. As another example, the representation may have a transparent appearance when the user appendage is out of contact with the sensing region and an opaque appearance when the user appendage is in contact with the sensing region. The method may also include determining location changes of the sensing region with which the user appendage is proximate or in contact. In such instances, taking the one or more actions may include updating locations of the representation on the display.
  • In accordance with yet another embodiment, a user input system is described. The user input system may be a set of one or more devices that collectively operate to change a display according to user input. In this embodiment, the user input system includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device update according to user input. For instance, the display device may display a representation of a user appendage on a display of the display device and change one or more color characteristics of the representation based at least in part on changes in distances of the user appendage from a sensing region of a remote control device.
  • Variations of the user input system are also considered as being within the scope of the present disclosure. For example, the instructions may further cause the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region. The instructions may also further cause the user input system to cause the display device to update the display according to a predefined action of multiple user appendages in connection with the sensing region. The display device may be separate from the user input system. For instance, the display device may be a television and the user input system may be a remote control device (or remote control system) that operates the television.
  • In accordance with another embodiment, another user input system is described. The user input system allows for user interaction with a graphical user interface. The user input system may include, for example, one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device to display information according to user input. The display device may, for instance, display a representation of a user appendage at a display location of a display of the display device where the display location is based at least in part on an absolute mapping of display locations of the display device to sensing locations of a sensing region of a sensing device. At least when the appendage moves relative to and out of contact with the sensing device, the display device may change the display location based at least in part on the absolute mapping.
  • Variations of the user input system considered as being within the scope of the present disclosure include, but are not limited to, the instructions further causing the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region. The instructions may further cause the user input system to identify the user appendage from a set of potential user appendages and/or cause the user input system to update the display based at least in part on detection of an event that is uncausable using at least one other of the potential user appendages. The sensing device may be a component of a device that is physically disconnected from the display device and/or the sensing device may be a remote control device for the display device.
  • In accordance with another embodiment, a display device is disclosed. The display device includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to display information according to user input. The display may, for instance, display a graphical user interface and receive signals corresponding to user interaction with a sensing region of a sensing input device, the signals being based at least in part on a number of dimensions of user interaction that is greater than two, and change the graphical user interface according to the received signals. The signals may be generated by an intermediate device that receives other signals from a remote control device. The sensing input device may be separate from the display device. Changing the graphical user interface may include updating an appearance characteristic of a representation of an object used to interact with the sensing input device. Changing the graphical user interface may also include updating, on the graphical user interface, a location of a representation of an object used to interact with the sensing input device. The user interaction may include contactless interaction with the sensing input device.
  • For a fuller understanding of the nature and advantages of the present invention, reference should be made to the ensuing detailed description and accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an illustrative example of an environment in which various embodiments may be practiced.
  • FIG. 2 shows an illustration of a remote control device and a display device in accordance with at least one embodiment.
  • FIG. 3 shows the remote control device and the display device of FIG. 2 being navigated by a user in accordance with at least one embodiment.
  • FIG. 4 shows an illustrative example of a process for facilitating user navigation of an interface in accordance with at least one embodiment.
  • FIG. 5 shows an illustrative example of maintaining an interface in accordance with at least one embodiment.
  • FIG. 6 shows an illustration of an aspect of the invention in accordance with at least one embodiment.
  • FIG. 7 shows the aspect of FIG. 6 as it changes according to user movement in accordance with at least one embodiment.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following description, various embodiments of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described.
  • FIG. 1 shows an environment 100 in which various embodiments may be practiced. In accordance with an embodiment, environment 100 utilizes a content appliance 102 in order to provide content to a user. As illustrated in FIG. 1, the content may be provided to the user in various ways. For example, the environment 100 in FIG. 1 includes a television 104, an audio system 106 and a mobile device 108 (such as a mobile phone) that may be used to provide content to a user. Content may include video content, audio content, text content, and generally any type of content that may be provided audio, visually or otherwise to a user. Other devices may also be used in the environment 100. For example, as illustrated in FIG. 1, the environment 100 includes an audio visual (AV) receiver 110 which operates in connection with television 104. Also, the environment 100 as illustrated in FIG. 1 includes a video camera 112, a set top box 114 and a remote control 116 and a keyboard 118.
  • When a user utilizes an environment, such as the environment 100, one or more devices may utilize the content appliance 102 in some manner. To accomplish this, the various devices shown in FIG. 1 are configured to communicate with one another according to various protocols. As a result, in an embodiment, the content appliance 102 configured to communicate with various devices utilizing the different methods, such as according to the methods and protocols illustrated in FIG. 1. For example, in an embodiment, the content appliance 102 is configured to generate and transmit infrared (IR) signals to various devices that are configured to receive IR signals and perform one or more functions accordingly. Different devices may utilize different codes and the content appliance may be configured to generate proper codes with each appliance. For example, a television from one manufacturer may utilize different codes in a television from another manufacturer. The content appliance 102 may be configured accordingly to generate a transmit appropriate codes. The content appliance may include a data store that has the codes for various devices and codes may be obtained from remote sources, such as from remote databases as discussed below. In a set up process, a user may configure the content appliance 102 to submit the correct codes to the appropriate device(s).
  • As another example of how the content 102 is able to communicate utilizing various protocols, the content appliance 102 includes various ports which may be used to connect with various devices. For example, in an embodiment, the content appliance 102 includes an HDMI OUT port 120 which may be used to provide content through an HDMI cable to another device. For example, as illustrated in FIG. 1, the HDMI OUT port 120 communicates content to the AV receiver 110. The HDMI OUT port may be used to provide content to other devices, such as directly to the television 104. In an embodiment, the content appliance 102 includes an S/PDIF port 122 to communicate with the audio system 106.
  • An ethernet port 124 may be provided with the content appliance 102 to enable the content appliance 102 to communicate utilizing an appropriate networking protocol, such as illustrated in FIG. 1. For example, the content appliance 102 may communicate signals utilizing the ethernet port 124 to communicate to a set top box. The set top box may operate according to an application of a content provider such as a satellite or cable television provider. The ethernet port 124 of the content appliance 102 may be used to instruct the set top box 114 to obtain content on demand.
  • In an embodiment, the content appliance 102 includes one or more universal serial bus (USB) ports 126. The USB ports 126 may be utilized to communicate with various accessories that are configured to communicate utilizing a USB cable. For example, as shown in FIG. 1, the content appliance 102 communicates with a video camera 112. The video camera 112 may be used, for instance, to enable use of the content appliance to make video calls over a public communications network, such as the Internet 128. Generally, the content appliance 102 may be configured to communicate with any device connectable using USB techniques.
  • Other ports on the content appliance 102 may include RCA ports 130 in order to provide content to devices that are configured to communicate using such ports and an HDMI end port 132 which may be used to accept content from another device, such as from the set top box 114. Generally, the content appliance 102 may have additional ports to those discussed above and, in some embodiments, may include fewer ports than illustrated.
  • Various devices in communication with the content appliance may be used to control the content appliance and other devices in the environment 100. For example, the remote control 116 may communicate with the content appliance 102 utilizing radio frequency (RF) communication. As described in more detail below, the remote control 116 may include a touch screen that may be used in accordance with the various embodiments described herein.
  • A keyboard 118 may also communicate with the content appliance 102 utilizing RF or another method (and possibly one or more other devices, either directly, or through the content appliance 102). The keyboard may be used for various actions, such as navigation on a interface displayed on the television 104, user input by a user typing utilizing the keyboard 118, and general remote control functions. For example, an interface displayed on the television 104 may include options for text entry. The user may type text utilizing keyboard 118. Keystrokes that the user makes on the keyboard 118 may be communicated to the content appliance 102, which in turn generates an appropriate signal to send over an HDMI cable connecting the HDMI OUT port 120 to the AV receiver 110. The AV receiver 110 may communicate with television 104 over HDMI or another suitable connection to enable the television to display text or other content that corresponds to the user input. The keyboard 118 may also include other features as well. For example, the keyboard 118 may include a touchpad, such as described below or generally a touchpad that may allow for user navigation of an interface displayed on a display device. The touchpad may have proximity sensing capabilities to enable use of the keyboard in various embodiments of the present disclosure.
  • In an embodiment, the mobile device 108 is also able to control the content appliance 102 (and possibly other devices, either directly, or through the content appliance 102). The mobile device may include a remote control application that provides an interface for controlling the content appliance 102. In this particular example from FIG. 1, the mobile device 108 includes a touch screen that may be used in a manner described below. As the user interacts with the mobile device 108, the mobile device may communicate with the content appliance 102 over wi-fi utilizing signals that correspond to the user's interaction with the mobile device 108. The content appliance 102 may be, for instance, configured to receive signals from the mobile device over wi-fi (directly, as illustrated, or indirectly, such as through a wireless router or other device). The content appliance may be configured to generate signals of another type (such as IR, HDMI, RF, and the like) that correspond to codes received over wi-fi from the mobile device 108 and then generate and transmit signals accordingly. An application executing on the mobile device 108 may provide a graphical user interface that allows users to use the mobile device 108 as a remote control and generate such codes accordingly. The mobile device 108 (and other devices), as illustrated, may be configured to receive information from the content appliance 102 and reconfigure itself according to the information received. The mobile device 108 may, for example, update a display and/or update any applications executing on the mobile device 108 according to information received by the content appliance 102. It should be noted that, while the present disclosure discusses a mobile device illustrated as a mobile phone, the mobile device may be a different device with at least some similar capabilities. For example, the mobile device may be a portable music player or tablet computing device with a touch screen. Example mobile devices include, but are not limited to, a mobile phone with a touch screen (e.g., a smartphone such as an iPhone or an Android based phone, etc.), a portable music player (e.g., an iPod, etc.), a tablet computing device (e.g., an iPad, iPad2, etc.), and other devices with touch sensitive user input devices. Of course, such devices (and other devices) may be included additionally in a mobile device in the environment illustrated in FIG. 1.
  • In an embodiment, the content appliance 102 is also configured to utilize various services provided over a public communications network, such as the Internet 128. As example, the content appliance 102 may communicate with a router 134 of home network. The content appliance 102 and the router 134 may communicate utilizing a wired or wireless connection. The router 134 may be directly or indirectly connected to the Internet 128 in order to access various third-party services. For example, in an embodiment, a code service 136 is provided. The code service in an embodiment provides codes for the content appliance 102 to control various devices to enable the content appliance to translate codes received from another device (such as the remote control 116, the keyboard 118, and/or the mobile device 108). The various devices to control may be identified to the content appliance 102 by user input or through automated means. The content appliance 102 may submit a request through the router 134 to the code service 136 for appropriate codes. The codes may be, for example, IR codes that are used to control the various devices that utilize IR for communication. Thus, for example, if a user presses a button on the remote control 116, keyboard 118, or an interface element of the mobile device 108, a signal corresponding to the selection by the user may be communicated to the content appliance 102. The content appliance 102 may then generate a code based at least in part on information received from the code service 136. As an illustrative example, if the user presses a play button of the remote control 116, a signal corresponding to selection of the play button may be sent to the content appliance 102 which may generate a play IR code, which is then transmitted to the television 104 or to another suitable appliance, such as generally any appliance that is able to play content.
  • Other services that may be accessed by the content appliance 102 over the Internet 128 include various content services 138. The content services may be, for example, any information resource, such as websites, video-streaming services, audio-streaming services and generally any services that provide content over the Internet 128.
  • It should be noted that the environment illustrated in FIG. 1 is provided for the purpose of illustration and that numerous environments may be used to practice embodiments of the present disclosure. Various embodiments, for example, are applicable in any environment where proximity sensing is used as a method of enabling user input, including any environment in which a touch pad or touch screen with proximity sensing capabilities is used to interact with a GUI on a separate display. It should be noted that various embodiments may be described as utilizing a particular input device such as a touch pad or touch screen but, unless otherwise clear from context, various embodiments of the invention may utilize other input devices other than those explicitly described. For instance, in many instances, such as where a display on the input device is not required, embodiments described as utilizing a touch screen may be modified to utilize a touch pad or other suitable input device having touch and/or proximity sensing capabilities as an alternative to a touch screen. FIG. 1 shows an example environment in which user input is provided to a display (television, in the illustrated example) through a content appliance. However, techniques of the present disclosure are also applicable for providing user input directly to a device with a display. For instance, the various techniques described herein may be used in connection with a television remote control device, where the television remote control device sends signals according to user interaction with a touch screen directly to a television.
  • FIG. 2 shows an illustration of such a remote control device 202 and a television 204, although signals from the remote control device 202 could be transmitted using a content appliance, such as described above in connection with FIG. 1. In an embodiment, the remote control device 202 is used to control how content is displayed on the television 204 or other device with a display. The remote control device 202 may communicate signals directly to the television 204 or through an intermediate device, such as the content appliance 102 described above in connection with FIG. 1.
  • In an embodiment, the television 204 may display an interface which is navigable by a user utilizing the remote control 202. For example, the television 204 in FIG. 2, displays an interface 206 which includes a plurality of selectable options. In this specific example, the options are provided as interface buttons displayed on the television 204. Each button in this example corresponds to an activity that the user may perform by selecting one of the buttons. For example, the interface 206 includes a watch TV button 208. Selection of the watch TV button 208 may result in one or more devices changing to a state suitable for watching TV on the television 204. For example, a set top box (not shown) may be put into an on state if it was not in such a state already. Television 204 may be put into a state wherein it receives television content from an appropriate source, such as from the set top box or from a different device, such as a content appliance 102 described in connection with FIG. 1.
  • In this example, the television 204 (or a network of devices that includes the television 204) is configured to utilize one or more other devices, such as a DVD player, music player, a gaming device, and devices that allow communication over the Internet. For example, in an embodiment the users are able to utilize the television 204 to check an email account and/or stream a movie from a remote streaming service. Generally, the television 204 or a network of devices that includes the television 204 may be configured for use with any device involved in providing content, either from the devices themselves, or from other sources, including remote sources accessible over the Internet or other communications network.
  • In an embodiment, the remote control 202 includes a touch screen 210. As noted, while the remote control is described as having a touch screen 210, embodiments may utilize a remote control with a touch pad instead of or in addition to a touch screen. The touch screen 210 or touch pad in an embodiment operates using capacitive proximity sensing. The touch screen 210 (or touch pad) may be configured, for example, according to the disclosure of U.S. application Ser. No. 13/047,962, referenced above. The touch screen 210 (or touch pad), however, may utilize any technique for proximity sensing. As discussed below, the touch screen 210 (or touch pad) may be used as an input device by a user. The input may be input for controlling the remote control device 202 and/or the television 204. Other mechanisms for input may be used in addition to the touch screen 210. For instance, FIG. 2 illustrates various buttons, many of which are common on standard remote controls. Such buttons may be selected to perform corresponding functions. Performance of some functions may be caused by selection of corresponding buttons or selections of interface elements on the touchscreen 210, although some functions may be causable by either the buttons or the touchscreen 210, but not both.
  • In an embodiment, there is an absolute mapping between locations on the touch screen or touch pad 210 and locations on the graphical user interface on the television (or other display device). Thus, each location on the touch screen or touch pad 210 is mapped to at least one location on the user interface on the display. The absolute mapping may be a surjective mapping from the locations of the user interface on the display to the locations of the touch screen. The mapping may be a surjective mapping from the locations of the touch screen to the locations of the user interface if there are more locations on the touch screen or touch pad than the user interface. A mapping may also be a one-to-one mapping between touch screen sensor locations and pointer locations on the interface. Generally, the mapping may be any mapping that is configured such that, from the user perspective, each location on the touch screen has a corresponding location on the user interface. It should be noted, however, that some embodiments of the present disclosure may utilize a relative mapping between the touch screen and user interface.
  • Mappings between a region of a remote control device and a display device may be determined in various ways. For example, in an embodiment, a device in connection with the display device utilizes extended display identification data (EDID) or extended EDID (E-DID) received over a high-definition multimedia interface (HDMI) or other connection to determine display parameters for the display device. The data may, for example, specify a maximum horizontal image size and maximum vertical image size which, allows for a mapping of the sensing region to the display. Other ways of determining the display size for generating a mapping may be used. For instance, a user may input the display size during a setup process. The user may alternatively enter an identifier (model number, e.g.) for a display device and a database (which may be a remote database) may be referenced to determine the display size based on the model identifier. Alternatively, some other method may be used to determine an identifier for the display device (e.g., knowledge of the legacy remote control used by the display device), and the identifier may be used to determine the display size (possibly using a remote database). Generally, any suitable for determining the display dimensions and generating a mapping may be used.
  • In an embodiment, the user interacts with the touch screen or touch pad 210 in order to navigate the user interface displayed on the television 204. Generally, the user may interact with the touch screen 210 (or touch pad) by using one or more appendages (such as fingers) to touch and/or hover over the touch screen 210 and/or move the one or more appendages relative to the touch screen 210. The manner in which the user interacts with the touch screen 210 may be sensed by the touch screen 210 (or touch pad) to generate signals. The generated signals may be interpreted by one or more processors of the remote control 202 to generate one or more other signals corresponding to user input and are transmitted by the remote control 202 to another device, such as directly to the television 204 or to a content appliance, such as described above, or in any suitable manner. For example, if the user touches the touch screen 210 (or touch pad) with his or her finger and moves the finger upward while in contact with the touch screen 210 (touch pad), one or more processors of the remote control 210 may interpret signals generated according to such touch and movement and generate and transmit one or more other signals that enable another device to update a device on which a GUI is displayed accordingly. Alternatively, signals generated by the touch screen 210 or signals derived therefrom may be transmitted to another device (such as the television 204 or a content appliance) and interpreted by the other device. Generally, any manner in which signals generated by the touch screen 210 (or touch pad) are interpreted as user input may be used.
  • It should be noted that FIG. 2 is simplified for the purpose of illustration and additional details and/or variations are considered as being within the scope of the present disclosure. For example, FIG. 2 shows an example where a remote control device is used to control a display on a television. Generally, the techniques of the present disclosure may apply in any instance in which a touch-sensitive surface is used to provide user input for interaction with a GUI. Further, FIG. 2 shows the touch screen 210 without a display during user interaction. The touch screen 210 may, however, include a display when the user interacts with the touch screen 210, and/or at other times. The display of the touch screen 210 may, for example, be identical or similar to the display on the television 204. In this manner, the user can see the GUI by looking at either the remote control 202 or the television 204. As noted, a touch pad or other touch-sensitive user input device with which proximity sensing is possible may be used in accordance with various embodiments.
  • In addition, FIG. 2 shows a touch screen on a device (remote control 202) that is different from the device on which the GUI is displayed (or primarily displayed). However, techniques of the present disclosure may also be applied in instances in which the touch screen and display are incorporated in a single device. A notebook computer, for example, may include a touch screen that is separate from a display of the notebook computer. The techniques described herein may also be applied to single devices. For example, a touch screen and display operating in accordance with the present disclosure may both be part of a single mobile device (such as a smart phone, tablet computing device, or other device). Generally, the scope of the present disclosure is not limited to the embodiments explicitly described and illustrated herein.
  • As mentioned, various embodiments of the present disclosure utilize touch and proximity sensing technology to allow a user to interact with a graphical user interface. FIG. 3 accordingly shows an illustrative example of how the touch screen 210 of FIG. 2 may be utilized to navigate the interface on the television 204. As with FIG. 2, techniques illustrated in FIG. 3 could apply to a variety of environments and devices, not just those explicitly illustrated and described herein.
  • FIG. 3 in this particular example shows a remote control device 302 and a television 304 which may be the remote control device 202 and television 204 described above in connection with FIG. 2. Also shown in FIG. 3, as in FIG. 2, the television 304 displays an interface 306 which has various selectable options, such as a watch TV button 308. In addition, the remote control device 302 includes a touch screen 310 which may be the touch screen 210 described above in connection with FIG. 2.
  • In an embodiment, when the user interacts with the touch screen 310, visual indicators of such interaction appear on the interface displayed on the television 304. For example, interaction with the touch screen 310 in an embodiment includes touching the touch screen 310 and, generally, performing actions in close proximity to the touch screen 310. For example, as shown in FIG. 3, a user is interacting with the touch screen 310 with a left hand 312 and a right hand 314. It should be noted, however, that other appendages may be used to interact with the touch screen 310. For example, the user may interact with the touch screen 310 or may interact with the touch screen with any appendage or combination of appendages, or other portions of appendages (such as the palm of a hand). In addition, other items, such as styluses or other, non-human, things may be used for interaction with the touch screen 310, although the present disclosure will focus on thumbs for the purpose of illustration.
  • Accordingly, in FIG. 3, a left thumb 316 and a right thumb 318 are shown interacting with touch screen 310. In this example, the user is hovering over, that is, not touching, the touch screen 310 with the left thumb 316 and, in particular, in the lower-left corner of the touch screen 310. However, with the right thumb 318, the user is touching the touch screen 310 and in particular, touching an upper-right portion of touch screen 310. In an embodiment, as a result of such user interaction with the touch screen 310, representations of the left thumb 316 and right thumb 318 are shown in corresponding places on the user interface 306. For example, a representation of a left thumb 320 is displayed at a location on the location of the user interface 306 that corresponds to a location over which the left thumb 316 is hovering over the touch screen 310. Similarly, a representation 322 of the right thumb 318 appears in the upper-right hand portion of the user interface 306 at a location that corresponds to a location touched by the user on the touch screen 310.
  • In the illustrative example of FIG. 3, the representation 320 of the left thumb 316 and the representation 322 of the right thumb 318 each have the appearance of a thumb. That is, the representations on the television 304 resemble the appendages used by the user. However, other representations may be used which do not necessarily resemble body parts. For example, circles or targets or any visual indicator of the user's interaction with touch screen 310 may be used. Further, FIG. 3 shows representations overlaid on a GUI for the purpose of illustration. However, other ways of providing visual feedback to the user based on the user's interaction with the touch screen may be used. For instance, a representation may not be overlaid on the user interface, but may be made by manipulating the user interface in a way that indicates user interaction with the touch screen 310. For example, the user interface could have a warped effect at a location where the user interacts with the touch screen 310. As another example, the display may brighten at locations corresponding to a location with which a user interacts with the touch screen 310. As yet another example, interface elements, such as selectable options of the user interface, may change color, brightness or other characteristics when the user interacts with the touch screen 310 in a corresponding location. Generally, any manner of providing visual feedback of the user's interaction with the touch screen 310 may be used.
  • The visual feedback of user interaction with the touch screen 310 may be provided in a varying manner. For example, as shown in FIG. 3, color characteristics of representations of the user appendages vary according to how the user is interacting with touch screen 310. For example, because the left thumb 316 is not touching but is only hovering over the touch screen 310, the representation 320 of the left thumb 316 in this example appears not bright and transparent; that is, the combination of the representation 320 of the left thumb and the user interface has an appearance as if the interface shows through the representation 320 of the left thumb 316.
  • The representation 322 of the right thumb 318, on the other hand, appears bright and opaque, thereby indicating to the user that the right thumb is touching the touch screen 310. In this manner, the user can hover over the touch screen with an appendage and, based on the location of the representation on the interface 306, knows where to move his or her appendage to navigate the interface as desired. Because, in this example, the representation is transparent when the corresponding appendage is hovering, the representation does not obscure the user interface. For example, as shown in FIG. 3, the representation 320 of the left thumb 316 appears transparently over a play a game option 324 of the user interface 306. In this manner, the user does not need to move the left thumb 316 in order to see what option would be selected by pressing the touch screen 310 at the same location over which the left thumb 316 is hovering. As noted, other ways of providing visual feedback that do not obscure elements of the interface 306, such as by changing elements of the interface to be still recognizable, may be used and, in some embodiments, elements of the interface may be allowed to be obscured.
  • Other indications of user interaction with an interface may also be shown in addition to the representations of the appendages. For example, in FIG. 3, lines radiating from the representation of the right thumb 322 at a location corresponding to a location where the user touches the touch screen 310 are shown as an illustrative example. The radiated lines indicate that the user has touched the touch screen at this location and therefore selected a corresponding option on the user interface which, in this example, is a play music option 326. The lines may appear responsive to contact with the touch screen 310 (or proximity within a threshold), and may subsequently disappear, such as when the corresponding appendage loses contact with the touch screen 310, when a determination is made that the user made a selection of an element, and/or at another time.
  • Other variations not illustrated in this figure, but described below, may also be used. For example, the amount by which a representation is transparent may vary according to a distance by which an appendage is hovering over the touch screen 310. Further, while the example in FIG. 3 shows the user navigating the user interface using the touch screen 310 utilizing two appendages, his or her thumbs, the navigation may be done by a single appendage or other mechanism, or more than two appendages. When a user selects an option of user interface 306, representations may disappear in order to allow a user to fully view a new updated interface that appears as a result of the selection of the option.
  • In addition, while the illustrative examples in FIGS. 2 and 3 illustrate a particular interface for a particular purpose, other GUIs may be used in accordance with the present disclosure. Generally, any GUI that may be presented on a display and manipulated using touch techniques may be used. Example GUIs are GUIs configured for use with any operating system that allows for touch-based input. Embodiments of the present disclosure may also be used to enable non-touch-based operating systems to be navigated using touch techniques. For example, a left-right-up-down (LRUD) operating system, such as on some televisions, may allow navigation only according to the left, right, up, or down directions. User input on a touch screen may be translated to corresponding left, right, up, or down commands to enable touch navigation. For instance, a touch screen may be divided into regions, where each region corresponds to one or more LRUD commands. Touching the touch screen at an upper middle portion, for instance, may correspond to an up command. Some regions may correspond to multiple commands. Touching a touch screen in a corner, for instance, may correspond to a sequence of LRUD commands. The lower left corner may, for example, correspond to an down-left or left-down sequence of commands. The commands may be transmitted in sequence to the device with the LRUD operating system upon selection of the touch screen. Generally, one or more ways of interacting with a touch screen may correspond to one or more commands of a non-touch-screen operating system.
  • FIG. 4 shows an illustrative example of a process 400 that may be used to provide navigation of an interface, such as in the manner illustrated in connection with FIGS. 2 and 3. Some or all of the process 400 (or any other processes described herein, or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware, or combinations thereof. A computer system may be any device capable of performing processing functions, such as notebook computers, desktop computers, remote control devices, content appliances, mobile phones, tablet computing devices, and, generally, any device or collection of devices that utilize one or more processors. One of more of the actions depicted in FIG. 4 may be performed by a device such as a remote control device, a content appliance, a television, or, generally, any device that is configured to participate in providing content. The code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors. The computer-readable storage medium may be non-transitory.
  • In an embodiment, the process 400 includes displaying 402 an interface, such as described above. As used herein, an interface may mean actually displaying the interface or taking one or more actions that cause an interface to be displayed. For example, referring to FIG. 1, the content appliance 102 may display an interface by generating the signal, causing the signal to be sent to the AV receiver which relays the signal to the television 104. A device performing the process 400, such as a television thus configured, may display the interface itself.
  • In an embodiment, a proximate appendage is detected 404. Detecting a proximate appendage may be done in any suitable manner, such as utilizing the techniques in U.S. application Ser. No. 13/047,962 and U.S. application Ser. No. 12/840,320, described above. The appendage may be detected, for example, upon the user moving an appendage within a certain distance of a touch screen and by detecting the proximate appendage. Depending on a particular environment in which the process 400 is performed, detecting the appendage may be performed in various ways. For example, detecting the appendage may be performed by detecting the appendage directly or receiving a signal from another device, such as a remote control device that detected the appendage.
  • Once the appendage is detected 404 in an embodiment, an interface display location is determined 406 based at least in part on a mapping of input device locations to interface display locations, which may be an absolute mapping, as described above. Upon determining the interface display location, in an embodiment, the representation of the appendage is overlaid 408 on the interface display at the determined interface display location. As discussed above, other ways of providing representations may be performed, although overlays of representations are used for the purpose of illustration. As the user moves his or her appendage relative to the touch screen, the position of the representation on the interface may be updated 410 according to movement of the appendage. For example, if the user moves the appendage to the left, the representation of the appendage may move to the left as well. If the user moves the appendage up or down relative to the touch screen, then the representation may remain in the same place, but change color characteristics such as described above. Determining how to update the position of the representation may include multiple detections of the appendage and corresponding determinations of the interface display location based on the mapping.
  • In an embodiment, at some point during interaction with the touch screen, the user may make contact with the touch screen. In such instances, when the user touches the touch screen, a touch event of the appendage is detected 412. When a touch event is detected 412, an operation according to the touch event type and/or location of touch is performed 414. The way in which the user touches a touch screen may indicate, for example, how the user wishes to navigate a user interface. For instance, touching the touch screen and moving the appendage while in contact with the touch screen may indicate a drag operation in the user interface. If the initial touch was on an element and is dragable in the user interface, the element may move accordingly. Similarly, if the user touches the touch screen and subsequently raises the appendage away from the touch screen, losing contact with the touch screen, such an event may indicate selection of an option at the location that was touched. A double tap on the touch surface 210 may also be appropriately interpreted (for example as a double click on an icon on the display).
  • FIG. 5 shows a more detailed process 500 which may be used to update a user interface in accordance with an embodiment. The process illustrated in FIG. 5 may be performed, for example, by a remote control device, such as described, or generally any device with a touch screen that is configured to operate in accordance with the present disclosure. In an embodiment, the process 500 includes displaying interface 502, such as described above. During performance of the process 500, the touch screen may be monitored 504 for various events involving user interaction with the touch screen. A process in a device performing the process 500 may periodically or otherwise poll for events and take action when events are detected. For example, as illustrated in FIG. 5, a repetitive sub-process is performed in which the touch screen is monitored 504 and determinations are made 506 whether an appendage is detected.
  • Determining whether an appendage is detected may be performed in various ways. For example, in some embodiments, the determination may simply be a determination of whether signals from a touch screen indicate the presence of an appendage proximate to the touch screen. However, the determination may be more complex and may include other determinations. For example, determining whether an appendage is detected may include determining how many appendages are detected. In addition, in an embodiment, for any appendages detected, the detected appendages may be matched to actual appendages. For instance, referring to FIG. 3, the two detected appendages may be matched to the left and right thumb. Other appendages and objects may be detected and matched, such as other fingers, styluses, and the like.
  • Matching detected appendages to actual appendages may be done in various ways. For example, in an embodiment, when an appendage is detected, the appendage will generally cause different portions of the touch screen to generate different signals. For example, with capacitive pressure sensors, the capacitance measurements for the touch screen may increase for locations that are proximate to the detected appendage. The locations for which the capacitance changes may be used to determine which appendage a detected appendage corresponds to. For example, a thumb will generally affect a larger region of the touch screen than other fingers due to the thumb's larger relative size. In addition, regions of affected locations in the touch screen may generally be oriented differently depending on the appendage being sensed. Referring to FIG. 3, for example, a region of locations on the touch screen 310 for the left thumb would generally point upward and to the right whereas a right thumb would point upward and to the left. Some suitable techniques for matching detected appendages to fingers are described in U.S. application Ser. No. 13/047,962 and U.S. application Ser. No. 12/840,320, referenced above.
  • Returning to the process 500, in an embodiment, if no appendage is detected, the touch screen continues to be monitored, as illustrated. however, if it is determined 506 that an appendage is detected, then a determination may be made whether a touch event criteria has been satisfied. (The touch screen may continue to be monitored, when a touch event is detected, for example, to detect further touch events.) Touch event criteria may be criteria that, when met, indicate a touch event. For example, criteria for a selection of a touch event corresponding to selection of a user interface element may be that the user may contact the touch screen and then may lose contact over a predetermined period of time. Other criteria can be simpler, the same complexity, and/or more complex. Criteria may take into account information about timing of various activities, such as how long the user has touched the touch screen, how many appendages or other objects touched the screen, whether or not the user moved while in contact with the touch screen, a certain amount and the like.
  • In an embodiment, the touch event criteria takes into account matches to appendages that have been detected. Different touch events may correspond to different actions by a user using different subsets of his or her fingers (or other appendages or objects). For example, in an embodiment, when a middle finger and an index finger are detected, a right click event may be generated at a user interface (UI) location corresponding to a touch screen location selected by the index finger. The location of the UI may correspond to a particular right-click menu of selectable interface options that may be displayed upon detection of the right click event. As another example, when a thumb and right index finger are detected, a scroll left event may be detected for a UI element (scroll bar, icon, etc.) selected by the right index finger. Similarly, when a thumb and right ring finger are detected, a scroll right event may be detected for a UI element selected by the right ring finger. When the thumb finger is detected, a scroll left or right event may be generated for a UI element selected by the index finger. The direction of scroll may be determined in many ways, such as by the left or right thumb detected, by the direction of movement of the detected thumb, and the like. Generally, any way of matching actions of sets of one or more fingers (or other objects) to events may be used.
  • If it is determined that touch event criteria are satisfied, a determination may be made 510 of the touch event type. If, for example, it is determined that the touch event type was an object selection, then the interface may be updated 512 according to the selection. As noted, updating the interface may be done in any suitable way, such as changing a display of the interface on a display device providing a completely new display or generally changing the interface in accordance with its programming logic. An overlay of representations on the interface may be removed in accordance with an embodiment. If the event touch type is another type of touch type, such as a drag, scroll or other event, then the user interface may be updated if applicable. For example, if the user interface is scrollable, the user interface may be scrolled. If a scroll event is detected with an interface object selected, the object may be moved on the interface accordingly. As another example, if the user touches the touch screen at a location corresponding to a portion of the user interface that is not manipulable through user interaction and moved within that area while in contact with the touch screen, the user interface may be left as is, although representations of detected appendages may be updated with movement accordingly.
  • As illustrated in FIG. 5, when an interface is updated or otherwise, the touch screen may be continued to be monitored 504. If it has been determined that touch event criteria have not been satisfied—for example, if the user has touched the touch screen, but not moved or lost contact with the touch screen—a determination may be made 516 whether the detection of the appendage corresponds to a touch or a hover. If it is determined 516 that the detected appendage is hovering, then a hover mode representation of an appendage on the interface screen at a location corresponding to the detected location may be overlaid or updated. For example, if the hover mode representation is already on the interface, the position of the representation may be changed accordingly. If, for example, an overlay of the representation was not on the user interface, a representation may then be overlaid at the location corresponding to the location at which the appendage was detected on the touch screen. If it is determined that the detected appendage is touching the touch screen, a touch mode representation of the appendage on the interface screen at a location corresponding to the detection location may be overlaid or updated accordingly. For example, if a hover mode representation was on the touch screen and the user then touched the touch screen, the hover mode representation may be changed into a touch mode representation. If no representation was on the interface, then a touch mode representation may appear. If a touch mode representation was present on the interface, a location of the representation may be updated in accordance with any movement by the user—for example, if the location changed since the last time the determination was made. As with other steps herein, the touch screen is continued to be monitored during performance of the process 500.
  • As noted, representations of user appendages or other devices used to interact with a touch screen may change according to a manner in which the interaction is performed. One way of changing the manner in which the interaction is performed is by changing the representation based at least in part on a distance of a detected appendage and, in particular, changing the representation based at least in part on the distances of multiple locations of a detected appendage (or other object). As described above, the color characteristics of a representation of an appendage may be changed based at least in part on the distance the appendage is from the touch screen. FIG. 6, for example, shows a touch screen 602 over which a finger 604 is hovering at a distance that is not in contact with the touch screen 602. FIG. 6 also shows a corresponding user interface 606 on which a representation 608 of the finger 604 appears. It should be noted that the touch screen 602 and user interface 606 are shown simplified for the purpose of illustration of an embodiment. For example, the touch screen 602 is shown without other portions of a device with which the touch screen 602 is attached. Similarly, the user interface 606 is illustrated without a device with which the user interface is displayed or a GUI, although the user interface may, and typically will, have a GUI displayed. The touch screen 602 may similarly have a GUI displayed, which may match a GUI displayed on the user interface 606.
  • The representation 608 of the finger 604 appears on the interface at a location corresponding to the location at which the finger 604 hovers over the touch screen 602. In addition, in this particular example, the representation 608 resembles an outline of a finger and this outline is oriented according to the orientation of the finger 604 over the touch screen 602. As shown in FIG. 6, the finger 604 at different locations is a different distance over the touch screen 602. The representation 608 of the finger 604 has color characteristics changing according to the various distances. For example, portions of the representation 608 of the finger 604 that correspond to locations of the finger 604 that are closer to the touch screen 602 are darker than portions of the representation 608 that correspond to locations on the finger 604 that are further from the touch screen 602. As the user reorients the finger, for example, so that other parts of the finger are different distances from those illustrated in FIG. 6, the representation may be updated accordingly. Similarly, as the finger 604 is moved relative to the touch screen 602, the location of the representation 608 on the user interface 606 may be updated as well. (If a representation is displayed on the touch screen, the representation may be updated there as well.) When updated, transparency and/or other color characteristics of the representation 608 may change according to distance changes at locations of the various portions of the finger 604 from the touch screen 602.
  • FIG. 7 shows an illustrative example of how in one way a representation may be updated. In particular, FIG. 7 shows a touch screen 702 over which a finger 704 hovers in order to cause a user interface 706 to display a representation 708 of the finger 704. The touch screen 702, finger 704, interface 706 and representation 708 may be the same as those similarly named items described above in connection with FIG. 6. In FIG. 7, however, the finger 704 is closer to the touch screen 702 than the finger 604 to the touch screen 602 in FIG. 6. The representation 708 in FIG. 7 of the finger 704, accordingly, is displayed with different color characteristics than in FIG. 6. For example, the tip of the finger 704 is closer to the touch screen 702 than the tip of the finger 604 to the touch screen 602 of FIG. 6. The tip of the representation 708 is therefore in this example less transparent than the tip of the representation 608 in FIG. 6. Portions of the representation 708 of the finger 704 that are further from the touch screen 702 are more transparent in the same manner as shown in FIG. 6. However, they are less transparent than corresponding locations in FIG. 6 because overall the finger 704 in FIG. 7 is closer to the touch screen than the finger 604 in FIG. 6 at corresponding locations.
  • FIGS. 6 and 7 show comparative figures that illustrate one particular embodiment of the present disclosure. As noted, other ways of varying characteristics of a representation of a detected object may be made. For example, as noted, the user interface may warp at locations corresponding to detected objects. The amount by which the user interface is warped may vary depending on the distance of the detected objects. The closer the object, the more the corresponding location on the user interface may be warped. Generally, any way of varying the representation based at least in part on distance of a corresponding object to a touch screen may be used.
  • Further, while various embodiments of the present disclosure describe embodiments in terms of touch screens, any input device that is able to detect proximity and touch may be used. For example, touch screens used in some embodiments may not themselves display any information or may display information different from that which is displayed on another device, such as a television. In addition, proximity sensing techniques may be used in other contexts, not just planar touch sensitive areas, such as those illustrated above. For example, a remote control device with buttons (such as physically displaceable buttons) may incorporate proximity sensing technology. Proximity sensors may be incorporated with the remote control device. When a user's appendage becomes close to a button (such as by making contact with the button), a representation of the appendage may appear on a display. When the user presses the button, an action may be taken. The action may correspond to the button, the location of the representation on the display, or otherwise. As an illustrative example, if a user's appendage becomes close to a “play” button on a remote control, a representation of the appendage may appear over a “play” button on a display, such as described above. When the user presses the button, a play function may be performed. For instance, if watching a DVD, a DVD player may be put in a play state. In an embodiment where displaceable or other buttons are used in connection with proximity sensing techniques, movement of the representation of an appendage on a display may correspond to user movement of the appendage relative to the remote control, manipulation of an input device (such as a joystick), or otherwise.
  • In addition to the above, other variations are within the scope of the present disclosure. For example, additional techniques may be incorporated with those above. As an example, buttons of a remote control (such as physically displaceable buttons) may be force sensing. The above described hovering effects may be produced upon light presses of a button and selection of an object may be performed upon more forceful presses of the button. As other examples, sound, vibration, or other additional feedback may be provided to the user based on the user's interaction with a remote control device. A remote control device may, for instance, vibrate upon making a selection using the techniques described herein. The remote control or another device (such as a television or audio system) may make a sound upon selection.
  • Other variations are within the spirit of the present invention. Thus, while the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific form or forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention, as defined in the appended claims.
  • The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. The term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
  • Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate, and the inventors intend for the invention to be practiced otherwise than as specifically described herein. Accordingly, this invention includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.
  • All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

Claims (30)

1. A computer-implemented method of manipulating a display, comprising:
detecting a set of one or more user appendages proximate to and out of contact with a set of one or more corresponding sensor locations of a sensing region of a remote control device;
determining, based at least in part on a mapping of the sensing region to a display region of a remote display device, a set of one or more display locations of the display region; and
transmitting a signal that causes the display device to display a set of one or more representations of the detected set of one or more user appendages according to the determined set of one or more second locations.
2. The computer-implemented method of claim 1, wherein the mapping is an absolute mapping.
3. The computer-implemented method of claim 1, further comprising:
calculating at least one measurement that corresponds to a distance of the detected appendage from the sensing region; and
wherein displaying the representation of the detected appendage includes displaying the representation of the detected appendage with one or more color characteristics that are based at least in part on the measurement.
4. The computer-implemented method of claim 1, wherein the display displays a graphical user interface and wherein displaying the set of one or more representations includes overlaying the one or more representations on the graphical user interface.
5. The computer-implemented method of claim 4, wherein:
the graphical user interface includes one or more selectable options that each correspond to a selection region of the sensing region;
the method further comprises detecting a contact event by at least one appendage of the set of one or more appendages, the detected contact event corresponding to a contact location of the sensing region; and
when the contact location corresponds to a selection region of a corresponding selectable option of the graphical user interface, the graphical user interface is updated according to the corresponding selectable option.
6. The computer-implemented method of claim 5, wherein the displayed set of one or more representations is changed upon detection of the contact event for which the contact location corresponds to the selection region of the corresponding selectable option.
7. The computer-implemented method of claim 6, wherein changing the displayed set of one or more representations includes removing the set of one or more representations from the display.
8. The computer-implemented method of claim 1, wherein at least one of the representations resembles a corresponding appendage.
9. The computer-implemented method of claim 1, wherein the displayed set of one or more representations includes at least two representations of different forms.
10. A computer-implemented method of manipulating a display, comprising:
calculating measurements that correspond to distances of a user appendage from a sensing region of a remote control device as the user moves the user appendage relative to the sensing region;
determining, based at least in part on a mapping of locations of the sensing region to locations of a display device, a location on the display device that corresponds to a location of the appendage relative to the sensing region; and
taking one or more actions that cause a display device to display a representation of the appendage according to the determined location on the display device such that the representation has one or more color characteristics that vary based at least in part on the calculated measurements.
11. The computer-implemented method of claim 10, wherein taking the one or more actions includes transmitting remotely generated signals to the display device.
12. The computer-implemented method of claim 10, wherein the representation has a transparent appearance when the user appendage is out of contact with the sensing region and an opaque appearance when the user appendage is in contact with the sensing region.
13. The computer-implemented method of claim 10, further comprising determining location changes of the sensing region with which the user appendage is proximate or in contact and wherein taking the one or more actions includes updating locations of the representation on the display.
14. A user input system, comprising:
one or more processors; and
memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device to at least:
display a representation of a user appendage on a display of the display device at a location of the display based at least in part on a mapping of locations of a sensing region of a remote control device to locations on the display; and
change one or more color characteristics of the representation based at least in part on changes in distances of the user appendage from a sensing region of a remote control device.
15. The user input system of claim 14, wherein the instructions further cause the user input system to cause the display device to:
change a location of the representation based at least in part on movement of the user appendage relative to the sensing region.
16. The user input system of claim 14, wherein the instructions further cause the user input system to cause the display device to:
update the display according to a predefined action of multiple user appendages in connection with the sensing region.
17. The user input system of claim 14, wherein the display device is separate from the user input system.
18. A user input system for interacting with a graphical user interface, comprising:
one or more processors; and
memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device to at least:
display a representation of a user appendage at a display location of a display of the display device, the display location based at least in part on a mapping of display locations of the display device to sensing locations of a sensing region of a sensing device; and
at least when the appendage moves relative to and out of contact with the sensing device, change the display location based at least in part on the mapping.
19. The user input system of claim 18, wherein the instructions further cause the user input system to cause the display device to:
change a location of the representation based at least in part on movement of the user appendage relative to the sensing region.
20. The user input system of claim 18, wherein the mapping is an absolute mapping.
21. The user input system of claim 18, wherein the instructions further cause the user input system to identify the user appendage from a set of potential user appendages.
22. The user input system of claim 21, wherein the instructions further cause the user input system to update the display based at least in part on detection of an event that is uncausable using at least one other of the potential user appendages.
23. The user input system of claim 18, wherein the sensing device is a component of a device that is physically disconnected from the display device.
24. The user input system of claim 18, wherein the sensing device is a remote control device for the display device.
25. A display device, comprising:
one or more processors; and
memory including instructions that, when executed collectively by the one or more processors, cause the display device to at least:
display a graphical user interface;
receive signals corresponding to user interaction with a sensing region of a sensing input device, the signals being based at least in part on a number of dimensions of user interaction that is greater than two; and
change the graphical user interface according to the received signals.
26. The user input system of claim 25, wherein the signals are generated by an intermediate device that receives other signals from a remote control device.
27. The user input system of claim 25, wherein the sensing input device is separate from the display device.
28. The user input system of claim 25, wherein changing the graphical user interface includes updating an appearance characteristic of a representation of an object used to interact with the sensing input device.
29. The user input system of claim 25, wherein changing the graphical user interface includes updating, on the graphical user interface, a location of a representation of an object used to interact with the sensing input device.
30. The user input system of claim 25, wherein the user interaction includes contactless interaction with the sensing input device.
US13/284,810 2011-04-29 2011-10-28 Techniques for content navigation using proximity sensing Abandoned US20120274547A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/284,810 US20120274547A1 (en) 2011-04-29 2011-10-28 Techniques for content navigation using proximity sensing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161480849P 2011-04-29 2011-04-29
US13/284,810 US20120274547A1 (en) 2011-04-29 2011-10-28 Techniques for content navigation using proximity sensing

Publications (1)

Publication Number Publication Date
US20120274547A1 true US20120274547A1 (en) 2012-11-01

Family

ID=47067498

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/284,668 Active 2033-08-15 US9239837B2 (en) 2011-04-29 2011-10-28 Remote control system for connected devices
US13/284,775 Active US8745024B2 (en) 2011-04-29 2011-10-28 Techniques for enhancing content
US13/284,810 Abandoned US20120274547A1 (en) 2011-04-29 2011-10-28 Techniques for content navigation using proximity sensing

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/284,668 Active 2033-08-15 US9239837B2 (en) 2011-04-29 2011-10-28 Remote control system for connected devices
US13/284,775 Active US8745024B2 (en) 2011-04-29 2011-10-28 Techniques for enhancing content

Country Status (1)

Country Link
US (3) US9239837B2 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130107131A1 (en) * 2011-10-28 2013-05-02 Universal Electronics Inc. System and method for optimized appliance control
US20130155010A1 (en) * 2011-12-14 2013-06-20 Microchip Technology Incorporated Capacitive Proximity Based Gesture Input System
US20130176254A1 (en) * 2012-01-06 2013-07-11 Samsung Electronics Co., Ltd. Input apparatus, display apparatus, control method thereof and display system
US20130176505A1 (en) * 2012-01-06 2013-07-11 Samsung Electronics Co., Ltd. Input apparatus, display apparatus and methods for controlling a display through user manipulation
US20130249813A1 (en) * 2012-03-26 2013-09-26 Lenovo (Singapore) Pte, Ltd. Apparatus, system, and method for touch input
US8654074B1 (en) * 2010-07-02 2014-02-18 Alpha and Omega, Inc. Remote control systems and methods for providing page commands to digital electronic display devices
US20140130091A1 (en) * 2011-12-28 2014-05-08 Jiang Liu User interface interaction system and method for handheld device and tv set
US20140184513A1 (en) * 2012-12-31 2014-07-03 Nvidia Corporation Softkey magnification on touch screen
EP2763022A1 (en) * 2013-01-31 2014-08-06 Samsung Electronics Co., Ltd Method for controlling display of pointer and displaying the pointer, and apparatus thereof
US20150007232A1 (en) * 2013-06-26 2015-01-01 Echostar Technologies L.L.C. Grid system and method for remote control
US20150046945A1 (en) * 2012-03-30 2015-02-12 Zte Corporation Method for Controlling Touch Screen, and Mobile Terminal
WO2015105815A1 (en) * 2014-01-10 2015-07-16 Microsoft Technology Licensing, Llc Hover-sensitive control of secondary display
WO2015121175A1 (en) * 2014-02-17 2015-08-20 Volkswagen Aktiengesellschaft User interface and method for assisting a user in the operation of a user interface
US20160034058A1 (en) * 2014-07-31 2016-02-04 Microsoft Corporation Mobile Device Input Controller For Secondary Display
US20170060391A1 (en) * 2015-08-28 2017-03-02 Samsung Electronics Co., Ltd. Electronic device and operating method of the same
US20170277413A1 (en) * 2016-03-25 2017-09-28 Samsung Electronics Co., Ltd. Method for outputting screen and electronic device supporting the same
US20170337400A1 (en) * 2011-12-30 2017-11-23 Samsung Electronics Co., Ltd. Electronic device, user input apparatus controlling the same, and control method thereof
US9910512B1 (en) 2014-10-27 2018-03-06 Amazon Technologies, Inc. Systems and methods for using cursor movement profiles
US10042440B2 (en) 2012-03-26 2018-08-07 Lenovo (Singapore) Pte. Ltd. Apparatus, system, and method for touch input
US10185464B2 (en) 2015-05-28 2019-01-22 Microsoft Technology Licensing, Llc Pausing transient user interface elements based on hover information
US10228768B2 (en) 2014-03-25 2019-03-12 Analog Devices, Inc. Optical user interface
US10325487B2 (en) 2011-10-28 2019-06-18 Universal Electronics Inc. System and method for optimized appliance control
US10425568B2 (en) 2016-08-16 2019-09-24 Samsung Electronics Co., Ltd. Display device and system and method for controlling power of the same
CN110471524A (en) * 2019-07-31 2019-11-19 维沃移动通信有限公司 Display control method and terminal device
US10593195B2 (en) 2011-10-28 2020-03-17 Universal Electronics Inc. System and method for optimized appliance control
US20200204392A1 (en) * 2018-12-20 2020-06-25 Ming-Tsung Chen Home appliance control system
US10764625B2 (en) * 2016-09-08 2020-09-01 Fm Marketing Gmbh Smart touch
US10937308B2 (en) 2011-10-28 2021-03-02 Universal Electronics Inc. System and method for optimized appliance control
US11295605B2 (en) 2011-10-28 2022-04-05 Universal Electronics Inc. System and method for optimized appliance control
US20220206683A1 (en) * 2019-05-09 2022-06-30 Microsoft Technology Licensing, Llc Quick menu selection device and method
US11430325B2 (en) * 2013-06-26 2022-08-30 Google Llc Methods, systems, and media for controlling a remote device using a touch screen of a mobile device in a display inhibited state
EP4171041A4 (en) * 2020-07-07 2023-11-01 Samsung Electronics Co., Ltd. Display device and control method therefor

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2680639C (en) 2007-03-19 2017-03-07 University Of Virginia Patent Foundation Access needle pressure sensor device and method of use
US11058354B2 (en) 2007-03-19 2021-07-13 University Of Virginia Patent Foundation Access needle with direct visualization and related methods
WO2011103456A2 (en) 2010-02-18 2011-08-25 University Of Virginia Patent Foundation System, method, and computer program product for simulating epicardial electrophysiology procedures
US9468396B2 (en) 2007-03-19 2016-10-18 University Of Virginia Patent Foundation Systems and methods for determining location of an access needle in a subject
US9642534B2 (en) 2009-09-11 2017-05-09 University Of Virginia Patent Foundation Systems and methods for determining location of an access needle in a subject
US20120128334A1 (en) * 2010-11-19 2012-05-24 Samsung Electronics Co. Ltd. Apparatus and method for mashup of multimedia content
US8918544B2 (en) 2011-03-31 2014-12-23 Logitech Europe S.A. Apparatus and method for configuration and operation of a remote-control system
US9239837B2 (en) 2011-04-29 2016-01-19 Logitech Europe S.A. Remote control system for connected devices
KR101810403B1 (en) * 2011-05-13 2017-12-19 삼성전자주식회사 Apparatus and method for storing data of peripheral device in portable terminal
KR20130011434A (en) * 2011-07-21 2013-01-30 삼성전자주식회사 Display apparatus, host apparatus, display system and control method thereof
KR101869095B1 (en) * 2011-08-23 2018-06-19 삼성전자주식회사 Method and apparatus for displaying in a portagble terminal
CN103049997B (en) * 2011-10-11 2016-01-27 Lg电子株式会社 The control method of telepilot and multimedia equipment
KR101718894B1 (en) * 2011-11-29 2017-03-23 삼성전자주식회사 System and method for controlling device
US20130201161A1 (en) * 2012-02-03 2013-08-08 John E. Dolan Methods, Systems and Apparatus for Digital-Marking-Surface Content-Unit Manipulation
US20150163537A1 (en) 2012-06-14 2015-06-11 Flextronics Ap, Llc Intelligent television
US20140013193A1 (en) * 2012-06-29 2014-01-09 Joseph John Selinger Methods and systems for capturing information-enhanced images
US9423925B1 (en) * 2012-07-11 2016-08-23 Google Inc. Adaptive content control and display for internet media
CN104145434B (en) * 2012-08-17 2017-12-12 青岛海信国际营销股份有限公司 The channel switch device of intelligent television
CN104756506A (en) * 2012-10-24 2015-07-01 索尼公司 HDMI device control via IP
US20140195975A1 (en) * 2013-01-04 2014-07-10 Samsung Electronics Co., Ltd. Display apparatus and method of controlling a display apparatus
US9286323B2 (en) 2013-02-25 2016-03-15 International Business Machines Corporation Context-aware tagging for augmented reality environments
US9673925B2 (en) * 2013-03-15 2017-06-06 Universal Electronics Inc. System and method for monitoring user interactions with a universal controlling device
US9625884B1 (en) * 2013-06-10 2017-04-18 Timothy Harris Ousley Apparatus for extending control and methods thereof
KR102044701B1 (en) * 2013-07-10 2019-11-14 엘지전자 주식회사 Mobile terminal
US10114532B2 (en) * 2013-12-06 2018-10-30 Google Llc Editing options for image regions
KR20150092996A (en) * 2014-02-06 2015-08-17 삼성전자주식회사 display applaratus and method for controlling the electronic device using the same
FR3019428A1 (en) * 2014-03-31 2015-10-02 Orange DEVICE AND METHOD FOR REMOTELY CONTROLLING THE RESTITUTION OF MULTIMEDIA CONTENT
DE102014005534A1 (en) * 2014-04-16 2015-10-22 Fm Marketing Gmbh Method for programming a remote control
US20150301718A1 (en) * 2014-04-18 2015-10-22 Google Inc. Methods, systems, and media for presenting music items relating to media content
US20150371536A1 (en) 2014-06-20 2015-12-24 Ray Enterprises Inc. Universal remote control device
US9426203B2 (en) 2014-06-27 2016-08-23 Microsoft Technology Licensing, Llc Remote application control interface
US9554189B2 (en) 2014-06-30 2017-01-24 Microsoft Technology Licensing, Llc Contextual remote control interface
CN105338385A (en) 2014-07-16 2016-02-17 阿里巴巴集团控股有限公司 Method for video controlling and equipment
US10635296B2 (en) 2014-09-24 2020-04-28 Microsoft Technology Licensing, Llc Partitioned application presentation across devices
US10448111B2 (en) 2014-09-24 2019-10-15 Microsoft Technology Licensing, Llc Content projection
US9769227B2 (en) 2014-09-24 2017-09-19 Microsoft Technology Licensing, Llc Presentation of computing environment on multiple devices
US20160088060A1 (en) * 2014-09-24 2016-03-24 Microsoft Technology Licensing, Llc Gesture navigation for secondary user interface
US10025684B2 (en) 2014-09-24 2018-07-17 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
WO2016109069A1 (en) 2014-12-31 2016-07-07 Pcms Holdings, Inc. Systems and methods for creation of a listening log and music library
KR102300435B1 (en) * 2015-01-09 2021-09-09 삼성전자주식회사 A display apparatus and a display method
USD795842S1 (en) * 2015-07-20 2017-08-29 Lg Electronics Inc. Remote control
US10521177B2 (en) 2015-08-30 2019-12-31 EVA Automation, Inc. User interface based on system-state information
US10452332B2 (en) 2015-08-30 2019-10-22 EVA Automation, Inc. User interface based on device-state information
US10198231B2 (en) 2015-08-30 2019-02-05 EVA Automation, Inc. User interface based on system-state information
US10200737B2 (en) 2015-08-30 2019-02-05 EVA Automation, Inc. User interface based on device-state information
US10198230B2 (en) 2015-08-30 2019-02-05 EVA Automation, Inc. User interface based on device-state information
US10296275B2 (en) * 2015-08-30 2019-05-21 EVA Automation, Inc. User interface based on device-state information
US10198232B2 (en) 2015-08-30 2019-02-05 EVA Automation, Inc. User interface based on system-state information
US10387095B2 (en) 2015-08-30 2019-08-20 EVA Automation, Inc. User interface based on system-state information
US10387094B2 (en) 2015-08-30 2019-08-20 EVA Automation, Inc. User interface based on device-state information
US10296276B2 (en) 2015-08-30 2019-05-21 EVA Automation, Inc. User interface based on device-state information
US9894409B2 (en) 2015-08-30 2018-02-13 EVA Automation, Inc. User interface based on device-state information
US9798554B2 (en) 2015-09-11 2017-10-24 EVA Automation, Inc. Touch-sensitive remote control with visual feedback
KR102467519B1 (en) * 2015-10-19 2022-11-16 삼성전자주식회사 Display apparatus for setting universal remote controller, method thereof and system
KR102462671B1 (en) 2015-10-28 2022-11-04 삼성전자 주식회사 Display Apparatus and Display Control Method Thereof
KR102395701B1 (en) * 2015-11-11 2022-05-10 삼성전자주식회사 Electronic apparatus and method for controlling of an electronic apparatus
US10431218B2 (en) * 2016-02-15 2019-10-01 EVA Automation, Inc. Integration and probabilistic control of electronic devices
KR20170106055A (en) * 2016-03-11 2017-09-20 삼성전자주식회사 Appratus and method for providing graphical user interface for providing service
KR102500558B1 (en) * 2016-03-16 2023-02-17 엘지전자 주식회사 Display device and method for operating thereof
CN105843412B (en) * 2016-03-31 2019-05-28 珠海迈科智能科技股份有限公司 A kind of method of the key of determining remote controler
KR102507161B1 (en) * 2016-09-27 2023-03-07 삼성전자주식회사 Apparatus and control method for displaying content of peripheral device
US10958963B2 (en) 2016-11-22 2021-03-23 Caavo Inc Automatic screen navigation for media device configuration and control
KR20180067108A (en) * 2016-12-12 2018-06-20 삼성전자주식회사 Display apparatus presenting status of external electronic apparatus and controlling method thereof
WO2018112730A1 (en) * 2016-12-20 2018-06-28 Arris Enterprises Llc Display device auto brightness adjustment controlled by a source device
US10652040B2 (en) * 2017-10-17 2020-05-12 Carrier Corporation Common social interface for system controls
EP3609190A1 (en) 2017-12-12 2020-02-12 Spotify AB Methods, computer server systems and media devices for media streaming
US11068137B2 (en) * 2017-12-18 2021-07-20 Facebook, Inc. Systems and methods for augmenting content
KR20190084767A (en) * 2018-01-09 2019-07-17 삼성전자주식회사 Electronic apparatus, user interface providing method and computer readable medium
US11474620B2 (en) * 2019-03-01 2022-10-18 Sony Interactive Entertainment Inc. Controller inversion detection for context switching
CA3143789A1 (en) * 2021-01-02 2022-07-02 Easywebapp Inc. Method and system for providing a website and a standalone application on a client device using a single code base
US20230247259A1 (en) * 2022-01-28 2023-08-03 Discovery.Com, Llc Systems and methods for media streaming application interacting with a social network

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6144366A (en) * 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
US20020196238A1 (en) * 2001-06-20 2002-12-26 Hitachi, Ltd. Touch responsive display unit and method
US20060132455A1 (en) * 2004-12-21 2006-06-22 Microsoft Corporation Pressure based selection
US20080231610A1 (en) * 2004-07-30 2008-09-25 Apple Inc. Gestures for touch sensitive input devices
US20090237372A1 (en) * 2008-03-20 2009-09-24 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen in the same
US20100020043A1 (en) * 2008-07-28 2010-01-28 Samsung Electronics Co. Ltd. Mobile terminal having touch screen and method for displaying cursor thereof
US20100059296A9 (en) * 2002-02-20 2010-03-11 Planar Systems, Inc. Light sensitive display
US20100315413A1 (en) * 2009-06-16 2010-12-16 Microsoft Corporation Surface Computer User Interaction
US20110141012A1 (en) * 2009-12-14 2011-06-16 Samsung Electronics Co., Ltd. Displaying device and control method thereof and display system and control method thereof
US20110261058A1 (en) * 2010-04-23 2011-10-27 Tong Luo Method for user input from the back panel of a handheld computerized device
US20110304542A1 (en) * 2010-06-10 2011-12-15 Isaac Calderon Multi purpose remote control with display

Family Cites Families (187)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3990012A (en) 1975-04-25 1976-11-02 Tocom, Inc. Remote transceiver for a multiple site location in a two-way cable television system
US4174517A (en) 1977-07-15 1979-11-13 Jerome Mandel Central system for controlling remote devices over power lines
JPS6334382Y2 (en) 1980-08-08 1988-09-12
DE3036552C2 (en) 1980-09-27 1985-04-25 Blaupunkt-Werke Gmbh, 3200 Hildesheim Television reception system
DE3364719D1 (en) 1982-09-07 1986-08-28 Emi Plc Thorn Television and distribution network
US4566034A (en) 1983-05-02 1986-01-21 Rca Corporation Remote control transmitter arrangement for one or more television devices
US4626848A (en) 1984-05-15 1986-12-02 General Electric Company Programmable functions for reconfigurable remote control
KR900002744B1 (en) 1985-05-29 1990-04-28 알프스덴기 가부시기 가이샤 Remote controller
US4774511A (en) 1985-05-30 1988-09-27 Nap Consumer Electronics Corp. Universal remote control unit
US4918439A (en) 1987-06-23 1990-04-17 Cl 9, Inc. Remote control device
US4837627A (en) 1987-08-19 1989-06-06 Rca Licensing Corporation Programmable operating-parameter control appatatus for a television receiver
US5255313A (en) 1987-12-02 1993-10-19 Universal Electronics Inc. Universal remote control system
US5414426A (en) 1987-10-14 1995-05-09 Universal Electronics Inc. Favorite key macro command and chained macro command in a remote control
US5515052A (en) 1987-10-14 1996-05-07 Universal Electronics Inc. Universal remote control with function synthesis
US6720904B1 (en) 1987-10-14 2004-04-13 Universal Electronics Inc. Remote control with LED capabilities
US6014092A (en) 1987-10-14 2000-01-11 Universal Electronics Inc. Key mover
US5481256A (en) 1987-10-14 1996-01-02 Universal Electronics Inc. Direct entry remote control with channel scan
US4959810A (en) 1987-10-14 1990-09-25 Universal Electronics, Inc. Universal remote control device
US5228077A (en) 1987-12-02 1993-07-13 Universal Electronics Inc. Remotely upgradable universal remote control
US5537463A (en) 1987-10-14 1996-07-16 Universal Electronics Inc. Magnetic modem in a remote control
US5177461A (en) 1988-11-28 1993-01-05 Universal Electronics Inc. Warning light system for use with a smoke detector
US5109222A (en) 1989-03-27 1992-04-28 John Welty Remote control system for control of electrically operable equipment in people occupiable structures
JPH0374139A (en) 1989-05-16 1991-03-28 Sony Corp Power source condition detector
US5272418A (en) 1990-01-09 1993-12-21 Universal Electronics, Inc. Time enabled photosensing circuit
US5367316A (en) * 1990-03-27 1994-11-22 Matsushita Electric Industrial Co., Ltd. Remote-control apparatus for electronics apparatus
US5161023A (en) 1990-09-24 1992-11-03 Thomson Consumer Electronics, Inc. Previous channel feature in a television receiver having multiple rf inputs
US5140326A (en) 1991-01-29 1992-08-18 Harris Corporation Converter comparator cell with improved resolution
US5192999A (en) 1991-04-25 1993-03-09 Compuadd Corporation Multipurpose computerized television
US5422783A (en) 1992-07-06 1995-06-06 Universal Electronics Inc. Modular casing for a remote control having upper housing member slidingly received in a panel section
US5410326A (en) 1992-12-04 1995-04-25 Goldstein; Steven W. Programmable remote control device for interacting with a plurality of remotely controlled devices
US5374999A (en) 1992-12-22 1994-12-20 Silitek Corporation Scan control system
KR100217236B1 (en) 1993-03-24 1999-09-01 피레하머 쥬니어 리챠드 에이. Infrared remote control device for a personal digital assistant
JPH06327056A (en) * 1993-05-14 1994-11-25 Sony Corp Remote control system
US6275268B1 (en) 1993-09-09 2001-08-14 United Video Properties, Inc. Electronic television program guide with remote product ordering
US5481251A (en) 1993-11-29 1996-01-02 Universal Electronics Inc. Minimal function remote control without digit keys and with a power toggle program and with a channel rotation program
KR0128169B1 (en) 1993-12-31 1998-04-15 김광호 Home automation
US5629868A (en) 1994-02-07 1997-05-13 Le Groupe Videotron Ltee Method of programming local control
KR960003475U (en) 1994-06-08 1996-01-22 Lighting Remote Control
US5671267A (en) 1994-12-30 1997-09-23 Lucent Technologies Inc. Interactive system for communications between a cordless telephone and a remotely operated device
US5517257A (en) * 1995-03-28 1996-05-14 Microsoft Corporation Video control user interface for interactive television systems and method for controlling display of a video movie
US6284563B1 (en) 1995-10-31 2001-09-04 Tessera, Inc. Method of making compliant microelectronic assemblies
JP3169815B2 (en) 1995-12-14 2001-05-28 松下電送システム株式会社 Image communication device and image communication method
KR0164089B1 (en) 1995-12-20 1998-12-01 양승택 Remote control method and system thereof
US5619196A (en) 1995-12-28 1997-04-08 Universal Electronics Inc. Single wire keyboard encode and decode circuit
US5638050A (en) 1995-12-29 1997-06-10 Universal Electronics, Inc. System for locating an object
US5686891A (en) 1995-12-29 1997-11-11 Universal Electronics Inc. System for locating an object
US5677711A (en) 1996-01-02 1997-10-14 Silitek Corporation Touch control type cursor control device
US5963145A (en) 1996-02-26 1999-10-05 Universal Electronics Inc. System for providing wireless pointer control
US5614906A (en) 1996-04-23 1997-03-25 Universal Electronics Inc. Method for selecting a remote control command set
US6173330B1 (en) 1996-09-17 2001-01-09 Motorola, Inc. Delivery and acquisition of data segments with optimized inter-arrival time
US5907322A (en) 1996-10-16 1999-05-25 Catch Tv Acquisition Corp. Television event marking system
US6177931B1 (en) 1996-12-19 2001-01-23 Index Systems, Inc. Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information
US6130625A (en) 1997-01-24 2000-10-10 Chambord Technologies, Inc. Universal remote control with incoming signal identification
CN1118746C (en) 1997-03-24 2003-08-20 发展产品有限公司 Two-way remote control with advertising display
US6154204A (en) 1998-01-21 2000-11-28 Evolve Products, Inc. Tap antenna unit
US6130726A (en) 1997-03-24 2000-10-10 Evolve Products, Inc. Program guide on a remote control display
US6002450A (en) 1997-03-24 1999-12-14 Evolve Products, Inc. Two-way remote control with advertising display
US6271831B1 (en) 1997-04-03 2001-08-07 Universal Electronics Inc. Wireless control and pointer system
KR100288581B1 (en) 1997-05-29 2001-05-02 윤종용 Method for turning on/off powers of several monitors using remote controller
US6211870B1 (en) 1997-07-07 2001-04-03 Combi/Mote Corp. Computer programmable remote control
US6223348B1 (en) 1997-09-03 2001-04-24 Universal Electronics Inc. Universal remote control system
US6133847A (en) 1997-10-09 2000-10-17 At&T Corp. Configurable remote control device
KR100269343B1 (en) 1997-12-27 2000-10-16 서평원 Apparatus and method for sensing status of long distance amplifier in mobile communication system
US6104334A (en) 1997-12-31 2000-08-15 Eremote, Inc. Portable internet-enabled controller and information browser for consumer devices
US6097441A (en) 1997-12-31 2000-08-01 Eremote, Inc. System for dual-display interaction with integrated television and internet content
US6243035B1 (en) 1998-02-27 2001-06-05 Universal Electronics Inc. Key module for wireless keyboard
CN2324602Y (en) 1998-03-02 1999-06-16 东莞宇宙电子有限公司 Table screen type luminous photo album
US6147677A (en) 1998-03-10 2000-11-14 Universal Electronics Inc. Sensing and control devices using pressure sensitive resistive elements
US6255961B1 (en) 1998-05-08 2001-07-03 Sony Corporation Two-way communications between a remote control unit and one or more devices in an audio/visual environment
US6330091B1 (en) 1998-05-15 2001-12-11 Universal Electronics Inc. IR receiver using IR transmitting diode
US7831930B2 (en) 2001-11-20 2010-11-09 Universal Electronics Inc. System and method for displaying a user interface for a remote control application
US6157319A (en) 1998-07-23 2000-12-05 Universal Electronics Inc. Universal remote control system with device activated setup
US8098140B1 (en) 2000-07-13 2012-01-17 Universal Electronics Inc. Customizable and upgradable devices and methods related thereto
US7586398B2 (en) 1998-07-23 2009-09-08 Universal Electronics, Inc. System and method for setting up a universal remote control
US6784804B1 (en) 1998-07-23 2004-08-31 Universal Electronics Inc. Digital interconnect of entertainment equipment
US7218243B2 (en) 1998-07-23 2007-05-15 Universal Electronics Inc. System and method for automatically setting up a universal remote control
US6097309A (en) 1998-07-23 2000-08-01 Universal Electronics Inc. Remote control learning system and method using signal envelope pattern recognition
US6563430B1 (en) 1998-12-11 2003-05-13 Koninklijke Philips Electronics N.V. Remote control device with location dependent interface
TW515146B (en) 1998-12-09 2002-12-21 Intel Corp Remotely controlling electronic devices
JP2000184478A (en) 1998-12-16 2000-06-30 Sharp Corp Remote control device, device to be controlled, remote control system, and method for controlling remote control system
US6374404B1 (en) 1998-12-16 2002-04-16 Sony Corporation Of Japan Intelligent device having background caching of web pages from a digital television broadcast signal and method of same
US6225938B1 (en) 1999-01-14 2001-05-01 Universal Electronics Inc. Universal remote control system with bar code setup
US20050210101A1 (en) 1999-03-04 2005-09-22 Universal Electronics Inc. System and method for providing content, management, and interactivity for client devices
US7046161B2 (en) 1999-06-16 2006-05-16 Universal Electronics Inc. System and method for automatically setting up a universal remote control
US20020046083A1 (en) 1999-07-08 2002-04-18 Philips Electronics North America Corporation After-sales customization specified by retailer acts as incentive
US6567011B1 (en) 1999-10-14 2003-05-20 Universal Electronics Inc. Media system and remote control for same
AU2001239080A1 (en) 2000-03-15 2001-09-24 Glen Mclean Harris State-based remote control system
US20010033243A1 (en) 2000-03-15 2001-10-25 Harris Glen Mclean Online remote control configuration system
US8531276B2 (en) * 2000-03-15 2013-09-10 Logitech Europe S.A. State-based remote control system
US20020056084A1 (en) 2000-03-15 2002-05-09 Harris Glen Mclean Active media content access system
US6784805B2 (en) 2000-03-15 2004-08-31 Intrigue Technologies Inc. State-based remote control system
US7079113B1 (en) 2000-07-06 2006-07-18 Universal Electronics Inc. Consumer electronic navigation system and methods related thereto
JP2002058079A (en) 2000-08-11 2002-02-22 Hitachi Ltd Remote control system
US7142934B2 (en) 2000-09-01 2006-11-28 Universal Electronics Inc. Audio converter device and method for using the same
US20020065927A1 (en) 2000-09-05 2002-05-30 Janik Craig M. Webpad and method for using the same
US20060031550A1 (en) 2000-09-05 2006-02-09 Universal Electronics Inc. Webpad adapted to communicate using wide area and local area communication channels
US6748248B1 (en) 2000-10-20 2004-06-08 Silitek Corporation Extended input device for portable wireless communication apparatus
US7200357B2 (en) 2000-10-20 2007-04-03 Universal Electronics Inc. Automotive storage and playback device and method for using the same
US6946988B2 (en) 2000-11-10 2005-09-20 Simple Devices Detachable remote controller for an electronic entertainment device and a method for using the same
US6640144B1 (en) 2000-11-20 2003-10-28 Universal Electronics Inc. System and method for creating a controlling device
US6722984B1 (en) 2000-11-22 2004-04-20 Universal Electronics Inc. Game controller with parental control functionality
US6629077B1 (en) 2000-11-22 2003-09-30 Universal Electronics Inc. Universal remote control adapted to receive voice input
US7093003B2 (en) 2001-01-29 2006-08-15 Universal Electronics Inc. System and method for upgrading the remote control functionality of a device
US6938101B2 (en) 2001-01-29 2005-08-30 Universal Electronics Inc. Hand held device having a browser application
US8909739B2 (en) 2001-01-29 2014-12-09 Universal Electronics Inc. System and method for upgrading the remote control functionality of a device
US7102688B2 (en) 2001-01-29 2006-09-05 Universal Electronics Inc. System and method for using a hand held device to display a readable representation of an audio track
JP2002271871A (en) 2001-03-09 2002-09-20 Fuji Photo Film Co Ltd Remote control system and remote controller compatible with household lan
US6724339B2 (en) 2001-03-14 2004-04-20 Universal Electronics Inc. System and method for controlling home appliances
US6859197B2 (en) 2001-05-02 2005-02-22 Universal Electronics Inc. Universal remote control with display and printer
US20030117427A1 (en) 2001-07-13 2003-06-26 Universal Electronics Inc. System and method for interacting with a program guide displayed on a portable electronic device
US8863184B2 (en) 2001-07-13 2014-10-14 Universal Electronics Inc. System and method for presenting program guide information in an electronic portable device
US20050134578A1 (en) 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
US8063923B2 (en) 2001-07-13 2011-11-22 Universal Electronics Inc. System and method for updating information in an electronic portable device
US9264755B2 (en) 2001-07-13 2016-02-16 Universal Electronics Inc. System and method for presenting program guide information in an electronic portable device
CN1175623C (en) 2001-07-25 2004-11-10 台均实业有限公司 Comprehensive remote control method and device for household appliance via remote control code
US6947101B2 (en) 2001-08-03 2005-09-20 Universal Electronics Inc. Control device with easy lock feature
US6781638B1 (en) 2001-08-10 2004-08-24 Universal Electronics Inc. Universal remote control capable of simulating a skip search
US7266701B2 (en) 2001-09-06 2007-09-04 Universal Electronics, Inc. System and method for enabling a remote control to automatically and dynamically set-up a V-chip
JP2003087881A (en) 2001-09-14 2003-03-20 Funai Electric Co Ltd Learning remote controller, remote control function learning system, and remote control function learning method
US7193661B2 (en) 2001-09-27 2007-03-20 Universal Electronics Inc. Two way communication using light links
US6747591B1 (en) 2001-11-20 2004-06-08 Universal Electronics Inc. System and method for retrieving information while commanding operation of an appliance
US8176432B2 (en) 2001-11-20 2012-05-08 UEI Electronics Inc. Hand held remote control device having an improved user interface
KR100450251B1 (en) 2001-11-22 2004-09-24 최형락 Multi-purpose remote controller used together mobile communication terminal
US7254777B2 (en) 2001-12-20 2007-08-07 Universal Electronics Inc. System and method for controlling the recording functionality of an appliance using a program guide
KR100448176B1 (en) 2001-12-24 2004-09-10 최형락 A remote controller data download system using a internet, and thereof method
US6650247B1 (en) 2002-02-20 2003-11-18 Universal Electronics Inc. System and method for configuring a home appliance communications network
US6642852B2 (en) 2002-03-01 2003-11-04 Universal Electronics Inc. Remote control device with appliance power awareness
US20040169590A1 (en) 2002-03-01 2004-09-02 Universal Electronics Inc. System and method for using appliance power awareness to select a remote control command set
US7274303B2 (en) 2002-03-01 2007-09-25 Universal Electronics Inc. Power strip with control and monitoring functionality
US8255968B2 (en) 2002-04-15 2012-08-28 Universal Electronics, Inc. System and method for adaptively controlling the recording of program material using a program guide
US7653212B2 (en) 2006-05-19 2010-01-26 Universal Electronics Inc. System and method for using image data in connection with configuring a universal controlling device
US7167913B2 (en) 2002-06-05 2007-01-23 Universal Electronics Inc. System and method for managing communication links
US6917302B2 (en) 2002-06-20 2005-07-12 Universal Electronics Inc. System and method for retrieving information while commanding operation of an appliance
US6788241B2 (en) 2002-09-25 2004-09-07 Universal Electronics Inc. System and method for using keystroke data to configure a remote control device
US6882729B2 (en) 2002-12-12 2005-04-19 Universal Electronics Inc. System and method for limiting access to data
US7013434B2 (en) 2003-01-03 2006-03-14 Universal Electronics Inc. Remote control with local, screen-guided setup
US20040210933A1 (en) 2003-01-07 2004-10-21 Universal Electronics Inc. User interface for a remote control application
CN1434422A (en) 2003-03-07 2003-08-06 赵依军 Remote control method and equipment
US7768234B2 (en) 2004-02-28 2010-08-03 Janik Craig M System and method for automatically synchronizing and acquiring content for battery powered devices
US7161524B2 (en) 2003-03-28 2007-01-09 Universal Electronics Inc. System and method for using an universal remote control to access extended operational functions of a device
US6885952B1 (en) 2003-04-09 2005-04-26 Universal Electronics Inc. System and method for determining voltage levels
EP1644904B1 (en) 2003-06-25 2009-12-09 Universal Electronics Inc. System and method for monitoring remote control transmissions
US7005979B2 (en) 2003-06-25 2006-02-28 Universal Electronics Inc. System and method for monitoring remote control transmissions
US7154428B2 (en) 2003-06-25 2006-12-26 Universal Electronics Inc. Remote control with selective key illumination
US7876255B2 (en) 2003-09-19 2011-01-25 Universal Electronics Inc. Controlling device using visual cues to indicate appliance and function key relationships
US7460050B2 (en) 2003-09-19 2008-12-02 Universal Electronics, Inc. Controlling device using cues to convey information
US7221306B2 (en) 2003-09-19 2007-05-22 Universal Electronics Inc. System and method for measuring and presenting memory size of a universal remote control
JP2005093022A (en) 2003-09-19 2005-04-07 Pioneer Electronic Corp Source selecting device, information output device, ,source allocation method or the like
US7209116B2 (en) 2003-10-08 2007-04-24 Universal Electronics Inc. Control device having integrated mouse and remote control capabilities
US8253532B2 (en) 2003-10-27 2012-08-28 Universal Electronics Inc. Controlling device having a device mode state toggle feature
US9131272B2 (en) 2003-11-04 2015-09-08 Universal Electronics Inc. System and method for saving and recalling state data for media and home appliances
US7363028B2 (en) 2003-11-04 2008-04-22 Universal Electronics, Inc. System and method for controlling device location determination
US7136709B2 (en) 2003-11-04 2006-11-14 Universal Electronics Inc. Home appliance control system and methods in a networked environment
US7155305B2 (en) 2003-11-04 2006-12-26 Universal Electronics Inc. System and methods for home appliance identification and control in a networked environment
US7412653B2 (en) 2003-11-06 2008-08-12 Universal Electronics, Inc. Remote control having a display with multi-function EL segments
US7652844B2 (en) 2003-12-24 2010-01-26 Bruce Edwards System and method for protecting removeable media playback devices
WO2005088894A1 (en) 2004-03-11 2005-09-22 Universal Electronics Inc. Syncronizing device-specific encrypted data to and from mobile devices using detachable storage media
US7872642B2 (en) 2004-03-12 2011-01-18 Universal Electronics Inc. Controlling device having multiple user interfaces
US20050283814A1 (en) 2004-06-16 2005-12-22 Universal Electronics Inc. System and method for enhanced data transfer within control environments
US9088748B2 (en) 2004-07-16 2015-07-21 Universal Electronics Inc. System for providing electronic media and commands via remote control and docking station
US7877328B2 (en) 2004-07-21 2011-01-25 Sony Corporation Communication system communication method, contents processing device, and computer program
US7941786B2 (en) 2004-09-08 2011-05-10 Universal Electronics Inc. Configurable controlling device and associated configuration distribution system and method
US7266777B2 (en) 2004-09-08 2007-09-04 Universal Electronics Inc. Configurable controlling device having an associated editing program
US7743012B2 (en) 2004-09-08 2010-06-22 Universal Electronics Inc. Configurable controlling device and associated configuration upload and download system and method
US7432916B2 (en) 2004-12-09 2008-10-07 Universal Electronics, Inc. Controlling device with dual-mode, touch-sensitive display
US8149218B2 (en) 2004-12-21 2012-04-03 Universal Electronics, Inc. Controlling device with selectively illuminated user interfaces
US7436346B2 (en) * 2005-01-20 2008-10-14 At&T Intellectual Property I, L.P. System, method and interface for controlling multiple electronic devices of a home entertainment system via a single control device
US7319426B2 (en) 2005-06-16 2008-01-15 Universal Electronics Controlling device with illuminated user interface
US20070077784A1 (en) 2005-08-01 2007-04-05 Universal Electronics Inc. System and method for accessing a user interface via a secondary device
US7549008B2 (en) 2005-08-05 2009-06-16 Universal Electronics, Inc. Interface adapter for a portable media player device
US7907222B2 (en) 2005-09-08 2011-03-15 Universal Electronics Inc. System and method for simplified setup of a universal remote control
US7764190B2 (en) 2005-09-30 2010-07-27 Universal Electronics Inc. System using a fiber optic cable to distribute commands for controlling operations of an appliance
US8321466B2 (en) 2005-12-22 2012-11-27 Universal Electronics Inc. System and method for creating and utilizing metadata regarding the structure of program content stored on a DVR
US8031270B1 (en) 2006-01-31 2011-10-04 Cypress Semiconductor Corporation Remote control system
US7548246B2 (en) 2006-03-24 2009-06-16 Universal Electronics, Inc. System and method for defining a controlled device command set
US7765245B2 (en) 2006-03-29 2010-07-27 Universal Electronics Inc. System and methods for enhanced metadata entry
US8054294B2 (en) * 2006-03-31 2011-11-08 Sony Corporation Touch screen remote control system for use in controlling one or more devices
JP4919825B2 (en) 2007-01-30 2012-04-18 株式会社東芝 Program development apparatus, program development method and program
US7525037B2 (en) 2007-06-25 2009-04-28 Sony Ericsson Mobile Communications Ab System and method for automatically beat mixing a plurality of songs using an electronic equipment
EP2248013A4 (en) 2007-12-20 2012-09-26 Hsbc Technologies Inc Automated methods and systems for developing and deploying projects in parallel
JP5428186B2 (en) 2008-04-04 2014-02-26 ソニー株式会社 Electronics
US20100005503A1 (en) * 2008-07-01 2010-01-07 Kaylor Floyd W Systems and methods for generating a video image by merging video streams
EP2204965B1 (en) 2008-12-31 2016-07-27 Google Technology Holdings LLC Device and method for receiving scalable content from multiple sources having different content quality
CA2807185C (en) 2010-08-04 2017-05-30 Nagravision S.A. Method for sharing data and synchronizing broadcast data with additional information
US20120128334A1 (en) * 2010-11-19 2012-05-24 Samsung Electronics Co. Ltd. Apparatus and method for mashup of multimedia content
US8918544B2 (en) 2011-03-31 2014-12-23 Logitech Europe S.A. Apparatus and method for configuration and operation of a remote-control system
US9239837B2 (en) 2011-04-29 2016-01-19 Logitech Europe S.A. Remote control system for connected devices

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6144366A (en) * 1996-10-18 2000-11-07 Kabushiki Kaisha Toshiba Method and apparatus for generating information input using reflected light image of target object
US20020196238A1 (en) * 2001-06-20 2002-12-26 Hitachi, Ltd. Touch responsive display unit and method
US20100059296A9 (en) * 2002-02-20 2010-03-11 Planar Systems, Inc. Light sensitive display
US20080231610A1 (en) * 2004-07-30 2008-09-25 Apple Inc. Gestures for touch sensitive input devices
US20060132455A1 (en) * 2004-12-21 2006-06-22 Microsoft Corporation Pressure based selection
US20090237372A1 (en) * 2008-03-20 2009-09-24 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen in the same
US20100020043A1 (en) * 2008-07-28 2010-01-28 Samsung Electronics Co. Ltd. Mobile terminal having touch screen and method for displaying cursor thereof
US20100315413A1 (en) * 2009-06-16 2010-12-16 Microsoft Corporation Surface Computer User Interaction
US20110141012A1 (en) * 2009-12-14 2011-06-16 Samsung Electronics Co., Ltd. Displaying device and control method thereof and display system and control method thereof
US20110261058A1 (en) * 2010-04-23 2011-10-27 Tong Luo Method for user input from the back panel of a handheld computerized device
US20110304542A1 (en) * 2010-06-10 2011-12-15 Isaac Calderon Multi purpose remote control with display

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8654074B1 (en) * 2010-07-02 2014-02-18 Alpha and Omega, Inc. Remote control systems and methods for providing page commands to digital electronic display devices
US11145189B2 (en) 2011-10-28 2021-10-12 Universal Electronics Inc. System and method for optimized appliance control
US10991239B2 (en) 2011-10-28 2021-04-27 Universal Electronics Inc. System and method for optimized appliance control
US11887469B2 (en) 2011-10-28 2024-01-30 Universal Electronics Inc. System and method for optimized appliance control
US11651677B2 (en) 2011-10-28 2023-05-16 Universal Electronics Inc. System and method for optimized appliance control
US20140022462A1 (en) * 2011-10-28 2014-01-23 Universal Electronics Inc. System and method for optimized appliance control
US10325487B2 (en) 2011-10-28 2019-06-18 Universal Electronics Inc. System and method for optimized appliance control
US10339797B2 (en) 2011-10-28 2019-07-02 Universal Electronics Inc. System and method for optimized appliance control
US11322016B2 (en) 2011-10-28 2022-05-03 Universal Electronics Inc. System and method for optimized appliance control
US11308796B2 (en) 2011-10-28 2022-04-19 Universal Electronics Inc. System and method for optimized appliance control
US11295603B2 (en) 2011-10-28 2022-04-05 Universal Electronics Inc. System and method for optimized appliance control
US11295606B2 (en) 2011-10-28 2022-04-05 Universal Electronics Inc. System and method for optimized appliance control
US11295605B2 (en) 2011-10-28 2022-04-05 Universal Electronics Inc. System and method for optimized appliance control
US11170636B2 (en) 2011-10-28 2021-11-09 Universal Electronics Inc. System and method for optimized appliance control
US20130107131A1 (en) * 2011-10-28 2013-05-02 Universal Electronics Inc. System and method for optimized appliance control
US9215394B2 (en) * 2011-10-28 2015-12-15 Universal Electronics Inc. System and method for optimized appliance control
US11113954B2 (en) 2011-10-28 2021-09-07 Universal Electronics Inc. System and method for optimized appliance control
US9307178B2 (en) * 2011-10-28 2016-04-05 Universal Electronics Inc. System and method for optimized appliance control
US10325486B2 (en) 2011-10-28 2019-06-18 Universal Electronics Inc. System and method for optimized appliance control
US10970997B2 (en) 2011-10-28 2021-04-06 Universal Electronics Inc. System and method for optimized appliance control
US10943469B2 (en) 2011-10-28 2021-03-09 Universal Electronics Inc. System and method for optimized appliance control
US10937306B2 (en) 2011-10-28 2021-03-02 Universal Electronics Inc. System and method for optimized appliance control
US10937308B2 (en) 2011-10-28 2021-03-02 Universal Electronics Inc. System and method for optimized appliance control
US10922958B2 (en) 2011-10-28 2021-02-16 Universal Electronics Inc. System and method for optimized appliance control
US10636288B2 (en) 2011-10-28 2020-04-28 Universal Electronics Inc. System and method for optimized appliance control
US9693006B2 (en) 2011-10-28 2017-06-27 Universal Electronics Inc. System and method for optimized appliance control
US10614704B2 (en) 2011-10-28 2020-04-07 Universal Electronics Inc. System and method for optimized appliance control
US11769397B2 (en) 2011-10-28 2023-09-26 Universal Electronics Inc. System and method for optimized appliance control
US10593196B2 (en) 2011-10-28 2020-03-17 Universal Electronics Inc. System and method for optimized appliance control
US9800818B2 (en) 2011-10-28 2017-10-24 Universal Electronics Inc. System and method for optimized appliance control
US10593195B2 (en) 2011-10-28 2020-03-17 Universal Electronics Inc. System and method for optimized appliance control
US9716853B2 (en) 2011-10-28 2017-07-25 Universal Electronics Inc. System and method for optimized appliance control
US9942509B2 (en) 2011-10-28 2018-04-10 Universal Electronics Inc. System and method for optimized appliance control
US11315410B2 (en) 2011-10-28 2022-04-26 Universal Electronics Inc. System and method for optimized appliance control
US20130155010A1 (en) * 2011-12-14 2013-06-20 Microchip Technology Incorporated Capacitive Proximity Based Gesture Input System
US9723352B2 (en) * 2011-12-28 2017-08-01 Huizhou Tcl Mobile Communication Co., Ltd. User interface interaction system and method for handheld device and TV set
US20140130091A1 (en) * 2011-12-28 2014-05-08 Jiang Liu User interface interaction system and method for handheld device and tv set
US11194975B2 (en) 2011-12-30 2021-12-07 Samsung Electronics Co., Ltd. Electronic device, user input apparatus controlling the same, and control method thereof
US20170337400A1 (en) * 2011-12-30 2017-11-23 Samsung Electronics Co., Ltd. Electronic device, user input apparatus controlling the same, and control method thereof
US10671817B2 (en) * 2011-12-30 2020-06-02 Samsung Electronics Co., Ltd. Electronic device, user input apparatus controlling the same, and control method thereof
US20130176254A1 (en) * 2012-01-06 2013-07-11 Samsung Electronics Co., Ltd. Input apparatus, display apparatus, control method thereof and display system
US9342168B2 (en) * 2012-01-06 2016-05-17 Samsung Electronics Co., Ltd. Input apparatus, display apparatus, control method thereof and display system
US20130176505A1 (en) * 2012-01-06 2013-07-11 Samsung Electronics Co., Ltd. Input apparatus, display apparatus and methods for controlling a display through user manipulation
US20130249813A1 (en) * 2012-03-26 2013-09-26 Lenovo (Singapore) Pte, Ltd. Apparatus, system, and method for touch input
US10042440B2 (en) 2012-03-26 2018-08-07 Lenovo (Singapore) Pte. Ltd. Apparatus, system, and method for touch input
US9467731B2 (en) * 2012-03-30 2016-10-11 Zte Corporation Method for controlling touch screen, and mobile terminal
US20150046945A1 (en) * 2012-03-30 2015-02-12 Zte Corporation Method for Controlling Touch Screen, and Mobile Terminal
US20140184513A1 (en) * 2012-12-31 2014-07-03 Nvidia Corporation Softkey magnification on touch screen
EP2763022A1 (en) * 2013-01-31 2014-08-06 Samsung Electronics Co., Ltd Method for controlling display of pointer and displaying the pointer, and apparatus thereof
US9600088B2 (en) 2013-01-31 2017-03-21 Samsung Electronics Co., Ltd. Method and apparatus for displaying a pointer on an external display
US11770576B2 (en) * 2013-06-26 2023-09-26 DISH Technologies L.L.C. Grid system and method for remote control
US11749102B2 (en) 2013-06-26 2023-09-05 Google Llc Methods, systems, and media for controlling a remote device using a touch screen of a mobile device in a display inhibited state
US20150007232A1 (en) * 2013-06-26 2015-01-01 Echostar Technologies L.L.C. Grid system and method for remote control
US11190829B2 (en) * 2013-06-26 2021-11-30 DISH Technologies L.L.C. Grid system and method for remote control
US11582503B2 (en) * 2013-06-26 2023-02-14 DISH Technologies L.L.C. Grid system and method for remote control
US20230145059A1 (en) * 2013-06-26 2023-05-11 DISH Technologies L.L.C. Grid system and method for remote control
US20220053229A1 (en) * 2013-06-26 2022-02-17 DISH Technologies L.L.C. Grid system and method for remote control
US11430325B2 (en) * 2013-06-26 2022-08-30 Google Llc Methods, systems, and media for controlling a remote device using a touch screen of a mobile device in a display inhibited state
CN105900056A (en) * 2014-01-10 2016-08-24 微软技术许可有限责任公司 Hover-sensitive control of secondary display
US20150199030A1 (en) * 2014-01-10 2015-07-16 Microsoft Corporation Hover-Sensitive Control Of Secondary Display
WO2015105815A1 (en) * 2014-01-10 2015-07-16 Microsoft Technology Licensing, Llc Hover-sensitive control of secondary display
WO2015121175A1 (en) * 2014-02-17 2015-08-20 Volkswagen Aktiengesellschaft User interface and method for assisting a user in the operation of a user interface
US10228768B2 (en) 2014-03-25 2019-03-12 Analog Devices, Inc. Optical user interface
US20160034058A1 (en) * 2014-07-31 2016-02-04 Microsoft Corporation Mobile Device Input Controller For Secondary Display
CN106537326A (en) * 2014-07-31 2017-03-22 微软技术许可有限责任公司 Mobile device input controller for secondary display
EP3175346A1 (en) * 2014-07-31 2017-06-07 Microsoft Technology Licensing, LLC Mobile device input controller for secondary display
US9910512B1 (en) 2014-10-27 2018-03-06 Amazon Technologies, Inc. Systems and methods for using cursor movement profiles
US10185464B2 (en) 2015-05-28 2019-01-22 Microsoft Technology Licensing, Llc Pausing transient user interface elements based on hover information
US20170060391A1 (en) * 2015-08-28 2017-03-02 Samsung Electronics Co., Ltd. Electronic device and operating method of the same
US10528218B2 (en) * 2015-08-28 2020-01-07 Samsung Electronics Co., Ltd. Electronic device and operating method of the same
US10719209B2 (en) * 2016-03-25 2020-07-21 Samsung Electronics Co., Ltd. Method for outputting screen and electronic device supporting the same
US20170277413A1 (en) * 2016-03-25 2017-09-28 Samsung Electronics Co., Ltd. Method for outputting screen and electronic device supporting the same
US10425568B2 (en) 2016-08-16 2019-09-24 Samsung Electronics Co., Ltd. Display device and system and method for controlling power of the same
US10764625B2 (en) * 2016-09-08 2020-09-01 Fm Marketing Gmbh Smart touch
US20200204392A1 (en) * 2018-12-20 2020-06-25 Ming-Tsung Chen Home appliance control system
US20220206683A1 (en) * 2019-05-09 2022-06-30 Microsoft Technology Licensing, Llc Quick menu selection device and method
CN110471524A (en) * 2019-07-31 2019-11-19 维沃移动通信有限公司 Display control method and terminal device
EP4171041A4 (en) * 2020-07-07 2023-11-01 Samsung Electronics Co., Ltd. Display device and control method therefor

Also Published As

Publication number Publication date
US20120274863A1 (en) 2012-11-01
US20120278348A1 (en) 2012-11-01
US8745024B2 (en) 2014-06-03
US9239837B2 (en) 2016-01-19

Similar Documents

Publication Publication Date Title
US20120274547A1 (en) Techniques for content navigation using proximity sensing
CN105278674B (en) Radar-based gesture recognition through wearable devices
KR100980741B1 (en) A remote controller and a method for remote contrlling a display
TWI588734B (en) Electronic apparatus and method for operating electronic apparatus
JP4933027B2 (en) Highlight navigation that switches seamlessly combined with freely moving cursors
US20110227947A1 (en) Multi-Touch User Interface Interaction
US20170293351A1 (en) Head mounted display linked to a touch sensitive input device
KR102169521B1 (en) Input apparatus, display apparatus and control method thereof
US20110134032A1 (en) Method for controlling touch control module and electronic device thereof
US20140181746A1 (en) Electrionic device with shortcut function and control method thereof
KR101352329B1 (en) Apparatus and method for providing user interface by using remote controller
WO2015024252A1 (en) Remote controller, information processing method and system
KR20140035870A (en) Smart air mouse
JP6141301B2 (en) Dialogue model of indirect dialogue device
KR20110134810A (en) A remote controller and a method for remote contrlling a display
KR20160139481A (en) User terminal apparatus and control method thereof
KR20150031986A (en) Display apparatus and control method thereof
US10386932B2 (en) Display apparatus and control method thereof
KR20140107829A (en) Display apparatus, input apparatus and control method thereof
US20140009403A1 (en) System and Method for Creating Optimal Command Regions for the Hand on a Touch Pad Device
US20160239201A1 (en) Multi-touch remote control method
KR101384493B1 (en) System for interworking and controlling devices and user device used in the same
EP3016400A2 (en) Display apparatus, system, and controlling method thereof
JP2015525927A (en) Method and apparatus for controlling a display device
KR101451941B1 (en) Method and set-top box for controlling screen associated icon

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION