US20110090168A1 - Touch Screen Input Method and Device - Google Patents

Touch Screen Input Method and Device Download PDF

Info

Publication number
US20110090168A1
US20110090168A1 US12/976,171 US97617110A US2011090168A1 US 20110090168 A1 US20110090168 A1 US 20110090168A1 US 97617110 A US97617110 A US 97617110A US 2011090168 A1 US2011090168 A1 US 2011090168A1
Authority
US
United States
Prior art keywords
display
finger
image
instruction
section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/976,171
Inventor
Koichi Goto
Mikiko Sakurai
Akio Yoshioka
Tatsushi Nashida
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US12/976,171 priority Critical patent/US20110090168A1/en
Publication of US20110090168A1 publication Critical patent/US20110090168A1/en
Priority to US13/246,299 priority patent/US20120038578A1/en
Priority to US13/617,475 priority patent/US8743070B2/en
Priority to US14/210,546 priority patent/US9001069B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42208Display device provided on the remote control
    • H04N21/42209Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42222Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the invention relates to an input apparatus with a construction in which a touch panel is attached onto a screen of a display such as a liquid crystal or the like and to an input method using such a kind of input apparatus.
  • the auxiliary input/output apparatus has a construction in which a touch panel is attached onto the screen, an electronic program table is displayed on the auxiliary input/output apparatus, and the operation to select a desired program or reserve the recording of the desired program is executed through the touch panel of the auxiliary input/output apparatus.
  • an object of the invention to an input method and an input apparatus in which in the case of inputting by using a touch panel, the operation to display a menu and the operation to select a source to be monitored from the displayed menu can be executed at a time and the operability is improved.
  • an input method using an input apparatus in which a touch panel is laminated onto a display screen of a display apparatus, a sensor unit is formed so as to be expanded to the outside of one side of the display screen, an instruction according to a touching position of a finger or a touch pen onto the sensor unit is given, and a controller generates a control signal on the basis of the instruction, comprising the steps of: displaying a selection display comprising a plurality of selection items along the side of the display screen when the finger or the touch pen is touched to the sensor unit; instructing one of the selection items when the finger or the touch pen is moved along the side on the sensor unit; and instructing selection of the instructed selection item when the finger or the touch pen is released from the sensor unit.
  • an input apparatus in which a touch panel is laminated onto a display screen of a display apparatus, comprising: a sensor unit formed so as to be expanded to the outside of one side of the display screen; and a controller to which an instruction according to a touching position of a finger or a touch pen onto the sensor unit is given and which generates a control signal on the basis of the instruction, wherein the controller controls the display apparatus in such a manner that a selection display comprising a plurality of selection items is displayed along the side of the display screen when the finger or the touch pen is touched to the sensor unit and one of the selection items is instructed when the finger or the touch pen is moved along the side on the sensor unit, and the controller generates the control signal to instruct selection of the instructed selection item when the finger or the touch pen is released from the sensor unit.
  • the selection display for example, a menu by the touch and releasing the finger or the touch pen from the sensor unit
  • one selection item for example, a source
  • the display of the menu and the selection can be executed by one touching and releasing operation and the operability can be improved. Since the selection display is displayed along the side near the sensor unit, the items which are instructed can be easily seen and a situation that the display screen cannot be easily seen because of the selection display can be avoided.
  • FIG. 1 is a block diagram showing a system construction of an embodiment of the invention.
  • FIG. 2 is a block diagram showing a more detailed construction of the embodiment of the invention.
  • FIGS. 3A to 3C are schematic diagrams for use in explanation of the embodiment of the invention.
  • FIG. 4 is a flowchart for use in explanation of the embodiment of the invention.
  • FIG. 5 is a schematic diagram showing a more specific display example of the embodiment of the invention.
  • FIG. 6 is a schematic diagram showing a display example corresponding to a menu selected by a menu display.
  • FIG. 7 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 8 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 9 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 10 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 11 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 12 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • reference numeral 1 denotes a whole display system to which the invention is applied;
  • 2 indicates a first display unit (hereinafter, properly referred to as a primary display) having a large display panel such as PDP (Plasma Display Panel), LCD (Liquid Crystal Display), or the like; and 3 a small secondary display (hereinafter, properly referred to as a secondary display).
  • the secondary display 3 has a construction in which a touch panel is laminated onto a small LCD of, for example, 7 inches, is put on a pedestal 4 , and can be carried by the user as necessary.
  • a video signal to be displayed is supplied to the primary display 2 through a media receiver 5 .
  • the video signal is a broadcast signal or streaming data which is distributed through the Internet.
  • the broadcast signal is received by an antenna 6 , the streaming data is branched by a switch 7 , and they are supplied to the media receiver 5 through a LAN (Local Area Network).
  • a personal computer 8 is connected to another branch of the switch 7 .
  • the streaming data distributed through Internet 10 is inputted to a WAN (Wide Area Network) side of a MODEM (modulator-demodulator) of an ADSL (Asymmetric Digital Subscriber Line).
  • the switch 7 is connected to the LAN side of a MODEM 9 .
  • the ADSL is an example of broadband connection.
  • video contents can be received through broadband connection using a CATV (cable television), an FTTH (Fiber To The Home), or the like. Ordinarily, the video contents is associated with audio data.
  • the media receiver 5 has two tuners to supply the reception signal to each of the primary display 2 and the secondary display 3 .
  • the media receiver 5 can transmit the video signal to the secondary display 3 through an access point 11 of a wireless LAN.
  • Control data such as a remocon (remote control) signal or the like can be transmitted to the access point 11 from the secondary display 3 and bidirectional communication can be made.
  • a wireless system of IEEE (Institute of Electrical and Electronics Engineers) 802.11 can be used and the standard of, for example, 802.11a in such a wireless system can be used. This standard uses a frequency of 5.2 GHz and can realize a transmission speed of maximum 54 Mbps.
  • FIG. 2 shows, in more detail, a construction of an example of the display system comprising the primary display 2 and the secondary display 3 .
  • the primary display 2 has a relatively large display panel 21 of, for example, 30 inches or more and its driving unit (not shown).
  • a main tuner 22 a and a subtuner 22 b each for receiving a terrestrial wave are included in the media receiver 5 .
  • Reference numeral 23 denotes a digital tuner for receiving BS (Broadcasting Satellite) and 110° CS (Communication Satellite).
  • BS Broadcasting Satellite
  • 110° CS Common Satellite
  • outputs of UHF/VHF antennas are supplied to the tuners 22 a and 22 b and an output of a parabolic antenna for receiving BS/110° CS is supplied to the digital tuner 23 .
  • the main tuner 22 a is used for the primary display 2 and the subtuner 22 b is used for the secondary display 3 .
  • the video signals of the main tuner 22 a and the subtuner 22 b are supplied to an AV switch 24 .
  • An output video signal of the AV switch 24 is inputted to an image processing unit 25 and a signal processing unit 32 .
  • the image processing unit 25 executes image processes for improving picture quality such as a process to further raise resolution and the like.
  • An output signal of the image processing unit 25 is inputted to the display panel 21 of the primary display 2 through a DVI (Digital Visual Interface) 26 as a display interface.
  • a picture quality adjusting circuit of the display panel 21 is provided at the front stage of the DVI 26 .
  • a copy prevention signal to prevent an illegal copy of the broadcast contents is also outputted.
  • HDCP High bandwith Digital Content Protection
  • An output signal of the digital tuner 23 is inputted to a video decoder 27 .
  • decoding by MPEG2 Moving Picture Experts Group Phase 2
  • An HD (High Definition) video signal from the video decoder 27 is supplied to the image processing unit 25 and inputted to the display panel 21 through the DVI 26 .
  • the video decoder 27 has a function for outputting an SD (Standard Definition) video signal, for example, 480I (interlace signal in which the number of lines is equal to 480) to the signal processing unit 32 .
  • Reference numeral 28 denotes a system controller for controlling the operations of the primary display 2 and the media receiver 5 and the controller 28 is constructed by a CPU (Central Processing Unit).
  • the system controller 28 controls station selecting states of the main tuner 22 a and the subtuner 22 b.
  • the streaming data and the data of Homepage which were received through the Internet are supplied to the signal processing unit 32 through a LAN 31 .
  • two DSPs (Digital Signal Processors) 33 and 34 are connected to a bus such as PCI (Peripheral Component Interconnect) 35 and a controller 36 comprising a CPU is connected to the PCI 35 through a bridge 37 .
  • PCI Peripheral Component Interconnect
  • the signal processing unit 32 decodes the inputted streaming data.
  • the decoded video signal is supplied to the image processing unit 25 and displayed by the primary display 2 . Therefore, the broadcast signal from each of the main tuner 22 a and the digital tuner 23 can be displayed on the primary display 2 and the contents received through the Internet can also be displayed.
  • the signal processing unit 32 encrypts the video signals from the subtuner 22 b and the digital tuner 23 , further, converts the encrypted video signals into a format in which they can be transmitted in a wireless manner, and sends the converted signals to the secondary display 3 through the access point 11 .
  • the contents such as streaming data received through the Internet and the like are sent to the secondary display 3 through the access point 11 without being decoded.
  • the signal processing unit 32 processes the control signal such as a remocon signal or the like from the secondary display 3 which was received by the access point 11 and sends it to the system controller 28 .
  • the secondary display 3 has a transceiver 41 for making wireless communication with the access point 11 .
  • a signal processing unit 42 is connected to the transceiver 41 .
  • a system controller 43 to control the operation of the secondary display 3 and a DSP 44 are connected through PCI 45 .
  • a display panel for example, an LCD 46 , a transparent touch panel 47 laminated on the display screen of the LCD 46 , a speaker 48 , and a memory card 49 are connected to the signal processing unit 42 . Further, a battery 50 as a power source is provided. The battery 50 is enclosed in for example, the pedestal (refer to FIG. 1 ).
  • the signal processing unit 42 decodes the encrypted video signal received from the access point 11 , decodes the data received through the Internet, and displays the decoded signal onto the LCD 46 . Further, the signal processing unit 42 transmits a remocon signal, a command, or the like generated by the operation of the touch panel 47 to the primary display 2 side. Moreover, the signal processing unit 42 has a function for decoding still image data stored in the memory card 49 and displaying it onto the LCD 46 .
  • the analog video signal of a base band demodulated by the main tuner 22 a is converted into a digital signal, subjected to the picture quality improving process by the image processing unit 25 , subjected to an interlace/progressive converting process, and thereafter, outputted to the display panel 21 through the DVI 26 .
  • a base band analog signal demodulated by the subtuner 22 b is supplied to the signal processing unit 32 , converted into a digital signal, and thereafter, compressed in a digital compression format such as MPEG2, MPEG4, or the like.
  • the compressed video signal is subjected to an encrypting process and, thereafter, transmitted to the secondary display 3 through the access point 11 by the wireless LAN.
  • the signal is subjected to a process for decrypting the encryption and a decompressing process by the signal processing unit 42 of the secondary display 3 and displayed by the LCD 46 .
  • the digital broadcast signal is inputted to the digital tuner 23 and demodulated by a digital front-end block of the digital tuner 23 . After that, the digital video signal is decoded by the video decoder 27 . The digital video signal is displayed onto the display panel 21 through the image processing unit 25 and the DVI 26 .
  • the SD signal for example the video signal of 480I which is outputted from the video decoder 27 is sent to the signal processing unit 32 , compressed in a digital compression format and encrypted by the signal processing unit 32 .
  • the resultant signal is transmitted to the secondary display 3 from the access point 11 of the wireless LAN.
  • the input source is the HD signal
  • it is down-converted into the SD signal, for example the video signal of 480I and, thereafter, sent to the signal processing unit 32 .
  • the down-conversion is a process for protection of a copyright of the digital broadcast contents.
  • the signal inputted from the LAN 31 is subjected to a streaming decoding process in the signal processing unit 32 in accordance with the streaming compression format and sent to the display panel 21 through the image processing unit 25 and the DVI 26 .
  • the streaming contents onto the secondary display 3 it is not subjected to the decoding process in the signal processing unit 32 but is transmitted to the secondary display 3 by the wireless LAN while keeping the state where it has been compressed by the streaming compression format.
  • the decoding process of the streaming compression is executed by the signal processing unit 42 of the secondary display 3 , the decoded video image is displayed by the LCD 46 , and the decoded audio sound is reproduced by the speaker 48 .
  • the invention intends to improve the station selecting operation of the broadcasting and a GUI (Graphical User Interface) at the time of selecting the contents of the Internet. That is, hitherto, in the case of operating the touch panel, when the user selects the desired contents, first, he enters the menu display mode or executes the operation corresponding to such an operation and, thereafter, he further selects the source to be monitored from the menu. In this manner, the operations of two stages are certainly necessary until the target reaches the desired contents. By improving such a drawback, those operations can be realized by the touching and releasing operation, that is, one operation.
  • FIG. 3A shows the touch panel 47 laminated on the LCD 46 of the secondary display 3 . Since the touch panel 47 is transparent, the display image on the LCD 46 can be seen through the touch panel 47 .
  • a pressure sensitive type in which the position where a contact pressure has been applied is detected or an electrostatic type in which the contact is detected as a change in electrostatic capacitance can be used.
  • a touch panel of an infrared detection system in which a number of sensors comprising infrared light emitting diodes and phototransistors are provided can be also used.
  • a size of touch panel 47 is almost equal to that of the display screen of the LCD 46 .
  • the size of touch panel 47 is larger than that of the display screen of the LCD 46 .
  • the touch panel 47 comprises a display/sensor unit 51 a which is slightly larger than the display screen of the LCD 46 ; and a sensor unit 51 b which is projected to the outside from one side, for example, from one side on the right.
  • a finger 52 of the user a rod-shaped touch pen can be also used in place of the finger
  • a selection item such as desired button, icon, or the like on the display/sensor unit 51 a and vertically moved on the sensor unit 51 b.
  • a selection display 53 comprising a plurality of selection items, for example, first to fifth buttons is displayed along the side of the right side of the display screen by the LCD 46 in accordance with the operation of the touch panel 47 .
  • a plurality of selection displays 53 are displayed in the vertical direction in parallel with the sensor device portion 51 b.
  • the touch panel 47 construction other than that mentioned above can be used.
  • the sensor unit is provided on the outside of another side (left side, upper side, or lower side) of the display screen of the LCD 46 and a selection display in which a plurality of selection items are arranged is displayed along such another side by the LCD 46 .
  • a belt-shaped touch panel can be also partially provided along one side of the display screen.
  • the sensor unit 51 b Since the sensor unit 51 b is provided on the outside of the display screen, a situation that the display screen cannot be easily seen due to the dirt by repetitively operating the sensor unit 51 b can be avoided. Since the selection display 53 is displayed at the end of the display screen along the side where the sensor unit 51 b is provided, a degree at which the display image on the LCD 46 cannot be easily seen because of the selection display 53 can be reduced.
  • the sensor unit 51 b is touched with the finger 52 in step S 1 .
  • the selection display 53 is displayed on the display screen as shown in FIG. 3A .
  • step S 2 when the finger 52 is vertically moved while keeping the state where the finger is touched onto the sensor unit 51 b , the button in the position of the same height as that of the finger is instructed and highlighted.
  • the “highlight” denotes a display in which the instructed button can be visually identified, that is, the luminance, color, reversal indication, flickering, or the like is made different.
  • the third button is highlighted.
  • step S 3 when the finger 52 is released from the sensor unit 51 b in the state where the third button is highlighted, the third button is selected. That is, the operation to display the selection display 53 and the operation to select a desired selection item in the selection display 53 can be executed by one touching and releasing operation. Since the third button is selected, a display screen of a lower layer corresponding to the third button is displayed by the LCD 46 .
  • Step S 4 shows the case where the finger is released from the sensor unit 51 b in this state. In this case, it is judged that the selecting operation has been cancelled, the processing routine is finished, and therefore, the state does not change and the display of the selection display 53 is continued.
  • FIG. 5 shows a selection display, for example, a menu display 54 which is displayed when the finger 52 is touched to the sensor unit 51 b of the right side of the touch panel 47 .
  • a menu display 54 which is displayed when the finger 52 is touched to the sensor unit 51 b of the right side of the touch panel 47 .
  • FIG. 5 shows the state where a channel list of the menu item is highlighted.
  • this menu item is selected.
  • a display screen of a lower layer corresponding to the selected menu item is displayed.
  • the image of the LCD 46 of the secondary display 3 is displayed on the display/sensor unit 51 a of the touch panel 47 .
  • FIG. 6 shows an example of a display 55 in the case where a menu item of “television channel list” is selected.
  • Channels of the terrestrial wave, BS, CS, and inputs are displayed on the LCD 46 .
  • a desired channel can be selected by the display/sensor unit 51 a of the touch panel 47 .
  • the channel list of FIG. 6 is a display of the list showing, for example, the items which are displayed on the primary display 2 .
  • FIG. 7 shows an example of a display 56 in the case where a menu item of “channel list” is selected. It shows a list of the contents which can be received by the secondary display 3 .
  • channels of news and the like which are received through the Internet are displayed on the LCD 46 .
  • a desired channel can be selected by the display/sensor unit 51 a of the touch panel 47 .
  • FIG. 8 shows an example, of a display 57 in the case where a menu item of “TV remocon” is selected.
  • Buttons for remocon are displayed on the display screen of the LCD 46 .
  • the buttons for remocon are buttons for ten-key, increase/decrease in sound volume, channel switching, and the like.
  • FIG. 9 shows an example of a display 58 in the case where a menu item of “memory stick (registered trade name)” is selected. Thumbnails of still images recorded in the memory card 49 are displayed on the display screen of the LCD 46 . Nine thumbnails can be displayed at once. The thumbnails which are displayed can be switched by vertically scrolling.
  • FIG. 10 shows an example of a display 59 in the case where a menu item of “Internet” is selected.
  • a list of titles and addresses of Homepages registered as “favorites” is displayed on the display screen of the LCD 46 .
  • a column to input words or phrases for searching is displayed. Further, other buttons necessary to access a site through the Internet are displayed. Browsing of Homepage is ordinarily performed on the secondary display 3 .
  • FIG. 11 shows an example of a display 60 in the case where a menu item of “setup” is selected. This display 60 is displayed to set the terrestrial channel.
  • FIG. 12 shows an example of a display 61 in the case where a menu of “setup” is selected. When the menu of the setup is selected, the displays 60 and 61 are used and the states of the primary display 2 and the media receiver 5 are set.
  • buttons of the operations of “slow”, “swap”, and “catch” are displayed on the lower side of the display 61 .
  • “Slow” is a process for displaying the same image as that displayed on the secondary display 3 onto the primary display 2 .
  • “Swap” is a process for exchanging the display of the primary display 2 for the display of the secondary display 3 .
  • “Catch” is a process for displaying the same image as that displayed on the primary display 2 onto the secondary display 3 .
  • Commands for executing such processes can be generated by the operation to move the finger 52 from the bottom to the top on the touch panel 47 (in the case of “slow”) or the operation to move the finger 52 from the top to the bottom (in the case of “catch”) besides the operation to touch onto the buttons mentioned above.
  • Such a changing process of the display image can be realized by transmitting the commands from the secondary display 3 to the primary display 2 side and controlling the main tuner 22 a and the subtuner 22 b by the system controller 28 .
  • the tuners it is possible to execute the operation which gives an impression as if the image were bidirectionally transmitted and received between the display panel 21 of the primary display 2 and the LCD 46 of the secondary display 3 .
  • the invention is not limited to the embodiment or the like of the invention as mentioned above but various modifications and applications are possible within the scope without departing from the essence of the invention.
  • the invention can be applied to a television apparatus having one display or the like besides the system having the primary display 2 and the secondary display 3 .

Abstract

A touch panel 47 is constructed by a display/sensor unit 51 a which is slightly larger than a display screen of an LCD 46 and a sensor unit 51 b which is projected to the outside from one side, for example, from one side on the right. A finger 52 of the user is touched to a selection item such as desired button, icon, or the like on the display/sensor unit 51 a and vertically moved on the sensor unit 51 b. A selection display 53 constructed by a plurality of buttons is displayed along the side of the right side of the display screen by the LCD 46. A button beside the finger 52 is highlighted and when the finger 52 is released, the highlighted button is selected. If there are no buttons adjacent to the finger 52, the button is not highlighted. Even if the finger 52 is released, the state is not changed. When finger 52 is moved to the display/sensor unit 51 a, the selection display 53 disappears.

Description

    TECHNICAL FIELD
  • The invention relates to an input apparatus with a construction in which a touch panel is attached onto a screen of a display such as a liquid crystal or the like and to an input method using such a kind of input apparatus.
  • BACKGROUND ART
  • Hitherto, as a digital broadcast receiver, construction in which an auxiliary input/output apparatus having a small display is provided separately from a display main body having a large screen and the display main body and the auxiliary input/output apparatus are connected in a wireless manner has been disclosed in JP-A-2001-2030908. According to the apparatus disclosed in JP-A-2001-2030908, the auxiliary input/output apparatus has a construction in which a touch panel is attached onto the screen, an electronic program table is displayed on the auxiliary input/output apparatus, and the operation to select a desired program or reserve the recording of the desired program is executed through the touch panel of the auxiliary input/output apparatus.
  • However, in a system having the two displays as disclosed in JP-A-2001-2030908, in the case where the auxiliary input/output apparatus has the construction of the touch panel, when broadcast contents, Internet contents, or the like is selected and monitored by the operation of the touch panel, the operations of two stages are necessary. That is, when the user selects desired contents, first, he enters a menu display mode or executes the operation corresponding to the operation for entering the menu display mode and, thereafter, he selects a source to be monitored from the menu. In this manner, the operations of two stages are necessary until the target reaches the desired contents.
  • It is, therefore, an object of the invention to an input method and an input apparatus in which in the case of inputting by using a touch panel, the operation to display a menu and the operation to select a source to be monitored from the displayed menu can be executed at a time and the operability is improved.
  • DISCLOSURE OF INVENTION
  • To solve the above problems, according to the invention of claim 1, there is provided an input method using an input apparatus in which a touch panel is laminated onto a display screen of a display apparatus, a sensor unit is formed so as to be expanded to the outside of one side of the display screen, an instruction according to a touching position of a finger or a touch pen onto the sensor unit is given, and a controller generates a control signal on the basis of the instruction, comprising the steps of: displaying a selection display comprising a plurality of selection items along the side of the display screen when the finger or the touch pen is touched to the sensor unit; instructing one of the selection items when the finger or the touch pen is moved along the side on the sensor unit; and instructing selection of the instructed selection item when the finger or the touch pen is released from the sensor unit.
  • According to the invention of claim 5, there is provided an input apparatus in which a touch panel is laminated onto a display screen of a display apparatus, comprising: a sensor unit formed so as to be expanded to the outside of one side of the display screen; and a controller to which an instruction according to a touching position of a finger or a touch pen onto the sensor unit is given and which generates a control signal on the basis of the instruction, wherein the controller controls the display apparatus in such a manner that a selection display comprising a plurality of selection items is displayed along the side of the display screen when the finger or the touch pen is touched to the sensor unit and one of the selection items is instructed when the finger or the touch pen is moved along the side on the sensor unit, and the controller generates the control signal to instruct selection of the instructed selection item when the finger or the touch pen is released from the sensor unit.
  • According to the invention, by displaying the selection display, for example, a menu by the touch and releasing the finger or the touch pen from the sensor unit, one selection item, for example, a source can be selected. The display of the menu and the selection can be executed by one touching and releasing operation and the operability can be improved. Since the selection display is displayed along the side near the sensor unit, the items which are instructed can be easily seen and a situation that the display screen cannot be easily seen because of the selection display can be avoided.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram showing a system construction of an embodiment of the invention.
  • FIG. 2 is a block diagram showing a more detailed construction of the embodiment of the invention.
  • FIGS. 3A to 3C are schematic diagrams for use in explanation of the embodiment of the invention.
  • FIG. 4 is a flowchart for use in explanation of the embodiment of the invention.
  • FIG. 5 is a schematic diagram showing a more specific display example of the embodiment of the invention.
  • FIG. 6 is a schematic diagram showing a display example corresponding to a menu selected by a menu display.
  • FIG. 7 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 8 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 9 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 10 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 11 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • FIG. 12 is a schematic diagram showing a display example corresponding to a menu selected by the menu display.
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • An embodiment of the invention will be described hereinbelow with reference to the drawings.
  • In FIG. 1, reference numeral 1 denotes a whole display system to which the invention is applied; 2 indicates a first display unit (hereinafter, properly referred to as a primary display) having a large display panel such as PDP (Plasma Display Panel), LCD (Liquid Crystal Display), or the like; and 3 a small secondary display (hereinafter, properly referred to as a secondary display). The secondary display 3 has a construction in which a touch panel is laminated onto a small LCD of, for example, 7 inches, is put on a pedestal 4, and can be carried by the user as necessary.
  • A video signal to be displayed is supplied to the primary display 2 through a media receiver 5. The video signal is a broadcast signal or streaming data which is distributed through the Internet. The broadcast signal is received by an antenna 6, the streaming data is branched by a switch 7, and they are supplied to the media receiver 5 through a LAN (Local Area Network). A personal computer 8 is connected to another branch of the switch 7.
  • The streaming data distributed through Internet 10 is inputted to a WAN (Wide Area Network) side of a MODEM (modulator-demodulator) of an ADSL (Asymmetric Digital Subscriber Line). The switch 7 is connected to the LAN side of a MODEM 9. The ADSL is an example of broadband connection. As another method, video contents can be received through broadband connection using a CATV (cable television), an FTTH (Fiber To The Home), or the like. Ordinarily, the video contents is associated with audio data.
  • The media receiver 5 has two tuners to supply the reception signal to each of the primary display 2 and the secondary display 3. The media receiver 5 can transmit the video signal to the secondary display 3 through an access point 11 of a wireless LAN. Control data such as a remocon (remote control) signal or the like can be transmitted to the access point 11 from the secondary display 3 and bidirectional communication can be made. For example, a wireless system of IEEE (Institute of Electrical and Electronics Engineers) 802.11 can be used and the standard of, for example, 802.11a in such a wireless system can be used. This standard uses a frequency of 5.2 GHz and can realize a transmission speed of maximum 54 Mbps.
  • FIG. 2 shows, in more detail, a construction of an example of the display system comprising the primary display 2 and the secondary display 3. The primary display 2 has a relatively large display panel 21 of, for example, 30 inches or more and its driving unit (not shown).
  • A main tuner 22 a and a subtuner 22 b each for receiving a terrestrial wave are included in the media receiver 5. Reference numeral 23 denotes a digital tuner for receiving BS (Broadcasting Satellite) and 110° CS (Communication Satellite). Although not shown, outputs of UHF/VHF antennas are supplied to the tuners 22 a and 22 b and an output of a parabolic antenna for receiving BS/110° CS is supplied to the digital tuner 23. In the embodiment, the main tuner 22 a is used for the primary display 2 and the subtuner 22 b is used for the secondary display 3.
  • The video signals of the main tuner 22 a and the subtuner 22 b are supplied to an AV switch 24. An output video signal of the AV switch 24 is inputted to an image processing unit 25 and a signal processing unit 32. The image processing unit 25 executes image processes for improving picture quality such as a process to further raise resolution and the like.
  • An output signal of the image processing unit 25 is inputted to the display panel 21 of the primary display 2 through a DVI (Digital Visual Interface) 26 as a display interface. Although not shown, a picture quality adjusting circuit of the display panel 21 is provided at the front stage of the DVI 26. Further, in the case of supplying the digital video signal to the display panel 21, a copy prevention signal to prevent an illegal copy of the broadcast contents is also outputted. For example, HDCP (High bandwith Digital Content Protection) can be used.
  • An output signal of the digital tuner 23 is inputted to a video decoder 27. For example, decoding by MPEG2 (Moving Picture Experts Group Phase 2) is executed by the video decoder 27. An HD (High Definition) video signal from the video decoder 27 is supplied to the image processing unit 25 and inputted to the display panel 21 through the DVI 26.
  • The video decoder 27 has a function for outputting an SD (Standard Definition) video signal, for example, 480I (interlace signal in which the number of lines is equal to 480) to the signal processing unit 32. Reference numeral 28 denotes a system controller for controlling the operations of the primary display 2 and the media receiver 5 and the controller 28 is constructed by a CPU (Central Processing Unit). For example, the system controller 28 controls station selecting states of the main tuner 22 a and the subtuner 22 b.
  • The streaming data and the data of Homepage which were received through the Internet are supplied to the signal processing unit 32 through a LAN 31. In the signal processing unit 32, two DSPs (Digital Signal Processors) 33 and 34 are connected to a bus such as PCI (Peripheral Component Interconnect) 35 and a controller 36 comprising a CPU is connected to the PCI 35 through a bridge 37.
  • The signal processing unit 32 decodes the inputted streaming data. The decoded video signal is supplied to the image processing unit 25 and displayed by the primary display 2. Therefore, the broadcast signal from each of the main tuner 22 a and the digital tuner 23 can be displayed on the primary display 2 and the contents received through the Internet can also be displayed.
  • The signal processing unit 32 encrypts the video signals from the subtuner 22 b and the digital tuner 23, further, converts the encrypted video signals into a format in which they can be transmitted in a wireless manner, and sends the converted signals to the secondary display 3 through the access point 11. The contents such as streaming data received through the Internet and the like are sent to the secondary display 3 through the access point 11 without being decoded. On the other hand, the signal processing unit 32 processes the control signal such as a remocon signal or the like from the secondary display 3 which was received by the access point 11 and sends it to the system controller 28.
  • The secondary display 3 has a transceiver 41 for making wireless communication with the access point 11. A signal processing unit 42 is connected to the transceiver 41. In the signal processing unit 42, a system controller 43 to control the operation of the secondary display 3 and a DSP 44 are connected through PCI 45.
  • A display panel, for example, an LCD 46, a transparent touch panel 47 laminated on the display screen of the LCD 46, a speaker 48, and a memory card 49 are connected to the signal processing unit 42. Further, a battery 50 as a power source is provided. The battery 50 is enclosed in for example, the pedestal (refer to FIG. 1). The signal processing unit 42 decodes the encrypted video signal received from the access point 11, decodes the data received through the Internet, and displays the decoded signal onto the LCD 46. Further, the signal processing unit 42 transmits a remocon signal, a command, or the like generated by the operation of the touch panel 47 to the primary display 2 side. Moreover, the signal processing unit 42 has a function for decoding still image data stored in the memory card 49 and displaying it onto the LCD 46.
  • The operation of the foregoing display system according to the embodiment of the invention will be described hereinbelow. The analog video signal of a base band demodulated by the main tuner 22 a is converted into a digital signal, subjected to the picture quality improving process by the image processing unit 25, subjected to an interlace/progressive converting process, and thereafter, outputted to the display panel 21 through the DVI 26.
  • A base band analog signal demodulated by the subtuner 22 b is supplied to the signal processing unit 32, converted into a digital signal, and thereafter, compressed in a digital compression format such as MPEG2, MPEG4, or the like. The compressed video signal is subjected to an encrypting process and, thereafter, transmitted to the secondary display 3 through the access point 11 by the wireless LAN. The signal is subjected to a process for decrypting the encryption and a decompressing process by the signal processing unit 42 of the secondary display 3 and displayed by the LCD 46.
  • In the case where the input source is a digital broadcast signal, the digital broadcast signal is inputted to the digital tuner 23 and demodulated by a digital front-end block of the digital tuner 23. After that, the digital video signal is decoded by the video decoder 27. The digital video signal is displayed onto the display panel 21 through the image processing unit 25 and the DVI 26.
  • The SD signal, for example the video signal of 480I which is outputted from the video decoder 27 is sent to the signal processing unit 32, compressed in a digital compression format and encrypted by the signal processing unit 32. The resultant signal is transmitted to the secondary display 3 from the access point 11 of the wireless LAN. In the case where the input source is the HD signal, it is down-converted into the SD signal, for example the video signal of 480I and, thereafter, sent to the signal processing unit 32. The down-conversion is a process for protection of a copyright of the digital broadcast contents.
  • In the case where the input source is the streaming contents from the Internet, the signal inputted from the LAN 31 is subjected to a streaming decoding process in the signal processing unit 32 in accordance with the streaming compression format and sent to the display panel 21 through the image processing unit 25 and the DVI 26.
  • In the case of displaying the streaming contents onto the secondary display 3, it is not subjected to the decoding process in the signal processing unit 32 but is transmitted to the secondary display 3 by the wireless LAN while keeping the state where it has been compressed by the streaming compression format. The decoding process of the streaming compression is executed by the signal processing unit 42 of the secondary display 3, the decoded video image is displayed by the LCD 46, and the decoded audio sound is reproduced by the speaker 48.
  • In the foregoing display system, the invention intends to improve the station selecting operation of the broadcasting and a GUI (Graphical User Interface) at the time of selecting the contents of the Internet. That is, hitherto, in the case of operating the touch panel, when the user selects the desired contents, first, he enters the menu display mode or executes the operation corresponding to such an operation and, thereafter, he further selects the source to be monitored from the menu. In this manner, the operations of two stages are certainly necessary until the target reaches the desired contents. By improving such a drawback, those operations can be realized by the touching and releasing operation, that is, one operation.
  • An outline of the embodiment of the invention will now be described with reference to FIGS. 3A to 3C and 4. FIG. 3A shows the touch panel 47 laminated on the LCD 46 of the secondary display 3. Since the touch panel 47 is transparent, the display image on the LCD 46 can be seen through the touch panel 47. As a specific structure of the touch panel 47, either a pressure sensitive type in which the position where a contact pressure has been applied is detected or an electrostatic type in which the contact is detected as a change in electrostatic capacitance can be used. Further, a touch panel of an infrared detection system in which a number of sensors comprising infrared light emitting diodes and phototransistors are provided can be also used.
  • Ordinarily, a size of touch panel 47 is almost equal to that of the display screen of the LCD 46. In the embodiment, the size of touch panel 47 is larger than that of the display screen of the LCD 46. In the examples of FIGS. 3A to 3C, the touch panel 47 comprises a display/sensor unit 51 a which is slightly larger than the display screen of the LCD 46; and a sensor unit 51 b which is projected to the outside from one side, for example, from one side on the right. A finger 52 of the user (a rod-shaped touch pen can be also used in place of the finger) is touched to a selection item (item to be selected) such as desired button, icon, or the like on the display/sensor unit 51 a and vertically moved on the sensor unit 51 b.
  • A selection display 53 comprising a plurality of selection items, for example, first to fifth buttons is displayed along the side of the right side of the display screen by the LCD 46 in accordance with the operation of the touch panel 47. In the embodiment, a plurality of selection displays 53 are displayed in the vertical direction in parallel with the sensor device portion 51 b.
  • As a construction of the touch panel 47, construction other than that mentioned above can be used. For example, it is also possible to construct in such a manner that the sensor unit is provided on the outside of another side (left side, upper side, or lower side) of the display screen of the LCD 46 and a selection display in which a plurality of selection items are arranged is displayed along such another side by the LCD 46. A belt-shaped touch panel can be also partially provided along one side of the display screen.
  • Since the sensor unit 51 b is provided on the outside of the display screen, a situation that the display screen cannot be easily seen due to the dirt by repetitively operating the sensor unit 51 b can be avoided. Since the selection display 53 is displayed at the end of the display screen along the side where the sensor unit 51 b is provided, a degree at which the display image on the LCD 46 cannot be easily seen because of the selection display 53 can be reduced.
  • In a flowchart shown in FIG. 4, the sensor unit 51 b is touched with the finger 52 in step S1. Thus, the selection display 53 is displayed on the display screen as shown in FIG. 3A.
  • In step S2, when the finger 52 is vertically moved while keeping the state where the finger is touched onto the sensor unit 51 b, the button in the position of the same height as that of the finger is instructed and highlighted. The “highlight” denotes a display in which the instructed button can be visually identified, that is, the luminance, color, reversal indication, flickering, or the like is made different. In FIG. 3A, the third button is highlighted.
  • In step S3, when the finger 52 is released from the sensor unit 51 b in the state where the third button is highlighted, the third button is selected. That is, the operation to display the selection display 53 and the operation to select a desired selection item in the selection display 53 can be executed by one touching and releasing operation. Since the third button is selected, a display screen of a lower layer corresponding to the third button is displayed by the LCD 46.
  • In the case where although the position of the finger 52 is on the sensor unit 51 b, there is no button beside the finger as shown in FIG. 3B, the button is not highlighted. That is, when the finger is touched to the upper or lower area out of the range where the five buttons are arranged, since there are no neighboring buttons, every button is not highlighted. Step S4 shows the case where the finger is released from the sensor unit 51 b in this state. In this case, it is judged that the selecting operation has been cancelled, the processing routine is finished, and therefore, the state does not change and the display of the selection display 53 is continued.
  • Further, when the finger 52 is moved to the display/sensor unit 51 a as shown in FIG. 3C, it is judged that the selecting operation has been cancelled, the processing routine is finished, and the display of the selection display 53 disappears. In this case, even if the finger 52 is released, the state is not changed.
  • A more specific example of the foregoing embodiment will now be described. FIG. 5 shows a selection display, for example, a menu display 54 which is displayed when the finger 52 is touched to the sensor unit 51 b of the right side of the touch panel 47. When the finger 52 is vertically moved while keeping the state where it is touched to the sensor unit 51 b, only the menu item of almost the same height as that of the finger 52 is highlighted. FIG. 5 shows the state where a channel list of the menu item is highlighted.
  • When the finger 52 is released at the position of the highlighted menu item, this menu item is selected. A display screen of a lower layer corresponding to the selected menu item is displayed. Although not shown in FIG. 5, when the menu display 54 is displayed, the image of the LCD 46 of the secondary display 3 is displayed on the display/sensor unit 51 a of the touch panel 47.
  • FIG. 6 shows an example of a display 55 in the case where a menu item of “television channel list” is selected. Channels of the terrestrial wave, BS, CS, and inputs (Video 1 to Video 4) are displayed on the LCD 46. A desired channel can be selected by the display/sensor unit 51 a of the touch panel 47. The channel list of FIG. 6 is a display of the list showing, for example, the items which are displayed on the primary display 2.
  • FIG. 7 shows an example of a display 56 in the case where a menu item of “channel list” is selected. It shows a list of the contents which can be received by the secondary display 3. In addition to the channels of the television and the video inputs shown in FIG. 6, channels of news and the like which are received through the Internet are displayed on the LCD 46. A desired channel can be selected by the display/sensor unit 51 a of the touch panel 47.
  • FIG. 8 shows an example, of a display 57 in the case where a menu item of “TV remocon” is selected. Buttons for remocon (remote control) are displayed on the display screen of the LCD 46. By pressing a desired button in the display/sensor unit 51 a of the touch panel 47, the primary display 2 and the media receiver 5 can be controlled. The buttons for remocon are buttons for ten-key, increase/decrease in sound volume, channel switching, and the like.
  • FIG. 9 shows an example of a display 58 in the case where a menu item of “memory stick (registered trade name)” is selected. Thumbnails of still images recorded in the memory card 49 are displayed on the display screen of the LCD 46. Nine thumbnails can be displayed at once. The thumbnails which are displayed can be switched by vertically scrolling.
  • FIG. 10 shows an example of a display 59 in the case where a menu item of “Internet” is selected. A list of titles and addresses of Homepages registered as “favorites” is displayed on the display screen of the LCD 46. A column to input words or phrases for searching is displayed. Further, other buttons necessary to access a site through the Internet are displayed. Browsing of Homepage is ordinarily performed on the secondary display 3.
  • FIG. 11 shows an example of a display 60 in the case where a menu item of “setup” is selected. This display 60 is displayed to set the terrestrial channel. FIG. 12 shows an example of a display 61 in the case where a menu of “setup” is selected. When the menu of the setup is selected, the displays 60 and 61 are used and the states of the primary display 2 and the media receiver 5 are set.
  • In FIG. 12, buttons of the operations of “slow”, “swap”, and “catch” are displayed on the lower side of the display 61. “Slow” is a process for displaying the same image as that displayed on the secondary display 3 onto the primary display 2. “Swap” is a process for exchanging the display of the primary display 2 for the display of the secondary display 3. “Catch” is a process for displaying the same image as that displayed on the primary display 2 onto the secondary display 3.
  • Commands for executing such processes can be generated by the operation to move the finger 52 from the bottom to the top on the touch panel 47 (in the case of “slow”) or the operation to move the finger 52 from the top to the bottom (in the case of “catch”) besides the operation to touch onto the buttons mentioned above. Such a changing process of the display image can be realized by transmitting the commands from the secondary display 3 to the primary display 2 side and controlling the main tuner 22 a and the subtuner 22 b by the system controller 28. By the control of the tuners, it is possible to execute the operation which gives an impression as if the image were bidirectionally transmitted and received between the display panel 21 of the primary display 2 and the LCD 46 of the secondary display 3.
  • The invention is not limited to the embodiment or the like of the invention as mentioned above but various modifications and applications are possible within the scope without departing from the essence of the invention. For example, the invention can be applied to a television apparatus having one display or the like besides the system having the primary display 2 and the secondary display 3.

Claims (12)

1-8. (canceled)
9. A display system comprising:
a first display having a display panel;
a second display having a display panel; and
a controller,
wherein the controller controls display on each said first and second displays,
wherein said controller accepts an instruction from a user to perform one of:
a first process to display a first image, displayed on the first display, onto said second display,
a second process to display a second image, displayed on said second display, onto said first display, and
a third process to exchange images displayed on said first and second displays.
10. The display system according to claim 9, wherein an instruction, according to a touching position of a finger or a touch pen, is given onto said second display.
11. The display system according to claim 9, wherein said first process, said second process, and said third process each correspond to a button displayed on said second display,
wherein each process is performed based on the user's instruction.
12. The display system according to claim 9, wherein said first process, said second process, and said third process each correspond to a movement of a finger or a touch pen, and wherein each process is performed based on the user's touch and movement.
13. The display system according to claim 9, wherein said first process, said second process, and said third process each correspond to a touch instruction according to a touching position of a finger or a touch pen onto said second display, and wherein each process is performed based on the user's touch instruction.
14. A controller comprising:
a communication section to communicate with a first display and a second display, the controller selects one of:
sending a first image signal from the first display to the second display;
sending a second image signal from the second display to the first display, and
exchanging image signals between the first and second displays.
15. The controller according to claim 14, wherein the controller receives image date from one or more external connections.
16. The controller according to claim 14, wherein the controller receives image data from an internet source.
17. An information process apparatus, comprising:
a display section,
a sensor section on the display section that accepts a user's instruction according to a touching position of a finger or a pen onto said sensor section,
an image controlling section that generates a control signal according to the instruction from the user, and
a communication section that sends an image signal displayed on said display section to an external device based on a first instruction from the user,
wherein said image controlling section displays an image from the external device onto said display section based on a second instruction from the user.
18. The information processing apparatus according to claim 17, wherein said image controlling section exchanges image signals between the image displayed on said display section and an image displayed on the external device, based on a third instruction from said user.
19. The information processing apparatus according to claim 17, wherein said sensor section accepts said first instruction when a finger or pen moves in a first direction on said display section, and accepts said second instruction when the finger or pen moves in a second direction on said display section, wherein said first direction and second direction are in opposite directions to each other.
US12/976,171 2003-06-16 2010-12-22 Touch Screen Input Method and Device Abandoned US20110090168A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/976,171 US20110090168A1 (en) 2003-06-16 2010-12-22 Touch Screen Input Method and Device
US13/246,299 US20120038578A1 (en) 2003-06-16 2011-09-27 Touch Screen Input Method and Device
US13/617,475 US8743070B2 (en) 2003-06-16 2012-09-14 Touch screen input method and device
US14/210,546 US9001069B2 (en) 2003-06-16 2014-03-14 Technique for executing a combined operation in response to a single press and release of a sensor unit

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2003-170493 2003-06-16
JP2003170493A JP4161814B2 (en) 2003-06-16 2003-06-16 Input method and input device
US10/524,354 US7948476B2 (en) 2003-06-16 2004-06-11 Touch screen input method and device
PCT/JP2004/008593 WO2004111827A1 (en) 2003-06-16 2004-06-11 Inputting method and device
US12/976,171 US20110090168A1 (en) 2003-06-16 2010-12-22 Touch Screen Input Method and Device

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
US10524354 Continuation 2004-06-11
PCT/JP2004/008593 Continuation WO2004111827A1 (en) 2003-06-16 2004-06-11 Inputting method and device
US10/524,354 Continuation US7948476B2 (en) 2003-06-16 2004-06-11 Touch screen input method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/246,299 Division US20120038578A1 (en) 2003-06-16 2011-09-27 Touch Screen Input Method and Device

Publications (1)

Publication Number Publication Date
US20110090168A1 true US20110090168A1 (en) 2011-04-21

Family

ID=33549427

Family Applications (5)

Application Number Title Priority Date Filing Date
US10/524,354 Active 2025-05-28 US7948476B2 (en) 2003-06-16 2004-06-11 Touch screen input method and device
US12/976,171 Abandoned US20110090168A1 (en) 2003-06-16 2010-12-22 Touch Screen Input Method and Device
US13/246,299 Abandoned US20120038578A1 (en) 2003-06-16 2011-09-27 Touch Screen Input Method and Device
US13/617,475 Active US8743070B2 (en) 2003-06-16 2012-09-14 Touch screen input method and device
US14/210,546 Active US9001069B2 (en) 2003-06-16 2014-03-14 Technique for executing a combined operation in response to a single press and release of a sensor unit

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/524,354 Active 2025-05-28 US7948476B2 (en) 2003-06-16 2004-06-11 Touch screen input method and device

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/246,299 Abandoned US20120038578A1 (en) 2003-06-16 2011-09-27 Touch Screen Input Method and Device
US13/617,475 Active US8743070B2 (en) 2003-06-16 2012-09-14 Touch screen input method and device
US14/210,546 Active US9001069B2 (en) 2003-06-16 2014-03-14 Technique for executing a combined operation in response to a single press and release of a sensor unit

Country Status (7)

Country Link
US (5) US7948476B2 (en)
EP (3) EP2613249B1 (en)
JP (1) JP4161814B2 (en)
KR (2) KR20060017736A (en)
CN (1) CN1701298A (en)
TW (1) TWI273471B (en)
WO (1) WO2004111827A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025622A1 (en) * 2009-08-03 2011-02-03 Nitto Denko Corporation Touch panel and display device with touch panel
US20110074708A1 (en) * 2009-09-28 2011-03-31 Brother Kogyo Kabushiki Kaisha Input device with display panel
WO2013081819A1 (en) 2011-11-30 2013-06-06 Neonode Inc. Light-based finger gesture user interface
US9591346B2 (en) 2012-05-14 2017-03-07 Samsung Electronics Co., Ltd. Content delivery system with content sharing mechanism and method of operation thereof
US10061504B2 (en) 2013-01-17 2018-08-28 Toyota Jidosha Kabushiki Kaisha Operation apparatus

Families Citing this family (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
KR100751411B1 (en) * 2005-02-01 2007-08-22 엘지전자 주식회사 Method for display Channel Information
WO2006094308A2 (en) * 2005-03-04 2006-09-08 Apple Computer, Inc. Multi-functional hand-held device
TWI272001B (en) * 2005-08-26 2007-01-21 Benq Corp Displaying device and displaying method thereof
US20070093275A1 (en) * 2005-10-25 2007-04-26 Sony Ericsson Mobile Communications Ab Displaying mobile television signals on a secondary display device
KR20070066076A (en) * 2005-12-21 2007-06-27 삼성전자주식회사 Display apparatus and control method thereof
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
KR100672605B1 (en) * 2006-03-30 2007-01-24 엘지전자 주식회사 Method for selecting items and terminal therefor
DE102006019724B4 (en) * 2006-04-27 2008-05-08 Fujitsu Siemens Computers Gmbh remote control
DE102006034415A1 (en) * 2006-07-25 2008-01-31 Siemens Ag Method and arrangement for operating electrical devices
KR100800450B1 (en) 2006-09-12 2008-02-04 엘지전자 주식회사 Manual input method using by touch pad, and terminal thereof
TWI399671B (en) 2007-01-19 2013-06-21 Lg Electronics Inc Inputting information through touch input device
KR101346931B1 (en) * 2007-01-19 2014-01-07 엘지전자 주식회사 Electronic Device With Touch Screen And Method Of Executing Application Using Same
CN101414229B (en) * 2007-10-19 2010-09-08 集嘉通讯股份有限公司 Method and apparatus for controlling switch of handhold electronic device touch control screen
TW200921478A (en) * 2007-11-06 2009-05-16 Giga Byte Comm Inc A picture-page scrolling control method of touch panel for hand-held electronic device and device thereof
US20090189869A1 (en) * 2007-12-20 2009-07-30 Seiko Epson Corporation Touch panel input device, control method of touch panel input device, media stored control program, and electronic device
JP4709234B2 (en) * 2008-01-25 2011-06-22 シャープ株式会社 Display input device and program
US8752121B2 (en) * 2008-06-27 2014-06-10 At&T Intellectual Property I, Lp System and method for displaying television program information on a remote control device
CN102077160B (en) * 2008-06-30 2014-06-18 日本电气株式会社 Information processing device and display control method
CN101727870B (en) * 2008-10-22 2012-02-22 承景科技股份有限公司 Displayer control device and method thereof
KR20100045188A (en) 2008-10-23 2010-05-03 삼성전자주식회사 Remote control device and method for controlling other devices using the remote control device
JP4670970B2 (en) * 2009-01-28 2011-04-13 ソニー株式会社 Display input device
DE102009006661B4 (en) * 2009-01-29 2011-04-14 Institut für Rundfunktechnik GmbH Device for controlling a device reproducing a picture content
JP2010191892A (en) * 2009-02-20 2010-09-02 Sony Corp Information processing apparatus, display control method, and program
JP2011059820A (en) * 2009-09-07 2011-03-24 Sony Corp Information processing apparatus, information processing method and program
KR20110050201A (en) * 2009-11-06 2011-05-13 삼성전자주식회사 Display apparatus and control method of the same
US20110148774A1 (en) * 2009-12-23 2011-06-23 Nokia Corporation Handling Tactile Inputs
JP2011134272A (en) * 2009-12-25 2011-07-07 Sony Corp Information processor, information processing method, and program
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US20110209098A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P On and Off-Screen Gesture Combinations
US9274682B2 (en) * 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
JP2011210138A (en) * 2010-03-30 2011-10-20 Sony Corp Electronic apparatus, image output method and program
JP5255093B2 (en) * 2010-05-17 2013-08-07 シャープ株式会社 Display device and image forming apparatus
TWM396996U (en) * 2010-06-09 2011-01-21 Darfon Electronics Corp Input system combining a mouse and a planar sensing device
JP5418440B2 (en) * 2010-08-13 2014-02-19 カシオ計算機株式会社 Input device and program
JP2012088805A (en) * 2010-10-15 2012-05-10 Sharp Corp Information processor and information processor control method
US10303357B2 (en) 2010-11-19 2019-05-28 TIVO SOLUTIONS lNC. Flick to send or display content
KR101788006B1 (en) * 2011-07-18 2017-10-19 엘지전자 주식회사 Remote Controller and Image Display Device Controllable by Remote Controller
JP6282793B2 (en) 2011-11-08 2018-02-21 サターン ライセンシング エルエルシーSaturn Licensing LLC Transmission device, display control device, content transmission method, recording medium, and program
CN103218143B (en) * 2012-01-18 2016-12-07 阿里巴巴集团控股有限公司 A kind of classification page switching method and mobile device
CN103248931B (en) * 2012-02-14 2016-12-14 联想(北京)有限公司 A kind of remote controller and the method generating control signal
JP2013200863A (en) 2012-02-23 2013-10-03 Panasonic Corp Electronic device
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US10304347B2 (en) 2012-05-09 2019-05-28 Apple Inc. Exercised-based watch face and complications
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
WO2013187872A1 (en) * 2012-06-11 2013-12-19 Intel Corporation Techniques for select-hold-release electronic device navigation menu system
JP6004855B2 (en) * 2012-09-14 2016-10-12 キヤノン株式会社 Display control apparatus and control method thereof
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US11210076B2 (en) 2013-01-28 2021-12-28 Samsung Electronics Co., Ltd. Downloading and launching an app on a second device from a first device
BR102013003187A2 (en) * 2013-02-08 2014-09-16 Tqtvd Software Ltda INTEGRATED LINEAR AND NONLINEAR MULTIMEDIA CONTENT USER INTERFACE FROM MULTIPLE SOURCES AND METHOD FOR IMPLEMENTATION
US10225611B2 (en) 2013-09-03 2019-03-05 Samsung Electronics Co., Ltd. Point-to-point content navigation using an auxiliary device
US9883231B2 (en) * 2013-09-03 2018-01-30 Samsung Electronics Co., Ltd. Content control using an auxiliary device
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US9250730B2 (en) * 2014-03-18 2016-02-02 City University Of Hong Kong Target acquisition system for use in touch screen graphical interface
US10873718B2 (en) 2014-04-02 2020-12-22 Interdigital Madison Patent Holdings, Sas Systems and methods for touch screens associated with a display
US9389703B1 (en) * 2014-06-23 2016-07-12 Amazon Technologies, Inc. Virtual screen bezel
CN116301544A (en) 2014-06-27 2023-06-23 苹果公司 Reduced size user interface
US10135905B2 (en) 2014-07-21 2018-11-20 Apple Inc. Remote user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
EP3189406B1 (en) 2014-09-02 2022-09-07 Apple Inc. Phone user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
WO2016151919A1 (en) * 2015-03-26 2016-09-29 株式会社ミスミグループ本社 Browsing assistance method for electronic book, and browsing assistance program
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
CN113521710A (en) 2015-08-20 2021-10-22 苹果公司 Motion-based dial and complex function block
JP6677019B2 (en) * 2016-03-02 2020-04-08 富士通株式会社 Information processing apparatus, information processing program, and information processing method
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
JP6832725B2 (en) * 2017-01-31 2021-02-24 シャープ株式会社 Display device, display method and program
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
KR102416838B1 (en) * 2017-10-12 2022-07-05 삼성전자주식회사 Display apparatus and control method of the same
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
AU2020239670B2 (en) 2019-05-06 2021-07-15 Apple Inc. Restricted operation of an electronic device
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US10878782B1 (en) 2019-09-09 2020-12-29 Apple Inc. Techniques for managing display usage
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
WO2021231345A1 (en) 2020-05-11 2021-11-18 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148015A (en) * 1990-12-24 1992-09-15 Pitney Bowes Inc. Touch switch input device for computer system
US5250929A (en) * 1991-07-29 1993-10-05 Conference Communications, Inc. Interactive overlay-driven computer display system
US5434929A (en) * 1994-07-12 1995-07-18 Apple Computer, Inc. Method and apparatus for setting character style preferences in a pen-based computer system
US5524201A (en) * 1993-11-03 1996-06-04 Apple Computer, Inc. Method of preparing an electronic book for a computer system
US5644657A (en) * 1992-05-27 1997-07-01 Apple Computer, Inc. Method for locating and displaying information in a pointer-based computer system
US5726669A (en) * 1988-06-20 1998-03-10 Fujitsu Limited Multi-window communication system
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US5748192A (en) * 1991-12-18 1998-05-05 Ampex Corporation Video special effects system with graphical operator interface
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US6084553A (en) * 1996-01-11 2000-07-04 Hewlett Packard Company Design and method for a large, virtual workspace
US6125230A (en) * 1995-09-05 2000-09-26 Sony Corporation Magnetic tape recording and reproducing apparatus for video signal
US6208340B1 (en) * 1998-05-26 2001-03-27 International Business Machines Corporation Graphical user interface including a drop-down widget that permits a plurality of choices to be selected in response to a single selection of the drop-down widget
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US6331840B1 (en) * 1998-03-27 2001-12-18 Kevin W. Nielson Object-drag continuity between discontinuous touch screens of a single virtual desktop
US6344836B1 (en) * 1997-11-27 2002-02-05 Hitachi, Ltd. Multimedia information system
US20020048413A1 (en) * 2000-08-23 2002-04-25 Fuji Photo Film Co., Ltd. Imaging system
US20020060750A1 (en) * 2000-03-29 2002-05-23 Istvan Anthony F. Single-button remote access to a synthetic channel page of specialized content
US20020073204A1 (en) * 2000-12-07 2002-06-13 Rabindranath Dutta Method and system for exchange of node characteristics for DATA sharing in peer-to-peer DATA networks
US20020109727A1 (en) * 1999-10-28 2002-08-15 Venture Union Inc. Recording medium reproducing apparatus
US20020109665A1 (en) * 2001-02-15 2002-08-15 Matthews Joseph H. Methods and systems for a portable, interactive display device for use with a computer
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US20030142037A1 (en) * 2002-01-25 2003-07-31 David Pinedo System and method for managing context data in a single logical screen graphics environment
US20030229845A1 (en) * 2002-05-30 2003-12-11 David Salesin System and method for adaptive document layout via manifold content
US6670950B1 (en) * 1999-10-19 2003-12-30 Samsung Electronics Co., Ltd. Portable computer and method using an auxilliary LCD panel having a touch screen as a pointing device
US20040165013A1 (en) * 2003-02-20 2004-08-26 International Business Machines Corp. Cascading menu with automatic cursor relocation
US6919864B1 (en) * 2000-07-27 2005-07-19 Avaya Technology Corp. Display monitor
US7373605B2 (en) * 2003-06-13 2008-05-13 Sap Aktiengesellschaft Presentation system for displaying data

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH082748Y2 (en) * 1989-11-28 1996-01-29 横河電機株式会社 Touch screen device
US5398045A (en) * 1990-08-24 1995-03-14 Hughes Aircraft Company Touch control panel
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
EP0520655A3 (en) * 1991-06-28 1993-11-18 Ncr Int Inc Item selection method and apparatus
JP3547015B2 (en) * 1993-01-07 2004-07-28 ソニー株式会社 Image display device and method for improving resolution of image display device
US5581670A (en) * 1993-07-21 1996-12-03 Xerox Corporation User interface having movable sheet with click-through tools
JP2813728B2 (en) * 1993-11-01 1998-10-22 インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communication device with zoom / pan function
JPH082748A (en) 1994-06-21 1996-01-09 Kawashima Seisakusho:Kk Conveyer for layer sheaf
JPH0877493A (en) * 1994-09-08 1996-03-22 Fujitsu Ten Ltd Display device
US5682486A (en) * 1995-03-14 1997-10-28 International Business Machines Corporation Video display and control of multiple graphical interfaces
US5721853A (en) * 1995-04-28 1998-02-24 Ast Research, Inc. Spot graphic display element with open locking and periodic animation
EP1291812A3 (en) 1996-02-09 2004-06-23 Seiko Instruments Inc. Display unit, manufacturing method of the same and electronic device
JP3980679B2 (en) * 1996-02-16 2007-09-26 角田 達雄 Character / character string input processing device
KR100258931B1 (en) * 1997-06-17 2000-06-15 윤종용 A received signal discriminating circuit and method therefor
US5912667A (en) 1997-09-10 1999-06-15 Primax Electronics Ltd. Cursor control system for controlling a pop-up menu
JP3684832B2 (en) * 1998-03-31 2005-08-17 セイコーエプソン株式会社 Microcomputer, electronic equipment and debugging system
US6496122B2 (en) * 1998-06-26 2002-12-17 Sharp Laboratories Of America, Inc. Image display and remote control system capable of displaying two distinct images
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US6396523B1 (en) * 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
JP2001203908A (en) 2000-01-19 2001-07-27 Sony Corp Information terminal, reception device, and information transmitting and receiving method
SE0000585D0 (en) 2000-02-23 2000-02-23 Ericsson Telefon Ab L M Tuning method and system
US6683633B2 (en) * 2000-03-20 2004-01-27 Incontext Enterprises, Inc. Method and system for accessing information
JP2002157086A (en) * 2000-11-17 2002-05-31 Seiko Epson Corp Display with input function, electronic equipment provided with the same, and method of manufacturing display with input function
US7240287B2 (en) * 2001-02-24 2007-07-03 Microsoft Corp. System and method for viewing and controlling a presentation
JP4127982B2 (en) * 2001-05-28 2008-07-30 富士フイルム株式会社 Portable electronic devices
US7451457B2 (en) * 2002-04-15 2008-11-11 Microsoft Corporation Facilitating interaction between video renderers and graphics device drivers
US20040001099A1 (en) * 2002-06-27 2004-01-01 Microsoft Corporation Method and system for associating actions with semantic labels in electronic documents
US6999045B2 (en) * 2002-07-10 2006-02-14 Eastman Kodak Company Electronic system for tiled displays
US7027035B2 (en) * 2002-10-07 2006-04-11 Hewlett-Packard Development Company, L.P. Image copy to a second display
JP2004213451A (en) 2003-01-07 2004-07-29 Matsushita Electric Ind Co Ltd Information processor and frame
US7898529B2 (en) * 2003-01-08 2011-03-01 Autodesk, Inc. User interface having a placement and layout suitable for pen-based computers
US7158123B2 (en) * 2003-01-31 2007-01-02 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
JP4847801B2 (en) 2006-06-21 2011-12-28 株式会社クボタ Combine cabin door structure

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5726669A (en) * 1988-06-20 1998-03-10 Fujitsu Limited Multi-window communication system
US5148015A (en) * 1990-12-24 1992-09-15 Pitney Bowes Inc. Touch switch input device for computer system
US5250929A (en) * 1991-07-29 1993-10-05 Conference Communications, Inc. Interactive overlay-driven computer display system
US5748192A (en) * 1991-12-18 1998-05-05 Ampex Corporation Video special effects system with graphical operator interface
US5644657A (en) * 1992-05-27 1997-07-01 Apple Computer, Inc. Method for locating and displaying information in a pointer-based computer system
US5524201A (en) * 1993-11-03 1996-06-04 Apple Computer, Inc. Method of preparing an electronic book for a computer system
US5434929A (en) * 1994-07-12 1995-07-18 Apple Computer, Inc. Method and apparatus for setting character style preferences in a pen-based computer system
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US6125230A (en) * 1995-09-05 2000-09-26 Sony Corporation Magnetic tape recording and reproducing apparatus for video signal
US6084553A (en) * 1996-01-11 2000-07-04 Hewlett Packard Company Design and method for a large, virtual workspace
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US6344836B1 (en) * 1997-11-27 2002-02-05 Hitachi, Ltd. Multimedia information system
US6331840B1 (en) * 1998-03-27 2001-12-18 Kevin W. Nielson Object-drag continuity between discontinuous touch screens of a single virtual desktop
US6208340B1 (en) * 1998-05-26 2001-03-27 International Business Machines Corporation Graphical user interface including a drop-down widget that permits a plurality of choices to be selected in response to a single selection of the drop-down widget
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US6670950B1 (en) * 1999-10-19 2003-12-30 Samsung Electronics Co., Ltd. Portable computer and method using an auxilliary LCD panel having a touch screen as a pointing device
US20020109727A1 (en) * 1999-10-28 2002-08-15 Venture Union Inc. Recording medium reproducing apparatus
US20020060750A1 (en) * 2000-03-29 2002-05-23 Istvan Anthony F. Single-button remote access to a synthetic channel page of specialized content
US6919864B1 (en) * 2000-07-27 2005-07-19 Avaya Technology Corp. Display monitor
US20020048413A1 (en) * 2000-08-23 2002-04-25 Fuji Photo Film Co., Ltd. Imaging system
US20020073204A1 (en) * 2000-12-07 2002-06-13 Rabindranath Dutta Method and system for exchange of node characteristics for DATA sharing in peer-to-peer DATA networks
US20020109665A1 (en) * 2001-02-15 2002-08-15 Matthews Joseph H. Methods and systems for a portable, interactive display device for use with a computer
US20030142037A1 (en) * 2002-01-25 2003-07-31 David Pinedo System and method for managing context data in a single logical screen graphics environment
US20030229845A1 (en) * 2002-05-30 2003-12-11 David Salesin System and method for adaptive document layout via manifold content
US20040165013A1 (en) * 2003-02-20 2004-08-26 International Business Machines Corp. Cascading menu with automatic cursor relocation
US7373605B2 (en) * 2003-06-13 2008-05-13 Sap Aktiengesellschaft Presentation system for displaying data

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9262074B2 (en) 2002-11-04 2016-02-16 Neonode, Inc. Finger gesture user interface
US20110025622A1 (en) * 2009-08-03 2011-02-03 Nitto Denko Corporation Touch panel and display device with touch panel
US20110074708A1 (en) * 2009-09-28 2011-03-31 Brother Kogyo Kabushiki Kaisha Input device with display panel
US8922495B2 (en) 2009-09-28 2014-12-30 Brother Kogyo Kabushiki Kaisha Input device with display panel
WO2013081819A1 (en) 2011-11-30 2013-06-06 Neonode Inc. Light-based finger gesture user interface
EP2666075B1 (en) * 2011-11-30 2016-02-24 Neonode Inc. Light-based finger gesture user interface
US9591346B2 (en) 2012-05-14 2017-03-07 Samsung Electronics Co., Ltd. Content delivery system with content sharing mechanism and method of operation thereof
US10061504B2 (en) 2013-01-17 2018-08-28 Toyota Jidosha Kabushiki Kaisha Operation apparatus

Also Published As

Publication number Publication date
US8743070B2 (en) 2014-06-03
KR20110025866A (en) 2011-03-11
EP2613249A1 (en) 2013-07-10
US9001069B2 (en) 2015-04-07
JP4161814B2 (en) 2008-10-08
KR20060017736A (en) 2006-02-27
EP2613249B1 (en) 2016-11-16
EP2405350A3 (en) 2012-09-26
US20120038578A1 (en) 2012-02-16
CN1701298A (en) 2005-11-23
JP2005004690A (en) 2005-01-06
EP1548559B1 (en) 2011-11-16
EP1548559A4 (en) 2011-01-05
US20130021282A1 (en) 2013-01-24
US7948476B2 (en) 2011-05-24
TWI273471B (en) 2007-02-11
EP1548559A1 (en) 2005-06-29
EP2405350A2 (en) 2012-01-11
WO2004111827A1 (en) 2004-12-23
US20140289677A1 (en) 2014-09-25
US20050200611A1 (en) 2005-09-15
TW200508952A (en) 2005-03-01
KR101133824B1 (en) 2012-04-06

Similar Documents

Publication Publication Date Title
US9001069B2 (en) Technique for executing a combined operation in response to a single press and release of a sensor unit
KR101032802B1 (en) Receiving apparatus, receiving method, and transmitting/receiving apparatus
US20220007070A1 (en) Remote controller, control method thereof and image processing apparatus having the same
JP2004040656A (en) Image display system, image display method, and display device
JP2001157284A (en) Remote controller, control object device and control system
JP5278379B2 (en) Information processing device
US20080074557A1 (en) Method for switching a channel of an image display device and apparatus therefor
US8836864B2 (en) Display apparatus and control method thereof for displaying a picture-in-picture screen
JP5184491B2 (en) Television system
EP1336294A2 (en) System and method for selecting an area of a video display screen with a remote control unit
JP3873942B2 (en) Display stand and receiver
KR100771611B1 (en) Wireless television system and control method thereof
JP2008160240A (en) Image display apparatus
KR20060136035A (en) Wireless television system and control method thereof

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION