US20160170703A1 - System and method for linking and controlling terminals - Google Patents
System and method for linking and controlling terminals Download PDFInfo
- Publication number
- US20160170703A1 US20160170703A1 US15/052,803 US201615052803A US2016170703A1 US 20160170703 A1 US20160170703 A1 US 20160170703A1 US 201615052803 A US201615052803 A US 201615052803A US 2016170703 A1 US2016170703 A1 US 2016170703A1
- Authority
- US
- United States
- Prior art keywords
- touch
- user terminal
- image
- receiving terminal
- touch means
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
- H04N21/42209—Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/4222—Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- H04N5/4403—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
-
- H04N2005/4425—
Definitions
- the present invention relates to a system and method for linking and controlling terminals.
- the smart TV has become very popular in recent times, and various methods have been proposed to increase convenience in performing operations on a smart TV, one such method disclosed, for example, in Korean Patent Publication No. 2011-0078656.
- Korean Patent Publication No. 2011-0078656 Korean Patent Publication No. 2011-0078656
- Embodiments include a system and method with which to efficiently control a receiving terminal, such as a smart TV, etc., by using a user terminal, such as a smart phone, etc.
- An embodiment includes a user terminal that includes: a decoding unit configured to decode image data received from a receiving terminal; a touch display unit configured to display the decoded image; and an information transmitting unit configured to transmit selection information for an event-executing entity included in the decoded image to the receiving terminal.
- Another embodiment includes a receiving terminal that includes: an information transmitting unit configured to transmit an entirety of or a portion of an image to a user terminal; and an information receiving unit configured to receive selection information for an event-executing entity included in the image from the user terminal.
- a terminal linkage method includes: sharing an image with a receiving terminal; receiving input in the form of selection information for an event-executing entity included in the image; and transmitting the selection information to the receiving terminal.
- a terminal linkage method includes: transmitting an entirety of or a portion of an image to a user terminal; and receiving selection information for an event-executing entity included in the image from the user terminal.
- a program of instructions that can be executed to link a user terminal and a receiving terminal can be tangibly embodied in a recorded medium readable by a digital processing device, where the program of instructions are for a method that includes: sharing an image with the receiving terminal; receiving input in a form of selection information for an event-executing entity included in the image; and transmitting the selection information to the receiving terminal.
- a program of instructions that can be executed to link a user terminal and a receiving terminal can be tangibly embodied in a recorded medium readable by a digital processing device, where the program of instructions are for a method that includes: transmitting an entirety of or a portion of an image to the user terminal; and receiving selection information for an event-executing entity included in the image from the user terminal.
- the receiving terminal can provide the user terminal with an entire or a portion of a displayed image, the user can select an event-executing entity included in the image via the user terminal, and the user terminal can transmit the selection information of the event-executing entity to the receiving terminal.
- the user can efficiently control the operations of the receiving terminal by using the user terminal.
- the user can control the receiving terminal without looking at the receiving terminal and looking only at the user terminal.
- controlling the receiving terminal can be freed from the limitation of place.
- a position image can be shown on the user terminal, indicating the position of a touch means that is nearby or is touching with a light pressure or area, so that the user can conveniently control the operation of the user terminal.
- the position image can also be shown on the receiving terminal, in which case the user can easily control the operation of the receiving terminal while viewing only the receiving terminal without looking at the user terminal.
- FIG. 1A , FIG. 1B , FIG. 1C , and FIG. 1D schematically illustrate a system for linking and controlling terminals according to an embodiment.
- FIG. 2A and FIG. 2B illustrate a method of linking and controlling terminals according to a first embodiment.
- FIG. 3 illustrates a method of linking and controlling terminals according to a second embodiment.
- FIG. 4A and FIG. 4B illustrate a method of linking and controlling terminals according to a third embodiment.
- FIG. 5 illustrates a method of linking and controlling terminals according to a fourth embodiment.
- FIG. 6 illustrates a method of linking and controlling terminals according to a fifth embodiment.
- FIG. 7A , FIG. 7B , and FIG. 7C illustrate a control method used for the method of linking and controlling terminals in FIG. 6 .
- FIG. 8A , FIG. 8B , FIG. 8C , FIG. 8D , and FIG. 8E illustrate a linking operation when different sensing levels are set in accordance with an embodiment.
- FIG. 9 is a flowchart illustrating a method of linking and controlling terminals according to a sixth embodiment.
- FIG. 10 is a flowchart illustrating a method of linking and controlling terminals according to a seventh embodiment.
- FIG. 11 illustrates a system for linking and controlling terminals according to another embodiment.
- FIG. 12 is a flowchart illustrating a method of linking and controlling terminals according to an eighth embodiment.
- FIG. 13 illustrates a method of sensing a touch means according to a first embodiment.
- FIG. 14 illustrates a method of sensing a touch means according to a second embodiment.
- FIG. 15A and FIG. 15B illustrate a method of sensing a touch means according to a third embodiment.
- FIG. 16A and FIG. 16B illustrate a method of sensing a touch means according to a fourth embodiment.
- FIG. 17 is a block diagram illustrating the structure of a user terminal according to an embodiment.
- FIG. 18 is a block diagram illustrating the structure of a user terminal according to another embodiment.
- FIG. 19 is a block diagram illustrating the structure of a receiving terminal according to an embodiment.
- a system for linking and controlling terminals relates to linking and controlling terminals, especially by sharing an image and controlling the operation of a terminal based on the shared image.
- a system for linking and controlling terminals can link a smaller terminal (e.g. smart phone, tablet PC, etc.), which is controlled by a touch method, with a larger terminal (e.g. TV, etc.) and enable various methods for controlling the larger terminal with the smaller terminal. That is, a system for linking and controlling terminals according to an embodiment can control a larger terminal with a smaller terminal utilized as a remote control.
- the smaller terminal controlled directly by the user will be referred to as the user terminal or a mobile terminal
- the larger terminal that receives the position information of the touch means from the user terminal will be referred to as the receiving terminal or a display device.
- the user terminal may preferably have a smaller size compared to the receiving terminal, the sizes of the terminals are not thus limited.
- the terminals used in a system according to an embodiment may be provided with a function for display images and a function for wired/wireless communication, but the communication function is not necessarily essential to the invention. Considering real-life applications, however, it may be preferable if each of the terminals is equipped with an image display function and a communication function.
- the terminal is not limited to a particular type of device, as long as it is capable of displaying images and exchanging signals with the another terminal, and various devices, for example such as a smart phone, smart TV, remote control, PC, tablet PC, laptop, touch pad, game console, cloud PC, etc., can be used as the terminal in an embodiment. However, it may be preferable if the smaller terminal is equipped with a touch function.
- first image the image shared by the terminals
- second image position image
- FIGS. 1A through 1D schematically illustrate a system for linking and controlling terminals according to an embodiment.
- a system for linking and controlling terminals can include a user terminal 100 and a receiving terminal 102 , where the terminals 100 and 102 can be computing apparatuses.
- the user terminal 100 may be a terminal that can be directly controlled by the user and can be, for example, a smart phone, remote control, etc., that is capable sharing an image with another device. Also, the user terminal 100 can be a terminal having a relatively small size and having a touch function, and for example can be a mobile terminal.
- the receiving terminal 102 may be a terminal that is not directly manipulated by the user but is linked with the user terminal 100 , and can be a display-enabled terminal. That is, the receiving terminal 102 may be any device capable of displaying an image, and from the perspective of displaying an image, can also be referred to as a display device.
- the receiving terminal 102 can be a device used for a different purpose from that of the user terminal 100 , and for example can be a TV for showing broadcast programs such as a drama series.
- the receiving terminal 102 may be a terminal having a relatively larger size, although it may not necessarily have a touch function.
- the overall size of the receiving terminal 102 can be larger than the overall size of the user terminal 100 , but it may be sufficient if the size of the display unit on the receiving terminal 102 is larger than the size of the display unit on the user terminal 100 . In the latter case, the overall sizes of the user terminal 100 and receiving terminal 102 need not be considered.
- the terminals 100 and 102 can be connected directly in a wired or wireless manner or indirectly using another device as a medium.
- the user may use the user terminal 100 to control the operation of the receiving terminal 102 , more specifically the operation of a program displayed on the receiving terminal 102 .
- the user terminal 100 can be located at a distance that allows near-field communication with the receiving terminal 102 , and can be located for example at a distance from which the user can view the receiving terminal 102 .
- the communication between the terminals 100 and 102 is not limited to near-field communication; for example, the user can control the operation of a receiving terminal 102 inside the home by using a user terminal 100 from outside the home. This is possible because the user terminal 100 shares at least a portion of the first image displayed on the receiving terminal 102 . Consequently, the control of the receiving terminal can be freed from the limitation of place.
- the receiving terminal 102 can transmit to the user terminal 100 the image data corresponding to at least a portion of the first image 110 displayed on the receiving terminal 102 .
- the user terminal 100 may display a first image 110 corresponding to the transmitted image data. That is, the user terminal 100 and the receiving terminal 102 can share the first image 110 , as illustrated in FIG. 1B .
- the first image 110 displayed on the user terminal 100 may be substantially the same as the first image 110 displayed on the receiving terminal 102 , the contrast or display proportion, etc., may differ according to the properties of the user terminal 100 .
- the user terminal 100 can display the first image 110 as is, without particularly processing the transmitted image data, or the receiving terminal 102 can convert the resolution, etc., of the transmitted image data and then display the first image 110 corresponding to the converted image data.
- a separate device connected with the user terminal 100 can convert the image data transmitted from the receiving terminal 102 to a format suitable for the user terminal 100 and then transmit the converted image data to the user terminal 100 .
- the method of processing the image data at the user terminal 100 can be modified in various ways.
- the user terminal 100 can transmit the image data corresponding to the shared first image 110 to the receiving terminal 102 and thus share the first image 110 .
- the receiving terminal 102 can transmit image data corresponding to the changed first image 110 to the user terminal 100 . Consequently, the user terminal 100 and the receiving terminal 102 can continuously share the first image 110 .
- the user terminal 100 can transmit the information on the selection of the event-executing entity (the selection information) to the receiving terminal 102 .
- the receiving terminal 102 can execute a corresponding operation in accordance with the transmitted information, and as a result, the first image 110 can be changed.
- image data corresponding to the changed first image 110 may be transmitted to the user terminal 100 . That is, the user can use the user terminal 100 to control the operation of the receiving terminal 102 , particularly the operation of a program executed by the receiving terminal 102 .
- the selection information can include position information corresponding to a touch input for selecting an event-executing entity.
- the selection information can include control information for executing an event-executing entity selected by the touch input.
- the user terminal 100 can generate the position information or control information in accordance with the touch input.
- the receiving terminal 102 can execute an operation corresponding to the event-executing entity by using the position information corresponding to the touch input. Alternatively, the receiving terminal 102 can execute the operation corresponding to the event-executing entity by using the control information.
- the selection of the event-executing entity can be determined according to the pressure or area by which the touch means touches the user terminal 100 , and as will be described later on, the event-executing entity can be selected when the user touches the user terminal 100 by a pressure or area exceeding a preset pressure or area. This will be described later in more detail.
- the user terminal 100 can display a third image (a position image) 114 indicating the position information of a touch means that is in a touch-sensitive region of the user terminal 100 , for example a touch means that is near the display unit or is touching the display unit with a pressure or area smaller than or equal to the preset pressure or area, as illustrated in FIG. 1C . That is, the position of a touch means can be shown on the user terminal 100 , so that the user may control user terminal 100 conveniently.
- the touch means is not limited to a particular form and can be a finger, a touch pen, etc.
- being near may refer to the touch means being positioned with a preset distance from the user terminal 100 , including the case of the touch means contacting the user terminal 100 .
- a second image 112 such as a pointer, a shadow image, etc., corresponding to the position information of the touch means can be displayed on the receiving terminal 102 together with the first image 110 , as illustrated in FIGS. 1B through 1D .
- the second image 112 can be substantially the same as the third image 114 or can have a different shape or color.
- the position of the touch means can be specified on the user terminal 100 and the receiving terminal 102 .
- the user can control the operation of the receiving terminal 102 while viewing only the receiving terminal 102 and without looking at the user terminal 100 .
- the second image 112 or the third image 114 corresponding to the position of the touch means can be changed according to the touch pressure or the touch area of the touch means contacting the user terminal 100 .
- the touch means touches the user terminal 100 with a pressure or an area smaller than or equal to a preset value
- a third image 114 corresponding to the position of the touch means may be shown on the user terminal 100
- the touch means touches the user terminal 100 with a pressure or an area exceeding the preset value
- the third image 114 corresponding to the position of the touch means may not be shown on the user terminal 100 or may be changed to a different shape.
- This operation can also apply in a similar manner to the second image 112 displayed on the receiving terminal 102 .
- the shape of the second image 112 or third image 114 can be varied according to the distance between the touch means and the user terminal 100 .
- the image 112 or 114 can be shaped as a shadow if the distance between the touch means and the user terminal 100 is within a first distance and can be shaped as a finger if the distance between the touch means and the user terminal 100 is within a second distance smaller than the first distance.
- At least one of the position of a touch means near to the user terminal 100 , the position of a touch means touching the user terminal 100 with a touch pressure smaller than or equal to a preset value, and the position of a touch means touching the user terminal 100 with a touch area smaller than or equal to a preset value can be indicated by a second image 112 or a third image 114 .
- the user can control the operation of the receiving terminal 102 conveniently by using the user terminal 100 .
- a second image 112 indicating the position information of the touch means is shown on the receiving terminal 102
- the user can use the user terminal 100 to control the receiving terminal 102 while viewing only the receiving terminal 102 , without having to look at the user terminal 100 .
- the user can control the receiving terminal 102 by using a smart phone, for example, carried by the user.
- the user can control the user terminal 100 by using a touch means, not only can the user select an event-executing entity to execute a particular operation, but also the user can input particular characters, etc., and even work on documents on the user terminal 100 . Such operations on the user terminal 100 can be reflected immediately on the receiving terminal 102 .
- the touch means can be utilized as a mouse for a computer, etc, to perform various operations
- the user can control the receiving terminal 102 with greater convenience.
- the user can scroll, copy, or search, etc., articles in a portal site remotely on the user terminal 100 which may be reflected on the receiving terminal 102 .
- the second image 112 indicating the position information of the touch means may be shown on the receiving terminal 102 , allowing the user to perform a desired operation by looking at the second image 112 displayed on the larger-sized receiving terminal 102 .
- embodiments can involve showing a second image 112 on the receiving terminal 102 to allow the user to control the receiving terminal 102 conveniently, using the user terminal 100 to implement various functions performed by a remote control, a mouse, a touch pen, etc.
- a receiving terminal 102 such as a smart TV, which can perform various functions, can be controlled in a convenient manner by using a user terminal 100 typically carried by the user.
- the user terminal 100 can just as well indicate the position of the touch means in the first image 110 and transmit the first image 110 , in which the position of the touch means is indicated, to the receiving terminal 102 .
- the user terminal 100 can superimpose the position image for the touch means over the first image 110 and transmit the image data, with the position image of the touch means superimposed, to the receiving terminal 102 .
- the user terminal 100 can modify a region corresponding to the position of the touch means in the first image 110 to a position image for the touch means. That is, the user terminal 100 can modify the first image 110 itself and create a new image to indicate the position of the touch means, and can then transmit the created image to the receiving terminal 102 .
- the user can select a menu item or input a button manipulation, etc., on the user terminal 100 or on the receiving terminal 102 to stop showing the second image 112 or the third image 114 .
- the user terminal 100 can continuously transmit the position information of the touch means to the receiving terminal 102 and the receiving terminal 102 can store the transmitted position information, so that the receiving terminal 102 may display the second image 112 corresponding to the position information upon the user's request.
- the linkage and control system can include a user terminal, a receiving terminal, and a linkage server.
- the linkage server can transmit image data corresponding to the first image to the user terminal and the receiving terminal, and the user terminal can transmit the position information of the touch means to the receiving terminal, so that the second image indicating the position information of the touch means may be shown together with the first image at the receiving terminal.
- the linkage server can pre-store the resolution information, etc., of the user terminal and the receiving terminal to process the image data based on the stored resolution information, etc., before transmitting it to the user terminal and the receiving terminal.
- Terminals such as smart phones, TVs, etc.
- wire communication standards such as High-Definition Multimedia Interface HDMI, Mobile High-definition Link MHL, Displayport, etc.
- wireless communication standards such as Digital Living Network Alliance DNLA and WiFi, etc.
- HDMI uses Consumer Electronic Control CEC, Display Data Channel DDC, Utility, and SCL/SDA as control channels
- MHL uses CBUS as a control channel
- Displayport uses the auxiliary channel as a control channel.
- it may not be necessary to establish separate channels for linking the terminals 100 and 102 and the channels already available on the terminals 100 and 102 can be utilized for a channel by which to transmit the position information of the touch means according to an embodiment.
- the terminals 100 and 102 can exchange data in various forms according to the communication method used for the linkage system.
- the position information of the touch means can be the position information on the first image displayed on the user terminal 100 . Consequently, the position of the touch means on the first image of the user terminal 100 can be reflected in the first image 110 of the receiving terminal 102 as the second image 112 .
- the position information of the touch means can be coordinate information that is in accordance with the resolution or screen size of the display unit of the user terminal 100 . That is, the position information of the touch means can be the coordinate information of the touch means with respect to the display unit of the user terminal 100 , rather than the coordinate information of the touch means on the image displayed on the screen.
- the user terminal 100 when the user terminal 100 is to show the image data transmitted from the receiving terminal 102 in order that the user terminal 100 and the receiving terminal 102 may share the first image 110 , the user terminal 100 can display the first image 110 corresponding to the image data after setting the screen to the same resolution as the receiving terminal 102 .
- the user terminal 100 can accurately represent the position of the touch means by displaying the second image 112 with the position information, e.g. the coordinate information, of the touch means unmodified.
- the present invention can employ various methods for specifying the position of the touch means at the receiving terminal 102 .
- the user terminal 100 can transmit the position information of the touch means to the receiving terminal 102 , or generate second image data corresponding to the second image representing the position information and send this second image data to the receiving terminal 102 , or transmit the first image to the receiving terminal 102 with the region corresponding to the position of the touch means modified.
- FIG. 2A and FIG. 2B illustrate a method of linking and controlling terminals according to a first embodiment.
- the receiving terminal 102 can transmit image data corresponding to a first image 110 that includes an event-executing entity 200 , such as a UI, icon, application program, link, etc., for example, to the user terminal 100 . Consequently, the first image 110 displayed on the user terminal 100 can include the event-executing entity 200 .
- an event-executing entity 200 such as a UI, icon, application program, link, etc.
- the user terminal 100 can transmit selection information, which notifies that the event-executing entity 200 was selected, to the receiving terminal 102 .
- the selection information can include position information corresponding to a touch input for selecting the event-executing entity 200 or, depending on the touch input, control information for executing the event-executing entity 200 .
- the receiving terminal 102 can recognize that the event-executing entity 200 was selected, based on the information corresponding to the touch input, and can execute an operation related to the selection of the event-executing entity 200 . Alternatively, the receiving terminal 102 can execute the related operation by using the control information for executing the event-executing entity 200 . Thus, the user can control the operation of the receiving terminal 102 by using the user terminal 100 .
- a second image 112 can be shown on the receiving terminal 102 in a position corresponding to the position of the touch means.
- a third image 114 indicating the position of the touch means can be shown on the user terminal 100 as well. If the user touches the event-executing entity 200 with the touch means with a pressure greater than the preset pressure or an area greater than the preset area, then the receiving terminal 102 can perform an operation corresponding to the event-executing entity 200 , for example by activating a game, etc. In this case, the first image 110 displayed on the receiving terminal 102 can be changed.
- This method of linking terminals can be very useful not only for games, navigation, etc., but also for controlling a smart TV by using a user terminal 100 , as illustrated in FIG. 2B .
- FIG. 3 illustrates a method of linking and controlling terminals according to a second embodiment.
- the user terminal 100 it is also possible to transmit to the user terminal 100 only the image 300 corresponding to a UI for control from among the first image 110 displayed on the receiving terminal 102 , so that the user terminal 100 can display a corresponding image 302 . That is, the user terminal 100 and the receiving terminal 102 can share just a portion of the image, especially a portion including an event-executing entity.
- a second image 112 indicating the position of the touch means may be displayed on the receiving terminal 102 .
- a third image 114 indicating the position of the touch means can also be displayed on the user terminal 100 , where the third image 114 can be substantially the same as the second image 112 .
- the user terminal 100 when the user touches an event-executing entity in the image 302 displayed on the user terminal 100 with a touch means, the user terminal 100 can transmit the selection information of the event-executing entity to the receiving terminal 102 , and the receiving terminal 102 can execute the operation corresponding to the selection information.
- the first image 110 on the receiving terminal 102 may be changed.
- the user terminal 100 may maintain the image 302 , whereas if the image 300 of the event-executing entity at the receiving terminal 102 is changed, an altered image 302 of the event-executing entity may be shown at the user terminal 100 .
- the second image 112 or third image 114 may be shown.
- event occurrence information can be outputted, for example in the form of a sound, etc. This will be described later in more detail with reference to FIG. 18 .
- embodiments can involve using the user terminal 100 as a means for controlling the receiving terminal 102 . This can be particularly efficient for games, electronic commerce, smart V, etc.
- FIG. 4A and FIG. 4B illustrate a method of linking and controlling terminals according to a third embodiment.
- the image for the second image 112 a indicating the position of the touch means can be changed to a different shape. Also, the second image 112 a can be changed if the touch means is present at a preset position while it is near the user terminal 100 or is touching the user terminal 100 with a pressure or an area smaller than or equal to a preset pressure or area level.
- the second image 112 a can be represented as a shadow image as illustrated in FIG. 4A , but if the touch means is positioned over an event-executing entity, it can be represented by a finger image as illustrated in FIG. 4B .
- the second image 112 can be changed when the touch means is positioned not only over an icon, but also over an Internet address input window, the bottom of the screen, a search window, a folder, and the like.
- the receiving terminal 102 can output event occurrence information, such as in the form of sound, light, vibration, etc., according to the event associated with the event-executing entity over which the touch means is positioned. This will be described later in further detail. Changing the second image 112 and outputting the event occurrence information can be performed simultaneously. The method described above can also apply in a similar manner to the third image 114 .
- the image 112 or 114 can be changed according to the number of times or the touch duration of the touch means touching the user terminal 100 .
- the image 112 or 114 can be a shadow image when a touch means touches the user terminal 100 once, but can be changed to an arrow image if the touch means makes a touch twice in a row.
- the image 112 or 114 can be a shadow image if the touch means touches the user terminal 100 for a duration shorter than or equal to a preset value, and can be changed to a different image if the preset duration is exceeded.
- the image 112 or 114 shown when a touch means is near the user terminal 100 can be different from the image 112 or 114 shown when the touch means is touching the user terminal 100 .
- While the second image 112 and the third image 114 can be changed simultaneously, it is also possible to change just one of them or change them into images that are different from each other.
- FIG. 5 illustrates a method of linking and controlling terminals according to a fourth embodiment.
- a user terminal 100 and a receiving terminal 102 may be connected to begin linked operation (S 500 ).
- S 500 a user terminal 100 and a receiving terminal 102 may be connected to begin linked operation (S 500 ).
- the user terminal 100 or the receiving terminal 102 requests linkage to the counterpart terminal after the user terminal 100 and the receiving terminal 102 connected by the user for linked operation, or if the user terminal 100 requests linkage to the receiving terminal 102 and the linkage is accepted, a channel can be formed between the terminals 100 and 102 capable of transmitting the position information of a touch means.
- the linkage of the terminals 100 and 102 can be performed according to a user's request or can be performed automatically by a terminal 100 or 102 .
- the user terminal 100 and the receiving terminal 102 can be connected by two channels, i.e. a data channel and a control channel, and the control signal for requesting or accepting linkage can be exchanged through the control channel.
- the receiving terminal 102 may transmit image data corresponding to at least a portion of a first image that is currently being displayed or about to be displayed to the user terminal 100 , and the user terminal 100 may display the first image corresponding to the transmitted image data (S 502 ). That is, the user terminal 100 and the receiving terminal 102 may share at least a portion of the first image.
- the user terminal 100 may sense the touch means (S 504 ).
- the user terminal 100 can sense the touch means using various methods such as capacitive sensing, electromagnetic sensing, etc.
- the user terminal 100 may transmit the position information of the touch means obtained according to the sensing result to the receiving terminal 102 , and the receiving terminal 102 may display a second image, which represents the position information of the touch means transmitted thus, together with the first image 110 (S 506 ).
- the user terminal 100 can also transmit image data (or combined image data) that includes data corresponding to the first image and data corresponding to the second image to the receiving terminal 102 .
- image data or combined image data
- this embodiment basically involves the receiving terminal 102 transmitting the image data corresponding to the first image to the user terminal 100
- the former method of the user terminal 100 transmitting only the position information to the receiving terminal 102 may be more efficient.
- the user terminal 100 may transmit the position information of the touch means to the receiving terminal 102 to display the second image on the receiving terminal 102 (S 508 ). That is, the movement of the touch means can be reflected on the receiving terminal 102 .
- FIG. 6 illustrates a method of linking and controlling terminals according to a fifth embodiment.
- FIGS. 7A through 7C illustrate a control method used for the method of linking and controlling terminals in FIG. 6
- FIGS. 8A through 8E illustrate a linking operation when different sensing levels are set in accordance with an embodiment.
- the user terminal 100 or the receiving terminal 102 may request linkage to begin linked operation (S 600 ).
- the user terminal 100 and the receiving terminal 102 can be connected by two channels, i.e. a data channel and a control channel, as illustrated in FIG. 7A , and the control signal for requesting or accepting linkage can be exchanged through the control channel.
- the data can be exchanged through one channel.
- the transmission periods for the channel can include data periods E 1 and E 2 for transmitting image data, and a control period C between the data periods E 1 and E 2 for transmitting the position information or the selection information for an event-executing entity, as illustrated in FIG. 7C .
- the position information or selection information can be transmitted by utilizing the blank periods C in-between the periods for transmitting image data.
- the user terminal 100 can transmit the image data to the receiving terminal 102 through one channel, and can transmit the position information or selection information to the receiving terminal 102 during the blank periods C existing in-between the data periods E 1 and E 2 for transmitting image data.
- the receiving terminal 102 may transmit image data corresponding to at least a portion of the first image 110 to the user terminal 100 , to thus share the first image 110 with the user terminal 100 (S 602 ).
- the user terminal 100 may sense the touch pressure or the touch area of the touch means (S 604 ).
- the user terminal 100 in sensing the touch pressure or touch area, can set multiple levels, e.g. two levels, for the sensing levels, as illustrated in FIG. 7B . If it is not performing a terminal-linked process, i.e. if the user is using only the user terminal 100 , the sensing level for the user terminal 100 can be set to a higher level (L 2 ) such that the touch position is sensed only when there is a light touch made by the user.
- L 2 a higher level
- the user terminal 100 may set the sensing level to a lower level (L 1 ).
- L 1 the sensing level
- the user terminal 100 can sense the touch means even when the touch means is near and not touching or when the touch pressure or touch area is smaller than or equal to a preset pressure or area.
- the user terminal 100 can employ various methods other than the capacitance-based method, such as methods based on electromagnetic induction, methods using a resistive overlay, optical methods, ultrasonic methods, etc., and the settings conditions can vary according to the method employed.
- the user terminal 100 may transmit the position information of the touch means obtained according to the sensing result to the receiving terminal 102 , and the receiving terminal 102 may display a second image representing the position information, for example a position image 112 indicating the position of the touch means, such as that illustrated in FIG. 1B , together with the first image 110 (S 606 ).
- the position information of the touch means can be information in an image form or information in a coordinate form. That is, the user terminal 100 can generate the position information for the second image 112 directly in the form of image data and transmit it to the receiving terminal 102 , or transmit only the position information of the touch means to the receiving terminal 102 in the form of a control signal. Alternatively, the user terminal 100 can generate a position image (second image) for of the touch means and transmit image data with the first image and the second image included to the receiving terminal 102 , so that the second image 112 can be displayed on the receiving terminal 102 concurrently with the sharing of the first image 110 .
- the position of the touch means can be displayed on the receiving terminal 102 if the touch pressure of the touch means is smaller than or equal to a preset value or the touch area is smaller than or equal to a preset value. That is, if the touch means touches the user terminal 100 lightly, the second image 112 can be displayed on the receiving terminal 102 . If the touch means moves while touching the user terminal 100 lightly, the second image 112 may reflect the movement of the touch means, and the second image 112 on the receiving terminal 102 may also move continuously (S 608 ).
- the user terminal 100 may not recognize the touch of the touch means as a touch input. If the touch means touches the user terminal 100 with a strong pressure, i.e. if the touch pressure exceeds a preset pressure or the touch area exceeds a preset touch area, the user terminal 100 may recognize the touch of the touch means as a touch input. Thus, if the touch means touches the user terminal 100 lightly, the icon may not be executed, but if the touch means touches the icon strongly, the icon can be executed.
- the embodiment shown in FIG. 5 and the embodiment shown in FIG. 6 can be applied together. That is, the second image 112 can be displayed on the receiving terminal 102 when the touch means is positioned within a preset distance from the user terminal 100 and when the touch means lightly touches the user terminal 100 .
- FIGS. 8A to 8E a more detailed description is provided below on setting the sensing levels for the embodiments illustrated in FIG. 5 and FIG. 6 .
- the user terminal 100 can be set to have multiple sensing levels. For example, a first level, a second level, and a third level can be set at the user terminal 100 ; a first level for sensing a nearness of a touch means 804 , a second level for sensing the touch means 804 touching with a level equal to or smaller than a preset level (pressure level or area level), and a third level for sensing the touch means 804 touching with a level equal to or greater than a preset level.
- a preset level pressure level or area level
- the first image 110 can be displayed as illustrated in FIG. 8A .
- a first image 110 that is substantially the same can also be displayed at the user terminal 100 .
- the first image 110 can include event-executing entities 800 such as icons, application programs, links, etc., and when an event-executing entity 800 is selected, a racing game, for example, can be executed as illustrated in FIG. 8B .
- event-executing entities 800 such as icons, application programs, links, etc.
- the touch means 804 when the touch means 804 is brought near the display unit 806 of the user terminal 100 as illustrated in FIG. 8C , the second image 112 indicating the position of the touch means 804 can be displayed on the receiving terminal 102 as illustrated in FIG. 8A .
- the third image 114 can also be shown on the user terminal 100 .
- the touch means 804 touches the display unit 806 with a level smaller than or equal to a preset level as illustrated in FIG. 8D , the image 112 or 114 indicating the position of the touch means 804 can remain as is, and the event-executing entity 800 may remain unexecuted.
- the user terminal can recognize this as a selection of the event-executing entity 800 and execute the game.
- the distal end 810 of the touch means 804 can be structured such that it can be inserted inside, and when the user makes a touch with the touch means 804 with a level greater than or equal to a preset level, the display unit 806 may be pressed with the distal end 810 inserted inside, as illustrated in FIG. 8E .
- a linkage and control system based on this embodiment can perform different operations according to sensing levels.
- an image 112 or 114 indicating the position of the touch means 804 can be shown, and if the touch means 804 touches the display unit 806 , the event-executing entity 800 can be executed.
- FIG. 9 is a flowchart illustrating a method of linking and controlling terminals according to a sixth embodiment.
- the user terminal 100 and the receiving terminal 102 may begin linked operation (S 900 ).
- the receiving terminal 102 may transmit image data corresponding to at least a portion of the displayed first image to the user terminal 100 , and the user terminal 100 may display a first image corresponding to the image data (S 902 ). That is, the user terminal 100 and the receiving terminal 102 may share the first image.
- the user terminal 100 may sense a touch means, such as a finger, a touch pen, etc., through any of a variety of methods (S 904 ).
- the user terminal 100 can sense the position of a touch means that is near the user terminal 100 or lightly touching the user terminal 100 .
- the user terminal 100 may generate a combined image, including the currently displayed first image together with the second image corresponding to the sensed position of the touch means, and may transmit combined image data (combination information) corresponding to the combined image to the receiving terminal 102 , and the receiving terminal 102 may display the combined image corresponding to the combined image data (S 906 ). Consequently, the second image together with the first image may be displayed on the receiving terminal 102 . In this case, the user terminal 100 can display the first image only or display the first image and the second image together.
- the user terminal 100 may transmit the position information of the touch means to the receiving terminal 102 , and the receiving terminal 102 may display the second image, which indicates the position of the touch means in accordance with the position information, together with the corresponding first image (S 908 ). That is, the movement of the touch means may be reflected in the screen of the receiving terminal 102 .
- the first image on the receiving terminal 102 may be the same image as the previous image or may be a different image from the previous image.
- this embodiment has the user terminal 100 generate a combined image that includes the first image and the second image and transmit the combined image thus generated to the receiving terminal 102 , so that the receiving terminal 102 may consequently display the second image together with the first image.
- the user terminal 100 can modify the first image such that a region corresponding to the position of the touch means is changed to a shadow image, etc., that is, the first image itself can be modified to create a new image, after which the image thus created can be transmitted to the receiving terminal 102 .
- the modified first image can be substantially the same as the combined image.
- FIG. 10 is a flowchart illustrating a method of linking and controlling terminals according to a seventh embodiment.
- the user terminal 100 and the receiving terminal 102 may begin linked operation (S 1000 ).
- the user or the user terminal 100 may set multiple sensing levels for sensing the touch means (S 1002 ). As described above, various levels can be set for different embodiments. Such settings may be established at the beginning of the linked operation or may be established beforehand in the user terminal 100 prior to linked operation.
- the user terminal 100 may sense the touch means, and the receiving terminal 102 may display the second image, which represents the sensed position of the touch means (S 1004 ).
- the stopping of linked operation can be requested by the user by a method such as a menu selection, etc., or can also be requested by turning off the connection between the user terminal 100 and the receiving terminal 102 .
- the user, controlling the user terminal 100 while viewing the receiving terminal 102 may wish to use the user terminal 100 only or may wish to view the receiving terminal 102 only, in which case the user can request for a stopping of linked operation while the user terminal 100 and the receiving terminal 102 are in a connected state.
- step S 1004 may be performed again.
- the user terminal 100 may initialize the multiple levels such that only one level is available or change the levels to sensing levels which only sense touches (S 1008 ). That is, the user terminal 100 may change the levels such that the touch means is not sensed if the touch means does not make a touch, and that the touch means is sensed only when the touch means makes a touch.
- a method of linking and controlling terminals can allow a user to arbitrarily request linked operation and request a stopping of the linked operation and to freely set and change sensing levels.
- a preliminary sensing level can be set at the beginning of linked operation between the user terminal 100 and receiving terminal 102 , and the sensing levels can be set differently during linked operation.
- the user can set different sensing levels during linked operation according to the nearness distance and touch strength of the touch means and can change the sensing level settings to sense the touch means only when it is in contact.
- FIG. 11 illustrates a system for linking and controlling terminals according to another embodiment.
- a system for linking and controlling terminals based on this embodiment can include a user terminal 100 , a receiving terminal 102 , and a transceiver device 1100 (e.g. a dongle).
- a transceiver device 1100 e.g. a dongle
- the transceiver device 1100 can connect the communications between the user terminal 100 and the receiving terminal 102 .
- the transceiver device 1100 can transmit the position information of the touch means, the combined image data, or the selection information for the event-executing entity to the receiving terminal 102 .
- image data transmitted from the receiving terminal 102 can be transmitted by the transceiver device 1100 to the user terminal 100 .
- the transceiver device 1100 can be connected to the user terminal 100 or the receiving terminal 102 , or can exist separately without being connected to the user terminal 100 and receiving terminal 102 .
- the transceiver device 1100 can be, for example, a dongle, a set-top, etc.
- the position information, combined image data, or selection information transmitted from the user terminal 100 can be forwarded by the transceiver device 1100 as is, without modification, to the receiving terminal 102 .
- the transceiver device 1100 may convert the position information, combined image data, or selection information transmitted from the user terminal 100 to a format suitable for the receiving terminal 102 and then transmit the converted position information or combined image data to the receiving terminal 102 . Since many companies currently manufacture the receiving terminal 102 , in the form of a smart TV, etc., it may be necessary to match the position information, combined image data, or selection information with the format of the receiving terminal 102 according to manufacturer. An embodiment can use the transceiver device 1100 to convert the position information, combined image data, or selection information to fit the format of the receiving terminal 102 , so that the terminals 100 and 102 can be linked regardless of manufacturer.
- the transceiver device 1100 can include a communication unit, a signal unit, and a format changer unit.
- the communication unit may connect the user terminal 100 and the receiving terminal 102 .
- the signal unit can transmit to the receiving terminal 102 the position information of the touch means, the combined image data, or the selection information transmitted from the user terminal 100 , and can transmit the first image received from the receiving terminal 102 to the user terminal 100 .
- the format changer unit can modify the position information, combined image data, or selection information to the format of the receiving terminal 102 or modify the first image to the format of the user terminal 100 .
- the user terminal 100 and the receiving terminal 102 can send or receive the position information, image data, combined image data, or selection information by way of the transceiver device 1100 .
- the user terminal 100 or the receiving terminal 102 connected to the transceiver device 1100 need not have a communication function.
- the transceiver device 1100 may not only provide communication between the user terminal 100 and the receiving terminal 102 but may also sense the position of a touch means 1102 .
- the transceiver device 1100 may serve as a communication means for the user terminal 100 and can provide a particular communication function, such as Wibro communication, for example, and can also convert a particular communication function into another communication function, such as by converting Wibro to Wi-Fi, for example, for use by the user terminal 100 .
- a particular communication function such as Wibro communication
- another communication function such as by converting Wibro to Wi-Fi, for example, for use by the user terminal 100 .
- FIG. 12 is a flowchart illustrating a method of linking and controlling terminals according to an eighth embodiment.
- a receiver may be installed on the receiving terminal 102 (S 1200 ). In another embodiment, the receiver can be built into the receiving terminal 102 .
- the user terminal 100 and the receiving terminal 102 may begin linked operation (S 1202 ).
- the user terminal 100 may transmit image data corresponding to the first image to the receiving terminal 102 to share the first image 110 , or the receiving terminal 102 may transmit the image data to the user terminal 100 to share the first image 110 (S 1204 ).
- the receiver may sense the position of the touch means by way of infrared rays and ultrasonic waves emitted from the touch means and transmit the sensed position of the touch means to the receiving terminal 102 , and the receiving terminal 102 may display the second image, which represents the position thus obtained by sensing, together with the first image (S 1206 ).
- the user terminal 100 may transmit the position information based on the movement of the touch means to the receiving terminal 102 , and the receiving terminal 102 can show the movement of the touch means as a second image 112 or a third image different from the second image.
- FIG. 13 illustrates a method of sensing a touch means according to a first embodiment.
- a touch pen 1300 can be used as the touch means intended for touching the user terminal 100 .
- a capacitance-based touch panel may be used for the user terminal 100 .
- the touch pen 1300 may be composed of a body 1310 and a touch part 1312 .
- the body 1310 may be made of an electrically non-conducting material, while the touch part 1312 may be a conductor.
- a change in capacitance may occur when the touch pen 1300 is brought near the user terminal 100 or is touching the user terminal 100 , and the user terminal 100 can sense the touch pen 1300 based on the change in capacitance.
- FIG. 14 illustrates a method of sensing a touch means according to a second embodiment.
- the user terminal 100 can include a touch panel 1400 and an electromagnetic field generator unit 1402 .
- the electromagnetic field generator unit 1402 can be connected to a rear surface of the touch panel 1400 and can be made of a thin metal film to generate an electromagnetic field when electricity is applied.
- the touch pen 1404 may include a body 1410 and a touch part 1412 , where the touch part 1412 can preferably be made of a small metal coil. Consequently, when the touch pen 1404 is brought near the touch panel 1400 , electromagnetic induction may occur in the touch part 1412 , and as a result, an alteration may occur in the electromagnetic field created by the electromagnetic field generator unit 1402 . Thus, the user terminal 100 may recognize the position of the touch pen 1404 by sensing this alteration in the electromagnetic field. In particular, since the alteration of the electromagnetic field would differ according to the nearness and touch strength of the touch pen 1404 , this method of sensing the touch means can minutely sense the degree of proximity and the touch pressure of the touch pen 1404 with respect to the touch panel 1400 .
- FIG. 15A and FIG. 15B illustrate a method of sensing a touch means according to a third embodiment.
- a receiver 1500 can be installed on a portion of the user terminal 100 , and a touch pen 1502 can be used.
- the receiver 1500 can include an infrared sensor and two ultrasonic sensors to sense the movement of the touch pen 1502 by receiving the infrared rays and ultrasonic waves emitted from the touch part (pen tip) of the touch pen 1502 , and can transmit the position information of the touch pen 1502 obtained in accordance with the sensing results to the receiving terminal 102 .
- the receiving terminal 102 may display a second image that represents the transmitted position information. Consequently, the second image may be displayed together with the first image.
- the position information can be transmitted to the receiving terminal 102 by the receiver 1500 or by the user terminal 100 .
- the receiver 1500 can perform not only the function of sensing the position of the touch pen 1502 but also the function of transmitting image data and position information to the receiving terminal 102 .
- the receiver 1500 can include a touch control unit 1510 , an image signal unit 1512 , a control signal unit 1514 , and a transceiver unit 1516 , as illustrated in FIG. 15B .
- the touch control unit 1510 may serve to sense the position of the touch pen 1502 by using the received infrared rays and ultrasonic waves and provide the user terminal 100 with the information on the sensed position.
- the user terminal 100 may show the position of the touch pen 1502 or perform a related operation in accordance to the information thus provided.
- the image signal unit 1512 can be provided with image data from the user terminal 100 and transmit the image data thus provided to the receiving terminal 102 via the transceiver unit 1516 .
- the control signal unit 1514 may serve to transmit a control signal, which includes the position information of the touch pen 1502 obtained above by sensing, to the receiving terminal 102 . That is, since the receiver 1500 transmits the image data transmitted from the user terminal 100 and the position information of the touch pen 1502 to the receiving terminal 102 , the user terminal 100 does not have to include a communication function. Therefore, even with a terminal that does not have a communication function or a terminal that has a communication function but is unable to use the related communication facilities, it is possible to recognize the position and action of the touch pen 1502 using the receiver 1500 as well as to employ a linkage method according to an embodiment for sharing images, displaying the second image, etc.
- the receiver can be incorporated into the user terminal 100 to be implemented as a single body.
- FIG. 16A and FIG. 16B illustrate a method of sensing a touch means according to a fourth embodiment.
- a receiver 1600 can be installed on or built into the receiving terminal 102 rather than the user terminal 100 .
- the receiver 1600 may receive the ultrasonic waves and infrared rays to sense the position of the touch pen 1602 .
- the receiver 1600 may transmit information regarding the position of the touch pen 1602 thus sensed to the receiving terminal 102 , and the receiving terminal 102 may display the second image representing the position of the touch pen 1602 together with the first image.
- the user terminal 100 and the receiving terminal 102 may display the second image while sharing the first image.
- the receiver 1600 can serve not only to sense the position of the touch pen 1602 but also to perform communication.
- the receiver 1600 can include a touch means sensing unit 1610 , an image signal unit 1612 , a control signal unit 1614 , and a transceiver unit 1616 .
- the touch means sensing unit 1610 may serve to sense the position of the touch pen 1602 .
- the image signal unit 1612 may receive the combined image data transmitted from the user terminal 100 by way of the transceiver unit 1616 and transmit the received image data by way of the transceiver unit 1616 to the receiving terminal 102 , or may also receive image data from the receiving terminal 102 and transmit the received image data to the user terminal 100 .
- the control signal unit 1614 can receive a control signal transmitted from the user terminal 100 related to the linked operation, etc., and can transmit the received control signal to the receiving terminal 102 .
- the position information of the touch pen 1602 need not be transmitted from the user terminal 100 to the receiver 1600 .
- the receiver 1600 may provide not only the function of sensing the position of the touch pen 1602 but also a communication function.
- a linking and control method according to an embodiment can be used even when the receiving terminal 102 does not have a communication function.
- FIG. 17 is a block diagram illustrating the structure of a user terminal according to an embodiment.
- the user terminal 100 of this embodiment can include a control unit 1700 , a linkage unit 1702 , an image unit 1704 , a control unit 1700 , a linkage unit 1702 , an image unit 1704 , a sensing unit 1706 , a settings unit 1708 , a display unit 1710 , a signal unit 1712 , a transceiver unit 1714 , a decoding unit 1716 , and a storage unit 1718 .
- the linkage unit 1702 may manage all functions related to linkage with the receiving terminal 102 .
- the image unit 1704 can include an image generator unit and an image changer unit and can display, via the display unit 1710 , an image corresponding to the image data transmitted from the receiving terminal 102 .
- the image generator unit can generate a position image, which may represent the position of a touch means that is positioned within a preset distance from the display unit 1710 , generate a combined image, which may include the second image and the first image, or generate a position image of a touch means that is touching the display unit 1710 with a pressure or an area smaller than or equal to a preset value.
- the image changer unit can change the position image according to the distance between the touch means and the user terminal 100 , and can change the position image according to the pressure or area with which the touch means contacts the user terminal 100 . Also, the image changer unit can change the position image according to whether or not the touch means is over an event-executing entity or a preset position.
- the sensing unit 1706 may serve to sense a touch means, such as a finger or a touch pen, etc. More specifically, the sensing unit 1706 can sense the position of the touch means, distinguishing when the touch means is near and when it is touching.
- the method of sensing is not limited to a particular method and can be a capacitance-based method, an electromagnetic induction-based method, etc.
- the information on the position of the touch means as sensed by the sensing unit 1706 can also be generated by a position information generating unit (not shown).
- the settings unit 1708 may manage the settings of various functions, such as linkage function settings, sensing level settings, etc.
- the display unit 1710 can be implemented in various ways such as by using capacitance-based types, resistive overlay types, electromagnetic induction types, etc.
- the display unit 1710 can be a touch display unit equipped with a touch function.
- the signal unit 1712 can include an information transmitting unit 1720 and an information receiving unit 1722 .
- the information transmitting unit 1720 can transmit the position information of a touch means or the selection information for an event-executing entity to the receiving terminal 102 .
- the selection information can include position information corresponding to a touch input for selecting an event-executing entity or control information for executing an event-executing entity according to a touch input.
- the information receiving unit 1722 may receive image data from the receiving terminal 102 corresponding to the entirety or a portion of the first image.
- the transceiver unit 1714 may serve as a communication passageway to the receiving terminal 102 .
- the decoding unit 1716 may decode the image data received from the receiving terminal 102 .
- the storage unit 1718 may store various data, such as the first image, image signals, position information, control signals, application programs, etc.
- the control unit 1700 may control the overall operations of the components of the user terminal 100 .
- the user terminal 100 can further include a touch means unit, a receiver unit, an electromagnetic field generator unit, or a touch means operating unit.
- the touch means unit may receive and manage information on the position of the touch means when the receiver is connected with the user terminal 100 . That is, the receiver may sense the position of the touch means, while the touch means unit may analyze the signals transmitted from the receiver to detect the position of the touch means.
- the receiver unit may receive infrared rays and ultrasonic waves transmitted from the touch means when the touch means is brought near to or in contact with the user terminal 100 , and may analyze the infrared rays and ultrasonic waves thus received to detect the position of the touch means.
- the electromagnetic field generator unit may serve to create an electromagnetic field for sensing the touch means by electromagnetic induction, and may preferably be formed on a rear surface of the display unit.
- the touch means operating unit can perform a particular operation when the touch means is brought near the user terminal 100 or is touching the user terminal 100 with a pressure or an area smaller than or equal to a preset value. For example, a scroll function can be performed if the touch means is brought near a lower part of the display unit 1710 on the user terminal 100 .
- the position information of the touch means can be transmitted to the receiving terminal 102 or a third image 114 corresponding to the position information can be displayed on the display unit 1710 . That is, the user terminal 100 can transmit the position information of the touch means or display a third image 114 on the user terminal 100 , in response to an approaching near of or a light touch by the touch means, to result in a particular operation such as scrolling, etc.
- FIG. 18 is a block diagram illustrating the structure of a user terminal according to another embodiment.
- the user terminal 100 of this embodiment can include a control unit 1800 , a display unit 1802 , a sensing unit 1804 , a signal unit 1806 , an image unit 1808 , an information provider unit 1810 , and a mode changer unit 1812 .
- the user terminal 100 can include all or just some of the components above. Also, the user terminal 100 can additionally include components other than the components above.
- the display unit 1802 may display a first image that is shared by the user terminal 100 and the receiving terminal 102 . Also, the display unit 1802 can display a menu, etc., from which to select a touch mode.
- the sensing unit 1804 may sense the position of a touch means by way of various methods such as those described above, when the touch means is near or is touching the user terminal 100 .
- the position information representing the position of the touch means can be the position information of the touch means that is positioned within a preset distance from the user terminal 100 .
- the signal unit 1806 may transmit the position information of the touch means obtained by the sensing above to the receiving terminal 102 and may transmit image data corresponding to the first image shared by the user terminal 100 and the receiving terminal 102 to the receiving terminal 102 .
- the image unit 1808 may generate combined image data that is to be shared by the user terminal 100 and the receiving terminal 102 or the second image that indicates the position of the touch means.
- the information provider unit 1810 can output information according to the sensing results of the sensing unit 1804 . That is, nearness information can be outputted if a touch means is brought near the user terminal 100 and sensed by the sensing unit 1804 .
- the nearness information can be in the form of vibration, sound, or light, so as to stimulate the user's tactile, auditory, or visual senses.
- the information provider unit 1810 can provide the user with a tactile, auditory, or visual sensation in various types according to the state of nearness of the touch means with respect to the user terminal 100 , to allow the user of the user terminal 100 to perceive various tactile, auditory, or visual sensations.
- the information provider unit 1810 can provide a continuous vibration, sound, or light during a movement of the touch means. That is, a short vibration can be provided once when a near touch of the touch means is first recognized, after which continuous vibrations can be provided when the touch means moves. In other words, when a near touch of the touch means is first recognized, a vibration can be provided for a first duration, and afterwards when the touch means moves, a vibration can be provided for a second duration.
- the second duration can be longer than the first duration.
- the information provider unit 1810 can provide a vibration when a near touch is first recognized, and afterwards provide a sound when the touch means is moving.
- the information provider unit 1810 can provide the user with nearness information in the form of sound, etc., when the touch means is brought near a preset entity such as a folder, control UI, etc., from among the images shown on the screen of the user terminal 100 . That is, as described above with reference to FIGS. 6A and 6B , the position image of the touch means can change when the touch means is placed at a preset position, and at this time, the information provider unit 1810 can output nearness information, where the nearness information can correspond to event occurrence information. In this way, the user can perceive the entity immediately.
- a preset entity such as a folder, control UI, etc.
- the information provider unit 1810 can provide the nearness information for a first duration when the sensing unit 1804 recognizes a nearness state of the touch means, and can provide the nearness information for a second duration when the touch means moves while the sensing unit 1804 is aware of the nearness state of the touch means.
- the second duration can be a duration corresponding to the duration for which the touch means moves while the sensing unit 1804 is aware of the nearness state of the touch means.
- the information provider unit 1810 can provide the nearness information in different forms for a first case in which the sensing unit 1804 recognizes a nearness state of a touch means and a second case in which the touch means is moved while the sensing unit 1804 is aware of the nearness state of the touch means.
- a vibration can be provided for the first case and a sound can be provided for the second case, or vibrations of a first pattern can be provided for the first case and vibrations of a second pattern can be provided for the second case, so as to allow the user to perceive the movement of the touch means.
- the information provider unit 1810 may not provide the nearness information, allowing the user of the user terminal to differentiate between a near state and a direct touch.
- the information provider unit 1810 can provide the nearness information.
- the touch means Once the touch means is brought near the user terminal 100 and touches the user terminal 100 , the touch means may be separated from the user terminal 100 . That is, since the purpose of the touch has been fulfilled, the touch means may be separated from the user terminal 100 to proceed with the next operation, at which time a nearness state may occur again.
- the information provider unit 1810 since the touch means is put in a nearness state for the first time after touching the display unit 1802 and was not intentionally placed in a nearness state by the user, the information provider unit 1810 may not provide nearness information. Then, when a nearness state occurs for the second time, i.e. the user intentionally triggers a nearness state, the information provider unit 1810 can provide nearness information.
- the mode changer unit 1812 may change the touch mode according to the user's input, where the touch mode can include a near touch mode 1820 and a direct touch mode 1822 .
- the user input can be made by way of a switch.
- the switch can be configured as a ring/vibration conversion switch.
- the position image of a nearby touch means can be displayed on the user terminal 100 or the receiving terminal. Also, the position image of a touch means touching the user terminal 100 with a pressure level or area smaller than or equal to a preset value can be displayed on the user terminal 100 or the receiving terminal.
- the user terminal 100 may recognize the position of a touch means that is near the user terminal 100 for a touch input of the touch means, and in the direct touch mode, the user terminal 100 may recognize a touch by the touch means as a touch input.
- the mode changer unit 1812 can change the touch mode at the beginning of linked operation between the user terminal 100 and the receiving terminal 102 or change the touch mode according to the user's command, e.g. the user's touch, voice data, or visual data.
- the user can change the touch mode by selecting on a menu shown on the user terminal 100 , or the touch mode can be changed by the user's voice or a visual sequence such as motion.
- the touch mode change function described above can be provided when the linked operation of the user terminal 100 and the receiving terminal 102 begins, or can be provided in the user terminal 100 regardless of linked operation.
- the touch mode change function for linked operation can be provided automatically when the linked operation begins or can be provided after linking when the user selects the function. With a small screen as on a smart phone, it can be useful to sense a touch means that is nearby and show a corresponding image on the smart phone, and in such a device, it may be advantageous to provide the touch mode change function regardless of linked operation.
- the user terminal 100 can be provided with a link mode in addition to the near touch mode and direct touch mode. For example, if a user carrying a user terminal 100 such as a smart phone or a tablet PC, etc., wishes to link it to a receiving terminal 102 , the user can select the link mode. When the user selects the link mode, the user terminal 100 can search for display apparatuses close by, begin linking with a searched receiving terminal 102 , and display a menu from which to select a near touch mode and a direct touch mode on the user terminal 100 at the beginning of the linked operation.
- a link mode in addition to the near touch mode and direct touch mode.
- the selection of the touch mode change can be achieved by methods other than selecting a menu displayed on the user terminal 100 , such as by pressing a button provided on a side surface, a front surface, etc., of a smart phone.
- the control unit 1800 may control the overall operations of the components of the user terminal 100 .
- FIG. 19 is a block diagram illustrating the structure of a receiving terminal according to an embodiment.
- a receiving terminal 102 can include a control unit 1900 , a linkage unit 1902 , a transceiver unit 1904 , a signal unit 1906 , an image unit 1908 , a display unit 1910 , and an operation executing unit 1912 .
- the linkage unit 1902 may manage the function of linking to the user terminal 100 .
- the transceiver unit 1904 may serve as a communication passageway to the user terminal 100 .
- the signal unit 1906 can include an information transmitting unit 1920 and an information receiving unit 1922 .
- the information transmitting unit 1920 may transmit image data corresponding to the first image to the user terminal 100 .
- the information receiving unit 1922 can receive the position information of the touch means or the selection information of the event-executing entity that is transmitted from the user terminal 100 .
- the image unit 1908 may display the first image through the display unit 1910 , and may display the second image corresponding to the position information of the touch means received above, together with the first image. According to another embodiment, the image unit 1908 can combine the first image with the second image and display the combined image, i.e. the result of the combining, through the display unit 1910 .
- the display unit 1910 is not limited to a particular type as long as it is capable of displaying images, and can be implemented, for example, as an LCD, OLED, PDP, etc.
- the display unit 1910 does not necessarily require a touch function.
- the operation executing unit 1912 can execute the operation corresponding to the selection information of the event-executing entity.
- the storage unit 1914 may store various data such as the first image, the second image, a combined image, application programs, etc.
- the control unit 1900 may control the overall operations of the components of the receiving terminal 102 .
- a computer-readable medium can include program instructions, data files, data structures, etc., alone or in combination.
- the program instructions recorded on the medium can be designed and configured specifically for the present invention or can be a type of medium known to and used by the skilled person in the field of computer software.
- Examples of a computer-readable medium may include magnetic media such as hard disks, floppy disks, magnetic tapes, etc., optical media such as CD-ROM's, DVD's, etc., magneto-optical media such as floptical disks, etc., and hardware devices such as ROM, RAM, flash memory, etc.
- Examples of the program of instructions may include not only machine language codes produced by a compiler but also high-level language codes that can be executed by a computer through the use of an interpreter, etc.
- the hardware mentioned above can be made to operate as one or more software modules that perform the actions of the embodiments, and vice versa.
Abstract
A system for linking and controlling terminals is disclosed. The user terminal includes a decoding unit configured to decode image data received from a receiving terminal; a touch display unit configured to display the decoded image; and an information transmitting unit configured to transmit selection information for an event-executing entity included in the decoded image to the receiving terminal.
Description
- This application is a continuation of U.S. patent application Ser. No. 13/785,370 filed Mar. 5, 2013, which claims the benefit of Korean Patent Applications No. 10-2012-0023012, No. 10-2012-0022986, No. 10-2012-0022988, No. 10-2012-0022984, No. 10-2012-0024073, No. 10-2012-0024092, No. 10-2012-0032982, No. 10-2012-0033047, No. 10-2012-0043148, No. 10-2012-0057996, No. 10-2012-0057998, No. 10-2012-0058000, filed respectively with the Korean Intellectual Property Office on Mar. 6, 2012, Mar. 6, 2012, Mar. 6, 2012, Mar. 6, 2012, Mar. 8, 2012, Mar. 8, 2012, Mar. 30, 2012, Mar. 30, 2012, Apr. 25, 2012, May 31, 2012, May 31, 2012, May 31, 2012, the disclosures of which are incorporated herein by reference in their entireties.
- 1. Technical Field
- The present invention relates to a system and method for linking and controlling terminals.
- 2. Description of the Related Art
- The smart TV has become very popular in recent times, and various methods have been proposed to increase convenience in performing operations on a smart TV, one such method disclosed, for example, in Korean Patent Publication No. 2011-0078656. However, it is inconvenient for a user to control a smart TV using touch methods, and remote controls for controlling smart TV's are not very efficient.
- Embodiments include a system and method with which to efficiently control a receiving terminal, such as a smart TV, etc., by using a user terminal, such as a smart phone, etc.
- An embodiment includes a user terminal that includes: a decoding unit configured to decode image data received from a receiving terminal; a touch display unit configured to display the decoded image; and an information transmitting unit configured to transmit selection information for an event-executing entity included in the decoded image to the receiving terminal.
- Another embodiment includes a receiving terminal that includes: an information transmitting unit configured to transmit an entirety of or a portion of an image to a user terminal; and an information receiving unit configured to receive selection information for an event-executing entity included in the image from the user terminal.
- A terminal linkage method according to an embodiment includes: sharing an image with a receiving terminal; receiving input in the form of selection information for an event-executing entity included in the image; and transmitting the selection information to the receiving terminal.
- A terminal linkage method according to another embodiment includes: transmitting an entirety of or a portion of an image to a user terminal; and receiving selection information for an event-executing entity included in the image from the user terminal.
- A program of instructions that can be executed to link a user terminal and a receiving terminal according to an embodiment can be tangibly embodied in a recorded medium readable by a digital processing device, where the program of instructions are for a method that includes: sharing an image with the receiving terminal; receiving input in a form of selection information for an event-executing entity included in the image; and transmitting the selection information to the receiving terminal.
- A program of instructions that can be executed to link a user terminal and a receiving terminal according to an embodiment can be tangibly embodied in a recorded medium readable by a digital processing device, where the program of instructions are for a method that includes: transmitting an entirety of or a portion of an image to the user terminal; and receiving selection information for an event-executing entity included in the image from the user terminal.
- In a method and system for linking and controlling terminals according to an embodiment, the receiving terminal can provide the user terminal with an entire or a portion of a displayed image, the user can select an event-executing entity included in the image via the user terminal, and the user terminal can transmit the selection information of the event-executing entity to the receiving terminal. Thus, the user can efficiently control the operations of the receiving terminal by using the user terminal.
- Since the image of the receiving terminal may be displayed on the user terminal, the user can control the receiving terminal without looking at the receiving terminal and looking only at the user terminal. Thus, controlling the receiving terminal can be freed from the limitation of place.
- Also, a position image can be shown on the user terminal, indicating the position of a touch means that is nearby or is touching with a light pressure or area, so that the user can conveniently control the operation of the user terminal.
- Furthermore, the position image can also be shown on the receiving terminal, in which case the user can easily control the operation of the receiving terminal while viewing only the receiving terminal without looking at the user terminal.
- Additional aspects and advantages of the present invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice.
-
FIG. 1A ,FIG. 1B ,FIG. 1C , andFIG. 1D schematically illustrate a system for linking and controlling terminals according to an embodiment. -
FIG. 2A andFIG. 2B illustrate a method of linking and controlling terminals according to a first embodiment. -
FIG. 3 illustrates a method of linking and controlling terminals according to a second embodiment. -
FIG. 4A andFIG. 4B illustrate a method of linking and controlling terminals according to a third embodiment. -
FIG. 5 illustrates a method of linking and controlling terminals according to a fourth embodiment. -
FIG. 6 illustrates a method of linking and controlling terminals according to a fifth embodiment. -
FIG. 7A ,FIG. 7B , andFIG. 7C illustrate a control method used for the method of linking and controlling terminals inFIG. 6 . -
FIG. 8A ,FIG. 8B ,FIG. 8C ,FIG. 8D , andFIG. 8E illustrate a linking operation when different sensing levels are set in accordance with an embodiment. -
FIG. 9 is a flowchart illustrating a method of linking and controlling terminals according to a sixth embodiment. -
FIG. 10 is a flowchart illustrating a method of linking and controlling terminals according to a seventh embodiment. -
FIG. 11 illustrates a system for linking and controlling terminals according to another embodiment. -
FIG. 12 is a flowchart illustrating a method of linking and controlling terminals according to an eighth embodiment. -
FIG. 13 illustrates a method of sensing a touch means according to a first embodiment. -
FIG. 14 illustrates a method of sensing a touch means according to a second embodiment. -
FIG. 15A andFIG. 15B illustrate a method of sensing a touch means according to a third embodiment. -
FIG. 16A andFIG. 16B illustrate a method of sensing a touch means according to a fourth embodiment. -
FIG. 17 is a block diagram illustrating the structure of a user terminal according to an embodiment. -
FIG. 18 is a block diagram illustrating the structure of a user terminal according to another embodiment. -
FIG. 19 is a block diagram illustrating the structure of a receiving terminal according to an embodiment. - Certain embodiments of the present invention will be described below in more detail with reference to the accompanying drawings.
- A system for linking and controlling terminals according to an embodiment relates to linking and controlling terminals, especially by sharing an image and controlling the operation of a terminal based on the shared image. In particular, a system for linking and controlling terminals according to an embodiment can link a smaller terminal (e.g. smart phone, tablet PC, etc.), which is controlled by a touch method, with a larger terminal (e.g. TV, etc.) and enable various methods for controlling the larger terminal with the smaller terminal. That is, a system for linking and controlling terminals according to an embodiment can control a larger terminal with a smaller terminal utilized as a remote control.
- For the sake of convenience, the smaller terminal controlled directly by the user will be referred to as the user terminal or a mobile terminal, and the larger terminal that receives the position information of the touch means from the user terminal will be referred to as the receiving terminal or a display device. Although the user terminal may preferably have a smaller size compared to the receiving terminal, the sizes of the terminals are not thus limited.
- The terminals used in a system according to an embodiment may be provided with a function for display images and a function for wired/wireless communication, but the communication function is not necessarily essential to the invention. Considering real-life applications, however, it may be preferable if each of the terminals is equipped with an image display function and a communication function. The terminal is not limited to a particular type of device, as long as it is capable of displaying images and exchanging signals with the another terminal, and various devices, for example such as a smart phone, smart TV, remote control, PC, tablet PC, laptop, touch pad, game console, cloud PC, etc., can be used as the terminal in an embodiment. However, it may be preferable if the smaller terminal is equipped with a touch function.
- A system for linking and controlling terminals according to various embodiments of the present invention will be described below in detail with reference to the accompanying drawings. For convenience, the image shared by the terminals will be referred to as a first image, and the image representing the position information of the touch means will be referred to as a second image (position image).
-
FIGS. 1A through 1D schematically illustrate a system for linking and controlling terminals according to an embodiment. - Referring to
FIG. 1A , a system for linking and controlling terminals according to this embodiment can include auser terminal 100 and a receivingterminal 102, where theterminals - The
user terminal 100 may be a terminal that can be directly controlled by the user and can be, for example, a smart phone, remote control, etc., that is capable sharing an image with another device. Also, theuser terminal 100 can be a terminal having a relatively small size and having a touch function, and for example can be a mobile terminal. - The receiving
terminal 102 may be a terminal that is not directly manipulated by the user but is linked with theuser terminal 100, and can be a display-enabled terminal. That is, the receivingterminal 102 may be any device capable of displaying an image, and from the perspective of displaying an image, can also be referred to as a display device. The receivingterminal 102 can be a device used for a different purpose from that of theuser terminal 100, and for example can be a TV for showing broadcast programs such as a drama series. In an embodiment, the receivingterminal 102 may be a terminal having a relatively larger size, although it may not necessarily have a touch function. - The overall size of the receiving
terminal 102 can be larger than the overall size of theuser terminal 100, but it may be sufficient if the size of the display unit on the receivingterminal 102 is larger than the size of the display unit on theuser terminal 100. In the latter case, the overall sizes of theuser terminal 100 and receiving terminal 102 need not be considered. - The
terminals user terminal 100 to control the operation of the receivingterminal 102, more specifically the operation of a program displayed on the receivingterminal 102. Thus, theuser terminal 100 can be located at a distance that allows near-field communication with the receivingterminal 102, and can be located for example at a distance from which the user can view the receivingterminal 102. Of course, the communication between theterminals terminal 102 inside the home by using auser terminal 100 from outside the home. This is possible because theuser terminal 100 shares at least a portion of the first image displayed on the receivingterminal 102. Consequently, the control of the receiving terminal can be freed from the limitation of place. - According to an embodiment, the receiving
terminal 102 can transmit to theuser terminal 100 the image data corresponding to at least a portion of thefirst image 110 displayed on the receivingterminal 102. Theuser terminal 100 may display afirst image 110 corresponding to the transmitted image data. That is, theuser terminal 100 and the receivingterminal 102 can share thefirst image 110, as illustrated inFIG. 1B . - Of course, although the
first image 110 displayed on theuser terminal 100 may be substantially the same as thefirst image 110 displayed on the receivingterminal 102, the contrast or display proportion, etc., may differ according to the properties of theuser terminal 100. Theuser terminal 100 can display thefirst image 110 as is, without particularly processing the transmitted image data, or the receivingterminal 102 can convert the resolution, etc., of the transmitted image data and then display thefirst image 110 corresponding to the converted image data. In another example, a separate device connected with theuser terminal 100 can convert the image data transmitted from the receivingterminal 102 to a format suitable for theuser terminal 100 and then transmit the converted image data to theuser terminal 100. - That is, as long as the receiving
terminal 102 and theuser terminal 100 display substantially the samefirst image 110, the method of processing the image data at theuser terminal 100 can be modified in various ways. - According to another embodiment, the
user terminal 100 can transmit the image data corresponding to the sharedfirst image 110 to the receivingterminal 102 and thus share thefirst image 110. - When the
first image 110 on the receivingterminal 102 is changed, the receivingterminal 102 can transmit image data corresponding to the changedfirst image 110 to theuser terminal 100. Consequently, theuser terminal 100 and the receivingterminal 102 can continuously share thefirst image 110. - Also, when a user selects a particular event-executing entity, for example, from among the
first image 110 displayed on theuser terminal 100, theuser terminal 100 can transmit the information on the selection of the event-executing entity (the selection information) to the receivingterminal 102. In this case, the receivingterminal 102 can execute a corresponding operation in accordance with the transmitted information, and as a result, thefirst image 110 can be changed. Of course, image data corresponding to the changedfirst image 110 may be transmitted to theuser terminal 100. That is, the user can use theuser terminal 100 to control the operation of the receivingterminal 102, particularly the operation of a program executed by the receivingterminal 102. - The selection information can include position information corresponding to a touch input for selecting an event-executing entity. Alternatively, the selection information can include control information for executing an event-executing entity selected by the touch input. The
user terminal 100 can generate the position information or control information in accordance with the touch input. - The receiving
terminal 102 can execute an operation corresponding to the event-executing entity by using the position information corresponding to the touch input. Alternatively, the receivingterminal 102 can execute the operation corresponding to the event-executing entity by using the control information. - The selection of the event-executing entity can be determined according to the pressure or area by which the touch means touches the
user terminal 100, and as will be described later on, the event-executing entity can be selected when the user touches theuser terminal 100 by a pressure or area exceeding a preset pressure or area. This will be described later in more detail. - A description will now be provided on various embodiments by which a user can control the operation of the receiving
terminal 102 by using theuser terminal 100. - According to an embodiment, the
user terminal 100 can display a third image (a position image) 114 indicating the position information of a touch means that is in a touch-sensitive region of theuser terminal 100, for example a touch means that is near the display unit or is touching the display unit with a pressure or area smaller than or equal to the preset pressure or area, as illustrated inFIG. 1C . That is, the position of a touch means can be shown on theuser terminal 100, so that the user may controluser terminal 100 conveniently. The touch means is not limited to a particular form and can be a finger, a touch pen, etc. Here, being near may refer to the touch means being positioned with a preset distance from theuser terminal 100, including the case of the touch means contacting theuser terminal 100. - According to another embodiment, if a touch means is brought near to the
user terminal 100 or is touching with a pressure or area smaller than or equal to a preset pressure or area, asecond image 112 such as a pointer, a shadow image, etc., corresponding to the position information of the touch means can be displayed on the receivingterminal 102 together with thefirst image 110, as illustrated inFIGS. 1B through 1D . Thesecond image 112 can be substantially the same as thethird image 114 or can have a different shape or color. - That is, the position of the touch means can be specified on the
user terminal 100 and the receivingterminal 102. As thesecond image 112 is displayed on the receivingterminal 102, the user can control the operation of the receivingterminal 102 while viewing only the receivingterminal 102 and without looking at theuser terminal 100. - In an embodiment, the
second image 112 or thethird image 114 corresponding to the position of the touch means can be changed according to the touch pressure or the touch area of the touch means contacting theuser terminal 100. For example, if the touch means touches theuser terminal 100 with a pressure or an area smaller than or equal to a preset value, then athird image 114 corresponding to the position of the touch means may be shown on theuser terminal 100, whereas if the touch means touches theuser terminal 100 with a pressure or an area exceeding the preset value, then thethird image 114 corresponding to the position of the touch means may not be shown on theuser terminal 100 or may be changed to a different shape. This operation can also apply in a similar manner to thesecond image 112 displayed on the receivingterminal 102. - According to another embodiment, the shape of the
second image 112 orthird image 114 can be varied according to the distance between the touch means and theuser terminal 100. For example, theimage user terminal 100 is within a first distance and can be shaped as a finger if the distance between the touch means and theuser terminal 100 is within a second distance smaller than the first distance. - In various different embodiments, at least one of the position of a touch means near to the
user terminal 100, the position of a touch means touching theuser terminal 100 with a touch pressure smaller than or equal to a preset value, and the position of a touch means touching theuser terminal 100 with a touch area smaller than or equal to a preset value can be indicated by asecond image 112 or athird image 114. - In short, the user can control the operation of the receiving
terminal 102 conveniently by using theuser terminal 100. As asecond image 112 indicating the position information of the touch means is shown on the receivingterminal 102, the user can use theuser terminal 100 to control the receivingterminal 102 while viewing only the receivingterminal 102, without having to look at theuser terminal 100. In particular, even when the user does not have a separate control means, the user can control the receivingterminal 102 by using a smart phone, for example, carried by the user. - Since the user can control the
user terminal 100 by using a touch means, not only can the user select an event-executing entity to execute a particular operation, but also the user can input particular characters, etc., and even work on documents on theuser terminal 100. Such operations on theuser terminal 100 can be reflected immediately on the receivingterminal 102. - Also, since the touch means can be utilized as a mouse for a computer, etc, to perform various operations, the user can control the receiving
terminal 102 with greater convenience. For example, the user can scroll, copy, or search, etc., articles in a portal site remotely on theuser terminal 100 which may be reflected on the receivingterminal 102. Of course, thesecond image 112 indicating the position information of the touch means may be shown on the receivingterminal 102, allowing the user to perform a desired operation by looking at thesecond image 112 displayed on the larger-sized receiving terminal 102. That is, embodiments can involve showing asecond image 112 on the receivingterminal 102 to allow the user to control the receivingterminal 102 conveniently, using theuser terminal 100 to implement various functions performed by a remote control, a mouse, a touch pen, etc. Thus, a receivingterminal 102 such as a smart TV, which can perform various functions, can be controlled in a convenient manner by using auser terminal 100 typically carried by the user. - Although the descriptions above refer to the
user terminal 100 as transmitting the image data for thefirst image 110 and the position information of the touch means separately to the receivingterminal 102, theuser terminal 100 can just as well indicate the position of the touch means in thefirst image 110 and transmit thefirst image 110, in which the position of the touch means is indicated, to the receivingterminal 102. For example, theuser terminal 100 can superimpose the position image for the touch means over thefirst image 110 and transmit the image data, with the position image of the touch means superimposed, to the receivingterminal 102. - In another example, the
user terminal 100 can modify a region corresponding to the position of the touch means in thefirst image 110 to a position image for the touch means. That is, theuser terminal 100 can modify thefirst image 110 itself and create a new image to indicate the position of the touch means, and can then transmit the created image to the receivingterminal 102. - According to another embodiment, if the user does not wish to show the
second image 112 during linked operation, the user can select a menu item or input a button manipulation, etc., on theuser terminal 100 or on the receivingterminal 102 to stop showing thesecond image 112 or thethird image 114. In this case, theuser terminal 100 can continuously transmit the position information of the touch means to the receivingterminal 102 and the receivingterminal 102 can store the transmitted position information, so that the receivingterminal 102 may display thesecond image 112 corresponding to the position information upon the user's request. - According to yet another embodiment, the linkage and control system can include a user terminal, a receiving terminal, and a linkage server. The linkage server can transmit image data corresponding to the first image to the user terminal and the receiving terminal, and the user terminal can transmit the position information of the touch means to the receiving terminal, so that the second image indicating the position information of the touch means may be shown together with the first image at the receiving terminal. Here, the linkage server can pre-store the resolution information, etc., of the user terminal and the receiving terminal to process the image data based on the stored resolution information, etc., before transmitting it to the user terminal and the receiving terminal.
- Terminals such as smart phones, TVs, etc., use wire communication standards such as High-Definition Multimedia Interface HDMI, Mobile High-definition Link MHL, Displayport, etc., and wireless communication standards such as Digital Living Network Alliance DNLA and WiFi, etc., and are each provided not only with data channels for transmitting data but also with a separate channel for exchanging control signals. For example, HDMI uses Consumer Electronic Control CEC, Display Data Channel DDC, Utility, and SCL/SDA as control channels; MHL uses CBUS as a control channel; and Displayport uses the auxiliary channel as a control channel. Thus, it may not be necessary to establish separate channels for linking the
terminals terminals - The
terminals - A description will now be provided of a method for specifying the touch position of the touch means at the receiving
terminal 102. - In an embodiment, the position information of the touch means can be the position information on the first image displayed on the
user terminal 100. Consequently, the position of the touch means on the first image of theuser terminal 100 can be reflected in thefirst image 110 of the receivingterminal 102 as thesecond image 112. - In another embodiment, the position information of the touch means can be coordinate information that is in accordance with the resolution or screen size of the display unit of the
user terminal 100. That is, the position information of the touch means can be the coordinate information of the touch means with respect to the display unit of theuser terminal 100, rather than the coordinate information of the touch means on the image displayed on the screen. - In still another embodiment, when the
user terminal 100 is to show the image data transmitted from the receivingterminal 102 in order that theuser terminal 100 and the receivingterminal 102 may share thefirst image 110, theuser terminal 100 can display thefirst image 110 corresponding to the image data after setting the screen to the same resolution as the receivingterminal 102. Thus, theuser terminal 100 can accurately represent the position of the touch means by displaying thesecond image 112 with the position information, e.g. the coordinate information, of the touch means unmodified. - That is, the present invention can employ various methods for specifying the position of the touch means at the receiving
terminal 102. Theuser terminal 100 can transmit the position information of the touch means to the receivingterminal 102, or generate second image data corresponding to the second image representing the position information and send this second image data to the receivingterminal 102, or transmit the first image to the receivingterminal 102 with the region corresponding to the position of the touch means modified. - The method for linking and controlling terminals according to various embodiments will be described below in more detail with reference to the accompanying drawings.
-
FIG. 2A andFIG. 2B illustrate a method of linking and controlling terminals according to a first embodiment. - Referring to
FIG. 2A , the receivingterminal 102 can transmit image data corresponding to afirst image 110 that includes an event-executingentity 200, such as a UI, icon, application program, link, etc., for example, to theuser terminal 100. Consequently, thefirst image 110 displayed on theuser terminal 100 can include the event-executingentity 200. - According to an embodiment, when the user touches the event-executing
entity 200 with a touch means, theuser terminal 100 can transmit selection information, which notifies that the event-executingentity 200 was selected, to the receivingterminal 102. The selection information can include position information corresponding to a touch input for selecting the event-executingentity 200 or, depending on the touch input, control information for executing the event-executingentity 200. - The receiving
terminal 102 can recognize that the event-executingentity 200 was selected, based on the information corresponding to the touch input, and can execute an operation related to the selection of the event-executingentity 200. Alternatively, the receivingterminal 102 can execute the related operation by using the control information for executing the event-executingentity 200. Thus, the user can control the operation of the receivingterminal 102 by using theuser terminal 100. - Here, if the user brings a touch means near the event-executing
entity 200 of theuser terminal 100 or touches it with a pressure or an area smaller than or equal to a preset value, then asecond image 112 can be shown on the receivingterminal 102 in a position corresponding to the position of the touch means. Of course, athird image 114 indicating the position of the touch means can be shown on theuser terminal 100 as well. If the user touches the event-executingentity 200 with the touch means with a pressure greater than the preset pressure or an area greater than the preset area, then the receivingterminal 102 can perform an operation corresponding to the event-executingentity 200, for example by activating a game, etc. In this case, thefirst image 110 displayed on the receivingterminal 102 can be changed. - This method of linking terminals can be very useful not only for games, navigation, etc., but also for controlling a smart TV by using a
user terminal 100, as illustrated inFIG. 2B . -
FIG. 3 illustrates a method of linking and controlling terminals according to a second embodiment. - Referring to
FIG. 3 , it is also possible to transmit to theuser terminal 100 only the image 300 corresponding to a UI for control from among thefirst image 110 displayed on the receivingterminal 102, so that theuser terminal 100 can display acorresponding image 302. That is, theuser terminal 100 and the receivingterminal 102 can share just a portion of the image, especially a portion including an event-executing entity. In this case, asecond image 112 indicating the position of the touch means may be displayed on the receivingterminal 102. Of course, athird image 114 indicating the position of the touch means can also be displayed on theuser terminal 100, where thethird image 114 can be substantially the same as thesecond image 112. - In such a linkage system, when the user touches an event-executing entity in the
image 302 displayed on theuser terminal 100 with a touch means, theuser terminal 100 can transmit the selection information of the event-executing entity to the receivingterminal 102, and the receivingterminal 102 can execute the operation corresponding to the selection information. Generally, thefirst image 110 on the receivingterminal 102 may be changed. In this case, if the image 300 of the event-executing entity at the receivingterminal 102 is not changed, theuser terminal 100 may maintain theimage 302, whereas if the image 300 of the event-executing entity at the receivingterminal 102 is changed, analtered image 302 of the event-executing entity may be shown at theuser terminal 100. - If a touch means is brought near the display unit of the
user terminal 100, thesecond image 112 orthird image 114 may be shown. In particular, if the touch means is positioned over an event-executing entity, event occurrence information can be outputted, for example in the form of a sound, etc. This will be described later in more detail with reference toFIG. 18 . - In short, embodiments can involve using the
user terminal 100 as a means for controlling the receivingterminal 102. This can be particularly efficient for games, electronic commerce, smart V, etc. -
FIG. 4A andFIG. 4B illustrate a method of linking and controlling terminals according to a third embodiment. - Referring to
FIG. 4A andFIG. 4B , the image for thesecond image 112 a indicating the position of the touch means can be changed to a different shape. Also, thesecond image 112 a can be changed if the touch means is present at a preset position while it is near theuser terminal 100 or is touching theuser terminal 100 with a pressure or an area smaller than or equal to a preset pressure or area level. - For example, if the touch means is not positioned over an event-executing entity, the
second image 112 a can be represented as a shadow image as illustrated inFIG. 4A , but if the touch means is positioned over an event-executing entity, it can be represented by a finger image as illustrated inFIG. 4B . Thesecond image 112 can be changed when the touch means is positioned not only over an icon, but also over an Internet address input window, the bottom of the screen, a search window, a folder, and the like. - In this case, the receiving
terminal 102 can output event occurrence information, such as in the form of sound, light, vibration, etc., according to the event associated with the event-executing entity over which the touch means is positioned. This will be described later in further detail. Changing thesecond image 112 and outputting the event occurrence information can be performed simultaneously. The method described above can also apply in a similar manner to thethird image 114. - According to another embodiment, the
image user terminal 100. For example, theimage user terminal 100 once, but can be changed to an arrow image if the touch means makes a touch twice in a row. In another example, theimage user terminal 100 for a duration shorter than or equal to a preset value, and can be changed to a different image if the preset duration is exceeded. - In another example, the
image user terminal 100 can be different from theimage user terminal 100. - While the
second image 112 and thethird image 114 can be changed simultaneously, it is also possible to change just one of them or change them into images that are different from each other. - The operations of the system for linking and controlling terminals according to an embodiment will be described below in more detail with reference to the accompanying drawings.
-
FIG. 5 illustrates a method of linking and controlling terminals according to a fourth embodiment. - Referring to
FIG. 5 , auser terminal 100 and a receivingterminal 102 may be connected to begin linked operation (S500). To be more specific, if theuser terminal 100 or the receiving terminal 102 requests linkage to the counterpart terminal after theuser terminal 100 and the receivingterminal 102 connected by the user for linked operation, or if theuser terminal 100 requests linkage to the receivingterminal 102 and the linkage is accepted, a channel can be formed between theterminals terminals - According to an embodiment, the
user terminal 100 and the receivingterminal 102 can be connected by two channels, i.e. a data channel and a control channel, and the control signal for requesting or accepting linkage can be exchanged through the control channel. - Next, the receiving
terminal 102 may transmit image data corresponding to at least a portion of a first image that is currently being displayed or about to be displayed to theuser terminal 100, and theuser terminal 100 may display the first image corresponding to the transmitted image data (S502). That is, theuser terminal 100 and the receivingterminal 102 may share at least a portion of the first image. - Then, when a user brings a touch means, such as a finger or a touch pen, etc., near the
user terminal 100 or contacts the touch means with theuser terminal 100 with a pressure or an area smaller than or equal to a preset value, then theuser terminal 100 may sense the touch means (S504). Theuser terminal 100 can sense the touch means using various methods such as capacitive sensing, electromagnetic sensing, etc. - Next, the
user terminal 100 may transmit the position information of the touch means obtained according to the sensing result to the receivingterminal 102, and the receivingterminal 102 may display a second image, which represents the position information of the touch means transmitted thus, together with the first image 110 (S506). - According to another embodiment, the
user terminal 100 can also transmit image data (or combined image data) that includes data corresponding to the first image and data corresponding to the second image to the receivingterminal 102. However, since this embodiment basically involves the receivingterminal 102 transmitting the image data corresponding to the first image to theuser terminal 100, the former method of theuser terminal 100 transmitting only the position information to the receivingterminal 102 may be more efficient. - Then, when the user makes a touch with or moves the touch means, the
user terminal 100 may transmit the position information of the touch means to the receivingterminal 102 to display the second image on the receiving terminal 102 (S508). That is, the movement of the touch means can be reflected on the receivingterminal 102. -
FIG. 6 illustrates a method of linking and controlling terminals according to a fifth embodiment.FIGS. 7A through 7C illustrate a control method used for the method of linking and controlling terminals inFIG. 6 , andFIGS. 8A through 8E illustrate a linking operation when different sensing levels are set in accordance with an embodiment. - Referring to
FIG. 6 , theuser terminal 100 or the receivingterminal 102 may request linkage to begin linked operation (S600). - According to an embodiment, the
user terminal 100 and the receivingterminal 102 can be connected by two channels, i.e. a data channel and a control channel, as illustrated inFIG. 7A , and the control signal for requesting or accepting linkage can be exchanged through the control channel. - According to another embodiment, the data can be exchanged through one channel. For example, the transmission periods for the channel can include data periods E1 and E2 for transmitting image data, and a control period C between the data periods E1 and E2 for transmitting the position information or the selection information for an event-executing entity, as illustrated in
FIG. 7C . In particular, the position information or selection information can be transmitted by utilizing the blank periods C in-between the periods for transmitting image data. For example, theuser terminal 100 can transmit the image data to the receivingterminal 102 through one channel, and can transmit the position information or selection information to the receivingterminal 102 during the blank periods C existing in-between the data periods E1 and E2 for transmitting image data. - Next, the receiving
terminal 102 may transmit image data corresponding to at least a portion of thefirst image 110 to theuser terminal 100, to thus share thefirst image 110 with the user terminal 100 (S602). - Then, the
user terminal 100 may sense the touch pressure or the touch area of the touch means (S604). - For example, in sensing the touch pressure or touch area, the
user terminal 100 can set multiple levels, e.g. two levels, for the sensing levels, as illustrated inFIG. 7B . If it is not performing a terminal-linked process, i.e. if the user is using only theuser terminal 100, the sensing level for theuser terminal 100 can be set to a higher level (L2) such that the touch position is sensed only when there is a light touch made by the user. - Conversely, if the user is performing a terminal-linked process, i.e. if the receiving
terminal 102 is also being used, theuser terminal 100 may set the sensing level to a lower level (L1). Thus, theuser terminal 100 can sense the touch means even when the touch means is near and not touching or when the touch pressure or touch area is smaller than or equal to a preset pressure or area. In other embodiments, theuser terminal 100 can employ various methods other than the capacitance-based method, such as methods based on electromagnetic induction, methods using a resistive overlay, optical methods, ultrasonic methods, etc., and the settings conditions can vary according to the method employed. - Next, the
user terminal 100 may transmit the position information of the touch means obtained according to the sensing result to the receivingterminal 102, and the receivingterminal 102 may display a second image representing the position information, for example aposition image 112 indicating the position of the touch means, such as that illustrated inFIG. 1B , together with the first image 110 (S606). - Here, the position information of the touch means can be information in an image form or information in a coordinate form. That is, the
user terminal 100 can generate the position information for thesecond image 112 directly in the form of image data and transmit it to the receivingterminal 102, or transmit only the position information of the touch means to the receivingterminal 102 in the form of a control signal. Alternatively, theuser terminal 100 can generate a position image (second image) for of the touch means and transmit image data with the first image and the second image included to the receivingterminal 102, so that thesecond image 112 can be displayed on the receivingterminal 102 concurrently with the sharing of thefirst image 110. - For example, the position of the touch means can be displayed on the receiving
terminal 102 if the touch pressure of the touch means is smaller than or equal to a preset value or the touch area is smaller than or equal to a preset value. That is, if the touch means touches theuser terminal 100 lightly, thesecond image 112 can be displayed on the receivingterminal 102. If the touch means moves while touching theuser terminal 100 lightly, thesecond image 112 may reflect the movement of the touch means, and thesecond image 112 on the receivingterminal 102 may also move continuously (S608). - According to an embodiment, if the touch means touches the
user terminal 100 lightly, theuser terminal 100 may not recognize the touch of the touch means as a touch input. If the touch means touches theuser terminal 100 with a strong pressure, i.e. if the touch pressure exceeds a preset pressure or the touch area exceeds a preset touch area, theuser terminal 100 may recognize the touch of the touch means as a touch input. Thus, if the touch means touches theuser terminal 100 lightly, the icon may not be executed, but if the touch means touches the icon strongly, the icon can be executed. - As described above, the embodiment shown in
FIG. 5 and the embodiment shown inFIG. 6 can be applied together. That is, thesecond image 112 can be displayed on the receivingterminal 102 when the touch means is positioned within a preset distance from theuser terminal 100 and when the touch means lightly touches theuser terminal 100. - Referring to
FIGS. 8A to 8E , a more detailed description is provided below on setting the sensing levels for the embodiments illustrated inFIG. 5 andFIG. 6 . - The
user terminal 100 can be set to have multiple sensing levels. For example, a first level, a second level, and a third level can be set at theuser terminal 100; a first level for sensing a nearness of a touch means 804, a second level for sensing the touch means 804 touching with a level equal to or smaller than a preset level (pressure level or area level), and a third level for sensing the touch means 804 touching with a level equal to or greater than a preset level. - A description is provided below of the operations of the
user terminal 100 and the receivingterminal 102 when there are multiple levels set as above. - At the receiving
terminal 102, thefirst image 110 can be displayed as illustrated inFIG. 8A . Of course, afirst image 110 that is substantially the same can also be displayed at theuser terminal 100. Thefirst image 110 can include event-executing entities 800 such as icons, application programs, links, etc., and when an event-executing entity 800 is selected, a racing game, for example, can be executed as illustrated inFIG. 8B . In the descriptions that follow, it will be assumed that the touch position of the touch means 804 corresponds to a particular point on the event-executing entity 800. - First, when the touch means 804 is brought near the
display unit 806 of theuser terminal 100 as illustrated inFIG. 8C , thesecond image 112 indicating the position of the touch means 804 can be displayed on the receivingterminal 102 as illustrated inFIG. 8A . In addition, thethird image 114 can also be shown on theuser terminal 100. - Next, if the touch means 804 touches the
display unit 806 with a level smaller than or equal to a preset level as illustrated inFIG. 8D , theimage - If the touch means 804 touches the
display unit 806 with a level greater than or equal to the preset level as illustrated inFIG. 8E , the user terminal can recognize this as a selection of the event-executing entity 800 and execute the game. Thedistal end 810 of the touch means 804 can be structured such that it can be inserted inside, and when the user makes a touch with the touch means 804 with a level greater than or equal to a preset level, thedisplay unit 806 may be pressed with thedistal end 810 inserted inside, as illustrated inFIG. 8E . - In short, a linkage and control system based on this embodiment can perform different operations according to sensing levels.
- In another embodiment, if the touch means 804 is brought near, an
image display unit 806, the event-executing entity 800 can be executed. -
FIG. 9 is a flowchart illustrating a method of linking and controlling terminals according to a sixth embodiment. - Referring to
FIG. 9 , theuser terminal 100 and the receivingterminal 102 may begin linked operation (S900). - Next, the receiving
terminal 102 may transmit image data corresponding to at least a portion of the displayed first image to theuser terminal 100, and theuser terminal 100 may display a first image corresponding to the image data (S902). That is, theuser terminal 100 and the receivingterminal 102 may share the first image. - Then, the
user terminal 100 may sense a touch means, such as a finger, a touch pen, etc., through any of a variety of methods (S904). Theuser terminal 100 can sense the position of a touch means that is near theuser terminal 100 or lightly touching theuser terminal 100. - Next, the
user terminal 100 may generate a combined image, including the currently displayed first image together with the second image corresponding to the sensed position of the touch means, and may transmit combined image data (combination information) corresponding to the combined image to the receivingterminal 102, and the receivingterminal 102 may display the combined image corresponding to the combined image data (S906). Consequently, the second image together with the first image may be displayed on the receivingterminal 102. In this case, theuser terminal 100 can display the first image only or display the first image and the second image together. - If the user makes a touch with the touch means or moves while touching, the
user terminal 100 may transmit the position information of the touch means to the receivingterminal 102, and the receivingterminal 102 may display the second image, which indicates the position of the touch means in accordance with the position information, together with the corresponding first image (S908). That is, the movement of the touch means may be reflected in the screen of the receivingterminal 102. The first image on the receivingterminal 102 may be the same image as the previous image or may be a different image from the previous image. - In short, this embodiment has the
user terminal 100 generate a combined image that includes the first image and the second image and transmit the combined image thus generated to the receivingterminal 102, so that the receivingterminal 102 may consequently display the second image together with the first image. - Although the descriptions above use the expression “combined image,” it is possible to modify the first image itself such that the first image indicates the position of the touch means. To be more specific, the
user terminal 100 can modify the first image such that a region corresponding to the position of the touch means is changed to a shadow image, etc., that is, the first image itself can be modified to create a new image, after which the image thus created can be transmitted to the receivingterminal 102. Here, the modified first image can be substantially the same as the combined image. -
FIG. 10 is a flowchart illustrating a method of linking and controlling terminals according to a seventh embodiment. - Referring to
FIG. 10 , theuser terminal 100 and the receivingterminal 102 may begin linked operation (S1000). - Next, the user or the
user terminal 100 may set multiple sensing levels for sensing the touch means (S1002). As described above, various levels can be set for different embodiments. Such settings may be established at the beginning of the linked operation or may be established beforehand in theuser terminal 100 prior to linked operation. - Then, the
user terminal 100 may sense the touch means, and the receivingterminal 102 may display the second image, which represents the sensed position of the touch means (S1004). - Next, it may be determined whether or not there was a request by the user to stop linked operation (S1006). The stopping of linked operation can be requested by the user by a method such as a menu selection, etc., or can also be requested by turning off the connection between the
user terminal 100 and the receivingterminal 102. The user, controlling theuser terminal 100 while viewing the receivingterminal 102, may wish to use theuser terminal 100 only or may wish to view the receivingterminal 102 only, in which case the user can request for a stopping of linked operation while theuser terminal 100 and the receivingterminal 102 are in a connected state. - If there is no request from the user to stop the linked operation, then step S1004 may be performed again.
- Conversely, if there is a request from the user to stop linked operation, then the
user terminal 100 may initialize the multiple levels such that only one level is available or change the levels to sensing levels which only sense touches (S1008). That is, theuser terminal 100 may change the levels such that the touch means is not sensed if the touch means does not make a touch, and that the touch means is sensed only when the touch means makes a touch. - In short, a method of linking and controlling terminals according to an embodiment can allow a user to arbitrarily request linked operation and request a stopping of the linked operation and to freely set and change sensing levels.
- According to another embodiment, a preliminary sensing level can be set at the beginning of linked operation between the
user terminal 100 and receivingterminal 102, and the sensing levels can be set differently during linked operation. For example, the user can set different sensing levels during linked operation according to the nearness distance and touch strength of the touch means and can change the sensing level settings to sense the touch means only when it is in contact. -
FIG. 11 illustrates a system for linking and controlling terminals according to another embodiment. - Referring to
FIG. 11 , a system for linking and controlling terminals based on this embodiment can include auser terminal 100, a receivingterminal 102, and a transceiver device 1100 (e.g. a dongle). - The
transceiver device 1100 can connect the communications between theuser terminal 100 and the receivingterminal 102. To be more specific, when theuser terminal 100 transmits the position information of a touch means, the combined image data, or the selection information for an event-executing entity to thetransceiver device 1100, thetransceiver device 1100 can transmit the position information of the touch means, the combined image data, or the selection information for the event-executing entity to the receivingterminal 102. Also, image data transmitted from the receivingterminal 102 can be transmitted by thetransceiver device 1100 to theuser terminal 100. - The
transceiver device 1100 can be connected to theuser terminal 100 or the receivingterminal 102, or can exist separately without being connected to theuser terminal 100 and receivingterminal 102. Thetransceiver device 1100 can be, for example, a dongle, a set-top, etc. - According to an embodiment, the position information, combined image data, or selection information transmitted from the
user terminal 100 can be forwarded by thetransceiver device 1100 as is, without modification, to the receivingterminal 102. - According to another embodiment, the
transceiver device 1100 may convert the position information, combined image data, or selection information transmitted from theuser terminal 100 to a format suitable for the receivingterminal 102 and then transmit the converted position information or combined image data to the receivingterminal 102. Since many companies currently manufacture the receivingterminal 102, in the form of a smart TV, etc., it may be necessary to match the position information, combined image data, or selection information with the format of the receivingterminal 102 according to manufacturer. An embodiment can use thetransceiver device 1100 to convert the position information, combined image data, or selection information to fit the format of the receivingterminal 102, so that theterminals - Although it is not illustrated in the drawings, the
transceiver device 1100 can include a communication unit, a signal unit, and a format changer unit. - The communication unit may connect the
user terminal 100 and the receivingterminal 102. - The signal unit can transmit to the receiving
terminal 102 the position information of the touch means, the combined image data, or the selection information transmitted from theuser terminal 100, and can transmit the first image received from the receivingterminal 102 to theuser terminal 100. - The format changer unit can modify the position information, combined image data, or selection information to the format of the receiving
terminal 102 or modify the first image to the format of theuser terminal 100. - In short, the
user terminal 100 and the receivingterminal 102 can send or receive the position information, image data, combined image data, or selection information by way of thetransceiver device 1100. In this case, theuser terminal 100 or the receivingterminal 102 connected to thetransceiver device 1100 need not have a communication function. - According to another embodiment, the
transceiver device 1100 may not only provide communication between theuser terminal 100 and the receivingterminal 102 but may also sense the position of a touch means 1102. - According to yet another embodiment, the
transceiver device 1100 may serve as a communication means for theuser terminal 100 and can provide a particular communication function, such as Wibro communication, for example, and can also convert a particular communication function into another communication function, such as by converting Wibro to Wi-Fi, for example, for use by theuser terminal 100. -
FIG. 12 is a flowchart illustrating a method of linking and controlling terminals according to an eighth embodiment. - Referring to
FIG. 12 , a receiver may be installed on the receiving terminal 102 (S1200). In another embodiment, the receiver can be built into the receivingterminal 102. - Next, the
user terminal 100 and the receivingterminal 102 may begin linked operation (S1202). - Then, the
user terminal 100 may transmit image data corresponding to the first image to the receivingterminal 102 to share thefirst image 110, or the receivingterminal 102 may transmit the image data to theuser terminal 100 to share the first image 110 (S1204). - Next, when a touch means is brought near the
user terminal 100 or touches the display unit of theuser terminal 100 with a pressure or an area smaller than or equal to a preset pressure or preset area, the receiver may sense the position of the touch means by way of infrared rays and ultrasonic waves emitted from the touch means and transmit the sensed position of the touch means to the receivingterminal 102, and the receivingterminal 102 may display the second image, which represents the position thus obtained by sensing, together with the first image (S1206). - Then, the
user terminal 100 may transmit the position information based on the movement of the touch means to the receivingterminal 102, and the receivingterminal 102 can show the movement of the touch means as asecond image 112 or a third image different from the second image. - Various methods for sensing a touch means will now be described in more detail with reference to the accompanying drawings.
-
FIG. 13 illustrates a method of sensing a touch means according to a first embodiment. - Referring to
FIG. 13 , atouch pen 1300 can be used as the touch means intended for touching theuser terminal 100. A capacitance-based touch panel may be used for theuser terminal 100. - The
touch pen 1300 may be composed of abody 1310 and atouch part 1312. Thebody 1310 may be made of an electrically non-conducting material, while thetouch part 1312 may be a conductor. Thus, because of thetouch part 1312, a change in capacitance may occur when thetouch pen 1300 is brought near theuser terminal 100 or is touching theuser terminal 100, and theuser terminal 100 can sense thetouch pen 1300 based on the change in capacitance. -
FIG. 14 illustrates a method of sensing a touch means according to a second embodiment. - Referring to
FIG. 14 , theuser terminal 100 can include atouch panel 1400 and an electromagnetic field generator unit 1402. - The electromagnetic field generator unit 1402 can be connected to a rear surface of the
touch panel 1400 and can be made of a thin metal film to generate an electromagnetic field when electricity is applied. - The
touch pen 1404 may include abody 1410 and atouch part 1412, where thetouch part 1412 can preferably be made of a small metal coil. Consequently, when thetouch pen 1404 is brought near thetouch panel 1400, electromagnetic induction may occur in thetouch part 1412, and as a result, an alteration may occur in the electromagnetic field created by the electromagnetic field generator unit 1402. Thus, theuser terminal 100 may recognize the position of thetouch pen 1404 by sensing this alteration in the electromagnetic field. In particular, since the alteration of the electromagnetic field would differ according to the nearness and touch strength of thetouch pen 1404, this method of sensing the touch means can minutely sense the degree of proximity and the touch pressure of thetouch pen 1404 with respect to thetouch panel 1400. -
FIG. 15A andFIG. 15B illustrate a method of sensing a touch means according to a third embodiment. - Referring to
FIG. 15A , areceiver 1500 can be installed on a portion of theuser terminal 100, and a touch pen 1502 can be used. - The
receiver 1500 can include an infrared sensor and two ultrasonic sensors to sense the movement of the touch pen 1502 by receiving the infrared rays and ultrasonic waves emitted from the touch part (pen tip) of the touch pen 1502, and can transmit the position information of the touch pen 1502 obtained in accordance with the sensing results to the receivingterminal 102. The receivingterminal 102 may display a second image that represents the transmitted position information. Consequently, the second image may be displayed together with the first image. Here, the position information can be transmitted to the receivingterminal 102 by thereceiver 1500 or by theuser terminal 100. - According to another embodiment, the
receiver 1500 can perform not only the function of sensing the position of the touch pen 1502 but also the function of transmitting image data and position information to the receivingterminal 102. To be more specific, thereceiver 1500 can include atouch control unit 1510, animage signal unit 1512, acontrol signal unit 1514, and atransceiver unit 1516, as illustrated inFIG. 15B . - The
touch control unit 1510 may serve to sense the position of the touch pen 1502 by using the received infrared rays and ultrasonic waves and provide theuser terminal 100 with the information on the sensed position. Theuser terminal 100 may show the position of the touch pen 1502 or perform a related operation in accordance to the information thus provided. - The
image signal unit 1512 can be provided with image data from theuser terminal 100 and transmit the image data thus provided to the receivingterminal 102 via thetransceiver unit 1516. - The
control signal unit 1514 may serve to transmit a control signal, which includes the position information of the touch pen 1502 obtained above by sensing, to the receivingterminal 102. That is, since thereceiver 1500 transmits the image data transmitted from theuser terminal 100 and the position information of the touch pen 1502 to the receivingterminal 102, theuser terminal 100 does not have to include a communication function. Therefore, even with a terminal that does not have a communication function or a terminal that has a communication function but is unable to use the related communication facilities, it is possible to recognize the position and action of the touch pen 1502 using thereceiver 1500 as well as to employ a linkage method according to an embodiment for sharing images, displaying the second image, etc. - In another embodiment, the receiver can be incorporated into the
user terminal 100 to be implemented as a single body. -
FIG. 16A andFIG. 16B illustrate a method of sensing a touch means according to a fourth embodiment. - Referring to
FIG. 16A , in a system for linking and controlling terminals according to an embodiment, areceiver 1600 can be installed on or built into the receivingterminal 102 rather than theuser terminal 100. - When a
touch pen 1602 serving as the touch means for theuser terminal 100 emits ultrasonic waves and infrared rays to thereceiver 1600 installed on the receivingterminal 102, thereceiver 1600 may receive the ultrasonic waves and infrared rays to sense the position of thetouch pen 1602. - The
receiver 1600 may transmit information regarding the position of thetouch pen 1602 thus sensed to the receivingterminal 102, and the receivingterminal 102 may display the second image representing the position of thetouch pen 1602 together with the first image. Of course, theuser terminal 100 and the receivingterminal 102 may display the second image while sharing the first image. - According to another embodiment, the
receiver 1600 can serve not only to sense the position of thetouch pen 1602 but also to perform communication. To be more specific, thereceiver 1600 can include a touch meanssensing unit 1610, animage signal unit 1612, acontrol signal unit 1614, and atransceiver unit 1616. - The touch means
sensing unit 1610 may serve to sense the position of thetouch pen 1602. - The
image signal unit 1612 may receive the combined image data transmitted from theuser terminal 100 by way of thetransceiver unit 1616 and transmit the received image data by way of thetransceiver unit 1616 to the receivingterminal 102, or may also receive image data from the receivingterminal 102 and transmit the received image data to theuser terminal 100. - The
control signal unit 1614 can receive a control signal transmitted from theuser terminal 100 related to the linked operation, etc., and can transmit the received control signal to the receivingterminal 102. Of course, the position information of thetouch pen 1602 need not be transmitted from theuser terminal 100 to thereceiver 1600. As such, thereceiver 1600 may provide not only the function of sensing the position of thetouch pen 1602 but also a communication function. Thus, a linking and control method according to an embodiment can be used even when the receivingterminal 102 does not have a communication function. -
FIG. 17 is a block diagram illustrating the structure of a user terminal according to an embodiment. - Referring to
FIG. 17 , theuser terminal 100 of this embodiment can include acontrol unit 1700, alinkage unit 1702, animage unit 1704, acontrol unit 1700, alinkage unit 1702, animage unit 1704, a sensing unit 1706, asettings unit 1708, a display unit 1710, a signal unit 1712, atransceiver unit 1714, adecoding unit 1716, and astorage unit 1718. - The
linkage unit 1702 may manage all functions related to linkage with the receivingterminal 102. - The
image unit 1704 can include an image generator unit and an image changer unit and can display, via the display unit 1710, an image corresponding to the image data transmitted from the receivingterminal 102. - The image generator unit can generate a position image, which may represent the position of a touch means that is positioned within a preset distance from the display unit 1710, generate a combined image, which may include the second image and the first image, or generate a position image of a touch means that is touching the display unit 1710 with a pressure or an area smaller than or equal to a preset value.
- The image changer unit can change the position image according to the distance between the touch means and the
user terminal 100, and can change the position image according to the pressure or area with which the touch means contacts theuser terminal 100. Also, the image changer unit can change the position image according to whether or not the touch means is over an event-executing entity or a preset position. - The sensing unit 1706 may serve to sense a touch means, such as a finger or a touch pen, etc. More specifically, the sensing unit 1706 can sense the position of the touch means, distinguishing when the touch means is near and when it is touching. The method of sensing is not limited to a particular method and can be a capacitance-based method, an electromagnetic induction-based method, etc. The information on the position of the touch means as sensed by the sensing unit 1706 can also be generated by a position information generating unit (not shown).
- The
settings unit 1708 may manage the settings of various functions, such as linkage function settings, sensing level settings, etc. - The display unit 1710 can be implemented in various ways such as by using capacitance-based types, resistive overlay types, electromagnetic induction types, etc. The display unit 1710 can be a touch display unit equipped with a touch function.
- The signal unit 1712 can include an
information transmitting unit 1720 and aninformation receiving unit 1722. - The
information transmitting unit 1720 can transmit the position information of a touch means or the selection information for an event-executing entity to the receivingterminal 102. The selection information can include position information corresponding to a touch input for selecting an event-executing entity or control information for executing an event-executing entity according to a touch input. - The
information receiving unit 1722 may receive image data from the receiving terminal 102 corresponding to the entirety or a portion of the first image. - The
transceiver unit 1714 may serve as a communication passageway to the receivingterminal 102. - The
decoding unit 1716 may decode the image data received from the receivingterminal 102. - The
storage unit 1718 may store various data, such as the first image, image signals, position information, control signals, application programs, etc. - The
control unit 1700 may control the overall operations of the components of theuser terminal 100. - Although it has not been described above, the
user terminal 100 can further include a touch means unit, a receiver unit, an electromagnetic field generator unit, or a touch means operating unit. - The touch means unit may receive and manage information on the position of the touch means when the receiver is connected with the
user terminal 100. That is, the receiver may sense the position of the touch means, while the touch means unit may analyze the signals transmitted from the receiver to detect the position of the touch means. - The receiver unit may receive infrared rays and ultrasonic waves transmitted from the touch means when the touch means is brought near to or in contact with the
user terminal 100, and may analyze the infrared rays and ultrasonic waves thus received to detect the position of the touch means. - The electromagnetic field generator unit may serve to create an electromagnetic field for sensing the touch means by electromagnetic induction, and may preferably be formed on a rear surface of the display unit.
- The touch means operating unit can perform a particular operation when the touch means is brought near the
user terminal 100 or is touching theuser terminal 100 with a pressure or an area smaller than or equal to a preset value. For example, a scroll function can be performed if the touch means is brought near a lower part of the display unit 1710 on theuser terminal 100. In this case, the position information of the touch means can be transmitted to the receivingterminal 102 or athird image 114 corresponding to the position information can be displayed on the display unit 1710. That is, theuser terminal 100 can transmit the position information of the touch means or display athird image 114 on theuser terminal 100, in response to an approaching near of or a light touch by the touch means, to result in a particular operation such as scrolling, etc. -
FIG. 18 is a block diagram illustrating the structure of a user terminal according to another embodiment. - Referring to
FIG. 18 , theuser terminal 100 of this embodiment can include acontrol unit 1800, adisplay unit 1802, a sensing unit 1804, a signal unit 1806, an image unit 1808, aninformation provider unit 1810, and amode changer unit 1812. Theuser terminal 100 can include all or just some of the components above. Also, theuser terminal 100 can additionally include components other than the components above. - The
display unit 1802 may display a first image that is shared by theuser terminal 100 and the receivingterminal 102. Also, thedisplay unit 1802 can display a menu, etc., from which to select a touch mode. - The sensing unit 1804 may sense the position of a touch means by way of various methods such as those described above, when the touch means is near or is touching the
user terminal 100. Here, the position information representing the position of the touch means can be the position information of the touch means that is positioned within a preset distance from theuser terminal 100. - The signal unit 1806 may transmit the position information of the touch means obtained by the sensing above to the receiving
terminal 102 and may transmit image data corresponding to the first image shared by theuser terminal 100 and the receivingterminal 102 to the receivingterminal 102. - The image unit 1808 may generate combined image data that is to be shared by the
user terminal 100 and the receivingterminal 102 or the second image that indicates the position of the touch means. - The
information provider unit 1810 can output information according to the sensing results of the sensing unit 1804. That is, nearness information can be outputted if a touch means is brought near theuser terminal 100 and sensed by the sensing unit 1804. The nearness information can be in the form of vibration, sound, or light, so as to stimulate the user's tactile, auditory, or visual senses. - The
information provider unit 1810 can provide the user with a tactile, auditory, or visual sensation in various types according to the state of nearness of the touch means with respect to theuser terminal 100, to allow the user of theuser terminal 100 to perceive various tactile, auditory, or visual sensations. - For example, with the
user terminal 100 having recognized a near touch of the touch means, theinformation provider unit 1810 can provide a continuous vibration, sound, or light during a movement of the touch means. That is, a short vibration can be provided once when a near touch of the touch means is first recognized, after which continuous vibrations can be provided when the touch means moves. In other words, when a near touch of the touch means is first recognized, a vibration can be provided for a first duration, and afterwards when the touch means moves, a vibration can be provided for a second duration. Here, the second duration can be longer than the first duration. - Alternatively, the
information provider unit 1810 can provide a vibration when a near touch is first recognized, and afterwards provide a sound when the touch means is moving. - In another example, the
information provider unit 1810 can provide the user with nearness information in the form of sound, etc., when the touch means is brought near a preset entity such as a folder, control UI, etc., from among the images shown on the screen of theuser terminal 100. That is, as described above with reference toFIGS. 6A and 6B , the position image of the touch means can change when the touch means is placed at a preset position, and at this time, theinformation provider unit 1810 can output nearness information, where the nearness information can correspond to event occurrence information. In this way, the user can perceive the entity immediately. - The
information provider unit 1810 can provide the nearness information for a first duration when the sensing unit 1804 recognizes a nearness state of the touch means, and can provide the nearness information for a second duration when the touch means moves while the sensing unit 1804 is aware of the nearness state of the touch means. The second duration can be a duration corresponding to the duration for which the touch means moves while the sensing unit 1804 is aware of the nearness state of the touch means. - Also, the
information provider unit 1810 can provide the nearness information in different forms for a first case in which the sensing unit 1804 recognizes a nearness state of a touch means and a second case in which the touch means is moved while the sensing unit 1804 is aware of the nearness state of the touch means. As described above, a vibration can be provided for the first case and a sound can be provided for the second case, or vibrations of a first pattern can be provided for the first case and vibrations of a second pattern can be provided for the second case, so as to allow the user to perceive the movement of the touch means. - If the touch means touches the
display unit 1802 after the sensing unit 1804 has recognized a nearness state, theinformation provider unit 1810 may not provide the nearness information, allowing the user of the user terminal to differentiate between a near state and a direct touch. - After the touch means has touched the
display unit 1802, if the sensing unit 1804 recognizes a nearness state of the touch means for a second time, theinformation provider unit 1810 can provide the nearness information. Once the touch means is brought near theuser terminal 100 and touches theuser terminal 100, the touch means may be separated from theuser terminal 100. That is, since the purpose of the touch has been fulfilled, the touch means may be separated from theuser terminal 100 to proceed with the next operation, at which time a nearness state may occur again. In this case, since the touch means is put in a nearness state for the first time after touching thedisplay unit 1802 and was not intentionally placed in a nearness state by the user, theinformation provider unit 1810 may not provide nearness information. Then, when a nearness state occurs for the second time, i.e. the user intentionally triggers a nearness state, theinformation provider unit 1810 can provide nearness information. - The
mode changer unit 1812 may change the touch mode according to the user's input, where the touch mode can include a near touch mode 1820 and adirect touch mode 1822. Here, the user input can be made by way of a switch. The switch can be configured as a ring/vibration conversion switch. - In the near touch mode 1820, the position image of a nearby touch means can be displayed on the
user terminal 100 or the receiving terminal. Also, the position image of a touch means touching theuser terminal 100 with a pressure level or area smaller than or equal to a preset value can be displayed on theuser terminal 100 or the receiving terminal. - In the near touch mode, the
user terminal 100 may recognize the position of a touch means that is near theuser terminal 100 for a touch input of the touch means, and in the direct touch mode, theuser terminal 100 may recognize a touch by the touch means as a touch input. - According to an embodiment, the
mode changer unit 1812 can change the touch mode at the beginning of linked operation between theuser terminal 100 and the receivingterminal 102 or change the touch mode according to the user's command, e.g. the user's touch, voice data, or visual data. For example, the user can change the touch mode by selecting on a menu shown on theuser terminal 100, or the touch mode can be changed by the user's voice or a visual sequence such as motion. - The touch mode change function described above can be provided when the linked operation of the
user terminal 100 and the receivingterminal 102 begins, or can be provided in theuser terminal 100 regardless of linked operation. Also, the touch mode change function for linked operation can be provided automatically when the linked operation begins or can be provided after linking when the user selects the function. With a small screen as on a smart phone, it can be useful to sense a touch means that is nearby and show a corresponding image on the smart phone, and in such a device, it may be advantageous to provide the touch mode change function regardless of linked operation. - According to another embodiment, the
user terminal 100 can be provided with a link mode in addition to the near touch mode and direct touch mode. For example, if a user carrying auser terminal 100 such as a smart phone or a tablet PC, etc., wishes to link it to a receivingterminal 102, the user can select the link mode. When the user selects the link mode, theuser terminal 100 can search for display apparatuses close by, begin linking with a searched receivingterminal 102, and display a menu from which to select a near touch mode and a direct touch mode on theuser terminal 100 at the beginning of the linked operation. - While it is not described above, the selection of the touch mode change can be achieved by methods other than selecting a menu displayed on the
user terminal 100, such as by pressing a button provided on a side surface, a front surface, etc., of a smart phone. - The
control unit 1800 may control the overall operations of the components of theuser terminal 100. -
FIG. 19 is a block diagram illustrating the structure of a receiving terminal according to an embodiment. - Referring to
FIG. 19 , a receivingterminal 102 based on this embodiment can include a control unit 1900, a linkage unit 1902, atransceiver unit 1904, asignal unit 1906, animage unit 1908, adisplay unit 1910, and an operation executing unit 1912. - The linkage unit 1902 may manage the function of linking to the
user terminal 100. - The
transceiver unit 1904 may serve as a communication passageway to theuser terminal 100. - The
signal unit 1906 can include aninformation transmitting unit 1920 and aninformation receiving unit 1922. - The
information transmitting unit 1920 may transmit image data corresponding to the first image to theuser terminal 100. - The
information receiving unit 1922 can receive the position information of the touch means or the selection information of the event-executing entity that is transmitted from theuser terminal 100. - The
image unit 1908 may display the first image through thedisplay unit 1910, and may display the second image corresponding to the position information of the touch means received above, together with the first image. According to another embodiment, theimage unit 1908 can combine the first image with the second image and display the combined image, i.e. the result of the combining, through thedisplay unit 1910. - The
display unit 1910 is not limited to a particular type as long as it is capable of displaying images, and can be implemented, for example, as an LCD, OLED, PDP, etc. Thedisplay unit 1910 does not necessarily require a touch function. - The operation executing unit 1912 can execute the operation corresponding to the selection information of the event-executing entity.
- The
storage unit 1914 may store various data such as the first image, the second image, a combined image, application programs, etc. - The control unit 1900 may control the overall operations of the components of the receiving
terminal 102. - Components in the embodiments described above can be easily understood from the perspective of processes. That is, each component can also be understood as an individual process. Likewise, processes in the embodiments described above can be easily understood from the perspective of components.
- Also, the technical features described above can be implemented in the form of program instructions that may be performed using various computer means and can be recorded in a computer-readable medium. Such a computer-readable medium can include program instructions, data files, data structures, etc., alone or in combination. The program instructions recorded on the medium can be designed and configured specifically for the present invention or can be a type of medium known to and used by the skilled person in the field of computer software. Examples of a computer-readable medium may include magnetic media such as hard disks, floppy disks, magnetic tapes, etc., optical media such as CD-ROM's, DVD's, etc., magneto-optical media such as floptical disks, etc., and hardware devices such as ROM, RAM, flash memory, etc. Examples of the program of instructions may include not only machine language codes produced by a compiler but also high-level language codes that can be executed by a computer through the use of an interpreter, etc. The hardware mentioned above can be made to operate as one or more software modules that perform the actions of the embodiments, and vice versa.
- The embodiments described above are disclosed only for illustrative purposes. A person having ordinary skill in the art would be able to make various modifications, alterations, and additions without departing from the spirit and scope, but it is to be appreciated that such modifications, alterations, and additions are encompassed by the scope of claims set forth below.
Claims (9)
1. A user terminal comprising:
a display unit;
a touch panel configured to sense touch pressure; and
a mode changer unit configured to change a touch mode according to a user's input,
wherein the touch mode comprises:
a first touch mode for recognizing a touch of the touch means regardless of the touch pressure or touch area of the touch means; and
a second touch mode for recognizing the touch pressure or touch area of the touch means.
2. The user terminal of claim 1 , wherein the user terminal outputs event occurrence information if the touch means is positioned over an event-executing object displayed on the user terminal.
3. The user terminal of claim 2 , wherein the event occurrence information is the form of sound, light or vibration.
4. A method for controlling a user terminal, the method comprising:
outputting image data;
sensing the touch pressure; and
changing a touch mode according to a user's input,
wherein the touch mode includes:
a first touch mode for recognizing a touch of the touch means regardless of the touch pressure or touch area of the touch means; and
a second touch mode for recognizing the touch pressure or touch area of the touch means.
5. The method of claim 4 , wherein the user terminal outputs event occurrence information if the touch means is positioned over an event-executing object displayed on the user terminal.
6. The method of claim 5 , wherein the event occurrence information is the form of sound, light or vibration.
7. A non-transitory recorded medium readable by a digital processing device, tangibly embodying a program of instructions executable by the digital processing device to control a user terminal, the program of instructions configured to perform a method comprising:
generating image data;
sensing the touch pressure; and
changing a touch mode according to a user's input,
wherein the touch mode comprises:
a first touch mode for recognizing a touch of the touch means regardless of the touch pressure or touch area of the touch means; and
a second touch mode for recognizing the touch pressure or touch area of the touch means.
8. The recorded medium of claim 7 , wherein the user terminal outputs event occurrence information if the touch means is positioned over an event-executing object displayed on the user terminal.
9. The recorded medium of claim 8 , wherein the event occurrence information is the form of sound, light or vibration.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/052,803 US20160170703A1 (en) | 2012-03-06 | 2016-02-24 | System and method for linking and controlling terminals |
Applications Claiming Priority (26)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0023012 | 2012-03-06 | ||
KR10-2012-0022984 | 2012-03-06 | ||
KR1020120022986A KR20130101886A (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and reception device used in the same |
KR10-2012-0022986 | 2012-03-06 | ||
KR1020120022984A KR101951484B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and reception device used in the same |
KR1020120023012A KR101151549B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0022988 | 2012-03-06 | ||
KR1020120022988A KR101212364B1 (en) | 2012-03-06 | 2012-03-06 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0024092 | 2012-03-08 | ||
KR10-2012-0024073 | 2012-03-08 | ||
KR20120024073 | 2012-03-08 | ||
KR20120024092 | 2012-03-08 | ||
KR20120032982 | 2012-03-30 | ||
KR20120033047 | 2012-03-30 | ||
KR10-2012-0033047 | 2012-03-30 | ||
KR10-2012-0032982 | 2012-03-30 | ||
KR20120043148 | 2012-04-25 | ||
KR10-2012-0043148 | 2012-04-25 | ||
KR1020120057998A KR101337665B1 (en) | 2012-03-08 | 2012-05-31 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0057996 | 2012-05-31 | ||
KR1020120058000A KR101384493B1 (en) | 2012-03-08 | 2012-05-31 | System for interworking and controlling devices and user device used in the same |
KR10-2012-0057998 | 2012-05-31 | ||
KR10-2012-0058000 | 2012-05-31 | ||
KR20120057996 | 2012-05-31 | ||
US13/785,370 US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
US15/052,803 US20160170703A1 (en) | 2012-03-06 | 2016-02-24 | System and method for linking and controlling terminals |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,370 Continuation US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160170703A1 true US20160170703A1 (en) | 2016-06-16 |
Family
ID=49113653
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,302 Abandoned US20130241854A1 (en) | 2012-03-06 | 2013-03-05 | Image sharing system and user terminal for the system |
US13/785,370 Abandoned US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
US13/785,600 Abandoned US20130234984A1 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US13/785,498 Expired - Fee Related US8913026B2 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US14/155,204 Abandoned US20140125622A1 (en) | 2012-03-06 | 2014-01-14 | System for linking and controlling terminals and user terminal used in the same |
US15/052,803 Abandoned US20160170703A1 (en) | 2012-03-06 | 2016-02-24 | System and method for linking and controlling terminals |
US16/142,998 Active US10656895B2 (en) | 2012-03-06 | 2018-09-26 | System for linking and controlling terminals and user terminal used in the same |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/785,302 Abandoned US20130241854A1 (en) | 2012-03-06 | 2013-03-05 | Image sharing system and user terminal for the system |
US13/785,370 Abandoned US20130234959A1 (en) | 2012-03-06 | 2013-03-05 | System and method for linking and controlling terminals |
US13/785,600 Abandoned US20130234984A1 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US13/785,498 Expired - Fee Related US8913026B2 (en) | 2012-03-06 | 2013-03-05 | System for linking and controlling terminals and user terminal used in the same |
US14/155,204 Abandoned US20140125622A1 (en) | 2012-03-06 | 2014-01-14 | System for linking and controlling terminals and user terminal used in the same |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/142,998 Active US10656895B2 (en) | 2012-03-06 | 2018-09-26 | System for linking and controlling terminals and user terminal used in the same |
Country Status (1)
Country | Link |
---|---|
US (7) | US20130241854A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160092152A1 (en) * | 2014-09-25 | 2016-03-31 | Oracle International Corporation | Extended screen experience |
CN108984140A (en) * | 2018-06-28 | 2018-12-11 | 联想(北京)有限公司 | A kind of display control method and system |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130241854A1 (en) | 2012-03-06 | 2013-09-19 | Industry-University Cooperation Foundation Hanyang University | Image sharing system and user terminal for the system |
US20150109257A1 (en) * | 2013-10-23 | 2015-04-23 | Lumi Stream Inc. | Pre-touch pointer for control and data entry in touch-screen devices |
CN104683863A (en) * | 2013-11-28 | 2015-06-03 | 中国移动通信集团公司 | Method and equipment for multimedia data transmission |
KR20150069155A (en) * | 2013-12-13 | 2015-06-23 | 삼성전자주식회사 | Touch indicator display method of electronic apparatus and electronic appparatus thereof |
US20150199030A1 (en) * | 2014-01-10 | 2015-07-16 | Microsoft Corporation | Hover-Sensitive Control Of Secondary Display |
CN104793912A (en) * | 2014-01-22 | 2015-07-22 | 宏碁股份有限公司 | Operation method and operation system |
TWI524259B (en) * | 2014-01-24 | 2016-03-01 | 財團法人工業技術研究院 | System and method for controlling touching panel |
KR102187027B1 (en) * | 2014-06-25 | 2020-12-04 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
TWI533190B (en) * | 2014-09-23 | 2016-05-11 | 緯創資通股份有限公司 | Touch sensing apparatus, touch system and touch sensing method |
WO2016072635A1 (en) | 2014-11-03 | 2016-05-12 | Samsung Electronics Co., Ltd. | User terminal device and method for control thereof and system for providing contents |
US9335862B1 (en) * | 2014-11-14 | 2016-05-10 | International Business Machines Corporation | Virtual multi-device navigation in surface computing system |
US9830030B2 (en) | 2015-05-07 | 2017-11-28 | Industrial Technology Research Institute | Flexible touch panel, touch control device and operating method using the same |
CN105183284B (en) * | 2015-08-27 | 2018-07-20 | 广东欧珀移动通信有限公司 | A kind of method and user terminal for checking short message |
CN105426178B (en) * | 2015-11-02 | 2019-03-08 | Oppo广东移动通信有限公司 | Display system, the display methods of terminal system of terminal |
US20170160866A1 (en) * | 2015-12-08 | 2017-06-08 | Innolux Corporation | Touch display device |
CN105898462A (en) * | 2015-12-11 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | Method and device capable of operating video display device |
KR101771837B1 (en) * | 2016-02-17 | 2017-08-25 | (주)휴맥스 | Remote controller providing force input in a media system and method of driving the same |
KR101886209B1 (en) * | 2016-04-19 | 2018-08-08 | (주)휴맥스 | Apparatus and method of providing media service |
KR101775829B1 (en) * | 2016-05-16 | 2017-09-06 | (주)휴맥스 | Computer processing device and method for determining coordinate compensation and error for remote control key using user profile information based on force input |
CN108810592A (en) * | 2017-04-28 | 2018-11-13 | 数码士有限公司 | The remote controler and its driving method of strength input are provided in media system |
KR20200055983A (en) * | 2018-11-14 | 2020-05-22 | 삼성전자주식회사 | Method for estimating electromagnatic signal radiated from device and an electronic device thereof |
US11625155B2 (en) * | 2020-03-23 | 2023-04-11 | Ricoh Company, Ltd. | Information processing system, user terminal, method of processing information |
CN111541922B (en) * | 2020-04-15 | 2022-12-13 | 北京小米移动软件有限公司 | Method, device and storage medium for displaying interface input information |
CN114647390B (en) * | 2020-12-21 | 2024-03-26 | 华为技术有限公司 | Enhanced screen sharing method and system and electronic equipment |
US11606456B1 (en) | 2021-10-19 | 2023-03-14 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
US11907495B2 (en) * | 2021-10-19 | 2024-02-20 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
US11503358B1 (en) | 2021-10-19 | 2022-11-15 | Motorola Mobility Llc | Electronic devices and corresponding methods utilizing ultra-wideband communication signals for user interface enhancement |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6424338B1 (en) * | 1999-09-30 | 2002-07-23 | Gateway, Inc. | Speed zone touchpad |
US6492979B1 (en) * | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
US20050110769A1 (en) * | 2003-11-26 | 2005-05-26 | Dacosta Henry | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
US20060132456A1 (en) * | 2004-12-21 | 2006-06-22 | Microsoft Corporation | Hard tap |
US20070152976A1 (en) * | 2005-12-30 | 2007-07-05 | Microsoft Corporation | Unintentional touch rejection |
US20080165154A1 (en) * | 2007-01-04 | 2008-07-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling touch sensitivity of touch screen panel and touch screen display using the same |
US20080204427A1 (en) * | 2004-08-02 | 2008-08-28 | Koninklijke Philips Electronics, N.V. | Touch Screen with Pressure-Dependent Visual Feedback |
US20080278454A1 (en) * | 2007-05-08 | 2008-11-13 | Samsung Electronics Co. Ltd. | Method for setting touch sensitivity in portable terminal |
US20090167704A1 (en) * | 2007-12-31 | 2009-07-02 | Apple Inc. | Multi-touch display screen with localized tactile feedback |
US20100007612A1 (en) * | 2008-07-09 | 2010-01-14 | Howard Locker | Apparatus, system, and method for automated touchpad adjustments |
US20100149130A1 (en) * | 2008-12-15 | 2010-06-17 | Samsung Electronics Co., Ltd. | Sensitivity-variable touch sensor according to user interface mode of terminal and method for providing the same |
US20110032199A1 (en) * | 2009-08-10 | 2011-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling touch sensitivity in a portable terminal |
US20110261058A1 (en) * | 2010-04-23 | 2011-10-27 | Tong Luo | Method for user input from the back panel of a handheld computerized device |
US20110304550A1 (en) * | 2010-06-10 | 2011-12-15 | Qualcomm Incorporated | Auto-morphing adaptive user interface device and methods |
Family Cites Families (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4903012A (en) * | 1987-01-20 | 1990-02-20 | Alps Electric Co., Ltd. | Coordinate system input device providing registration calibration and a mouse function |
US4814552A (en) * | 1987-12-02 | 1989-03-21 | Xerox Corporation | Ultrasound position input device |
JPH07507886A (en) | 1990-11-01 | 1995-08-31 | ゲイゼル・グラフイツク・システムズ・インコーポレイテツド | Electromagnetic position transducer with active transmitting stylus |
US5120908A (en) * | 1990-11-01 | 1992-06-09 | Gazelle Graphic Systems Inc. | Electromagnetic position transducer |
JP3510318B2 (en) * | 1994-04-28 | 2004-03-29 | 株式会社ワコム | Angle information input device |
US6321158B1 (en) * | 1994-06-24 | 2001-11-20 | Delorme Publishing Company | Integrated routing/mapping information |
US5867146A (en) * | 1996-01-17 | 1999-02-02 | Lg Electronics Inc. | Three dimensional wireless pointing device |
US6008807A (en) * | 1997-07-14 | 1999-12-28 | Microsoft Corporation | Method and system for controlling the display of objects in a slide show presentation |
US6429846B2 (en) | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
EP1032201B1 (en) * | 1999-02-26 | 2005-11-02 | Canon Kabushiki Kaisha | Image display control system and method |
US6411283B1 (en) | 1999-05-20 | 2002-06-25 | Micron Technology, Inc. | Computer touch screen adapted to facilitate selection of features at edge of screen |
US7565680B1 (en) * | 2000-06-30 | 2009-07-21 | Comcast Ip Holdings I, Llc | Advanced set top terminal having a video call feature |
US20020171689A1 (en) | 2001-05-15 | 2002-11-21 | International Business Machines Corporation | Method and system for providing a pre-selection indicator for a graphical user interface (GUI) widget |
TW508526B (en) * | 2001-06-15 | 2002-11-01 | Compal Electronics Inc | Personal digital assistant with a power-saving external image output port |
KR100474724B1 (en) * | 2001-08-04 | 2005-03-08 | 삼성전자주식회사 | Apparatus having touch screen and external display device using method therefor |
AU2002338134A1 (en) * | 2001-12-29 | 2003-07-15 | Xuanming Shi | A touch control display screen with a built-in electromagnet induction layer of septum array grids |
US7109975B2 (en) * | 2002-01-29 | 2006-09-19 | Meta4Hand Inc. | Computer pointer control |
JP3925297B2 (en) | 2002-05-13 | 2007-06-06 | ソニー株式会社 | Video display system and video display control device |
US20030231168A1 (en) | 2002-06-18 | 2003-12-18 | Jory Bell | Component for use as a portable computing device and pointing device in a modular computing system |
KR100480823B1 (en) * | 2002-11-14 | 2005-04-07 | 엘지.필립스 엘시디 주식회사 | touch panel for display device |
US20050071761A1 (en) | 2003-09-25 | 2005-03-31 | Nokia Corporation | User interface on a portable electronic device |
KR100539904B1 (en) | 2004-02-27 | 2005-12-28 | 삼성전자주식회사 | Pointing device in terminal having touch screen and method for using it |
EP1596538A1 (en) * | 2004-05-10 | 2005-11-16 | Sony Ericsson Mobile Communications AB | Method and device for bluetooth pairing |
US20050256923A1 (en) | 2004-05-14 | 2005-11-17 | Citrix Systems, Inc. | Methods and apparatus for displaying application output on devices having constrained system resources |
US7245502B2 (en) * | 2004-06-07 | 2007-07-17 | Broadcom Corporation | Small form factor USB bluetooth dongle |
US20060103871A1 (en) * | 2004-11-16 | 2006-05-18 | Erwin Weinans | Methods, apparatus and computer program products supporting display generation in peripheral devices for communications terminals |
US20060135865A1 (en) * | 2004-11-23 | 2006-06-22 | General Electric Company | Method and apparatus for synching of images using regions of interest mapped by a user |
US7683889B2 (en) * | 2004-12-21 | 2010-03-23 | Microsoft Corporation | Pressure based selection |
US8717301B2 (en) | 2005-08-01 | 2014-05-06 | Sony Corporation | Information processing apparatus and method, and program |
US7835505B2 (en) * | 2005-05-13 | 2010-11-16 | Microsoft Corporation | Phone-to-monitor connection device |
EP1724955A3 (en) * | 2005-05-17 | 2007-01-03 | Samsung Electronics Co.,Ltd. | Method for taking a telephone call while receiving a broadcast service, and digital multimedia broadcasting terminal using this method |
US20080115073A1 (en) * | 2005-05-26 | 2008-05-15 | ERICKSON Shawn | Method and Apparatus for Remote Display of Drawn Content |
US8495514B1 (en) * | 2005-06-02 | 2013-07-23 | Oracle America, Inc. | Transparency assisted window focus and selection |
JPWO2007097253A1 (en) | 2006-02-27 | 2009-07-09 | 京セラ株式会社 | Image information sharing system |
KR101128803B1 (en) | 2006-05-03 | 2012-03-23 | 엘지전자 주식회사 | A mobile communication terminal, and method of processing input signal in a mobile communication terminal with touch panel |
US9063647B2 (en) | 2006-05-12 | 2015-06-23 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
KR100816286B1 (en) * | 2006-05-18 | 2008-03-24 | 삼성전자주식회사 | Display apparatus and support method using the portable terminal and the external device |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US20080079757A1 (en) * | 2006-09-29 | 2008-04-03 | Hochmuth Roland M | Display resolution matching or scaling for remotely coupled systems |
US7890863B2 (en) * | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
US8330773B2 (en) * | 2006-11-21 | 2012-12-11 | Microsoft Corporation | Mobile data and handwriting screen capture and forwarding |
US8665225B2 (en) * | 2007-01-07 | 2014-03-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture |
US20080273015A1 (en) * | 2007-05-02 | 2008-11-06 | GIGA BYTE Communications, Inc. | Dual function touch screen module for portable device and opeating method therefor |
JP2009100246A (en) * | 2007-10-17 | 2009-05-07 | Hitachi Ltd | Display device |
JP4533421B2 (en) | 2007-11-22 | 2010-09-01 | シャープ株式会社 | Display device |
US8184096B2 (en) * | 2007-12-04 | 2012-05-22 | Apple Inc. | Cursor transitions |
US20090225043A1 (en) * | 2008-03-05 | 2009-09-10 | Plantronics, Inc. | Touch Feedback With Hover |
KR101513023B1 (en) | 2008-03-25 | 2015-04-22 | 엘지전자 주식회사 | Terminal and method of displaying information therein |
US8525802B2 (en) * | 2008-03-31 | 2013-09-03 | Lg Electronics Inc. | Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same |
US20090251422A1 (en) * | 2008-04-08 | 2009-10-08 | Honeywell International Inc. | Method and system for enhancing interaction of a virtual keyboard provided through a small touch screen |
KR101513025B1 (en) | 2008-05-08 | 2015-04-17 | 엘지전자 주식회사 | Mobile terminal and method of notifying information therein |
WO2009143294A2 (en) * | 2008-05-20 | 2009-11-26 | Citrix Systems, Inc. | Methods and systems for using external display devices with a mobile computing device |
US8594740B2 (en) * | 2008-06-11 | 2013-11-26 | Pantech Co., Ltd. | Mobile communication terminal and data input method |
US9030418B2 (en) * | 2008-06-24 | 2015-05-12 | Lg Electronics Inc. | Mobile terminal capable of sensing proximity touch |
KR101436608B1 (en) * | 2008-07-28 | 2014-09-01 | 삼성전자 주식회사 | Mobile terminal having touch screen and method for displaying cursor thereof |
KR101474452B1 (en) | 2008-08-04 | 2014-12-19 | 엘지전자 주식회사 | Method for controlling touch input of mobile terminal |
KR20100023981A (en) | 2008-08-23 | 2010-03-05 | 정윤경 | Credit card having contents and drviving method thereof |
KR101256008B1 (en) | 2008-09-05 | 2013-04-18 | 에스케이플래닛 주식회사 | Apparatus and method for providing virtual PC using broadcasting receiver and mobile terminal |
US8750938B2 (en) * | 2008-09-29 | 2014-06-10 | Microsoft Corporation | Glow touch feedback for virtual input devices |
KR20100064840A (en) | 2008-12-05 | 2010-06-15 | 엘지전자 주식회사 | Mobile terminal and operation method thereof |
JP2012515966A (en) * | 2009-01-26 | 2012-07-12 | ズッロ・テクノロジーズ・(2009)・リミテッド | Device and method for monitoring the behavior of an object |
KR20100095951A (en) | 2009-02-23 | 2010-09-01 | 주식회사 팬택 | Portable electronic equipment and control method thereof |
US8339372B2 (en) * | 2009-04-20 | 2012-12-25 | Broadcom Corporation | Inductive touch screen with integrated antenna for use in a communication device and methods for use therewith |
KR101491169B1 (en) | 2009-05-07 | 2015-02-06 | 현대자동차일본기술연구소 | Device and method for controlling AVN of vehicle |
KR101533691B1 (en) * | 2009-06-01 | 2015-07-03 | 삼성전자 주식회사 | System for connecting a device with bluetooth module and method thereof |
WO2011011025A1 (en) * | 2009-07-24 | 2011-01-27 | Research In Motion Limited | Method and apparatus for a touch-sensitive display |
US20130009907A1 (en) * | 2009-07-31 | 2013-01-10 | Rosenberg Ilya D | Magnetic Stylus |
KR20110025520A (en) | 2009-09-04 | 2011-03-10 | 삼성전자주식회사 | Apparatus and method for controlling a mobile terminal |
KR20110027117A (en) | 2009-09-09 | 2011-03-16 | 삼성전자주식회사 | Electronic apparatus with touch panel and displaying method thereof |
KR101179466B1 (en) | 2009-09-22 | 2012-09-07 | 에스케이플래닛 주식회사 | Mobile terminal and method for displaying object using approach sensing of touch tool thereof |
KR101651128B1 (en) | 2009-10-05 | 2016-08-25 | 엘지전자 주식회사 | Mobile terminal and method for controlling application execution thereof |
KR101500008B1 (en) | 2009-11-25 | 2015-03-18 | 현대자동차주식회사 | System for connecting car handset unit with external device |
KR20110067559A (en) | 2009-12-14 | 2011-06-22 | 삼성전자주식회사 | Display device and control method thereof, display system and control method thereof |
KR101210298B1 (en) | 2009-12-18 | 2012-12-10 | 에스케이플래닛 주식회사 | User interface method for using touch input and terminal |
US20120287090A1 (en) * | 2009-12-29 | 2012-11-15 | Sanford, L.P. | Interactive Whiteboard with Wireless Remote Control |
US10048725B2 (en) * | 2010-01-26 | 2018-08-14 | Apple Inc. | Video out interface for electronic device |
KR20110100121A (en) | 2010-03-03 | 2011-09-09 | 삼성전자주식회사 | Method and apparatus for inputting character in mobile terminal |
US9003334B2 (en) | 2010-03-05 | 2015-04-07 | Adobe Systems Incorporated | Editing content using multiple touch inputs |
JP5847407B2 (en) | 2010-03-16 | 2016-01-20 | イマージョン コーポレーションImmersion Corporation | System and method for pre-touch and true touch |
KR20110119464A (en) | 2010-04-27 | 2011-11-02 | 터치유아이(주) | Touch screen device and methods of operating terminal using the same |
CN102860034B (en) * | 2010-04-28 | 2016-05-18 | Lg电子株式会社 | The method of image display and operation image display |
US8762893B2 (en) * | 2010-05-14 | 2014-06-24 | Google Inc. | Automatic derivation of analogous touch gestures from a user-defined gesture |
KR101099838B1 (en) | 2010-06-02 | 2011-12-27 | 신두일 | Remote a/s method using video phone call between computer and mobile phone |
WO2012020863A1 (en) * | 2010-08-13 | 2012-02-16 | 엘지전자 주식회사 | Mobile/portable terminal, device for displaying and method for controlling same |
WO2012020868A1 (en) * | 2010-08-13 | 2012-02-16 | 엘지전자 주식회사 | Mobile terminal, display device and method for controlling same |
JP5510185B2 (en) | 2010-08-20 | 2014-06-04 | ソニー株式会社 | Information processing apparatus, program, and display control method |
US8358596B2 (en) * | 2010-09-20 | 2013-01-22 | Research In Motion Limited | Communications system providing mobile wireless communications device application module associations for respective wireless communications formats and related methods |
US9851849B2 (en) * | 2010-12-03 | 2017-12-26 | Apple Inc. | Touch device communication |
US8309043B2 (en) | 2010-12-06 | 2012-11-13 | Fmc Corporation | Recovery of Li values from sodium saturate brine |
US8369893B2 (en) * | 2010-12-31 | 2013-02-05 | Motorola Mobility Llc | Method and system for adapting mobile device to accommodate external display |
US20120194440A1 (en) * | 2011-01-31 | 2012-08-02 | Research In Motion Limited | Electronic device and method of controlling same |
US8963858B2 (en) * | 2011-02-28 | 2015-02-24 | Semtech Corporation | Use of resistive touch screen as a proximity sensor |
US9152373B2 (en) * | 2011-04-12 | 2015-10-06 | Apple Inc. | Gesture visualization and sharing between electronic devices and remote displays |
CA2838280C (en) * | 2011-06-15 | 2017-10-10 | Smart Technologies Ulc | Interactive surface with user proximity detection |
KR101286358B1 (en) * | 2011-08-11 | 2013-07-15 | 엘지전자 주식회사 | Display method and apparatus |
US8966131B2 (en) * | 2012-01-06 | 2015-02-24 | Qualcomm Incorporated | System method for bi-directional tunneling via user input back channel (UIBC) for wireless displays |
US20130241854A1 (en) * | 2012-03-06 | 2013-09-19 | Industry-University Cooperation Foundation Hanyang University | Image sharing system and user terminal for the system |
-
2013
- 2013-03-05 US US13/785,302 patent/US20130241854A1/en not_active Abandoned
- 2013-03-05 US US13/785,370 patent/US20130234959A1/en not_active Abandoned
- 2013-03-05 US US13/785,600 patent/US20130234984A1/en not_active Abandoned
- 2013-03-05 US US13/785,498 patent/US8913026B2/en not_active Expired - Fee Related
-
2014
- 2014-01-14 US US14/155,204 patent/US20140125622A1/en not_active Abandoned
-
2016
- 2016-02-24 US US15/052,803 patent/US20160170703A1/en not_active Abandoned
-
2018
- 2018-09-26 US US16/142,998 patent/US10656895B2/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6492979B1 (en) * | 1999-09-07 | 2002-12-10 | Elo Touchsystems, Inc. | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
US6424338B1 (en) * | 1999-09-30 | 2002-07-23 | Gateway, Inc. | Speed zone touchpad |
US20050110769A1 (en) * | 2003-11-26 | 2005-05-26 | Dacosta Henry | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
US20080204427A1 (en) * | 2004-08-02 | 2008-08-28 | Koninklijke Philips Electronics, N.V. | Touch Screen with Pressure-Dependent Visual Feedback |
US20060132456A1 (en) * | 2004-12-21 | 2006-06-22 | Microsoft Corporation | Hard tap |
US20070152976A1 (en) * | 2005-12-30 | 2007-07-05 | Microsoft Corporation | Unintentional touch rejection |
US20080165154A1 (en) * | 2007-01-04 | 2008-07-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling touch sensitivity of touch screen panel and touch screen display using the same |
US20080278454A1 (en) * | 2007-05-08 | 2008-11-13 | Samsung Electronics Co. Ltd. | Method for setting touch sensitivity in portable terminal |
US20090167704A1 (en) * | 2007-12-31 | 2009-07-02 | Apple Inc. | Multi-touch display screen with localized tactile feedback |
US20100007612A1 (en) * | 2008-07-09 | 2010-01-14 | Howard Locker | Apparatus, system, and method for automated touchpad adjustments |
US20100149130A1 (en) * | 2008-12-15 | 2010-06-17 | Samsung Electronics Co., Ltd. | Sensitivity-variable touch sensor according to user interface mode of terminal and method for providing the same |
US20110032199A1 (en) * | 2009-08-10 | 2011-02-10 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling touch sensitivity in a portable terminal |
US20110261058A1 (en) * | 2010-04-23 | 2011-10-27 | Tong Luo | Method for user input from the back panel of a handheld computerized device |
US20110304550A1 (en) * | 2010-06-10 | 2011-12-15 | Qualcomm Incorporated | Auto-morphing adaptive user interface device and methods |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160092152A1 (en) * | 2014-09-25 | 2016-03-31 | Oracle International Corporation | Extended screen experience |
CN108984140A (en) * | 2018-06-28 | 2018-12-11 | 联想(北京)有限公司 | A kind of display control method and system |
Also Published As
Publication number | Publication date |
---|---|
US20130241854A1 (en) | 2013-09-19 |
US8913026B2 (en) | 2014-12-16 |
US10656895B2 (en) | 2020-05-19 |
US20140125622A1 (en) | 2014-05-08 |
US20130234959A1 (en) | 2013-09-12 |
US20130234984A1 (en) | 2013-09-12 |
US20130234983A1 (en) | 2013-09-12 |
US20190026059A1 (en) | 2019-01-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160170703A1 (en) | System and method for linking and controlling terminals | |
US8839137B2 (en) | Information processing device, table, display control method, program, portable terminal, and information processing system | |
US9632642B2 (en) | Terminal apparatus and associated methodology for automated scroll based on moving speed | |
CN105612759B (en) | Display device and control method thereof | |
JP5871080B2 (en) | Image display device, portable terminal device, information processing system, image display device control method, and program | |
US9948979B2 (en) | User terminal and control method thereof | |
US10386932B2 (en) | Display apparatus and control method thereof | |
US20170147129A1 (en) | User terminal device and method for controlling same | |
KR102046181B1 (en) | System and method for interworking and controlling devices | |
US20130244730A1 (en) | User terminal capable of sharing image and method for controlling the same | |
US20230280837A1 (en) | Interaction method, display device, and non-transitory storage medium | |
KR101212364B1 (en) | System for interworking and controlling devices and user device used in the same | |
KR101151549B1 (en) | System for interworking and controlling devices and user device used in the same | |
US20170285767A1 (en) | Display device and display method | |
KR101555830B1 (en) | Set-top box system and Method for providing set-top box remote controller functions | |
KR102602034B1 (en) | display device | |
KR101515912B1 (en) | User Terminal Capable of Sharing Image and Method for Controlling the Same | |
KR101951484B1 (en) | System for interworking and controlling devices and reception device used in the same | |
JP6368263B2 (en) | Information processing system, information processing apparatus, and remote operation support method | |
TWI517686B (en) | A coordinate controlling system for wireless communication device with touch sensing and digital television | |
KR20130129684A (en) | System for interworking and controlling devices and user device used in the same | |
KR20130101886A (en) | System for interworking and controlling devices and reception device used in the same | |
KR20130129693A (en) | System for interworking and controlling devices and user device used in the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |