CN201181467Y - Hand-hold mobile communicating device - Google Patents

Hand-hold mobile communicating device Download PDF

Info

Publication number
CN201181467Y
CN201181467Y CN 200720194296 CN200720194296U CN201181467Y CN 201181467 Y CN201181467 Y CN 201181467Y CN 200720194296 CN200720194296 CN 200720194296 CN 200720194296 U CN200720194296 U CN 200720194296U CN 201181467 Y CN201181467 Y CN 201181467Y
Authority
CN
China
Prior art keywords
touch
finger
gesture
scrolling
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CN 200720194296
Other languages
Chinese (zh)
Inventor
格雷格·克里斯蒂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Application granted granted Critical
Publication of CN201181467Y publication Critical patent/CN201181467Y/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

The utility model relates to a handheld mobile communication device, and provides a system and software used for realizing the gesture operation by utilizing a touch sensitive device (such as a touch sensitive display) and used for managing and editing media files in a computing device or system; more particularly, the gesture input by human hands on the touch/proximity sensitive device can be used for controlling editing and operating files, such as the media files including but not limited to graphic files, photograph files and video files.

Description

The handheld mobile communication device
Technical field
The utility model relate to by make on the touch sensitive device use gesture manage, the system of manipulation and editing media object (for example Drawing Object on display).
Background technology
The input media that has many types now is used for operating in computer system.These operations generally are moving cursors and selecting on display screen.These operations also can comprise paging, scrolling, pan, convergent-divergent or the like.For example, input media can comprise button, switch, keyboard, mouse, trace ball, touch pads, joystick, touch-screen etc.Each of these devices all has merits and demerits, and this must consider when designing a calculating machine system.
Button and switch generally are engineering properties, and move and select the control that provides limited for cursor.For example, they generally are exclusively used in along specific direction moving cursor (for example arrow key), perhaps carry out specific selection (for example import, delete, number or the like).
When using mouse apparatus, in the motion of input pointer on the display generally corresponding to relative motion when user's mouse during along a surperficial rolling mouse.When using Trackball device, the input pointer in the motion on the display generally corresponding to when user's relative motion of trace ball during the motion track ball in a housing.Mouse and Trackball device generally also comprise one or more buttons that are used to select.Mouse apparatus can also comprise scrolling wheel, the content that it allows the user to take turns scrolling to be shown by means of the scrolling that rolls forward or backward.
Utilize touching pad device, for example the touch pads on personal handheld computer is generally pointed the relative motion of (or stylus) corresponding to user when moving on user's the surface of finger in touch pads in the motion of input pointer on the display.On the other hand, touch-screen can be one type a display screen, and it generally comprises the transparent panel (or " skin ") of the touch-sensitive that is covered with display screen.When using touch-screen, the user generally selects on display screen at the object that shows on the screen (for example GUI object) by directly clicking (usually with finger or stylus).
For additional function is provided, some input medias have been realized gesture.For example, in touch pads, in the time can detecting one or more touching on the surface in touch pads, can select.In some cases, any part that can the tap touch pad.Remove outside the selection, can start scrolling by means of the motion of using finger at the touch pads edge.
The United States Patent (USP) 5612719 and 5590219 of authorizing Apple Computer has been described some other the use of gesture, United States Patent (USP) 5612719 has disclosed button on a kind of screen, its response on the screen near at least two on the button or the button different button gestures.United States Patent (USP) 5590219 has disclosed a kind of method that is used to recognize the oval gesture of importing on the display screen of computer system.
Recently, realized the gesture that some are more advanced.For example, by 4 fingers are placed on the touch pads, thereby recognize the scrolling gesture, after this on touch pads, move these fingers, can start scrolling, so that carry out the scrolling incident.But, the method for the gesture that is used to realize that these are advanced may be restricted, and may be counterintuitive in many cases.In some applications, particularly relate in the application of use a computer system management or editing media file, use the gesture of touch-screen can make the user can be more effectively and accurately realize required operation.
Based on above-mentioned, need improve, make gesture on touch sensitive device, to be performed, particularly about management and editing media file.
The utility model content
The application relates to system and the software that is used for realizing by touch sensitive device (for example touch-sensitive display) gesture, is used for management and editing media file on computer system.Specifically, touching/near sensor on the gesture input of staff can be used for control, editor and handle file (for example media file includes but not limited to photo files and video file).
According to an embodiment, the gesture input of using on the display in the computer desktop of touch-sensitive is used to realize conventional mouse/trace ball operation, for example target, selection, right clicking operation, scrolling or the like.
According to another embodiment, the gesture input on touch-sensitive display is used to realize being used for for example edit instruction of photo files of edited image file.Gesture input can by user interface (UI) element for example slider bar distinguished.The gesture input can the quantity of touch point be changed on the UI element by changing by means of the UI element.
According to another embodiment, gesture input comprises the activation of UI element, can realize other function after this with the interaction of the gesture of relevant UI element.
According to an aspect of the present invention, provide a kind of handheld mobile communication device, it is characterized in that: the display screen of touch-sensitive; Be used to make the device of the part of display screen display media file, described media file comprises that text items and image item are one of at least; Be used to detect the device in the lip-deep touch scrolling input of display screen, described touch scrolling input is included in the lip-deep people's of display screen the touch screen point of finger, the position that this touch screen point is shown corresponding to the described part of media file on the display screen; Be used to detect the device of drag movement of people's the touch screen point of finger on display screen, described drag movement strides across the part of the described part of the media file that just is being shown, and comprises vertical and vector component level; Thereby the device that is used for the drag movement indication scrolling operation of definite finger touch screen point; Be used to make the device of media file scrolling on display screen, scrolling shown in it is restricted to one of vertical and horizontal direction.
According to another aspect of the present invention, provide a kind of handheld mobile communication device, it is characterized in that: the display screen of touch-sensitive; Be used to make the device of the part of display screen display media file, described media file comprises that text items and image item are one of at least; Be used to detect the device in the lip-deep touch scrolling input of display screen, described touch scrolling input is included in the lip-deep people's of display screen the touch screen point of finger, the position that this touch screen point is shown corresponding to the described part of media file on the display screen; Be used to detect the device of drag movement of people's the touch screen point of finger on display screen, described drag movement strides across the part of the described part of the media file that just is being shown; Be used to detect the device of direction of drag movement of touch screen point of people's finger, wherein the direction of drag movement comprises vertical component vector and horizontal component vector; And be used to make media file device according to the direction scrolling of the drag movement that detects on display screen.
Description of drawings
Fig. 1 is the calcspar according to the computer system of the embodiment of example of the present invention;
Fig. 2 represents the computer system according to the embodiment of another example of the present invention;
Fig. 3 represents the multiple spot disposal route according to the embodiment of an example of the present invention;
Fig. 4 A, 4B represent the touch image according to the detection of one embodiment of the present of invention;
Fig. 5 represents the stack features according to one embodiment of the present of invention;
Fig. 6 represents the calculation method of parameters according to one embodiment of the invention;
Fig. 7 A-7E and 7I-7K represent to be used to specify according to one embodiment of the present of invention the multiple gesture of the target and/or the task of selection;
Fig. 7 F-7H represents to be used to distinguish and realize the method for the gesture input of Fig. 7 A-E;
Fig. 8 A-8G represents the rotation posture according to one embodiment of the present of invention;
Fig. 9 represents the method based on touch according to one embodiment of the invention;
Figure 10 represents the method based on touch according to one embodiment of the invention;
Figure 11 is the method based on touch according to one embodiment of the invention;
Figure 12 represents the convergent-divergent gesture method according to one embodiment of the invention;
Figure 13 A-13H represents the convergent-divergent order according to one embodiment of the invention;
Figure 14 represents the pan method according to one embodiment of the invention;
Figure 15 A-15D represents the pan order according to one embodiment of the invention;
Figure 16 represents the rotating method according to one embodiment of the invention;
Figure 17 A-17C represents the rotation order according to one embodiment of the invention;
Figure 17 D-17E represents the method that is used to rotate optional target according to one embodiment of the invention;
Figure 18 A and 18B represent the gesture input that is used for the editing photo file according to one embodiment of the invention;
Figure 18 C represents to be used to distinguish and realizes Figure 18 A, the method for the gesture input of 18B;
Figure 18 D and 18E represent to be used in photo application according to one embodiment of the invention the gesture input of the close-shot and the photo files of dolly-out,ing dolly-back;
Figure 19 A-19D represents to be used for importing by the gesture of reproduction order file scrolling according to one embodiment of the invention;
Figure 19 E and 19F represent to be used on the digital camera display by reproducing the gesture input of photo files scrolling according to one embodiment of the invention;
Figure 19 G represents to be used for importing in the gesture of reproduction period mark or deletion photo files according to one embodiment of the invention;
Figure 19 H represents to be used for importing in another gesture of reproduction period mark or deletion photo files according to another embodiment of the present invention;
Figure 20 is expression is used to realize the method for Figure 18 A-19F according to one embodiment of the invention total figure;
Figure 21 A-21D represents to use Video Applications to be used to control/the gesture input of editing video according to one embodiment of the invention;
Figure 22 A and 22B represent to be used to realize the method for the gesture input of Figure 21 A-21D; And
Figure 23 represents to be used to use the gesture of voice applications control/editor audio frequency to import according to one embodiment of the invention.
Embodiment
In the explanation of preferred embodiment below, the accompanying drawing with reference to a part that constitutes this explanation wherein shows by explanation and can realize specific embodiment of the present invention.Should be appreciated that the scope that does not break away from the preferred embodiments of the present invention, can utilize other embodiment, can carry out the change of structure.
Fig. 1 is the calcspar according to the computer system 50 of one embodiment of the invention.Computer system 50 can be the personal computer system, for example desktop computer, pocket computer, graphic tablet or handheld computer.Computer system also can be for example cell phone, PDA, specialized media player, consumer electronic device or the like of calculation element.
The computer system 50 of example shown in Figure 1 can comprise processor 56, and it is arranged to execution command, and is used to carry out the operation relevant with computer system 50.For example for example use the instruction from memory search, processor 56 can be controlled at the reception and the processing of the input and output data between the element of computer system 50.Processor 56 can be realized on single chip, a plurality of chip or be realized by a plurality of electrical equipments.For example, processor 56 can use multiple architecture, comprises special-purpose or flush bonding processor, single-use processor, controller, ASIC or the like.
In most of the cases, processor 56 and operating system are worked together, with object computer code and generation and use data.Operating system is generally known, thereby no longer describes in detail.For example, operating system can be OS/2, DOS, Unix, Palm OS or the like.Operating system can also be the operating system of specific use, for example can be used for the operating system of the equipment type calculation element of limited purposes.Operating system, other computer code and data can reside in operation with memory block 58 that processor 56 links to each other in.Memory block 58 generally is provided for storage computation machine code and the position of the data that can be used by computer system 50.For example, memory block 58 can comprise ROM (read-only memory) (ROM), random-access memory (ram), hard disk unit or the like.Information also can reside on the removable storage medium, is downloaded or is installed on the computer system 50 when needs.Removable storage medium for example comprises CD-ROM, PC-CARD, storage card, floppy disk, tape and network element.
Computer system 50 can also comprise the display device 68 that can link to each other with processor 56 in operation.Display device 68 can be LCD (LCD) (for example active matrix, a passive matrix etc.).Perhaps, display device 68 can be a monitor, for example graphics adapter (EGA) display of monochrome display, cga (CGA) display, enhancing, changeable graphics array (VGA) display, Super VGA display, cathode ray tube (CRT) etc.Display device can also be plasma display or the display that utilizes the electronic link realization.
Display device 68 generally can be arranged to display graphics user interface 69, and it makes that use is in user and the operating system or the user interface between the application program of moving on the operating system of computer system easily.In general, GUI 69 representative programs, file and exercisable selection with graph image, object or vector representation.Graph image can comprise window, field, dialog box, menu, icon, button, scrollbar etc.This image can be managed with predetermined layout, perhaps can be given birth to by dynamic real estate, to be used for the specific operation that the user carries out.During operation, the user can select and/or activate multiple graph image, so that starting function and being associated with function of task.For example, the user can select to open, close, minimize or the button of maximized window, perhaps selects to start the icon of specific program.GUI 69 can be on display device 18 to user's display message by way of parenthesis or additionally, for example noninteractive text and figure.
Computer system 50 can also be included in the input media 70 that can link to each other with processor 56 in the operation.Input media 70 can be arranged to from the external world to computer system 50 Data transmission.Input media 70 for example can be used for following the tracks of and on display 68 GUI 69 being selected.Input media 70 also is used in and sends instruction in the computer system 50.Input media 70 can comprise the touch sensing device, and it is arranged to the input of reception from user's touch, and this information is sent to processor 56.For example, touching sensing device can be touch pads or touch-screen.In many cases, touch sensing device and distinguish position and the size that touches and on touch perception surface, touch.Touch that sensing device detects and touch to processor 56 reports, 56 of processors touch according to its interpretation of programs.For example, processor 56 can be according to specific touch initiating task.Can use application specific processor to handle touch partly, and reduce requirement the primary processor of computer system.
Touch sensing device and can include but not limited to capacitance sensing, resistance sensing, surface acoustic wave sensing, pressure-sensing, light sensing or the like based on cognition technology.In addition, touch sensing device and can be used for single point sensing or multipoint sensing.Single point sensing can only be distinguished a touch, and multipoint sensing can be distinguished simultaneous a plurality of touch.
As mentioned above, input media 70 can be a touch-screen, and the top of its position indicator 68 or the place ahead and display device 68 integrate, or independent element, for example a touch pads.
Computer system 50 preferably also comprises the electric capacity that is used for one or more I/O device 80 couplings.For example, I/O device 80 can be keyboard, printer, scanner, camera, microphone, loudspeaker or the like.I/O device 80 can integrate with computer system 50, or independent element (for example peripheral unit).In some cases, I/O device 80 can pass through wired connection (for example cables/ports) and links to each other with computer system 50.In other cases, I/O device 80 can link to each other with computer system 80 by wireless link.For example, the data link can be PS/2, USB, IR, live wire, RF, bluetooth etc.
According to one embodiment of the present of invention, computer system 50 is designed to distinguish the gesture 85 that offers input media 70, and according to the various aspects of gesture 85 control computer systems 50.In some cases, gesture can be defined as being mapped to one or more specific stylized interactions of quilt calculating operation and input media.Gesture 85 can be configured by the motion of various hands (more particularly finger).Extraly or alternatively, gesture can be configured by means of stylus.Under all these situations, input media 70 receives gestures 85, and processor 56 is carried out the instruction of the operation relevant with gesture 85.In addition, memory block 58 can comprise the program 88 of gesture operation, and it can be the part of operating system, or an independent application program.Gesture operation program 88 generally can comprise one group of instruction, and this instruction is distinguished the generation of gesture 85 and gesture 85 and/or response gesture 85 and the one or more agengs of operational notification that carry out.Relevant other details that can be used as the various gestures of input instruction use will further be discussed below.
According to the preferred embodiment, when the user carried out one or more gesture, input media 70 was delivered to processor 56 to gesture information.Processor 56 uses the instruction from storer 58, more particularly, uses the program 88 of gesture operation, explains gesture 85, and according to the different element of gesture 85 control computer systems 50, for example storer 58, display 68 and I/O device 80.Gesture 85 can be identified as instruction, is used for carrying out operation in the application program of storer 58 storages, is modified in the image object that shows on the display 68, is modified in the data of storage in the storer 58, and/or operates in I/O device 80.
In addition, though Fig. 1 shows input media 70 and display 69 as two independent squares for illustrative purposes, these two squares can be implemented on a device.
Fig. 2 represents to use the computing system 10 of many touch pads as the example of gesture input media, and many touch pads 24 can be a display board simultaneously.Computing system 10 can comprise the one or more many touch pads processors 12 that are exclusively used in many touch subsystems 27.Perhaps, the function of many touch pads processor can by special logic for example state machine realize.Peripherals 11 can include but not limited to storer, watchdog timer of random-access memory (ram) or other type etc.The subsystems 27 that touch can include but not limited to one or more analog channels 17, channel scan logic 18 and driver logic 19 more.Channel scan logic 18 can be visited RAM 16, from analog channel read data and provide control to analog channel automatically.This control can comprise the row of many touch pads 24 is multiplexed to analog channel 17.In addition, channel scan logic 18 can the Control Driver logic and the simulating signal that selectively offered the row of many touch pads 24.In certain embodiments, touching subsystem 27, many touch pads processor 12 and peripherals 11 can be integrated in one and use in the specific integrated circuit (ASIC) more.
Driver logic 19 can provide a plurality of subsystem outputs 20 that touch more, the proprietary interface that drives high-voltage drive can be provided, high-voltage drive preferably includes code translator 21 and follow-up level shifter and driver-level 22, though the level locomotive function can be performed before the code translator function.Level shifter and driver 22 can provide the level from low voltage level (for example CMOS level) to higher voltage level to move, and provide preferably signal to noise ratio (S/N ratio) so that reduce noise.Code translator 21 can be decoded into one of N output driving interface signals, and wherein N can be the line number of the maximum in the plate.Code translator 21 can be used for reducing the quantity of drive wire required between high-voltage drive and many touch pads 24.The capable input of each many touch pad 23 one or more row that can drive in many touch pads 24.Should be noted that driver 22 and code translator 21 can be integrated into an ASIC, be integrated in the driver logic 19, perhaps may be unwanted in some instances.
Many touch pads 24 can comprise the capacitance sensing medium, have a plurality of capable tracks or row drive wire, and a plurality of row track or sense wire, though also can use other sensed media.Row track and row track can be made of transparent conducting medium, and for example indium tin oxide (ITO) or antimony tin oxide (ATO) are though also can use for example copper of other transparent and nontransparent material.In certain embodiments, row track and row track can be formed on the opposite side of dielectric material, and can be vertical mutually, though in other embodiments, other Fei Dika orientation also is possible.For example, in rho theta system, sense wire can be a concentric circles, and drive wire can be the line (or vice versa) that radially extends.Therefore, be to be understood that, term used herein " OK " and " row ", " first dimension " and " second dimension ", " first axle " and " second axis " are intended to not only comprise the grid of quadrature, and comprise the track that intersects (for example concentric line and the radial line of polar coordinates arrangement) with first peacekeeping, second other geometry of dimension.Row and column can be formed on a side of substrate, perhaps can be formed on two independent substrates that separated by dielectric material.In some instances, an additional dielectric covering layer can be placed on capable track or the row track, to strengthen this structure and to protect whole assembly not to be destroyed.
At " intersection point " of many touch pads 24, promptly track is mutually by the position of (striding across) (directly not electrically contacting mutually) up and down, and track forms two electrodes (though plural track also can intersect) basically.Each intersection point of row track or row track can provide the capacitance sensing node, and can be regarded as a pictorial element (pixel) 26, and this is especially useful when many touch pads 24 are counted as catching " image " of touch.In other words, touch after subsystems 27 determine whether each touch sensor in many touch pads detects touch event " image " that the pattern of the touch sensor in many touch pads of touch event have taken place can be regarded as touching (for example touching the pattern of the finger of many touch pads) more.When given row is maintained at following time of DC, the electric capacity between column electrode and row electrode is seemingly in all stray capacitances that lists, when given row utilizes the AC signal excitation, and then seemingly total capacitor C sig.On many touch pads or near the existence of finger or other object can change into Csig and detected by measurement.The row that touch the many touch pads 124 in the subsystem 27 can drive one or more analog channels 17 (also being called event detection and demodulator circuit here) more.In some implementations, each row can be coupled to the analog channel 17 of a special use.But, in other was realized, these row can be coupled to less analog channel 17 by analog switch.
Computing system 10 can also comprise primary processor 14, be used to receive from the output of many touch pads processor 12 and according to this output and operate, include but not limited to mobile object for example cursor or pointer, scrolling or pan, adjusting control setting, open file or document, navigate through menus, select, execute instruction, peripherals that operation links to each other with main equipment etc.Primary processor 14, it can be the CPU of personal computer, also can carry out and many touch pads are handled irrelevant additional function, and can with program storage 15 and display device 13 for example LCD display link to each other so that provide user interface (UI) to the user of equipment.
Show special-purpose MT sheet processor 12 though should be noted that Fig. 2, touching subsystem can directly be controlled by primary processor 14 more.In addition, should be noted that many touch pads 24 and display device 13 can be integrated in the touch panel display device.Other details that many touch sensors detect, comprise approaching detection of being undertaken by touch pad, in the common pending application of transferring the possession of, be described, comprise application number 10/840862, its U.S. Patent Publication No. is US2006/0097991, application number is 11/428522, U.S. Patent Publication No. US2006/0238522, and the patented claim that is called " Proximity andMulti-Touch Sensor Detection and Demodulation (near detecting and demodulation with many touch sensors) " in the name that on January 3rd, 2007 submitted to, the full content of all these patented claims is included in this by reference.
Fig. 3 represents the multiple spot disposal route 100 according to one embodiment of the present of invention.Multiple spot disposal route 100 for example can be performed in Fig. 1 or system shown in Figure 2.Generally in piece 102 beginnings, can be that multi-point touch panel is read image this moment from the multiple spot input media to multiple spot disposal route 100.Though used " image " this term, should be noted that data can be other form.In most of the cases, the image of reading from touch-screen provides for each check point of touch-screen or pixel that (x, the amplitude of function y) (Z), this amplitude for example can be reflected in the electric capacity of each point measurement as the position.
After piece 102, multiple spot disposal route 100 advances to piece 104, and can become characteristic set or feature list to image transitions this moment.The different input of each feature representative for example touches.In most of the cases, each feature can comprise himself unique identifier (ID), y coordinate, x coordinate, Z amplitude, angle θ, area A etc.For example, Fig. 4 A, 4B represent a concrete image 120 in real time.In image 120, can have two based on two different features 122 that touch.These for example touch and can be formed by the pair of finger that touches described touch-screen.As shown in the figure, each feature 122 can comprise unique identifier (ID), x coordinate, y coordinate, Z amplitude, angle θ and area A.More particularly, the first feature 122A can be by ID 1, X 1, Y 1, Z 1, θ 1, A 1Expression, the second feature 12B can be by ID 2, X 2, Y 2, Z 2, θ 2, A 2Expression.These data for example can use the agreements that touch to be output more.
Can use from data or image to the conversion of feature at application number be 10/840862, publication number realizes that as the method described in the unsettled U.S. Patent application of US2006/007991 this patented claim also is included in this by reference.Described in this patented claim, raw data generally is received with digitized form, and can comprise the value about each node of touch-screen.These values can be between 0 and 256,0 touch pressure that is equal to nothing wherein, and 256 equal whole touch pressures.After this, raw data can be filtered so that reduce noise.In case, just can produce the gradient data of the topology of every group of tie point of indication through filtering.After this, can for example, can use watershed algorithm according to the border (promptly carry out such determine, determine which point can be divided into a group) that gradient data calculates the touch area to form each touch area.In case the border is determined, just can calculate data about each touch area (X for example, Y, Z, θ, A).
After piece 104, multiple spot disposal route 100 advances to piece 106, can carry out tagsort and packet transaction at this.During classifying, can determine the characteristic of each feature.For example, feature can be classified as concrete finger, thumb, palm or other object.In case be classified, just can divide into groups to feature.The mode of formation group can be diversified.In most of the cases, can divide into groups to feature according to some standards (for example they have similar attribute).For example, Fig. 4 A, two features shown in the 4B can be divided into one group, and this is because each of these features can be placed near another, perhaps because they are from the same hand.Grouping can comprise filtering to a certain degree, is not the feature of the part of touch event with filtering.When filtering, can refuse one or more features, because they satisfy some predetermined criteria, perhaps because they do not satisfy some criterions.For example, one of them feature can be classified as the thumb that is positioned at tablet PC edge.Because thumb is used to keeping equipment, execute the task and be not used in, be rejected by the feature of its generation, promptly be not considered to the part of processed touch event.
After piece 106, multiple spot disposal route 100 advances to piece 108, the key parameter that this moment can the calculated characteristics group.Key parameter can comprise that the x/y barycenter of distance between the feature, all features, feature are rotated, described group the general pressure pressure of barycenter (for example) etc.As shown in Figure 5, this calculating can comprise obtains barycenter C, determines that (D1, distance D D2) are obtained distance D 1 to each dotted line then, the mean value of D2.In case parameter is calculated, just can the Report Parameters value.General and group id (GID) of parameter value and the number of features in every group (in this example being 3) are reported together.In most of the cases, initial and current parameter value of report.Initial parameter value can be based on landing (set down), and promptly when the user placed its finger on the touch-screen, currency can be based on any point in the action that takes place after landing.
Such as will be appreciated, piece 102-108 can repeatedly be carried out during user's action, so as to producing a plurality of signals of configuration in order.In the step of back, can make initially to compare, so that in system, operate with current parameter.
After piece 108, treatment scheme advances to piece 110, at this described group and user interface (UI) element is associated.The UI element can be button piece, table, sliding shoe, roller, switch knob etc.The element or the control at each UI element representative of consumer interface.Can visit in piece 108 parameters calculated data in the application program of UI element back.In one implementation, application program to touch data to the related ordering of its corresponding UI element.This ordering can be based on some predetermined criteria.Ordering can comprise the generation figure of merit, and no matter which UI element has the highest figure of merit, only allows it to visit described group.Even also has to a certain degree a hysteresis (in case one of described UI element requires that group of control, this group is just clung this UI element, has higher ordering up to another UI element).For example, ordering can comprise definite barycenter (or feature) to the degree of approach of the image object of UI element associated.
After piece 110, the multiple spot disposal route advances to piece 112 and 114. Piece 112 and 114 can generally be carried out simultaneously.In one embodiment, it seems that as if piece 112 and 114 carried out simultaneously from the user.In piece 112, can carry out one or more operations according to the difference between initial and current parameter value, also can be performed (if any) by the basis UI element related with it.Feedback about the one or more operation users that just are being performed can be provided in piece 114.For example, user feedback can comprise demonstration, audio frequency, tactile feedback etc.
Accompanying drawing 6 expressions are according to the calculation method of parameters 150 of one embodiment of the present of invention.Calculation method of parameters 150 for example can be corresponding to piece shown in Figure 3 108.Calculation method of parameters 150 generally in piece 152 beginnings, can receive the feature group at this.After piece 152, calculation method of parameters 150 advances to piece 154, determines whether the number of features in the feature group change has taken place.For example, lifting or place an additional finger owing to the user changes number of features.May need different fingers to carry out different control (for example follow the tracks of, make a sign with the hand).If number of features changes, then calculation method of parameters advances to piece 156, can calculate initial parameter value at this.If it is identical that number keeps, then calculation method of parameters 150 advances to piece 158, can calculate the parameter current value.After this, calculation method of parameters 150 advances to piece 160, can report initial and parameter current value.For example, the average initial distance between initial parameter value can comprise a little (or initial distance (AVG)), the average current distance between the parameter current value can comprise a little (or current distance (AVG)).These can be compared in the step afterwards, so that control the various aspects of a computer system.
Said method and technology can be used for realizing any amount of gui interface object and operation.For example, can produce gesture, so that detect and realize user instruction, with the demonstration of size, scrolling that changes window, the view that rotates an object, furthers and push away a demonstration far away, deletion or insertion text or other object etc.
The basic kind of gesture should make the user import can be by using the instruction commonly used of conventional mouse or Trackball device input.Fig. 7 represents to be used to handle the flow chart of click operation.Fig. 7 F represents to be used to handle the process flow diagram of the detection of click operation.By piece 710 beginnings, the detection of one or two touch that can point, if determine that 711 the touch that detects is a finger, then 712 determine that these touches are whether within the predetermined proximity that be shown Drawing Object related with optional file object, if then carry out selection operation 714.If, then 718 can call double click operation in 716 double click operation that detect with optional object association.Double click operation can by detect finger leave touch-screen and immediately once more touch screen detect.According to another embodiment,, then can call double click operation if detect time that the finger of the object of touch selecting stops at interval greater than a preset time.
Shown in Fig. 7 G, if it is relevant to detect optional file object of finger touch discord, but determine relevantly 720 with a network address hyperlink, then call single-click and operate, so as to the startup hyperlink.If hyperlink is touched in the non-browser environment, then should the running browser application program.
If detect two finger touch,, then select this object 715 if then relevant with the optional file object in 713 at least one pick-up point 711.If detect one or more clicks of one of pointing 717 on touch-sensitive display, the pick-up point is held simultaneously, then can call the right side operation of clicking the mouse.
According to the preferred embodiment, if detected one or more touch is not relevant with any optional file object or hyperlink, then shown in Fig. 7 H, determine 722 whether the pick-up point is relevant with rollable zone (for example text editing application window, listed files window or internet web page).
Scrolling relates generally to mobile data presented or image on the viewing area on the display screen, makes to see one group of new data at viewing area.In most of the cases, in case that viewing areas becomes is full, then each new data set just appears at the edge of viewing areas, and other all data sets position that moves up.That is, each data set that new data set replaces shifting out viewing areas appears.In fact, these functions make the user can see current data set in succession outside viewing areas.In most of the cases, the user can quicken its traversal to these data sets by means of its finger of fast moving.The example that carries out scrolling by tabulation can be 2003/0076303A1 at application number, and 2003/0076301A1 finds in the U.S. Patent application of 2003/0095096A1, and these patented claims are included in this by reference.
If the pick-up point can/can be in rollable zone, then can call the scrolling operation 723 similarly with the scrolling wheel that presses down on the conventional mouse apparatus.If rollable zone can only be along a direction scrolling (for example), then invoked scrolling operation is unidirectional scrolling.If rollable zone can be along the both direction scrolling, the scrolling operation of then calling is omnidirectional.
In the unidirectional scrolling operation that scrolling is constrained to vertical direction (being Y-axis), have only the vertical vector component of tracked touch campaign to be used as the input that realizes vertical scrolling.Similarly, in the unidirectional scrolling operation of the scrolling that is confined to horizontal direction, have only the horizontal vector component of tracked touch campaign to be used as the input that realizes horizontal scrolling.If the scrolling operation is an omnidirectional, the scrolling operation that then realizes will be followed the tracks of the motion of tracked touch.
According to the preferred embodiment, if detected touch is a finger touch, then can prepare to carry out with normal or 1X speed in the operation of 724 scrollings.And if in case move on the touch-screen that the finger that lands begins, then can carry out the scrolling operation by following the tracks of on touch-screen the motion of pick-up point.If detected touch is the touch of two fingers, then can carry out the scrolling operation with twice or 2X speed 725.Can also increase additional finger,, wherein detect four finger touch and can be interpreted into " going up page turning " or " following page turning " instruction in the multiple page documents window so that carry out faster the scrolling operation.
According to another embodiment, even when removing finger from touch-screen, data presented also continues motion.This continues motion can be at least in part based on former motion.For example, can continue scrolling with identical direction and speed.In some cases, scrolling is slack-off in time, and is promptly more and more slower by the traversal of media item, stays next static table up to finally stopping scrolling.For example, each the new media item that enters vision area can reduce speed gradually with increasing.Extraly or alternatively, when finger is put back on the touch-screen, the video data stop motion.That is, refer to put back at the touch-screen up knob and can realize braking, it stops or slowing down and continue the motion of operation.
Below by the above-mentioned gesture operation of example explanation, shown in Fig. 7 A, use touch-screen (many touch-screens 24 for example shown in Figure 2), the click that can be interpreted as being equal to mouse of pressing by finger 501 fingers that on image object (for example file table 500), carry out, it can represent a selection in this example, and it is generally represented by file or image object that highlight is selected.Press twice click that can be interpreted as being equal to mouse twice on the detected image object, it can cause the operation of the application program relevant with the image object of pressing.For example, twice of the file (for example photo files) listed on the screen pressed to be caused operation photo viewer applications and open photo files.
In at least one finger, touch by means of keeping, shown in Fig. 7 B, the image relevant with the object that will be landed can call drag-and-drop function and figure ground is dragged to required landing position to object by touching, and shown that file table 501 is dragged and dropped to folder window 503 from folder window 502.
Some mouse function may need twice touch just can finish.For example shown in Fig. 7 C, utilize two fingers can make " the right click " gesture, one of them finger is as landing finger 506, and second finger 507 is touched screen at least once, to represent right clicking operation.Fig. 7 D is illustrated in and can carries out after the right clicking operation, can call operation window 504, and after this first finger can move to the window 504 that calls, so that utilize a finger 506 to select and touch action-item 505.According to one embodiment of the present of invention, have only detectedly to touch and be positioned near the detected pick-up point, and have only when detected touch be positioned at land the finger left (from user's observation point see land the finger right-hand) time, could realize right clicking operation.
Usually require other selection function of mouse and keyboard operation combination only to use touch operation to realize.For example, under the Microsoft's Window environment,, when generally need pinning mobile (shift) button, the user wanting to drag mouse icon on the selecteed sequential file for a plurality of files in the select File window 502.Do not pin mobile (shift) button, dragging of mouse icon can be interpreted as drag-and-drop operation.Shown in Fig. 7 E, according to embodiments of the invention, two approaching relevant touches that detect the file table drag the more options operation that can be interpreted as being used to select one group of file 508.For fear of the instruction that described two touch operation is interpreted as mistakenly other, rotating operation for example, preferably, have only when detected two touches each other quite near the time just call two and touch the more options functions.
The scrolling operation of representing referring to Fig. 7 H, 7I and 7J, in rollable window one or two finger land can cause window displaying contents with different speed scrollings.Specifically, in case call scrolling operation 723, if determine on touch-sensitive display, only to detect a finger (or a pick-up point), then at 724 speed scrollings, if detect two fingers (or two pick-up points), then with the speed scrolling of 2X with 1X.According to the preferred embodiment, in scrolling operating period, scrollbar 727 moves along the direction consistent with the scrolling direction with 728.
At last, use can be carried out many touch displays of proximity detection, for example application number aforesaid and that be included in the common transfer in this explanation by reference be 10/840862 (U.S. Patent Publication No. US2006/0097991) patented claim and be called in the name that on January 3rd, 2007 submitted to " near and many touch sensors detect and demodulation " patented claim in the plate described, the posture of finger also can be used for causing being equivalent to makes mouse icon hover over hover operation on the image object.
For example, referring to Fig. 7 K, use icon 731 top users in the desktop 729 and point 501 proximity detection and can be interpreted as hover operation, its rolling that causes the application icon 730 that hovers is ejected.If the user touches the icon of ejection, then can call double click operation, so as to moving this application program.For using specific situation, can conceive like the application class, for example when in photo management software during, detect above thumbnail finger near then calling hover operation, so as to the size (but not selecting) that can amplify the photo thumbnail that hovers with thumbnail image format display photos file.
Gesture can also be used to call and handle the virtual controlling interface, and for example volume button, switch, slide block, keyboard and other can be used to help people and computer system or user's electronic option to carry out interactional virtual interface by generation.For example, making uses gesture calls the virtual controlling interface, referring to Fig. 8 A-8H, the rotation gesture that is used to control virtual volume button 170 on the gui interface of the display 174 of tablet PC175 has been described.For start button 170, the user is put into finger 176 on the multi-point touch panel 178.The virtual controlling button may be shown, the orientation or the profile of perhaps specific numeral, the finger that lands or land after some combinations of interactional further feature of the motion of moment finger or these and user can call the virtual controlling button that will be shown.In each case, computing system is associated finger group and virtual controlling button, and definite user will use virtual volume button.
Described association also can be based in part on pattern or the current state of calculation element in input time.For example, identical gesture be when can being interpreted as volume button during just in played songs on the calculation element, if perhaps carrying out the object editing application program then be interpreted as rotation command.Other user feedback can be provided, for example comprise the sense of hearing or tactile feedback.
In case the Show Button 170, shown in Fig. 8 A, just user's finger 176 can be positioned at button 170 around, be the button or the rotating disk of a reality as it, can around button 170, rotate then, with simulation turn knob 170.In addition, when button 170 during, for example can provide the audio feedback of click sound form or the tactile feedback of vibration mode by " rotation ".The user can also use the another hand to keep tablet PC175.
Shown in Fig. 8 B, many touch-screens 178 detect at least one pair of image.Specifically, first image 180 is produced when landing, and when finger 176 is rotated, can produce at least one other image 182.Though only show two images, in most of the cases, have the many more images that between these two images, occur cumulatively.The profile of the finger that each graphical representation contacted with touch-screen a particular moment.These images also can be called as the touch image.Should be appreciated that term " image " is not meant the profile that is shown (but the image that is formed by touch sensitive device) on screen 178.Though shall also be noted that other form of having used " image " this term, data to be to represent different touch planes constantly.
Shown in Fig. 8 C, each image 180 and 182 can be converted into the set of feature 184.Each feature 184 can be associated with specific touch, for example each thumb of pointing 176 finger tip and being used to keep the another hand 177 of tablet PC175 around button 170.
Shown in Fig. 8 D, feature 184 is classified, and promptly each finger/thumb is identified, and to each groupings of image 180 and 182.Under this particular case, 170 related feature 184A can be divided into one group with button, and with formation group 188, the feature 184B related with thumb can be by filtering.In replacing structure, thumb feature 184B can (or in another group) be used as independent feature by means of himself and treat, for example, for input pattern or the operator scheme that changes system, perhaps in order to realize another gesture, for example with the related slide block gesture of balanced slide block that on the regional inherent screen of thumb (or other finger), shows.
Shown in Fig. 8 E, can be for the key parameter of each image 180,182 calculated characteristics group 188.The key parameter related with first image represented original state, represents current state with second image, 182 related key parameters.
Also shown in Fig. 8 E, button 170 is UI elements related with feature group 188, and this is because itself and button 170 are approaching.After this, shown in Fig. 8 F, can be compared,, promptly from the original state to the current state, clockwise rotate the feature groups of 5 degree to determine gyration vector from the key parameter value of the feature group 188 of each image 180,182.In Fig. 8 F, be shown in broken lines initial characteristics group (image 180), and shown current feature group (image 182) with solid line.
Shown in Fig. 8 G, according to gyration vector, the loudspeaker 192 of tablet PC175 increases (or reducing) its output according to the amount of spin of finger 176, that is, according to the rotation of 5 degree, make volume increase by 5%.The display 174 of tablet PC175 also can be regulated the rotation of button 170 according to the amount of spin of finger 176, that is, make the position of button 170 rotate 5 degree.In most of the cases, the rotation of button and the rotation of finger take place simultaneously, that is, finger rotates 1 degree, and button also rotates 1 degree.In fact, the virtual controlling button is being followed the gesture that takes place on screen.In addition, the audio unit 194 of tablet PC can provide click sound to each unit of rotation, for example according to the rotation of 5 degree, provides 5 clicks.In addition, the haptic unit 196 of tablet PC175 can provide a certain amount of vibration or other tactile feedback for each click, so as to simulating actual button.
Should be noted that the gesture that when carrying out virtual controlling button gesture, can add.For example, can use two hands to control more than one virtual controlling button simultaneously, that is, a hand is used for a virtual controlling button.Extraly or alternatively, one or more slider bars can be used as the virtual controlling button and are controlled simultaneously, promptly, a manual manipulation virtual controlling button, and at least one finger of another hand or at least one sliding shoe of more than one finger manipulation or more than one slider bar, for example slider bar of each finger.
Though shall also be noted that and use the virtual controlling button that present embodiment has been described, in another embodiment, the UI element can be the virtual volume driving wheel.As an example, the virtual volume driving wheel can be simulated actual scrolling wheel, patent publication No. US2003/0076303A1 for example, and US2003/0076301A1, and the patented claim of US2003/0095096A1 is described, these patented claims are included in this by reference.
Fig. 9 represents the method 200 based on touch according to one embodiment of the present of invention.This method can detect the user's input that takes place in piece 202 beginnings at this on the multiple spot touch sensitive device.User's input can comprise one or more touch inputs, and each touches input and has unique sign.After piece 202, advance to piece 204 based on the method 200 that touches, at this user's input is classified, when user input comprises a unique identification, be classified as and follow the tracks of or select input, when user's input comprises at least two unique identifications (more than one touch input) be classified as the gesture input.Follow the tracks of input if user input can be classified as, then advance to piece 206, carry out tracking corresponding to user's input at this based on the method 200 that touches.
If user input is classified as the gesture input, then advance to piece 208 based on the method 200 that touches, carry out one or more gesture control operations at this corresponding to user's input.The gesture control operation is at least in part based on by described at least two unique identifications or the change that takes place between described at least two unique identifications.
Figure 10 sign is according to the method 250 based on touch of one embodiment of the invention.Begin at piece 252 based on the method 250 that touches, touching at this and can catch initial pictures during input of carrying out on sensitive surfaces is knocked.After piece 252, advance to piece 254 based on the method 250 that touches, determine touch mode at this based on initial pictures.For example, if initial pictures comprises a unique identification, then touch mode is corresponding to following the tracks of or preference pattern.On the other hand, if image comprises more than one unique identification, then touch mode is corresponding to gesture mode.
After piece 254, advance to piece 256 based on the method 250 that touches, touching to import on the sensitive surfaces at this and can catch next image during knocking.During knocking, generally can catch image in order, thereby can have and knock relevant a plurality of images.
After piece 256, advance to piece 258 based on the method 250 that touches, determine at this whether touch mode during initial pictures and next image capturing change has taken place.If change has taken place in touch mode, then advance to piece 260 based on the method 250 that touches, can be next image setting initial pictures at this, after this be determined once more based on new initial pictures at piece 254.If it is identical that touch mode keeps, this method then advances to piece 262, compares at this initial pictures and next image, and produces one or more control signals according to comparative result.
Figure 11 represents the method 300 based on touch according to one embodiment of the present of invention.This method can be the image object of GUI object in piece 302 beginnings in this input.For example, processor can order display to show specific image object.After piece 302, method 300 advances to piece 304, receives the gesture input at this by image object.For example, the user can land in the gesture mode above the display image object simultaneously on the surface of touch-screen or move its finger.The gesture input can comprise the one or more single gesture that recurs, perhaps simultaneous a plurality of gestures.General each gesture has specific order, motion or relative orientation.For example, gesture can comprise separately points or points gesture closed together, rotates the gesture of finger, translation finger etc.
After piece 304, method 300 advances to piece 306, this according to gesture input and and gesture input as one man revise image object.The modification here means that image object changes according to ongoing certain gestures.Here " as one man " means that described change almost takes place when carrying out described gesture.In most of the cases, between the change of gesture and image object generation, have man-to-man relation, and their take place simultaneously basically.In fact, image object is being followed the motion of gesture.For example, the finger that scatters can amplify object simultaneously, downscaled images object, rotation finger can rotate object to closed finger simultaneously simultaneously, the translation finger can allow to sweep simultaneously or the scrolling image object.
In one embodiment, piece 306 comprises determines which image object is relevant with ongoing gesture, after this display object is locked on the finger that is positioned at its top, thereby this image object changes according to the gesture input.Point or make finger and image object to be associated by means of locking, image object can continue to regulate himself according to the action of pointing on touch-screen.Usually describedly determine and be locked in when finger falls i.e. generation when finger is positioned on the touch-screen takes place.
Figure 12 represents convergent-divergent (zoom) gesture method 350 according to one embodiment of the present of invention.The convergent-divergent gesture can be performed on multi-point touch panel many touch pads 24 for example shown in Figure 2.This method is being touched existence simultaneously on the sensitive surfaces in piece 352 beginnings at this detection at least the first finger and second finger.It is that gesture touches rather than touches based on the tracking of a finger that the existence of at least two fingers can be arranged to this touch of expression.In some cases, having only two fingers to exist expression to touch is that gesture touches.In other cases, plural any amount of finger represents that touch is that gesture touches.In fact, gesture touches and can be configured to make and can both operate when 2,3,4 or more a plurality of finger touch, even quantity changes during gesture, that is, and two fingers of a minimum at any time needs during gesture.
After piece 352, this method 350 advances to piece 354, in the distance of this comparison between at least two fingers.This distance can be the distance of pointing from pointing to, or points for example distance of barycenter of another reference point from each.If the distance between two fingers increases (finger scatters), then produce the signal that furthers, shown in piece 356.If the distance between two fingers reduces (closed finger), then produce and push away signal far away, shown in piece 358.In most of the cases, falling of finger will or lock onto a specific image object that just is being shown the finger association.For example, touching sensitive surfaces can be touch-screen, and image object can show on touch-screen.This generally at least one finger be positioned at image object above the time take place.As a result, when separating when pointing mobile, the signal that furthers can be used to increase the size of the embedding feature in the image object, when finger is pinched together, pushes away signal far away can be used to reduce to embed feature in object size.Convergent-divergent generally carries out in predetermined border, for example the edge of the periphery of the periphery of display, window, image object etc.Embed feature and can be formed on the multilayer, the different zoom level of each representative wherein.
In most of the cases, amount of zoom changes according to two distance between objects.In addition, convergent-divergent generally side by side carries out with motion of objects basically.For example, when finger separated or be closed together, object was furthered at the same time or is pushed away far away.Though this method, should be noted that it also can be used for amplifying or dwindling at convergent-divergent.Convergent-divergent gesture method 300 is particularly useful in for example publication of graphic package, photo and cartographic programme.In addition, convergent-divergent can be used for for example camera of control peripheral devices, that is, when finger separated, camera pushed away far, and when finger was closed, camera furthered.
Figure 13 A-13H represents to use the convergent-divergent order of said method.Figure 13 A represents to have the demonstration of the image object 364 that is North America ground diagram form, wherein have embedding can be scaled level.In some instances, as shown in the figure, image object can be positioned at the window on the border of composing images object 364.Figure 13 B represents that the user places the top of North America 368 to finger, specifically in the top of the U.S. 370 (more particularly California 372).In order to further on California 372, the user begins separately its finger 366, shown in Figure 13 C.Separately (detecting distance increases) along with finger 366, map is exaggerated on northern California, arrive a specific region in northern California 374 then, arrive curved zone 376, sea again, arrive the peninsula 378 (for example zone between San Francisco and San Jose Area) again, arrive the San Carlos urban district 380 between San Francisco and San Jose again, shown in Figure 13 D-13H.In order to dwindle San Carlos380 and to get back to North America 368, after above-mentioned order, point 366 closed together, but along opposite direction.
Figure 14 represents pan (pan) method 400 according to one embodiment of the present of invention.The pan gesture can be carried out on multi-point touch panel.Pan method 400 begins at piece 402, detects touching at this to have at least the first object and second object on sensitive surfaces simultaneously.It is that gesture touches rather than touches based on the tracking of a finger that the existence of at least two objects can be arranged to that expression touches.In some cases, only existing two finger expressions to touch is that gesture touches.Under the other situation, it is that gesture touches that plural any amount of finger is represented to touch.In fact, gesture touches and can be configured to make and can both operate when 2,3,4 or more a plurality of finger touch, even quantity changes during gesture, that is, and two fingers of minimum needs.
After piece 402, this method advances to piece 404, monitors the position of two objects at this together when mobile when object on touch-screen.After piece 404, method 400 advances to piece 406, produces the pan signal when changing with respect to initial position in this position when two objects.In most of the cases, falling of finger will or lock onto the specific image object that shows the finger association on touch-screen.Typically, when at least one finger is positioned at the top, position of described image object.As a result, when finger was mobile on touch-screen, the pan signal was used to move image along the direction of finger.In most of the cases, the distance that pan amount moves according to two objects and changing.In addition, pan takes place simultaneously with motion of objects basically.For example, when finger motion, object is simultaneously along with finger motion.
Figure 15 A-15D represents the pan order based on above-mentioned pan method 400.Use the map of Figure 13 A, Figure 15 A represents that the user places the map top to finger.After finger was positioned, finger 366 was locked into map.Shown in Figure 15 B, when finger 366 when mobile vertically upward, entirely Figure 36 4 moves up, and be placed in the outside of viewing areas so as to the former observable part that makes map 364, and former invisible part is placed in the viewing areas.Shown in Figure 15 C, when finger 366 when side direction moves horizontally, entirely Figure 36 4 can be along being displaced sideways, and be placed in the outside of viewing areas so as to the former observable part that makes map 364, and former invisible part is placed in the viewing areas.Shown in Figure 15 D, when finger 366 when diagonal moves, entirely Figure 36 4 moves along diagonal, be placed in the outside of viewing areas so as to the former observable part that makes map 364, and former invisible part is placed in the viewing areas.The motion accompanying that should be appreciated that map 364 motion of finger 366.This processing and that a piece of paper is slided along desktop is similar.The pressure that finger is applied on the paper locks onto paper on the finger, and when finger slided on the table, this paper moved with finger.
Figure 16 represents the rotating method 450 according to one embodiment of the invention.Rotating gesture can carry out on multi-point touch panel.Rotating method 450 exists first object and second object in piece 452 beginnings simultaneously in this detection.It is that gesture touches rather than touches based on the tracking of a finger that the existence of at least two fingers can be arranged to that expression touches.In some cases, only existing two fingers just to represent to touch is that gesture touches.In other cases, the existence of plural any amount of finger represents that touch is that gesture touches.Also in other cases, gesture touches and can be configured to make and can both operate when 2,3,4 or more a plurality of finger touch, even quantity changes during gesture, that is, and two fingers of a minimum at any time needs during gesture.
After piece 452, rotating method 450 advances to piece 454, at this angle of each finger is set.This angle generally is determined with respect to a reference point.After piece 454, this method advances to piece 456, and signal just rotates when this angle when at least one object changes with respect to reference point.In most of the cases, laying of finger will make pointing locking or be associated with a specific image object that shows on touch-screen.In general, when at least one finger is positioned at image when top on the image object, this image object is just locked or be associated with this finger.As a result, when finger rotated, the direction (inhour for example clockwise) of just using turn signal to rotate along finger was rotated described object.In most of the cases, the amount of spin of object changes with the amount of spin of finger, that is, if finger rotates 5 degree, then object also rotates 5 degree.In addition, rotation is general carries out simultaneously with the motion of finger basically.For example, when finger rotated, object rotated with finger simultaneously.
Figure 17 A-17C represents the rotation order based on above-mentioned method.Use the map of Figure 13, Figure 17 A represents that the user places its finger 366 top of map 364.After laying, finger 366 just is locked into map 364.Shown in Figure 17 B, when finger 366 rotated along clockwise direction, entirely Figure 36 4 rotated along clockwise direction according to the finger that rotates.Shown in Figure 17 C, when finger 366 when counter-clockwise direction is rotated, entirely Figure 36 4 rotates along counter-clockwise direction according to the finger of rotation.
Though should be noted that shown in Figure 17 A-17C and to use thumb and forefinger to make the rotation gesture, also can use two to point that for example forefinger and middle finger are made the rotation gesture.
In addition, in certain application-specific, do not need to use two fingers gesture that rotates.For example, according to the preferred embodiment and shown in Figure 17 D and 17E, utilize a finger gesture can make the photo thumbnail turn to required orientation (for example from the landscape painting azimuth rotation to the portrait orientation).Specifically, detecting the touch relevant with optional photo thumbnail icons 741, and wherein touching input is gesture, when the touch that make to detect formed the arc that rotates or radial arc around the core of thumbnail, then this input was interpreted as being used to instruction that thumbnail is rotated according to the direction of arc that rotates or radial arc.According to the preferred embodiment, the rotation of thumbnail icons can also make the corresponding file object change azimuth configuration.According to another embodiment, in the application program of photo management, detect the rotation gesture and will produce rapid traverse (snap) instruction, it makes the photo thumbnail rotate 90 degree automatically towards rotation direction.
Figure 18 A and 18B represent to make for example another example of photo of input editing media file that uses gesture according to the embodiment of as shown in figure 10 example of the present invention by the UI element.Specifically, shown in Figure 18 A,, be used for the each side of editing photo opening so that in the photo editor environment 750 of editing photo image file (for example jpeg file) 752, can provide UI element 751.UI element 751 can be a horizontal slider, is used to regulate the level of certain aspect of photo.In the example shown in Figure 18 A, UI element 751 can be the interface that is used to receive touch gestures, is used to regulate the brightness of photo.Specifically, when tracked finger touch was moved to the left on bar, brightness reduced, and when the touch of following the tracks of moved right on the UI element, brightness increased.According to an embodiment, the UI element is preferably translucent, so that the user can see the image at the photo of UI element back.In another embodiment, the size that is shown photo on screen can be reduced, and thinks that the UI element of independent demonstration is reserved the position, and it can be close to and be positioned at the below that is shown photo.
Figure 18 B represents by selectively using the ability of one or more pick-up points by UI element 751 conversion gesture input patterns.Specifically, shown in Figure 18 B, second pick-up point that detects on UI element 751 will make operator scheme be transformed into contrast adjustment from brightness regulation.In this example, two pick-up point motions to the left or to the right will make the contrast of photo reduce or increase respectively.Detect the instruction (for example convergent-divergent, tone adjusting, gamma value adjusting etc.) that additional pick-up point (for example 3 or 4 fingers) also can be interpreted as being used to change other operator scheme.Though should be noted that Figure 18 A, 18B represents to regulate the brightness and contrast by UI element 751, and the user can be to programming of UI element or customization UI element 751, the feasible operator scheme that the quantity of pick-up point is interpreted as other form.Shall also be noted that slider bar UI element 751 can be other form, for example empty scrolling wheel.
Figure 18 C represents and above-mentioned Figure 18 A, the algorithm flow chart that the specific examples of 18B is relevant.Specifically, shown in Figure 18 C, on screen, export UI element 751 760.Touch if detect the gesture input, then determine and be somebody's turn to do to touch relevant pick-up point quantity at 762-765 761.In the quantity of 767-769, start corresponding operator scheme according to detected pick-up point.In case suitable operator scheme is activated, in 770 tracking that just detect the pick-up point, to realize corresponding adjusting 771 according to operator scheme.Should be noted that during editing and processing, at any time can both the conversion operations pattern, if thereby in 772 quantity that detect the pick-up point of change, then handle to returning circulation, so that start new operator scheme at 762-764.
Figure 18 D and 18E represent to use same UI element 751 described above to cause additional operations by importing other gesture instruction.Specifically, in the brightness of regulating display photos, can use second finger to realize furthering or pushing away operation far away.Can cause and further and push away far away by detecting second pick-up point and the change between two pick-up points near distance.According to method described above and shown in Figure 12, the distance between two pick-up points changes and can be interpreted as furthering or pushing away operation far away.It should be noted that, according to an embodiment, if the distance between detected second pick-up point and first pick-up point remains unchanged, then do not cause zoom operations, in this case, gesture will be interpreted as being used to start the input (for example changing into contrast adjustment from brightness regulation, as Figure 18 A, shown in the 18B) of second mode of operation.
Figure 19 A, 19B represent to make the example of input by media file (for example display photos file in photo editor) scrolling that use gesture.Specifically,, shown in the 19B, touch surveyed area 754 and can be exclusively used in the scrolling operation, so as to making that the gesture that moves up and down of finger can be interpreted as being used for the gesture input of scrolling to next photo 753 on the photo 752 of the demonstration of touch-screen 750 as Figure 19 A.According to the preferred embodiment, do not need to show the UI element that causes the scrolling mode of operation, but detecting the downward slide of finger in touching surveyed area 754 just is enough to automatically cause the scrolling operation.According to another embodiment, the UI element can empty vertical slider be shown on screen, having started the scrolling operation to user's indication, and the area that is used to continue the touch surveyed area 754 of scrolling operation.
According to the preferred embodiment,, then slide and carry out with 2X speed if detected downward pursuit movement has more than one pick-up point (for example two finger slip gestures), similar with above-mentioned mode about in rollable zone, causing the scrolling operation.
Figure 19 C and 19D represent the UI element of another kind of form, and promptly the virtual volume driving wheel 755, are used to receive the gesture input and show with the scrolling photo.In the present embodiment, can produce the virtual volume driving wheel by utilizing a finger on photo, to carry out the circular simple gesture that touches or touch downwards with 3 fingers.In case virtual volume driving wheel UI element 755 occurs, the user just can " rotate " the virtual volume driving wheel to pass through described photo scrolling.In this certain embodiments, the speed of scrolling be can't help the quantity control of the pick-up point on the detected scrolling wheel 755, but the speed control of rotating around the center of virtual volume driving wheel 755 by the pick-up point.
Figure 19 E and 19F are illustrated in the design of Figure 19 A and 19B on the display screen 781 of digital camera 780.According to the preferred embodiment, the display screen 781 of digital camera 780 can be made of many touch plates, for example top many touch pads 2 shown in Figure 2.
Figure 19 E represents an embodiment, and wherein under the broadcast mode of digital camera 780, the gesture input that detects the bang vertically downward of at least one finger in touching surveyed area 782 causes that playing scrolling operates, so as to showing next photo.According to another embodiment, the downward gesture input on any part of display 781 will automatically cause the scrolling operation.
The alternative embodiment of Figure 19 F presentation graphs 19E wherein needs to detect two and touches so that produce the broadcast scrolling.Specifically, the following touch point of downward touch area 783 and on touch area 782 down or near the combination that slidingly inputs downwards can cause the scrolling operation so that show next photo.Should be noted that Figure 19 A does not form specific factor to the described method of 19E, be implemented because this method can or have on the device of any kind of touch-screen at PC monitor, portable monitor, digital camera.
Figure 19 G represents the plus gesture that can import according to another embodiment during the broadcast of media file (for example photo files).Specifically and Figure 18 A, the embodiment shown in the 18B is similar, by means of the quantity of distinguishing the pick-up point on touch-sensitive display (i.e. Shou Zhi quantity), can differently explain identical motion.In this example, the vertical pick-up point bang gesture of carrying out with two fingers can be interpreted as being used to delete the gesture of photo files, mark photo files (for example editing photograph album), or any other useful instruction.
Figure 19 H represents to use other appointment UI district of touch-sensitive display to detect other additional gesture.In this example, detect in the pick-up point of another designation area 756 can be interpreted as deleting, mark or other useful instruction.According to an embodiment, a plurality of pick-up point district can be shown as the translucent coverage diagram of described photo files.
Represent bang gesture though should be noted that Figure 19, also it is contemplated that, can be designated as the gesture input of same instructions along the bang of direction or along continuous straight runs vertically upward along vertical downward direction.
Figure 20 represents to be used to realize a kind of possible algorithm of Figure 19 A-19F.Specifically, in the first step 790, on touch-sensitive display, show a photo in the multiple pictures.If in 791 touches that detect on display screen, then determine 792 whether this touch is the gesture input, and receive the type (for example slide of following the tracks of, the circular rotating operation of following the tracks of etc.) that gesture is imported downwards 793.According to the gesture input that detects, export UI element (for example slider bar or vertical scrolling wheel) as required 794, after this in 795 operations that produce corresponding to use of UI element or gesture input.
Should be noted that the described method of Figure 18-20 also can realize in video environment.Specifically, during video file is play, can produce and show for example horizontal slider shown in Figure 18 A of UI element, so as to quantity according to detected pick-up point, start the mode of operation of some the scalable aspect be used to change video, for example brightness, contrast etc.Meanwhile, also can realize scrolling shown in Figure 19 A-19F and Zoom method,, also can recoil and forwarding operation though replace scrolling with similar fashion.
The gesture input of use on some control element that is pre-existing in can realize the additional editor/playing function of video file.According to the preferred embodiment, by means of selectively dwindling or enlarging reproduction time line bar, can realize that the non-linear time of video file plays.Specifically, Figure 21 A represents the display video broadcast 791 together of Video Applications 790 (for example video playback application program) and progress bar 792, plays the time progress that formation 793 instruction videos are play on it.
According to the preferred embodiment, play formation 793 and can before and after on the progress bar 792, move, with the F.F. and the recoil of reflecting video.Play formation and can be maintained at identical position, perhaps be adjusted, play or time-out with the variable-ratio of realizing video with nonlinear velocity.According to the preferred embodiment, Video Applications 790 can show on touch-sensitive display that the position of the formation that the position of broadcast formation 793 can show by the finger of hand 501 touches and handled formation on screen.That is, playing formation 793 can be used as progress indicator and is used for the speed of control of video broadcast and the UI element of temporary position.
According to the preferred embodiment, whole progress bar 792 can be used as the UI element, can realize the non-linear playback of video by means of the one or more parts that enlarge or dwindle the progress bar so as to the user.Specifically, shown in Figure 21 B, UI element progress bar 792 can furthering or pushing away gesture far away (as described above with reference to Figure 12) and operate by two fingers.In the example shown in Figure 21 B, the gesture that furthers causes that reproduction time prolongs between 60 minutes marks and 80 minutes marks.In the example shown in Figure 21 B, the broadcasting speed of video becomes nonlinear, and this is because the broadcasting speed of video can be slack-off during the time interval between 60 and 80 minutes marks.Perhaps, between 0 and 60 minute mark and the broadcasting speed of the video after 80 minutes marks can be accelerated, and the broadcasting speed between 60 and 80 minutes marks is a standard.
Figure 21 C is illustrated in the additional UI element 794 that shows in the Video Applications 790.In this example, UI element 794 can be a virtual scrolling wheel, so as to the further broadcasting speed of control of video of user.Combine with the manipulation of progress bar 792, the user can at first specify broadcasting speed by slack-off video section, thereby the user can use scrolling wheel 794 further to adjust broadcast direction and/or the speed of broadcast formation 793 with control of video.
Figure 21 D represents and can make an addition to other additional touch sensitive UI element on the Video Applications 790 for editor's purpose.For example, shown in Figure 21 D, slider bar UI element 796 can be added and be used to detect the gesture input that is used to cause horizontal adjustment, for example adjustment of types such as pan (pan) adjustment or brightness, colourity, contrast, gamma.Similar with the UI element 751 that reference Figure 18 A-18E discusses, slider bar UI element 796 can be used for causing different modes of operation by the pick-up point quantity that changes thereon.
UI element 795 also can just be shown in Video Applications 790, to realize the sound-editing of video.Specifically, UI element 795 can comprise a plurality of tone level adjustment, is used to play different passages or sound or the music mixed with video.
According to the preferred embodiment, the user of Video Applications 790 can customize the UI element that is shown, and can also programme to realize required function to the UI element extraly.
Figure 22 A and 22B represent to be used to realize the exemplary algorithm 800 with reference to the described method of Figure 21 A-21D.Specifically, shown in Figure 22 A, 802, Video Applications 790 can be activated so that video playback and/or editor to be provided.803, progress bar 792 is shown.If detect the touches of progress on the bar 792, then determine that 805 these touches further or push away instruction far away 804.If do not detect is to further or push away the touch of instruction far away, then can touch input and handles and play formation according to following the tracks of.If detected touch is the convergent-divergent gesture, the part that then is detected the described progress bar of this touch can be handled to enlarge according to described gesture input or to dwindle.
Shown in Figure 22 B, can carry out step 808-810, selectively to show additional UI element, for example scrolling wheel, sound mixer and slider bar horizontal adjustment respectively.Can senses touch at step 811-813, after this can call suitable function 814-818.
Figure 23 represents to be used to handle an alternative embodiment of the invention of the record of display and audio frequency or music file.As shown in figure 23, music application 830 can show a pair of virtual rotating disk 842 and 843, is playing two musical recordings 834 and 835 on it, and these records are one of single record or LP record. Record 834 and 835 can be the diagrammatic representation (for example song A and song B) of digital music file, and it just is being played by music application 830.In other words, this record can be the figure stamp of music file, seems that music file is stamped on physical record.
As a pair of physics rotating disk, stylus 844 and 855 can be that the graphic icons of playing formation is represented, its position can be by touching the desired location of playing formation and icon being dragged and dropped on the graphic recording and change touching on the quick display screen.The jump of moving the broadcast point that will cause corresponding song of stylus is as on the rotating disk of physics.
Also, can begin by one or more finger touch/ stop button 838 and 839, thereby cause the stop/pause that song is reproduced as a pair of physics rotating disk. Rapid change bar 840 and 841 can be adjusted linearly, with the broadcasting speed of control song.Window 831 and 833 can figure ground reproduces the frequency representation of reproduced song, and window 832 can show the frequency representation of the actual output of music application 832 simultaneously, and it can be a just reproduced song simply, or song mixed/combination.Mixed/pan (pan) bar 850 can be handled, so that modulate or separate two reproduced songs of levelling.
At the song reproduction period, record 834 and 835 can be similar to physical record and be handled.For example, the snap back of record and F.F. can cause the sound effect of record " damage ", carry out on the physics rotating disk of being everlasting as the dish operator.
Should be noted that above-mentioned method can be implemented simultaneously during identical gesture striking.That is, during the gesture striking, can select, tracking, convergent-divergent, rotation and pan, it can comprise and scatters, rotates and the finger that slides.For example, after at least two fingers were laid, the object that is shown (map) can be associated with these two fingers or be locked to this two fingers.For convergent-divergent, the user can scatter or closed its finger.Each of these operations can take place in continuous motion simultaneously.For example, the user can scatter and closed its finger, and finger simultaneously rotates on touch-screen and slide.Perhaps, each during the user can move to these is carried out segmentation, and the gesture striking is resetted.For example, user's its finger that can at first scatter rotates this finger then, closed then should finger, this finger or the like then slides.
Shall also be noted that always end user's finger is realized the gesture input.Under possible situation, for example the input of stylus realization gesture is also just enough to use pointing device.
Comprise with UI element (for example virtual scrolling wheel) carry out that the interactive additional examples that is used as the gesture striking that is used for realizing the interface instruction is 10/903964 at common application number in a review, U.S. Patent Publication No. is that the pending application application that US2006/0026521 and application number are 11/038590, U.S. Patent Publication No. is US2006/0026535 has been described, the full content of these patented claims is included in this by reference.
Do not depart from the scope of the present invention and conceive, those skilled in the art can make many changes and remodeling.Therefore, it must be understood that the embodiment that is proposed only is some examples, and should not be used to limit the present invention who limits as claim.For example, though wherein embodiments of the invention have been described at personal computing device, should be appreciated that to the invention is not restricted to desk-top computer or portable computer, but the computing application that can be applied to other for example mobile communication equipment, multimedia reproducing apparatus etc. independently.
The word that this explanation is used is used to describe the present invention and each embodiment thereof should be understood to not only comprise its meaning of definition usually, but also be included in outside the meaning of common definition by the special definition in structure, material or the operation of this explanation.Thereby if an element can be understood to include more than one meaning in the context of this description, its use it must be understood that all possible meaning of supporting into by instructions and this word itself so in the claims.
Therefore, the word of following claim or the definition of element have been defined in this explanation, it not only comprises the literal combination of elements that go up to propose, thereby and comprises and be used for realizing that in essentially identical mode essentially identical function obtains essentially identical result's all equivalent structures, material or operation.In this sense, any one element in the claim below it is contemplated that, can make the equivalent substitution thing of two or more elements, perhaps can substitute two or more elements in the claim with an element.
In the view of those of ordinary skills, now known or find out later on for the immaterial change of the theme of claim, all will be understood that in the scope that is included in claim.Therefore, for those of ordinary skills, now known or known various replacement schemes later on all should be limited in the scope of element of the claim that is defined.
Thereby, specify and describe above claim is appreciated that and comprises, in design equivalence and technical scheme that obviously can be replaced.For example, the term of mentioning in the claim " computing machine " or " computer system " should comprise for example mobile communications device (for example cell phone or Fi/Skype phone, E-mail communication apparatus, personal digital assistant device) of desk-top computer, portable computer or any mobile computing device at least, and multimedia reproducing apparatus (iPod for example, MP3 player, or any digital figure/optical reproducing apparatus).

Claims (6)

1. handheld mobile communication device is characterized in that:
The display screen of touch-sensitive;
Be used to make the device of the part of display screen display media file, described media file comprises that text items and image item are one of at least;
Be used to detect the device in the lip-deep touch scrolling input of display screen, described touch scrolling input is included in the lip-deep people's of display screen the touch screen point of finger, the position that this touch screen point is shown corresponding to the described part of media file on the display screen;
Be used to detect the device of drag movement of people's the touch screen point of finger on display screen, described drag movement strides across the part of the described part of the media file that just is being shown, and comprises vertical and vector component level;
Thereby the device that is used for the drag movement indication scrolling operation of definite finger touch screen point;
Be used to make the device of media file scrolling on display screen, scrolling shown in it is restricted to one of vertical and horizontal direction.
2. handheld mobile communication device as claimed in claim 1 also comprises being used for determining described touch scrolling input whether in the predetermined neighborhood of media file, and if then carry out the device for the selection operation of this media file.
3. handheld mobile communication device as claimed in claim 1 comprises that also the horizontal component of only utilizing drag movement or the vertical component of only utilizing drag movement are as importing so that realize respectively along the device of the scrolling of level or vertical direction.
4. handheld mobile communication device is characterized in that:
The display screen of touch-sensitive;
Be used to make the device of the part of display screen display media file, described media file comprises that text items and image item are one of at least;
Be used to detect the device in the lip-deep touch scrolling input of display screen, described touch scrolling input is included in the lip-deep people's of display screen the touch screen point of finger, the position that this touch screen point is shown corresponding to the described part of media file on the display screen;
Be used to detect the device of drag movement of people's the touch screen point of finger on display screen, described drag movement strides across the part of the described part of the media file that just is being shown;
Be used to detect the device of direction of drag movement of touch screen point of people's finger, wherein the direction of drag movement comprises vertical component vector and horizontal component vector; And
Be used to make media file device according to the direction scrolling of the drag movement that detects on display screen.
5. handheld mobile communication device as claimed in claim 4 also comprises being used for determine touching the whether device in the predetermined neighborhood of media file of scrolling input, and if then carry out the selection operation for media file.
6. handheld mobile communication device as claimed in claim 4 comprises that also the horizontal component of only utilizing drag movement or the vertical component of only utilizing drag movement are as importing so that realize respectively along the device of the scrolling of level or vertical direction.
CN 200720194296 2007-01-05 2007-12-05 Hand-hold mobile communicating device Expired - Lifetime CN201181467Y (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US87875407P 2007-01-05 2007-01-05
US60/878,754 2007-01-05
US11/818,342 2007-06-13

Publications (1)

Publication Number Publication Date
CN201181467Y true CN201181467Y (en) 2009-01-14

Family

ID=38860004

Family Applications (2)

Application Number Title Priority Date Filing Date
CN 200720194296 Expired - Lifetime CN201181467Y (en) 2007-01-05 2007-12-05 Hand-hold mobile communicating device
CN200780051755.2A Active CN101611373B (en) 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN200780051755.2A Active CN101611373B (en) 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices

Country Status (2)

Country Link
CN (2) CN201181467Y (en)
DE (1) DE202007014957U1 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101847055A (en) * 2009-03-24 2010-09-29 鸿富锦精密工业(深圳)有限公司 Input method based on touch screen
CN101996033A (en) * 2009-08-03 2011-03-30 Lg电子株式会社 Mobile terminal and controlling method thereof
WO2011035723A1 (en) * 2009-09-23 2011-03-31 Han Dingnan Method and interface for man-machine interaction
CN102043510A (en) * 2009-10-09 2011-05-04 禾瑞亚科技股份有限公司 Method and device for analyzing two-dimension sensing information
CN102063248A (en) * 2009-11-16 2011-05-18 索尼公司 Information processing apparatus, information processing method, and program
CN102193714A (en) * 2010-03-11 2011-09-21 龙旗科技(上海)有限公司 Man-machine interactive mode for data grouping management of mobile terminal
CN102200882A (en) * 2010-03-24 2011-09-28 Nec卡西欧移动通信株式会社 Terminal device and control program thereof
CN102243573A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method and device for managing element attribute in application program
CN102243662A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method for displaying browser interface on mobile equipment
CN102270081A (en) * 2010-06-03 2011-12-07 腾讯科技(深圳)有限公司 Method and device for adjusting size of list element
CN102439555A (en) * 2009-05-19 2012-05-02 三星电子株式会社 Method of operating a portable terminal and portable terminal supporting the same
CN102439859A (en) * 2009-05-19 2012-05-02 三星电子株式会社 Mobile device and method for executing particular function through touch event on communication related list
CN102549538A (en) * 2009-07-21 2012-07-04 思科技术公司 Gradual proximity touch screen
CN102750096A (en) * 2012-06-15 2012-10-24 深圳乐投卡尔科技有限公司 Vehicle-mounted Android platform multi-point gesture control method
CN102770837A (en) * 2010-02-25 2012-11-07 微软公司 Multi-screen pinch and expand gestures
CN101799727B (en) * 2009-02-11 2012-11-07 晨星软件研发(深圳)有限公司 Signal processing device and method of multipoint touch interface and selecting method of user interface image
CN102799299A (en) * 2011-05-27 2012-11-28 华硕电脑股份有限公司 Computer system with touch screen and processing method for gestures of computer system
CN102884499A (en) * 2010-03-26 2013-01-16 诺基亚公司 Apparatus and method for proximity based input
CN102292693B (en) * 2009-01-21 2013-03-13 微软公司 Bi-modal multiscreen interactivity
US8400423B2 (en) 2009-10-09 2013-03-19 Egalax—Empia Technology Inc. Method and device for analyzing positions
CN103069491A (en) * 2010-08-27 2013-04-24 三星电子株式会社 Method and apparatus for playing contents
US8471826B2 (en) 2009-10-09 2013-06-25 Egalax—Empia Technology Inc. Method and device for position detection
US8537131B2 (en) 2009-10-09 2013-09-17 Egalax—Empia Technology Inc. Method and device for converting sensing information
CN103329075A (en) * 2011-01-06 2013-09-25 Tivo有限公司 Method and apparatus for gesture based controls
CN103376945A (en) * 2012-04-13 2013-10-30 佳能株式会社 Information processing apparatus and method for controlling the same
US8587555B2 (en) 2009-10-09 2013-11-19 Egalax—Empia Technology Inc. Method and device for capacitive position detection
US8643613B2 (en) 2009-10-09 2014-02-04 Egalax—Empia Technology Inc. Method and device for dual-differential sensing
CN103733163A (en) * 2011-08-05 2014-04-16 三星电子株式会社 Electronic apparatus using motion recognition and method for controlling electronic apparatus thereof
CN104035696A (en) * 2013-03-04 2014-09-10 观致汽车有限公司 Display method and device of vehicle-mounted message center on touch display interface
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
US9261964B2 (en) 2005-12-30 2016-02-16 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
CN105378634A (en) * 2013-05-17 2016-03-02 思杰系统有限公司 Remoting or localizing touch gestures at a virtualization client agent
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411498B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US9483152B2 (en) 2009-10-09 2016-11-01 Egalax_Empia Technology Inc. Method and device for dual-differential sensing
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US9733895B2 (en) 2011-08-05 2017-08-15 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same
US9864471B2 (en) 2009-10-09 2018-01-09 Egalax_Empia Technology Inc. Method and processor for analyzing two-dimension information
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
CN108616771A (en) * 2018-04-25 2018-10-02 维沃移动通信有限公司 Video broadcasting method and mobile terminal

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI363983B (en) * 2008-04-25 2012-05-11 Benq Corp Interactive electronic apparatus and interaction method thereof
DE102008032451C5 (en) * 2008-07-10 2017-10-19 Rational Ag Display method and cooking appliance therefor
DE102008032448B4 (en) 2008-07-10 2023-11-02 Rational Ag Display method and cooking device therefor
EP2187290A1 (en) * 2008-11-18 2010-05-19 Studer Professional Audio GmbH Input device and method of detecting a user input with an input device
US8957865B2 (en) 2009-01-05 2015-02-17 Apple Inc. Device, method, and graphical user interface for manipulating a user interface object
KR20100086678A (en) * 2009-01-23 2010-08-02 삼성전자주식회사 Apparatus and method for playing of multimedia item
KR101691938B1 (en) 2010-01-06 2017-01-03 삼성전자주식회사 Method and apparatus for setting of repeat playing section in a portable terminal
US8786559B2 (en) * 2010-01-06 2014-07-22 Apple Inc. Device, method, and graphical user interface for manipulating tables using multi-contact gestures
CN101853128A (en) * 2010-05-08 2010-10-06 杭州惠道科技有限公司 Multi-touch method for human-computer interface of slide-wheel
US20110298720A1 (en) * 2010-06-02 2011-12-08 Rockwell Automation Technologies, Inc. System and method for the operation of a touch screen
EP2395440A3 (en) * 2010-06-14 2012-01-11 Lg Electronics Inc. Mobile terminal and conrolling method thereof
CN101957718A (en) * 2010-06-22 2011-01-26 宇龙计算机通信科技(深圳)有限公司 Method and device for moving icons and digital terminal
US8773370B2 (en) 2010-07-13 2014-07-08 Apple Inc. Table editing systems with gesture-based insertion and deletion of columns and rows
CN101986249A (en) * 2010-07-14 2011-03-16 上海无戒空间信息技术有限公司 Method for controlling computer by using gesture object and corresponding computer system
US9268431B2 (en) * 2010-08-27 2016-02-23 Apple Inc. Touch and hover switching
US8767019B2 (en) 2010-08-31 2014-07-01 Sovanta Ag Computer-implemented method for specifying a processing operation
US8972467B2 (en) 2010-08-31 2015-03-03 Sovanta Ag Method for selecting a data set from a plurality of data sets by means of an input device
US9710154B2 (en) * 2010-09-03 2017-07-18 Microsoft Technology Licensing, Llc Dynamic gesture parameters
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
KR101685991B1 (en) 2010-09-30 2016-12-13 엘지전자 주식회사 Mobile terminal and control method for mobile terminal
CN102467327A (en) * 2010-11-10 2012-05-23 上海无戒空间信息技术有限公司 Method for generating and editing gesture object and operation method of audio data
CN102025831A (en) * 2010-11-18 2011-04-20 华为终端有限公司 Multimedia playing method and terminal
KR20120075839A (en) * 2010-12-29 2012-07-09 삼성전자주식회사 Method and apparatus for providing mouse right click function in touch screen terminal
CN102681748B (en) * 2011-03-09 2015-01-28 联想(北京)有限公司 Information processing equipment and information processing method
US9281010B2 (en) 2011-05-31 2016-03-08 Samsung Electronics Co., Ltd. Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
JP5751030B2 (en) * 2011-06-03 2015-07-22 ソニー株式会社 Display control apparatus, display control method, and program
CN103001933A (en) * 2011-09-15 2013-03-27 北京同步科技有限公司 Interactive multimedia information distribution terminal and information distribution method thereof
CN102890694A (en) * 2011-09-22 2013-01-23 北京师科阳光信息技术有限公司 Time shaft system and implementation method thereof
US9052810B2 (en) * 2011-09-28 2015-06-09 Sonos, Inc. Methods and apparatus to manage zones of a multi-zone media playback system
CN103092389A (en) * 2011-11-04 2013-05-08 德尔福技术有限公司 Touch screen device and method for achieving virtual mouse action
US9405463B2 (en) * 2011-11-25 2016-08-02 Samsung Electronics Co., Ltd. Device and method for gesturally changing object attributes
CN103247310A (en) * 2012-02-14 2013-08-14 索尼爱立信移动通讯有限公司 Multimedia playing control method, playing control module and playing terminal
CN102609143A (en) * 2012-02-15 2012-07-25 张群 Handheld electronic equipment and video playing and controlling method thereof
JP6004693B2 (en) * 2012-03-23 2016-10-12 キヤノン株式会社 Display control apparatus and control method thereof
US20130257792A1 (en) * 2012-04-02 2013-10-03 Synaptics Incorporated Systems and methods for determining user input using position information and force sensing
CN102866988B (en) * 2012-08-28 2015-10-21 中兴通讯股份有限公司 A kind of terminal and realization towing thereof copy the method for paste text
US20140109012A1 (en) * 2012-10-16 2014-04-17 Microsoft Corporation Thumbnail and document map based navigation in a document
US20140118265A1 (en) * 2012-10-29 2014-05-01 Compal Electronics, Inc. Electronic apparatus with proximity sensing capability and proximity sensing control method therefor
CN103035273B (en) * 2012-12-12 2016-05-25 宁波高新区百瑞音响科技有限公司 A kind of device that utilizes knob type digital code switch to switch audio file
KR102091077B1 (en) * 2012-12-14 2020-04-14 삼성전자주식회사 Mobile terminal and method for controlling feedback of an input unit, and the input unit and method therefor
CN103885623A (en) * 2012-12-24 2014-06-25 腾讯科技(深圳)有限公司 Mobile terminal, system and method for processing sliding event into editing gesture
CN103902173B (en) * 2012-12-26 2017-12-26 联想(北京)有限公司 Portable terminal and its information processing method and display processing method
CN103076985B (en) * 2013-01-31 2016-03-02 北京魔力时间科技有限公司 Accurately manipulate and display video playing progress rate device and using method based on touch screen
WO2014132893A1 (en) * 2013-02-27 2014-09-04 アルプス電気株式会社 Operation detection device
US11209975B2 (en) * 2013-03-03 2021-12-28 Microsoft Technology Licensing, Llc Enhanced canvas environments
CN104123088B (en) * 2013-04-24 2018-01-02 华为技术有限公司 Mouse action implementation method and its device and touch screen terminal
CN104216625A (en) * 2013-05-31 2014-12-17 华为技术有限公司 Display object display position adjusting method and terminal equipment
CN103327247B (en) * 2013-06-17 2017-01-11 神思依图(北京)科技有限公司 Portrait collection operation device and method
JP6189680B2 (en) * 2013-08-23 2017-08-30 シャープ株式会社 Interface device, interface method, interface program, and computer-readable recording medium storing the program
US9519420B2 (en) * 2013-10-16 2016-12-13 Samsung Electronics Co., Ltd. Apparatus and method for editing synchronous media
CN104902331B (en) * 2014-03-07 2018-08-10 联想(北京)有限公司 A kind of playing progress rate adjusting method and electronic equipment
CN104077028A (en) * 2014-05-28 2014-10-01 天津三星通信技术研究有限公司 Equipment and method for controlling display item in electronic equipment
CN105653111A (en) * 2014-11-14 2016-06-08 神讯电脑(昆山)有限公司 Touch control input method and electronic device thereof
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN105045513B (en) * 2015-08-27 2019-02-12 Oppo广东移动通信有限公司 Touch operation method and handheld device
CN105224220A (en) * 2015-09-08 2016-01-06 深圳市金立通信设备有限公司 A kind of control method of media play and device
CN106612425B (en) * 2015-10-23 2019-04-12 腾讯科技(深圳)有限公司 Image adjusting method and terminal device
DE102015222164A1 (en) * 2015-11-11 2017-05-11 Kuka Roboter Gmbh Method and computer program for generating a graphical user interface of a manipulator program
CN105573616B (en) * 2015-12-10 2018-05-29 广东欧珀移动通信有限公司 A kind of playlist control method and mobile terminal
CN105573631A (en) * 2015-12-14 2016-05-11 联想(北京)有限公司 Touch display electronic device and control method thereof
CN106527917B (en) * 2016-09-23 2020-09-29 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interaction system
CN107438839A (en) * 2016-10-25 2017-12-05 深圳市大疆创新科技有限公司 A kind of multimedia editing method, device and intelligent terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system

Cited By (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9946370B2 (en) 2005-12-30 2018-04-17 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9261964B2 (en) 2005-12-30 2016-02-16 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9594457B2 (en) 2005-12-30 2017-03-14 Microsoft Technology Licensing, Llc Unintentional touch rejection
US10019080B2 (en) 2005-12-30 2018-07-10 Microsoft Technology Licensing, Llc Unintentional touch rejection
US9952718B2 (en) 2005-12-30 2018-04-24 Microsoft Technology Licensing, Llc Unintentional touch rejection
CN102292693B (en) * 2009-01-21 2013-03-13 微软公司 Bi-modal multiscreen interactivity
CN101799727B (en) * 2009-02-11 2012-11-07 晨星软件研发(深圳)有限公司 Signal processing device and method of multipoint touch interface and selecting method of user interface image
CN101847055A (en) * 2009-03-24 2010-09-29 鸿富锦精密工业(深圳)有限公司 Input method based on touch screen
CN102439555A (en) * 2009-05-19 2012-05-02 三星电子株式会社 Method of operating a portable terminal and portable terminal supporting the same
US9817546B2 (en) 2009-05-19 2017-11-14 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
CN102439859A (en) * 2009-05-19 2012-05-02 三星电子株式会社 Mobile device and method for executing particular function through touch event on communication related list
CN102439555B (en) * 2009-05-19 2015-07-15 三星电子株式会社 Method of operating a portable terminal and portable terminal supporting the same
US11029816B2 (en) 2009-05-19 2021-06-08 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
US8910068B2 (en) 2009-05-19 2014-12-09 Samsung Electronics Co., Ltd. Method of operating a portable terminal and portable terminal supporting the same
CN102549538B (en) * 2009-07-21 2016-04-27 思科技术公司 Gradual proximity touch screen
CN102549538A (en) * 2009-07-21 2012-07-04 思科技术公司 Gradual proximity touch screen
US8595646B2 (en) 2009-08-03 2013-11-26 Lg Electronics Inc. Mobile terminal and method of receiving input in the mobile terminal
CN101996033A (en) * 2009-08-03 2011-03-30 Lg电子株式会社 Mobile terminal and controlling method thereof
WO2011035723A1 (en) * 2009-09-23 2011-03-31 Han Dingnan Method and interface for man-machine interaction
CN102812426A (en) * 2009-09-23 2012-12-05 韩鼎楠 Method And Interface For Man-machine Interaction
US9081441B2 (en) 2009-10-09 2015-07-14 Egalax—Empia Technology Inc. Method and device for analyzing two-dimension sensing information
US8633917B2 (en) 2009-10-09 2014-01-21 Egalax—Empia Technology Inc. Method and device for capacitive position detection
US8400422B2 (en) 2009-10-09 2013-03-19 Egalax—Empia Technology Inc. Method and device for analyzing positions
US8400425B2 (en) 2009-10-09 2013-03-19 Egalax—Empia Technology Inc. Method and device for analyzing positions
US9798427B2 (en) 2009-10-09 2017-10-24 Egalax_Empia Technology Inc. Method and device for dual-differential sensing
US8473243B2 (en) 2009-10-09 2013-06-25 Egalax—Empia Technology Inc. Method and device for analyzing positions
US8471826B2 (en) 2009-10-09 2013-06-25 Egalax—Empia Technology Inc. Method and device for position detection
US8497851B2 (en) 2009-10-09 2013-07-30 Egalax—Empia Technology Inc. Method and device for analyzing positions
US8537131B2 (en) 2009-10-09 2013-09-17 Egalax—Empia Technology Inc. Method and device for converting sensing information
CN102043510A (en) * 2009-10-09 2011-05-04 禾瑞亚科技股份有限公司 Method and device for analyzing two-dimension sensing information
US8564564B2 (en) 2009-10-09 2013-10-22 Egalax—Empia Technology Inc. Method and device for position detection
US8570289B2 (en) 2009-10-09 2013-10-29 Egalax—Empia Technology Inc. Method and device for position detection
US10310693B2 (en) 2009-10-09 2019-06-04 Egalax_Empia Technology Inc. Controller for position detection
US8583401B2 (en) 2009-10-09 2013-11-12 Egalax—Empia Technology Inc. Method and device for analyzing positions
US8587555B2 (en) 2009-10-09 2013-11-19 Egalax—Empia Technology Inc. Method and device for capacitive position detection
US8400423B2 (en) 2009-10-09 2013-03-19 Egalax—Empia Technology Inc. Method and device for analyzing positions
US8600698B2 (en) 2009-10-09 2013-12-03 Egalax—Empia Technology Inc. Method and device for analyzing positions
US9606692B2 (en) 2009-10-09 2017-03-28 Egalax_Empia Technology Inc. Controller for position detection
US8643613B2 (en) 2009-10-09 2014-02-04 Egalax—Empia Technology Inc. Method and device for dual-differential sensing
US8400424B2 (en) 2009-10-09 2013-03-19 Egalax—Empia Technology Inc. Method and device for analyzing positions
CN104331183B (en) * 2009-10-09 2017-12-15 禾瑞亚科技股份有限公司 The method and apparatus of two-dimensionses double difference value sensing information analysis
US8872776B2 (en) 2009-10-09 2014-10-28 Egalax—Empia Technology Inc. Method and device for analyzing two-dimension sensing information
US8890821B2 (en) 2009-10-09 2014-11-18 Egalax—Empia Technology Inc. Method and device for dual-differential sensing
US9864471B2 (en) 2009-10-09 2018-01-09 Egalax_Empia Technology Inc. Method and processor for analyzing two-dimension information
US8941597B2 (en) 2009-10-09 2015-01-27 Egalax—Empia Technology Inc. Method and device for analyzing two-dimension sensing information
US9483152B2 (en) 2009-10-09 2016-11-01 Egalax_Empia Technology Inc. Method and device for dual-differential sensing
US9069410B2 (en) 2009-10-09 2015-06-30 Egalax—Empia Technology Inc. Method and device for analyzing two-dimension sensing information
US9977556B2 (en) 2009-10-09 2018-05-22 Egalax_Empia Technology Inc. Controller for position detection
US10101372B2 (en) 2009-10-09 2018-10-16 Egalax_Empia Technology Inc. Method and device for analyzing positions
US9141216B2 (en) 2009-10-09 2015-09-22 Egalax—Empia Technology Inc. Method and device for dual-differential sensing
CN102063248A (en) * 2009-11-16 2011-05-18 索尼公司 Information processing apparatus, information processing method, and program
US9411498B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9857970B2 (en) 2010-01-28 2018-01-02 Microsoft Technology Licensing, Llc Copy and staple gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US10282086B2 (en) 2010-01-28 2019-05-07 Microsoft Technology Licensing, Llc Brush, carbon-copy, and fill gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US10268367B2 (en) 2010-02-19 2019-04-23 Microsoft Technology Licensing, Llc Radial menus with bezel gestures
CN102770837B (en) * 2010-02-25 2016-12-21 微软技术许可有限责任公司 Multi-screen opening and closing gesture
US11055050B2 (en) 2010-02-25 2021-07-06 Microsoft Technology Licensing, Llc Multi-device pairing and combined display
CN102770837A (en) * 2010-02-25 2012-11-07 微软公司 Multi-screen pinch and expand gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
CN102193714A (en) * 2010-03-11 2011-09-21 龙旗科技(上海)有限公司 Man-machine interactive mode for data grouping management of mobile terminal
CN102200882B (en) * 2010-03-24 2015-03-11 Nec卡西欧移动通信株式会社 Terminal device and control program thereof
CN102200882A (en) * 2010-03-24 2011-09-28 Nec卡西欧移动通信株式会社 Terminal device and control program thereof
CN102884499A (en) * 2010-03-26 2013-01-16 诺基亚公司 Apparatus and method for proximity based input
CN102270081A (en) * 2010-06-03 2011-12-07 腾讯科技(深圳)有限公司 Method and device for adjusting size of list element
CN102270081B (en) * 2010-06-03 2015-09-23 腾讯科技(深圳)有限公司 A kind of method and device adjusting size of list element
CN103069491A (en) * 2010-08-27 2013-04-24 三星电子株式会社 Method and apparatus for playing contents
CN103329075A (en) * 2011-01-06 2013-09-25 Tivo有限公司 Method and apparatus for gesture based controls
CN103329075B (en) * 2011-01-06 2017-12-26 TiVo解决方案有限公司 For the method and apparatus based on gesture control
CN102799299B (en) * 2011-05-27 2015-11-25 华硕电脑股份有限公司 The computer system of tool touch control screen and the disposal route of gesture thereof
CN102799299A (en) * 2011-05-27 2012-11-28 华硕电脑股份有限公司 Computer system with touch screen and processing method for gestures of computer system
CN102243662A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method for displaying browser interface on mobile equipment
CN102243573A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method and device for managing element attribute in application program
US9733895B2 (en) 2011-08-05 2017-08-15 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same
CN103733163A (en) * 2011-08-05 2014-04-16 三星电子株式会社 Electronic apparatus using motion recognition and method for controlling electronic apparatus thereof
CN103376945A (en) * 2012-04-13 2013-10-30 佳能株式会社 Information processing apparatus and method for controlling the same
US9195381B2 (en) 2012-04-13 2015-11-24 Canon Kabushiki Kaisha Information processing apparatus, method for controlling the same, and storage medium to receive a touch operation for rotating a displayed image
CN103376945B (en) * 2012-04-13 2016-08-03 佳能株式会社 Information processor and control method thereof
CN102750096A (en) * 2012-06-15 2012-10-24 深圳乐投卡尔科技有限公司 Vehicle-mounted Android platform multi-point gesture control method
US10656750B2 (en) 2012-11-12 2020-05-19 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
CN104035696B (en) * 2013-03-04 2017-12-19 观致汽车有限公司 Display methods and device of the vehicle-mounted message center in touch display interface
CN104035696A (en) * 2013-03-04 2014-09-10 观致汽车有限公司 Display method and device of vehicle-mounted message center on touch display interface
CN105378634B (en) * 2013-05-17 2019-03-29 思杰系统有限公司 Long-range or localization virtualization client Agency touch gestures
CN105378634A (en) * 2013-05-17 2016-03-02 思杰系统有限公司 Remoting or localizing touch gestures at a virtualization client agent
US10754436B2 (en) 2013-05-17 2020-08-25 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
US10180728B2 (en) 2013-05-17 2019-01-15 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
US11209910B2 (en) 2013-05-17 2021-12-28 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
US11513609B2 (en) 2013-05-17 2022-11-29 Citrix Systems, Inc. Remoting or localizing touch gestures
US9946383B2 (en) 2014-03-14 2018-04-17 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
CN108616771B (en) * 2018-04-25 2021-01-15 维沃移动通信有限公司 Video playing method and mobile terminal
CN108616771A (en) * 2018-04-25 2018-10-02 维沃移动通信有限公司 Video broadcasting method and mobile terminal

Also Published As

Publication number Publication date
CN101611373A (en) 2009-12-23
CN101611373B (en) 2014-01-29
DE202007014957U1 (en) 2007-12-27

Similar Documents

Publication Publication Date Title
CN201181467Y (en) Hand-hold mobile communicating device
CN201266371Y (en) Handhold mobile communication equipment
CN103631496B (en) Attitude using touch-sensitive device control, manipulation and editing media file
JP7321197B2 (en) Information processing device, information processing method, and computer program
US8970503B2 (en) Gestures for devices having one or more touch sensitive surfaces
CN101482794B (en) Mode-based graphical user interfaces for touch sensitive input devices
TWI669652B (en) Information processing device, information processing method and computer program
US20100328224A1 (en) Playback control using a touch interface
JP2013089202A (en) Input control unit, input control method and input control program
JP2013089200A (en) Input control unit, input control method and input control program
JP2013089201A (en) Input control unit, input control method and input control program
AU2011253700B2 (en) Gestures for controlling, manipulating, and editing of media files using touch sensitive devices

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CX01 Expiry of patent term
CX01 Expiry of patent term

Granted publication date: 20090114