CN102884498B - The method carrying out on the computing device inputting - Google Patents
The method carrying out on the computing device inputting Download PDFInfo
- Publication number
- CN102884498B CN102884498B CN201180009579.2A CN201180009579A CN102884498B CN 102884498 B CN102884498 B CN 102884498B CN 201180009579 A CN201180009579 A CN 201180009579A CN 102884498 B CN102884498 B CN 102884498B
- Authority
- CN
- China
- Prior art keywords
- bezel
- menu
- input
- frame
- function
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0339—Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Abstract
Describe the bezel gestures for touch display.In at least certain embodiments, using the frame of equipment to extend can function by using so-called bezel gestures to access.In at least certain embodiments, it is possible to use motion outside screen to create screen input by bezel gestures by frame.Bezel gestures can include single finger bezel gestures, many finger/same-hand bezel frame gesture and/or the different hands bezel gestures of many fingers.
Description
Technical field
The present invention relates to a kind of method carrying out on the computing device and inputting, particularly for creating the outer gesture of the screen of input on screen.
Background technology
There is the users such as such as touch display may participate in the challenge that the designer of equipment of display continues to face and relate to providing the user a function part without " framework " (chrome) of the user interface that this function for good and all shows as equipment for enhancing.This is such not only for the equipment with bigger or multiple screen, and at such as flat board PC, portable equipment, less multi-screen device etc. has also is such in the context of the relatively equipment of small occupied space.
Summary of the invention
Thering is provided present invention is some concepts in order to the form introduction to simplify will further describe in the following specific embodiments.Present invention is not intended to identify key feature or the essential feature of theme required for protection, is intended to be used to assist in the scope of theme required for protection.
Describe the frame (bezel) for touch display.In at least certain embodiments, using the frame of equipment to extend can function by using so-called bezel gestures to access.In at least certain embodiments, it is possible to use motion outside screen to create screen input by bezel gestures by frame.Bezel gestures can include single finger bezel gestures, many finger/same-hand bezel frame gesture and/or the different hands bezel gestures of many fingers.
Accompanying drawing explanation
Detailed description of the invention is described in reference to the drawings.In the accompanying drawings, the accompanying drawing that in accompanying drawing labelling, this accompanying drawing labelling of leftmost Digital ID occurs first.The different instances of specification and drawings use identical accompanying drawing labelling may indicate that similar or identical project.
Fig. 1 is the diagram of the environment in the example implementation according to one or more embodiments.
Fig. 2 is the diagram illustrating in greater detail the system in the example implementation of Fig. 1.
Fig. 3 illustrates the Example Computing Device according to one or more embodiments.
Fig. 4 describes the flow chart according to each step in the method for one or more embodiments.
Fig. 5 describes the flow chart according to each step in the method for one or more embodiments.
Fig. 6 illustrates the Example Computing Device according to one or more embodiments.
Fig. 7 illustrates the Example Computing Device according to one or more embodiments.
Fig. 8 illustrates the Example Computing Device according to one or more embodiments.
Fig. 9 illustrates the Example Computing Device according to one or more embodiments.
Figure 10 describes the flow chart according to each step in the method for one or more embodiments.
Figure 11 describes the flow chart according to each step in the method for one or more embodiments.
Figure 12 illustrates the Example Computing Device according to one or more embodiments.
Figure 13 illustrates the Example Computing Device according to one or more embodiments.
Figure 14 illustrates the Example Computing Device according to one or more embodiments.
Figure 15 describes the flow chart according to each step in the method for one or more embodiments.
Figure 16 describes the flow chart according to each step in the method for one or more embodiments.
Figure 17 illustrates the Example Computing Device according to one or more embodiments.
Figure 18 describes the flow chart according to each step in the method for one or more embodiments.
Figure 19 illustrates the Example Computing Device according to one or more embodiments.
Figure 20 describes the flow chart according to each step in the method for one or more embodiments.
Figure 21 illustrates the Example Computing Device according to one or more embodiments.
Figure 22 illustrates the Example Computing Device according to one or more embodiments.
Figure 23 illustrates the Example Computing Device according to one or more embodiments.
Figure 24 illustrates the Example Computing Device according to one or more embodiments.
Figure 25 describes the flow chart according to each step in the method for one or more embodiments.
Figure 26 describes the flow chart according to each step in the method for one or more embodiments.
Figure 27 illustrates the Example Computing Device according to one or more embodiments.
Figure 28 illustrates the Example Computing Device according to one or more embodiments.
Figure 29 illustrates the Example Computing Device according to one or more embodiments.
Figure 30 describes the flow chart according to each step in the method for one or more embodiments.
Figure 31 describes the flow chart according to each step in the method for one or more embodiments.
Figure 32 describes the flow chart according to each step in the method for one or more embodiments.
Figure 33 illustrates the Example Computing Device that can be used for realizing embodiments described here.
Detailed description of the invention
General view
Describe the bezel gestures for touch display.In at least certain embodiments, using the frame of equipment to extend can function by using so-called bezel gestures to access.In at least certain embodiments, it is possible to use motion outside screen to create screen input by bezel gestures by frame.Bezel gestures can include single finger bezel gestures, many finger/same-hand bezel frame gesture and/or the different hands bezel gestures of many fingers.
In the following discussion, the various different realization of the bezel gestures relating to start and/or realize the function on computing equipment or the gesture being associated from bezel gestures is described.In this way, user easily can come to visit by efficient and intuitive way and ask the enhancing function of computing equipment.
In the following discussion, first description can be used for adopting the example context of gesture technology described herein.Then describing gesture and the example illustration of each process, these can adopt in example context and in other environments.Therefore, this example context is not limited to perform example gestures, and gesture is not limited to the realization in example context.
Example context
Fig. 1 is the diagram that can be used in an example implementation adopts the environment 100 of bezel gestures and other technologies described herein.Shown environment 100 includes an example of the computing equipment 102 that can configure in various manners.Such as, computing equipment 102 can be configured to traditional computer (such as, desktop PC, laptop computer etc.), movement station, amusement equipment, be communicatively coupled to the Set Top Box of television set, radio telephone, net book, game console, portable equipment etc., as further described about Fig. 2.Thus, the scope of computing equipment 102 can be from the wholly-owned source device (such as personal computer, game console) with sufficient memory and processor resource to having finite memory and/or processing the low resource device (such as conventional set-top box, handheld game consoles) of resource.Computing equipment 102 can also include so that computing equipment 102 performs the software of one or more operation described below.
Computing equipment 102 includes the frame 103 forming a part for the shell of this equipment.Frame is made up of the frame structure adjacent with the display of equipment (hereinafter also referred to device display 108).Computing equipment 102 includes the bezel gestures module 105 of a part for gesture module 104 and formation gesture module 104.Gesture module can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least certain embodiments, gesture module realizes to reside in the software on certain tangible computer computer-readable recording medium, and the example of this computer-readable medium is provided below.
Gesture module 104 and bezel gestures module 105 represent and identify gesture and bezel gestures respectively and make the function being performed corresponding to the operation of gesture.Gesture can be identified with various different modes by module 104,105.Such as, gesture module 104 is configured to the finger touch input close to the display device 108 of computing equipment 102 of the hands 106a of touch screen function identification such as user.It addition, bezel gestures module 105 can be configured to identify that the finger of the hands 106b of such as user etc. is initiated on frame 103 or the gesture adjacent with frame 103 proceed to the touch input on display device 108.Available any suitable technology senses on frame 103 or the input adjacent with frame 103.Such as, at least certain embodiments, the digitizer being associated with display device 108 or sensing element can extend under frame 103.In this case, the technology such as such as capacitance field technology and other technologies can be used to sense on frame 103 or the user adjacent with frame 103 input.
Alternatively or in addition, display device 108 does not extend under frame 103 and is positioned at when flushing with frame wherein, and bezel gestures module 105 can detect the contact profile of the change of user's finger from frame 103 when user's finger occurs on the display device 108.Alternatively or in addition, the method using the barycenter touching profile of user can be used for the barycenter contact profile of the change of detection hint bezel gestures.It is also possible to use for the technology of fingerprint sensing.Specifically, if the ridge projections of the enough sensitive one or more fingers determining contact display of sensing substrate, then can detect that the orientation of finger and fingerprint by frame blocks the fact.Much less, any amount of different technologies can be used to carry out the sensing user input relative to frame 103.Touch input also can be identified as the attribute (such as, movement, selected element etc.) including can be used for other touch inputs that touch input and gesture module 104,105 identify being made a distinction.This differentiation can then serve as and identifies gesture from touch input and therefore based on the basis of the mark for marking operation to be performed to gesture.This generates total benefit that the gesture that the gesture that starts and enter into screen generally can be similar on other surfaces accessing content on screen makes a distinction from frame, because if being intended that of user is mutual with some thing on screen, then user has no reason partially or even wholly to start to position its finger outside screen.Therefore, even with the object close to screen border, common gesture of directly handling is still possible, and will not intervene bezel gestures, and vice versa.
Such as, the finger of the hands 106a of user is illustrated as selecting the image 112 shown by 110 display devices 108.Selection 110 and the subsequent movement of image 112 can be identified by the finger of the hands 106a of user by gesture module 104.Then this mobile logo identified is " drag and drop " operation indicating the point mentioned by the finger of the hands 106a that position change is user in display of image 112 from display device 108 by gesture module 104.Thus, to describe the touch input of selection of image, selected element to another point movement, then mention the identification of the finger of the hands 106a of user and can be used for identifying the gesture (such as, drag and drop gesture) starting drag-and-drop operation.
The recognizable various types of gesture of gesture module 104,105, as from the gesture of single type of input identification touch gestures such as (such as) all drag and drop gestures as previously described and the gesture relating to polytype input.Such as, module 104,105 can be used for identifying single finger gesture and bezel gestures, many fingers/with hands gesture and bezel gestures and/or many fingers/different hands gesture and bezel gestures.
Such as, computing equipment 102 can be configured to detection touch input (such as, one or more fingers of hands 106a, 106b of user providing) and stylus input (such as, instruction pen 116 providing) and makes a distinction between.This differentiation can perform in various manners, as passed through to detect the amount of the display device 108 of amount contrast instruction pen 116 contact of the display device 108 of the finger contact of the hands 106 of user.
Thus, gesture module 104,105 by identifying and can utilize the division between instruction pen and touch input and different types of touch input to support various different gesture technology.
Therefore, gesture module 104,105 can support various different gesture.The example of gesture described herein includes single finger gesture 118, single finger bezel gestures 120, many fingers/with hands gesture 122, many finger/same-hand bezel frame gesture 124, many fingers/different hands gesture 126 and many fingers/different hands bezel gestures 128.Each in these different types of bezel gestures is described below.
Fig. 2 illustrates an example system, realizes in the environment that its gesture module 104 illustrating Fig. 1 and bezel gestures module 105 multiple equipment wherein are interconnected by central computing facility.Central computing facility can be that multiple equipment is local, or may be located at the long-range of multiple equipment.In one embodiment, central computing facility is " cloud " server farm, they one or more server computers including being connected to multiple equipment by network or the Internet or other means.
In one embodiment, this interconnection architecture makes function can deliver on multiple equipment and provides public and seamless experience with the user to multiple equipment.Each of multiple equipment can have different desired physical considerations and ability, and central computing facility uses a platform make special for equipment and again the experience that all devices is public can be delivered to equipment.In one embodiment, create target device " class ", and to the special experience of common apparatus class.Equipment class can be defined by the physical features of equipment or purposes or other common featureses.Such as, as it has been described above, computing equipment 102 configures with various different modes, such as it is used for moving 202, computer 204 and television set 206 purposes.Each in these configurations has a screen size generally corresponded to, and one of these equipment apoplexy due to endogenous wind that therefore computing equipment 102 can be configured in this example system 200.Such as, computing equipment 102 can take mobile 202 equipment classes, and this equipment class includes mobile phone, music player, game station etc..Computing equipment 102 also can take computer 204 equipment class, and this equipment class includes personal computer, laptop computer, net book etc..Television set 206 configuration includes the equipment configuration relating to display in Leisure Environment, such as television set, Set Top Box, game console etc..Thus, technology described herein can be supported by these various configurations of computing equipment 102, and is not limited to the concrete example described in following joint.
Cloud 208 is shown as including the platform 210 for web services 212.Platform 210 takes out the hardware (such as, server) of cloud 208 and the bottom function of software resource, and therefore can be used as " cloud operating system ".Such as, computing equipment 102 can abstract resource be connected by platform 210 with other computing equipments.Platform 210 can be additionally used in the convergent-divergent level of zoom to the demand the run into offer correspondence to the web services 212 realized via platform 210 of abstract resource.Also contemplate other examples various, the load balance of the server in server farm, protection etc. for malicious parties (such as, spam, virus and other Malwares).
Thus, cloud 208 as relate to via the Internet or other networks to computing equipment 102 can the part of strategy of software and hardware resource include.Such as, gesture module 104,105 can partly on computing device 102 and via support web services 212 platform 210 realize.
Such as, the gesture technology that gesture module is supported can use the touch screen function in mobile configuration 202, the Trackpad function of computer 204 configuration detects, a part as the support of the natural user interface (NUI) contacted being not related to concrete input equipment is detected by photographing unit, etc..Additionally, the execution detecting and identifying the operation that input identifies certain gestures can be distributed on system 200, the web services 212 as performed by computing equipment 102 and/or supported by the platform 210 of cloud 208 performs.
It is said that in general, the combination that any function described here can use software, firmware, hardware (such as, fixed logic circuit), manual handle or these realizations realizes.Terms used herein " module ", " function " and " logic " typicallys represent software, firmware, hardware or its combination.In the case of a software implementation, module, functionally or logically expression are when the program code performing appointed task when the upper execution of processor (such as, one or more CPU).Program code can be stored in one or more computer readable memory devices.Respectively being characterized by of gesture technology described below is platform-independent, thus meaning that these technology can realize on the various commercial computing platforms have various processor.
In the following discussion, each joint describes example bezel gestures and the gesture being associated with bezel gestures.It is entitled as the first segment of " use frame as input mechanism " to describe the frame of computing equipment and can be used as the embodiment of input mechanism.Afterwards, how the motion that the joint being entitled as " using motion outside screen to create to input on screen " describes outside device screen can be utilized by gesture to create and input on screen.Then, the joint being entitled as " using many fingers to represent " for gesture describes how to utilize multiple finger to provide gesture to input.After this joint, it is entitled as the joint of " radially menu " and describes the embodiment that can utilize radial direction menu to provide sane input option set.Then, the joint being entitled as " with the outer gesture of screen and the combination page/object manipulation on screen " describes the various types of gestures and combination that can be used for handling the page and/or object.Finally, the joint being entitled as " example apparatus " describes each side that can be used for realizing the example apparatus of one or more embodiment.
Use frame as input mechanism
In one or more embodiments, the frame of equipment can be used as input mechanism.Such as, display device extends under frame wherein, the finger of user or other input mechanisms can above it hovers over frame or with frame physical engagement time sensed.Alternatively or in addition, frame can include the such as sensor mechanism such as infrared mechanism and other mechanism, the sensing hovering of this sensor mechanism above frame or with user's finger of frame physical engagement or other input mechanisms.Any combination of the input relative to frame can be used.Such as, in order to provide various inputs to equipment, can one or many tapping frame, keep frame, streak frame, hover over above frame and/or any combination of these or other input.
Exemplarily, it is considered to situations below.Many selections, manipulation and context menu activate scheme and utilize the difference between the background painting canvas of equipment and the object occurring on painting canvas.Even if using frame to be provided that the page in background painting canvas itself also can access the mode of this page when being covered by many closely spaced objects as input mechanism.Such as, on frame, tapping can provide the mechanism of the selection cancelling all objects.Frame keeps the context menu that can be used on triggering page.Exemplarily, it is considered to Fig. 3, Fig. 3 illustrate the example context 300 including computing equipment 302, computing equipment has frame 303 and display device 308.In this case, the finger just tapping on frame 303 on the hands 306a of user.By tapping on frame, the input of user is sensed, and can provide the function being associated being mapped to this input.In the above examples, this type of function is likely to cancel the selection of all objects occurred on display device 308.It addition, the various location on frame can receive input, and input is mapped to difference in functionality.Such as, the input received on the right side of frame is mapped to the first function;The input received in the left side of frame is mapped to the second input, by that analogy.Additionally, depend on how the orientation of equipment and user arrest equipment, the input received in the zones of different of border side is mapped to difference in functionality, or is not mapped to any function completely.Some frame edge can retain and not be assigned with, or to touching and (touch-and-hold) can be kept insensitive so that will not trigger and be not intended to operation.Thus, any one particular side of frame can be used for receiving input, and depends on that what region of frame receives input, correspondingly this input is mapped to difference in functionality.It is to be appreciated and understood that the input received via frame can receive independent of any input received via hardware input equipment, hardware input equipment such as button, tracking ball and can be located at other instruments on the equipment being associated.Additionally, at least certain embodiments, the input received via frame could be for finding out and accessing the unique subscriber input of specific function.Such as, the input received on frame completely can provide the basis that can be used for accessing functions of the equipments.Additionally, in certain embodiments, orientation sensor (such as, accelerometer) is used as helping the input of which frame edge activity of decision.In certain embodiments, tapping quickly, intentionally keeps available, but only touches and keep being left in the basket, in order to keep battery limits to separate simply with the finger rested on by chance on frame.
Alternatively or in addition, at least certain embodiments, available vision enlightenment (visualaffordance) provides hint or the instruction of the addressable function being associated with frame.Specifically, vision enlightenment can be used for the function that instruction can access by bezel gestures.The vision enlightenment of available any suitable type.As an example, again consider Fig. 3.Again, the vision enlightenment of translucent bars 304 form provide additional function can instruction by utilizing bezel gestures to access.Vision enlightenment can take any suitable form, and can be located at any suitable position on display device 308.Additionally, vision enlightenment can be shown in any suitable manner.Such as, at least certain embodiments, the input received via frame can be used for showing or show that vision is enlightened.Specifically, at least certain embodiments, the enlightenment of " micro-dew (peekout) " vision may be in response to detect that the hovering above the frame of equipment or the physical engagement with the frame of equipment present.The enlightenment of " micro-dew " vision can be detected by user at least certain embodiments, so that " micro-dew " is hidden.
In this concrete example, the form of the additional function being associated with the translucent bars 304 so-called bezel menu can use bezel gestures and access exists.Specifically, in one or more embodiments, bezel menu can be accessed by following gesture: the finger of the hands 306b of user touches frame and then moves past frame in the direction of the illustrated arrow and move on on display device 308.This can allow drop-down bezel menu, as will be discussed in greater detail hereinafter.
Therefore, each embodiment can use frame itself to be used as input mechanism, in above-mentioned first example.Alternatively or in addition, respectively frame can be used by other embodiments in conjunction with vision enlightenment, and vision enlightenment can provide a user with the prompting that additional function can access by bezel gestures.
Fig. 4 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 400 receives the input being associated with frame.Can receiving the input of any suitable type, its example is given above.Step 402 accesses the function being associated with the input received.The function of any suitable type can be received.By providing various types of recognizable input (such as, tapping, tapping combination, tapping/maintenance are combined, streaked), and these recognizable inputs are mapped to different types of function, it is possible to provide sane user's input mechanism set.
Fig. 5 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Vision enlightenment on the display device that step 500 display is associated with computing equipment.The vision that can use any suitable type is enlightened, and its example is given above.Step 502 receives the bezel gestures input relative to vision enlightenment.The bezel gestures input of available any suitable type.Step 504 accesses and inputs, with the bezel gestures received, the function being associated.May have access to the function of any suitable type, its example is provided above and is more particularly described hereinafter.
After considering the example that wherein frame can be used as input mechanism, presently contemplate outside available screen or the outer motion of display creates screen or shows each embodiment inputted.
Use motion outside screen to create screen inputs
In at least certain embodiments, screen outer move to screen (or contrary) is used as showing menu or accessing the mechanism of some other type of function.Outside screen, motion or input can provide relative to the frame of equipment as described above.The bezel gestures input that can provide any suitable type realizes the outer motion to screen of screen.Such as, exemplarily unrestricted, bezel gestures or input can start on frame or terminate, pass or again through frame, frame diverse location (such as, corner or the preferred coordinates scope along particular edge) on through, and/or occur on the one or more frames being associated from multiple screens (depend on screen or its edge likely have different semanteme).In addition, exemplarily unrestricted, frame input can include singly contacting draggings (finger or pen), double; two contact drags (two fingers) and/or hands contact dragging (multiple fingers/whole hands/difference multiple or single finger on hand).Such as, the available knob gesture from the screen external space (that is, originating from frame) map that to difference in functionality.Such as, the bezel gestures with the multiple contacts entered from the different edges of screen can have different semantic.Specifically, two fingers entered from the neighboring edge (that is, crossing over a corner) of frame are mapped to reduce the page to illustrate the work space of extension or the reduction operation of painting canvas.Two fingers entered from opposite edges and either hand (if screen is sufficiently small) or two handss (finger from every hands) are mapped to difference in functionality.The multiple fingers entered on an edge of frame and the finger entered from the adjacent of frame or opposite edges are mapped to difference in functionality.It addition, the multiple fingers entered from two or more edges can be mapped to difference in functionality further.
As another example, it is considered to Fig. 6.At this, equipment 602 includes the frame 603 and the vision enlightenment 604 that are presented on display device 608.As it has been described above, the vision enlightenment 604 of translucent bars form can be used for providing hint or the instruction of the addressable function (for bezel menu in this situation) being associated with frame.
In one or more embodiments, bezel menu can be accessed by following bezel gestures: the finger of the hands 606 of user touches frame and then moves past frame in the direction of the illustrated arrow and move on on display device 608.This can allow drop-down bezel menu 610, and now it can become completely opaque.
In shown and described embodiment, bezel menu 610 includes multiple optional icon or groove 612,614,616,618 and 620.Each icon or groove and a different function are associated, such as picture function, a function, notes function, Object Creation, object editing etc..It is to be appreciated and understood that any kind of function can be associated with icon or groove.
In shown and described environment, bezel menu 610 can allow the user to access and activation command, instrument and object.Bezel menu can be configured to that touch input and pen are inputted both and responds.Alternatively or in addition, bezel menu can be configured to only touch input be responded.
In at least certain embodiments, available different gesture mode accesses the function being associated with bezel menu 610.Such as, a gesture mode can be new hand's pattern, and another gesture mode can be expert mode.
In new hand's pattern, after user's gesture discloses bezel menu 610, user can mention their finger, and now bezel menu can stay open one section of configurable interval (or indefinite duration).User then can tapping on the required entries being associated with one of icon or groove 612,614,616,618 and 620.By this gesture, may have access to the function being associated with special icon or groove.Such as, in special icon or groove, tapping can make establishment object on the painting canvas being associated with display device 608.In at least certain embodiments, in new hand's pattern, the object accessed from bezel menu occurs in the default location painting canvas.User can by marking screen (to the outer gesture of screen on screen) in turn or by closing bezel menu in the outside tapping of bezel menu and do not activate any function by it.
In expert mode, once user has been familiar with the position of the conventional item that can access from bezel menu, user just can perform to drag through groove or icon the continuous finger to painting canvas in single affairs, so that the object that establishment is associated (or instrument, or interface model) and drag it to specific desired location or path.Then user can decontrol this object and interact.Exemplarily, it is considered to Fig. 7.At this, user performs to drag on icon or groove 614 and accesses the function being associated with Sticky Note notes the bezel gestures being positioned on painting canvas as indicated by corresponding notes.Now, the pen that user can mention finger and use is associated is by desirably annotating this numeral Sticky Note.In at least certain embodiments, after have accessed specific function, bezel menu 610 can keep or not keep fully opening.
In other embodiments at least some of, in expert mode, bezel menu can completely disclose and access the function being associated with icon or groove.On the contrary, the bezel gestures being passed through enlightening corresponding to the vision of special icon or the position of groove may have access to the function being associated with this icon or groove.Exemplarily, it is considered to Fig. 8.At this, it is shown that vision enlightenment 604.Note, the part that bezel gestures enlightens through the vision corresponding to icon or groove 614 (Fig. 7).It is also noted that by this bezel gestures, have accessed the notes of the Sticky Note of correspondence.This feature can pass through to use the time delay of such as 1/3 second, and actual decide whether to dispose bezel menu in response to bezel gestures before consider that the position of user's finger realizes.Concept herein is that bezel menu remains hidden, unless user suspends or just pulls out menu, and does not complete the hauling-out of the required entries.This is used in the time delay that bezel menu starts before marking and reaches.Therefore, once user has been familiar with the specific operation on bezel menu, they just can promptly be dragged and be created by it and position object and take sb's mind off sth even without by opening of vision menu itself.This can encourage Kernel-based methods to remember expert's performance of the ballistic movement (ballisticmotion) driven, and is not based on the direct operated performance visually guided to widget.This concept is successfully to be because using its new hand's mode help study and encourage the expert mode for working.
It is only used as it and how to carry out an example of work according to an embodiment, it is considered to be following.When finger is observed the groove passing bezel menu from screen frame, start intervalometer.There is not other instant visual feedbacks.When the timer has lapsed, if finger is still in the region that bezel menu occupies, then bezel menu marks and follows the finger of user.When the finger of user is mentioned at bezel menu intra-zone, it keeps being put up.This is above-mentioned new hand's pattern.User can mention finger to check all grooves, and tapping creates required object (but not dragging it) on required groove.User also can touch by next item from new hand's pattern and drag it to painting canvas.If finger has streaked threshold distance or region, then bezel menu remains turned-off, but the function indicated by groove of traverse be activated, for instance, create Sticky Note, and start to follow the finger of user.This is above-mentioned expert mode.One realizes considering that the groove chosen by expert mode gesture can be determined through the position of screen edge by finger.
In at least certain embodiments, bezel menu is rotatable, in order to provide the access to additional function.Such as, bezel menu can have that left and right arrow enables at either side can rollability.Alternatively or in addition, the single or multiple finger orthogonal with the opening direction of bezel menu drags can roll this menu, without any arrow.
In at least certain embodiments, bezel menu can be extra groove or icon establishment space.Such as, the width of groove or icon by reducing the edge occurring in bezel menu, can add extra groove or icon.Exemplarily, it is considered to Fig. 9.
At this, equipment includes the frame 903 and the bezel menu 910 that occur on display device 908.Extra groove or icon 912,914 occur in bezel menu 910.Noting, groove or icon 912,914 have the width reduced relative to other grooves or icon.In this example, this width is reduced half.In order to access the object being associated with groove or icon 912,914, the bezel gestures being pulled through groove or icon as shown in the figure from the side of equipment can be used.In certain embodiments, corner groove or icon can have special state.Such as, corner groove or icon can by permanent allocation to specific function and be not likely to be customizable.
Therefore, bezel menu shows function to user in the way of can be used for forever not making screen real estate be occupied or needing use specialized hardware button.
Figure 10 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
The vision enlightenment that step 1000 display is associated with addressable bezel menu.One example of suitable vision enlightenment is given above.Step 1002 receives the bezel gestures input relative to vision enlightenment.Can using any suitable bezel gestures, its example is provided above.Step 1004 presents bezel menu in response to receiving bezel gestures input.Available any suitable bezel menu.In at least certain embodiments, bezel menu can present simply by receiving bezel gestures, without display vision enlightenment.Alternatively or in addition, vision enlightenment can above the finger of user or pen hover over the frame edge being associated time fade in.
Figure 11 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 1100 receives gesture input.This input can enlighten relative to bezel menu or the vision being associated with bezel menu and receive.Any suitable gesture input can be received.Such as, gesture input can include not using or including the input of frame.One example provides relative to the exposition of user's tapping bezel menu in the discussion of figure 6 above.Alternatively or in addition, gesture input can include bezel gestures input.One example provides in the discussion of figure 7 above-9.Step 1102 is found out and is inputted, with gesture, the function being associated.Step 1104 accesses the function found out in step 1102.This example that can how complete is provided above.
Above example illustrates gesture, including the bezel gestures utilizing single finger.In other embodiments, more than one finger is utilized in combinations with the gesture including bezel gestures.
Multiple finger is used to represent for gesture
In one or more embodiments, available multiple fingers represent for gesture, represent including bezel gestures.The plurality of finger can at a hands or altogether at two on hand.Use multiple finger can make repeatedly to touch to be mapped to difference in functionality or the object being associated with each function.Such as, two finger gestures or bezel gestures can be mapped to the first function or the first object associated there, and three finger gestures or bezel gestures are mapped to the second function or the second object associated there.Exemplarily, it is considered to Figure 12.
At this, equipment 1202 includes frame 1203 and presents vision enlightenment 1204 on the display device.As it has been described above, the vision enlightenment 1204 of translucent bars form can be used for providing hint or the instruction of the addressable function (for bezel menu 1210 in this situation) being associated with frame.
As it has been described above, bezel menu 1210 can be accessed by following bezel gestures: the finger of the hands of user touches frame and then moves past frame and move on to drag down bezel menu on display device.
In one or more embodiments, bezel menu 1210 can be demonstrated and further extend in the drawer shown in 1212.In shown and described embodiment, following bezel gestures can be used to show drawer 1212.First, user with one or more fingers on frame 1203 or near touch press.This is shown in the top part of Figure 12.Multiple fingers can be dragged on display device by user therefrom, as shown in the bottom most portion of Figure 12, thus illustrating drawer 1212.In at least certain embodiments, when multiple fingers are also cross frame, do not create object acquiescently.That is, in these embodiments, multi-finger gesture as above instruction drawer 1212 is just accessed.Drawer 1212 can have all those extra objects as directed.Exemplarily unrestricted, extra objects can include auxiliary tools, color or other objects various.It addition, at least certain embodiments, drawer 1212 can be used for storing and/or arrange every.Can in any suitable manner, as by user directly manipulation, for instance arrange by drag and drop object in drawer or rearrange.
In at least certain embodiments, mention hands and drawer can be kept to open, until it is closed later by similar gesture in an opposite direction.In at least certain embodiments, bezel menu 1210 can use and such as customize from the content of drawer 1212.Exemplarily, it is considered to Figure 13.
At this, user can change instrument and/or the object default allocation for main bezel menu groove via drag-and-drop operation.Such as, in the top part of Figure 13, user touches on new tool 1300 and presses.User is then and then by one of instrument 1300 each groove being dragged to bezel menu 1210 or on one of each groove.This gesture can make the new object that the object being previously associated with this groove is put down by user replace.
Alternatively or in addition, content also can be dragged to drawer 1212 by user from the page or painting canvas.Exemplarily, it is considered to Figure 14.At this, user touches on the object 1400 on the page or painting canvas and presses, and is dragged in drawer 1212 by this object.By mentioning finger, object 1400 is stored in drawer 1212.
Although it is to be appreciated and understood that the foregoing describe a drawer, but other embodiments various may utilize multiple drawer.Such as, other edges of display device can be associated from different drawers.These different drawers can preserve different instrument, object or other guide.On dual or multi screen equipment, the drawer for each screen edge can be identical maybe can having any different.In at least certain embodiments, it is possible on each screen edge, access multiple drawer by the direction paddling orthogonally being opened with drawer.This can by single touch, and/or multiple touch completes.If bezel menu extends to screen edge, then this completes also by the bezel gestures from orthogonal edges.
In the above-described embodiments, multiple touch is employed to access drawer 1212.Specifically, as shown in figure 12, three touches are employed to access shown drawer.In one or more embodiments, the touch of available varying number accesses different drawer.Such as, two touches can be mapped to the first drawer, three touches can be mapped to the second drawer, and four touches can be mapped to the 3rd drawer, by that analogy.Alternatively or in addition, the change between interval and the interval between multiple touches is mapped to difference in functionality.Such as, the two fingers touches with the first interval are mapped to the first function;And two fingers with the second larger space touch and are mapped to the second different function.
Figure 15 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 1500 receives multi-finger gesture input.The gesture of any suitable type can be used, exemplarily unrestricted, input including such as the above bezel gestures.Step 1502 is found out and is inputted, with multi-finger gesture, the function being associated.The example of each function is described above.Step 1504 accesses the function found out.This example that can how complete is described above.
Figure 16 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 1600 receives bezel gestures input.The example of bezel gestures input is described above.Step 1602 is found out and is inputted, with bezel gestures, the function being associated.In this specific embodiment, the function being associated with bezel gestures input is and accesses the function that one or more drawers are associated.Step 1604 shows one or more drawer for user.This example that can how complete is described above.
Radially menu
In at least certain embodiments, so-called radial direction menu is used in combinations with menus such as such as bezel menu.Notwithstanding radial direction menu, but other type of menu can be used without departing from the spirit and scope of theme required for protection.Such as, drop-down menu is used in combinations with bezel menu.One of general conception being associated with radial direction menu is that user can press in the touch of a certain position and access and realize specific function or menucommand by its finger by a direction paddling or slip.Radially the existence of menu can be indicated by the small icon being associated with the bigger icon or groove of bezel menu.Exemplarily, it is considered to Figure 17.
At this, equipment 1702 includes frame 1703 and the bezel menu 1710 shown on display device 1708 as mentioned above.In embodiment that is shown and that describe, bezel menu 1710 includes multiple optional icon or groove, and one of them is indicated at 1712.Each icon or groove and a different function are associated, such as picture function, a function, notes function, Object Creation, object editing etc..It is to be appreciated and understood that any kind of function can be associated with icon or groove.
As it has been described above, bezel menu 1710 can allow the user to access and activation command, instrument and object.Bezel menu can be configured to that touch input and pen are inputted both and responds.Alternatively or in addition, bezel menu can be configured to only touch input be responded.In embodiment that is shown and that describe, icon or groove 1712 include radially menu icon 1714, and this radial direction menu icon has given the user the prompting that one or more radial direction menus such as such as radial direction menu 1715 are associated with this special icon or groove.In embodiment that is shown and that describe, radially menu 1715 can be in any suitable manner, for instance is accessed by pen or touch.Such as, at least certain embodiments, radially menu 1715 can by hovering over pen radially on menu icon 1714 or neighbouring access.Alternatively or in addition, pen or finger can be used for drop-down radial direction menu 1715.Alternatively or in addition, radially menu 1715 can by radial direction menu icon 1714 or near tapping keep pen or finger to access.In certain embodiments, on radial direction menu icon, tapping triggers default-action, this default-action can from and tapping is associated on bezel menu groove action different, it is also possible to be not different from.
Once illustrate radially menu 1715, user can by radial direction menu icon 1714 or near touch press and streak on a specific direction and access various function or order.In embodiment that is shown and that describe, arrow indicates five different directions.Each direction corresponds to a difference in functionality or order.Each function or order are represented by intersection hachure square in the accompanying drawings.In at least certain embodiments, each icon or groove 1712 have default feature or order.By selecting specific radial menu function or an order, default feature or order can be replaced by selected function or order.
In at least certain embodiments, the number of options that radially menu presents can depend on that the position of the corresponding groove associated by radial direction menu or icon changes.Such as, in embodiment that is shown and that describe, groove or icon 1712 include five options for user.The radial direction menu being associated with the groove of the end occurring in bezel menu 1710 or icon can have less option due to interval constraint.Alternatively or in addition, the radial direction menu that the groove occurred with the part as the drawer shown or icon are associated can have more selectable option.
In at least certain embodiments, radial direction menu can be implemented as and include new hand's pattern and expert mode.In new hand's pattern, radially menu can be shown completely so that the user being unfamiliar with its addressable function or order can visually be conducted through this selection course.In expert mode, this is that radially menu is likely to not be demonstrated completely in order to be familiar with the user of the content of radial direction menu and behavior preparation.On the contrary, with the quick touch that such as icon 1712 icon such as grade or groove are associated streak gesture the function of radially menu or order can be made to be accessed directly.
Figure 18 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 1800 presents bezel menu.The example of bezel menu is provided above.Step 1802 provides the instruction of the one or more radial direction menus being associated with bezel menu.In embodiment that is shown and that describe, the form of the radial direction menu icon on the groove to occur in bezel menu or icon is indicated to exist.User's input that step 1804 receives with radially one of menu is associated.This example that can how complete is provided above.Such as, at least certain embodiments, radially user can visually be presented to by menu so that user can touch and streak on a specific direction subsequently provides input.Alternatively or in addition, radially menu need not visually present.On the contrary, it is familiar with the radial direction content of menu and the user of behavior can correspondingly be made as described above gesture to provide input.Step 1806 inputs in response to the user received and accesses the function or order being associated.
In one or more embodiments, when screen orientation is rotated, bezel menu can be rotated or is not rotated.For example, in some instances it may be desirable to or not bezel menu when screen orientation rotates.In this application that content should not be rotated wherein especially relevant, for instance user's Rotation screen provides in magazine page or the sketching board of different drawing angles wherein.In other cases, it may be desirable to rotated bezel menu when screen orientation rotates.Acquiescently, it may be desirable to the bezel menu groove of the equal number on all four edges of support screen so that menu item can rotate to the minor face of screen from the long limit of screen and not lose some.
Alternatively or in addition, bezel menu can customize according to screen orientation, so as to uses the groove of varying number on the long limit and minor face of screen.In some cases, depending on that screen is directed, some limit of screen can be left without frame item.Such as, for dexterous individual, left and base is likely to more likely accidentally be streaked, and can be left without frame item if needed.
With the outer gesture of screen and the combination page/object manipulation on screen
In one or more embodiments, available screen combines handle the page and/or other objects with the outer gesture of screen.Such as, screen can include such gesture with the combination of the outer gesture of screen: use a palmistry to receive input on screen for an object, and use identical or different palmistry to receive the other input of bezel gestures form for this object.The gesture that can use any suitable type combines.Exemplarily, it is considered to Figure 19.
At this, equipment 1902 includes frame 1903.The page 1904 is shown on the display device (not specified).In embodiment that is shown and that describe, use on screen and the combination of the outer gesture of screen performs to tear operation.Specifically, in the lowermost end part of Figure 19, the left hand of user or left forefinger keeping object, in this example, this object includes the page 1904.Using the right hand, user initiates to start on frame 1903 and move upwardly through in the shown side seeing arrow the bezel gestures of a part for the page 1904.Tear operation by using single finger to indicate, perform the part of the page is torn.Tear operation and by creating the bitmap of avulsed part in the page and can only show that in the page, not avulsed part realizes.Alternatively or in addition, object can be created to represent tear-away portion.In this object created, occur in the object in tear-away portion and can be created to represent the item on the present page.
In other embodiments one or more, tear operation and multiple finger can be used to realize.In these embodiments, the input of many fingers can be mapped to a page is occurred in from this page the operation torn completely painting canvas therein or book.
In at least certain embodiments, tear direction and can carry with it different semantic.Such as, tear tear-off from top to bottom and delete a page.Tear this page tear-off from top to bottom and allow to be dragged to this page one new position.
Figure 20 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 2000 receives input on the screen being associated with object.Can receive and the screen of any suitable type inputs, exemplarily unrestricted, including single finger input and/or the input of many fingers.Step 2002 receives the bezel gestures input being associated with object.The bezel gestures input of any suitable type can be received, exemplarily unrestricted, including single finger input and/or the input of many fingers.Step 2004 finds out the function being associated with two inputs.Step 2006 accesses the function being associated.The function of any suitable type can be associated with the combination of bezel gestures input with on screen, and its example is provided above.
Can by using the gesture including bezel gestures to provide other page manipulation.Such as, can page flip provided as described below and Page-saving (also referred to as " page pack ").
Exemplarily, it is considered to Figure 21.At this, equipment 2102 includes frame 2103 and the page 2104.As shown in the lowermost end part of Figure 21, user can by being used on frame 2103 start and translate into prevpage through the bezel gestures of screen to the right in the directions of the arrows.Do so discloses prevpage 2106.Equally, for translating into lower one page, user may utilize similar but simply bezel gestures in the opposite direction.Using page flip gesture, the finger of user can be mentioned any suitable position on screen.
In one or more embodiments, the semanteme of page flip gesture can from above-mentioned semantic change.Such as, in some cases, page flip gesture can be initiated as described above.But, if user suspends on screen with its finger, then can climb over multiple page.Alternatively or in addition, suspending finger in the middle of page flip gesture on screen can make the such as additional controls such as feast-brand mark label, command option plate or bezel menu occur.
Alternatively or in addition, at least certain embodiments, the finger of user advances more remote on screen, then can climb over more multipage.Alternatively or in addition, by page flip gesture initiated as described above, finger can then be moved clockwise or counterclockwise with circular motion to stir multiple page.In this case, clockwise movement represents and stirs forward, and counterclockwise movement represents and stirs backward.In this implementation, circle can be fit to last N number of motion sample.Movement velocity can be the function of circle diameter.Noting, in this implementation, user around any ad-hoc location pitch of the laps on screen, even must need not draw the circle that shape is good completely.On the contrary, any curvilinear motion can be mapped to page flip with intuitive manner, allows also to user simultaneously and easily stops and reversing route to stir in the opposite direction.
In at least certain embodiments, similar gesture can be used to preserve the page or by the page " pack ".In these embodiments, replacing the gesture terminated on screen in page flip example, this gesture can start to terminate in the frame portion or other structures of screen from gesture origin part.As an example, it is considered to Figure 22 and 23.
At this, equipment 2202 includes frame 2203 and the page 2204.As shown in the lowermost end part of Figure 22, user can by being used on frame 2203 start and preserve the page through screen to the bezel gestures being positioned at the relative frame portion of part originated from gesture to the right in the directions of the arrows or packed by the page.Do so discloses another page 2206.In one or more embodiments, definable one distance threshold so that before this threshold value, it is possible to provide the page flip as described in figure 21 and illustrate is experienced.After the distance threshold of this definition, it is possible to provide different Page-savings or page pack are experienced.Such as, in the diagram of Figure 22, the page 2204 is reduced to thumbnail.Page-saving or page pack experience can by when completing most of page flip gesture after the minimum time-out such as such as 1/3 second combination through minimum threshold of distance provide.In at least certain embodiments, if the user while mention its finger before arriving the frame of offside, then can be assumed that it is page flip operation.
Figure 23 illustrates the equipment 2302 of two independent display screens 2304,2306 including frame 2303 and being separated by crestal line 2308.Crestal line 2308 can be considered as a part for frame or the physical arrangement constituting equipment.The page 2310 is illustrated as being shown on display screen 2304.
As shown in the lowermost end part of Figure 23, user can by being used on frame 2303 start and preserve the page or by page pack through screen to the bezel gestures of the crestal line 2308 being positioned on screen 2304 the originated from part of gesture to the right in the directions of the arrows.Do so discloses another page 2312.In one or more embodiments, definable one distance threshold so that before this threshold value, it is possible to provide the page flip as described in figure 21 and illustrate is experienced.After the distance threshold of this definition, it is possible to provide different Page-savings or page pack are experienced.Such as, in the diagram of Figure 23, the page 2310 is reduced to thumbnail.Page-saving or page pack experience can be passed through to provide after the minimum time-out such as such as 1/3 second when completing most of page flip gesture.In at least certain embodiments, if the user while mention its finger before arriving crestal line 2308, then can be assumed that it is page flip operation.
In one or more embodiments, each several part of the page can be preserved or each several part of the page is packed.Exemplarily, it is considered to Figure 24.At this, equipment 2402 includes frame 2403 and two the independent display screens 2404,2406 separated by crestal line 2408.Crestal line 2408 can be considered as a part for frame or the physical arrangement constituting equipment.The page 2410 is illustrated as being shown on display screen 2404.
As shown in the lowermost end part of Figure 24, user can by using bezel gestures to preserve a part for the page or by the part pack of the page.First, two fingers of the hands (being left hand in the present case) of user are swept to screen from frame.In this particular instance, the left hand of user initiates bezel gestures from crestal line 2408, and moves up in the side of top arrow.Region between finger this be in 2412 places and illustrate then highlighted.The another hands of user can then sweep across the region highlighted to tear the part highlighted of this page and the part highlighted packed or preserve this part highlighted.In one or more embodiments, this gesture can be supported on any bar of the four edges of screen, thus tolerable injury level or vertical bar are torn from arbitrary screen by righthanded or left-handed user.In at least certain embodiments, the tear-away portion of the page can have two tearing edge and two bright and clean incision edges, in order to it is come with the page of pack or the object discrimination of other packs.
Figure 25 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 2500 receives and inputs relative to the bezel gestures of the page.Step 2502 finds out the page manipulation function being associated with this input.Can finding out the page manipulation function of any suitable type, its example is provided above.Step 2504 accesses the page manipulation function found out.
Figure 26 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 2600 receives and inputs relative on the screen of the page.The input of any suitable type can be received.In at least certain embodiments, the screen input received includes touch input or stylus input.Step 2602 receives and inputs relative to the bezel gestures of the page.Can receiving the bezel gestures input of any suitable type, its example is provided above.Step 2604 is found out and combines the page manipulation function that input is associated.The example of page manipulation function is provided above.Step 2606 accesses the page manipulation function found out and realizes this function relative to the page.
Thus, page flip and Page-saving operation can by using the bezel gestures including at least some of common aspect to unify.The unification of the two operation creates simplicity for user and facilitates Finding possibility.
In one or more embodiments, can by using bezel gestures to realize the operation of other page manipulation.Exemplarily, it is considered to Figure 27.At this, equipment 2702 includes frame 2703.The page 2704 is shown (not to be indicated) on the display device.In embodiment that is shown and that describe, can by using bezel gestures to create bookmarks tab.Specifically, as shown in the lowermost end part of Figure 27, by initiating gesture on frame 2703 and can move on to the page 2704 creates bookmarks tab 2706.In embodiment that is shown and that describe, the bezel gestures creating bookmarks tab originates from the corner of frame as shown in the figure.Any suitable position on available frame creates bookmarks tab.
Alternatively or in addition, available bezel gestures is by page knuckle (dog-ear).Exemplarily, it is considered to Figure 28.At this, equipment 2802 includes frame 2803.The page 2804 is shown (not to be indicated) on the display device.In embodiment that is shown and that describe, can by using bezel gestures to create knuckle.Specifically, as shown in the lowermost end part of Figure 28, can pass through on frame 2803, initiate gesture and move on on the page 2804, then exit the page in the opposite direction to create knuckle 2806 as shown by arrows.In embodiment that is shown and that describe, the bezel gestures creating knuckle originates from the corner of frame as shown in the figure.Any suitable position on available frame creates knuckle.Such as, in other embodiments, knuckle can be created by the corner slit edge frame gesture of cross-page.
In one or more embodiments, gesture can be used for showing the labels such as that such as user in document creates or predefined label.Exemplarily, it is considered to Figure 29.At this, equipment 2902 includes frame 2903.The page 2904 is shown (not to be indicated) on the display device.In one or more embodiments, label can by utilizing the edge being shown in the page 2904 to pull the bezel gestures showing label construction 2906 to show.When bezel gestures moves on on screen, this page can slightly be pulled to right side and be shown label construction 2906.In this case, gesture includes two or more fingers kept together as shown in the figure, but not has gap between the finger.
In one or more embodiments, continue to drag the page and can disclose further structure.Such as, continue to drag the page and can show form tissue view in the left side of the page 2904.In at least certain embodiments, the gesture continuing this traverse full page can preserve full page as described above or be packed by full page.
Figure 30 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 3000 receives and inputs relative to the bezel gestures of the page.Step 3002 inputs relative to page creation bookmarks tab in response to receiving bezel gestures.This example that can how complete is provided above.
Figure 31 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 3100 receives and inputs relative to the bezel gestures of the page.Step 3102 creates knuckle in response to receiving bezel gestures input on the page.This example that can how complete is provided above.
Figure 32 describes the flow chart according to each step in the method for one or more embodiments.The method can realize in conjunction with any suitable hardware, software, firmware or its combination.In at least some embodiments, the method realizes in combinations with systems such as systems such as described above and below.
Step 3200 receives and inputs relative to the bezel gestures of the page.Step 3202 shows the label construction being associated with the page.This example that can how complete is provided above.
Example apparatus
Figure 33 illustrates and can be implemented as with reference to Fig. 1 and 2 any kind of portable and/or computer equipment described to realize each assembly of the example apparatus 3300 of each embodiment of gesture technology described herein.Equipment 3300 includes allowing the communication equipment 3304 wiredly and/or wirelessly communicated of device data 3302 (such as, received data, the data just received, be scheduled the packet etc. of the data of broadcast, data).Device data 3304 or other equipment contents can include the configuration setting of equipment, the media content being stored on equipment and/or the information being associated with the user of equipment.It is stored in the media content on equipment 3300 and can include any kind of audio frequency, video and/or view data.Equipment 3300 includes one or more data input 3306, any kind of data, media content and/or input can be received via the input of these data, as user may select input, message, music, television media content, record video content and from any content and/or the audio frequency of any other type of data sources, video and/or view data.
Equipment 3300 also includes communication interface 3308, and it can be implemented as any one in serial and/or parallel interface, wave point, any kind of network interface, modem and any other type of communication interface or multiple.Communication interface 3308 provides the connection and/or communication link between equipment 3300 and communication network, and other electronics, calculating and communication equipment can pass through communication network and communicate with equipment 3300.
Equipment 3300 includes one or more processor 3310 (such as, any one in microprocessor, controller etc.), and the processor various calculating of process can perform or instructions is to control the operation of equipment 3300 and to realize above-mentioned gesture embodiment.As an alternative or supplement, equipment 3300 can realize with combination briefly any one in hardware, firmware or fixed logic circuit that the process of 3312 places' marks and control circuit realize or combination.Although not showing that, but equipment 3300 can include system bus or data transmission system that each assembly in this equipment carries out couple.System bus can include any one in different bus architectures or combination, such as memory bus or Memory Controller, peripheral bus, USB (universal serial bus) and/or utilize the processor of any one in various bus architecture or local bus.
Equipment 3300 may also include computer-readable medium 3314, such as one or more memory assemblies, the example of memory assembly include random access memory (RAM), nonvolatile memory (such as, read only memory (ROM), flash memory, in EPROM, EEPROM etc. any one or multiple) and disk storage device.Disk storage device can be implemented as any kind of magnetically or optically storage device, such as hard disk drive, recordable and/or rewritable compact-disc (CD), any kind of digital versatile disc (DVD) etc..Equipment 3300 may also include large-capacity storage media equipment 3316.
Computer-readable medium 3314 provides data storage mechanism so that storage device data 3304 and various equipment application 3318 and any other type of information relevant with each operating aspect of equipment 3300 and/or data.Such as, operating system 3320 can be safeguarded as computer applied algorithm with computer-readable medium 3314 and perform on processor 3310.Equipment application 3318 can include equipment manager (such as, control application, software application, signal processing and control module, particular device the machine code, for the hardware abstraction layer etc. of particular device).Equipment application 3318 also includes any system component or the module that realize each embodiment of gesture technology described herein.In this example, equipment applies 3318 Application of Interface 3322 and the gesture-capture driver 3324 including being illustrated as software module and/or computer applied algorithm.Gesture-capture driver 3324 represents the software for providing be configured to catch the interface of the equipment (such as touch screen, Trackpad, photographing unit etc.) of gesture.Alternatively or in addition, Application of Interface 3322 and gesture-capture driver 3324 can be implemented as hardware, software, firmware or its combination in any.
Equipment 3300 also includes providing voice data to audio system 3326 and/or providing audio frequency and/or the video input-output system 3330 of video data to display system 3328.Audio system 3328 and/or display system 3330 can include any equipment processing, show and/or otherwise present audio frequency, video and view data.Video signal and audio signal can connect via RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital visual interface), analogue audio frequency or other similar communication link is transferred to audio frequency apparatus from equipment 3300 and/or is transferred to display device.In one embodiment, audio system 3328 and/or display system 3330 are implemented as the assembly outside equipment 3300.Or, audio system 3328 and/or display system 3330 are implemented as the integrated package of example apparatus 3300.
Conclusion
Describe the bezel gestures for touch display.In at least certain embodiments, the frame of equipment is used to extend the function that can be accessed by use bezel gestures.In at least certain embodiments, it is possible to use motion outside screen to create screen input by bezel gestures by frame.Bezel gestures can include single finger bezel gestures, many finger/same-hand bezel frame gesture and/or the different hands bezel gestures of many fingers.
Although describing each embodiment with to the language that architectural feature and/or method action are special, it should be appreciated that, each embodiment defined in the following claims is not necessarily limited to described specific features or action.On the contrary, these specific features and action be as realize each embodiment required for protection exemplary forms and disclosed in.
Claims (12)
1. the method carrying out on the computing device inputting, including:
Receive the bezel gestures relative to vision enlightenment on the frame of computing equipment, wherein said bezel gestures starts on described frame and moves on the display device being associated with described computing equipment, the enlightenment of described vision is for providing the bezel menu being associated with frame, wherein said bezel gestures can be performed by one or more fingers, and the bezel gestures performed by the finger of varying number can be mapped to difference in functionality or different object;And
It is at least partially in response to described reception, accesses the function that can be obtained by described bezel menu.
2. the method for claim 1, it is characterised in that receive described bezel gestures before being additionally included in the described function of access and in response to relative to the enlightenment of described vision, present described bezel menu.
3. method as claimed in claim 2, it is characterised in that described bezel menu includes drop-down menu.
4. the method for claim 1, it is characterised in that described bezel menu includes multiple groove, each groove is associated with corresponding function.
5. the method for claim 1, it is characterised in that described bezel menu is configured to enable the access to order, instrument and object.
6. the method carrying out on the computing device inputting, including:
Realize on the computing device may have access to bezel menu;
The vision enlightenment that display is associated with described addressable bezel menu, the enlightenment of described vision is configured to be used to access the function being associated with described addressable bezel menu;And
Access function is inputted by bezel gestures, the at least some of of described function is associated with described addressable bezel menu, the input of described bezel gestures includes through or again passes the input of described frame, wherein said bezel gestures can be performed by one or more fingers, and the bezel gestures performed by the finger of varying number can be mapped to difference in functionality or different object.
7. method as claimed in claim 6, it is characterized in that, the input of described bezel gestures also include in the following at least one: start on described frame with the input terminated, on diverse location through the input of described frame, the input occurred on the one or more frames being associated with multiple screens, include single contact drag input, include the input that double; two contact drags or the input including hands contact dragging.
8. method as claimed in claim 6, it is characterised in that described bezel menu includes multiple groove, and each groove is associated with corresponding function.
9. method as claimed in claim 6, it is characterised in that described bezel menu includes multiple groove, and each groove is associated with corresponding function, and at least one of which groove has the width reduced relative to other grooves.
10. method as claimed in claim 6, it is characterised in that described bezel menu includes multiple groove, and each groove is associated with corresponding function, and at least one of which groove has the width reduced relative to other grooves, and at least one groove wherein said includes corner groove.
11. method as claimed in claim 6, it is characterised in that also include enabling the access to the function being associated with described addressable bezel menu by different gesture modes.
12. method as claimed in claim 6, it is characterized in that, also include enabling the access to the function being associated with described addressable bezel menu by different gesture modes, wherein first gesture pattern includes new hand's pattern that described addressable bezel menu is revealed, and second gesture pattern includes the expert mode that described addressable bezel menu is not revealed.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/709,204 | 2010-02-19 | ||
US12/709,204 US9274682B2 (en) | 2010-02-19 | 2010-02-19 | Off-screen gestures to create on-screen input |
PCT/US2011/025131 WO2011103218A2 (en) | 2010-02-19 | 2011-02-17 | Off-screen gestures to create on-screen input |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102884498A CN102884498A (en) | 2013-01-16 |
CN102884498B true CN102884498B (en) | 2016-07-06 |
Family
ID=44476093
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201180009579.2A Active CN102884498B (en) | 2010-02-19 | 2011-02-17 | The method carrying out on the computing device inputting |
Country Status (6)
Country | Link |
---|---|
US (1) | US9274682B2 (en) |
EP (1) | EP2537088B1 (en) |
JP (1) | JP5883400B2 (en) |
CN (1) | CN102884498B (en) |
CA (1) | CA2788137C (en) |
WO (1) | WO2011103218A2 (en) |
Families Citing this family (163)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8018440B2 (en) | 2005-12-30 | 2011-09-13 | Microsoft Corporation | Unintentional touch rejection |
JP2010191892A (en) * | 2009-02-20 | 2010-09-02 | Sony Corp | Information processing apparatus, display control method, and program |
US8803474B2 (en) * | 2009-03-25 | 2014-08-12 | Qualcomm Incorporated | Optimization of wireless power devices |
US8836648B2 (en) | 2009-05-27 | 2014-09-16 | Microsoft Corporation | Touch pull-in gesture |
US8239785B2 (en) | 2010-01-27 | 2012-08-07 | Microsoft Corporation | Edge gestures |
US8261213B2 (en) * | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
US9411504B2 (en) | 2010-01-28 | 2016-08-09 | Microsoft Technology Licensing, Llc | Copy and staple gestures |
US9519356B2 (en) | 2010-02-04 | 2016-12-13 | Microsoft Technology Licensing, Llc | Link gestures |
US9965165B2 (en) | 2010-02-19 | 2018-05-08 | Microsoft Technology Licensing, Llc | Multi-finger gestures |
US9367205B2 (en) | 2010-02-19 | 2016-06-14 | Microsoft Technolgoy Licensing, Llc | Radial menus with bezel gestures |
US20110209098A1 (en) * | 2010-02-19 | 2011-08-25 | Hinckley Kenneth P | On and Off-Screen Gesture Combinations |
US8799827B2 (en) | 2010-02-19 | 2014-08-05 | Microsoft Corporation | Page manipulations using on and off-screen gestures |
US9310994B2 (en) | 2010-02-19 | 2016-04-12 | Microsoft Technology Licensing, Llc | Use of bezel as an input mechanism |
US8539384B2 (en) * | 2010-02-25 | 2013-09-17 | Microsoft Corporation | Multi-screen pinch and expand gestures |
US8751970B2 (en) | 2010-02-25 | 2014-06-10 | Microsoft Corporation | Multi-screen synchronous slide gesture |
US9454304B2 (en) | 2010-02-25 | 2016-09-27 | Microsoft Technology Licensing, Llc | Multi-screen dual tap gesture |
US8473870B2 (en) | 2010-02-25 | 2013-06-25 | Microsoft Corporation | Multi-screen hold and drag gesture |
US8707174B2 (en) | 2010-02-25 | 2014-04-22 | Microsoft Corporation | Multi-screen hold and page-flip gesture |
US9075522B2 (en) * | 2010-02-25 | 2015-07-07 | Microsoft Technology Licensing, Llc | Multi-screen bookmark hold gesture |
KR20110121125A (en) * | 2010-04-30 | 2011-11-07 | 삼성전자주식회사 | Interactive display apparatus and operating method thereof |
US20110296351A1 (en) * | 2010-05-26 | 2011-12-01 | T-Mobile Usa, Inc. | User Interface with Z-axis Interaction and Multiple Stacks |
US20120050183A1 (en) | 2010-08-27 | 2012-03-01 | Google Inc. | Switching display modes based on connection state |
JP5001412B2 (en) * | 2010-09-03 | 2012-08-15 | 株式会社コナミデジタルエンタテインメント | GAME DEVICE, GAME CONTROL METHOD, AND PROGRAM |
EP2434387B1 (en) | 2010-09-24 | 2020-01-08 | 2236008 Ontario Inc. | Portable electronic device and method therefor |
EP3451123B8 (en) * | 2010-09-24 | 2020-06-17 | BlackBerry Limited | Method for conserving power on a portable electronic device and a portable electronic device configured for the same |
CA2811253C (en) | 2010-09-24 | 2018-09-04 | Research In Motion Limited | Transitional view on a portable electronic device |
EP2434388B1 (en) * | 2010-09-24 | 2017-12-20 | BlackBerry Limited | Portable electronic device and method of controlling same |
US8823640B1 (en) | 2010-10-22 | 2014-09-02 | Scott C. Harris | Display reconfiguration and expansion across multiple devices |
US20120159395A1 (en) | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Application-launching interface for multiple modes |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US8446363B1 (en) | 2010-12-30 | 2013-05-21 | Google Inc. | Enhanced input using touch screen |
JP5197777B2 (en) * | 2011-02-01 | 2013-05-15 | 株式会社東芝 | Interface device, method, and program |
US20120256829A1 (en) * | 2011-04-05 | 2012-10-11 | Qnx Software Systems Limited | Portable electronic device and method of controlling same |
CN103999028B (en) * | 2011-05-23 | 2018-05-15 | 微软技术许可有限责任公司 | Invisible control |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9158445B2 (en) | 2011-05-27 | 2015-10-13 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US9104440B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US20140123080A1 (en) * | 2011-06-07 | 2014-05-01 | Beijing Lenovo Software Ltd. | Electrical Device, Touch Input Method And Control Method |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US9202297B1 (en) * | 2011-07-12 | 2015-12-01 | Domo, Inc. | Dynamic expansion of data visualizations |
US9454299B2 (en) * | 2011-07-21 | 2016-09-27 | Nokia Technologies Oy | Methods, apparatus, computer-readable storage mediums and computer programs for selecting functions in a graphical user interface |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US8743069B2 (en) * | 2011-09-01 | 2014-06-03 | Google Inc. | Receiving input at a computing device |
US20130057587A1 (en) | 2011-09-01 | 2013-03-07 | Microsoft Corporation | Arranging tiles |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US10318146B2 (en) * | 2011-09-12 | 2019-06-11 | Microsoft Technology Licensing, Llc | Control area for a touch screen |
US8810535B2 (en) | 2011-10-18 | 2014-08-19 | Blackberry Limited | Electronic device and method of controlling same |
JP5816516B2 (en) | 2011-10-24 | 2015-11-18 | 京セラ株式会社 | Electronic device, control program, and process execution method |
US9990119B2 (en) * | 2011-12-15 | 2018-06-05 | Blackberry Limited | Apparatus and method pertaining to display orientation |
US20130154959A1 (en) * | 2011-12-20 | 2013-06-20 | Research In Motion Limited | System and method for controlling an electronic device |
US9030407B2 (en) * | 2011-12-21 | 2015-05-12 | Nokia Technologies Oy | User gesture recognition |
US9250768B2 (en) * | 2012-02-13 | 2016-02-02 | Samsung Electronics Co., Ltd. | Tablet having user interface |
US9594499B2 (en) * | 2012-02-21 | 2017-03-14 | Nokia Technologies Oy | Method and apparatus for hover-based spatial searches on mobile maps |
US9778706B2 (en) * | 2012-02-24 | 2017-10-03 | Blackberry Limited | Peekable user interface on a portable electronic device |
US20130222272A1 (en) * | 2012-02-28 | 2013-08-29 | Research In Motion Limited | Touch-sensitive navigation in a tab-based application interface |
CN102629185A (en) * | 2012-02-29 | 2012-08-08 | 中兴通讯股份有限公司 | Processing method of touch operation and mobile terminal |
ES2845274T3 (en) * | 2012-03-06 | 2021-07-26 | Huawei Device Co Ltd | Touch screen and terminal operation method |
US10078420B2 (en) * | 2012-03-16 | 2018-09-18 | Nokia Technologies Oy | Electronic devices, associated apparatus and methods |
FR2990020B1 (en) * | 2012-04-25 | 2014-05-16 | Fogale Nanotech | CAPACITIVE DETECTION DEVICE WITH ARRANGEMENT OF CONNECTION TRACKS, AND METHOD USING SUCH A DEVICE. |
US9395852B2 (en) * | 2012-05-07 | 2016-07-19 | Cirque Corporation | Method for distinguishing between edge swipe gestures that enter a touch sensor from an edge and other similar but non-edge swipe actions |
JP6082458B2 (en) | 2012-05-09 | 2017-02-15 | アップル インコーポレイテッド | Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
AU2013259637B2 (en) | 2012-05-09 | 2016-07-07 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
EP3401773A1 (en) | 2012-05-09 | 2018-11-14 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
WO2013169300A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Thresholds for determining feedback in computing devices |
JP6002836B2 (en) | 2012-05-09 | 2016-10-05 | アップル インコーポレイテッド | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
JP6182207B2 (en) | 2012-05-09 | 2017-08-16 | アップル インコーポレイテッド | Device, method, and graphical user interface for providing feedback for changing an activation state of a user interface object |
CN107728906B (en) | 2012-05-09 | 2020-07-31 | 苹果公司 | Device, method and graphical user interface for moving and placing user interface objects |
WO2013169845A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for scrolling nested regions |
WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
US10108265B2 (en) | 2012-05-09 | 2018-10-23 | Apple Inc. | Calibration of haptic feedback systems for input devices |
KR101710771B1 (en) * | 2012-05-18 | 2017-02-27 | 애플 인크. | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs |
WO2013188307A2 (en) | 2012-06-12 | 2013-12-19 | Yknots Industries Llc | Haptic electromagnetic actuator |
CN102830901A (en) * | 2012-06-29 | 2012-12-19 | 鸿富锦精密工业(深圳)有限公司 | Office device |
US8994678B2 (en) * | 2012-07-18 | 2015-03-31 | Samsung Electronics Co., Ltd. | Techniques for programmable button on bezel of mobile terminal |
US9886116B2 (en) * | 2012-07-26 | 2018-02-06 | Apple Inc. | Gesture and touch input detection through force sensing |
US20140055369A1 (en) * | 2012-08-22 | 2014-02-27 | Qualcomm Innovation Center, Inc. | Single-gesture mobile computing device operations |
CN107247538B (en) * | 2012-09-17 | 2020-03-20 | 华为终端有限公司 | Touch operation processing method and terminal device |
US9411461B2 (en) * | 2012-10-17 | 2016-08-09 | Adobe Systems Incorporated | Moveable interactive shortcut toolbar and unintentional hit rejecter for touch input devices |
TWI493386B (en) * | 2012-10-22 | 2015-07-21 | Elan Microelectronics Corp | Cursor control device and controlling method for starting operating system function menu by using the same |
US9582122B2 (en) | 2012-11-12 | 2017-02-28 | Microsoft Technology Licensing, Llc | Touch-sensitive bezel techniques |
US9542018B2 (en) * | 2012-11-12 | 2017-01-10 | Htc Corporation | Touch panel and electronic apparatus |
JP2015535628A (en) | 2012-11-27 | 2015-12-14 | トムソン ライセンシングThomson Licensing | Adaptive virtual keyboard |
WO2014083368A1 (en) | 2012-11-27 | 2014-06-05 | Thomson Licensing | Adaptive virtual keyboard |
JP2014127103A (en) * | 2012-12-27 | 2014-07-07 | Brother Ind Ltd | Material sharing program, terminal device, and material sharing method |
CN104903834B (en) | 2012-12-29 | 2019-07-05 | 苹果公司 | For equipment, method and the graphic user interface in touch input to transition between display output relation |
AU2013368441B2 (en) | 2012-12-29 | 2016-04-14 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
CN104885050B (en) | 2012-12-29 | 2017-12-08 | 苹果公司 | For determining the equipment, method and the graphic user interface that are rolling or selection content |
WO2014105277A2 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
CN109375853A (en) | 2012-12-29 | 2019-02-22 | 苹果公司 | To equipment, method and the graphic user interface of the navigation of user interface hierarchical structure |
WO2014105279A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for switching between user interfaces |
CN103076959A (en) | 2013-02-06 | 2013-05-01 | 华为终端有限公司 | Electronic equipment and screen unlocking method thereof |
US9304587B2 (en) | 2013-02-13 | 2016-04-05 | Apple Inc. | Force sensing mouse |
US10578499B2 (en) | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
US20140232679A1 (en) * | 2013-02-17 | 2014-08-21 | Microsoft Corporation | Systems and methods to protect against inadvertant actuation of virtual buttons on touch surfaces |
CN104035694B (en) * | 2013-03-04 | 2018-11-09 | 观致汽车有限公司 | Motor multimedia exchange method and device |
CN103150095B (en) * | 2013-03-07 | 2015-12-23 | 东莞宇龙通信科技有限公司 | Terminal and terminal control method |
US20140282240A1 (en) * | 2013-03-15 | 2014-09-18 | William Joseph Flynn, III | Interactive Elements for Launching from a User Interface |
CN104063037B (en) * | 2013-03-18 | 2017-03-29 | 联想(北京)有限公司 | A kind of operational order recognition methods, device and Wearable electronic equipment |
US9063576B1 (en) * | 2013-04-04 | 2015-06-23 | Amazon Technologies, Inc. | Managing gesture input information |
KR102157270B1 (en) * | 2013-04-26 | 2020-10-23 | 삼성전자주식회사 | User terminal device with a pen and control method thereof |
US20140347326A1 (en) * | 2013-05-21 | 2014-11-27 | Samsung Electronics Co., Ltd. | User input using hovering input |
US10481769B2 (en) * | 2013-06-09 | 2019-11-19 | Apple Inc. | Device, method, and graphical user interface for providing navigation and search functionalities |
JP2015005182A (en) * | 2013-06-21 | 2015-01-08 | カシオ計算機株式会社 | Input device, input method, program and electronic apparatus |
WO2015020639A1 (en) * | 2013-08-06 | 2015-02-12 | Intel Corporation | Dual screen visibility with virtual transparency |
US9423946B2 (en) * | 2013-08-12 | 2016-08-23 | Apple Inc. | Context sensitive actions in response to touch input |
GB2519558A (en) | 2013-10-24 | 2015-04-29 | Ibm | Touchscreen device with motion sensor |
CN105940385B (en) * | 2013-11-07 | 2021-06-25 | 英特尔公司 | Controlling primary and secondary displays from a single touch screen |
KR101870848B1 (en) | 2013-12-30 | 2018-06-25 | 후아웨이 테크놀러지 컴퍼니 리미티드 | Side menu displaying method and apparatus and terminal |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9600172B2 (en) * | 2014-01-03 | 2017-03-21 | Apple Inc. | Pull down navigation mode |
US20150242037A1 (en) | 2014-01-13 | 2015-08-27 | Apple Inc. | Transparent force sensor with strain relief |
CN103793055A (en) * | 2014-01-20 | 2014-05-14 | 华为终端有限公司 | Method and terminal for responding to gesture |
US9477337B2 (en) | 2014-03-14 | 2016-10-25 | Microsoft Technology Licensing, Llc | Conductive trace routing for display and bezel sensors |
WO2015159774A1 (en) * | 2014-04-14 | 2015-10-22 | シャープ株式会社 | Input device and method for controlling input device |
JP6237890B2 (en) * | 2014-04-18 | 2017-11-29 | 株式会社村田製作所 | Display device and program |
US9798399B2 (en) | 2014-06-02 | 2017-10-24 | Synaptics Incorporated | Side sensing for electronic devices |
US10297119B1 (en) | 2014-09-02 | 2019-05-21 | Apple Inc. | Feedback device in an electronic device |
US20160077793A1 (en) * | 2014-09-15 | 2016-03-17 | Microsoft Corporation | Gesture shortcuts for invocation of voice input |
US9939901B2 (en) | 2014-09-30 | 2018-04-10 | Apple Inc. | Haptic feedback assembly |
US10048767B2 (en) * | 2014-11-06 | 2018-08-14 | Samsung Electronics Co., Ltd. | Electronic apparatus and method of controlling multi-vision screen including a plurality of display apparatuses |
KR102327146B1 (en) * | 2014-11-06 | 2021-11-16 | 삼성전자주식회사 | Electronic apparatus and method for controlling a screen of display apparatus |
CN104503682A (en) * | 2014-11-07 | 2015-04-08 | 联发科技(新加坡)私人有限公司 | Method for processing screen display window and mobile terminal |
CN105759950B (en) * | 2014-12-18 | 2019-08-02 | 宇龙计算机通信科技(深圳)有限公司 | Information of mobile terminal input method and mobile terminal |
US9798409B1 (en) | 2015-03-04 | 2017-10-24 | Apple Inc. | Multi-force input device |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US10152208B2 (en) | 2015-04-01 | 2018-12-11 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9674426B2 (en) | 2015-06-07 | 2017-06-06 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US10430051B2 (en) * | 2015-12-29 | 2019-10-01 | Facebook, Inc. | Multi-user content presentation system |
US9454259B2 (en) | 2016-01-04 | 2016-09-27 | Secugen Corporation | Multi-level command sensing apparatus |
DE102016208575A1 (en) * | 2016-05-19 | 2017-11-23 | Heidelberger Druckmaschinen Ag | Touchpad with gesture control for wallscreen |
WO2018068207A1 (en) * | 2016-10-11 | 2018-04-19 | 华为技术有限公司 | Method and device for identifying operation, and mobile terminal |
JP6511499B2 (en) * | 2017-10-18 | 2019-05-15 | 華為技術有限公司Huawei Technologies Co.,Ltd. | Side menu display method, device and terminal |
KR102442457B1 (en) | 2017-11-10 | 2022-09-14 | 삼성전자주식회사 | Electronic device and control method thereof |
USD875740S1 (en) * | 2018-03-14 | 2020-02-18 | Google Llc | Display screen with graphical user interface |
USD879114S1 (en) * | 2018-03-29 | 2020-03-24 | Google Llc | Display screen with graphical user interface |
CN111433723A (en) * | 2018-06-27 | 2020-07-17 | 华为技术有限公司 | Shortcut key control method and terminal |
JP6686084B2 (en) * | 2018-08-31 | 2020-04-22 | レノボ・シンガポール・プライベート・リミテッド | Electronics |
JP7382863B2 (en) * | 2020-03-16 | 2023-11-17 | 株式会社ワコム | Pointer position detection method and sensor controller |
US11409410B2 (en) | 2020-09-14 | 2022-08-09 | Apple Inc. | User input interfaces |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1936799A (en) * | 2005-09-23 | 2007-03-28 | 鸿富锦精密工业(深圳)有限公司 | User operation control apparatus and method |
CN101432677A (en) * | 2005-03-04 | 2009-05-13 | 苹果公司 | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
Family Cites Families (269)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4686332A (en) | 1986-06-26 | 1987-08-11 | International Business Machines Corporation | Combined finger touch and stylus detection system for use on the viewing surface of a visual display device |
US4843538A (en) * | 1985-04-30 | 1989-06-27 | Prometrix Corporation | Multi-level dynamic menu which suppresses display of items previously designated as non-selectable |
US6167439A (en) | 1988-05-27 | 2000-12-26 | Kodak Limited | Data retrieval, manipulation and transmission with facsimile images |
US5231578A (en) * | 1988-11-01 | 1993-07-27 | Wang Laboratories, Inc. | Apparatus for document annotation and manipulation using images from a window source |
US5237647A (en) | 1989-09-15 | 1993-08-17 | Massachusetts Institute Of Technology | Computer aided drawing in three dimensions |
US5898434A (en) * | 1991-05-15 | 1999-04-27 | Apple Computer, Inc. | User interface system having programmable user interface elements |
US5349658A (en) * | 1991-11-01 | 1994-09-20 | Rourke Thomas C O | Graphical user interface |
US5351995A (en) | 1992-01-29 | 1994-10-04 | Apple Computer, Inc. | Double-sided, reversible electronic paper |
US5661773A (en) | 1992-03-19 | 1997-08-26 | Wisconsin Alumni Research Foundation | Interface for radiation therapy machine |
US5821930A (en) | 1992-08-23 | 1998-10-13 | U S West, Inc. | Method and system for generating a working window in a computer system |
US6097392A (en) | 1992-09-10 | 2000-08-01 | Microsoft Corporation | Method and system of altering an attribute of a graphic object in a pen environment |
US5463725A (en) | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
DE69430967T2 (en) * | 1993-04-30 | 2002-11-07 | Xerox Corp | Interactive copying system |
DE69432199T2 (en) * | 1993-05-24 | 2004-01-08 | Sun Microsystems, Inc., Mountain View | Graphical user interface with methods for interfacing with remote control devices |
US5583984A (en) * | 1993-06-11 | 1996-12-10 | Apple Computer, Inc. | Computer system with graphical user interface including automated enclosures |
US5497776A (en) * | 1993-08-05 | 1996-03-12 | Olympus Optical Co., Ltd. | Ultrasonic image diagnosing apparatus for displaying three-dimensional image |
US5596697A (en) * | 1993-09-30 | 1997-01-21 | Apple Computer, Inc. | Method for routing items within a computer system |
US5664133A (en) | 1993-12-13 | 1997-09-02 | Microsoft Corporation | Context sensitive menu system/menu behavior |
US5491783A (en) * | 1993-12-30 | 1996-02-13 | International Business Machines Corporation | Method and apparatus for facilitating integrated icon-based operations in a data processing system |
JPH086707A (en) * | 1993-12-30 | 1996-01-12 | Xerox Corp | Screen-directivity-display processing system |
US5555369A (en) | 1994-02-14 | 1996-09-10 | Apple Computer, Inc. | Method of creating packages for a pointer-based computer system |
US5664128A (en) * | 1995-02-23 | 1997-09-02 | Apple Computer, Inc. | Object storage apparatus for use with data sets in computer applications |
JPH0926769A (en) | 1995-07-10 | 1997-01-28 | Hitachi Ltd | Picture display device |
US5694150A (en) | 1995-09-21 | 1997-12-02 | Elo Touchsystems, Inc. | Multiuser/multi pointing device graphical user interface system |
US6029214A (en) | 1995-11-03 | 2000-02-22 | Apple Computer, Inc. | Input tablet system with user programmable absolute coordinate mode and relative coordinate mode segments |
US5761485A (en) * | 1995-12-01 | 1998-06-02 | Munyan; Daniel E. | Personal electronic book system |
JPH10192A (en) | 1996-04-15 | 1998-01-06 | Olympus Optical Co Ltd | Ultrasonic image diagnosing device |
US5969720A (en) | 1997-03-07 | 1999-10-19 | International Business Machines Corporation | Data processing system and method for implementing an informative container for a file system |
US6920619B1 (en) * | 1997-08-28 | 2005-07-19 | Slavoljub Milekic | User interface for removing an object from a display |
US6310610B1 (en) | 1997-12-04 | 2001-10-30 | Nortel Networks Limited | Intelligent touch display |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
WO1999028811A1 (en) * | 1997-12-04 | 1999-06-10 | Northern Telecom Limited | Contextual gesture interface |
US20010047263A1 (en) | 1997-12-18 | 2001-11-29 | Colin Donald Smith | Multimodal user interface |
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US7760187B2 (en) * | 2004-07-30 | 2010-07-20 | Apple Inc. | Visual expander |
US20070177804A1 (en) | 2006-01-30 | 2007-08-02 | Apple Computer, Inc. | Multi-touch gesture dictionary |
US7663607B2 (en) * | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
EP1717684A3 (en) | 1998-01-26 | 2008-01-23 | Fingerworks, Inc. | Method and apparatus for integrating manual input |
US7800592B2 (en) | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
US6639577B2 (en) | 1998-03-04 | 2003-10-28 | Gemstar-Tv Guide International, Inc. | Portable information display device with ergonomic bezel |
US6239798B1 (en) * | 1998-05-28 | 2001-05-29 | Sun Microsystems, Inc. | Methods and apparatus for a window access panel |
US6337698B1 (en) | 1998-11-20 | 2002-01-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US6507352B1 (en) * | 1998-12-23 | 2003-01-14 | Ncr Corporation | Apparatus and method for displaying a menu with an interactive retail terminal |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
US6396523B1 (en) | 1999-07-29 | 2002-05-28 | Interlink Electronics, Inc. | Home entertainment device remote control |
US6992687B1 (en) | 1999-12-07 | 2006-01-31 | Microsoft Corporation | Bookmarking and placemarking a displayed document in a computer system |
US6957233B1 (en) | 1999-12-07 | 2005-10-18 | Microsoft Corporation | Method and apparatus for capturing and rendering annotations for non-modifiable electronic content |
JP4803883B2 (en) | 2000-01-31 | 2011-10-26 | キヤノン株式会社 | Position information processing apparatus and method and program thereof. |
US6859909B1 (en) * | 2000-03-07 | 2005-02-22 | Microsoft Corporation | System and method for annotating web-based documents |
JP2001265523A (en) | 2000-03-21 | 2001-09-28 | Sony Corp | Information input/output system, information input/ output method and program storage medium |
AU2001271763A1 (en) * | 2000-06-30 | 2002-01-14 | Zinio Systems, Inc. | System and method for encrypting, distributing and viewing electronic documents |
JP2002055753A (en) | 2000-08-10 | 2002-02-20 | Canon Inc | Information processor, function list display method and storage medium |
WO2002059868A1 (en) * | 2001-01-24 | 2002-08-01 | Interlink Electronics, Inc. | Game and home entertainment device remote control |
US20020101457A1 (en) | 2001-01-31 | 2002-08-01 | Microsoft Corporation | Bezel interface for small computing devices |
US20020116421A1 (en) | 2001-02-17 | 2002-08-22 | Fox Harold L. | Method and system for page-like display, formating and processing of computer generated information on networked computers |
US6961912B2 (en) | 2001-07-18 | 2005-11-01 | Xerox Corporation | Feedback mechanism for use with visual selection methods |
US7085274B1 (en) * | 2001-09-19 | 2006-08-01 | Juniper Networks, Inc. | Context-switched multi-stream pipelined reorder engine |
US6762752B2 (en) * | 2001-11-29 | 2004-07-13 | N-Trig Ltd. | Dual function input device and method |
JP2003195998A (en) | 2001-12-26 | 2003-07-11 | Canon Inc | Information processor, control method of information processor, control program of information processor and storage medium |
JP2003296015A (en) | 2002-01-30 | 2003-10-17 | Casio Comput Co Ltd | Electronic equipment |
US20030179541A1 (en) | 2002-03-21 | 2003-09-25 | Peter Sullivan | Double screen portable computer |
US7158675B2 (en) * | 2002-05-14 | 2007-01-02 | Microsoft Corporation | Interfacing with ink |
US7330184B2 (en) | 2002-06-12 | 2008-02-12 | Smart Technologies Ulc | System and method for recognizing connector gestures |
US7023427B2 (en) | 2002-06-28 | 2006-04-04 | Microsoft Corporation | Method and system for detecting multiple touches on a touch-sensitive screen |
US7656393B2 (en) * | 2005-03-04 | 2010-02-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US20090143141A1 (en) * | 2002-08-06 | 2009-06-04 | Igt | Intelligent Multiplayer Gaming System With Multi-Touch Display |
US9756349B2 (en) * | 2002-12-10 | 2017-09-05 | Sony Interactive Entertainment America Llc | User interface, system and method for controlling a video stream |
US7898529B2 (en) | 2003-01-08 | 2011-03-01 | Autodesk, Inc. | User interface having a placement and layout suitable for pen-based computers |
ATE471501T1 (en) | 2003-02-10 | 2010-07-15 | N trig ltd | TOUCH DETECTION FOR A DIGITIZER |
US7159189B2 (en) | 2003-06-13 | 2007-01-02 | Alphabase Systems, Inc. | Method and system for controlling cascaded windows on a GUI desktop on a computer |
JP4161814B2 (en) | 2003-06-16 | 2008-10-08 | ソニー株式会社 | Input method and input device |
KR100580245B1 (en) | 2003-06-26 | 2006-05-16 | 삼성전자주식회사 | Apparatus and method for displaying double picture at the same time |
JP2005026834A (en) | 2003-06-30 | 2005-01-27 | Minolta Co Ltd | Annotation providing apparatus and annotation providing program |
WO2005008444A2 (en) * | 2003-07-14 | 2005-01-27 | Matt Pallakoff | System and method for a portbale multimedia client |
JP2005122271A (en) | 2003-10-14 | 2005-05-12 | Sony Ericsson Mobilecommunications Japan Inc | Portable electronic device |
US20050101864A1 (en) * | 2003-10-23 | 2005-05-12 | Chuan Zheng | Ultrasound diagnostic imaging system and method for 3D qualitative display of 2D border tracings |
US7532196B2 (en) | 2003-10-30 | 2009-05-12 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
JP2005149279A (en) | 2003-11-18 | 2005-06-09 | Matsushita Electric Ind Co Ltd | Information processor and program |
TWI275041B (en) * | 2003-12-10 | 2007-03-01 | Univ Nat Chiao Tung | System and method for constructing large-scaled drawings of similar objects |
JP4239090B2 (en) | 2004-01-08 | 2009-03-18 | 富士フイルム株式会社 | File management program |
US7197502B2 (en) | 2004-02-18 | 2007-03-27 | Friendly Polynomials, Inc. | Machine-implemented activity management system using asynchronously shared activity data objects and journal data items |
CN101268504A (en) | 2004-02-25 | 2008-09-17 | 爱普乐技术公司 | Apparatus for providing multi-mode digital input |
US7995036B2 (en) | 2004-02-27 | 2011-08-09 | N-Trig Ltd. | Noise reduction in digitizer system |
US7383500B2 (en) * | 2004-04-30 | 2008-06-03 | Microsoft Corporation | Methods and systems for building packages that contain pre-paginated documents |
JP4148187B2 (en) | 2004-06-03 | 2008-09-10 | ソニー株式会社 | Portable electronic device, input operation control method and program thereof |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
US7649524B2 (en) | 2004-07-15 | 2010-01-19 | N-Trig Ltd. | Tracking window for a digitizer system |
EP1787281A2 (en) * | 2004-07-15 | 2007-05-23 | N-Trig Ltd. | Automatic switching for a dual mode digitizer |
US8381135B2 (en) * | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US7454717B2 (en) | 2004-10-20 | 2008-11-18 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
US20060092177A1 (en) | 2004-10-30 | 2006-05-04 | Gabor Blasko | Input method and apparatus using tactile guidance and bi-directional segmented stroke |
US7925996B2 (en) | 2004-11-18 | 2011-04-12 | Microsoft Corporation | Method and system for providing multiple input connecting user interface |
US7489324B2 (en) | 2005-03-07 | 2009-02-10 | Vistaprint Technologies Limited | Automated image processing |
US20060241864A1 (en) | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
JP4603931B2 (en) | 2005-05-16 | 2010-12-22 | 任天堂株式会社 | Object movement control device and object movement control program |
US20060262105A1 (en) | 2005-05-18 | 2006-11-23 | Microsoft Corporation | Pen-centric polyline drawing tool |
US20060262188A1 (en) | 2005-05-20 | 2006-11-23 | Oded Elyada | System and method for detecting changes in an environment |
US7676767B2 (en) * | 2005-06-15 | 2010-03-09 | Microsoft Corporation | Peel back user interface to show hidden functions |
US8161415B2 (en) * | 2005-06-20 | 2012-04-17 | Hewlett-Packard Development Company, L.P. | Method, article, apparatus and computer system for inputting a graphical object |
US7734654B2 (en) | 2005-08-16 | 2010-06-08 | International Business Machines Corporation | Method and system for linking digital pictures to electronic documents |
JP4394057B2 (en) | 2005-09-21 | 2010-01-06 | アルプス電気株式会社 | Input device |
US7574628B2 (en) * | 2005-11-14 | 2009-08-11 | Hadi Qassoudi | Clickless tool |
US7868874B2 (en) | 2005-11-15 | 2011-01-11 | Synaptics Incorporated | Methods and systems for detecting a position-based attribute of an object using digital codes |
US7636071B2 (en) * | 2005-11-30 | 2009-12-22 | Hewlett-Packard Development Company, L.P. | Providing information in a multi-screen device |
US7603633B2 (en) | 2006-01-13 | 2009-10-13 | Microsoft Corporation | Position-based multi-stroke marking menus |
JP5092255B2 (en) | 2006-03-09 | 2012-12-05 | カシオ計算機株式会社 | Display device |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US20070236468A1 (en) | 2006-03-30 | 2007-10-11 | Apaar Tuli | Gesture based device activation |
KR100672605B1 (en) | 2006-03-30 | 2007-01-24 | 엘지전자 주식회사 | Method for selecting items and terminal therefor |
US20100045705A1 (en) * | 2006-03-30 | 2010-02-25 | Roel Vertegaal | Interaction techniques for flexible displays |
US8587526B2 (en) | 2006-04-12 | 2013-11-19 | N-Trig Ltd. | Gesture recognition feedback for a dual mode digitizer |
US20090278806A1 (en) * | 2008-05-06 | 2009-11-12 | Matias Gonzalo Duarte | Extended touch-sensitive control area for electronic device |
US20070262951A1 (en) | 2006-05-09 | 2007-11-15 | Synaptics Incorporated | Proximity sensor device and method with improved indication of adjustment |
KR100897806B1 (en) | 2006-05-23 | 2009-05-15 | 엘지전자 주식회사 | Method for selecting items and terminal therefor |
US8086971B2 (en) * | 2006-06-28 | 2011-12-27 | Nokia Corporation | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
US7880728B2 (en) * | 2006-06-29 | 2011-02-01 | Microsoft Corporation | Application switching via a touch screen interface |
WO2008016031A1 (en) | 2006-07-31 | 2008-02-07 | Access Co., Ltd. | Electronic device, display system, display method, and program |
JP4514830B2 (en) | 2006-08-15 | 2010-07-28 | エヌ−トリグ リミテッド | Gesture detection for digitizer |
US7813774B2 (en) * | 2006-08-18 | 2010-10-12 | Microsoft Corporation | Contact, motion and position sensing circuitry providing data entry associated with keypad and touchpad |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8106856B2 (en) * | 2006-09-06 | 2012-01-31 | Apple Inc. | Portable electronic device for photo management |
US7831727B2 (en) * | 2006-09-11 | 2010-11-09 | Apple Computer, Inc. | Multi-content presentation of unassociated content types |
US8564543B2 (en) * | 2006-09-11 | 2013-10-22 | Apple Inc. | Media player with imaged based browsing |
US20080084400A1 (en) * | 2006-10-10 | 2008-04-10 | Outland Research, Llc | Touch-gesture control of video media play on handheld media players |
US8970503B2 (en) * | 2007-01-05 | 2015-03-03 | Apple Inc. | Gestures for devices having one or more touch sensitive surfaces |
DE202007014957U1 (en) | 2007-01-05 | 2007-12-27 | Apple Inc., Cupertino | Multimedia touch screen communication device responsive to gestures for controlling, manipulating and editing media files |
US7956847B2 (en) * | 2007-01-05 | 2011-06-07 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
US8963842B2 (en) * | 2007-01-05 | 2015-02-24 | Visteon Global Technologies, Inc. | Integrated hardware and software user interface |
US7877707B2 (en) * | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US7978182B2 (en) | 2007-01-07 | 2011-07-12 | Apple Inc. | Screen rotation gestures on a portable multifunction device |
US8607167B2 (en) * | 2007-01-07 | 2013-12-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for providing maps and directions |
US8519963B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display |
US10437459B2 (en) | 2007-01-07 | 2019-10-08 | Apple Inc. | Multitouch data fusion |
US8665225B2 (en) * | 2007-01-07 | 2014-03-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture |
WO2008095137A2 (en) * | 2007-01-31 | 2008-08-07 | Perceptive Pixel, Inc. | Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques |
JP4973245B2 (en) | 2007-03-08 | 2012-07-11 | 富士ゼロックス株式会社 | Display device and program |
US8347206B2 (en) | 2007-03-15 | 2013-01-01 | Microsoft Corporation | Interactive image tagging |
US20080249682A1 (en) | 2007-04-06 | 2008-10-09 | Visteon Global Technologies, Inc. | Touch control bezel for display devices |
WO2008138030A1 (en) | 2007-05-11 | 2008-11-20 | Rpo Pty Limited | User-defined enablement protocol |
TWM325544U (en) | 2007-05-15 | 2008-01-11 | High Tech Comp Corp | Electronic device with switchable user interface and electronic device with accessable touch operation |
JP2008305087A (en) | 2007-06-06 | 2008-12-18 | Toshiba Matsushita Display Technology Co Ltd | Display device |
JP2008305036A (en) | 2007-06-06 | 2008-12-18 | Hitachi Displays Ltd | Display device with touch panel |
US20090019188A1 (en) * | 2007-07-11 | 2009-01-15 | Igt | Processing input for computing systems based on the state of execution |
MY154070A (en) | 2007-07-19 | 2015-04-30 | Choy Heng Kah | Dual screen presentation notebook computer |
US20090033632A1 (en) * | 2007-07-30 | 2009-02-05 | Szolyga Thomas H | Integrated touch pad and pen-based tablet input system |
WO2009018314A2 (en) * | 2007-07-30 | 2009-02-05 | Perceptive Pixel, Inc. | Graphical user interface for large-scale, multi-user, multi-touch systems |
KR20090013927A (en) | 2007-08-03 | 2009-02-06 | 에스케이 텔레콤주식회사 | Method for executing memo at viewer screen of electronic book, apparatus applied to the same |
US20090054107A1 (en) * | 2007-08-20 | 2009-02-26 | Synaptics Incorporated | Handheld communication device and method for conference call initiation |
US7778118B2 (en) * | 2007-08-28 | 2010-08-17 | Garmin Ltd. | Watch device having touch-bezel user interface |
US7941758B2 (en) | 2007-09-04 | 2011-05-10 | Apple Inc. | Animation of graphical objects |
US8122384B2 (en) * | 2007-09-18 | 2012-02-21 | Palo Alto Research Center Incorporated | Method and apparatus for selecting an object within a user interface by performing a gesture |
US20090079699A1 (en) * | 2007-09-24 | 2009-03-26 | Motorola, Inc. | Method and device for associating objects |
KR101386473B1 (en) | 2007-10-04 | 2014-04-18 | 엘지전자 주식회사 | Mobile terminal and its menu display method |
DE202008018283U1 (en) * | 2007-10-04 | 2012-07-17 | Lg Electronics Inc. | Menu display for a mobile communication terminal |
KR100869557B1 (en) | 2007-10-18 | 2008-11-27 | 김종대 | Power transmission of chainless type bicycle |
KR100930563B1 (en) * | 2007-11-06 | 2009-12-09 | 엘지전자 주식회사 | Mobile terminal and method of switching broadcast channel or broadcast channel list of mobile terminal |
TW200921478A (en) | 2007-11-06 | 2009-05-16 | Giga Byte Comm Inc | A picture-page scrolling control method of touch panel for hand-held electronic device and device thereof |
US8294669B2 (en) * | 2007-11-19 | 2012-10-23 | Palo Alto Research Center Incorporated | Link target accuracy in touch-screen mobile devices by layout adjustment |
US20090153289A1 (en) | 2007-12-12 | 2009-06-18 | Eric James Hope | Handheld electronic devices with bimodal remote control functionality |
US8154523B2 (en) * | 2007-12-13 | 2012-04-10 | Eastman Kodak Company | Electronic device, display and touch-sensitive user interface |
US8395584B2 (en) * | 2007-12-31 | 2013-03-12 | Sony Corporation | Mobile terminals including multiple user interfaces on different faces thereof configured to be used in tandem and related methods of operation |
US20090167702A1 (en) | 2008-01-02 | 2009-07-02 | Nokia Corporation | Pointing device detection |
CN101482790B (en) | 2008-01-09 | 2012-03-14 | 宏达国际电子股份有限公司 | Electronic device capable of transferring object between two display elements and its control method |
AU2009203914A1 (en) | 2008-01-09 | 2009-07-16 | Smart Technologies Ulc | Multi-page organizing and manipulating electronic documents |
WO2009097555A2 (en) | 2008-01-30 | 2009-08-06 | Google Inc. | Notification of mobile device events |
KR100981268B1 (en) | 2008-02-15 | 2010-09-10 | 한국표준과학연구원 | Touch screen apparatus using tactile sensors |
US8555207B2 (en) | 2008-02-27 | 2013-10-08 | Qualcomm Incorporated | Enhanced input using recognized gestures |
TW200943140A (en) | 2008-04-02 | 2009-10-16 | Asustek Comp Inc | Electronic apparatus and control method thereof |
US8289289B2 (en) | 2008-04-03 | 2012-10-16 | N-trig, Ltd. | Multi-touch and single touch detection |
KR20090106755A (en) | 2008-04-07 | 2009-10-12 | 주식회사 케이티테크 | Method, Terminal for providing memo recording function and computer readable record-medium on which program for executing method thereof |
US20090276701A1 (en) | 2008-04-30 | 2009-11-05 | Nokia Corporation | Apparatus, method and computer program product for facilitating drag-and-drop of an object |
TWI364699B (en) | 2008-05-07 | 2012-05-21 | Acer Inc | Synchronous operating method of remote systems and synchronous operating method of local touch screens |
US20090282332A1 (en) | 2008-05-12 | 2009-11-12 | Nokia Corporation | Apparatus, method and computer program product for selecting multiple items using multi-touch |
US20090284478A1 (en) | 2008-05-15 | 2009-11-19 | Microsoft Corporation | Multi-Contact and Single-Contact Input |
JP5164675B2 (en) | 2008-06-04 | 2013-03-21 | キヤノン株式会社 | User interface control method, information processing apparatus, and program |
TW200951783A (en) | 2008-06-06 | 2009-12-16 | Acer Inc | Electronic device and controlling method thereof |
CN101566865A (en) | 2008-06-08 | 2009-10-28 | 许文武 | Multi-working mode double screen notebook computer system and operating control method |
US8245156B2 (en) | 2008-06-28 | 2012-08-14 | Apple Inc. | Radial menu selection |
JP2010015238A (en) | 2008-07-01 | 2010-01-21 | Sony Corp | Information processor and display method for auxiliary information |
US20110115735A1 (en) | 2008-07-07 | 2011-05-19 | Lev Jeffrey A | Tablet Computers Having An Internal Antenna |
JP2010019643A (en) | 2008-07-09 | 2010-01-28 | Toyota Motor Corp | Information terminal, navigation apparatus, and option display method |
JP5606669B2 (en) * | 2008-07-16 | 2014-10-15 | 任天堂株式会社 | 3D puzzle game apparatus, game program, 3D puzzle game system, and game control method |
US8159455B2 (en) * | 2008-07-18 | 2012-04-17 | Apple Inc. | Methods and apparatus for processing combinations of kinematical inputs |
JP5670016B2 (en) | 2008-07-22 | 2015-02-18 | レノボ・イノベーションズ・リミテッド(香港) | Display device, communication terminal, display device display method, and display control program |
US8390577B2 (en) | 2008-07-25 | 2013-03-05 | Intuilab | Continuous recognition of multi-touch gestures |
US8924892B2 (en) * | 2008-08-22 | 2014-12-30 | Fuji Xerox Co., Ltd. | Multiple selection on devices with many gestures |
KR101529916B1 (en) * | 2008-09-02 | 2015-06-18 | 엘지전자 주식회사 | Portable terminal |
KR100969790B1 (en) | 2008-09-02 | 2010-07-15 | 엘지전자 주식회사 | Mobile terminal and method for synthersizing contents |
WO2010030984A1 (en) * | 2008-09-12 | 2010-03-18 | Gesturetek, Inc. | Orienting a displayed element relative to a user |
KR101548958B1 (en) | 2008-09-18 | 2015-09-01 | 삼성전자주식회사 | A method for operating control in mobile terminal with touch screen and apparatus thereof. |
US8600446B2 (en) * | 2008-09-26 | 2013-12-03 | Htc Corporation | Mobile device interface with dual windows |
US8547347B2 (en) * | 2008-09-26 | 2013-10-01 | Htc Corporation | Method for generating multiple windows frames, electronic device thereof, and computer program product using the method |
JP5362307B2 (en) | 2008-09-30 | 2013-12-11 | 富士フイルム株式会社 | Drag and drop control device, method, program, and computer terminal |
US8284170B2 (en) | 2008-09-30 | 2012-10-09 | Apple Inc. | Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor |
US9250797B2 (en) * | 2008-09-30 | 2016-02-02 | Verizon Patent And Licensing Inc. | Touch gesture interface apparatuses, systems, and methods |
KR101586627B1 (en) * | 2008-10-06 | 2016-01-19 | 삼성전자주식회사 | A method for controlling of list with multi touch and apparatus thereof |
KR101503835B1 (en) * | 2008-10-13 | 2015-03-18 | 삼성전자주식회사 | Apparatus and method for object management using multi-touch |
JP4683110B2 (en) * | 2008-10-17 | 2011-05-11 | ソニー株式会社 | Display device, display method, and program |
US20100107067A1 (en) * | 2008-10-27 | 2010-04-29 | Nokia Corporation | Input on touch based user interfaces |
KR20100050103A (en) * | 2008-11-05 | 2010-05-13 | 엘지전자 주식회사 | Method of controlling 3 dimension individual object on map and mobile terminal using the same |
CN201298220Y (en) | 2008-11-26 | 2009-08-26 | 陈伟山 | Infrared reflection multipoint touching device based on LCD liquid crystal display screen |
KR101544475B1 (en) * | 2008-11-28 | 2015-08-13 | 엘지전자 주식회사 | Controlling of Input/Output through touch |
JP5268595B2 (en) * | 2008-11-28 | 2013-08-21 | ソニー株式会社 | Image processing apparatus, image display method, and image display program |
US20100185949A1 (en) | 2008-12-09 | 2010-07-22 | Denny Jaeger | Method for using gesture objects for computer control |
US8749497B2 (en) | 2008-12-12 | 2014-06-10 | Apple Inc. | Multi-touch shape drawing |
TWI381305B (en) | 2008-12-25 | 2013-01-01 | Compal Electronics Inc | Method for displaying and operating user interface and electronic device |
US9864513B2 (en) * | 2008-12-26 | 2018-01-09 | Hewlett-Packard Development Company, L.P. | Rendering a virtual input device upon detection of a finger movement across a touch-sensitive display |
US20100164878A1 (en) * | 2008-12-31 | 2010-07-01 | Nokia Corporation | Touch-click keypad |
US8330733B2 (en) | 2009-01-21 | 2012-12-11 | Microsoft Corporation | Bi-modal multiscreen interactivity |
US8279184B2 (en) | 2009-01-27 | 2012-10-02 | Research In Motion Limited | Electronic device including a touchscreen and method |
JP4771183B2 (en) | 2009-01-30 | 2011-09-14 | 株式会社デンソー | Operating device |
US8219937B2 (en) | 2009-02-09 | 2012-07-10 | Microsoft Corporation | Manipulation of graphical elements on graphical user interface via multi-touch gestures |
TWI370473B (en) | 2009-02-20 | 2012-08-11 | Wistron Corp | Switch structure mounted on the sidewall of circuit boards for electronic devices and manufacturing methods of the circuit boards thereof |
WO2010096755A1 (en) | 2009-02-23 | 2010-08-26 | Provo Craft And Novelty, Inc. | System for controlling an electronic cutting machine |
US9250788B2 (en) | 2009-03-18 | 2016-02-02 | IdentifyMine, Inc. | Gesture handlers of a gesture engine |
US20100251112A1 (en) | 2009-03-24 | 2010-09-30 | Microsoft Corporation | Bimodal touch sensitive digital notebook |
US8134539B2 (en) | 2009-03-30 | 2012-03-13 | Eastman Kodak Company | Digital picture frame having near-touch and true-touch |
US8370762B2 (en) | 2009-04-10 | 2013-02-05 | Cellco Partnership | Mobile functional icon use in operational area in touch panel devices |
JP5229083B2 (en) * | 2009-04-14 | 2013-07-03 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US8212788B2 (en) | 2009-05-07 | 2012-07-03 | Microsoft Corporation | Touch input to modulate changeable parameter |
TW201040823A (en) | 2009-05-11 | 2010-11-16 | Au Optronics Corp | Multi-touch method for resistive touch panel |
US8169418B2 (en) | 2009-05-12 | 2012-05-01 | Sony Ericsson Mobile Communications Ab | Displays for electronic devices that detect and respond to the size and/or angular orientation of user input objects |
CN101551728A (en) | 2009-05-21 | 2009-10-07 | 友达光电股份有限公司 | Electric resistance and touch control panel multi-point touch control process |
US8269736B2 (en) | 2009-05-22 | 2012-09-18 | Microsoft Corporation | Drop target gestures |
US8836648B2 (en) | 2009-05-27 | 2014-09-16 | Microsoft Corporation | Touch pull-in gesture |
US8549432B2 (en) | 2009-05-29 | 2013-10-01 | Apple Inc. | Radial menus |
US9405456B2 (en) | 2009-06-08 | 2016-08-02 | Xerox Corporation | Manipulation of displayed objects by virtual magnetism |
CN101576789B (en) | 2009-06-16 | 2011-07-27 | 广东威创视讯科技股份有限公司 | Method for maintaining and modifying cross-screen writing stroke attributes in display wall positioning system |
US9152317B2 (en) * | 2009-08-14 | 2015-10-06 | Microsoft Technology Licensing, Llc | Manipulation of graphical elements via gestures |
US20110055753A1 (en) * | 2009-08-31 | 2011-03-03 | Horodezky Samuel J | User interface methods providing searching functionality |
US9262063B2 (en) * | 2009-09-02 | 2016-02-16 | Amazon Technologies, Inc. | Touch-screen user interface |
US9274699B2 (en) | 2009-09-03 | 2016-03-01 | Obscura Digital | User interface for a large scale multi-user, multi-touch system |
US20110072036A1 (en) | 2009-09-23 | 2011-03-24 | Microsoft Corporation | Page-based content storage system |
US20110117526A1 (en) * | 2009-11-16 | 2011-05-19 | Microsoft Corporation | Teaching gesture initiation with registration posture guides |
US20110126094A1 (en) * | 2009-11-24 | 2011-05-26 | Horodezky Samuel J | Method of modifying commands on a touch screen user interface |
US20110143769A1 (en) * | 2009-12-16 | 2011-06-16 | Microsoft Corporation | Dual display mobile communication device |
US20110167336A1 (en) | 2010-01-04 | 2011-07-07 | Hit Development Llc | Gesture-based web site design |
JP2011150413A (en) | 2010-01-19 | 2011-08-04 | Sony Corp | Information processing apparatus, method and program for inputting operation |
US8239785B2 (en) | 2010-01-27 | 2012-08-07 | Microsoft Corporation | Edge gestures |
US8261213B2 (en) | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
US20110185299A1 (en) | 2010-01-28 | 2011-07-28 | Microsoft Corporation | Stamp Gestures |
US9411504B2 (en) | 2010-01-28 | 2016-08-09 | Microsoft Technology Licensing, Llc | Copy and staple gestures |
US20110185320A1 (en) | 2010-01-28 | 2011-07-28 | Microsoft Corporation | Cross-reference Gestures |
US9519356B2 (en) | 2010-02-04 | 2016-12-13 | Microsoft Technology Licensing, Llc | Link gestures |
US20110191719A1 (en) | 2010-02-04 | 2011-08-04 | Microsoft Corporation | Cut, Punch-Out, and Rip Gestures |
US20110191704A1 (en) | 2010-02-04 | 2011-08-04 | Microsoft Corporation | Contextual multiplexing gestures |
US20110199386A1 (en) | 2010-02-12 | 2011-08-18 | Honeywell International Inc. | Overlay feature to provide user assistance in a multi-touch interactive display environment |
US20110231796A1 (en) | 2010-02-16 | 2011-09-22 | Jose Manuel Vigil | Methods for navigating a touch screen device in conjunction with gestures |
US20110209098A1 (en) | 2010-02-19 | 2011-08-25 | Hinckley Kenneth P | On and Off-Screen Gesture Combinations |
US8799827B2 (en) | 2010-02-19 | 2014-08-05 | Microsoft Corporation | Page manipulations using on and off-screen gestures |
US9965165B2 (en) | 2010-02-19 | 2018-05-08 | Microsoft Technology Licensing, Llc | Multi-finger gestures |
US9367205B2 (en) | 2010-02-19 | 2016-06-14 | Microsoft Technolgoy Licensing, Llc | Radial menus with bezel gestures |
US9310994B2 (en) | 2010-02-19 | 2016-04-12 | Microsoft Technology Licensing, Llc | Use of bezel as an input mechanism |
US9075522B2 (en) | 2010-02-25 | 2015-07-07 | Microsoft Technology Licensing, Llc | Multi-screen bookmark hold gesture |
US8539384B2 (en) | 2010-02-25 | 2013-09-17 | Microsoft Corporation | Multi-screen pinch and expand gestures |
US20110209101A1 (en) | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen pinch-to-pocket gesture |
US8751970B2 (en) | 2010-02-25 | 2014-06-10 | Microsoft Corporation | Multi-screen synchronous slide gesture |
US20110209089A1 (en) | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen object-hold and page-change gesture |
US9454304B2 (en) | 2010-02-25 | 2016-09-27 | Microsoft Technology Licensing, Llc | Multi-screen dual tap gesture |
US8473870B2 (en) * | 2010-02-25 | 2013-06-25 | Microsoft Corporation | Multi-screen hold and drag gesture |
US8707174B2 (en) * | 2010-02-25 | 2014-04-22 | Microsoft Corporation | Multi-screen hold and page-flip gesture |
US20110209058A1 (en) | 2010-02-25 | 2011-08-25 | Microsoft Corporation | Multi-screen hold and tap gesture |
US20110291964A1 (en) | 2010-06-01 | 2011-12-01 | Kno, Inc. | Apparatus and Method for Gesture Control of a Dual Panel Electronic Device |
USD631043S1 (en) * | 2010-09-12 | 2011-01-18 | Steven Kell | Electronic dual screen personal tablet computer with integrated stylus |
EP2437153A3 (en) | 2010-10-01 | 2016-10-05 | Samsung Electronics Co., Ltd. | Apparatus and method for turning e-book pages in portable terminal |
US8495522B2 (en) | 2010-10-18 | 2013-07-23 | Nokia Corporation | Navigation in a display |
US8640047B2 (en) | 2011-06-01 | 2014-01-28 | Micorsoft Corporation | Asynchronous handling of a user interface manipulation |
US8810533B2 (en) | 2011-07-20 | 2014-08-19 | Z124 | Systems and methods for receiving gesture inputs spanning multiple input devices |
-
2010
- 2010-02-19 US US12/709,204 patent/US9274682B2/en active Active
-
2011
- 2011-02-17 JP JP2012554008A patent/JP5883400B2/en active Active
- 2011-02-17 EP EP11745193.0A patent/EP2537088B1/en active Active
- 2011-02-17 WO PCT/US2011/025131 patent/WO2011103218A2/en active Application Filing
- 2011-02-17 CN CN201180009579.2A patent/CN102884498B/en active Active
- 2011-02-17 CA CA2788137A patent/CA2788137C/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101432677A (en) * | 2005-03-04 | 2009-05-13 | 苹果公司 | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
CN1936799A (en) * | 2005-09-23 | 2007-03-28 | 鸿富锦精密工业(深圳)有限公司 | User operation control apparatus and method |
Also Published As
Publication number | Publication date |
---|---|
JP5883400B2 (en) | 2016-03-15 |
JP2013520727A (en) | 2013-06-06 |
WO2011103218A3 (en) | 2012-01-05 |
EP2537088B1 (en) | 2017-06-21 |
EP2537088A2 (en) | 2012-12-26 |
CA2788137A1 (en) | 2011-08-25 |
CA2788137C (en) | 2017-12-05 |
US9274682B2 (en) | 2016-03-01 |
EP2537088A4 (en) | 2016-03-09 |
US20110205163A1 (en) | 2011-08-25 |
CN102884498A (en) | 2013-01-16 |
WO2011103218A2 (en) | 2011-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102884498B (en) | The method carrying out on the computing device inputting | |
CN102207788B (en) | Radial menus with bezel gestures | |
CN102122230B (en) | Multi-finger gesture | |
US8799827B2 (en) | Page manipulations using on and off-screen gestures | |
JP5684291B2 (en) | Combination of on and offscreen gestures | |
CN102122229A (en) | Use of bezel as an input mechanism | |
EP2539803B1 (en) | Multi-screen hold and page-flip gesture | |
EP2539799B1 (en) | Multi-screen pinch and expand gestures | |
EP2539801B1 (en) | Multi-screen dual tap gesture | |
EP2539802B1 (en) | Multi-screen hold and tap gesture | |
US9383897B2 (en) | Spiraling radial menus in computer systems | |
CN102147704B (en) | Multi-screen bookmark hold gesture | |
US8751970B2 (en) | Multi-screen synchronous slide gesture | |
US20100192102A1 (en) | Displaying radial menus near edges of a display area | |
US20100192101A1 (en) | Displaying radial menus in a graphics container | |
EP2740022A1 (en) | Cross-slide gesture to select and rearrange |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
ASS | Succession or assignment of patent right |
Owner name: MICROSOFT TECHNOLOGY LICENSING LLC Free format text: FORMER OWNER: MICROSOFT CORP. Effective date: 20150717 |
|
C41 | Transfer of patent application or patent right or utility model | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20150717 Address after: Washington State Applicant after: Micro soft technique license Co., Ltd Address before: Washington State Applicant before: Microsoft Corp. |
|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |