US6545670B1 - Methods and apparatus for man machine interfaces and related activity - Google Patents
Methods and apparatus for man machine interfaces and related activity Download PDFInfo
- Publication number
- US6545670B1 US6545670B1 US09/568,554 US56855400A US6545670B1 US 6545670 B1 US6545670 B1 US 6545670B1 US 56855400 A US56855400 A US 56855400A US 6545670 B1 US6545670 B1 US 6545670B1
- Authority
- US
- United States
- Prior art keywords
- touch screen
- outer member
- camera
- screen according
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000000034 method Methods 0.000 title abstract description 9
- 230000000694 effects Effects 0.000 title description 5
- 230000003993 interaction Effects 0.000 claims description 5
- 239000011248 coating agent Substances 0.000 claims description 3
- 238000000576 coating method Methods 0.000 claims description 3
- 239000007788 liquid Substances 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims 4
- 239000007787 solid Substances 0.000 claims 2
- 230000003287 optical effect Effects 0.000 description 9
- 230000011664 signaling Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000026683 transduction Effects 0.000 description 2
- 238000010361 transduction Methods 0.000 description 2
- 229920002799 BoPET Polymers 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 239000005041 Mylar™ Substances 0.000 description 1
- 229920005372 Plexiglas® Polymers 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000011324 bead Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 235000019504 cigarettes Nutrition 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011438 discrete method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000004926 polymethyl methacrylate Substances 0.000 description 1
- 230000008672 reprogramming Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000001771 vacuum deposition Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04109—FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location
Definitions
- the invention relates to simple input devices for computers, particularly, but not necessarily, intended for use with 3-D graphically intensive activities, and operating by optically sensing a human input to a display screen or other object and/or the sensing of human positions or orientations.
- the invention herein is a continuation in part of several inventions of mine, listed above.
- One embodiment is a monitor housing for a computer that integrally incorporates digital TV cameras to look at points on the hand or the finger, or objects held in the hand of the user, which are used to input data to the computer. It may also or alternatively, look at the head of the user as well.
- improved touch screens and further discloses camera based sensing of laser pointer indications.
- the invention in several other embodiments, uses real time stereo photogrammetry or other methods using single or multiple TV cameras whose output is analyzed and used as input to a personal computer, typically to gather data concerning the 3D location of parts of, or objects held by, a person or persons.
- FIG. 1 illustrates a computer monitor of the invention integrally incorporating one or two cameras pointing outward from the surface of the Bezel facing the user to observe objects held by the user, or parts of the user such as fingers and hands, the bezel preferably including an opaque (to the user) plastic cover for both cameras and light sources. Also illustrated is an additional camera for viewing a user directly or other purposes.
- FIG. 2 illustrates a version of the invention wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
- FIG. 3 illustrates a touch screen of the invention of co-pending application 1 referenced above, having improved screen rigidity.
- distortion of the screen occurs primarily in a zone that is able to distort, with the zone supported by a rigid backing member.
- FIG. 4 illustrates the use of a TV camera based transduction using the camera for screen distortion determination similar to that of FIG. 3 with the camera used for the determination of the position of a laser pointer indication such as a spot directed by a user on the screen, particularly in response to an image displayed on the screen.
- a laser pointer indication such as a spot directed by a user on the screen
- FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information that is itself then sent to the camera system.
- FIG. 6 illustrates an embodiment using laser pointers for acquaintance making purposes, including the use of the laser pointer to an image to designate an image on a television screen using the invention of FIG. 4 or 5 above.
- FIG. 7 illustrates Handwriting and signature recognition of sensed pencil position for internet commerce and other purpose, including a D-Sight technology based writing pad capable of distortion signature determination.
- FIG. 1 A first figure.
- This embodiment illustrating a computer display with camera(s) and illumination system is an alternative or addition to that of FIG. 1 of copending reference 3 above (Ser. No. 09/138,339).
- FIG. 1 a A PC computer based embodiment is shown in FIG. 1 a.
- a stereo pair of cameras 100 and 101 located on each side of the upper surface of monitor 102 (for example a rear projection TV having 80 inch diagonal screen 104 ) facing the user, desirably having one or more cover windows 103 .
- a single extensive cover window 103 is covering both cameras and their associated light sources 110 and 111 , and mounted flush with the monitor front bezel surface.
- the LED's in this application are typically used to illuminate targets associated with the fingers, hand, head of the user, or objects held by the user, such as user 135 with hands 136 and 137 , and head 138 .
- These targets are desirably, but not necessarily, retro-reflective, and may be constituted by the object features themselves (e.g. a finger), or by features of clothing worn by the user, or by artificial targets other than retroreflectors.
- the cameras are preferably pointed obliquely inward at angles theta, and downward, if desired, at further angles phi toward the center of the desired work volume 140 in front of the monitor, as required (angles depend on the computer monitor width, the distance of the work zone volume, 140 , from the monitor etc.
- a single camera can be used, for determining user or other object positions such as 120 with light source 121 , both also optionally located behind a cover window (such as 103 ).
- the cover window 103 is preferably black or dark plastic which lets the LED light source wavelength pass easily, but attenuates sunlight or room lights, thus aiding camera signal to noise in many cases, and making the cameras and light sources substantially invisible to the user (especially if the light sources are in the near infrared) and thus pleasing to the eye and not distracting.
- Alternate camera locations may be used such as in the sides of the monitor bezel, or anywhere desired, for example as appendages to the monitor. They may alternately or in addition, be at the rear of the keyboard in front of the monitor. In the case of cameras mounted at the rear of the keyboard (toward the display screen), these cameras are also inclined to point toward the user at an angle as well.
- an additional camera for viewing a user directly or other purposes may be employed.
- a stereo pair such as 100 and 101
- a third camera such as 120 might be used just for imaging using ambient illumination such as room lights (i.e. LED source 121 is not needed, though could be provided if desired)
- ambient illumination such as room lights (i.e. LED source 121 is not needed, though could be provided if desired)
- any light sources such as 110 and 111 located near the optical axes of the other two, do not generally illuminate any retro-reflectors in such a way as to register same on camera 120 —due to the limited angular return characteristic of retro-reflectors.
- FIG. 2 illustrates a version of the invention of the co-pending applications wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
- target 201 on finger 202 is illuminated by light 205 from zone 210 of screen 220 on which an image is projected by projection device 221 as shown.
- Light reflected by the target 201 is imaged by camera 230 such as that employed in the invention of FIG. 1 for example.
- the color of light 205 , or the point (or points) 210 from which it emanates, may be varied by the program of computer 250 controlling screen display driver 260 .
- the control of point location and color allows selective illumination of targets or object features such as finger tip 203 , both by choice of color sensitivity response with respect to the target illuminated, and if the target is retroreflective, by choice of screen location with in the viewing field of view 225 of TV camera 230 . This can be used to select, by choice of screen generated light source location which camera, for example of two cameras sees the target.
- FIG. 3 illustrates a touch screen of the invention as in the co-pending invention 1 referenced above, having however, improved screen rigidity.
- distortion of the screen occurs primarily in a region that is able to distort or otherwise be changed, with the region supported by a rigid backing member.
- transparent screen member 305 whose outer scattering surface 310 is touched, and locally distorted inward, by finger 315 .
- This surface is separated from rigid (typically glass) backing member 320 by optically transparent refractive medium 330 of thickness t, which is compressed by the force of finger 315 .
- the medium 330 can be a liquid, such as water or alcohol, that is either compressed, or displaced temporarily into a reservoir such as 340 (dotted lines) for example.
- the index of refraction of member 320 and material 330 are closely matched such that little refraction occurs as light passes from 320 through 330 to surface 311 and back toward camera 370 (after retroreflection by expansive retroreflector 365 typically comprised of Scotchlight 7615 glass bead material) for example used to determine the distortion of 311 due to the retroreflective based “D sight” image effect or another optical phenomena such as disclosed in copending reference 1.
- outer surface 310 of member 315 be relatively hard, such as thin Plexiglas or Mylar.
- a partially reflective coating such as commonly done with vacuum deposition of silver or interference coatings—the latter useful if at the wavelength used for distortion determination, which additionally can be in the near infrared (e.g. 0.8 microns, where LED sources and TV camera sensitivity is commonplace)
- a periodic array of zones of reflective material for example a reflective dot or stripe 0.05 mm wide, every 1 mm
- the projection device if desired can be programmed to minimize the effect of such reflective or partially reflective zones in the displayed image.
- FIG. 4 illustrates the use of a TV camera based transduction of finger touch using for example camera 410 to determine screen distortion occurring on touch similar to that of FIG. 3 .
- the camera 410 may also be used for the determination of the position of a laser or other optical based pointer indication such as a spot 420 from laser pointer 425 directed by a user 430 on the screen 440 , particularly in response to an image (not shown for clarity) displayed on the screen, for example by image projector 455 .
- a child could point a pointer at the pony image on the screen, and the program in system computer 460 could acknowledge this with audio feedback from loudspeaker 465 to the child, and if desired record in memory 470 that the child had correctly identified the pony, useful for tracking the child's learning, or recording scores for game purposes.
- optical pointer function is distinct from that of the physical touch screen indication of FIG. 3 above. Either function can exist independently, or both together. A separate camera such as 411 can alternatively be used to determine laser pointer indication.
- a person such as 475 can draw a drawing 487 with a laser pointer 480 for example by successively tracing it on display screen 481 where the successive laser pointer indications on the screen are digitized with the camera system 485 comprising one or more TV cameras connected via for example an IEEE 1394 firewire connection to computer 490 as shown, equipped to digitize the camera inputs. This allows one to draw modifications for example sitting in a conference room on to a drawing and have this digitized by the camera.
- the camera can optionally can be equipped with a laser wavelength bandpass filter such as 495 in front of it to make it easy to detect the position laser spot anywhere on t he big screen, even in bright light surroundings.
- a laser wavelength bandpass filter such as 495 in front of it to make it easy to detect the position laser spot anywhere on t he big screen, even in bright light surroundings.
- the brightness of the laser alone is often significant enough to allow reliable detection.
- Computer 490 also controls the front projector 496 , such that detected laser spot indications from the digitized camera image, can be used to modify software used to generate the image on the screen in what ever manner desired, for example to make a car design larger in a certain area, or to inject a new line on the drawing or whatever.
- a camera such as 485 to determine laser pointer indications on a front projection screen can also be used to see datums on objects in front of the screen as well, as discussed in FIG. 1 and referenced copending applications for example. These can include natural features of the person 475 , such as fingers, hands, or specialized datums such as retroreflectors generally located on apparel or extremities of the user. Combined also can be a camera to determine laser pointer location and the screen deflection or other characteristic of touch or other contact with the screen. The camera can also determine location of target datums on the object as well as laser pointer indications, and other things as well.
- FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information.
- information can be in the form of a shape (such as a heart 501 projected by laser pointer 502 whose beam is transmitted through grating 505 ), an alphanumeric character, or anything else desired.
- information can be easily changed by the user, either by changing fixed lasers, selecting different lasers with different spatially encoded holographic gratings or by having a turret of such gratings in front of a single laser.
- the color of the laser can also be changed, with certain colors signifying desired actions. Tunable wavelength lasers make this easier today.
- the information can be projected directly on an object, or on a front or rear projection screen displaying other information.
- the projected information can also be sensed as in FIG. 4, using a TV camera such as 530 , viewing an object such as a chair 540 , or alternatively a screen such as 481 on which the information is projected.
- FIG. 6 illustrates embodiments using laser pointers for acquaintance making purposes. This embodiment of the invention is particularly illustrated here for social purposes, but any application to which it useful is contemplated.
- it is adapted to signaling one's indication for wishing to meet a person particularly of the opposite sex for the purpose of dating, etc. It more particularly concerns the use in a bar, restaurant or other social scene and using a laser pointer or other means for indicating information on the laser beam so pointed to point at the person at another table or across the room in a way that would indicate an interest.
- this particular invention is more direct because it concerns the actual pointing in an area at or near the person in question with a signal. Much as you might wave your hand or do something else, but in this case it's subtle and less embarrassing. For example, if one sits in a crowded restaurant and waves their hand in the air at somebody, everyone sees that, where as if you aim a laser beam at the coffee cup of the person in question, no one sees it but the person in question and the company they are with. This is a major difference.
- a heart or some other spatially extending information signaling the particular idea This can be purposely aimed to project onto a person's clothing, or on a glass or whatever on a table in a bar or restaurant, for example.
- information that actually carries data such as contact information details.
- the cellular phone becomes more prevalent, one idea is to project a persons cell phone number with a holographic grating or other mechanism for generating such a pattern at a distance. If you have your cell phone on, and so does the other party, dialog can be initiated immediately. In addition, the person doesn't have to embarrass themselves, by looking back around to see who shot the beam, so to speak.
- Another idea is to send a message concerning one's e-mail address. This then allows the person to correspond later, in an even less obvious way without even having to talk in person.
- Transponders at the other end that would beep back. For example, you could put a cigarette lighter that actually was a transponder on the edge of your restaurant table. If someone hit that with a laser beam, it would light up or send a beam back or do something. This could be an indicator for example. Certainly something on one's clothes would be another thing. Or something on a purse, or something on another article that might be obvious for the other person to shoot at for example.
- colors can mean different things. For example, blue could be something like “lets dance”, or red could be let's leave, or whatever it is. Progress in colored lasers will make this a reality soon, insofar as laser sources for the indication are concerned.
- FIG. 5 a laser pointer is being held in the hand of the signaler who aims it at the coffee cup of a potential acquaintance sitting at table.
- the method here is that the user aims the laser and triggers it to send an optical indication signal to hit the coffee cup or any other point (such as the chair illustrated in FIG. 5) that is visible to the potential acquaintance, thereby in this simple case, signaling a message to look at the person who is signaling.
- a holographic or other pattern-generating element on the front of the laser 502 can be used to make such a signal but with a message carried spatially in a more descriptive manner.
- This message being either a pattern or a word or even a phrase, such as “let's get together” or something, or conversely a phone number, e-mail address, or other useful piece of information.
- FIG. 6 is an embodiment of a transponder device responsive to a laser directed signal such as that from laser 600 at items on the potential acquaintance's table for example, ashtray 620 placed on the table 630 that has the ability to signal back to the potential acquaintance or, in another version, to be programmed to signal to any sender (or a sender having a particular code) that the person is interested, or not interested, as the case may be.
- a laser directed signal such as that from laser 600 at items on the potential acquaintance's table for example, ashtray 620 placed on the table 630 that has the ability to signal back to the potential acquaintance or, in another version, to be programmed to signal to any sender (or a sender having a particular code) that the person is interested, or not interested, as the case may be.
- this signal could either be a modulated signal that can be detected through a modulated infrared or radio respondent means for example, or it could be a visible signal capable of lighting up and showing that the person is interested.
- This is shown by LED 632 , attached to ashtray 620 and responsive to the signal from photovoltaic detector 635 connected to a readout capable of energizing said LED on receipt of a desired signal, such as, that produced by laser beam 640 from the laser pointer 600 .
- the Detector and led are driven and powered by circuit and batteries not shown.
- the detector would be responsive over a wide range of angles, ideally 360 degrees. Alternatively multiple detectors facing in different directions could be used. This would also give an indication of the direction from which the signal came. Each detector could have a corresponding LED as well, making the indication easy to see. Or other means of communicating the arrival and/or direction of a signal could be used, such as generating an audio signal or radio signal.
- Laser 600 can also be equipped with modulator 655 modulating the laser beam with a signal 670 that is responsive to the needs of the signaler. For example, if a key energizing the modulator is pressed three times, the unit is set up to modulate at a frequency that will be demodulated at the other end and provide an indicated meaning that the signaler wants to meet. This signal could be provided by having the LED 632 just blink 3 times in rapid succession for example.
- a return signal to signal 670 could indicate that the signaler can't meet right now but would like to meet at some time in the future and would signal the person automatically modulated on the beam, the person's phone number or cell number, e-mail address, or whatever. This makes it easy for one to have then an ashtray or handbag that has the receptor for this particular number.
- a preferred goal of this invention is to provide a discrete method of signaling.
- the whole point of using a laser beam is that it is very discrete because no one can hear it, see it, or whatever—except in a region near an impacted object or in the direct line of sight. For this reason, a mechanism may be desirable to look back along the line of sight. An optical system buried in a purse, handbag, ashtray, or whatever can do this.
- Another neat idea is to have lights over each table. These lights would be energized with the laser pointer, which would be an easy shoot from a long distance, and then they could light up with either the message carried by the laser pointer or some standard message.
- Another laser pointer idea is to point at a screen on a projection TV and then sense that from behind using a camera. This was disclosed in FIG. 4 for other purposes.
- the alternative of a modulated message is also possible where the laser can put out any sort of pulse or frequency modulated code or amplitude either and some sort of detective system read that.
- the simplest thing is to have a single analog sensor looking at the back end of the screen to demodulate the signal and tell which laser used (in a room full of potential signaling users) or what signal was encoded in the modulated message.
- Another idea is to have a lighting fixture over a table that would receive the encoded messages either encoded in a time based fashion (pulse width, pulse spacing, frequency, etc) or spatially encoded.
- the spatially encoded one has the advantage in that it can be done without any sort of electronic system. In other words, the human recipient of the message can see it directly. However, it's less versatile as to change data you have to change the spatial encoded masks, be they holographic gratings or whatever.
- phase mask can generate such a phase mask through other means through what have historically been called light valves, but it's complex.
- the goal here is to try to reduce this to the simplest type of system useful by large numbers of people.
- Another embodiment of the invention may utilize a TV camera in place of single detector 635 to detect the incoming radiation from laser pointer 600 .
- the camera system can be utilized, for example, in a handbag or whatever that would possibly have the benefit of actually presenting to the owner an image of where the laser beam is coming from.
- FIG. 7 Illustrated in FIG. 7 is handwriting and signature recognition of sensed pencil position such as for internet commerce and other purpose, including a D-sight pad.
- user 701 with pen 710 writes his signature on paper 720 , resting on glass plate 725 .
- the backside of the paper 726 is reflective and using camera 730 , retroreflector 735 and light source 740 , a D sight image (using the D-Sight effect—see Ref 1 U.S. Pat. No. 4,629,319) is created which is viewed on camera 730 and analyzed where desired by computer 750 .
- This image is a function both of the xy position of the pen, and the force used. (which force is proportional with some writing instruments, such as brushes, and papers to the width of a mark produced).
- the image generated by camera 730 can be digitized and transmitted if desired to a remote analysis site 770 for authentication. It is uniquely a D-Sight image, and cannot be copied, even if the users signature, say off a credit card receipt, was available to a forger.
- a reflective member such as saran can be placed between paper 720 and glass plate 725 and pressed or sucked in contact with it, and the reflective member(saran in this case) conforming to the writing material monitored. If D-sight is not the optical means used to monitor the force signature, then other means, such a s grid projection described in copending applications may not require reflective material at all.
- the apparatus of the above embodiments can be used to determine location of items in a scene, for example furniture in a house, for which homicide studies or insurance fraud could be an issue (see also referenced co-pending application for further detail on this application).
- users may each point a laser pointer at each other, which can be detected by one or more cameras of the invention.
- each or both may point at an image on the TV screen of the invention.
- the TV camera picks this up, and displays, possibly discretely, that each liked (or disliked that image). In this way, mutual likes and dislikes can be registered and noted.
- the laser pointer of the invention could be supplanted by any optical pointer capable of easy viewing by people, and sensing.
- the TV camera of the invention in whatever location used for sensing laser radiation can be equipped with an interference filter for passing substantially only laser wavelengths used. (assuming all persons using the system use a similar wavelength).
Abstract
Description
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/568,554 US6545670B1 (en) | 1999-05-11 | 2000-05-11 | Methods and apparatus for man machine interfaces and related activity |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13367399P | 1999-05-11 | 1999-05-11 | |
US09/568,554 US6545670B1 (en) | 1999-05-11 | 2000-05-11 | Methods and apparatus for man machine interfaces and related activity |
Publications (1)
Publication Number | Publication Date |
---|---|
US6545670B1 true US6545670B1 (en) | 2003-04-08 |
Family
ID=26831576
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/568,554 Expired - Lifetime US6545670B1 (en) | 1999-05-11 | 2000-05-11 | Methods and apparatus for man machine interfaces and related activity |
Country Status (1)
Country | Link |
---|---|
US (1) | US6545670B1 (en) |
Cited By (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010020933A1 (en) * | 2000-02-21 | 2001-09-13 | Christoph Maggioni | Method and configuration for interacting with a display visible in a display window |
US20020126876A1 (en) * | 1999-08-10 | 2002-09-12 | Paul George V. | Tracking and gesture recognition system particularly suited to vehicular control applications |
US20030001818A1 (en) * | 2000-12-27 | 2003-01-02 | Masaji Katagiri | Handwritten data input device and method, and authenticating device and method |
US20030048280A1 (en) * | 2001-09-12 | 2003-03-13 | Russell Ryan S. | Interactive environment using computer vision and touchscreens |
US20030063260A1 (en) * | 2001-09-28 | 2003-04-03 | Fuji Photo Optical Co., Ltd. | Presentation system |
US6608648B1 (en) * | 1999-10-21 | 2003-08-19 | Hewlett-Packard Development Company, L.P. | Digital camera cursor control by sensing finger position on lens cap |
US20030184645A1 (en) * | 2002-03-27 | 2003-10-02 | Biegelsen David K. | Automatic camera steering control and video conferencing |
US20030222849A1 (en) * | 2002-05-31 | 2003-12-04 | Starkweather Gary K. | Laser-based user input device for electronic projection displays |
US20040008185A1 (en) * | 2002-03-29 | 2004-01-15 | Mitac International Corp. | Data processing device, presentation device, and projection method for presentation |
US20040027455A1 (en) * | 2000-12-15 | 2004-02-12 | Leonard Reiffel | Imaged coded data source tracking product |
US20040041027A1 (en) * | 2000-12-15 | 2004-03-04 | Leonard Reiffel | Imaged coded data source transducer product |
US20040125224A1 (en) * | 2000-08-18 | 2004-07-01 | Leonard Reiffel | Annotating imaged data product |
US20050102332A1 (en) * | 2000-12-15 | 2005-05-12 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data iInput product |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US7000840B2 (en) | 2000-05-03 | 2006-02-21 | Leonard Reiffel | Dual mode data imaging product |
US20060044282A1 (en) * | 2004-08-27 | 2006-03-02 | International Business Machines Corporation | User input apparatus, system, method and computer program for use with a screen having a translucent surface |
US7034803B1 (en) | 2000-08-18 | 2006-04-25 | Leonard Reiffel | Cursor display privacy product |
US20060119798A1 (en) * | 2004-12-02 | 2006-06-08 | Huddleston Wyatt A | Display panel |
US20060238493A1 (en) * | 2005-04-22 | 2006-10-26 | Dunton Randy R | System and method to activate a graphical user interface (GUI) via a laser beam |
US7137711B1 (en) | 2000-03-21 | 2006-11-21 | Leonard Reiffel | Multi-user retro reflector data input |
US20060291797A1 (en) * | 2003-05-27 | 2006-12-28 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data input product |
JP2007072637A (en) * | 2005-09-06 | 2007-03-22 | Hitachi Ltd | Input device using elastic material |
US20070063982A1 (en) * | 2005-09-19 | 2007-03-22 | Tran Bao Q | Integrated rendering of sound and image on a display |
US20070171891A1 (en) * | 2006-01-26 | 2007-07-26 | Available For Licensing | Cellular device with broadcast radio or TV receiver |
US20070187506A1 (en) * | 2001-04-19 | 2007-08-16 | Leonard Reiffel | Combined imaging coded data source data acquisition |
US20070222734A1 (en) * | 2006-03-25 | 2007-09-27 | Tran Bao Q | Mobile device capable of receiving music or video content from satellite radio providers |
US20070229233A1 (en) * | 2004-08-02 | 2007-10-04 | Dort David B | Reconfigurable tactile-enhanced display including "tap-and-drop" computing system for vision impaired users |
US20070262995A1 (en) * | 2006-05-12 | 2007-11-15 | Available For Licensing | Systems and methods for video editing |
US20080024463A1 (en) * | 2001-02-22 | 2008-01-31 | Timothy Pryor | Reconfigurable tactile control display applications |
US20080065291A1 (en) * | 2002-11-04 | 2008-03-13 | Automotive Technologies International, Inc. | Gesture-Based Control of Vehicular Components |
US7355561B1 (en) | 2003-09-15 | 2008-04-08 | United States Of America As Represented By The Secretary Of The Army | Systems and methods for providing images |
US20080088587A1 (en) * | 2001-02-22 | 2008-04-17 | Timothy Pryor | Compact rtd instrument panels and computer interfaces |
US20080096651A1 (en) * | 2006-07-28 | 2008-04-24 | Aruze Corp. | Gaming machine |
US20080122805A1 (en) * | 2000-10-11 | 2008-05-29 | Peter Smith | Books, papers, and downloaded information to facilitate human interaction with computers |
US20080192027A1 (en) * | 2002-11-08 | 2008-08-14 | Morrison James C | Interactive window display |
US20080211779A1 (en) * | 1994-08-15 | 2008-09-04 | Pryor Timothy R | Control systems employing novel physical controls and touch screens |
US20090027357A1 (en) * | 2007-07-23 | 2009-01-29 | Smart Technologies, Inc. | System and method of detecting contact on a display |
US20090116692A1 (en) * | 1998-08-10 | 2009-05-07 | Paul George V | Realtime object tracking system |
US20090273563A1 (en) * | 1999-11-08 | 2009-11-05 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20090300531A1 (en) * | 1995-06-29 | 2009-12-03 | Pryor Timothy R | Method for providing human input to a computer |
US20090322499A1 (en) * | 1995-06-29 | 2009-12-31 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20100008582A1 (en) * | 2008-07-10 | 2010-01-14 | Samsung Electronics Co., Ltd. | Method for recognizing and translating characters in camera-based image |
US20100079493A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
US20100079385A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for calibrating an interactive input system and interactive input system executing the calibration method |
US20100083109A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method |
US20100079409A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Touch panel for an interactive input system, and interactive input system incorporating the touch panel |
EP2213501A2 (en) | 2003-03-31 | 2010-08-04 | Timothy R. Pryor | Reconfigurable vehicle instrument panels |
US20100217433A1 (en) * | 2007-10-16 | 2010-08-26 | Hyun Dong Son | Store management system capable of switching between manned or unmanned sales |
US20110050650A1 (en) * | 2009-09-01 | 2011-03-03 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (snr) and image capture method |
US20110069019A1 (en) * | 2009-07-08 | 2011-03-24 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
US20110169748A1 (en) * | 2010-01-11 | 2011-07-14 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
US20110243380A1 (en) * | 2010-04-01 | 2011-10-06 | Qualcomm Incorporated | Computing device interface |
US8077147B2 (en) | 2005-12-30 | 2011-12-13 | Apple Inc. | Mouse with optical sensing surface |
US8239784B2 (en) | 2004-07-30 | 2012-08-07 | Apple Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US8314773B2 (en) | 2002-09-09 | 2012-11-20 | Apple Inc. | Mouse having an optically-based scrolling feature |
CN102831387A (en) * | 2005-01-07 | 2012-12-19 | 高通股份有限公司 | Detecting and tracking objects in images |
US8381135B2 (en) | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US20130050549A1 (en) * | 2006-01-04 | 2013-02-28 | Apple Inc. | Embedded camera with privacy filter |
US20130141569A1 (en) * | 2011-12-06 | 2013-06-06 | Canon Kabushiki Kaisha | Information processing apparatus, control method of information processing apparatus, and storage medium |
US20130229669A1 (en) * | 2007-10-10 | 2013-09-05 | Gerard Dirk Smits | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering |
US8576199B1 (en) | 2000-02-22 | 2013-11-05 | Apple Inc. | Computer control systems |
US8711370B1 (en) | 2012-10-04 | 2014-04-29 | Gerard Dirk Smits | Scanning optical positioning system with spatially triangulating receivers |
US8781171B2 (en) | 2012-10-24 | 2014-07-15 | Honda Motor Co., Ltd. | Object recognition in low-lux and high-lux conditions |
US8867015B2 (en) | 2012-01-11 | 2014-10-21 | Apple Inc. | Displays with liquid crystal shutters |
US20140370980A1 (en) * | 2013-06-17 | 2014-12-18 | Bally Gaming, Inc. | Electronic gaming displays, gaming tables including electronic gaming displays and related assemblies, systems and methods |
US8971568B1 (en) | 2012-10-08 | 2015-03-03 | Gerard Dirk Smits | Method, apparatus, and manufacture for document writing and annotation with virtual ink |
US20150199018A1 (en) * | 2014-01-14 | 2015-07-16 | Microsoft Corporation | 3d silhouette sensing system |
US20150336588A1 (en) * | 2012-07-06 | 2015-11-26 | Audi Ag | Method and control system for operating a motor vehicle |
US9239677B2 (en) | 2004-05-06 | 2016-01-19 | Apple Inc. | Operation of a computer with touch screen interface |
US9239673B2 (en) | 1998-01-26 | 2016-01-19 | Apple Inc. | Gesturing with a multipoint sensing device |
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US9304593B2 (en) | 1998-08-10 | 2016-04-05 | Cybernet Systems Corporation | Behavior recognition system |
US9377533B2 (en) | 2014-08-11 | 2016-06-28 | Gerard Dirk Smits | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
US20160301900A1 (en) * | 2015-04-07 | 2016-10-13 | Omnivision Technologies, Inc. | Touch screen rear projection display |
US9753126B2 (en) | 2015-12-18 | 2017-09-05 | Gerard Dirk Smits | Real time position sensing of objects |
US9810913B2 (en) | 2014-03-28 | 2017-11-07 | Gerard Dirk Smits | Smart head-mounted projection system |
US9813673B2 (en) | 2016-01-20 | 2017-11-07 | Gerard Dirk Smits | Holographic video capture and telepresence system |
US9946076B2 (en) | 2010-10-04 | 2018-04-17 | Gerard Dirk Smits | System and method for 3-D projection and enhancements for interactivity |
US10067230B2 (en) | 2016-10-31 | 2018-09-04 | Gerard Dirk Smits | Fast scanning LIDAR with dynamic voxel probing |
US10157469B2 (en) | 2015-04-13 | 2018-12-18 | Gerard Dirk Smits | Machine vision for ego-motion, segmenting, and classifying objects |
US10166995B2 (en) * | 2016-01-08 | 2019-01-01 | Ford Global Technologies, Llc | System and method for feature activation via gesture recognition and voice command |
US10261183B2 (en) | 2016-12-27 | 2019-04-16 | Gerard Dirk Smits | Systems and methods for machine perception |
US10379220B1 (en) | 2018-01-29 | 2019-08-13 | Gerard Dirk Smits | Hyper-resolved, high bandwidth scanned LIDAR systems |
US10473921B2 (en) | 2017-05-10 | 2019-11-12 | Gerard Dirk Smits | Scan mirror systems and methods |
US10591605B2 (en) | 2017-10-19 | 2020-03-17 | Gerard Dirk Smits | Methods and systems for navigating a vehicle including a novel fiducial marker system |
US20220360755A1 (en) * | 2020-10-23 | 2022-11-10 | Ji Shen | Interactive display with integrated camera for capturing audio and visual information |
US20230113359A1 (en) * | 2020-10-23 | 2023-04-13 | Pathway Innovations And Technologies, Inc. | Full color spectrum blending and digital color filtering for transparent display screens |
US11829059B2 (en) | 2020-02-27 | 2023-11-28 | Gerard Dirk Smits | High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3748751A (en) * | 1972-09-07 | 1973-07-31 | Us Navy | Laser machine gun simulator |
US3757322A (en) * | 1971-02-03 | 1973-09-04 | Hall Barkan Instr Inc | Transparent touch controlled interface with interreactively related display |
US4017848A (en) * | 1975-05-19 | 1977-04-12 | Rockwell International Corporation | Transparent keyboard switch and array |
US4772028A (en) * | 1987-08-27 | 1988-09-20 | Rockhold Christopher K | Electronic shootout game |
US4948371A (en) * | 1989-04-25 | 1990-08-14 | The United States Of America As Represented By The United States Department Of Energy | System for training and evaluation of security personnel in use of firearms |
US5328190A (en) * | 1992-08-04 | 1994-07-12 | Dart International, Inc. | Method and apparatus enabling archery practice |
US5495269A (en) * | 1992-04-03 | 1996-02-27 | Xerox Corporation | Large area electronic writing system |
US5502514A (en) | 1995-06-07 | 1996-03-26 | Nview Corporation | Stylus position sensing and digital camera with a digital micromirror device |
US5515079A (en) | 1989-11-07 | 1996-05-07 | Proxima Corporation | Computer input system and method of using same |
US5613913A (en) * | 1994-04-06 | 1997-03-25 | Sega Enterprises, Ltd. | Method for developing attractions in a shooting game system |
US5649706A (en) * | 1994-09-21 | 1997-07-22 | Treat, Jr.; Erwin C. | Simulator and practice method |
US5982352A (en) * | 1992-09-18 | 1999-11-09 | Pryor; Timothy R. | Method for providing human input to a computer |
US6008800A (en) * | 1992-09-18 | 1999-12-28 | Pryor; Timothy R. | Man machine interfaces for entering data into a computer |
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US6339748B1 (en) * | 1997-11-11 | 2002-01-15 | Seiko Epson Corporation | Coordinate input system and display apparatus |
US6441807B1 (en) * | 1997-09-03 | 2002-08-27 | Plus Industrial Corporation | Display system |
-
2000
- 2000-05-11 US US09/568,554 patent/US6545670B1/en not_active Expired - Lifetime
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3757322A (en) * | 1971-02-03 | 1973-09-04 | Hall Barkan Instr Inc | Transparent touch controlled interface with interreactively related display |
US3748751A (en) * | 1972-09-07 | 1973-07-31 | Us Navy | Laser machine gun simulator |
US4017848A (en) * | 1975-05-19 | 1977-04-12 | Rockwell International Corporation | Transparent keyboard switch and array |
US4772028A (en) * | 1987-08-27 | 1988-09-20 | Rockhold Christopher K | Electronic shootout game |
US4948371A (en) * | 1989-04-25 | 1990-08-14 | The United States Of America As Represented By The United States Department Of Energy | System for training and evaluation of security personnel in use of firearms |
US5515079A (en) | 1989-11-07 | 1996-05-07 | Proxima Corporation | Computer input system and method of using same |
US5495269A (en) * | 1992-04-03 | 1996-02-27 | Xerox Corporation | Large area electronic writing system |
US5328190A (en) * | 1992-08-04 | 1994-07-12 | Dart International, Inc. | Method and apparatus enabling archery practice |
US5982352A (en) * | 1992-09-18 | 1999-11-09 | Pryor; Timothy R. | Method for providing human input to a computer |
US6008800A (en) * | 1992-09-18 | 1999-12-28 | Pryor; Timothy R. | Man machine interfaces for entering data into a computer |
US5613913A (en) * | 1994-04-06 | 1997-03-25 | Sega Enterprises, Ltd. | Method for developing attractions in a shooting game system |
US5649706A (en) * | 1994-09-21 | 1997-07-22 | Treat, Jr.; Erwin C. | Simulator and practice method |
US5502514A (en) | 1995-06-07 | 1996-03-26 | Nview Corporation | Stylus position sensing and digital camera with a digital micromirror device |
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US6441807B1 (en) * | 1997-09-03 | 2002-08-27 | Plus Industrial Corporation | Display system |
US6339748B1 (en) * | 1997-11-11 | 2002-01-15 | Seiko Epson Corporation | Coordinate input system and display apparatus |
Cited By (161)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9513744B2 (en) | 1994-08-15 | 2016-12-06 | Apple Inc. | Control systems employing novel physical controls and touch screens |
US20080211779A1 (en) * | 1994-08-15 | 2008-09-04 | Pryor Timothy R | Control systems employing novel physical controls and touch screens |
US20090322499A1 (en) * | 1995-06-29 | 2009-12-31 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US9758042B2 (en) | 1995-06-29 | 2017-09-12 | Apple Inc. | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US8427449B2 (en) | 1995-06-29 | 2013-04-23 | Apple Inc. | Method for providing human input to a computer |
US8228305B2 (en) | 1995-06-29 | 2012-07-24 | Apple Inc. | Method for providing human input to a computer |
US8610674B2 (en) | 1995-06-29 | 2013-12-17 | Apple Inc. | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20090300531A1 (en) * | 1995-06-29 | 2009-12-03 | Pryor Timothy R | Method for providing human input to a computer |
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US9239673B2 (en) | 1998-01-26 | 2016-01-19 | Apple Inc. | Gesturing with a multipoint sensing device |
US7684592B2 (en) | 1998-08-10 | 2010-03-23 | Cybernet Systems Corporation | Realtime object tracking system |
US20090116692A1 (en) * | 1998-08-10 | 2009-05-07 | Paul George V | Realtime object tracking system |
US9304593B2 (en) | 1998-08-10 | 2016-04-05 | Cybernet Systems Corporation | Behavior recognition system |
US7050606B2 (en) * | 1999-08-10 | 2006-05-23 | Cybernet Systems Corporation | Tracking and gesture recognition system particularly suited to vehicular control applications |
US20070195997A1 (en) * | 1999-08-10 | 2007-08-23 | Paul George V | Tracking and gesture recognition system particularly suited to vehicular control applications |
US20020126876A1 (en) * | 1999-08-10 | 2002-09-12 | Paul George V. | Tracking and gesture recognition system particularly suited to vehicular control applications |
US6608648B1 (en) * | 1999-10-21 | 2003-08-19 | Hewlett-Packard Development Company, L.P. | Digital camera cursor control by sensing finger position on lens cap |
US20090273563A1 (en) * | 1999-11-08 | 2009-11-05 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US8482535B2 (en) | 1999-11-08 | 2013-07-09 | Apple Inc. | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20010020933A1 (en) * | 2000-02-21 | 2001-09-13 | Christoph Maggioni | Method and configuration for interacting with a display visible in a display window |
US7034807B2 (en) * | 2000-02-21 | 2006-04-25 | Siemens Aktiengesellschaft | Method and configuration for interacting with a display visible in a display window |
US8576199B1 (en) | 2000-02-22 | 2013-11-05 | Apple Inc. | Computer control systems |
US7137711B1 (en) | 2000-03-21 | 2006-11-21 | Leonard Reiffel | Multi-user retro reflector data input |
US7000840B2 (en) | 2000-05-03 | 2006-02-21 | Leonard Reiffel | Dual mode data imaging product |
US7034803B1 (en) | 2000-08-18 | 2006-04-25 | Leonard Reiffel | Cursor display privacy product |
US20040125224A1 (en) * | 2000-08-18 | 2004-07-01 | Leonard Reiffel | Annotating imaged data product |
US7161581B2 (en) | 2000-08-18 | 2007-01-09 | Leonard Reiffel | Annotating imaged data product |
US8040328B2 (en) | 2000-10-11 | 2011-10-18 | Peter Smith | Books, papers, and downloaded information to facilitate human interaction with computers |
US20080122805A1 (en) * | 2000-10-11 | 2008-05-29 | Peter Smith | Books, papers, and downloaded information to facilitate human interaction with computers |
US20050102332A1 (en) * | 2000-12-15 | 2005-05-12 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data iInput product |
US6945460B2 (en) | 2000-12-15 | 2005-09-20 | Leonard Reiffel | Imaged coded data source transducer product |
US20040027455A1 (en) * | 2000-12-15 | 2004-02-12 | Leonard Reiffel | Imaged coded data source tracking product |
US7184075B2 (en) | 2000-12-15 | 2007-02-27 | Leonard Reiffel | Imaged coded data source tracking product |
US20040041027A1 (en) * | 2000-12-15 | 2004-03-04 | Leonard Reiffel | Imaged coded data source transducer product |
US7099070B2 (en) | 2000-12-15 | 2006-08-29 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data input product |
US20030001818A1 (en) * | 2000-12-27 | 2003-01-02 | Masaji Katagiri | Handwritten data input device and method, and authenticating device and method |
US6947029B2 (en) * | 2000-12-27 | 2005-09-20 | Masaji Katagiri | Handwritten data input device and method, and authenticating device and method |
US20080024463A1 (en) * | 2001-02-22 | 2008-01-31 | Timothy Pryor | Reconfigurable tactile control display applications |
US20080088587A1 (en) * | 2001-02-22 | 2008-04-17 | Timothy Pryor | Compact rtd instrument panels and computer interfaces |
US7377438B2 (en) | 2001-04-19 | 2008-05-27 | Leonard Reiffel | Combined imaging coded data source data acquisition |
US20070187506A1 (en) * | 2001-04-19 | 2007-08-16 | Leonard Reiffel | Combined imaging coded data source data acquisition |
US20030048280A1 (en) * | 2001-09-12 | 2003-03-13 | Russell Ryan S. | Interactive environment using computer vision and touchscreens |
US7027041B2 (en) * | 2001-09-28 | 2006-04-11 | Fujinon Corporation | Presentation system |
US20030063260A1 (en) * | 2001-09-28 | 2003-04-03 | Fuji Photo Optical Co., Ltd. | Presentation system |
US9606668B2 (en) | 2002-02-07 | 2017-03-28 | Apple Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US20030184645A1 (en) * | 2002-03-27 | 2003-10-02 | Biegelsen David K. | Automatic camera steering control and video conferencing |
US7969472B2 (en) * | 2002-03-27 | 2011-06-28 | Xerox Corporation | Automatic camera steering control and video conferencing |
US20040008185A1 (en) * | 2002-03-29 | 2004-01-15 | Mitac International Corp. | Data processing device, presentation device, and projection method for presentation |
US20030222849A1 (en) * | 2002-05-31 | 2003-12-04 | Starkweather Gary K. | Laser-based user input device for electronic projection displays |
US8314773B2 (en) | 2002-09-09 | 2012-11-20 | Apple Inc. | Mouse having an optically-based scrolling feature |
US20080065291A1 (en) * | 2002-11-04 | 2008-03-13 | Automotive Technologies International, Inc. | Gesture-Based Control of Vehicular Components |
US20080192027A1 (en) * | 2002-11-08 | 2008-08-14 | Morrison James C | Interactive window display |
US7978184B2 (en) * | 2002-11-08 | 2011-07-12 | American Greetings Corporation | Interactive window display |
EP2213501A2 (en) | 2003-03-31 | 2010-08-04 | Timothy R. Pryor | Reconfigurable vehicle instrument panels |
EP2581248A1 (en) | 2003-03-31 | 2013-04-17 | Timothy R. Pryor | Reconfigurable vehicle instrument panels |
US20060291797A1 (en) * | 2003-05-27 | 2006-12-28 | Leonard Reiffel | Multi-imager multi-source multi-use coded data source data input product |
US7355561B1 (en) | 2003-09-15 | 2008-04-08 | United States Of America As Represented By The Secretary Of The Army | Systems and methods for providing images |
US9239677B2 (en) | 2004-05-06 | 2016-01-19 | Apple Inc. | Operation of a computer with touch screen interface |
US8381135B2 (en) | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US8239784B2 (en) | 2004-07-30 | 2012-08-07 | Apple Inc. | Mode-based graphical user interfaces for touch sensitive input devices |
US11036282B2 (en) | 2004-07-30 | 2021-06-15 | Apple Inc. | Proximity detector in handheld device |
US10042418B2 (en) | 2004-07-30 | 2018-08-07 | Apple Inc. | Proximity detector in handheld device |
US8612856B2 (en) | 2004-07-30 | 2013-12-17 | Apple Inc. | Proximity detector in handheld device |
US9348458B2 (en) | 2004-07-30 | 2016-05-24 | Apple Inc. | Gestures for touch sensitive input devices |
US20070229233A1 (en) * | 2004-08-02 | 2007-10-04 | Dort David B | Reconfigurable tactile-enhanced display including "tap-and-drop" computing system for vision impaired users |
US20100027843A1 (en) * | 2004-08-10 | 2010-02-04 | Microsoft Corporation | Surface ui for gesture-based interaction |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US8560972B2 (en) | 2004-08-10 | 2013-10-15 | Microsoft Corporation | Surface UI for gesture-based interaction |
WO2006025872A2 (en) * | 2004-08-27 | 2006-03-09 | International Business Machines Corporation | User input apparatus, system, method and computer program for use with a screen having a translucent surface |
US20060044282A1 (en) * | 2004-08-27 | 2006-03-02 | International Business Machines Corporation | User input apparatus, system, method and computer program for use with a screen having a translucent surface |
WO2006025872A3 (en) * | 2004-08-27 | 2008-11-20 | Ibm | User input apparatus, system, method and computer program for use with a screen having a translucent surface |
US8508710B2 (en) * | 2004-12-02 | 2013-08-13 | Hewlett-Packard Development Company, L.P. | Display panel |
US20060119798A1 (en) * | 2004-12-02 | 2006-06-08 | Huddleston Wyatt A | Display panel |
CN102831387B (en) * | 2005-01-07 | 2016-12-14 | 高通股份有限公司 | Object in detect and track image |
CN102831387A (en) * | 2005-01-07 | 2012-12-19 | 高通股份有限公司 | Detecting and tracking objects in images |
US20060238493A1 (en) * | 2005-04-22 | 2006-10-26 | Dunton Randy R | System and method to activate a graphical user interface (GUI) via a laser beam |
JP2007072637A (en) * | 2005-09-06 | 2007-03-22 | Hitachi Ltd | Input device using elastic material |
JP4635788B2 (en) * | 2005-09-06 | 2011-02-23 | 株式会社日立製作所 | Input device using elastic material |
US20070063982A1 (en) * | 2005-09-19 | 2007-03-22 | Tran Bao Q | Integrated rendering of sound and image on a display |
US8077147B2 (en) | 2005-12-30 | 2011-12-13 | Apple Inc. | Mouse with optical sensing surface |
US20130050549A1 (en) * | 2006-01-04 | 2013-02-28 | Apple Inc. | Embedded camera with privacy filter |
CN104702827A (en) * | 2006-01-04 | 2015-06-10 | 苹果公司 | Embedded camera with privacy filter |
US8797451B2 (en) * | 2006-01-04 | 2014-08-05 | Apple Inc. | Embedded camera with privacy filter |
US20070171891A1 (en) * | 2006-01-26 | 2007-07-26 | Available For Licensing | Cellular device with broadcast radio or TV receiver |
US20070222734A1 (en) * | 2006-03-25 | 2007-09-27 | Tran Bao Q | Mobile device capable of receiving music or video content from satellite radio providers |
US20110230232A1 (en) * | 2006-05-12 | 2011-09-22 | Tran Bao Q | Systems and methods for video editing |
US20070262995A1 (en) * | 2006-05-12 | 2007-11-15 | Available For Licensing | Systems and methods for video editing |
US7827491B2 (en) | 2006-05-12 | 2010-11-02 | Tran Bao Q | Systems and methods for video editing |
US20080096651A1 (en) * | 2006-07-28 | 2008-04-24 | Aruze Corp. | Gaming machine |
US20090027357A1 (en) * | 2007-07-23 | 2009-01-29 | Smart Technologies, Inc. | System and method of detecting contact on a display |
US8094137B2 (en) | 2007-07-23 | 2012-01-10 | Smart Technologies Ulc | System and method of detecting contact on a display |
US10962867B2 (en) | 2007-10-10 | 2021-03-30 | Gerard Dirk Smits | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering |
US8696141B2 (en) * | 2007-10-10 | 2014-04-15 | Gerard Dirk Smits | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering |
US9581883B2 (en) | 2007-10-10 | 2017-02-28 | Gerard Dirk Smits | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering |
US20130229669A1 (en) * | 2007-10-10 | 2013-09-05 | Gerard Dirk Smits | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering |
US20100217433A1 (en) * | 2007-10-16 | 2010-08-26 | Hyun Dong Son | Store management system capable of switching between manned or unmanned sales |
US20100008582A1 (en) * | 2008-07-10 | 2010-01-14 | Samsung Electronics Co., Ltd. | Method for recognizing and translating characters in camera-based image |
US8810522B2 (en) | 2008-09-29 | 2014-08-19 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
US20100079409A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Touch panel for an interactive input system, and interactive input system incorporating the touch panel |
US20100079385A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for calibrating an interactive input system and interactive input system executing the calibration method |
US20100079493A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
US20100083109A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method |
US8416206B2 (en) | 2009-07-08 | 2013-04-09 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
US20110069019A1 (en) * | 2009-07-08 | 2011-03-24 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
US8902195B2 (en) | 2009-09-01 | 2014-12-02 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (SNR) and image capture method |
US20110050650A1 (en) * | 2009-09-01 | 2011-03-03 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (snr) and image capture method |
US20110169748A1 (en) * | 2010-01-11 | 2011-07-14 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
US8502789B2 (en) | 2010-01-11 | 2013-08-06 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
US8818027B2 (en) * | 2010-04-01 | 2014-08-26 | Qualcomm Incorporated | Computing device interface |
US20110243380A1 (en) * | 2010-04-01 | 2011-10-06 | Qualcomm Incorporated | Computing device interface |
US9946076B2 (en) | 2010-10-04 | 2018-04-17 | Gerard Dirk Smits | System and method for 3-D projection and enhancements for interactivity |
US9288455B2 (en) * | 2011-12-06 | 2016-03-15 | Canon Kabushiki Kaisha | Information processing apparatus, control method of information processing apparatus, and storage medium for determining whether a projection pattern of a current frame differs from that of a previous frame |
US20130141569A1 (en) * | 2011-12-06 | 2013-06-06 | Canon Kabushiki Kaisha | Information processing apparatus, control method of information processing apparatus, and storage medium |
US8867015B2 (en) | 2012-01-11 | 2014-10-21 | Apple Inc. | Displays with liquid crystal shutters |
US9493169B2 (en) * | 2012-07-06 | 2016-11-15 | Audi Ag | Method and control system for operating a motor vehicle |
US20150336588A1 (en) * | 2012-07-06 | 2015-11-26 | Audi Ag | Method and control system for operating a motor vehicle |
US8711370B1 (en) | 2012-10-04 | 2014-04-29 | Gerard Dirk Smits | Scanning optical positioning system with spatially triangulating receivers |
US9501176B1 (en) | 2012-10-08 | 2016-11-22 | Gerard Dirk Smits | Method, apparatus, and manufacture for document writing and annotation with virtual ink |
US8971568B1 (en) | 2012-10-08 | 2015-03-03 | Gerard Dirk Smits | Method, apparatus, and manufacture for document writing and annotation with virtual ink |
US8781171B2 (en) | 2012-10-24 | 2014-07-15 | Honda Motor Co., Ltd. | Object recognition in low-lux and high-lux conditions |
US9469251B2 (en) | 2012-10-24 | 2016-10-18 | Honda Motor Co., Ltd. | Object recognition in low-lux and high-lux conditions |
US9302621B2 (en) | 2012-10-24 | 2016-04-05 | Honda Motor Co., Ltd. | Object recognition in low-lux and high-lux conditions |
US9852332B2 (en) | 2012-10-24 | 2017-12-26 | Honda Motor Co., Ltd. | Object recognition in low-lux and high-lux conditions |
US20140370980A1 (en) * | 2013-06-17 | 2014-12-18 | Bally Gaming, Inc. | Electronic gaming displays, gaming tables including electronic gaming displays and related assemblies, systems and methods |
US20170285763A1 (en) * | 2014-01-14 | 2017-10-05 | Microsoft Technology Licensing, Llc | 3d silhouette sensing system |
US20150199018A1 (en) * | 2014-01-14 | 2015-07-16 | Microsoft Corporation | 3d silhouette sensing system |
US9720506B2 (en) * | 2014-01-14 | 2017-08-01 | Microsoft Technology Licensing, Llc | 3D silhouette sensing system |
US10001845B2 (en) * | 2014-01-14 | 2018-06-19 | Microsoft Technology Licensing, Llc | 3D silhouette sensing system |
US9810913B2 (en) | 2014-03-28 | 2017-11-07 | Gerard Dirk Smits | Smart head-mounted projection system |
US10061137B2 (en) | 2014-03-28 | 2018-08-28 | Gerard Dirk Smits | Smart head-mounted projection system |
US9377533B2 (en) | 2014-08-11 | 2016-06-28 | Gerard Dirk Smits | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
US11137497B2 (en) | 2014-08-11 | 2021-10-05 | Gerard Dirk Smits | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
US10324187B2 (en) | 2014-08-11 | 2019-06-18 | Gerard Dirk Smits | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
US20160301900A1 (en) * | 2015-04-07 | 2016-10-13 | Omnivision Technologies, Inc. | Touch screen rear projection display |
US10901548B2 (en) * | 2015-04-07 | 2021-01-26 | Omnivision Technologies, Inc. | Touch screen rear projection display |
US10157469B2 (en) | 2015-04-13 | 2018-12-18 | Gerard Dirk Smits | Machine vision for ego-motion, segmenting, and classifying objects |
US10325376B2 (en) | 2015-04-13 | 2019-06-18 | Gerard Dirk Smits | Machine vision for ego-motion, segmenting, and classifying objects |
US10274588B2 (en) | 2015-12-18 | 2019-04-30 | Gerard Dirk Smits | Real time position sensing of objects |
US10502815B2 (en) | 2015-12-18 | 2019-12-10 | Gerard Dirk Smits | Real time position sensing of objects |
US11714170B2 (en) | 2015-12-18 | 2023-08-01 | Samsung Semiconuctor, Inc. | Real time position sensing of objects |
US9753126B2 (en) | 2015-12-18 | 2017-09-05 | Gerard Dirk Smits | Real time position sensing of objects |
US10166995B2 (en) * | 2016-01-08 | 2019-01-01 | Ford Global Technologies, Llc | System and method for feature activation via gesture recognition and voice command |
US10084990B2 (en) | 2016-01-20 | 2018-09-25 | Gerard Dirk Smits | Holographic video capture and telepresence system |
US10477149B2 (en) | 2016-01-20 | 2019-11-12 | Gerard Dirk Smits | Holographic video capture and telepresence system |
US9813673B2 (en) | 2016-01-20 | 2017-11-07 | Gerard Dirk Smits | Holographic video capture and telepresence system |
US10935659B2 (en) | 2016-10-31 | 2021-03-02 | Gerard Dirk Smits | Fast scanning lidar with dynamic voxel probing |
US10067230B2 (en) | 2016-10-31 | 2018-09-04 | Gerard Dirk Smits | Fast scanning LIDAR with dynamic voxel probing |
US10451737B2 (en) | 2016-10-31 | 2019-10-22 | Gerard Dirk Smits | Fast scanning with dynamic voxel probing |
US11709236B2 (en) | 2016-12-27 | 2023-07-25 | Samsung Semiconductor, Inc. | Systems and methods for machine perception |
US10564284B2 (en) | 2016-12-27 | 2020-02-18 | Gerard Dirk Smits | Systems and methods for machine perception |
US10261183B2 (en) | 2016-12-27 | 2019-04-16 | Gerard Dirk Smits | Systems and methods for machine perception |
US10473921B2 (en) | 2017-05-10 | 2019-11-12 | Gerard Dirk Smits | Scan mirror systems and methods |
US11067794B2 (en) | 2017-05-10 | 2021-07-20 | Gerard Dirk Smits | Scan mirror systems and methods |
US10591605B2 (en) | 2017-10-19 | 2020-03-17 | Gerard Dirk Smits | Methods and systems for navigating a vehicle including a novel fiducial marker system |
US10935989B2 (en) | 2017-10-19 | 2021-03-02 | Gerard Dirk Smits | Methods and systems for navigating a vehicle including a novel fiducial marker system |
US10725177B2 (en) | 2018-01-29 | 2020-07-28 | Gerard Dirk Smits | Hyper-resolved, high bandwidth scanned LIDAR systems |
US10379220B1 (en) | 2018-01-29 | 2019-08-13 | Gerard Dirk Smits | Hyper-resolved, high bandwidth scanned LIDAR systems |
US11829059B2 (en) | 2020-02-27 | 2023-11-28 | Gerard Dirk Smits | High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array |
US20230113359A1 (en) * | 2020-10-23 | 2023-04-13 | Pathway Innovations And Technologies, Inc. | Full color spectrum blending and digital color filtering for transparent display screens |
US20220360755A1 (en) * | 2020-10-23 | 2022-11-10 | Ji Shen | Interactive display with integrated camera for capturing audio and visual information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6545670B1 (en) | Methods and apparatus for man machine interfaces and related activity | |
US11630315B2 (en) | Measuring content brightness in head worn computing | |
US10139635B2 (en) | Content presentation in head worn computing | |
KR100921543B1 (en) | A touch pad, a stylus for use with the touch pad, and a method of operating the touch pad | |
US7310090B2 (en) | Optical generic switch panel | |
US6008800A (en) | Man machine interfaces for entering data into a computer | |
JP3067452B2 (en) | Large electronic writing system | |
US5317140A (en) | Diffusion-assisted position location particularly for visual pen detection | |
US20160048160A1 (en) | Content presentation in head worn computing | |
US11790617B2 (en) | Content presentation in head worn computing | |
KR20090060283A (en) | Multi touch sensing display through frustrated total internal reflection | |
JP2014517361A (en) | Camera-type multi-touch interaction device, system and method | |
KR20070045188A (en) | User input apparatus, system, method and computer program for use with a screen having a translucent surface | |
CN101971128A (en) | Interaction arrangement for interaction between a display screen and a pointer object | |
US8899474B2 (en) | Interactive document reader | |
JP2006145645A (en) | Information display apparatus | |
CN1701351A (en) | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device | |
US11340710B2 (en) | Virtual mouse | |
US11868569B2 (en) | Virtual mouse | |
US20240094851A1 (en) | Virtual mouse | |
US20230409148A1 (en) | Virtual mouse | |
Shimoda et al. | Development of Head-attached Interface Device (HIDE) and its Functional Evaluation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
REMI | Maintenance fee reminder mailed | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
SULP | Surcharge for late payment | ||
AS | Assignment |
Owner name: APPLE INC.,CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRYOR, TIMOTHY R.;REEL/FRAME:024320/0642 Effective date: 20100330 Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRYOR, TIMOTHY R.;REEL/FRAME:024320/0642 Effective date: 20100330 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 12 |