US6545670B1 - Methods and apparatus for man machine interfaces and related activity - Google Patents

Methods and apparatus for man machine interfaces and related activity Download PDF

Info

Publication number
US6545670B1
US6545670B1 US09/568,554 US56855400A US6545670B1 US 6545670 B1 US6545670 B1 US 6545670B1 US 56855400 A US56855400 A US 56855400A US 6545670 B1 US6545670 B1 US 6545670B1
Authority
US
United States
Prior art keywords
touch screen
outer member
camera
screen according
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/568,554
Inventor
Timothy R. Pryor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Timothy R. Pryor
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Timothy R. Pryor filed Critical Timothy R. Pryor
Priority to US09/568,554 priority Critical patent/US6545670B1/en
Application granted granted Critical
Publication of US6545670B1 publication Critical patent/US6545670B1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRYOR, TIMOTHY R.
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04109FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location

Definitions

  • the invention relates to simple input devices for computers, particularly, but not necessarily, intended for use with 3-D graphically intensive activities, and operating by optically sensing a human input to a display screen or other object and/or the sensing of human positions or orientations.
  • the invention herein is a continuation in part of several inventions of mine, listed above.
  • One embodiment is a monitor housing for a computer that integrally incorporates digital TV cameras to look at points on the hand or the finger, or objects held in the hand of the user, which are used to input data to the computer. It may also or alternatively, look at the head of the user as well.
  • improved touch screens and further discloses camera based sensing of laser pointer indications.
  • the invention in several other embodiments, uses real time stereo photogrammetry or other methods using single or multiple TV cameras whose output is analyzed and used as input to a personal computer, typically to gather data concerning the 3D location of parts of, or objects held by, a person or persons.
  • FIG. 1 illustrates a computer monitor of the invention integrally incorporating one or two cameras pointing outward from the surface of the Bezel facing the user to observe objects held by the user, or parts of the user such as fingers and hands, the bezel preferably including an opaque (to the user) plastic cover for both cameras and light sources. Also illustrated is an additional camera for viewing a user directly or other purposes.
  • FIG. 2 illustrates a version of the invention wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
  • FIG. 3 illustrates a touch screen of the invention of co-pending application 1 referenced above, having improved screen rigidity.
  • distortion of the screen occurs primarily in a zone that is able to distort, with the zone supported by a rigid backing member.
  • FIG. 4 illustrates the use of a TV camera based transduction using the camera for screen distortion determination similar to that of FIG. 3 with the camera used for the determination of the position of a laser pointer indication such as a spot directed by a user on the screen, particularly in response to an image displayed on the screen.
  • a laser pointer indication such as a spot directed by a user on the screen
  • FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information that is itself then sent to the camera system.
  • FIG. 6 illustrates an embodiment using laser pointers for acquaintance making purposes, including the use of the laser pointer to an image to designate an image on a television screen using the invention of FIG. 4 or 5 above.
  • FIG. 7 illustrates Handwriting and signature recognition of sensed pencil position for internet commerce and other purpose, including a D-Sight technology based writing pad capable of distortion signature determination.
  • FIG. 1 A first figure.
  • This embodiment illustrating a computer display with camera(s) and illumination system is an alternative or addition to that of FIG. 1 of copending reference 3 above (Ser. No. 09/138,339).
  • FIG. 1 a A PC computer based embodiment is shown in FIG. 1 a.
  • a stereo pair of cameras 100 and 101 located on each side of the upper surface of monitor 102 (for example a rear projection TV having 80 inch diagonal screen 104 ) facing the user, desirably having one or more cover windows 103 .
  • a single extensive cover window 103 is covering both cameras and their associated light sources 110 and 111 , and mounted flush with the monitor front bezel surface.
  • the LED's in this application are typically used to illuminate targets associated with the fingers, hand, head of the user, or objects held by the user, such as user 135 with hands 136 and 137 , and head 138 .
  • These targets are desirably, but not necessarily, retro-reflective, and may be constituted by the object features themselves (e.g. a finger), or by features of clothing worn by the user, or by artificial targets other than retroreflectors.
  • the cameras are preferably pointed obliquely inward at angles theta, and downward, if desired, at further angles phi toward the center of the desired work volume 140 in front of the monitor, as required (angles depend on the computer monitor width, the distance of the work zone volume, 140 , from the monitor etc.
  • a single camera can be used, for determining user or other object positions such as 120 with light source 121 , both also optionally located behind a cover window (such as 103 ).
  • the cover window 103 is preferably black or dark plastic which lets the LED light source wavelength pass easily, but attenuates sunlight or room lights, thus aiding camera signal to noise in many cases, and making the cameras and light sources substantially invisible to the user (especially if the light sources are in the near infrared) and thus pleasing to the eye and not distracting.
  • Alternate camera locations may be used such as in the sides of the monitor bezel, or anywhere desired, for example as appendages to the monitor. They may alternately or in addition, be at the rear of the keyboard in front of the monitor. In the case of cameras mounted at the rear of the keyboard (toward the display screen), these cameras are also inclined to point toward the user at an angle as well.
  • an additional camera for viewing a user directly or other purposes may be employed.
  • a stereo pair such as 100 and 101
  • a third camera such as 120 might be used just for imaging using ambient illumination such as room lights (i.e. LED source 121 is not needed, though could be provided if desired)
  • ambient illumination such as room lights (i.e. LED source 121 is not needed, though could be provided if desired)
  • any light sources such as 110 and 111 located near the optical axes of the other two, do not generally illuminate any retro-reflectors in such a way as to register same on camera 120 —due to the limited angular return characteristic of retro-reflectors.
  • FIG. 2 illustrates a version of the invention of the co-pending applications wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
  • target 201 on finger 202 is illuminated by light 205 from zone 210 of screen 220 on which an image is projected by projection device 221 as shown.
  • Light reflected by the target 201 is imaged by camera 230 such as that employed in the invention of FIG. 1 for example.
  • the color of light 205 , or the point (or points) 210 from which it emanates, may be varied by the program of computer 250 controlling screen display driver 260 .
  • the control of point location and color allows selective illumination of targets or object features such as finger tip 203 , both by choice of color sensitivity response with respect to the target illuminated, and if the target is retroreflective, by choice of screen location with in the viewing field of view 225 of TV camera 230 . This can be used to select, by choice of screen generated light source location which camera, for example of two cameras sees the target.
  • FIG. 3 illustrates a touch screen of the invention as in the co-pending invention 1 referenced above, having however, improved screen rigidity.
  • distortion of the screen occurs primarily in a region that is able to distort or otherwise be changed, with the region supported by a rigid backing member.
  • transparent screen member 305 whose outer scattering surface 310 is touched, and locally distorted inward, by finger 315 .
  • This surface is separated from rigid (typically glass) backing member 320 by optically transparent refractive medium 330 of thickness t, which is compressed by the force of finger 315 .
  • the medium 330 can be a liquid, such as water or alcohol, that is either compressed, or displaced temporarily into a reservoir such as 340 (dotted lines) for example.
  • the index of refraction of member 320 and material 330 are closely matched such that little refraction occurs as light passes from 320 through 330 to surface 311 and back toward camera 370 (after retroreflection by expansive retroreflector 365 typically comprised of Scotchlight 7615 glass bead material) for example used to determine the distortion of 311 due to the retroreflective based “D sight” image effect or another optical phenomena such as disclosed in copending reference 1.
  • outer surface 310 of member 315 be relatively hard, such as thin Plexiglas or Mylar.
  • a partially reflective coating such as commonly done with vacuum deposition of silver or interference coatings—the latter useful if at the wavelength used for distortion determination, which additionally can be in the near infrared (e.g. 0.8 microns, where LED sources and TV camera sensitivity is commonplace)
  • a periodic array of zones of reflective material for example a reflective dot or stripe 0.05 mm wide, every 1 mm
  • the projection device if desired can be programmed to minimize the effect of such reflective or partially reflective zones in the displayed image.
  • FIG. 4 illustrates the use of a TV camera based transduction of finger touch using for example camera 410 to determine screen distortion occurring on touch similar to that of FIG. 3 .
  • the camera 410 may also be used for the determination of the position of a laser or other optical based pointer indication such as a spot 420 from laser pointer 425 directed by a user 430 on the screen 440 , particularly in response to an image (not shown for clarity) displayed on the screen, for example by image projector 455 .
  • a child could point a pointer at the pony image on the screen, and the program in system computer 460 could acknowledge this with audio feedback from loudspeaker 465 to the child, and if desired record in memory 470 that the child had correctly identified the pony, useful for tracking the child's learning, or recording scores for game purposes.
  • optical pointer function is distinct from that of the physical touch screen indication of FIG. 3 above. Either function can exist independently, or both together. A separate camera such as 411 can alternatively be used to determine laser pointer indication.
  • a person such as 475 can draw a drawing 487 with a laser pointer 480 for example by successively tracing it on display screen 481 where the successive laser pointer indications on the screen are digitized with the camera system 485 comprising one or more TV cameras connected via for example an IEEE 1394 firewire connection to computer 490 as shown, equipped to digitize the camera inputs. This allows one to draw modifications for example sitting in a conference room on to a drawing and have this digitized by the camera.
  • the camera can optionally can be equipped with a laser wavelength bandpass filter such as 495 in front of it to make it easy to detect the position laser spot anywhere on t he big screen, even in bright light surroundings.
  • a laser wavelength bandpass filter such as 495 in front of it to make it easy to detect the position laser spot anywhere on t he big screen, even in bright light surroundings.
  • the brightness of the laser alone is often significant enough to allow reliable detection.
  • Computer 490 also controls the front projector 496 , such that detected laser spot indications from the digitized camera image, can be used to modify software used to generate the image on the screen in what ever manner desired, for example to make a car design larger in a certain area, or to inject a new line on the drawing or whatever.
  • a camera such as 485 to determine laser pointer indications on a front projection screen can also be used to see datums on objects in front of the screen as well, as discussed in FIG. 1 and referenced copending applications for example. These can include natural features of the person 475 , such as fingers, hands, or specialized datums such as retroreflectors generally located on apparel or extremities of the user. Combined also can be a camera to determine laser pointer location and the screen deflection or other characteristic of touch or other contact with the screen. The camera can also determine location of target datums on the object as well as laser pointer indications, and other things as well.
  • FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information.
  • information can be in the form of a shape (such as a heart 501 projected by laser pointer 502 whose beam is transmitted through grating 505 ), an alphanumeric character, or anything else desired.
  • information can be easily changed by the user, either by changing fixed lasers, selecting different lasers with different spatially encoded holographic gratings or by having a turret of such gratings in front of a single laser.
  • the color of the laser can also be changed, with certain colors signifying desired actions. Tunable wavelength lasers make this easier today.
  • the information can be projected directly on an object, or on a front or rear projection screen displaying other information.
  • the projected information can also be sensed as in FIG. 4, using a TV camera such as 530 , viewing an object such as a chair 540 , or alternatively a screen such as 481 on which the information is projected.
  • FIG. 6 illustrates embodiments using laser pointers for acquaintance making purposes. This embodiment of the invention is particularly illustrated here for social purposes, but any application to which it useful is contemplated.
  • it is adapted to signaling one's indication for wishing to meet a person particularly of the opposite sex for the purpose of dating, etc. It more particularly concerns the use in a bar, restaurant or other social scene and using a laser pointer or other means for indicating information on the laser beam so pointed to point at the person at another table or across the room in a way that would indicate an interest.
  • this particular invention is more direct because it concerns the actual pointing in an area at or near the person in question with a signal. Much as you might wave your hand or do something else, but in this case it's subtle and less embarrassing. For example, if one sits in a crowded restaurant and waves their hand in the air at somebody, everyone sees that, where as if you aim a laser beam at the coffee cup of the person in question, no one sees it but the person in question and the company they are with. This is a major difference.
  • a heart or some other spatially extending information signaling the particular idea This can be purposely aimed to project onto a person's clothing, or on a glass or whatever on a table in a bar or restaurant, for example.
  • information that actually carries data such as contact information details.
  • the cellular phone becomes more prevalent, one idea is to project a persons cell phone number with a holographic grating or other mechanism for generating such a pattern at a distance. If you have your cell phone on, and so does the other party, dialog can be initiated immediately. In addition, the person doesn't have to embarrass themselves, by looking back around to see who shot the beam, so to speak.
  • Another idea is to send a message concerning one's e-mail address. This then allows the person to correspond later, in an even less obvious way without even having to talk in person.
  • Transponders at the other end that would beep back. For example, you could put a cigarette lighter that actually was a transponder on the edge of your restaurant table. If someone hit that with a laser beam, it would light up or send a beam back or do something. This could be an indicator for example. Certainly something on one's clothes would be another thing. Or something on a purse, or something on another article that might be obvious for the other person to shoot at for example.
  • colors can mean different things. For example, blue could be something like “lets dance”, or red could be let's leave, or whatever it is. Progress in colored lasers will make this a reality soon, insofar as laser sources for the indication are concerned.
  • FIG. 5 a laser pointer is being held in the hand of the signaler who aims it at the coffee cup of a potential acquaintance sitting at table.
  • the method here is that the user aims the laser and triggers it to send an optical indication signal to hit the coffee cup or any other point (such as the chair illustrated in FIG. 5) that is visible to the potential acquaintance, thereby in this simple case, signaling a message to look at the person who is signaling.
  • a holographic or other pattern-generating element on the front of the laser 502 can be used to make such a signal but with a message carried spatially in a more descriptive manner.
  • This message being either a pattern or a word or even a phrase, such as “let's get together” or something, or conversely a phone number, e-mail address, or other useful piece of information.
  • FIG. 6 is an embodiment of a transponder device responsive to a laser directed signal such as that from laser 600 at items on the potential acquaintance's table for example, ashtray 620 placed on the table 630 that has the ability to signal back to the potential acquaintance or, in another version, to be programmed to signal to any sender (or a sender having a particular code) that the person is interested, or not interested, as the case may be.
  • a laser directed signal such as that from laser 600 at items on the potential acquaintance's table for example, ashtray 620 placed on the table 630 that has the ability to signal back to the potential acquaintance or, in another version, to be programmed to signal to any sender (or a sender having a particular code) that the person is interested, or not interested, as the case may be.
  • this signal could either be a modulated signal that can be detected through a modulated infrared or radio respondent means for example, or it could be a visible signal capable of lighting up and showing that the person is interested.
  • This is shown by LED 632 , attached to ashtray 620 and responsive to the signal from photovoltaic detector 635 connected to a readout capable of energizing said LED on receipt of a desired signal, such as, that produced by laser beam 640 from the laser pointer 600 .
  • the Detector and led are driven and powered by circuit and batteries not shown.
  • the detector would be responsive over a wide range of angles, ideally 360 degrees. Alternatively multiple detectors facing in different directions could be used. This would also give an indication of the direction from which the signal came. Each detector could have a corresponding LED as well, making the indication easy to see. Or other means of communicating the arrival and/or direction of a signal could be used, such as generating an audio signal or radio signal.
  • Laser 600 can also be equipped with modulator 655 modulating the laser beam with a signal 670 that is responsive to the needs of the signaler. For example, if a key energizing the modulator is pressed three times, the unit is set up to modulate at a frequency that will be demodulated at the other end and provide an indicated meaning that the signaler wants to meet. This signal could be provided by having the LED 632 just blink 3 times in rapid succession for example.
  • a return signal to signal 670 could indicate that the signaler can't meet right now but would like to meet at some time in the future and would signal the person automatically modulated on the beam, the person's phone number or cell number, e-mail address, or whatever. This makes it easy for one to have then an ashtray or handbag that has the receptor for this particular number.
  • a preferred goal of this invention is to provide a discrete method of signaling.
  • the whole point of using a laser beam is that it is very discrete because no one can hear it, see it, or whatever—except in a region near an impacted object or in the direct line of sight. For this reason, a mechanism may be desirable to look back along the line of sight. An optical system buried in a purse, handbag, ashtray, or whatever can do this.
  • Another neat idea is to have lights over each table. These lights would be energized with the laser pointer, which would be an easy shoot from a long distance, and then they could light up with either the message carried by the laser pointer or some standard message.
  • Another laser pointer idea is to point at a screen on a projection TV and then sense that from behind using a camera. This was disclosed in FIG. 4 for other purposes.
  • the alternative of a modulated message is also possible where the laser can put out any sort of pulse or frequency modulated code or amplitude either and some sort of detective system read that.
  • the simplest thing is to have a single analog sensor looking at the back end of the screen to demodulate the signal and tell which laser used (in a room full of potential signaling users) or what signal was encoded in the modulated message.
  • Another idea is to have a lighting fixture over a table that would receive the encoded messages either encoded in a time based fashion (pulse width, pulse spacing, frequency, etc) or spatially encoded.
  • the spatially encoded one has the advantage in that it can be done without any sort of electronic system. In other words, the human recipient of the message can see it directly. However, it's less versatile as to change data you have to change the spatial encoded masks, be they holographic gratings or whatever.
  • phase mask can generate such a phase mask through other means through what have historically been called light valves, but it's complex.
  • the goal here is to try to reduce this to the simplest type of system useful by large numbers of people.
  • Another embodiment of the invention may utilize a TV camera in place of single detector 635 to detect the incoming radiation from laser pointer 600 .
  • the camera system can be utilized, for example, in a handbag or whatever that would possibly have the benefit of actually presenting to the owner an image of where the laser beam is coming from.
  • FIG. 7 Illustrated in FIG. 7 is handwriting and signature recognition of sensed pencil position such as for internet commerce and other purpose, including a D-sight pad.
  • user 701 with pen 710 writes his signature on paper 720 , resting on glass plate 725 .
  • the backside of the paper 726 is reflective and using camera 730 , retroreflector 735 and light source 740 , a D sight image (using the D-Sight effect—see Ref 1 U.S. Pat. No. 4,629,319) is created which is viewed on camera 730 and analyzed where desired by computer 750 .
  • This image is a function both of the xy position of the pen, and the force used. (which force is proportional with some writing instruments, such as brushes, and papers to the width of a mark produced).
  • the image generated by camera 730 can be digitized and transmitted if desired to a remote analysis site 770 for authentication. It is uniquely a D-Sight image, and cannot be copied, even if the users signature, say off a credit card receipt, was available to a forger.
  • a reflective member such as saran can be placed between paper 720 and glass plate 725 and pressed or sucked in contact with it, and the reflective member(saran in this case) conforming to the writing material monitored. If D-sight is not the optical means used to monitor the force signature, then other means, such a s grid projection described in copending applications may not require reflective material at all.
  • the apparatus of the above embodiments can be used to determine location of items in a scene, for example furniture in a house, for which homicide studies or insurance fraud could be an issue (see also referenced co-pending application for further detail on this application).
  • users may each point a laser pointer at each other, which can be detected by one or more cameras of the invention.
  • each or both may point at an image on the TV screen of the invention.
  • the TV camera picks this up, and displays, possibly discretely, that each liked (or disliked that image). In this way, mutual likes and dislikes can be registered and noted.
  • the laser pointer of the invention could be supplanted by any optical pointer capable of easy viewing by people, and sensing.
  • the TV camera of the invention in whatever location used for sensing laser radiation can be equipped with an interference filter for passing substantially only laser wavelengths used. (assuming all persons using the system use a similar wavelength).

Abstract

Disclosed herein are new forms computer monitors and displays, and preferred embodiments utilize electro-optical sensors, and particularly TV Cameras, providing optically inputted data from the display screen or from specialized datum's on objects and/or natural features of objects, which may be illuminated using specialized light sources, such as laser pointers. The invention is a continuation of earlier applications aimed at providing affordable methods and apparatus for data communication with respect to people and computers.

Description

This application claims benefit of U.S. Provisional Application No. 60/133,673 filed May 11, 1999.
CROSS REFERENCES TO RELATED CO-PENDING APPLICATIONS BY THE INVENTOR
1. Touch TV and other Man Machine Interfaces, Ser. No. 09/435,854.
2. More Useful Man Machine Interfaces and applications Ser. No. 09/433,297.
3. Target holes and corners U.S. Ser. Nos. 08/203,603, and 08/468,358.
4. Useful Man Machine interfaces and applications, Ser. No. 09/138,339.
5. Vision Target based assembly, U.S. Ser. Nos. 08/469,429, 08/469,907, 08/470,325, 08/466,294.
6. Camera based Applications of man machine interfaces (U.S. provisional application No. 60/142,777).
7. Picture Taking method and apparatus(U.S. provisional application No. 60/133,671 filed May 11, 1999).
8. Tactile Touch Screens for Automobile Dashboards, Interiors and Other Applications.
9. Apparel Manufacture and Distance Fashion Shopping in Both Present and Future, filed March 2000.
The disclosures of the above referenced applications are incorporated herein by reference.
FEDERALLY SPONSORED R AND D STATEMENT
not applicable
MICROFICHE APPENDIX
not applicable
The disclosures of the following U.S. patents and co-pending patent applications are incorporated herein by reference:
1. U.S. Pat. No. 4,629,319 (Panel Surface Flaw inspection, which discloses a novel optical principle commonly called “D Sight”).
2. U.S. Ser. No. 09/435,854 and U.S. Pat. No. 5,982,352, and U.S. Ser. No. 08/290,516 (“Man Machine Interfaces”), filed Aug. 15, 1994, now U.S. Pat. No. 6,008,000, the disclosure of both of which is contained in that of Ser. No. 09/435,854.
3. U.S. application Ser. No. 09/138,339 Useful man machine interfaces and applications.
4. U.S. application Ser. No. 09/433,297 More useful man machine interfaces and applications.
PROVISIONAL PATENT APPLICATIONS
5. Camera Based Applications of Man—Machine Interfaces U.S. Ser. No. 60/142,777.
6. Picture Taking method and apparatus U.S. No. 60/133,671.
7. Methods and Apparatus for Man Machine Interfaces and Related Activity U.S. No. 60/133,673.
8. Tactile Touch Screens for Automobile Dashboards, Interiors and Other Applications, Ser. No. 60/183,807, filed Feb. 22, 2000.
9. Apparel Manufacture and Distance Fashion Shopping in Both Present and Future, Ser. No. 60/187,397, filed Mar. 7, 2000.
BACKGROUND OF THE INVENTION
1. Field of the invention
The invention relates to simple input devices for computers, particularly, but not necessarily, intended for use with 3-D graphically intensive activities, and operating by optically sensing a human input to a display screen or other object and/or the sensing of human positions or orientations. The invention herein is a continuation in part of several inventions of mine, listed above.
This continuation application seeks to provide further detail on useful embodiments for computing. One embodiment is a monitor housing for a computer that integrally incorporates digital TV cameras to look at points on the hand or the finger, or objects held in the hand of the user, which are used to input data to the computer. It may also or alternatively, look at the head of the user as well.
Further disclosed are improved touch screens, and further discloses camera based sensing of laser pointer indications. The invention in several other embodiments, uses real time stereo photogrammetry or other methods using single or multiple TV cameras whose output is analyzed and used as input to a personal computer, typically to gather data concerning the 3D location of parts of, or objects held by, a person or persons.
2. Description of Related Art
The above mentioned co-pending applications incorporated by reference discuss many prior art references in various pertinent fields, which form a background for this invention.
Regarding use of laser pointers to signal or provide information to computers associated with TV displays, the closest reference I can find is U.S. Pat. No. 5,515,079 by Hauck “Computer input system and method of using same” (incorporated herein by reference). Hauck however, does not disclose function in a rear projection context as disclosed here.
Another reference which does use rear projection is U.S. Pat. No. 5,502,514 by Vogeley, et al, entitled “Stylus position sensing and digital camera with a digital micromirror device”. This however can only be used with a DLP projector comprising such a device, and does not use the simple TV camera based sensing approach of the instant invention.
No reference I have been able to find discusses the unique aspects of the disclosed invention relative to human interaction based information and the ability of the input data from the aiming pointer to be spatially encoded.
DESCRIPTION OF FIGURES
FIG. 1 illustrates a computer monitor of the invention integrally incorporating one or two cameras pointing outward from the surface of the Bezel facing the user to observe objects held by the user, or parts of the user such as fingers and hands, the bezel preferably including an opaque (to the user) plastic cover for both cameras and light sources. Also illustrated is an additional camera for viewing a user directly or other purposes.
FIG. 2 illustrates a version of the invention wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
FIG. 3 illustrates a touch screen of the invention of co-pending application 1 referenced above, having improved screen rigidity. In this case, distortion of the screen occurs primarily in a zone that is able to distort, with the zone supported by a rigid backing member.
FIG. 4 illustrates the use of a TV camera based transduction using the camera for screen distortion determination similar to that of FIG. 3 with the camera used for the determination of the position of a laser pointer indication such as a spot directed by a user on the screen, particularly in response to an image displayed on the screen.
FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information that is itself then sent to the camera system.
FIG. 6 illustrates an embodiment using laser pointers for acquaintance making purposes, including the use of the laser pointer to an image to designate an image on a television screen using the invention of FIG. 4 or 5 above.
FIG. 7 illustrates Handwriting and signature recognition of sensed pencil position for internet commerce and other purpose, including a D-Sight technology based writing pad capable of distortion signature determination.
THE INVENTION EMBODIMENTS
FIG. 1
This embodiment illustrating a computer display with camera(s) and illumination system is an alternative or addition to that of FIG. 1 of copending reference 3 above (Ser. No. 09/138,339).
A PC computer based embodiment is shown in FIG. 1a. In this case, a stereo pair of cameras 100 and 101 located on each side of the upper surface of monitor 102 (for example a rear projection TV having 80 inch diagonal screen 104) facing the user, desirably having one or more cover windows 103. In this case a single extensive cover window 103 is covering both cameras and their associated light sources 110 and 111, and mounted flush with the monitor front bezel surface. The LED's in this application are typically used to illuminate targets associated with the fingers, hand, head of the user, or objects held by the user, such as user 135 with hands 136 and 137, and head 138. These targets are desirably, but not necessarily, retro-reflective, and may be constituted by the object features themselves (e.g. a finger), or by features of clothing worn by the user, or by artificial targets other than retroreflectors.
The cameras are preferably pointed obliquely inward at angles theta, and downward, if desired, at further angles phi toward the center of the desired work volume 140 in front of the monitor, as required (angles depend on the computer monitor width, the distance of the work zone volume, 140, from the monitor etc.
Alternatively, or in addition, a single camera can be used, for determining user or other object positions such as 120 with light source 121, both also optionally located behind a cover window (such as 103).
The cover window 103 is preferably black or dark plastic which lets the LED light source wavelength pass easily, but attenuates sunlight or room lights, thus aiding camera signal to noise in many cases, and making the cameras and light sources substantially invisible to the user (especially if the light sources are in the near infrared) and thus pleasing to the eye and not distracting.
Alternate camera locations may be used such as in the sides of the monitor bezel, or anywhere desired, for example as appendages to the monitor. They may alternately or in addition, be at the rear of the keyboard in front of the monitor. In the case of cameras mounted at the rear of the keyboard (toward the display screen), these cameras are also inclined to point toward the user at an angle as well.
It is noted that an additional camera for viewing a user directly or other purposes may be employed. For example if a stereo pair such as 100 and 101 is utilized for position determination, then a third camera such as 120 might be used just for imaging using ambient illumination such as room lights (i.e. LED source 121 is not needed, though could be provided if desired) When this camera is located far enough spaced from the other two, then any light sources such as 110 and 111 located near the optical axes of the other two, do not generally illuminate any retro-reflectors in such a way as to register same on camera 120—due to the limited angular return characteristic of retro-reflectors.
FIG. 2
FIG. 2 illustrates a version of the invention of the co-pending applications wherein the light illuminating the target datums of the object is itself generated by the monitor viewed by the user or users, said monitor being of the CRT, LED, Projected light, scanned laser spot or any other variety.
For example target 201 on finger 202 is illuminated by light 205 from zone 210 of screen 220 on which an image is projected by projection device 221 as shown. Light reflected by the target 201 is imaged by camera 230 such as that employed in the invention of FIG. 1 for example. The color of light 205, or the point (or points) 210 from which it emanates, may be varied by the program of computer 250 controlling screen display driver 260. The control of point location and color allows selective illumination of targets or object features such as finger tip 203, both by choice of color sensitivity response with respect to the target illuminated, and if the target is retroreflective, by choice of screen location with in the viewing field of view 225 of TV camera 230. This can be used to select, by choice of screen generated light source location which camera, for example of two cameras sees the target.
FIG. 3
FIG. 3 illustrates a touch screen of the invention as in the co-pending invention 1 referenced above, having however, improved screen rigidity. In this case, distortion of the screen occurs primarily in a region that is able to distort or otherwise be changed, with the region supported by a rigid backing member.
For example consider transparent screen member 305 whose outer scattering surface 310 is touched, and locally distorted inward, by finger 315. This surface is separated from rigid (typically glass) backing member 320 by optically transparent refractive medium 330 of thickness t, which is compressed by the force of finger 315. Alternatively the medium 330 can be a liquid, such as water or alcohol, that is either compressed, or displaced temporarily into a reservoir such as 340 (dotted lines) for example.
Light from light source 350 from the rear passes through member 320 and is reflected off the back surface 311 of screen member 305. Desirably, but not necessarily, the index of refraction of member 320 and material 330 are closely matched such that little refraction occurs as light passes from 320 through 330 to surface 311 and back toward camera 370 (after retroreflection by expansive retroreflector 365 typically comprised of Scotchlight 7615 glass bead material) for example used to determine the distortion of 311 due to the retroreflective based “D sight” image effect or another optical phenomena such as disclosed in copending reference 1.
While primarily intended for rear projection application using a TV or computer display image projector such as a Sharp brand LCD based projector 375, there is also a front projection version in which the distortion of surface 310 is directly observed from the front side, for example by camera 380, also as taught in reference 1. (copending application Ser. No. 09/435,854).
In the embodiment of FIG. 3, I have found it of use to coat the surface 311 with a partially reflecting coating as was also described in copending application Ser. No. 09/435,854. It is also generally desirable that outer surface 310 of member 315 be relatively hard, such as thin Plexiglas or Mylar.
As an alternative to providing a partially reflective coating (such as commonly done with vacuum deposition of silver or interference coatings—the latter useful if at the wavelength used for distortion determination, which additionally can be in the near infrared (e.g. 0.8 microns, where LED sources and TV camera sensitivity is commonplace), it is possible to provide on surface 311 a periodic array of zones of reflective material (for example a reflective dot or stripe 0.05 mm wide, every 1 mm), whose movement as a result of touch is detected using the electro-optical system of the invention. The projection device if desired can be programmed to minimize the effect of such reflective or partially reflective zones in the displayed image.
FIG. 4
FIG. 4 illustrates the use of a TV camera based transduction of finger touch using for example camera 410 to determine screen distortion occurring on touch similar to that of FIG. 3. As disclosed herein, the camera 410 may also be used for the determination of the position of a laser or other optical based pointer indication such as a spot 420 from laser pointer 425 directed by a user 430 on the screen 440, particularly in response to an image (not shown for clarity) displayed on the screen, for example by image projector 455. For example if a pony image was displayed, a child could point a pointer at the pony image on the screen, and the program in system computer 460 could acknowledge this with audio feedback from loudspeaker 465 to the child, and if desired record in memory 470 that the child had correctly identified the pony, useful for tracking the child's learning, or recording scores for game purposes.
The optical pointer function is distinct from that of the physical touch screen indication of FIG. 3 above. Either function can exist independently, or both together. A separate camera such as 411 can alternatively be used to determine laser pointer indication.
This method of laser pointer designation is highly interesting, especially as one considers very large display screens. While one might use a finger pointing which is more natural, let's say, the actual holding of the laser pointer is not much different and the use of this on the front projection screen shown in FIG. 4 ballows almost widespread use. A person such as 475 can draw a drawing 487 with a laser pointer 480 for example by successively tracing it on display screen 481 where the successive laser pointer indications on the screen are digitized with the camera system 485 comprising one or more TV cameras connected via for example an IEEE 1394 firewire connection to computer 490 as shown, equipped to digitize the camera inputs. This allows one to draw modifications for example sitting in a conference room on to a drawing and have this digitized by the camera.
The camera can optionally can be equipped with a laser wavelength bandpass filter such as 495 in front of it to make it easy to detect the position laser spot anywhere on t he big screen, even in bright light surroundings. In a rear projection application such as depicted in FIG. 4a, the brightness of the laser alone is often significant enough to allow reliable detection.
Computer 490 also controls the front projector 496, such that detected laser spot indications from the digitized camera image, can be used to modify software used to generate the image on the screen in what ever manner desired, for example to make a car design larger in a certain area, or to inject a new line on the drawing or whatever.
A camera such as 485 to determine laser pointer indications on a front projection screen, can also be used to see datums on objects in front of the screen as well, as discussed in FIG. 1 and referenced copending applications for example. These can include natural features of the person 475, such as fingers, hands, or specialized datums such as retroreflectors generally located on apparel or extremities of the user. Combined also can be a camera to determine laser pointer location and the screen deflection or other characteristic of touch or other contact with the screen. The camera can also determine location of target datums on the object as well as laser pointer indications, and other things as well.
FIG. 5
FIG. 5 illustrates a variation of FIG. 4 in which the laser spot is a spatially encoded to carry information. Such information can be in the form of a shape (such as a heart 501 projected by laser pointer 502 whose beam is transmitted through grating 505), an alphanumeric character, or anything else desired. It is also foreseen that such information can be easily changed by the user, either by changing fixed lasers, selecting different lasers with different spatially encoded holographic gratings or by having a turret of such gratings in front of a single laser. The color of the laser can also be changed, with certain colors signifying desired actions. Tunable wavelength lasers make this easier today.
The information can be projected directly on an object, or on a front or rear projection screen displaying other information. The projected information can also be sensed as in FIG. 4, using a TV camera such as 530, viewing an object such as a chair 540, or alternatively a screen such as 481 on which the information is projected.
FIG. 6
FIG. 6 illustrates embodiments using laser pointers for acquaintance making purposes. This embodiment of the invention is particularly illustrated here for social purposes, but any application to which it useful is contemplated.
In particular, it is adapted to signaling one's indication for wishing to meet a person particularly of the opposite sex for the purpose of dating, etc. It more particularly concerns the use in a bar, restaurant or other social scene and using a laser pointer or other means for indicating information on the laser beam so pointed to point at the person at another table or across the room in a way that would indicate an interest.
A common problem in social interaction is to find someone of the opposite sex to start a dialogue with. This is a famous problem for persons typically males who need to approach females for this purpose. It is difficult because often the females are in a group and it makes it more embarrassing and in any case, there's always the fear of rejection.
I am roughly aware of various types of radio transponders and so forth for dealing with this issue that have been proposed in Japan, for example where a transponder on a girl would indicate her availability and it would match up with the transponder signals of a guy.
However, this particular invention is more direct because it concerns the actual pointing in an area at or near the person in question with a signal. Much as you might wave your hand or do something else, but in this case it's subtle and less embarrassing. For example, if one sits in a crowded restaurant and waves their hand in the air at somebody, everyone sees that, where as if you aim a laser beam at the coffee cup of the person in question, no one sees it but the person in question and the company they are with. This is a major difference.
The other question is what information does the laser beam carry? Clearly in the first case of the invention, one simply signals with a laser beam period. In this case, the person looks back to see who is sending the signal, but does one know what the signal means?
In the second case, similar to FIG. 5 above, one sends for example, a heart or some other spatially extending information signaling the particular idea. This can be purposely aimed to project onto a person's clothing, or on a glass or whatever on a table in a bar or restaurant, for example.
In addition, one can project information that actually carries data, such as contact information details. For example, as the cellular phone becomes more prevalent, one idea is to project a persons cell phone number with a holographic grating or other mechanism for generating such a pattern at a distance. If you have your cell phone on, and so does the other party, dialog can be initiated immediately. In addition, the person doesn't have to embarrass themselves, by looking back around to see who shot the beam, so to speak.
It is true that this sending message could be sent by radio control but the question is how could you decide to whom it was sent? This gets back to the other invention where only those people who wish to have something sent would have the transponder on, but again you only wish to send it to one particular person. This is why the laser becomes useful.
Another idea is to send a message concerning one's e-mail address. This then allows the person to correspond later, in an even less obvious way without even having to talk in person.
Another idea is Transponders at the other end that would beep back. For example, you could put a cigarette lighter that actually was a transponder on the edge of your restaurant table. If someone hit that with a laser beam, it would light up or send a beam back or do something. This could be an indicator for example. Certainly something on one's clothes would be another thing. Or something on a purse, or something on another article that might be obvious for the other person to shoot at for example.
In addition, colors can mean different things. For example, blue could be something like “lets dance”, or red could be let's leave, or whatever it is. Progress in colored lasers will make this a reality soon, insofar as laser sources for the indication are concerned.
And then we come to whether we could also modulate the laser, which is easy technically, but more costly. For example, lasers modulated at a certain frequency would interact with detectors and say located in a purse, it would then give out a certain response, either to the person being signaled at to respond back.
One can imagine that in FIG. 5, a laser pointer is being held in the hand of the signaler who aims it at the coffee cup of a potential acquaintance sitting at table. The method here is that the user aims the laser and triggers it to send an optical indication signal to hit the coffee cup or any other point (such as the chair illustrated in FIG. 5) that is visible to the potential acquaintance, thereby in this simple case, signaling a message to look at the person who is signaling.
As also discussed relative to FIG. 5 a holographic or other pattern-generating element on the front of the laser 502 can be used to make such a signal but with a message carried spatially in a more descriptive manner. This message being either a pattern or a word or even a phrase, such as “let's get together” or something, or conversely a phone number, e-mail address, or other useful piece of information.
FIG. 6 is an embodiment of a transponder device responsive to a laser directed signal such as that from laser 600 at items on the potential acquaintance's table for example, ashtray 620 placed on the table 630 that has the ability to signal back to the potential acquaintance or, in another version, to be programmed to signal to any sender (or a sender having a particular code) that the person is interested, or not interested, as the case may be.
But what could this signal back be? It could either be a modulated signal that can be detected through a modulated infrared or radio respondent means for example, or it could be a visible signal capable of lighting up and showing that the person is interested. This is shown by LED 632, attached to ashtray 620 and responsive to the signal from photovoltaic detector 635 connected to a readout capable of energizing said LED on receipt of a desired signal, such as, that produced by laser beam 640 from the laser pointer 600. The Detector and led are driven and powered by circuit and batteries not shown. The detector would be responsive over a wide range of angles, ideally 360 degrees. Alternatively multiple detectors facing in different directions could be used. This would also give an indication of the direction from which the signal came. Each detector could have a corresponding LED as well, making the indication easy to see. Or other means of communicating the arrival and/or direction of a signal could be used, such as generating an audio signal or radio signal.
Laser 600 can also be equipped with modulator 655 modulating the laser beam with a signal 670 that is responsive to the needs of the signaler. For example, if a key energizing the modulator is pressed three times, the unit is set up to modulate at a frequency that will be demodulated at the other end and provide an indicated meaning that the signaler wants to meet. This signal could be provided by having the LED 632 just blink 3 times in rapid succession for example.
A return signal to signal 670 could indicate that the signaler can't meet right now but would like to meet at some time in the future and would signal the person automatically modulated on the beam, the person's phone number or cell number, e-mail address, or whatever. This makes it easy for one to have then an ashtray or handbag that has the receptor for this particular number.
The issue there though is does one want to really meet this person at all since it's relatively easy in some cases to receive the signal? For example, and only if one look's back to see where the signal came from can you ascertain whether you want to meet the person. At this point in time, the signal could also convey data as to what the person is like; their age, weight, height, physical dimensions, educational background, all of that could be encoded on the signal that's transmitted. This of course, gives a lot more data to the receiving person from which they can make a decision as to whether they want to signal back. This readout can be made either with an automatic voice actualization or it can be made into a computer system or whatever. For example, that computer sitting on your desk can do this.
None the less, how does one actually see who is sending the signal? The more obtrusive the laser signal is, the less discrete and a preferred goal of this invention is to provide a discrete method of signaling. The whole point of using a laser beam is that it is very discrete because no one can hear it, see it, or whatever—except in a region near an impacted object or in the direct line of sight. For this reason, a mechanism may be desirable to look back along the line of sight. An optical system buried in a purse, handbag, ashtray, or whatever can do this.
Another neat idea is to have lights over each table. These lights would be energized with the laser pointer, which would be an easy shoot from a long distance, and then they could light up with either the message carried by the laser pointer or some standard message.
Another laser pointer idea is to point at a screen on a projection TV and then sense that from behind using a camera. This was disclosed in FIG. 4 for other purposes.
When you do point at the screen like that, you can just shoot the spot at whatever the image is that appears on the screen and correlate the image of the moment or the other text box icon or whatever to the pointing indication. Or you can also project some sort of message that can be read by the camera. This message would then be transmitted either just directly up onto the screen which requires no one else's at all or actually machine read “character . . . ” and utilized in that way as information into the computer. In other words, the laser would project a specially encoded message of the users once rather than the typical thing which is a modulated message.
The alternative of a modulated message is also possible where the laser can put out any sort of pulse or frequency modulated code or amplitude either and some sort of detective system read that. The simplest thing is to have a single analog sensor looking at the back end of the screen to demodulate the signal and tell which laser used (in a room full of potential signaling users) or what signal was encoded in the modulated message.
Another idea is to have a lighting fixture over a table that would receive the encoded messages either encoded in a time based fashion (pulse width, pulse spacing, frequency, etc) or spatially encoded. The spatially encoded one has the advantage in that it can be done without any sort of electronic system. In other words, the human recipient of the message can see it directly. However, it's less versatile as to change data you have to change the spatial encoded masks, be they holographic gratings or whatever.
One can generate such a phase mask through other means through what have historically been called light valves, but it's complex.
The goal here is to try to reduce this to the simplest type of system useful by large numbers of people.
Another embodiment of the invention may utilize a TV camera in place of single detector 635 to detect the incoming radiation from laser pointer 600. In this case, the camera system can be utilized, for example, in a handbag or whatever that would possibly have the benefit of actually presenting to the owner an image of where the laser beam is coming from.
FIG. 7
Illustrated in FIG. 7 is handwriting and signature recognition of sensed pencil position such as for internet commerce and other purpose, including a D-sight pad. As shown, user 701 with pen 710 writes his signature on paper 720, resting on glass plate 725. The backside of the paper 726, is reflective and using camera 730, retroreflector 735 and light source 740, a D sight image (using the D-Sight effect—see Ref 1 U.S. Pat. No. 4,629,319) is created which is viewed on camera 730 and analyzed where desired by computer 750. This image is a function both of the xy position of the pen, and the force used. (which force is proportional with some writing instruments, such as brushes, and papers to the width of a mark produced).
Alternatively the image generated by camera 730 can be digitized and transmitted if desired to a remote analysis site 770 for authentication. It is uniquely a D-Sight image, and cannot be copied, even if the users signature, say off a credit card receipt, was available to a forger.
If the paper a is not sufficiently reflective, then a reflective member such as saran can be placed between paper 720 and glass plate 725 and pressed or sucked in contact with it, and the reflective member(saran in this case) conforming to the writing material monitored. If D-sight is not the optical means used to monitor the force signature, then other means, such a s grid projection described in copending applications may not require reflective material at all.
Other Points
Clearly the apparatus of the above embodiments can be used to determine location of items in a scene, for example furniture in a house, for which homicide studies or insurance fraud could be an issue (see also referenced co-pending application for further detail on this application).
In addition it is noted that users may each point a laser pointer at each other, which can be detected by one or more cameras of the invention. Or each or both may point at an image on the TV screen of the invention. For example if an image was shown on the screen which both users were interested in, both could point at it with their encoded laser pointers (e.g. the girls could have her name, the boys, his) the TV camera then picks this up, and displays, possibly discretely, that each liked (or disliked that image). In this way, mutual likes and dislikes can be registered and noted.
In reference 1 the idea of a simulator was discussed for application to various airplane dashboards for example. One can also use such a touch screen for an actual airplane or car dashboard or portion thereof, and change ones mind about where certain functions were, just by reprogramming the video and touch screen of the invention.
The laser pointer of the invention could be supplanted by any optical pointer capable of easy viewing by people, and sensing. The TV camera of the invention in whatever location used for sensing laser radiation can be equipped with an interference filter for passing substantially only laser wavelengths used. (assuming all persons using the system use a similar wavelength).

Claims (17)

What is claimed is:
1. A touch screen, comprising
a deformable outer member on which visual information is displayed to a person viewing the screen and which is deformable by touching to provide an interaction with the person,
a transparent inner member which is relatively more rigid than the outer member,
a transparent medium, located between said inner and outer members, and
an electro-optical means for determining a presence of the interaction of the person with the outer member, said electro-optical means being located in a space behind and spaced from the inner member.
2. A touch screen according to claim 1 wherein said medium is compressed when said outer member is touched.
3. A touch screen according to claim 1 wherein said medium is displaced when said outer member is touched.
4. A touch screen according to claim 1 wherein said electro-optical means includes a TV camera means to view said outer member.
5. Apparatus according to claim 1, further comprising a reflective coating on an inner surface of said outer member.
6. A touch screen according to claim 1, wherein the visual information is rear projected onto said outer member.
7. A touch screen according to claim 1, wherein said medium is a liquid.
8. A touch screen according to claim 1, wherein said medium is a solid.
9. A touch screen according to claim 7, wherein electro-optical means further includes a light source adjacent said TV camera view which projects light onto an inner surface of said outer member and a retro-reflector to which light reflected from the inner surface is directed and returned back to the inner surface for viewing by the TV camera.
10. A rear projection touch screen, comprising
a deformable outer member on which visual information is displayed to a person viewing the screen and which is deformable by touching to provide an interaction with the person,
a transparent inner member which is relatively more rigid than the outer member,
a transparent medium, located between said inner and outer members, and
an electro-optical means for determining a location of the interaction of the person with the outer member, said electro-optical means being located in a space behind and spaced from the inner member.
11. A touch screen according to claim 10 wherein said medium is compressed when said outer member is touched.
12. A touch screen according to claim 10 wherein said medium is displaced when said outer member is touched.
13. A touch screen according to claim 1 wherein said electro-optical means includes a TV camera means to view said outer member.
14. A touch screen according to claim 10, wherein the visual information is rear projected onto said outer member.
15. A touch screen according to claim 10, wherein said medium is a liquid.
16. A touch screen according to claim 10, wherein said medium is a solid.
17. A touch screen according to claim 16, wherein electro-optical means further includes a light source adjacent said TV camera view which projects light onto an inner surface of said outer member and a retro-reflector to which light reflected from the inner surface is directed and returned back to the inner surface for viewing by the TV camera.
US09/568,554 1999-05-11 2000-05-11 Methods and apparatus for man machine interfaces and related activity Expired - Lifetime US6545670B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/568,554 US6545670B1 (en) 1999-05-11 2000-05-11 Methods and apparatus for man machine interfaces and related activity

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13367399P 1999-05-11 1999-05-11
US09/568,554 US6545670B1 (en) 1999-05-11 2000-05-11 Methods and apparatus for man machine interfaces and related activity

Publications (1)

Publication Number Publication Date
US6545670B1 true US6545670B1 (en) 2003-04-08

Family

ID=26831576

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/568,554 Expired - Lifetime US6545670B1 (en) 1999-05-11 2000-05-11 Methods and apparatus for man machine interfaces and related activity

Country Status (1)

Country Link
US (1) US6545670B1 (en)

Cited By (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010020933A1 (en) * 2000-02-21 2001-09-13 Christoph Maggioni Method and configuration for interacting with a display visible in a display window
US20020126876A1 (en) * 1999-08-10 2002-09-12 Paul George V. Tracking and gesture recognition system particularly suited to vehicular control applications
US20030001818A1 (en) * 2000-12-27 2003-01-02 Masaji Katagiri Handwritten data input device and method, and authenticating device and method
US20030048280A1 (en) * 2001-09-12 2003-03-13 Russell Ryan S. Interactive environment using computer vision and touchscreens
US20030063260A1 (en) * 2001-09-28 2003-04-03 Fuji Photo Optical Co., Ltd. Presentation system
US6608648B1 (en) * 1999-10-21 2003-08-19 Hewlett-Packard Development Company, L.P. Digital camera cursor control by sensing finger position on lens cap
US20030184645A1 (en) * 2002-03-27 2003-10-02 Biegelsen David K. Automatic camera steering control and video conferencing
US20030222849A1 (en) * 2002-05-31 2003-12-04 Starkweather Gary K. Laser-based user input device for electronic projection displays
US20040008185A1 (en) * 2002-03-29 2004-01-15 Mitac International Corp. Data processing device, presentation device, and projection method for presentation
US20040027455A1 (en) * 2000-12-15 2004-02-12 Leonard Reiffel Imaged coded data source tracking product
US20040041027A1 (en) * 2000-12-15 2004-03-04 Leonard Reiffel Imaged coded data source transducer product
US20040125224A1 (en) * 2000-08-18 2004-07-01 Leonard Reiffel Annotating imaged data product
US20050102332A1 (en) * 2000-12-15 2005-05-12 Leonard Reiffel Multi-imager multi-source multi-use coded data source data iInput product
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US7000840B2 (en) 2000-05-03 2006-02-21 Leonard Reiffel Dual mode data imaging product
US20060044282A1 (en) * 2004-08-27 2006-03-02 International Business Machines Corporation User input apparatus, system, method and computer program for use with a screen having a translucent surface
US7034803B1 (en) 2000-08-18 2006-04-25 Leonard Reiffel Cursor display privacy product
US20060119798A1 (en) * 2004-12-02 2006-06-08 Huddleston Wyatt A Display panel
US20060238493A1 (en) * 2005-04-22 2006-10-26 Dunton Randy R System and method to activate a graphical user interface (GUI) via a laser beam
US7137711B1 (en) 2000-03-21 2006-11-21 Leonard Reiffel Multi-user retro reflector data input
US20060291797A1 (en) * 2003-05-27 2006-12-28 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
JP2007072637A (en) * 2005-09-06 2007-03-22 Hitachi Ltd Input device using elastic material
US20070063982A1 (en) * 2005-09-19 2007-03-22 Tran Bao Q Integrated rendering of sound and image on a display
US20070171891A1 (en) * 2006-01-26 2007-07-26 Available For Licensing Cellular device with broadcast radio or TV receiver
US20070187506A1 (en) * 2001-04-19 2007-08-16 Leonard Reiffel Combined imaging coded data source data acquisition
US20070222734A1 (en) * 2006-03-25 2007-09-27 Tran Bao Q Mobile device capable of receiving music or video content from satellite radio providers
US20070229233A1 (en) * 2004-08-02 2007-10-04 Dort David B Reconfigurable tactile-enhanced display including "tap-and-drop" computing system for vision impaired users
US20070262995A1 (en) * 2006-05-12 2007-11-15 Available For Licensing Systems and methods for video editing
US20080024463A1 (en) * 2001-02-22 2008-01-31 Timothy Pryor Reconfigurable tactile control display applications
US20080065291A1 (en) * 2002-11-04 2008-03-13 Automotive Technologies International, Inc. Gesture-Based Control of Vehicular Components
US7355561B1 (en) 2003-09-15 2008-04-08 United States Of America As Represented By The Secretary Of The Army Systems and methods for providing images
US20080088587A1 (en) * 2001-02-22 2008-04-17 Timothy Pryor Compact rtd instrument panels and computer interfaces
US20080096651A1 (en) * 2006-07-28 2008-04-24 Aruze Corp. Gaming machine
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20080192027A1 (en) * 2002-11-08 2008-08-14 Morrison James C Interactive window display
US20080211779A1 (en) * 1994-08-15 2008-09-04 Pryor Timothy R Control systems employing novel physical controls and touch screens
US20090027357A1 (en) * 2007-07-23 2009-01-29 Smart Technologies, Inc. System and method of detecting contact on a display
US20090116692A1 (en) * 1998-08-10 2009-05-07 Paul George V Realtime object tracking system
US20090273563A1 (en) * 1999-11-08 2009-11-05 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US20090300531A1 (en) * 1995-06-29 2009-12-03 Pryor Timothy R Method for providing human input to a computer
US20090322499A1 (en) * 1995-06-29 2009-12-31 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US20100008582A1 (en) * 2008-07-10 2010-01-14 Samsung Electronics Co., Ltd. Method for recognizing and translating characters in camera-based image
US20100079493A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
US20100079385A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for calibrating an interactive input system and interactive input system executing the calibration method
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US20100079409A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Touch panel for an interactive input system, and interactive input system incorporating the touch panel
EP2213501A2 (en) 2003-03-31 2010-08-04 Timothy R. Pryor Reconfigurable vehicle instrument panels
US20100217433A1 (en) * 2007-10-16 2010-08-26 Hyun Dong Son Store management system capable of switching between manned or unmanned sales
US20110050650A1 (en) * 2009-09-01 2011-03-03 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (snr) and image capture method
US20110069019A1 (en) * 2009-07-08 2011-03-24 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
US20110169748A1 (en) * 2010-01-11 2011-07-14 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US20110243380A1 (en) * 2010-04-01 2011-10-06 Qualcomm Incorporated Computing device interface
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8314773B2 (en) 2002-09-09 2012-11-20 Apple Inc. Mouse having an optically-based scrolling feature
CN102831387A (en) * 2005-01-07 2012-12-19 高通股份有限公司 Detecting and tracking objects in images
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US20130050549A1 (en) * 2006-01-04 2013-02-28 Apple Inc. Embedded camera with privacy filter
US20130141569A1 (en) * 2011-12-06 2013-06-06 Canon Kabushiki Kaisha Information processing apparatus, control method of information processing apparatus, and storage medium
US20130229669A1 (en) * 2007-10-10 2013-09-05 Gerard Dirk Smits Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
US8711370B1 (en) 2012-10-04 2014-04-29 Gerard Dirk Smits Scanning optical positioning system with spatially triangulating receivers
US8781171B2 (en) 2012-10-24 2014-07-15 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US8867015B2 (en) 2012-01-11 2014-10-21 Apple Inc. Displays with liquid crystal shutters
US20140370980A1 (en) * 2013-06-17 2014-12-18 Bally Gaming, Inc. Electronic gaming displays, gaming tables including electronic gaming displays and related assemblies, systems and methods
US8971568B1 (en) 2012-10-08 2015-03-03 Gerard Dirk Smits Method, apparatus, and manufacture for document writing and annotation with virtual ink
US20150199018A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation 3d silhouette sensing system
US20150336588A1 (en) * 2012-07-06 2015-11-26 Audi Ag Method and control system for operating a motor vehicle
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9304593B2 (en) 1998-08-10 2016-04-05 Cybernet Systems Corporation Behavior recognition system
US9377533B2 (en) 2014-08-11 2016-06-28 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US20160301900A1 (en) * 2015-04-07 2016-10-13 Omnivision Technologies, Inc. Touch screen rear projection display
US9753126B2 (en) 2015-12-18 2017-09-05 Gerard Dirk Smits Real time position sensing of objects
US9810913B2 (en) 2014-03-28 2017-11-07 Gerard Dirk Smits Smart head-mounted projection system
US9813673B2 (en) 2016-01-20 2017-11-07 Gerard Dirk Smits Holographic video capture and telepresence system
US9946076B2 (en) 2010-10-04 2018-04-17 Gerard Dirk Smits System and method for 3-D projection and enhancements for interactivity
US10067230B2 (en) 2016-10-31 2018-09-04 Gerard Dirk Smits Fast scanning LIDAR with dynamic voxel probing
US10157469B2 (en) 2015-04-13 2018-12-18 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
US10166995B2 (en) * 2016-01-08 2019-01-01 Ford Global Technologies, Llc System and method for feature activation via gesture recognition and voice command
US10261183B2 (en) 2016-12-27 2019-04-16 Gerard Dirk Smits Systems and methods for machine perception
US10379220B1 (en) 2018-01-29 2019-08-13 Gerard Dirk Smits Hyper-resolved, high bandwidth scanned LIDAR systems
US10473921B2 (en) 2017-05-10 2019-11-12 Gerard Dirk Smits Scan mirror systems and methods
US10591605B2 (en) 2017-10-19 2020-03-17 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
US20220360755A1 (en) * 2020-10-23 2022-11-10 Ji Shen Interactive display with integrated camera for capturing audio and visual information
US20230113359A1 (en) * 2020-10-23 2023-04-13 Pathway Innovations And Technologies, Inc. Full color spectrum blending and digital color filtering for transparent display screens
US11829059B2 (en) 2020-02-27 2023-11-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3748751A (en) * 1972-09-07 1973-07-31 Us Navy Laser machine gun simulator
US3757322A (en) * 1971-02-03 1973-09-04 Hall Barkan Instr Inc Transparent touch controlled interface with interreactively related display
US4017848A (en) * 1975-05-19 1977-04-12 Rockwell International Corporation Transparent keyboard switch and array
US4772028A (en) * 1987-08-27 1988-09-20 Rockhold Christopher K Electronic shootout game
US4948371A (en) * 1989-04-25 1990-08-14 The United States Of America As Represented By The United States Department Of Energy System for training and evaluation of security personnel in use of firearms
US5328190A (en) * 1992-08-04 1994-07-12 Dart International, Inc. Method and apparatus enabling archery practice
US5495269A (en) * 1992-04-03 1996-02-27 Xerox Corporation Large area electronic writing system
US5502514A (en) 1995-06-07 1996-03-26 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US5515079A (en) 1989-11-07 1996-05-07 Proxima Corporation Computer input system and method of using same
US5613913A (en) * 1994-04-06 1997-03-25 Sega Enterprises, Ltd. Method for developing attractions in a shooting game system
US5649706A (en) * 1994-09-21 1997-07-22 Treat, Jr.; Erwin C. Simulator and practice method
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6008800A (en) * 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
US20010012001A1 (en) * 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
US6339748B1 (en) * 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus
US6441807B1 (en) * 1997-09-03 2002-08-27 Plus Industrial Corporation Display system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3757322A (en) * 1971-02-03 1973-09-04 Hall Barkan Instr Inc Transparent touch controlled interface with interreactively related display
US3748751A (en) * 1972-09-07 1973-07-31 Us Navy Laser machine gun simulator
US4017848A (en) * 1975-05-19 1977-04-12 Rockwell International Corporation Transparent keyboard switch and array
US4772028A (en) * 1987-08-27 1988-09-20 Rockhold Christopher K Electronic shootout game
US4948371A (en) * 1989-04-25 1990-08-14 The United States Of America As Represented By The United States Department Of Energy System for training and evaluation of security personnel in use of firearms
US5515079A (en) 1989-11-07 1996-05-07 Proxima Corporation Computer input system and method of using same
US5495269A (en) * 1992-04-03 1996-02-27 Xerox Corporation Large area electronic writing system
US5328190A (en) * 1992-08-04 1994-07-12 Dart International, Inc. Method and apparatus enabling archery practice
US5982352A (en) * 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6008800A (en) * 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
US5613913A (en) * 1994-04-06 1997-03-25 Sega Enterprises, Ltd. Method for developing attractions in a shooting game system
US5649706A (en) * 1994-09-21 1997-07-22 Treat, Jr.; Erwin C. Simulator and practice method
US5502514A (en) 1995-06-07 1996-03-26 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US20010012001A1 (en) * 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
US6441807B1 (en) * 1997-09-03 2002-08-27 Plus Industrial Corporation Display system
US6339748B1 (en) * 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus

Cited By (161)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US20080211779A1 (en) * 1994-08-15 2008-09-04 Pryor Timothy R Control systems employing novel physical controls and touch screens
US20090322499A1 (en) * 1995-06-29 2009-12-31 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US9758042B2 (en) 1995-06-29 2017-09-12 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8427449B2 (en) 1995-06-29 2013-04-23 Apple Inc. Method for providing human input to a computer
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
US8610674B2 (en) 1995-06-29 2013-12-17 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US20090300531A1 (en) * 1995-06-29 2009-12-03 Pryor Timothy R Method for providing human input to a computer
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US7684592B2 (en) 1998-08-10 2010-03-23 Cybernet Systems Corporation Realtime object tracking system
US20090116692A1 (en) * 1998-08-10 2009-05-07 Paul George V Realtime object tracking system
US9304593B2 (en) 1998-08-10 2016-04-05 Cybernet Systems Corporation Behavior recognition system
US7050606B2 (en) * 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
US20070195997A1 (en) * 1999-08-10 2007-08-23 Paul George V Tracking and gesture recognition system particularly suited to vehicular control applications
US20020126876A1 (en) * 1999-08-10 2002-09-12 Paul George V. Tracking and gesture recognition system particularly suited to vehicular control applications
US6608648B1 (en) * 1999-10-21 2003-08-19 Hewlett-Packard Development Company, L.P. Digital camera cursor control by sensing finger position on lens cap
US20090273563A1 (en) * 1999-11-08 2009-11-05 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8482535B2 (en) 1999-11-08 2013-07-09 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US20010020933A1 (en) * 2000-02-21 2001-09-13 Christoph Maggioni Method and configuration for interacting with a display visible in a display window
US7034807B2 (en) * 2000-02-21 2006-04-25 Siemens Aktiengesellschaft Method and configuration for interacting with a display visible in a display window
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
US7137711B1 (en) 2000-03-21 2006-11-21 Leonard Reiffel Multi-user retro reflector data input
US7000840B2 (en) 2000-05-03 2006-02-21 Leonard Reiffel Dual mode data imaging product
US7034803B1 (en) 2000-08-18 2006-04-25 Leonard Reiffel Cursor display privacy product
US20040125224A1 (en) * 2000-08-18 2004-07-01 Leonard Reiffel Annotating imaged data product
US7161581B2 (en) 2000-08-18 2007-01-09 Leonard Reiffel Annotating imaged data product
US8040328B2 (en) 2000-10-11 2011-10-18 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20050102332A1 (en) * 2000-12-15 2005-05-12 Leonard Reiffel Multi-imager multi-source multi-use coded data source data iInput product
US6945460B2 (en) 2000-12-15 2005-09-20 Leonard Reiffel Imaged coded data source transducer product
US20040027455A1 (en) * 2000-12-15 2004-02-12 Leonard Reiffel Imaged coded data source tracking product
US7184075B2 (en) 2000-12-15 2007-02-27 Leonard Reiffel Imaged coded data source tracking product
US20040041027A1 (en) * 2000-12-15 2004-03-04 Leonard Reiffel Imaged coded data source transducer product
US7099070B2 (en) 2000-12-15 2006-08-29 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
US20030001818A1 (en) * 2000-12-27 2003-01-02 Masaji Katagiri Handwritten data input device and method, and authenticating device and method
US6947029B2 (en) * 2000-12-27 2005-09-20 Masaji Katagiri Handwritten data input device and method, and authenticating device and method
US20080024463A1 (en) * 2001-02-22 2008-01-31 Timothy Pryor Reconfigurable tactile control display applications
US20080088587A1 (en) * 2001-02-22 2008-04-17 Timothy Pryor Compact rtd instrument panels and computer interfaces
US7377438B2 (en) 2001-04-19 2008-05-27 Leonard Reiffel Combined imaging coded data source data acquisition
US20070187506A1 (en) * 2001-04-19 2007-08-16 Leonard Reiffel Combined imaging coded data source data acquisition
US20030048280A1 (en) * 2001-09-12 2003-03-13 Russell Ryan S. Interactive environment using computer vision and touchscreens
US7027041B2 (en) * 2001-09-28 2006-04-11 Fujinon Corporation Presentation system
US20030063260A1 (en) * 2001-09-28 2003-04-03 Fuji Photo Optical Co., Ltd. Presentation system
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20030184645A1 (en) * 2002-03-27 2003-10-02 Biegelsen David K. Automatic camera steering control and video conferencing
US7969472B2 (en) * 2002-03-27 2011-06-28 Xerox Corporation Automatic camera steering control and video conferencing
US20040008185A1 (en) * 2002-03-29 2004-01-15 Mitac International Corp. Data processing device, presentation device, and projection method for presentation
US20030222849A1 (en) * 2002-05-31 2003-12-04 Starkweather Gary K. Laser-based user input device for electronic projection displays
US8314773B2 (en) 2002-09-09 2012-11-20 Apple Inc. Mouse having an optically-based scrolling feature
US20080065291A1 (en) * 2002-11-04 2008-03-13 Automotive Technologies International, Inc. Gesture-Based Control of Vehicular Components
US20080192027A1 (en) * 2002-11-08 2008-08-14 Morrison James C Interactive window display
US7978184B2 (en) * 2002-11-08 2011-07-12 American Greetings Corporation Interactive window display
EP2213501A2 (en) 2003-03-31 2010-08-04 Timothy R. Pryor Reconfigurable vehicle instrument panels
EP2581248A1 (en) 2003-03-31 2013-04-17 Timothy R. Pryor Reconfigurable vehicle instrument panels
US20060291797A1 (en) * 2003-05-27 2006-12-28 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
US7355561B1 (en) 2003-09-15 2008-04-08 United States Of America As Represented By The Secretary Of The Army Systems and methods for providing images
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US8612856B2 (en) 2004-07-30 2013-12-17 Apple Inc. Proximity detector in handheld device
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US20070229233A1 (en) * 2004-08-02 2007-10-04 Dort David B Reconfigurable tactile-enhanced display including "tap-and-drop" computing system for vision impaired users
US20100027843A1 (en) * 2004-08-10 2010-02-04 Microsoft Corporation Surface ui for gesture-based interaction
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US8560972B2 (en) 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
WO2006025872A2 (en) * 2004-08-27 2006-03-09 International Business Machines Corporation User input apparatus, system, method and computer program for use with a screen having a translucent surface
US20060044282A1 (en) * 2004-08-27 2006-03-02 International Business Machines Corporation User input apparatus, system, method and computer program for use with a screen having a translucent surface
WO2006025872A3 (en) * 2004-08-27 2008-11-20 Ibm User input apparatus, system, method and computer program for use with a screen having a translucent surface
US8508710B2 (en) * 2004-12-02 2013-08-13 Hewlett-Packard Development Company, L.P. Display panel
US20060119798A1 (en) * 2004-12-02 2006-06-08 Huddleston Wyatt A Display panel
CN102831387B (en) * 2005-01-07 2016-12-14 高通股份有限公司 Object in detect and track image
CN102831387A (en) * 2005-01-07 2012-12-19 高通股份有限公司 Detecting and tracking objects in images
US20060238493A1 (en) * 2005-04-22 2006-10-26 Dunton Randy R System and method to activate a graphical user interface (GUI) via a laser beam
JP2007072637A (en) * 2005-09-06 2007-03-22 Hitachi Ltd Input device using elastic material
JP4635788B2 (en) * 2005-09-06 2011-02-23 株式会社日立製作所 Input device using elastic material
US20070063982A1 (en) * 2005-09-19 2007-03-22 Tran Bao Q Integrated rendering of sound and image on a display
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US20130050549A1 (en) * 2006-01-04 2013-02-28 Apple Inc. Embedded camera with privacy filter
CN104702827A (en) * 2006-01-04 2015-06-10 苹果公司 Embedded camera with privacy filter
US8797451B2 (en) * 2006-01-04 2014-08-05 Apple Inc. Embedded camera with privacy filter
US20070171891A1 (en) * 2006-01-26 2007-07-26 Available For Licensing Cellular device with broadcast radio or TV receiver
US20070222734A1 (en) * 2006-03-25 2007-09-27 Tran Bao Q Mobile device capable of receiving music or video content from satellite radio providers
US20110230232A1 (en) * 2006-05-12 2011-09-22 Tran Bao Q Systems and methods for video editing
US20070262995A1 (en) * 2006-05-12 2007-11-15 Available For Licensing Systems and methods for video editing
US7827491B2 (en) 2006-05-12 2010-11-02 Tran Bao Q Systems and methods for video editing
US20080096651A1 (en) * 2006-07-28 2008-04-24 Aruze Corp. Gaming machine
US20090027357A1 (en) * 2007-07-23 2009-01-29 Smart Technologies, Inc. System and method of detecting contact on a display
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US10962867B2 (en) 2007-10-10 2021-03-30 Gerard Dirk Smits Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering
US8696141B2 (en) * 2007-10-10 2014-04-15 Gerard Dirk Smits Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering
US9581883B2 (en) 2007-10-10 2017-02-28 Gerard Dirk Smits Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering
US20130229669A1 (en) * 2007-10-10 2013-09-05 Gerard Dirk Smits Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering
US20100217433A1 (en) * 2007-10-16 2010-08-26 Hyun Dong Son Store management system capable of switching between manned or unmanned sales
US20100008582A1 (en) * 2008-07-10 2010-01-14 Samsung Electronics Co., Ltd. Method for recognizing and translating characters in camera-based image
US8810522B2 (en) 2008-09-29 2014-08-19 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
US20100079409A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Touch panel for an interactive input system, and interactive input system incorporating the touch panel
US20100079385A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for calibrating an interactive input system and interactive input system executing the calibration method
US20100079493A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method
US20100083109A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method
US8416206B2 (en) 2009-07-08 2013-04-09 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
US20110069019A1 (en) * 2009-07-08 2011-03-24 Smart Technologies Ulc Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
US8902195B2 (en) 2009-09-01 2014-12-02 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (SNR) and image capture method
US20110050650A1 (en) * 2009-09-01 2011-03-03 Smart Technologies Ulc Interactive input system with improved signal-to-noise ratio (snr) and image capture method
US20110169748A1 (en) * 2010-01-11 2011-07-14 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US8502789B2 (en) 2010-01-11 2013-08-06 Smart Technologies Ulc Method for handling user input in an interactive input system, and interactive input system executing the method
US8818027B2 (en) * 2010-04-01 2014-08-26 Qualcomm Incorporated Computing device interface
US20110243380A1 (en) * 2010-04-01 2011-10-06 Qualcomm Incorporated Computing device interface
US9946076B2 (en) 2010-10-04 2018-04-17 Gerard Dirk Smits System and method for 3-D projection and enhancements for interactivity
US9288455B2 (en) * 2011-12-06 2016-03-15 Canon Kabushiki Kaisha Information processing apparatus, control method of information processing apparatus, and storage medium for determining whether a projection pattern of a current frame differs from that of a previous frame
US20130141569A1 (en) * 2011-12-06 2013-06-06 Canon Kabushiki Kaisha Information processing apparatus, control method of information processing apparatus, and storage medium
US8867015B2 (en) 2012-01-11 2014-10-21 Apple Inc. Displays with liquid crystal shutters
US9493169B2 (en) * 2012-07-06 2016-11-15 Audi Ag Method and control system for operating a motor vehicle
US20150336588A1 (en) * 2012-07-06 2015-11-26 Audi Ag Method and control system for operating a motor vehicle
US8711370B1 (en) 2012-10-04 2014-04-29 Gerard Dirk Smits Scanning optical positioning system with spatially triangulating receivers
US9501176B1 (en) 2012-10-08 2016-11-22 Gerard Dirk Smits Method, apparatus, and manufacture for document writing and annotation with virtual ink
US8971568B1 (en) 2012-10-08 2015-03-03 Gerard Dirk Smits Method, apparatus, and manufacture for document writing and annotation with virtual ink
US8781171B2 (en) 2012-10-24 2014-07-15 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US9469251B2 (en) 2012-10-24 2016-10-18 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US9302621B2 (en) 2012-10-24 2016-04-05 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US9852332B2 (en) 2012-10-24 2017-12-26 Honda Motor Co., Ltd. Object recognition in low-lux and high-lux conditions
US20140370980A1 (en) * 2013-06-17 2014-12-18 Bally Gaming, Inc. Electronic gaming displays, gaming tables including electronic gaming displays and related assemblies, systems and methods
US20170285763A1 (en) * 2014-01-14 2017-10-05 Microsoft Technology Licensing, Llc 3d silhouette sensing system
US20150199018A1 (en) * 2014-01-14 2015-07-16 Microsoft Corporation 3d silhouette sensing system
US9720506B2 (en) * 2014-01-14 2017-08-01 Microsoft Technology Licensing, Llc 3D silhouette sensing system
US10001845B2 (en) * 2014-01-14 2018-06-19 Microsoft Technology Licensing, Llc 3D silhouette sensing system
US9810913B2 (en) 2014-03-28 2017-11-07 Gerard Dirk Smits Smart head-mounted projection system
US10061137B2 (en) 2014-03-28 2018-08-28 Gerard Dirk Smits Smart head-mounted projection system
US9377533B2 (en) 2014-08-11 2016-06-28 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US11137497B2 (en) 2014-08-11 2021-10-05 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US10324187B2 (en) 2014-08-11 2019-06-18 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US20160301900A1 (en) * 2015-04-07 2016-10-13 Omnivision Technologies, Inc. Touch screen rear projection display
US10901548B2 (en) * 2015-04-07 2021-01-26 Omnivision Technologies, Inc. Touch screen rear projection display
US10157469B2 (en) 2015-04-13 2018-12-18 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
US10325376B2 (en) 2015-04-13 2019-06-18 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
US10274588B2 (en) 2015-12-18 2019-04-30 Gerard Dirk Smits Real time position sensing of objects
US10502815B2 (en) 2015-12-18 2019-12-10 Gerard Dirk Smits Real time position sensing of objects
US11714170B2 (en) 2015-12-18 2023-08-01 Samsung Semiconuctor, Inc. Real time position sensing of objects
US9753126B2 (en) 2015-12-18 2017-09-05 Gerard Dirk Smits Real time position sensing of objects
US10166995B2 (en) * 2016-01-08 2019-01-01 Ford Global Technologies, Llc System and method for feature activation via gesture recognition and voice command
US10084990B2 (en) 2016-01-20 2018-09-25 Gerard Dirk Smits Holographic video capture and telepresence system
US10477149B2 (en) 2016-01-20 2019-11-12 Gerard Dirk Smits Holographic video capture and telepresence system
US9813673B2 (en) 2016-01-20 2017-11-07 Gerard Dirk Smits Holographic video capture and telepresence system
US10935659B2 (en) 2016-10-31 2021-03-02 Gerard Dirk Smits Fast scanning lidar with dynamic voxel probing
US10067230B2 (en) 2016-10-31 2018-09-04 Gerard Dirk Smits Fast scanning LIDAR with dynamic voxel probing
US10451737B2 (en) 2016-10-31 2019-10-22 Gerard Dirk Smits Fast scanning with dynamic voxel probing
US11709236B2 (en) 2016-12-27 2023-07-25 Samsung Semiconductor, Inc. Systems and methods for machine perception
US10564284B2 (en) 2016-12-27 2020-02-18 Gerard Dirk Smits Systems and methods for machine perception
US10261183B2 (en) 2016-12-27 2019-04-16 Gerard Dirk Smits Systems and methods for machine perception
US10473921B2 (en) 2017-05-10 2019-11-12 Gerard Dirk Smits Scan mirror systems and methods
US11067794B2 (en) 2017-05-10 2021-07-20 Gerard Dirk Smits Scan mirror systems and methods
US10591605B2 (en) 2017-10-19 2020-03-17 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
US10935989B2 (en) 2017-10-19 2021-03-02 Gerard Dirk Smits Methods and systems for navigating a vehicle including a novel fiducial marker system
US10725177B2 (en) 2018-01-29 2020-07-28 Gerard Dirk Smits Hyper-resolved, high bandwidth scanned LIDAR systems
US10379220B1 (en) 2018-01-29 2019-08-13 Gerard Dirk Smits Hyper-resolved, high bandwidth scanned LIDAR systems
US11829059B2 (en) 2020-02-27 2023-11-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US20230113359A1 (en) * 2020-10-23 2023-04-13 Pathway Innovations And Technologies, Inc. Full color spectrum blending and digital color filtering for transparent display screens
US20220360755A1 (en) * 2020-10-23 2022-11-10 Ji Shen Interactive display with integrated camera for capturing audio and visual information

Similar Documents

Publication Publication Date Title
US6545670B1 (en) Methods and apparatus for man machine interfaces and related activity
US11630315B2 (en) Measuring content brightness in head worn computing
US10139635B2 (en) Content presentation in head worn computing
KR100921543B1 (en) A touch pad, a stylus for use with the touch pad, and a method of operating the touch pad
US7310090B2 (en) Optical generic switch panel
US6008800A (en) Man machine interfaces for entering data into a computer
JP3067452B2 (en) Large electronic writing system
US5317140A (en) Diffusion-assisted position location particularly for visual pen detection
US20160048160A1 (en) Content presentation in head worn computing
US11790617B2 (en) Content presentation in head worn computing
KR20090060283A (en) Multi touch sensing display through frustrated total internal reflection
JP2014517361A (en) Camera-type multi-touch interaction device, system and method
KR20070045188A (en) User input apparatus, system, method and computer program for use with a screen having a translucent surface
CN101971128A (en) Interaction arrangement for interaction between a display screen and a pointer object
US8899474B2 (en) Interactive document reader
JP2006145645A (en) Information display apparatus
CN1701351A (en) Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US11340710B2 (en) Virtual mouse
US11868569B2 (en) Virtual mouse
US20240094851A1 (en) Virtual mouse
US20230409148A1 (en) Virtual mouse
Shimoda et al. Development of Head-attached Interface Device (HIDE) and its Functional Evaluation

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 4

SULP Surcharge for late payment
AS Assignment

Owner name: APPLE INC.,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRYOR, TIMOTHY R.;REEL/FRAME:024320/0642

Effective date: 20100330

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRYOR, TIMOTHY R.;REEL/FRAME:024320/0642

Effective date: 20100330

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12