US20110199342A1 - Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound - Google Patents

Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound Download PDF

Info

Publication number
US20110199342A1
US20110199342A1 US12/706,205 US70620510A US2011199342A1 US 20110199342 A1 US20110199342 A1 US 20110199342A1 US 70620510 A US70620510 A US 70620510A US 2011199342 A1 US2011199342 A1 US 2011199342A1
Authority
US
United States
Prior art keywords
ultrasound
display device
user
sensations
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/706,205
Inventor
Harry Vartanian
Jaron Jurikson-Rhodes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jjr Laboratories LLC
Original Assignee
HJ Labs LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HJ Labs LLC filed Critical HJ Labs LLC
Priority to US12/706,205 priority Critical patent/US20110199342A1/en
Assigned to HJ Laboratories, LLC reassignment HJ Laboratories, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JURIKSON-RHODES, JARON, VARTANIAN, HARRY
Publication of US20110199342A1 publication Critical patent/US20110199342A1/en
Priority to US15/498,122 priority patent/US10496170B2/en
Priority to US16/700,518 priority patent/US20200103975A1/en
Assigned to JJR LABORATORIES, LLC reassignment JJR LABORATORIES, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HJ Laboratories, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • B60K2360/11
    • B60K2360/113
    • B60K2360/1438
    • B60K35/10
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04809Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard

Definitions

  • This application is related to an apparatus and method for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound or ultrasonic waves. Ultrasound may also be provided with or without sensations to an object for detecting input. Processes are provided and described involving elevated, indented, or texturized sensations to an object near a display device using ultrasound or ultrasonic waves. Processes are also provided for detecting input from an object using ultrasound.
  • Display devices for inputting information are commonplace in electronic devices such as mobile devices, cellular phones, personal digital assistants, smart phones, tablet personal computers (PCs), laptop computers, televisions, monitors, touchscreens, picture frames, or the like.
  • display devices may be based on liquid crystal, plasma, light emitting, or organic light emitting technologies using ridged or flexible substrates.
  • a display device functions as an input device, such as a touchscreen, their applications are mostly limited to displaying and interacting with a user in two dimensions.
  • Another limitation or problem of current display devices is the lack of texture to the user interface. As the world becomes more electronic, texture is needed for enhancing and enabling certain applications, computer processes, or commerce.
  • Ultrasound or ultrasonic technology has become ubiquitous in the medical imaging field. Recently, ultrasound has been proposed for virtual reality applications. However, the use of embedded or integrated ultrasound technology in display devices or computers for enhancing the user interface to multiple dimensions has been limited. Therefore, it is desirable to have display devices or computers that can provide elevated, indented, or texturized sensations to an object near a display device using embedded or integrated ultrasound technology. It is also desirable for ultrasound to be provided to an object with or without sensations for detecting input.
  • An apparatus and method for providing elevated, indented, or texturized contactless sensations to an object at a distance from a display device using ultrasound or ultrasonic waves is disclosed. Processes are also given involving elevated, indented, or texturized sensations to an object near a display device using airborne ultrasound or ultrasonic waves. By providing elevated, indented, or texturized sensations to an object near a display device enhanced input/output functions are provided.
  • FIG. 1 is a diagram of an electronic device having a display device providing elevated, indented, or texturized sensations to an object near the display device using ultrasound in accordance with one embodiment
  • FIGS. 2 a - 2 d and 2 f are diagrams of configurations for providing elevated, indented, or texturized sensations to an object using ultrasound in accordance with another embodiment
  • FIG. 2 e is a diagram of various ultrasound focal point patterns in accordance with another embodiment
  • FIG. 3 is a diagram comprising of processes for an electronic device providing elevated, indented, or texturized sensations to an object near a display device using ultrasound in accordance with another embodiment
  • FIG. 4 is a diagram for providing varying ultrasound strengths to an object for providing elevated, indented, or texturized sensations in accordance with another embodiment.
  • FIG. 5 is a process for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound in accordance with another embodiment.
  • ultrasound or ultrasonic waves are given as an example to provide elevated, indented, or texturized sensation to an object near a display device.
  • any acoustic or radio wave that excites an afar object or sensed by the human body may be applicable for the examples and processes given in the disclosure.
  • the sensation felt by an object via an airborne ultrasound may be similar to vibration or gyration.
  • the sensation may be varied by producing focal points of different sizes and intensities.
  • the vibration or gyration caused by an airborne ultrasound may depend on the targeted receptors in the skin. Adapting or controlling the ultrasound focal or control points for different receptors may cause different sensations for the user's skin.
  • Elevation or elevated sensations describe different sensations that may be caused to an object using ultrasound at a predetermined or random distance from a display or electronic device.
  • the relative distance of the object may be by one or more millimeters to several meters, as desired.
  • Indenting may be a configuration where an object is given a sensation around its perimeter while giving little sensation to the inner area of the object. Indenting may also describe a configuration where a given location in space near a display device provides a substantially sensible ultrasound to an object but a point lower or closer to the display device the ultrasound is not substantially sensible. Indenting may also describe a configuration where a given location in space near a display device provides a substantially sensible ultrasound to an object but a point lower or closer to the display device an ultrasound is not substantially sensible until a predetermined point near the display device is reached.
  • Texturizing or texturing describes a process where an electronic device using controlled ultrasound over air may provide, simulate, or mimic friction, pulsing sensation, pulsating sensation, variable smoothness, variable thickness, coarseness, fineness, irregularity, a movement sensation, bumpiness, or rigidness that is sensed by or detectable by an object.
  • FIG. 1 is a diagram of a wireless subscriber unit, user equipment (UE), mobile station, pager, cellular telephone, personal digital assistant (PDA), computing device, surface computer, tablet computer, monitor, general display, versatile device, automobile computer system, vehicle computer system, or television device 100 for mobile or fixed applications.
  • Device 100 comprises computer bus 140 that couples one or more processors 102 , one or more interface controllers 104 , memory 106 having software 108 , storage device 110 , power source 112 , and/or one or more displays controller 120 .
  • device 100 comprises an elevation, indenting, or texturizing controller 121 to provide sensations an object located near one or more display devices 122 .
  • One or more display devices 122 can be configured as a liquid crystal display (LCD), light emitting diode (LED), field emission display (FED), organic light emitting diode (OLED), or flexible OLED display device.
  • the one or more display devices 122 may be configured, manufactured, produced, or assembled based on the descriptions provided in US Patent Publication Nos. 2007-247422, 2007-139391, 2007-085838, or 2006-096392 or U.S. Pat. No. 7,050,835 or WO Publication 2007-012899 all herein incorporated by reference as if fully set forth.
  • the one or more electronic display devices 122 may be configured and assembled using organic light emitting diodes (OLED), liquid crystal displays using flexible substrate technology, flexible transistors, or field emission displays (FED) using flexible substrate technology, as desired.
  • OLED organic light emitting diodes
  • FED field emission displays
  • One or more display devices 122 can be configured as a touch screen display using resistive, capacitive, surface-acoustic wave (SAW) capacitive, infrared, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, frustrated total internal reflection or magneto-strictive technology, as understood by one of ordinary skill in the art.
  • SAW surface-acoustic wave
  • Coupled to one or more display devices 122 may be pressure sensors 123 . Coupled to computer bus 140 are one or more input/output (I/O) controller 116 , I/O devices 118 , GPS device 114 , one or more network adapters 128 , and/or one or more antennas 130 .
  • Device 100 may have one or more motion, proximity, light, optical, chemical, environmental, moisture, acoustic, heat, temperature, radio frequency identification (RFID), biometric, face recognition, image, photo, or voice recognition sensors 126 and touch detectors 124 for detecting any touch inputs, including multi-touch inputs, for one or more display devices 122 .
  • One or more interface controllers 104 may communicate with touch detectors 124 and I/O controller 116 for determining user inputs to device 100 .
  • Ultrasound source/detector 125 may be configured in combination with touch detectors 124 , elevation, indenting, or texturizing controller 121 , one or more display devices 122 , pressure sensors 123 , or sensors 126 to project or generate ultrasound waves, rays, or beams to an object to simulate elevated, indented, or texturized sensations, recognize inputs, or track the object as will be explained in more detail below. There may be cases for input recognition or object tracking wherein an ultrasound is provided without detected sensation to the object.
  • storage device 110 may be any disk based or solid state memory device for storing data.
  • Power source 112 may be a plug-in, battery, solar panels for receiving and storing solar energy, or a device for receiving and storing wireless power as described in U.S. Pat. No. 7,027,311 herein incorporated by reference as if fully set forth.
  • One or more network adapters 128 may be configured as a Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Orthogonal Frequency-Division Multiplexing (OFDM), Orthogonal Frequency-Division Multiple Access (OFDMA), Global System for Mobile (GSM) communications, Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), cdma2000, wideband CDMA (W-CDMA), long term evolution (LTE), 802.11x, Wi-Max, mobile Wi-MAX, Bluetooth, or any other wireless or wired transceiver for modulating and demodulating information communicated via one or more antennas 130 .
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • OFDM Orthogonal Frequency-Division Multiplexing
  • OFDMA Orthogonal Frequency-Division Multiple Access
  • GSM Global System for Mobile
  • EDGE Enhanced Data rates for GSM Evolution
  • GPRS General Packet Radio Service
  • FIGS. 2 a - 2 d are diagrams of configurations for providing elevated, indented, or texturized sensations to an object using ultrasound.
  • a display device layer 204 lays proximate to ultrasound layer 205 .
  • layers 204 and 205 can be composed of a plurality of sublayers.
  • display device layer 204 is shown above that ultrasound layer 205 , some or most of the components of ultrasound layer 205 , such as ultrasound transducer or detectors, may be provided in substantially the same level plane as display device layer 204 .
  • Display device layer 204 can be either a flexible or rigid display device for displaying video, images, photos, graphics, text, etc.
  • Ultrasound layer 205 can be configured and composed of ultrasound transducer, source, or detector devices as described in “Two-dimensional scanning tactile display using ultrasound radiation pressure” by Shinoda et al. (2006), “A Tactile Display using Ultrasound Linear Phased Array” by Shinoda et al. (2004), or “Small and Lightweight Tactile Display (SaLT) and Its Application” by Kim et al. (2009) that are all herein incorporated by reference as if fully set forth.
  • linear phased arrays of ultrasound can provide at least 1 mm diameter focal or control points for fine, precise tactile airborne stimuli at variable frequencies and intensities. Larger focal points may also be provided.
  • Techniques for tracking or detecting motion of a focal or control point and object may include Time Delay of Arrival (TDOA) where the difference in arrival times and the velocity of an ultrasound at one or more detectors is used to establish and track location.
  • TDOA Time Delay of Arrival
  • Airborne refers to an ultrasound transmission that may propagate through the air for at least a predetermined distance.
  • stimuli can be provided to an object by transmitting one or more ultrasound focal points to cause a vibration, gyration, beat, or tap by a phased array.
  • the ultrasound intensity may be varied to cause different feelings to the object. Varying of sensations may also be done by changing focal point sizes.
  • Ultrasound layer 205 comprises of an array of coupled ultrasound transducers and/or detectors that may emit directional ultrasound waves, rays, or beams through air to objects at location points 206 , 208 , and/or 210 in sensation zone 202 . Layer 205 also detects reflections of the emitted waves off of the objects at location points 206 , 208 , and/or 210 . Layer 205 is controlled in part by elevation, indenting, or texturizing controller 121 . Sensation zone 202 may be the space, part of the space, or a force field above display device layer 204 that defines the range of ultrasound perception. Sensation zone 202 may be defined using approximate boundaries in order to limit the space ultrasound are emitted over display device layer 204 for safety or power conservation. Another benefit of having sensation zone 202 is that a user can have space in other areas of display device layer 204 for normal operation of device 100 .
  • ultrasound layer 205 may be configured with transducers and detectors directed away from the user. This double-sided configuration is desirable to provide ultrasound sensations to fingers placed behind device 100 for grasping in mobile applications. Airborne ultrasound zone behind device 100 may be used to give a user the ability to virtually grasp from afar images on screen perpendicular to device 100 .
  • Objects at location points 206 , 208 , and/or 210 may be any one of a finger, part of a finger, a hand, part of a hand, skin, any body part, a special ultrasound sensitive glove, part of a special ultrasound sensitive glove, an ultrasound sensitive finger attachment, an ultrasound sensitive thimble, an ultrasound sensitive wand, a material that reacts in response to ultrasound, or a material that is perceptive to ultrasound, as desired.
  • FIG. 2 b is a diagram showing various approximate airborne ultrasound patterns 222 , 224 , 226 , and 228 emitted over display device surface 231 .
  • Substantially cubicle pattern 222 may be provided by emitting rays by ultrasound layer 205 to provide a substantially cubicle sensation.
  • FIG. 2 e shows an example of a focal point pattern 2221 for providing a substantially cubicle pattern sensation on finger 2222 by ultrasound layer 205 .
  • Ultrasound control or focal points shown in FIG. 2 e or other figures are not drawn to scale and may be approximate in size.
  • Dot or dimple pattern 224 may be provided by emitting rays by ultrasound layer 205 to provide a substantially spherical sensation.
  • FIG. 2 e shows an example of a focal point pattern 224 1 for a dot or dimple pattern on finger 224 2 emitted by ultrasound layer 205 to provide a substantially spherical sensation.
  • substantially cylindrical pattern 226 may be provided by emitting rays by ultrasound layer 205 to provide a substantially circular sensation.
  • FIG. 2 e shows an example of a focal point pattern 226 1 for a cylindrical pattern sensation on finger 226 2 provided by ultrasound layer 205 to provide a substantially circular sensation.
  • Substantially rectangular pattern 228 may be provided by emitting rays by ultrasound layer 205 to provide a substantially rectangular sensation.
  • FIG. 2 e shows an example of focal point edge patterns 2281 and 2282 for a rectangular pattern sensation on finger 228 3 provided by ultrasound layer 205 . Although two edges are shown on finger 228 3 , a single or multiple edges may be projected. Edge projections are desirable for virtual keyboard applications where the projected edges help to define the boundaries of a key.
  • ultrasound layer 205 may be controlled in part by ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121 .
  • the ultrasound may be swept or stroked over each focal or control point in a pattern at high frequency or variable pulsating frequencies using various intensities levels dependent upon the desired sensation or virtual effect.
  • well-defined shapes are shown in the FIGS. 2 b and 2 e , actual sensations will vary from person to person and by the accuracy of the phased array ultrasound source.
  • FIG. 2 c is a diagram providing an example configuration of display device layer 204 and ultrasound layer 205 .
  • Display pixels 232 1 to 232 n may lay partially adjacent, on the same level, or on the same layer to elevation, indenting, or texturizing cells 234 1 to 234 n each having an ultrasound transducer, source, and/or detector. Alternatively, display pixels 232 1 to 232 n may lay partially above elevation, indenting, or texturizing cells 234 1 to 234 n .
  • Display and ultrasound array or matrix 233 also comprises of display pixels 236 1 to 236 n adjacent to elevation, indenting, or texturizing cells 2381 to 238 n that are adjacent to display pixels 240 1 to 240 n .
  • the elevation, indenting, or texturizing cells may be controlled by elevation, indenting, or texturizing controller 121 to adjust the intensity, orientation, or direction of the ultrasound emitted to location points 206 , 208 , or 210 .
  • FIG. 2 d shows an embodiment of a display device array or matrix 235 from a top view where ultrasound transducer, source, or detector cells 239 and 241 are placed selectively within two predetermined areas without display pixels so that the surface of display device array or matrix 235 is mostly comprised of display pixels 237 .
  • cells 239 and 241 may line the perimeter of display device array or matrix 235 . When around the perimeter, integration with existing display device layout may be more easily enabled.
  • FIG. 3 is a diagram comprising of processes for an electronic device providing elevated, indented, or texturized sensations to an object near display device 302 using ultrasound.
  • the object provided elevated, indented, or texturized sensations near display device 302 using ultrasound may be any one of a finger, part of a finger, multiple fingers, a hand, part of a hand, or two hands as desired.
  • Display device 302 may be assembled with at least some of the components described in device 100 .
  • a “click here” displayed universal resource locater (URL) or hyperlink is provided to an object that may be at location points 206 , 208 , and/or 210 with an elevated substantially circular ultrasound pattern 304 . Clicking may be performed by tracking the motion, momentum, or velocity of the object as provided in the example in FIG. 5 below.
  • Motion of the object relative to display device 302 that can be recognized as an input, gesture, or command may be a push towards display device 302 , a pull away from display device 302 , sideway or lateral motion relative to display device 302 , a circle gesture, a square gesture, a rectangular gesture, a spiral gesture, a swirl gesture, a swipe gesture, a pinch gesture, a flick gesture, a customized gesture, a user defined gesture, a multiple finger coordinated motion, or a single finger gesture, as desired.
  • single finger gesture control is desirable since the user may use for example the thumb finger for gestures to signal an input or command while holding device 100 at the same time for mobile applications allowing the other hand to be free.
  • Gestures may be stored in a gesture library or database in storage device 110 .
  • tracking an object relative to display device 302 may be used for drawing purposes.
  • a user may use a finger to draw in air a character or shape that is detected by ultrasound source/detector 125 and rendered into an image by one or processors 102 . This feature may be useful, for instance, in computer games, toys, or graphics applications.
  • part of an on screen virtual or simulated keyboard displayed on display device 302 provides the letter “E” key having an elevated substantially square ultrasound 308 provided to an object at location points 206 , 208 , and/or 210 .
  • display device 302 can be configured to show a whole QWERTY keyboard, a numeric keypad, or a combination of a whole QWERTY keyboard and a numeric keypad, as desired.
  • the letter “S” key is provided by a partially displayed portion and an elevated substantially circular ultrasound 310 .
  • the virtual or simulated keyboard may also be programmed to replicate Braille lettering, as desired.
  • ultrasound 3061 and 3062 are projected around the perimeter or edges of the keys to define boundaries so that a user may type the correct key and can find or feel the correct position of the keys.
  • a user may type the key by physically touching display device 302 .
  • the touch input is detected by touch detectors 124 .
  • a pull away motion of an object from display device 302 may be detectable as a capital or superscripting letter input while a push motion in the direction towards the display device may indicate subscripting of the letter.
  • haptic feedback, force feedback, or tactile feedback in the form of a played sound, gyration, or vibration may be provided via I/O controller 116 .
  • chart 400 shows an example of how ultrasound focal or control point strength or intensity units may be varied over time to provide different sensations to a user's finger, hand, or any other object. For instance, as a user pulls a finger away from display device 302 , which is detected by ultrasound source/detector 125 , strength or intensity units may be reduced by elevation, indenting, or texturizing controller 121 . Conversely, when the finger is pushed towards display device 302 strength or intensity units may be increased for a predetermined period creating a virtual feeling of resistance.
  • display device 242 may project ultrasound as shown in FIG. 2 f .
  • Ultrasound transducer, source, or detector cells 244 may project onto zone 246 so that the user's view of display device 242 is unobstructed.
  • Zone 246 may be projected onto a table or desk giving the user the ability to use the space as an input area similar to that of a keyboard or mouse.
  • a special pad 248 may be used to reflect or vibrate in response to ultrasound from transducer, source, or detector cells 244 .
  • instructions in software 108 can be used to predict or anticipate keystrokes. Prediction or anticipation may be based on a word or sentence entered. In response to the anticipation, a different key may emit ultrasound to a user's finger, hand, or any other object to encourage or invoke input and provide context awareness.
  • Advertisement 316 such as an adword by Google, can be sold to an advertiser for a certain price for having elevated substantially circular ultrasound 317 on at least one part or the entire advertisement image or photo.
  • Advertisement 318 can be sold to an advertiser for a different price, higher or lower, for having elevated substantially circular ultrasound 318 1 and 318 2 each projected at a different intensity in comparison to substantially circular ultrasound 317 .
  • the strength or intensity of substantially circular ultrasound 317 , 318 1 , and 318 2 may be dependent on location determined by GPS device 114 and varied over time as shown in FIG. 4 .
  • Advertisement 316 or 318 may be provided in a separate pop up window with the emitted ultrasound to an object at location points 206 , 208 , and/or 210 .
  • the emitted ultrasound may be provided only for a predetermined time period after the pop up window is displayed thereby providing a nudge or feeling sensation to the object.
  • the intensity of ultrasound to the object may be increased over time prior to turning off thereby simulating the effect of the pop up window virtually emerging from display device 302 .
  • substantially circular ultrasound 317 may provide to a user's fingers a sensation by projecting multiple focal or control points when the user virtually tries to grab a window shown on display device 302 .
  • a slight vibration is provided by substantially circular ultrasound 317 .
  • a strong vibration may be provided by substantially circular ultrasound 317 when running into obstacles or boundaries on the screen. The vibration may stop when the user releases the window, as desired.
  • An embodiment of the present invention may provide electronic commerce processes.
  • a “Buy Now” button is provided with an elevated substantially circular ultrasound 322 1 and an elevated substantially square ultrasound 322 2 to an object at location points 206 , 208 , and/or 210 .
  • the “Buy Now” button is associated with triggering the purchasing of displayed shirt 324 by sending a request to a server (not shown) over one or more network adapters 128 .
  • ultrasound texturizing pattern 326 is provided to virtually replicate or simulate the surface or composition of shirt 324 .
  • Ultrasound texturizing pattern 326 can be a combination of different ultrasound focal or control points. Although a shirt 324 is shown, ultrasound texturizing pattern 326 can be used to provide surface information for any product being sold or displayed on display device 302 .
  • displayed shirt 324 can be highlighted and then rotated in response to a multitouch input while ultrasound texturizing pattern 326 is dynamically changed to virtually reflect the different surfaces or materials used to make the shirt.
  • Shirt 324 can be zoomed in and out using mulitouch inputs detected by touch detectors 124 with each zoom level reflecting texture differences on ultrasound texturizing pattern 326 . For instance, a zoomed in view may be more grainy or rough compared to a zoomed out view.
  • the zoom levels can also be configured with a fading in or out effect by one or more processors 102 and can involve retrieving additional information from a server (not shown) over one or more network adapters 128 . Beyond the examples of fabrics, any material may be replicated or simulated by ultrasound texturizing pattern 326 .
  • Airborne ultrasound feedback similar to multitouch inputs, may also be used to change views, angles, or size of displayed shirt 324 .
  • display device 302 may be elevated, indented, or texturized in accordance with examples given in U.S. application Ser. No. 12/406,273.
  • shirt 324 texturized on display device 302 and at a distance to an object using ultrasound the user is given an improved realization of the composition of the shirt by combining the two enhancements.
  • an embodiment of the present invention provides an electronic game, such as tic-tac-toe, by projecting ultrasound pattern 328 to an object at location points 206 , 208 , and/or 210 .
  • ultrasound pattern 328 may be projected to multiple fingers and tracked as the user tries to pinch, grab, or push an object in a game or any other simulated environment displayed on display device 302 .
  • Ultrasound pattern 328 emitted onto an object can also control scrolling or drag and drop functions of items in a game in combination with multitouch inputs detected by touch detectors 124 .
  • ultrasound pattern 328 can be controlled by elevation, indenting, or texturizing controller 121 such that an object being tracked at location point 206 , such as user's hand, can be handed off or switched to location point 208 , such as a user's other hand, to be tracked.
  • location point 208 such as a user's other hand
  • a user may dribble a ball from one hand to another in front of display device 302 .
  • passing of location points in space and time from 206 to 208 results in passing a location point of an object between different horizontal planes relative to display device layer 204 .
  • the location point may be passed on the same plane.
  • ultrasound pattern 328 can be used to emulate a spring like sensation to an object and simulate elasticity to a user's hand in a game or any other application.
  • Ultrasound layer 205 can also simulate whole screen explosions, blasts, or bullets being fired at the user by turning on several ultrasound transducers for a predetermined period of time in a game or movie.
  • Ultrasound pattern 328 may also provide a gaming feature where tilting or rotation detected by an accelerometer in sensors 126 controls ultrasound output for four dimensional motion gaming.
  • Ultrasound pattern 328 may also define the boundaries of a virtual space or layer between location points 206 , 208 , and 210 and display device layer 204 .
  • ultrasound pattern 328 projected onto multiple fingers can be used to simulate a virtual joystick or pointing stick for 360 degrees rotational input by tracking the movement of the fingers by ultrasound source/detector 125 .
  • a three dimensional accelerometer can be included in sensors 126 to be used in combination with elevation, indenting, or texturizing controller 121 to project ultrasound pattern 328 in response to a programmed action in the game.
  • a visual haptic ultrasound mouse or track pad may be configured by projecting and controlling ultrasound pattern 328 to replicate the functionality of a mouse or track pad and provide a 4-D free space tactile user interface device.
  • ultrasound pattern 328 can provide enhanced features for online collaboration, distance learning, online conferencing, social networking, or online dating. For instance, in response to push command on a networked computing device (not shown), which may or may not have an ultrasound enhanced display device, ultrasound pattern 328 may provide feedback to an object at location points 206 , 208 , and/or 210 . Examples of feedback are a poke sensation similar to that on Facebook, a push sensation, a virtual handshake sensation, etc. In online conferencing, tactile inputs or gestures via ultrasound pattern 328 may be used during a video conference application for additional interaction between conversing parties. Social networking or adult entertainment applications can be enhanced by ultrasound pattern 328 providing stimulation in connection with a video, image, photo, or audio media on display device 302 .
  • ultrasound rays 327 1 and 327 2 may be used to augment, enhance, or characterize different objects in photo or image 327 3 .
  • Ultrasound rays 327 1 and 327 2 may be preprogrammed into photo or image 327 3 by the owner for watermarking, artistic design, or the like. Ultrasound 327 1 and 327 2 may also be used to augment photo editing applications. If display device 302 is configured as a digital sign, ultrasound 327 1 and 327 2 may be used to get the attention of people walking near or viewing the photo or image 327 3 on the sign.
  • ultrasound pattern 328 may also project sensations to simulate maps, topography, geography, imagery, or location service processes in combination with GPS device 114 .
  • Ultrasound pattern 328 can simulate mountainous regions on a map by projecting an ultrasound of various heights and intensities to an object at location points 206 , 208 , and/or 210 .
  • Ultrasound pattern 328 may also be used to simulate the action of picking up (i.e. cut) or drop text (i.e. paste) in an email, 3rd Generation Partnership Project (3GPP) or 3GPP2 short message service (SMS) text message, or 3GPP/3GPP2 multimedia message service (MMS) message.
  • Ultrasound pattern 328 may also be used in connection with a PDF document, word document, excel, four dimensional (4-D) screensaver, 4-D art, 4-D drawings, 3-D imagery, a 3-D sculpture, a 4-D “etch-a-sketch”, or architecture designs using scalable or vector graphics. Any of the actions given above for ultrasound pattern 328 may be used in combination with transmitting or receiving information over one or more network adapters 128 .
  • ultrasound pattern 328 can be used to replicate or simulate the edge of a page and allow a user to virtually lift or pick-up a page. Moreover, a user may be able to feel text of varying sensations provided by ultrasound pattern 328 that is hyperlinked or highlighted on an e-book page as the user moves a finger across the page.
  • airborne ultrasound pattern 328 may be used to simulate friction or resistance as a user moves an image by touching the screen, zooms into an image, or zooms out of an image. When zooming beyond a threshold, ultrasound pattern 328 can be used to provide resistance thereby defining boundaries and providing a warning or alarm. While scrolling, panning, or gliding, hitting a threshold level or endpoint causes an ultrasound tactile feedback or response. For scrolling momentum, the ultrasound pattern 328 may provide high intensity initially to simulate inertia and then less intensity as momentum builds. For navigating through a list of items on display device 302 , items may be highlighted on the screen as the user scrolls through the list from afar.
  • display device 302 may have ultrasound source/detectors 330 1 - 330 4 in a slightly beveled position or in the level with the frame of display device 302 .
  • Display device 302 may also have digital image or infrared cameras 334 1 - 334 4 for tracking motion of objects at location points 206 , 208 , and/or 210 using algorithms such as that described in U.S. Pat. No. 7,317,872, herein incorporated by reference as if fully set forth, that can be used to perform additional sensor measurements.
  • Other sensor measurements for additional metrics and refinement include infrared or optical detection to detect depth of objects at location points 206 , 208 , and/or 210 . These sensors can be embedded next to or within each display cell in display device 302 .
  • display device 302 replicates, mimics, or simulates a customizable or programmable interface or control panel for a remote control, instrument panel on a vehicle, an automobile dashboard configuration, audio equalizers, multitouch equalizers, radio button list, or a consumer electronics button surface with ultrasound patterns 332 1 - 332 3 .
  • ultrasound patterns 332 1 - 332 3 provide a user the ability to simulate buttons on a device prior to purchase or use as a tutorial.
  • 332 1 - 332 3 can be programmed for controlling volume control, replicating smart home switches or controllers, or replicating a dial or knob, as desired.
  • ultrasound may be used to feel, sense, or move text, images, photos, windows or icons.
  • web searching is performed by dragging and dropping text “TEST SEARCH” 337 into search box 336 .
  • a user may be provided substantially circular ultrasound 338 when grabbing the text “TEST SEARCH” from a distance to display device 302 .
  • the user then moves or drags the text “TEST SEARCH” from afar via path 339 over to search box 336 and releases or drops it.
  • the user's finger movements are tracked by ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121 .
  • the text may be shown as moving on display device 302 as the user's fingers are tracked.
  • a visual, photo, or image search may be performed by grabbing image of shirt 324 and dropping it in search box 336 .
  • ultrasound pattern 328 can be used to replicate or simulate a virtual stylus, pen, or pencil allowing a user to mimic writing or drawing on display device 302 similar to a notepad.
  • the virtual stylus, pen, or pencil may be configured without the user physically holding anything.
  • the writing or drawing may be done at a predetermined distance from display device 302 in sensation zone 202 .
  • Ultrasound pattern 328 can also be used for medical applications. For instance, with laparoscopic surgery a physician located in the surgery room or remote to the surgery room may be able to feel or sense images or photos of organs of a patient provided by an internal surgical camera and displayed on display device 302 . Ultrasound pattern 328 may also be used to simulate pain of a patient to a doctor over the Internet.
  • ultrasound pattern 328 can be responsive to voice or visual commands or recognition detected by sensors 126 .
  • ultrasound pattern 328 can be a preprogrammed texturized pattern to notify the user of an incoming call, similar to a customized ringtone.
  • ultrasound pattern 328 may be used for providing a warning to a driver in relation to safety feature on an automobile.
  • ultrasound pattern 328 may be used for enhancing icons on a system tray with each icon having a different characteristic vibration sensation.
  • device 100 may be controlled remotely, either wired or wirelessly, via a server or cloud computing platform (not shown) via one or more network adapters 128 .
  • ultrasound pattern 328 can be used to replicate, simulate, enhance features for biometrics, musical instruments, video clips, editing audio tracks, editing video, computer aided designs (CAD), semiconductor layouts, e-books, a children's educational product, children's productivity or educational game, a general education product, a 3-D drawing tool, distance learning, or a pop-up children's books, as desired.
  • CAD computer aided designs
  • FIG. 5 is a process 500 for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound.
  • the object may be one or more fingers or hands at location points 206 , 208 , and/or 210 .
  • Ultrasound source/detector 125 determines the initial object location and calculates a distance and angle relative to display device 302 (step 502 ) to calculate focal or control point vectors.
  • a user's fingers, hand, or a predetermined object may be place over a predetermined zone over display device 302 .
  • a user's fingers, hand, or a predetermined object may be detected by digital or infrared cameras 334 1 - 334 4 using image or photo recognition technology.
  • device 100 may display a preprogrammed image, such as a virtual keyboard or icon, on display device 302 at the detected location.
  • Ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121 projects or emits one or more ultrasound patterns, such as the ones shown in FIG. 2 e , having one or more focal or control points (step 504 ).
  • the intensity of ultrasound at one or more focal or control points may be varied.
  • ultrasound source/detector 125 may be time multiplexed to project different ultrasound patterns to each object.
  • Elevation, indenting or texturizing controller 121 focuses or adjusts focal or control point vectors (step 506 ).
  • Ultrasound source/detector 125 in combination with elevation, indenting or texturizing controller 121 detects, tracks, or senses movement of focal or control points to determine momentum and/or velocity of an object (step 508 ). While the object moves and is tracked, the ultrasound patterns provided to the object may vary based on images, text, video, or the like displayed on display device 302 .
  • device 100 may detect and track multitouch inputs by other fingers and/or input detected by other sensors (step 510 ).
  • An animation or video of a generated surface may be displayed on display device 302 for feedback and showing the tracking of the object (step 512 ). If an input, gesture, or command is recognized by ultrasound source/detector 125 in combination with elevation, indenting or texturizing controller 121 (step 514 ), the input, gesture, or command is processed by one or more processors 102 (step 516 ) and information is retrieved based on the input, gesture, or command (step 518 ).
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, digital versatile disks (DVDs), and BluRay discs.
  • Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
  • DSP digital signal processor
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • a processor in association with software may be used to implement hardware functions for use in a computer, wireless transmit receive unit (WTRU) or any host computer.
  • the programmed hardware functions may be used in conjunction with modules, implemented in hardware and/or software, such as a camera, a video camera module, a videophone, a speakerphone, a vibration device, a speaker, a microphone, a television transceiver, a hands free headset, a keyboard, a Bluetooth® module, a frequency modulated (FM) radio unit, a liquid crystal display (LCD) display unit, an organic light-emitting diode (OLED) display unit, a digital music player, a media player, a video game player module, an Internet browser, and/or any wireless local area network (WLAN) or Ultra Wide Band (UWB) module.
  • WLAN wireless local area network
  • UWB Ultra Wide Band

Abstract

An apparatus and method for providing elevated, indented, or texturized contactless sensations to an object at a distance from a display device using ultrasound or ultrasonic waves is disclosed. Processes are also given involving elevated, indented, or texturized sensations to an object near a display device using airborne ultrasound or ultrasonic waves. By providing elevated, indented, or texturized sensations to an object near a display device enhanced input/output functions are provided.

Description

    FIELD OF INVENTION
  • This application is related to an apparatus and method for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound or ultrasonic waves. Ultrasound may also be provided with or without sensations to an object for detecting input. Processes are provided and described involving elevated, indented, or texturized sensations to an object near a display device using ultrasound or ultrasonic waves. Processes are also provided for detecting input from an object using ultrasound.
  • BACKGROUND
  • Display devices for inputting information are commonplace in electronic devices such as mobile devices, cellular phones, personal digital assistants, smart phones, tablet personal computers (PCs), laptop computers, televisions, monitors, touchscreens, picture frames, or the like. Currently, display devices may be based on liquid crystal, plasma, light emitting, or organic light emitting technologies using ridged or flexible substrates. When a display device functions as an input device, such as a touchscreen, their applications are mostly limited to displaying and interacting with a user in two dimensions. Another limitation or problem of current display devices is the lack of texture to the user interface. As the world becomes more electronic, texture is needed for enhancing and enabling certain applications, computer processes, or commerce.
  • Ultrasound or ultrasonic technology has become ubiquitous in the medical imaging field. Recently, ultrasound has been proposed for virtual reality applications. However, the use of embedded or integrated ultrasound technology in display devices or computers for enhancing the user interface to multiple dimensions has been limited. Therefore, it is desirable to have display devices or computers that can provide elevated, indented, or texturized sensations to an object near a display device using embedded or integrated ultrasound technology. It is also desirable for ultrasound to be provided to an object with or without sensations for detecting input.
  • SUMMARY
  • An apparatus and method for providing elevated, indented, or texturized contactless sensations to an object at a distance from a display device using ultrasound or ultrasonic waves is disclosed. Processes are also given involving elevated, indented, or texturized sensations to an object near a display device using airborne ultrasound or ultrasonic waves. By providing elevated, indented, or texturized sensations to an object near a display device enhanced input/output functions are provided.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more detailed understanding may be had from the following description, given by way of example in conjunction with the accompanying drawings wherein:
  • FIG. 1 is a diagram of an electronic device having a display device providing elevated, indented, or texturized sensations to an object near the display device using ultrasound in accordance with one embodiment;
  • FIGS. 2 a-2 d and 2 f are diagrams of configurations for providing elevated, indented, or texturized sensations to an object using ultrasound in accordance with another embodiment;
  • FIG. 2 e is a diagram of various ultrasound focal point patterns in accordance with another embodiment;
  • FIG. 3 is a diagram comprising of processes for an electronic device providing elevated, indented, or texturized sensations to an object near a display device using ultrasound in accordance with another embodiment;
  • FIG. 4 is a diagram for providing varying ultrasound strengths to an object for providing elevated, indented, or texturized sensations in accordance with another embodiment; and
  • FIG. 5 is a process for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound in accordance with another embodiment.
  • DETAILED DESCRIPTION
  • The present invention will be described with reference to the drawing figures wherein like numerals represent like elements throughout. For the processes described below the steps recited may be performed out of sequence and sub-steps not explicitly described or shown may be performed. In addition, “coupled” or “operatively coupled” may mean that objects are linked between zero or more intermediate objects.
  • In the examples forthcoming ultrasound or ultrasonic waves are given as an example to provide elevated, indented, or texturized sensation to an object near a display device. However, one of ordinary skill would appreciate that any acoustic or radio wave that excites an afar object or sensed by the human body may be applicable for the examples and processes given in the disclosure.
  • In the examples forthcoming, the sensation felt by an object via an airborne ultrasound may be similar to vibration or gyration. The sensation may be varied by producing focal points of different sizes and intensities. For the case where the object is human skin, the vibration or gyration caused by an airborne ultrasound may depend on the targeted receptors in the skin. Adapting or controlling the ultrasound focal or control points for different receptors may cause different sensations for the user's skin.
  • Elevation or elevated sensations describe different sensations that may be caused to an object using ultrasound at a predetermined or random distance from a display or electronic device. As an example, the relative distance of the object may be by one or more millimeters to several meters, as desired.
  • Indenting may be a configuration where an object is given a sensation around its perimeter while giving little sensation to the inner area of the object. Indenting may also describe a configuration where a given location in space near a display device provides a substantially sensible ultrasound to an object but a point lower or closer to the display device the ultrasound is not substantially sensible. Indenting may also describe a configuration where a given location in space near a display device provides a substantially sensible ultrasound to an object but a point lower or closer to the display device an ultrasound is not substantially sensible until a predetermined point near the display device is reached.
  • Texturizing or texturing describes a process where an electronic device using controlled ultrasound over air may provide, simulate, or mimic friction, pulsing sensation, pulsating sensation, variable smoothness, variable thickness, coarseness, fineness, irregularity, a movement sensation, bumpiness, or rigidness that is sensed by or detectable by an object.
  • U.S. application Ser. No. 12/406,273 is herein incorporated by reference as if fully set forth and may be used in combination with the given examples to provide a display device that is elevated, indented, or texturized and ultrasound is used to provide a sensation to an object near the display device.
  • FIG. 1 is a diagram of a wireless subscriber unit, user equipment (UE), mobile station, pager, cellular telephone, personal digital assistant (PDA), computing device, surface computer, tablet computer, monitor, general display, versatile device, automobile computer system, vehicle computer system, or television device 100 for mobile or fixed applications. Device 100 comprises computer bus 140 that couples one or more processors 102, one or more interface controllers 104, memory 106 having software 108, storage device 110, power source 112, and/or one or more displays controller 120. In addition, device 100 comprises an elevation, indenting, or texturizing controller 121 to provide sensations an object located near one or more display devices 122.
  • One or more display devices 122 can be configured as a liquid crystal display (LCD), light emitting diode (LED), field emission display (FED), organic light emitting diode (OLED), or flexible OLED display device. The one or more display devices 122 may be configured, manufactured, produced, or assembled based on the descriptions provided in US Patent Publication Nos. 2007-247422, 2007-139391, 2007-085838, or 2006-096392 or U.S. Pat. No. 7,050,835 or WO Publication 2007-012899 all herein incorporated by reference as if fully set forth. In the case of a flexible display device, the one or more electronic display devices 122 may be configured and assembled using organic light emitting diodes (OLED), liquid crystal displays using flexible substrate technology, flexible transistors, or field emission displays (FED) using flexible substrate technology, as desired. One or more display devices 122 can be configured as a touch screen display using resistive, capacitive, surface-acoustic wave (SAW) capacitive, infrared, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, frustrated total internal reflection or magneto-strictive technology, as understood by one of ordinary skill in the art.
  • Coupled to one or more display devices 122 may be pressure sensors 123. Coupled to computer bus 140 are one or more input/output (I/O) controller 116, I/O devices 118, GPS device 114, one or more network adapters 128, and/or one or more antennas 130. Device 100 may have one or more motion, proximity, light, optical, chemical, environmental, moisture, acoustic, heat, temperature, radio frequency identification (RFID), biometric, face recognition, image, photo, or voice recognition sensors 126 and touch detectors 124 for detecting any touch inputs, including multi-touch inputs, for one or more display devices 122. One or more interface controllers 104 may communicate with touch detectors 124 and I/O controller 116 for determining user inputs to device 100.
  • Ultrasound source/detector 125 may be configured in combination with touch detectors 124, elevation, indenting, or texturizing controller 121, one or more display devices 122, pressure sensors 123, or sensors 126 to project or generate ultrasound waves, rays, or beams to an object to simulate elevated, indented, or texturized sensations, recognize inputs, or track the object as will be explained in more detail below. There may be cases for input recognition or object tracking wherein an ultrasound is provided without detected sensation to the object.
  • Still referring to device 100, storage device 110 may be any disk based or solid state memory device for storing data. Power source 112 may be a plug-in, battery, solar panels for receiving and storing solar energy, or a device for receiving and storing wireless power as described in U.S. Pat. No. 7,027,311 herein incorporated by reference as if fully set forth. One or more network adapters 128 may be configured as a Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Orthogonal Frequency-Division Multiplexing (OFDM), Orthogonal Frequency-Division Multiple Access (OFDMA), Global System for Mobile (GSM) communications, Enhanced Data rates for GSM Evolution (EDGE), General Packet Radio Service (GPRS), cdma2000, wideband CDMA (W-CDMA), long term evolution (LTE), 802.11x, Wi-Max, mobile Wi-MAX, Bluetooth, or any other wireless or wired transceiver for modulating and demodulating information communicated via one or more antennas 130. Additionally, any of devices, controllers, displays, components, etc. in device 100 may be combined, made integral, or separated as desired. For instance, elevation, indenting, or texturizing controller 121 may be combined with ultrasound source/detector 125 in one unit.
  • FIGS. 2 a-2 d are diagrams of configurations for providing elevated, indented, or texturized sensations to an object using ultrasound. In FIG. 2 a display device layer 204 lays proximate to ultrasound layer 205. Although a single layer is shown, layers 204 and 205 can be composed of a plurality of sublayers. Although display device layer 204 is shown above that ultrasound layer 205, some or most of the components of ultrasound layer 205, such as ultrasound transducer or detectors, may be provided in substantially the same level plane as display device layer 204. Display device layer 204 can be either a flexible or rigid display device for displaying video, images, photos, graphics, text, etc.
  • Ultrasound layer 205 can be configured and composed of ultrasound transducer, source, or detector devices as described in “Two-dimensional scanning tactile display using ultrasound radiation pressure” by Shinoda et al. (2006), “A Tactile Display using Ultrasound Linear Phased Array” by Shinoda et al. (2004), or “Small and Lightweight Tactile Display (SaLT) and Its Application” by Kim et al. (2009) that are all herein incorporated by reference as if fully set forth. As indicated by the incorporated references, linear phased arrays of ultrasound can provide at least 1 mm diameter focal or control points for fine, precise tactile airborne stimuli at variable frequencies and intensities. Larger focal points may also be provided. Techniques for tracking or detecting motion of a focal or control point and object may include Time Delay of Arrival (TDOA) where the difference in arrival times and the velocity of an ultrasound at one or more detectors is used to establish and track location. Airborne refers to an ultrasound transmission that may propagate through the air for at least a predetermined distance.
  • As previously stated, stimuli can be provided to an object by transmitting one or more ultrasound focal points to cause a vibration, gyration, beat, or tap by a phased array. The ultrasound intensity may be varied to cause different feelings to the object. Varying of sensations may also be done by changing focal point sizes.
  • Ultrasound layer 205 comprises of an array of coupled ultrasound transducers and/or detectors that may emit directional ultrasound waves, rays, or beams through air to objects at location points 206, 208, and/or 210 in sensation zone 202. Layer 205 also detects reflections of the emitted waves off of the objects at location points 206, 208, and/or 210. Layer 205 is controlled in part by elevation, indenting, or texturizing controller 121. Sensation zone 202 may be the space, part of the space, or a force field above display device layer 204 that defines the range of ultrasound perception. Sensation zone 202 may be defined using approximate boundaries in order to limit the space ultrasound are emitted over display device layer 204 for safety or power conservation. Another benefit of having sensation zone 202 is that a user can have space in other areas of display device layer 204 for normal operation of device 100.
  • In addition to providing airborne ultrasound in the direction of the user, ultrasound layer 205 may be configured with transducers and detectors directed away from the user. This double-sided configuration is desirable to provide ultrasound sensations to fingers placed behind device 100 for grasping in mobile applications. Airborne ultrasound zone behind device 100 may be used to give a user the ability to virtually grasp from afar images on screen perpendicular to device 100.
  • Objects at location points 206, 208, and/or 210 may be any one of a finger, part of a finger, a hand, part of a hand, skin, any body part, a special ultrasound sensitive glove, part of a special ultrasound sensitive glove, an ultrasound sensitive finger attachment, an ultrasound sensitive thimble, an ultrasound sensitive wand, a material that reacts in response to ultrasound, or a material that is perceptive to ultrasound, as desired.
  • FIG. 2 b is a diagram showing various approximate airborne ultrasound patterns 222, 224, 226, and 228 emitted over display device surface 231. Substantially cubicle pattern 222 may be provided by emitting rays by ultrasound layer 205 to provide a substantially cubicle sensation. FIG. 2 e shows an example of a focal point pattern 2221 for providing a substantially cubicle pattern sensation on finger 2222 by ultrasound layer 205. Ultrasound control or focal points shown in FIG. 2 e or other figures are not drawn to scale and may be approximate in size. Dot or dimple pattern 224 may be provided by emitting rays by ultrasound layer 205 to provide a substantially spherical sensation. FIG. 2 e shows an example of a focal point pattern 224 1 for a dot or dimple pattern on finger 224 2 emitted by ultrasound layer 205 to provide a substantially spherical sensation.
  • Moreover, substantially cylindrical pattern 226 may be provided by emitting rays by ultrasound layer 205 to provide a substantially circular sensation. FIG. 2 e shows an example of a focal point pattern 226 1 for a cylindrical pattern sensation on finger 226 2 provided by ultrasound layer 205 to provide a substantially circular sensation.
  • Substantially rectangular pattern 228 may be provided by emitting rays by ultrasound layer 205 to provide a substantially rectangular sensation. FIG. 2 e shows an example of focal point edge patterns 2281 and 2282 for a rectangular pattern sensation on finger 228 3 provided by ultrasound layer 205. Although two edges are shown on finger 228 3, a single or multiple edges may be projected. Edge projections are desirable for virtual keyboard applications where the projected edges help to define the boundaries of a key.
  • In the examples given in FIG. 2 b ultrasound layer 205 may be controlled in part by ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121. In FIG. 2 e, the ultrasound may be swept or stroked over each focal or control point in a pattern at high frequency or variable pulsating frequencies using various intensities levels dependent upon the desired sensation or virtual effect. Although well-defined shapes are shown in the FIGS. 2 b and 2 e, actual sensations will vary from person to person and by the accuracy of the phased array ultrasound source.
  • FIG. 2 c is a diagram providing an example configuration of display device layer 204 and ultrasound layer 205. Display pixels 232 1 to 232 n may lay partially adjacent, on the same level, or on the same layer to elevation, indenting, or texturizing cells 234 1 to 234 n each having an ultrasound transducer, source, and/or detector. Alternatively, display pixels 232 1 to 232 n may lay partially above elevation, indenting, or texturizing cells 234 1 to 234 n. Display and ultrasound array or matrix 233 also comprises of display pixels 236 1 to 236 n adjacent to elevation, indenting, or texturizing cells 2381 to 238 n that are adjacent to display pixels 240 1 to 240 n. The elevation, indenting, or texturizing cells may be controlled by elevation, indenting, or texturizing controller 121 to adjust the intensity, orientation, or direction of the ultrasound emitted to location points 206, 208, or 210.
  • FIG. 2 d shows an embodiment of a display device array or matrix 235 from a top view where ultrasound transducer, source, or detector cells 239 and 241 are placed selectively within two predetermined areas without display pixels so that the surface of display device array or matrix 235 is mostly comprised of display pixels 237. In an alternative embodiment, cells 239 and 241 may line the perimeter of display device array or matrix 235. When around the perimeter, integration with existing display device layout may be more easily enabled.
  • FIG. 3 is a diagram comprising of processes for an electronic device providing elevated, indented, or texturized sensations to an object near display device 302 using ultrasound. For the examples given in FIG. 3, the object provided elevated, indented, or texturized sensations near display device 302 using ultrasound may be any one of a finger, part of a finger, multiple fingers, a hand, part of a hand, or two hands as desired. Display device 302 may be assembled with at least some of the components described in device 100.
  • For inputting or triggering a request action, a “click here” displayed universal resource locater (URL) or hyperlink is provided to an object that may be at location points 206, 208, and/or 210 with an elevated substantially circular ultrasound pattern 304. Clicking may be performed by tracking the motion, momentum, or velocity of the object as provided in the example in FIG. 5 below. Motion of the object relative to display device 302 that can be recognized as an input, gesture, or command may be a push towards display device 302, a pull away from display device 302, sideway or lateral motion relative to display device 302, a circle gesture, a square gesture, a rectangular gesture, a spiral gesture, a swirl gesture, a swipe gesture, a pinch gesture, a flick gesture, a customized gesture, a user defined gesture, a multiple finger coordinated motion, or a single finger gesture, as desired. In particular, single finger gesture control is desirable since the user may use for example the thumb finger for gestures to signal an input or command while holding device 100 at the same time for mobile applications allowing the other hand to be free. Gestures may be stored in a gesture library or database in storage device 110.
  • In addition to gestures, tracking an object relative to display device 302, as provided in an example in FIG. 5, may be used for drawing purposes. A user may use a finger to draw in air a character or shape that is detected by ultrasound source/detector 125 and rendered into an image by one or processors 102. This feature may be useful, for instance, in computer games, toys, or graphics applications.
  • In FIG. 3, part of an on screen virtual or simulated keyboard displayed on display device 302 provides the letter “E” key having an elevated substantially square ultrasound 308 provided to an object at location points 206, 208, and/or 210. Although part of a virtual or simulated keyboard is shown, display device 302 can be configured to show a whole QWERTY keyboard, a numeric keypad, or a combination of a whole QWERTY keyboard and a numeric keypad, as desired. The letter “S” key is provided by a partially displayed portion and an elevated substantially circular ultrasound 310. The virtual or simulated keyboard may also be programmed to replicate Braille lettering, as desired.
  • As an example, for letters “Q” and “A” ultrasound 3061 and 3062 are projected around the perimeter or edges of the keys to define boundaries so that a user may type the correct key and can find or feel the correct position of the keys. For displayed letters “Q” and “A” a user may type the key by physically touching display device 302. The touch input is detected by touch detectors 124.
  • In one embodiment a pull away motion of an object from display device 302 may be detectable as a capital or superscripting letter input while a push motion in the direction towards the display device may indicate subscripting of the letter. In response to a detected motion, haptic feedback, force feedback, or tactile feedback in the form of a played sound, gyration, or vibration may be provided via I/O controller 116.
  • Referring to FIG. 4, chart 400 shows an example of how ultrasound focal or control point strength or intensity units may be varied over time to provide different sensations to a user's finger, hand, or any other object. For instance, as a user pulls a finger away from display device 302, which is detected by ultrasound source/detector 125, strength or intensity units may be reduced by elevation, indenting, or texturizing controller 121. Conversely, when the finger is pushed towards display device 302 strength or intensity units may be increased for a predetermined period creating a virtual feeling of resistance.
  • In addition to inputting information via on screen virtual or simulated keyboard shown in FIG. 3, display device 242 may project ultrasound as shown in FIG. 2 f. Ultrasound transducer, source, or detector cells 244 may project onto zone 246 so that the user's view of display device 242 is unobstructed. Zone 246 may be projected onto a table or desk giving the user the ability to use the space as an input area similar to that of a keyboard or mouse. A special pad 248 may be used to reflect or vibrate in response to ultrasound from transducer, source, or detector cells 244.
  • Referring again to the virtual or simulated keyboard on display device 302, instructions in software 108 can be used to predict or anticipate keystrokes. Prediction or anticipation may be based on a word or sentence entered. In response to the anticipation, a different key may emit ultrasound to a user's finger, hand, or any other object to encourage or invoke input and provide context awareness.
  • An embodiment of the present invention may provide enhanced electronic advertising processes. Advertisement 316, such as an adword by Google, can be sold to an advertiser for a certain price for having elevated substantially circular ultrasound 317 on at least one part or the entire advertisement image or photo. Advertisement 318 can be sold to an advertiser for a different price, higher or lower, for having elevated substantially circular ultrasound 318 1 and 318 2 each projected at a different intensity in comparison to substantially circular ultrasound 317. In addition, the strength or intensity of substantially circular ultrasound 317, 318 1, and 318 2 may be dependent on location determined by GPS device 114 and varied over time as shown in FIG. 4.
  • Advertisement 316 or 318 may be provided in a separate pop up window with the emitted ultrasound to an object at location points 206, 208, and/or 210. The emitted ultrasound may be provided only for a predetermined time period after the pop up window is displayed thereby providing a nudge or feeling sensation to the object. As the pop up window emerges the intensity of ultrasound to the object may be increased over time prior to turning off thereby simulating the effect of the pop up window virtually emerging from display device 302.
  • With advertisement 316 or 318 in a separate pop up window, or for any another application in a window, a user may interact with an operating system by moving windows, grabbing windows, dragging windows, or dropping windows. Substantially circular ultrasound 317, for instance, may provide to a user's fingers a sensation by projecting multiple focal or control points when the user virtually tries to grab a window shown on display device 302. As the user moves a window, a slight vibration is provided by substantially circular ultrasound 317. A strong vibration may be provided by substantially circular ultrasound 317 when running into obstacles or boundaries on the screen. The vibration may stop when the user releases the window, as desired.
  • An embodiment of the present invention may provide electronic commerce processes. A “Buy Now” button is provided with an elevated substantially circular ultrasound 322 1 and an elevated substantially square ultrasound 322 2 to an object at location points 206, 208, and/or 210. The “Buy Now” button is associated with triggering the purchasing of displayed shirt 324 by sending a request to a server (not shown) over one or more network adapters 128. For shirt 324, ultrasound texturizing pattern 326 is provided to virtually replicate or simulate the surface or composition of shirt 324. Ultrasound texturizing pattern 326 can be a combination of different ultrasound focal or control points. Although a shirt 324 is shown, ultrasound texturizing pattern 326 can be used to provide surface information for any product being sold or displayed on display device 302.
  • Using touch detectors 124 in combination with elevation, indenting, or texturizing controller 121, displayed shirt 324 can be highlighted and then rotated in response to a multitouch input while ultrasound texturizing pattern 326 is dynamically changed to virtually reflect the different surfaces or materials used to make the shirt. Shirt 324 can be zoomed in and out using mulitouch inputs detected by touch detectors 124 with each zoom level reflecting texture differences on ultrasound texturizing pattern 326. For instance, a zoomed in view may be more grainy or rough compared to a zoomed out view. The zoom levels can also be configured with a fading in or out effect by one or more processors 102 and can involve retrieving additional information from a server (not shown) over one or more network adapters 128. Beyond the examples of fabrics, any material may be replicated or simulated by ultrasound texturizing pattern 326. Airborne ultrasound feedback, similar to multitouch inputs, may also be used to change views, angles, or size of displayed shirt 324.
  • Still referring to displayed shirt 324, display device 302 may be elevated, indented, or texturized in accordance with examples given in U.S. application Ser. No. 12/406,273. With shirt 324 texturized on display device 302 and at a distance to an object using ultrasound, the user is given an improved realization of the composition of the shirt by combining the two enhancements.
  • Referring again to FIG. 3, an embodiment of the present invention provides an electronic game, such as tic-tac-toe, by projecting ultrasound pattern 328 to an object at location points 206, 208, and/or 210. As an example given in gaming applications, ultrasound pattern 328 may be projected to multiple fingers and tracked as the user tries to pinch, grab, or push an object in a game or any other simulated environment displayed on display device 302. Ultrasound pattern 328 emitted onto an object can also control scrolling or drag and drop functions of items in a game in combination with multitouch inputs detected by touch detectors 124.
  • In another example, ultrasound pattern 328 can be controlled by elevation, indenting, or texturizing controller 121 such that an object being tracked at location point 206, such as user's hand, can be handed off or switched to location point 208, such as a user's other hand, to be tracked. Using this process, for instance, a user may dribble a ball from one hand to another in front of display device 302. Moreover, passing of location points in space and time from 206 to 208, results in passing a location point of an object between different horizontal planes relative to display device layer 204. Alternatively, the location point may be passed on the same plane.
  • In another example, ultrasound pattern 328 can be used to emulate a spring like sensation to an object and simulate elasticity to a user's hand in a game or any other application. Ultrasound layer 205 can also simulate whole screen explosions, blasts, or bullets being fired at the user by turning on several ultrasound transducers for a predetermined period of time in a game or movie. Ultrasound pattern 328 may also provide a gaming feature where tilting or rotation detected by an accelerometer in sensors 126 controls ultrasound output for four dimensional motion gaming. Ultrasound pattern 328 may also define the boundaries of a virtual space or layer between location points 206, 208, and 210 and display device layer 204.
  • In another embodiment, ultrasound pattern 328 projected onto multiple fingers can be used to simulate a virtual joystick or pointing stick for 360 degrees rotational input by tracking the movement of the fingers by ultrasound source/detector 125. A three dimensional accelerometer can be included in sensors 126 to be used in combination with elevation, indenting, or texturizing controller 121 to project ultrasound pattern 328 in response to a programmed action in the game. Similarly, a visual haptic ultrasound mouse or track pad may be configured by projecting and controlling ultrasound pattern 328 to replicate the functionality of a mouse or track pad and provide a 4-D free space tactile user interface device.
  • In another embodiment, ultrasound pattern 328 can provide enhanced features for online collaboration, distance learning, online conferencing, social networking, or online dating. For instance, in response to push command on a networked computing device (not shown), which may or may not have an ultrasound enhanced display device, ultrasound pattern 328 may provide feedback to an object at location points 206, 208, and/or 210. Examples of feedback are a poke sensation similar to that on Facebook, a push sensation, a virtual handshake sensation, etc. In online conferencing, tactile inputs or gestures via ultrasound pattern 328 may be used during a video conference application for additional interaction between conversing parties. Social networking or adult entertainment applications can be enhanced by ultrasound pattern 328 providing stimulation in connection with a video, image, photo, or audio media on display device 302.
  • For digital imagery, ultrasound rays 327 1 and 327 2 may be used to augment, enhance, or characterize different objects in photo or image 327 3. Ultrasound rays 327 1 and 327 2 may be preprogrammed into photo or image 327 3 by the owner for watermarking, artistic design, or the like. Ultrasound 327 1 and 327 2 may also be used to augment photo editing applications. If display device 302 is configured as a digital sign, ultrasound 327 1 and 327 2 may be used to get the attention of people walking near or viewing the photo or image 327 3 on the sign.
  • In addition, ultrasound pattern 328 may also project sensations to simulate maps, topography, geography, imagery, or location service processes in combination with GPS device 114. Ultrasound pattern 328 can simulate mountainous regions on a map by projecting an ultrasound of various heights and intensities to an object at location points 206, 208, and/or 210.
  • Ultrasound pattern 328 may also be used to simulate the action of picking up (i.e. cut) or drop text (i.e. paste) in an email, 3rd Generation Partnership Project (3GPP) or 3GPP2 short message service (SMS) text message, or 3GPP/3GPP2 multimedia message service (MMS) message. Ultrasound pattern 328 may also be used in connection with a PDF document, word document, excel, four dimensional (4-D) screensaver, 4-D art, 4-D drawings, 3-D imagery, a 3-D sculpture, a 4-D “etch-a-sketch”, or architecture designs using scalable or vector graphics. Any of the actions given above for ultrasound pattern 328 may be used in combination with transmitting or receiving information over one or more network adapters 128.
  • In e-book applications, ultrasound pattern 328 can be used to replicate or simulate the edge of a page and allow a user to virtually lift or pick-up a page. Moreover, a user may be able to feel text of varying sensations provided by ultrasound pattern 328 that is hyperlinked or highlighted on an e-book page as the user moves a finger across the page.
  • For multitouch applications, airborne ultrasound pattern 328 may be used to simulate friction or resistance as a user moves an image by touching the screen, zooms into an image, or zooms out of an image. When zooming beyond a threshold, ultrasound pattern 328 can be used to provide resistance thereby defining boundaries and providing a warning or alarm. While scrolling, panning, or gliding, hitting a threshold level or endpoint causes an ultrasound tactile feedback or response. For scrolling momentum, the ultrasound pattern 328 may provide high intensity initially to simulate inertia and then less intensity as momentum builds. For navigating through a list of items on display device 302, items may be highlighted on the screen as the user scrolls through the list from afar.
  • Moreover, display device 302 may have ultrasound source/detectors 330 1-330 4 in a slightly beveled position or in the level with the frame of display device 302. Display device 302 may also have digital image or infrared cameras 334 1-334 4 for tracking motion of objects at location points 206, 208, and/or 210 using algorithms such as that described in U.S. Pat. No. 7,317,872, herein incorporated by reference as if fully set forth, that can be used to perform additional sensor measurements. Other sensor measurements for additional metrics and refinement include infrared or optical detection to detect depth of objects at location points 206, 208, and/or 210. These sensors can be embedded next to or within each display cell in display device 302.
  • In another embodiment, display device 302 replicates, mimics, or simulates a customizable or programmable interface or control panel for a remote control, instrument panel on a vehicle, an automobile dashboard configuration, audio equalizers, multitouch equalizers, radio button list, or a consumer electronics button surface with ultrasound patterns 332 1-332 3. For demoing consumer electronics online, ultrasound patterns 332 1-332 3 provide a user the ability to simulate buttons on a device prior to purchase or use as a tutorial. Moreover, 332 1-332 3 can be programmed for controlling volume control, replicating smart home switches or controllers, or replicating a dial or knob, as desired.
  • Still referring to FIG. 3, ultrasound may be used to feel, sense, or move text, images, photos, windows or icons. For instance, web searching is performed by dragging and dropping text “TEST SEARCH” 337 into search box 336. A user may be provided substantially circular ultrasound 338 when grabbing the text “TEST SEARCH” from a distance to display device 302. The user then moves or drags the text “TEST SEARCH” from afar via path 339 over to search box 336 and releases or drops it. The user's finger movements are tracked by ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121. The text may be shown as moving on display device 302 as the user's fingers are tracked. Similarly a visual, photo, or image search may be performed by grabbing image of shirt 324 and dropping it in search box 336.
  • In another example, ultrasound pattern 328 can be used to replicate or simulate a virtual stylus, pen, or pencil allowing a user to mimic writing or drawing on display device 302 similar to a notepad. The virtual stylus, pen, or pencil may be configured without the user physically holding anything. Unlike a notepad, the writing or drawing may be done at a predetermined distance from display device 302 in sensation zone 202.
  • Ultrasound pattern 328 can also be used for medical applications. For instance, with laparoscopic surgery a physician located in the surgery room or remote to the surgery room may be able to feel or sense images or photos of organs of a patient provided by an internal surgical camera and displayed on display device 302. Ultrasound pattern 328 may also be used to simulate pain of a patient to a doctor over the Internet.
  • In another example, ultrasound pattern 328 can be responsive to voice or visual commands or recognition detected by sensors 126. Alternatively, ultrasound pattern 328 can be a preprogrammed texturized pattern to notify the user of an incoming call, similar to a customized ringtone. Alternatively, ultrasound pattern 328 may be used for providing a warning to a driver in relation to safety feature on an automobile. Alternatively, ultrasound pattern 328 may be used for enhancing icons on a system tray with each icon having a different characteristic vibration sensation. Alternatively, device 100 may be controlled remotely, either wired or wirelessly, via a server or cloud computing platform (not shown) via one or more network adapters 128.
  • Moreover, ultrasound pattern 328 can be used to replicate, simulate, enhance features for biometrics, musical instruments, video clips, editing audio tracks, editing video, computer aided designs (CAD), semiconductor layouts, e-books, a children's educational product, children's productivity or educational game, a general education product, a 3-D drawing tool, distance learning, or a pop-up children's books, as desired.
  • FIG. 5 is a process 500 for providing elevated, indented, or texturized sensations to an object near a display device using ultrasound. In the example given here, the object may be one or more fingers or hands at location points 206, 208, and/or 210. Ultrasound source/detector 125 determines the initial object location and calculates a distance and angle relative to display device 302 (step 502) to calculate focal or control point vectors. For initialization, a user's fingers, hand, or a predetermined object may be place over a predetermined zone over display device 302. Alternatively, a user's fingers, hand, or a predetermined object may be detected by digital or infrared cameras 334 1-334 4 using image or photo recognition technology. Once the location of the object is determined, device 100 may display a preprogrammed image, such as a virtual keyboard or icon, on display device 302 at the detected location.
  • Ultrasound source/detector 125 in combination with elevation, indenting, or texturizing controller 121 projects or emits one or more ultrasound patterns, such as the ones shown in FIG. 2 e, having one or more focal or control points (step 504). In order to project a predetermined sensation, the intensity of ultrasound at one or more focal or control points may be varied. Also, in the case of multiple objects ultrasound source/detector 125 may be time multiplexed to project different ultrasound patterns to each object. Elevation, indenting or texturizing controller 121 focuses or adjusts focal or control point vectors (step 506). Ultrasound source/detector 125 in combination with elevation, indenting or texturizing controller 121 detects, tracks, or senses movement of focal or control points to determine momentum and/or velocity of an object (step 508). While the object moves and is tracked, the ultrasound patterns provided to the object may vary based on images, text, video, or the like displayed on display device 302.
  • In order to enhance accuracy or user experience, device 100 may detect and track multitouch inputs by other fingers and/or input detected by other sensors (step 510). An animation or video of a generated surface may be displayed on display device 302 for feedback and showing the tracking of the object (step 512). If an input, gesture, or command is recognized by ultrasound source/detector 125 in combination with elevation, indenting or texturizing controller 121 (step 514), the input, gesture, or command is processed by one or more processors 102 (step 516) and information is retrieved based on the input, gesture, or command (step 518).
  • Although features and elements are described above in particular combinations, each feature or element can be used alone without the other features and elements or in various combinations with or without other features and elements. The methods, processes, or flow charts provided herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable storage medium for execution by a general purpose computer or a processor. Examples of computer-readable storage mediums include a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, digital versatile disks (DVDs), and BluRay discs.
  • Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
  • A processor in association with software may be used to implement hardware functions for use in a computer, wireless transmit receive unit (WTRU) or any host computer. The programmed hardware functions may be used in conjunction with modules, implemented in hardware and/or software, such as a camera, a video camera module, a videophone, a speakerphone, a vibration device, a speaker, a microphone, a television transceiver, a hands free headset, a keyboard, a Bluetooth® module, a frequency modulated (FM) radio unit, a liquid crystal display (LCD) display unit, an organic light-emitting diode (OLED) display unit, a digital music player, a media player, a video game player module, an Internet browser, and/or any wireless local area network (WLAN) or Ultra Wide Band (UWB) module.

Claims (2)

1. An apparatus for providing ultrasound sensations to a user at a predetermined distance from the apparatus, the apparatus comprising:
a display device having a plurality of display pixels, a plurality of ultrasound transducers, and a plurality of ultrasound detectors;
a controller coupled to the plurality of ultrasound transducers and the plurality of ultrasound detectors, where the controller controls transmit of ultrasound over air to the user by the plurality of ultrasound transducers and receives feedback information from the plurality of ultrasound detectors of reflections off of the user; and
wherein the controller controls transmit of ultrasound over air to provide an ultrasound pattern having a plurality of ultrasound focal points to the user and tracks the motion of the user from received feedback from the plurality of ultrasound detectors to determine user input commands, where the user input commands are related to selection of a hyperlink displayed on the display device.
2. A method for providing ultrasound sensations to a user at a predetermined distance from an apparatus, the method comprising:
controlling a plurality of ultrasound transducers to transmit ultrasound over air to the user;
controlling a plurality of ultrasound detectors to receives feedback information of reflections off of the user;
transmitting ultrasound over air to provide an ultrasound pattern having a plurality of ultrasound focal points to the user; and
tracking the motion of the user from received feedback from the plurality of ultrasound detectors to determine user input commands, where user input commands are related to selecting a hyperlink displayed on a display device.
US12/706,205 2010-02-16 2010-02-16 Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound Abandoned US20110199342A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/706,205 US20110199342A1 (en) 2010-02-16 2010-02-16 Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US15/498,122 US10496170B2 (en) 2010-02-16 2017-04-26 Vehicle computing system to provide feedback
US16/700,518 US20200103975A1 (en) 2010-02-16 2019-12-02 Vehicle computing system to provide a vehicle safety warning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/706,205 US20110199342A1 (en) 2010-02-16 2010-02-16 Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/498,122 Continuation US10496170B2 (en) 2010-02-16 2017-04-26 Vehicle computing system to provide feedback

Publications (1)

Publication Number Publication Date
US20110199342A1 true US20110199342A1 (en) 2011-08-18

Family

ID=44369327

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/706,205 Abandoned US20110199342A1 (en) 2010-02-16 2010-02-16 Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US15/498,122 Expired - Fee Related US10496170B2 (en) 2010-02-16 2017-04-26 Vehicle computing system to provide feedback
US16/700,518 Abandoned US20200103975A1 (en) 2010-02-16 2019-12-02 Vehicle computing system to provide a vehicle safety warning

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/498,122 Expired - Fee Related US10496170B2 (en) 2010-02-16 2017-04-26 Vehicle computing system to provide feedback
US16/700,518 Abandoned US20200103975A1 (en) 2010-02-16 2019-12-02 Vehicle computing system to provide a vehicle safety warning

Country Status (1)

Country Link
US (3) US20110199342A1 (en)

Cited By (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038652A1 (en) * 2010-08-12 2012-02-16 Palm, Inc. Accepting motion-based character input on mobile computing devices
US20120096354A1 (en) * 2010-10-14 2012-04-19 Park Seungyong Mobile terminal and control method thereof
US20120274609A1 (en) * 2011-04-26 2012-11-01 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US20130061176A1 (en) * 2011-09-07 2013-03-07 Konami Digital Entertainment Co., Ltd. Item selection device, item selection method and non-transitory information recording medium
US20130215038A1 (en) * 2012-02-17 2013-08-22 Rukman Senanayake Adaptable actuated input device with integrated proximity detection
US20130227409A1 (en) * 2011-12-07 2013-08-29 Qualcomm Incorporated Integrating sensation functionalities into social networking services and applications
WO2013171025A1 (en) * 2012-05-18 2013-11-21 Robert Bosch Gmbh Arrangement and method for stimulating the tactile sense of a user
US20140078112A1 (en) * 2011-04-26 2014-03-20 Sentons Inc. Using multiple signals to detect touch input
US8686951B2 (en) 2009-03-18 2014-04-01 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
WO2014081508A1 (en) * 2012-11-20 2014-05-30 Immersion Corporation Systems and methods for providing mode or state awareness with programmable surface texture
US8743244B2 (en) 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information
CN103838421A (en) * 2012-11-20 2014-06-04 英默森公司 Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
US8766953B1 (en) 2013-06-27 2014-07-01 Elwha Llc Tactile display driven by surface acoustic waves
US8884927B1 (en) 2013-06-27 2014-11-11 Elwha Llc Tactile feedback generated by phase conjugation of ultrasound surface acoustic waves
US20140380214A1 (en) * 2013-06-21 2014-12-25 Barnesandnoble.Com Llc Drag and drop techniques for discovering related content
US20150003204A1 (en) * 2013-06-27 2015-01-01 Elwha Llc Tactile feedback in a two or three dimensional airspace
US8928582B2 (en) 2012-02-17 2015-01-06 Sri International Method for adaptive interaction with a legacy software application
GB2516820A (en) * 2013-07-01 2015-02-11 Nokia Corp An apparatus
US20150169059A1 (en) * 2012-04-18 2015-06-18 Nokia Corporation Display apparatus with haptic feedback
US9099971B2 (en) 2011-11-18 2015-08-04 Sentons Inc. Virtual keyboard interaction using touch input force
US20150286380A1 (en) * 2012-08-10 2015-10-08 Blackberry Limited Method of momentum based zoom of content on an electronic device
US9218526B2 (en) 2012-05-24 2015-12-22 HJ Laboratories, LLC Apparatus and method to detect a paper document using one or more sensors
WO2016007920A1 (en) * 2014-07-11 2016-01-14 New York University Three dimensional tactile feedback system
US20160054826A1 (en) * 2012-07-26 2016-02-25 Apple Inc. Ultrasound-Based Force Sensing
US20160062497A1 (en) * 2012-07-26 2016-03-03 Apple Inc. Ultrasound-Based Force Sensing and Touch Sensing
GB2530036A (en) * 2014-09-09 2016-03-16 Ultrahaptics Ltd Method and apparatus for modulating haptic feedback
US9335845B2 (en) 2012-01-31 2016-05-10 MCube Inc. Selective accelerometer data processing methods and apparatus
US20160180636A1 (en) * 2014-12-17 2016-06-23 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3d display
US20160175709A1 (en) * 2014-12-17 2016-06-23 Fayez Idris Contactless tactile feedback on gaming terminal with 3d display
US20160175701A1 (en) * 2014-12-17 2016-06-23 Gtech Canada Ulc Contactless tactile feedback on gaming terminal with 3d display
US20160180644A1 (en) * 2014-12-17 2016-06-23 Fayez Idris Gaming system with movable ultrasonic transducer
US9449476B2 (en) 2011-11-18 2016-09-20 Sentons Inc. Localized haptic feedback
WO2016192266A1 (en) * 2015-05-29 2016-12-08 京东方科技集团股份有限公司 Sound wave touch control device and electronic device
US9612658B2 (en) 2014-01-07 2017-04-04 Ultrahaptics Ip Ltd Method and apparatus for providing tactile sensations
US9679547B1 (en) * 2016-04-04 2017-06-13 Disney Enterprises, Inc. Augmented reality music composition
US9804675B2 (en) 2013-06-27 2017-10-31 Elwha Llc Tactile feedback generated by non-linear interaction of surface acoustic waves
US9836150B2 (en) 2012-11-20 2017-12-05 Immersion Corporation System and method for feedforward and feedback with haptic effects
US9841819B2 (en) 2015-02-20 2017-12-12 Ultrahaptics Ip Ltd Perceptions in a haptic system
US9891738B2 (en) 2012-07-26 2018-02-13 Apple Inc. Ultrasound-based force sensing of inputs
US9977120B2 (en) 2013-05-08 2018-05-22 Ultrahaptics Ip Ltd Method and apparatus for producing an acoustic field
US9983718B2 (en) 2012-07-18 2018-05-29 Sentons Inc. Detection of type of object used to provide a touch contact input
US20180153752A1 (en) * 2016-12-07 2018-06-07 Stryker Corporation Haptic Systems And Methods For A User Interface Of A Patient Support Apparatus
WO2018109466A1 (en) * 2016-12-13 2018-06-21 Ultrahaptics Ip Limited Driving techniques for phased-array systems
US10048811B2 (en) 2015-09-18 2018-08-14 Sentons Inc. Detecting touch input provided by signal transmitting stylus
US10061453B2 (en) 2013-06-07 2018-08-28 Sentons Inc. Detecting multi-touch inputs
US10075630B2 (en) 2013-07-03 2018-09-11 HJ Laboratories, LLC Providing real-time, personal services by accessing components on a mobile device
US10101811B2 (en) 2015-02-20 2018-10-16 Ultrahaptics Ip Ltd. Algorithm improvements in a haptic system
US10108286B2 (en) 2012-08-30 2018-10-23 Apple Inc. Auto-baseline determination for force sensing
US10126877B1 (en) 2017-02-01 2018-11-13 Sentons Inc. Update of reference data for touch input detection
US10147243B2 (en) 2016-12-05 2018-12-04 Google Llc Generating virtual notation surfaces with gestures in an augmented and/or virtual reality environment
US20190021924A1 (en) * 2017-07-19 2019-01-24 Stryker Corporation Techniques For Generating Auditory And Haptic Output With A Vibrational Panel Of A Patient Support Apparatus
WO2019015882A1 (en) * 2017-07-17 2019-01-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Display apparatuses and pixels for a display apparatus
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US10235004B1 (en) 2011-11-18 2019-03-19 Sentons Inc. Touch input detector with an integrated antenna
US10268275B2 (en) 2016-08-03 2019-04-23 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US10296144B2 (en) 2016-12-12 2019-05-21 Sentons Inc. Touch input detection with shared receivers
US10386966B2 (en) 2013-09-20 2019-08-20 Sentons Inc. Using spectral control in detecting touch input
WO2019207143A1 (en) * 2018-04-27 2019-10-31 Myvox Ab A device, system and method for generating an acoustic-potential field of ultrasonic waves
US10496170B2 (en) 2010-02-16 2019-12-03 HJ Laboratories, LLC Vehicle computing system to provide feedback
US10497358B2 (en) 2016-12-23 2019-12-03 Ultrahaptics Ip Ltd Transducer driver
US10531212B2 (en) 2016-06-17 2020-01-07 Ultrahaptics Ip Ltd. Acoustic transducers in haptic systems
US10585522B2 (en) 2017-02-27 2020-03-10 Sentons Inc. Detection of non-touch inputs using a signature
EP3620903A1 (en) * 2013-09-03 2020-03-11 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US10691230B2 (en) 2012-12-29 2020-06-23 Apple Inc. Crown input for a wearable electronic device
US10712824B2 (en) 2018-09-11 2020-07-14 Apple Inc. Content-based tactile outputs
US10755538B2 (en) 2016-08-09 2020-08-25 Ultrahaptics ilP LTD Metamaterials and acoustic lenses in haptic systems
US10818162B2 (en) 2015-07-16 2020-10-27 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
US10884592B2 (en) 2015-03-02 2021-01-05 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
US10908741B2 (en) 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
US10911861B2 (en) 2018-05-02 2021-02-02 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency
US10921976B2 (en) 2013-09-03 2021-02-16 Apple Inc. User interface for manipulating user interface objects
US10949020B2 (en) 2012-07-26 2021-03-16 Apple Inc. Fingerprint-assisted force estimation
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time
US11009411B2 (en) 2017-08-14 2021-05-18 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11048329B1 (en) 2017-07-27 2021-06-29 Emerge Now Inc. Mid-air ultrasonic haptic interface for immersive computing environments
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
US11068083B2 (en) 2014-09-02 2021-07-20 Apple Inc. Button functionality
US11098951B2 (en) 2018-09-09 2021-08-24 Ultrahaptics Ip Ltd Ultrasonic-assisted liquid manipulation
US11157143B2 (en) 2014-09-02 2021-10-26 Apple Inc. Music user interface
US11169610B2 (en) 2019-11-08 2021-11-09 Ultraleap Limited Tracking techniques in haptic systems
US11189140B2 (en) 2016-01-05 2021-11-30 Ultrahaptics Ip Ltd Calibration and detection techniques in haptic systems
US11250385B2 (en) 2014-06-27 2022-02-15 Apple Inc. Reduced size user interface
US11327599B2 (en) 2011-04-26 2022-05-10 Sentons Inc. Identifying a contact type
US11360546B2 (en) 2017-12-22 2022-06-14 Ultrahaptics Ip Ltd Tracking in haptic systems
US11374586B2 (en) 2019-10-13 2022-06-28 Ultraleap Limited Reducing harmonic distortion by dithering
US11378997B2 (en) 2018-10-12 2022-07-05 Ultrahaptics Ip Ltd Variable phase and frequency pulse-width modulation technique
US20220229493A1 (en) * 2020-06-06 2022-07-21 Battelle Memorial Institute Delivery of somatosensation for medical diagnostics or guiding motor action
US11402968B2 (en) 2014-09-02 2022-08-02 Apple Inc. Reduced size user in interface
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
US11531395B2 (en) 2017-11-26 2022-12-20 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US11553295B2 (en) 2019-10-13 2023-01-10 Ultraleap Limited Dynamic capping with virtual microphones
US11550395B2 (en) 2019-01-04 2023-01-10 Ultrahaptics Ip Ltd Mid-air haptic textures
US11580829B2 (en) 2017-08-14 2023-02-14 Sentons Inc. Dynamic feedback for haptics
FR3127309A1 (en) * 2021-09-21 2023-03-24 Mz Technology Contactless interaction framework for human / machine interface
US11704983B2 (en) 2017-12-22 2023-07-18 Ultrahaptics Ip Ltd Minimizing unwanted responses in haptic systems
US11715453B2 (en) 2019-12-25 2023-08-01 Ultraleap Limited Acoustic transducer structures
US11743221B2 (en) 2014-09-02 2023-08-29 Apple Inc. Electronic message user interface
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
US11886639B2 (en) 2020-09-17 2024-01-30 Ultraleap Limited Ultrahapticons

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102441746B1 (en) * 2017-12-22 2022-09-08 삼성전자주식회사 A method for suggesting a user interface using a plurality of display and an electronic device thereof
US11067687B2 (en) 2019-04-25 2021-07-20 Elwha, Llc Multipath acoustic holography and virtual haptics
US10916107B1 (en) * 2019-07-29 2021-02-09 Elwha Llc Time-domain and frequency-domain enhancements for acoustic haptography
JP2021103414A (en) * 2019-12-25 2021-07-15 株式会社東海理化電機製作所 Tactile sense presentation device

Citations (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5717434A (en) * 1992-07-24 1998-02-10 Toda; Kohji Ultrasonic touch system
US6313825B1 (en) * 1998-12-28 2001-11-06 Gateway, Inc. Virtual input device
US20020030699A1 (en) * 1998-04-17 2002-03-14 Van Ee Jan Hand-held with auto-zoom for graphical display of Web page
US20020050983A1 (en) * 2000-09-26 2002-05-02 Qianjun Liu Method and apparatus for a touch sensitive system employing spread spectrum technology for the operation of one or more input devices
US6441828B1 (en) * 1998-09-08 2002-08-27 Sony Corporation Image display apparatus
US20030048260A1 (en) * 2001-08-17 2003-03-13 Alec Matusis System and method for selecting actions based on the identification of user's fingers
US6744178B2 (en) * 1999-12-27 2004-06-01 Seiko Instruments Inc. Pulse detection device and method of manufacturing the same
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US6859569B2 (en) * 2000-03-31 2005-02-22 Sony Corporation Information receiving/display apparatus and information receiving/display method
US6859572B2 (en) * 2000-03-31 2005-02-22 Sony Corporation Photon operating device and photon operating method
US6984208B2 (en) * 2002-08-01 2006-01-10 The Hong Kong Polytechnic University Method and apparatus for sensing body gesture, posture and movement
US7027311B2 (en) * 2003-10-17 2006-04-11 Firefly Power Technologies, Inc. Method and apparatus for a wireless power supply
US20060096392A1 (en) * 2001-07-24 2006-05-11 Tactex Controls Inc. Touch sensitive membrane
US7050835B2 (en) * 2001-12-12 2006-05-23 Universal Display Corporation Intelligent multi-media display communication system
US7077015B2 (en) * 2003-05-29 2006-07-18 Vincent Hayward Apparatus to reproduce tactile sensations
US7190416B2 (en) * 2002-10-18 2007-03-13 Nitto Denko Corporation Liquid crystal display with touch panel having internal front polarizer
US20070078345A1 (en) * 2005-09-30 2007-04-05 Siemens Medical Solutions Usa, Inc. Flexible ultrasound transducer array
US20070085828A1 (en) * 2005-10-13 2007-04-19 Schroeder Dale W Ultrasonic virtual mouse
US20070085838A1 (en) * 2005-10-17 2007-04-19 Ricks Theodore K Method for making a display with integrated touchscreen
US20070109276A1 (en) * 2005-11-17 2007-05-17 Lg Electronics Inc. Method for Allocating/Arranging Keys on Touch-Screen, and Mobile Terminal for Use of the Same
US20070139391A1 (en) * 2005-11-18 2007-06-21 Siemens Aktiengesellschaft Input device
US20070247422A1 (en) * 2006-03-30 2007-10-25 Xuuk, Inc. Interaction techniques for flexible displays
US20080005703A1 (en) * 2006-06-28 2008-01-03 Nokia Corporation Apparatus, Methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US7317872B1 (en) * 1997-10-10 2008-01-08 Posa John G Remote microphone and range-finding configuration
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor
US20080062148A1 (en) * 2006-06-09 2008-03-13 Hotelling Steve P Touch screen liquid crystal display
US20080100907A1 (en) * 2006-10-10 2008-05-01 Cbrite Inc. Electro-optic display
US20080150911A1 (en) * 2008-01-21 2008-06-26 Sony Computer Entertainment America Inc. Hand-held device with touchscreen and digital tactile pixels
US20080216001A1 (en) * 2006-01-05 2008-09-04 Bas Ording Portable electronic device with content-dependent touch sensitivity
US20090043205A1 (en) * 2007-08-10 2009-02-12 Laurent Pelissier Hand-held ultrasound system having sterile enclosure
US20090043195A1 (en) * 2004-10-12 2009-02-12 Koninklijke Philips Electronics, N.V. Ultrasound Touchscreen User Interface and Display
US20090051662A1 (en) * 2005-03-14 2009-02-26 Martin Klein Touch-Sensitive Screen With Haptic Acknowledgement
US7500952B1 (en) * 1995-06-29 2009-03-10 Teratech Corporation Portable ultrasound imaging system
US20090181724A1 (en) * 2008-01-14 2009-07-16 Sony Ericsson Mobile Communications Ab Touch sensitive display with ultrasonic vibrations for tactile feedback
US20090184936A1 (en) * 2008-01-22 2009-07-23 Mathematical Inventing - Slicon Valley 3D touchpad
US20090198132A1 (en) * 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
US20090199392A1 (en) * 2008-02-11 2009-08-13 General Electric Company Ultrasound transducer probes and system and method of manufacture
US20090262078A1 (en) * 2008-04-21 2009-10-22 David Pizzi Cellular phone with special sensor functions
US20090284480A1 (en) * 2008-05-16 2009-11-19 International Business Machines Corporation System and apparatus for a multi-point touch-sensitive sensor user interface using distinct digit identification
US20090295760A1 (en) * 2008-06-02 2009-12-03 Sony Ericsson Mobile Communications Ab Touch screen display
US20090315851A1 (en) * 2006-05-02 2009-12-24 Hotelling Steven P Multipoint Touch Surface Controller
US20090322687A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Virtual touchpad
US20100007511A1 (en) * 2008-07-14 2010-01-14 Sony Ericsson Mobile Communications Ab Touchless control of a control device
US20100013777A1 (en) * 2008-07-18 2010-01-21 Microsoft Corporation Tracking input in a screen-reflective interface environment
US20100026656A1 (en) * 2008-07-31 2010-02-04 Apple Inc. Capacitive sensor behind black mask
US20100160786A1 (en) * 2007-06-01 2010-06-24 Koninklijke Philips Electronics N.V. Wireless Ultrasound Probe User Interface
US7756297B2 (en) * 1999-07-08 2010-07-13 Pryor Timothy R Camera based sensing in handheld, mobile, gaming, or other devices
US20100177050A1 (en) * 2009-01-14 2010-07-15 Immersion Corporation Method and Apparatus for Generating Haptic Feedback from Plasma Actuation
US20100225734A1 (en) * 2009-03-03 2010-09-09 Horizon Semiconductors Ltd. Stereoscopic three-dimensional interactive system and method
US20100234077A1 (en) * 2009-03-12 2010-09-16 Yoo Jae-Suk Mobile terminal and method for providing user interface thereof
US20100231540A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods For A Texture Engine
US20100238114A1 (en) * 2009-03-18 2010-09-23 Harry Vartanian Apparatus and method for providing an elevated, indented, or texturized display device
US20100257491A1 (en) * 2007-11-29 2010-10-07 Koninklijke Philips Electronics N.V. Method of providing a user interface
US20100259633A1 (en) * 2009-04-14 2010-10-14 Sony Corporation Information processing apparatus, information processing method, and program
US7828733B2 (en) * 2000-11-24 2010-11-09 U-Systems Inc. Coronal and axial thick-slice ultrasound images derived from ultrasonic scans of a chestwardly-compressed breast
US20100298713A1 (en) * 2007-10-29 2010-11-25 Koninklijke Philips Electronics N.V. Systems and methods for ultrasound assembly including multiple imaging transducer arrays
US7843449B2 (en) * 2006-09-20 2010-11-30 Apple Inc. Three-dimensional display system
US7841944B2 (en) * 2002-08-06 2010-11-30 Igt Gaming device having a three dimensional display device
US20110107958A1 (en) * 2009-11-12 2011-05-12 Apple Inc. Input devices and methods of operation
US20110109588A1 (en) * 2009-11-12 2011-05-12 Senseg Ltd. Tactile stimulation apparatus having a composite section comprising a semiconducting material
US20110175813A1 (en) * 2010-01-20 2011-07-21 Apple Inc. Piezo-based acoustic and capacitive detection
US8232973B2 (en) * 2008-01-09 2012-07-31 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input

Family Cites Families (236)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7274652B1 (en) 2000-06-02 2007-09-25 Conexant, Inc. Dual packet configuration for wireless communications
US4871992A (en) 1988-07-08 1989-10-03 Petersen Robert C Tactile display apparatus
US5327457A (en) 1991-09-13 1994-07-05 Motorola, Inc. Operation indicative background noise in a digital receiver
US5867144A (en) 1991-11-19 1999-02-02 Microsoft Corporation Method and system for the direct manipulation of information, including non-default drag and drop operation
US6597347B1 (en) 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5402490A (en) 1992-09-01 1995-03-28 Motorola, Inc. Process for improving public key authentication
US5412189A (en) 1992-12-21 1995-05-02 International Business Machines Corporation Touch screen apparatus with tactile information
US5490087A (en) 1993-12-06 1996-02-06 Motorola, Inc. Radio channel access control
US5463657A (en) 1994-02-15 1995-10-31 Lockheed Missiles & Space Company, Inc. Detection of a multi-sequence spread spectrum signal
FI95178C (en) 1994-04-08 1995-12-27 Nokia Mobile Phones Ltd Keyboard
US5504938A (en) 1994-05-02 1996-04-02 Motorola, Inc. Method and apparatus for varying apparent cell size in a cellular communication system
DE69514926T2 (en) 1994-08-18 2000-10-05 Interval Research Corp INPUT DEVICE FOR VIDEO WITH TACTILE FEEDBACK DEPENDING ON THE CONTENT OF THE VIDEO
US5602901A (en) 1994-12-22 1997-02-11 Motorola, Inc. Specialized call routing method and apparatus for a cellular communication system
US5673256A (en) 1995-07-25 1997-09-30 Motorola, Inc. Apparatus and method for sending data messages at an optimum time
US5712870A (en) 1995-07-31 1998-01-27 Harris Corporation Packet header generation and detection circuitry
US5752162A (en) 1995-11-03 1998-05-12 Motorola, Inc. Methods for assigning subscriber units to visited gateways
US5825308A (en) 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5937049A (en) 1995-12-29 1999-08-10 Apropos Technology Service bureau caller ID collection with ISDN BRI
EP1291812A3 (en) 1996-02-09 2004-06-23 Seiko Instruments Inc. Display unit, manufacturing method of the same and electronic device
US5724659A (en) 1996-07-01 1998-03-03 Motorola, Inc. Multi-mode variable bandwidth repeater switch and method therefor
US5892902A (en) 1996-09-05 1999-04-06 Clark; Paul C. Intelligent token protected system with network authentication
US5867789A (en) 1996-12-30 1999-02-02 Motorola, Inc. Method and system for real-time channel management in a radio telecommunications system
US20010020202A1 (en) * 1999-09-21 2001-09-06 American Calcar Inc. Multimedia information and control system for automobiles
US6882086B2 (en) 2001-05-22 2005-04-19 Sri International Variable stiffness electroactive polymer systems
US7102621B2 (en) 1997-09-30 2006-09-05 3M Innovative Properties Company Force measurement system correcting for inertial interference
US6037882A (en) 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
WO1999017929A1 (en) 1997-10-03 1999-04-15 The Trustees Of The University Of Pennsylvania Polymeric electrostrictive systems
US6243078B1 (en) 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
US6131032A (en) 1997-12-01 2000-10-10 Motorola, Inc. Method and apparatus for monitoring users of a communications system
US6256011B1 (en) 1997-12-03 2001-07-03 Immersion Corporation Multi-function control device with force feedback
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
KR100595920B1 (en) 1998-01-26 2006-07-05 웨인 웨스터만 Method and apparatus for integrating manual input
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
NL1008351C2 (en) 1998-02-19 1999-08-20 No Wires Needed B V Data communication network.
US6104922A (en) 1998-03-02 2000-08-15 Motorola, Inc. User authentication in a communication system utilizing biometric information
US6185536B1 (en) 1998-03-04 2001-02-06 Motorola, Inc. System and method for establishing a communication link using user-specific voice data parameters as a user discriminator
JP3739927B2 (en) 1998-03-04 2006-01-25 独立行政法人科学技術振興機構 Tactile sensor and tactile detection system
US5888161A (en) 1998-03-19 1999-03-30 Ford Global Technologies, Inc. All wheel drive continuously variable transmission having dual mode operation
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
DE19827905C1 (en) 1998-06-23 1999-12-30 Papenmeier Friedrich Horst Device for entering and reading out data
US6563487B2 (en) 1998-06-23 2003-05-13 Immersion Corporation Haptic feedback for directional control pads
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6140913A (en) 1998-07-20 2000-10-31 Nec Corporation Apparatus and method of assisting visually impaired persons to generate graphical data in a computer
US6117296A (en) 1998-07-21 2000-09-12 Thomson; Timothy Electrically controlled contractile polymer composite
US6004049A (en) 1998-10-29 1999-12-21 Sun Microsystems, Inc. Method and apparatus for dynamic configuration of an input device
US6787238B2 (en) 1998-11-18 2004-09-07 The Penn State Research Foundation Terpolymer systems for electromechanical and dielectric applications
US6434702B1 (en) 1998-12-08 2002-08-13 International Business Machines Corporation Automatic rotation of digit location in devices used in passwords
US6198470B1 (en) * 1998-12-28 2001-03-06 Uri Agam Computer input device
US6266690B1 (en) 1999-01-27 2001-07-24 Adc Telecommunications, Inc. Enhanced service platform with secure system and method for subscriber profile customization
DE19918432A1 (en) 1999-04-23 2000-10-26 Saueressig Gmbh & Co Expansion layer of compressible material between core cylinder and its sleeve is provided with depressions on its outer or inner circumferential surface
US6462840B1 (en) 1999-05-17 2002-10-08 Grigory Kravtsov Three dimensional monitor and tactile scanner
EP1188170B1 (en) 1999-06-22 2004-05-26 Peratech Ltd. Variable conductance structures
WO2001001374A1 (en) 1999-06-28 2001-01-04 Tactilics, Inc. Braille computer monitor
US6492979B1 (en) 1999-09-07 2002-12-10 Elo Touchsystems, Inc. Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
JP2001117721A (en) 1999-10-14 2001-04-27 Mitsubishi Electric Corp Touch panel input type keyboard
US6535201B1 (en) 1999-12-17 2003-03-18 International Business Machines Corporation Method and system for three-dimensional topographical modeling
JP2003520374A (en) 2000-01-11 2003-07-02 サーク・コーポレーション Flexible touchpad sensor grid for conforming to an arcuate surface
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
GB0011829D0 (en) 2000-05-18 2000-07-05 Lussey David Flexible switching devices
US7196688B2 (en) 2000-05-24 2007-03-27 Immersion Corporation Haptic devices using electroactive polymers
US7210099B2 (en) 2000-06-12 2007-04-24 Softview Llc Resolution independent vector display of internet content
US6559620B2 (en) 2001-03-21 2003-05-06 Digital Angel Corporation System and method for remote monitoring utilizing a rechargeable battery
US6925495B2 (en) 2000-07-13 2005-08-02 Vendaria Media, Inc. Method and system for delivering and monitoring an on-demand playlist over a network using a template
US6571102B1 (en) 2000-08-08 2003-05-27 Motorola, Inc. Channel management technique for asymmetric data services
DE10046099A1 (en) 2000-09-18 2002-04-04 Siemens Ag Touch sensitive display with tactile feedback
US6456245B1 (en) 2000-12-13 2002-09-24 Magis Networks, Inc. Card-based diversity antenna structure for wireless communications
US6782102B2 (en) 2000-12-21 2004-08-24 Motorola, Inc. Multiple format secure voice apparatus for communication handsets
US6842428B2 (en) 2001-01-08 2005-01-11 Motorola, Inc. Method for allocating communication network resources using adaptive demand prediction
US6628511B2 (en) 2001-01-22 2003-09-30 Xoucin, Inc. Palm-sized handheld device with inverted ergonomic keypad
US6856816B2 (en) 2001-03-23 2005-02-15 Hall Aluminum Llc Telephone quick dialing/re-dialing method and apparatus
US6852416B2 (en) 2001-03-30 2005-02-08 The Penn State Research Foundation High dielectric constant composites of metallophthalaocyanine oligomer and poly(vinylidene-trifluoroethylene) copolymer
US6636202B2 (en) 2001-04-27 2003-10-21 International Business Machines Corporation Interactive tactile display for computer screen
JP3913496B2 (en) 2001-05-28 2007-05-09 独立行政法人科学技術振興機構 Tactile detection system
GB0113905D0 (en) 2001-06-07 2001-08-01 Peratech Ltd Analytical device
FI20012231A (en) 2001-06-21 2002-12-22 Ismo Rakkolainen System for creating a user interface
US7408872B2 (en) 2001-07-09 2008-08-05 Spyder Navigations, L.L.C. Modulation of signals for transmission in packets via an air interface
JP2003029898A (en) 2001-07-16 2003-01-31 Japan Science & Technology Corp Tactile device
US7042997B2 (en) 2001-07-30 2006-05-09 Persona Software, Inc. Passive call blocking method and apparatus
AUPR694401A0 (en) 2001-08-10 2001-09-06 University Of Wollongong, The Bio-mechanical feedback device
CA2398798A1 (en) 2001-08-28 2003-02-28 Research In Motion Limited System and method for providing tactility for an lcd touchscreen
US6703550B2 (en) 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
WO2003039080A1 (en) 2001-10-31 2003-05-08 Nokia Corporation A method for handling of messages between a terminal and a data network
CN1582465B (en) 2001-11-01 2013-07-24 伊梅森公司 Input device and mobile telephone comprising the input device
US8339379B2 (en) 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US8095879B2 (en) 2002-12-10 2012-01-10 Neonode Inc. User interface for mobile handheld computer unit
SE0103835L (en) 2001-11-02 2003-05-03 Neonode Ab Touch screen realized by display unit with light transmitting and light receiving units
WO2009008786A1 (en) 2007-07-06 2009-01-15 Neonode Inc. Scanning of a touch screen
GB2382291A (en) 2001-11-16 2003-05-21 Int Computers Ltd Overlay for touch sensitive screen
US7352356B2 (en) 2001-12-13 2008-04-01 United States Of America Refreshable scanning tactile graphic display for localized sensory stimulation
KR100769783B1 (en) 2002-03-29 2007-10-24 가부시끼가이샤 도시바 Display input device and display input system
US6864878B2 (en) 2002-03-29 2005-03-08 Xerox Corporation Tactile overlays for screens
WO2003088354A2 (en) 2002-04-15 2003-10-23 Schott Ag Method for producing a copy protection for an electronic circuit and corresponding component
US7289826B1 (en) 2002-04-16 2007-10-30 Faulkner Interstices, Llc Method and apparatus for beam selection in a smart antenna system
US7400640B2 (en) 2002-05-03 2008-07-15 Conexant, Inc. Partitioned medium access control implementation
US7269153B1 (en) 2002-05-24 2007-09-11 Conexant Systems, Inc. Method for minimizing time critical transmit processing for a personal computer implementation of a wireless local area network adapter
TW554538B (en) 2002-05-29 2003-09-21 Toppoly Optoelectronics Corp TFT planar display panel structure and process for producing same
US6988247B2 (en) 2002-06-18 2006-01-17 Koninklijke Philips Electronics N.V. Graphic user interface having touch detectability
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US6842607B2 (en) 2002-09-09 2005-01-11 Conexant Systems, Inc Coordination of competing protocols
US20040046739A1 (en) 2002-09-11 2004-03-11 Palm, Inc. Pliable device navigation method and apparatus
US20060092344A1 (en) 2002-09-19 2006-05-04 Toshihiko Ura Illumination unit and liquid crystal display comprising it
US7138985B2 (en) 2002-09-25 2006-11-21 Ui Evolution, Inc. Tactilely enhanced visual image display
US7253807B2 (en) 2002-09-25 2007-08-07 Uievolution, Inc. Interactive apparatuses with tactiley enhanced visual imaging capability and related methods
US20040100448A1 (en) 2002-11-25 2004-05-27 3M Innovative Properties Company Touch display
US6881063B2 (en) 2003-02-24 2005-04-19 Peichun Yang Electroactive polymer actuator braille cell and braille display
KR100563460B1 (en) 2003-02-25 2006-03-23 엘지.필립스 엘시디 주식회사 Liquid Crystal Display Associated with Touch Panel And Driving Method Thereof
US7081888B2 (en) * 2003-04-24 2006-07-25 Eastman Kodak Company Flexible resistive touch screen
WO2004109488A2 (en) 2003-05-30 2004-12-16 Immersion Corporation System and method for low power haptic feedback
JP2005056267A (en) 2003-08-06 2005-03-03 Sony Corp Kinesthetic sense feedback device
DK1665880T3 (en) 2003-09-03 2013-02-25 Stanford Res Inst Int Electroactive surface deformation polymer transducers
US20060172557A1 (en) 2004-09-09 2006-08-03 He Xinhua Sam Electrical actuator having smart muscle wire
TW594183B (en) 2003-09-26 2004-06-21 Chi Lin Technology Co Ltd Panel positioning and testing device
US20050088417A1 (en) 2003-10-24 2005-04-28 Mulligan Roger C. Tactile touch-sensing system
US7042711B2 (en) 2003-11-18 2006-05-09 Kabushiki Kaisha Toshiba Multi-functional electronic device with a continuously accessible pointing device
US7054145B2 (en) 2003-11-18 2006-05-30 Kabushiki Kaisha Toshiba Mechanism for adjusting a display
GB0402191D0 (en) 2004-02-02 2004-03-03 Eleksen Ltd Linear sensor
GB0406080D0 (en) 2004-03-18 2004-04-21 Eleksen Ltd Sensor assembly
GB0406079D0 (en) 2004-03-18 2004-04-21 Eleksen Ltd Sensor response
US7436318B2 (en) 2004-04-19 2008-10-14 Atg Designworks, Llc Self contained device for displaying electronic information
US7750890B2 (en) 2004-05-11 2010-07-06 The Chamberlain Group, Inc. Movable barrier operator system display method and apparatus
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
TWI287771B (en) 2004-07-06 2007-10-01 Au Optronics Corp Active matrix organic light emitting diode (AMOLED) display and a pixel drive circuit thereof
US7522153B2 (en) 2004-09-14 2009-04-21 Fujifilm Corporation Displaying apparatus and control method
KR100682901B1 (en) 2004-11-17 2007-02-15 삼성전자주식회사 Apparatus and method for providing fingertip haptics of visual information using electro-active polymer in a image displaying device
CN101107587B (en) 2005-01-14 2013-03-13 皇家飞利浦电子股份有限公司 Moving objects presented by a touch input display device
US7952564B2 (en) 2005-02-17 2011-05-31 Hurst G Samuel Multiple-touch sensor
US7602118B2 (en) 2005-02-24 2009-10-13 Eastman Kodak Company OLED device having improved light output
US7193350B1 (en) 2005-02-25 2007-03-20 United States Of America As Represented By The Secretary Of The Navy Electroactive polymer structure
JP4360497B2 (en) 2005-03-09 2009-11-11 国立大学法人 東京大学 Electric tactile presentation device and electric tactile presentation method
US20060209083A1 (en) 2005-03-18 2006-09-21 Outland Research, L.L.C. Method and electroactive device for a dynamic graphical imagery display
JP2006276707A (en) 2005-03-30 2006-10-12 Toshiba Matsushita Display Technology Co Ltd Display device and its driving method
US20070020589A1 (en) 2005-04-06 2007-01-25 Ethan Smith Electrothermal refreshable Braille cell and method for actuating same
US7382357B2 (en) 2005-04-25 2008-06-03 Avago Technologies Ecbu Ip Pte Ltd User interface incorporating emulated hard keys
US7368307B2 (en) 2005-06-07 2008-05-06 Eastman Kodak Company Method of manufacturing an OLED device with a curved light emitting surface
EP1907921B1 (en) 2005-07-25 2017-07-19 Flexenable Limited Flexible touch screen display
US7659887B2 (en) 2005-10-20 2010-02-09 Microsoft Corp. Keyboard with a touchpad layer on keys
JP4536638B2 (en) 2005-10-28 2010-09-01 株式会社スクウェア・エニックス Display information selection apparatus and method, program, and recording medium
WO2007066717A1 (en) 2005-12-08 2007-06-14 The University Of Tokyo Electric tactile display
KR100801089B1 (en) 2005-12-13 2008-02-05 삼성전자주식회사 Mobile device and operation method control available for using touch and drag
KR100791379B1 (en) 2006-01-02 2008-01-07 삼성전자주식회사 System and method for user interface
KR100877067B1 (en) 2006-01-03 2009-01-07 삼성전자주식회사 Haptic button, and haptic device using it
JP4412288B2 (en) 2006-01-26 2010-02-10 セイコーエプソン株式会社 Electro-optical device and electronic apparatus
TW200735626A (en) 2006-01-30 2007-09-16 Alain C Briancon Skin tone mobile device and service
US7594839B2 (en) 2006-02-24 2009-09-29 Eastman Kodak Company OLED device having improved light output
US7511702B2 (en) 2006-03-30 2009-03-31 Apple Inc. Force and location sensitive display
EP1843406A1 (en) 2006-04-05 2007-10-10 Nederlandse Organisatie voor Toegepast-Natuuurwetenschappelijk Onderzoek TNO Actuator comprising an electroactive polymer
US7978181B2 (en) 2006-04-25 2011-07-12 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
US20070257891A1 (en) 2006-05-03 2007-11-08 Esenther Alan W Method and system for emulating a mouse on a multi-touch sensitive surface
US20090189748A1 (en) 2006-08-24 2009-07-30 Koninklijke Philips Electronics N.V. Device for and method of processing an audio signal and/or a video signal to generate haptic excitation
KR100910577B1 (en) 2006-09-11 2009-08-04 삼성전자주식회사 Computer system and control method thereof
KR20080023901A (en) 2006-09-12 2008-03-17 삼성전자주식회사 Method and device for presenting braille in a wireless mobile terminal
US20080062088A1 (en) 2006-09-13 2008-03-13 Tpo Displays Corp. Pixel driving circuit and OLED display apparatus and electrionic device using the same
US20100315345A1 (en) 2006-09-27 2010-12-16 Nokia Corporation Tactile Touch Screen
US20080122589A1 (en) 2006-11-28 2008-05-29 Ivanov Yuri A Tactile Output Device
EP1930800A1 (en) 2006-12-05 2008-06-11 Electronics and Telecommunications Research Institute Tactile and visual display device
US8970501B2 (en) 2007-01-03 2015-03-03 Apple Inc. Proximity and multi-touch sensor detection and demodulation
US20080180399A1 (en) 2007-01-31 2008-07-31 Tung Wan Cheng Flexible Multi-touch Screen
US8269729B2 (en) 2007-01-31 2012-09-18 Perceptive Pixel Inc. Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques
US7731670B2 (en) 2007-02-02 2010-06-08 Honda Motor Co., Ltd. Controller for an assistive exoskeleton based on active impedance
US8098234B2 (en) 2007-02-20 2012-01-17 Immersion Corporation Haptic feedback system with stored effects
US20080211353A1 (en) 2007-03-02 2008-09-04 Charles Erklin Seeley High temperature bimorph actuator
US8253654B2 (en) 2007-03-16 2012-08-28 Motorola Mobility Llc Visual interface control based on viewing display area configuration
WO2008128080A2 (en) 2007-04-13 2008-10-23 Saint-Gobain Ceramics & Plastics, Inc. Electrostatic dissipative stage for use in forming lcd products
KR100863571B1 (en) 2007-05-23 2008-10-15 삼성전자주식회사 Display pixel using electro active polymer and display employing the same
KR100888480B1 (en) 2007-05-23 2009-03-12 삼성전자주식회사 Reflective unit using electro active polymer and flexible display
US20080303795A1 (en) 2007-06-08 2008-12-11 Lowles Robert J Haptic display for a handheld electronic device
US20090002328A1 (en) 2007-06-26 2009-01-01 Immersion Corporation, A Delaware Corporation Method and apparatus for multi-touch tactile touch panel actuator mechanisms
US7956770B2 (en) 2007-06-28 2011-06-07 Sony Ericsson Mobile Communications Ab Data input device and portable electronic device
US7952498B2 (en) 2007-06-29 2011-05-31 Verizon Patent And Licensing Inc. Haptic computer interface
US20090015560A1 (en) 2007-07-13 2009-01-15 Motorola, Inc. Method and apparatus for controlling a display of a device
KR101430445B1 (en) 2007-08-20 2014-08-14 엘지전자 주식회사 Terminal having function for controlling screen size and program recording medium
EP2034399B1 (en) 2007-09-04 2019-06-05 LG Electronics Inc. Scrolling method of mobile terminal
WO2009037379A1 (en) 2007-09-18 2009-03-26 Senseg Oy Method and apparatus for sensory stimulation
US8098235B2 (en) * 2007-09-28 2012-01-17 Immersion Corporation Multi-touch device having dynamic haptic effects
US20090102805A1 (en) 2007-10-18 2009-04-23 Microsoft Corporation Three-dimensional object simulation using audio, visual, and tactile feedback
EP2065870A1 (en) 2007-11-21 2009-06-03 Roche Diagnostics GmbH Medical device for visually impaired users and users not visually impaired
US8136402B2 (en) 2007-11-28 2012-03-20 International Business Machines Corporation Accelerometer module for use with a touch sensitive device
KR20090062190A (en) 2007-12-12 2009-06-17 삼성전자주식회사 Input/output device for tactile sensation and driving method for the same
JP2009151684A (en) 2007-12-21 2009-07-09 Sony Corp Touch-sensitive sheet member, input device and electronic equipment
US9857872B2 (en) 2007-12-31 2018-01-02 Apple Inc. Multi-touch display screen with localized tactile feedback
US8547339B2 (en) 2008-01-04 2013-10-01 Tactus Technology, Inc. System and methods for raised touch screens
US8154527B2 (en) 2008-01-04 2012-04-10 Tactus Technology User interface system
US7890257B2 (en) 2008-01-14 2011-02-15 Research In Motion Limited Using a shape-changing display as an adaptive lens for selectively magnifying information displayed onscreen
US8310444B2 (en) 2008-01-29 2012-11-13 Pacinian Corporation Projected field haptic actuation
US20090195512A1 (en) 2008-02-05 2009-08-06 Sony Ericsson Mobile Communications Ab Touch sensitive display with tactile feedback
US8022933B2 (en) 2008-02-21 2011-09-20 Sony Corporation One button remote control with haptic feedback
US9513704B2 (en) 2008-03-12 2016-12-06 Immersion Corporation Haptically enabled user interface
US8786555B2 (en) 2008-03-21 2014-07-22 Sprint Communications Company L.P. Feedback-providing keypad for touchscreen devices
KR100943989B1 (en) 2008-04-02 2010-02-26 (주)엠아이디티 Capacitive Touch Screen
US9256342B2 (en) 2008-04-10 2016-02-09 Perceptive Pixel, Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US20090256807A1 (en) 2008-04-14 2009-10-15 Nokia Corporation User interface
KR101456001B1 (en) * 2008-05-23 2014-11-03 엘지전자 주식회사 Terminal and method for controlling the same
US7924143B2 (en) 2008-06-09 2011-04-12 Research In Motion Limited System and method for providing tactile feedback to a user of an electronic device
US8054300B2 (en) 2008-06-17 2011-11-08 Apple Inc. Capacitive sensor panel having dynamically reconfigurable sensor size and shape
KR101498623B1 (en) * 2008-06-25 2015-03-04 엘지전자 주식회사 Mobile Terminal Capable of Previewing Different Channel
US8508495B2 (en) 2008-07-03 2013-08-13 Apple Inc. Display with dual-function capacitive elements
US10031549B2 (en) 2008-07-10 2018-07-24 Apple Inc. Transitioning between modes of input
US7953462B2 (en) 2008-08-04 2011-05-31 Vartanian Harry Apparatus and method for providing an adaptively responsive flexible display device
KR101505198B1 (en) 2008-08-18 2015-03-23 엘지전자 주식회사 PORTABLE TERMINAL and DRIVING METHOD OF THE SAME
JP2010086471A (en) 2008-10-02 2010-04-15 Sony Corp Operation feeling providing device, and operation feeling feedback method, and program
US8593409B1 (en) 2008-10-10 2013-11-26 Immersion Corporation Method and apparatus for providing haptic feedback utilizing multi-actuated waveform phasing
US8427433B2 (en) 2008-10-17 2013-04-23 Honeywell International Inc. Tactile-feedback touch screen
US8433138B2 (en) 2008-10-29 2013-04-30 Nokia Corporation Interaction using touch and non-touch gestures
EP2353066A4 (en) 2008-11-04 2014-10-15 Bayer Ip Gmbh Electroactive polymer transducers for tactile feedback devices
US8413066B2 (en) 2008-11-06 2013-04-02 Dmytro Lysytskyy Virtual keyboard with visually enhanced keys
US9116569B2 (en) 2008-11-26 2015-08-25 Blackberry Limited Touch-sensitive display method and apparatus
US8558803B2 (en) 2008-11-28 2013-10-15 Samsung Electronics Co., Ltd. Input device for portable terminal and method thereof
US9600070B2 (en) 2008-12-22 2017-03-21 Apple Inc. User interface having changeable topography
TW201025085A (en) 2008-12-23 2010-07-01 Hannstar Display Corp Keyboard formed from a touch display, method of endowing a touch display with a keyboard function, and a device with functions of keyboard or writing pad input and image output
US8686952B2 (en) 2008-12-23 2014-04-01 Apple Inc. Multi touch with multi haptics
US8760413B2 (en) 2009-01-08 2014-06-24 Synaptics Incorporated Tactile surface
US8255323B1 (en) 2009-01-09 2012-08-28 Apple Inc. Motion based payment confirmation
JP2010165032A (en) 2009-01-13 2010-07-29 Hitachi Displays Ltd Touch panel display device
CA2749984A1 (en) 2009-01-21 2010-07-29 Bayer Materialscience Ag Electroactive polymer transducers for tactile feedback devices
EP2391972B1 (en) 2009-02-02 2015-05-27 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
US8406816B2 (en) 2009-02-03 2013-03-26 Research In Motion Limited Method and apparatus for implementing a virtual rotary dial pad on a portable electronic device
TW201030588A (en) 2009-02-13 2010-08-16 Hannstar Display Corp In-cell touch panel
US8188844B2 (en) 2009-02-16 2012-05-29 GM Global Technology Operations LLC Reconfigurable tactile interface utilizing active material actuation
US8077021B2 (en) 2009-03-03 2011-12-13 Empire Technology Development Llc Dynamic tactile interface
US9927873B2 (en) 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
KR101628782B1 (en) 2009-03-20 2016-06-09 삼성전자주식회사 Apparatus and method for providing haptic function using multi vibrator in portable terminal
JP5658235B2 (en) 2009-05-07 2015-01-21 イマージョン コーポレーションImmersion Corporation Method and apparatus for forming shape change display by tactile feedback
US8279200B2 (en) 2009-05-19 2012-10-02 Microsoft Corporation Light-induced shape-memory polymer display screen
US7827975B1 (en) * 2009-05-28 2010-11-09 Ford Global Technologies, Llc Direct-start engine operation utilizing multi-strike ignition
US8738219B2 (en) * 2009-08-24 2014-05-27 Robert Bosch Gmbh Good checking for vehicle longitudinal acceleration sensor
US20110199342A1 (en) 2010-02-16 2011-08-18 Harry Vartanian Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US8174931B2 (en) 2010-10-08 2012-05-08 HJ Laboratories, LLC Apparatus and method for providing indoor location, position, or tracking of a mobile computer using building information
US8485941B2 (en) * 2011-03-08 2013-07-16 Chrysler Group Llc Driver selectable low speed mode for disabling stop/start technology
US8816977B2 (en) * 2011-03-21 2014-08-26 Apple Inc. Electronic devices with flexible displays
US9939963B2 (en) * 2012-08-14 2018-04-10 Christopher V. Beckman Techniques improving displays
US20140195926A1 (en) * 2013-01-08 2014-07-10 Emo2 Inc. Systems and methods for enabling access to one or more applications on a device
US9238449B2 (en) * 2014-03-06 2016-01-19 Omega Patents, L.L.C. Vehicle control system including accelerometer based security warning and related methods
EP3176062B1 (en) * 2014-07-28 2023-11-15 Robert Bosch GmbH Information providing device and program for motorcycle
KR20160090524A (en) * 2015-01-22 2016-08-01 엘지전자 주식회사 Electric Vehicle and Control Method Thereof
JP2016182005A (en) * 2015-03-24 2016-10-13 株式会社デンソー Control device

Patent Citations (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5717434A (en) * 1992-07-24 1998-02-10 Toda; Kohji Ultrasonic touch system
US7500952B1 (en) * 1995-06-29 2009-03-10 Teratech Corporation Portable ultrasound imaging system
US7317872B1 (en) * 1997-10-10 2008-01-08 Posa John G Remote microphone and range-finding configuration
US20020030699A1 (en) * 1998-04-17 2002-03-14 Van Ee Jan Hand-held with auto-zoom for graphical display of Web page
US6441828B1 (en) * 1998-09-08 2002-08-27 Sony Corporation Image display apparatus
US6313825B1 (en) * 1998-12-28 2001-11-06 Gateway, Inc. Virtual input device
US7756297B2 (en) * 1999-07-08 2010-07-13 Pryor Timothy R Camera based sensing in handheld, mobile, gaming, or other devices
US6744178B2 (en) * 1999-12-27 2004-06-01 Seiko Instruments Inc. Pulse detection device and method of manufacturing the same
US6859572B2 (en) * 2000-03-31 2005-02-22 Sony Corporation Photon operating device and photon operating method
US6859569B2 (en) * 2000-03-31 2005-02-22 Sony Corporation Information receiving/display apparatus and information receiving/display method
US20020050983A1 (en) * 2000-09-26 2002-05-02 Qianjun Liu Method and apparatus for a touch sensitive system employing spread spectrum technology for the operation of one or more input devices
US7828733B2 (en) * 2000-11-24 2010-11-09 U-Systems Inc. Coronal and axial thick-slice ultrasound images derived from ultrasonic scans of a chestwardly-compressed breast
US20060096392A1 (en) * 2001-07-24 2006-05-11 Tactex Controls Inc. Touch sensitive membrane
US20030048260A1 (en) * 2001-08-17 2003-03-13 Alec Matusis System and method for selecting actions based on the identification of user's fingers
US7050835B2 (en) * 2001-12-12 2006-05-23 Universal Display Corporation Intelligent multi-media display communication system
US6984208B2 (en) * 2002-08-01 2006-01-10 The Hong Kong Polytechnic University Method and apparatus for sensing body gesture, posture and movement
US7841944B2 (en) * 2002-08-06 2010-11-30 Igt Gaming device having a three dimensional display device
US7190416B2 (en) * 2002-10-18 2007-03-13 Nitto Denko Corporation Liquid crystal display with touch panel having internal front polarizer
US7077015B2 (en) * 2003-05-29 2006-07-18 Vincent Hayward Apparatus to reproduce tactile sensations
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US7027311B2 (en) * 2003-10-17 2006-04-11 Firefly Power Technologies, Inc. Method and apparatus for a wireless power supply
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor
US20090043195A1 (en) * 2004-10-12 2009-02-12 Koninklijke Philips Electronics, N.V. Ultrasound Touchscreen User Interface and Display
US20090051662A1 (en) * 2005-03-14 2009-02-26 Martin Klein Touch-Sensitive Screen With Haptic Acknowledgement
US20070078345A1 (en) * 2005-09-30 2007-04-05 Siemens Medical Solutions Usa, Inc. Flexible ultrasound transducer array
US20070085828A1 (en) * 2005-10-13 2007-04-19 Schroeder Dale W Ultrasonic virtual mouse
US20070085838A1 (en) * 2005-10-17 2007-04-19 Ricks Theodore K Method for making a display with integrated touchscreen
US20070109276A1 (en) * 2005-11-17 2007-05-17 Lg Electronics Inc. Method for Allocating/Arranging Keys on Touch-Screen, and Mobile Terminal for Use of the Same
US20070139391A1 (en) * 2005-11-18 2007-06-21 Siemens Aktiengesellschaft Input device
US20080216001A1 (en) * 2006-01-05 2008-09-04 Bas Ording Portable electronic device with content-dependent touch sensitivity
US20070247422A1 (en) * 2006-03-30 2007-10-25 Xuuk, Inc. Interaction techniques for flexible displays
US20090315851A1 (en) * 2006-05-02 2009-12-24 Hotelling Steven P Multipoint Touch Surface Controller
US20080062148A1 (en) * 2006-06-09 2008-03-13 Hotelling Steve P Touch screen liquid crystal display
US20080005703A1 (en) * 2006-06-28 2008-01-03 Nokia Corporation Apparatus, Methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US7843449B2 (en) * 2006-09-20 2010-11-30 Apple Inc. Three-dimensional display system
US20080100907A1 (en) * 2006-10-10 2008-05-01 Cbrite Inc. Electro-optic display
US20100160786A1 (en) * 2007-06-01 2010-06-24 Koninklijke Philips Electronics N.V. Wireless Ultrasound Probe User Interface
US20090198132A1 (en) * 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
US20090043205A1 (en) * 2007-08-10 2009-02-12 Laurent Pelissier Hand-held ultrasound system having sterile enclosure
US20100298713A1 (en) * 2007-10-29 2010-11-25 Koninklijke Philips Electronics N.V. Systems and methods for ultrasound assembly including multiple imaging transducer arrays
US20100257491A1 (en) * 2007-11-29 2010-10-07 Koninklijke Philips Electronics N.V. Method of providing a user interface
US8232973B2 (en) * 2008-01-09 2012-07-31 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US20090181724A1 (en) * 2008-01-14 2009-07-16 Sony Ericsson Mobile Communications Ab Touch sensitive display with ultrasonic vibrations for tactile feedback
US20080150911A1 (en) * 2008-01-21 2008-06-26 Sony Computer Entertainment America Inc. Hand-held device with touchscreen and digital tactile pixels
US20090184936A1 (en) * 2008-01-22 2009-07-23 Mathematical Inventing - Slicon Valley 3D touchpad
US20090199392A1 (en) * 2008-02-11 2009-08-13 General Electric Company Ultrasound transducer probes and system and method of manufacture
US20090262078A1 (en) * 2008-04-21 2009-10-22 David Pizzi Cellular phone with special sensor functions
US20090284480A1 (en) * 2008-05-16 2009-11-19 International Business Machines Corporation System and apparatus for a multi-point touch-sensitive sensor user interface using distinct digit identification
US20090295760A1 (en) * 2008-06-02 2009-12-03 Sony Ericsson Mobile Communications Ab Touch screen display
US20090322687A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Virtual touchpad
US20100007511A1 (en) * 2008-07-14 2010-01-14 Sony Ericsson Mobile Communications Ab Touchless control of a control device
US20100013777A1 (en) * 2008-07-18 2010-01-21 Microsoft Corporation Tracking input in a screen-reflective interface environment
US20100026656A1 (en) * 2008-07-31 2010-02-04 Apple Inc. Capacitive sensor behind black mask
US20100177050A1 (en) * 2009-01-14 2010-07-15 Immersion Corporation Method and Apparatus for Generating Haptic Feedback from Plasma Actuation
US20100225734A1 (en) * 2009-03-03 2010-09-09 Horizon Semiconductors Ltd. Stereoscopic three-dimensional interactive system and method
US20100231540A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods For A Texture Engine
US20100234077A1 (en) * 2009-03-12 2010-09-16 Yoo Jae-Suk Mobile terminal and method for providing user interface thereof
US20100238114A1 (en) * 2009-03-18 2010-09-23 Harry Vartanian Apparatus and method for providing an elevated, indented, or texturized display device
US20100259633A1 (en) * 2009-04-14 2010-10-14 Sony Corporation Information processing apparatus, information processing method, and program
US20110107958A1 (en) * 2009-11-12 2011-05-12 Apple Inc. Input devices and methods of operation
US20110109588A1 (en) * 2009-11-12 2011-05-12 Senseg Ltd. Tactile stimulation apparatus having a composite section comprising a semiconducting material
US20110175813A1 (en) * 2010-01-20 2011-07-21 Apple Inc. Piezo-based acoustic and capacitive detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Takayuki Iwamoto, Mari Tatezono, Takayuki Hoshi, Hiroyuki Shinoda, "Airborne Ultrasound Tactile Display," SIGGRAPH 2008 New Tech Demos, Aug. 2008. *

Cited By (216)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459728B2 (en) 2009-03-18 2016-10-04 HJ Laboratories, LLC Mobile device with individually controllable tactile sensations
US10191652B2 (en) 2009-03-18 2019-01-29 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9778840B2 (en) 2009-03-18 2017-10-03 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9772772B2 (en) 2009-03-18 2017-09-26 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9547368B2 (en) 2009-03-18 2017-01-17 Hj Laboratories Licensing, Llc Electronic device with a pressure sensitive multi-touch display
US8866766B2 (en) 2009-03-18 2014-10-21 HJ Laboratories, LLC Individually controlling a tactile area of an image displayed on a multi-touch display
US9448632B2 (en) 2009-03-18 2016-09-20 Hj Laboratories Licensing, Llc Mobile device with a pressure and indentation sensitive multi-touch display
US8686951B2 (en) 2009-03-18 2014-04-01 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
US9423905B2 (en) 2009-03-18 2016-08-23 Hj Laboratories Licensing, Llc Providing an elevated and texturized display in a mobile electronic device
US9405371B1 (en) 2009-03-18 2016-08-02 HJ Laboratories, LLC Controllable tactile sensations in a consumer device
US9400558B2 (en) 2009-03-18 2016-07-26 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
US9335824B2 (en) 2009-03-18 2016-05-10 HJ Laboratories, LLC Mobile device with a pressure and indentation sensitive multi-touch display
US10496170B2 (en) 2010-02-16 2019-12-03 HJ Laboratories, LLC Vehicle computing system to provide feedback
US20120038652A1 (en) * 2010-08-12 2012-02-16 Palm, Inc. Accepting motion-based character input on mobile computing devices
US20120096354A1 (en) * 2010-10-14 2012-04-19 Park Seungyong Mobile terminal and control method thereof
US9721489B2 (en) 2011-03-21 2017-08-01 HJ Laboratories, LLC Providing augmented reality based on third party information
US8743244B2 (en) 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information
US11327599B2 (en) 2011-04-26 2022-05-10 Sentons Inc. Identifying a contact type
US20120274609A1 (en) * 2011-04-26 2012-11-01 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US10877581B2 (en) 2011-04-26 2020-12-29 Sentons Inc. Detecting touch input force
US10444909B2 (en) 2011-04-26 2019-10-15 Sentons Inc. Using multiple signals to detect touch input
US9477350B2 (en) * 2011-04-26 2016-10-25 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US9639213B2 (en) * 2011-04-26 2017-05-02 Sentons Inc. Using multiple signals to detect touch input
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US20140078112A1 (en) * 2011-04-26 2014-03-20 Sentons Inc. Using multiple signals to detect touch input
US10969908B2 (en) 2011-04-26 2021-04-06 Sentons Inc. Using multiple signals to detect touch input
US11907464B2 (en) 2011-04-26 2024-02-20 Sentons Inc. Identifying a contact type
US20130061176A1 (en) * 2011-09-07 2013-03-07 Konami Digital Entertainment Co., Ltd. Item selection device, item selection method and non-transitory information recording medium
US10353509B2 (en) 2011-11-18 2019-07-16 Sentons Inc. Controlling audio volume using touch input force
US9099971B2 (en) 2011-11-18 2015-08-04 Sentons Inc. Virtual keyboard interaction using touch input force
US10055066B2 (en) 2011-11-18 2018-08-21 Sentons Inc. Controlling audio volume using touch input force
US10248262B2 (en) 2011-11-18 2019-04-02 Sentons Inc. User interface interaction using touch input force
US9449476B2 (en) 2011-11-18 2016-09-20 Sentons Inc. Localized haptic feedback
US11209931B2 (en) 2011-11-18 2021-12-28 Sentons Inc. Localized haptic feedback
US10698528B2 (en) 2011-11-18 2020-06-30 Sentons Inc. Localized haptic feedback
US11829555B2 (en) 2011-11-18 2023-11-28 Sentons Inc. Controlling audio volume using touch input force
US11016607B2 (en) 2011-11-18 2021-05-25 Sentons Inc. Controlling audio volume using touch input force
US10732755B2 (en) 2011-11-18 2020-08-04 Sentons Inc. Controlling audio volume using touch input force
US9594450B2 (en) 2011-11-18 2017-03-14 Sentons Inc. Controlling audio volume using touch input force
US10235004B1 (en) 2011-11-18 2019-03-19 Sentons Inc. Touch input detector with an integrated antenna
US20130227409A1 (en) * 2011-12-07 2013-08-29 Qualcomm Incorporated Integrating sensation functionalities into social networking services and applications
US9335845B2 (en) 2012-01-31 2016-05-10 MCube Inc. Selective accelerometer data processing methods and apparatus
US20130215038A1 (en) * 2012-02-17 2013-08-22 Rukman Senanayake Adaptable actuated input device with integrated proximity detection
US8928582B2 (en) 2012-02-17 2015-01-06 Sri International Method for adaptive interaction with a legacy software application
US20150169059A1 (en) * 2012-04-18 2015-06-18 Nokia Corporation Display apparatus with haptic feedback
WO2013171025A1 (en) * 2012-05-18 2013-11-21 Robert Bosch Gmbh Arrangement and method for stimulating the tactile sense of a user
US9959464B2 (en) 2012-05-24 2018-05-01 HJ Laboratories, LLC Mobile device utilizing multiple cameras for environmental detection
US10599923B2 (en) 2012-05-24 2020-03-24 HJ Laboratories, LLC Mobile device utilizing multiple cameras
US9578200B2 (en) 2012-05-24 2017-02-21 HJ Laboratories, LLC Detecting a document using one or more sensors
US9218526B2 (en) 2012-05-24 2015-12-22 HJ Laboratories, LLC Apparatus and method to detect a paper document using one or more sensors
US9983718B2 (en) 2012-07-18 2018-05-29 Sentons Inc. Detection of type of object used to provide a touch contact input
US10860132B2 (en) 2012-07-18 2020-12-08 Sentons Inc. Identifying a contact type
US10466836B2 (en) 2012-07-18 2019-11-05 Sentons Inc. Using a type of object to provide a touch contact input
US10209825B2 (en) 2012-07-18 2019-02-19 Sentons Inc. Detection of type of object used to provide a touch contact input
US20170344143A1 (en) * 2012-07-26 2017-11-30 Apple Inc. Ultrasound-Based Force Sensing and Touch Sensing
US10013118B2 (en) * 2012-07-26 2018-07-03 Apple Inc. Ultrasound-based force sensing and touch sensing
US10635217B2 (en) 2012-07-26 2020-04-28 Apple Inc. Ultrasound-based force sensing of inputs
US9772721B2 (en) * 2012-07-26 2017-09-26 Apple Inc. Ultrasound-based force sensing and touch sensing
US10949020B2 (en) 2012-07-26 2021-03-16 Apple Inc. Fingerprint-assisted force estimation
US20160062497A1 (en) * 2012-07-26 2016-03-03 Apple Inc. Ultrasound-Based Force Sensing and Touch Sensing
US9891738B2 (en) 2012-07-26 2018-02-13 Apple Inc. Ultrasound-based force sensing of inputs
US20160054826A1 (en) * 2012-07-26 2016-02-25 Apple Inc. Ultrasound-Based Force Sensing
US20150286380A1 (en) * 2012-08-10 2015-10-08 Blackberry Limited Method of momentum based zoom of content on an electronic device
US10489031B2 (en) * 2012-08-10 2019-11-26 Blackberry Limited Method of momentum based zoom of content on an electronic device
US10108286B2 (en) 2012-08-30 2018-10-23 Apple Inc. Auto-baseline determination for force sensing
WO2014081508A1 (en) * 2012-11-20 2014-05-30 Immersion Corporation Systems and methods for providing mode or state awareness with programmable surface texture
EP2733577A3 (en) * 2012-11-20 2014-07-23 Immersion Corporation A method of producing a haptic effect and haptic effect enabled device
EP3508952A1 (en) * 2012-11-20 2019-07-10 Immersion Corporation Systems and methods for providing mode or state awareness with programmable surface texture
CN103838421A (en) * 2012-11-20 2014-06-04 英默森公司 Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
EP3252567A1 (en) * 2012-11-20 2017-12-06 Immersion Corporation A method of producing a haptic effect and haptic effect enabled device
US10078384B2 (en) 2012-11-20 2018-09-18 Immersion Corporation Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction
US9836150B2 (en) 2012-11-20 2017-12-05 Immersion Corporation System and method for feedforward and feedback with haptic effects
US10691230B2 (en) 2012-12-29 2020-06-23 Apple Inc. Crown input for a wearable electronic device
US10281567B2 (en) 2013-05-08 2019-05-07 Ultrahaptics Ip Ltd Method and apparatus for producing an acoustic field
US11543507B2 (en) 2013-05-08 2023-01-03 Ultrahaptics Ip Ltd Method and apparatus for producing an acoustic field
US11624815B1 (en) 2013-05-08 2023-04-11 Ultrahaptics Ip Ltd Method and apparatus for producing an acoustic field
US9977120B2 (en) 2013-05-08 2018-05-22 Ultrahaptics Ip Ltd Method and apparatus for producing an acoustic field
US10061453B2 (en) 2013-06-07 2018-08-28 Sentons Inc. Detecting multi-touch inputs
US9244603B2 (en) * 2013-06-21 2016-01-26 Nook Digital, Llc Drag and drop techniques for discovering related content
US20140380214A1 (en) * 2013-06-21 2014-12-25 Barnesandnoble.Com Llc Drag and drop techniques for discovering related content
US9804675B2 (en) 2013-06-27 2017-10-31 Elwha Llc Tactile feedback generated by non-linear interaction of surface acoustic waves
US20150003204A1 (en) * 2013-06-27 2015-01-01 Elwha Llc Tactile feedback in a two or three dimensional airspace
US10671168B2 (en) 2013-06-27 2020-06-02 Elwha Llc Tactile feedback generated by non-linear interaction of surface acoustic waves
US8884927B1 (en) 2013-06-27 2014-11-11 Elwha Llc Tactile feedback generated by phase conjugation of ultrasound surface acoustic waves
US8766953B1 (en) 2013-06-27 2014-07-01 Elwha Llc Tactile display driven by surface acoustic waves
GB2516820A (en) * 2013-07-01 2015-02-11 Nokia Corp An apparatus
US10075630B2 (en) 2013-07-03 2018-09-11 HJ Laboratories, LLC Providing real-time, personal services by accessing components on a mobile device
US10921976B2 (en) 2013-09-03 2021-02-16 Apple Inc. User interface for manipulating user interface objects
EP3620903A1 (en) * 2013-09-03 2020-03-11 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US11656751B2 (en) 2013-09-03 2023-05-23 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
US11829576B2 (en) 2013-09-03 2023-11-28 Apple Inc. User interface object manipulations in a user interface
US11537281B2 (en) 2013-09-03 2022-12-27 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US10386966B2 (en) 2013-09-20 2019-08-20 Sentons Inc. Using spectral control in detecting touch input
US9612658B2 (en) 2014-01-07 2017-04-04 Ultrahaptics Ip Ltd Method and apparatus for providing tactile sensations
US10921890B2 (en) 2014-01-07 2021-02-16 Ultrahaptics Ip Ltd Method and apparatus for providing tactile sensations
US9898089B2 (en) 2014-01-07 2018-02-20 Ultrahaptics Ip Ltd Method and apparatus for providing tactile sensations
US11720861B2 (en) 2014-06-27 2023-08-08 Apple Inc. Reduced size user interface
US11250385B2 (en) 2014-06-27 2022-02-15 Apple Inc. Reduced size user interface
US10133353B2 (en) 2014-07-11 2018-11-20 New York University Three dimensional tactile feedback system
WO2016007920A1 (en) * 2014-07-11 2016-01-14 New York University Three dimensional tactile feedback system
US11644911B2 (en) 2014-09-02 2023-05-09 Apple Inc. Button functionality
US11941191B2 (en) 2014-09-02 2024-03-26 Apple Inc. Button functionality
US11402968B2 (en) 2014-09-02 2022-08-02 Apple Inc. Reduced size user in interface
US11474626B2 (en) 2014-09-02 2022-10-18 Apple Inc. Button functionality
US11157143B2 (en) 2014-09-02 2021-10-26 Apple Inc. Music user interface
US11068083B2 (en) 2014-09-02 2021-07-20 Apple Inc. Button functionality
US11743221B2 (en) 2014-09-02 2023-08-29 Apple Inc. Electronic message user interface
US11204644B2 (en) 2014-09-09 2021-12-21 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
GB2530036A (en) * 2014-09-09 2016-03-16 Ultrahaptics Ltd Method and apparatus for modulating haptic feedback
US10444842B2 (en) 2014-09-09 2019-10-15 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
WO2016038347A1 (en) * 2014-09-09 2016-03-17 Ultrahaptics Limited Method and apparatus for modulating haptic feedback
US11656686B2 (en) 2014-09-09 2023-05-23 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
US9958943B2 (en) 2014-09-09 2018-05-01 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
US11768540B2 (en) 2014-09-09 2023-09-26 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
CN106575161A (en) * 2014-09-09 2017-04-19 超级触觉资讯处理有限公司 Method and apparatus for modulating haptic feedback
US20160180636A1 (en) * 2014-12-17 2016-06-23 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3d display
US20160175701A1 (en) * 2014-12-17 2016-06-23 Gtech Canada Ulc Contactless tactile feedback on gaming terminal with 3d display
US20160180644A1 (en) * 2014-12-17 2016-06-23 Fayez Idris Gaming system with movable ultrasonic transducer
US20160175709A1 (en) * 2014-12-17 2016-06-23 Fayez Idris Contactless tactile feedback on gaming terminal with 3d display
US9672689B2 (en) * 2014-12-17 2017-06-06 Igt Canada Solutions Ulc Gaming system with movable ultrasonic transducer
US10195525B2 (en) * 2014-12-17 2019-02-05 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3D display
US20190134503A1 (en) * 2014-12-17 2019-05-09 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3d display
US10737174B2 (en) * 2014-12-17 2020-08-11 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3D display
US10427034B2 (en) * 2014-12-17 2019-10-01 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3D display
US10403084B2 (en) * 2014-12-17 2019-09-03 Igt Canada Solutions Ulc Contactless tactile feedback on gaming terminal with 3D display
US9841819B2 (en) 2015-02-20 2017-12-12 Ultrahaptics Ip Ltd Perceptions in a haptic system
US10101811B2 (en) 2015-02-20 2018-10-16 Ultrahaptics Ip Ltd. Algorithm improvements in a haptic system
US11550432B2 (en) 2015-02-20 2023-01-10 Ultrahaptics Ip Ltd Perceptions in a haptic system
EP3916525A1 (en) * 2015-02-20 2021-12-01 Ultrahaptics IP Limited Perceptions in a haptic system
US10685538B2 (en) 2015-02-20 2020-06-16 Ultrahaptics Ip Ltd Algorithm improvements in a haptic system
US11276281B2 (en) 2015-02-20 2022-03-15 Ultrahaptics Ip Ltd Algorithm improvements in a haptic system
US10930123B2 (en) 2015-02-20 2021-02-23 Ultrahaptics Ip Ltd Perceptions in a haptic system
US11830351B2 (en) 2015-02-20 2023-11-28 Ultrahaptics Ip Ltd Algorithm improvements in a haptic system
US10101814B2 (en) 2015-02-20 2018-10-16 Ultrahaptics Ip Ltd. Perceptions in a haptic system
US10884592B2 (en) 2015-03-02 2021-01-05 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
WO2016192266A1 (en) * 2015-05-29 2016-12-08 京东方科技集团股份有限公司 Sound wave touch control device and electronic device
US10248263B2 (en) 2015-05-29 2019-04-02 Boe Technology Group Co., Ltd. Acoustic wave touch device and electronic apparatus
US10818162B2 (en) 2015-07-16 2020-10-27 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
US11727790B2 (en) 2015-07-16 2023-08-15 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
US10048811B2 (en) 2015-09-18 2018-08-14 Sentons Inc. Detecting touch input provided by signal transmitting stylus
US11189140B2 (en) 2016-01-05 2021-11-30 Ultrahaptics Ip Ltd Calibration and detection techniques in haptic systems
US9679547B1 (en) * 2016-04-04 2017-06-13 Disney Enterprises, Inc. Augmented reality music composition
US10262642B2 (en) 2016-04-04 2019-04-16 Disney Enterprises, Inc. Augmented reality music composition
US10531212B2 (en) 2016-06-17 2020-01-07 Ultrahaptics Ip Ltd. Acoustic transducers in haptic systems
US10496175B2 (en) 2016-08-03 2019-12-03 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US10268275B2 (en) 2016-08-03 2019-04-23 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US11307664B2 (en) 2016-08-03 2022-04-19 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US10915177B2 (en) 2016-08-03 2021-02-09 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US11714492B2 (en) 2016-08-03 2023-08-01 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
US10755538B2 (en) 2016-08-09 2020-08-25 Ultrahaptics ilP LTD Metamaterials and acoustic lenses in haptic systems
US10908741B2 (en) 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
US10147243B2 (en) 2016-12-05 2018-12-04 Google Llc Generating virtual notation surfaces with gestures in an augmented and/or virtual reality environment
US10339723B2 (en) 2016-12-05 2019-07-02 Google Llc Generating virtual notation surfaces with gestures in an augmented and/or virtual reality environment
US20180153752A1 (en) * 2016-12-07 2018-06-07 Stryker Corporation Haptic Systems And Methods For A User Interface Of A Patient Support Apparatus
US10744053B2 (en) * 2016-12-07 2020-08-18 Stryker Corporation Haptic systems and methods for a user interface of a patient support apparatus
US11246777B2 (en) 2016-12-07 2022-02-15 Stryker Corporation Haptic systems and methods for a user interface of a patient support apparatus
US10296144B2 (en) 2016-12-12 2019-05-21 Sentons Inc. Touch input detection with shared receivers
US10509515B2 (en) 2016-12-12 2019-12-17 Sentons Inc. Touch input detection with shared receivers
US10943578B2 (en) 2016-12-13 2021-03-09 Ultrahaptics Ip Ltd Driving techniques for phased-array systems
WO2018109466A1 (en) * 2016-12-13 2018-06-21 Ultrahaptics Ip Limited Driving techniques for phased-array systems
US11955109B2 (en) 2016-12-13 2024-04-09 Ultrahaptics Ip Ltd Driving techniques for phased-array systems
US10497358B2 (en) 2016-12-23 2019-12-03 Ultrahaptics Ip Ltd Transducer driver
US10126877B1 (en) 2017-02-01 2018-11-13 Sentons Inc. Update of reference data for touch input detection
US10444905B2 (en) 2017-02-01 2019-10-15 Sentons Inc. Update of reference data for touch input detection
US10585522B2 (en) 2017-02-27 2020-03-10 Sentons Inc. Detection of non-touch inputs using a signature
US11061510B2 (en) 2017-02-27 2021-07-13 Sentons Inc. Detection of non-touch inputs using a signature
KR102367547B1 (en) 2017-07-17 2022-02-24 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Display Apparatueses and Pixels for a Display apparatus
JP7366109B2 (en) 2017-07-17 2023-10-20 フラウンホーファー-ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン display device
KR20200023649A (en) * 2017-07-17 2020-03-05 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Display Devices and Pixels for Display Devices
US11061477B2 (en) 2017-07-17 2021-07-13 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Display devices and pixel for a display device
JP2020527269A (en) * 2017-07-17 2020-09-03 フラウンホーファー−ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン Display devices and pixels for display devices
KR20210147088A (en) * 2017-07-17 2021-12-06 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Display Apparatueses and Pixels for a Display apparatus
KR102337968B1 (en) 2017-07-17 2021-12-13 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Display devices and pixels for display devices
JP7064567B2 (en) 2017-07-17 2022-05-10 フラウンホーファー-ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン Pixels for Display Devices and Display Devices
JP2022019992A (en) * 2017-07-17 2022-01-27 フラウンホーファー-ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン Display device
WO2019015882A1 (en) * 2017-07-17 2019-01-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Display apparatuses and pixels for a display apparatus
US20190021924A1 (en) * 2017-07-19 2019-01-24 Stryker Corporation Techniques For Generating Auditory And Haptic Output With A Vibrational Panel Of A Patient Support Apparatus
US10980687B2 (en) * 2017-07-19 2021-04-20 Stryker Corporation Techniques for generating auditory and haptic output with a vibrational panel of a patient support apparatus
US11392206B2 (en) 2017-07-27 2022-07-19 Emerge Now Inc. Mid-air ultrasonic haptic interface for immersive computing environments
US11048329B1 (en) 2017-07-27 2021-06-29 Emerge Now Inc. Mid-air ultrasonic haptic interface for immersive computing environments
US11262253B2 (en) 2017-08-14 2022-03-01 Sentons Inc. Touch input detection using a piezoresistive sensor
US11580829B2 (en) 2017-08-14 2023-02-14 Sentons Inc. Dynamic feedback for haptics
US11435242B2 (en) 2017-08-14 2022-09-06 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11340124B2 (en) 2017-08-14 2022-05-24 Sentons Inc. Piezoresistive sensor for detecting a physical disturbance
US11009411B2 (en) 2017-08-14 2021-05-18 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11531395B2 (en) 2017-11-26 2022-12-20 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US11921928B2 (en) 2017-11-26 2024-03-05 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US11360546B2 (en) 2017-12-22 2022-06-14 Ultrahaptics Ip Ltd Tracking in haptic systems
US11704983B2 (en) 2017-12-22 2023-07-18 Ultrahaptics Ip Ltd Minimizing unwanted responses in haptic systems
WO2019207143A1 (en) * 2018-04-27 2019-10-31 Myvox Ab A device, system and method for generating an acoustic-potential field of ultrasonic waves
US11883847B2 (en) 2018-05-02 2024-01-30 Ultraleap Limited Blocking plate structure for improved acoustic transmission efficiency
US10911861B2 (en) 2018-05-02 2021-02-02 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency
US11529650B2 (en) 2018-05-02 2022-12-20 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency
US11740018B2 (en) 2018-09-09 2023-08-29 Ultrahaptics Ip Ltd Ultrasonic-assisted liquid manipulation
US11098951B2 (en) 2018-09-09 2021-08-24 Ultrahaptics Ip Ltd Ultrasonic-assisted liquid manipulation
US10928907B2 (en) 2018-09-11 2021-02-23 Apple Inc. Content-based tactile outputs
US11921926B2 (en) 2018-09-11 2024-03-05 Apple Inc. Content-based tactile outputs
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
US10712824B2 (en) 2018-09-11 2020-07-14 Apple Inc. Content-based tactile outputs
US11378997B2 (en) 2018-10-12 2022-07-05 Ultrahaptics Ip Ltd Variable phase and frequency pulse-width modulation technique
US11550395B2 (en) 2019-01-04 2023-01-10 Ultrahaptics Ip Ltd Mid-air haptic textures
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time
US11460925B2 (en) 2019-06-01 2022-10-04 Apple Inc. User interfaces for non-visual output of time
US11553295B2 (en) 2019-10-13 2023-01-10 Ultraleap Limited Dynamic capping with virtual microphones
US11742870B2 (en) 2019-10-13 2023-08-29 Ultraleap Limited Reducing harmonic distortion by dithering
US11374586B2 (en) 2019-10-13 2022-06-28 Ultraleap Limited Reducing harmonic distortion by dithering
US11169610B2 (en) 2019-11-08 2021-11-09 Ultraleap Limited Tracking techniques in haptic systems
US11715453B2 (en) 2019-12-25 2023-08-01 Ultraleap Limited Acoustic transducer structures
US20220229493A1 (en) * 2020-06-06 2022-07-21 Battelle Memorial Institute Delivery of somatosensation for medical diagnostics or guiding motor action
US11726567B2 (en) * 2020-06-06 2023-08-15 Battelle Memorial Institute Delivery of somatosensation for medical diagnostics or guiding motor action
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
US11886639B2 (en) 2020-09-17 2024-01-30 Ultraleap Limited Ultrahapticons
FR3127309A1 (en) * 2021-09-21 2023-03-24 Mz Technology Contactless interaction framework for human / machine interface
WO2023047050A1 (en) 2021-09-21 2023-03-30 Mz Technology Contactless interaction frame for human/machine interface

Also Published As

Publication number Publication date
US10496170B2 (en) 2019-12-03
US20170228023A1 (en) 2017-08-10
US20200103975A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US10496170B2 (en) Vehicle computing system to provide feedback
US9983676B2 (en) Simulation of tangible user interface interactions and gestures using array of haptic cells
US11360558B2 (en) Computer systems with finger devices
US10191652B2 (en) Electronic device with an interactive pressure sensitive multi-touch display
US10032346B2 (en) Haptic device incorporating stretch characteristics
CN101730874B (en) Touchless gesture based input
CN111602101A (en) Human interaction with an aerial haptic system
US20100020036A1 (en) Portable electronic device and method of controlling same
Freeman et al. Multimodal feedback in HCI: haptics, non-speech audio, and their applications
CN107943273A (en) Context pressure-sensing haptic response
KR20190002525A (en) Gadgets for multimedia management of compute devices for people who are blind or visually impaired
Rekimoto Organic interaction technologies: from stone to skin
Cho et al. Multi-context photo browsing on mobile devices based on tilt dynamics
Gong Enriching Input Modalities for Computing Devices Using Wrist-Worn Sensors
Kodippilige Magicwand: A comparison of gestural affordances between cylindrical and flat display form factors

Legal Events

Date Code Title Description
AS Assignment

Owner name: HJ LABORATORIES, LLC, PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VARTANIAN, HARRY;JURIKSON-RHODES, JARON;REEL/FRAME:026674/0709

Effective date: 20110725

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: JJR LABORATORIES, LLC, PENNSYLVANIA

Free format text: CHANGE OF NAME;ASSIGNOR:HJ LABORATORIES, LLC;REEL/FRAME:057419/0583

Effective date: 20210720