US20010043189A1 - Active edge user interface - Google Patents
Active edge user interface Download PDFInfo
- Publication number
- US20010043189A1 US20010043189A1 US09/097,150 US9715098A US2001043189A1 US 20010043189 A1 US20010043189 A1 US 20010043189A1 US 9715098 A US9715098 A US 9715098A US 2001043189 A1 US2001043189 A1 US 2001043189A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- display
- input device
- user
- active edge
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims description 20
- 230000004044 response Effects 0.000 claims description 11
- 229920001971 elastomer Polymers 0.000 claims description 8
- 239000000806 elastomer Substances 0.000 claims description 8
- 239000004973 liquid crystal related substance Substances 0.000 claims description 2
- 230000013011 mating Effects 0.000 claims 4
- 238000004891 communication Methods 0.000 abstract description 22
- 230000002452 interceptive effect Effects 0.000 abstract description 3
- 230000035945 sensitivity Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 29
- 239000000463 material Substances 0.000 description 8
- 238000013500 data storage Methods 0.000 description 6
- 238000013461 design Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 3
- 241000699666 Mus <mouse, genus> Species 0.000 description 3
- 229910052799 carbon Inorganic materials 0.000 description 3
- 239000004020 conductor Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 239000003575 carbonaceous material Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000012858 resilient material Substances 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 229910000906 Bronze Inorganic materials 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 1
- 230000009118 appropriate response Effects 0.000 description 1
- 239000010974 bronze Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- KUNSUQLRTQLHQQ-UHFFFAOYSA-N copper tin Chemical compound [Cu].[Sn] KUNSUQLRTQLHQQ-UHFFFAOYSA-N 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H13/00—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
- H01H13/70—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard
- H01H13/78—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by the contacts or the contact sites
- H01H13/807—Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by the contacts or the contact sites characterised by the spatial arrangement of the contact sites, e.g. superimposed sites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0339—Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2225/00—Switch site location
- H01H2225/002—Switch site location superimposed
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2225/00—Switch site location
- H01H2225/018—Consecutive operations
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01H—ELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
- H01H2239/00—Miscellaneous
- H01H2239/074—Actuation by finger touch
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0266—Details of the structure or mounting of specific components for a display module assembly
Definitions
- the present invention relates generally to interface devices, and more particularly to a user interface device that includes dynamically configurable flexible touch areas located near the perimeter of a display to support interactive communication between a user and a user environment.
- the keyboard allows a user to enter text and symbol information into a computer, and provides predefined keys for executing specific functions (e.g., “save” and “exit” functions).
- specific functions e.g., “save” and “exit” functions.
- the introduction of the windows-based operating system exposed the limitations of the keyboard, which often required a user to perform multiple keystrokes to execute simple computer functions.
- the mouse was created to provide “point-and-click” functionality.
- This user interface tool significantly increased the efficiency of a computer session regardless of whether a user performed simple word processing or engaged in complex computer-generated graphic designs. For example, selecting and opening a word processing file typically required three or more keystrokes with a keyboard. However, with a mouse, the user can simply point to the file on the desktop or in a pull down menu and click on the file to open it.
- keyboards and mice are not readily adaptable to smaller computing devices, such as palm-sized computers, wireless communication products, and public kiosks where space is at a premium.
- touch-screen systems seem to be the preferred choice of users since they do not require physical keys or buttons to enter data into each device. By eliminating physical keys, small computing device manufacturers can significantly reduce the size and weight of the device, characteristics that appeal to consumers.
- Touch-screen systems typically include a touch-responsive medium that senses a human touch on a particularly area of the display and software to implement a function associated with the touched area.
- a touch-screen interface is found in U.S. Pat. No. 5,594,471 to Deeran et al. (the “'471 patent”).
- the '471 patent discloses an industrial computer workstation with a display and a touch-screen.
- the touch-screen includes a display touch zone that overlaps the display and a border touch zone located outside the display. Portions of the display touch zone and the border touch zone are programmable as user input areas of the touch-screen and are identified to a user via removable templates.
- touch-screen systems such as the touch-screen interface of the '471 patent have disadvantages.
- Removable templates on a touch-screen display can be lost, destroyed, or misplaced, and when using a finger to select an item on a touch-screen, the user's hand can often block a view of the screen.
- touch-screens quickly become dirty, especially when installed in a public kiosk or an industrial environment, and they do not support key travel—a sliding motion across the screen to execute a function (e.g., scrolling through data) or “two-step” functionality—the ability to implement multiple functions from a single predetermined area of the user interface device.
- Systems and methods consistent with the present invention provide a user interface device that includes dynamically configurable flexible touch areas located near the perimeter of a display to support interactive communication between a user and a user environment.
- a user interface consistent with this invention comprises a display; an input device located adjacent an edge of the display, and operatively connected to the display to respond to a physical contact; and a processor for executing user interface software configured to implement a function in response to the physical contact on the input device.
- a method for implementing a user interface comprises the steps of generating an image on a display in response to at least one of a human touch and a first pressure on a predetermined area of an input device adjacent the display; and implementing a function associated with the image when a second pressure is applied to the predetermined area of the input device.
- FIG. 1 illustrates an active edge user interface consistent with the present invention
- FIG. 2 a illustrates a cross-sectional view of a user input device at rest consistent with the present invention
- FIG. 2 b illustrates a cross-sectional view of the user input device in FIG. 2 a with contact applied
- FIG. 2 c illustrates a cross-sectional view of the user input device in FIG. 2 a with additional contact applied
- FIG. 3 a illustrates a cross-sectional view of another user input device at rest consistent with of the present invention
- FIG. 3 b illustrates a cross-sectional view of the user input device in FIG. 3 a with contact applied
- FIG. 3 c illustrates a cross-sectional view of the user input device in FIG. 3 a with additional contact applied
- FIG. 4 a illustrates the selection of an item illustrated on a display using a user input device consistent with the present invention
- FIG. 4 b illustrates a response to the selection of an item illustrated on a display using a user input device consistent with the present invention
- FIG. 5 a illustrates an implementation of an active edge user interface on a wireless communications device for responding to a call consistent with the present invention
- FIG. 5 b illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5 a for forwarding a call
- FIG. 5 c illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5 a for locating information in memory
- FIG. 5 d illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5 a for selecting the name of a person
- FIG. 6 illustrates a flowchart of a method for implementing an active edge user interface consistent with the present invention.
- Systems and methods consistent with the present invention use an active edge user interface positioned near the edge of a display that allows a user to interact with a host device.
- the active edge user interface includes a flexible input device that extends along at least one edge of a display and responds to touch and pressure to implement one or more functions viewable on the display. This design supports key travel, programmability, ease-of-use, and adaptability to a variety of applications and technologies.
- FIG. 1 illustrates an active edge user interface 100 consistent with the present invention.
- Active edge user interface 100 includes a display 110 , active touch input device 120 , processor 130 , and memory 140 . These components represent the basic infrastructure of active edge user interface 100 .
- active edge interface 100 may include additional components depending on the host device in which it is used.
- active edge user interface 100 can be used in a wristwatch, which may require altering the shape and size of display 110 and input device 120 .
- active edge user interface 100 can be installed in a desktop computer which may include additional processors and memory.
- Active edge user interface 100 is designed as a universal interface that can operate in any graphical user interface environment.
- Display 110 is any commercially available display that is capable of displaying textual and graphical images.
- display 110 is a liquid crystal diode (LCD) display, however, the type of display used with active edge user interface 100 can depend on the user environment.
- active edge user interface 100 may be used in a desktop computer system. In this instance, images can be generated on display 110 using a cathode ray tube.
- active edge user interface 100 may be used in a wireless communication device, such as a cellular phone, in which case display 110 is an LCD display.
- display 110 can be any geometrical shape.
- Active edge input device 120 is a user interface device positioned adjacent display 110 . Active edge input device 120 may actually touch display 110 or lay a predetermined distance away from an edge of display 110 .
- the shape of active edge input device 120 may vary depending on the user environment. For example, active edge input device 120 may be shaped in a manner that visibly distinguishes between a highly used area of the device and a lesser used area of the device (e.g., the highly used area is wider than the lesser used area).
- active edge input device 120 extends around the perimeter of display 110 . Nevertheless, active edge input device 120 may be configured to extend only along one, two, or three sides of display 110 . If display 110 has a round geometrical shape, active edge input device 120 may form a complete circle around the display or only extend around a portion of the display. The position of active edge input device 120 relative to display 110 is important to provide an ergonomically correct, user-friendly interface device. The structure of and method for using active edge input device 120 with display 110 is described in detail with respect to FIGS. 2 - 6 , respectively.
- Processor 130 is preferably a high-speed processor, such as an Intel Pentium® processor, capable of processing simple and complex graphic applications.
- Processor 130 communicates with display 110 and controls active edge user interface 100 .
- processor 130 can be integrated into display 110 or located in a peripheral device.
- Memory 140 is a random access memory (RAM) that communicates with processor 130 to store and retrieve data and software. Preferably, memory 140 facilitates high-speed access to enhance the storage and retrieval process. As illustrated in FIG. 1, memory 140 includes data storage 150 and user interface software 160 . One skilled in the art will appreciate that memory 140 can store additional data and software not described herein. For example, in a wireless communications environment, memory 140 may include communications software to support the transfer of voice signals to and from a cell site.
- RAM random access memory
- Data storage 150 is an area of memory 140 that stores data.
- data storage 150 may include a listing of telephone numbers or call information (e.g., number of calls received within a specified time period).
- call information e.g., number of calls received within a specified time period.
- the type of data resident in data storage 150 may change based on the user environment.
- User interface software 160 is a software program resident in memory 140 that implements methods of active edge user interface 100 in accordance with the present invention.
- User interface software 160 is executed by processor 130 to respond to user inputs into active edge input device 120 .
- User interface software 160 interprets the user inputs and implements an appropriate response. For example, if a user wishes to call a friend, the user selects the friend's name from a telephone listing displayed on the screen by pressing on active edge input device 120 in a predetermined area (e.g., adjacent the friend's name). In response to the selection, user interface software 160 associates the name with a telephone number stored in data storage 150 and instructs processor 130 to dial the number.
- User interface software 160 can be configured to operate in a variety of user environments such as on a desktop computer or a public kiosk.
- FIGS. 2 a - 2 c illustrate cross-sectional views of active edge input device 120 in accordance with a preferred embodiment consistent with the present invention.
- active edge input device 120 is a strip of material that extends along a border of display 110 and is responsive to touch or pressure.
- Active edge input device 120 is designed to provide “two-step” functionality.
- a first function is implemented at the first step when a first pressure or touch is applied to the input device (e.g., pressure applied by a human finger).
- a second function is implemented at the second step when a second pressure is applied to the same area on the input device (e.g., additional pressure applied by a human finger in the same location).
- FIG. 2 a illustrates a cross-sectional view of active edge input device 120 at rest.
- Active edge input device 120 includes a flexible strip 200 positioned adjacent a host device body surface 260 .
- Body surface 260 is a surface of a host device in which active edge user interface 100 is employed. For example, if the active edge user interface 100 is employed in a wireless communication device, then body surface 260 is a surface of the wireless communication device body.
- Flexible strip 200 is an elastomer strip of material that includes an upper surface 205 , a lower surface 207 and one or more cavities 210 . Although an elastomer material is preferable, flexible strip 200 can be composed of any resilient material. Preferably, flexible strip 200 is a continuous strip of material that extends around at least one side of display 110 . However, flexible strip 200 may be sectioned (i.e., non-continuous) as appropriate in the user environment to satisfy design requirements.
- Upper surface 205 is a surface of flexible strip 200 that is exposed to a user as illustrated in FIG. 1.
- upper surface 205 is smooth, however, it may include protrusions or have a distinct texture to allow users to locate certain areas on active edge input device 120 by touch alone.
- the smoothness of upper surface 205 allows a user to drag their finger or other instrument along flexible strip 200 in a sweeping motion. This motion, for example, may be used to implement a scrolling function which allows a user to quickly view information on display 110 .
- Lower surface 207 includes one or more protrusions 208 that extend outward and include extensions 209 .
- the face of protrusions 208 include upper electrical contacts 220 that are fixed thereon.
- these electrical contacts made from a conductive carbon material and form a continuous ring around extensions 209 as illustrated in FIG. 2 a .
- Upper electrical contacts 220 can be sectioned into distinct units, however, that are spaced around extensions 209 .
- the face of extensions 209 include lower electrical contacts 230 that are fixed thereon. These electrical contacts are “puck-shaped” and are preferably formed from a carbon material.
- Body surface 260 includes body protrusion electrical contacts 240 and body extension electrical contacts 250 which are fixed thereon.
- these electrical contacts are also composed of carbon and are aligned with upper electrical contact 220 and lower electrical contacts 230 , respectively.
- Cavities 210 are formed in an area of flexible strip 200 adjacent each protrusion 208 .
- each of cavities 210 is formed in an image of protrusions 208 and extensions 209 , but may have any shape.
- Cavities 210 are designed to collapse when a pressure is applied and return to its original shape when the pressure is released. Thus, cavities 210 provide a “soft button” effect when engaged by a user. The deformation of cavities 210 under pressure is illustrated in FIGS. 2 b and 2 c.
- FIG. 2 b illustrates a cross-sectional view of a first pressure applied to active edge input device 120 consistent with a first embodiment of the present invention.
- a first pressure e.g., a “touch”
- the pressure forces protrusion 208 downward until lower electrical contact 230 makes contact with body extension electrical contact 250 .
- the connection of these two electrical contacts generates a signal that is sent to processor 130 for processing.
- FIGS. 4 - 6 A discussion of how processor 130 responds to this connection is described with respect to FIGS. 4 - 6 .
- Pressure on one area of flexible strip 200 only affects the components directly below. That is, if pressure is applied to one of three adjacent areas on flexible strip 200 , only the selected area will respond to the pressure as shown in FIG. 2 b.
- FIG. 2 c illustrates a cross-sectional view of a second pressure applied to a user input device consistent with a first embodiment of the present invention.
- This figure shows the second step of the “two-step” functionality described herein.
- the first pressure shown on area 270 is increased to a second pressure (e.g., a “press”) until upper electrical contact 220 makes contact with body protrusion electrical contact 240 .
- a second pressure e.g., a “press”
- both lower electrical contact 230 and upper electrical contact 220 are electrically coupled with the respective body electrical contacts under area 270 .
- This connection generates a second signal to processor 130 which is processed accordingly.
- FIGS. 3 a - 3 c illustrate a cross-sectional view of a user input device consistent with a second embodiment of the present invention.
- active edge input device 120 includes an alternative design for entering data into a host device.
- the active edge input device illustrated in FIGS. 3 a - 3 c also provides “two-step” functionality as described herein.
- FIG. 3 a illustrates a cross-sectional view of a second embodiment of active edge input device 120 at rest.
- active edge input device 120 includes a flexible strip 300 positioned adjacent a host body surface 350 .
- Body surface 350 is a surface of a host device in which active edge user interface 100 is installed. For example, if active edge user interface 100 is installed in a wireless communication device, then body surface 350 is a surface of the wireless communication device.
- Flexible strip 300 is an elastomer strip of material that includes an upper surface 305 , a lower surface 307 , and one or more cavities 320 . Although elastomer is preferable, flexible strip 300 can be composed of any resilient material. Preferably, flexible strip 300 is a continuous strip of material that extends around at least one side of display 110 . However, flexible strip 300 may be sectioned (i.e., non-continuous) as appropriate in the user environment to satisfy design requirements.
- Upper surface 305 is a surface of flexible strip 300 that is exposed to a user as illustrated in FIG. 1.
- upper surface 305 is smooth, however, it may include protrusions to allow users to locate certain areas on active edge input device 120 by touch alone.
- the smoothness of upper surface 305 allows users to drag their finger or other instrument along flexible strip 300 in a sweeping motion. This motion, for example, may be used to implement a scrolling function which allows a user to scroll through information on display 110 .
- Lower surface 307 includes a resistive plate 310 that is responsive to a human touch.
- resistive plate 310 extends along lower surface 307 as a continuous strip of conductive material.
- resistive plate 310 may have separate and distinct sections that are positioned along lower surface 307 .
- Resistive plate 310 may comprise resistive material currently used in conventional touch-screen devices.
- resistive plate 310 Attached to resistive plate 310 are one or more protrusions 308 that extend outward and include extensions 309 .
- the face of extensions 309 include input device electrical contacts 330 fixed thereon, as illustrated in FIG. 3 a . These electrical contacts are “puck-shaped” and are formed from an electrically conductive material (e.g., carbon).
- Body surface 350 includes body electrical contacts 340 which are fixed thereon. These electrical contacts are also composed of an electrically conductive material (e.g., carbon) and are aligned with input device electrical contacts 330 . A gap exists between the electrical contacts on body surface 350 and the electrical contacts on extensions 309 while active edge input device 120 is at rest.
- electrically conductive material e.g., carbon
- Cavities 320 are formed in an area of flexible strip 300 adjacent each protrusion 308 .
- each of cavities 320 are formed in an image of protrusions 308 and extensions 309 , as illustrated in FIG. 3 a , but may have any shape.
- Cavities 320 are designed to collapse when a pressure is applied and return to its original shape when the pressure is released. Thus, cavities 320 provide a “soft button” effect when a pressure is applied thereto by a user. The deformation of cavities 320 under pressure is illustrated in FIGS. 3 b and 3 c.
- FIG. 3 b illustrates a cross-sectional view of a touch applied to active edge input device 120 consistent with a second embodiment of the present invention.
- This figure shows the first step of the “two-step” functionality described herein.
- a voltage is applied to resistive plate 310 during operation of the host device.
- a human touches upper surface 305 of flexible strip 300 e.g., on area 360
- a change in voltage is detected and a first signal is generated.
- Processor 130 receives the first signal and responds by implementing user interface software 160 .
- a discussion of how processor 130 implements user interface software 160 is described with respect to FIGS. 4 - 6 .
- active edge input device 120 can be configured to simply sense a human touch without requiring the application of pressure to flexible strip 300 .
- resistive plate 310 simply detects the presence of a human touch on area 360 and does not require any deformation of flexible strip 300 .
- FIG. 3 c illustrates a cross-sectional view of a pressure applied to active edge input device 120 consistent with a second embodiment of the present invention.
- This figure shows the second step of the “two-step” functionality described herein.
- the first pressure shown in FIG. 3 b is increased to a second pressure (e.g., a “press”) on area 370 of flexible strip 300 until input device electrical contact 330 makes contact with body electrical contact 340 .
- the second pressure deforms flexible strip 300 including resistive plate 310 and cavity 320 .
- the connection of the electrical contacts generates a second signal to processor 130 which is processed accordingly by implementing user interface software 160 .
- FIGS. 4 a - 4 b illustrate the operation of selecting an item illustrated on a display using an active edge input device consistent with the present invention. Specifically, the operation of display 400 , active edge input devices 420 and 430 , and user interface software 160 (of FIG. 1) is discussed with reference to FIGS. 4 a - 4 b .
- Active edge input devices consistent with the present invention are dynamically configurable such that different functions can be associated with each selectable area of the input device depending on the user environment.
- FIGS. 4 a and 4 b illustrate a mode of operation for an active edge user interface consistent with the present invention.
- the user environment illustrated in these figures includes a notebook computer with an active edge user interface.
- the notebook computer includes a display 400 and active edge input devices 420 and 430 located on the right and left sides of display 400 , respectively.
- Active edge input devices 420 and 430 may include the design of FIGS. 2 a - 2 c or 3 a - 3 c . In either case, the user can enter information into the notebook computer using active edge input devices 420 and 430 .
- information stored in data storage 150 or a peripheral device is generated on display 400 .
- this information relates to fashion and includes a main category “clothing” displayed on the left side of display 400 and a plurality of sub-categories including “shoes, socks, shirts, pants, jackets, scarfs, and hats” displayed on the right side of display 400 .
- a user can touch or press an area of active edge input device 420 to highlight a sub-category adjacent thereto.
- users can drag their finger down or up active edge user input device 420 to scroll through the sub-categories. As illustrated in FIG.
- the sub-category “shirts” is highlighted as a result of a touch or press on an adjacent area of active edge input device 420 .
- a sub-category, or any data displayed and selected using embodiments consistent with the present invention, can by highlighted in many different ways.
- the selected data can change colors, expand, contract, flash, or be affected in any manner that indicates it has been selected by a user via active edge input device 420 .
- the touch or press on active edge input device 420 corresponding to the selection of the “shirts” sub-category sends a first signal to processor 130 which processes the signal using user interface software 160 .
- User interface software 160 interprets the signal as a selection of the “shirts” category based on the screen location of the currently display data and the selected area on active edge input device 420 . Since the touch or press only implements the first step of the “two-step” functionality described herein, the “shirts” category is simply highlighted for the user.
- the user has the option of accepting the selected category or moving to another displayed category.
- the latter option highlights a newly selected sub-category in a manner similar to the highlighted “shirts” sub-category. If the user chooses to accept the “shirts” sub-category, they simply increase the pressure on active edge input device 420 until the electrical contacts of active edge input device 420 contact the electrical contacts connected to a surface of the host device. This operation implements the second step of “two-step” functionality described herein. At this point, a second signal is sent to processor 130 indicating that the selection is accepted and the “shirts” sub-category moves to the left side of the screen under the “clothing” category, as illustrated in FIG. 4 b .
- User interface software 160 then implements the function associated with the user selection that, in this example, is updating the category listing with “shirts.”
- the function implemented by user interface software 160 will change depending on the user environment.
- the display may show an “Announce” function that, when selected, announces predetermined information to specified subscribers over a wireless or wireline communication channel.
- the “Announce” function may allow the user to select the priority of the announcement by displaying priority selections adjacent an active edge input device (e.g., gold priority for urgent, silver priority for semi-urgent, and bronze for not urgent).
- an active edge input device e.g., gold priority for urgent, silver priority for semi-urgent, and bronze for not urgent.
- the user can scroll through the displayed priority categories and select the desired priority using the “two-step” functionality described herein. Another example of this feature is discussed with reference to FIGS. 5 a - 5 d.
- FIG. 5 a illustrates an implementation of an active edge user interface on a wireless communications device 500 for responding to a call consistent with the present invention.
- Wireless communication device 500 is a host device that includes a display 510 , an active edge input device 520 , and a keypad 525 .
- the upper highlighted portion of display 510 indicates the currently displayed function (e.g., “call from” or “contact”).
- the middle portion of display 510 shows data entered by a user or received from a remote device.
- the lower portion of display 510 shows function parameters, such as “Fwd,” “Ans,” and “Send.”
- Active edge input device 520 is a continuous strip of flexible material that borders three sides of display 510 .
- Active edge input device 520 includes protrusions in the shape of ribs 540 on the left and right sides of display 510 , and buttons 550 on the bottom side of the display. One or more buttons 550 correspond to one or more of the displayed function parameters.
- Display 510 in FIG. 5 a indicates to the user that wireless communications device 500 is receiving or has received a call from “Alan Frank” whose telephone number is “459-6232.” The user has the option of answering or forwarding the call by pressing the appropriate button 550 . If the “Ans” function parameter is selected, wireless communications device 500 connects the call. If the “Fwd” function parameter is selected, the user has the option of forwarding the call to “VMail” (i.e., voicemail) or to “Home” (i.e., to telephone number “763-5463”) as illustrated in FIG. 5 b . The user can move between each displayed option, for example, by dragging a finger along the left or right side surface of active edge input device 520 .
- active edge user interface may be configured such that the user can only use one side of active edge input device to select between the options on display 510 .
- the option is highlighted, as shown in FIG. 5 b .
- the touching or slight pressure represents the first step of the “two-step” functionality implemented by embodiments consistent with the present invention.
- the user presses harder on active edge input device 520 , which forwards Alan Frank's call to the user's home.
- This secondary pressure represents the second step of the “two-step” functionality.
- the user may choose to quit the current display at any time by touching on active edge input device 520 below the displayed “Quit” function parameter.
- the user may choose to make a call from wireless communications device 500 .
- the user presses on active edge input device 520 below the “Call” function as illustrated in FIG. 5 c .
- a list of names stored in memory appears on display 510 . If the list is voluminous, the user can scroll through the list by dragging (e.g., touching or slightly pressing) a finger or other instrument in an upward or downward motion across the surface of active edge input device 520 .
- display 510 may automatically switch to an iconic view to show where the user is on the list, as shown in FIG. 5 c.
- the name is highlighted by the touch or slight pressure on active edge input device 520 adjacent the name, as illustrated in FIG. 5 d .
- the user can then initiate the call by pressing harder on active edge input device 520 .
- the user could only send a message to a specified person by selecting the appropriate function key on the bottom of display 510 .
- FIG. 6 illustrates a method for implementing an active edge user interface consistent with the present invention.
- an active edge user interface generates an image on a display in response to a touch or pressure on a predetermined area of an input device adjacent the display (step 600 ).
- active edge user interface implements a function associated with the image when a greater pressure is applied to the predetermined area of the input device (step 620 ).
- the function for example, could be calling a highlighted name (i.e., represented by the image) on a wireless communications device.
- Systems and methods consistent with the present invention thus provide an active edge user interface that offers great functionality and ease-of-use. Moreover, an active edge user interface consistent with the present invention eliminates the need to touch the actual display while preserving the benefits of a graphical user interface.
Abstract
Description
- The present invention relates generally to interface devices, and more particularly to a user interface device that includes dynamically configurable flexible touch areas located near the perimeter of a display to support interactive communication between a user and a user environment.
- There is always a need for user interface devices that simplify human interaction with computers. Current user interface devices include the keyboard, mouse, and touch-screen systems. Each of these user interface devices offer varying functionality in a desktop environment.
- The keyboard allows a user to enter text and symbol information into a computer, and provides predefined keys for executing specific functions (e.g., “save” and “exit” functions). The introduction of the windows-based operating system exposed the limitations of the keyboard, which often required a user to perform multiple keystrokes to execute simple computer functions. To take advantage of the user-friendly, windows-based environment, the mouse was created to provide “point-and-click” functionality. This user interface tool significantly increased the efficiency of a computer session regardless of whether a user performed simple word processing or engaged in complex computer-generated graphic designs. For example, selecting and opening a word processing file typically required three or more keystrokes with a keyboard. However, with a mouse, the user can simply point to the file on the desktop or in a pull down menu and click on the file to open it.
- Although preferred in a desktop environment, keyboards and mice are not readily adaptable to smaller computing devices, such as palm-sized computers, wireless communication products, and public kiosks where space is at a premium. For these user environments, touch-screen systems seem to be the preferred choice of users since they do not require physical keys or buttons to enter data into each device. By eliminating physical keys, small computing device manufacturers can significantly reduce the size and weight of the device, characteristics that appeal to consumers. Moreover, through a touch-screen system, a user can interact with a public kiosk using only a display to request and retrieve information. Touch-screen systems typically include a touch-responsive medium that senses a human touch on a particularly area of the display and software to implement a function associated with the touched area.
- One example of a touch-screen interface is found in U.S. Pat. No. 5,594,471 to Deeran et al. (the “'471 patent”). The '471 patent discloses an industrial computer workstation with a display and a touch-screen. The touch-screen includes a display touch zone that overlaps the display and a border touch zone located outside the display. Portions of the display touch zone and the border touch zone are programmable as user input areas of the touch-screen and are identified to a user via removable templates. Although convenient, touch-screen systems such as the touch-screen interface of the '471 patent have disadvantages. Removable templates on a touch-screen display can be lost, destroyed, or misplaced, and when using a finger to select an item on a touch-screen, the user's hand can often block a view of the screen. Furthermore, touch-screens quickly become dirty, especially when installed in a public kiosk or an industrial environment, and they do not support key travel—a sliding motion across the screen to execute a function (e.g., scrolling through data) or “two-step” functionality—the ability to implement multiple functions from a single predetermined area of the user interface device.
- Therefore, it is desirable to provide an improved user interface device that is robust and ergonomically correct to create a user-friendly environment that does not require physical keys, templates, or touching the actual display.
- Systems and methods consistent with the present invention provide a user interface device that includes dynamically configurable flexible touch areas located near the perimeter of a display to support interactive communication between a user and a user environment.
- Specifically, a user interface consistent with this invention comprises a display; an input device located adjacent an edge of the display, and operatively connected to the display to respond to a physical contact; and a processor for executing user interface software configured to implement a function in response to the physical contact on the input device.
- A method for implementing a user interface comprises the steps of generating an image on a display in response to at least one of a human touch and a first pressure on a predetermined area of an input device adjacent the display; and implementing a function associated with the image when a second pressure is applied to the predetermined area of the input device.
- Both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate presently preferred embodiments of the invention and, together with the preceding general description and the following detailed description, explain the principles of the invention.
- In the drawings:
- FIG. 1 illustrates an active edge user interface consistent with the present invention;
- FIG. 2a illustrates a cross-sectional view of a user input device at rest consistent with the present invention;
- FIG. 2b illustrates a cross-sectional view of the user input device in FIG. 2a with contact applied;
- FIG. 2c illustrates a cross-sectional view of the user input device in FIG. 2a with additional contact applied;
- FIG. 3a illustrates a cross-sectional view of another user input device at rest consistent with of the present invention;
- FIG. 3b illustrates a cross-sectional view of the user input device in FIG. 3a with contact applied;
- FIG. 3c illustrates a cross-sectional view of the user input device in FIG. 3a with additional contact applied;
- FIG. 4a illustrates the selection of an item illustrated on a display using a user input device consistent with the present invention;
- FIG. 4b illustrates a response to the selection of an item illustrated on a display using a user input device consistent with the present invention;
- FIG. 5a illustrates an implementation of an active edge user interface on a wireless communications device for responding to a call consistent with the present invention;
- FIG. 5b illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5a for forwarding a call;
- FIG. 5c illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5a for locating information in memory;
- FIG. 5d illustrates an implementation of an active edge user interface on the wireless communications device of FIG. 5a for selecting the name of a person; and
- FIG. 6 illustrates a flowchart of a method for implementing an active edge user interface consistent with the present invention.
- Systems and methods consistent with the present invention use an active edge user interface positioned near the edge of a display that allows a user to interact with a host device. The active edge user interface includes a flexible input device that extends along at least one edge of a display and responds to touch and pressure to implement one or more functions viewable on the display. This design supports key travel, programmability, ease-of-use, and adaptability to a variety of applications and technologies.
- FIG. 1 illustrates an active
edge user interface 100 consistent with the present invention. Activeedge user interface 100 includes adisplay 110, activetouch input device 120,processor 130, andmemory 140. These components represent the basic infrastructure of activeedge user interface 100. One skilled in the art will appreciate thatactive edge interface 100 may include additional components depending on the host device in which it is used. For example, activeedge user interface 100 can be used in a wristwatch, which may require altering the shape and size ofdisplay 110 andinput device 120. In addition, activeedge user interface 100 can be installed in a desktop computer which may include additional processors and memory. Activeedge user interface 100 is designed as a universal interface that can operate in any graphical user interface environment. -
Display 110 is any commercially available display that is capable of displaying textual and graphical images. Preferably,display 110 is a liquid crystal diode (LCD) display, however, the type of display used with activeedge user interface 100 can depend on the user environment. For example, activeedge user interface 100 may be used in a desktop computer system. In this instance, images can be generated ondisplay 110 using a cathode ray tube. Alternatively, activeedge user interface 100 may be used in a wireless communication device, such as a cellular phone, in whichcase display 110 is an LCD display. Although illustrated in FIG. 1 with a square screen,display 110 can be any geometrical shape. - Active
edge input device 120 is a user interface device positionedadjacent display 110. Activeedge input device 120 may actually touchdisplay 110 or lay a predetermined distance away from an edge ofdisplay 110. The shape of activeedge input device 120 may vary depending on the user environment. For example, activeedge input device 120 may be shaped in a manner that visibly distinguishes between a highly used area of the device and a lesser used area of the device (e.g., the highly used area is wider than the lesser used area). - As illustrated in FIG. 1, active
edge input device 120 extends around the perimeter ofdisplay 110. Nevertheless, activeedge input device 120 may be configured to extend only along one, two, or three sides ofdisplay 110. Ifdisplay 110 has a round geometrical shape, activeedge input device 120 may form a complete circle around the display or only extend around a portion of the display. The position of activeedge input device 120 relative to display 110 is important to provide an ergonomically correct, user-friendly interface device. The structure of and method for using activeedge input device 120 withdisplay 110 is described in detail with respect to FIGS. 2-6, respectively. -
Processor 130 is preferably a high-speed processor, such as an Intel Pentium® processor, capable of processing simple and complex graphic applications.Processor 130 communicates withdisplay 110 and controls activeedge user interface 100. Although illustrated as an external unit,processor 130 can be integrated intodisplay 110 or located in a peripheral device. -
Memory 140 is a random access memory (RAM) that communicates withprocessor 130 to store and retrieve data and software. Preferably,memory 140 facilitates high-speed access to enhance the storage and retrieval process. As illustrated in FIG. 1,memory 140 includesdata storage 150 anduser interface software 160. One skilled in the art will appreciate thatmemory 140 can store additional data and software not described herein. For example, in a wireless communications environment,memory 140 may include communications software to support the transfer of voice signals to and from a cell site. -
Data storage 150 is an area ofmemory 140 that stores data. For example, when utilizing activeedge input device 120 in a wireless communications device,data storage 150 may include a listing of telephone numbers or call information (e.g., number of calls received within a specified time period). Of course, the type of data resident indata storage 150 may change based on the user environment. -
User interface software 160 is a software program resident inmemory 140 that implements methods of activeedge user interface 100 in accordance with the present invention.User interface software 160 is executed byprocessor 130 to respond to user inputs into activeedge input device 120.User interface software 160 interprets the user inputs and implements an appropriate response. For example, if a user wishes to call a friend, the user selects the friend's name from a telephone listing displayed on the screen by pressing on activeedge input device 120 in a predetermined area (e.g., adjacent the friend's name). In response to the selection,user interface software 160 associates the name with a telephone number stored indata storage 150 and instructsprocessor 130 to dial the number.User interface software 160 can be configured to operate in a variety of user environments such as on a desktop computer or a public kiosk. - FIGS. 2a-2 c illustrate cross-sectional views of active
edge input device 120 in accordance with a preferred embodiment consistent with the present invention. As illustrated in FIG. 1, activeedge input device 120 is a strip of material that extends along a border ofdisplay 110 and is responsive to touch or pressure. Activeedge input device 120 is designed to provide “two-step” functionality. A first function is implemented at the first step when a first pressure or touch is applied to the input device (e.g., pressure applied by a human finger). A second function is implemented at the second step when a second pressure is applied to the same area on the input device (e.g., additional pressure applied by a human finger in the same location). - FIG. 2a illustrates a cross-sectional view of active
edge input device 120 at rest. Activeedge input device 120 includes aflexible strip 200 positioned adjacent a hostdevice body surface 260.Body surface 260 is a surface of a host device in which activeedge user interface 100 is employed. For example, if the activeedge user interface 100 is employed in a wireless communication device, thenbody surface 260 is a surface of the wireless communication device body. -
Flexible strip 200 is an elastomer strip of material that includes anupper surface 205, alower surface 207 and one ormore cavities 210. Although an elastomer material is preferable,flexible strip 200 can be composed of any resilient material. Preferably,flexible strip 200 is a continuous strip of material that extends around at least one side ofdisplay 110. However,flexible strip 200 may be sectioned (i.e., non-continuous) as appropriate in the user environment to satisfy design requirements. -
Upper surface 205 is a surface offlexible strip 200 that is exposed to a user as illustrated in FIG. 1. Preferably,upper surface 205 is smooth, however, it may include protrusions or have a distinct texture to allow users to locate certain areas on activeedge input device 120 by touch alone. The smoothness ofupper surface 205 allows a user to drag their finger or other instrument alongflexible strip 200 in a sweeping motion. This motion, for example, may be used to implement a scrolling function which allows a user to quickly view information ondisplay 110. -
Lower surface 207 includes one ormore protrusions 208 that extend outward and include extensions 209. The face ofprotrusions 208 include upperelectrical contacts 220 that are fixed thereon. Preferably, these electrical contacts made from a conductive carbon material and form a continuous ring around extensions 209 as illustrated in FIG. 2a. Upperelectrical contacts 220 can be sectioned into distinct units, however, that are spaced around extensions 209. The face of extensions 209 include lowerelectrical contacts 230 that are fixed thereon. These electrical contacts are “puck-shaped” and are preferably formed from a carbon material. -
Body surface 260 includes body protrusionelectrical contacts 240 and body extensionelectrical contacts 250 which are fixed thereon. Preferably, these electrical contacts are also composed of carbon and are aligned with upperelectrical contact 220 and lowerelectrical contacts 230, respectively. A gap exists between the electrical contacts onbody surface 260 and the electrical contacts onflexible strip 200 while activeedge input device 120 is at rest. - Cavities210 are formed in an area of
flexible strip 200 adjacent eachprotrusion 208. Preferably, each ofcavities 210 is formed in an image ofprotrusions 208 and extensions 209, but may have any shape.Cavities 210 are designed to collapse when a pressure is applied and return to its original shape when the pressure is released. Thus,cavities 210 provide a “soft button” effect when engaged by a user. The deformation ofcavities 210 under pressure is illustrated in FIGS. 2b and 2 c. - FIG. 2b illustrates a cross-sectional view of a first pressure applied to active
edge input device 120 consistent with a first embodiment of the present invention. This figure shows the first step of the “two-step” functionality described herein. In this instance, a first pressure (e.g., a “touch”) is applied to anarea 270 offlexible strip 200 which deformsupper surface 205 andcavity 210. The pressure forces protrusion 208 downward until lowerelectrical contact 230 makes contact with body extensionelectrical contact 250. The connection of these two electrical contacts generates a signal that is sent toprocessor 130 for processing. A discussion of howprocessor 130 responds to this connection is described with respect to FIGS. 4-6. Pressure on one area offlexible strip 200 only affects the components directly below. That is, if pressure is applied to one of three adjacent areas onflexible strip 200, only the selected area will respond to the pressure as shown in FIG. 2b. - FIG. 2c illustrates a cross-sectional view of a second pressure applied to a user input device consistent with a first embodiment of the present invention. This figure shows the second step of the “two-step” functionality described herein. In this instance, the first pressure shown on
area 270 is increased to a second pressure (e.g., a “press”) until upperelectrical contact 220 makes contact with body protrusionelectrical contact 240. In this position, both lowerelectrical contact 230 and upperelectrical contact 220 are electrically coupled with the respective body electrical contacts underarea 270. This connection generates a second signal toprocessor 130 which is processed accordingly. - FIGS. 3a-3 c illustrate a cross-sectional view of a user input device consistent with a second embodiment of the present invention. In this second embodiment, active
edge input device 120 includes an alternative design for entering data into a host device. Although the embodiment in FIGS. 2a-2 c is preferred, the active edge input device illustrated in FIGS. 3a-3 c also provides “two-step” functionality as described herein. - FIG. 3a illustrates a cross-sectional view of a second embodiment of active
edge input device 120 at rest. As in the first embodiment, activeedge input device 120 includes aflexible strip 300 positioned adjacent ahost body surface 350.Body surface 350 is a surface of a host device in which activeedge user interface 100 is installed. For example, if activeedge user interface 100 is installed in a wireless communication device, thenbody surface 350 is a surface of the wireless communication device. -
Flexible strip 300 is an elastomer strip of material that includes anupper surface 305, alower surface 307, and one ormore cavities 320. Although elastomer is preferable,flexible strip 300 can be composed of any resilient material. Preferably,flexible strip 300 is a continuous strip of material that extends around at least one side ofdisplay 110. However,flexible strip 300 may be sectioned (i.e., non-continuous) as appropriate in the user environment to satisfy design requirements. -
Upper surface 305 is a surface offlexible strip 300 that is exposed to a user as illustrated in FIG. 1. Preferably,upper surface 305 is smooth, however, it may include protrusions to allow users to locate certain areas on activeedge input device 120 by touch alone. The smoothness ofupper surface 305 allows users to drag their finger or other instrument alongflexible strip 300 in a sweeping motion. This motion, for example, may be used to implement a scrolling function which allows a user to scroll through information ondisplay 110. -
Lower surface 307 includes aresistive plate 310 that is responsive to a human touch. Preferably,resistive plate 310 extends alonglower surface 307 as a continuous strip of conductive material. However,resistive plate 310 may have separate and distinct sections that are positioned alonglower surface 307.Resistive plate 310 may comprise resistive material currently used in conventional touch-screen devices. - Attached to
resistive plate 310 are one ormore protrusions 308 that extend outward and includeextensions 309. The face ofextensions 309 include input deviceelectrical contacts 330 fixed thereon, as illustrated in FIG. 3a. These electrical contacts are “puck-shaped” and are formed from an electrically conductive material (e.g., carbon). -
Body surface 350 includes bodyelectrical contacts 340 which are fixed thereon. These electrical contacts are also composed of an electrically conductive material (e.g., carbon) and are aligned with input deviceelectrical contacts 330. A gap exists between the electrical contacts onbody surface 350 and the electrical contacts onextensions 309 while activeedge input device 120 is at rest. - Cavities320 are formed in an area of
flexible strip 300 adjacent eachprotrusion 308. Preferably, each ofcavities 320 are formed in an image ofprotrusions 308 andextensions 309, as illustrated in FIG. 3a, but may have any shape.Cavities 320 are designed to collapse when a pressure is applied and return to its original shape when the pressure is released. Thus,cavities 320 provide a “soft button” effect when a pressure is applied thereto by a user. The deformation ofcavities 320 under pressure is illustrated in FIGS. 3b and 3 c. - FIG. 3b illustrates a cross-sectional view of a touch applied to active
edge input device 120 consistent with a second embodiment of the present invention. This figure shows the first step of the “two-step” functionality described herein. In this instance, a voltage is applied toresistive plate 310 during operation of the host device. When a human touchesupper surface 305 of flexible strip 300 (e.g., on area 360), a change in voltage is detected and a first signal is generated.Processor 130 receives the first signal and responds by implementinguser interface software 160. A discussion of howprocessor 130 implementsuser interface software 160 is described with respect to FIGS. 4-6. Although FIG. 3b illustrates deformation offlexible strip 300 in the area where a touch is applied, activeedge input device 120 can be configured to simply sense a human touch without requiring the application of pressure toflexible strip 300. In this instance,resistive plate 310 simply detects the presence of a human touch onarea 360 and does not require any deformation offlexible strip 300. - FIG. 3c illustrates a cross-sectional view of a pressure applied to active
edge input device 120 consistent with a second embodiment of the present invention. This figure shows the second step of the “two-step” functionality described herein. In this instance, the first pressure shown in FIG. 3b is increased to a second pressure (e.g., a “press”) onarea 370 offlexible strip 300 until input deviceelectrical contact 330 makes contact with bodyelectrical contact 340. The second pressure deformsflexible strip 300 includingresistive plate 310 andcavity 320. The connection of the electrical contacts generates a second signal toprocessor 130 which is processed accordingly by implementinguser interface software 160. - FIGS. 4a-4 b illustrate the operation of selecting an item illustrated on a display using an active edge input device consistent with the present invention. Specifically, the operation of
display 400, activeedge input devices - FIGS. 4a and 4 b illustrate a mode of operation for an active edge user interface consistent with the present invention. The user environment illustrated in these figures includes a notebook computer with an active edge user interface. The notebook computer includes a
display 400 and activeedge input devices display 400, respectively. Activeedge input devices edge input devices - Initially, information stored in
data storage 150 or a peripheral device is generated ondisplay 400. As shown in FIG. 4a, this information relates to fashion and includes a main category “clothing” displayed on the left side ofdisplay 400 and a plurality of sub-categories including “shoes, socks, shirts, pants, jackets, scarfs, and hats” displayed on the right side ofdisplay 400. In operation, a user can touch or press an area of activeedge input device 420 to highlight a sub-category adjacent thereto. In addition, users can drag their finger down or up active edgeuser input device 420 to scroll through the sub-categories. As illustrated in FIG. 4a, the sub-category “shirts” is highlighted as a result of a touch or press on an adjacent area of activeedge input device 420. A sub-category, or any data displayed and selected using embodiments consistent with the present invention, can by highlighted in many different ways. For example, the selected data can change colors, expand, contract, flash, or be affected in any manner that indicates it has been selected by a user via activeedge input device 420. - The touch or press on active
edge input device 420 corresponding to the selection of the “shirts” sub-category sends a first signal toprocessor 130 which processes the signal usinguser interface software 160.User interface software 160 interprets the signal as a selection of the “shirts” category based on the screen location of the currently display data and the selected area on activeedge input device 420. Since the touch or press only implements the first step of the “two-step” functionality described herein, the “shirts” category is simply highlighted for the user. - Once the sub-category is highlighted, the user has the option of accepting the selected category or moving to another displayed category. The latter option highlights a newly selected sub-category in a manner similar to the highlighted “shirts” sub-category. If the user chooses to accept the “shirts” sub-category, they simply increase the pressure on active
edge input device 420 until the electrical contacts of activeedge input device 420 contact the electrical contacts connected to a surface of the host device. This operation implements the second step of “two-step” functionality described herein. At this point, a second signal is sent toprocessor 130 indicating that the selection is accepted and the “shirts” sub-category moves to the left side of the screen under the “clothing” category, as illustrated in FIG. 4b.User interface software 160 then implements the function associated with the user selection that, in this example, is updating the category listing with “shirts.” - The function implemented by
user interface software 160 will change depending on the user environment. For example, the display may show an “Announce” function that, when selected, announces predetermined information to specified subscribers over a wireless or wireline communication channel. The “Announce” function may allow the user to select the priority of the announcement by displaying priority selections adjacent an active edge input device (e.g., gold priority for urgent, silver priority for semi-urgent, and bronze for not urgent). Using the active edge input device, the user can scroll through the displayed priority categories and select the desired priority using the “two-step” functionality described herein. Another example of this feature is discussed with reference to FIGS. 5a-5 d. - FIG. 5a illustrates an implementation of an active edge user interface on a
wireless communications device 500 for responding to a call consistent with the present invention.Wireless communication device 500 is a host device that includes adisplay 510, an activeedge input device 520, and akeypad 525. The upper highlighted portion ofdisplay 510 indicates the currently displayed function (e.g., “call from” or “contact”). The middle portion ofdisplay 510 shows data entered by a user or received from a remote device. The lower portion ofdisplay 510 shows function parameters, such as “Fwd,” “Ans,” and “Send.” Activeedge input device 520 is a continuous strip of flexible material that borders three sides ofdisplay 510. Activeedge input device 520 includes protrusions in the shape ofribs 540 on the left and right sides ofdisplay 510, andbuttons 550 on the bottom side of the display. One ormore buttons 550 correspond to one or more of the displayed function parameters. -
Display 510 in FIG. 5a indicates to the user thatwireless communications device 500 is receiving or has received a call from “Alan Frank” whose telephone number is “459-6232.” The user has the option of answering or forwarding the call by pressing theappropriate button 550. If the “Ans” function parameter is selected,wireless communications device 500 connects the call. If the “Fwd” function parameter is selected, the user has the option of forwarding the call to “VMail” (i.e., voicemail) or to “Home” (i.e., to telephone number “763-5463”) as illustrated in FIG. 5b. The user can move between each displayed option, for example, by dragging a finger along the left or right side surface of activeedge input device 520. One skilled in the art will appreciate that active edge user interface may be configured such that the user can only use one side of active edge input device to select between the options ondisplay 510. - When the user is touching or slightly pressing on an area of active
edge input device 520 adjacent a desired option, the option is highlighted, as shown in FIG. 5b. The touching or slight pressure represents the first step of the “two-step” functionality implemented by embodiments consistent with the present invention. To accept the highlighted option, the user presses harder on activeedge input device 520, which forwards Alan Frank's call to the user's home. This secondary pressure represents the second step of the “two-step” functionality. The user may choose to quit the current display at any time by touching on activeedge input device 520 below the displayed “Quit” function parameter. - The user may choose to make a call from
wireless communications device 500. In this instance, the user presses on activeedge input device 520 below the “Call” function as illustrated in FIG. 5c. Upon selecting this function, a list of names stored in memory appears ondisplay 510. If the list is voluminous, the user can scroll through the list by dragging (e.g., touching or slightly pressing) a finger or other instrument in an upward or downward motion across the surface of activeedge input device 520. In the scrolling mode,display 510 may automatically switch to an iconic view to show where the user is on the list, as shown in FIG. 5c. - Upon reaching a desired name on the list, the name is highlighted by the touch or slight pressure on active
edge input device 520 adjacent the name, as illustrated in FIG. 5d. The user can then initiate the call by pressing harder on activeedge input device 520. Alternatively, the user could only send a message to a specified person by selecting the appropriate function key on the bottom ofdisplay 510. - FIG. 6 illustrates a method for implementing an active edge user interface consistent with the present invention. Initially, an active edge user interface generates an image on a display in response to a touch or pressure on a predetermined area of an input device adjacent the display (step600). Subsequently, active edge user interface implements a function associated with the image when a greater pressure is applied to the predetermined area of the input device (step 620). The function, for example, could be calling a highlighted name (i.e., represented by the image) on a wireless communications device.
- Systems and methods consistent with the present invention thus provide an active edge user interface that offers great functionality and ease-of-use. Moreover, an active edge user interface consistent with the present invention eliminates the need to touch the actual display while preserving the benefits of a graphical user interface.
- While there has been illustrated and described preferred embodiments and methods of the present invention, those skilled in the art will understand that various changes and modifications may be made, and equivalents may be substituted for elements thereof, without departing from the true scope of the invention.
- In addition, many modifications may be made to adapt a particular element, technique or implementation to the teachings of the present invention without departing from the central scope of the invention. Therefore, this invention should not be limited to the particular embodiments and methods disclosed herein, but should include all embodiments falling within the scope of the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/097,150 US6369803B2 (en) | 1998-06-12 | 1998-06-12 | Active edge user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/097,150 US6369803B2 (en) | 1998-06-12 | 1998-06-12 | Active edge user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20010043189A1 true US20010043189A1 (en) | 2001-11-22 |
US6369803B2 US6369803B2 (en) | 2002-04-09 |
Family
ID=22261515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/097,150 Expired - Lifetime US6369803B2 (en) | 1998-06-12 | 1998-06-12 | Active edge user interface |
Country Status (1)
Country | Link |
---|---|
US (1) | US6369803B2 (en) |
Cited By (83)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030095095A1 (en) * | 2001-11-20 | 2003-05-22 | Nokia Corporation | Form factor for portable device |
US20040228266A1 (en) * | 2003-05-14 | 2004-11-18 | Knight Kenneth Kingston | Digital animation disk |
EP1526440A1 (en) * | 2003-10-24 | 2005-04-27 | Giat Industries | Two dimensional pointing device |
WO2006045209A2 (en) * | 2004-10-26 | 2006-05-04 | Dätwyler I/O Devices Ag | Input device |
US20060146036A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US20060146039A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US20060146037A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US20060238517A1 (en) * | 2005-03-04 | 2006-10-26 | Apple Computer, Inc. | Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control |
EP1764987A1 (en) * | 2005-09-16 | 2007-03-21 | NTT DoCoMo Inc. | Mobile terminal and program for pre-explanation of multi-function keys before executing the functions |
WO2007103631A2 (en) * | 2006-03-03 | 2007-09-13 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US20080001924A1 (en) * | 2006-06-29 | 2008-01-03 | Microsoft Corporation | Application switching via a touch screen interface |
WO2008028499A1 (en) * | 2006-09-05 | 2008-03-13 | Nokia Corporation | Mobile electronic device with competing input devices |
US20080074400A1 (en) * | 2000-11-30 | 2008-03-27 | Palm,Inc. | Input detection system for a portable electronic device |
EP1973208A3 (en) * | 2007-03-17 | 2008-11-19 | Aizo AG | Method for operating and programming switches, in particular light switches |
US20090174687A1 (en) * | 2008-01-04 | 2009-07-09 | Craig Michael Ciesla | User Interface System |
US20090174673A1 (en) * | 2008-01-04 | 2009-07-09 | Ciesla Craig M | System and methods for raised touch screens |
US20090287834A1 (en) * | 2008-05-15 | 2009-11-19 | Alcorn Byron A | Method and system for allocating on-demand resources using a connection manager |
US20100103137A1 (en) * | 2008-01-04 | 2010-04-29 | Craig Michael Ciesla | User interface system and method |
US20100117983A1 (en) * | 2008-11-10 | 2010-05-13 | Asustek Computer Inc. | Resistive touch panel and method for detecting touch points thereof |
WO2010055195A1 (en) | 2008-11-14 | 2010-05-20 | Nokia Corporation | Warning system for breaking touch screen or display |
US20100171719A1 (en) * | 2009-01-05 | 2010-07-08 | Ciesla Michael Craig | User interface system |
US20100171720A1 (en) * | 2009-01-05 | 2010-07-08 | Ciesla Michael Craig | User interface system |
US20100238138A1 (en) * | 2009-02-15 | 2010-09-23 | Neonode Inc. | Optical touch screen systems using reflected light |
US20100238139A1 (en) * | 2009-02-15 | 2010-09-23 | Neonode Inc. | Optical touch screen systems using wide light beams |
US20110001613A1 (en) * | 2009-07-03 | 2011-01-06 | Craig Michael Ciesla | Method for adjusting the user interface of a device |
US20110012851A1 (en) * | 2009-07-03 | 2011-01-20 | Craig Michael Ciesla | User Interface Enhancement System |
WO2011006678A1 (en) * | 2009-07-15 | 2011-01-20 | Sony Ericsson Mobile Communications Ab | Sensor assembly and display including a sensor assembly |
US20110148793A1 (en) * | 2008-01-04 | 2011-06-23 | Craig Michael Ciesla | User Interface System |
US20110157080A1 (en) * | 2008-01-04 | 2011-06-30 | Craig Michael Ciesla | User Interface System |
US20120094723A1 (en) * | 2002-12-10 | 2012-04-19 | Neonode, Inc. | User interface |
WO2012158902A2 (en) | 2011-05-19 | 2012-11-22 | Microsoft Corporation | Pressure-sensitive multi-touch device |
US20130069886A1 (en) * | 2011-09-16 | 2013-03-21 | Wan-Qiu Wang | Edge grip detection method of a touch panel and a device using the same |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US8456438B2 (en) | 2008-01-04 | 2013-06-04 | Tactus Technology, Inc. | User interface system |
US8553005B2 (en) | 2008-01-04 | 2013-10-08 | Tactus Technology, Inc. | User interface system |
US8570295B2 (en) | 2008-01-04 | 2013-10-29 | Tactus Technology, Inc. | User interface system |
US8587541B2 (en) | 2010-04-19 | 2013-11-19 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
TWI419039B (en) * | 2009-12-14 | 2013-12-11 | Casio Computer Co Ltd | Touch panel |
US8619035B2 (en) | 2010-02-10 | 2013-12-31 | Tactus Technology, Inc. | Method for assisting user input to a device |
US8643628B1 (en) | 2012-10-14 | 2014-02-04 | Neonode Inc. | Light-based proximity detection system and user interface |
US8654524B2 (en) | 2009-08-17 | 2014-02-18 | Apple Inc. | Housing as an I/O device |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
US8917239B2 (en) | 2012-10-14 | 2014-12-23 | Neonode Inc. | Removable protective cover with embedded proximity sensors |
US8922510B2 (en) | 2008-01-04 | 2014-12-30 | Tactus Technology, Inc. | User interface system |
US8947383B2 (en) | 2008-01-04 | 2015-02-03 | Tactus Technology, Inc. | User interface system and method |
US9013417B2 (en) | 2008-01-04 | 2015-04-21 | Tactus Technology, Inc. | User interface system |
US9019228B2 (en) | 2008-01-04 | 2015-04-28 | Tactus Technology, Inc. | User interface system |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US9052790B2 (en) | 2008-01-04 | 2015-06-09 | Tactus Technology, Inc. | User interface and methods |
US9063627B2 (en) | 2008-01-04 | 2015-06-23 | Tactus Technology, Inc. | User interface and methods |
US9128525B2 (en) | 2008-01-04 | 2015-09-08 | Tactus Technology, Inc. | Dynamic tactile interface |
US9158416B2 (en) | 2009-02-15 | 2015-10-13 | Neonode Inc. | Resilient light-based touch surface |
US9164625B2 (en) | 2012-10-14 | 2015-10-20 | Neonode Inc. | Proximity sensor for determining two-dimensional coordinates of a proximal object |
US9239623B2 (en) | 2010-01-05 | 2016-01-19 | Tactus Technology, Inc. | Dynamic tactile interface |
US9274612B2 (en) | 2008-01-04 | 2016-03-01 | Tactus Technology, Inc. | User interface system |
US9280224B2 (en) | 2012-09-24 | 2016-03-08 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US9298261B2 (en) | 2008-01-04 | 2016-03-29 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US9367132B2 (en) | 2008-01-04 | 2016-06-14 | Tactus Technology, Inc. | User interface system |
US9372565B2 (en) | 2008-01-04 | 2016-06-21 | Tactus Technology, Inc. | Dynamic tactile interface |
US9405417B2 (en) | 2012-09-24 | 2016-08-02 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US9423875B2 (en) | 2008-01-04 | 2016-08-23 | Tactus Technology, Inc. | Dynamic tactile interface with exhibiting optical dispersion characteristics |
US9552065B2 (en) | 2008-01-04 | 2017-01-24 | Tactus Technology, Inc. | Dynamic tactile interface |
US9557915B2 (en) | 2008-01-04 | 2017-01-31 | Tactus Technology, Inc. | Dynamic tactile interface |
US9557813B2 (en) | 2013-06-28 | 2017-01-31 | Tactus Technology, Inc. | Method for reducing perceived optical distortion |
US9588683B2 (en) | 2008-01-04 | 2017-03-07 | Tactus Technology, Inc. | Dynamic tactile interface |
US9588684B2 (en) | 2009-01-05 | 2017-03-07 | Tactus Technology, Inc. | Tactile interface for a computing device |
US9612659B2 (en) | 2008-01-04 | 2017-04-04 | Tactus Technology, Inc. | User interface system |
US9619030B2 (en) | 2008-01-04 | 2017-04-11 | Tactus Technology, Inc. | User interface system and method |
US9720501B2 (en) | 2008-01-04 | 2017-08-01 | Tactus Technology, Inc. | Dynamic tactile interface |
US9741184B2 (en) | 2012-10-14 | 2017-08-22 | Neonode Inc. | Door handle with optical proximity sensors |
US9760172B2 (en) | 2008-01-04 | 2017-09-12 | Tactus Technology, Inc. | Dynamic tactile interface |
US9785258B2 (en) | 2003-09-02 | 2017-10-10 | Apple Inc. | Ambidextrous mouse |
US9921661B2 (en) | 2012-10-14 | 2018-03-20 | Neonode Inc. | Optical proximity sensor and associated user interface |
US10282034B2 (en) | 2012-10-14 | 2019-05-07 | Neonode Inc. | Touch sensitive curved and flexible displays |
US10324565B2 (en) | 2013-05-30 | 2019-06-18 | Neonode Inc. | Optical proximity sensor |
US20190278332A1 (en) * | 2018-03-09 | 2019-09-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Electronic device and manufacturing method thereof |
US10585530B2 (en) | 2014-09-23 | 2020-03-10 | Neonode Inc. | Optical proximity sensor |
US10949082B2 (en) | 2016-09-06 | 2021-03-16 | Apple Inc. | Processing capacitive touch gestures implemented on an electronic device |
US11275405B2 (en) | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
US11669210B2 (en) | 2020-09-30 | 2023-06-06 | Neonode Inc. | Optical touch sensor |
US11842014B2 (en) | 2019-12-31 | 2023-12-12 | Neonode Inc. | Contactless touch input system |
Families Citing this family (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6525715B2 (en) * | 1997-03-24 | 2003-02-25 | Seiko Epson Corporation | Portable information acquisition device |
US7834855B2 (en) | 2004-08-25 | 2010-11-16 | Apple Inc. | Wide touchpad on a portable computer |
JP3792920B2 (en) * | 1998-12-25 | 2006-07-05 | 株式会社東海理化電機製作所 | Touch operation input device |
US7151528B2 (en) * | 1999-06-22 | 2006-12-19 | Cirque Corporation | System for disposing a proximity sensitive touchpad behind a mobile phone keypad |
US20060121938A1 (en) * | 1999-08-12 | 2006-06-08 | Hawkins Jeffrey C | Integrated handheld computing and telephony device |
US8064886B2 (en) * | 1999-08-12 | 2011-11-22 | Hewlett-Packard Development Company, L.P. | Control mechanisms for mobile devices |
US6781575B1 (en) | 2000-09-21 | 2004-08-24 | Handspring, Inc. | Method and apparatus for organizing addressing elements |
US7007239B1 (en) * | 2000-09-21 | 2006-02-28 | Palm, Inc. | Method and apparatus for accessing a contacts database and telephone services |
US6924792B1 (en) * | 2000-03-10 | 2005-08-02 | Richard V. Jessop | Electrowetting and electrostatic screen display systems, colour displays and transmission means |
US8531276B2 (en) | 2000-03-15 | 2013-09-10 | Logitech Europe S.A. | State-based remote control system |
US6784805B2 (en) | 2000-03-15 | 2004-08-31 | Intrigue Technologies Inc. | State-based remote control system |
US20010033243A1 (en) * | 2000-03-15 | 2001-10-25 | Harris Glen Mclean | Online remote control configuration system |
JP3785902B2 (en) * | 2000-07-11 | 2006-06-14 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Device, device control method, pointer movement method |
JP2002149308A (en) * | 2000-11-10 | 2002-05-24 | Nec Corp | Information input method and input device |
US7088343B2 (en) * | 2001-04-30 | 2006-08-08 | Lenovo (Singapore) Pte., Ltd. | Edge touchpad input device |
JP4127982B2 (en) * | 2001-05-28 | 2008-07-30 | 富士フイルム株式会社 | Portable electronic devices |
US7808487B2 (en) * | 2001-06-06 | 2010-10-05 | Cirque Corporation | System for disposing a proximity sensitive touchpad behind a mobile phone keymat |
JP3971907B2 (en) * | 2001-09-17 | 2007-09-05 | アルプス電気株式会社 | Coordinate input device and electronic device |
CN1582465B (en) | 2001-11-01 | 2013-07-24 | 伊梅森公司 | Input device and mobile telephone comprising the input device |
US6791529B2 (en) * | 2001-12-13 | 2004-09-14 | Koninklijke Philips Electronics N.V. | UI with graphics-assisted voice control system |
US6888537B2 (en) * | 2002-02-13 | 2005-05-03 | Siemens Technology-To-Business Center, Llc | Configurable industrial input devices that use electrically conductive elastomer |
US6809280B2 (en) * | 2002-05-02 | 2004-10-26 | 3M Innovative Properties Company | Pressure activated switch and touch panel |
EP1505484B1 (en) | 2002-05-16 | 2012-08-15 | Sony Corporation | Inputting method and inputting apparatus |
US7362386B2 (en) * | 2002-08-06 | 2008-04-22 | Toshiba America Consumer Products, L.L.C. | Integrated structural screen panel for projection television |
TW579010U (en) * | 2002-11-15 | 2004-03-01 | Lite On Technology Corp | Input device for alarming of excessive applied force |
US7295852B1 (en) | 2003-05-01 | 2007-11-13 | Palm, Inc. | Automated telephone conferencing method and system |
ATE502685T1 (en) * | 2004-03-22 | 2011-04-15 | Nintendo Co Ltd | GAME APPARATUS, GAME PROGRAM, STORAGE MEDIUM IN WHICH THE GAME PROGRAM IS STORED, AND GAME CONTROL METHOD |
US7417625B2 (en) * | 2004-04-29 | 2008-08-26 | Scenera Technologies, Llc | Method and system for providing input mechanisms on a handheld electronic device |
JP4148187B2 (en) * | 2004-06-03 | 2008-09-10 | ソニー株式会社 | Portable electronic device, input operation control method and program thereof |
JP4303167B2 (en) * | 2004-06-11 | 2009-07-29 | アルプス電気株式会社 | Input device |
US7561146B1 (en) | 2004-08-25 | 2009-07-14 | Apple Inc. | Method and apparatus to reject accidental contact on a touchpad |
US20060092177A1 (en) * | 2004-10-30 | 2006-05-04 | Gabor Blasko | Input method and apparatus using tactile guidance and bi-directional segmented stroke |
US7468199B2 (en) * | 2004-12-23 | 2008-12-23 | 3M Innovative Properties Company | Adhesive membrane for force switches and sensors |
US7260999B2 (en) * | 2004-12-23 | 2007-08-28 | 3M Innovative Properties Company | Force sensing membrane |
US7892096B2 (en) * | 2005-02-22 | 2011-02-22 | Wms Gaming Inc. | Gaming machine with configurable button panel |
DE102006018238A1 (en) | 2005-04-20 | 2007-03-29 | Logitech Europe S.A. | Remote control system for home theater system, analyzes log of events stored by remote controller to identify patterns of interest in logged use of remote controller |
JP2006345209A (en) * | 2005-06-08 | 2006-12-21 | Sony Corp | Input device, information processing apparatus, information processing method, and program |
US7509881B2 (en) * | 2005-07-29 | 2009-03-31 | 3M Innovative Properties Company | Interdigital force switches and sensors |
US20070055597A1 (en) * | 2005-09-08 | 2007-03-08 | Visa U.S.A. | Method and system for manipulating purchase information |
CN100583012C (en) * | 2005-09-21 | 2010-01-20 | 鸿富锦精密工业(深圳)有限公司 | Crossing-type menu displaying device and display control method |
CN100592247C (en) * | 2005-09-21 | 2010-02-24 | 鸿富锦精密工业(深圳)有限公司 | Multi-gradation menu displaying device and display control method |
TWI320160B (en) * | 2005-09-23 | 2010-02-01 | Apparatus and method for displaying a multi-level menu | |
CN1940834B (en) * | 2005-09-30 | 2014-10-29 | 鸿富锦精密工业(深圳)有限公司 | Circular menu display device and its display controlling method |
CN1949161B (en) * | 2005-10-14 | 2010-05-26 | 鸿富锦精密工业(深圳)有限公司 | Multi gradation menu displaying device and display controlling method |
DE102006038293A1 (en) * | 2005-10-28 | 2007-06-06 | Volkswagen Ag | Input device for motor vehicles, has adjustable transparency layer that is arranged between display and touch screen |
US20070152983A1 (en) | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Touch pad with symbols based on mode |
FR2899719B1 (en) * | 2006-04-11 | 2008-05-16 | Itt Mfg Enterprises Inc | ELECTRICAL SWITCH WITH MULTIPLE SWITCHES |
TW200823729A (en) * | 2006-06-14 | 2008-06-01 | Polymer Vision Ltd | User input on rollable display device |
US8022935B2 (en) | 2006-07-06 | 2011-09-20 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
WO2008041062A2 (en) * | 2006-10-06 | 2008-04-10 | Bethesda Waters And Associates, Llc | System for tracking a person or object and analyzing and reporting related event information |
US7804488B2 (en) * | 2006-11-03 | 2010-09-28 | Research In Motion Limited | Method of employing a switch assembly to provide input, and handheld electronic device |
US20080136784A1 (en) * | 2006-12-06 | 2008-06-12 | Motorola, Inc. | Method and device for selectively activating a function thereof |
KR101405928B1 (en) * | 2007-06-07 | 2014-06-12 | 엘지전자 주식회사 | A method for generating key signal in mobile terminal and the mobile terminal |
US20090174679A1 (en) | 2008-01-04 | 2009-07-09 | Wayne Carl Westerman | Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface |
KR20100006219A (en) * | 2008-07-09 | 2010-01-19 | 삼성전자주식회사 | Method and apparatus for user interface |
US8500732B2 (en) * | 2008-10-21 | 2013-08-06 | Hermes Innovations Llc | Endometrial ablation devices and systems |
US8294047B2 (en) | 2008-12-08 | 2012-10-23 | Apple Inc. | Selective input signal rejection and modification |
US20100182135A1 (en) * | 2009-01-16 | 2010-07-22 | Research In Motion Limited | Portable electronic device including tactile touch-sensitive display |
US20110087963A1 (en) * | 2009-10-09 | 2011-04-14 | At&T Mobility Ii Llc | User Interface Control with Edge Finger and Motion Sensing |
TWI545468B (en) * | 2010-03-04 | 2016-08-11 | Sentelic Corp | Input device |
US8508401B1 (en) | 2010-08-31 | 2013-08-13 | Logitech Europe S.A. | Delay fixing for command codes in a remote control system |
US8847890B2 (en) | 2011-01-04 | 2014-09-30 | Synaptics Incorporated | Leveled touchsurface with planar translational responsiveness to vertical travel |
US8309870B2 (en) | 2011-01-04 | 2012-11-13 | Cody George Peterson | Leveled touchsurface with planar translational responsiveness to vertical travel |
US8912458B2 (en) | 2011-01-04 | 2014-12-16 | Synaptics Incorporated | Touchsurface with level and planar translational travel responsiveness |
US20130016129A1 (en) * | 2011-07-14 | 2013-01-17 | Google Inc. | Region-Specific User Input |
US8730174B2 (en) | 2011-10-13 | 2014-05-20 | Blackberry Limited | Device and method for receiving input |
US9582178B2 (en) | 2011-11-07 | 2017-02-28 | Immersion Corporation | Systems and methods for multi-pressure interaction on touch-sensitive surfaces |
US9891709B2 (en) | 2012-05-16 | 2018-02-13 | Immersion Corporation | Systems and methods for content- and context specific haptic effects using predefined haptic effects |
US9324515B2 (en) | 2012-08-06 | 2016-04-26 | Synaptics Incorporated | Touchsurface assembly utilizing magnetically enabled hinge |
US9040851B2 (en) | 2012-08-06 | 2015-05-26 | Synaptics Incorporated | Keycap assembly with an interactive spring mechanism |
US9218927B2 (en) | 2012-08-06 | 2015-12-22 | Synaptics Incorporated | Touchsurface assembly with level and planar translational responsiveness via a buckling elastic component |
US9177733B2 (en) | 2012-08-06 | 2015-11-03 | Synaptics Incorporated | Touchsurface assemblies with linkages |
US9904394B2 (en) | 2013-03-13 | 2018-02-27 | Immerson Corporation | Method and devices for displaying graphical user interfaces based on user contact |
US9384919B2 (en) | 2013-03-14 | 2016-07-05 | Synaptics Incorporated | Touchsurface assembly having key guides formed in a sheet metal component |
US9213372B2 (en) | 2013-04-19 | 2015-12-15 | Synaptics Incorporated | Retractable keyboard keys |
US9569008B1 (en) * | 2015-08-31 | 2017-02-14 | Logitech Europe S.A. | Solid state input device |
US10768803B2 (en) | 2015-09-21 | 2020-09-08 | Motorola Solutions, Inc. | User interface system with active and passive display spaces |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4017848A (en) * | 1975-05-19 | 1977-04-12 | Rockwell International Corporation | Transparent keyboard switch and array |
US4085302A (en) * | 1976-11-22 | 1978-04-18 | Control Data Corporation | Membrane-type touch panel |
US4310839A (en) * | 1979-11-23 | 1982-01-12 | Raytheon Company | Interactive display system with touch data entry |
US4471177A (en) * | 1982-08-13 | 1984-09-11 | Press On, Inc. | Enlarged switch area membrane switch and method |
DE3879725T2 (en) * | 1987-02-02 | 1993-07-08 | Sharp Kk | DEVICE FOR A FLAT KEYBOARD. |
US5121091A (en) * | 1989-09-08 | 1992-06-09 | Matsushita Electric Industrial Co., Ltd. | Panel switch |
JPH0458316A (en) * | 1990-06-28 | 1992-02-25 | Toshiba Corp | Information processor |
US5594471A (en) | 1992-01-09 | 1997-01-14 | Casco Development, Inc. | Industrial touchscreen workstation with programmable interface and method |
JPH0695796A (en) * | 1992-09-14 | 1994-04-08 | Mutoh Ind Ltd | Pen input device |
FR2697935B1 (en) * | 1992-11-12 | 1995-01-13 | Sextant Avionique | Compact and ergonomic communication terminal with proximity detection surfaces. |
US5459461A (en) * | 1993-07-29 | 1995-10-17 | Crowley; Robert J. | Inflatable keyboard |
US5724069A (en) * | 1994-07-15 | 1998-03-03 | Chen; Jack Y. | Special purpose terminal for interactive user interface |
US5757361A (en) * | 1996-03-20 | 1998-05-26 | International Business Machines Corporation | Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary |
US5910802A (en) * | 1997-06-11 | 1999-06-08 | Microsoft Corporation | Operating system for handheld computing device having taskbar auto hide |
-
1998
- 1998-06-12 US US09/097,150 patent/US6369803B2/en not_active Expired - Lifetime
Cited By (166)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9489018B2 (en) | 2000-11-30 | 2016-11-08 | Qualcomm Incorporated | Input detection system for a portable electronic device |
US20080117184A1 (en) * | 2000-11-30 | 2008-05-22 | Palm, Inc. | Flexible screen display with touch sensor in a portable computer |
US20080074400A1 (en) * | 2000-11-30 | 2008-03-27 | Palm,Inc. | Input detection system for a portable electronic device |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US7009599B2 (en) * | 2001-11-20 | 2006-03-07 | Nokia Corporation | Form factor for portable device |
US20030095095A1 (en) * | 2001-11-20 | 2003-05-22 | Nokia Corporation | Form factor for portable device |
US9983742B2 (en) | 2002-07-01 | 2018-05-29 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US8810551B2 (en) | 2002-11-04 | 2014-08-19 | Neonode Inc. | Finger gesture user interface |
US8884926B1 (en) | 2002-11-04 | 2014-11-11 | Neonode Inc. | Light-based finger gesture user interface |
US9262074B2 (en) | 2002-11-04 | 2016-02-16 | Neonode, Inc. | Finger gesture user interface |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US20120094723A1 (en) * | 2002-12-10 | 2012-04-19 | Neonode, Inc. | User interface |
US8650510B2 (en) * | 2002-12-10 | 2014-02-11 | Neonode Inc. | User interface |
US10088975B2 (en) * | 2002-12-10 | 2018-10-02 | Neonode Inc. | User interface |
US8812993B2 (en) * | 2002-12-10 | 2014-08-19 | Neonode Inc. | User interface |
US20120192094A1 (en) * | 2002-12-10 | 2012-07-26 | Neonode, Inc. | User interface |
US20140325441A1 (en) * | 2002-12-10 | 2014-10-30 | Neonode Inc. | User interface |
US20040228266A1 (en) * | 2003-05-14 | 2004-11-18 | Knight Kenneth Kingston | Digital animation disk |
US9785258B2 (en) | 2003-09-02 | 2017-10-10 | Apple Inc. | Ambidextrous mouse |
US10474251B2 (en) | 2003-09-02 | 2019-11-12 | Apple Inc. | Ambidextrous mouse |
US10156914B2 (en) | 2003-09-02 | 2018-12-18 | Apple Inc. | Ambidextrous mouse |
EP1526440A1 (en) * | 2003-10-24 | 2005-04-27 | Giat Industries | Two dimensional pointing device |
FR2861473A1 (en) * | 2003-10-24 | 2005-04-29 | Giat Ind Sa | BI-DIMENSIONAL POINTING DEVICE |
WO2006045209A2 (en) * | 2004-10-26 | 2006-05-04 | Dätwyler I/O Devices Ag | Input device |
WO2006045209A3 (en) * | 2004-10-26 | 2006-06-29 | Daetwyler I O Devices Ag | Input device |
US7920126B2 (en) | 2004-12-30 | 2011-04-05 | Volkswagen Ag | Input device |
US20060146039A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US8599142B2 (en) * | 2004-12-30 | 2013-12-03 | Volkswagen Ag | Input device |
US20060146036A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US8040323B2 (en) | 2004-12-30 | 2011-10-18 | Volkswagen Ag | Input device |
US20060146037A1 (en) * | 2004-12-30 | 2006-07-06 | Michael Prados | Input device |
US9047009B2 (en) | 2005-03-04 | 2015-06-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US10386980B2 (en) | 2005-03-04 | 2019-08-20 | Apple Inc. | Electronic device having display and surrounding touch sensitive surfaces for user interface and control |
US7656393B2 (en) | 2005-03-04 | 2010-02-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US11360509B2 (en) | 2005-03-04 | 2022-06-14 | Apple Inc. | Electronic device having display and surrounding touch sensitive surfaces for user interface and control |
US20060238517A1 (en) * | 2005-03-04 | 2006-10-26 | Apple Computer, Inc. | Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control |
US10921941B2 (en) | 2005-03-04 | 2021-02-16 | Apple Inc. | Electronic device having display and surrounding touch sensitive surfaces for user interface and control |
US11275405B2 (en) | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US20070063988A1 (en) * | 2005-09-16 | 2007-03-22 | Ntt Docomo, Inc. | Mobile terminal device and program used in mobile terminal device |
US8750937B2 (en) | 2005-09-16 | 2014-06-10 | Ntt Docomo, Inc. | Mobile terminal device and program used in mobile terminal device |
EP1764987A1 (en) * | 2005-09-16 | 2007-03-21 | NTT DoCoMo Inc. | Mobile terminal and program for pre-explanation of multi-function keys before executing the functions |
WO2007103631A3 (en) * | 2006-03-03 | 2008-11-13 | Apple Inc | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
EP2141566A3 (en) * | 2006-03-03 | 2013-12-04 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
WO2007103631A2 (en) * | 2006-03-03 | 2007-09-13 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
EP3835920A1 (en) * | 2006-03-03 | 2021-06-16 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
US20080001924A1 (en) * | 2006-06-29 | 2008-01-03 | Microsoft Corporation | Application switching via a touch screen interface |
US7880728B2 (en) * | 2006-06-29 | 2011-02-01 | Microsoft Corporation | Application switching via a touch screen interface |
WO2008028499A1 (en) * | 2006-09-05 | 2008-03-13 | Nokia Corporation | Mobile electronic device with competing input devices |
EP1973208A3 (en) * | 2007-03-17 | 2008-11-19 | Aizo AG | Method for operating and programming switches, in particular light switches |
US9626059B2 (en) | 2008-01-04 | 2017-04-18 | Tactus Technology, Inc. | User interface system |
US20100103137A1 (en) * | 2008-01-04 | 2010-04-29 | Craig Michael Ciesla | User interface system and method |
US9619030B2 (en) | 2008-01-04 | 2017-04-11 | Tactus Technology, Inc. | User interface system and method |
US9720501B2 (en) | 2008-01-04 | 2017-08-01 | Tactus Technology, Inc. | Dynamic tactile interface |
US8456438B2 (en) | 2008-01-04 | 2013-06-04 | Tactus Technology, Inc. | User interface system |
US8547339B2 (en) | 2008-01-04 | 2013-10-01 | Tactus Technology, Inc. | System and methods for raised touch screens |
US8553005B2 (en) | 2008-01-04 | 2013-10-08 | Tactus Technology, Inc. | User interface system |
US8570295B2 (en) | 2008-01-04 | 2013-10-29 | Tactus Technology, Inc. | User interface system |
US9612659B2 (en) | 2008-01-04 | 2017-04-04 | Tactus Technology, Inc. | User interface system |
US9760172B2 (en) | 2008-01-04 | 2017-09-12 | Tactus Technology, Inc. | Dynamic tactile interface |
US8179375B2 (en) * | 2008-01-04 | 2012-05-15 | Tactus Technology | User interface system and method |
US8154527B2 (en) * | 2008-01-04 | 2012-04-10 | Tactus Technology | User interface system |
US9588683B2 (en) | 2008-01-04 | 2017-03-07 | Tactus Technology, Inc. | Dynamic tactile interface |
US9557915B2 (en) | 2008-01-04 | 2017-01-31 | Tactus Technology, Inc. | Dynamic tactile interface |
US9552065B2 (en) | 2008-01-04 | 2017-01-24 | Tactus Technology, Inc. | Dynamic tactile interface |
US9229571B2 (en) | 2008-01-04 | 2016-01-05 | Tactus Technology, Inc. | Method for adjusting the user interface of a device |
US9524025B2 (en) | 2008-01-04 | 2016-12-20 | Tactus Technology, Inc. | User interface system and method |
US9274612B2 (en) | 2008-01-04 | 2016-03-01 | Tactus Technology, Inc. | User interface system |
US9495055B2 (en) | 2008-01-04 | 2016-11-15 | Tactus Technology, Inc. | User interface and methods |
US8717326B2 (en) | 2008-01-04 | 2014-05-06 | Tactus Technology, Inc. | System and methods for raised touch screens |
US20110157080A1 (en) * | 2008-01-04 | 2011-06-30 | Craig Michael Ciesla | User Interface System |
US20110148793A1 (en) * | 2008-01-04 | 2011-06-23 | Craig Michael Ciesla | User Interface System |
US9477308B2 (en) | 2008-01-04 | 2016-10-25 | Tactus Technology, Inc. | User interface system |
US9207795B2 (en) | 2008-01-04 | 2015-12-08 | Tactus Technology, Inc. | User interface system |
US9298261B2 (en) | 2008-01-04 | 2016-03-29 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US9367132B2 (en) | 2008-01-04 | 2016-06-14 | Tactus Technology, Inc. | User interface system |
US9128525B2 (en) | 2008-01-04 | 2015-09-08 | Tactus Technology, Inc. | Dynamic tactile interface |
US9448630B2 (en) | 2008-01-04 | 2016-09-20 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US9430074B2 (en) | 2008-01-04 | 2016-08-30 | Tactus Technology, Inc. | Dynamic tactile interface |
US8922510B2 (en) | 2008-01-04 | 2014-12-30 | Tactus Technology, Inc. | User interface system |
US8947383B2 (en) | 2008-01-04 | 2015-02-03 | Tactus Technology, Inc. | User interface system and method |
US9423875B2 (en) | 2008-01-04 | 2016-08-23 | Tactus Technology, Inc. | Dynamic tactile interface with exhibiting optical dispersion characteristics |
US8970403B2 (en) | 2008-01-04 | 2015-03-03 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US9372565B2 (en) | 2008-01-04 | 2016-06-21 | Tactus Technology, Inc. | Dynamic tactile interface |
US9013417B2 (en) | 2008-01-04 | 2015-04-21 | Tactus Technology, Inc. | User interface system |
US9019228B2 (en) | 2008-01-04 | 2015-04-28 | Tactus Technology, Inc. | User interface system |
US9035898B2 (en) | 2008-01-04 | 2015-05-19 | Tactus Technology, Inc. | System and methods for raised touch screens |
US20090174673A1 (en) * | 2008-01-04 | 2009-07-09 | Ciesla Craig M | System and methods for raised touch screens |
US20090174687A1 (en) * | 2008-01-04 | 2009-07-09 | Craig Michael Ciesla | User Interface System |
US9052790B2 (en) | 2008-01-04 | 2015-06-09 | Tactus Technology, Inc. | User interface and methods |
US9063627B2 (en) | 2008-01-04 | 2015-06-23 | Tactus Technology, Inc. | User interface and methods |
US9075525B2 (en) | 2008-01-04 | 2015-07-07 | Tactus Technology, Inc. | User interface system |
US9098141B2 (en) | 2008-01-04 | 2015-08-04 | Tactus Technology, Inc. | User interface system |
US9372539B2 (en) | 2008-01-04 | 2016-06-21 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US20090287834A1 (en) * | 2008-05-15 | 2009-11-19 | Alcorn Byron A | Method and system for allocating on-demand resources using a connection manager |
US8140693B2 (en) * | 2008-05-15 | 2012-03-20 | Hewlett-Packard Development Company, L.P. | Method and system for allocating on-demand resources using a connection manager |
US20100117983A1 (en) * | 2008-11-10 | 2010-05-13 | Asustek Computer Inc. | Resistive touch panel and method for detecting touch points thereof |
EP2353067A4 (en) * | 2008-11-14 | 2013-03-20 | Nokia Corp | Warning system for breaking touch screen or display |
WO2010055195A1 (en) | 2008-11-14 | 2010-05-20 | Nokia Corporation | Warning system for breaking touch screen or display |
EP2353067A1 (en) * | 2008-11-14 | 2011-08-10 | Nokia Corporation | Warning system for breaking touch screen or display |
US9588684B2 (en) | 2009-01-05 | 2017-03-07 | Tactus Technology, Inc. | Tactile interface for a computing device |
US8199124B2 (en) * | 2009-01-05 | 2012-06-12 | Tactus Technology | User interface system |
US8179377B2 (en) * | 2009-01-05 | 2012-05-15 | Tactus Technology | User interface system |
US20100171720A1 (en) * | 2009-01-05 | 2010-07-08 | Ciesla Michael Craig | User interface system |
US20100171719A1 (en) * | 2009-01-05 | 2010-07-08 | Ciesla Michael Craig | User interface system |
US9811163B2 (en) | 2009-02-15 | 2017-11-07 | Neonode Inc. | Elastic touch input surface |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
US9213443B2 (en) | 2009-02-15 | 2015-12-15 | Neonode Inc. | Optical touch screen systems using reflected light |
US9158416B2 (en) | 2009-02-15 | 2015-10-13 | Neonode Inc. | Resilient light-based touch surface |
US20100238138A1 (en) * | 2009-02-15 | 2010-09-23 | Neonode Inc. | Optical touch screen systems using reflected light |
US20100238139A1 (en) * | 2009-02-15 | 2010-09-23 | Neonode Inc. | Optical touch screen systems using wide light beams |
US9116617B2 (en) | 2009-07-03 | 2015-08-25 | Tactus Technology, Inc. | User interface enhancement system |
US20110001613A1 (en) * | 2009-07-03 | 2011-01-06 | Craig Michael Ciesla | Method for adjusting the user interface of a device |
US20110012851A1 (en) * | 2009-07-03 | 2011-01-20 | Craig Michael Ciesla | User Interface Enhancement System |
US8587548B2 (en) | 2009-07-03 | 2013-11-19 | Tactus Technology, Inc. | Method for adjusting the user interface of a device |
US8243038B2 (en) * | 2009-07-03 | 2012-08-14 | Tactus Technologies | Method for adjusting the user interface of a device |
US8207950B2 (en) * | 2009-07-03 | 2012-06-26 | Tactus Technologies | User interface enhancement system |
US8120588B2 (en) | 2009-07-15 | 2012-02-21 | Sony Ericsson Mobile Communications Ab | Sensor assembly and display including a sensor assembly |
WO2011006678A1 (en) * | 2009-07-15 | 2011-01-20 | Sony Ericsson Mobile Communications Ab | Sensor assembly and display including a sensor assembly |
US11644865B2 (en) | 2009-08-17 | 2023-05-09 | Apple Inc. | Housing as an I/O device |
US10248221B2 (en) | 2009-08-17 | 2019-04-02 | Apple Inc. | Housing as an I/O device |
US8654524B2 (en) | 2009-08-17 | 2014-02-18 | Apple Inc. | Housing as an I/O device |
US9600037B2 (en) | 2009-08-17 | 2017-03-21 | Apple Inc. | Housing as an I/O device |
US10739868B2 (en) | 2009-08-17 | 2020-08-11 | Apple Inc. | Housing as an I/O device |
TWI419039B (en) * | 2009-12-14 | 2013-12-11 | Casio Computer Co Ltd | Touch panel |
US9239623B2 (en) | 2010-01-05 | 2016-01-19 | Tactus Technology, Inc. | Dynamic tactile interface |
US9298262B2 (en) | 2010-01-05 | 2016-03-29 | Tactus Technology, Inc. | Dynamic tactile interface |
US8619035B2 (en) | 2010-02-10 | 2013-12-31 | Tactus Technology, Inc. | Method for assisting user input to a device |
US8587541B2 (en) | 2010-04-19 | 2013-11-19 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US8723832B2 (en) | 2010-04-19 | 2014-05-13 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
EP2710452A2 (en) * | 2011-05-19 | 2014-03-26 | Microsoft Corporation | Pressure-sensitive multi-touch device |
EP2710452A4 (en) * | 2011-05-19 | 2014-12-10 | Microsoft Corp | Pressure-sensitive multi-touch device |
WO2012158902A2 (en) | 2011-05-19 | 2012-11-22 | Microsoft Corporation | Pressure-sensitive multi-touch device |
US9372588B2 (en) | 2011-05-19 | 2016-06-21 | Microsoft Technology Licensing, Llc | Pressure-sensitive multi-touch device |
US20130069886A1 (en) * | 2011-09-16 | 2013-03-21 | Wan-Qiu Wang | Edge grip detection method of a touch panel and a device using the same |
US8963859B2 (en) * | 2011-09-16 | 2015-02-24 | Tpk Touch Solutions (Xiamen) Inc. | Edge grip detection method of a touch panel and a device using the same |
US9280224B2 (en) | 2012-09-24 | 2016-03-08 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US9405417B2 (en) | 2012-09-24 | 2016-08-02 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US10534479B2 (en) | 2012-10-14 | 2020-01-14 | Neonode Inc. | Optical proximity sensors |
US9569095B2 (en) | 2012-10-14 | 2017-02-14 | Neonode Inc. | Removable protective cover with embedded proximity sensors |
US10282034B2 (en) | 2012-10-14 | 2019-05-07 | Neonode Inc. | Touch sensitive curved and flexible displays |
US11733808B2 (en) | 2012-10-14 | 2023-08-22 | Neonode, Inc. | Object detector based on reflected light |
US11714509B2 (en) | 2012-10-14 | 2023-08-01 | Neonode Inc. | Multi-plane reflective sensor |
US9921661B2 (en) | 2012-10-14 | 2018-03-20 | Neonode Inc. | Optical proximity sensor and associated user interface |
US8917239B2 (en) | 2012-10-14 | 2014-12-23 | Neonode Inc. | Removable protective cover with embedded proximity sensors |
US10496180B2 (en) | 2012-10-14 | 2019-12-03 | Neonode, Inc. | Optical proximity sensor and associated user interface |
US11379048B2 (en) | 2012-10-14 | 2022-07-05 | Neonode Inc. | Contactless control panel |
US9741184B2 (en) | 2012-10-14 | 2017-08-22 | Neonode Inc. | Door handle with optical proximity sensors |
US9164625B2 (en) | 2012-10-14 | 2015-10-20 | Neonode Inc. | Proximity sensor for determining two-dimensional coordinates of a proximal object |
US10140791B2 (en) | 2012-10-14 | 2018-11-27 | Neonode Inc. | Door lock user interface |
US10802601B2 (en) | 2012-10-14 | 2020-10-13 | Neonode Inc. | Optical proximity sensor and associated user interface |
US8643628B1 (en) | 2012-10-14 | 2014-02-04 | Neonode Inc. | Light-based proximity detection system and user interface |
US10928957B2 (en) | 2012-10-14 | 2021-02-23 | Neonode Inc. | Optical proximity sensor |
US10949027B2 (en) | 2012-10-14 | 2021-03-16 | Neonode Inc. | Interactive virtual display |
US10004985B2 (en) | 2012-10-14 | 2018-06-26 | Neonode Inc. | Handheld electronic device and associated distributed multi-display system |
US9001087B2 (en) | 2012-10-14 | 2015-04-07 | Neonode Inc. | Light-based proximity detection system and user interface |
US11073948B2 (en) | 2012-10-14 | 2021-07-27 | Neonode Inc. | Optical proximity sensors |
US10324565B2 (en) | 2013-05-30 | 2019-06-18 | Neonode Inc. | Optical proximity sensor |
US9557813B2 (en) | 2013-06-28 | 2017-01-31 | Tactus Technology, Inc. | Method for reducing perceived optical distortion |
US10585530B2 (en) | 2014-09-23 | 2020-03-10 | Neonode Inc. | Optical proximity sensor |
US10949082B2 (en) | 2016-09-06 | 2021-03-16 | Apple Inc. | Processing capacitive touch gestures implemented on an electronic device |
US10520984B2 (en) * | 2018-03-09 | 2019-12-31 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Electronic device and manufacturing method thereof |
US20190278332A1 (en) * | 2018-03-09 | 2019-09-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Electronic device and manufacturing method thereof |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
US11842014B2 (en) | 2019-12-31 | 2023-12-12 | Neonode Inc. | Contactless touch input system |
US11669210B2 (en) | 2020-09-30 | 2023-06-06 | Neonode Inc. | Optical touch sensor |
Also Published As
Publication number | Publication date |
---|---|
US6369803B2 (en) | 2002-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6369803B2 (en) | Active edge user interface | |
CA2634098C (en) | Electronic device and method of providing haptic feedback | |
US8179371B2 (en) | Method, system, and graphical user interface for selecting a soft keyboard | |
US8537117B2 (en) | Handheld wireless communication device that selectively generates a menu in response to received commands | |
US6587131B1 (en) | Method for assisting user to operate pointer | |
KR101038459B1 (en) | Text selection using a touch sensitive screen of a handheld mobile communication device | |
CA2572574C (en) | Method and arrangement for a primary action on a handheld electronic device | |
US20080303795A1 (en) | Haptic display for a handheld electronic device | |
US20050071761A1 (en) | User interface on a portable electronic device | |
US20110193787A1 (en) | Input mechanism for providing dynamically protruding surfaces for user interaction | |
US20070013665A1 (en) | Method for shifting a shortcut in an electronic device, a display unit of the device, and an electronic device | |
US20070268259A1 (en) | Handheld wireless communication device | |
US20080168401A1 (en) | Method, system, and graphical user interface for viewing multiple application windows | |
US20070281733A1 (en) | Handheld wireless communication device with chamfer keys | |
US20070254701A1 (en) | Handheld wireless communication device | |
US20070254700A1 (en) | Handheld wireless communication device | |
US20070259697A1 (en) | Handheld wireless communication device | |
US7973765B2 (en) | Handheld wireless communication device | |
JP2002111813A (en) | Portable communication unit of radio communication system | |
US8064946B2 (en) | Handheld wireless communication device | |
US20070254705A1 (en) | Handheld wireless communication device | |
US20070252817A1 (en) | Handheld wireless communication device | |
US20090167695A1 (en) | Embedded navigation assembly and method on handheld device | |
US20070254690A1 (en) | Handheld wireless communication device | |
US20070254688A1 (en) | Handheld wireless communication device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NORTHERN TELECOM LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRISEBOIS, MICHEL A.;MAHAN, LAURA;FRENCH-ST. GEORGE, MARILYN;AND OTHERS;REEL/FRAME:009369/0481 Effective date: 19980707 |
|
AS | Assignment |
Owner name: NORTEL NETWORKS CORPORATION, CANADA Free format text: CHANGE OF NAME;ASSIGNOR:NORTHERN TELECOM LIMITED;REEL/FRAME:010567/0001 Effective date: 19990429 |
|
AS | Assignment |
Owner name: NORTEL NETWORKS LIMITED, CANADA Free format text: CHANGE OF NAME;ASSIGNOR:NORTEL NETWORKS CORPORATION;REEL/FRAME:011195/0706 Effective date: 20000830 Owner name: NORTEL NETWORKS LIMITED,CANADA Free format text: CHANGE OF NAME;ASSIGNOR:NORTEL NETWORKS CORPORATION;REEL/FRAME:011195/0706 Effective date: 20000830 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT,NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023892/0500 Effective date: 20100129 Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023892/0500 Effective date: 20100129 |
|
AS | Assignment |
Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT,NEW YO Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 Owner name: CITICORP USA, INC., AS ADMINISTRATIVE AGENT, NEW Y Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC.;REEL/FRAME:023905/0001 Effective date: 20100129 |
|
AS | Assignment |
Owner name: AVAYA INC.,NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTEL NETWORKS LIMITED;REEL/FRAME:023998/0878 Effective date: 20091218 Owner name: AVAYA INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTEL NETWORKS LIMITED;REEL/FRAME:023998/0878 Effective date: 20091218 |
|
AS | Assignment |
Owner name: BANK OF NEW YORK MELLON TRUST, NA, AS NOTES COLLATERAL AGENT, THE, PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC., A DELAWARE CORPORATION;REEL/FRAME:025863/0535 Effective date: 20110211 Owner name: BANK OF NEW YORK MELLON TRUST, NA, AS NOTES COLLAT Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA INC., A DELAWARE CORPORATION;REEL/FRAME:025863/0535 Effective date: 20110211 |
|
AS | Assignment |
Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS INC.;OCTEL COMMUNICATIONS CORPORATION;AND OTHERS;REEL/FRAME:041576/0001 Effective date: 20170124 |
|
AS | Assignment |
Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL COMMUNICATIONS CORPORATION), CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 023892/0500;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044891/0564 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 025863/0535;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST, NA;REEL/FRAME:044892/0001 Effective date: 20171128 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNI Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: VPNET TECHNOLOGIES, INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 030083/0639;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A.;REEL/FRAME:045012/0666 Effective date: 20171128 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW Y Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 |
|
AS | Assignment |
Owner name: SIERRA HOLDINGS CORP., NEW JERSEY Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITICORP USA, INC.;REEL/FRAME:045045/0564 Effective date: 20171215 Owner name: AVAYA, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITICORP USA, INC.;REEL/FRAME:045045/0564 Effective date: 20171215 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045124/0026 Effective date: 20171215 |
|
AS | Assignment |
Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA HOLDINGS CORP., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 |
|
AS | Assignment |
Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: CAAS TECHNOLOGIES, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY II, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: ZANG, INC. (FORMER NAME OF AVAYA CLOUD INC.), NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: VPNET TECHNOLOGIES, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: OCTEL COMMUNICATIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 |