US20060262146A1 - Mobile communication terminal and method - Google Patents

Mobile communication terminal and method Download PDF

Info

Publication number
US20060262146A1
US20060262146A1 US11/158,921 US15892105A US2006262146A1 US 20060262146 A1 US20060262146 A1 US 20060262146A1 US 15892105 A US15892105 A US 15892105A US 2006262146 A1 US2006262146 A1 US 2006262146A1
Authority
US
United States
Prior art keywords
user interface
interface element
candidate
desired direction
candidate user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/158,921
Inventor
Antti Koivisto
Andrei Popescu
Wei Liu
Guido Grassel
Virpi Roto
Lasse Pajunen
Matti Vaisanen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to US11/158,921 priority Critical patent/US20060262146A1/en
Assigned to NOKIA CORPORATION reassignment NOKIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIU, WEI, ROTO, VIRPI, KOIVISTO, ANTTI J., VAISANEN, MATTI, PAJUNEN, LASSE, POPESCU, ANDREI, GRASSEL, GUIDO
Publication of US20060262146A1 publication Critical patent/US20060262146A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9577Optimising the visualization of content, e.g. distillation of HTML documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units

Definitions

  • the present invention generally relates to user interfaces of mobile communication terminals, and more particularly to changing focus of user interface elements shown on displays of mobile communication terminals.
  • Mobile communication terminals have changed dramatically in the last decade. With the first 2G terminals, the only real purpose was to make normal phone calls. Now with 2.5G (GPRS), CDMA2000 and UMTS technology, mobile communication terminals not only facilitate voice communication, but also digital communication such as text and multimedia messaging, as well as browsing content provided by Internet servers.
  • GPRS GPRS
  • CDMA2000 Code Division Multiple Access 2000
  • UMTS Universal Mobile Broadband
  • the browser applications also known as user agents, that are responsible for rendering documents, such as HTML, SVG, SMIL, containing focusable user interface elements, for example links or form controls, need to provide a method that takes input from the user, e.g. an event generated by the hardware input controller, and translates that input into an action that changes the state of the document by removing the focus from one user interface element and setting it on another.
  • the user agent In case the user agent is running on a device that allows a pen or a mouse as an input device, then the user can directly indicate the user interface element that will receive focus by simply tapping the screen or moving the mouse and clicking, in a position where a desired user interface element is rendered. However, if the device only has a four-way or five-way navigation key or a joystick, the user agent has the much more difficult task of determining the user interface element that will receive focus solely based on:
  • the newly focused user interface element should usually be the same as the one intended by the user, and
  • the method should provide a certain degree of reversibility (e.g. a “left” press on the joystick, etc. followed by a “right” press should transfer the focus between the same two user interface elements).
  • a certain degree of reversibility e.g. a “left” press on the joystick, etc. followed by a “right” press should transfer the focus between the same two user interface elements.
  • the European patent EP 0 671 682 B1 presents a method, apparatus and computer readable storage device for positioning a cursor on one of a plurality of controls being displayed on a screen.
  • the presented method is unsuitable for use in modern mobile communication terminals capable of displaying complex documents with focusable user interface elements, as the method provides a low degree of reversibility.
  • an objective of the invention is to solve or at least reduce the above-identified and other problems and shortcomings with the prior art, and to provide improvements to a mobile communication terminal.
  • a first aspect of the present invention is a method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device.
  • the method comprises the steps of:
  • determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction;
  • said second user interface element determines said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
  • This provides a method with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • said mobile communication terminal further comprises a current focus position related to said first user interface element, said method comprising the further step of determining a new focus position related to said second user interface element, such that a component distance, along an axis orthogonal to said desired direction, of an absolute distance between said current focus position and said new focus position, is minimized.
  • the focus point furthermore increases the reversibility and provides a way for the user to more predictably move focus from the first user interface element to the second user interface element.
  • said new focus position is determined such that it is placed inside said second user interface element, at least a margin distance from any border thereof. This avoids the focus point from being placed right on the border of a user interface element, as the visible part of the user interface element is actually often placed a distance from the selectable border of the user interface element.
  • said step of detecting an input involves detecting an input from a directional input device. It is especially important to improve predictability for users when a directional input device is employed.
  • said step of detecting an input involves detecting an input from a device selected from the group consisting of a four way input device, a five way input device, an eight way input device, a nine way input device, a joystick, joypad and a navigation key.
  • said overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction is restricted to an overlap being visible on said display. This prevents longer user interface elements always winning focus over shorter user interface elements when all user interface elements completely overlap with the currently focused user interface element.
  • the method further comprises a step of providing a visual representation of said focus position on said display. This gives the user an indication of the position of the focus point which allows the user to improve the prediction of the movement of the focus point.
  • the visual representation may be a graphical symbol. This allows the user to determine the position of the focus points by means of familiar user interface symbols or icons.
  • the method further comprises a step, after said step of detecting an input, of determining a search area in a currently displayed document in which said first and second user interface elements are included, wherein said step of determining a set of candidate user interface elements is confined to user interface elements included in said search area.
  • the introduction of a search area decreases the required processing.
  • said search area is a combination of a part of said document currently visible on said display and a part of said document that would be visible if said document is scrolled in said desired direction.
  • This search area should contain all user interface items the user would expect to be able to navigate to, given the desired direction.
  • said applied function is: dist basic +dist parallel +2*dist orthogonal ⁇ square root over (overlap) ⁇ ,
  • dist basic is a Euclidian distance between a current focus position related to said first user interface element and a candidate focus position related to said candidate user interface element;
  • dist parallel is a component along an axis parallel to said desired direction of a distance between a first point of said first user interface element utmost in said desired direction and a second point of said candidate user interface element utmost in a direction opposite said desired direction;
  • dist orthogonal is a component along an axis orthogonal to said desired direction of a distance,between a third point, which may be the same as said first point, of said first user interface element closest to said candidate user interface element and a fourth point, which may be the same as said second point, of said candidate user interface element closest to said first user interface element;
  • overlap is a distance of overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction.
  • dist parallel is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction parallel to said desired direction.
  • dist orthogonal is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction orthogonal to said desired direction.
  • dist basic is determined to be 0.
  • a second aspect of the present invention is a mobile communication terminal comprising a display and an input device, said terminal being configured to allow movement of focus from a first user interface element to a second user interface element shown on said display, said terminal furthermore comprising:
  • This provides a mobile communication terminal with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • a third aspect of the present invention is a computer program product, directly loadable into a memory of a digital computer, comprising software code portions for performing a method according to the first aspect.
  • This provides a computer program product, when executed, providing improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.
  • FIG. 2 illustrates a computer network environment in which the pocket computer of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services.
  • FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings.
  • FIG. 4 illustrates a web browser showing content with hyperlinks.
  • FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page rendered on a display of a mobile terminal according to an embodiment of the present invention.
  • FIG. 6 illustrates a search area in an embodiment of the present invention.
  • FIG. 7 illustrates an exemplary behavior of movement of the focus position.
  • FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention.
  • FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image in an embodiment of the present invention.
  • FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.
  • FIG. 11 shows a flow chart illustrating an embodiment of the present invention.
  • FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.
  • the pocket computer 1 of the illustrated embodiment comprises an apparatus housing 2 and a display 3 provided at the surface of a front side 2 f of the apparatus housing 2 .
  • a plurality of hardware keys 5 a - d are provided, as well as a speaker 6 .
  • key 5 a is a five-way navigation key, i.e. a key which is depressible at four different peripheral positions to command navigation in respective orthogonal directions (“up”, “down”, “left”, “right”) among information shown on the display 3 , as well as depressible at a center position to command selection among information shown on the display 3 .
  • Key 5 b is a cancel key
  • key 5 c is a menu or options key
  • key 5 d is a home key.
  • an earphone audio terminal 7 a At the surface of a short side 21 of the apparatus housing 2 , there is provided an earphone audio terminal 7 a, a mains power terminal 7 b and a wire-based data interface 7 c in the form of a serial USB port.
  • FIG. 2 illustrates a computer network environment in which the pocket computer 1 of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services.
  • the pocket computer 1 has a rechargeable battery.
  • the pocket computer according to an embodiment of the invention also has at least one interface 55 ( FIG. 3 ) for wireless access to network resources on at least one digital network.
  • the pocket computer 1 may connect to a data communications network 32 by establishing a wireless link via a network access point 30 , such as a WLAN (Wireless Local Area Network) router.
  • the data communications network 32 may be a wide area network (WAN), such as the Internet or some part thereof, a local area network (LAN), etc.
  • WAN wide area network
  • LAN local area network
  • a plurality of network resources 40 - 44 may be connected to the data communications network 32 and are thus made available to the user 9 through the pocket computer 1 .
  • the network resources may include servers 40 with associated content 42 such as www data, wap data, ftp data, email data, audio data, video data, etc.
  • the network resources may also include other end-user devices 44 , such as personal computers.
  • a second digital network 26 is shown in FIG. 2 in the form of a mobile telecommunications network, compliant with any available mobile telecommunications standard such as GSM, UMTS, D-AMPS or CDMA2000.
  • the user 9 may access network resources 28 on the mobile telecommunications network 26 through the pocket computer 1 by establishing a wireless link lob to a mobile terminal 20 , which in turn has operative access to the mobile telecommunications network 26 over a wireless link 22 to a base station 24 , as is well known per se.
  • the wireless links 10 a, 10 b may for instance be in compliance with BluetoothTM, WLAN (Wireless Local Area Network, e.g. as specified in IEEE 802.11), HomeRF or HIPERLAN.
  • the interface(s) 55 will contain all the necessary hardware and software required for establishing such links, as is readily realized by a man skilled in the art.
  • FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings.
  • the pocket computer 1 has a controller 50 with associated memory 54 .
  • the controller is responsible for the overall operation of the pocket computer 1 and may be implemented by any commercially available CPU (Central Processing Unit), DSP (Digital Signal Processor) or any other electronic programmable logic device.
  • the associated memory may be internal and/or external to the controller 50 and may be RAM memory, ROM memory, EEPROM memory, flash memory, hard disk, or any combination thereof.
  • the memory 54 is used for various purposes by the controller 50 , one of them being for storing data and program instructions for various pieces of software in the pocket computer 1 .
  • the software may include a real-time operating system, drivers e.g. for a user interface 51 , as well as various applications 57 .
  • Non-limiting examples of applications are a www/wap browser application, a contacts application, a messaging application (email, SMS, MMS), a calendar application, an organizer application, a video game application, a calculator application, a voice memo application, an alarm clock application, a word processing application, a spreadsheet application, a code memory application, a music player application, a media streaming application, and a control panel application.
  • GUI (graphical user interface) functionality 56 in the user interface 51 controls the interaction between the applications 57 , the user 9 and the user interface elements 52 , 53 of the user interface.
  • FIG. 4 illustrates a browser application showing content with hyperlinks.
  • the browser application executing in the pocket computer 1 renders a text on the display 52 , including a number of hyperlinks 310 - 313 , where the hyperlink 311 is currently focused.
  • the browser application will instead display a new page, referred to by the activated hyperlink.
  • FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page 304 rendered on the display 52 of a mobile communication terminal according to the present invention such as aforesaid pocket computer.
  • a currently focused user interface element being a first link 302
  • the user first navigates to the right 305 to a new link 303 by pressing on a right side of the five way input device 5 a.
  • the user navigates left 306 by pressing on a left side of the five way input device 5 a, resulting in focus shifting back to the first link 302 again, i.e. the navigation is reversible as the user ends up with a focus on the first link 302 .
  • a graphical symbol in the form of a hand cursor 301 is used to indicate a currently focused user interface element and can be positioned at coordinates given by a focus position.
  • user interface elements may have arbitrary shape, and in particular, user interface elements are not limited to be of rectangular shape.
  • FIG. 6 illustrates a search area in an embodiment of the present invention.
  • the user has pressed down on a directional device, such as the five way directional device 5 a, resulting in a desired direction 320 being downwards. It is then determined a search area 323 where eligible user interface elements to receive focus could exist, given a part 321 of the current document visible on the display 52 and the desired direction 320 .
  • the search area 323 comprises the part 321 of the document currently visible on the display 52 plus a search range 322 , which is the document area that would become visible as a result of scrolling the document in the desired direction 320 .
  • a set of candidate user interface elements is populated by recursively traversing a rendering tree of the browser and testing boxes in nodes of the tree for overlap with the search area 323 . Those boxes that are found to be overlapping are used to get pointers to corresponding user interface elements in the rendering tree.
  • the rendering tree nodes, that correspond to user interface elements that are focusable are added to the set of candidate user interface elements.
  • FIG. 7 illustrates an exemplary behavior of movement of the focus position.
  • the focus position is a point inside the area of the currently focused user interface element, displayed on the display 52 .
  • the focus point is not actually inside the currently focused user interface element but is positioned in a proximity exterior to the focused user interface element. This point is moved along the desired direction 328 (given by several user input events), and the user interface elements being traversed this way are favored for receiving the focus. This adds reversibility to the navigation as can be seen in the following example.
  • the focus position movement follows the navigational input given by the user: given that a user interface element 320 has focus, and the focus position is in the position 325 , navigating downward 328 will determine the focus position to move in the same direction. Initially, the focus position encounters a user interface element 322 , resulting in the focus position moving to a new position 326 . An additional user input event indicating a desire to move further downwards, analogously moves the focused position to a new position 327 in a user interface element 323 . When moving upward from user interface element 323 , the focus position will be determined such that the same user interface elements will receive focus ( 322 followed by 320 ).
  • FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention.
  • a user triggers a scroll in the browser application along a direction 336 to the left from an original display view 335 a such that the display view changes to 335 b, and in response to an additional user input event, to a display view 335 c.
  • Content such as user interface elements 331 , 332 and 333 , are moved correspondingly as an effect of the scrolling.
  • the focus position moves from an original position 334 a to an intermediate position 334 b and finally to a position 334 c.
  • the focus position keeps, if possible, the same relative position inside the display 52 .
  • FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image 341 .
  • This is to illustrate the situation when a focusable user interface element 341 encloses other focusable user interface elements 342 - 345 .
  • An example of such a situation is in an HTML document where the author may place several links 342 - 345 inside a large image 341 , which is, in itself, focusable.
  • the focus position is originally in a position 339 in a user interface element 340 , displayed on the display 52 .
  • the user indicates a desired direction 337 of movement, in this case being downwards, resulting in the focus position moving to a new position 338 .
  • This new position can then be the starting point for applying a distance function to the user interface elements 342 - 345 inside the image 341 as is described in detail with reference to FIGS. 10A and 10B below.
  • the focus position can be used again as described above.
  • FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.
  • FIG. 10A currently focused user interface element 347 is indicated that it is in focus by a box outline 346 .
  • the focus position is in a position 350 .
  • the user indicates a desired direction 351 to the right, resulting in candidate user interface elements being the user interface elements 348 and 349 .
  • currently focused user interface element 353 is indicated that it is in focus by a box outline 352 .
  • the current focus position is in a position 356 .
  • the user indicates a desired direction 359 downwards, resulting in candidate user interface elements being user interface elements 354 and 355 .
  • a candidate focus position 357 is determined for user interface element 354 by having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 354 , and having a component on the x-axis being the same as the current focus position 356 .
  • a candidate focus position 358 is determined for user interface element 355 as having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 355 , and having a component on the x-axis being as close to the current focus position 356 , while still remaining within the user interface element 355 .
  • an optional margin is applied such that the candidate focus position 358 is positioned a margin distance from the border of the user interface element 355 .
  • the distance function is used to determine a weighted distance between a currently focused user interface element position and a candidate user interface element. This distance function is applied to calculate a weighted distance between each candidate user interface element and the currently focused user interface element. The new user interface element to receive focus is then determined to be the candidate user interface element with the smallest weighted distance to the currently focused user interface element.
  • the parameter dist basic is a Euclidian distance between the current focus position of the currently focused user interface element and the candidate focus position of the candidate user interface element. If the two positions have the same coordinate on the axis orthogonal to the desired direction, dist basic is forced to be 0.
  • dist basic is the Euclidian distance between. 356 and 358 .
  • dist basic is forced to be 0.
  • user interface element 355 will yield a smaller distance value than 354 , so the focus will correctly move to 355 .
  • the parameter dist parallel is a component along an axis parallel to the desired direction of a distance between a first point 360 of the first user interface element utmost in the desired direction, and a second point 361 of the candidate user interface element utmost in a direction opposite the desired direction.
  • the parameter dist parallel is in other words a component of the distance between the points 360 and 361 , projected along an axis parallel to the desired direction.
  • dist parallel is set to be 0. Note that in FIG. 10B , as the currently focused user interface element 353 is rectangular, the point 360 may be chosen arbitrarily along the lower edge of the currently focused user interface element 353 . Analogously, the point 361 may be chosen arbitrarily along the upper edge of the candidate user interface element 355 .
  • the parameter dist orthogonal is set to 0 if there is an overlap between the first user interface element and the candidate user interface element in a direction orthogonal to the desired direction; otherwise dist orthogonal is a component along an axis orthogonal to the desired direction of a distance between a third point, which may be the same as the first point, of the first user interface element closest to the candidate user interface element and a fourth point, which may be the same as the second point, of the candidate user interface element closest to the first user interface element.
  • dist orthogonal is the vertical distance, as the desired direction is horizontal, between the point 362 of the currently focused user interface element 347 and the point 363 of the candidate user interface element 348 .
  • the parameter dist orthogonal is in other words a component of the distance between the points 362 and 363 , projected along an axis orthogonal to the desired direction.
  • the point 362 may be chosen arbitrarily along the upper edge of the currently focused user interface element 347 .
  • the point 363 may be chosen arbitrarily along the lower edge of the candidate user interface element 348 .
  • the term dist orthogonal is used in the calculation to compensate for the situations where a link is close along the desired direction, but very far on the orthogonal axis. In such a case, it is more natural to navigate to another link, which may be further away along the desired direction, but approximately on the same level on the other axis.
  • the parameter overlap is a distance of overlap between the currently focused user interface element and the candidate user interface element in a direction orthogonal to the desired direction.
  • the overlap between the currently focused user interface element 347 and the candidate user interface element 349 is a distance 364 .
  • User interface elements are rewarded for having high overlap with the currently focused user interface element.
  • a visible width may optionally be set as an upper limit for the overlap.
  • FIG. 11 shows a flow chart illustrating an embodiment of the present invention. As a man skilled in the art will realize, all steps in this embodiment are not required to implement the invention.
  • a directional input signal is detected via an input device such as the five-way navigation key 5 a. This input gives information about the desired direction in which the user wishes to move focus from the currently focused user interface element to a target user interface element.
  • a search area is determined, as described in detail with reference to FIG. 6 above.
  • user interface elements of the search area are all considered candidate user interface elements and references to these are collected in a set.
  • a test candidate user interface element for which a weighted distance has not been calculated yet, is selected from the set of candidate user interface elements.
  • a weighted distance is calculated between the test candidate user interface, element and the currently focused user interface element, as described in detail with reference to FIGS. 10A and 10B above.
  • a conditional uncalculated user interface elements step 415 it is tested whether there are any more uncalculated user interface elements. If there are more uncalculated user interface elements, the method proceeds to the determine a test candidate user interface element step 413 , otherwise the method proceeds to a determine target user interface element step 416 .
  • the target user interface element is determined as an element in the candidate set of user interface elements having a minimum weighted distance to the currently focused user interface element.

Abstract

It is shown a method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device. The method comprises the steps of: detecting a directional input via said input device, said input indicating a desired direction to move; determining a set of candidate user interface elements being eligible to receive focus; for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation-in-part of U.S. patent application Ser. No. 11/135,624 filed on May 23, 2005.
  • FIELD OF THE INVENTION
  • The present invention generally relates to user interfaces of mobile communication terminals, and more particularly to changing focus of user interface elements shown on displays of mobile communication terminals.
  • BACKGROUND OF THE INVENTION
  • Mobile communication terminals have changed dramatically in the last decade. With the first 2G terminals, the only real purpose was to make normal phone calls. Now with 2.5G (GPRS), CDMA2000 and UMTS technology, mobile communication terminals not only facilitate voice communication, but also digital communication such as text and multimedia messaging, as well as browsing content provided by Internet servers.
  • The browser applications, also known as user agents, that are responsible for rendering documents, such as HTML, SVG, SMIL, containing focusable user interface elements, for example links or form controls, need to provide a method that takes input from the user, e.g. an event generated by the hardware input controller, and translates that input into an action that changes the state of the document by removing the focus from one user interface element and setting it on another.
  • In case the user agent is running on a device that allows a pen or a mouse as an input device, then the user can directly indicate the user interface element that will receive focus by simply tapping the screen or moving the mouse and clicking, in a position where a desired user interface element is rendered. However, if the device only has a four-way or five-way navigation key or a joystick, the user agent has the much more difficult task of determining the user interface element that will receive focus solely based on:
  • the currently focused user interface element,
  • the desired direction (given by the joystick, etc.), and
  • the position of the other focusable user interface elements relative to the currently focused user interface element.
  • Two important requirements that should be fulfilled by a user agent that provides such a method are:
  • the newly focused user interface element should usually be the same as the one intended by the user, and
  • the method should provide a certain degree of reversibility (e.g. a “left” press on the joystick, etc. followed by a “right” press should transfer the focus between the same two user interface elements).
  • The European patent EP 0 671 682 B1 presents a method, apparatus and computer readable storage device for positioning a cursor on one of a plurality of controls being displayed on a screen. However, the presented method is unsuitable for use in modern mobile communication terminals capable of displaying complex documents with focusable user interface elements, as the method provides a low degree of reversibility.
  • Consequently, there is a problem on how to manage navigation in documents displayed in a mobile communication terminal having only a limited input device.
  • SUMMARY OF THE INVENTION
  • In view of the above, an objective of the invention is to solve or at least reduce the above-identified and other problems and shortcomings with the prior art, and to provide improvements to a mobile communication terminal.
  • Generally, the above objectives and purposes are achieved by methods, mobile communication terminals and computer program products according to the attached independent patent claims.
  • A first aspect of the present invention is a method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device. The method comprises the steps of:
  • detecting a directional input via said input device, said input indicating a desired direction to move;
  • determining a set of candidate user interface elements being eligible to receive focus;
  • for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and
  • determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
  • This provides a method with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • In one embodiment, said mobile communication terminal further comprises a current focus position related to said first user interface element, said method comprising the further step of determining a new focus position related to said second user interface element, such that a component distance, along an axis orthogonal to said desired direction, of an absolute distance between said current focus position and said new focus position, is minimized. The focus point furthermore increases the reversibility and provides a way for the user to more predictably move focus from the first user interface element to the second user interface element.
  • In one embodiment, said in step of determining a new focus position, said new focus position is determined such that it is placed inside said second user interface element, at least a margin distance from any border thereof. This avoids the focus point from being placed right on the border of a user interface element, as the visible part of the user interface element is actually often placed a distance from the selectable border of the user interface element.
  • In one embodiment, said step of detecting an input involves detecting an input from a directional input device. It is especially important to improve predictability for users when a directional input device is employed.
  • In one embodiment, said step of detecting an input involves detecting an input from a device selected from the group consisting of a four way input device, a five way input device, an eight way input device, a nine way input device, a joystick, joypad and a navigation key.
  • In one embodiment, said overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction is restricted to an overlap being visible on said display. This prevents longer user interface elements always winning focus over shorter user interface elements when all user interface elements completely overlap with the currently focused user interface element.
  • In one embodiment, the method further comprises a step of providing a visual representation of said focus position on said display. This gives the user an indication of the position of the focus point which allows the user to improve the prediction of the movement of the focus point. The visual representation may be a graphical symbol. This allows the user to determine the position of the focus points by means of familiar user interface symbols or icons.
  • In one embodiment, the method further comprises a step, after said step of detecting an input, of determining a search area in a currently displayed document in which said first and second user interface elements are included, wherein said step of determining a set of candidate user interface elements is confined to user interface elements included in said search area. The introduction of a search area decreases the required processing.
  • In one embodiment, said search area is a combination of a part of said document currently visible on said display and a part of said document that would be visible if said document is scrolled in said desired direction. This search area should contain all user interface items the user would expect to be able to navigate to, given the desired direction.
  • In one embodiment, said applied function is:
    distbasic+distparallel+2*distorthogonal−√{square root over (overlap)},
  • wherein distbasic is a Euclidian distance between a current focus position related to said first user interface element and a candidate focus position related to said candidate user interface element;
  • distparallel is a component along an axis parallel to said desired direction of a distance between a first point of said first user interface element utmost in said desired direction and a second point of said candidate user interface element utmost in a direction opposite said desired direction;
  • distorthogonal is a component along an axis orthogonal to said desired direction of a distance,between a third point, which may be the same as said first point, of said first user interface element closest to said candidate user interface element and a fourth point, which may be the same as said second point, of said candidate user interface element closest to said first user interface element; and
  • overlap is a distance of overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction.
  • The formula provided gives a good level of reversibility while being reasonably simple to calculate.
  • In one embodiment, distparallel is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction parallel to said desired direction.
  • In one embodiment, distorthogonal is determined to be 0 if said first user interface element and said candidate user interface element overlap in a direction orthogonal to said desired direction.
  • In one embodiment, if a component of said current focus position on an axis orthogonal to said desired direction equals a component of said candidate focus position on an axis orthogonal to said desired direction, distbasic is determined to be 0.
  • A second aspect of the present invention is a mobile communication terminal comprising a display and an input device, said terminal being configured to allow movement of focus from a first user interface element to a second user interface element shown on said display, said terminal furthermore comprising:
  • means for detecting a directional input via said input device, said input indicating a desired direction to move;
  • means for determining a set of candidate user interface elements being eligible to receive focus;
  • means for determining, for each candidate user interface element in said set of candidate user interface elements, a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and
  • means for determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
  • This provides a mobile communication terminal with improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • A third aspect of the present invention is a computer program product, directly loadable into a memory of a digital computer, comprising software code portions for performing a method according to the first aspect. This provides a computer program product, when executed, providing improved predictability of the shift of focus from a first user interface element to a second user interface element in a complex user interface environment.
  • Other objectives, features and advantages of the present invention will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the invention will now be described in more detail, with reference to the enclosed drawings.
  • FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.
  • FIG. 2 illustrates a computer network environment in which the pocket computer of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services.
  • FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings.
  • FIG. 4 illustrates a web browser showing content with hyperlinks.
  • FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page rendered on a display of a mobile terminal according to an embodiment of the present invention.
  • FIG. 6 illustrates a search area in an embodiment of the present invention.
  • FIG. 7 illustrates an exemplary behavior of movement of the focus position.
  • FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention.
  • FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image in an embodiment of the present invention.
  • FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.
  • FIG. 11 shows a flow chart illustrating an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a perspective view of a mobile communication terminal in the form of a pocket computer according to one embodiment of the present invention.
  • The pocket computer 1 of the illustrated embodiment comprises an apparatus housing 2 and a display 3 provided at the surface of a front side 2 f of the apparatus housing 2. Next to the display 3 a plurality of hardware keys 5 a-d are provided, as well as a speaker 6.
  • More particularly, key 5 a is a five-way navigation key, i.e. a key which is depressible at four different peripheral positions to command navigation in respective orthogonal directions (“up”, “down”, “left”, “right”) among information shown on the display 3, as well as depressible at a center position to command selection among information shown on the display 3. Key 5 b is a cancel key, key 5 c is a menu or options key, and key 5 d is a home key.
  • At the surface of a short side 21 of the apparatus housing 2, there is provided an earphone audio terminal 7 a, a mains power terminal 7 b and a wire-based data interface 7 c in the form of a serial USB port.
  • FIG. 2 illustrates a computer network environment in which the pocket computer 1 of FIG. 1 advantageously may be used for providing wireless access for the user to network resources and remote services. To allow portable use, the pocket computer 1 has a rechargeable battery. The pocket computer according to an embodiment of the invention also has at least one interface 55 (FIG. 3) for wireless access to network resources on at least one digital network. The pocket computer 1 may connect to a data communications network 32 by establishing a wireless link via a network access point 30, such as a WLAN (Wireless Local Area Network) router. The data communications network 32 may be a wide area network (WAN), such as the Internet or some part thereof, a local area network (LAN), etc. A plurality of network resources 40-44 may be connected to the data communications network 32 and are thus made available to the user 9 through the pocket computer 1. For instance, the network resources may include servers 40 with associated content 42 such as www data, wap data, ftp data, email data, audio data, video data, etc. The network resources may also include other end-user devices 44, such as personal computers.
  • A second digital network 26 is shown in FIG. 2 in the form of a mobile telecommunications network, compliant with any available mobile telecommunications standard such as GSM, UMTS, D-AMPS or CDMA2000. In the illustrated exemplifying embodiment, the user 9 may access network resources 28 on the mobile telecommunications network 26 through the pocket computer 1 by establishing a wireless link lob to a mobile terminal 20, which in turn has operative access to the mobile telecommunications network 26 over a wireless link 22 to a base station 24, as is well known per se. The wireless links 10 a, 10 b may for instance be in compliance with Bluetooth™, WLAN (Wireless Local Area Network, e.g. as specified in IEEE 802.11), HomeRF or HIPERLAN. Thus, the interface(s) 55 will contain all the necessary hardware and software required for establishing such links, as is readily realized by a man skilled in the art.
  • FIG. 3 is a schematic block diagram of the pocket computer according to the previous drawings. As seen in FIG. 3, the pocket computer 1 has a controller 50 with associated memory 54. The controller is responsible for the overall operation of the pocket computer 1 and may be implemented by any commercially available CPU (Central Processing Unit), DSP (Digital Signal Processor) or any other electronic programmable logic device. The associated memory may be internal and/or external to the controller 50 and may be RAM memory, ROM memory, EEPROM memory, flash memory, hard disk, or any combination thereof.
  • The memory 54 is used for various purposes by the controller 50, one of them being for storing data and program instructions for various pieces of software in the pocket computer 1. The software may include a real-time operating system, drivers e.g. for a user interface 51, as well as various applications 57.
  • Many if not all of these applications will interact with the user 9 both by receiving data input from him, such as text or navigational input through the input device(s) 53, and by providing data output to him, such as visual output in the form of e.g. text and graphical information presented on the display 52. Non-limiting examples of applications are a www/wap browser application, a contacts application, a messaging application (email, SMS, MMS), a calendar application, an organizer application, a video game application, a calculator application, a voice memo application, an alarm clock application, a word processing application, a spreadsheet application, a code memory application, a music player application, a media streaming application, and a control panel application. GUI (graphical user interface) functionality 56 in the user interface 51 controls the interaction between the applications 57, the user 9 and the user interface elements 52, 53 of the user interface.
  • FIG. 4 illustrates a browser application showing content with hyperlinks. In this example, the browser application executing in the pocket computer 1 renders a text on the display 52, including a number of hyperlinks 310-313, where the hyperlink 311 is currently focused. As is known in the art, if the user activates the focused hyperlink, the browser application will instead display a new page, referred to by the activated hyperlink.
  • FIG. 5 illustrates a user agent behavior as it would appear to a user navigating in a web page 304 rendered on the display 52 of a mobile communication terminal according to the present invention such as aforesaid pocket computer. With a currently focused user interface element being a first link 302, the user first navigates to the right 305 to a new link 303 by pressing on a right side of the five way input device 5 a. Secondly, the user navigates left 306 by pressing on a left side of the five way input device 5 a, resulting in focus shifting back to the first link 302 again, i.e. the navigation is reversible as the user ends up with a focus on the first link 302. A graphical symbol in the form of a hand cursor 301 is used to indicate a currently focused user interface element and can be positioned at coordinates given by a focus position. Note that user interface elements may have arbitrary shape, and in particular, user interface elements are not limited to be of rectangular shape.
  • FIG. 6 illustrates a search area in an embodiment of the present invention. In this example, the user has pressed down on a directional device, such as the five way directional device 5 a, resulting in a desired direction 320 being downwards. It is then determined a search area 323 where eligible user interface elements to receive focus could exist, given a part 321 of the current document visible on the display 52 and the desired direction 320.
  • The search area 323 comprises the part 321 of the document currently visible on the display 52 plus a search range 322, which is the document area that would become visible as a result of scrolling the document in the desired direction 320.
  • A set of candidate user interface elements is populated by recursively traversing a rendering tree of the browser and testing boxes in nodes of the tree for overlap with the search area 323. Those boxes that are found to be overlapping are used to get pointers to corresponding user interface elements in the rendering tree. The rendering tree nodes, that correspond to user interface elements that are focusable are added to the set of candidate user interface elements.
  • FIG. 7 illustrates an exemplary behavior of movement of the focus position. The focus position is a point inside the area of the currently focused user interface element, displayed on the display 52. Alternatively, the focus point is not actually inside the currently focused user interface element but is positioned in a proximity exterior to the focused user interface element. This point is moved along the desired direction 328 (given by several user input events), and the user interface elements being traversed this way are favored for receiving the focus. This adds reversibility to the navigation as can be seen in the following example.
  • The focus position movement follows the navigational input given by the user: given that a user interface element 320 has focus, and the focus position is in the position 325, navigating downward 328 will determine the focus position to move in the same direction. Initially, the focus position encounters a user interface element 322, resulting in the focus position moving to a new position 326. An additional user input event indicating a desire to move further downwards, analogously moves the focused position to a new position 327 in a user interface element 323. When moving upward from user interface element 323, the focus position will be determined such that the same user interface elements will receive focus (322 followed by 320). Here is a difference to solutions in the prior art for solving the same problem: when traveling upwards from user interface element 323, the focus arrives at 322. At this point, an algorithm that relies solely on a mathematical function to compute the distance between user interface elements would most likely choose user interface element 321 as the next focus target. One reason for this can be that a geometrical center 328 of the user interface element 322 is closer to a geometrical center 329 of user interface element 321 than the geometrical center 330 of user interface element 320. However, this would lead to a rather bad user experience, since the expectation is that 320, and not 321, would receive focus.
  • FIG. 8 illustrates a movement of a focus position when content is scrolled in an embodiment of the present invention. On mobile communication terminals it is quite often the case that only a small part of a document is visible through the display 52. Consequently it frequently happens that only one user interface element is currently visible, and the user has to scroll the document in order to see other user interface elements.
  • For example, a user triggers a scroll in the browser application along a direction 336 to the left from an original display view 335 a such that the display view changes to 335 b, and in response to an additional user input event, to a display view 335 c. Content, such as user interface elements 331, 332 and 333, are moved correspondingly as an effect of the scrolling. Additionally, synchronized with the scrolling, the focus position moves from an original position 334 a to an intermediate position 334 b and finally to a position 334 c. The focus position keeps, if possible, the same relative position inside the display 52.
  • FIG. 9 illustrates a movement of a focusable position to a focusable user interface element in an image 341. This is to illustrate the situation when a focusable user interface element 341 encloses other focusable user interface elements 342-345. An example of such a situation is in an HTML document where the author may place several links 342-345 inside a large image 341, which is, in itself, focusable.
  • The focus position is originally in a position 339 in a user interface element 340, displayed on the display 52. The user indicates a desired direction 337 of movement, in this case being downwards, resulting in the focus position moving to a new position 338. This new position can then be the starting point for applying a distance function to the user interface elements 342-345 inside the image 341 as is described in detail with reference to FIGS. 10A and 10B below. Once a user interface element is chosen, the focus position can be used again as described above.
  • FIG. 10A and 10B illustrate the use of a distance function in an embodiment of the present invention.
  • In FIG. 10A, currently focused user interface element 347 is indicated that it is in focus by a box outline 346. The focus position is in a position 350. The user indicates a desired direction 351 to the right, resulting in candidate user interface elements being the user interface elements 348 and 349.
  • In FIG. 10B, currently focused user interface element 353 is indicated that it is in focus by a box outline 352. The current focus position is in a position 356. The user indicates a desired direction 359 downwards, resulting in candidate user interface elements being user interface elements 354 and 355. A candidate focus position 357 is determined for user interface element 354 by having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 354, and having a component on the x-axis being the same as the current focus position 356. A candidate focus position 358 is determined for user interface element 355 as having a co-ordinate with a component on the y-axis corresponding to the geometrical center of the user interface element 355, and having a component on the x-axis being as close to the current focus position 356, while still remaining within the user interface element 355. In this case, an optional margin is applied such that the candidate focus position 358 is positioned a margin distance from the border of the user interface element 355.
  • The distance function is used to determine a weighted distance between a currently focused user interface element position and a candidate user interface element. This distance function is applied to calculate a weighted distance between each candidate user interface element and the currently focused user interface element. The new user interface element to receive focus is then determined to be the candidate user interface element with the smallest weighted distance to the currently focused user interface element. The weighted distance is calculated by means of the following formula:
    distweighted=distbasic+distparallel+2*distorthogonal−√{square root over (overlap)}
  • The parameter distbasic is a Euclidian distance between the current focus position of the currently focused user interface element and the candidate focus position of the candidate user interface element. If the two positions have the same coordinate on the axis orthogonal to the desired direction, distbasic is forced to be 0. For example, in FIG. 10B, when calculating the weighted distance between the user interface elements 353 and 355, distbasic is the Euclidian distance between. 356 and 358. When calculating distweighted between user interface elements 353 and 354, distbasic is forced to be 0. However, due to the presence of the other terms in the calculation of distweighted, user interface element 355 will yield a smaller distance value than 354, so the focus will correctly move to 355.
  • The parameter distparallel is a component along an axis parallel to the desired direction of a distance between a first point 360 of the first user interface element utmost in the desired direction, and a second point 361 of the candidate user interface element utmost in a direction opposite the desired direction. The parameter distparallel is in other words a component of the distance between the points 360 and 361, projected along an axis parallel to the desired direction. Preferably, if the first point is further along the desired direction than the second point, distparallel is set to be 0. Note that in FIG. 10B, as the currently focused user interface element 353 is rectangular, the point 360 may be chosen arbitrarily along the lower edge of the currently focused user interface element 353. Analogously, the point 361 may be chosen arbitrarily along the upper edge of the candidate user interface element 355.
  • The parameter distorthogonal is set to 0 if there is an overlap between the first user interface element and the candidate user interface element in a direction orthogonal to the desired direction; otherwise distorthogonal is a component along an axis orthogonal to the desired direction of a distance between a third point, which may be the same as the first point, of the first user interface element closest to the candidate user interface element and a fourth point, which may be the same as the second point, of the candidate user interface element closest to the first user interface element. For example, in FIG. 10A, distorthogonal is the vertical distance, as the desired direction is horizontal, between the point 362 of the currently focused user interface element 347 and the point 363 of the candidate user interface element 348. The parameter distorthogonal is in other words a component of the distance between the points 362 and 363, projected along an axis orthogonal to the desired direction. Note that in this case, as the currently focused user interface element is rectangular, the point 362 may be chosen arbitrarily along the upper edge of the currently focused user interface element 347. Analogously, the point 363 may be chosen arbitrarily along the lower edge of the candidate user interface element 348. The term distorthogonal is used in the calculation to compensate for the situations where a link is close along the desired direction, but very far on the orthogonal axis. In such a case, it is more natural to navigate to another link, which may be further away along the desired direction, but approximately on the same level on the other axis.
  • The parameter overlap is a distance of overlap between the currently focused user interface element and the candidate user interface element in a direction orthogonal to the desired direction. For example, in FIG. 10A, the overlap between the currently focused user interface element 347 and the candidate user interface element 349, is a distance 364. User interface elements are rewarded for having high overlap with the currently focused user interface element. To prevent longer user interface elements always winning focus over shorter user interface elements when all user interface elements completely overlap with the currently focused user interface element, a visible width may optionally be set as an upper limit for the overlap.
  • FIG. 11 shows a flow chart illustrating an embodiment of the present invention. As a man skilled in the art will realize, all steps in this embodiment are not required to implement the invention.
  • In a detect directional input step 410, a directional input signal is detected via an input device such as the five-way navigation key 5 a. This input gives information about the desired direction in which the user wishes to move focus from the currently focused user interface element to a target user interface element.
  • In a determine search area step 411, a search area is determined, as described in detail with reference to FIG. 6 above.
  • In a determine a set of candidate user interface elements step 412, user interface elements of the search area are all considered candidate user interface elements and references to these are collected in a set.
  • In a determine a test candidate user interface element step 413, a test candidate user interface element, for which a weighted distance has not been calculated yet, is selected from the set of candidate user interface elements.
  • In a calculate weighted distance step 414, a weighted distance is calculated between the test candidate user interface, element and the currently focused user interface element, as described in detail with reference to FIGS. 10A and 10B above.
  • In a conditional uncalculated user interface elements step 415, it is tested whether there are any more uncalculated user interface elements. If there are more uncalculated user interface elements, the method proceeds to the determine a test candidate user interface element step 413, otherwise the method proceeds to a determine target user interface element step 416.
  • In the determine target user interface element step 416, the target user interface element is determined as an element in the candidate set of user interface elements having a minimum weighted distance to the currently focused user interface element.
  • The invention has mainly been described above with reference to a number of embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims. It is to be noted that the invention may be exercised in other kinds of mobile communication terminals than the pocket computer of FIGS. 1-3, including but not limited to mobile (cellular) telephones and personal digital assistants (PDAs).

Claims (16)

1. A method to move focus from a first user interface element to a second user interface element shown on a display of a mobile communication terminal, said mobile communication terminal further comprising an input device, said method comprising the steps of:
detecting a directional input via said input device, said input indicating a desired direction to move;
determining a set of candidate user interface elements being eligible to receive focus;
for each candidate user interface element in said set of candidate user interface elements, determining a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and
determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
2. The method of claim 1, wherein said mobile communication terminal further comprises a current focus position related to said first user interface element, said method comprising the further step of:
determining a new focus position related to said second user interface element, such that a component distance, along an axis orthogonal to said desired direction, of an absolute distance between said current focus position and said new focus position, is minimized.
3. The method according to claim 2, wherein in said step of determining a new focus position, said new focus position is determined such that it is placed inside said second user interface element, at least a margin distance from any border thereof.
4. The method of claim 1, wherein said step of detecting an input involves detecting an input from a directional input device.
5. The method of claim 4, wherein said step of detecting an input involves detecting an input from a device selected from the group consisting of a four way input device, a five way input device, an eight way input device, a nine way input device, a joystick, a joypad and a navigation key.
6. The method of claim 1, wherein said overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction is restricted to an overlap being visible on said display.
7. The method of claim 1, further comprising the step of:
providing a visual representation of said focus position on said display.
8. The method of claim 7, wherein said visual representation is a graphical symbol.
9. The method of claim 1, further comprising the step, after said step of detecting an input, of:
determining a search area in a currently displayed document in which said first and second user interface elements are included, wherein said step of determining a set of candidate user interface elements is confined to user interface elements included in said search area.
10. The method of claim 9, wherein said search area is a combination of a part of said document currently visible on said display and a part of said document that would be visible if said document is scrolled in said desired direction.
11. The method of claim 1, wherein said applied function is:

distbasic+distparallel+2*distorthogonal−√{square root over (overlap)},
wherein distbasic is a Euclidian distance between a current focus position related to said first user interface element and a candidate focus position related to said candidate user interface element;
distparallel is a component along an axis parallel to said desired direction of a distance between a first point of said first user interface element utmost in said desired direction and a second point of said candidate user interface element utmost in a direction opposite said desired direction;
distorthogonal is a component along an axis orthogonal to said desired direction of a distance between a third point, which may be the same as said first point, of said first user interface element closest to said candidate user interface element and a fourth point, which may be the same as said second point, of said candidate user interface element closest to said first user interface element; and
overlap is a distance of overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction.
12. The method of claim 11, wherein distparallel is set to 0 if said first user interface element and said candidate user interface element overlap in a direction parallel to said desired direction.
13. The method of claim 11, wherein distorthogonal is set to 0 if said first user interface element and said candidate user interface element overlap in a direction orthogonal to said desired direction.
14. The method of claim 11, wherein if a component of said current focus position on an axis orthogonal to said desired direction equals a component of said candidate focus position on an axis orthogonal to said desired direction, distbasic is determined to be 0.
15. A mobile communication terminal comprising a display and an input device, said terminal being configured to allow movement of focus from a first user interface element to a second user interface element shown on said display, said terminal furthermore comprising:
means for detecting a directional input via said input device, said input indicating a desired direction to move;
means for determining a set of candidate user interface elements being eligible to receive focus;
means for determining, for each candidate user interface element in said set of candidate user interface elements, a weighted distance between said candidate user interface element and said first user interface element by applying a function including, as an input parameter, an overlap between said first user interface element and said candidate user interface element in a direction orthogonal to said desired direction; and
means for determining said second user interface element to be a particular candidate user interface element in said set of candidate user interface elements that has a minimum weighted distance to said first user interface element.
16. A computer program product, directly loadable into a memory of a digital computer, comprising software code portions for performing a method according to claim 1.
US11/158,921 2005-05-23 2005-06-22 Mobile communication terminal and method Abandoned US20060262146A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/158,921 US20060262146A1 (en) 2005-05-23 2005-06-22 Mobile communication terminal and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/135,624 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods
US11/158,921 US20060262146A1 (en) 2005-05-23 2005-06-22 Mobile communication terminal and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/135,624 Continuation-In-Part US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods

Publications (1)

Publication Number Publication Date
US20060262146A1 true US20060262146A1 (en) 2006-11-23

Family

ID=37447912

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/135,624 Active 2027-12-17 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods
US11/158,921 Abandoned US20060262146A1 (en) 2005-05-23 2005-06-22 Mobile communication terminal and method
US11/249,156 Active 2026-11-08 US9448711B2 (en) 2005-05-23 2005-10-12 Mobile communication terminal and associated methods
US11/439,530 Abandoned US20070120832A1 (en) 2005-05-23 2006-05-23 Portable electronic apparatus and associated method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/135,624 Active 2027-12-17 US9785329B2 (en) 2005-05-23 2005-05-23 Pocket computer and associated methods

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/249,156 Active 2026-11-08 US9448711B2 (en) 2005-05-23 2005-10-12 Mobile communication terminal and associated methods
US11/439,530 Abandoned US20070120832A1 (en) 2005-05-23 2006-05-23 Portable electronic apparatus and associated method

Country Status (2)

Country Link
US (4) US9785329B2 (en)
JP (1) JP2008542868A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080222530A1 (en) * 2007-03-06 2008-09-11 Microsoft Corporation Navigating user interface controls on a two-dimensional canvas
US20090259952A1 (en) * 2008-04-14 2009-10-15 Canon Kabushiki Kaisha Information processing apparatus and method of controlling same
US20100214250A1 (en) * 2001-05-16 2010-08-26 Synaptics Incorporated Touch screen with user interface enhancement
US8564555B2 (en) 2009-04-30 2013-10-22 Synaptics Incorporated Operating a touch screen control system according to a plurality of rule sets
US9563337B2 (en) 2011-03-23 2017-02-07 Nec Corporation Information processing device, method for controlling an information processing device, and program
US20170102847A1 (en) * 2015-10-12 2017-04-13 Microsoft Technology Licensing, Llc Directional navigation of graphical user interface
US20170185246A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Image display apparatus and method of displaying image
US10812957B2 (en) 2014-09-10 2020-10-20 Lego A/S Method for establishing a wireless connection between electronic devices
US11262899B2 (en) 2014-09-10 2022-03-01 Lego A/S Method for establishing a functional relationship between input and output functions

Families Citing this family (160)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58103744A (en) * 1981-12-16 1983-06-20 Hitachi Ltd Manufacturing method for fluorescent lamp
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
AU2003277358A1 (en) * 2002-10-10 2004-05-04 Action Engine Corporation A method for dynamically assigning and displaying character shortcuts on a computing device display
US20070192711A1 (en) * 2006-02-13 2007-08-16 Research In Motion Limited Method and arrangement for providing a primary actions menu on a handheld communication device
US7890881B1 (en) * 2005-07-29 2011-02-15 Adobe Systems Incorporated Systems and methods for a fold preview
US20070094280A1 (en) * 2005-10-26 2007-04-26 Elina Vartiainen Mobile communication terminal
CN1991726A (en) * 2005-12-30 2007-07-04 鸿富锦精密工业(深圳)有限公司 Keyboard-free inputting portable electronic device and its realizing method and operation interface
US7705861B2 (en) * 2006-01-19 2010-04-27 Microsoft Corporation Snap to element analytical tool
FR2896716B1 (en) * 2006-01-31 2009-06-26 Abb Mc Soc Par Actions Simplif METHOD FOR CONTROLLING A ROBOTIZED WORK STATION AND CORRESPONDING ROBOTIC STATION
US8904286B2 (en) * 2006-02-13 2014-12-02 Blackberry Limited Method and arrangement for providing a primary actions menu on a wireless handheld communication device
US7461349B1 (en) * 2006-02-28 2008-12-02 Adobe Systems Incorporated Methods and apparatus for applying functions to content
US8717302B1 (en) * 2006-06-30 2014-05-06 Cypress Semiconductor Corporation Apparatus and method for recognizing a gesture on a sensing device
US7996789B2 (en) * 2006-08-04 2011-08-09 Apple Inc. Methods and apparatuses to control application programs
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
WO2008045690A2 (en) 2006-10-06 2008-04-17 Veveo, Inc. Linear character selection display interface for ambiguous text input
US8570278B2 (en) 2006-10-26 2013-10-29 Apple Inc. Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US7856605B2 (en) 2006-10-26 2010-12-21 Apple Inc. Method, system, and graphical user interface for positioning an insertion marker in a touch screen display
US20080100585A1 (en) * 2006-11-01 2008-05-01 Teemu Pohjola mobile communication terminal
KR100851977B1 (en) * 2006-11-20 2008-08-12 삼성전자주식회사 Controlling Method and apparatus for User Interface of electronic machine using Virtual plane.
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US20080168478A1 (en) * 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US8082523B2 (en) * 2007-01-07 2011-12-20 Apple Inc. Portable electronic device with graphical user interface supporting application switching
US8689132B2 (en) 2007-01-07 2014-04-01 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic documents and lists
KR20090000137A (en) * 2007-01-11 2009-01-07 삼성전자주식회사 System and method for navigation of web browser
WO2008087871A1 (en) * 2007-01-15 2008-07-24 Nec Corporation Portable communication terminal, browsing method, and browsing program
US8391786B2 (en) * 2007-01-25 2013-03-05 Stephen Hodges Motion triggered data transfer
US20080313574A1 (en) * 2007-05-25 2008-12-18 Veveo, Inc. System and method for search with reduced physical interaction requirements
US8074178B2 (en) * 2007-06-12 2011-12-06 Microsoft Corporation Visual feedback display
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
KR101404606B1 (en) * 2007-08-21 2014-06-10 삼성전자주식회사 A method of providing menus by using a touchscreen and a multimedia apparatus thereof
US10203873B2 (en) 2007-09-19 2019-02-12 Apple Inc. Systems and methods for adaptively presenting a keyboard on a touch-sensitive display
US9110590B2 (en) 2007-09-19 2015-08-18 Typesoft Technologies, Inc. Dynamically located onscreen keyboard
US10126942B2 (en) * 2007-09-19 2018-11-13 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US20120075193A1 (en) * 2007-09-19 2012-03-29 Cleankeys Inc. Multiplexed numeric keypad and touchpad
US9454270B2 (en) 2008-09-19 2016-09-27 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
US9489086B1 (en) 2013-04-29 2016-11-08 Apple Inc. Finger hover detection for improved typing
US20090100380A1 (en) * 2007-10-12 2009-04-16 Microsoft Corporation Navigating through content
TW200923758A (en) * 2007-11-27 2009-06-01 Wistron Corp A key-in method and a content display method of an electronic device, and the application thereof
KR101474561B1 (en) * 2007-11-27 2014-12-19 삼성전자주식회사 Method for executing application in mobile communication teminal and apparatus therefor
US20090158190A1 (en) * 2007-12-13 2009-06-18 Yuvee, Inc. Computing apparatus including a personal web and application assistant
US9612847B2 (en) * 2008-02-05 2017-04-04 Microsoft Technology Licensing, Llc Destination list associated with an application launcher
US8205157B2 (en) 2008-03-04 2012-06-19 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US20090228779A1 (en) * 2008-03-04 2009-09-10 Richard John Williamson Use of remote services by a local wireless electronic device
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8650507B2 (en) 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
US8201109B2 (en) 2008-03-04 2012-06-12 Apple Inc. Methods and graphical user interfaces for editing on a portable multifunction device
KR100952699B1 (en) * 2008-03-10 2010-04-13 한국표준과학연구원 Full-browsing display method in touchscreen apparatus using tactile sensors
US20090235186A1 (en) * 2008-03-12 2009-09-17 Microsoft Corporation Limited-scope rendering
US8525802B2 (en) * 2008-03-31 2013-09-03 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same
WO2009126143A1 (en) * 2008-04-08 2009-10-15 Hewlett-Packard Development Company, L.P. Systems and methods for launching a user application on a computing device
KR101079624B1 (en) * 2008-05-06 2011-11-03 삼성전자주식회사 Method for display of browser and portable terminal using the same
US8296670B2 (en) * 2008-05-19 2012-10-23 Microsoft Corporation Accessing a menu utilizing a drag-operation
US8237666B2 (en) * 2008-10-10 2012-08-07 At&T Intellectual Property I, L.P. Augmented I/O for limited form factor user-interfaces
US8253713B2 (en) 2008-10-23 2012-08-28 At&T Intellectual Property I, L.P. Tracking approaching or hovering objects for user-interfaces
KR101083158B1 (en) * 2008-12-04 2011-11-11 에스케이텔레시스 주식회사 Contents conversion method for mobile terminal with touch screen
JP2010134755A (en) * 2008-12-05 2010-06-17 Toshiba Corp Communication device
US8489569B2 (en) 2008-12-08 2013-07-16 Microsoft Corporation Digital media retrieval and display
US9037992B2 (en) * 2008-12-29 2015-05-19 International Business Machines Corporation System and method for changing system modes
US20100171888A1 (en) * 2009-01-05 2010-07-08 Hipolito Saenz Video frame recorder
US20100218141A1 (en) * 2009-02-23 2010-08-26 Motorola, Inc. Virtual sphere input controller for electronics device
US8589374B2 (en) 2009-03-16 2013-11-19 Apple Inc. Multifunction device with integrated search and application selection
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8756534B2 (en) 2009-03-16 2014-06-17 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
KR101640460B1 (en) * 2009-03-25 2016-07-18 삼성전자 주식회사 Operation Method of Split Window And Portable Device supporting the same
US20100281409A1 (en) * 2009-04-30 2010-11-04 Nokia Corporation Apparatus and method for handling notifications within a communications device
JP2011022842A (en) * 2009-07-16 2011-02-03 Sony Corp Display apparatus, display method, and program
KR101629645B1 (en) * 2009-09-18 2016-06-21 엘지전자 주식회사 Mobile Terminal and Operation method thereof
EP2341419A1 (en) * 2009-12-31 2011-07-06 Sony Computer Entertainment Europe Limited Device and method of control
US8698845B2 (en) * 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface with interactive popup views
US20120287039A1 (en) * 2010-01-28 2012-11-15 Craig Brown User interface for application selection and action control
JP5486977B2 (en) * 2010-03-24 2014-05-07 株式会社日立ソリューションズ Coordinate input device and program
US20110252357A1 (en) 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Concurrently Open Software Applications
US8423911B2 (en) 2010-04-07 2013-04-16 Apple Inc. Device, method, and graphical user interface for managing folders
US9513801B2 (en) 2010-04-07 2016-12-06 Apple Inc. Accessing electronic notifications and settings icons with gestures
US9823831B2 (en) 2010-04-07 2017-11-21 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
CN102298595A (en) * 2010-06-23 2011-12-28 北京爱国者信息技术有限公司 Browser guiding system and guiding method thereof
JP5569271B2 (en) * 2010-09-07 2014-08-13 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5693901B2 (en) * 2010-09-24 2015-04-01 シャープ株式会社 Display device, display method, and program
US8782534B2 (en) * 2010-10-12 2014-07-15 International Business Machines Corporation Independent viewing of web conference content by participants
US8572489B2 (en) * 2010-12-16 2013-10-29 Harman International Industries, Incorporated Handlebar audio controls
US9244606B2 (en) * 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
WO2012098469A2 (en) 2011-01-20 2012-07-26 Cleankeys Inc. Systems and methods for monitoring surface sanitation
US9665250B2 (en) 2011-02-07 2017-05-30 Blackberry Limited Portable electronic device and method of controlling same
US20120266090A1 (en) * 2011-04-18 2012-10-18 Microsoft Corporation Browser Intermediary
WO2012144667A1 (en) * 2011-04-19 2012-10-26 Lg Electronics Inc. Method and electronic device for gesture recognition
US8661339B2 (en) 2011-05-31 2014-02-25 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
JP5852336B2 (en) * 2011-06-13 2016-02-03 任天堂株式会社 Display control program, display control method, display control system, and display control apparatus
KR20130004654A (en) * 2011-07-04 2013-01-14 삼성전자주식회사 Method and device for editing text in wireless terminal
US20130055164A1 (en) * 2011-08-24 2013-02-28 Sony Ericsson Mobile Communications Ab System and Method for Selecting Objects on a Touch-Sensitive Display of a Mobile Communications Device
US8806369B2 (en) 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US9507454B1 (en) * 2011-09-19 2016-11-29 Parade Technologies, Ltd. Enhanced linearity of gestures on a touch-sensitive surface
KR101873056B1 (en) 2011-09-20 2018-07-02 삼성전자주식회사 Device and method for performing application in wireless terminal
US20130086112A1 (en) 2011-10-03 2013-04-04 James R. Everingham Image browsing system and method for a digital content platform
US8737678B2 (en) 2011-10-05 2014-05-27 Luminate, Inc. Platform for providing interactive applications on a digital content platform
USD737290S1 (en) 2011-10-10 2015-08-25 Yahoo! Inc. Portion of a display screen with a graphical user interface
USD736224S1 (en) 2011-10-10 2015-08-11 Yahoo! Inc. Portion of a display screen with a graphical user interface
JP6159078B2 (en) * 2011-11-28 2017-07-05 京セラ株式会社 Apparatus, method, and program
US9747019B2 (en) * 2012-02-24 2017-08-29 Lg Electronics Inc. Mobile terminal and control method thereof
US8255495B1 (en) 2012-03-22 2012-08-28 Luminate, Inc. Digital image and content display systems and methods
US9104260B2 (en) 2012-04-10 2015-08-11 Typesoft Technologies, Inc. Systems and methods for detecting a press on a touch-sensitive surface
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
EP3185116B1 (en) 2012-05-09 2019-09-11 Apple Inc. Device, method and graphical user interface for providing tactile feedback for operations performed in a user interface
KR101670570B1 (en) 2012-05-09 2016-10-28 애플 인크. Device, method, and graphical user interface for selecting user interface objects
WO2013169870A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for transitioning between display states in response to gesture
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
JP2015519656A (en) 2012-05-09 2015-07-09 アップル インコーポレイテッド Device, method and graphical user interface for moving and dropping user interface objects
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
CN104487929B (en) 2012-05-09 2018-08-17 苹果公司 For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
EP2847662B1 (en) 2012-05-09 2020-02-19 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US20140028554A1 (en) * 2012-07-26 2014-01-30 Google Inc. Recognizing gesture on tactile input device
KR101961860B1 (en) 2012-08-28 2019-03-25 삼성전자주식회사 User terminal apparatus and contol method thereof
KR101957173B1 (en) 2012-09-24 2019-03-12 삼성전자 주식회사 Method and apparatus for providing multi-window at a touch device
RU2662636C2 (en) * 2012-09-25 2018-07-26 Опера Софтвэар Ас Information management and display in web browsers
EP2939095B1 (en) 2012-12-29 2018-10-03 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
KR101958517B1 (en) 2012-12-29 2019-03-14 애플 인크. Device, method, and graphical user interface for transitioning between touch input to display output relationships
EP2912542B1 (en) 2012-12-29 2022-07-13 Apple Inc. Device and method for forgoing generation of tactile output for a multi-contact gesture
CN105264479B (en) 2012-12-29 2018-12-25 苹果公司 Equipment, method and graphic user interface for navigating to user interface hierarchical structure
EP3564806B1 (en) 2012-12-29 2024-02-21 Apple Inc. Device, method and graphical user interface for determining whether to scroll or select contents
JP6115136B2 (en) * 2013-01-08 2017-04-19 日本電気株式会社 Information communication apparatus, control method thereof, and program
US9658740B2 (en) 2013-03-15 2017-05-23 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
US9477404B2 (en) 2013-03-15 2016-10-25 Apple Inc. Device, method, and graphical user interface for managing concurrently open software applications
JP6018017B2 (en) * 2013-05-31 2016-11-02 グリー株式会社 Information processing method, information processing system, and program
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
KR102110779B1 (en) * 2013-06-27 2020-05-14 삼성전자 주식회사 Method and apparatus for managing page display mode in application of an user device
KR102179056B1 (en) * 2013-07-19 2020-11-16 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US10757241B2 (en) * 2013-07-29 2020-08-25 Oath Inc. Method and system for dynamically changing a header space in a graphical user interface
US10289302B1 (en) 2013-09-09 2019-05-14 Apple Inc. Virtual keyboard animation
JP6149684B2 (en) * 2013-10-21 2017-06-21 ブラザー工業株式会社 Portable terminal, image processing apparatus, and program
US9934207B1 (en) 2014-05-02 2018-04-03 Tribune Publishing Company, Llc Online information system with continuous scrolling and previous section removal
USD882582S1 (en) 2014-06-20 2020-04-28 Google Llc Display screen with animated graphical user interface
USD774062S1 (en) * 2014-06-20 2016-12-13 Google Inc. Display screen with graphical user interface
JP6390213B2 (en) * 2014-06-30 2018-09-19 ブラザー工業株式会社 Display control apparatus, display control method, and display control program
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10496275B2 (en) * 2015-10-12 2019-12-03 Microsoft Technology Licensing, Llc Multi-window keyboard
US10386997B2 (en) * 2015-10-23 2019-08-20 Sap Se Integrating functions for a user input device
US10127198B2 (en) * 2016-05-26 2018-11-13 International Business Machines Corporation Real-time text layout conversion control and management on a mobile electronic device
US9971483B2 (en) * 2016-05-26 2018-05-15 International Business Machines Corporation Contextual-based real-time text layout conversion control and management on a mobile electronic device
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
JP6147903B2 (en) * 2016-09-29 2017-06-14 グリー株式会社 Information processing method, information processing system, and program
US10283082B1 (en) 2016-10-29 2019-05-07 Dvir Gassner Differential opacity position indicator
CN106598406A (en) * 2016-11-16 2017-04-26 上海斐讯数据通信技术有限公司 Intelligent terminal-based page display method and intelligent terminal
CN107391559B (en) * 2017-06-08 2020-06-02 广东工业大学 General forum text extraction algorithm based on block, pattern recognition and line text
CN107765979A (en) * 2017-09-27 2018-03-06 北京金山安全软件有限公司 Display method and device of predicted words and electronic equipment
CN107748741B (en) * 2017-11-20 2021-04-23 维沃移动通信有限公司 Text editing method and mobile terminal
JP6909849B2 (en) * 2018-07-03 2021-07-28 グリー株式会社 Game processing program, game processing method, and game processing system
US11016643B2 (en) 2019-04-15 2021-05-25 Apple Inc. Movement of user interface object with user-specified content
CN111857505B (en) * 2020-07-16 2022-07-05 Oppo广东移动通信有限公司 Display method, device and storage medium

Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969097A (en) * 1985-09-18 1990-11-06 Levin Leonid D Method of rapid entering of text into computer equipment
US5375201A (en) * 1992-12-18 1994-12-20 Borland International, Inc. System and methods for intelligent analytical graphing
US5523775A (en) * 1992-05-26 1996-06-04 Apple Computer, Inc. Method for selecting objects on a computer display
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US5675753A (en) * 1995-04-24 1997-10-07 U.S. West Technologies, Inc. Method and system for presenting an electronic user-interface specification
US5689666A (en) * 1994-01-27 1997-11-18 3M Method for handling obscured items on computer displays
US5703620A (en) * 1995-04-28 1997-12-30 U.S. Philips Corporation Cursor/pointer speed control based on directional relation to target objects
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
US5805159A (en) * 1996-08-22 1998-09-08 International Business Machines Corporation Mobile client computer interdependent display data fields
US5864340A (en) * 1996-08-22 1999-01-26 International Business Machines Corporation Mobile client computer programmed to predict input
US5959629A (en) * 1996-11-25 1999-09-28 Sony Corporation Text input device and method
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US6008817A (en) * 1997-12-31 1999-12-28 Comparative Visual Assessments, Inc. Comparative visual assessment system and method
US6173297B1 (en) * 1997-09-12 2001-01-09 Ericsson Inc. Dynamic object linking interface
US6208345B1 (en) * 1998-04-15 2001-03-27 Adc Telecommunications, Inc. Visual data integration system and method
US20010045949A1 (en) * 2000-03-29 2001-11-29 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US20020015042A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Visual content browsing using rasterized representations
US20020024506A1 (en) * 1999-11-09 2002-02-28 Flack James F. Motion detection and tracking system to control navigation and display of object viewers
US20020052900A1 (en) * 2000-05-15 2002-05-02 Freeman Alfred Boyd Computer assisted text input system
US20020103698A1 (en) * 2000-10-31 2002-08-01 Christian Cantrell System and method for enabling user control of online advertising campaigns
US20020156864A1 (en) * 2000-06-06 2002-10-24 Kniest James Newton System for wireless exchange of data with hand held devices
US20030045331A1 (en) * 2001-08-30 2003-03-06 Franco Montebovi Mobile telecommunications device browser
US6570583B1 (en) * 2000-08-28 2003-05-27 Compal Electronics, Inc. Zoom-enabled handheld device
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US20050044506A1 (en) * 2003-08-19 2005-02-24 Nokia Corporation Updating information content on a small display
US6862712B1 (en) * 1999-03-08 2005-03-01 Tokyo University Of Agriculture And Technology Method for controlling displayed contents on a display device
US20050195221A1 (en) * 2004-03-04 2005-09-08 Adam Berger System and method for facilitating the presentation of content via device displays
US20050223308A1 (en) * 1999-03-18 2005-10-06 602531 British Columbia Ltd. Data entry for personal computing devices
US20050283364A1 (en) * 1998-12-04 2005-12-22 Michael Longe Multimodal disambiguation of speech recognition
US20060020904A1 (en) * 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060095842A1 (en) * 2004-11-01 2006-05-04 Nokia Corporation Word completion dictionary
US20060101005A1 (en) * 2004-10-12 2006-05-11 Yang Wendy W System and method for managing and presenting entity information
US20060112346A1 (en) * 2004-11-19 2006-05-25 Microsoft Corporation System and method for directional focus navigation
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060161870A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US7107204B1 (en) * 2000-04-24 2006-09-12 Microsoft Corporation Computer-aided writing system and method with cross-language writing wizard
US20060274051A1 (en) * 2003-12-22 2006-12-07 Tegic Communications, Inc. Virtual Keyboard Systems with Automatic Correction
US7171353B2 (en) * 2000-03-07 2007-01-30 Microsoft Corporation Grammar-based automatic data completion and suggestion for user input
US7194404B1 (en) * 2000-08-31 2007-03-20 Semantic Compaction Systems Linguistic retrieval system and method
US7228268B2 (en) * 2000-04-24 2007-06-05 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3164353B2 (en) * 1990-03-30 2001-05-08 ソニー株式会社 Information input control device and method
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
TW238450B (en) 1993-06-07 1995-01-11 Hicrosoft Corp
US5808604A (en) 1994-03-10 1998-09-15 Microsoft Corporation Apparatus and method for automatically positioning a cursor on a control
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
JPH0937358A (en) 1995-07-20 1997-02-07 Sony Corp Keyboard and video camera control system
KR980009337A (en) 1996-07-30 1998-04-30 한형수 Method for producing biaxially oriented polyester film
US5953541A (en) * 1997-01-24 1999-09-14 Tegic Communications, Inc. Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use
US5999176A (en) * 1997-04-04 1999-12-07 International Business Machines Corporation Method to provide a single scrolling control for a multi-window interface
JPH10340178A (en) 1997-06-06 1998-12-22 Sony Corp Portable terminal equipment, information display method and information processing method
US6278465B1 (en) * 1997-06-23 2001-08-21 Sun Microsystems, Inc. Adaptive font sizes for network browsing
JP2000163444A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information device and information storage medium
US6590594B2 (en) * 1999-03-25 2003-07-08 International Business Machines Corporation Window scroll-bar
GB0017793D0 (en) 2000-07-21 2000-09-06 Secr Defence Human computer interface
US6981223B2 (en) * 2001-03-19 2005-12-27 Ecrio, Inc. Method, apparatus and computer readable medium for multiple messaging session management with a graphical user interface
JP3618303B2 (en) 2001-04-24 2005-02-09 松下電器産業株式会社 Map display device
US20030098891A1 (en) * 2001-04-30 2003-05-29 International Business Machines Corporation System and method for multifunction menu objects
US20020186257A1 (en) * 2001-06-08 2002-12-12 Cadiz Jonathan J. System and process for providing dynamic communication access and information awareness in an interactive peripheral display
US6640185B2 (en) * 2001-07-21 2003-10-28 Alpine Electronics, Inc. Display method and apparatus for navigation system
US7075513B2 (en) * 2001-09-04 2006-07-11 Nokia Corporation Zooming and panning content on a display screen
US7009599B2 (en) 2001-11-20 2006-03-07 Nokia Corporation Form factor for portable device
JP4085304B2 (en) 2002-03-25 2008-05-14 富士電機ホールディングス株式会社 Manufacturing method of solar cell module
US7225407B2 (en) * 2002-06-28 2007-05-29 Microsoft Corporation Resource browser sessions search
US7006074B2 (en) 2002-09-05 2006-02-28 Thomas Peter Chesters Multimodal pointer method
US8015259B2 (en) * 2002-09-10 2011-09-06 Alan Earl Swahn Multi-window internet search with webpage preload
US7523397B2 (en) * 2002-09-30 2009-04-21 Microsoft Corporation Centralized alert and notifications repository, manager, and viewer
JP2004206300A (en) 2002-12-24 2004-07-22 Nokia Corp Mobile electronic device
JP4074530B2 (en) 2003-02-28 2008-04-09 京セラ株式会社 Portable information terminal device
US20050052427A1 (en) * 2003-09-10 2005-03-10 Wu Michael Chi Hung Hand gesture interaction with touch surface
KR100568495B1 (en) 2003-09-16 2006-04-07 주식회사 쏠리테크 A portable electronic apparatus and a method for controlling the apparatus
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
EP1574971A1 (en) 2004-03-10 2005-09-14 Alcatel A method, a hypermedia browser, a network client, a network server, and a computer software product for providing joint navigation of hypermedia documents
US7450111B2 (en) * 2004-10-27 2008-11-11 Nokia Corporation Key functionality for communication terminal
US20060267967A1 (en) 2005-05-24 2006-11-30 Microsoft Corporation Phrasing extensions and multiple modes in one spring-loaded control

Patent Citations (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4969097A (en) * 1985-09-18 1990-11-06 Levin Leonid D Method of rapid entering of text into computer equipment
US5523775A (en) * 1992-05-26 1996-06-04 Apple Computer, Inc. Method for selecting objects on a computer display
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5375201A (en) * 1992-12-18 1994-12-20 Borland International, Inc. System and methods for intelligent analytical graphing
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US5689666A (en) * 1994-01-27 1997-11-18 3M Method for handling obscured items on computer displays
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
US5675753A (en) * 1995-04-24 1997-10-07 U.S. West Technologies, Inc. Method and system for presenting an electronic user-interface specification
US5703620A (en) * 1995-04-28 1997-12-30 U.S. Philips Corporation Cursor/pointer speed control based on directional relation to target objects
US5805159A (en) * 1996-08-22 1998-09-08 International Business Machines Corporation Mobile client computer interdependent display data fields
US5864340A (en) * 1996-08-22 1999-01-26 International Business Machines Corporation Mobile client computer programmed to predict input
US5959629A (en) * 1996-11-25 1999-09-28 Sony Corporation Text input device and method
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US5995084A (en) * 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6173297B1 (en) * 1997-09-12 2001-01-09 Ericsson Inc. Dynamic object linking interface
US6008817A (en) * 1997-12-31 1999-12-28 Comparative Visual Assessments, Inc. Comparative visual assessment system and method
US6208345B1 (en) * 1998-04-15 2001-03-27 Adc Telecommunications, Inc. Visual data integration system and method
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US20050283364A1 (en) * 1998-12-04 2005-12-22 Michael Longe Multimodal disambiguation of speech recognition
US6862712B1 (en) * 1999-03-08 2005-03-01 Tokyo University Of Agriculture And Technology Method for controlling displayed contents on a display device
US20050223308A1 (en) * 1999-03-18 2005-10-06 602531 British Columbia Ltd. Data entry for personal computing devices
US20020024506A1 (en) * 1999-11-09 2002-02-28 Flack James F. Motion detection and tracking system to control navigation and display of object viewers
US7171353B2 (en) * 2000-03-07 2007-01-30 Microsoft Corporation Grammar-based automatic data completion and suggestion for user input
US20010045949A1 (en) * 2000-03-29 2001-11-29 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US7107204B1 (en) * 2000-04-24 2006-09-12 Microsoft Corporation Computer-aided writing system and method with cross-language writing wizard
US7254527B2 (en) * 2000-04-24 2007-08-07 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7228268B2 (en) * 2000-04-24 2007-06-05 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US7315809B2 (en) * 2000-04-24 2008-01-01 Microsoft Corporation Computer-aided reading system and method with cross-language reading wizard
US20020052900A1 (en) * 2000-05-15 2002-05-02 Freeman Alfred Boyd Computer assisted text input system
US20020156864A1 (en) * 2000-06-06 2002-10-24 Kniest James Newton System for wireless exchange of data with hand held devices
US20070263007A1 (en) * 2000-08-07 2007-11-15 Searchlite Advances, Llc Visual content browsing with zoom and pan features
US20040239681A1 (en) * 2000-08-07 2004-12-02 Zframe, Inc. Visual content browsing using rasterized representations
US20020015042A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Visual content browsing using rasterized representations
US6570583B1 (en) * 2000-08-28 2003-05-27 Compal Electronics, Inc. Zoom-enabled handheld device
US7194404B1 (en) * 2000-08-31 2007-03-20 Semantic Compaction Systems Linguistic retrieval system and method
US20020103698A1 (en) * 2000-10-31 2002-08-01 Christian Cantrell System and method for enabling user control of online advertising campaigns
US20030045331A1 (en) * 2001-08-30 2003-03-06 Franco Montebovi Mobile telecommunications device browser
US20050012723A1 (en) * 2003-07-14 2005-01-20 Move Mobile Systems, Inc. System and method for a portable multimedia client
US20050044506A1 (en) * 2003-08-19 2005-02-24 Nokia Corporation Updating information content on a small display
US20060274051A1 (en) * 2003-12-22 2006-12-07 Tegic Communications, Inc. Virtual Keyboard Systems with Automatic Correction
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices
US20050195221A1 (en) * 2004-03-04 2005-09-08 Adam Berger System and method for facilitating the presentation of content via device displays
US20060020904A1 (en) * 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060161870A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060026521A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060026536A1 (en) * 2004-07-30 2006-02-02 Apple Computer, Inc. Gestures for touch sensitive input devices
US20060101005A1 (en) * 2004-10-12 2006-05-11 Yang Wendy W System and method for managing and presenting entity information
US20060095842A1 (en) * 2004-11-01 2006-05-04 Nokia Corporation Word completion dictionary
US20060112346A1 (en) * 2004-11-19 2006-05-25 Microsoft Corporation System and method for directional focus navigation

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100214250A1 (en) * 2001-05-16 2010-08-26 Synaptics Incorporated Touch screen with user interface enhancement
US8402372B2 (en) * 2001-05-16 2013-03-19 Synaptics Incorporated Touch screen with user interface enhancement
US8560947B2 (en) 2001-05-16 2013-10-15 Synaptics Incorporated Touch screen with user interface enhancement
US20080222530A1 (en) * 2007-03-06 2008-09-11 Microsoft Corporation Navigating user interface controls on a two-dimensional canvas
US20090259952A1 (en) * 2008-04-14 2009-10-15 Canon Kabushiki Kaisha Information processing apparatus and method of controlling same
US10254878B2 (en) 2009-04-30 2019-04-09 Synaptics Incorporated Operating a touch screen control system according to a plurality of rule sets
US9052764B2 (en) 2009-04-30 2015-06-09 Synaptics Incorporated Operating a touch screen control system according to a plurality of rule sets
US9304619B2 (en) 2009-04-30 2016-04-05 Synaptics Incorporated Operating a touch screen control system according to a plurality of rule sets
US9703411B2 (en) 2009-04-30 2017-07-11 Synaptics Incorporated Reduction in latency between user input and visual feedback
US8564555B2 (en) 2009-04-30 2013-10-22 Synaptics Incorporated Operating a touch screen control system according to a plurality of rule sets
US9563337B2 (en) 2011-03-23 2017-02-07 Nec Corporation Information processing device, method for controlling an information processing device, and program
US10812957B2 (en) 2014-09-10 2020-10-20 Lego A/S Method for establishing a wireless connection between electronic devices
US11262899B2 (en) 2014-09-10 2022-03-01 Lego A/S Method for establishing a functional relationship between input and output functions
US20170102847A1 (en) * 2015-10-12 2017-04-13 Microsoft Technology Licensing, Llc Directional navigation of graphical user interface
US10592070B2 (en) * 2015-10-12 2020-03-17 Microsoft Technology Licensing, Llc User interface directional navigation using focus maps
US20170185246A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Image display apparatus and method of displaying image

Also Published As

Publication number Publication date
US20060262136A1 (en) 2006-11-23
US20060265653A1 (en) 2006-11-23
US9785329B2 (en) 2017-10-10
US20070120832A1 (en) 2007-05-31
US9448711B2 (en) 2016-09-20
JP2008542868A (en) 2008-11-27

Similar Documents

Publication Publication Date Title
US20060262146A1 (en) Mobile communication terminal and method
KR101025259B1 (en) Improved pocket computer and associated methods
US8621378B2 (en) Mobile terminal device and display control method
KR101720849B1 (en) Touch screen hover input handling
JP5639158B2 (en) Organizing content columns
US7272790B2 (en) Method and device for automatically selecting a frame for display
US20070024646A1 (en) Portable electronic apparatus and associated method
JP2009266220A (en) Method and device for operating graphic menu bar and recording medium using the same
US7761807B2 (en) Portable electronic device and method for displaying large format data files
CN104965649B (en) Content display method and device and terminal
EP1855185A2 (en) Method of displaying text using mobile terminal
US8704782B2 (en) Electronic device, method for viewing desktop thereof, and computer-readable medium
KR20100029191A (en) Visual feedback display
US20100107066A1 (en) scrolling for a touch based graphical user interface
CN1288539C (en) Method and device for navigating inside image
JP2011060301A (en) Improved mobile communications terminal and method
JP2006505025A (en) Graphical user interface for expandable menus
US20080172618A1 (en) Navigation system of web browser and method thereof
US20080163065A1 (en) Using a light source to indicate navigation spots on a web page
CN108491152B (en) Touch screen terminal control method, terminal and medium based on virtual cursor
JP2012174247A (en) Mobile electronic device, contact operation control method, and contact operation control program
US20080100585A1 (en) mobile communication terminal
CN113311982A (en) Information selection method and device
CA2619260C (en) Portable electronic device and method for displaying large format data files
KR20080083996A (en) Mobile communication device and control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOKIA CORPORATION, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOIVISTO, ANTTI J.;POPESCU, ANDREI;LIU, WEI;AND OTHERS;REEL/FRAME:017328/0846;SIGNING DATES FROM 20050921 TO 20051206

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION