US9619139B2 - Device, method, and storage medium storing program - Google Patents

Device, method, and storage medium storing program Download PDF

Info

Publication number
US9619139B2
US9619139B2 US13/633,934 US201213633934A US9619139B2 US 9619139 B2 US9619139 B2 US 9619139B2 US 201213633934 A US201213633934 A US 201213633934A US 9619139 B2 US9619139 B2 US 9619139B2
Authority
US
United States
Prior art keywords
icon
application
icons
gesture
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/633,934
Other versions
US20130082965A1 (en
Inventor
Yuuki Wada
Katsuaki Oonishi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp filed Critical Kyocera Corp
Assigned to KYOCERA CORPORATION reassignment KYOCERA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OONISHI, KATSUAKI, WADA, YUUKI
Publication of US20130082965A1 publication Critical patent/US20130082965A1/en
Application granted granted Critical
Publication of US9619139B2 publication Critical patent/US9619139B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • H04M1/72544
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/66Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72519
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present application relates to a device, a method, and a storage medium storing therein a program. More particularly, the present application relates to a device including a touch screen display, a method of controlling the device, and a storage medium storing therein a program for controlling the device.
  • a touch screen device having a touch screen display has been known.
  • the touch screen devices include, but are not limited to, a smartphone and a tablet.
  • the touch screen device detects a gesture of a finger, a pen, or a stylus pen through the touch screen display. Then, the touch screen device operates according to the detected gesture.
  • An example of the operation according to the detected gesture is described in, for example, International Publication Pamphlet No. 2008/086302.
  • the basic operation of the touch screen device is implemented by an operating system (OS) built into the device.
  • OS operating system
  • Examples of the OS built into the touch screen device include, but are not limited to, Android, BlackBerry OS, iOS, Symbian OS, and Windows Phone.
  • touch screen devices have a lock function so as to prevent erroneous operations or the like.
  • the touch screen device executes a lock function to display a lock screen on the touch screen display during a locked state. On the lock screen, operations other than a set operation are invalidated. Therefore, the touch screen device can prevent erroneous operations during the locked state by executing the lock function.
  • the touch screen device unlocks the locked state when an unlock operation is detected in the locked state. Therefore, when a user executes a desired application in the locked state, the user is required to input the unlock operation, select the application, and execute the selected application. Furthermore, when particular processing executable in the desired application is executed, the user is required to perform the unlock operation, select the desired application, and select particular processing from a menu or the like of the application. For example, in a case where the application is a mail application, the particular processing includes incoming mail check processing, new mail composition processing, outgoing mail check processing, and the like. As described above, the above-mentioned touch screen device has low operability and convenience in the locked state.
  • a device includes a touch screen display and a controller.
  • the touch screen display displays a lock screen including a first icon and a second icon.
  • the controller displays a sub icon associated with the second icon on the lock screen when a gesture in which the first icon and the second icon are superimposed is detected.
  • a method for controlling a device having a touch screen display. The method includes: displaying a lock screen, including a first icon and a second icon, on the touch screen display; and displaying a sub icon associated with the second icon when a gesture in which the first icon and the second icon are superimposed is detected.
  • a non-transitory storage medium stores therein a program.
  • the program When executed by a device having a touch screen display, the program cases the device to execute: displaying a lock screen, including a first icon and a second icon, on the touch screen display; and displaying a sub icon associated with the second icon when a gesture in which the first icon and the second icon are superimposed is detected.
  • FIG. 1 is a perspective view of a smartphone according to an embodiment
  • FIG. 2 is a front view of the smartphone
  • FIG. 3 is a back view of the smartphone
  • FIG. 4 is a diagram illustrating an example of a home screen
  • FIG. 5 is a block diagram of the smartphone
  • FIG. 6 is a diagram illustrating an example of a lock screen
  • FIG. 7 is a diagram illustrating an example of a control during displaying the lock screen
  • FIG. 8 is a diagram illustrating an example of a control during displaying the lock screen
  • FIG. 9 is a diagram illustrating an example of an operation screen in a case where a text editor application is executed.
  • FIG. 10 is a diagram illustrating an example of a control during displaying the lock screen
  • FIG. 11 is a flowchart illustrating a procedure of a control that is performed in a locked state
  • FIG. 15 is a diagram illustrating an example of a control during displaying the lock screen
  • FIG. 16 is a flowchart illustrating a procedure of a control that is performed in a locked state
  • FIG. 17C is a diagram illustrating an example of the lock screen
  • FIG. 19 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen
  • FIG. 20 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen
  • FIG. 21 is a diagram illustrating an example of a control for setting up display content of the lock screen.
  • FIG. 22 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen.
  • a smartphone will be explained below as an example of a device provided with a touch screen display.
  • the smartphone 1 includes a housing 20 .
  • the housing 20 includes a front face 1 A, a back face 1 B, and side faces 1 C 1 to 1 C 4 .
  • the front face 1 A is a front of the housing 20 .
  • the back face 1 B is a back of the housing 20 .
  • the side faces 1 C 1 to 1 C 4 are sides each connecting the front face 1 A and the back face 1 B.
  • the side faces 1 C 1 to 1 C 4 may be collectively called “side face 1 C” without being specific to any of the side faces.
  • the smartphone 1 includes a touch screen display 2 , buttons 3 A to 3 C, an illumination (ambient light) sensor 4 , a proximity sensor 5 , a receiver 7 , a microphone 8 , and a camera 12 , which are provided in the front face 1 A.
  • the smartphone 1 includes a camera 13 , which is provided in the back face 1 B.
  • the smartphone 1 includes buttons 3 D to 3 F and a connector 14 , which are provided in the side face 1 C.
  • the buttons 3 A to 3 F may be collectively called “button 3 ” without being specific to any of the buttons.
  • the touch screen display 2 includes a display 2 A and a touch screen 2 B.
  • each of the display 2 A and the touch screen 2 B is approximately rectangular-shaped; however, the shapes of the display 2 A and the touch screen 2 B are not limited thereto.
  • Each of the display 2 A and the touch screen 2 B may have any shape such as a square, a circle or the like.
  • the display 2 A and the touch screen 2 B are arranged in a superimposed manner; however, the manner in which the display 2 A and the touch screen 2 B are arranged is not limited thereto.
  • the display 2 A and the touch screen 2 B may be arranged, for example, side by side or apart from each other. In the example of FIG.
  • longer sides of the display 2 A are along with longer sides of the touch screen 2 B respectively while shorter sides of the display 2 A are along with shorter sides of the touch screen 2 B respectively; however, the manner in which the display 2 A and the touch screen 2 B are superimposed is not limited thereto. In case the display 2 A and the touch screen 2 B are arranged in the superimposed manner, they can be arranged such that, for example, one or more sides of the display 2 A are not along with any sides of the touch screen 2 B.
  • the display 2 A is provided with a display device such as a liquid crystal display (LCD), an organic electroluminescence display (GELD), or an inorganic electroluminescence display (IELD).
  • a display device such as a liquid crystal display (LCD), an organic electroluminescence display (GELD), or an inorganic electroluminescence display (IELD).
  • the display 2 A displays text, images, symbols, graphics, and the like.
  • the touch screen 2 B detects a contact of a finger, a pen, a stylus pen, or the like on the touch screen 2 B.
  • the touch screen 2 B can detect positions where a plurality of fingers, pens, stylus pens, or the like make contact with the touch screen 2 B.
  • a finger, pen, stylus pen, and the like may be referred to as a “contact object” or an “object”.
  • the detection method of the touch screen 2 B may be any detection methods, including but not limited to, a capacitive type detection method, a resistive type detection method, a surface acoustic wave type (or ultrasonic type) detection method, an infrared type detection method, an electro magnetic induction type detection method, and a load sensing type detection method.
  • a capacitive type detection method a resistive type detection method
  • a surface acoustic wave type (or ultrasonic type) detection method e.g., a surface acoustic wave type (or ultrasonic type) detection method
  • an infrared type detection method e.g., a infrared IR type detection method
  • electro magnetic induction type detection method e.g., electro magnetic induction type detection method
  • load sensing type detection method e.g., a load sensing type detection method
  • the smartphone 1 determines a type of a gesture based on at least one of a contact detected by the touch screen 2 B, a position where the contact is detected, a change of a position where the contact is detected, an interval between detected contacts, and the number of detection times of the contact.
  • the gesture is an operation performed on the touch screen 2 B. Examples of the gestures determined by the smartphone 1 include, but are not limited to, touch, long touch, release, swipe, tap, double tap, long tap, drag, flick, pinch in, and pinch out.
  • “Touch” is a gesture in which a finger makes contact with the touch screen 2 B.
  • the smartphone 1 determines a gesture in which the finger makes contact with the touch screen 2 B as touch.
  • “Long touch” is a gesture in which a finger makes contact with the touch screen 2 B for longer than a given time.
  • the smartphone 1 determines a gesture in which the finger makes contact with the touch screen 2 B for longer than a given time as long touch.
  • “Release” is a gesture in which a finger separates from the touch screen 2 B.
  • the smartphone 1 determines a gesture in which the finger separates from the touch screen 2 B as release.
  • “Swipe” is a gesture in which a finger moves on the touch screen 2 B with continuous contact thereon.
  • the smartphone 1 determines a gesture in which the finger moves on the touch screen 2 B with continuous contact thereon as swipe.
  • “Tap” is a gesture in which a touch is followed by a release.
  • the smartphone 1 determines a gesture in which a touch is followed by a release as tap.
  • “Double tap” is a gesture such that a gesture in which a touch is followed by a release is successively performed twice.
  • the smartphone 1 determines a gesture such that a gesture in which a touch is followed by a release is successively performed twice as double tap.
  • “Long tap” is a gesture in which a long touch is followed by a release.
  • the smartphone 1 determines a gesture in which a long touch is followed by a release as long tap.
  • “Drag” is a gesture in which a swipe is performed from an area where a movable-object is displayed.
  • the smartphone 1 determines a gesture in which a swipe is performed from an area where the movable-object displayed as drag.
  • “Flick” is a gesture in which a finger separates from the touch screen 2 B while moving after making contact with the touch screen 2 B. That is, “Flick” is a gesture in which a touch is followed by a release accompanied with a movement of the finger.
  • the smartphone 1 determines a gesture in which the finger separates from the touch screen 2 B while moving after making contact with the touch screen 2 B as flick.
  • the flick is performed, in many cases, with a finger moving along one direction.
  • the flick includes “upward flick” in which the finger moves upward on the screen, “downward flick” in which the finger moves downward on the screen, “rightward flick” in which the finger moves rightward on the screen, and “leftward flick” in which the finger moves leftward on the screen, and the like. Movement of the finger during the flick is, in many cases, quicker than that of the finger during the swipe.
  • “Pinch in” is a gesture in which a swipe with a plurality of fingers is performed in a direction to move the fingers toward each other.
  • the smartphone 1 determines a gesture in which the distance between a position of one finger and a position of another finger detected by the touch screen 2 B becomes shorter as pinch in.
  • “Pinch out” is a gesture in which a swipe with a plurality of fingers is performed in a direction to move the fingers away from each other.
  • the smartphone 1 determines a gesture in which the distance between a position of one finger and a position of another finger detected by the touch screen 2 B becomes longer as pinch out.
  • a gesture performed by using a finger may be referred to as a “single touch gesture”, and a gesture performed by using a plurality of fingers may be referred to as a “multi touch gesture”.
  • Examples of the multi touch gesture include a pinch in and a pinch out.
  • a tap, a flick, a swipe, and the like are a single touch gesture when performed by using a finger, and are a multi touch gesture when performed by using a plurality of fingers.
  • FIG. 4 represents an example of a home screen.
  • the home screen may also be called “desktop”, “standby screen”, “idle screen”, or “standard screen”.
  • the home screen is displayed on the display 2 A.
  • the home screen is a screen allowing the user to select which one of applications (programs) installed in the smartphone 1 is executed.
  • the smartphone 1 executes the application selected on the home screen in the foreground.
  • the screen of the application executed in the foreground is displayed on the display 2 A in a different manner from that of the home screen.
  • the icons 50 include an image and a character string.
  • the icons 50 may contain a symbol or a graphic instead of an image.
  • the icons 50 do not have to include either one of the image and the character string.
  • the icons 50 are arranged based on a layout pattern.
  • a wall paper 41 is displayed behind the icons 50 .
  • the wall paper may sometimes be called “photo screen”, “back screen”, “idle image”, or “background image”.
  • the smartphone 1 can use an arbitrary image as the wall paper 41 .
  • the smartphone 1 may be configured so that the user can select an image to be displayed as the wall paper 41 .
  • the smartphone 1 displays an indicator (a locator) 51 on the home screen.
  • the indicator 51 includes one or more symbols. The number of the symbols is the same as that of the home screens.
  • a symbol corresponding to a home screen that is currently displayed is displayed in a different manner from that of symbols corresponding to the other home screens.
  • the indicator 51 in an example illustrated in FIG. 4 includes four symbols. This means the number of home screens is four. According to the indicator 51 in the example illustrated in FIG. 4 , the second symbol from the left is displayed in a different manner from that of the other symbols. This means that the second home screen from the left is currently displayed.
  • the smartphone 1 changes the home screen to be displayed on the display 2 A from a first home screen to a second home screen, when a gesture is detected while displaying the first home screen, such that the area of the first home screen displayed on the display 2 A gradually becomes smaller and the area of the second home screen displayed gradually becomes larger.
  • the smartphone 1 may switch the home screens such that the first home screen is instantly replaced by the second home screen.
  • the touch screen display 2 includes, as explained above, the display 2 A and the touch screen 2 B.
  • the display 2 A displays text, images, symbols, graphics, or the like.
  • the touch screen 2 B detects contact(s).
  • the controller 10 detects a gesture performed for the smartphone 1 . Specifically, the controller 10 detects an operation (a gesture) for the touch screen 2 B in cooperation with the touch screen 2 B.
  • the button 3 is operated by the user.
  • the button 3 includes buttons 3 A to 3 F.
  • the controller 10 detects an operation for the button 3 in cooperation with the button 3 . Examples of the operations for the button 3 include, but are not limited to, a click, a double click, a triple click, a push, and a multi-push.
  • the illumination sensor 4 detects illumination of the ambient light of the smartphone 1 .
  • the illumination indicates intensity of light, lightness, or brightness.
  • the illumination sensor 4 is used, for example, to adjust the brightness of the display 2 A.
  • the proximity sensor 5 detects the presence of a nearby object without any physical contact.
  • the proximity sensor 5 detects the presence of the object based on a change of the magnetic field, a change of the return time of the reflected ultrasonic wave, etc.
  • the proximity sensor 5 detects that, for example, the touch screen display 2 is brought close to someone's face.
  • the illumination sensor 4 and the proximity sensor 5 may be configured as one sensor.
  • the illumination sensor 4 can be used as a proximity sensor.
  • the communication unit 6 performs communication via radio waves.
  • a communication system supported by the communication unit 6 is wireless communication standard.
  • the wireless communication standard includes, for example, a communication standard of cellar phones such as 2G, 3G, and 4G.
  • the communication standard of cellar phones includes, for example, Long Term Evolution (LTE), Wideband Code Division Multiple Access (W-CDMA), CDMA 2000, a Personal Digital Cellular (PDC), a Global System for Mobile Communications (GSM), and a Personal Handy-phone System (PHS).
  • the wireless communication standard further includes, for example, Worldwide Interoperability for Microwave Access (WiMAX), IEEE 802.11, Bluetooth, Infrared Data Association (IrDA), and Near Field Communication (NFC).
  • WiMAX Worldwide Interoperability for Microwave Access
  • IEEE 802.11 Bluetooth
  • IrDA Infrared Data Association
  • NFC Near Field Communication
  • the communication unit 6 may support one or more communication standards.
  • the storage 9 stores therein programs and data.
  • the storage 9 is used also as a work area that temporarily stores a processing result of the controller 10 .
  • the storage 9 may include any non-transitory storage medium such as a semiconductor storage medium and a magnetic storage medium.
  • the storage 9 may include a plurality type of storage mediums.
  • the storage 9 may include a combination of a portable storage medium such as a memory card, an optical disc, or a magneto-optical disc with a reader of the storage medium.
  • the storage 9 may include a storage device used as a temporary storage area such as Random Access Memory (RAM).
  • RAM Random Access Memory
  • the storage 9 stores therein, for example, a control program 9 A, a mail application 9 B, a browser application 9 C, and setting data 9 Z.
  • the mail application 9 B provides an e-mail function for composing, transmitting, receiving, and displaying e-mail, and the like.
  • the browser application 9 C provides a WEB browsing function for displaying WEB pages.
  • the setting data 9 Z contains information related to various settings on the operations of a smartphone 1 .
  • the controller 10 executes instructions contained in the program stored in the storage 9 while referring to the data stored in the storage 9 as necessary.
  • the controller 10 controls a function unit according to the data and the instructions to thereby implement the various functions.
  • Examples of the function units include, but are not limited to, the display 2 A, the communication unit 6 , and the receiver 7 .
  • the controller 10 can change the control of the function unit according to the detection result of a detector. Examples of the detectors include, but are not limited to, the touch screen 2 B, the button 3 , the illumination sensor 4 , the proximity sensor 5 , the microphone 8 , the camera 12 , the camera 13 , the acceleration sensor 15 , the direction sensor 16 , and the gyroscope 17 .
  • the controller 10 executes, for example, the control program 9 A to perform various controls, such as a control for changing information displayed on the display 2 A in accordance with the gesture detected through the touch screen 2 B.
  • the camera 12 is an in-camera for photographing an object facing the front face 1 A.
  • the camera 13 is an out-camera for photographing an object facing the back face 1 B.
  • the connector 14 is a terminal to which other device is connected.
  • the connector 14 may be a general-purpose terminal such as a Universal Serial Bus (USB), a High-Definition Multimedia Interface (HDMI), Light Peak (Thunderbolt), and an earphone/microphone connector.
  • the connector 14 may be a dedicated terminal such as a dock connector. Examples of the devices connected to the connector 14 include, but are not limited to, an external storage device, a speaker, and a communication device.
  • the acceleration sensor 15 detects a direction and a magnitude of acceleration applied to the smartphone 1 .
  • the direction sensor 16 detects a direction of geomagnetism.
  • the gyroscope 17 detects an angle and an angular velocity of the smartphone 1 . The detection results of the acceleration sensor 15 , the direction sensor 16 , and the gyroscope 17 are used in combination with each other in order to detect a position of the smartphone 1 and a change of its attitude.
  • Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be downloaded from any other device through communication by the communication unit 6 .
  • Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be stored in the non-transitory storage medium that can be read by the reader included in the storage 9 .
  • Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be stored in the non-transitory storage medium that can be read by a reader connected to the connector 14 .
  • Examples of the non-transitory storage mediums include, but are not limited to, an optical disc such as CD, DVD, and Blu-ray, a magneto-optical disc, magnetic storage medium, a memory card, and solid-state storage medium.
  • the smartphone 1 is provided with three types of sensors in order to detect its position and attitude; however, the smartphone 1 does not have to be provided with some of the sensors. Alternatively, the smartphone 1 may be provided with any other type of sensor for detecting at least one of the position and the attitude.
  • the function provided by the control program 9 A includes a function for changing the screen displayed on the display 2 A according to gestures detected through the touch screen 2 B while the locked state is set.
  • a control performed according to user instructions while the locked state is set will be described.
  • FIG. 6 illustrates an example of the lock screen.
  • a lock screen 60 is a screen representing that the locked state is set.
  • the lock screen 60 is a screen that moves to other screen when a preset unlock gesture is detected.
  • the lock screen 60 is a screen on which gestures other than a preset gesture are invalidated.
  • the smartphone 1 is in a state that cannot perform various operations until a particular gesture is detected on the lock screen.
  • the wall paper 61 is displayed behind the date/time image 62 , the key icon 64 , the ring 66 , the application icons 68 a , 68 b , 68 c and 68 d , and the home icon 69 .
  • the key icon 64 is a lock-shaped image and is displayed in a substantially central portion of the screen.
  • the key icon 64 is an object that a user drags through an unlock gesture and a gesture for executing each application.
  • the smartphone 1 moves a display position of the key icon 64 according to a movement of a contact position of the swipe.
  • the application icons 68 a , 68 b , 68 c and 68 d are displayed separately on the ring 66 .
  • the application icons 68 a , 68 b , 68 c and 68 d are arranged on the ring 66 in order of the application icons 68 a , 68 b , 68 c and 68 d in clockwise direction.
  • Each of the application icons 68 a , 68 b , 68 c and 68 d is associated with a particular application installed in the smartphone 1 .
  • the smartphone 1 executes the application associated with the application icon for which the particular gesture is performed. The particular gesture will be described below.
  • the application icon 68 a is associated with a phone application.
  • the application icon 68 b is associated with a mail application.
  • the application icon 68 c is associated with an SMS application.
  • the application icon 68 d is associated with a text editor application.
  • Each of the application icons 68 a , 68 b , 68 c and 68 d includes an image that represents the associated application.
  • Each of the application icons 68 a , 68 b , 68 c and 68 d may contain an image and a character like the icon 50 , and may contain a symbol or a graphic instead of an image.
  • Each of the application icons 68 a , 68 b , 68 c and 68 d may contain only a character string, without containing an image.
  • the home icon 69 is displayed in an area that is a lower end side of the lock screen 60 in the outside of the ring 66 .
  • the home icon 69 is an icon that is associated with execution of unlock processing and processing of moving to the home screen 40 .
  • the smartphone 1 When detecting a particular gesture for the home icon 69 , the smartphone 1 unlocks the locked state and displays the home screen 40 on the display 2 A. The particular gesture will be described below.
  • Step S 1 illustrated in FIG. 7 the lock screen 60 is displayed on the display 2 A.
  • the user's finger F touches the key icon 64 .
  • the smartphone 1 detects a touch on a portion where the key icon 64 is arranged.
  • the user's finger F drops the key icon 64 on the home icon 69 . That is, the user uses his/her finger F to touch the area where the key icon 64 is displayed at Step S 1 , drags the key icon 64 along a path indicated by an arrow ⁇ 1 , and releases the key icon 64 in the area where the home icon 69 is displayed.
  • the smartphone 1 detects a swipe, of which the start point is the portion where the key icon 64 is arranged and the end point is the portion where the home icon 69 is arranged. That is, the smartphone 1 detects a drop of the key icon 64 on the home icon 69 . When detecting such a drop, the smartphone 1 unlocks the locked state and displays the home screen 40 on the touch screen display 2 .
  • the drop of the key icon 64 on the home icon 69 is set as the particular gesture for the home icon 69 .
  • the smartphone 1 may display the key icon 64 at the position where the contact is detected by the swipe. That is, when detecting the swipe of which the start point is the key icon 64 , the smartphone 1 may display the key icon 64 during the swipe while moving according to the movement of the finger F.
  • the smartphone 1 When detecting a gesture of dropping the key icon 64 on the outer area (the second area) of the ring 66 , the smartphone 1 also unlocks the locked state and displays the home screen 40 on the display 2 A. That is, by dropping key icon 64 on the second area, the user can unlock the locked state and allow the home screen 40 to be displayed.
  • the smartphone 1 When detecting a gesture of dropping the key icon 64 on the inner area (the first area) of the ring 66 , the smartphone 1 returns the key icon 64 to the initial position. That is, when the key icon 64 is not dropped on either of the second area and the application icons 68 a , 68 b , 68 c and 68 d , the smartphone 1 displays the key icon 64 at the central position of the ring 66 .
  • FIG. 8 describes a case where the particular gesture is performed with respect to the application icon 68 d so as to execute an application desired by a user.
  • the lock screen 60 is displayed on the display 2 A.
  • the user's finger F touches the key icon 64 .
  • the smartphone 1 detects a touch on a portion where the key icon 64 is arranged.
  • the user's finger F drops the key icon 64 on the application icon 68 d . That is, at Step S 4 , the user uses his/her finger F to touch the area where the key icon 64 is displayed, drags the key icon 64 along a path indicated by an arrow ⁇ 2 , and releases the key icon 64 in the area where the application icon 68 d is displayed.
  • the smartphone 1 detects the swipe, of which the start point is the portion where the key icon 64 is arranged and the end point is the portion where the application icon 68 d is arranged. That is, the smartphone 1 detects the drop of the key icon 64 on the application icon 68 d .
  • the smartphone 1 When detecting such a drop, the smartphone 1 unlocks the locked state and executes the text editor application as the application associated with the application icon 68 d . Subsequently, the smartphone 1 displays an operation screen, which is displayed in a case the text editor application is executed as the application associated with the application icon 68 d , on the touch screen display 2 .
  • FIG. 9 illustrates an example of the operation screen in case the text editor application is executed.
  • the smartphone 1 displays an operation screen 80 illustrated in FIG. 9 on the touch screen display 2 .
  • the operation screen 80 illustrated in FIG. 9 includes a display area 82 for checking an input character string on a substantially entire area of the upper portion of the screen, a keyboard object 84 for executing the input of a character string on the lower portion of the screen, a memo list display button 86 for displaying a memo list registered by the text editor on the upper left side of the display area 82 , and a end button 88 for ending the processing of the text editor on the upper right side of the display area 82 .
  • the smartphone 1 In such a state that the operation screen 80 is displayed, when detecting a tap or a swipe with respect to the keyboard object 84 , the smartphone 1 detects a character corresponding to a tapped area or a swiped trajectory as an input character. The smartphone 1 displays the input character at a set position of the display area 82 . In such a state that the operation screen 80 is displayed, when detecting a tap with respect to the memo list display button 86 or the completion button 88 , the smartphone 1 executes the processing associated with the tapped button. In this manner, the smartphone 1 executes a variety of processing of the text editor application and detects the input of the text.
  • the smartphone 1 of the embodiment sets the drop of the key icon 64 on the application icons 68 a , 68 b , 68 c or 68 d as the particular gesture for executing the application associated with the application icon on which the key icon 64 is dropped.
  • the smartphone 1 of the embodiment sets the drop of the key icon 64 on the application icon, that is, the gesture of dropping the key icon 64 on the application icon, as the particular gesture for the application icon; however, the particular gesture is not limited thereto. For example, a gesture of flicking the key icon 64 toward the application icon may be set as the particular gesture for the application icon.
  • a gesture of tapping an application icon after tapping the key icon 64 may be set as the particular gesture for the application icon.
  • the particular gesture for the application icon may be a gesture of releasing the key icon 64 in such a state that the key icon 64 and the application icon are superimposed.
  • the above-described gestures correspond to the gesture of releasing the key icon 64 in such a state that the key icon 64 and the application icon are superimposed.
  • a gesture in which the touch on the application icon is the start point may be set as the particular gesture for executing the application associated with the application icon.
  • a gesture of dragging the application icon and dropping the application icon on the key icon may be set as the particular gesture for the application icon.
  • FIG. 10 describes a particular gesture for executing particular processing that is executable in an application desired by a user.
  • the lock screen 60 is displayed on the display 2 A.
  • the user's finger F touches the key icon 64 .
  • the smartphone 1 detects the touch in the portion where the key icon 64 is arranged.
  • the user's finger F moves the key icon 64 onto the application icon 68 b . That is, the finger F touches the area where the key icon 64 is displayed at Step S 5 , and drags the key icon 64 along a path indicated by an arrow ⁇ 3 , so that the key icon 64 is moved to the area where the application icon 68 b is displayed.
  • the smartphone 1 detects the swipe, of which the start point is the portion where the key icon 64 is arranged and which moves to the portion where the application icon 68 b is arranged. That is, the smartphone 1 detects the gesture of superimposing the key icon 64 on the application icon 68 b .
  • the smartphone 1 When detecting the gesture of superimposing the key icon 64 on the application icon 68 b as the particular gesture for the application icon 68 b , the smartphone 1 displays sub icons 78 a , 78 b and 78 c associated with the application icon 68 b.
  • the date/time image 62 on the wall paper 61 On the lock screen 60 displayed at Step S 6 of FIG. 10 , the date/time image 62 on the wall paper 61 , the key icon (first icon) 64 , the ring 66 , the application icons (second icons) 68 a , 68 b , 68 c and 68 d , the home icon 69 , the sub ring 76 , and the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b are arranged.
  • the sub ring 76 is displayed at a location surrounding the outer periphery of the application icon 68 b .
  • the sub ring 76 has a circular frame shape.
  • the application icon 68 b is arranged in the central portion of the circular frame of the sub ring 76 .
  • the circular frame of the sub ring 76 is arranged with the same center as that of the application icon 68 b and has a shape with a larger diameter than that of the outer edge of the application icon 68 b .
  • the circular frame of the sub ring 76 is arranged apart from the outer edge of the application icon 68 b by more than a predetermined distance.
  • the sub ring 76 has a closed shape and becomes a boundary that divides the area of the lock screen 60 into two areas, that is, the inner area and the outer area of the sub ring 76 .
  • the sub icons 78 a , 78 b and 78 c are displayed separately on the sub ring 76 .
  • the sub icons 78 a , 78 b and 78 c are arranged on the sub ring 76 in order of the sub icons 78 a , 78 b and 78 c in clockwise direction.
  • Each of the sub icons 78 a , 78 b and 78 c is associated with particular processing that is executable in the application associated with the application icon 68 b .
  • the smartphone 1 executes the particular processing associated with the sub icon for which the particular gesture is performed. The particular gesture will be described below.
  • the sub icon 78 a is associated with incoming mail check processing that is executable in the mail application.
  • the sub icon 78 b is associated with new mail composition processing that is executable in the mail application.
  • the sub icon 78 c is associated with outgoing mail check processing that is executable in the mail application.
  • Each of the sub icons 78 a , 78 b and 78 c includes an image that represents the associated particular processing.
  • Each of the sub icons 78 a , 78 b and 78 c may contain an image or a character, or may contain a symbol or a graphic instead of an image.
  • Each of the sub icons 78 a , 78 b and 78 c may contain only a character string, without containing an image.
  • the sub icon 78 a contains a character string of “IN BOX” representing that the sub icon 78 a is associated with the incoming mail check processing.
  • the sub icon 78 b contains a character string of “COMPOSE” representing that the sub icon 78 b is associated with the new mail composition processing.
  • the sub icon 78 c contains a character string of “OUT BOX” representing that the sub icon 78 c is associated with the outgoing mail check processing.
  • the user's finger F moves the key icon 64 onto the sub icon 78 b . That is, the user swipes his/her finger F, which touches the area where the application icon 68 b is displayed at Step S 6 , along a path indicated by an arrow ⁇ 4 , and moves the finger F to the area where the sub icon 78 b is displayed.
  • the smartphone 1 detects a swipe, which adds the portion where the application icon 68 b detected at Step S 6 is arranged, as a swipe, of which an end point is the portion where the sub icon 78 b is arranged. That is, the smartphone 1 detects the gesture of releasing the key icon 64 superimposed on the sub icon 78 b .
  • the smartphone 1 executes the particular processing associated with the sub icon 78 b , that is, the new mail composition processing. That is, when detecting the gesture of superimposing the key icon 64 on the sub icon 78 b , the smartphone 1 displays the new mail composition screen of the mail application.
  • the processing of displaying the sub icons 78 a , 78 b and 78 c in the example of the application icon 68 b associated with the mail application has been described; however, the application icon associated with the sub icon is not limited thereto.
  • the smartphone 1 may display the sub icons associated with the application icon 68 a .
  • the sub icons associated with the phone may include, but are not limited to, sub icons that are short-cuts of missed call check processing, new call start processing, and outgoing call history check processing.
  • the smartphone 1 may display the sub icons associated with the application icon 68 c , as in the case of the mail application.
  • the sub icons associated with the mail may include, but are not limited to, sub icons that are short-cuts of incoming mail check processing, new mail composition processing, and outgoing mail check processing.
  • the smartphone 1 can execute the mail application associated with the application icon 68 b .
  • the smartphone 1 may delete the displayed sub icons 78 a , 78 b and 78 c .
  • the smartphone 1 may delete the displayed sub icons 78 a , 78 b and 78 c .
  • the smartphone 1 may execute the application associated with the application icon 68 b located at the center of the sub ring 76 .
  • the smartphone 1 may execute the application associated with the application icon 68 b located at the center of the sub ring 76 .
  • a gesture in which the touch on the key icon is the start point may be set as the particular gesture for executing the particular processing associated with the sub icon.
  • the gesture of dragging the sub icon and dropping the sub icon on the key icon may be set as the particular gesture for the sub icon.
  • FIG. 11 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen.
  • the procedure illustrated in FIG. 11 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 11 is executed in a case where the locked state is set and an operation of displaying a screen on the display 2 A is detected.
  • the case where the operation of displaying the screen on the display 2 A is detected is, for example, a case where a screen return operation is detected in such a state that a power-saving mode is set so that the screen is not displayed on the touch screen display 2 .
  • the controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 11 .
  • the controller 10 displays the lock screen on the touch screen display 2 at Step S 10 .
  • the controller 10 determines at Step S 12 whether a gesture has been detected.
  • the controller 10 obtains the detection result of the touch screen 2 B, and determines whether a gesture is detected, based on the obtained detection result.
  • the controller 10 determines at Step S 14 whether threshold value time ⁇ waiting time is satisfied. That is, the controller 10 determines whether the waiting time defined as the elapsed time after the completion of the latest operation is equal to or greater than the predetermined threshold value time.
  • Step S 14 When it is determined at Step S 14 that threshold value time ⁇ waiting time is not satisfied (No at Step S 14 ), that is, when it is determined that threshold value time>waiting time is satisfied, the controller 10 proceeds to Step S 12 and determines again whether there is a gesture. When it is determined at Step S 14 that threshold value time ⁇ waiting time is satisfied (Yes at Step S 14 ), the controller 10 shifts to the power-saving mode at Step S 16 and ends the processing. That is, the controller 10 changes to a state in which the lock screen is not displayed, by turning off the touch screen display 2 , and ends the processing.
  • the controller 10 determines, at Step S 18 , whether the gesture is a gesture of touching the key icon. That is, the controller 10 determines whether the gesture detected at Step S 12 is a gesture of touching the key icon. When it is determined at Step S 18 that the detected gesture is not the gesture of touching the key icon (No at Step S 18 ), the controller 10 executes the process corresponding to the detected gesture at Step S 20 , and proceeds to Step S 12 .
  • the process corresponding to the detected gesture include, but are not limited to, processing of displaying the sub icons associated with the application icon, processing of moving the positions of the application icons displayed on the lock screen, etc. The processing of displaying the sub icons and the processing of moving the positions of the application icons will be described below.
  • the process corresponding to the detected gesture may be processing of displaying screens displayable on the lock screen, for example, a help screen or an emergency notice screen.
  • Step S 22 When it is determined at Step S 22 that the key icon is released on the second area or on the home icon (Yes at Step S 22 ), the controller 10 executes unlock processing at Step S 24 , and displays the home screen on the touch screen display 2 at Step S 26 . When the home screen is displayed at Step S 26 , the controller 10 ends the processing.
  • Step S 28 When it is determined at Step S 28 that there are the sub icons associated with the application icon (Yes at Step S 28 ), the controller 10 displays the sub icons on the lock screen at Step S 29 .
  • the controller 10 when detecting the particular gesture for the application icon 68 b (in FIG. 10 , the gesture of moving the key icon 64 onto the application icon 68 b ), the controller 10 displays the sub icons 78 a , 78 b and 78 c on the lock screen 60 . Subsequently, the controller 10 proceeds to processing of Step S 30 .
  • Step S 28 when it is determined at Step S 28 that there are no sub icons associated with the application icon (No at Step S 28 ), the controller 10 proceeds to the processing of Step S 30 .
  • Step S 30 determines whether the touch is released on the application icon. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S 18 , is swiped and then released so that the corresponding released position (position where the key icon is dropped) is on the application icon.
  • Step S 30 When it is determined at Step S 30 that the key icon is released on the application icon (Yes at Step S 30 ), the controller 10 executes unlock processing at Step S 31 , executes the application corresponding to the application icon located at the dropped position at Step S 32 , and displays the screen of the executed application on the touch screen display 2 at Step S 33 .
  • the controller 10 executes the text editor application associated with the application icon 68 d .
  • the controller 10 ends the processing.
  • Step S 34 determines whether the key icon is released on the sub icon. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S 18 , is swiped and then released so that the corresponding released position (position where the key icon is dropped) is on the sub icon.
  • Step S 34 When it is determined at Step S 34 that the key icon is released on the sub icon (Yes at Step S 34 ), the controller 10 executes unlock processing at Step S 35 , and executes particular processing corresponding to the sub icon located at the dropped position at Step S 36 . That is, the controller 10 displays the operation screen for executing the particular processing corresponding to the sub icon. For example, as illustrated at Step S 7 of FIG. 10 , when detecting the release of the key icon 64 superimposed on the sub icon 78 b , the controller 10 executes the new mail composition processing associated with the sub icon 78 b . That is, the controller 10 displays the new mail composition screen for executing the new mail composition processing associated with the sub icon 78 b . When the moved operation screen for executing the particular processing of the application is displayed at Step S 36 , the controller 10 ends the processing.
  • Step S 27 When it is determined at Step S 27 that there is no gesture of moving the key icon onto the application icon (No at Step S 27 ), or when it is determined at Step S 34 that the key icon is not released on the sub icon (No at Step S 34 ), the controller 10 determines at Step S 37 whether the key icon is released in the first area. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S 18 , is swiped and then released so that the corresponding released position (position where the key icon is dropped) is the first area.
  • Step S 37 When it is determined at Step S 37 that the key icon is released in the first area (Yes at Step S 37 ), the controller 10 moves the key icon to the initial position at Step S 38 , and proceeds to Step S 12 . In case the sub icon is displayed at Step S 29 , the controller 10 deletes the sub icon. When it is determined at Step S 37 that the key icon is not released in the first area (No at Step S 37 ), the controller 10 proceeds to Step S 22 .
  • Step S 18 when detecting the touch on the key icon at Step S 18 , the controller 10 repeats the processing of Step S 22 , Step S 27 , Step S 30 , Step S 34 , and Step S 37 until the release of the touch gesture, that is, the drop of the key icon, is detected at Step S 22 , Step S 30 , or Step S 34 .
  • the smartphone 1 displays the sub icons when the particular gesture for the application icon (gesture of superimposing the key icon on the application icon in the embodiment) is detected. Therefore, according to the smartphone 1 of the embodiment, in the case of executing the particular processing that is executable in the desired application, the user does not need to perform three-stage operations, that is, the execution of the unlock processing, the selection of the desired application, and the selection of the particular processing from the application menu, or the like. That is, by just making the gesture of superimposing the key icon on the application icon on the lock screen, the user can display the operation screen for executing the particular processing of the desired application.
  • the user can smoothly select and execute, on the lock screen, the desired processing from among the incoming mail check processing, the new mail composition processing, the outgoing mail check processing, and the like, which can be executed in the mail application.
  • the smartphone 1 of the embodiment provides the user with high operability and high convenience in a state that the lock screen representing the locked state is displayed.
  • the smartphone 1 executes the application associated with the application icon. Therefore, the user can execute the desired application quickly from the locked state.
  • the smartphone 1 executes the unlock processing, the application selection processing, and the application execution processing.
  • the application icons displayed on the lock screen works as icons having a short-cut function. Therefore, the user can input three processings through a single gesture. Therefore, the user can execute the desired application by a short-cut operation in which part of operations, for example, an operation of performing the unlock and displaying the icon selection screen and the operation of selecting the icon on the icon selection screen are omitted.
  • the smartphone 1 When detecting the drop of the key icon on the second area or the home icon, the smartphone 1 unlocks the locked state and displays the home screen. Therefore, the user can unlock the locked state through a simple operation.
  • the smartphone 1 may be configured not to display the home icon on the lock screen.
  • the smartphone 1 may execute processing different from that of the case where the drop of the key icon in the second area is detected. For example, when detecting the drop of the key icon on the home icon, the smartphone 1 unlocks the locked state and displays the home screen. On the other hand, when detecting the drop of the key icon in the second area, the smartphone 1 unlocks the locked state, and then displays the application if there is an application being already executed, and displays the home screen if there is no application being executed. In this manner, by changing the processing to be executed according to whether the position where the key icon is dropped is on the home icon or in the second area, the smartphone 1 can execute more types of processing quickly from the locked state.
  • the smartphone 1 determines the detected gesture for the key icon in order of Step S 22 , Step S 27 , Step S 30 , Step S 34 , and Step S 37 , but the order of the determination is not specially limited.
  • the smartphone 1 may execute the determinations of Step S 22 , Step S 27 , Step S 30 , Step S 34 , and Step S 37 in any order.
  • the smartphone 1 when it is determined at Step S 37 that the key icon is released in the first area, the smartphone 1 deletes the displayed sub icons, but it is not limited thereto. After displaying the sub icons at Step S 29 , the smartphone 1 may delete the displayed sub icons, based on the position of the key icon 64 . For example, as illustrated in Step S 6 of FIG.
  • the smartphone 1 when detecting the gesture of superimposing the key icon 64 and the application icon 68 b , the smartphone 1 displays the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b , but, after that, when detecting the gesture of moving the key icon 64 to the outer area of the sub ring 76 , the smartphone 1 may delete the displayed sub icons 78 a , 78 b and 78 c .
  • the smartphone 1 may delete the displayed sub icons 78 a , 78 b and 78 c . Therefore, the user can efficiently change the display of the sub icons associated with the respective application icons.
  • the smartphone 1 when it is determined at Step S 37 that the key icon is not released in the first area, the smartphone 1 proceeds to Step S 22 again to repeat the processing, but it is not limited thereto.
  • the smartphone 1 may further determine whether there is a gesture of dropping the key icon in the inner area of the sub ring.
  • the smartphone 1 may execute the application associated with the application icon located at the center of the sub ring. That is, when the key icon 64 is not dropped in either of the first area, the second area, and the sub icons after displaying the sub icons, the smartphone 1 may execute the application associated with the application icon located at the center of the sub ring.
  • the smartphone 1 executes the unlock processing, but it is not limited thereto.
  • the smartphone 1 may execute the application associated with the application icon, without executing the unlock processing.
  • FIG. 12 is a flowchart illustrating another example of the procedure of the control that is performed in the locked state.
  • FIG. 12 illustrates another example of the procedure that is performed when it is determined that the gesture of releasing the key icon on the application icon is detected, that is, the procedure that is performed after it is determined as Yes at Step S 30 of FIG. 11 .
  • the procedure illustrated in FIG. 12 is realized by the controller 10 executing the control program 9 A.
  • the controller 10 executes the application associated with the application icon located at the dropped position at Step S 32 , and displays the screen of the executed application on the touch screen display 2 at Step S 33 .
  • the controller 10 performs the processing of the executed application at Step S 39 . For example, when a gesture is detected, the controller 10 executes the processing associated with the detected gesture.
  • the controller 10 determines whether to end (terminate or suspend) the application at Step S 40 . For example, when detecting the gesture of ending the application, and when it is determined that a preset processing condition is satisfied, the controller 10 determines to terminate the application. When it is determined not to end the application (No at Step S 40 ), the controller 10 proceeds to Step S 39 and performs the processing of the application. When it is determined to end the application (Yes at Step S 40 ), the controller 10 displays the lock screen at Step S 41 and ends the processing. That is, when it is determined as Yes at Step S 40 , the controller 10 proceeds to Step S 10 in FIG. 11 and ends the processing illustrated in FIG. 12 .
  • the smartphone 1 executes the application associated with the application icon on which the key icon is superimposed, and displays the lock screen again when the application is ended.
  • the smartphone 1 may execute the processing operation of executing the mail application when the key icon and the application icon associated with the mail application are superimposed, and returning to the lock screen when the mail is transmitted. Therefore, the user can use the application associated with the application icon displayed on the lock screen in a state the locked state is not unlocked. That is, the user can execute the predetermined set processing, without inputting the cumbersome unlock gesture.
  • the smartphone 1 executes the unlock processing, but it is not limited thereto.
  • the smartphone 1 may execute the particular processing associated with the sub icon, without executing the unlock processing. Subsequently, the smartphone 1 may display the lock screen again when the particular processing is ended.
  • the smartphone 1 of the embodiment sets the gesture of superimposing the key icon 64 and the application icon 68 b as the particular gesture for displaying the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b .
  • the smartphone 1 of the embodiment sets the gesture of superimposing the key icon 64 on the application icon as the particular gesture for displaying the sub icons associated with the application icon
  • the particular gesture for displaying the sub icons is not limited thereto.
  • a gesture of directly tapping the application icon, without the user's operating the key icon 64 may be set as the particular gesture for displaying the sub icons associated with the application icon.
  • FIG. 13 illustrates an example of the control during displaying the lock screen.
  • the lock screen 60 is displayed on the display 2 A, and the user's finger F taps the application icon 68 b .
  • the smartphone 1 detects the tap in the portion where the application icon 68 b is arranged.
  • Step S 9 the sub icons 78 a , 78 b and 78 c are displayed on the lock screen 60 .
  • the smartphone 1 when detecting the gesture of tapping the application icon 68 b as the particular gesture for the application icon 68 b , the smartphone 1 displays the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b.
  • the smartphone 1 of the embodiment sets the gesture of tapping an application icon as the particular gesture for displaying the sub icons associated with the application icon.
  • FIG. 14 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen.
  • the procedure illustrated in FIG. 14 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 14 is performed as a part of the processing of Step S 20 .
  • the controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 14 .
  • the controller 10 determines at Step S 42 whether the detected gesture is a gesture of tapping the application icon. That is, as illustrated in FIG. 13 , the controller 10 determines whether a tap is detected in the portion where the application icon 68 b is arranged.
  • Step S 42 determines at Step S 43 whether there are sub icons associated with the application icon.
  • Step S 43 determines at Step S 43 whether there are sub icons associated with the application icon.
  • the controller 10 displays the sub icons on the lock screen at Step S 44 .
  • the controller 10 displays the sub icons 78 a , 78 b and 78 c on the lock screen 60 . Then, the processing is ended.
  • the controller 10 ends the processing.
  • the smartphone 1 displays the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b . Therefore, the user can display the sub icons 78 a , 78 b and 78 c associated with the application icon 68 b on the lock screen by just tapping the application icon, without performing the operation of superimposing the key icon 64 on the application icon. Furthermore, the user can execute desired processing quickly from the lock screen by selecting a sub icon associated with the desired processing from among the displayed sub icons 78 a , 78 b and 78 c.
  • the smartphone 1 may display the sub icons associated with the respective application icons 68 a to 68 c displayed on the lock screen by other gestures. For example, when detecting gesture of double tap, long tap, or the like for the application icon 68 b , the smartphone 1 may display the sub icons associated with the application icon. In this case, after displaying the sub icons, when detecting again the particular gesture (tap or the like) for the application icon, the smartphone 1 may delete the sub icons displayed on the lock screen.
  • the sub icons are associated with the particular processing executable in the mail application; however, a manner in which the sub icons are displayed is not limited thereto.
  • Examples of the particular processing executable in the mail application include incoming mail check processing, new mail composition processing, outgoing mail check processing, and the like.
  • the sub icon may be associated with an arbitrary application installed in the smartphone 1 .
  • the user can hierarchically organize the applications executable on the lock screen by associating the applications being used frequently with the application icons of an upper layer and associating the applications being used sometimes with the sub icons of a lower layer.
  • third icons of a lower layer may be associated with the sub icons.
  • the user can use groups of icons (that is, the application icon of the upper layer, the sub icons of the intermediate layer, and the third icons of the lower layer) that function as the short-cuts of the applications executable on the lock screen or the particular processing executable on the lock screen. As a result, the operability and convenience of the lock screen are further improved.
  • the application icons are associated with the sub icons, but may not be necessarily associated with the applications. Therefore, the user can use the application icons as folders for organizing the sub icons. That is, the user can organize the sub icons into desired categories by using the application icons of the upper layer displayed on the lock screen as the folders. For example, the user can set the image of the application icon displayed on the lock screen to a character string (for example, “amusement”, “work”, “private”, and the like) representing a category, and associate the sub icons with the application icon according to the intended category. As a result, the operability and convenience of the lock screen are further improved.
  • a character string for example, “amusement”, “work”, “private”, and the like
  • FIG. 15 illustrates an example of the control during displaying the lock screen.
  • the lock screen 60 is displayed on the display 2 A.
  • the user's finger F touches the ring 66 .
  • the smartphone 1 detects the touch in the portion where the ring 66 is arranged.
  • Step S 52 the user swipes his/her finger F along the ring 66 .
  • the smartphone 1 detects the swipe along the ring 66 .
  • the smartphone 1 rotates the application icons 68 a , 68 b , 68 c and 68 d arranged on the ring 66 , based on a movement amount of the swipe on the ring 66 , in the embodiment, an angle of an arc of the swipe.
  • the smartphone 1 rotates the ring 66 and the application icons 68 a , 68 b , 68 c and 68 d by using the center of the ring 66 as the rotational axis.
  • the smartphone 1 may display as if only the application icons 68 a , 68 b , 68 c and 68 d are rotated.
  • Step S 53 the user swipes his/her finger F along the ring 66 .
  • the smartphone 1 detects the swipe along the ring 66 .
  • the smartphone 1 further rotates the application icons 68 a , 68 b , 68 c and 68 d arranged on the ring 66 , based on a movement amount of the swipe, in the embodiment, an angle of an arc of the swipe on the ring 66 .
  • FIG. 16 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen.
  • the procedure illustrated in FIG. 16 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 16 is performed as a part of the processing of Step S 20 .
  • the controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 16 .
  • the controller 10 determines at Step S 60 whether detected gesture is a swipe along the ring 66 . That is, the controller 10 determines whether a gesture of touching the area where the ring 66 is displayed and swiping along the ring 66 , as illustrated in FIG. 15 , is detected.
  • the controller 10 changes the display position of the application icons at Step S 62 . That is, the controller 10 rotates the application icons 68 a , 68 b , 68 c and 68 d arranged on the ring 66 , based on the swipe detected at Step S 60 .
  • the controller 10 ends the processing.
  • the controller 10 ends the processing.
  • the smartphone 1 when detecting the gesture of swiping along the ring 66 , the smartphone 1 rotates the application icons 68 a , 68 b , 68 c and 68 d arranged on the ring 66 . Therefore, the user can easily adjust the positions of the application icons 68 a , 68 b , 68 c and 68 d on the lock screen, and move the desired application icon to the position where it is easy to drop the key icon 64 .
  • the smartphone 1 may be configured to adjust the positions of the application icons 68 a , 68 b , 68 c and 68 d displayed on the lock screen by other gestures. For example, when detecting the swipe, of which the start point is the application icon and the end point is an arbitrary position on the ring 66 , that is, when detecting the drop of the application icon on the arbitrary position on the ring 66 , the smartphone 1 may move the display position of the application icon to the dropped position. When detecting the swipe, of which the start point is the application icon and the end point is other application icon on the ring 66 , that is, when detecting the drop of the application icon on other application icon, the smartphone 1 may exchange the display position of the application icon with the display position of other application icon.
  • FIGS. 17A to 17C illustrate examples of the respective lock screens.
  • the lock screen 60 displays four application icons 68 a , 68 b , 68 c and 68 d on the ring 66 , the number of the application icons is not limited thereto.
  • the lock screen 60 a illustrated in FIG. 17A displays two application icons of the application icons 68 a and 68 b on the ring 66 .
  • the smartphone 1 may display the home icon 69 on the ring 66 .
  • the lock screen 60 c illustrated in FIG. 17C displays five application icons of the application icons 68 a , 68 b , 68 c , 68 d and 68 e and the home icon 69 on the ring 66 .
  • the home icon 69 is displayed between the application icon 68 a and the application icon 68 e.
  • the shape of the ring 66 of the lock screen is not limited to a circle.
  • the ring 66 has only to divide the lock screen into a first area, which includes the key icon 64 , and a second area, which does not include the key icon 64 .
  • the ring 66 has only to have a frame surrounding the outer edge of the key icon 64 .
  • the frame surrounding the outer edge of the key icon 64 may have various shapes, such as a polygon, an ellipse, and a combined shape of a curve and a straight line.
  • the ring 66 may be displayed on the lock screen. By displaying the ring 66 on the lock screen, the smartphone 1 can clearly indicate the user the boundary between the first area and the second area. Therefore, the user can execute the desired processing more reliably.
  • the smartphone 1 can facilitate the input of the gesture of superimposing the application icon and the key icon, while suppressing the application from being executed by erroneous operations. Therefore, it is preferable to display the application icon on the ring 66 or the frame, but the smartphone 1 may also display the application icon on the area that does not overlap the ring 66 or the frame.
  • FIG. 18 illustrates an example of an icon setting screen.
  • the icon setting screen 90 illustrated in FIG. 18 displays a plurality of items 92 , check boxes 94 corresponding to the items 92 , and a scroll bar 98 .
  • Check marks 96 are displayed in some of the check boxes 94 .
  • the check boxes 94 are square frames and are displayed on the left of the respective items 92 .
  • the check boxes 94 are display areas for representing whether the items 92 are selected.
  • the check box 96 is an image for representing whether the item 92 corresponding to the check box 94 is selected as the item to be displayed as the application icon. As described above, in the check box 94 , the check mark 96 is displayed when the item 92 is selected, and the check mark 96 is not displayed when the item 92 is unselected.
  • the scroll bar 98 is an image representing to which area of the entire icon setting screen 90 the area currently displayed on the display 2 A corresponds. When detecting an operation of moving an object 98 a representing a current position of the scroll bar 98 , the smartphone 1 scrolls the icon setting screen 90 displayed on the display 2 A, based on the detected operation.
  • the icon setting screen 90 is not limited thereto.
  • the icon setting screen 90 may use, as the items 92 , images of the application icons associated with the applications, or images of the icons displayed on the home screen.
  • FIG. 19 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen.
  • the procedure illustrated in FIG. 19 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 19 is executed, for example, when the operation of executing the setting application of the application icon displayed on the lock screen is detected.
  • the controller 10 determines at Step S 74 , whether the gesture is an item selection operation. That is, the controller 10 determines whether the gesture detected at Step S 72 is a gesture of selecting the item displayed on the icon setting screen.
  • the gesture of selecting the item displayed on the icon setting screen is a preset particular gesture selected among various gestures.
  • the gesture of selecting the item displayed on the icon setting screen may use a tap, a long tap, or a double tap with respect to the area where the item is displayed, and may use a tap, a long tap, a double tap, or the like with respect to the area where the check box corresponding to the item is displayed.
  • Step S 74 determines at Step S 76 whether the targeted item is in a selected state. That is, the controller 10 detects the state of the item determined as selected at Step S 74 , and determines whether the item is in a selected state, in the embodiment, whether there is the check mark 96 in the check box 94 corresponding to the item 92 .
  • Step S 76 When it is determined at Step S 76 that the item is in the selected state (Yes at Step S 76 ), the controller 10 changes the item to an unselected state at Step S 78 , and proceeds to Step S 82 .
  • Step S 76 When it is determined at Step S 76 that the item is not in the selected state (No at Step S 76 ), the controller 10 changes the item to a selected state at Step S 80 , and proceeds to Step S 82 .
  • Step S 78 or Step S 80 the controller 10 changes the display state of the item at Step S 82 . That is, the controller 10 clears the check mark of the check box of the item changed to the unselected state at Step S 78 , and displays the check mark in the check box of the item changed to the selected state at Step S 80 .
  • Step S 82 the controller 10 proceeds to Step S 72 and repeats the above-described processing.
  • Step S 74 determines at Step S 84 whether the gesture is the setting completion operation.
  • the controller 10 executes the process corresponding to the detected gesture at Step S 86 , and proceeds to Step S 72 .
  • Examples of the process corresponding to the detected gesture include screen scroll processing of the icon setting screen, display processing of screens displayable on the icon setting screen, for example, a help screen, and the like.
  • Step S 84 When it is determined at Step S 84 that the gesture is the setting completion operation (Yes at Step S 84 ), the controller 10 sets the item in the selected state to an item to be displayed at Step S 88 . That is, the controller 10 sets the application of the item in the selected state to the application displaying the application icon on the lock screen. When the display item is set at Step S 88 , the controller 10 ends the processing.
  • FIG. 20 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen, in particular, the procedure of the control for setting up the display position of the application icon to be displayed.
  • the procedure illustrated in FIG. 20 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 20 is executed when the setting processing of the application icon to be displayed is completed, or when the operation of displaying on the lock screen is detected at first after the setting processing of the application icon to be displayed is completed.
  • the procedure illustrated in FIG. 20 may be executed whenever the operation of displaying on the lock screen is detected.
  • the controller 10 extracts the application icons to be displayed at Step S 90 . Specifically, the controller 10 extracts the application set to the display item in the above-described processing, and extracts the application icons associated with the extracted applications as the application icons to be displayed.
  • the smartphone 1 may allow the user to select the application icon to be displayed on the lock screen. Accordingly, since the desired application icon can be displayed on the lock screen, the user can execute the desired application quickly.
  • the smartphone 1 may determine the arrangement positions of the application icons based on the gesture detected through the touch screen display 2 .
  • FIG. 21 illustrates an example of the control for setting up the display content of the lock screen.
  • an icon position setting screen 102 is displayed on the display 2 A.
  • the icon position setting screen 102 is a screen for setting the display positions of the application icons to be displayed on the lock screen.
  • a ring 106 and application icons 108 a , 108 b , 108 c and 108 d are arranged.
  • the same area 42 as the area 42 of the home screen 40 is arranged at the top edge of the display 2 A.
  • a remaining mark 43 indicating a remaining amount of a power supply and a radio-wave level mark 44 indicating electric field strength of radio wave for communication are displayed in the area 42 .
  • Step S 102 the user's finger F touches the application icon 108 a .
  • the smartphone 1 detects the touch in the portion where the application icon 108 a is arranged.
  • the user's finger F drops the application icon 108 a on the ring 106 . That is, the user uses his/her finger F to touch the area where the application icon 108 a is displayed at Step S 102 , drags the application icon 108 a along a path indicated by an arrow ⁇ 7 , and releases the application icon 108 a in the area where the ring 106 is displayed.
  • the smartphone 1 detects the swipe, of which the start point is the portion where the application icon 108 a is arranged and the end point is the portion where the ring 106 is arranged. That is, the smartphone 1 detects the drop of the application icon 108 a on the ring 106 .
  • Step S 104 the application icon 108 a is arranged on the ring 106 . That is, when the drop is detected, as illustrated in Step S 104 , the smartphone 1 sets the position on the ring 106 , at which the application icon 108 a is dropped, as the display position of the application icon 108 a.
  • the user can determine the positions of the respective application icons 108 b , 108 c and 108 d on the ring 106 .
  • the smartphone 1 sets the dropped positions on the ring 106 as the display positions of the dropped application icons.
  • the smartphone 1 of the embodiment returns the dropped application icon to the position of prior to the drop.
  • FIG. 22 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen.
  • the procedure illustrated in FIG. 22 is realized by the controller 10 executing the control program 9 A.
  • the procedure illustrated in FIG. 22 ′ is executed, for example, when the operation of executing the application for determining the display positions of the application icons to be displayed on the lock screen is detected.
  • the controller 10 displays the icon position setting screen at Step S 120 , that is, the screen illustrated in Step S 101 of FIG. 21 , on the touch screen display 2 .
  • the controller 10 determines at Step S 122 whether a gesture has been detected. That is, the controller 10 obtains the detection result of the touch screen 2 B, and determines whether a gesture is detected based on the obtained detection result.
  • the controller 10 proceeds to Step S 122 and determines again whether a gesture has been detected.
  • Step S 122 determines at Step S 124 whether the detected gesture is a touch on the icon. That is, the controller 10 determines whether the gesture detected at Step S 122 is a touch on the application icon displayed on the icon position setting screen.
  • Step S 124 determines at Step S 126 whether a release has been detected. That is, the controller 10 determines whether the touch on the application icon, which is detected at Step S 124 is released. When it is determined at Step S 126 that there is no release (No at Step S 126 ), that is, when it is determined that the touch on the application icon is continued, the controller 10 proceeds to Step S 126 and determines again whether a release has been detected.
  • Step S 126 determines at Step S 128 whether the release position is on the ring. That is, the controller 10 determines whether the position of the release determined as being present at Step S 126 is on the ring, that is, whether the application icon is dropped on the ring.
  • Step S 128 When it is determined at Step S 128 that the release position is on the ring (Yes at Step S 128 ), the controller 10 sets the release position as the icon display position at Step S 130 . That is, the controller 10 sets the position on the ring, at which the application icon is dropped, as the display position of the application icon. When the display position of the icon is changed at Step S 130 , the controller 10 proceeds to Step S 122 .
  • Step S 128 When it is determined at Step S 128 that the release position is not on the ring (No at Step S 128 ), that is, when it is determined that the position of the dropped application icon is the position where the application icon is not superimposed with the ring, the controller 10 invalidates the operation of moving the icon at Step S 132 . That is, the controller 10 returns the dropped application icon to the position touched at Step S 124 , that is, the position of prior to movement. When the operation of moving the icon is invalidated at Step S 132 , the controller. 10 proceeds to Step S 122 .
  • Step S 134 determines the display position of the icon at Step S 138 . That is, the controller 10 sets the position of the application icon, which is displayed on the ring of the icon position setting screen at the time point when the setting operation is determined as being completed, as the display position of the application icon. The controller 10 sets the application icons, which are not displayed on the ring, as icons which are not displayed on the lock screen.
  • the controller 10 ends the processing.
  • the smartphone 1 invalidates the operation of moving the icon.
  • the smartphone 1 may also change the display position of the icon to the dropped position.
  • the smartphone 1 may continue the processing of FIG. 22 until the entire application icons arranged on the lock screen are arranged on the ring.
  • the smartphone 1 may set the initial state of the icon position setting screen as a state in which the application icons are automatically arranged on the ring.
  • the user can arrange the entire application icons on the ring by just adjusting the positions of the application icons.
  • the smartphone 1 may allow the use of only the application icons associated with the preset applications. That is, the smartphone 1 may disable the modification of the application icons to be displayed on the lock screen.

Abstract

According to an aspect, a device includes a touch screen display and a controller. The touch screen display displays a lock screen including a first icon and a second icon. The controller displays a sub icon associated with the second icon on the lock screen when a gesture in which the first icon and the second icon are superimposed is detected.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority from Japanese Application No. 2011-219531, filed on Oct. 3, 2011, and Japanese Application No. 2012-221208, filed on Oct. 3, 2012, the contents of which are incorporated by reference herein in their entireties.
BACKGROUND
1. Technical Field
The present application relates to a device, a method, and a storage medium storing therein a program. More particularly, the present application relates to a device including a touch screen display, a method of controlling the device, and a storage medium storing therein a program for controlling the device.
2. Description of the Related Art
A touch screen device having a touch screen display has been known. Examples of the touch screen devices include, but are not limited to, a smartphone and a tablet. The touch screen device detects a gesture of a finger, a pen, or a stylus pen through the touch screen display. Then, the touch screen device operates according to the detected gesture. An example of the operation according to the detected gesture is described in, for example, International Publication Pamphlet No. 2008/086302.
The basic operation of the touch screen device is implemented by an operating system (OS) built into the device. Examples of the OS built into the touch screen device include, but are not limited to, Android, BlackBerry OS, iOS, Symbian OS, and Windows Phone.
Many of touch screen devices have a lock function so as to prevent erroneous operations or the like. The touch screen device executes a lock function to display a lock screen on the touch screen display during a locked state. On the lock screen, operations other than a set operation are invalidated. Therefore, the touch screen device can prevent erroneous operations during the locked state by executing the lock function.
The touch screen device unlocks the locked state when an unlock operation is detected in the locked state. Therefore, when a user executes a desired application in the locked state, the user is required to input the unlock operation, select the application, and execute the selected application. Furthermore, when particular processing executable in the desired application is executed, the user is required to perform the unlock operation, select the desired application, and select particular processing from a menu or the like of the application. For example, in a case where the application is a mail application, the particular processing includes incoming mail check processing, new mail composition processing, outgoing mail check processing, and the like. As described above, the above-mentioned touch screen device has low operability and convenience in the locked state.
For the foregoing reasons, there is a need for a device, a method, and a program that provide the user with high operability and convenience in the locked state.
SUMMARY
According to an aspect, a device includes a touch screen display and a controller. The touch screen display displays a lock screen including a first icon and a second icon. The controller displays a sub icon associated with the second icon on the lock screen when a gesture in which the first icon and the second icon are superimposed is detected.
According to another aspect, a method is for controlling a device having a touch screen display. The method includes: displaying a lock screen, including a first icon and a second icon, on the touch screen display; and displaying a sub icon associated with the second icon when a gesture in which the first icon and the second icon are superimposed is detected.
According to another aspect, a non-transitory storage medium stores therein a program. When executed by a device having a touch screen display, the program cases the device to execute: displaying a lock screen, including a first icon and a second icon, on the touch screen display; and displaying a sub icon associated with the second icon when a gesture in which the first icon and the second icon are superimposed is detected.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view of a smartphone according to an embodiment;
FIG. 2 is a front view of the smartphone;
FIG. 3 is a back view of the smartphone;
FIG. 4 is a diagram illustrating an example of a home screen;
FIG. 5 is a block diagram of the smartphone;
FIG. 6 is a diagram illustrating an example of a lock screen;
FIG. 7 is a diagram illustrating an example of a control during displaying the lock screen;
FIG. 8 is a diagram illustrating an example of a control during displaying the lock screen;
FIG. 9 is a diagram illustrating an example of an operation screen in a case where a text editor application is executed;
FIG. 10 is a diagram illustrating an example of a control during displaying the lock screen;
FIG. 11 is a flowchart illustrating a procedure of a control that is performed in a locked state;
FIG. 12 is a flowchart illustrating another example of a procedure of a control that is performed in a locked state;
FIG. 13 is a diagram illustrating an example of a control during displaying the lock screen;
FIG. 14 is a flowchart illustrating procedures of a control that is performed in a locked state;
FIG. 15 is a diagram illustrating an example of a control during displaying the lock screen;
FIG. 16 is a flowchart illustrating a procedure of a control that is performed in a locked state;
FIG. 17A is a diagram illustrating an example of the lock screen;
FIG. 17B is a diagram illustrating an example of the lock screen;
FIG. 17C is a diagram illustrating an example of the lock screen;
FIG. 18 is a diagram illustrating an example of an icon setting screen;
FIG. 19 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen;
FIG. 20 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen;
FIG. 21 is a diagram illustrating an example of a control for setting up display content of the lock screen; and
FIG. 22 is a flowchart illustrating a procedure of a control for setting up display content of the lock screen.
DETAILED DESCRIPTION
Exemplary embodiments of the present invention will be explained in detail below with reference to the accompanying drawings. A smartphone will be explained below as an example of a device provided with a touch screen display.
An overall configuration of a smartphone 1 according to an embodiment will be explained below with reference to FIG. 1 to FIG. 3. As illustrated in FIG. 1 to FIG. 3, the smartphone 1 includes a housing 20. The housing 20 includes a front face 1A, a back face 1B, and side faces 1C1 to 1C4. The front face 1A is a front of the housing 20. The back face 1B is a back of the housing 20. The side faces 1C1 to 1C4 are sides each connecting the front face 1A and the back face 1B. Hereinafter, the side faces 1C1 to 1C4 may be collectively called “side face 1C” without being specific to any of the side faces.
The smartphone 1 includes a touch screen display 2, buttons 3A to 3C, an illumination (ambient light) sensor 4, a proximity sensor 5, a receiver 7, a microphone 8, and a camera 12, which are provided in the front face 1A. The smartphone 1 includes a camera 13, which is provided in the back face 1B. The smartphone 1 includes buttons 3D to 3F and a connector 14, which are provided in the side face 1C. Hereinafter, the buttons 3A to 3F may be collectively called “button 3” without being specific to any of the buttons.
The touch screen display 2 includes a display 2A and a touch screen 2B. In the example of FIG. 1, each of the display 2A and the touch screen 2B is approximately rectangular-shaped; however, the shapes of the display 2A and the touch screen 2B are not limited thereto. Each of the display 2A and the touch screen 2B may have any shape such as a square, a circle or the like. In the example of FIG. 1, the display 2A and the touch screen 2B are arranged in a superimposed manner; however, the manner in which the display 2A and the touch screen 2B are arranged is not limited thereto. The display 2A and the touch screen 2B may be arranged, for example, side by side or apart from each other. In the example of FIG. 1, longer sides of the display 2A are along with longer sides of the touch screen 2B respectively while shorter sides of the display 2A are along with shorter sides of the touch screen 2B respectively; however, the manner in which the display 2A and the touch screen 2B are superimposed is not limited thereto. In case the display 2A and the touch screen 2B are arranged in the superimposed manner, they can be arranged such that, for example, one or more sides of the display 2A are not along with any sides of the touch screen 2B.
The display 2A is provided with a display device such as a liquid crystal display (LCD), an organic electroluminescence display (GELD), or an inorganic electroluminescence display (IELD). The display 2A displays text, images, symbols, graphics, and the like.
The touch screen 2B detects a contact of a finger, a pen, a stylus pen, or the like on the touch screen 2B. The touch screen 2B can detect positions where a plurality of fingers, pens, stylus pens, or the like make contact with the touch screen 2B. In the description herein below, a finger, pen, stylus pen, and the like may be referred to as a “contact object” or an “object”.
The detection method of the touch screen 2B may be any detection methods, including but not limited to, a capacitive type detection method, a resistive type detection method, a surface acoustic wave type (or ultrasonic type) detection method, an infrared type detection method, an electro magnetic induction type detection method, and a load sensing type detection method. In the description herein below, for the sake of simplicity, it is assumed that the user uses his/her finger(s) to make contact with the touch screen 2B in order to operate the smartphone 1.
The smartphone 1 determines a type of a gesture based on at least one of a contact detected by the touch screen 2B, a position where the contact is detected, a change of a position where the contact is detected, an interval between detected contacts, and the number of detection times of the contact. The gesture is an operation performed on the touch screen 2B. Examples of the gestures determined by the smartphone 1 include, but are not limited to, touch, long touch, release, swipe, tap, double tap, long tap, drag, flick, pinch in, and pinch out.
“Touch” is a gesture in which a finger makes contact with the touch screen 2B. The smartphone 1 determines a gesture in which the finger makes contact with the touch screen 2B as touch. “Long touch” is a gesture in which a finger makes contact with the touch screen 2B for longer than a given time. The smartphone 1 determines a gesture in which the finger makes contact with the touch screen 2B for longer than a given time as long touch.
“Release” is a gesture in which a finger separates from the touch screen 2B. The smartphone 1 determines a gesture in which the finger separates from the touch screen 2B as release. “Swipe” is a gesture in which a finger moves on the touch screen 2B with continuous contact thereon. The smartphone 1 determines a gesture in which the finger moves on the touch screen 2B with continuous contact thereon as swipe.
“Tap” is a gesture in which a touch is followed by a release. The smartphone 1 determines a gesture in which a touch is followed by a release as tap. “Double tap” is a gesture such that a gesture in which a touch is followed by a release is successively performed twice. The smartphone 1 determines a gesture such that a gesture in which a touch is followed by a release is successively performed twice as double tap.
“Long tap” is a gesture in which a long touch is followed by a release. The smartphone 1 determines a gesture in which a long touch is followed by a release as long tap. “Drag” is a gesture in which a swipe is performed from an area where a movable-object is displayed. The smartphone 1 determines a gesture in which a swipe is performed from an area where the movable-object displayed as drag.
“Flick” is a gesture in which a finger separates from the touch screen 2B while moving after making contact with the touch screen 2B. That is, “Flick” is a gesture in which a touch is followed by a release accompanied with a movement of the finger. The smartphone 1 determines a gesture in which the finger separates from the touch screen 2B while moving after making contact with the touch screen 2B as flick. The flick is performed, in many cases, with a finger moving along one direction. The flick includes “upward flick” in which the finger moves upward on the screen, “downward flick” in which the finger moves downward on the screen, “rightward flick” in which the finger moves rightward on the screen, and “leftward flick” in which the finger moves leftward on the screen, and the like. Movement of the finger during the flick is, in many cases, quicker than that of the finger during the swipe.
“Pinch in” is a gesture in which a swipe with a plurality of fingers is performed in a direction to move the fingers toward each other. The smartphone 1 determines a gesture in which the distance between a position of one finger and a position of another finger detected by the touch screen 2B becomes shorter as pinch in. “Pinch out” is a gesture in which a swipe with a plurality of fingers is performed in a direction to move the fingers away from each other. The smartphone 1 determines a gesture in which the distance between a position of one finger and a position of another finger detected by the touch screen 2B becomes longer as pinch out.
In the description herein below, a gesture performed by using a finger may be referred to as a “single touch gesture”, and a gesture performed by using a plurality of fingers may be referred to as a “multi touch gesture”. Examples of the multi touch gesture include a pinch in and a pinch out. A tap, a flick, a swipe, and the like are a single touch gesture when performed by using a finger, and are a multi touch gesture when performed by using a plurality of fingers.
The smartphone 1 performs operations according to these gestures which are determined through the touch screen 2B. Therefore, user-friendly and intuitive operability is achieved. The operations performed by the smartphone 1 according to the determined gestures may be different depending on the screen displayed on the display 2A. In the following explanation, for the sake of simplicity of explanation, the fact that the touch screen detects the contact(s) and then the smartphone determines the type of the gesture as X based on the contact(s) may be simply described as “the smartphone detects X” or “the controller detects X”.
An example of the screen displayed on the display 2A will be explained below with reference to FIG. 4. FIG. 4 represents an example of a home screen. The home screen may also be called “desktop”, “standby screen”, “idle screen”, or “standard screen”. The home screen is displayed on the display 2A. The home screen is a screen allowing the user to select which one of applications (programs) installed in the smartphone 1 is executed. The smartphone 1 executes the application selected on the home screen in the foreground. The screen of the application executed in the foreground is displayed on the display 2A in a different manner from that of the home screen.
Icons can be arranged on the home screen of the smartphone 1. A plurality of icons 50 are arranged on a home screen 40 illustrated in FIG. 4. Each of the icons 50 is previously associated with an application installed in the smartphone 1. When detecting a gesture for an icon 50, the smartphone 1 executes the application associated with the icon 50 for which the gesture is detected. For example, when detecting a tap on an icon 50 associated with a mail application, the smartphone 1 executes the mail application.
The icons 50 include an image and a character string. The icons 50 may contain a symbol or a graphic instead of an image. The icons 50 do not have to include either one of the image and the character string. The icons 50 are arranged based on a layout pattern. A wall paper 41 is displayed behind the icons 50. The wall paper may sometimes be called “photo screen”, “back screen”, “idle image”, or “background image”. The smartphone 1 can use an arbitrary image as the wall paper 41. The smartphone 1 may be configured so that the user can select an image to be displayed as the wall paper 41.
The smartphone 1 can include a plurality of home screens. The smartphone 1 determines, for example, the number of home screens according to setting by the user. The smartphone 1 displays a selected one on the display 2A even if there is a plurality of home screens.
The smartphone 1 displays an indicator (a locator) 51 on the home screen. The indicator 51 includes one or more symbols. The number of the symbols is the same as that of the home screens. In the indicator 51, a symbol corresponding to a home screen that is currently displayed is displayed in a different manner from that of symbols corresponding to the other home screens.
The indicator 51 in an example illustrated in FIG. 4 includes four symbols. This means the number of home screens is four. According to the indicator 51 in the example illustrated in FIG. 4, the second symbol from the left is displayed in a different manner from that of the other symbols. This means that the second home screen from the left is currently displayed.
The smartphone 1 can change a home screen to be displayed on the display 2A. When a gesture is detected while displaying one of home screens, the smartphone 1 changes the home screen to be displayed on the display 2A to another one. For example, when detecting a rightward flick, the smartphone 1 changes the home screen to be displayed on the display 2A to a home screen on the left side. For example, when detecting a leftward flick, the smartphone 1 changes the home screen to be displayed on the display 2A to a home screen on the right side. The smartphone 1 changes the home screen to be displayed on the display 2A from a first home screen to a second home screen, when a gesture is detected while displaying the first home screen, such that the area of the first home screen displayed on the display 2A gradually becomes smaller and the area of the second home screen displayed gradually becomes larger. The smartphone 1 may switch the home screens such that the first home screen is instantly replaced by the second home screen.
An area 42 is provided along the top edge of the display 2A. Displayed on the area 42 are a remaining mark 43 indicating a remaining amount of a power supply and a radio-wave level mark 44 indicating an electric field strength of radio wave for communication. The smartphone 1 may display time, weather, an application during execution thereof, a type of communication system, a status of a phone call, a mode of the device, an event occurring in the device, and the like in the area 42. In this manner, the area 42 is used to inform the user of various notifications. The area 42 may be provided on any screen other than the home screen 40. A position where the area 42 is provided is not limited to the top edge of the display 2A.
The home screen 40 illustrated in FIG. 4 is only an example, and therefore the configuration of each of elements, the arrangement of the elements, the number of home screens 40, the way to perform each of operations on the home screen 40, and the like do not have to be like the above mentioned explanation.
FIG. 5 is a block diagram of the smartphone 1. The smartphone 1 includes the touch screen display 2, the button 3, the illumination sensor 4, the proximity sensor 5, a communication unit 6, the receiver 7, the microphone 8, a storage 9, a controller 10, the cameras 12 and 13, the connector 14, an acceleration sensor 15, a direction (orientation) sensor 16, and a gyroscope 17.
The touch screen display 2 includes, as explained above, the display 2A and the touch screen 2B. The display 2A displays text, images, symbols, graphics, or the like. The touch screen 2B detects contact(s). The controller 10 detects a gesture performed for the smartphone 1. Specifically, the controller 10 detects an operation (a gesture) for the touch screen 2B in cooperation with the touch screen 2B.
The button 3 is operated by the user. The button 3 includes buttons 3A to 3F. The controller 10 detects an operation for the button 3 in cooperation with the button 3. Examples of the operations for the button 3 include, but are not limited to, a click, a double click, a triple click, a push, and a multi-push.
The buttons 3A to 3C are, for example, a home button, a back button, or a menu button. The button 3D is, for example, a power on/off button of the smartphone 1. The button 3D may function also as a sleep/sleep release button. The buttons 3E and 3F are, for example, volume buttons.
The illumination sensor 4 detects illumination of the ambient light of the smartphone 1. The illumination indicates intensity of light, lightness, or brightness. The illumination sensor 4 is used, for example, to adjust the brightness of the display 2A. The proximity sensor 5 detects the presence of a nearby object without any physical contact. The proximity sensor 5 detects the presence of the object based on a change of the magnetic field, a change of the return time of the reflected ultrasonic wave, etc. The proximity sensor 5 detects that, for example, the touch screen display 2 is brought close to someone's face. The illumination sensor 4 and the proximity sensor 5 may be configured as one sensor. The illumination sensor 4 can be used as a proximity sensor.
The communication unit 6 performs communication via radio waves. A communication system supported by the communication unit 6 is wireless communication standard. The wireless communication standard includes, for example, a communication standard of cellar phones such as 2G, 3G, and 4G. The communication standard of cellar phones includes, for example, Long Term Evolution (LTE), Wideband Code Division Multiple Access (W-CDMA), CDMA 2000, a Personal Digital Cellular (PDC), a Global System for Mobile Communications (GSM), and a Personal Handy-phone System (PHS). The wireless communication standard further includes, for example, Worldwide Interoperability for Microwave Access (WiMAX), IEEE 802.11, Bluetooth, Infrared Data Association (IrDA), and Near Field Communication (NFC). The communication unit 6 may support one or more communication standards.
The receiver 7 is a sound output unit. The receiver 7 outputs a sound signal transmitted from the controller 10 as sound. The receiver 7 is used, for example, to output voice of the other party on the phone. The microphone 8 is a sound input unit. The microphone 8 converts speech of the user or the like to a sound signal and transmit the converted signal to the controller 10. The smartphone 1 may be provided with a speaker instead of, or in addition to, the receiver 7.
The storage 9 stores therein programs and data. The storage 9 is used also as a work area that temporarily stores a processing result of the controller 10. The storage 9 may include any non-transitory storage medium such as a semiconductor storage medium and a magnetic storage medium. The storage 9 may include a plurality type of storage mediums. The storage 9 may include a combination of a portable storage medium such as a memory card, an optical disc, or a magneto-optical disc with a reader of the storage medium. The storage 9 may include a storage device used as a temporary storage area such as Random Access Memory (RAM).
The programs stored in the storage 9 include applications executed in the foreground or in the background and a control program for assisting operations of the applications. The application causes, for example, a predetermined screen to be displayed on the display 2A, and the controller to perform a process according to a gesture detected through the touch screen 2B. The control program is, for example, an OS. The application and the control program may be installed in the storage 9 through communication by the communication unit 6 or through a non-transitory storage medium.
The storage 9 stores therein, for example, a control program 9A, a mail application 9B, a browser application 9C, and setting data 9Z. The mail application 9B provides an e-mail function for composing, transmitting, receiving, and displaying e-mail, and the like. The browser application 9C provides a WEB browsing function for displaying WEB pages. The setting data 9Z contains information related to various settings on the operations of a smartphone 1.
The control program 9A provides a function related to various controls for operating the smartphone 1. The control program 9A controls, for example, the communication unit 6, the receiver 7, and the microphone 8 to make a phone call. The function provided by the control program 9A includes functions for performing various controls such as changing a screen displayed on the display 2A according to gestures detected through the touch screen 2B while the locked state is set. The functions provided by the control program 9A can be used in combination with a function provided by the other program such as the mail application 9B.
The controller 10 is a processing unit. Examples of the processing units include, but are not limited to, a Central Processing Unit (CPU), System-on-a-chip (SoC), a Micro Control Unit (MCU), and a Field-Programmable Gate Array (FPGA). The controller 10 integrally controls the operations of the smartphone 1 to implement various functions.
Specifically, the controller 10 executes instructions contained in the program stored in the storage 9 while referring to the data stored in the storage 9 as necessary. The controller 10 controls a function unit according to the data and the instructions to thereby implement the various functions. Examples of the function units include, but are not limited to, the display 2A, the communication unit 6, and the receiver 7. The controller 10 can change the control of the function unit according to the detection result of a detector. Examples of the detectors include, but are not limited to, the touch screen 2B, the button 3, the illumination sensor 4, the proximity sensor 5, the microphone 8, the camera 12, the camera 13, the acceleration sensor 15, the direction sensor 16, and the gyroscope 17.
The controller 10 executes, for example, the control program 9A to perform various controls, such as a control for changing information displayed on the display 2A in accordance with the gesture detected through the touch screen 2B.
The camera 12 is an in-camera for photographing an object facing the front face 1A. The camera 13 is an out-camera for photographing an object facing the back face 1B.
The connector 14 is a terminal to which other device is connected. The connector 14 may be a general-purpose terminal such as a Universal Serial Bus (USB), a High-Definition Multimedia Interface (HDMI), Light Peak (Thunderbolt), and an earphone/microphone connector. The connector 14 may be a dedicated terminal such as a dock connector. Examples of the devices connected to the connector 14 include, but are not limited to, an external storage device, a speaker, and a communication device.
The acceleration sensor 15 detects a direction and a magnitude of acceleration applied to the smartphone 1. The direction sensor 16 detects a direction of geomagnetism. The gyroscope 17 detects an angle and an angular velocity of the smartphone 1. The detection results of the acceleration sensor 15, the direction sensor 16, and the gyroscope 17 are used in combination with each other in order to detect a position of the smartphone 1 and a change of its attitude.
Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be downloaded from any other device through communication by the communication unit 6. Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be stored in the non-transitory storage medium that can be read by the reader included in the storage 9. Part or all of the programs and the data stored in the storage 9 in FIG. 5 may be stored in the non-transitory storage medium that can be read by a reader connected to the connector 14. Examples of the non-transitory storage mediums include, but are not limited to, an optical disc such as CD, DVD, and Blu-ray, a magneto-optical disc, magnetic storage medium, a memory card, and solid-state storage medium.
The configuration of the smartphone 1 illustrated in FIG. 5 is only an example, and therefore it can be modified as required within a scope that does not depart from the gist of the present invention. For example, the number and the type of the button 3 are not limited to the example of FIG. 5. The smartphone 1 may be provided with buttons of a numeric keypad layout or a QWERTY layout and so on as buttons for operation of the screen instead of the buttons 3A to 3C. The smartphone 1 may be provided with only one button to operate the screen, or with no button. In the example of FIG. 5, the smartphone 1 is provided with two cameras; however, the smartphone 1 may be provided with only one camera or with no camera. In the example of FIG. 5, the smartphone 1 is provided with three types of sensors in order to detect its position and attitude; however, the smartphone 1 does not have to be provided with some of the sensors. Alternatively, the smartphone 1 may be provided with any other type of sensor for detecting at least one of the position and the attitude.
An example of the controls based on the functions provided by the control program 9A will be described with reference to FIGS. 6 to 22. The function provided by the control program 9A includes a function for changing the screen displayed on the display 2A according to gestures detected through the touch screen 2B while the locked state is set. Hereinafter, an example of a control performed according to user instructions while the locked state is set will be described.
An example of the lock screen will be described with reference to FIG. 6. The lock screen is displayed on the display 2A while the locked state is set, that is, while the setting of the locked state is ON. FIG. 6 illustrates an example of the lock screen. A lock screen 60 is a screen representing that the locked state is set. The lock screen 60 is a screen that moves to other screen when a preset unlock gesture is detected. The lock screen 60 is a screen on which gestures other than a preset gesture are invalidated. The smartphone 1 is in a state that cannot perform various operations until a particular gesture is detected on the lock screen.
The lock screen 60 illustrated in FIG. 6 has a date/time image 62, a key icon (first icon) 64, a ring 66, and application icons (second icons) 68 a, 68 b, 68 c and 68 d, and a home icon 69 arranged on a wall paper 61. On the lock screen 60, the same area 42 as the area 42 of the home screen 40 is arranged at the top edge of the display 2A. Displayed on the area 42 of the lock screen 60 are a remaining mark 43 indicating a remaining amount of a power supply and a radio-wave level mark 44 indicating electric field strength of radio wave for communication. The wall paper 61 is displayed behind the date/time image 62, the key icon 64, the ring 66, the application icons 68 a, 68 b, 68 c and 68 d, and the home icon 69.
The date/time image 62 is an image that indicates time and date, and is displayed under the area 42 in the upper portion of the lock screen 60. In the date/time image 62 illustrated in FIG. 6, a state display indicating the time is “12:34 PM” that indicates twelve o'clock thirty four minutes in the afternoon, and a state display indicating the date is “Aug. 22” that indicates August 22.
The key icon 64 is a lock-shaped image and is displayed in a substantially central portion of the screen. In the embodiment, the key icon 64 is an object that a user drags through an unlock gesture and a gesture for executing each application. When a swipe starting from the key icon 64 is detected, the smartphone 1 moves a display position of the key icon 64 according to a movement of a contact position of the swipe.
The outer periphery of the ring 66 is displayed at a location surrounding the key icon 64. The ring 66 has a circular frame shape. The key icon 64 is arranged in a central portion of the circular frame of the ring 66. The circular frame of the ring 66 is arranged with the same center as that of the key icon 64 and has a shape with a larger diameter than that of the outer edge of the key icon 64. The circular frame of the ring 66 is arranged to be spaced apart from the outer edge of the key icon 64 by more than a predetermined distance. The ring 66 has a closed shape and becomes a boundary that divides the area of the lock screen 60 into two areas, that is, an inner area (a first area) and an outer area (a second area) of the ring 66.
The application icons 68 a, 68 b, 68 c and 68 d are displayed separately on the ring 66. The application icons 68 a, 68 b, 68 c and 68 d are arranged on the ring 66 in order of the application icons 68 a, 68 b, 68 c and 68 d in clockwise direction. Each of the application icons 68 a, 68 b, 68 c and 68 d is associated with a particular application installed in the smartphone 1. When detecting a particular gesture for the application icons 68 a, 68 b, 68 c or 68 d, the smartphone 1 executes the application associated with the application icon for which the particular gesture is performed. The particular gesture will be described below.
In the example illustrated in FIG. 6, the application icon 68 a is associated with a phone application. The application icon 68 b is associated with a mail application. The application icon 68 c is associated with an SMS application. The application icon 68 d is associated with a text editor application.
Each of the application icons 68 a, 68 b, 68 c and 68 d includes an image that represents the associated application. Each of the application icons 68 a, 68 b, 68 c and 68 d may contain an image and a character like the icon 50, and may contain a symbol or a graphic instead of an image. Each of the application icons 68 a, 68 b, 68 c and 68 d may contain only a character string, without containing an image.
The home icon 69 is displayed in an area that is a lower end side of the lock screen 60 in the outside of the ring 66. The home icon 69 is an icon that is associated with execution of unlock processing and processing of moving to the home screen 40. When detecting a particular gesture for the home icon 69, the smartphone 1 unlocks the locked state and displays the home screen 40 on the display 2A. The particular gesture will be described below.
Then, the particular gesture for the home icon 69 will be described with reference to FIG. 7. At Step S1 illustrated in FIG. 7, the lock screen 60 is displayed on the display 2A. At Step S1, the user's finger F touches the key icon 64. In this case, the smartphone 1 detects a touch on a portion where the key icon 64 is arranged.
At Step S2, the user's finger F drops the key icon 64 on the home icon 69. That is, the user uses his/her finger F to touch the area where the key icon 64 is displayed at Step S1, drags the key icon 64 along a path indicated by an arrow α1, and releases the key icon 64 in the area where the home icon 69 is displayed. In this case, the smartphone 1 detects a swipe, of which the start point is the portion where the key icon 64 is arranged and the end point is the portion where the home icon 69 is arranged. That is, the smartphone 1 detects a drop of the key icon 64 on the home icon 69. When detecting such a drop, the smartphone 1 unlocks the locked state and displays the home screen 40 on the touch screen display 2. As described above, in the smartphone 1 of the embodiment, the drop of the key icon 64 on the home icon 69 is set as the particular gesture for the home icon 69.
When detecting the swipe of which the start point is the key icon 64, the smartphone 1 may display the key icon 64 at the position where the contact is detected by the swipe. That is, when detecting the swipe of which the start point is the key icon 64, the smartphone 1 may display the key icon 64 during the swipe while moving according to the movement of the finger F.
When detecting a gesture of dropping the key icon 64 on the outer area (the second area) of the ring 66, the smartphone 1 also unlocks the locked state and displays the home screen 40 on the display 2A. That is, by dropping key icon 64 on the second area, the user can unlock the locked state and allow the home screen 40 to be displayed.
When detecting a gesture of dropping the key icon 64 on the inner area (the first area) of the ring 66, the smartphone 1 returns the key icon 64 to the initial position. That is, when the key icon 64 is not dropped on either of the second area and the application icons 68 a, 68 b, 68 c and 68 d, the smartphone 1 displays the key icon 64 at the central position of the ring 66.
Then, the particular gesture for the application icons 68 a, 68 b, 68 c and 68 d will be described with reference to FIG. 8. FIG. 8 describes a case where the particular gesture is performed with respect to the application icon 68 d so as to execute an application desired by a user. At Step S3 illustrated in FIG. 8, the lock screen 60 is displayed on the display 2A. At Step S3, the user's finger F touches the key icon 64. In this case, the smartphone 1 detects a touch on a portion where the key icon 64 is arranged.
At Step S4, the user's finger F drops the key icon 64 on the application icon 68 d. That is, at Step S4, the user uses his/her finger F to touch the area where the key icon 64 is displayed, drags the key icon 64 along a path indicated by an arrow α2, and releases the key icon 64 in the area where the application icon 68 d is displayed. In this case, the smartphone 1 detects the swipe, of which the start point is the portion where the key icon 64 is arranged and the end point is the portion where the application icon 68 d is arranged. That is, the smartphone 1 detects the drop of the key icon 64 on the application icon 68 d. When detecting such a drop, the smartphone 1 unlocks the locked state and executes the text editor application as the application associated with the application icon 68 d. Subsequently, the smartphone 1 displays an operation screen, which is displayed in a case the text editor application is executed as the application associated with the application icon 68 d, on the touch screen display 2.
FIG. 9 illustrates an example of the operation screen in case the text editor application is executed. When the text editor application is executed, the smartphone 1 displays an operation screen 80 illustrated in FIG. 9 on the touch screen display 2. The operation screen 80 illustrated in FIG. 9 includes a display area 82 for checking an input character string on a substantially entire area of the upper portion of the screen, a keyboard object 84 for executing the input of a character string on the lower portion of the screen, a memo list display button 86 for displaying a memo list registered by the text editor on the upper left side of the display area 82, and a end button 88 for ending the processing of the text editor on the upper right side of the display area 82. In such a state that the operation screen 80 is displayed, when detecting a tap or a swipe with respect to the keyboard object 84, the smartphone 1 detects a character corresponding to a tapped area or a swiped trajectory as an input character. The smartphone 1 displays the input character at a set position of the display area 82. In such a state that the operation screen 80 is displayed, when detecting a tap with respect to the memo list display button 86 or the completion button 88, the smartphone 1 executes the processing associated with the tapped button. In this manner, the smartphone 1 executes a variety of processing of the text editor application and detects the input of the text.
As described above, the smartphone 1 of the embodiment sets the drop of the key icon 64 on the application icons 68 a, 68 b, 68 c or 68 d as the particular gesture for executing the application associated with the application icon on which the key icon 64 is dropped. Although the smartphone 1 of the embodiment sets the drop of the key icon 64 on the application icon, that is, the gesture of dropping the key icon 64 on the application icon, as the particular gesture for the application icon; however, the particular gesture is not limited thereto. For example, a gesture of flicking the key icon 64 toward the application icon may be set as the particular gesture for the application icon. In a case where it is set to move the key icon 64 to a next tapped position after the key icon 64 is tapped, a gesture of tapping an application icon after tapping the key icon 64 may be set as the particular gesture for the application icon. As described above, the particular gesture for the application icon may be a gesture of releasing the key icon 64 in such a state that the key icon 64 and the application icon are superimposed. The above-described gestures correspond to the gesture of releasing the key icon 64 in such a state that the key icon 64 and the application icon are superimposed.
In the embodiment, although the gesture in which the touch on the key icon is the start point have been described, a gesture in which the touch on the application icon is the start point may be set as the particular gesture for executing the application associated with the application icon. For example, a gesture of dragging the application icon and dropping the application icon on the key icon may be set as the particular gesture for the application icon.
Then, another example of particular gesture for application icon will be described with reference to FIG. 10. FIG. 10 describes a particular gesture for executing particular processing that is executable in an application desired by a user. At Step S5 illustrated in FIG. 10, the lock screen 60 is displayed on the display 2A. At Step S5, the user's finger F touches the key icon 64. In this case, the smartphone 1 detects the touch in the portion where the key icon 64 is arranged.
At Step S6, the user's finger F moves the key icon 64 onto the application icon 68 b. That is, the finger F touches the area where the key icon 64 is displayed at Step S5, and drags the key icon 64 along a path indicated by an arrow α3, so that the key icon 64 is moved to the area where the application icon 68 b is displayed. In this case, the smartphone 1 detects the swipe, of which the start point is the portion where the key icon 64 is arranged and which moves to the portion where the application icon 68 b is arranged. That is, the smartphone 1 detects the gesture of superimposing the key icon 64 on the application icon 68 b. When detecting the gesture of superimposing the key icon 64 on the application icon 68 b as the particular gesture for the application icon 68 b, the smartphone 1 displays sub icons 78 a, 78 b and 78 c associated with the application icon 68 b.
On the lock screen 60 displayed at Step S6 of FIG. 10, the date/time image 62 on the wall paper 61, the key icon (first icon) 64, the ring 66, the application icons (second icons) 68 a, 68 b, 68 c and 68 d, the home icon 69, the sub ring 76, and the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b are arranged.
The sub ring 76 is displayed at a location surrounding the outer periphery of the application icon 68 b. The sub ring 76 has a circular frame shape. The application icon 68 b is arranged in the central portion of the circular frame of the sub ring 76. The circular frame of the sub ring 76 is arranged with the same center as that of the application icon 68 b and has a shape with a larger diameter than that of the outer edge of the application icon 68 b. The circular frame of the sub ring 76 is arranged apart from the outer edge of the application icon 68 b by more than a predetermined distance. The sub ring 76 has a closed shape and becomes a boundary that divides the area of the lock screen 60 into two areas, that is, the inner area and the outer area of the sub ring 76.
The sub icons 78 a, 78 b and 78 c are displayed separately on the sub ring 76. The sub icons 78 a, 78 b and 78 c are arranged on the sub ring 76 in order of the sub icons 78 a, 78 b and 78 c in clockwise direction. Each of the sub icons 78 a, 78 b and 78 c is associated with particular processing that is executable in the application associated with the application icon 68 b. When detecting a particular gesture for the sub icons 78 a, 78 b or 78 c, the smartphone 1 executes the particular processing associated with the sub icon for which the particular gesture is performed. The particular gesture will be described below.
In the example illustrated in FIG. 10, since the application icon 68 b is associated with the mail application, the sub icon 78 a is associated with incoming mail check processing that is executable in the mail application. In a similar manner, the sub icon 78 b is associated with new mail composition processing that is executable in the mail application. The sub icon 78 c is associated with outgoing mail check processing that is executable in the mail application.
Each of the sub icons 78 a, 78 b and 78 c includes an image that represents the associated particular processing. Each of the sub icons 78 a, 78 b and 78 c may contain an image or a character, or may contain a symbol or a graphic instead of an image. Each of the sub icons 78 a, 78 b and 78 c may contain only a character string, without containing an image. For example, in the example illustrated in FIG. 10, the sub icon 78 a contains a character string of “IN BOX” representing that the sub icon 78 a is associated with the incoming mail check processing. The sub icon 78 b contains a character string of “COMPOSE” representing that the sub icon 78 b is associated with the new mail composition processing. The sub icon 78 c contains a character string of “OUT BOX” representing that the sub icon 78 c is associated with the outgoing mail check processing.
At Step S7, the user's finger F moves the key icon 64 onto the sub icon 78 b. That is, the user swipes his/her finger F, which touches the area where the application icon 68 b is displayed at Step S6, along a path indicated by an arrow α4, and moves the finger F to the area where the sub icon 78 b is displayed. In this case, the smartphone 1 detects a swipe, which adds the portion where the application icon 68 b detected at Step S6 is arranged, as a swipe, of which an end point is the portion where the sub icon 78 b is arranged. That is, the smartphone 1 detects the gesture of releasing the key icon 64 superimposed on the sub icon 78 b. When detecting the gesture of releasing the key icon 64 superimposed on the sub icon 78 b as the particular gesture for the sub icon 78 b, the smartphone 1 executes the particular processing associated with the sub icon 78 b, that is, the new mail composition processing. That is, when detecting the gesture of superimposing the key icon 64 on the sub icon 78 b, the smartphone 1 displays the new mail composition screen of the mail application.
In the embodiment, as an example of the application icon associated with the sub icon, the processing of displaying the sub icons 78 a, 78 b and 78 c in the example of the application icon 68 b associated with the mail application has been described; however, the application icon associated with the sub icon is not limited thereto. For example, when detecting the particular gesture for the application icon 68 a associated with the phone application, the smartphone 1 may display the sub icons associated with the application icon 68 a. Examples of the sub icons associated with the phone may include, but are not limited to, sub icons that are short-cuts of missed call check processing, new call start processing, and outgoing call history check processing. When detecting the particular gesture for the application icon 68 c associated with the SMS application, the smartphone 1 may display the sub icons associated with the application icon 68 c, as in the case of the mail application. Examples of the sub icons associated with the mail may include, but are not limited to, sub icons that are short-cuts of incoming mail check processing, new mail composition processing, and outgoing mail check processing.
In the embodiment, it has been described about the control of the smartphone 1 to display the sub icons 78 a, 78 b and 78 c when detecting the gesture of superimposing the key icon 64 on the application icon 68 b, and execute the particular processing associated with the sub icon 78 b. However, in case the user drops the key icon 64 on the application icon 68 b at Step S6 of FIG. 10, the smartphone 1 can execute the mail application associated with the application icon 68 b. When detecting the gesture of moving the key icon 64 to the outer area of the sub ring 76, the smartphone 1 may delete the displayed sub icons 78 a, 78 b and 78 c. After displaying the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b, when detecting the gesture of superimposing the key icon 64 on other application icons 68 a, 68 c or 68 d, the smartphone 1 may delete the displayed sub icons 78 a, 78 b and 78 c. When detecting the gesture of dropping the key icon 64 on the inner area of the sub ring 76, the smartphone 1 may execute the application associated with the application icon 68 b located at the center of the sub ring 76. That is, when the key icon 64 is not dropped on either of the first area, the second area, and the sub icons 78 a, 78 b and 78 c after displaying the sub icons, the smartphone 1 may execute the application associated with the application icon 68 b located at the center of the sub ring 76.
In the embodiment, although the gesture in which the touch on the key icon is the start point have been described, a gesture in which the touch on the sub icon is the start point may be set as the particular gesture for executing the particular processing associated with the sub icon. For example, the gesture of dragging the sub icon and dropping the sub icon on the key icon may be set as the particular gesture for the sub icon.
An example of the procedure of the control based on the functions provided by the control program 9A will be described with reference to FIG. 11. FIG. 11 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen. The procedure illustrated in FIG. 11 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 11 is executed in a case where the locked state is set and an operation of displaying a screen on the display 2A is detected. The case where the operation of displaying the screen on the display 2A is detected is, for example, a case where a screen return operation is detected in such a state that a power-saving mode is set so that the screen is not displayed on the touch screen display 2. The controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 11.
The controller 10 displays the lock screen on the touch screen display 2 at Step S10. When the lock screen is displayed at Step S10, the controller 10 determines at Step S12 whether a gesture has been detected. The controller 10 obtains the detection result of the touch screen 2B, and determines whether a gesture is detected, based on the obtained detection result. When it is determined at Step S12 that no gesture has been detected (No at Step S12), the controller 10 determines at Step S14 whether threshold value time≦waiting time is satisfied. That is, the controller 10 determines whether the waiting time defined as the elapsed time after the completion of the latest operation is equal to or greater than the predetermined threshold value time.
When it is determined at Step S14 that threshold value time≦waiting time is not satisfied (No at Step S14), that is, when it is determined that threshold value time>waiting time is satisfied, the controller 10 proceeds to Step S12 and determines again whether there is a gesture. When it is determined at Step S14 that threshold value time≦waiting time is satisfied (Yes at Step S14), the controller 10 shifts to the power-saving mode at Step S16 and ends the processing. That is, the controller 10 changes to a state in which the lock screen is not displayed, by turning off the touch screen display 2, and ends the processing.
When it is determined at Step S12 that a gesture has been detected (Yes at Step S12), the controller 10 determines, at Step S18, whether the gesture is a gesture of touching the key icon. That is, the controller 10 determines whether the gesture detected at Step S12 is a gesture of touching the key icon. When it is determined at Step S18 that the detected gesture is not the gesture of touching the key icon (No at Step S18), the controller 10 executes the process corresponding to the detected gesture at Step S20, and proceeds to Step S12. Examples of the process corresponding to the detected gesture include, but are not limited to, processing of displaying the sub icons associated with the application icon, processing of moving the positions of the application icons displayed on the lock screen, etc. The processing of displaying the sub icons and the processing of moving the positions of the application icons will be described below. Furthermore, the process corresponding to the detected gesture may be processing of displaying screens displayable on the lock screen, for example, a help screen or an emergency notice screen.
When it is determined at Step S18 that the detected gesture is the gesture of touching the key icon (Yes at Step S18), the controller 10 determines at Step S22 whether the touch is released on the second area or the home icon. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S18, is swiped and then released so that the corresponding released position (position where the key icon is dropped) is on the second area or on the home icon.
When it is determined at Step S22 that the key icon is released on the second area or on the home icon (Yes at Step S22), the controller 10 executes unlock processing at Step S24, and displays the home screen on the touch screen display 2 at Step S26. When the home screen is displayed at Step S26, the controller 10 ends the processing.
When it is determined at Step S22 that the key icon is not released on either of the second area and the home icon (No at Step S22), the controller 10 determines at Step S27 whether a gesture of moving the key icon onto the application icon is detected. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S18, is swiped and is on the application icon.
When it is determined at Step S27 that the gesture of moving the key icon onto the application icon is detected (Yes at Step S27), the controller 10 determines at Step S28 whether there are sub icons associated with the application icon. When it is determined at Step S27 that there is no gesture of moving the key icon onto the application icon (No at Step S27), the controller 10 proceeds to processing of Step S37.
When it is determined at Step S28 that there are the sub icons associated with the application icon (Yes at Step S28), the controller 10 displays the sub icons on the lock screen at Step S29. For example, as illustrated in FIG. 10, when detecting the particular gesture for the application icon 68 b (in FIG. 10, the gesture of moving the key icon 64 onto the application icon 68 b), the controller 10 displays the sub icons 78 a, 78 b and 78 c on the lock screen 60. Subsequently, the controller 10 proceeds to processing of Step S30. When it is determined at Step S28 that there are no sub icons associated with the application icon (No at Step S28), the controller 10 proceeds to the processing of Step S30.
When it is determined at Step S28 that there are no sub icons associated with the application icon (No at Step S28), or when the sub icons are displayed at Step S29, the controller 10 determines at Step S30 whether the touch is released on the application icon. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S18, is swiped and then released so that the corresponding released position (position where the key icon is dropped) is on the application icon.
When it is determined at Step S30 that the key icon is released on the application icon (Yes at Step S30), the controller 10 executes unlock processing at Step S31, executes the application corresponding to the application icon located at the dropped position at Step S32, and displays the screen of the executed application on the touch screen display 2 at Step S33. For example, as illustrated in FIGS. 8 and 9, when detecting the release of the key icon 64 superimposed on the application icon 68 d, the controller 10 executes the text editor application associated with the application icon 68 d. When the moved screen of the application is displayed at Step S33, the controller 10 ends the processing.
When it is determined at Step S30 that the key icon is not released on the application icon (No at Step S30), the controller 10 determines at Step S34 whether the key icon is released on the sub icon. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S18, is swiped and then released so that the corresponding released position (position where the key icon is dropped) is on the sub icon.
When it is determined at Step S34 that the key icon is released on the sub icon (Yes at Step S34), the controller 10 executes unlock processing at Step S35, and executes particular processing corresponding to the sub icon located at the dropped position at Step S36. That is, the controller 10 displays the operation screen for executing the particular processing corresponding to the sub icon. For example, as illustrated at Step S7 of FIG. 10, when detecting the release of the key icon 64 superimposed on the sub icon 78 b, the controller 10 executes the new mail composition processing associated with the sub icon 78 b. That is, the controller 10 displays the new mail composition screen for executing the new mail composition processing associated with the sub icon 78 b. When the moved operation screen for executing the particular processing of the application is displayed at Step S36, the controller 10 ends the processing.
When it is determined at Step S27 that there is no gesture of moving the key icon onto the application icon (No at Step S27), or when it is determined at Step S34 that the key icon is not released on the sub icon (No at Step S34), the controller 10 determines at Step S37 whether the key icon is released in the first area. That is, the controller 10 determines whether the touch on the key icon, which is detected at Step S18, is swiped and then released so that the corresponding released position (position where the key icon is dropped) is the first area.
When it is determined at Step S37 that the key icon is released in the first area (Yes at Step S37), the controller 10 moves the key icon to the initial position at Step S38, and proceeds to Step S12. In case the sub icon is displayed at Step S29, the controller 10 deletes the sub icon. When it is determined at Step S37 that the key icon is not released in the first area (No at Step S37), the controller 10 proceeds to Step S22. As described above, when detecting the touch on the key icon at Step S18, the controller 10 repeats the processing of Step S22, Step S27, Step S30, Step S34, and Step S37 until the release of the touch gesture, that is, the drop of the key icon, is detected at Step S22, Step S30, or Step S34.
As described above, in a case where there are the sub icons associated with the application icon displayed on the lock screen, the smartphone 1 displays the sub icons when the particular gesture for the application icon (gesture of superimposing the key icon on the application icon in the embodiment) is detected. Therefore, according to the smartphone 1 of the embodiment, in the case of executing the particular processing that is executable in the desired application, the user does not need to perform three-stage operations, that is, the execution of the unlock processing, the selection of the desired application, and the selection of the particular processing from the application menu, or the like. That is, by just making the gesture of superimposing the key icon on the application icon on the lock screen, the user can display the operation screen for executing the particular processing of the desired application. For example, in a case where the application is the mail application, the user can smoothly select and execute, on the lock screen, the desired processing from among the incoming mail check processing, the new mail composition processing, the outgoing mail check processing, and the like, which can be executed in the mail application. As described above, the smartphone 1 of the embodiment provides the user with high operability and high convenience in a state that the lock screen representing the locked state is displayed.
When detecting the drop of the key icon on the application icon, the smartphone 1 executes the application associated with the application icon. Therefore, the user can execute the desired application quickly from the locked state. When detecting the release gesture in the state that the application icon and the key icon are superimposed, the smartphone 1 executes the unlock processing, the application selection processing, and the application execution processing. Thus, the application icons displayed on the lock screen works as icons having a short-cut function. Therefore, the user can input three processings through a single gesture. Therefore, the user can execute the desired application by a short-cut operation in which part of operations, for example, an operation of performing the unlock and displaying the icon selection screen and the operation of selecting the icon on the icon selection screen are omitted.
By arranging the application icon at a position spaced apart from the key icon, specifically on the ring 66, the smartphone 1 can suppress the application associated with the application icon from being executed at unintended timings due to erroneous operations, even when the application icon is arranged on the lock screen.
When detecting the drop of the key icon on the second area or the home icon, the smartphone 1 unlocks the locked state and displays the home screen. Therefore, the user can unlock the locked state through a simple operation. The smartphone 1 may be configured not to display the home icon on the lock screen.
When detecting the drop of the key icon on the home icon, the smartphone 1 may execute processing different from that of the case where the drop of the key icon in the second area is detected. For example, when detecting the drop of the key icon on the home icon, the smartphone 1 unlocks the locked state and displays the home screen. On the other hand, when detecting the drop of the key icon in the second area, the smartphone 1 unlocks the locked state, and then displays the application if there is an application being already executed, and displays the home screen if there is no application being executed. In this manner, by changing the processing to be executed according to whether the position where the key icon is dropped is on the home icon or in the second area, the smartphone 1 can execute more types of processing quickly from the locked state.
In the processing operation illustrated in FIG. 11, the smartphone 1 determines the detected gesture for the key icon in order of Step S22, Step S27, Step S30, Step S34, and Step S37, but the order of the determination is not specially limited. The smartphone 1 may execute the determinations of Step S22, Step S27, Step S30, Step S34, and Step S37 in any order.
In the processing operation illustrated in FIG. 11, when it is determined at Step S37 that the key icon is released in the first area, the smartphone 1 deletes the displayed sub icons, but it is not limited thereto. After displaying the sub icons at Step S29, the smartphone 1 may delete the displayed sub icons, based on the position of the key icon 64. For example, as illustrated in Step S6 of FIG. 10, when detecting the gesture of superimposing the key icon 64 and the application icon 68 b, the smartphone 1 displays the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b, but, after that, when detecting the gesture of moving the key icon 64 to the outer area of the sub ring 76, the smartphone 1 may delete the displayed sub icons 78 a, 78 b and 78 c. After displaying the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b, when detecting the gesture of superimposing the key icon 64 on other application icons 68 a, 68 c and 68 d, the smartphone 1 may delete the displayed sub icons 78 a, 78 b and 78 c. Therefore, the user can efficiently change the display of the sub icons associated with the respective application icons.
In the processing operation illustrated in FIG. 11, when it is determined at Step S37 that the key icon is not released in the first area, the smartphone 1 proceeds to Step S22 again to repeat the processing, but it is not limited thereto. When it is determined that the key icon is not released in the first area, the smartphone 1 may further determine whether there is a gesture of dropping the key icon in the inner area of the sub ring. When detecting the gesture of dropping the key icon in the inner area of the sub ring, the smartphone 1 may execute the application associated with the application icon located at the center of the sub ring. That is, when the key icon 64 is not dropped in either of the first area, the second area, and the sub icons after displaying the sub icons, the smartphone 1 may execute the application associated with the application icon located at the center of the sub ring.
In the processing operation illustrated in FIG. 11, when it is determined at Step S30 that it is a gesture of releasing the key icon on the application icon, the smartphone 1 executes the unlock processing, but it is not limited thereto. When detecting the gesture of releasing the key icon on the application icon, the smartphone 1 may execute the application associated with the application icon, without executing the unlock processing.
FIG. 12 is a flowchart illustrating another example of the procedure of the control that is performed in the locked state. FIG. 12 illustrates another example of the procedure that is performed when it is determined that the gesture of releasing the key icon on the application icon is detected, that is, the procedure that is performed after it is determined as Yes at Step S30 of FIG. 11. The procedure illustrated in FIG. 12 is realized by the controller 10 executing the control program 9A.
When it is determined as Yes at Step S30, the controller 10 executes the application associated with the application icon located at the dropped position at Step S32, and displays the screen of the executed application on the touch screen display 2 at Step S33. When the screen of the executed application is displayed at Step S33, the controller 10 performs the processing of the executed application at Step S39. For example, when a gesture is detected, the controller 10 executes the processing associated with the detected gesture.
When the processing of the application is executed at Step S39, the controller 10 determines whether to end (terminate or suspend) the application at Step S40. For example, when detecting the gesture of ending the application, and when it is determined that a preset processing condition is satisfied, the controller 10 determines to terminate the application. When it is determined not to end the application (No at Step S40), the controller 10 proceeds to Step S39 and performs the processing of the application. When it is determined to end the application (Yes at Step S40), the controller 10 displays the lock screen at Step S41 and ends the processing. That is, when it is determined as Yes at Step S40, the controller 10 proceeds to Step S10 in FIG. 11 and ends the processing illustrated in FIG. 12.
As illustrated in FIG. 12, when detecting the release gesture in the state that the key icon and the application icon are superimposed, the smartphone 1 executes the application associated with the application icon on which the key icon is superimposed, and displays the lock screen again when the application is ended. For example, the smartphone 1 may execute the processing operation of executing the mail application when the key icon and the application icon associated with the mail application are superimposed, and returning to the lock screen when the mail is transmitted. Therefore, the user can use the application associated with the application icon displayed on the lock screen in a state the locked state is not unlocked. That is, the user can execute the predetermined set processing, without inputting the cumbersome unlock gesture.
In a similar manner, in the processing operation illustrated in FIG. 11, when it is determined at Step S34 that the gesture of releasing the key icon on the sub icon is detected, the smartphone 1 executes the unlock processing, but it is not limited thereto. When detecting the gesture of releasing the key icon on the sub icon, the smartphone 1 may execute the particular processing associated with the sub icon, without executing the unlock processing. Subsequently, the smartphone 1 may display the lock screen again when the particular processing is ended.
The smartphone 1 of the embodiment sets the gesture of superimposing the key icon 64 and the application icon 68 b as the particular gesture for displaying the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b. Although the smartphone 1 of the embodiment sets the gesture of superimposing the key icon 64 on the application icon as the particular gesture for displaying the sub icons associated with the application icon, the particular gesture for displaying the sub icons is not limited thereto. For example, a gesture of directly tapping the application icon, without the user's operating the key icon 64, may be set as the particular gesture for displaying the sub icons associated with the application icon.
Then, an example of the processing of displaying the sub icons, which is executed during displaying the lock screen, will be described with reference to FIGS. 13 and 14. FIG. 13 illustrates an example of the control during displaying the lock screen. At Step S8 illustrated in FIG. 13, the lock screen 60 is displayed on the display 2A, and the user's finger F taps the application icon 68 b. In this case, the smartphone 1 detects the tap in the portion where the application icon 68 b is arranged.
At Step S9, the sub icons 78 a, 78 b and 78 c are displayed on the lock screen 60. In this case, at Step S8, when detecting the gesture of tapping the application icon 68 b as the particular gesture for the application icon 68 b, the smartphone 1 displays the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b.
As described above, the smartphone 1 of the embodiment sets the gesture of tapping an application icon as the particular gesture for displaying the sub icons associated with the application icon.
An example of the procedure of displaying update information based on the functions provided by the control program 9A will be described with reference to FIG. 14. FIG. 14 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen. The procedure illustrated in FIG. 14 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 14 is performed as a part of the processing of Step S20. At Step S20 of FIG. 11, the controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 14.
The controller 10 determines at Step S42 whether the detected gesture is a gesture of tapping the application icon. That is, as illustrated in FIG. 13, the controller 10 determines whether a tap is detected in the portion where the application icon 68 b is arranged.
When it is determined at Step S42 that the detected gesture is the gesture of tapping the application icon (Yes at Step S42), the controller 10 determines at Step S43 whether there are sub icons associated with the application icon. When it is determined at Step S42 that there is no gesture of tapping the application icon (No at Step S42), the controller 10 ends the processing.
When it is determined at Step S43 that there are the sub icons associated with the application icon (Yes at Step S43), the controller 10 displays the sub icons on the lock screen at Step S44. For example, as illustrated in FIG. 13, when detecting the particular gesture for the application icon 68 b (in FIG. 13, the gesture of tapping the application icon 68 b), the controller 10 displays the sub icons 78 a, 78 b and 78 c on the lock screen 60. Then, the processing is ended. When it is determined at Step S43 that there are no sub icons associated with the application icon (No at Step S43), the controller 10 ends the processing.
As illustrated in FIGS. 13 and 14, when the gesture of tapping the application icon 68 b is detected as the particular gesture for the application icon 68 b so as to display the sub icons 78 a, 78 b and 78 c, the smartphone 1 displays the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b. Therefore, the user can display the sub icons 78 a, 78 b and 78 c associated with the application icon 68 b on the lock screen by just tapping the application icon, without performing the operation of superimposing the key icon 64 on the application icon. Furthermore, the user can execute desired processing quickly from the lock screen by selecting a sub icon associated with the desired processing from among the displayed sub icons 78 a, 78 b and 78 c.
The smartphone 1 may display the sub icons associated with the respective application icons 68 a to 68 c displayed on the lock screen by other gestures. For example, when detecting gesture of double tap, long tap, or the like for the application icon 68 b, the smartphone 1 may display the sub icons associated with the application icon. In this case, after displaying the sub icons, when detecting again the particular gesture (tap or the like) for the application icon, the smartphone 1 may delete the sub icons displayed on the lock screen.
In the embodiment, the example in which the sub icons are associated with the particular processing executable in the mail application has been described; however, a manner in which the sub icons are displayed is not limited thereto. Examples of the particular processing executable in the mail application include incoming mail check processing, new mail composition processing, outgoing mail check processing, and the like. As in the case of the application icon, the sub icon may be associated with an arbitrary application installed in the smartphone 1. Thus, the user can hierarchically organize the applications executable on the lock screen by associating the applications being used frequently with the application icons of an upper layer and associating the applications being used sometimes with the sub icons of a lower layer.
In the embodiment, the example in which the sub icons are associated with the application icon has been described; however, third icons of a lower layer may be associated with the sub icons. Thus, the user can use groups of icons (that is, the application icon of the upper layer, the sub icons of the intermediate layer, and the third icons of the lower layer) that function as the short-cuts of the applications executable on the lock screen or the particular processing executable on the lock screen. As a result, the operability and convenience of the lock screen are further improved.
In the embodiment, the application icons are associated with the sub icons, but may not be necessarily associated with the applications. Therefore, the user can use the application icons as folders for organizing the sub icons. That is, the user can organize the sub icons into desired categories by using the application icons of the upper layer displayed on the lock screen as the folders. For example, the user can set the image of the application icon displayed on the lock screen to a character string (for example, “amusement”, “work”, “private”, and the like) representing a category, and associate the sub icons with the application icon according to the intended category. As a result, the operability and convenience of the lock screen are further improved.
Then, an example of the processing of moving the position of the application icon executed during displaying the lock screen will be described with reference to FIGS. 15 and 16. FIG. 15 illustrates an example of the control during displaying the lock screen. At Step S51 illustrated in FIG. 15, the lock screen 60 is displayed on the display 2A. At Step S51, the user's finger F touches the ring 66. In this case, the smartphone 1 detects the touch in the portion where the ring 66 is arranged.
At Step S52, the user swipes his/her finger F along the ring 66. After the user's finger F touches the position where the ring 66 is displayed at Step S51, the user swipes the finger F along a path indicated by an arrow α5. In this case, the smartphone 1 detects the swipe along the ring 66. When detecting such a swipe, the smartphone 1 rotates the application icons 68 a, 68 b, 68 c and 68 d arranged on the ring 66, based on a movement amount of the swipe on the ring 66, in the embodiment, an angle of an arc of the swipe. That is, in such a state that the ring 66 and the application icons 68 a, 68 b, 68 c and 68 d arranged on the ring 66 are connected together, the smartphone 1 rotates the ring 66 and the application icons 68 a, 68 b, 68 c and 68 d by using the center of the ring 66 as the rotational axis. By rotating the ring 66 without changing the shape of the ring 66, the smartphone 1 may display as if only the application icons 68 a, 68 b, 68 c and 68 d are rotated.
At Step S53, the user swipes his/her finger F along the ring 66. After the user's finger F touches the position where the ring 66 is displayed at Step S51, the user swipes the finger F along a path indicated by an arrow α5, and also swipes the finger F along a path indicated by an arrow α6. In this case, the smartphone 1 detects the swipe along the ring 66. When detecting such a swipe, the smartphone 1 further rotates the application icons 68 a, 68 b, 68 c and 68 d arranged on the ring 66, based on a movement amount of the swipe, in the embodiment, an angle of an arc of the swipe on the ring 66.
An example of the procedure of the control based on the functions provided by the control program 9A will be described with reference to FIG. 16. FIG. 16 illustrates the procedure of the control that is performed in the locked state, in particular, the control that is performed during displaying the lock screen. The procedure illustrated in FIG. 16 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 16 is performed as a part of the processing of Step S20. At Step S20 of FIG. 10, the controller 10 may execute another procedure for control related to the lock screen 60 in parallel with the procedure illustrated in FIG. 16.
The controller 10 determines at Step S60 whether detected gesture is a swipe along the ring 66. That is, the controller 10 determines whether a gesture of touching the area where the ring 66 is displayed and swiping along the ring 66, as illustrated in FIG. 15, is detected. When it is determined at Step S60 that detected gesture is the swipe along the ring 66 (Yes at Step S60), the controller 10 changes the display position of the application icons at Step S62. That is, the controller 10 rotates the application icons 68 a, 68 b, 68 c and 68 d arranged on the ring 66, based on the swipe detected at Step S60. When the display position of the application icon is changed at Step S62, the controller 10 ends the processing. When it is determined at Step S60 that there is no swipe along the ring 66 (No at Step S60), the controller 10 ends the processing.
As described in FIGS. 15 and 16, when detecting the gesture of swiping along the ring 66, the smartphone 1 rotates the application icons 68 a, 68 b, 68 c and 68 d arranged on the ring 66. Therefore, the user can easily adjust the positions of the application icons 68 a, 68 b, 68 c and 68 d on the lock screen, and move the desired application icon to the position where it is easy to drop the key icon 64.
The smartphone 1 may be configured to adjust the positions of the application icons 68 a, 68 b, 68 c and 68 d displayed on the lock screen by other gestures. For example, when detecting the swipe, of which the start point is the application icon and the end point is an arbitrary position on the ring 66, that is, when detecting the drop of the application icon on the arbitrary position on the ring 66, the smartphone 1 may move the display position of the application icon to the dropped position. When detecting the swipe, of which the start point is the application icon and the end point is other application icon on the ring 66, that is, when detecting the drop of the application icon on other application icon, the smartphone 1 may exchange the display position of the application icon with the display position of other application icon.
Another example of the lock screen will be described with reference to FIGS. 17A to 17C. FIGS. 17A to 17C illustrate examples of the respective lock screens. Although the lock screen 60 displays four application icons 68 a, 68 b, 68 c and 68 d on the ring 66, the number of the application icons is not limited thereto. For example, the lock screen 60 a illustrated in FIG. 17A displays two application icons of the application icons 68 a and 68 b on the ring 66. The lock screen 60 b illustrated in FIG. 17B displays eight application icons of the application icons 68 a, 68 b, 68 c, 68 d, 68 e, 68 f, 68 g and 68 h on the ring 66. The lock screens 60 a and 60 b have the same configuration as the lock screen 60, except for the number of the application icons. As illustrated in FIGS. 17A and 17B, the smartphone 1 can display an arbitrary number of the application icons on the lock screen.
The smartphone 1 may display the home icon 69 on the ring 66. Specifically, the lock screen 60 c illustrated in FIG. 17C displays five application icons of the application icons 68 a, 68 b, 68 c, 68 d and 68 e and the home icon 69 on the ring 66. The home icon 69 is displayed between the application icon 68 a and the application icon 68 e.
The shape of the ring 66 of the lock screen is not limited to a circle. The ring 66 has only to divide the lock screen into a first area, which includes the key icon 64, and a second area, which does not include the key icon 64. The ring 66 has only to have a frame surrounding the outer edge of the key icon 64. The frame surrounding the outer edge of the key icon 64 may have various shapes, such as a polygon, an ellipse, and a combined shape of a curve and a straight line. The ring 66 may be displayed on the lock screen. By displaying the ring 66 on the lock screen, the smartphone 1 can clearly indicate the user the boundary between the first area and the second area. Therefore, the user can execute the desired processing more reliably.
By displaying the application icon on the ring 66 or the frame, which is the boundary between the first area and the second area, the smartphone 1 can facilitate the input of the gesture of superimposing the application icon and the key icon, while suppressing the application from being executed by erroneous operations. Therefore, it is preferable to display the application icon on the ring 66 or the frame, but the smartphone 1 may also display the application icon on the area that does not overlap the ring 66 or the frame.
Then, the control for setting up the display content of the lock screen will be described with reference to FIGS. 18 to 22. FIG. 18 illustrates an example of an icon setting screen. The icon setting screen 90 illustrated in FIG. 18 displays a plurality of items 92, check boxes 94 corresponding to the items 92, and a scroll bar 98. Check marks 96 are displayed in some of the check boxes 94.
The items 92 are images indicating applications that can be executed by the smartphone 1 and can be displayed as application icons. The items 92 display character information set to the applications, specifically, application names. The icon setting screen 90 creates the items 92 with respect to respective applications that are executable by the smartphone 1, and displays the items 92 as a list.
The check boxes 94 are square frames and are displayed on the left of the respective items 92. The check boxes 94 are display areas for representing whether the items 92 are selected. The check box 96 is an image for representing whether the item 92 corresponding to the check box 94 is selected as the item to be displayed as the application icon. As described above, in the check box 94, the check mark 96 is displayed when the item 92 is selected, and the check mark 96 is not displayed when the item 92 is unselected. The scroll bar 98 is an image representing to which area of the entire icon setting screen 90 the area currently displayed on the display 2A corresponds. When detecting an operation of moving an object 98 a representing a current position of the scroll bar 98, the smartphone 1 scrolls the icon setting screen 90 displayed on the display 2A, based on the detected operation.
Although the items 92 are displayed as the images of character information of the application names, the icon setting screen 90 is not limited thereto. The icon setting screen 90 may use, as the items 92, images of the application icons associated with the applications, or images of the icons displayed on the home screen.
The smartphone 1 displays the icon setting screen 90, and determines the application icon to be displayed on the lock screen, based on the operation detected during displaying the icon setting screen 90.
The procedure of the control based on the functions provided by the control program 9A, specifically the procedure of the control of the processing of determining the application icon displayed on the lock screen, will be described with reference to FIG. 19. FIG. 19 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen. The procedure illustrated in FIG. 19 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 19 is executed, for example, when the operation of executing the setting application of the application icon displayed on the lock screen is detected.
The controller 10 displays the icon setting screen at Step S70, that is, the screen illustrated in FIG. 18, on the touch screen display 2. When the icon setting screen is displayed at Step S70, the controller 10 determines at Step S72 whether a gesture has been detected. That is, the controller 10 obtains the detection result of the touch screen 2B, and determines whether a gesture is detected based on the obtained detection result. When it is determined at Step S72 that no gesture has been detected (No at Step S72), the controller 10 proceeds to Step S72 and determines again whether a gesture has been detected.
When it is determined at Step S72 that a gesture has been detected (Yes at Step S72), the controller 10 determines at Step S74, whether the gesture is an item selection operation. That is, the controller 10 determines whether the gesture detected at Step S72 is a gesture of selecting the item displayed on the icon setting screen. The gesture of selecting the item displayed on the icon setting screen is a preset particular gesture selected among various gestures. The gesture of selecting the item displayed on the icon setting screen may use a tap, a long tap, or a double tap with respect to the area where the item is displayed, and may use a tap, a long tap, a double tap, or the like with respect to the area where the check box corresponding to the item is displayed.
When it is determined at Step S74 that the gesture is the item selection operation (Yes at Step S74), the controller 10 determines at Step S76 whether the targeted item is in a selected state. That is, the controller 10 detects the state of the item determined as selected at Step S74, and determines whether the item is in a selected state, in the embodiment, whether there is the check mark 96 in the check box 94 corresponding to the item 92.
When it is determined at Step S76 that the item is in the selected state (Yes at Step S76), the controller 10 changes the item to an unselected state at Step S78, and proceeds to Step S82. When it is determined at Step S76 that the item is not in the selected state (No at Step S76), the controller 10 changes the item to a selected state at Step S80, and proceeds to Step S82.
When the processing of Step S78 or Step S80 is executed, the controller 10 changes the display state of the item at Step S82. That is, the controller 10 clears the check mark of the check box of the item changed to the unselected state at Step S78, and displays the check mark in the check box of the item changed to the selected state at Step S80. When the processing of Step S82 is executed, the controller 10 proceeds to Step S72 and repeats the above-described processing.
When it is determined at Step S74 that the gesture is not the item selection operation (No at Step S74), the controller 10 determines at Step S84 whether the gesture is the setting completion operation. When it is determined at Step S84 that the gesture is not the setting completion operation (No at Step S84), the controller 10 executes the process corresponding to the detected gesture at Step S86, and proceeds to Step S72. Examples of the process corresponding to the detected gesture include screen scroll processing of the icon setting screen, display processing of screens displayable on the icon setting screen, for example, a help screen, and the like. When it is determined at Step S84 that the gesture is the setting completion operation (Yes at Step S84), the controller 10 sets the item in the selected state to an item to be displayed at Step S88. That is, the controller 10 sets the application of the item in the selected state to the application displaying the application icon on the lock screen. When the display item is set at Step S88, the controller 10 ends the processing.
The procedure of the control based on the functions provided by the control program 9A, specifically the procedure of the control of setting the display position of the application icon to be displayed, will be described with reference to FIG. 20. FIG. 20 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen, in particular, the procedure of the control for setting up the display position of the application icon to be displayed. The procedure illustrated in FIG. 20 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 20 is executed when the setting processing of the application icon to be displayed is completed, or when the operation of displaying on the lock screen is detected at first after the setting processing of the application icon to be displayed is completed. The procedure illustrated in FIG. 20 may be executed whenever the operation of displaying on the lock screen is detected.
The controller 10 extracts the application icons to be displayed at Step S90. Specifically, the controller 10 extracts the application set to the display item in the above-described processing, and extracts the application icons associated with the extracted applications as the application icons to be displayed.
When the application icons are extracted at Step S90, the controller 10 determines the spacing between the application icons at Step S92. Specifically, the controller 10 determines the spacing between the application icons on the ring 66, based on the number of the extracted application icons. The spacing may be equal and may be changed by position. For example, the controller 10 may set the ranks of the application icons and determine the spacing such that the distance to the adjacent icon application is increased with respect to the application icons whose ranks are set as high, and the distance to the adjacent icon application is decreased with respect to the application icons whose ranks are set as low. Thus, the icon applications having the high ranks are arranged far from other applications, so that the key icon can be easily superimposed thereon.
When the spacing is determined at Step S92, the controller 10 determines the arrangement positions of the application icons at Step S94. That is, the controller 10 determines the arrangement positions on the lock screen with respect to the application icons extracted at Step S90, based on the spacing determined at Step S92. The controller 10 arranges the application icons at the positions superimposed on the ring 66 with the determined spacing. The application icons may be arranged on the ring 66 in various orders. The application icons may be arranged on the ring 66 in order set in advance by the user.
As illustrated in FIGS. 19 and 20, the smartphone 1 may allow the user to select the application icon to be displayed on the lock screen. Accordingly, since the desired application icon can be displayed on the lock screen, the user can execute the desired application quickly.
By determining the spacing based on the number of the application icons to be displayed, the smartphone 1 can arrange the application icons on the ring with appropriate spacing. Therefore, the user can easily drop the key icon on the respective application icons arranged on the ring.
Although the case of automatically arranging the application icons has been described in FIG. 20, a manner of arranging the application icons is not limited thereto. The smartphone 1 may determine the arrangement positions of the application icons based on the gesture detected through the touch screen display 2.
Hereinafter, an example of the control that determines the arrangement positions of the application icons, based on the gesture detected through the touch screen display 2, will be described with reference to FIGS. 21 and 22. FIG. 21 illustrates an example of the control for setting up the display content of the lock screen.
At Step S101 illustrated in FIG. 21, an icon position setting screen 102 is displayed on the display 2A. The icon position setting screen 102 is a screen for setting the display positions of the application icons to be displayed on the lock screen. On the icon position setting screen 102, a ring 106 and application icons 108 a, 108 b, 108 c and 108 d are arranged. On the icon position setting screen 102, the same area 42 as the area 42 of the home screen 40 is arranged at the top edge of the display 2A. On the icon position setting screen 102, a remaining mark 43 indicating a remaining amount of a power supply and a radio-wave level mark 44 indicating electric field strength of radio wave for communication are displayed in the area 42.
The ring 106 is an image having a circular frame shape, and is displayed at the same position as that of the ring 66 displayed on the lock screen 60. The ring 106 is an image corresponding to the ring 66 of the lock screen 60. The ring 106 becomes a reference for determining the positions of the applications icons 108 a, 108 b, 108 c and 108 d in the icon position setting screen 102.
The application icons 108 a, 108 b, 108 c and 108 d are the same images as the respective application icons 68 a, 68 b, 68 c and 68 d of the lock screen 60. The application icons 108 a, 108 b, 108 c and 108 d are arranged under the area 42 in a row.
At Step S102, the user's finger F touches the application icon 108 a. In this case, the smartphone 1 detects the touch in the portion where the application icon 108 a is arranged.
At Step S103, the user's finger F drops the application icon 108 a on the ring 106. That is, the user uses his/her finger F to touch the area where the application icon 108 a is displayed at Step S102, drags the application icon 108 a along a path indicated by an arrow α7, and releases the application icon 108 a in the area where the ring 106 is displayed. In this case, the smartphone 1 detects the swipe, of which the start point is the portion where the application icon 108 a is arranged and the end point is the portion where the ring 106 is arranged. That is, the smartphone 1 detects the drop of the application icon 108 a on the ring 106.
At Step S104, the application icon 108 a is arranged on the ring 106. That is, when the drop is detected, as illustrated in Step S104, the smartphone 1 sets the position on the ring 106, at which the application icon 108 a is dropped, as the display position of the application icon 108 a.
By inputting the same gesture with respect to the application icons 108 b, 108 c and 108 d, the user can determine the positions of the respective application icons 108 b, 108 c and 108 d on the ring 106. When detecting the drop of the respective application icons 108 b, 108 c and 108 d on the ring 106, the smartphone 1 sets the dropped positions on the ring 106 as the display positions of the dropped application icons. When detecting the drop of the application icons 108 a, 108 b, 108 c or 108 d on the area where the application icon is not superimposed with the ring 106, the smartphone 1 of the embodiment returns the dropped application icon to the position of prior to the drop.
The procedure of the control based on the functions provided by the control program 9A, specifically the procedure of the control of the processing of determining the display positions of the application icons to be displayed on the lock screen, will be described with reference to FIG. 22. FIG. 22 is a flowchart illustrating the procedure of the control for setting up the display content of the lock screen. The procedure illustrated in FIG. 22 is realized by the controller 10 executing the control program 9A. The procedure illustrated in FIG. 22′ is executed, for example, when the operation of executing the application for determining the display positions of the application icons to be displayed on the lock screen is detected.
The controller 10 displays the icon position setting screen at Step S120, that is, the screen illustrated in Step S101 of FIG. 21, on the touch screen display 2. When the icon position setting screen is displayed at Step S120, the controller 10 determines at Step S122 whether a gesture has been detected. That is, the controller 10 obtains the detection result of the touch screen 2B, and determines whether a gesture is detected based on the obtained detection result. When it is determined at Step S122 that no gesture has been detected (No at Step S122), the controller 10 proceeds to Step S122 and determines again whether a gesture has been detected.
When it is determined at Step S122 that a gesture has been detected (Yes at Step S122), the controller 10 determines at Step S124 whether the detected gesture is a touch on the icon. That is, the controller 10 determines whether the gesture detected at Step S122 is a touch on the application icon displayed on the icon position setting screen.
When it is determined at Step S124 that the gesture is the touch on the icon (Yes at Step S124), the controller 10 determines at Step S126 whether a release has been detected. That is, the controller 10 determines whether the touch on the application icon, which is detected at Step S124 is released. When it is determined at Step S126 that there is no release (No at Step S126), that is, when it is determined that the touch on the application icon is continued, the controller 10 proceeds to Step S126 and determines again whether a release has been detected.
When it is determined at Step S126 that a release has been detected (Yes at Step S126), the controller 10 determines at Step S128 whether the release position is on the ring. That is, the controller 10 determines whether the position of the release determined as being present at Step S126 is on the ring, that is, whether the application icon is dropped on the ring.
When it is determined at Step S128 that the release position is on the ring (Yes at Step S128), the controller 10 sets the release position as the icon display position at Step S130. That is, the controller 10 sets the position on the ring, at which the application icon is dropped, as the display position of the application icon. When the display position of the icon is changed at Step S130, the controller 10 proceeds to Step S122.
When it is determined at Step S128 that the release position is not on the ring (No at Step S128), that is, when it is determined that the position of the dropped application icon is the position where the application icon is not superimposed with the ring, the controller 10 invalidates the operation of moving the icon at Step S132. That is, the controller 10 returns the dropped application icon to the position touched at Step S124, that is, the position of prior to movement. When the operation of moving the icon is invalidated at Step S132, the controller. 10 proceeds to Step S122.
When it is determined at Step S124 that the gesture is not the touch on the icon (No at Step S124), the controller 10 determines at Step S134 whether the gesture is the setting completion operation. When it is determined at Step S134 that the gesture is not the setting completion operation (No at Step S134), the controller 10 executes the process corresponding to the detected gesture at Step S136, and proceeds to Step S122. Examples of the process corresponding to the detected gesture include processing of adding the application icon to the icon position setting screen, display processing of screens displayable on the icon position setting screen, for example, a help screen, and the like. When it is determined at Step S134 that the gesture is the setting completion operation (Yes at Step S134), the controller 10 determines the display position of the icon at Step S138. That is, the controller 10 sets the position of the application icon, which is displayed on the ring of the icon position setting screen at the time point when the setting operation is determined as being completed, as the display position of the application icon. The controller 10 sets the application icons, which are not displayed on the ring, as icons which are not displayed on the lock screen. When the display position of the icon is set at Step S138, the controller 10 ends the processing.
As illustrated in FIGS. 21 and 22, the smartphone 1 can adjust the display positions of the application icons, based on the gesture of the user. Therefore, the application icons can be displayed on the lock screen in an arrangement that allows the user to use the application icons more easily.
In the examples illustrated in FIGS. 21 and 22, when the icon is dropped on the position where the icon is not superimposed on the ring, the smartphone 1 invalidates the operation of moving the icon. However, even when the icon is dropped on the position where the icon is not superimposed on the ring, the smartphone 1 may also change the display position of the icon to the dropped position. The smartphone 1 may continue the processing of FIG. 22 until the entire application icons arranged on the lock screen are arranged on the ring. The smartphone 1 may set the initial state of the icon position setting screen as a state in which the application icons are automatically arranged on the ring. Thus, the user can arrange the entire application icons on the ring by just adjusting the positions of the application icons.
The smartphone 1 may allow the use of only the application icons associated with the preset applications. That is, the smartphone 1 may disable the modification of the application icons to be displayed on the lock screen.
The embodiment disclosed in the present application can be modified without departing the gist and the scope of the invention. Moreover, the embodiments and their modifications disclosed in the present application can be combined with each other if necessary. For example, the embodiment may be modified as follows.
For example, the programs illustrated in FIG. 5 may be divided into a plurality of modules, or may be combined with any other program.
In the embodiment, the smartphone has been explained as an example of the device provided with the touch screen display; however, the device according to the appended claims is not limited to the smartphone. The device according to the appended claims may be a mobile electronic device other than the smartphone. Examples of the mobile electronic devices include, but are not limited to, mobile phones, tablets, mobile personal computers, digital cameras, media players, electronic book readers, navigators, and gaming devices. The device according to the appended claims may be a stationary-type electronic device. Examples of the stationary-type electronic devices include, but are not limited to, desktop personal computers, automatic teller machines (ATM), and television receivers.
Although the art of appended claims has been described with respect to a specific embodiment for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art which fairly fall within the basic teaching herein set forth.

Claims (8)

What is claimed is:
1. A device, comprising:
a touch screen display configured to display a lock screen, the lock screen at least including a first icon not associated with any applications and a plurality of second icons each associated with an application; and
a controller,
wherein,
while displaying the first icon and the plurality of second icons on the lock screen and without displaying a sub icon on the lock screen,
in response to a detection of a gesture in which the first icon and one of the plurality of second icons are superimposed,
the controller is configured to simultaneously display the first icon and a sub icon corresponding to the superimposed second icon, the sub icon corresponding to a process executable in the application associated with the superimposed second icon,
when the first icon, the plurality of second icons, and the sub icon are simultaneously displayed on the lock screen,
the first icon is moveable to be superimposed on the plurality of second icons and the sub icon,
the controller is configured to execute the process associated with the sub icon, in response to a gesture in which the first icon and the sub icon are superimposed, and
the controller is configured to execute an application among the applications and associated with a second icon among the plurality of second icons, in response to a gesture in which the first icon and the second icon are superimposed.
2. The device according to claim 1, wherein
the touch screen display is configured to display the lock screen including a first area in which the first icon is arranged and a second area,
the first and second areas are separated by a closed shape boundary,
the first area is surrounded by the closed shape boundary,
the closed shape boundary is surrounded by the second area, and
each of the plurality of second icons is arranged on the closed shape boundary and extends into both the first and second areas.
3. The device according to claim 1, wherein the first icon is a key icon configured to unlock a locked state in which the lock screen is displayed.
4. The device according to claim 1, wherein the sub icon includes a plurality of sub icons, each of the sub icons corresponds to a process executable in the application associated with the superimposed second icon.
5. A method of controlling a device having a touch screen display, the method comprising:
displaying a lock screen on the touch screen display, the lock screen at least including a first icon not associated with any applications and a plurality of second icons each associated with an application;
while displaying the first icon and the plurality of second icons on the lock screen and without displaying a sub icon on the lock screen, in response to a detection of a gesture in which the first icon and one of the plurality of second icons are superimposed,
simultaneously displaying the first icon and a sub icon corresponding to the superimposed second icon, the sub icon corresponding to a process executable in the application associated with the superimposed second icon, wherein the first icon is moveable to be superimposed on the sub icon and the plurality of second icons; and
when the first icon, the plurality of second icons, and the sub icon are simultaneously displayed on the lock screen,
executing the process associated with the sub icon, in response to a gesture in which the first icon and the sub icon are superimposed, and
executing an application among the applications and associated with a second icon among the plurality of second icons, in response to a gesture in which the first icon and the second icon are superimposed.
6. The method according to claim 5, further comprising
displaying the lock screen including a first area in which the first icon is arranged and a second area,
wherein
the first and second areas are separated by a closed shape boundary,
the first area is surrounded by the closed shape boundary,
the closed shape boundary is surrounded by the second area, and
each of the plurality of second icons is arranged on the closed shape boundary and extends into both the first and second areas.
7. A non-transitory storage medium storing therein a program for causing, when executed by a device having a touch screen display, the device to execute:
displaying a lock screen on the touch screen display, the lock screen at least including a first icon not associated with any applications and a plurality of second icons each associated with an application;
while displaying the first icon and the plurality of second icons on the lock screen and without displaying a sub icon on the lock screen, in response to a detection of a gesture in which the first icon and one of the plurality of second icons are superimposed,
simultaneously displaying the first icon and a sub icon corresponding to the superimposed second icon but not displaying the superimposed second icon that is superimposed by the first icon, the sub icon corresponding to a process executable in the application associated with the superimposed second icon, wherein the first icon is moveable to be superimposed on the sub icon and remaining second icons other than the superimposed second icon; and
when the first icon, the remaining second icons, and the sub icon are simultaneously displayed on the lock screen,
executing the process associated with the sub icon, in response to a gesture in which the first icon and the sub icon are superimposed, and
executing an application among the applications and associated with a second icon among the remaining second icons, in response to a gesture in which the first icon and the second icon among the remaining second icons are superimposed.
8. The non-transitory storage medium according to claim 7 storing therein the program for causing, when executed by the device having the touch screen display, the device to further execute:
displaying the lock screen including a first area in which the first icon is arranged and a second area,
wherein
the first and second areas are separated by a closed shape boundary,
the first area is surrounded by the closed shape boundary,
the closed shape boundary is surrounded by the second area, and
each of the plurality of second icons is arranged on the closed shape boundary and extends into both the first and second areas.
US13/633,934 2011-10-03 2012-10-03 Device, method, and storage medium storing program Expired - Fee Related US9619139B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011219531 2011-10-03
JP2011-219531 2011-10-03
JP2012221208A JP6194162B2 (en) 2011-10-03 2012-10-03 Apparatus, method, and program
JP2012-221208 2012-10-03

Publications (2)

Publication Number Publication Date
US20130082965A1 US20130082965A1 (en) 2013-04-04
US9619139B2 true US9619139B2 (en) 2017-04-11

Family

ID=47992101

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/633,934 Expired - Fee Related US9619139B2 (en) 2011-10-03 2012-10-03 Device, method, and storage medium storing program

Country Status (2)

Country Link
US (1) US9619139B2 (en)
JP (1) JP6194162B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150046880A1 (en) * 2012-09-24 2015-02-12 Huizhou Tcl Mobile Communication Co., Ltd Screen-unlocking unit, screen-unlocking method thereof and mobile communication apparatus
US20180217732A1 (en) * 2016-06-07 2018-08-02 Huizhou Tcl Mobile Communication Co., Ltd Method and mobile terminal for quickly opening an application based on lock screen
US20190243536A1 (en) * 2018-02-05 2019-08-08 Alkymia Method for interacting with one or more software applications using a touch sensitive display
US10628008B2 (en) 2016-09-01 2020-04-21 Honda Motor Co., Ltd. Information terminal controlling an operation of an application according to a user's operation received via a touch panel mounted on a display device
US20230409165A1 (en) * 2019-05-05 2023-12-21 Apple Inc. User interfaces for widgets

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9489074B2 (en) * 2011-03-23 2016-11-08 Kyocera Corporation Electronic device, operation control method, and operation control program
US10304347B2 (en) 2012-05-09 2019-05-28 Apple Inc. Exercised-based watch face and complications
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US20140055369A1 (en) * 2012-08-22 2014-02-27 Qualcomm Innovation Center, Inc. Single-gesture mobile computing device operations
US9715282B2 (en) * 2013-03-29 2017-07-25 Microsoft Technology Licensing, Llc Closing, starting, and restarting applications
CN103197767B (en) * 2013-04-10 2017-05-17 周可 Method and device for virtual keyboard input by aid of hand signs
KR102141155B1 (en) 2013-04-22 2020-08-04 삼성전자주식회사 Mobile apparatus providing with changed-shortcut icon responding to status of mobile apparatus and control method thereof
KR102203885B1 (en) * 2013-04-26 2021-01-15 삼성전자주식회사 User terminal device and control method thereof
CN104166468A (en) * 2013-05-17 2014-11-26 环达电脑(上海)有限公司 Touch screen device
CN104184862B (en) * 2013-05-27 2016-08-10 腾讯科技(深圳)有限公司 A kind of rapid communication method and apparatus
EP3018568A4 (en) * 2013-07-05 2017-04-19 Clarion Co., Ltd. Information processing device
KR102207443B1 (en) * 2013-07-26 2021-01-26 삼성전자주식회사 Method for providing graphic user interface and apparatus for the same
CN103500063B (en) * 2013-09-24 2016-08-17 小米科技有限责任公司 virtual keyboard display method, device and terminal
US10234988B2 (en) * 2013-09-30 2019-03-19 Blackberry Limited User-trackable moving image for control of electronic device with touch-sensitive display
US9733806B2 (en) * 2013-10-09 2017-08-15 Htc Corporation Electronic device and user interface operating method thereof
US20150121314A1 (en) * 2013-10-24 2015-04-30 Jens Bombolowsky Two-finger gestures
USD766951S1 (en) * 2014-05-01 2016-09-20 Beijing Qihoo Technology Co. Ltd Display screen with a graphical user interface
USD768148S1 (en) * 2014-05-23 2016-10-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
EP3584671B1 (en) 2014-06-27 2022-04-27 Apple Inc. Manipulation of calendar application in device with touch screen
WO2016014601A2 (en) 2014-07-21 2016-01-28 Apple Inc. Remote user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
CN104238876A (en) * 2014-08-29 2014-12-24 惠州Tcl移动通信有限公司 Intelligent terminal and display method of application icons thereof
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
WO2016036541A2 (en) 2014-09-02 2016-03-10 Apple Inc. Phone user interface
USD760246S1 (en) * 2014-10-15 2016-06-28 EndGame Design Laboratories, LLC Display screen with graphical user interface
USD760245S1 (en) * 2014-10-15 2016-06-28 EndGame Design Laboratories, LLC Display screen with graphical user interface
USD759672S1 (en) * 2014-10-15 2016-06-21 EndGame Design Laboratories, LLC Display screen with animated graphical user interface
CN105718132B (en) * 2014-12-05 2018-10-30 富泰华工业(深圳)有限公司 Desktop navigation system and method
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
EP4321088A2 (en) 2015-08-20 2024-02-14 Apple Inc. Exercise-based watch face
USD817966S1 (en) * 2016-01-26 2018-05-15 Sony Corporation Portion of display panel or screen with animated graphical user interface
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
CN109324846B (en) * 2017-07-28 2021-11-23 北京小米移动软件有限公司 Application display method and device and storage medium
USD872120S1 (en) * 2017-10-26 2020-01-07 Herdx, Inc. Flat soft touch control panel user interface
JP6871846B2 (en) * 2017-12-20 2021-05-19 京セラ株式会社 Electronics and control methods
CN108536381B (en) * 2018-03-26 2021-11-09 珠海格力电器股份有限公司 Short-time screen locking method and device
USD865799S1 (en) 2018-05-03 2019-11-05 Caterpillar Paving Products Inc. Display screen with animated graphical user interface
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
JP6921338B2 (en) 2019-05-06 2021-08-18 アップル インコーポレイテッドApple Inc. Limited operation of electronic devices
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
DK180684B1 (en) 2019-09-09 2021-11-25 Apple Inc Techniques for managing display usage
JP7383984B2 (en) 2019-10-30 2023-11-21 セイコーエプソン株式会社 Electronics
USD1018575S1 (en) * 2019-12-09 2024-03-19 Caterpillar Inc. Display screen having a graphical user interface
US11360651B2 (en) * 2020-02-14 2022-06-14 Dtoor Inc. Spc Mobile communication system with charging station and circular user interface
CN115552375A (en) 2020-05-11 2022-12-30 苹果公司 User interface for managing user interface sharing
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
CN112486367B (en) * 2020-11-27 2022-08-16 维沃移动通信有限公司 Application icon management method and device
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
USD995542S1 (en) * 2021-06-06 2023-08-15 Apple Inc. Display screen or portion thereof with graphical user interface
WO2023248912A1 (en) * 2022-06-23 2023-12-28 京セラドキュメントソリューションズ株式会社 Electronic device and control method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165160A1 (en) 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
US20090307631A1 (en) * 2008-02-01 2009-12-10 Kim Joo Min User interface method for mobile device and mobile communication system
US20100001967A1 (en) * 2008-07-07 2010-01-07 Yoo Young Jin Mobile terminal and operation control method thereof
US20100269040A1 (en) * 2009-04-16 2010-10-21 Lg Electronics Inc. Mobile terminal and control method thereof
US20100299638A1 (en) * 2009-05-25 2010-11-25 Choi Jin-Won Function execution method and apparatus thereof
US20110016390A1 (en) * 2009-07-14 2011-01-20 Pantech Co. Ltd. Mobile terminal to display menu information according to touch signal
US20120046077A1 (en) * 2010-08-18 2012-02-23 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120069231A1 (en) * 2010-09-21 2012-03-22 Altek Corporation Unlocking method of a touch screen and electronic device with camera function thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4526940B2 (en) * 2004-12-09 2010-08-18 株式会社リコー Information processing device
JP4899991B2 (en) * 2007-03-30 2012-03-21 富士ゼロックス株式会社 Display device and program
EP2060970A1 (en) * 2007-11-12 2009-05-20 Research In Motion Limited User interface for touchscreen device
KR101526965B1 (en) * 2008-02-29 2015-06-11 엘지전자 주식회사 Terminal and method for controlling the same
US9197738B2 (en) * 2008-12-04 2015-11-24 Microsoft Technology Licensing, Llc Providing selected data through a locked display
WO2010109849A1 (en) * 2009-03-23 2010-09-30 パナソニック株式会社 Information processing device, information processing method, recording medium, and integrated circuit
KR101608673B1 (en) * 2009-10-30 2016-04-05 삼성전자주식회사 Operation Method for Portable Device including a touch lock state And Apparatus using the same
KR101563150B1 (en) * 2011-09-09 2015-10-28 주식회사 팬택 Method for providing shortcut in lock screen and portable device employing the same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165160A1 (en) 2007-01-07 2008-07-10 Kenneth Kocienda Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Gesture on a Touch Screen Display
WO2008086302A1 (en) 2007-01-07 2008-07-17 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
US20090307631A1 (en) * 2008-02-01 2009-12-10 Kim Joo Min User interface method for mobile device and mobile communication system
US20100001967A1 (en) * 2008-07-07 2010-01-07 Yoo Young Jin Mobile terminal and operation control method thereof
US20100269040A1 (en) * 2009-04-16 2010-10-21 Lg Electronics Inc. Mobile terminal and control method thereof
US20100299638A1 (en) * 2009-05-25 2010-11-25 Choi Jin-Won Function execution method and apparatus thereof
US20110016390A1 (en) * 2009-07-14 2011-01-20 Pantech Co. Ltd. Mobile terminal to display menu information according to touch signal
US20120046077A1 (en) * 2010-08-18 2012-02-23 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120069231A1 (en) * 2010-09-21 2012-03-22 Altek Corporation Unlocking method of a touch screen and electronic device with camera function thereof

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150046880A1 (en) * 2012-09-24 2015-02-12 Huizhou Tcl Mobile Communication Co., Ltd Screen-unlocking unit, screen-unlocking method thereof and mobile communication apparatus
US20180217732A1 (en) * 2016-06-07 2018-08-02 Huizhou Tcl Mobile Communication Co., Ltd Method and mobile terminal for quickly opening an application based on lock screen
US10732819B2 (en) * 2016-06-07 2020-08-04 Huizhou Tcl Mobile Communication Co., Ltd. Method and mobile terminal for quickly opening an application based on lock screen
US10628008B2 (en) 2016-09-01 2020-04-21 Honda Motor Co., Ltd. Information terminal controlling an operation of an application according to a user's operation received via a touch panel mounted on a display device
US20190243536A1 (en) * 2018-02-05 2019-08-08 Alkymia Method for interacting with one or more software applications using a touch sensitive display
US20230409165A1 (en) * 2019-05-05 2023-12-21 Apple Inc. User interfaces for widgets

Also Published As

Publication number Publication date
US20130082965A1 (en) 2013-04-04
JP2013093021A (en) 2013-05-16
JP6194162B2 (en) 2017-09-06

Similar Documents

Publication Publication Date Title
US9619139B2 (en) Device, method, and storage medium storing program
US9342235B2 (en) Device, method, and storage medium storing program
US9423952B2 (en) Device, method, and storage medium storing program
US9280275B2 (en) Device, method, and storage medium storing program
US9495025B2 (en) Device, method and storage medium storing program for controlling screen orientation
US9268481B2 (en) User arrangement of objects on home screen of mobile device, method and storage medium thereof
US9013422B2 (en) Device, method, and storage medium storing program
US9524091B2 (en) Device, method, and storage medium storing program
US9563347B2 (en) Device, method, and storage medium storing program
US9703382B2 (en) Device, method, and storage medium storing program with control for terminating a program
US9323444B2 (en) Device, method, and storage medium storing program
US9817544B2 (en) Device, method, and storage medium storing program
US9448691B2 (en) Device, method, and storage medium storing program
US9874994B2 (en) Device, method and program for icon and/or folder management
US20130080964A1 (en) Device, method, and storage medium storing program
US9116595B2 (en) Device, method, and storage medium storing program
US20130167090A1 (en) Device, method, and storage medium storing program
US20130139107A1 (en) Device, method, and storage medium storing program
US9542019B2 (en) Device, method, and storage medium storing program for displaying overlapped screens while performing multitasking function
US20130086523A1 (en) Device, method, and storage medium storing program
US9733712B2 (en) Device, method, and storage medium storing program
US20130162574A1 (en) Device, method, and storage medium storing program
US20130235088A1 (en) Device, method, and storage medium storing program
JP6058790B2 (en) Apparatus, method, and program
EP2963533A1 (en) Electronic device, control method, and control program

Legal Events

Date Code Title Description
AS Assignment

Owner name: KYOCERA CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WADA, YUUKI;OONISHI, KATSUAKI;SIGNING DATES FROM 20121011 TO 20121015;REEL/FRAME:029209/0613

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210411