US20140089833A1 - Method and apparatus for providing multi-window in touch device - Google Patents

Method and apparatus for providing multi-window in touch device Download PDF

Info

Publication number
US20140089833A1
US20140089833A1 US14/035,266 US201314035266A US2014089833A1 US 20140089833 A1 US20140089833 A1 US 20140089833A1 US 201314035266 A US201314035266 A US 201314035266A US 2014089833 A1 US2014089833 A1 US 2014089833A1
Authority
US
United States
Prior art keywords
application
window
screen
execution
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/035,266
Inventor
Daesik HWANG
Hyesoon JEONG
JeongHoon Kim
Dongjun Lee
Jonghwa OH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HWANG, Daesik, Jeong, Hyesoon, KIM, JEONGHOON, LEE, DONGJUN, OH, Jonghwa
Publication of US20140089833A1 publication Critical patent/US20140089833A1/en
Priority to US17/030,645 priority Critical patent/US11714520B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present disclosure relates to a method and an apparatus for operating a function in a touch device. More particularly, the present disclosure relates to a method of providing a multi-window in a touch device so that a plurality of application may be efficiently used through multi-splitting of a window on one screen provided from the touch device, and an apparatus thereof.
  • the mobile device may have various functions, such as the ability to process an audio call, an image call, to process the transmission and reception of a message such as a Short Message Service (SMS)/Multimedia Message Service (MMS), an e-mail, an electronic note, photography, a broadcasting play, a video play, a music play, information from Internet, a messenger, and a Social Networking Service (SNS).
  • SMS Short Message Service
  • MMS Multimedia Message Service
  • SNS Social Networking Service
  • an aspect of the present disclosure is to provide a method of implementing a multi-window environment in a single system of a touch device composed of at least two split windows and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of providing a multi-window in a touch device capable of maximizing the usability of the touch device by a user by splitting one screen into at least two windows to easily arrange and execute a plurality of applications and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-widow environment in a touch device capable of simply changing a layout for convenience of an operation of a plurality of applications in the multi-window environment and supporting the convenience of a user operation in the multi-window environment and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window in a touch device capable of minimizing a burden of a user operation in a multi-window environment, and increases the user's convenience with respect to a plurality of applications by freely adjusting windows with respect to a plurality of applications and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window environment in a touch device capable of supporting large amounts of information and various experiences to the user by implementing a multi-window environment in a touch device and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window environment capable of improving convenience for a user and usability of the touch device by implementing an optimal environment for supporting a multi-window environment in a touch device is provided.
  • a method of executing an application in a touch device includes displaying an execution screen of a first application as a full screen, receiving an input of an execution event for executing a second application, configuring a multi-window in a split scheme when the execution event is released on a specific window, and individually displaying screens of the first application and the second application through respective split windows.
  • a method of executing an application in a touch device includes executing a first application corresponding to a user selection and displaying the application through one window as a full screen, receiving a first event input for selecting and moving a second application when the first application is executed, determining a multi-window split scheme and a region to which a first event is input, outputting a feedback for a window in which the second application is able to be executed and the region to which the first event is input, receiving a second event input for executing the second application, configuring the multi-window in response to the second event input, and independently displaying a screen of the first application and a screen of the second application through corresponding windows separated by the multi-window.
  • a method of executing an application in a touch device includes displaying an execution screen of a first application as a full screen, sliding-in a tray including an execution icon of an application according to a user input when the first application is executed, receiving an input for selecting an execution icon of a second application from the tray and dragging the selected execution icon into the full screen, receiving an input for dropping the execution icon in a specific window while the execution icon is dragged, executing the second application in response to the drop input of the execution icon, splitting a full screen into windows for displaying screens of the first application and the second application, and displaying a screen of the second application through the specific window in which the execution icon is dropped and displaying the screen of the first application through another split window.
  • a computer readable recording medium recording a program for executing the methods in a processor.
  • a touch device configured to display a screen interface of a multi-window environment, to display screens of a plurality of applications through a plurality of windows split in the screen interface, and to receive an event input for operating the applications, and a controller configured to control execution of the applications in the multi-window environment, and to control to independently display screens of at least two applications through the windows according to a user selection from among a plurality of executed applications.
  • a computer readable recording medium having recorded thereon a program performing a method.
  • the method includes receiving an input of an execution event for executing a second application when an execution screen of a first application is displayed as a full screen, configuring a multi-window in a split scheme when the execution event is released on a specific window, and individually displaying screens of the first application and the second application through respective split windows.
  • FIG. 1 is a block diagram schematically illustrating a configuration of a touch device according to an embodiment of the present disclosure
  • FIG. 2 is a diagram of a screen schematically illustrating a screen interface in a touch device according to an embodiment of the present disclosure
  • FIG. 3 is a diagram schematically illustrating an operation of a multi-window in a touch device according to an embodiment of the present disclosure
  • FIG. 4 is a diagram schematically illustrating an operation for splitting a multi-window in a touch device according to an embodiment of the present disclosure
  • FIGS. 5 , 6 , 7 , 8 , 9 , 10 , 11 , and 12 are diagrams illustrating examples of an operation screen operating a tray for rapidly executing an application in a multi-window environment according to an embodiment of the present disclosure
  • FIGS. 13 , 14 , 15 , 16 , and 17 are diagrams illustrating examples of an operation screen operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure
  • FIGS. 18 , 19 , 20 , 21 , 22 , and 23 are diagrams illustrating examples of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure
  • FIGS. 24 , 25 , 26 , 27 , 28 , and 29 are diagrams illustrating examples of operating a key pad for text input in a multi-window environment according to an embodiment of the present disclosure
  • FIG. 30 is a diagram illustrating an example of operating a plurality of applications in a multi-window environment according to an disclosure embodiment of the present disclosure
  • FIGS. 31 , 32 , 33 , and 34 are diagrams illustrating examples of an operation screen providing information with respect to a plurality of applications executed according to a multi-window environment in a touch device according to an embodiment of the present disclosure
  • FIG. 35 is a flowchart illustrating a method of executing an additional application by switching a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • FIG. 36 is a flowchart illustrating a method of executing an additional application in a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • the present disclosure relates to a method of providing a multi-window in a touch device which splits a screen of the touch device into at least two windows in a split scheme to provide a multi-window and allows a user to efficiently use a plurality of applications through the multi-window on one screen and an apparatus thereof.
  • Embodiments of the present disclosure may include selecting an additional application in a touch device to determine a screen split scheme upon execution of a drag, and may feedback a corresponding window in which an additional application is able to be executed from among respective windows split from one screen. Accordingly, the user may know where an additional application being executed exists. Further, according to an embodiment of the present disclosure, when the additional application is executed at a location selected by the user, a screen of the application may be displayed suitable for the size of a corresponding window.
  • a configuration of the touch device and a method of controlling an operation thereof according to embodiments of the present disclosure are not limited to the following description, but are also applicable to various additional embodiments based on the embodiments described herein.
  • FIG. 1 is a block diagram schematically illustrating a configuration of a touch device according to an embodiment of the present disclosure.
  • the touch device of the present disclosure may include a Radio Frequency (RF) communication unit 110 , a user input unit 120 , a display unit 130 , an audio processor 140 , a memory 150 , an interface unit 160 , a controller 170 , and a power supply 180 . Since constituent elements shown in FIG. 1 may not be essential, a touch device of the present disclosure may be implemented with more than the above described elements or less than the above described elements.
  • RF Radio Frequency
  • the RF communication unit 110 may include at least one or more modules capable of performing a wireless communication between the touch device and a wireless communication system or between the touch device and a network in which another device is located.
  • the wireless communication unit 110 may include a mobile communication module 111 , a Wireless Local Area Network (WLAN) module 113 , a short range communication module 115 , a location calculation module 117 , and a broadcasting reception module 119 .
  • WLAN Wireless Local Area Network
  • the mobile communication module 111 transmits and receives a wireless signal to and from at least one of a base station, an external terminal, various servers (e.g., an integration server, a provider server, a content server, or the like).
  • the wireless signal may include a voice call signal, an image call signal, or data of various formats according to the transmission/reception of a character/multi-media message.
  • the mobile communication module 111 may access at least one of various servers under control of the controller 170 to receive an application available in a touch device according to user selection.
  • the WLAN module 113 may be a module for access to wireless Internet, and forming a wireless LAN link with other touch device, and may be installed at an inside or outside of the touch device.
  • Wireless Internet techniques may include Wireless LAN/Wi-Fi (WLAN), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access (HSDPA).
  • the WLAN module 113 may access at least one of various servers to receive a usable application from the touch device according to user selection under control of controller 170 . Further, when a WLAN link is formed with another touch device, the WLAN module 113 may transmit or receive an application according to the user selection to or from another touch device.
  • the short range communication module 115 is a module for short range communication.
  • the short range communication techniques may include Bluetooth, Bluetooth Low Energy (BLE), Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC).
  • BLE Bluetooth Low Energy
  • RFID Radio Frequency IDentification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the location calculation module 117 is a module for acquiring a location of the touch device.
  • the location calculation module 117 includes a Global Position System (GPS).
  • GPS Global Position System
  • the location calculation module 115 may calculate distance information distant from at least three base stations and exact time information, apply trigonometry to the calculated information so that three-dimensional current location information according to latitude, longitude, and altitude may be calculated.
  • the location calculation module 115 may continuously receive a current location of the touch device from at least three satellites in real time to calculate location information.
  • the location information of the touch device may be acquired by various schemes.
  • the broadcasting receiving module 119 receives a broadcasting signal (e.g., a TV broadcasting signal, a radio broadcasting signal, a data broadcasting signal) and/or information (e.g., a broadcasting channel, a broadcasting program or information about a broadcasting service provider) from an external broadcasting management server through a broadcasting channel (e.g., a satellite channel or a terrestrial channel).
  • a broadcasting signal e.g., a TV broadcasting signal, a radio broadcasting signal, a data broadcasting signal
  • information e.g., a broadcasting channel, a broadcasting program or information about a broadcasting service provider
  • the user input unit 120 generates input data for controlling an operation of the touch device by user.
  • the user input unit 120 may be configured by a key pad, a dome switch, a touch pad (e.g., a resistive/capacitive type), a jog wheel, and a jog switch.
  • the user input unit 120 may be implemented in the form of a button outside the touch device, and some buttons may be implemented by a touch panel.
  • the display unit 130 displays (i.e., outputs) information processed by the touch device. For example, when the touch device is in a call mode, the display unit 130 displays User Interface (UI) or Graphical UI (GUI) associated with a call. When the touch device is in an image call mode or a shooting mode, the display unit 130 displays photographed and/or received image or UI and GUI.
  • UI User Interface
  • GUI Graphical UI
  • the display unit 130 may display an execution screen with respect to various functions (or applications) executed in the touch device through one or more windows, as will be illustrated in relation to the following figures, for instance FIG. 3 .
  • the execution screen may therefore display data relating to multiple applications.
  • the display unit 130 may provide at least two split screen regions according to a split scheme, and may provide the split screen regions to one window, respectively to form a multi-window. That is, the display unit 130 may display a screen corresponding to the multi-window environment, and may display an execution screen with respect to a plurality of applications through a multi-window, which is split regions. In this case, the display unit 130 may simultaneously display a screen of one window and a screen of another window in parallel.
  • the display unit 130 may display a separator for separating respective windows, that is, split regions.
  • the display unit 130 may display a tray (or an application launcher) for efficiently and intuitively executing an application according to the multi-window environment.
  • the tray comprises a screen region in which, for instance, icons representing respective applications may be displayed and selected.
  • the tray may comprise a pop-up object displayed upon the screen.
  • the tray may be moved within the screen.
  • the display unit 130 may display a virtual input device (e.g., a touch key pad or a floating key pad which is freely moved in a full screen region.
  • the display unit 130 may receive a user input on a full screen (the whole of the available screen area of the display unit 130 ) or on an individual window screen provided through one or more windows in a multi-window environment, and may transfer an input signal according to the user input to the controller 170 . Further, the display unit 130 may support screen display in a landscape mode, screen display in a vertical mode (portrait mode), and a screen switch display according to variation between the landscape mode and the vertical mode according to the orientation or a change in the orientation of the touch device. An embodiment of a screen of the display unit 130 operated according to an embodiment of the present disclosure will be described herein.
  • the display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT LCD), a Light Emitting Diode (LED), an Organic Light-Emitting Diode (OLED), an Active Matrix OLED (AMOLED), a flexible display, a bendable display 100 , and a 3D display.
  • LCD Liquid Crystal Display
  • TFT LCD Thin Film Transistor-Liquid Crystal Display
  • LED Light Emitting Diode
  • OLED Organic Light-Emitting Diode
  • AMOLED Active Matrix OLED
  • Some of the above displays may be implemented by a transparent display configured in a transparent type or a light transmittance type to look out the outside there through.
  • the display unit 130 may be used as an input device as well as an output device.
  • the touch panel may convert pressure applied to a specific part of the display unit 130 or a variation in capacitance created at the specific part of the display unit 130 into an electric input signal.
  • the touch panel may detect a touched location, an area, or pressure upon touch.
  • a signal(s) corresponding to the touch input is sent to a touch controller (not shown).
  • the touch controller processes the signal(s) and transmits corresponding data to the controller 170 . Accordingly, the controller 170 may recognize which region of the display unit 330 is touched.
  • the audio processor 140 transmits an audio signal input from the controller 170 to a speaker 141 , and transfers an audio signal such as a voice input from the microphone 143 to the controller 170 .
  • the audio processor 140 converts voice/sound data into an audible sound and outputs the audible sound through the speaker 141 under the control of the controller 170 .
  • the audio processor 140 may convert an audio signal such as a voice input from the microphone 143 into a digital signal, and may transfer the digital signal to the controller 170 .
  • the speaker 141 may output audio data received from the RF communication unit 110 or stored in the memory 150 in a call mode, a record mode, a media contents play mode, a photographing mode, or a multimedia mode.
  • the speaker 141 may output a sound signal associated with a function (e.g., a receiving call connection, a sending call connection, a music file play, a video file play, an external output, or the like) performed in the touch device.
  • a function e.g., a receiving call connection, a sending call connection, a music file play, a video file play, an external output, or the like
  • the microphone 143 may receive and process an external sound signal to electric voice data in a call mode, a record mode, a voice recognition mode, or a photographing mode.
  • the processed voice data are converted into a transmissible format and the converted data are outputted to a mobile communication base station through a mobile communication module 111 .
  • Various noise removal algorithms for removing a noise generated during a procedure of receiving an external sound signal may be implemented in the microphone 143 .
  • the memory 150 may store a program for process and control of the controller 170 , and may temporarily store a function for input/output data (e.g., a telephone number, a message, audio, media contents [e.g., a music file or a video file], or an application).
  • the memory 150 may store a use frequency (e.g., frequencies in the use of an application, frequencies in media contents, or frequencies in a phone number, a message, and in multi-media), an importance, a priority, or a preference according to a function operation of the touch device.
  • the memory 150 may store data regarding a vibration or a sound of various patterns output upon touch input on the touch screen.
  • the memory 150 may store split information with respect to a screen split scheme for operating a multi-window, application information to be registered in the tray, or application information executed by multi-tasking by the multi-window.
  • the memory 150 may include a storage medium having at least one of memory types including a flash memory type, a hard disk type, a micro type, a card type (e.g., an SD card or XD card memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Programmable Read-Only Memory (PROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Magnetic RAM (MRAM), a magnetic disc, or an optical disc.
  • the touch device may operate associated with a web storage executing a storage function of the memory 150 on Internet.
  • the interface unit 160 performs a function of passage with all external devices connected to the touch device.
  • the interface unit 160 may receive data or power from an external device, transfer the data or power to each element inside of the touch device, or transmit data of the inside of touch device to an external device.
  • the interface unit 160 may include a wire/wireless headset port, an external charger port, a wire/wireless data port, a memory card port, a port of connecting a device having an identity module, an audio I/O (input/output) port, a video I/O (input/output) port and an earphone port.
  • the interface unit 160 includes an interface for connecting with an external device in a wired or wireless scheme.
  • the controller 170 controls an overall operation of the touch device. For example, the controller 170 performs control associated with an operation of an application according to a voice call, a data communication, an image call, or operating a multi-window environment.
  • the controller 170 may include a separate multi-media module (not shown) for operating a multi-window function. According to certain embodiments of the present disclosure, the multi-media module (not shown) may be implemented in the controller 170 and may be implemented separately from the controller 170 .
  • the controller 170 may control a series of operations for supporting a multi-window function according to embodiments of the present disclosure.
  • the controller 170 may control execution of a plurality of applications in a multi-window environment.
  • the controller 170 may control to independent display of screens relating to at least two applications according to user selection from among a plurality of executed applications through the plurality of windows.
  • the controller 170 may receive an execution event input, for instance a touch input, for executing a second application in a state in which an execution screen of the first application is displayed as a full screen (that is, occupying all or substantially all of the available screen area within the display unit 130 ).
  • the controller 170 may control a feedback output (for instance, visual feedback) with respect to a window where a dragged icon relating to the second application is currently located, or another movement location before the execution event is released. If the execution event is released when located over a specific window, the controller 170 may configure a multi-window according to a pre-set split scheme, and may control to independently display a screen of the first application and the second application through respective split windows.
  • the controller 170 may control execution of the additional application through a window selected to execute the additional application.
  • the controller 170 executes, and processes an application previously executed through the selected window in the background (that is, without continuing to display the executing application), and controls to display the additional application screen through the selected window.
  • the controller 170 may control the display of a tray, a separator, or a floating key pad provided from a screen interface according to the multi-window environment.
  • the controller 170 may allow the displayed tray, separator or floating key pad to be moved within the screen according to a user input or otherwise. More particularly, the controller 170 may determine (i.e., change) the size of each window according to the multi-window environment in accordance with the movement of the separator.
  • controller 370 A detailed control operation of the controller 370 will be described in an example of an operation of the touch device and a control method thereof referring to following drawings.
  • the power supply 180 uses power which is applied from an external power source or an internal power source thereto, and supplies power necessary to operate each constituent element under control of the controller 170 .
  • Various embodiments according to the present disclosure may be implemented in a recording medium which may be read by a computer or a similar device using software, hardware or a combination thereof.
  • various embodiments of the present disclosure may be implemented using at least one of Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and an electric unit for executing the functions.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field Programmable Gate Arrays
  • processors controllers, micro-controllers, microprocessors, and an electric unit for executing the functions.
  • controller 170 the controller 170 .
  • various embodiments of procedures and functions according to this specification may be implemented by separate software modules. The
  • the recording medium may include a computer readable recording medium recording a program processing to receive an input of an execution event for executing a second application in a state in which an execution screen of the first application is displayed on a full screen, to output feedback with respect to a window of a moved location when the execution event is moved while not being released, to configure a multi-window according to a preset split scheme when the execution event is released from the moved specific window, or to independently display screen of the first and second applications through respective split windows.
  • the touch device of the present disclosure illustrated in FIG. 1 may include various information communication devices, multi-media devices supporting a function of the present disclosure, and an application device thereof, such as various devices using an Application Processor (AP), a Graphic Processing Unit (GPU), and a Central Processing Unit (CPU).
  • the touch device includes devices such as a tablet Personal Computer (PC), a Smart Phone, a digital camera, a Portable Multimedia Player (PMP), a media player, a portable game terminal, a Personal Digital Assistant (PDA) as well as mobile communication terminals operating based on respective communication protocols corresponding to various communication systems.
  • PC Personal Computer
  • PMP Portable Multimedia Player
  • PDA Personal Digital Assistant
  • FIG. 2 is a diagram of a screen schematically illustrating a screen interface in a touch device according to an embodiment of the present disclosure.
  • a screen interface for supporting a multi-window environment in a touch device includes execution regions 210 and 230 split from one screen to display an execution screen of an application. That is, within the screen there are separate execution regions 210 and 230 in which execution screens relating to separate applications can be displayed. Each execution region 210 and 230 may be referred to as a separate window, and collectively the separate windows may be referred to as multi-windows or a multi-window environment. Furthermore, the screen interface includes a separator 200 separating at least two execution regions 210 and 230 split according to a split scheme to adjust a window size of the execution regions 210 and 230 .
  • the split scheme refers to the relative disposition and size of the two or more execution regions 210 and 230 or windows within the multi-window environment. It will be appreciated that if there are more than two windows within the multi-window environment then further separators may be required.
  • the respective execution regions 210 and 230 split according to the multi-window environment may include a navigation region, a scroll region, or a text input region which are independently formed according to the execution application or the respective execution applications.
  • the screen interface of the present disclosure provides a tray 300 for conveniently supporting execution of an application using respective windows separated as a multi-window.
  • the tray 300 is installed in the touch device and includes one or more execution icons (or a shortcut icon) 400 from among all executable applications or includes only some applications according to settings of the user.
  • the tray 300 may be arranged such that it appears to slide-in (i.e., be displayed) on the screen or to slide-out and be hidden from the screen.
  • the tray 300 may include a handle item 350 capable of receiving a user command (for instance a touch input or a touch and drag input) for switching between the slide-in in a slide-out state.
  • the tray 300 may support scrolling through execution icons 400 in the tray 300 and the execution icon 400 in the tray 300 may be corrected, added, or removed according to user selection. Although it has been illustrated in FIG. 2 that the tray 300 is disposed in a row, the tray 300 may be disposed in two or more rows, which may be changed according to user selection.
  • a screen of a touch device is split into two execution regions (i.e., windows) 210 and 230 through one separator 200
  • one or more separators 200 may be provided in response to the number of windows, that is, according to the split scheme that configures a multi-window environment.
  • one separator 200 may be provided.
  • two separators 200 may be provided.
  • two or three separators 200 may be provided according to the split region.
  • FIG. 3 is a diagram schematically illustrating an operation of a multi-window in a touch device according to an embodiment of the present disclosure.
  • a screen example of reference numeral ⁇ 301 > indicates a screen example of a touch device when the touch device executes an Internet application. More particularly, the screen of reference numeral ⁇ 301 > indicates a state in which the Internet application is displayed as a full screen through one window. The full screen consumes all or substantially all of the available screen space (which may, for instance, be less than the total screen size to allow for status bars to be continuously displayed).
  • a screen example of reference numeral ⁇ 303 > indicates a screen example of a touch device when two applications are executed through a multi-window.
  • the user may additionally execute a map (MAP) application in a state in which a full screen of the Internet application is displayed.
  • MAP map
  • one screen is split into different execution regions by two windows through the separator 200 , and execution screens of an Internet application and a MAP application are provided through respective execution regions (windows).
  • a plurality of applications split among at least two screens may be simultaneously operated according to embodiments of the present disclosure.
  • a screen example of reference numeral ⁇ 305 > indicates a screen example where sizes of respective windows are changed according to a user operation from a screen of reference numeral ⁇ 330 >.
  • the user moves (e.g., a touch & drag) the separator 200 to adjust a window size of an execution region in which the Internet application is executed and an execution region in which a MAP application is executed.
  • the screen size of the application may be suitably changed according to a variation in the window size of a corresponding execution region.
  • FIG. 4 is a diagram schematically illustrating an operation for separating a multi-window in a touch device according to an embodiment of the present disclosure.
  • a screen example of reference numeral ⁇ 401 > indicates a case where a screen is split into two windows for a multi-window environment and a screen example when an application A and an application B are executed through two windows separated through one separator 200 .
  • Screen examples of reference numerals ⁇ 403 > and ⁇ 405 > indicate a case where a screen is split into three windows for a multi-window environment, and indicates a screen example when applications A, B, and C are executed through three windows using two separators 200 .
  • the screen split of the present disclosure may be separated into various forms according to settings of the user, and the split scheme may be pre-defined.
  • FIGS. 5 , 6 , 7 , 8 , 9 , 10 , 11 , and 12 are diagrams illustrating examples of an operation screen operating a tray for rapidly executing an application in a multi-window environment according to embodiments of the present disclosure.
  • FIG. 5 illustrates a screen example of a touch device when the touch device displays an idle screen (or home screen).
  • FIG. 5 illustrates an example where the idle screen is operated in a normal mode before operating the multi-window environment. That is, according to an embodiment of the present disclosure, the touch device may be operated in a multi-window mode and a normal mode and may switch between the two.
  • the user may activate the tray 300 to be indicated on the idle screen as illustrated in FIG. 6 in a state in which the idle screen is displayed, according to an embodiment of the present disclosure.
  • the user may input a menu operation through the displayed idle screen of the touch device to display the tray 300 .
  • the tray 300 may be displayed through selection of a function key for executing a multi-window mode, or in response to a touch event set to execute the multi-window mode (e.g., a gesture having a specific pattern such as figures and characters).
  • the touch device may activate and indicate (display) a tray 300 on a pre-set region on an idle screen as shown in FIG. 6 .
  • the tray 300 may be disposed at a left frame (a left edge) of a rectangular full screen such that the full screen (currently displaying the idle screen in FIG. 6 ) is reduced in size.
  • the tray 300 also may be provided in the form of an overlay through a separate layer on a currently displayed screen, and may have a handle item 350 , such that the tray 300 overlaps the idle screen, as shown in FIG. 6 .
  • the user may input a movement event (e.g., a touch & drag) moving the tray 300 to another region on a screen as shown in FIG. 7 in a state in which the tray 300 is displayed on an idle screen, according to an embodiment of the present disclosure.
  • a movement event e.g., a touch & drag
  • the user may touch a part of the tray 300 to input a movement event to drag the tray to a different part of the screen (for instance, an opposite direction of a screen (e.g., a right frame direction of a window (a right edge direction of the screen)).
  • the touch device may provide a User Interface (UI) or a Graphic User Interface (GUI) that separates the tray 300 from the left frame according to the movement event to move with the drag in response to the drag of the user.
  • UI User Interface
  • GUI Graphic User Interface
  • the touch device may change and display a direction of a handle item 350 of the tray 300 . That is, the touch device may differently display the handle item 350 for sliding-in the tray 300 in a screen according to a region in which the tray 300 is located.
  • the handle item 350 illustrated in FIG. 6 may be switched to a direction of a handle item 350 as illustrated in FIG. 7 according to a movement of the tray 300 .
  • the user may move the tray 300 close to a desired region to release the input movement event. That is, the user may release drag input for moving the tray 300 .
  • the touch device may determine a moved region of the tray 300 and arrange and display the tray 300 on the determined region. For example, as shown in FIG. 8 , the touch device may arrange and provide the tray 300 at a right frame of a window (a right edge of the screen). That is, if a user input for moving the tray 300 is released, the touch device displays a screen as illustrated in FIG. 8 . That is, a tray 300 provided in the screen in the touch device shown in FIG. 6 is switched as illustrated in FIG. 8 according to a movement of the tray 300 .
  • the touch device may determine an arranged region of the tray 300 according to a movement degree of the tray 300 . For example, the touch device may arrange the tray 300 at a window frame (screen edge) closest to the moved region (based on a point of contact of a user input on the tray 300 ). For instance, when the user input is released when the tray 300 is closest to the left frame (the left edge of the screen), the tray 300 is arranged and displayed at the left frame (the left edge). When the user input is released when the tray 300 is closest to a respect right, upper or lower frame (edge of the screen), the tray 300 is arranged and displayed at the respective right, upper or lower frame (edge).
  • a window frame screen edge
  • FIG. 6 screen examples where the tray 300 is arranged in different locations according to a user input are illustrated in FIG. 6 (arranged at a left frame), in FIG. 8 (arranged at a right frame), in FIG. 9 (arranged at an upper frame), and in FIG. 10 (arranged at a lower frame). That is, according to embodiments of the present disclosure, referring to FIGS. 6 to 10 , an arranged location of the tray 300 may be changed in real time according to user input.
  • FIG. 11 illustrates a screen example of a slide-out, that is, a hidden state in a state in which the tray 300 is arranged at a lower frame as shown in FIG. 10 .
  • the tray 300 is not displayed on a screen but only a handle item 350 of the tray 300 may be displayed.
  • a slide-out of the tray 300 is achieved by a user input using the handle item 350 , or the tray 300 may be automatically slid-out when the user input does not occur for a predetermined time in a slide-in state.
  • the tray 300 may be automatically slid-out.
  • the tray 300 may be slid-in.
  • a user input e.g., handle item 350
  • moves i.e., a drag, a flick, or the like
  • FIG. 12 illustrates a screen example when a screen of a landscape mode is displayed according to a rotation of the touch device in a screen display of a portrait mode as illustrated in FIGS. 6 , 7 , 8 , 9 , 10 , and 11 , according to embodiments of the present disclosure.
  • the tray 300 may be arranged and provided at a location corresponding to a direction arranged in a previous mode.
  • the tray 300 may be automatically arranged and provided at a left frame at a time point of viewing the screen of the user (a left edge of the portrait mode). That is, regardless of switch of the mode, the tray 300 may be arranged and provided at the same location based on a time point of the user.
  • screens of respective applications of split execution regions are rotated and provided according to a mode switch, and the window size split by the separator 200 may be maintained in accordance with a previous state.
  • FIGS. 13 , 14 , 15 , 16 , and 17 are diagrams illustrating examples of an operation screen operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • FIG. 13 illustrates a screen example of a touch device when the touch device executes one application (e.g., Internet application) as a full screen.
  • one application e.g., Internet application
  • FIG. 13 the tray 300 is activated, slid-out, and hidden on the screen so that only the handle item 350 is displayed on the screen.
  • the user may select (e.g., touch & drag) the handle item 350 in a state in which the Internet application is displayed to slide-in the tray 300 on a screen as shown in FIG. 14 .
  • the touch device displays a screen as shown in FIG. 14 . That is, a screen of the touch device illustrated in FIG. 13 is switched according to the user input as illustrated in FIG. 14 .
  • the user may select an execution icon 410 of an application to be additionally executed according to a multi-window environment from among application execution icons 400 previously registered in the tray 300 to input an event moving on a screen in a state in which the tray 300 is displayed.
  • the user selects (i.e., touches) an execution icon 410 capable of executing a map application in the tray 300 and inputs an event moving (i.e., dragging) the execution icon into the screen region currently displaying Internet application while the touch is maintained.
  • the touch device displays a state in which the execution icon 410 is moved into the screen in response to a user input as shown in FIG. 15 .
  • the touch device confirms a region in which the execution icon 410 is located and a split scheme as illustrated in FIG. 15 and outputs a feedback for an execution region to which an application of the execution icon 410 is to be executed to the user (illustrated by the hashed box in FIG. 15 ).
  • the feedback may be expressed by various schemes which may be intuitively recognized by the user such as focusing a corresponding window in which the execution icon 410 is located among windows of the split execution region, highlighting and displaying only a corresponding window, or changing a color of a corresponding window.
  • UI or GUI may provide a fade out effect such that a space in which the execution icon 410 is located in the tray 300 is remained as a blank. Further, when the execution icon 410 is separated from the tray 300 and enters in the screen, the tray 300 may be slid-out. That is, a screen of the touch device illustrated in FIG. 15 may be switched as illustrated in FIG. 16 according to the user input.
  • a blanked processed space from the tray 300 according to separation of the execution icon may have an original shape. That is, as illustrated in a screen example of FIG. 18 to be described later, a space in which the execution icon 410 is located may be provided in a state in which a corresponding to when icon is present.
  • a multi-window environment may be split into two execution regions having two windows with an upper window and a lower window.
  • FIG. 15 illustrates a case where the current location of the execution icon 410 is in a current upper window according to a user input, and where the lower window is focused when the execution icon 410 is moved to a lower side of the screen in a state in which the touch input at the execution icon 410 is maintained.
  • the user may move the execution icon 410 to a lower side of the screen in a state in which a touch input to the execution icon 410 is maintained, and input an event of releasing a touch input to the execution icon 410 in the lower window.
  • the user may release (i.e., drag & drop) a touch input to the execution icon 410 .
  • the touch device executes an application (i.e., the map application) associated with the execution icon 410 in response to the user input and displays an execution screen of the application on the lower window.
  • an application i.e., the map application
  • the touch device separates the full screen into two split execution regions through the separator 200 to form two separate windows.
  • the touch device displays a screen of the additional application (i.e., map application) through a window (e.g., a lower window) of an execution region in which the execution icon 410 is located, and displays a screen of the previous application (i.e., Internet application) through a window (e.g., an upper window) of another execution region.
  • a window e.g., a lower window
  • the previous application i.e., Internet application
  • the touch device upon execution of the additional application, displays a screen of a suitable size corresponding to a window (e.g., a lower window) size of an execution region in which the additional application is executed. Further, the touch device displays a screen of the previous application as a full screen or a partial screen in a window (e.g., an upper window) of a split execution region according to a characteristic of a previous application, and displays a screen of the additional application in a window (lower window) of another split execution region as a full screen or a partial screen upon splitting the screen.
  • a window e.g., a lower window
  • the touch device may change to a screen of a suitable size corresponding to a window (e.g., an upper window and a lower window) of a split execution region and display a play screen in a corresponding window as a full screen.
  • a window e.g., an upper window and a lower window
  • the touch device may display only a partial screen corresponding to a size of a corresponding window (i.e., upper window, lower window) of the split execution region.
  • an execution screen of a first application may be displayed as the full screen.
  • the touch device may receive an execution event input (e.g., a user input which selects an execution icon 400 from the tray 300 and moves to the screen) for executing a second application from a user while displaying the first application as a full screen.
  • an execution event input e.g., a user input which selects an execution icon 400 from the tray 300 and moves to the screen
  • the touch device may output feedback with respect to the window of a location to which the execution event is moved (i.e., a location to which the execution icon 400 is being moved (i.e., dragged) according to a user input).
  • a location to which the execution icon 400 is being moved i.e., dragged
  • a multi-window may be configured according to a pre-set split scheme, and screens of the first application and the second application may be independently displayed through respective split windows.
  • FIGS. 18 , 19 , 20 , 21 , 22 , and 23 are diagrams illustrating examples of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • FIG. 18 illustrates a screen example of a touch device when the tray 300 is slid-in according to the user input using a handle item 350 in a state in which the touch device displays screens of different applications through each window of two split execution regions as illustrated in FIG. 17 .
  • the user may select an execution icon 430 of an application (e.g., a note application) to be additionally executed from among execution icons 400 previously registered in the tray 300 in response to the foregoing operation and input an event moving on the screen as illustrated in FIG. 19 .
  • an application e.g., a note application
  • the touch device moves the execution icon 430 into the screen in response to the user input as illustrated in FIG. 19 , and outputs feedback for an execution region in which the execution icon 430 is to be executed in a corresponding location according to the movement to the user.
  • a slide-out operation of the tray 300 according to the movement of the execution icon 430 and an execution operation of an application (e.g., a note application) of the execution icon 430 correspond to the foregoing operation.
  • FIG. 19 illustrates a case where a touch input to the execution icon 430 is moved to an upper window of the screen and is released (i.e., drag & drop).
  • the touch device executes an application (e.g., a note application) of an execution icon 430 in response to the user input and displays an execution screen of the application on an upper window.
  • the touch device processes the application (e.g., an Internet application) previously executed through the upper window in the background (not displayed), and displays a screen of the additional application (e.g., a note application) whose execution is newly requested through the upper window.
  • the touch device may continuously execute the application (e.g., a map application) allocated to the lower window and continuously displays a screen (e.g., currently progressing screen) according to the execution state through the lower window.
  • the touch device may receive a user input for executing an additional application while displaying a screen of a plurality of applications through the multi-window. Accordingly, the touch device may execute the additional application through a corresponding window selected from the user for executing the additional application. Upon executing the additional application, the application previously executed through the selected window may be processed as a background, and the additional application screen may be displayed through the selected window.
  • FIGS. 21 , 22 , and 23 illustrate an operation of changing the window size according to the user input in a state in which a window of split execution regions of the touch device is displayed.
  • the user may input an event to select, as illustrated in FIG. 21 , the separator 200 in a screen like FIG. 20 and to move the selected separator 200 in a specific direction (e.g., upward or downward).
  • a specific direction e.g., upward or downward
  • the user may input an event which touches the separator 200 as illustrated in FIG. 21 and drags the separator 200 to a lower direction of the screen in a state in which the touch is maintained.
  • the touch device displays a moved state of the separator 200 in response to a user input as illustrated in FIG. 21 .
  • the touch device may change and display only a moving state of the separator 200 according to an user input while maintaining a screen of the application displayed through each window as a current state as shown in FIG. 21 .
  • the touch device may adaptively change and display a screen of an application according to a window size changed when the separator 200 is moved according to the user input through a window size control scheme.
  • the user may input an event which moves the separator 200 corresponding to a size ratio of each window to be adjusted and releases a touch input to the separator 200 .
  • the user may drag the separator 200 and release (i.e., drag & drop) a touch input to the separator 200 in a state in which the separator 200 is moved to a location of the lower window as illustrated in FIG. 21 .
  • the touch device changes and displays a window size according to movement of the separator 200 in response to the user input as shown in FIG. 22 .
  • the touch device changes and displays a display state of a screen of an application allocated to each window (e.g., upper window and lower window) according to variation in the window size. For example, as shown in FIG. 22 , remaining hidden contents may be displayed according to increase of the window size on a screen of an application displayed on the upper window, and a screen of an application displayed on the lower window may be provided in a state in which a region displayed according to reduction of the window size is reduced.
  • FIG. 23 illustrates an opposite case of FIG. 22 , and illustrates a screen example in a state in which a separator 200 is moved to an upper direction of a screen according to a user input, and accordingly the size of an upper window is reduced and the size of the lower window is enlarged.
  • FIGS. 24 , 25 , 26 , 27 , 28 , and 29 are diagrams illustrating examples operating a key pad for text input in a multi-window environment according to an embodiment of the present disclosure.
  • the present disclosure provides a touch key pad (e.g., a floating key pad) 500 having a different form from a normal touch key pad for efficiently operating a multi-window environment. That is, according to embodiments of the present disclosure, a touch key pad operated in a normal mode providing a screen of one application as a full screen, and a floating key pad 500 operated in a multi-window mode providing a screen of a plurality of applications as an individual screen through screen split may be differentially provided.
  • a touch key pad operated in a normal mode providing a screen of one application as a full screen
  • a floating key pad 500 operated in a multi-window mode providing a screen of a plurality of applications as an individual screen through screen split may be differentially provided.
  • the floating key pad 500 is not fixed to a pre-defined region like a normal touch key pad, but may be freely moved around in a screen of the touch device in response to the user input.
  • the floating key pad of the present disclosure may be in the form of a pop-up when a text input is requested (e.g., a user input selecting a text input window of an application of the specific window) from an application of the specific window according to user selection from among applications of a plurality of windows separated as a multi-window in the multi-window environment.
  • FIG. 24 illustrates a screen example of a touch device in a state in which the touch device displays a screen of different applications through each window of two split execution regions.
  • the user may display a floating key pad at a predetermined region (e.g., a pre-defined region or a previously executed region) according to a user input referring to FIG. 25 in a state in which screens of a plurality of applications according to a multi-window environment are simultaneously displayed.
  • a predetermined region e.g., a pre-defined region or a previously executed region
  • the user may input a menu operation of the touch device, function key selection for executing the floating key pad 500 , or a touch event (e.g., a gesture having a specific pattern such as figures and characters) set to execute the floating key pad 500 .
  • a touch event e.g., a gesture having a specific pattern such as figures and characters
  • the touch device activates a floating key pad 500 at one region of a screen operated as the multi-window.
  • a location provided when the floating key pad 500 is activated may be provided in a form that a bottom end of the floating key pad 500 adheres to a lower frame.
  • the floating key pad 500 has a separate layer and may be provided in an overlay form on screens according to a multi-window.
  • the user may input a movement event (e.g., a touch & drag) moving the floating key pad 500 to another region on the screen as illustrated in FIG. 26 in a state in which the floating key pad 500 is displayed on the screen.
  • a movement event e.g., a touch & drag
  • the user may input a movement event which touches and drags a part of the floating key pad 500 to another region (e.g., upward) of the screen.
  • the touch device may provide UI or GUI separating the floating key pad 500 from a lower frame according to the movement event and moving the floating key pad 500 with a drag of the user in response to the drag of the user.
  • the user may move the floating key pad 500 to a desired location and release the input movement event as shown in FIG. 27 . That is, the user may release a drag input for moving the floating key pad 500 . Accordingly, the touch device may arrange and display the floating key pad 500 in a location in which the drag input is released.
  • the user input may be achieved in both of respective windows of split execution regions and the floating key pad 500 in a state in which the floating key pad 500 is provided.
  • a user input by the floating key pad 500 is received in a region that the floating key pad 500 occupies, and a user input for a corresponding window may be received in a remaining region.
  • the user may perform a text input using the floating key pad 500 in a state in which the floating key pad 500 is displayed.
  • the user inputs a text on a screen of an application executing on the upper window.
  • the user selects the upper window (i.e., selects any one region (e.g., a text input window) in which a text input is possible from an application screen of an upper window), and selects and inputs a desired character button on the floating key pad 500 .
  • the user selects a text input window 610 on a screen of an application executing through the upper window to implement a state in which the text input is possible. Further, the user may sequentially input respective buttons to which characters p, s, and y are allocated to input “psy” using the floating key pad 500 . Accordingly, the touch device may input and display a corresponding character on the text input window 610 in response to the user input as illustrated in FIGS. 27 and 28 .
  • the touch device may provide a result for a text (e.g., “psy”) input to the text input window 610 of an application executing on the upper window to the floating key pad 500 in the form of an underlay as illustrated in FIG. 28 .
  • a text input in to the text input window 610 may be provided through a recommendation region 620 of a new layout recommending a searched result corresponding to the text input in to the text input window 610 while maintaining a current state.
  • the recommendation region 620 may be provided in such a way that overlies a screen of an application and the floating key pad 500 overlies the recommendation region 620 . That is, the floating key pad 500 may be disposed at the uppermost position and may maintain a current state.
  • the text input to the text input window 610 may be input to the same layer as an application screen and may be directly provided thereon.
  • a text input window in to which receiver information is input like a mail application executed in the lower window, and unlike the example of FIG. 28 , only an input result may be displayed through a text input window of an application screen without a separate new layer.
  • the user may select any one recommended result in a state in which a recommendation region 620 is displayed on the floating key pad 500 as an underlay, or operate (i.e., command) search execution for a text input to the text input window 610 .
  • a corresponding result screen is illustrated in FIG. 29 . That is, a screen of a touch device illustrated in FIG. 28 is switched as illustrated in FIG. 29 according to a user input.
  • a text shown in text input window 610 is input through the floating key pad 500 according to user input
  • function execution for a corresponding application e.g., a search execution, a mail transmission execution, a memo storage execution, a message transmission execution, or the like
  • the floating key pad 500 is removed from the screen, and a result for the execution may be provided from a corresponding window of an application executing the function.
  • a search result for “psy” input from an application of an upper window may be provided through the upper window.
  • FIG. 30 is a diagram illustrating an example of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • FIG. 30 illustrates a screen example when specific setting for respective windows is changed according to the user input in a state in which the touch device displays screens of different applications through respective windows of two split execution regions.
  • a function may be independently set in every split window. That is, a function suitable for a characteristic of an execution application of a window selected by the user from among windows of split execution regions may be changed.
  • the user may select a left window from among windows of split execution regions, and operate a pre-set function (e.g., operate a function key provided to control a volume).
  • the touch device may separate a characteristic of an application executing through the left window.
  • the touch device may display a volume setting item 700 according to a characteristic of a separated application (e.g., a media playing capability, like a video playing capability), and may feedback a setting value changed according to the user input.
  • a screen brightness setting item (not shown) instead of the volume setting item 700 may be provided on the screen, and a feedback where brightness of the screen is changed according to the user input may be provided. Further, the setting for an application executing on the right window may be changed in accordance with the foregoing scheme.
  • an independent setting may be achieved for each window. For example, when a volume or screen brightness is set on the left window, a setting value may be reflected and displayed only for the left window.
  • FIGS. 31 , 32 , 33 , and 34 are diagrams illustrating examples of an operation screen providing information for a plurality of applications executed according to a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • FIG. 31 illustrates a screen example of a touch device when the touch device displays a list for a plurality of application executed according to a multi-window environment.
  • a list of applications executed in the multi-window environment by the user may be provided through a full screen according to user selection.
  • the user may input a menu operation of the touch device, function key selection for executing the list, or a touch event (e.g., a gesture having a specific pattern such figures or characters) set to execute the list in a state in which a function by multi window is operating or the screen is converted into an idle screen.
  • the touch device may display a list for applications currently executed (including background execution) through UI or GUI set as FIG. 31 .
  • FIG. 31 illustrates a list including an E-mail application 910 , a Video Player application 920 , a Note application 930 , a Map application 940 , and a Play Store application 950 .
  • FIGS. 32 and 33 although not displayed on an initial list screen of FIG. 31 , remaining applications (e.g., Gmail application 960 , Wi-Fi application 970 , and Phone application 980 ) hidden according to scroll (or navigation) control of the user may be spread and displayed. That is, the list illustrated in FIG. 31 includes different applications which are not displayed through the screen but are hidden.
  • the number of applications included in the initial list may be suitably set in consideration of intuition of the user according to the size of a screen of the touch device. When the number of executing applications is greater than the preset number, excessive applications may be hidden as illustrated in examples of FIGS. 31 to 34 .
  • Information for the applications of the list may be provided in such a manner that an information display region of an application (e.g., Video Player application 920 ) disposed at a lower side among the applications is mainly allocated and the information display region becomes gradually reduced in the upward direction. Accordingly, the uppermost application (e.g., Play Store application 950 ) may display only a state bar capable of discriminating a corresponding application.
  • an application e.g., Video Player application 920
  • the uppermost application e.g., Play Store application 950
  • the uppermost application may display only a state bar capable of discriminating a corresponding application.
  • an application e.g., E-mail application 910 disposed at a lowermost region to display only a state bar may correspond to at least one application which is most recently executed by user or is displayed on a screen just before execution of a list.
  • the application disposed at the lowermost region may be fixed and provided at a corresponding region regardless of scroll control of the user, and fixed arrangement may not be achieved according to user setting.
  • a list screen for the execution applications of the present disclosure may include a command region 800 for supporting a variety of command types (e.g., an application scroll, a termination of application execution, an application search, or the like) for the execution applications in the list.
  • the list screen may include a scroll item 850 for controlling a scroll (or a spread) for the applications in the list. That is, the user may scroll the applications in the list through a user input using the scroll item 850 .
  • the touch device may provide UI or GUI where information of applications overlapped according to a user input scheme for the scroll item 850 is spread. In this case, when a user input scheme is repeated once to be input, the touch device may repeatedly control (e.g., spread) one scroll in response to a corresponding input.
  • the touch device may continuously control automatic scroll while the user input is maintained.
  • the user may select (touch) the scroll item 850 to maintain the input in a state in which the list is displayed as illustrated in FIG. 31 . Accordingly, when a user input for the scroll item 850 is detected, the touch device displays a screen where information of applications spread from up to down as illustrated in FIGS. 32 , 33 , and 34 . That is, the list screen of the touch device illustrated in FIG. 31 is switched as shown in FIGS. 32 , 33 , and 34 according to the user input.
  • UI or GUI may be provided in such a manner that a Video Play application 920 is pulled downward in response to the user input using the scroll item 850 and disappears from the screen while information of other upper applications disposed in the upper side is gradually spread and is sequentially pulled downward.
  • other hidden applications e.g., a Gmail application 960 ( FIG. 33 ), a Wi-Fi application 970 [ FIG. 34 ], a Phone application 980 [ FIG. 34 ], or the like
  • an E-mail application 910 may be fixed at a corresponding location to be continuously displayed.
  • the user may select an item of a specific application in a state in which the list is displayed, or during scroll control. Accordingly, the touch device may display the selected application as a full screen.
  • the touch device may automatically display a recently executed application (i.e., an application [e.g., E-mail application 910 ] fixed and arranged at the lowermost side) as a full screen.
  • FIG. 35 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present disclosure. More particularly, FIG. 35 illustrates an example of switching to the multi-window environment during an operation of one window.
  • a controller 170 executes an application (hereinafter, referred to as a “first application”) corresponding to user selection at operation 3501 , and controls screen display for the executing first application at operation 3503 .
  • the controller 170 controls display of a full screen of the first application through one window.
  • the execution standby event may refer to an event for additionally executing and displaying another application by a multi-window environment in a state in which the user executes and displays any one application. More particularly, the execution standby event may refer to an event which allows the user to activate (e.g., slide in) the tray 300 on the screen and select an execution icon of an application to be additionally executed from the activated tray 300 to move (e.g., drag) into the screen.
  • the controller 170 traces and determines a moved location of the execution icon at operation 3509 .
  • the controller 170 may confirm a window of a current region after the execution icon is moved through location trace of the execution icon.
  • the controller 170 controls feedback output for a window of an execution region in which an additional application is able to be executed in response to the determined split scheme and a location of an execution icon at operation 3511 . That is, the controller 170 may control feedback output for a specific window of a location in which the execution icon is dragging while the execution icon is move on the full screen according to the drag. For example, the controller 170 may focus and display a window of a location to which the execution icon is moved.
  • the controller 170 splits a screen at operation 3515 and controls execution of the second application at operation 3517 .
  • the execution event may be an event dropping the execution icon in one region of the screen.
  • the controller 170 identifies a region (e.g., a region where an execution icon is dragged and dropped [i.e., a drag & drop]) where the execution icon is moved to generate an execution event, splits a full screen for the first application, and determines a region in which the execution event is generated among the split regions as one window (i.e., execution region) for displaying a screen of the second application.
  • the controller 170 Upon executing the second application, the controller 170 controls to display a screen having a suitable size corresponding to the window size of the split execution region (i.e., an execution region in which the second application is executed) at operation 3519 .
  • the controller 170 may display a screen of the first application in a window (e.g., an upper window) of a split execution region as a full screen or a partial screen, and display a screen of the second application in a window (e.g., a lower window) of another split execution region as a full screen or a partial screen.
  • the controller 170 may change into a screen of a suitable size pertinent to a corresponding window size of a split execution region, and display a playing screen in the window as the full screen.
  • the controller 170 may display as a partial screen in response to a corresponding window size of the split execution region. That is, according to embodiments of the present disclosure, a screen of the first application and a screen of the second application may be independently displayed on a corresponding window by implementing the multi-window environment.
  • the controller 170 may execute the second application in response to a drop input of the execution icon.
  • the controller 170 may split the full screen into windows for displaying screens of the first application and the second application. Further, the controller 170 may display a screen of the second application through the specific window in which the execution icon is dropped, and display a screen of the first application through another split window.
  • FIG. 36 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • FIG. 36 illustrates an operation example in which an additional application is executed while operating the multi-window.
  • the controller 170 may receive an input for selecting an additional application to additionally execute an application at operation 3603 . That is, according to embodiments of the present disclosure, another application may be further executed while independently displaying screens of a plurality of different applications through respective split windows in the multi-window environment.
  • the controller 170 determines a split scheme and a currently executed window (e.g., an “execution window”) at operation 3605 .
  • a split scheme and a currently executed window e.g., an “execution window”
  • the controller 170 may confirm how many window split schemes exist in the screen split for multi-window environment through pre-defined split information, and determine how many currently executed windows are split and operated.
  • the controller 170 compares the number of execution windows with the split information to determine whether the number of execution windows corresponds to a maximum value set to the pre-defined split information at operation 3607 . For example, the controller 170 may determine whether the pre-defined split information is 3 and the number of currently executed windows is 3. If the number of execution windows does not correspond to the maximum value set to the split information (NO of operation 3607 ), the controller 170 controls execution of a corresponding operation at operation 3609 .
  • the controller 170 may control an additional screen split for executing the additional application, execution of the additional application according thereto, and screen display for a plurality of applications. This may correspond to an operation for controlling execution of the additional application due to screen slit on the full screen as illustrated in an example of FIG. 35 .
  • the controller 170 traces and determines a location for a user input selecting an execution region for executing the additional application at operation 3611 . For example, when the user selects an execution icon of an application to be additionally executed from the tray 300 and moves the selected icon into the screen, the controller 170 may trace and determine a moved location of the execution icon.
  • the controller 170 feedbacks an execution region in which an additional application is able to be executed in response to the determined location at operation 3613 . For example, when the execution icon is moved from the tray 300 and enters in the screen, the controller 170 focuses and displays a window of a location to which the execution icon is moved.
  • the controller 170 executes the additional application and controls processing of a previous application executed in a corresponding execution region as a background at operation 3617 .
  • the controller 170 may process an application previously executed through a window selected to execute the additional application as the background, and may display the screen of additional application which is requested to execute through a corresponding window. That is, the controller 170 may process the previous application allocated to a corresponding window as a background to continuously execute the application, and may just replace a screen displayed on a corresponding window.
  • the controller 170 may control a screen display corresponding to a window size of an execution region in which the additional application is executed at operation 3619 .
  • the controller 170 may display a screen of the additional application in a window of a corresponding execution region as a full screen or a partial screen.
  • the controller 170 when the additional application is an application having a capability of playing media, like a video, the controller 170 changes into a screen having a suitable size corresponding to a window size of a corresponding execution region, and may display a playing screen in the window as a full screen.
  • the controller 170 may display a partial screen corresponding to a window size of the corresponding execution region.
  • the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof.
  • the program command recorded in a recording medium may be specially designed or configured for the present disclosure or be known to a person having ordinary skill in a computer software field to be used.
  • the computer readable recording medium includes Magnetic Media such as hard disk, floppy disk, or magnetic tape, Optical Media such as Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), Magneto-Optical Media such as floptical disk, and a hardware device such as ROM.
  • the program command includes a machine language code created by a complier and a high-level language code executable by a computer using an interpreter.
  • the foregoing hardware device may be configured to be operated as at least one software module to perform an operation of an embodiment of the present disclosure, and vice versa.
  • embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a machine-readable storage storing such a program. Still further, such programs may be conveyed electronically via any medium, for example a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
  • the user may simultaneously use a plurality of applications as a determined split screen or a free style in a simple method. For example, in order to split the screen to use a multi-window in a state in which one application is executed as a full screen, the user drags an additional application from the tray to drag & drop the application to a determined location or a free location, thereby simultaneously operating a plurality of applications.
  • the user may easily arrange and confirm a plurality of application from one screen through a multi-window, and freely change each window according to the multi-window to a desired layout, thereby solving burden and trouble with respect to an efficient configuration of a screen and operations of a plurality of applications.
  • the user may efficiently and simultaneously perform an operation with respect to various applications by a multi-window environment on a small screen of the touch device.
  • the user may simultaneously perform other operations such as creation of messages and mail while viewing and listening to a video on one screen of the touch device.
  • an optimal environment capable of supporting a multi-window environment in the touch device is implemented so that convenience for the user can be improved, and usability, convenience, and competitive forces of the touch device can be improved.
  • the present disclosure may simply implement various types of touch devices and various corresponding devices.
  • Certain embodiments aim to achieve the technical effect of enhancing the precision of an input device.
  • Certain embodiments aim to achieve the technical effect of lowering a burden (e.g. a cognitive, operative, operational, operating, or manipulative burden) of a user when performing certain computer or device interactions.
  • a burden e.g. a cognitive, operative, operational, operating, or manipulative burden
  • Certain embodiments aim to achieve the technical effect of providing a more efficient man-machine (user-machine) interface.

Abstract

A method of executing an application in a touch device is provided. The method includes displaying an execution screen of a first application as a full screen, receiving an input of an execution event for executing a second application, configuring a multi-window in a split scheme when the execution event is released on a specific window, and independently displaying screens of the first application and the second application through respective split windows.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Sep. 24, 2012 in the Korean Intellectual Property Office and assigned Serial No. 10-2012-0105898, the entire disclosure of which is hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to a method and an apparatus for operating a function in a touch device. More particularly, the present disclosure relates to a method of providing a multi-window in a touch device so that a plurality of application may be efficiently used through multi-splitting of a window on one screen provided from the touch device, and an apparatus thereof.
  • BACKGROUND
  • In recent years, with the development of digital technology, various mobile devices such as a mobile communication terminal, a Personal Digital Assistant (PDA), an electronic note device, a smart phone, a tablet Personal Computer (PC), and the like, each capable of processing communication and personal information while a user is moving, have been introduced. These mobile devices have developed to a mobile convergence stage including the traditional field of communication and other terminal fields. The mobile device may have various functions, such as the ability to process an audio call, an image call, to process the transmission and reception of a message such as a Short Message Service (SMS)/Multimedia Message Service (MMS), an e-mail, an electronic note, photography, a broadcasting play, a video play, a music play, information from Internet, a messenger, and a Social Networking Service (SNS).
  • However, in the touch device, due to a characteristic of the touch device having a small screen, only one application view can be provided at once. Any additional application is displayed through pop-up. Accordingly, in the related art, due to a screen having a small size, although a plurality of applications are simultaneously executed, only one application view is provided to a current screen according to the user selection. That is, the related art cannot efficiently use a plurality of applications.
  • Therefore, a need exists for a method and apparatus in which a plurality of applications may be efficiently used by splitting a window displayed on one screen of the touch device.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
  • SUMMARY
  • Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method of implementing a multi-window environment in a single system of a touch device composed of at least two split windows and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of providing a multi-window in a touch device capable of maximizing the usability of the touch device by a user by splitting one screen into at least two windows to easily arrange and execute a plurality of applications and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-widow environment in a touch device capable of simply changing a layout for convenience of an operation of a plurality of applications in the multi-window environment and supporting the convenience of a user operation in the multi-window environment and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window in a touch device capable of minimizing a burden of a user operation in a multi-window environment, and increases the user's convenience with respect to a plurality of applications by freely adjusting windows with respect to a plurality of applications and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window environment in a touch device capable of supporting large amounts of information and various experiences to the user by implementing a multi-window environment in a touch device and an apparatus thereof.
  • Another aspect of the present disclosure is to provide a method of supporting a multi-window environment capable of improving convenience for a user and usability of the touch device by implementing an optimal environment for supporting a multi-window environment in a touch device is provided.
  • In accordance with an aspect of the present disclosure, a method of executing an application in a touch device is provided, The method includes displaying an execution screen of a first application as a full screen, receiving an input of an execution event for executing a second application, configuring a multi-window in a split scheme when the execution event is released on a specific window, and individually displaying screens of the first application and the second application through respective split windows.
  • In accordance with another aspect of the present disclosure, a method of executing an application in a touch device is provided. The method includes executing a first application corresponding to a user selection and displaying the application through one window as a full screen, receiving a first event input for selecting and moving a second application when the first application is executed, determining a multi-window split scheme and a region to which a first event is input, outputting a feedback for a window in which the second application is able to be executed and the region to which the first event is input, receiving a second event input for executing the second application, configuring the multi-window in response to the second event input, and independently displaying a screen of the first application and a screen of the second application through corresponding windows separated by the multi-window.
  • In accordance with another aspect of the present disclosure, a method of executing an application in a touch device is provided. The method includes displaying an execution screen of a first application as a full screen, sliding-in a tray including an execution icon of an application according to a user input when the first application is executed, receiving an input for selecting an execution icon of a second application from the tray and dragging the selected execution icon into the full screen, receiving an input for dropping the execution icon in a specific window while the execution icon is dragged, executing the second application in response to the drop input of the execution icon, splitting a full screen into windows for displaying screens of the first application and the second application, and displaying a screen of the second application through the specific window in which the execution icon is dropped and displaying the screen of the first application through another split window.
  • In order to achieve the above objects, there is provided a computer readable recording medium recording a program for executing the methods in a processor.
  • In accordance with another aspect of the present disclosure, a touch device is provided, The touch device includes a touch screen configured to display a screen interface of a multi-window environment, to display screens of a plurality of applications through a plurality of windows split in the screen interface, and to receive an event input for operating the applications, and a controller configured to control execution of the applications in the multi-window environment, and to control to independently display screens of at least two applications through the windows according to a user selection from among a plurality of executed applications.
  • In accordance with another aspect of the present disclosure, a computer readable recording medium having recorded thereon a program performing a method is provided. The method includes receiving an input of an execution event for executing a second application when an execution screen of a first application is displayed as a full screen, configuring a multi-window in a split scheme when the execution event is released on a specific window, and individually displaying screens of the first application and the second application through respective split windows.
  • Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram schematically illustrating a configuration of a touch device according to an embodiment of the present disclosure;
  • FIG. 2 is a diagram of a screen schematically illustrating a screen interface in a touch device according to an embodiment of the present disclosure;
  • FIG. 3 is a diagram schematically illustrating an operation of a multi-window in a touch device according to an embodiment of the present disclosure;
  • FIG. 4 is a diagram schematically illustrating an operation for splitting a multi-window in a touch device according to an embodiment of the present disclosure;
  • FIGS. 5, 6, 7, 8, 9, 10, 11, and 12 are diagrams illustrating examples of an operation screen operating a tray for rapidly executing an application in a multi-window environment according to an embodiment of the present disclosure;
  • FIGS. 13, 14, 15, 16, and 17 are diagrams illustrating examples of an operation screen operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure;
  • FIGS. 18, 19, 20, 21, 22, and 23 are diagrams illustrating examples of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure;
  • FIGS. 24, 25, 26, 27, 28, and 29 are diagrams illustrating examples of operating a key pad for text input in a multi-window environment according to an embodiment of the present disclosure;
  • FIG. 30 is a diagram illustrating an example of operating a plurality of applications in a multi-window environment according to an disclosure embodiment of the present disclosure;
  • FIGS. 31, 32, 33, and 34 are diagrams illustrating examples of an operation screen providing information with respect to a plurality of applications executed according to a multi-window environment in a touch device according to an embodiment of the present disclosure;
  • FIG. 35 is a flowchart illustrating a method of executing an additional application by switching a multi-window environment in a touch device according to an embodiment of the present disclosure; and
  • FIG. 36 is a flowchart illustrating a method of executing an additional application in a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • The same reference numerals are used to represent the same elements throughout the drawings.
  • DETAILED DESCRIPTION
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the cope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • The present disclosure relates to a method of providing a multi-window in a touch device which splits a screen of the touch device into at least two windows in a split scheme to provide a multi-window and allows a user to efficiently use a plurality of applications through the multi-window on one screen and an apparatus thereof.
  • Embodiments of the present disclosure may include selecting an additional application in a touch device to determine a screen split scheme upon execution of a drag, and may feedback a corresponding window in which an additional application is able to be executed from among respective windows split from one screen. Accordingly, the user may know where an additional application being executed exists. Further, according to an embodiment of the present disclosure, when the additional application is executed at a location selected by the user, a screen of the application may be displayed suitable for the size of a corresponding window.
  • Hereinafter, a configuration of a touch device and a method of controlling an operation thereof according to embodiments of the present disclosure will be described with reference to the accompanying drawings. A configuration of the touch device and a method of controlling an operation thereof according to embodiments of the present disclosure are not limited to the following description, but are also applicable to various additional embodiments based on the embodiments described herein.
  • FIG. 1 is a block diagram schematically illustrating a configuration of a touch device according to an embodiment of the present disclosure.
  • Referring to FIG. 1, the touch device of the present disclosure may include a Radio Frequency (RF) communication unit 110, a user input unit 120, a display unit 130, an audio processor 140, a memory 150, an interface unit 160, a controller 170, and a power supply 180. Since constituent elements shown in FIG. 1 may not be essential, a touch device of the present disclosure may be implemented with more than the above described elements or less than the above described elements.
  • The RF communication unit 110 may include at least one or more modules capable of performing a wireless communication between the touch device and a wireless communication system or between the touch device and a network in which another device is located. For example, the wireless communication unit 110 may include a mobile communication module 111, a Wireless Local Area Network (WLAN) module 113, a short range communication module 115, a location calculation module 117, and a broadcasting reception module 119.
  • The mobile communication module 111 transmits and receives a wireless signal to and from at least one of a base station, an external terminal, various servers (e.g., an integration server, a provider server, a content server, or the like). The wireless signal may include a voice call signal, an image call signal, or data of various formats according to the transmission/reception of a character/multi-media message. The mobile communication module 111 may access at least one of various servers under control of the controller 170 to receive an application available in a touch device according to user selection.
  • The WLAN module 113 may be a module for access to wireless Internet, and forming a wireless LAN link with other touch device, and may be installed at an inside or outside of the touch device. Wireless Internet techniques may include Wireless LAN/Wi-Fi (WLAN), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access (HSDPA). The WLAN module 113 may access at least one of various servers to receive a usable application from the touch device according to user selection under control of controller 170. Further, when a WLAN link is formed with another touch device, the WLAN module 113 may transmit or receive an application according to the user selection to or from another touch device.
  • The short range communication module 115 is a module for short range communication. The short range communication techniques may include Bluetooth, Bluetooth Low Energy (BLE), Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC). When the short range communication module 115 connects short range communication with another touch device, the short range communication module 115 may transmit or receive an application according to the user selection to or from another touch device.
  • The location calculation module 117 is a module for acquiring a location of the touch device. For example, the location calculation module 117 includes a Global Position System (GPS). The location calculation module 115 may calculate distance information distant from at least three base stations and exact time information, apply trigonometry to the calculated information so that three-dimensional current location information according to latitude, longitude, and altitude may be calculated. The location calculation module 115 may continuously receive a current location of the touch device from at least three satellites in real time to calculate location information. The location information of the touch device may be acquired by various schemes.
  • The broadcasting receiving module 119 receives a broadcasting signal (e.g., a TV broadcasting signal, a radio broadcasting signal, a data broadcasting signal) and/or information (e.g., a broadcasting channel, a broadcasting program or information about a broadcasting service provider) from an external broadcasting management server through a broadcasting channel (e.g., a satellite channel or a terrestrial channel).
  • The user input unit 120 generates input data for controlling an operation of the touch device by user. The user input unit 120 may be configured by a key pad, a dome switch, a touch pad (e.g., a resistive/capacitive type), a jog wheel, and a jog switch. The user input unit 120 may be implemented in the form of a button outside the touch device, and some buttons may be implemented by a touch panel.
  • The display unit 130 displays (i.e., outputs) information processed by the touch device. For example, when the touch device is in a call mode, the display unit 130 displays User Interface (UI) or Graphical UI (GUI) associated with a call. When the touch device is in an image call mode or a shooting mode, the display unit 130 displays photographed and/or received image or UI and GUI.
  • In the present disclosure, the display unit 130 may display an execution screen with respect to various functions (or applications) executed in the touch device through one or more windows, as will be illustrated in relation to the following figures, for instance FIG. 3. The execution screen may therefore display data relating to multiple applications. In particular, the display unit 130 may provide at least two split screen regions according to a split scheme, and may provide the split screen regions to one window, respectively to form a multi-window. That is, the display unit 130 may display a screen corresponding to the multi-window environment, and may display an execution screen with respect to a plurality of applications through a multi-window, which is split regions. In this case, the display unit 130 may simultaneously display a screen of one window and a screen of another window in parallel. The display unit 130 may display a separator for separating respective windows, that is, split regions. The display unit 130 may display a tray (or an application launcher) for efficiently and intuitively executing an application according to the multi-window environment. The tray comprises a screen region in which, for instance, icons representing respective applications may be displayed and selected. The tray may comprise a pop-up object displayed upon the screen. The tray may be moved within the screen. The display unit 130 may display a virtual input device (e.g., a touch key pad or a floating key pad which is freely moved in a full screen region. Further, the display unit 130 may receive a user input on a full screen (the whole of the available screen area of the display unit 130) or on an individual window screen provided through one or more windows in a multi-window environment, and may transfer an input signal according to the user input to the controller 170. Further, the display unit 130 may support screen display in a landscape mode, screen display in a vertical mode (portrait mode), and a screen switch display according to variation between the landscape mode and the vertical mode according to the orientation or a change in the orientation of the touch device. An embodiment of a screen of the display unit 130 operated according to an embodiment of the present disclosure will be described herein.
  • The display unit 130 may include at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT LCD), a Light Emitting Diode (LED), an Organic Light-Emitting Diode (OLED), an Active Matrix OLED (AMOLED), a flexible display, a bendable display 100, and a 3D display. Some of the above displays may be implemented by a transparent display configured in a transparent type or a light transmittance type to look out the outside there through.
  • When a touch panel detecting a touch operation forms a layer structure with the display unit 130 (e.g., a “touch screen”), the display unit 130 may be used as an input device as well as an output device. The touch panel may convert pressure applied to a specific part of the display unit 130 or a variation in capacitance created at the specific part of the display unit 130 into an electric input signal. The touch panel may detect a touched location, an area, or pressure upon touch. When there is touch input with respect to the touch panel, a signal(s) corresponding to the touch input is sent to a touch controller (not shown). The touch controller (not shown) processes the signal(s) and transmits corresponding data to the controller 170. Accordingly, the controller 170 may recognize which region of the display unit 330 is touched.
  • The audio processor 140 transmits an audio signal input from the controller 170 to a speaker 141, and transfers an audio signal such as a voice input from the microphone 143 to the controller 170. The audio processor 140 converts voice/sound data into an audible sound and outputs the audible sound through the speaker 141 under the control of the controller 170. The audio processor 140 may convert an audio signal such as a voice input from the microphone 143 into a digital signal, and may transfer the digital signal to the controller 170.
  • The speaker 141 may output audio data received from the RF communication unit 110 or stored in the memory 150 in a call mode, a record mode, a media contents play mode, a photographing mode, or a multimedia mode. The speaker 141 may output a sound signal associated with a function (e.g., a receiving call connection, a sending call connection, a music file play, a video file play, an external output, or the like) performed in the touch device.
  • The microphone 143 may receive and process an external sound signal to electric voice data in a call mode, a record mode, a voice recognition mode, or a photographing mode. The processed voice data are converted into a transmissible format and the converted data are outputted to a mobile communication base station through a mobile communication module 111. Various noise removal algorithms for removing a noise generated during a procedure of receiving an external sound signal may be implemented in the microphone 143.
  • The memory 150 may store a program for process and control of the controller 170, and may temporarily store a function for input/output data (e.g., a telephone number, a message, audio, media contents [e.g., a music file or a video file], or an application). The memory 150 may store a use frequency (e.g., frequencies in the use of an application, frequencies in media contents, or frequencies in a phone number, a message, and in multi-media), an importance, a priority, or a preference according to a function operation of the touch device. The memory 150 may store data regarding a vibration or a sound of various patterns output upon touch input on the touch screen. In particular, the memory 150 may store split information with respect to a screen split scheme for operating a multi-window, application information to be registered in the tray, or application information executed by multi-tasking by the multi-window.
  • The memory 150 may include a storage medium having at least one of memory types including a flash memory type, a hard disk type, a micro type, a card type (e.g., an SD card or XD card memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Programmable Read-Only Memory (PROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Magnetic RAM (MRAM), a magnetic disc, or an optical disc. The touch device may operate associated with a web storage executing a storage function of the memory 150 on Internet.
  • The interface unit 160 performs a function of passage with all external devices connected to the touch device. The interface unit 160 may receive data or power from an external device, transfer the data or power to each element inside of the touch device, or transmit data of the inside of touch device to an external device. For example, the interface unit 160 may include a wire/wireless headset port, an external charger port, a wire/wireless data port, a memory card port, a port of connecting a device having an identity module, an audio I/O (input/output) port, a video I/O (input/output) port and an earphone port. The interface unit 160 includes an interface for connecting with an external device in a wired or wireless scheme.
  • The controller 170 controls an overall operation of the touch device. For example, the controller 170 performs control associated with an operation of an application according to a voice call, a data communication, an image call, or operating a multi-window environment. The controller 170 may include a separate multi-media module (not shown) for operating a multi-window function. According to certain embodiments of the present disclosure, the multi-media module (not shown) may be implemented in the controller 170 and may be implemented separately from the controller 170.
  • More particularly, the controller 170 may control a series of operations for supporting a multi-window function according to embodiments of the present disclosure. For example, the controller 170 may control execution of a plurality of applications in a multi-window environment. The controller 170 may control to independent display of screens relating to at least two applications according to user selection from among a plurality of executed applications through the plurality of windows.
  • For example, the controller 170 may receive an execution event input, for instance a touch input, for executing a second application in a state in which an execution screen of the first application is displayed as a full screen (that is, occupying all or substantially all of the available screen area within the display unit 130). The controller 170 may control a feedback output (for instance, visual feedback) with respect to a window where a dragged icon relating to the second application is currently located, or another movement location before the execution event is released. If the execution event is released when located over a specific window, the controller 170 may configure a multi-window according to a pre-set split scheme, and may control to independently display a screen of the first application and the second application through respective split windows.
  • Further, when an input requesting execution of an additional application is received while displaying screens of a plurality of applications through multi-windows, the controller 170 may control execution of the additional application through a window selected to execute the additional application. In this case, the controller 170 executes, and processes an application previously executed through the selected window in the background (that is, without continuing to display the executing application), and controls to display the additional application screen through the selected window.
  • Further, the controller 170 may control the display of a tray, a separator, or a floating key pad provided from a screen interface according to the multi-window environment. The controller 170 may allow the displayed tray, separator or floating key pad to be moved within the screen according to a user input or otherwise. More particularly, the controller 170 may determine (i.e., change) the size of each window according to the multi-window environment in accordance with the movement of the separator.
  • A detailed control operation of the controller 370 will be described in an example of an operation of the touch device and a control method thereof referring to following drawings.
  • The power supply 180 uses power which is applied from an external power source or an internal power source thereto, and supplies power necessary to operate each constituent element under control of the controller 170.
  • Various embodiments according to the present disclosure may be implemented in a recording medium which may be read by a computer or a similar device using software, hardware or a combination thereof. According to hardware implementation, various embodiments of the present disclosure may be implemented using at least one of Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and an electric unit for executing the functions. In some cases, embodiments of this disclosure may be implemented by the controller 170. According to the software implementation, various embodiments of procedures and functions according to this specification may be implemented by separate software modules. The software modules may perform one or more functions and operations described in the specification.
  • The recording medium may include a computer readable recording medium recording a program processing to receive an input of an execution event for executing a second application in a state in which an execution screen of the first application is displayed on a full screen, to output feedback with respect to a window of a moved location when the execution event is moved while not being released, to configure a multi-window according to a preset split scheme when the execution event is released from the moved specific window, or to independently display screen of the first and second applications through respective split windows.
  • Further, the touch device of the present disclosure illustrated in FIG. 1 may include various information communication devices, multi-media devices supporting a function of the present disclosure, and an application device thereof, such as various devices using an Application Processor (AP), a Graphic Processing Unit (GPU), and a Central Processing Unit (CPU). For example, the touch device includes devices such as a tablet Personal Computer (PC), a Smart Phone, a digital camera, a Portable Multimedia Player (PMP), a media player, a portable game terminal, a Personal Digital Assistant (PDA) as well as mobile communication terminals operating based on respective communication protocols corresponding to various communication systems.
  • FIG. 2 is a diagram of a screen schematically illustrating a screen interface in a touch device according to an embodiment of the present disclosure.
  • Referring to FIG. 2, a screen interface for supporting a multi-window environment in a touch device according to certain embodiment of the present disclosure includes execution regions 210 and 230 split from one screen to display an execution screen of an application. That is, within the screen there are separate execution regions 210 and 230 in which execution screens relating to separate applications can be displayed. Each execution region 210 and 230 may be referred to as a separate window, and collectively the separate windows may be referred to as multi-windows or a multi-window environment. Furthermore, the screen interface includes a separator 200 separating at least two execution regions 210 and 230 split according to a split scheme to adjust a window size of the execution regions 210 and 230. The split scheme refers to the relative disposition and size of the two or more execution regions 210 and 230 or windows within the multi-window environment. It will be appreciated that if there are more than two windows within the multi-window environment then further separators may be required. The respective execution regions 210 and 230 split according to the multi-window environment may include a navigation region, a scroll region, or a text input region which are independently formed according to the execution application or the respective execution applications.
  • Further, the screen interface of the present disclosure provides a tray 300 for conveniently supporting execution of an application using respective windows separated as a multi-window. The tray 300 is installed in the touch device and includes one or more execution icons (or a shortcut icon) 400 from among all executable applications or includes only some applications according to settings of the user. The tray 300 may be arranged such that it appears to slide-in (i.e., be displayed) on the screen or to slide-out and be hidden from the screen. The tray 300 may include a handle item 350 capable of receiving a user command (for instance a touch input or a touch and drag input) for switching between the slide-in in a slide-out state. In addition, the tray 300 may support scrolling through execution icons 400 in the tray 300 and the execution icon 400 in the tray 300 may be corrected, added, or removed according to user selection. Although it has been illustrated in FIG. 2 that the tray 300 is disposed in a row, the tray 300 may be disposed in two or more rows, which may be changed according to user selection.
  • Although it has been illustrated in FIG. 2 that a screen of a touch device is split into two execution regions (i.e., windows) 210 and 230 through one separator 200, according to an embodiment of the present disclosure, the screen of the touch device may be split into a larger number of windows up to a maximum number N (N>1, N=natural number) where N is proportional to the screen size. Accordingly, one or more separators 200 may be provided in response to the number of windows, that is, according to the split scheme that configures a multi-window environment. For example, when the screen of the touch device is split into two execution regions as shown in FIG. 2, one separator 200 may be provided. When the screen of the touch device is split into three execution regions, two separators 200 may be provided. When the screen of the touch device is split into four execution regions, two or three separators 200 may be provided according to the split region.
  • FIG. 3 is a diagram schematically illustrating an operation of a multi-window in a touch device according to an embodiment of the present disclosure.
  • Referring to FIG. 3, a screen example of reference numeral <301> indicates a screen example of a touch device when the touch device executes an Internet application. More particularly, the screen of reference numeral <301> indicates a state in which the Internet application is displayed as a full screen through one window. The full screen consumes all or substantially all of the available screen space (which may, for instance, be less than the total screen size to allow for status bars to be continuously displayed).
  • A screen example of reference numeral <303> indicates a screen example of a touch device when two applications are executed through a multi-window. For example, the user may additionally execute a map (MAP) application in a state in which a full screen of the Internet application is displayed. Accordingly, as shown in the screen example of reference numeral <303>, one screen is split into different execution regions by two windows through the separator 200, and execution screens of an Internet application and a MAP application are provided through respective execution regions (windows). In this manner, a plurality of applications split among at least two screens may be simultaneously operated according to embodiments of the present disclosure.
  • A screen example of reference numeral <305> indicates a screen example where sizes of respective windows are changed according to a user operation from a screen of reference numeral <330>. For example, the user moves (e.g., a touch & drag) the separator 200 to adjust a window size of an execution region in which the Internet application is executed and an execution region in which a MAP application is executed. According to embodiments of the present disclosure, when adjusting the window size by movement of the separator 200, the screen size of the application may be suitably changed according to a variation in the window size of a corresponding execution region.
  • FIG. 4 is a diagram schematically illustrating an operation for separating a multi-window in a touch device according to an embodiment of the present disclosure.
  • Referring to FIG. 4, a screen example of reference numeral <401> indicates a case where a screen is split into two windows for a multi-window environment and a screen example when an application A and an application B are executed through two windows separated through one separator 200.
  • Screen examples of reference numerals <403> and <405> indicate a case where a screen is split into three windows for a multi-window environment, and indicates a screen example when applications A, B, and C are executed through three windows using two separators 200.
  • As illustrated in screen examples of reference numerals <403> and <405>, the screen split of the present disclosure may be separated into various forms according to settings of the user, and the split scheme may be pre-defined.
  • FIGS. 5, 6, 7, 8, 9, 10, 11, and 12 are diagrams illustrating examples of an operation screen operating a tray for rapidly executing an application in a multi-window environment according to embodiments of the present disclosure.
  • Referring to FIGS. 5, 6, 7, 8, 9, 10, 11, and 12, FIG. 5 illustrates a screen example of a touch device when the touch device displays an idle screen (or home screen).
  • Although an idle screen is displayed as a full screen in the screen example of FIG. 5, an execution screen of a specific application may be displayed as a full screen. More particularly, FIG. 5 illustrates an example where the idle screen is operated in a normal mode before operating the multi-window environment. That is, according to an embodiment of the present disclosure, the touch device may be operated in a multi-window mode and a normal mode and may switch between the two.
  • The user may activate the tray 300 to be indicated on the idle screen as illustrated in FIG. 6 in a state in which the idle screen is displayed, according to an embodiment of the present disclosure. For example, the user may input a menu operation through the displayed idle screen of the touch device to display the tray 300. Alternatively, the tray 300 may be displayed through selection of a function key for executing a multi-window mode, or in response to a touch event set to execute the multi-window mode (e.g., a gesture having a specific pattern such as figures and characters). Accordingly, the touch device may activate and indicate (display) a tray 300 on a pre-set region on an idle screen as shown in FIG. 6. For example, the tray 300 may be disposed at a left frame (a left edge) of a rectangular full screen such that the full screen (currently displaying the idle screen in FIG. 6) is reduced in size. The tray 300 also may be provided in the form of an overlay through a separate layer on a currently displayed screen, and may have a handle item 350, such that the tray 300 overlaps the idle screen, as shown in FIG. 6.
  • The user may input a movement event (e.g., a touch & drag) moving the tray 300 to another region on a screen as shown in FIG. 7 in a state in which the tray 300 is displayed on an idle screen, according to an embodiment of the present disclosure. For example, the user may touch a part of the tray 300 to input a movement event to drag the tray to a different part of the screen (for instance, an opposite direction of a screen (e.g., a right frame direction of a window (a right edge direction of the screen)). Accordingly, the touch device may provide a User Interface (UI) or a Graphic User Interface (GUI) that separates the tray 300 from the left frame according to the movement event to move with the drag in response to the drag of the user. In this case, when the tray 300 is moved in a specific direction greater than a predetermined range (e.g., based on a center of a screen) in response to a drag movement of the user, the touch device may change and display a direction of a handle item 350 of the tray 300. That is, the touch device may differently display the handle item 350 for sliding-in the tray 300 in a screen according to a region in which the tray 300 is located. For example, the handle item 350 illustrated in FIG. 6 may be switched to a direction of a handle item 350 as illustrated in FIG. 7 according to a movement of the tray 300.
  • Referring to FIG. 7, the user may move the tray 300 close to a desired region to release the input movement event. That is, the user may release drag input for moving the tray 300. Then, the touch device may determine a moved region of the tray 300 and arrange and display the tray 300 on the determined region. For example, as shown in FIG. 8, the touch device may arrange and provide the tray 300 at a right frame of a window (a right edge of the screen). That is, if a user input for moving the tray 300 is released, the touch device displays a screen as illustrated in FIG. 8. That is, a tray 300 provided in the screen in the touch device shown in FIG. 6 is switched as illustrated in FIG. 8 according to a movement of the tray 300. The touch device may determine an arranged region of the tray 300 according to a movement degree of the tray 300. For example, the touch device may arrange the tray 300 at a window frame (screen edge) closest to the moved region (based on a point of contact of a user input on the tray 300). For instance, when the user input is released when the tray 300 is closest to the left frame (the left edge of the screen), the tray 300 is arranged and displayed at the left frame (the left edge). When the user input is released when the tray 300 is closest to a respect right, upper or lower frame (edge of the screen), the tray 300 is arranged and displayed at the respective right, upper or lower frame (edge).
  • In this manner, screen examples where the tray 300 is arranged in different locations according to a user input are illustrated in FIG. 6 (arranged at a left frame), in FIG. 8 (arranged at a right frame), in FIG. 9 (arranged at an upper frame), and in FIG. 10 (arranged at a lower frame). That is, according to embodiments of the present disclosure, referring to FIGS. 6 to 10, an arranged location of the tray 300 may be changed in real time according to user input.
  • FIG. 11 illustrates a screen example of a slide-out, that is, a hidden state in a state in which the tray 300 is arranged at a lower frame as shown in FIG. 10.
  • Referring to FIG. 11, if the tray is slid-out, the tray 300 is not displayed on a screen but only a handle item 350 of the tray 300 may be displayed. In the present disclosure, a slide-out of the tray 300 is achieved by a user input using the handle item 350, or the tray 300 may be automatically slid-out when the user input does not occur for a predetermined time in a slide-in state. When a specific execution icon 400 is selectively moved to the screen from the tray 300 according to the user input, the tray 300 may be automatically slid-out.
  • Further, when the user touches a user input (e.g., handle item 350) and moves (i.e., a drag, a flick, or the like) it in an inner direction of a screen in a state in which the tray 300 is slid-out, the tray 300 may be slid-in.
  • FIG. 12 illustrates a screen example when a screen of a landscape mode is displayed according to a rotation of the touch device in a screen display of a portrait mode as illustrated in FIGS. 6, 7, 8, 9, 10, and 11, according to embodiments of the present disclosure. When the touch device is switched from the landscape mode to the portrait mode or from the portrait mode to the landscape mode, the tray 300 may be arranged and provided at a location corresponding to a direction arranged in a previous mode. For example, when the touch device switches the landscape mode to the portrait mode in a state in which the tray 300 is arranged at a left frame at a time point of viewing a screen of the user (a left edge of the landscape mode), the tray 300 may be automatically arranged and provided at a left frame at a time point of viewing the screen of the user (a left edge of the portrait mode). That is, regardless of switch of the mode, the tray 300 may be arranged and provided at the same location based on a time point of the user.
  • Referring to FIG. 12, screens of respective applications of split execution regions (windows) are rotated and provided according to a mode switch, and the window size split by the separator 200 may be maintained in accordance with a previous state.
  • FIGS. 13, 14, 15, 16, and 17 are diagrams illustrating examples of an operation screen operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • Referring to FIGS. 13, 14, 15, 16, and 17, FIG. 13 illustrates a screen example of a touch device when the touch device executes one application (e.g., Internet application) as a full screen. As shown in FIG. 13, the tray 300 is activated, slid-out, and hidden on the screen so that only the handle item 350 is displayed on the screen.
  • The user may select (e.g., touch & drag) the handle item 350 in a state in which the Internet application is displayed to slide-in the tray 300 on a screen as shown in FIG. 14. When the user input with respect to the handle item 350 is detected in a state the tray 300 is slid-out, the touch device displays a screen as shown in FIG. 14. That is, a screen of the touch device illustrated in FIG. 13 is switched according to the user input as illustrated in FIG. 14.
  • The user may select an execution icon 410 of an application to be additionally executed according to a multi-window environment from among application execution icons 400 previously registered in the tray 300 to input an event moving on a screen in a state in which the tray 300 is displayed. For example, the user selects (i.e., touches) an execution icon 410 capable of executing a map application in the tray 300 and inputs an event moving (i.e., dragging) the execution icon into the screen region currently displaying Internet application while the touch is maintained.
  • Then, the touch device displays a state in which the execution icon 410 is moved into the screen in response to a user input as shown in FIG. 15. In this case, the touch device confirms a region in which the execution icon 410 is located and a split scheme as illustrated in FIG. 15 and outputs a feedback for an execution region to which an application of the execution icon 410 is to be executed to the user (illustrated by the hashed box in FIG. 15). The feedback may be expressed by various schemes which may be intuitively recognized by the user such as focusing a corresponding window in which the execution icon 410 is located among windows of the split execution region, highlighting and displaying only a corresponding window, or changing a color of a corresponding window.
  • When an execution icon 410 in the tray 300 enters in the screen according to the user input, UI or GUI may provide a fade out effect such that a space in which the execution icon 410 is located in the tray 300 is remained as a blank. Further, when the execution icon 410 is separated from the tray 300 and enters in the screen, the tray 300 may be slid-out. That is, a screen of the touch device illustrated in FIG. 15 may be switched as illustrated in FIG. 16 according to the user input.
  • The blank processing of the present disclosure is provided for intuition of the user. When the tray 300 is slid-out, that is, when FIG. 15 is switched to FIG. 16, a blanked processed space from the tray 300 according to separation of the execution icon may have an original shape. That is, as illustrated in a screen example of FIG. 18 to be described later, a space in which the execution icon 410 is located may be provided in a state in which a corresponding to when icon is present.
  • Further, in the case of FIGS. 15 and 16, a multi-window environment may be split into two execution regions having two windows with an upper window and a lower window. In addition, FIG. 15 illustrates a case where the current location of the execution icon 410 is in a current upper window according to a user input, and where the lower window is focused when the execution icon 410 is moved to a lower side of the screen in a state in which the touch input at the execution icon 410 is maintained.
  • Referring to FIG. 16, the user may move the execution icon 410 to a lower side of the screen in a state in which a touch input to the execution icon 410 is maintained, and input an event of releasing a touch input to the execution icon 410 in the lower window. For example, when the lower window is focused and displayed in a state in which the execution icon 410 is dragged and moved to the lower window, the user may release (i.e., drag & drop) a touch input to the execution icon 410.
  • Accordingly, referring to FIG. 17, the touch device executes an application (i.e., the map application) associated with the execution icon 410 in response to the user input and displays an execution screen of the application on the lower window. In this case, if a full screen is executed with a previous application, such as the Internet application and execution of an additional application such as the map application is detected, the touch device separates the full screen into two split execution regions through the separator 200 to form two separate windows. Further, the touch device displays a screen of the additional application (i.e., map application) through a window (e.g., a lower window) of an execution region in which the execution icon 410 is located, and displays a screen of the previous application (i.e., Internet application) through a window (e.g., an upper window) of another execution region.
  • In this case, upon execution of the additional application, the touch device displays a screen of a suitable size corresponding to a window (e.g., a lower window) size of an execution region in which the additional application is executed. Further, the touch device displays a screen of the previous application as a full screen or a partial screen in a window (e.g., an upper window) of a split execution region according to a characteristic of a previous application, and displays a screen of the additional application in a window (lower window) of another split execution region as a full screen or a partial screen upon splitting the screen.
  • For example, when the previous application and the additional application are each an application capable of playing content, such as a video, the touch device may change to a screen of a suitable size corresponding to a window (e.g., an upper window and a lower window) of a split execution region and display a play screen in a corresponding window as a full screen. When the previous application and the additional application are each an application capable of displaying a text or a list, such as an Internet application, the touch device may display only a partial screen corresponding to a size of a corresponding window (i.e., upper window, lower window) of the split execution region.
  • As illustrated in screen examples of FIGS. 13, 14, 15, 16 and 17, according to embodiments of the present disclosure, when the touch device executes an application, an execution screen of a first application may be displayed as the full screen. Further, the touch device may receive an execution event input (e.g., a user input which selects an execution icon 400 from the tray 300 and moves to the screen) for executing a second application from a user while displaying the first application as a full screen. In this case, when the execution event is moved into the screen while not being released, the touch device may output feedback with respect to the window of a location to which the execution event is moved (i.e., a location to which the execution icon 400 is being moved (i.e., dragged) according to a user input). Further, when the execution event is released in a moved specific window (e.g., when a user drops an execution icon 400 dragged into a region of a specific window after a selection thereof), a multi-window may be configured according to a pre-set split scheme, and screens of the first application and the second application may be independently displayed through respective split windows.
  • FIGS. 18, 19, 20, 21, 22, and 23 are diagrams illustrating examples of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • Referring to FIGS. 18, 19, 20, 21, 22, and 23, FIG. 18 illustrates a screen example of a touch device when the tray 300 is slid-in according to the user input using a handle item 350 in a state in which the touch device displays screens of different applications through each window of two split execution regions as illustrated in FIG. 17.
  • The user may select an execution icon 430 of an application (e.g., a note application) to be additionally executed from among execution icons 400 previously registered in the tray 300 in response to the foregoing operation and input an event moving on the screen as illustrated in FIG. 19.
  • Accordingly, the touch device moves the execution icon 430 into the screen in response to the user input as illustrated in FIG. 19, and outputs feedback for an execution region in which the execution icon 430 is to be executed in a corresponding location according to the movement to the user. A slide-out operation of the tray 300 according to the movement of the execution icon 430 and an execution operation of an application (e.g., a note application) of the execution icon 430 correspond to the foregoing operation. In this case, FIG. 19 illustrates a case where a touch input to the execution icon 430 is moved to an upper window of the screen and is released (i.e., drag & drop).
  • Referring to FIG. 20, the touch device executes an application (e.g., a note application) of an execution icon 430 in response to the user input and displays an execution screen of the application on an upper window. In this case, the touch device processes the application (e.g., an Internet application) previously executed through the upper window in the background (not displayed), and displays a screen of the additional application (e.g., a note application) whose execution is newly requested through the upper window. Further, the touch device may continuously execute the application (e.g., a map application) allocated to the lower window and continuously displays a screen (e.g., currently progressing screen) according to the execution state through the lower window.
  • In this manner, as illustrated in screen examples of FIGS. 18, 19, and 20, according to embodiments of the present disclosure, the touch device may receive a user input for executing an additional application while displaying a screen of a plurality of applications through the multi-window. Accordingly, the touch device may execute the additional application through a corresponding window selected from the user for executing the additional application. Upon executing the additional application, the application previously executed through the selected window may be processed as a background, and the additional application screen may be displayed through the selected window.
  • The user may change the window size for two split execution regions through the separator 200 as illustrated in FIG. 20. That is, FIGS. 21, 22, and 23 illustrate an operation of changing the window size according to the user input in a state in which a window of split execution regions of the touch device is displayed.
  • The user may input an event to select, as illustrated in FIG. 21, the separator 200 in a screen like FIG. 20 and to move the selected separator 200 in a specific direction (e.g., upward or downward). For example, the user may input an event which touches the separator 200 as illustrated in FIG. 21 and drags the separator 200 to a lower direction of the screen in a state in which the touch is maintained.
  • Accordingly, the touch device displays a moved state of the separator 200 in response to a user input as illustrated in FIG. 21. In this case, the touch device may change and display only a moving state of the separator 200 according to an user input while maintaining a screen of the application displayed through each window as a current state as shown in FIG. 21. However, according to embodiments of the present disclosure, the touch device may adaptively change and display a screen of an application according to a window size changed when the separator 200 is moved according to the user input through a window size control scheme.
  • The user may input an event which moves the separator 200 corresponding to a size ratio of each window to be adjusted and releases a touch input to the separator 200. For example, the user may drag the separator 200 and release (i.e., drag & drop) a touch input to the separator 200 in a state in which the separator 200 is moved to a location of the lower window as illustrated in FIG. 21.
  • Accordingly, the touch device changes and displays a window size according to movement of the separator 200 in response to the user input as shown in FIG. 22. In this case, the touch device changes and displays a display state of a screen of an application allocated to each window (e.g., upper window and lower window) according to variation in the window size. For example, as shown in FIG. 22, remaining hidden contents may be displayed according to increase of the window size on a screen of an application displayed on the upper window, and a screen of an application displayed on the lower window may be provided in a state in which a region displayed according to reduction of the window size is reduced.
  • FIG. 23 illustrates an opposite case of FIG. 22, and illustrates a screen example in a state in which a separator 200 is moved to an upper direction of a screen according to a user input, and accordingly the size of an upper window is reduced and the size of the lower window is enlarged.
  • FIGS. 24, 25, 26, 27, 28, and 29 are diagrams illustrating examples operating a key pad for text input in a multi-window environment according to an embodiment of the present disclosure.
  • Referring to FIGS. 24, 25, 26, 27, 28, to 29, the present disclosure provides a touch key pad (e.g., a floating key pad) 500 having a different form from a normal touch key pad for efficiently operating a multi-window environment. That is, according to embodiments of the present disclosure, a touch key pad operated in a normal mode providing a screen of one application as a full screen, and a floating key pad 500 operated in a multi-window mode providing a screen of a plurality of applications as an individual screen through screen split may be differentially provided. In the present disclosure, the floating key pad 500 is not fixed to a pre-defined region like a normal touch key pad, but may be freely moved around in a screen of the touch device in response to the user input. The floating key pad of the present disclosure may be in the form of a pop-up when a text input is requested (e.g., a user input selecting a text input window of an application of the specific window) from an application of the specific window according to user selection from among applications of a plurality of windows separated as a multi-window in the multi-window environment.
  • Referring to FIGS. 24, 25, 26, 27, 28 and 29, FIG. 24 illustrates a screen example of a touch device in a state in which the touch device displays a screen of different applications through each window of two split execution regions.
  • The user may display a floating key pad at a predetermined region (e.g., a pre-defined region or a previously executed region) according to a user input referring to FIG. 25 in a state in which screens of a plurality of applications according to a multi-window environment are simultaneously displayed. For example, the user may input a menu operation of the touch device, function key selection for executing the floating key pad 500, or a touch event (e.g., a gesture having a specific pattern such as figures and characters) set to execute the floating key pad 500. More particularly, in the present disclosure, when a text input window in which a text input is possible is selected on an application screen executed on a window of each split execution region, the floating key pad 500 may be automatically executed and be provided on the screen.
  • Referring to FIG. 25, the touch device activates a floating key pad 500 at one region of a screen operated as the multi-window. For example, a location provided when the floating key pad 500 is activated may be provided in a form that a bottom end of the floating key pad 500 adheres to a lower frame. In the present disclosure, the floating key pad 500 has a separate layer and may be provided in an overlay form on screens according to a multi-window.
  • The user may input a movement event (e.g., a touch & drag) moving the floating key pad 500 to another region on the screen as illustrated in FIG. 26 in a state in which the floating key pad 500 is displayed on the screen. For example, the user may input a movement event which touches and drags a part of the floating key pad 500 to another region (e.g., upward) of the screen. Accordingly, the touch device may provide UI or GUI separating the floating key pad 500 from a lower frame according to the movement event and moving the floating key pad 500 with a drag of the user in response to the drag of the user.
  • The user may move the floating key pad 500 to a desired location and release the input movement event as shown in FIG. 27. That is, the user may release a drag input for moving the floating key pad 500. Accordingly, the touch device may arrange and display the floating key pad 500 in a location in which the drag input is released.
  • According to embodiments of the present disclosure, the user input may be achieved in both of respective windows of split execution regions and the floating key pad 500 in a state in which the floating key pad 500 is provided. In this case, a user input by the floating key pad 500 is received in a region that the floating key pad 500 occupies, and a user input for a corresponding window may be received in a remaining region.
  • Referring to FIG. 27, the user may perform a text input using the floating key pad 500 in a state in which the floating key pad 500 is displayed. For example, it is assumed that the user inputs a text on a screen of an application executing on the upper window. In this case, the user selects the upper window (i.e., selects any one region (e.g., a text input window) in which a text input is possible from an application screen of an upper window), and selects and inputs a desired character button on the floating key pad 500.
  • Referring to FIGS. 27 and 28, the user selects a text input window 610 on a screen of an application executing through the upper window to implement a state in which the text input is possible. Further, the user may sequentially input respective buttons to which characters p, s, and y are allocated to input “psy” using the floating key pad 500. Accordingly, the touch device may input and display a corresponding character on the text input window 610 in response to the user input as illustrated in FIGS. 27 and 28.
  • Referring to FIG. 28, the touch device may provide a result for a text (e.g., “psy”) input to the text input window 610 of an application executing on the upper window to the floating key pad 500 in the form of an underlay as illustrated in FIG. 28. For example, as an example of FIG. 28, a text input in to the text input window 610 may be provided through a recommendation region 620 of a new layout recommending a searched result corresponding to the text input in to the text input window 610 while maintaining a current state. The recommendation region 620 may be provided in such a way that overlies a screen of an application and the floating key pad 500 overlies the recommendation region 620. That is, the floating key pad 500 may be disposed at the uppermost position and may maintain a current state.
  • The text input to the text input window 610 may be input to the same layer as an application screen and may be directly provided thereon. For example, in a case of a text input window in to which receiver information is input, like a mail application executed in the lower window, and unlike the example of FIG. 28, only an input result may be displayed through a text input window of an application screen without a separate new layer.
  • Referring to FIG. 28, the user may select any one recommended result in a state in which a recommendation region 620 is displayed on the floating key pad 500 as an underlay, or operate (i.e., command) search execution for a text input to the text input window 610. A corresponding result screen is illustrated in FIG. 29. That is, a screen of a touch device illustrated in FIG. 28 is switched as illustrated in FIG. 29 according to a user input.
  • Referring to FIG. 29, after a text shown in text input window 610 is input through the floating key pad 500 according to user input, when function execution for a corresponding application (e.g., a search execution, a mail transmission execution, a memo storage execution, a message transmission execution, or the like) is input, the floating key pad 500 is removed from the screen, and a result for the execution may be provided from a corresponding window of an application executing the function. For example, referring to FIGS. 28 and 29, a search result for “psy” input from an application of an upper window may be provided through the upper window.
  • FIG. 30 is a diagram illustrating an example of operating a plurality of applications in a multi-window environment according to an embodiment of the present disclosure.
  • Referring to FIG. 30, FIG. 30 illustrates a screen example when specific setting for respective windows is changed according to the user input in a state in which the touch device displays screens of different applications through respective windows of two split execution regions.
  • According to embodiments of the present disclosure, a function may be independently set in every split window. That is, a function suitable for a characteristic of an execution application of a window selected by the user from among windows of split execution regions may be changed. For example, the user may select a left window from among windows of split execution regions, and operate a pre-set function (e.g., operate a function key provided to control a volume). Accordingly, the touch device may separate a characteristic of an application executing through the left window. Further, the touch device may display a volume setting item 700 according to a characteristic of a separated application (e.g., a media playing capability, like a video playing capability), and may feedback a setting value changed according to the user input. In this case, when the user defines a setting of screen brightness with respect to the media characteristic, a screen brightness setting item (not shown) instead of the volume setting item 700 may be provided on the screen, and a feedback where brightness of the screen is changed according to the user input may be provided. Further, the setting for an application executing on the right window may be changed in accordance with the foregoing scheme.
  • As described above, when a function setting is changed according to a user input on a specific window, an independent setting may be achieved for each window. For example, when a volume or screen brightness is set on the left window, a setting value may be reflected and displayed only for the left window.
  • FIGS. 31, 32, 33, and 34 are diagrams illustrating examples of an operation screen providing information for a plurality of applications executed according to a multi-window environment in a touch device according to an embodiment of the present disclosure.
  • Referring to FIGS. 31, 32, 33, and 34, FIG. 31 illustrates a screen example of a touch device when the touch device displays a list for a plurality of application executed according to a multi-window environment. Referring to FIG. 31, a list of applications executed in the multi-window environment by the user may be provided through a full screen according to user selection. The user may input a menu operation of the touch device, function key selection for executing the list, or a touch event (e.g., a gesture having a specific pattern such figures or characters) set to execute the list in a state in which a function by multi window is operating or the screen is converted into an idle screen. Accordingly, as illustrated in FIG. 31, the touch device may display a list for applications currently executed (including background execution) through UI or GUI set as FIG. 31.
  • Referring to FIG. 31, applications which are executed by the user in the multi-window environment and currently maintain the execution may be provided in a specific arrangement format. For example, the applications may be arranged and provided in an execution order or a random order. FIG. 31 illustrates a list including an E-mail application 910, a Video Player application 920, a Note application 930, a Map application 940, and a Play Store application 950.
  • Referring to FIGS. 32 and 33, although not displayed on an initial list screen of FIG. 31, remaining applications (e.g., Gmail application 960, Wi-Fi application 970, and Phone application 980) hidden according to scroll (or navigation) control of the user may be spread and displayed. That is, the list illustrated in FIG. 31 includes different applications which are not displayed through the screen but are hidden. The number of applications included in the initial list may be suitably set in consideration of intuition of the user according to the size of a screen of the touch device. When the number of executing applications is greater than the preset number, excessive applications may be hidden as illustrated in examples of FIGS. 31 to 34. Information for the applications of the list may be provided in such a manner that an information display region of an application (e.g., Video Player application 920) disposed at a lower side among the applications is mainly allocated and the information display region becomes gradually reduced in the upward direction. Accordingly, the uppermost application (e.g., Play Store application 950) may display only a state bar capable of discriminating a corresponding application.
  • Further, as shown in FIG. 31, an application (e.g., E-mail application 910) disposed at a lowermost region to display only a state bar may correspond to at least one application which is most recently executed by user or is displayed on a screen just before execution of a list. In this manner, the application disposed at the lowermost region may be fixed and provided at a corresponding region regardless of scroll control of the user, and fixed arrangement may not be achieved according to user setting.
  • Further, a list screen for the execution applications of the present disclosure may include a command region 800 for supporting a variety of command types (e.g., an application scroll, a termination of application execution, an application search, or the like) for the execution applications in the list. More particularly, the list screen may include a scroll item 850 for controlling a scroll (or a spread) for the applications in the list. That is, the user may scroll the applications in the list through a user input using the scroll item 850. The touch device may provide UI or GUI where information of applications overlapped according to a user input scheme for the scroll item 850 is spread. In this case, when a user input scheme is repeated once to be input, the touch device may repeatedly control (e.g., spread) one scroll in response to a corresponding input. When the user input scheme maintains an input (e.g., a touched) state of the scroll item 850, the touch device may continuously control automatic scroll while the user input is maintained.
  • The user may select (touch) the scroll item 850 to maintain the input in a state in which the list is displayed as illustrated in FIG. 31. Accordingly, when a user input for the scroll item 850 is detected, the touch device displays a screen where information of applications spread from up to down as illustrated in FIGS. 32, 33, and 34. That is, the list screen of the touch device illustrated in FIG. 31 is switched as shown in FIGS. 32, 33, and 34 according to the user input.
  • Referring to FIGS. 32, 33, and 34, UI or GUI may be provided in such a manner that a Video Play application 920 is pulled downward in response to the user input using the scroll item 850 and disappears from the screen while information of other upper applications disposed in the upper side is gradually spread and is sequentially pulled downward. Further, when the list is scrolled according to scroll control according to the user input, referring to FIGS. 33 and 34, other hidden applications (e.g., a Gmail application 960 (FIG. 33), a Wi-Fi application 970 [FIG. 34], a Phone application 980 [FIG. 34], or the like) may be sequentially displayed on the screen. In this case, as illustrated in FIGS. 32, 33, and 34, an E-mail application 910 may be fixed at a corresponding location to be continuously displayed.
  • As illustrated in FIGS. 31, 33, and 34, the user may select an item of a specific application in a state in which the list is displayed, or during scroll control. Accordingly, the touch device may display the selected application as a full screen. Referring to FIGS. 31, 33, and 34, when the user input is achieved by the scroll item 850 until scroll for all applications included in the list is achieved, that is, when all the applications in the list is spread and pulled downward, the touch device may automatically display a recently executed application (i.e., an application [e.g., E-mail application 910] fixed and arranged at the lowermost side) as a full screen.
  • FIG. 35 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present disclosure. More particularly, FIG. 35 illustrates an example of switching to the multi-window environment during an operation of one window.
  • Referring to FIG. 35, a controller 170 executes an application (hereinafter, referred to as a “first application”) corresponding to user selection at operation 3501, and controls screen display for the executing first application at operation 3503. In this case, the controller 170 controls display of a full screen of the first application through one window.
  • When receiving an execution standby event input for executing an additional application (e.g., a “second application”) in a state in which the first application is executed at operation 3505, and determines a preset multi-window split scheme at operation 3507. In the present disclosure, the execution standby event may refer to an event for additionally executing and displaying another application by a multi-window environment in a state in which the user executes and displays any one application. More particularly, the execution standby event may refer to an event which allows the user to activate (e.g., slide in) the tray 300 on the screen and select an execution icon of an application to be additionally executed from the activated tray 300 to move (e.g., drag) into the screen.
  • When the execution icon is moved from the tray 300 and enters in the screen, the controller 170 traces and determines a moved location of the execution icon at operation 3509. The controller 170 may confirm a window of a current region after the execution icon is moved through location trace of the execution icon.
  • The controller 170 controls feedback output for a window of an execution region in which an additional application is able to be executed in response to the determined split scheme and a location of an execution icon at operation 3511. That is, the controller 170 may control feedback output for a specific window of a location in which the execution icon is dragging while the execution icon is move on the full screen according to the drag. For example, the controller 170 may focus and display a window of a location to which the execution icon is moved.
  • If an execution event of the second application by execution icon is input at operation 3513, the controller 170 splits a screen at operation 3515 and controls execution of the second application at operation 3517. The execution event may be an event dropping the execution icon in one region of the screen. The controller 170 identifies a region (e.g., a region where an execution icon is dragged and dropped [i.e., a drag & drop]) where the execution icon is moved to generate an execution event, splits a full screen for the first application, and determines a region in which the execution event is generated among the split regions as one window (i.e., execution region) for displaying a screen of the second application.
  • Upon executing the second application, the controller 170 controls to display a screen having a suitable size corresponding to the window size of the split execution region (i.e., an execution region in which the second application is executed) at operation 3519. Here, the controller 170 may display a screen of the first application in a window (e.g., an upper window) of a split execution region as a full screen or a partial screen, and display a screen of the second application in a window (e.g., a lower window) of another split execution region as a full screen or a partial screen. For example, when the first application or the second application is an application having a capability of playing media, like a video, the controller 170 may change into a screen of a suitable size pertinent to a corresponding window size of a split execution region, and display a playing screen in the window as the full screen. When the first application and the second application are an application having a characteristic of a text or a list like Internet, the controller 170 may display as a partial screen in response to a corresponding window size of the split execution region. That is, according to embodiments of the present disclosure, a screen of the first application and a screen of the second application may be independently displayed on a corresponding window by implementing the multi-window environment.
  • That is, if an input where the execution icon is dropped on a specific window during drag is received, the controller 170 may execute the second application in response to a drop input of the execution icon. In this case, when executing the second application, the controller 170 may split the full screen into windows for displaying screens of the first application and the second application. Further, the controller 170 may display a screen of the second application through the specific window in which the execution icon is dropped, and display a screen of the first application through another split window.
  • FIG. 36 is a flowchart illustrating a method of operating a multi-window environment in a touch device according to an embodiment of the present disclosure. In particular, FIG. 36 illustrates an operation example in which an additional application is executed while operating the multi-window.
  • Referring to FIG. 36, when displaying screens of a plurality of applications by a multi-window at operation 3601, the controller 170 may receive an input for selecting an additional application to additionally execute an application at operation 3603. That is, according to embodiments of the present disclosure, another application may be further executed while independently displaying screens of a plurality of different applications through respective split windows in the multi-window environment.
  • If an input for selecting an additional application is received in the multi-window environment, the controller 170 determines a split scheme and a currently executed window (e.g., an “execution window”) at operation 3605. For example, the controller 170 may confirm how many window split schemes exist in the screen split for multi-window environment through pre-defined split information, and determine how many currently executed windows are split and operated.
  • The controller 170 compares the number of execution windows with the split information to determine whether the number of execution windows corresponds to a maximum value set to the pre-defined split information at operation 3607. For example, the controller 170 may determine whether the pre-defined split information is 3 and the number of currently executed windows is 3. If the number of execution windows does not correspond to the maximum value set to the split information (NO of operation 3607), the controller 170 controls execution of a corresponding operation at operation 3609.
  • For example, as described above, the controller 170 may control an additional screen split for executing the additional application, execution of the additional application according thereto, and screen display for a plurality of applications. This may correspond to an operation for controlling execution of the additional application due to screen slit on the full screen as illustrated in an example of FIG. 35.
  • If the number of execution windows corresponds to the maximum value set to the split information (i.e., YES of operation 3607), the controller 170 traces and determines a location for a user input selecting an execution region for executing the additional application at operation 3611. For example, when the user selects an execution icon of an application to be additionally executed from the tray 300 and moves the selected icon into the screen, the controller 170 may trace and determine a moved location of the execution icon.
  • The controller 170 feedbacks an execution region in which an additional application is able to be executed in response to the determined location at operation 3613. For example, when the execution icon is moved from the tray 300 and enters in the screen, the controller 170 focuses and displays a window of a location to which the execution icon is moved.
  • If an execution event for the additional application is input at operation 3615, the controller 170 executes the additional application and controls processing of a previous application executed in a corresponding execution region as a background at operation 3617.
  • For example, when executing the additional application in response to the user input, the controller 170 may process an application previously executed through a window selected to execute the additional application as the background, and may display the screen of additional application which is requested to execute through a corresponding window. That is, the controller 170 may process the previous application allocated to a corresponding window as a background to continuously execute the application, and may just replace a screen displayed on a corresponding window.
  • Upon executing the additional application, the controller 170 may control a screen display corresponding to a window size of an execution region in which the additional application is executed at operation 3619. For example, the controller 170 may display a screen of the additional application in a window of a corresponding execution region as a full screen or a partial screen.
  • Here, when the additional application is an application having a capability of playing media, like a video, the controller 170 changes into a screen having a suitable size corresponding to a window size of a corresponding execution region, and may display a playing screen in the window as a full screen. When the additional application is an application having a capability of processing a text or a list, e.g., an Internet application, the controller 170 may display a partial screen corresponding to a window size of the corresponding execution region.
  • The foregoing various embodiments of the present disclosure may be implemented in an executable program command form by various computer means and be recorded in a computer readable recording medium. In this case, the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof. In the meantime, the program command recorded in a recording medium may be specially designed or configured for the present disclosure or be known to a person having ordinary skill in a computer software field to be used. The computer readable recording medium includes Magnetic Media such as hard disk, floppy disk, or magnetic tape, Optical Media such as Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), Magneto-Optical Media such as floptical disk, and a hardware device such as ROM. RAM, flash memory storing and executing program commands. Further, the program command includes a machine language code created by a complier and a high-level language code executable by a computer using an interpreter. The foregoing hardware device may be configured to be operated as at least one software module to perform an operation of an embodiment of the present disclosure, and vice versa.
  • Accordingly, embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a machine-readable storage storing such a program. Still further, such programs may be conveyed electronically via any medium, for example a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
  • As described above, according to the method and the apparatus for providing a multi-window in a touch device of the present disclosure, the user may simultaneously use a plurality of applications as a determined split screen or a free style in a simple method. For example, in order to split the screen to use a multi-window in a state in which one application is executed as a full screen, the user drags an additional application from the tray to drag & drop the application to a determined location or a free location, thereby simultaneously operating a plurality of applications.
  • Further, according to the present disclosure, the user may easily arrange and confirm a plurality of application from one screen through a multi-window, and freely change each window according to the multi-window to a desired layout, thereby solving burden and trouble with respect to an efficient configuration of a screen and operations of a plurality of applications.
  • According to the present disclosure, large amounts of information and various user experiences may be provided to the user through the multi-window environment. Further, according to the present disclosure, the user may efficiently and simultaneously perform an operation with respect to various applications by a multi-window environment on a small screen of the touch device. For example, the user may simultaneously perform other operations such as creation of messages and mail while viewing and listening to a video on one screen of the touch device. Accordingly, according to the present disclosure, an optimal environment capable of supporting a multi-window environment in the touch device is implemented so that convenience for the user can be improved, and usability, convenience, and competitive forces of the touch device can be improved. The present disclosure may simply implement various types of touch devices and various corresponding devices.
  • It will be appreciated from the following description that, in certain embodiments of the invention, features concerning the graphic design of user interfaces are combined with interaction steps or means to achieve a technical effect.
  • It will be appreciated from the following description that, in certain embodiments of the invention, graphic features concerning technical information (e.g. internal machine states) are utilised to achieve a technical effect.
  • Certain embodiments aim to achieve the technical effect of enhancing the precision of an input device.
  • Certain embodiments aim to achieve the technical effect of lowering a burden (e.g. a cognitive, operative, operational, operating, or manipulative burden) of a user when performing certain computer or device interactions.
  • Certain embodiments aim to achieve the technical effect of providing a more efficient man-machine (user-machine) interface.
  • While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.

Claims (35)

What is claimed is:
1. A method of executing an application in a touch device, the method comprising:
displaying an execution screen of a first application as a full screen;
receiving an input of an execution event for executing a second application;
configuring a multi-window in a split scheme when the execution event is released on a specific window; and
independently displaying screens of the first application and the second application through respective split windows.
2. The method of claim 1, wherein the execution event is an event for selecting an execution icon of the second application to be additionally executed from a tray and moving the selected execution icon into a screen.
3. The method of claim 1, further comprising outputting a feedback for a window corresponding to an updated location of an execution icon when an execution icon is moved and is not released.
4. The method of claim 3, wherein the outputting of the feedback comprises confirming a window of a region where the execution icon is moved and currently located through a location trace of the execution icon.
5. The method of claim 4, further comprising releasing the execution icon by dropping the execution icon on a window where the execution icon is currently located.
6. The method of claim 2, wherein the independently displaying of the screens comprises respectively displaying a screen corresponding to a size of a corresponding window in which the first application and the second application are executed.
7. The method of claim 1, further comprising:
displaying a screen of a plurality of applications through the multi-window;
receiving an input of an execution event for an additional application while displaying the screen of the plurality of applications;
executing the additional application through a window selected to execute the additional application; and
processing an application previously executed through the selected window as a background, and displaying a screen of the additional application through the selected window.
8. The method of claim 7, further comprising comparing the number of currently executed execution windows with a split information, and determining whether the number of execution windows corresponds to a value set to the split information when the input of the execution event for selecting the additional application is received.
9. The method of claim 2, wherein the tray is moved to another region in the screen according to a user input.
10. The method of claim 2, wherein a floating key pad is provided when a text input is requested from an application of a specific window during an operation by the multi-window.
11. The method of claim 10, wherein the floating key pad is moved to another region in the screen according to a user input.
12. The method of claim 11, wherein an input character by the floating key pad is input to a text input window provided from the application of the specific window and is displayed.
13. The method of claim 1, wherein respective windows of the multi-window are separated by a separator.
14. The method of claim 13, wherein the sizes of the respective windows are changed according to a movement of the separator.
15. A method of executing an application in a touch device, the method comprising:
executing a first application corresponding to user selection and displaying the application through one window as a full screen;
receiving a first event input for selecting and moving a second application when the first application is executed;
determining a multi-window split scheme and a region to which the first event is input;
outputting a feedback for a window in which the second application is able to be executed and the region to which the first event is input;
receiving a second event input of executing the second application;
configuring the multi-window in response to the second event input; and
independently displaying a screen of the first application and a screen of the second application respectively through corresponding windows separated by the multi-window.
16. The method of claim 15, wherein the first event comprises an event for selecting an execution icon of the second application to be additionally executed from a tray and moving the selected execution icon into a screen.
17. The method of claim 15, wherein the second event comprises moving the execution icon and releasing the execution icon from a current window when the first event is not released.
18. A method of executing an application in a touch device, the method comprising:
displaying an execution screen of a first application as a full screen;
sliding-in a tray including an execution icon of an application according to a user input when the first application is executed;
receiving an input for selecting an execution icon of a second application from the tray and dragging the selected execution icon into the full screen;
receiving an input for dropping the execution icon in a specific window while the execution icon is dragged;
executing the second application in response to the drop input of the execution icon;
splitting a full screen into windows for displaying screens of the first application and the second application; and
displaying a screen of the second application through the specific window in which the execution icon is dropped, and displaying the screen of the first application through another split window.
19. The method of claim 18, further comprising sliding-out the tray when the execution icon is selected and is moved to the full screen.
20. The method of claim 18, further comprising:
blanking a region to which the execution icon is allocated in the tray when the execution icon is selected and is moved to the full screen; and
restoring again the blanked region when the tray is slid-out.
21. The method of claim 18, further comprising outputting a feedback for the specific window into which the execution icon is dragged while the execution icon is moved on the full screen according to the drag.
22. The method of claim 18, further comprising:
popping-up a floating key pad when a text input is requested from an application of a specific window from among the split windows; and
inputting a text for the application of the specific window according to a user input using the floating key pad.
23. The method of claim 22, further comprising moving the floating key pad to another region in a screen to input a text for an application of another window.
24. The method of claim 18, wherein the windows for displaying the screens of the first application and the second application are split through a separator.
25. The method of claim 24, further comprising changing sizes of the windows according to a movement of the separator.
26. A touch device comprising:
a touch screen configured to display a screen interface of a multi-window environment, to display screens of a plurality of applications through a plurality of windows separated in the screen interface, and to receive an event input for operating the plurality of applications; and
a controller configured to control execution of the plurality of applications in the multi-window environment, and to control to independently display screens of at least two applications according to a user selection from among a plurality of executed applications through the plurality of windows.
27. The touch device of claim 26, wherein the controller receives an input of an execution event for executing a second application when an execution screen of a first application is displayed as a full screen, configures a multi-window according to a split scheme when the execution event is released from a specific window, and controls to independently display screens of the first application and the second application through respective split windows.
28. The touch device of claim 27, wherein the controller controls a feedback output for a window of a moved location while an execution icon is moved in a state in which the execution event is not released.
29. The touch device of claim 27, wherein, when execution of an additional application is received while displaying screens of the plurality of applications through the multi-window, the controller executes the additional application through a selected window for executing the additional application, processes an application previously executed through the selected window as a background, and controls to display a screen of the additional application through the selected window.
30. The touch device of claim 26, wherein the screen interface comprises an execution icon of an application and a tray moved to another region in a screen according to a user input.
31. The touch device of claim 26, wherein the screen interface comprises a floating key pad which is popped up when a text input is requested from an application of a specific window, and moved to another region in a screen according to a user input.
32. The touch device of claim 26, wherein the screen interface comprises a separator for separating respective windows according to the multi-window environment and changing sizes of the respective windows according to a user input.
33. The touch device of claim 32, wherein the controller determines a changing size of the respective windows according to movement of the separator.
34. A computer readable recording medium having recorded thereon a program for executing a process comprising receiving an input of an execution event for executing a second application when an execution screen of a first application is displayed as a full screen, configuring a multi-window in a split scheme when the execution event is released on a specific window, and independently displaying screens of the first application and the second applications through respective split windows.
35. The recording medium of claim 34, wherein the program further comprises outputting a feedback for a window of a moved location when an execution icon is moved and when the execution icon is not released.
US14/035,266 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device Abandoned US20140089833A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/030,645 US11714520B2 (en) 2012-09-24 2020-09-24 Method and apparatus for providing multi-window in touch device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120105898A KR101957173B1 (en) 2012-09-24 2012-09-24 Method and apparatus for providing multi-window at a touch device
KR10-2012-0105898 2012-09-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/030,645 Continuation US11714520B2 (en) 2012-09-24 2020-09-24 Method and apparatus for providing multi-window in touch device

Publications (1)

Publication Number Publication Date
US20140089833A1 true US20140089833A1 (en) 2014-03-27

Family

ID=49263142

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/035,266 Abandoned US20140089833A1 (en) 2012-09-24 2013-09-24 Method and apparatus for providing multi-window in touch device
US17/030,645 Active 2034-06-19 US11714520B2 (en) 2012-09-24 2020-09-24 Method and apparatus for providing multi-window in touch device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/030,645 Active 2034-06-19 US11714520B2 (en) 2012-09-24 2020-09-24 Method and apparatus for providing multi-window in touch device

Country Status (7)

Country Link
US (2) US20140089833A1 (en)
EP (2) EP2725466B1 (en)
KR (1) KR101957173B1 (en)
CN (1) CN103677627B (en)
AU (1) AU2013318697B2 (en)
ES (1) ES2706010T3 (en)
WO (1) WO2014046525A1 (en)

Cited By (140)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130275910A1 (en) * 2012-03-27 2013-10-17 Lg Electronics Inc. Optimization of application execution based on length of pulled out flexible display screen
US20140304634A1 (en) * 2013-04-09 2014-10-09 Fujitsu Limited Electronic apparatus and computer-readable recording medium
US20140325433A1 (en) * 2013-04-24 2014-10-30 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
US20150067588A1 (en) * 2013-08-30 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for changing screen in electronic device
US20150089427A1 (en) * 2013-09-26 2015-03-26 Yamaha Hatsudoki Kabushiki Kaisha Vessel display system and small vessel including the same
US20150121229A1 (en) * 2013-10-28 2015-04-30 Lenovo (Beijing) Co., Ltd. Method for Processing information and Electronic Apparatus
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US20150186024A1 (en) * 2014-01-02 2015-07-02 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
USD735749S1 (en) * 2013-03-14 2015-08-04 Microsoft Corporation Display screen with graphical user interface
USD735736S1 (en) * 2012-01-06 2015-08-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
USD737841S1 (en) * 2013-03-14 2015-09-01 Microsoft Corporation Display screen with graphical user interface
USD739873S1 (en) * 2013-06-10 2015-09-29 Huawei Technologies Co., Ltd. Display screen with icon
US20150286393A1 (en) * 2014-04-08 2015-10-08 Volkswagen Ag User interface and method for adapting a view on a display unit
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US20150309687A1 (en) * 2013-09-06 2015-10-29 Seespace Ltd. Method and apparatus for controlling video content on a display
USD742396S1 (en) * 2012-08-28 2015-11-03 General Electric Company Display screen with graphical user interface
US20150339804A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Electronic device and method for operating display
US20150365306A1 (en) * 2014-06-12 2015-12-17 Apple Inc. Systems and Methods for Multitasking on an Electronic Device with a Touch-Sensitive Display
US9229918B2 (en) 2010-12-23 2016-01-05 Microsoft Technology Licensing, Llc Presenting an application change through a tile
WO2016004116A1 (en) 2014-06-30 2016-01-07 Reliance Jio Infocomm Usa, Inc. System and method for providing a user-controlled overlay for user interface
USD747351S1 (en) * 2013-09-03 2016-01-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
USD748140S1 (en) * 2013-09-03 2016-01-26 Samsung Electronics Co., Ltd. Display screen portion with icon
USD748673S1 (en) * 2013-09-03 2016-02-02 Samsung Electronics Co., Ltd. Display screen portion with icon
WO2016024776A1 (en) * 2014-08-14 2016-02-18 Samsung Electronics Co., Ltd. Electronic device and method for providing user interface
USD751082S1 (en) * 2013-09-13 2016-03-08 Airwatch Llc Display screen with a graphical user interface for an email application
USD751603S1 (en) * 2013-09-03 2016-03-15 Samsung Electronics Co., Ltd. Display screen portion with icon
EP2998854A1 (en) * 2014-09-16 2016-03-23 Samsung Electronics Co., Ltd. Electronic device having independent screen configurations
US20160092064A1 (en) * 2014-06-20 2016-03-31 Huawei Technologies Co., Ltd. Method and Apparatus for Displaying Application Interface, and Electronic Device
USD754184S1 (en) * 2014-06-23 2016-04-19 Google Inc. Portion of a display panel with an animated computer icon
US20160110147A1 (en) * 2014-10-17 2016-04-21 Lenovo (Beijing) Co., Ltd. Display Method And Electronic Device
CN105573740A (en) * 2015-06-30 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Split-screen display mode operation method and terminal
USD756398S1 (en) * 2014-06-23 2016-05-17 Google Inc. Portion of a display panel with an animated computer icon
US9367214B2 (en) * 2008-06-05 2016-06-14 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
USD760780S1 (en) * 2013-09-30 2016-07-05 Terumo Kabushiki Kaisha Display screen with icon
US20160202884A1 (en) * 2013-08-22 2016-07-14 Sony Corporation Information processing apparatus, storage medium and control method
US20160209973A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Application user interface reconfiguration based on an experience mode transition
KR20160088631A (en) * 2015-01-16 2016-07-26 삼성전자주식회사 Method for controlling display and an electronic device thereof
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
CN105892823A (en) * 2016-04-27 2016-08-24 宇龙计算机通信科技(深圳)有限公司 Multi-window editing method, system and mobile terminal
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
USD770530S1 (en) * 2015-05-27 2016-11-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
USD774062S1 (en) 2014-06-20 2016-12-13 Google Inc. Display screen with graphical user interface
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US20170017355A1 (en) * 2015-07-13 2017-01-19 Lg Electronics Inc. Mobile terminal and control method thereof
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
EP3171242A1 (en) * 2015-11-18 2017-05-24 Samsung Electronics Co., Ltd. Electronic device and method for configuring display thereof
US9678542B2 (en) 2012-03-02 2017-06-13 Microsoft Technology Licensing, Llc Multiple position input device cover
CN106843638A (en) * 2016-12-26 2017-06-13 北京奇艺世纪科技有限公司 The control method of video playing terminal, device and video playing terminal
US9696888B2 (en) 2010-12-20 2017-07-04 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US20170199771A1 (en) * 2016-01-08 2017-07-13 Nasdaq, Inc. Systems and methods for calendar synchronization with enterprise web applications
USD792462S1 (en) 2016-01-26 2017-07-18 Google Inc. Display screen with transitional graphical user interface for image navigation and selection
US20170205990A1 (en) * 2016-01-14 2017-07-20 Lenovo (Beijing) Limited Method, system, and apparatus for controlling display regions for an electronic device
USD793440S1 (en) * 2016-01-26 2017-08-01 Google Inc. Display screen with transitional graphical user interface
US9785340B2 (en) 2014-06-12 2017-10-10 Apple Inc. Systems and methods for efficiently navigating between applications with linked content on an electronic device with a touch-sensitive display
US9787576B2 (en) 2014-07-31 2017-10-10 Microsoft Technology Licensing, Llc Propagating routing awareness for autonomous networks
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US9857935B2 (en) 2013-09-02 2018-01-02 Samsung Electronics Co., Ltd. Method and apparatus for providing multiple applications
USD808428S1 (en) 2016-06-29 2018-01-23 Quantcast Corporation Display screen or portion thereof with icon
USD808421S1 (en) * 2015-07-07 2018-01-23 Google Llc Display screen or portion thereof with a transitional graphical user interface component for identifying current location
US9904327B2 (en) 2012-03-02 2018-02-27 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
WO2018074798A1 (en) * 2016-10-17 2018-04-26 Samsung Electronics Co., Ltd. Electronic device and method for controlling display in electronic device
US20180165005A1 (en) * 2016-12-13 2018-06-14 Lg Electronics Inc. Mobile terminal and method for controlling the same
US20180189099A1 (en) * 2016-12-30 2018-07-05 TCL Research America Inc. Mobile-phone ux design for multitasking with priority and layered structure
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US20180204416A1 (en) * 2010-09-30 2018-07-19 Jesus Perea-Ochoa Method and System of Operating Multi-Task Interactive Electronic Devices
USD823871S1 (en) * 2017-02-03 2018-07-24 Google Llc Display screen with animated graphical user interface
US20180210643A1 (en) * 2013-02-17 2018-07-26 Benjamin Firooz Ghassabian Data entry systems
US10073976B2 (en) 2014-10-24 2018-09-11 Samsung Electronics Co., Ltd. Application executing method and device, and recording medium thereof
US10126943B2 (en) * 2014-06-17 2018-11-13 Lg Electronics Inc. Mobile terminal for activating editing function when item on front surface display area is dragged toward side surface display area
EP3435218A1 (en) * 2017-07-28 2019-01-30 Beijing Xiaomi Mobile Software Co., Ltd. Method and device for displaying application, and storage medium
US10254955B2 (en) 2011-09-10 2019-04-09 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
USD851118S1 (en) * 2014-09-02 2019-06-11 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10338765B2 (en) 2014-09-05 2019-07-02 Microsoft Technology Licensing, Llc Combined switching and window placement
USD854035S1 (en) 2015-05-17 2019-07-16 Google Llc Display screen with an animated graphical user interface
WO2019143071A1 (en) 2018-01-22 2019-07-25 Samsung Electronics Co., Ltd. Electronic device for controlling a plurality of applications
US20190244586A1 (en) * 2016-02-18 2019-08-08 Samsung Electronics Co., Ltd. Content display method and electonic device for performing same
TWI672632B (en) * 2018-02-26 2019-09-21 宏碁股份有限公司 Method for filtering screen split configurations and computer device using the same
USD861721S1 (en) * 2018-04-09 2019-10-01 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for handling swipe gesture
US10430917B2 (en) 2012-01-20 2019-10-01 Microsoft Technology Licensing, Llc Input mode recognition
US10452256B2 (en) * 2013-07-25 2019-10-22 Samsung Electronics Co., Ltd. Non-interfering multi-application display method and an electronic device thereof
US10509547B2 (en) 2014-12-18 2019-12-17 Samsung Electronics Co., Ltd. Electronic device and method for controlling a display
RU2710309C2 (en) * 2017-06-15 2019-12-25 Боргвард Трейдмарк Холдингс ГмбХ Method and apparatus for processing split screen and vehicle
US10534434B2 (en) 2014-11-12 2020-01-14 Samsung Electronics Co., Ltd. Apparatus and method for using blank area in screen
USD874495S1 (en) 2018-04-09 2020-02-04 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for an application launcher
US10552031B2 (en) 2014-12-30 2020-02-04 Microsoft Technology Licensing, Llc Experience mode transition
US10564792B2 (en) 2012-12-06 2020-02-18 Samsung Electronics Co., Ltd. Display device and method of indicating an active region in a milti-window display
US10579250B2 (en) 2011-09-01 2020-03-03 Microsoft Technology Licensing, Llc Arranging tiles
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10628987B2 (en) * 2018-05-08 2020-04-21 Google Llc Condensed transitions of graphical elements presented in graphical user interfaces
USD882582S1 (en) * 2014-06-20 2020-04-28 Google Llc Display screen with animated graphical user interface
US10649791B2 (en) * 2015-07-14 2020-05-12 Samsung Electronics Co., Ltd. Method for an initial setup and electronic device thereof
CN111142769A (en) * 2019-12-20 2020-05-12 维沃移动通信有限公司 Split screen display method and electronic equipment
US10678743B2 (en) 2012-05-14 2020-06-09 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
USD890198S1 (en) * 2018-08-21 2020-07-14 Facebook, Inc. Display screen with graphical user interface
USD890774S1 (en) * 2018-02-22 2020-07-21 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10739987B2 (en) 2014-08-28 2020-08-11 Samsung Electronics Co., Ltd. Electronic device including touch sensitive display and method for managing the display
USD894921S1 (en) 2018-08-21 2020-09-01 Facebook, Inc. Display screen with graphical user interface
US20200356265A1 (en) 2014-12-29 2020-11-12 Samsung Electronics Co., Ltd. User terminal device and control method thereof
US10838594B2 (en) * 2018-07-03 2020-11-17 Canon Production Printing Holding B.V. Method of controlling a user interface
US10942978B1 (en) 2018-08-27 2021-03-09 Facebook, Inc. Systems and methods for creating interactive metadata elements in social media compositions
CN112540740A (en) * 2020-12-09 2021-03-23 维沃移动通信有限公司 Split screen display method and device, electronic equipment and readable storage medium
US10969944B2 (en) 2010-12-23 2021-04-06 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
USD916873S1 (en) * 2019-06-19 2021-04-20 Stryker Corporation Display screen or portion thereof with graphical user interface
CN112689818A (en) * 2018-11-14 2021-04-20 深圳市柔宇科技股份有限公司 Anti-disturbance method, electronic device and computer readable storage medium
US10990268B2 (en) * 2016-09-22 2021-04-27 Beijing Bytedance Network Technology Co Ltd. Operation method and terminal device
US10996839B2 (en) * 2019-05-20 2021-05-04 Microsoft Technology Licensing, Llc Providing consistent interaction models in communication sessions
US11017164B1 (en) 2018-08-27 2021-05-25 Facebook, Inc. Systems and methods for collecting multiple forms of digital content using a single landing screen
US20210160291A1 (en) * 2013-11-13 2021-05-27 T1V, Inc. Simultaneous input system for web browsers and other applications
US11025582B1 (en) 2018-09-05 2021-06-01 Facebook, Inc. Systems and methods for creating multiple renditions of a social media composition from inputs to a single digital composer
USD922997S1 (en) 2018-04-09 2021-06-22 Palm Ventures Group, Inc. Personal computing device
US11054987B1 (en) * 2019-12-25 2021-07-06 Shanghai Transsion Co., Ltd. Sidebar interaction method, device, and computer-readable storage medium
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US11158290B2 (en) 2019-08-19 2021-10-26 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
CN113783995A (en) * 2021-08-13 2021-12-10 维沃移动通信有限公司 Display control method, display control device, electronic apparatus, and medium
US20210397309A1 (en) * 2019-07-19 2021-12-23 Tencent Technology (Shenzhen) Company Limited Interface display method and apparatus, and storage medium
USD940176S1 (en) * 2020-03-01 2022-01-04 Schlumberger Technology Corporation Display device with a graphical user interface having a responsive menu
RU2764157C1 (en) * 2018-06-29 2022-01-13 Бэйцзин Майкролайв Вижн Текнолоджи Ко., Лтд Method and apparatus for switching global special effects, terminal apparatus and data carrier
CN113946245A (en) * 2017-08-24 2022-01-18 华为技术有限公司 Split screen display method and device and terminal
US11231847B2 (en) * 2019-05-06 2022-01-25 Apple Inc. Drag and drop for a multi-window operating system
US11249643B2 (en) * 2018-10-26 2022-02-15 Samsung Electronics Co., Ltd Electronic device for displaying list of executable applications on split screen and operating method thereof
US11272017B2 (en) 2011-05-27 2022-03-08 Microsoft Technology Licensing, Llc Application notifications manifest
CN114168047A (en) * 2019-08-22 2022-03-11 华为技术有限公司 Application window processing method and device
USD948533S1 (en) * 2019-04-25 2022-04-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11340959B2 (en) * 2019-10-29 2022-05-24 Lg Electronics Inc. Electronic apparatus for running application and control method thereof
US11340752B2 (en) 2014-08-29 2022-05-24 Samsung Electronics Co., Ltd Window management method and electronic device supporting the same
CN114546314A (en) * 2022-01-21 2022-05-27 合肥联宝信息技术有限公司 Window display method and device, electronic equipment and storage medium
US11405725B2 (en) * 2017-09-08 2022-08-02 Samsung Electronics Co., Ltd. Method for controlling audio output by application through earphones and electronic device implementing same
US20220276752A1 (en) * 2019-04-15 2022-09-01 Apple Inc. Systems, Methods, and User Interfaces for Interacting with Multiple Application Windows
US20220308753A1 (en) * 2019-06-30 2022-09-29 Huawei Technologies Co., Ltd. Split-Screen Method and Electronic Device
US20220334855A1 (en) * 2020-03-17 2022-10-20 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Multi-task operation method, electronic device, and storage medium
US20220382427A1 (en) * 2020-05-25 2022-12-01 Beijing Bytedance Network Technology Co., Ltd. Method and apparatus for controlling display of video call interface, storage medium and device
USD988353S1 (en) 2019-06-25 2023-06-06 Stryker Corporation Display screen or portion thereof with graphical user interface
WO2023245310A1 (en) * 2022-06-20 2023-12-28 北京小米移动软件有限公司 Window adjustment method and apparatus, and terminal and storage medium

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2013356799B2 (en) * 2012-12-06 2019-08-08 Samsung Electronics Co., Ltd. Display device and method of controlling the same
CN103995722B (en) * 2014-05-26 2017-08-25 天津三星通信技术研究有限公司 Open the method and apparatus of multiple windows simultaneously on screen
CN115269088A (en) * 2014-06-12 2022-11-01 苹果公司 System and method for multitasking on an electronic device with a touch-sensitive display
CN104049866B (en) * 2014-06-25 2019-06-21 努比亚技术有限公司 The implementation method and device of a kind of mobile terminal and its split screen
CN104168515A (en) * 2014-08-21 2014-11-26 三星电子(中国)研发中心 Intelligent television terminal and screen control method thereof
CN105487742B (en) * 2014-09-18 2019-06-18 北京三星通信技术研究有限公司 The display methods and device of more application widgets
US10459608B2 (en) * 2014-12-01 2019-10-29 Ebay Inc. Mobile optimized shopping comparison
CN106293353B (en) * 2015-05-22 2020-06-16 腾讯科技(深圳)有限公司 Expansion control method and device for list elements
CN104965697A (en) * 2015-05-29 2015-10-07 深圳市金立通信设备有限公司 Window display method and terminal
CN105094733B (en) * 2015-06-30 2020-12-29 努比亚技术有限公司 Split screen display method and device
CN104991704A (en) * 2015-07-06 2015-10-21 魅族科技(中国)有限公司 Screen-splitting method for terminal and terminal
CN104991705A (en) * 2015-07-16 2015-10-21 魅族科技(中国)有限公司 Interface display method and terminal
CN105511778A (en) * 2015-11-25 2016-04-20 网易(杭州)网络有限公司 Interaction method device for controlling display of multiple game scenes
CN105426150B (en) * 2015-11-27 2018-10-23 青岛海信电器股份有限公司 A kind of multimedia messages playback method and device
CN105912192A (en) * 2016-03-31 2016-08-31 联想(北京)有限公司 Display control method and electronic equipment
CN105955802B (en) * 2016-04-21 2020-06-12 青岛海信移动通信技术股份有限公司 Application running method of mobile terminal and mobile terminal
CN106020592A (en) * 2016-05-09 2016-10-12 北京小米移动软件有限公司 Split screen display method and device
CN106055252B (en) * 2016-05-30 2019-04-30 努比亚技术有限公司 Mobile terminal and its split screen display available processing method
CN106502513A (en) * 2016-10-31 2017-03-15 珠海我爱拍科技有限公司 A kind of split screen display available technology based on Android system
CN106534914A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Split screen display device, mobile terminal and method
CN110622121A (en) * 2017-05-15 2019-12-27 苹果公司 System and method for interacting with multiple applications simultaneously displayed on an electronic device with a touch-sensitive display
DK180117B1 (en) * 2017-05-15 2020-05-15 Apple Inc. Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touchsensitive display
CN107645593A (en) * 2017-09-07 2018-01-30 宁波亿拍客网络科技有限公司 A kind of method of fast operating equipment
CN111149086B (en) * 2017-09-30 2022-01-14 华为技术有限公司 Method for editing main screen, graphical user interface and electronic equipment
CN108595067A (en) * 2018-03-19 2018-09-28 青岛海信移动通信技术股份有限公司 A kind of input operation zone of action of suspension keyboard determines method and device
CN112068907A (en) * 2019-05-25 2020-12-11 华为技术有限公司 Interface display method and electronic equipment
CN114816209A (en) * 2019-06-25 2022-07-29 华为技术有限公司 Full screen display method and device of mobile terminal
CN110471725A (en) * 2019-07-02 2019-11-19 华为技术有限公司 A kind of split screen method and electronic equipment
CN110442297B (en) * 2019-08-08 2021-08-27 Oppo广东移动通信有限公司 Split screen display method, split screen display device and terminal equipment
CN112578982A (en) * 2019-09-29 2021-03-30 华为技术有限公司 Electronic equipment and operation method thereof
WO2021221184A1 (en) * 2020-04-27 2021-11-04 엘지전자 주식회사 Mobile terminal and control method therefor
CN111638842B (en) * 2020-05-21 2021-09-28 维沃移动通信有限公司 Display control method and device and electronic equipment
KR102256292B1 (en) * 2020-11-26 2021-05-26 삼성전자 주식회사 An electronic device for providing multiple windows using an expandable display
CN115826902A (en) * 2021-07-12 2023-03-21 荣耀终端有限公司 Display method, medium, program product, chip device, and electronic apparatus
CN115016695A (en) * 2021-11-18 2022-09-06 荣耀终端有限公司 Application program starting method and electronic equipment
TWI812072B (en) * 2022-03-16 2023-08-11 緯創資通股份有限公司 Window arrangement method and window arrangement system
CN114706521A (en) * 2022-06-07 2022-07-05 芯行纪科技有限公司 Method for managing windows in EDA (electronic design automation) software interface and related equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US20040119750A1 (en) * 2002-12-19 2004-06-24 Harrison Edward R. Method and apparatus for positioning a software keyboard
US20100066698A1 (en) * 2008-09-18 2010-03-18 Samsung Electronics Co., Ltd. Method and appress for controlling multitasking operations of mobile terminal having touchscreen
US20100088597A1 (en) * 2008-10-02 2010-04-08 Samsung Electronics Co., Ltd. Method and apparatus for configuring idle screen of portable terminal
US20100088634A1 (en) * 2007-01-25 2010-04-08 Akira Tsuruta Multi-window management apparatus and program, storage medium and information processing apparatus
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US20110138276A1 (en) * 2009-12-03 2011-06-09 Mobile Devices Ingenierie Information Device for a Vehicle Driver and Method for Controlling Such a Device
US20110164048A1 (en) * 2008-09-08 2011-07-07 Ntt Docomo, Inc. Information-processing device and program
US20110175930A1 (en) * 2010-01-19 2011-07-21 Hwang Inyong Mobile terminal and control method thereof
US20110209093A1 (en) * 2010-02-19 2011-08-25 Microsoft Corporation Radial menus with bezel gestures
US20110244924A1 (en) * 2010-04-06 2011-10-06 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120023453A1 (en) * 2010-07-26 2012-01-26 Wagner Oliver P Device, Method, and Graphical User Interface for Navigating Through a Hierarchy
US20120066630A1 (en) * 2010-09-15 2012-03-15 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120131483A1 (en) * 2010-11-22 2012-05-24 International Business Machines Corporation Drag-and-drop actions for web applications using an overlay and a set of placeholder elements
US20130057587A1 (en) * 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US20130076793A1 (en) * 2011-09-27 2013-03-28 Imerj LLC Desktop application manager: tapping dual-screen cards

Family Cites Families (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5169342A (en) 1990-05-30 1992-12-08 Steele Richard D Method of communicating with a language deficient patient
US5390295A (en) 1991-12-20 1995-02-14 International Business Machines Corporation Method and apparatus for proportionally displaying windows on a computer display screen
WO1994024657A1 (en) * 1993-04-20 1994-10-27 Apple Computer Inc. Interactive user interface
US5819055A (en) 1994-12-13 1998-10-06 Microsoft Corporation Method and apparatus for docking re-sizeable interface boxes
CA2175148C (en) 1996-04-26 2002-06-11 Robert Cecco User interface control for creating split panes in a single window
DE69805986T2 (en) 1997-03-28 2003-01-23 Sun Microsystems Inc METHOD AND DEVICE FOR CONFIGURING SLIDING WINDOWS
US6166736A (en) 1997-08-22 2000-12-26 Natrificial Llc Method and apparatus for simultaneously resizing and relocating windows within a graphical display
US6832355B1 (en) 1998-07-28 2004-12-14 Microsoft Corporation Web page display system
CA2430432A1 (en) 2000-12-01 2002-06-06 Ginganet Corporation Video terminal, video terminal communication system, and video conferencing system
US6771292B2 (en) 2001-03-29 2004-08-03 International Business Machines Corporation Method and system for providing feedback concerning a content pane to be docked in a host window
US9256356B2 (en) * 2001-03-29 2016-02-09 International Business Machines Corporation Method and system for providing feedback for docking a content pane in a host window
US20020165993A1 (en) 2001-05-04 2002-11-07 Andre Kramer System and method of partitioning software components of a monolithic component-based application program to separate graphical user interface elements for local execution at a client system in conjunction with remote execution of the application program at a server system
US20020191028A1 (en) 2001-06-19 2002-12-19 Senechalle David A. Window manager user interface
US9405459B2 (en) 2011-08-24 2016-08-02 Z124 Unified desktop laptop dock software operation
JP4102045B2 (en) * 2001-09-28 2008-06-18 富士フイルム株式会社 Display control method and display control processing device for concealment window on desktop
US6961906B2 (en) 2001-11-14 2005-11-01 Lenovo Pte. Ltd. Method and system for switching between windows in a multiwindow computer environment
US6850255B2 (en) 2002-02-28 2005-02-01 James Edward Muschetto Method and apparatus for accessing information, computer programs and electronic communications across multiple computing devices using a graphical user interface
US7269797B1 (en) 2002-03-28 2007-09-11 Fabrizio Bertocci Mechanism to organize windows in a graphic application
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US7362341B2 (en) * 2003-06-02 2008-04-22 Microsoft Corporation System and method for customizing the visual layout of screen display areas
US20060020903A1 (en) 2004-07-26 2006-01-26 Shih-Yang Wang Window split system and method
US20060224992A1 (en) 2005-04-01 2006-10-05 Microsoft Corporation Graphical user interface management
US9785329B2 (en) 2005-05-23 2017-10-10 Nokia Technologies Oy Pocket computer and associated methods
KR100739747B1 (en) * 2005-10-31 2007-07-13 삼성전자주식회사 Apparatus and method for interfacing with an user for a touch screen
US20090288036A1 (en) 2005-12-22 2009-11-19 Kazuya Osawa Multi-window display apparatus, multi-window display method, and integrated circuit
KR100818918B1 (en) * 2006-02-14 2008-04-04 삼성전자주식회사 Apparatus and method for managing window layout
EP1847924A1 (en) 2006-04-20 2007-10-24 International Business Machines Corporation Optimal display of multiple windows within a computer display
KR100827228B1 (en) * 2006-05-01 2008-05-07 삼성전자주식회사 Apparatus and method for providing area separate means with touch function
WO2007141995A1 (en) 2006-06-05 2007-12-13 Konica Minolta Medical & Graphic, Inc. Display processing device
JP4068119B2 (en) 2006-07-25 2008-03-26 シャープ株式会社 Video display device, video display method, video display program, and recording medium
KR100831721B1 (en) * 2006-12-29 2008-05-22 엘지전자 주식회사 Apparatus and method for displaying of mobile terminal
JP4858313B2 (en) 2007-06-01 2012-01-18 富士ゼロックス株式会社 Workspace management method
US9116593B2 (en) 2007-07-06 2015-08-25 Qualcomm Incorporated Single-axis window manager
US8245155B2 (en) 2007-11-29 2012-08-14 Sony Corporation Computer implemented display, graphical user interface, design and method including scrolling features
KR101387527B1 (en) 2007-12-06 2014-04-23 엘지전자 주식회사 Terminal and method for displaying menu icon therefor
US20090150823A1 (en) * 2007-12-10 2009-06-11 Ati Technologies Ulc Apparatus and Method for Improved Window Management in a Grid Management System
EP3432656B1 (en) 2008-01-30 2021-04-14 Google LLC Notification of mobile device events
US20090199127A1 (en) * 2008-01-31 2009-08-06 Microsoft Corporation Previewing target display areas
KR101012300B1 (en) 2008-03-07 2011-02-08 삼성전자주식회사 User interface apparatus of mobile station having touch screen and method thereof
JP2009245423A (en) 2008-03-13 2009-10-22 Panasonic Corp Information device and window display method
KR101447752B1 (en) 2008-03-25 2014-10-06 삼성전자주식회사 Apparatus and method for separating and composing screen in a touch screen
US8434019B2 (en) 2008-06-02 2013-04-30 Daniel Paul Nelson Apparatus and method for positioning windows on a display
JP4632102B2 (en) 2008-07-17 2011-02-16 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
KR101568351B1 (en) 2008-08-08 2015-11-20 엘지전자 주식회사 Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
US8600446B2 (en) 2008-09-26 2013-12-03 Htc Corporation Mobile device interface with dual windows
US8547347B2 (en) 2008-09-26 2013-10-01 Htc Corporation Method for generating multiple windows frames, electronic device thereof, and computer program product using the method
JP5362307B2 (en) 2008-09-30 2013-12-11 富士フイルム株式会社 Drag and drop control device, method, program, and computer terminal
JP4683110B2 (en) 2008-10-17 2011-05-11 ソニー株式会社 Display device, display method, and program
KR20100048297A (en) * 2008-10-30 2010-05-11 에스케이텔레시스 주식회사 Screen controlling apparatus and method thereof for mobile terminal
KR101609162B1 (en) 2008-11-13 2016-04-05 엘지전자 주식회사 Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
WO2010111369A1 (en) 2009-03-24 2010-09-30 Fuhu, Inc. Apparatus, system and method for an icon driven tile bar in a graphical user interface
JP5229083B2 (en) 2009-04-14 2013-07-03 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5606686B2 (en) 2009-04-14 2014-10-15 ソニー株式会社 Information processing apparatus, information processing method, and program
US20100293501A1 (en) 2009-05-18 2010-11-18 Microsoft Corporation Grid Windows
KR101587211B1 (en) 2009-05-25 2016-01-20 엘지전자 주식회사 Mobile Terminal And Method Of Controlling Same
US8555185B2 (en) 2009-06-08 2013-10-08 Apple Inc. User interface for multiple display regions
US9092115B2 (en) 2009-09-23 2015-07-28 Microsoft Technology Licensing, Llc Computing system with visual clipboard
JP4959765B2 (en) 2009-09-28 2012-06-27 京セラ株式会社 Mobile terminal device
US10101898B2 (en) 2009-10-23 2018-10-16 Autodesk, Inc. Multi-touch graphical user interface for interacting with menus on a handheld device
KR101636570B1 (en) * 2009-10-28 2016-07-20 엘지전자 주식회사 Apparatus and Method for controlling an output display area
KR101714781B1 (en) 2009-11-17 2017-03-22 엘지전자 주식회사 Method for playing contents
KR101722616B1 (en) 2009-12-24 2017-04-19 삼성전자주식회사 Method and apparatus for operating application of a touch device having touch-based input interface
WO2011099808A2 (en) 2010-02-12 2011-08-18 Samsung Electronics Co., Ltd. Method and apparatus for providing a user interface
US20120005602A1 (en) 2010-07-02 2012-01-05 Nokia Corporation Methods and apparatuses for facilitating task switching
US8621386B2 (en) 2010-07-19 2013-12-31 Verizon Patent And Licensing Inc. File management and transfer using user interface icons associated with applications
US9766903B2 (en) 2010-08-18 2017-09-19 Red Hat, Inc. Inline response to notification messages
KR101811743B1 (en) 2010-09-09 2018-01-25 삼성전자주식회사 Multimedia apparatus and Method for providing contents thereof
US20120069049A1 (en) 2010-09-16 2012-03-22 Omnyx, LLC Digital pathology image manipulation
EP2434368B1 (en) 2010-09-24 2018-08-01 BlackBerry Limited Method for conserving power on a portable electronic device and a portable electronic device configured for the same
CN108681424B (en) 2010-10-01 2021-08-31 Z124 Dragging gestures on a user interface
US8994713B2 (en) 2010-10-01 2015-03-31 Z124 Smart pad operation with differing display parameters applied to different display elements
US20120218202A1 (en) 2010-10-01 2012-08-30 Sanjiv Sirpal Windows position control for phone applications
JP5628625B2 (en) 2010-10-14 2014-11-19 京セラ株式会社 Electronic device, screen control method, and screen control program
US20120102437A1 (en) * 2010-10-22 2012-04-26 Microsoft Corporation Notification Group Touch Gesture Dismissal Techniques
KR102188757B1 (en) * 2010-11-18 2020-12-08 구글 엘엘씨 Surfacing off-screen visible objects
KR101767504B1 (en) * 2010-12-01 2017-08-11 엘지전자 주식회사 Mobile terminal and operation method thereof
US20120144331A1 (en) 2010-12-03 2012-06-07 Ari Tolonen Method for Arranging Application Windows on a Display
US10620794B2 (en) 2010-12-23 2020-04-14 Apple Inc. Device, method, and graphical user interface for switching between two user interfaces
KR101788051B1 (en) * 2011-01-04 2017-10-19 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US9015641B2 (en) 2011-01-06 2015-04-21 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9477311B2 (en) 2011-01-06 2016-10-25 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9465440B2 (en) 2011-01-06 2016-10-11 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9471145B2 (en) 2011-01-06 2016-10-18 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9423878B2 (en) 2011-01-06 2016-08-23 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US10042546B2 (en) * 2011-01-07 2018-08-07 Qualcomm Incorporated Systems and methods to present multiple frames on a touch screen
KR101842906B1 (en) 2011-02-10 2018-05-15 삼성전자주식회사 Apparatus having a plurality of touch screens and screen changing method thereof
US9104290B2 (en) * 2011-02-11 2015-08-11 Samsung Electronics Co., Ltd. Method for controlling screen of mobile terminal
JP5580227B2 (en) 2011-02-24 2014-08-27 京セラ株式会社 Mobile terminal device
US8904305B2 (en) 2011-03-11 2014-12-02 Google Inc. Automatically hiding controls
US20120317499A1 (en) 2011-04-11 2012-12-13 Shen Jin Wen Instant messaging system that facilitates better knowledge and task management
US8713473B2 (en) 2011-04-26 2014-04-29 Google Inc. Mobile browser context switching
KR101199618B1 (en) 2011-05-11 2012-11-08 주식회사 케이티테크 Apparatus and Method for Screen Split Displaying
KR101251761B1 (en) * 2011-05-13 2013-04-05 주식회사 케이티 Method for Data Transferring Between Applications and Terminal Apparatus Using the Method
US20120289290A1 (en) 2011-05-12 2012-11-15 KT Corporation, KT TECH INC. Transferring objects between application windows displayed on mobile terminal
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
DE202011101724U1 (en) * 2011-06-11 2011-10-05 Ruia Global Fasteners Ag Self-centering cage nut
KR101802760B1 (en) 2011-06-27 2017-12-28 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR101860341B1 (en) 2011-09-09 2018-05-24 엘지전자 주식회사 Mobile terminal and control method for the same
KR101859102B1 (en) 2011-09-16 2018-05-17 엘지전자 주식회사 Mobile terminal and control method for mobile terminal
US8836654B2 (en) 2011-10-04 2014-09-16 Qualcomm Incorporated Application window position and size control in (multi-fold) multi-display devices
KR20130054074A (en) * 2011-11-16 2013-05-24 삼성전자주식회사 Apparatus displaying event view on splited screen and method for controlling thereof
KR101888457B1 (en) * 2011-11-16 2018-08-16 삼성전자주식회사 Apparatus having a touch screen processing plurality of apllications and method for controlling thereof
KR101905038B1 (en) 2011-11-16 2018-10-08 삼성전자주식회사 Apparatus having a touch screen under multiple applications environment and method for controlling thereof
US20130141371A1 (en) 2011-12-01 2013-06-06 Research In Motion Limited Electronic device and method of displaying information in response to a gesture
US9032292B2 (en) 2012-01-19 2015-05-12 Blackberry Limited Simultaneous display of multiple maximized applications on touch screen electronic devices
US9619038B2 (en) 2012-01-23 2017-04-11 Blackberry Limited Electronic device and method of displaying a cover image and an application image from a low power condition
US9058168B2 (en) 2012-01-23 2015-06-16 Blackberry Limited Electronic device and method of controlling a display
US20130227490A1 (en) 2012-02-24 2013-08-29 Simon Martin THORSANDER Method and Apparatus for Providing an Option to Enable Multiple Selections
EP2631760A1 (en) 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
US20130227413A1 (en) 2012-02-24 2013-08-29 Simon Martin THORSANDER Method and Apparatus for Providing a Contextual User Interface on a Device
EP2631738B1 (en) 2012-02-24 2016-04-13 BlackBerry Limited Method and apparatus for adjusting a user interface to reduce obscuration
KR101356368B1 (en) 2012-02-24 2014-01-29 주식회사 팬택 Application switching apparatus and method
EP2631747B1 (en) 2012-02-24 2016-03-30 BlackBerry Limited Method and apparatus for providing a user interface on a device that indicates content operators
WO2013169070A1 (en) 2012-05-11 2013-11-14 Samsung Electronics Co., Ltd. Multiple window providing apparatus and method
KR101417318B1 (en) 2012-08-17 2014-07-09 주식회사 팬택 Method for providing User Interface having multi-tasking function, Mobile Communication Device and Computer Readable Recording Medium for providing the same
KR101961860B1 (en) 2012-08-28 2019-03-25 삼성전자주식회사 User terminal apparatus and contol method thereof
US9588674B2 (en) 2012-11-30 2017-03-07 Qualcomm Incorporated Methods and systems for providing an automated split-screen user interface on a device
EP3287884B1 (en) 2012-12-06 2021-11-10 Samsung Electronics Co., Ltd. Display device and method of controlling the same
CN103870772B (en) 2012-12-17 2017-08-08 国基电子(上海)有限公司 Touch-screen electronic installation and its control method
US9690476B2 (en) 2013-03-14 2017-06-27 Blackberry Limited Electronic device and method of displaying information in response to a gesture
DE102013002891A1 (en) 2013-03-22 2014-09-25 Volkswagen Aktiengesellschaft An information reproduction system for a vehicle and method for providing information to the user of a vehicle
US9507495B2 (en) 2013-04-03 2016-11-29 Blackberry Limited Electronic device and method of displaying information in response to a gesture
KR102127925B1 (en) * 2013-04-29 2020-06-29 엘지전자 주식회사 Mobile terminal and control method thereof
US10564836B2 (en) 2013-05-01 2020-02-18 Apple Inc. Dynamic moveable interface elements on a touch screen device
US9933922B2 (en) 2014-03-27 2018-04-03 Sybase, Inc. Child container control of parent container of a user interface
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US20040119750A1 (en) * 2002-12-19 2004-06-24 Harrison Edward R. Method and apparatus for positioning a software keyboard
US20100088634A1 (en) * 2007-01-25 2010-04-08 Akira Tsuruta Multi-window management apparatus and program, storage medium and information processing apparatus
US20110164048A1 (en) * 2008-09-08 2011-07-07 Ntt Docomo, Inc. Information-processing device and program
US20100066698A1 (en) * 2008-09-18 2010-03-18 Samsung Electronics Co., Ltd. Method and appress for controlling multitasking operations of mobile terminal having touchscreen
US20100088597A1 (en) * 2008-10-02 2010-04-08 Samsung Electronics Co., Ltd. Method and apparatus for configuring idle screen of portable terminal
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
US20110138276A1 (en) * 2009-12-03 2011-06-09 Mobile Devices Ingenierie Information Device for a Vehicle Driver and Method for Controlling Such a Device
US20110175930A1 (en) * 2010-01-19 2011-07-21 Hwang Inyong Mobile terminal and control method thereof
US20110209093A1 (en) * 2010-02-19 2011-08-25 Microsoft Corporation Radial menus with bezel gestures
US20110244924A1 (en) * 2010-04-06 2011-10-06 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120023453A1 (en) * 2010-07-26 2012-01-26 Wagner Oliver P Device, Method, and Graphical User Interface for Navigating Through a Hierarchy
US20120066630A1 (en) * 2010-09-15 2012-03-15 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20120131483A1 (en) * 2010-11-22 2012-05-24 International Business Machines Corporation Drag-and-drop actions for web applications using an overlay and a set of placeholder elements
US20130057587A1 (en) * 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US20130076793A1 (en) * 2011-09-27 2013-03-28 Imerj LLC Desktop application manager: tapping dual-screen cards

Cited By (219)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9367214B2 (en) * 2008-06-05 2016-06-14 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US20180204416A1 (en) * 2010-09-30 2018-07-19 Jesus Perea-Ochoa Method and System of Operating Multi-Task Interactive Electronic Devices
US10741025B2 (en) * 2010-09-30 2020-08-11 Jesus Perea-Ochoa Method and system of operating multi-task interactive electronic devices
US9696888B2 (en) 2010-12-20 2017-07-04 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US10969944B2 (en) 2010-12-23 2021-04-06 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US11126333B2 (en) 2010-12-23 2021-09-21 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9229918B2 (en) 2010-12-23 2016-01-05 Microsoft Technology Licensing, Llc Presenting an application change through a tile
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US11272017B2 (en) 2011-05-27 2022-03-08 Microsoft Technology Licensing, Llc Application notifications manifest
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9535597B2 (en) 2011-05-27 2017-01-03 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US11698721B2 (en) 2011-05-27 2023-07-11 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US10303325B2 (en) 2011-05-27 2019-05-28 Microsoft Technology Licensing, Llc Multi-application environment
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US10579250B2 (en) 2011-09-01 2020-03-03 Microsoft Technology Licensing, Llc Arranging tiles
US10254955B2 (en) 2011-09-10 2019-04-09 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
USD735736S1 (en) * 2012-01-06 2015-08-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10430917B2 (en) 2012-01-20 2019-10-01 Microsoft Technology Licensing, Llc Input mode recognition
US10013030B2 (en) 2012-03-02 2018-07-03 Microsoft Technology Licensing, Llc Multiple position input device cover
US9678542B2 (en) 2012-03-02 2017-06-13 Microsoft Technology Licensing, Llc Multiple position input device cover
US9904327B2 (en) 2012-03-02 2018-02-27 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US10963087B2 (en) 2012-03-02 2021-03-30 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9262059B2 (en) * 2012-03-27 2016-02-16 Lg Electronics Inc. Optimization of application execution based on length of pulled out flexible display screen
US20130275910A1 (en) * 2012-03-27 2013-10-17 Lg Electronics Inc. Optimization of application execution based on length of pulled out flexible display screen
US10678743B2 (en) 2012-05-14 2020-06-09 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
USD742396S1 (en) * 2012-08-28 2015-11-03 General Electric Company Display screen with graphical user interface
US11853523B2 (en) 2012-12-06 2023-12-26 Samsung Electronics Co., Ltd. Display device and method of indicating an active region in a multi-window display
US10564792B2 (en) 2012-12-06 2020-02-18 Samsung Electronics Co., Ltd. Display device and method of indicating an active region in a milti-window display
US20180210643A1 (en) * 2013-02-17 2018-07-26 Benjamin Firooz Ghassabian Data entry systems
US10976922B2 (en) * 2013-02-17 2021-04-13 Benjamin Firooz Ghassabian Data entry systems
USD737841S1 (en) * 2013-03-14 2015-09-01 Microsoft Corporation Display screen with graphical user interface
USD735749S1 (en) * 2013-03-14 2015-08-04 Microsoft Corporation Display screen with graphical user interface
US20140304634A1 (en) * 2013-04-09 2014-10-09 Fujitsu Limited Electronic apparatus and computer-readable recording medium
US9367225B2 (en) * 2013-04-09 2016-06-14 Fujitsu Limited Electronic apparatus and computer-readable recording medium
US10126914B2 (en) * 2013-04-24 2018-11-13 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
US20140325433A1 (en) * 2013-04-24 2014-10-30 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
USD739873S1 (en) * 2013-06-10 2015-09-29 Huawei Technologies Co., Ltd. Display screen with icon
US10452256B2 (en) * 2013-07-25 2019-10-22 Samsung Electronics Co., Ltd. Non-interfering multi-application display method and an electronic device thereof
US20160202884A1 (en) * 2013-08-22 2016-07-14 Sony Corporation Information processing apparatus, storage medium and control method
US11687214B2 (en) 2013-08-30 2023-06-27 Samsung Electronics Co., Ltd. Method and apparatus for changing screen in electronic device
US20150067588A1 (en) * 2013-08-30 2015-03-05 Samsung Electronics Co., Ltd. Method and apparatus for changing screen in electronic device
US11137881B2 (en) 2013-08-30 2021-10-05 Samsung Electronics Co., Ltd. Method and apparatus for changing screen in electronic device
US10620774B2 (en) 2013-09-02 2020-04-14 Samsung Electronics Co., Ltd. Method and apparatus for providing multiple applications
US9857935B2 (en) 2013-09-02 2018-01-02 Samsung Electronics Co., Ltd. Method and apparatus for providing multiple applications
USD748140S1 (en) * 2013-09-03 2016-01-26 Samsung Electronics Co., Ltd. Display screen portion with icon
USD747351S1 (en) * 2013-09-03 2016-01-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
USD748673S1 (en) * 2013-09-03 2016-02-02 Samsung Electronics Co., Ltd. Display screen portion with icon
USD751603S1 (en) * 2013-09-03 2016-03-15 Samsung Electronics Co., Ltd. Display screen portion with icon
US10775992B2 (en) * 2013-09-06 2020-09-15 Seespace Ltd. Method and apparatus for controlling display of video content
US9846532B2 (en) * 2013-09-06 2017-12-19 Seespace Ltd. Method and apparatus for controlling video content on a display
US20150309687A1 (en) * 2013-09-06 2015-10-29 Seespace Ltd. Method and apparatus for controlling video content on a display
US10437453B2 (en) 2013-09-06 2019-10-08 Seespace Ltd. Method and apparatus for controlling display of video content
US11175818B2 (en) 2013-09-06 2021-11-16 Seespace Ltd. Method and apparatus for controlling display of video content
USD751082S1 (en) * 2013-09-13 2016-03-08 Airwatch Llc Display screen with a graphical user interface for an email application
US20150089427A1 (en) * 2013-09-26 2015-03-26 Yamaha Hatsudoki Kabushiki Kaisha Vessel display system and small vessel including the same
US10126748B2 (en) * 2013-09-26 2018-11-13 Yamaha Hatsudoki Kabushiki Kaisha Vessel display system and small vessel including the same
USD760780S1 (en) * 2013-09-30 2016-07-05 Terumo Kabushiki Kaisha Display screen with icon
US9841944B2 (en) * 2013-10-28 2017-12-12 Lenovo (Beijing) Co., Ltd. Method for processing information and electronic apparatus
US20150121229A1 (en) * 2013-10-28 2015-04-30 Lenovo (Beijing) Co., Ltd. Method for Processing information and Electronic Apparatus
US11570222B2 (en) * 2013-11-13 2023-01-31 T1V, Inc. Simultaneous input system for web browsers and other applications
US20210160291A1 (en) * 2013-11-13 2021-05-27 T1V, Inc. Simultaneous input system for web browsers and other applications
US20180095809A1 (en) * 2014-01-02 2018-04-05 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
US9891965B2 (en) * 2014-01-02 2018-02-13 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
US20150186024A1 (en) * 2014-01-02 2015-07-02 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
US10754711B2 (en) 2014-01-02 2020-08-25 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
US11494244B2 (en) 2014-01-02 2022-11-08 Samsung Electronics Co., Ltd. Multi-window control method and electronic device supporting the same
US20150286393A1 (en) * 2014-04-08 2015-10-08 Volkswagen Ag User interface and method for adapting a view on a display unit
US10061508B2 (en) * 2014-04-08 2018-08-28 Volkswagen Ag User interface and method for adapting a view on a display unit
US20150339804A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Electronic device and method for operating display
US10068315B2 (en) * 2014-05-26 2018-09-04 Samsung Electronics Co., Ltd. Electronic device and method for operating display
EP2950196A1 (en) * 2014-05-26 2015-12-02 Samsung Electronics Co., Ltd Electronic device and method for operating display
US10402007B2 (en) 2014-06-12 2019-09-03 Apple Inc. Systems and methods for activating a multi-tasking mode using an application selector that is displayed in response to a swipe gesture on an electronic device with a touch-sensitive display
US9785340B2 (en) 2014-06-12 2017-10-10 Apple Inc. Systems and methods for efficiently navigating between applications with linked content on an electronic device with a touch-sensitive display
US20170245017A1 (en) * 2014-06-12 2017-08-24 Apple Inc. Systems and Methods for Presenting and Interacting with a Picture-in-Picture Representation of Video Content on an Electronic Device with a Touch-Sensitive Display
US10795490B2 (en) * 2014-06-12 2020-10-06 Apple Inc. Systems and methods for presenting and interacting with a picture-in-picture representation of video content on an electronic device with a touch-sensitive display
US9648062B2 (en) * 2014-06-12 2017-05-09 Apple Inc. Systems and methods for multitasking on an electronic device with a touch-sensitive display
US20150365306A1 (en) * 2014-06-12 2015-12-17 Apple Inc. Systems and Methods for Multitasking on an Electronic Device with a Touch-Sensitive Display
US11592923B2 (en) 2014-06-12 2023-02-28 Apple Inc. Systems and methods for resizing applications in a multitasking view on an electronic device with a touch-sensitive display
US10732820B2 (en) 2014-06-12 2020-08-04 Apple Inc. Systems and methods for efficiently navigating between applications with linked content on an electronic device with a touch-sensitive display
US10126943B2 (en) * 2014-06-17 2018-11-13 Lg Electronics Inc. Mobile terminal for activating editing function when item on front surface display area is dragged toward side surface display area
US11294560B2 (en) * 2014-06-20 2022-04-05 Huawei Technologies Co., Ltd. Method and apparatus for changing the ratio between interfaces
USD882582S1 (en) * 2014-06-20 2020-04-28 Google Llc Display screen with animated graphical user interface
US20160092064A1 (en) * 2014-06-20 2016-03-31 Huawei Technologies Co., Ltd. Method and Apparatus for Displaying Application Interface, and Electronic Device
USD774062S1 (en) 2014-06-20 2016-12-13 Google Inc. Display screen with graphical user interface
USD754184S1 (en) * 2014-06-23 2016-04-19 Google Inc. Portion of a display panel with an animated computer icon
USD756398S1 (en) * 2014-06-23 2016-05-17 Google Inc. Portion of a display panel with an animated computer icon
US20210342058A1 (en) * 2014-06-30 2021-11-04 Reliance Jio Infocomm Usa, Inc. System and method for controlling errors in a system with a plurality of user-controlled devices using a network-controlled overlay
WO2016004116A1 (en) 2014-06-30 2016-01-07 Reliance Jio Infocomm Usa, Inc. System and method for providing a user-controlled overlay for user interface
EP3161596A4 (en) * 2014-06-30 2018-01-24 Reliance JIO Infocomm USA, Inc. System and method for providing a user-controlled overlay for user interface
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US9787576B2 (en) 2014-07-31 2017-10-10 Microsoft Technology Licensing, Llc Propagating routing awareness for autonomous networks
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
WO2016024776A1 (en) * 2014-08-14 2016-02-18 Samsung Electronics Co., Ltd. Electronic device and method for providing user interface
US11762550B2 (en) 2014-08-28 2023-09-19 Samsung Electronics Co., Ltd. Electronic device including touch sensitive display and method for managing the display
US11449220B2 (en) 2014-08-28 2022-09-20 Samsung Electronics Co., Ltd. Electronic device including touch sensitive display and method for managing the display
US10739987B2 (en) 2014-08-28 2020-08-11 Samsung Electronics Co., Ltd. Electronic device including touch sensitive display and method for managing the display
US11340752B2 (en) 2014-08-29 2022-05-24 Samsung Electronics Co., Ltd Window management method and electronic device supporting the same
USD851118S1 (en) * 2014-09-02 2019-06-11 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10338765B2 (en) 2014-09-05 2019-07-02 Microsoft Technology Licensing, Llc Combined switching and window placement
EP2998854A1 (en) * 2014-09-16 2016-03-23 Samsung Electronics Co., Ltd. Electronic device having independent screen configurations
US9880798B2 (en) * 2014-10-17 2018-01-30 Lenovo (Beijing) Co., Ltd. Method and electronic device for controlling displayed content based on operations
US20160110147A1 (en) * 2014-10-17 2016-04-21 Lenovo (Beijing) Co., Ltd. Display Method And Electronic Device
US10073976B2 (en) 2014-10-24 2018-09-11 Samsung Electronics Co., Ltd. Application executing method and device, and recording medium thereof
US10942574B2 (en) 2014-11-12 2021-03-09 Samsung Electronics Co., Ltd. Apparatus and method for using blank area in screen
US10534434B2 (en) 2014-11-12 2020-01-14 Samsung Electronics Co., Ltd. Apparatus and method for using blank area in screen
US10509547B2 (en) 2014-12-18 2019-12-17 Samsung Electronics Co., Ltd. Electronic device and method for controlling a display
US11782595B2 (en) 2014-12-29 2023-10-10 Samsung Electronics Co., Ltd. User terminal device and control method thereof
US20200356265A1 (en) 2014-12-29 2020-11-12 Samsung Electronics Co., Ltd. User terminal device and control method thereof
US10552031B2 (en) 2014-12-30 2020-02-04 Microsoft Technology Licensing, Llc Experience mode transition
KR102297330B1 (en) 2015-01-16 2021-09-02 삼성전자주식회사 Method for controlling display and an electronic device thereof
KR20160088631A (en) * 2015-01-16 2016-07-26 삼성전자주식회사 Method for controlling display and an electronic device thereof
US11249592B2 (en) 2015-01-16 2022-02-15 Samsung Electronics Co., Ltd. Method of splitting display area of a screen and electronic device for processing the same
US20160209973A1 (en) * 2015-01-21 2016-07-21 Microsoft Technology Licensing, Llc. Application user interface reconfiguration based on an experience mode transition
CN108027695A (en) * 2015-01-21 2018-05-11 微软技术许可有限责任公司 Application user interface based on experience mode conversion reconfigures
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
USD854035S1 (en) 2015-05-17 2019-07-16 Google Llc Display screen with an animated graphical user interface
USD919641S1 (en) 2015-05-17 2021-05-18 Google Llc Display screen with an animated graphical user interface
USD899444S1 (en) 2015-05-17 2020-10-20 Google Llc Display screen with an animated graphical user interface
USD770530S1 (en) * 2015-05-27 2016-11-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
CN105573740A (en) * 2015-06-30 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Split-screen display mode operation method and terminal
USD808421S1 (en) * 2015-07-07 2018-01-23 Google Llc Display screen or portion thereof with a transitional graphical user interface component for identifying current location
US20170017355A1 (en) * 2015-07-13 2017-01-19 Lg Electronics Inc. Mobile terminal and control method thereof
US10649791B2 (en) * 2015-07-14 2020-05-12 Samsung Electronics Co., Ltd. Method for an initial setup and electronic device thereof
KR20170058152A (en) * 2015-11-18 2017-05-26 삼성전자주식회사 Electronic apparatus and method for configuring of display thereof
EP3171242A1 (en) * 2015-11-18 2017-05-24 Samsung Electronics Co., Ltd. Electronic device and method for configuring display thereof
EP3712742A1 (en) * 2015-11-18 2020-09-23 Samsung Electronics Co., Ltd. Electronic device and method for configuring display thereof
KR102426070B1 (en) 2015-11-18 2022-07-28 삼성전자 주식회사 Electronic apparatus and method for configuring of display thereof
US10921967B2 (en) 2015-11-18 2021-02-16 Samsung Electronics Co., Ltd. Electronic device and method for configuring display thereof
US11720421B2 (en) 2016-01-08 2023-08-08 Nasdaq, Inc. Systems and methods for calendar synchronization with enterprise web applications
US20170199771A1 (en) * 2016-01-08 2017-07-13 Nasdaq, Inc. Systems and methods for calendar synchronization with enterprise web applications
US11449368B2 (en) * 2016-01-08 2022-09-20 Nasdaq, Inc. Systems and methods for calendar synchronization with enterprise web applications
US20170205990A1 (en) * 2016-01-14 2017-07-20 Lenovo (Beijing) Limited Method, system, and apparatus for controlling display regions for an electronic device
USD792462S1 (en) 2016-01-26 2017-07-18 Google Inc. Display screen with transitional graphical user interface for image navigation and selection
USD793440S1 (en) * 2016-01-26 2017-08-01 Google Inc. Display screen with transitional graphical user interface
USD832885S1 (en) 2016-01-26 2018-11-06 Google Llc Display screen with a transitional graphical user interface for image navigation and selection
US10937390B2 (en) * 2016-02-18 2021-03-02 Samsung Electronics Co., Ltd. Content display method and electronic device for performing same
US20190244586A1 (en) * 2016-02-18 2019-08-08 Samsung Electronics Co., Ltd. Content display method and electonic device for performing same
CN105892823A (en) * 2016-04-27 2016-08-24 宇龙计算机通信科技(深圳)有限公司 Multi-window editing method, system and mobile terminal
USD808428S1 (en) 2016-06-29 2018-01-23 Quantcast Corporation Display screen or portion thereof with icon
US10990268B2 (en) * 2016-09-22 2021-04-27 Beijing Bytedance Network Technology Co Ltd. Operation method and terminal device
US10444920B2 (en) 2016-10-17 2019-10-15 Samsung Electronics Co., Ltd. Electronic device and method for controlling display in electronic device
US10642437B2 (en) 2016-10-17 2020-05-05 Samsung Electronics Co., Ltd. Electronic device and method for controlling display in electronic device
US11093049B2 (en) 2016-10-17 2021-08-17 Samsung Electronics Co., Ltd. Electronic device and method for controlling display in electronic device
WO2018074798A1 (en) * 2016-10-17 2018-04-26 Samsung Electronics Co., Ltd. Electronic device and method for controlling display in electronic device
US20180165005A1 (en) * 2016-12-13 2018-06-14 Lg Electronics Inc. Mobile terminal and method for controlling the same
US10564845B2 (en) * 2016-12-13 2020-02-18 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN108614675A (en) * 2016-12-13 2018-10-02 Lg电子株式会社 Mobile terminal and its control method
CN106843638A (en) * 2016-12-26 2017-06-13 北京奇艺世纪科技有限公司 The control method of video playing terminal, device and video playing terminal
US10203982B2 (en) * 2016-12-30 2019-02-12 TCL Research America Inc. Mobile-phone UX design for multitasking with priority and layered structure
US20180189099A1 (en) * 2016-12-30 2018-07-05 TCL Research America Inc. Mobile-phone ux design for multitasking with priority and layered structure
CN108268251A (en) * 2016-12-30 2018-07-10 Tcl集团股份有限公司 The user experience design method and system and medium of the mobile phone of multitasking
USD823871S1 (en) * 2017-02-03 2018-07-24 Google Llc Display screen with animated graphical user interface
RU2710309C2 (en) * 2017-06-15 2019-12-25 Боргвард Трейдмарк Холдингс ГмбХ Method and apparatus for processing split screen and vehicle
US11243660B2 (en) 2017-07-28 2022-02-08 Beijing Xiaomi Mobile Software Co., Ltd. Method and device for displaying application, and storage medium
EP3435218A1 (en) * 2017-07-28 2019-01-30 Beijing Xiaomi Mobile Software Co., Ltd. Method and device for displaying application, and storage medium
CN113946245A (en) * 2017-08-24 2022-01-18 华为技术有限公司 Split screen display method and device and terminal
US11405725B2 (en) * 2017-09-08 2022-08-02 Samsung Electronics Co., Ltd. Method for controlling audio output by application through earphones and electronic device implementing same
WO2019143071A1 (en) 2018-01-22 2019-07-25 Samsung Electronics Co., Ltd. Electronic device for controlling a plurality of applications
US10929002B2 (en) 2018-01-22 2021-02-23 Samsung Electronics Co., Ltd. Electronic device for controlling a plurality of applications
USD890774S1 (en) * 2018-02-22 2020-07-21 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
TWI672632B (en) * 2018-02-26 2019-09-21 宏碁股份有限公司 Method for filtering screen split configurations and computer device using the same
USD861721S1 (en) * 2018-04-09 2019-10-01 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for handling swipe gesture
USD922997S1 (en) 2018-04-09 2021-06-22 Palm Ventures Group, Inc. Personal computing device
USD931887S1 (en) 2018-04-09 2021-09-28 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for handling swipe gesture
USD874495S1 (en) 2018-04-09 2020-02-04 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for an application launcher
USD927522S1 (en) 2018-04-09 2021-08-10 Palm Ventures Group, Inc. Display screen or portion thereof with a graphical user interface for an application launcher
US10628987B2 (en) * 2018-05-08 2020-04-21 Google Llc Condensed transitions of graphical elements presented in graphical user interfaces
US11249630B2 (en) * 2018-06-29 2022-02-15 Beijing Microlive Vision Technology Co., Ltd Method, apparatus, terminal device, and storage medium for switching global special effects
RU2764157C1 (en) * 2018-06-29 2022-01-13 Бэйцзин Майкролайв Вижн Текнолоджи Ко., Лтд Method and apparatus for switching global special effects, terminal apparatus and data carrier
US10838594B2 (en) * 2018-07-03 2020-11-17 Canon Production Printing Holding B.V. Method of controlling a user interface
USD890198S1 (en) * 2018-08-21 2020-07-14 Facebook, Inc. Display screen with graphical user interface
USD928192S1 (en) 2018-08-21 2021-08-17 Facebook, Inc. Display screen with graphical user interface
USD894921S1 (en) 2018-08-21 2020-09-01 Facebook, Inc. Display screen with graphical user interface
US11017164B1 (en) 2018-08-27 2021-05-25 Facebook, Inc. Systems and methods for collecting multiple forms of digital content using a single landing screen
US11874886B1 (en) 2018-08-27 2024-01-16 Meta Platforms, Inc. Systems and methods for creating interactive metadata elements in social media compositions
US10942978B1 (en) 2018-08-27 2021-03-09 Facebook, Inc. Systems and methods for creating interactive metadata elements in social media compositions
US11838258B1 (en) 2018-09-05 2023-12-05 Meta Platforms, Inc. Systems and methods for creating multiple renditions of a social media composition from inputs to a single digital composer
US11025582B1 (en) 2018-09-05 2021-06-01 Facebook, Inc. Systems and methods for creating multiple renditions of a social media composition from inputs to a single digital composer
US11249643B2 (en) * 2018-10-26 2022-02-15 Samsung Electronics Co., Ltd Electronic device for displaying list of executable applications on split screen and operating method thereof
CN112689818A (en) * 2018-11-14 2021-04-20 深圳市柔宇科技股份有限公司 Anti-disturbance method, electronic device and computer readable storage medium
US20220276752A1 (en) * 2019-04-15 2022-09-01 Apple Inc. Systems, Methods, and User Interfaces for Interacting with Multiple Application Windows
US11698716B2 (en) * 2019-04-15 2023-07-11 Apple Inc. Systems, methods, and user interfaces for interacting with multiple application windows
USD948533S1 (en) * 2019-04-25 2022-04-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11231847B2 (en) * 2019-05-06 2022-01-25 Apple Inc. Drag and drop for a multi-window operating system
US10996839B2 (en) * 2019-05-20 2021-05-04 Microsoft Technology Licensing, Llc Providing consistent interaction models in communication sessions
USD916873S1 (en) * 2019-06-19 2021-04-20 Stryker Corporation Display screen or portion thereof with graphical user interface
USD951969S1 (en) 2019-06-19 2022-05-17 Stryker Corporation Display screen or portion thereof with graphical user interface
USD988353S1 (en) 2019-06-25 2023-06-06 Stryker Corporation Display screen or portion thereof with graphical user interface
US11687235B2 (en) * 2019-06-30 2023-06-27 Huawei Technologies Co., Ltd. Split-screen method and electronic device
US20220308753A1 (en) * 2019-06-30 2022-09-29 Huawei Technologies Co., Ltd. Split-Screen Method and Electronic Device
US11816305B2 (en) * 2019-07-19 2023-11-14 Tencent Technology (Shenzhen) Company Limited Interface display method and apparatus, and storage medium
US20210397309A1 (en) * 2019-07-19 2021-12-23 Tencent Technology (Shenzhen) Company Limited Interface display method and apparatus, and storage medium
US11735143B2 (en) 2019-08-19 2023-08-22 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
US11158290B2 (en) 2019-08-19 2021-10-26 Samsung Electronics Co., Ltd. Electronic device and method of controlling the same
CN114168047A (en) * 2019-08-22 2022-03-11 华为技术有限公司 Application window processing method and device
US11340959B2 (en) * 2019-10-29 2022-05-24 Lg Electronics Inc. Electronic apparatus for running application and control method thereof
CN111142769A (en) * 2019-12-20 2020-05-12 维沃移动通信有限公司 Split screen display method and electronic equipment
US11054987B1 (en) * 2019-12-25 2021-07-06 Shanghai Transsion Co., Ltd. Sidebar interaction method, device, and computer-readable storage medium
USD940176S1 (en) * 2020-03-01 2022-01-04 Schlumberger Technology Corporation Display device with a graphical user interface having a responsive menu
US20220334855A1 (en) * 2020-03-17 2022-10-20 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Multi-task operation method, electronic device, and storage medium
US20220382427A1 (en) * 2020-05-25 2022-12-01 Beijing Bytedance Network Technology Co., Ltd. Method and apparatus for controlling display of video call interface, storage medium and device
US11853543B2 (en) * 2020-05-25 2023-12-26 Beijing Bytedance Network Technology Co., Ltd. Method and apparatus for controlling display of video call interface, storage medium and device
WO2022121790A1 (en) * 2020-12-09 2022-06-16 维沃移动通信有限公司 Split-screen display method and apparatus, electronic device, and readable storage medium
CN112540740A (en) * 2020-12-09 2021-03-23 维沃移动通信有限公司 Split screen display method and device, electronic equipment and readable storage medium
CN113783995A (en) * 2021-08-13 2021-12-10 维沃移动通信有限公司 Display control method, display control device, electronic apparatus, and medium
CN114546314A (en) * 2022-01-21 2022-05-27 合肥联宝信息技术有限公司 Window display method and device, electronic equipment and storage medium
WO2023245310A1 (en) * 2022-06-20 2023-12-28 北京小米移动软件有限公司 Window adjustment method and apparatus, and terminal and storage medium

Also Published As

Publication number Publication date
KR20140039575A (en) 2014-04-02
CN103677627B (en) 2020-12-11
US20210011610A1 (en) 2021-01-14
AU2013318697A1 (en) 2015-02-26
CN103677627A (en) 2014-03-26
KR101957173B1 (en) 2019-03-12
US11714520B2 (en) 2023-08-01
EP3493042A1 (en) 2019-06-05
EP2725466A1 (en) 2014-04-30
AU2013318697B2 (en) 2018-09-20
ES2706010T3 (en) 2019-03-27
EP2725466B1 (en) 2018-12-26
WO2014046525A1 (en) 2014-03-27

Similar Documents

Publication Publication Date Title
US11714520B2 (en) Method and apparatus for providing multi-window in touch device
US11809693B2 (en) Operating method for multiple windows and electronic device supporting the same
CA2835099C (en) Method and apparatus for sharing data between different network devices
KR102256702B1 (en) Foldable device and control method thereof
US8928614B2 (en) Method and apparatus for operating function in touch device
EP2797300B1 (en) Apparatus and method for transmitting an information in portable device
US9829706B2 (en) Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
KR20150080756A (en) Controlling Method For Multi-Window And Electronic Device supporting the same
US8994678B2 (en) Techniques for programmable button on bezel of mobile terminal
CN103677711A (en) Method for connecting mobile terminal and external display and apparatus implementing the same
KR20120138618A (en) Method and apparatus for operating multi tasking in a mobile device
US20150241957A1 (en) Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
KR20120081877A (en) Method for operating a communication terminal
KR20120073928A (en) Method for operating a communication terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HWANG, DAESIK;JEONG, HYESOON;KIM, JEONGHOON;AND OTHERS;REEL/FRAME:031269/0665

Effective date: 20130912

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION