US20110004461A1 - Character communication device - Google Patents

Character communication device Download PDF

Info

Publication number
US20110004461A1
US20110004461A1 US12/641,820 US64182009A US2011004461A1 US 20110004461 A1 US20110004461 A1 US 20110004461A1 US 64182009 A US64182009 A US 64182009A US 2011004461 A1 US2011004461 A1 US 2011004461A1
Authority
US
United States
Prior art keywords
message
communication
character
language
messages
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/641,820
Inventor
Kei Nakajima
Takao Miyoshi
Yuij Naka
Shiro Maekawa
Katsumi Yabuno
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Corp filed Critical Sega Corp
Priority to US12/641,820 priority Critical patent/US20110004461A1/en
Publication of US20110004461A1 publication Critical patent/US20110004461A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/335Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S715/00Data processing: presentation processing of document, operator interface processing, and screen saver display processing
    • Y10S715/961Operator interface with visual structure or function dictated by intended use
    • Y10S715/965Operator interface with visual structure or function dictated by intended use for process control and configuration
    • Y10S715/966Computer process, e.g. operation of computer
    • Y10S715/968Computer process, e.g. operation of computer interface for database querying and retrieval

Definitions

  • the present invention relates to an apparatus including a character inputting device, especially to an improved technology of inputting characters into an information processing device or a portable communication device which does not have a JIS (Japanese Industrial Standard) keyboard or other types of keyboards for inputting characters.
  • JIS Japanese Industrial Standard
  • “Chat” is a method of online communication conducted on a computer network, either between parties connecting to the network or between a party connecting to the network and his host computer. “Chat” is a conversational function conducted through a network by inputting characters, and its participants generally input words with the aid of a JIS or an ASC keyboard.
  • terminate device In order to participate in a “chat”, one must have a terminate device for connection to a network.
  • Various types of terminate devices may be used for this purpose, including a so-called “personal computer”, a portable terminate device, or a game device having a modem mounted thereon.
  • a foreigner may be included among the participants.
  • a transmitter sends a message in a language which is understood by the other party of the communication.
  • GUI graphical user interface
  • the user displays on the screen of a device a software keyboard for inputting characters, moves a cursor over such keyboard and selects a character, and makes the selected characters recognized by a computer built inside the device.
  • chats are generally performed in a language commonly understood by the parties of communication, it is preferable that “chats” may be easily performed between parties whose languages of daily use are different.
  • one purpose of this invention is to facilitate the input of characters in an information processing device or a portable communication device which does not have a character inputting keyboard.
  • Another purpose of this invention is to provide a character inputting device easy to use even for those who have not learned to type on keyboards.
  • Another purpose of this invention is to provide a character inputting device which facilitates online conversations between communicators speaking different languages.
  • the character communication device is a communication device connected to a network and enabling communication of messages at least by characters, which has a transmitting and receiving means for implementing communication with the communication device of the other party of communication via a network, a communication content displaying means for displaying in a communication content displaying area of a screen display the content of communication with the communication device of the other party, a candidate term displaying means for displaying in a candidate term displaying area of the screen display a group of candidate terms prepared in advance for communication of the messages, a term selecting means for outputting a term selected by a transmitter out of the group of candidate terms, and a message forming means for serially displaying a plurality of output terms in a message editing area of the screen display, and thereby forming a message and sending the formed message to the transmitting and receiving means.
  • the communication device of the other party is either a host computer executing a program of a communication-type game in accordance with accesses from a plurality of computers having a character communication device, or a communication device operated by participants taking part in the communication-type game.
  • the candidate term displaying means receives the group of candidate terms from the communication device of the other party.
  • the group of candidate terms is classified at least according to the names of the participants in the game, or game-related nouns, pronouns, verbs, adjectives, inflections, symbols, or short sentences registered by the users.
  • the group of candidate terms are arranged in a table in a plurality of pages, and one of such plurality of pages is displayed in the candidate term displaying area.
  • Another communication device of the present invention has a storing means for storing a message language conversion table including a single message expressed in a plurality of languages, a menu displaying means for selectably displaying on a menu screen of a display a plurality of messages expressed in a single language, a language conversion means for referring to the message language conversion table and converting a selected message to a message in a language of a party with whom communication is held, and a transmitting means for transmitting the converted message to the communication counterpart.
  • chats communication by characters
  • conversations can be held in their respective languages.
  • the load of information processing for conversion of languages is conveniently small because the user has to only select applicable messages out of a plurality of messages displayed in different languages.
  • communication device of the present invention has a storing means for storing a database including a plurality of messages in a single language and marks added to each of the messages, a menu displaying device for selectably displaying some of the plurality of messages on a menu screen of a display, a message mark outputting means for outputting marks corresponding to a selected message by referring to the database, and a transmitting means for transmitting the output message marks to a party with whom communication is held.
  • a device for inputting characters it is possible to perform input operations via a device which can be used for selective operation, for example, via a game controller (a control pad, joystick, etc.) serving as an inputting device of the game device.
  • a game controller a control pad, joystick, etc.
  • the character communication system is a character communication system connected to a network and structured to include a plurality of character communication devices capable of performing message communication at least by characters, and includes a first character communication device used for communication by a first message group in a single language, and a second character communication device used for communication by a second message group in a plurality of languages, wherein communication between the first and second character communication devices are performed using marks commonly added in advance to messages having common meanings, included in each of the first message group and the second message group.
  • the character communication system is a character communication device having a first character communication device for displaying messages in a first language, a second character communication device for displaying messages in a second language, and a network for connecting both communication devices, wherein the first character communication device has a storing means for storing a message table including a plurality of messages in the first and second languages, a message displaying means for selectably displaying on a screen some of the plurality messages in the first language, a language conversion means for converting an applicable message in the first language to a message in the second language by referring to the message table, and a transmitting means for transmitting the converted message to the second character communication device.
  • the character inputting device is a character communication device for inputting characters expressed by a combination or vowels and consonants, which includes a consonant inputting means for mainly inputting consonant information of characters which are to be input, a vowel inputting means for mainly inputting vowel information of characters which are to be input, and an input character identifying means for identifying characters input according to a combination of consonant information and vowel information.
  • Japanese kava characters can be input easily by using one's senses. This is especially advantageous when inputting Japanese characters.
  • the consonant inputting means and the vowel inputting means are respectively arranged on the operation panel of the character inputting device, and wherein the consonant inputting means is arranged such that it is positioned toward one side of the operation panel with respect to the center position thereof in the longitudinal direction, and the vowel inputting means is arranged to be positioned toward the other side of the operation panel with respect to the center position thereof in the longitudinal direction, and both of the consonant inputting means and the vowel inputting means can be operated manually with both hands holding the operation panel.
  • the device can be used with both hands for inputting characters even when the device is held in the user's hand. Therefore, this is advantageous for a character inputting device of a portable communication device.
  • the consonant inputting means can be structured by a figure keyboard mainly including a plurality of figure keys
  • the vowel inputting means can be structured by a composite switch for producing a plurality of types of output in accordance with the operational status of one actuator.
  • the aforementioned composite switch can be structured by a cross key, a joystick, a pointing device, a trackball, etc.
  • the portable communication device including the aforementioned character inputting device is easy to understand using one's senses, even for those who are not accustomed to keyboard operation, and therefore, input in the Japanese language is easily conducted. Secondly, even when the portable communication device is held in the user's hand, both hands can be used upon inputting operations. Thirdly, since the built-in character inputting device is small and takes up less space, it is advantageous for a portable communication device.
  • FIG. 1 is an explanatory view illustrating a message forming screen of a character communication device according to the present invention.
  • FIG. 2 is an explanatory view illustrating the overall structure of a communication network.
  • FIG. 3 is a block diagram illustrating an example of a game device which has a communicational function.
  • FIG. 4 is a flowchart illustrating a message formation processing in the character communication device.
  • FIGS. 5A and 5B are explanatory diagrams illustrating examples of term categories.
  • FIGS. 6A and 6B are explanatory diagrams illustrating examples of term categories.
  • FIGS. 7A and 7B are explanatory diagrams illustrating examples of term categories.
  • FIG. 8A is an explanatory diagram illustrating a mini-window for inputting figures.
  • FIG. 8B is an explanatory diagram illustrating a mini-window for inputting te/ni/wo/ha particles (Japanese postpositional particles).
  • FIGS. 9A and 9B are explanatory diagrams illustrating a character conversation where message cards are used.
  • FIG. 10 is an explanatory diagram illustrating a character conversation conducted by three parties.
  • FIG. 11 is a flowchart illustrating communication by three parties using message cards.
  • FIG. 12 is an explanatory view illustrating an example of a message conversion table.
  • FIG. 13 is an explanatory view illustrating an example of a portable communication device (portable information processing device).
  • FIG. 14 is an explanatory view illustrating an example of an inner circuit structure of the portable communication device.
  • FIG. 15 is a flowchart illustrating an algorithm for distinguishing characters input through the operation of figure keys from those input through the operation of a cross key.
  • FIG. 2 is an overall block diagram showing the outline of the communication system according to the present invention.
  • FIG. 2 shows direct connection of communication terminal devices 201 and 202 capable of character communication via a network 203 , as well as indirect connection of terminal devices 201 and 202 via a host computer 210 .
  • the network 203 includes a public communication line, a dedicated line, the Internet, a LAN, etc.
  • the host computer 210 includes a data processing function and a data exchange function, and is connected to the terminal devices 201 and 202 via the aforementioned network 203 .
  • the data processing above may include host functions of a communication-type game.
  • the host computer which is a server of a game, provides event information, map information, game parameters, coordinate moving information, character status information and other information.
  • a plurality of terminal devices are connected to the network 203 , and the terminal devices contemplated here are not only limited to the terminal devices placed in domestic areas but also those placed abroad.
  • the terminal device includes a game device having a personal computer or a communicational function.
  • the terminal device includes at least a main body, a display, and an inputting device.
  • the terminal device may be accomplished, for example, by a household game device.
  • the game device includes a main body 1 of the game device, and a game controller (for example, control pad) 2 b which serves as a game input device.
  • the terminal device 202 may also be accomplished by a similar structure, and in the present embodiment, a keyboard 4 is further provided. However, the keyboard 4 is not essential to the present invention.
  • FIG. 3 shows one example of the terminal device 201 or 202 , and here a game device having a modem is used.
  • This game device can be used as a network terminal device, and one may play the so-called “communication-type game”.
  • the game is mainly structured by a CPU block 10 for controlling the entire device, a video block 11 for controlling display of a game screen, a sound block 12 for generating sound effects, etc., a subsystem 13 for reading CD-ROMs, and a modem 14 for implementing data communication with an outside party.
  • the CPU block 10 is structured by an SCU (System Control Unit) 100 , a main CPU 101 , a RAM 102 , a ROM 103 , a cartridge I/F 1 a for connecting a modem, etc., a sub CPU 104 , a CPU bus 103 , etc.
  • the main CPU 101 controls the entire device.
  • the main CPU 101 includes inside thereof a computing function similar to a DSP (Digital Signal Processor), and is capable of executing an application software at a high speed.
  • the main CPU 101 automatically recognizes the type of a peripheral (FDD 3 b in FIG. 3 ) connected to a connector 34 , and conducts data communication between such peripheral. More specifically, the peripheral is connected to the SCI which is built in the main CPU 101 .
  • the serial connector 3 a connected to the serial connector 3 a are SCI signal lines from a master SH and a slave SH, three each, as well as a MIDI in/out signal lines from a SCSP (sound DSP).
  • the FDD 3 b is used, for example, for storing the data of the backup memory (stores various data and parameters of a game; not shown in the Figure) on a floppy disk, or for copying floppy disk data into the backup memory.
  • the RAM 102 is used as a work area of the main CPU 101 .
  • the ROM 103 has an initial program for initialization processing, etc. written thereunto.
  • the SCU 100 allows, by controlling buses 105 , 106 and 107 , smooth input and output of data between the main CPU 101 and VDPs 120 , 130 , DSP 140 , CPU 141 , etc.
  • the SCU 100 internally includes a DMA controller, and is capable of transmitting to a VRAM of the video block 11 spring data during a game. This allows execution of games and other application software at a high speed.
  • the cartridge I/F 1 a allows input of an application software which is provided in a form of a ROM cartridge (not shown).
  • the modem 14 allows use of the modem 14 in a form of a cartridge for transmission and reception of data.
  • a modem By using a modem, one can play the so-called “communication-type game”.
  • the aforementioned game parameters, etc. are exchanged between the game server and the CPU 102 .
  • the sub CPU 104 which is called the “SMPC (System Manager & Peripheral Condition),” has functions such as collecting peripheral data from a pad 2 b via a connector 2 a according to a request received from the main CPU 101 .
  • the main CPU 101 implements processing such as moving combat vessels shown in a game screen according to the peripheral data received from the sub CPU 104 .
  • a desired peripheral selected from a pad, a joystick, a keyboard, etc. can be connected to the connector 2 a.
  • the sub CPU 104 has a function of automatically recognizing the type of the peripheral connected to a connector 2 a (terminal on the main body side), and collecting peripheral data, etc. in a communication system according to the type of each peripheral.
  • the video block 11 includes the VDP (Video Display Processor) 120 for drawing video game characters etc., which is made of polygons, and the VDP 130 intended for drawing background pictures, synthesizing polygon picture data with the background pictures and implementing clipping processing, etc.
  • the VDP 120 is connected to a VRAM 121 and frame buffers 122 and 123 .
  • the polygon drawing data representing the characters of the video game device are sent from the main CPU 101 to the VDP 120 via the SCU 100 , and then written in the VRAM 121 .
  • the drawing data written in the VRAM is drawn in the drawing frame buffer 122 or 123 in, for example, a 16- or 18-bit/pixel form.
  • the data of the frame buffer 122 or 123 which includes drawings therein are sent to the VDP 130 .
  • Information for controlling the drawing processing is provided from the main CPU 101 to the VDP 120 via the SCU 100 .
  • the VDP 120 then executes drawing processing according to these orders.
  • the VDP 130 is connected to the VRAM 131 , and is structured such that the picture data output from the VDP 130 are output to an encoder 160 via a memory 132 .
  • the encoder 160 adds synchronizing signals to these picture data and thereby produces image data, and outputs such data to a TV receiver 5 . As a consequence, various types of game screens are displayed on the TV receiver 5 .
  • the sound block 12 is structured by the DSP 140 for synthesizing sounds according to the PCM or FM method, and the CPU 141 for controlling the DSP 140 . Sound data produced by the DSP 140 are converted to 2-channel signals by the D/A converter 170 and thereafter output to a speaker 5 b.
  • the subsystem 13 is structured by a CD-ROM drive 1 b, a CD I/F 180 , a CPU 181 , an MPEG AUDIO 182 , an MPEG VIDEO 183 , etc.
  • the subsystem 13 has functions such as reading application software provided in a form of a CD-ROM, and replaying animations.
  • the CD-ROM drive 1 b reads data from the CD-ROM.
  • the CPU 181 implements processing such as controlling the CD-ROM drive 1 b, correcting errors in data which are read out. Data read from the CD-ROM are provided to the main CPU 101 via the CD I/F 180 , the bus 106 , the SCU 100 , and are used as application software.
  • the MPEG AUDIO 182 and the MPEG VIDEO 183 are devices for restoring data compressed according to the MPEG (Motion Picture Expert Group) standard. By restoring the MPEG compressed data written in the CD-ROM with the aid of the MPEG AUDIO 182 and the MPEG VIDEO 183 , animations can be replayed.
  • MPEG Motion Picture Expert Group
  • graphic data including fonts, sound data, tutorial maps, mail documents (backup RAMs), etc.
  • Graphic data, etc. are provided, for example, by a CD-ROM.
  • graphic data are held at the side of the game device, and data group including parameters are held on the side of the server. Parameters and map information are administered on the side of the server in their entirety, and information on results thereof is received on the side of the game device, where display of the screen is processed in a specialized manner.
  • Operation of the map data and parameters by the server allows for the provision of games with new content without CD-ROMs being changed. Furthermore, game parameters (strength of characters) held on the side of the server allows entry of new types of monsters according to the rise of the user's level.
  • a communication-type game In a communication-type game, one may find comrades of a game by playing the game through a network.
  • RPG Role Playing Game
  • multiple players at different locations can form a party in a virtual game space.
  • Individual characters representing the respective players can be controlled in the virtual space.
  • intentions of the players must be mutually communicated. For example, players need to talk with a party whom he meets for the first time, or to discuss with a comrade a destination of their adventure, or to consult strategies at the time of a combat.
  • a chat function real time conversation system by characters
  • a telegram function can be used while the other player logs onto the server, which allows a player to send messages to a specific party (or the other party) regardless of the location or the status of the player or the other party. This function is effective when contacting one's friends and acquaintances. Furthermore, in a virtual game space, there may be provided a bulletin board as a means of communication united with the world of game, or a letter transmission function for sending letters to a specific party.
  • the chat function and the telegram function, etc. described above are usually performed with the aid of a keyboard intended for inputting characters.
  • the keyboard is generally provided as an optional item, and is not included in the same package as the main body of the game device. Accordingly, a character interface for inputting characters is prepared for the purpose of typing keywords. Nonetheless, such inputting device is not limited to game pads.
  • a character interface (character inputting device) will be explained below with reference to FIGS. 1 through 4 .
  • a control pad 2 b is provided with, for example, the respective switches in the forms of buttons A, B, C, X, Y, Z, L, R, and further with a cross key. In a character inputting mode for chats, etc., prescribed functions are assigned to the respective buttons.
  • button A is a shift key for switching the cross key to move between categories
  • button B for back space (deletion of one letter)
  • button C is for determining words
  • button X is for turning on/off the display of a mini-window for inputting figures
  • button Y is for turning on/off the display of a mini-window for inputting te/ni/wo/ha particles (Japanese postpositional particles)
  • button Z is for CR (message transmission)
  • button L is for moving the cursor forward by one keyword category
  • button L for moving the cursor backward by one keyword category
  • the cross keys has a function of outputting orders for selecting words (moves between categories when button A is pressed).
  • FIG. 1 shows an example of a chat screen.
  • a chat window (communication content displaying area) 51 for displaying the content of conversation
  • a keyword category window (candidate term displaying area) 52 for displaying keywords (terms) according to categories
  • an editing window (message editing area) 53 for editing messages.
  • the main CPU 101 executes a program (algorithm) as shown in FIG. 4 .
  • a program for example, in the case of a party organizing mode, if button A of the pad 2 b is pressed by a player upon a scene where participation to a prescribed team is recruited, or at the state of a guidance screen displayed during the game, or upon a scene where the characters discuss and exchange information at a bar for making friends, then the chat mode is designated and a flag is thereby placed. As a result, messages can be exchanged, and the party members can consult with one another.
  • the CPU 101 reads parameters (situation parameters) showing the current situation (for example, scenes) (S 102 ).
  • parameters situation parameters
  • a variety of “current situations” is contemplated, including participation to a party, sending a telegram, sending a letter, viewing a guidance screen, conversation at a bar, strategy meeting prior to a combat, discussing a course of adventure, etc.
  • a set of terms corresponding to the current situation is read from a database recorded in a CD-ROM and is displayed on the keyword category window 52 . Furthermore, as shown in FIGS. 5 through 8 , sets of terms are prepared according to categories.
  • FIG. 5A shows a table for category 1 , collecting terms related to names. Registered here are personal pronouns, names of party participants, and names of other communicators. Furthermore, senders of telegrams or letters can be automatically registered to the table.
  • FIG. 5B shows a table for category 2 , collecting terms such as nouns and pronouns.
  • FIG. 6A shows a table for category 3 , which collects terms such as verbs and adjectives.
  • FIG. 6B shows a table for category 4 , where inflection terms are collected.
  • FIG. 7A shows a table for category 5 , collecting symbols, emotional expressions, and other terms.
  • FIG. 7B shows a table for category 5 , collecting short sentences registered by the user in advance.
  • FIG. 8A shows the mini-window for figures
  • FIG. 8B shows the mini-window for te/ni/wo/ha particles (Japanese postpositional particles).
  • te/ni/wo/ha particles Japanese postpositional particles.
  • the player may switch and display the aforementioned sets of terms according to categories (fields, groups, items, etc.) (S 104 ).
  • the terms are divided into several pages, and the content of a single page selected (for example, page 1 , as shown in FIG. 1 ) is displayed.
  • the display area can be used effectively. This also enables easy switching of the content displayed (pages).
  • the player selects terms by operating the control pad 2 b.
  • the cursor or reverse display or other selected display
  • a term is selected.
  • categories are switched.
  • Button C is then pressed and a term to be selected is determined, and the term selection flag is placed.
  • the selected term for example, “yah-ho-” (“yoo-hoo” in Japanese) is transferred to the message editing program.
  • the message editing program has a simple editor function. With the aid of this program, the selected terms are displayed in the message editing window 53 (S 108 ). Subsequently, when “,” (comma), “konnichiwa” (“hello” in Japanese), “ ⁇ ” (full stop in Japanese) are input, the editing window 53 similarly displays “yah-ho-, konnichiwa ⁇ ” (“yoo-hoo, hello.” in Japanese).
  • messages can be modified, altered or input with a keyboard (S 106 through S 110 ; No).
  • the host computer system includes a database which stores sets of terms shown in FIGS. 5 through 8 , thereby enabling transmission of a relevant set of terms to the terminal device.
  • data transmission in characters can be performed between terminal devices (game device) without the intermediation of the host computer.
  • terms can be selected from a table for inputting characters, whereby messages can be formed and thereafter transmitted.
  • the second embodiment will be now explained with reference to the FIGS. 9 through 12 .
  • message cards shown on the screen are selected and mutually sent, thereby enabling the exchange of messages.
  • the languages of the message cards can be converted for overcoming any inconvenience caused by the difference in the languages. Since the second embodiment may have the structure of the device shown in FIGS. 2 and 3 , explanation on such structure will be omitted.
  • FIG. 9A shows an example of a display screen at the time the terminal device is set at the message exchange mode.
  • greeting message cards are displayed.
  • the first card reads “konnichiwa” (“hello” in Japanese), the second card “ohayou” (“good morning” in Japanese), and the third card “konbanwa” (“good evening” in Japanese).
  • the communicator operates the control pad and moves the cursor over an applicable message card, for example, “ohayou” (“good morning” in Japanese), and operates the selection button.
  • FIG. 9B shows an example of the other English-speaking communicator's display screen when the terminal device is in the message exchange mode.
  • “konnichiwa” (“hello” in Japanese), is converted and displayed as “HELLO”.
  • FIG. 10 shows an example where player 1 (Japanese language) sends a message card to player 2 (Japanese language) and player 3 (English language).
  • a message conversion table shown in FIG. 12 is stored in advance.
  • the conversion table is provided by a CD-ROM or a game server.
  • a greeting message card is displayed on the monitor screen of the first player's game device.
  • the first card reads “konnichiwa” (“hello” in Japanese), the second card “ohayou” (“good morning” in Japanese), and the third card “konbanwa” (“good evening” in Japanese).
  • the first player selects “konnichiwa” (“hello” in Japanese), designates a receiving party, and sends the message to the server.
  • the receiving party can be an individual party or all of the party participants.
  • Message card no. “ 2 ” corresponding to “konnichiwa” (“hello” in Japanese) is sent to the server.
  • the server sends to the designated receiving party message card no. “ 2 ” which corresponds to “konnichiwa” (“hello” in Japanese).
  • message card no. “ 2 ” is transmitted to player 2 and player 3 who are participating in a communication-type game together with player 1 .
  • the terminal device of player 2 refers to the message conversion table shown in FIG. 12 , decodes message card no. “ 2 ” to “konnichiwa” (“hello” in Japanese). The message “konnichiwa” is displayed on the screen.
  • the terminal device of player 3 refers to the message conversion table shown in FIG. 12 , decodes message card no. “ 2 ” to “hello”. The message “hello” is displayed on the screen.
  • player 2 designates a message selection mode to the terminal device and displays multiple greeting message cards on the screen. “Konnichiwa” (“hello” in Japanese) is selected and transmitted to player 1 . Message card no. “ 2 ” corresponding to “konnichiwa” (“hello” in Japanese) is sent to the server and then transferred to player 1 from the server.
  • the terminal device of player 1 refers to the message conversion table and decodes the received message card no. “ 2 ” into “konnichiwa” (“hello” in Japanese) and displays the message card “konnichiwa” on the screen.
  • player 3 designates the message selection mode to the terminal device and displays multiple greeting message cards on the screen. “HELLO” is selected and transmitted to player 1 .
  • Message card no. “ 2 ” is sent to the server and then transferred to player 1 from the server.
  • the terminal device of player 1 refers to the message conversion table and decodes the received message card no. “ 2 ” into “konnichiwa” (“hello” in Japanese) and displays the message card “konnichiwa” on the screen.
  • the respective terminal devices can directly convert messages into the receiving party's language and send such messages. For example, when player 1 selects “ohayou” (“good morning” in Japanese), the terminal device may refer to the message conversion table and convert the message into “GOOD MORNING” which corresponds to “ohayou” (good morning), and send to the server the message “GOOD MORNING” which is represented by a group of character codes.
  • the server may assume the role of referring to the message conversion table and converting the languages of the messages.
  • FIG. 11 is a flowchart explaining the message transmission operation implemented at the respective devices.
  • the CPU for example, the main CPU 101 of the terminal device implements these procedures when a card message mode flag is identified during the execution of the main program (not shown).
  • the CPU reads game parameters (S 122 ) and displays on the screen multiple message cards according to the progress of the current game.
  • Sets of message cards can be displayed according to the respective categories (fields, groups, items, etc.) (S 124 ).
  • the player operates the cursor and selects a message card.
  • the cross key of the game pad 2 b is moved for moving the cursor (or reverse display or other selected display), and a message card is thereby selected.
  • message card categories are switched.
  • button C is pressed and a message card to be selected is determined, a selection flag is placed.
  • the selected message card is identified according to the position of the card on the screen, as well as the position of the cursor on the screen (S 128 ). The number of the card selected is then identified. For example, if the message card “konnichiwa” (“hello” in Japanese) is selected, message card no. “ 2 ” is identified (S 130 a ). Message card no. “ 2 ” is transmitted to the server together with an address to which it is to be sent (S 132 a ). If the message is addressed to player 2 and player 3 , message card no. “ 2 ” is transferred to player 2 and player 3 from the server.
  • the terminal device of player 2 refers to the message conversion table and converts message card no. “ 2 ” into “konnichiwa” (“hello” in Japanese) and displays such message card on the display.
  • the terminal device of player 3 refers to the message conversion table and converts message card no. “ 2 ” into “HELLO” and displays the message card “HELLO” on the display.
  • the message is converted into the receiving party's language at the terminal device and the message is transmitted to the receiving party after the language of the message has been converted. Therefore, the algorithm steps S 130 a and S 132 a described above are modified.
  • the identified message (S 128 ) is converted to the receiving party's language (S 130 b ).
  • the message is converted into “HELLO”.
  • the message is transmitted to the receiving party after the language of the message has been converted (S 132 b ).
  • the terminal game devices can communicate with each other directly through a network.
  • FIG. 13 is an explanatory diagram illustrating the fourth embodiment. This embodiment suggests a character inputting device in a portable communication device or a portable information processing device.
  • the portable communication device 300 is structured roughly by a cover 300 a and a main body 300 b.
  • the cover 300 a internally includes a display panel 305 made of liquid crystals, etc.
  • the display panel 308 is capable of displaying an English character displaying area 308 a for displaying an English character table intended for selection of English characters, a symbol displaying area 308 b for displaying a symbol table intended for selecting symbols, and a message displaying area 308 c intended for several messages out of the multiple messages registered in advance.
  • the main body 300 b internally encloses a CPU board, etc., and the top face thereof constitutes an inputting panel.
  • a speaker 310 is arranged at substantially the center of the upper verge of the input operating panel.
  • a figure keyboard (figure keys) 321 is arranged to the right (in the longitudinal direction) with respect to the center of the input operation panel.
  • the figure keyboard includes figure buttons “0” through “9”, and an “*” button and a “#” button.
  • the figures may have the same arrangement as in a telephone set.
  • a cross key 322 is arranged to the left (in the longitudinal direction) from the center of the input operation panel.
  • the cross key 322 is capable of designating upward ( ⁇ ), downward ( ⁇ ), rightward ( ⁇ ), or leftward ( ⁇ ) positions.
  • the neutral position of the actuator may be designated as the fifth output (“N”).
  • the figure key 321 is arranged on the right side and the cross key 322 on the left side for facilitating operation by both hands when the portable communication device is held in either hand.
  • mode selection switches 323 a, 323 b and 323 c are arranged.
  • the mode selection switch 323 a is used for activating the selection mode of the registered (stock) messages. By operating this switch, display of the message displaying area 308 c can be turned on/off.
  • the mode selection switch 323 b moves the cross key in a cursor mode.
  • English characters and symbols can be input at the English character displaying area 308 a and the symbol displaying area 308 b.
  • the switch 323 is used for switching among the Japanese hiragana character/Japanese katakana character/figure/lowercase/uppercase modes.
  • FIG. 14 is a block diagram schematically illustrating the structure of the portable communication device.
  • a CPU 301 implements data processing and controls the respective components according to an operating system program or an application program.
  • a RAM 302 holds data and programs therein, and constitutes a principal work area for the CPU 301 .
  • a ROM 303 holds data and programs therein in a nonvolatile manner.
  • An external inputting interface 304 encodes the respective output results by the figure key 321 , the cross key 322 and the function keys 323 and outputs the same to the prescribed area of the RAM 302 .
  • a display interface 305 displays to the liquid crystal display unit 308 any information written into the video RAM by the CPU 301 .
  • a communication interface 306 exchanges data signals between an analog circuit modem 309 . Connected to the analog circuit modem 309 is, for example, a public telecommunication line, or a local area network.
  • An external interface 307 intermediates the computer system and an externally connected device (for example, a printer). The external interface 307 includes a simple pronunciation circuit which drives a speaker 310 .
  • the CPU identifies whether or not a flag representing operation of the cross key has been placed. If the flag has not been placed, processing is returned to the main program and repeated again (S 152 ; No).
  • the CPU reads the value which has been input through the figure key 321 (S 154 ) or the cross key.
  • neutral flag “N” is placed at the neutral position of the actuator.
  • Characters which have been input are then identified. For example, values “1” through “*” output by the figure keys respectively correspond to the “a” row, “ka” row, and “sa” through “wa” rows in the consonant arrangements of the Japanese fifty-syllabary.
  • the values “N”, “ ⁇ ” through “ ⁇ ” output by the cross key correspond to the vowel arrangements of the Japanese fifty-syllabary. “N” corresponds to “a”; “ ⁇ ” to “i”; “ ⁇ ” to “u”; “ ⁇ ” to “e”; and “ ⁇ ” to “o”. Characters can thus be input by the figure key and the cross key.
  • the CPU identifies a combination of: “1” and “ ⁇ ” as “o”; “6” and “N” as “ha”; “8” and “ ⁇ ” as “yo”; and “1” and “ ⁇ ” as “u”. Consequently, “1 ⁇ 6N8 ⁇ 1 ⁇ ” is identified as “o•ha•yo•u”.
  • the symbol “#” can be used for dividing the characters. For example, this symbol can be used as “1# ⁇ 6#8 ⁇ #1 ⁇ #”, or “1 ⁇ #6#8 ⁇ 1 ⁇ ”. Symbols serially input are converted to the Japanese kana characters “o•ha•yo•u”. In this case, the symbol “N” above need not be used.
  • values may be associated with the characters “a•ka•sa•ta•na•ha•ma•ya•ra•wa” in advance, and when there are two consecutive figures in the character or symbol sequences which are input, the first value can be converted to an applicable character out of “a•ka•sa•ta•na•ha•ma•ya•ra•wa”.
  • “1 ⁇ 68 ⁇ 1 ⁇ ” is, for example, identified as “o•ha•yo•u”, and the symbol “N” above need not be added.
  • the characters identified are delivered to a text processing program which has an editor function.
  • the characters identified are displayed in the message forming area (not shown).
  • the characters and symbols which have been input can be modified in this area.
  • Japanese kana characters can thus be input using the figure keys and the cross key.
  • a cross key is used in the embodiment above, a joystick, a trackball or other pointing devices can be used instead.
  • the character communication device requires no keyboard due to the terms displayed on the display being selected, assembled and transmitted as messages. Furthermore, even those who are not accustomed to typing on keyboards can easily use this device.
  • Another word-card character communication device requires no keyboard and even those having no knowledge of foreign languages can communicate with foreigners.
  • both hands can be used when the communication device is held in either hand, and the device also takes up less space.

Abstract

A character communication device which is connected to a network and enables at least communication by using characters. The device comprises a candidate term display means (S104) for displaying a group of candidate terms prepared for message communication in a candidate term display area (52) of the screen, a term selecting means (2 b) for outputting a term which is selected by the operator from the group of candidate terms, and a message generating means (s108) for generating a message by serially displaying a plurality of outputted terms in a message editing area (53) of the screen and sending the generated message to a transmitting/receiving means.

Description

    FIELD OF THE INVENTION
  • The present invention relates to an apparatus including a character inputting device, especially to an improved technology of inputting characters into an information processing device or a portable communication device which does not have a JIS (Japanese Industrial Standard) keyboard or other types of keyboards for inputting characters.
  • BACKGROUND ART
  • “Chat” is a method of online communication conducted on a computer network, either between parties connecting to the network or between a party connecting to the network and his host computer. “Chat” is a conversational function conducted through a network by inputting characters, and its participants generally input words with the aid of a JIS or an ASC keyboard.
  • In order to participate in a “chat”, one must have a terminate device for connection to a network. Various types of terminate devices may be used for this purpose, including a so-called “personal computer”, a portable terminate device, or a game device having a modem mounted thereon.
  • Furthermore, in an online “chat”, a foreigner may be included among the participants. In such case, a transmitter sends a message in a language which is understood by the other party of the communication.
  • However, in the case of a portable termination device (hand-held computer), in order to make the overall shape of the device a small size, mounting a character keyboard in a complete form is difficult. Moreover, in the case of a household game device, due to the purpose of its use as a game device or for cost reasons, a keyboard is not included in a standard game device, or the original design specification of a game device does not allow the connection of a keyboard. Accordingly, input of characters is generally not easy in these devices.
  • Therefore, one method used is an employment of a graphical user interface (GUI). For example, the user displays on the screen of a device a software keyboard for inputting characters, moves a cursor over such keyboard and selects a character, and makes the selected characters recognized by a computer built inside the device.
  • However, selecting each character from a software keyboard displayed according to the Japanese fifty-syllabary is time consuming, and is not suitable for a real-time communication such as “chats.”
  • Furthermore, although “chats” are generally performed in a language commonly understood by the parties of communication, it is preferable that “chats” may be easily performed between parties whose languages of daily use are different.
  • Accordingly, one purpose of this invention is to facilitate the input of characters in an information processing device or a portable communication device which does not have a character inputting keyboard.
  • Moreover, another purpose of this invention is to provide a character inputting device easy to use even for those who have not learned to type on keyboards.
  • Furthermore, another purpose of this invention is to provide a character inputting device which facilitates online conversations between communicators speaking different languages.
  • DESCRIPTION OF THE INVENTION
  • In order to accomplish the purposes mentioned above, the character communication device according to the present invention is a communication device connected to a network and enabling communication of messages at least by characters, which has a transmitting and receiving means for implementing communication with the communication device of the other party of communication via a network, a communication content displaying means for displaying in a communication content displaying area of a screen display the content of communication with the communication device of the other party, a candidate term displaying means for displaying in a candidate term displaying area of the screen display a group of candidate terms prepared in advance for communication of the messages, a term selecting means for outputting a term selected by a transmitter out of the group of candidate terms, and a message forming means for serially displaying a plurality of output terms in a message editing area of the screen display, and thereby forming a message and sending the formed message to the transmitting and receiving means.
  • According to such structure, terms shown on a display can be selected, and a message can be formed and transmitted. Accordingly, even those who are not able to type on keyboards can communicate by characters.
  • In the aforementioned invention, the communication device of the other party is either a host computer executing a program of a communication-type game in accordance with accesses from a plurality of computers having a character communication device, or a communication device operated by participants taking part in the communication-type game.
  • According to such structure, even if the game devices of the participants taking part in a communication-type game such as RPG does not have a keyboard for inputting characters, messages can be transmitted to the game devices of these participants. Furthermore, online chats can be conducted among participants taking part in a communication-type game.
  • In the aforementioned invention, the candidate term displaying means receives the group of candidate terms from the communication device of the other party.
  • According to such structure, terms can be easily obtained even if one's game device does not have a database in advance. Furthermore, in a communication-type game such as RPG, glossaries conforming with the content or progress of the game can be obtained, and there is a higher possibility that terms necessary for the current scene can be selected. Thus, forming messages is made easier. Moreover, forming messages expected by the other party is made easier.
  • Furthermore, in the invention above, the group of candidate terms is classified at least according to the names of the participants in the game, or game-related nouns, pronouns, verbs, adjectives, inflections, symbols, or short sentences registered by the users.
  • According to such structure, one may quickly select terms, as soon as possible. This is advantageous for online chats.
  • Furthermore, the group of candidate terms are arranged in a table in a plurality of pages, and one of such plurality of pages is displayed in the candidate term displaying area.
  • According to such structure, even if the displaying area of the screen is formed as a relatively small area, a group of terms for selection can be displayed efficiently in such area. Furthermore, by turning the pages, one may easily move to another page.
  • Another communication device of the present invention has a storing means for storing a message language conversion table including a single message expressed in a plurality of languages, a menu displaying means for selectably displaying on a menu screen of a display a plurality of messages expressed in a single language, a language conversion means for referring to the message language conversion table and converting a selected message to a message in a language of a party with whom communication is held, and a transmitting means for transmitting the converted message to the communication counterpart.
  • According to such structure, firstly, communication by characters (chats) can be conducted in a form of cards without requiring a keyboard. Secondly, even if the communicators use different languages, conversations can be held in their respective languages. Thirdly, the load of information processing for conversion of languages is conveniently small because the user has to only select applicable messages out of a plurality of messages displayed in different languages.
  • Furthermore, communication device of the present invention has a storing means for storing a database including a plurality of messages in a single language and marks added to each of the messages, a menu displaying device for selectably displaying some of the plurality of messages on a menu screen of a display, a message mark outputting means for outputting marks corresponding to a selected message by referring to the database, and a transmitting means for transmitting the output message marks to a party with whom communication is held.
  • According to such structure, since messages are sent not in a series of words but in marks corresponding to the messages, the amount of transmission information required for the messages is small, and handling thereof is easy. Furthermore, since common marks are added to a group of messages having common meanings, replacement to a message in another language is facilitated.
  • By displaying the aforementioned messages in a form of cards, conversation can be held by showing the cards. By making the messages uniform in the form of cards, handling of messages, the language conversion and the preparation of database, etc. are facilitated.
  • By having the aforementioned character communication device formed inside (built in) a household game device, communication by characters are performed at a relatively low cost with the aid of a game device having a relatively small storing capacity and not including a keyboard in its standard package. This is advantageous especially when playing a communication-type game.
  • Regarding a device for inputting characters, it is possible to perform input operations via a device which can be used for selective operation, for example, via a game controller (a control pad, joystick, etc.) serving as an inputting device of the game device.
  • The character communication system according to the present invention is a character communication system connected to a network and structured to include a plurality of character communication devices capable of performing message communication at least by characters, and includes a first character communication device used for communication by a first message group in a single language, and a second character communication device used for communication by a second message group in a plurality of languages, wherein communication between the first and second character communication devices are performed using marks commonly added in advance to messages having common meanings, included in each of the first message group and the second message group.
  • According to such structure, obtained is a character communication system having a small message transmission capacity and easily displaying in different languages the messages having the same meanings.
  • The character communication system according to the present invention is a character communication device having a first character communication device for displaying messages in a first language, a second character communication device for displaying messages in a second language, and a network for connecting both communication devices, wherein the first character communication device has a storing means for storing a message table including a plurality of messages in the first and second languages, a message displaying means for selectably displaying on a screen some of the plurality messages in the first language, a language conversion means for converting an applicable message in the first language to a message in the second language by referring to the message table, and a transmitting means for transmitting the converted message to the second character communication device.
  • According to such structure, it is possible to transmit from one's station a message which has been converted into the language of a party with whom communication is held.
  • The character inputting device according to the present invention is a character communication device for inputting characters expressed by a combination or vowels and consonants, which includes a consonant inputting means for mainly inputting consonant information of characters which are to be input, a vowel inputting means for mainly inputting vowel information of characters which are to be input, and an input character identifying means for identifying characters input according to a combination of consonant information and vowel information.
  • According to such structure, Japanese kava characters can be input easily by using one's senses. This is especially advantageous when inputting Japanese characters.
  • In the aforementioned character inputting device, the consonant inputting means and the vowel inputting means are respectively arranged on the operation panel of the character inputting device, and wherein the consonant inputting means is arranged such that it is positioned toward one side of the operation panel with respect to the center position thereof in the longitudinal direction, and the vowel inputting means is arranged to be positioned toward the other side of the operation panel with respect to the center position thereof in the longitudinal direction, and both of the consonant inputting means and the vowel inputting means can be operated manually with both hands holding the operation panel.
  • According to such structure, the device can be used with both hands for inputting characters even when the device is held in the user's hand. Therefore, this is advantageous for a character inputting device of a portable communication device.
  • In the aforementioned character inputting device, the consonant inputting means can be structured by a figure keyboard mainly including a plurality of figure keys, and the vowel inputting means can be structured by a composite switch for producing a plurality of types of output in accordance with the operational status of one actuator.
  • The aforementioned composite switch can be structured by a cross key, a joystick, a pointing device, a trackball, etc.
  • Furthermore, the portable communication device including the aforementioned character inputting device is easy to understand using one's senses, even for those who are not accustomed to keyboard operation, and therefore, input in the Japanese language is easily conducted. Secondly, even when the portable communication device is held in the user's hand, both hands can be used upon inputting operations. Thirdly, since the built-in character inputting device is small and takes up less space, it is advantageous for a portable communication device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an explanatory view illustrating a message forming screen of a character communication device according to the present invention.
  • FIG. 2 is an explanatory view illustrating the overall structure of a communication network.
  • FIG. 3 is a block diagram illustrating an example of a game device which has a communicational function.
  • FIG. 4 is a flowchart illustrating a message formation processing in the character communication device.
  • FIGS. 5A and 5B are explanatory diagrams illustrating examples of term categories.
  • FIGS. 6A and 6B are explanatory diagrams illustrating examples of term categories.
  • FIGS. 7A and 7B are explanatory diagrams illustrating examples of term categories.
  • FIG. 8A is an explanatory diagram illustrating a mini-window for inputting figures.
  • FIG. 8B is an explanatory diagram illustrating a mini-window for inputting te/ni/wo/ha particles (Japanese postpositional particles).
  • FIGS. 9A and 9B are explanatory diagrams illustrating a character conversation where message cards are used.
  • FIG. 10 is an explanatory diagram illustrating a character conversation conducted by three parties.
  • FIG. 11 is a flowchart illustrating communication by three parties using message cards.
  • FIG. 12 is an explanatory view illustrating an example of a message conversion table.
  • FIG. 13 is an explanatory view illustrating an example of a portable communication device (portable information processing device).
  • FIG. 14 is an explanatory view illustrating an example of an inner circuit structure of the portable communication device.
  • FIG. 15 is a flowchart illustrating an algorithm for distinguishing characters input through the operation of figure keys from those input through the operation of a cross key.
  • BEST MODES FOR CARRYING OUT THE INVENTION
  • The First Embodiment of the present invention will be explained below with reference to the drawings.
  • FIG. 2 is an overall block diagram showing the outline of the communication system according to the present invention. FIG. 2 shows direct connection of communication terminal devices 201 and 202 capable of character communication via a network 203, as well as indirect connection of terminal devices 201 and 202 via a host computer 210. The network 203 includes a public communication line, a dedicated line, the Internet, a LAN, etc. The host computer 210 includes a data processing function and a data exchange function, and is connected to the terminal devices 201 and 202 via the aforementioned network 203. The data processing above may include host functions of a communication-type game. In such case, the host computer, which is a server of a game, provides event information, map information, game parameters, coordinate moving information, character status information and other information. A plurality of terminal devices are connected to the network 203, and the terminal devices contemplated here are not only limited to the terminal devices placed in domestic areas but also those placed abroad.
  • Besides a dedicated communication device, the terminal device includes a game device having a personal computer or a communicational function. The terminal device includes at least a main body, a display, and an inputting device.
  • As explained below, the terminal device may be accomplished, for example, by a household game device. In such case, the game device includes a main body 1 of the game device, and a game controller (for example, control pad) 2 b which serves as a game input device.
  • The terminal device 202 may also be accomplished by a similar structure, and in the present embodiment, a keyboard 4 is further provided. However, the keyboard 4 is not essential to the present invention.
  • FIG. 3 shows one example of the terminal device 201 or 202, and here a game device having a modem is used. This game device can be used as a network terminal device, and one may play the so-called “communication-type game”.
  • The game is mainly structured by a CPU block 10 for controlling the entire device, a video block 11 for controlling display of a game screen, a sound block 12 for generating sound effects, etc., a subsystem 13 for reading CD-ROMs, and a modem 14 for implementing data communication with an outside party.
  • The CPU block 10 is structured by an SCU (System Control Unit) 100, a main CPU 101, a RAM 102, a ROM 103, a cartridge I/F 1 a for connecting a modem, etc., a sub CPU 104, a CPU bus 103, etc. The main CPU 101 controls the entire device. The main CPU 101 includes inside thereof a computing function similar to a DSP (Digital Signal Processor), and is capable of executing an application software at a high speed. The main CPU 101 automatically recognizes the type of a peripheral (FDD 3 b in FIG. 3) connected to a connector 34, and conducts data communication between such peripheral. More specifically, the peripheral is connected to the SCI which is built in the main CPU 101. Furthermore, connected to the serial connector 3 a are SCI signal lines from a master SH and a slave SH, three each, as well as a MIDI in/out signal lines from a SCSP (sound DSP). Moreover, the FDD 3 b is used, for example, for storing the data of the backup memory (stores various data and parameters of a game; not shown in the Figure) on a floppy disk, or for copying floppy disk data into the backup memory.
  • The RAM 102 is used as a work area of the main CPU 101. The ROM 103 has an initial program for initialization processing, etc. written thereunto. The SCU 100 allows, by controlling buses 105, 106 and 107, smooth input and output of data between the main CPU 101 and VDPs 120, 130, DSP 140, CPU 141, etc. The SCU 100 internally includes a DMA controller, and is capable of transmitting to a VRAM of the video block 11 spring data during a game. This allows execution of games and other application software at a high speed. The cartridge I/F 1 a allows input of an application software which is provided in a form of a ROM cartridge (not shown). Furthermore, it allows use of the modem 14 in a form of a cartridge for transmission and reception of data. By using a modem, one can play the so-called “communication-type game”. The aforementioned game parameters, etc. are exchanged between the game server and the CPU 102.
  • The sub CPU 104, which is called the “SMPC (System Manager & Peripheral Condition),” has functions such as collecting peripheral data from a pad 2 b via a connector 2 a according to a request received from the main CPU 101. The main CPU 101 implements processing such as moving combat vessels shown in a game screen according to the peripheral data received from the sub CPU 104. A desired peripheral selected from a pad, a joystick, a keyboard, etc. can be connected to the connector 2 a. The sub CPU 104 has a function of automatically recognizing the type of the peripheral connected to a connector 2 a (terminal on the main body side), and collecting peripheral data, etc. in a communication system according to the type of each peripheral.
  • The video block 11 includes the VDP (Video Display Processor) 120 for drawing video game characters etc., which is made of polygons, and the VDP 130 intended for drawing background pictures, synthesizing polygon picture data with the background pictures and implementing clipping processing, etc. The VDP 120 is connected to a VRAM 121 and frame buffers 122 and 123. The polygon drawing data representing the characters of the video game device are sent from the main CPU 101 to the VDP 120 via the SCU 100, and then written in the VRAM 121. The drawing data written in the VRAM is drawn in the drawing frame buffer 122 or 123 in, for example, a 16- or 18-bit/pixel form. The data of the frame buffer 122 or 123 which includes drawings therein are sent to the VDP 130. Information for controlling the drawing processing is provided from the main CPU 101 to the VDP 120 via the SCU 100. The VDP 120 then executes drawing processing according to these orders.
  • The VDP 130 is connected to the VRAM 131, and is structured such that the picture data output from the VDP 130 are output to an encoder 160 via a memory 132. The encoder 160 adds synchronizing signals to these picture data and thereby produces image data, and outputs such data to a TV receiver 5. As a consequence, various types of game screens are displayed on the TV receiver 5.
  • The sound block 12 is structured by the DSP 140 for synthesizing sounds according to the PCM or FM method, and the CPU 141 for controlling the DSP 140. Sound data produced by the DSP 140 are converted to 2-channel signals by the D/A converter 170 and thereafter output to a speaker 5 b.
  • The subsystem 13 is structured by a CD-ROM drive 1 b, a CD I/F 180, a CPU 181, an MPEG AUDIO 182, an MPEG VIDEO 183, etc. The subsystem 13 has functions such as reading application software provided in a form of a CD-ROM, and replaying animations. The CD-ROM drive 1 b reads data from the CD-ROM. The CPU 181 implements processing such as controlling the CD-ROM drive 1 b, correcting errors in data which are read out. Data read from the CD-ROM are provided to the main CPU 101 via the CD I/F 180, the bus 106, the SCU 100, and are used as application software. Furthermore, the MPEG AUDIO 182 and the MPEG VIDEO 183 are devices for restoring data compressed according to the MPEG (Motion Picture Expert Group) standard. By restoring the MPEG compressed data written in the CD-ROM with the aid of the MPEG AUDIO 182 and the MPEG VIDEO 183, animations can be replayed.
  • When playing a communication-type game under such structure, provided on the side of the game device are, for example, all graphic data including fonts, sound data, tutorial maps, mail documents (backup RAMs), etc. Graphic data, etc. are provided, for example, by a CD-ROM. Furthermore, provided on the side of the server are, for example, map data, event data (message data, etc.), monster parameters, various parameter data, backup information, etc. Basically, graphic data are held at the side of the game device, and data group including parameters are held on the side of the server. Parameters and map information are administered on the side of the server in their entirety, and information on results thereof is received on the side of the game device, where display of the screen is processed in a specialized manner. Operation of the map data and parameters by the server allows for the provision of games with new content without CD-ROMs being changed. Furthermore, game parameters (strength of characters) held on the side of the server allows entry of new types of monsters according to the rise of the user's level.
  • In a communication-type game, one may find comrades of a game by playing the game through a network. For example, in a network RPG (Role Playing Game), multiple players at different locations can form a party in a virtual game space. Individual characters representing the respective players can be controlled in the virtual space. In such case, intentions of the players must be mutually communicated. For example, players need to talk with a party whom he meets for the first time, or to discuss with a comrade a destination of their adventure, or to consult strategies at the time of a combat. A chat function (real time conversation system by characters) is provided for this reason. Furthermore, a telegram function can be used while the other player logs onto the server, which allows a player to send messages to a specific party (or the other party) regardless of the location or the status of the player or the other party. This function is effective when contacting one's friends and acquaintances. Furthermore, in a virtual game space, there may be provided a bulletin board as a means of communication united with the world of game, or a letter transmission function for sending letters to a specific party.
  • The chat function and the telegram function, etc. described above are usually performed with the aid of a keyboard intended for inputting characters. However, in the present game device, the keyboard is generally provided as an optional item, and is not included in the same package as the main body of the game device. Accordingly, a character interface for inputting characters is prepared for the purpose of typing keywords. Nonetheless, such inputting device is not limited to game pads.
  • A character interface (character inputting device) will be explained below with reference to FIGS. 1 through 4.
  • A control pad 2 b is provided with, for example, the respective switches in the forms of buttons A, B, C, X, Y, Z, L, R, and further with a cross key. In a character inputting mode for chats, etc., prescribed functions are assigned to the respective buttons. For example, button A is a shift key for switching the cross key to move between categories, button B for back space (deletion of one letter), button C is for determining words, button X is for turning on/off the display of a mini-window for inputting figures, button Y is for turning on/off the display of a mini-window for inputting te/ni/wo/ha particles (Japanese postpositional particles) button Z is for CR (message transmission), button L is for moving the cursor forward by one keyword category, button L for moving the cursor backward by one keyword category, and the cross keys has a function of outputting orders for selecting words (moves between categories when button A is pressed).
  • FIG. 1 shows an example of a chat screen. Displayed on the screen of the display 5 are a chat window (communication content displaying area) 51 for displaying the content of conversation, a keyword category window (candidate term displaying area) 52 for displaying keywords (terms) according to categories, and an editing window (message editing area) 53 for editing messages.
  • If, during the execution of the main program, the main CPU 101 identifies placement of a flag for performing chats, then the main CPU 101 executes a program (algorithm) as shown in FIG. 4. For example, in the case of a party organizing mode, if button A of the pad 2 b is pressed by a player upon a scene where participation to a prescribed team is recruited, or at the state of a guidance screen displayed during the game, or upon a scene where the characters discuss and exchange information at a bar for making friends, then the chat mode is designated and a flag is thereby placed. As a result, messages can be exchanged, and the party members can consult with one another.
  • Among the game parameters, the CPU 101 reads parameters (situation parameters) showing the current situation (for example, scenes) (S102). A variety of “current situations” is contemplated, including participation to a party, sending a telegram, sending a letter, viewing a guidance screen, conversation at a bar, strategy meeting prior to a combat, discussing a course of adventure, etc. A set of terms corresponding to the current situation is read from a database recorded in a CD-ROM and is displayed on the keyword category window 52. Furthermore, as shown in FIGS. 5 through 8, sets of terms are prepared according to categories.
  • FIG. 5A shows a table for category 1, collecting terms related to names. Registered here are personal pronouns, names of party participants, and names of other communicators. Furthermore, senders of telegrams or letters can be automatically registered to the table. FIG. 5B shows a table for category 2, collecting terms such as nouns and pronouns. FIG. 6A shows a table for category 3, which collects terms such as verbs and adjectives. FIG. 6B shows a table for category 4, where inflection terms are collected. FIG. 7A shows a table for category 5, collecting symbols, emotional expressions, and other terms. FIG. 7B shows a table for category 5, collecting short sentences registered by the user in advance.
  • FIG. 8A shows the mini-window for figures, and FIG. 8B shows the mini-window for te/ni/wo/ha particles (Japanese postpositional particles). In these mini-windows, by operating a single key (X, X buttons), one-letter figures or characters that are frequently used can be called out. Mini-windows may overlap the category windows when they are displayed or input.
  • The player may switch and display the aforementioned sets of terms according to categories (fields, groups, items, etc.) (S104). When there are many of terms, the terms are divided into several pages, and the content of a single page selected (for example, page 1, as shown in FIG. 1) is displayed. By introducing the page concept, the display area can be used effectively. This also enables easy switching of the content displayed (pages).
  • The player selects terms by operating the control pad 2 b. As described above, by operating the cross key, the cursor (or reverse display or other selected display) is moved and a term is selected. By operating the button concurrently with the cross key, categories are switched. Button C is then pressed and a term to be selected is determined, and the term selection flag is placed.
  • When the CPU 101 identifies selection of a term (S106), the selected term, for example, “yah-ho-” (“yoo-hoo” in Japanese) is transferred to the message editing program. The message editing program has a simple editor function. With the aid of this program, the selected terms are displayed in the message editing window 53 (S108). Subsequently, when “,” (comma), “konnichiwa” (“hello” in Japanese), “∘” (full stop in Japanese) are input, the editing window 53 similarly displays “yah-ho-, konnichiwa∘” (“yoo-hoo, hello.” in Japanese). In the message editing window 53, by using the editor function, messages can be modified, altered or input with a keyboard (S106 through S110; No).
  • When a message is formed and button Z is operated, a message transmission flag is placed. When this is identified by the CPU 101 (S110; Yes), the message assembled within the message editing window 53 is transmitted (S112). Messages transmitted to and received from the other party are stored in the RAM 102 and displayed in the chat window 51 by a communication displaying program.
  • Thus, when the player selects necessary terms from a table of terms with the aid of the pad 2 b, the selected terms are assembled and a message is thereby formed. By transmitting such message, chats by characters can be conducted.
  • Although according to the embodiment explained above, data transmission is conducted between the terminal device (game device) and the host computer system (game server) using characters, the host computer system (game server) includes a database which stores sets of terms shown in FIGS. 5 through 8, thereby enabling transmission of a relevant set of terms to the terminal device. Furthermore, data transmission in characters can be performed between terminal devices (game device) without the intermediation of the host computer. In such case, as in the case of communication with the host, terms can be selected from a table for inputting characters, whereby messages can be formed and thereafter transmitted.
  • The second embodiment will be now explained with reference to the FIGS. 9 through 12. In this embodiment, instead of assembling terms and forming a message as in the first embodiment, message cards shown on the screen are selected and mutually sent, thereby enabling the exchange of messages. In this processing, if different languages are used by the communicators, the languages of the message cards can be converted for overcoming any inconvenience caused by the difference in the languages. Since the second embodiment may have the structure of the device shown in FIGS. 2 and 3, explanation on such structure will be omitted.
  • Foremost, the outline of the second embodiment will be explained.
  • FIG. 9A shows an example of a display screen at the time the terminal device is set at the message exchange mode. On the screen of a monitor 5, greeting message cards are displayed. The first card reads “konnichiwa” (“hello” in Japanese), the second card “ohayou” (“good morning” in Japanese), and the third card “konbanwa” (“good evening” in Japanese). The communicator operates the control pad and moves the cursor over an applicable message card, for example, “ohayou” (“good morning” in Japanese), and operates the selection button.
  • FIG. 9B shows an example of the other English-speaking communicator's display screen when the terminal device is in the message exchange mode. Here, “konnichiwa” (“hello” in Japanese), is converted and displayed as “HELLO”.
  • FIG. 10 shows an example where player 1 (Japanese language) sends a message card to player 2 (Japanese language) and player 3 (English language). In the game devices of the respective players, a message conversion table shown in FIG. 12 is stored in advance. The conversion table is provided by a CD-ROM or a game server.
  • A greeting message card is displayed on the monitor screen of the first player's game device. The first card reads “konnichiwa” (“hello” in Japanese), the second card “ohayou” (“good morning” in Japanese), and the third card “konbanwa” (“good evening” in Japanese). The first player selects “konnichiwa” (“hello” in Japanese), designates a receiving party, and sends the message to the server. The receiving party can be an individual party or all of the party participants. Message card no. “2” corresponding to “konnichiwa” (“hello” in Japanese) is sent to the server. The server sends to the designated receiving party message card no. “2” which corresponds to “konnichiwa” (“hello” in Japanese). Far example, message card no. “2” is transmitted to player 2 and player 3 who are participating in a communication-type game together with player 1.
  • The terminal device of player 2 refers to the message conversion table shown in FIG. 12, decodes message card no. “2” to “konnichiwa” (“hello” in Japanese). The message “konnichiwa” is displayed on the screen.
  • The terminal device of player 3 refers to the message conversion table shown in FIG. 12, decodes message card no. “2” to “hello”. The message “hello” is displayed on the screen.
  • In order to send a reply, player 2 designates a message selection mode to the terminal device and displays multiple greeting message cards on the screen. “Konnichiwa” (“hello” in Japanese) is selected and transmitted to player 1. Message card no. “2” corresponding to “konnichiwa” (“hello” in Japanese) is sent to the server and then transferred to player 1 from the server. The terminal device of player 1 refers to the message conversion table and decodes the received message card no. “2” into “konnichiwa” (“hello” in Japanese) and displays the message card “konnichiwa” on the screen.
  • Similarly, in order to send a reply, player 3 designates the message selection mode to the terminal device and displays multiple greeting message cards on the screen. “HELLO” is selected and transmitted to player 1. Message card no. “2” is sent to the server and then transferred to player 1 from the server. The terminal device of player 1 refers to the message conversion table and decodes the received message card no. “2” into “konnichiwa” (“hello” in Japanese) and displays the message card “konnichiwa” on the screen.
  • Online conversation (chat) by characters is thus performed between the players.
  • By referring to the message conversion table, the respective terminal devices can directly convert messages into the receiving party's language and send such messages. For example, when player 1 selects “ohayou” (“good morning” in Japanese), the terminal device may refer to the message conversion table and convert the message into “GOOD MORNING” which corresponds to “ohayou” (good morning), and send to the server the message “GOOD MORNING” which is represented by a group of character codes.
  • Furthermore, when messages are exchanged between terminal devices utilizing different languages, the server may assume the role of referring to the message conversion table and converting the languages of the messages.
  • FIG. 11 is a flowchart explaining the message transmission operation implemented at the respective devices.
  • The CPU (for example, the main CPU 101) of the terminal device implements these procedures when a card message mode flag is identified during the execution of the main program (not shown).
  • Foremost, the CPU reads game parameters (S122) and displays on the screen multiple message cards according to the progress of the current game. Sets of message cards can be displayed according to the respective categories (fields, groups, items, etc.) (S124).
  • The player operates the cursor and selects a message card. As explained above, the cross key of the game pad 2 b is moved for moving the cursor (or reverse display or other selected display), and a message card is thereby selected. By operating button A concurrently with the cross key, message card categories are switched. When button C is pressed and a message card to be selected is determined, a selection flag is placed.
  • When the main CPU 101 identifies that a message card has been selected (S126), the selected message card is identified according to the position of the card on the screen, as well as the position of the cursor on the screen (S128). The number of the card selected is then identified. For example, if the message card “konnichiwa” (“hello” in Japanese) is selected, message card no. “2” is identified (S130 a). Message card no. “2” is transmitted to the server together with an address to which it is to be sent (S132 a). If the message is addressed to player 2 and player 3, message card no. “2” is transferred to player 2 and player 3 from the server. The terminal device of player 2 refers to the message conversion table and converts message card no. “2” into “konnichiwa” (“hello” in Japanese) and displays such message card on the display. The terminal device of player 3 refers to the message conversion table and converts message card no. “2” into “HELLO” and displays the message card “HELLO” on the display.
  • Online conversation using message cards can be thus performed.
  • One modification of the second embodiment will be explained with reference to FIG. 11B.
  • According to the third embodiment, instead of transmitting message card no. “2”, the message is converted into the receiving party's language at the terminal device and the message is transmitted to the receiving party after the language of the message has been converted. Therefore, the algorithm steps S130 a and S132 a described above are modified.
  • In other words, by referring to the message conversion table, the identified message (S128) is converted to the receiving party's language (S130 b). For example, in the case of a message card “konnichiwa” (“hello” in Japanese), the message is converted into “HELLO”. The message is transmitted to the receiving party after the language of the message has been converted (S132 b).
  • Thus, exchange of message cards and performance of online chats between players speaking different languages can be accomplished also by this method.
  • Although the embodiments described above suggest performance of online conversation and communication by characters via the server, the present invention is not restricted to these embodiments. The terminal game devices can communicate with each other directly through a network.
  • FIG. 13 is an explanatory diagram illustrating the fourth embodiment. This embodiment suggests a character inputting device in a portable communication device or a portable information processing device.
  • In FIG. 13, the portable communication device 300 is structured roughly by a cover 300 a and a main body 300 b. The cover 300 a internally includes a display panel 305 made of liquid crystals, etc. Upon a character inputting mode, the display panel 308 is capable of displaying an English character displaying area 308 a for displaying an English character table intended for selection of English characters, a symbol displaying area 308 b for displaying a symbol table intended for selecting symbols, and a message displaying area 308 c intended for several messages out of the multiple messages registered in advance.
  • The main body 300 b internally encloses a CPU board, etc., and the top face thereof constitutes an inputting panel. A speaker 310 is arranged at substantially the center of the upper verge of the input operating panel. To the right (in the longitudinal direction) with respect to the center of the input operation panel, a figure keyboard (figure keys) 321 is arranged. The figure keyboard includes figure buttons “0” through “9”, and an “*” button and a “#” button. The figures may have the same arrangement as in a telephone set.
  • To the left (in the longitudinal direction) from the center of the input operation panel, a cross key 322 is arranged. By operating a cross-shape actuator, the cross key 322 is capable of designating upward (↑), downward (↓), rightward (→), or leftward (←) positions. Furthermore, the neutral position of the actuator may be designated as the fifth output (“N”). By operating the cross key 322, an upward, downward, rightward or leftward movement can be designated, as well as change of items selected out of a table, etc.
  • The figure key 321 is arranged on the right side and the cross key 322 on the left side for facilitating operation by both hands when the portable communication device is held in either hand.
  • At the lower verge of the input operating panel, mode selection switches 323 a, 323 b and 323 c are arranged. The mode selection switch 323 a is used for activating the selection mode of the registered (stock) messages. By operating this switch, display of the message displaying area 308 c can be turned on/off. When operated, the mode selection switch 323 b moves the cross key in a cursor mode. As a consequence, English characters and symbols can be input at the English character displaying area 308 a and the symbol displaying area 308 b. The switch 323 is used for switching among the Japanese hiragana character/Japanese katakana character/figure/lowercase/uppercase modes.
  • FIG. 14 is a block diagram schematically illustrating the structure of the portable communication device. In this Figure, any components corresponding to those shown in FIG. 13 are indicated with the same reference numerals. In FIG. 14, a CPU 301 implements data processing and controls the respective components according to an operating system program or an application program. A RAM 302 holds data and programs therein, and constitutes a principal work area for the CPU 301. A ROM 303 holds data and programs therein in a nonvolatile manner. An external inputting interface 304 encodes the respective output results by the figure key 321, the cross key 322 and the function keys 323 and outputs the same to the prescribed area of the RAM 302.
  • A display interface 305 displays to the liquid crystal display unit 308 any information written into the video RAM by the CPU 301. A communication interface 306 exchanges data signals between an analog circuit modem 309. Connected to the analog circuit modem 309 is, for example, a public telecommunication line, or a local area network. An external interface 307 intermediates the computer system and an externally connected device (for example, a printer). The external interface 307 includes a simple pronunciation circuit which drives a speaker 310.
  • Input of characters under the structure above will now be explained with reference to FIG. 15. When a flag representing a Japanese (kana) character input mode is placed during the execution of the main program, the CPU executes the routine processing.
  • The CPU identifies whether or not a flag representing operation of the cross key has been placed. If the flag has not been placed, processing is returned to the main program and repeated again (S152; No).
  • If the flag indicating operation of the keys has been placed (S152; Yes), the CPU reads the value which has been input through the figure key 321 (S154) or the cross key. As mentioned above, neutral flag “N” is placed at the neutral position of the actuator.
  • Characters which have been input are then identified. For example, values “1” through “*” output by the figure keys respectively correspond to the “a” row, “ka” row, and “sa” through “wa” rows in the consonant arrangements of the Japanese fifty-syllabary. The values “N”, “←” through “↓” output by the cross key correspond to the vowel arrangements of the Japanese fifty-syllabary. “N” corresponds to “a”; “←” to “i”; “↑” to “u”; “→” to “e”; and “↓” to “o”. Characters can thus be input by the figure key and the cross key. For example, the CPU identifies a combination of: “1” and “↓” as “o”; “6” and “N” as “ha”; “8” and “↓” as “yo”; and “1” and “↑” as “u”. Consequently, “1↓6N8↓1↑” is identified as “o•ha•yo•u”.
  • Furthermore, if characters are divided and input one character at a time, the symbol “#” can be used for dividing the characters. For example, this symbol can be used as “1#↓6#8θ#1↑#”, or “1↓#6#8↓1↑”. Symbols serially input are converted to the Japanese kana characters “o•ha•yo•u”. In this case, the symbol “N” above need not be used.
  • Moreover, values may be associated with the characters “a•ka•sa•ta•na•ha•ma•ya•ra•wa” in advance, and when there are two consecutive figures in the character or symbol sequences which are input, the first value can be converted to an applicable character out of “a•ka•sa•ta•na•ha•ma•ya•ra•wa”. By doing so, “1↓68↓1↑” is, for example, identified as “o•ha•yo•u”, and the symbol “N” above need not be added.
  • By the so-called “inter-program processing”, the characters identified are delivered to a text processing program which has an editor function. The characters identified are displayed in the message forming area (not shown). The characters and symbols which have been input can be modified in this area.
  • Japanese kana characters can thus be input using the figure keys and the cross key.
  • Furthermore, although a cross key is used in the embodiment above, a joystick, a trackball or other pointing devices can be used instead.
  • INDUSTRIAL APPLICABILITY
  • As explained above, the character communication device according to the present invention requires no keyboard due to the terms displayed on the display being selected, assembled and transmitted as messages. Furthermore, even those who are not accustomed to typing on keyboards can easily use this device.
  • Furthermore, another word-card character communication device according to the present invention requires no keyboard and even those having no knowledge of foreign languages can communicate with foreigners.
  • Moreover, by the portable communication device (information processing device) according to the present invention, both hands can be used when the communication device is held in either hand, and the device also takes up less space.

Claims (6)

1.-5. (canceled)
6. A character communication device comprising:
a storing means for storing a message language conversion table including a single message expressed in a plurality of languages;
a menu displaying means for selectably displaying on a menu screen of a display a plurality of messages expressed in a single language;
a language conversion means for referring to said message language conversion table and converting a selected message to a message in a language of a party with whom communication is held; and
a transmitting means for transmitting the converted message to said communication counterpart.
7.-10. (canceled)
11. A character communication system connected to a network and structured to include a plurality of character communication devices capable of performing message communication at least by characters, comprising:
a first character communication device used for communication by a first message group in one language; and
a second character communication device used for communication by a second message group in another language;
wherein communication between said first and second character communication devices are performed using symbols commonly assigned in advance to messages having common meanings, included in each of said first message group and said second message group.
12. A character communication system comprising:
a first character communication device for displaying messages in a first language;
a second character communication device for displaying messages in a second language; and
a network for connecting both communication devices,
wherein said first character communication device comprises:
a storing means for storing a message table including a plurality of messages in said first and second languages;
a message displaying means for selectably displaying on a screen some of said plurality of messages in the first language;
a language conversion means for converting an applicable message in the first language to a message in the second language by referring to said message table; and
a transmitting means for transmitting a converted message to said second character communication device.
13.-17. (canceled)
US12/641,820 1997-11-10 2009-12-18 Character communication device Abandoned US20110004461A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/641,820 US20110004461A1 (en) 1997-11-10 2009-12-18 Character communication device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP9307537A JPH11143616A (en) 1997-11-10 1997-11-10 Character communication equipment
JP9-307537 1997-11-10
PCT/JP1998/005057 WO1999024900A1 (en) 1997-11-10 1998-11-10 Character communication device
US55406502A 2002-04-01 2002-04-01
US11/431,549 US7664536B2 (en) 1997-11-10 2006-05-11 Character communication device
US12/641,820 US20110004461A1 (en) 1997-11-10 2009-12-18 Character communication device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/431,549 Division US7664536B2 (en) 1997-11-10 2006-05-11 Character communication device

Publications (1)

Publication Number Publication Date
US20110004461A1 true US20110004461A1 (en) 2011-01-06

Family

ID=17970297

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/554,065 Expired - Fee Related US7203908B1 (en) 1997-11-10 1998-11-10 Character communication device
US11/431,549 Active 2024-05-20 US7664536B2 (en) 1997-11-10 2006-05-11 Character communication device
US12/641,820 Abandoned US20110004461A1 (en) 1997-11-10 2009-12-18 Character communication device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/554,065 Expired - Fee Related US7203908B1 (en) 1997-11-10 1998-11-10 Character communication device
US11/431,549 Active 2024-05-20 US7664536B2 (en) 1997-11-10 2006-05-11 Character communication device

Country Status (7)

Country Link
US (3) US7203908B1 (en)
EP (1) EP1031912B1 (en)
JP (1) JPH11143616A (en)
AU (1) AU9763698A (en)
DE (1) DE69836975T2 (en)
ES (1) ES2283077T3 (en)
WO (1) WO1999024900A1 (en)

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1051013A1 (en) * 1999-05-03 2000-11-08 Siemens Aktiengesellschaft Text edition by selection of information units
JP3458090B2 (en) * 2000-03-15 2003-10-20 コナミ株式会社 GAME SYSTEM HAVING MESSAGE EXCHANGE FUNCTION, GAME DEVICE USED FOR THE GAME SYSTEM, MESSAGE EXCHANGE SYSTEM, AND COMPUTER-READABLE STORAGE MEDIUM
JP2002032306A (en) * 2000-07-19 2002-01-31 Atr Media Integration & Communications Res Lab Mail transmission system
US6741833B2 (en) * 2000-07-21 2004-05-25 Englishtown, Inc. Learning activity platform and method for teaching a foreign language over a network
JP3818428B2 (en) * 2000-09-21 2006-09-06 株式会社セガ Character communication device
JP2002215538A (en) * 2000-10-31 2002-08-02 Sony Computer Entertainment Inc Communication system, entertainment device, recording medium, and program
JP2002157202A (en) 2000-11-17 2002-05-31 Square Co Ltd Information processor, message communication method, recording medium and computer program
JP2002325965A (en) * 2001-04-27 2002-11-12 Sega Corp Input character processing method
EP1265172A3 (en) 2001-05-18 2004-05-12 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Terminal device, information viewing method, information viewing method of information server system, and recording medium
JP2002346230A (en) * 2001-05-25 2002-12-03 Namco Ltd Game information, information storage medium, computer system and server system
EP1262216A3 (en) * 2001-05-29 2003-01-08 Konami Computer Entertainment Osaka, Inc. Server device for net game, net game management method, net game management program, and recording medium which stores net game
JP3417935B2 (en) 2001-05-30 2003-06-16 株式会社コナミコンピュータエンタテインメント大阪 NET GAME SERVER DEVICE, NET GAME MANAGEMENT METHOD, AND NET GAME MANAGEMENT PROGRAM
JP3429287B2 (en) 2001-05-29 2003-07-22 株式会社コナミコンピュータエンタテインメント大阪 NET GAME SYSTEM AND NET GAME MANAGEMENT METHOD
EP1262215A3 (en) * 2001-05-29 2003-01-08 Konami Computer Entertainment Osaka, Inc. Server device for net game, net game management method, net game mamagement program, and recording medium which stores net game
EP1262217A3 (en) * 2001-05-29 2003-01-08 Konami Computer Entertainment Osaka, Inc. Server device for net game, net game management method, net game management program, and recording medium which stores net game
JP3417936B2 (en) 2001-05-30 2003-06-16 株式会社コナミコンピュータエンタテインメント大阪 NET GAME SERVER DEVICE, NET GAME MANAGEMENT METHOD, AND NET GAME MANAGEMENT PROGRAM
US7370239B2 (en) 2001-05-31 2008-05-06 Fisher-Rosemount Systems, Inc. Input/output device with configuration, fault isolation and redundant fault assist functionality
FI20021224A (en) 2002-06-24 2003-12-25 Nokia Corp Improved procedure for realizing a text call and terminal equipment utilizing the procedure
US6797998B2 (en) 2002-07-16 2004-09-28 Nvidia Corporation Multi-configuration GPU interface device
JP2004126786A (en) * 2002-09-30 2004-04-22 Konami Co Ltd Communication device, program and communication method
US7908324B2 (en) * 2002-10-02 2011-03-15 Disney Enterprises, Inc. Multi-user interactive communication network environment
JP4506359B2 (en) * 2004-08-31 2010-07-21 カシオ計算機株式会社 Character input device
US8941594B2 (en) * 2004-10-01 2015-01-27 Nvidia Corporation Interface, circuit and method for interfacing with an electronic device
JP2006155433A (en) * 2004-12-01 2006-06-15 Nec Corp Character input language conversion display system, its method, communication terminal using the same and program
JP3865141B2 (en) * 2005-06-15 2007-01-10 任天堂株式会社 Information processing program and information processing apparatus
JP2008104707A (en) * 2006-10-26 2008-05-08 Taito Corp Communication game system, transmitting terminal, receiving terminal, and communication game program
WO2008081657A1 (en) * 2006-12-28 2008-07-10 Softbank Bb Corp. Mobile communication device
JP5114706B2 (en) * 2006-12-28 2013-01-09 ソフトバンクBb株式会社 Mobile communication device
JP4974763B2 (en) * 2007-05-29 2012-07-11 京セラ株式会社 Writing device
US20090049513A1 (en) * 2007-08-17 2009-02-19 Root Jason E System and method for controlling a virtual environment of a user
JP4764910B2 (en) * 2008-11-13 2011-09-07 富士フイルム株式会社 Image display control device
US20100331088A1 (en) * 2009-06-29 2010-12-30 Daniel Jason Culbert Method and System for Real Time Collaborative Story Generation and Scoring
US9477947B2 (en) 2009-08-24 2016-10-25 International Business Machines Corporation Retrospective changing of previously sent messages
JP5760359B2 (en) * 2010-09-24 2015-08-12 日本電気株式会社 Display device, display method, and program
JP2014079400A (en) * 2012-10-17 2014-05-08 Konami Digital Entertainment Co Ltd Game system with comment function and reply control method thereof
US20150004591A1 (en) * 2013-06-27 2015-01-01 DoSomething.Org Device, system, method, and computer-readable medium for providing an educational, text-based interactive game
WO2016136639A1 (en) * 2015-02-27 2016-09-01 株式会社ソニー・インタラクティブエンタテインメント Game device, method for controlling game device, and program
JP6030695B2 (en) * 2015-04-17 2016-11-24 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and information processing method
US10311857B2 (en) * 2016-12-09 2019-06-04 Microsoft Technology Licensing, Llc Session text-to-speech conversion
US10179291B2 (en) 2016-12-09 2019-01-15 Microsoft Technology Licensing, Llc Session speech-to-text conversion

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5275818A (en) * 1992-02-11 1994-01-04 Uwe Kind Apparatus employing question and answer grid arrangement and method
US5305206A (en) * 1990-01-26 1994-04-19 Ricoh Company, Ltd. Apparatus for producing an operation manual for use with a host system
GB2295474A (en) * 1994-11-15 1996-05-29 Fuji Xerox Co Ltd Language-information providing apparatus
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US6021336A (en) * 1995-07-21 2000-02-01 Sony Corporation Portable communication terminal capable of transmitting text data
US6104381A (en) * 1995-12-28 2000-08-15 King Jim Co., Ltd. Character input apparatus
US6145084A (en) * 1998-10-08 2000-11-07 Net I Trust Adaptive communication system enabling dissimilar devices to exchange information over a network
US6278886B1 (en) * 1997-07-25 2001-08-21 Samsung Electronics Co., Ltd. Device and method for inputting and transmitting messages in a predetermined sequence in a portable telephone
US20010029455A1 (en) * 2000-03-31 2001-10-11 Chin Jeffrey J. Method and apparatus for providing multilingual translation over a network
US6487424B1 (en) * 1998-01-14 2002-11-26 Nokia Mobile Phones Limited Data entry by string of possible candidate information in a communication terminal
US6490235B1 (en) * 1997-08-07 2002-12-03 Sony Corporation Storage and reproduction apparatus with rotary control element for controlling operations
US20020188670A1 (en) * 2001-06-08 2002-12-12 Stringham Gary G. Method and apparatus that enables language translation of an electronic mail message
US6580917B1 (en) * 1997-05-27 2003-06-17 Siemens Aktiengesellschaft Mobile station for use in mobile radio systems
US20030125927A1 (en) * 2001-12-28 2003-07-03 Microsoft Corporation Method and system for translating instant messages
US6633746B1 (en) * 1998-11-16 2003-10-14 Sbc Properties, L.P. Pager with a touch-sensitive display screen and method for transmitting a message therefrom
US20040030542A1 (en) * 2002-07-26 2004-02-12 Fujitsu Limited Apparatus for and method of performing translation, and computer product
US20040078189A1 (en) * 2002-10-18 2004-04-22 Say-Ling Wen Phonetic identification assisted Chinese input system and method thereof
US6741235B1 (en) * 2000-06-13 2004-05-25 Michael Goren Rapid entry of data and information on a reduced size input area
US20040158471A1 (en) * 2003-02-10 2004-08-12 Davis Joel A. Message translations
US20040229697A1 (en) * 2003-05-15 2004-11-18 Williams Roland E. Text entry within a video game
US6848080B1 (en) * 1999-11-05 2005-01-25 Microsoft Corporation Language input architecture for converting one text form to another text form with tolerance to spelling, typographical, and conversion errors
US6859211B2 (en) * 2001-09-13 2005-02-22 Terry H. Friedlander System and method for generating an online interactive story
US20060133585A1 (en) * 2003-02-10 2006-06-22 Daigle Brian K Message translations
US20080270113A1 (en) * 2000-11-20 2008-10-30 Jokipii Eron A Multi-language system for online communications
US7970598B1 (en) * 1995-02-14 2011-06-28 Aol Inc. System for automated translation of speech
US20110223567A1 (en) * 2010-02-03 2011-09-15 Kai Staats Language and communication system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62127924A (en) * 1985-11-28 1987-06-10 Toshiba Eng Co Ltd Keyboard for inputting japanese language
JPH01137362A (en) * 1987-11-25 1989-05-30 Fujitsu Ltd Data input system
JPH0752471Y2 (en) * 1989-08-08 1995-11-29 株式会社藤井合金製作所 Gas tap
FR2651581B1 (en) 1989-09-06 1994-05-13 Centre Nal Recherc Scientifique MEANS FOR THE DIAGNOSIS OF DEMYELINIZING NEUROPATHIES, IN PARTICULAR MULTIPLE SCLEROSIS.
JPH0485650A (en) * 1990-07-30 1992-03-18 Ricoh Co Ltd Operation manual preparing device
JPH05165979A (en) 1991-12-12 1993-07-02 Casio Comput Co Ltd Computer, handy terminal, and system for key input from computer to handy terminal
JPH0736904A (en) 1993-07-15 1995-02-07 Nec Corp Message control management device
JP3262677B2 (en) 1994-05-02 2002-03-04 株式会社ワコム Information input device
DE4416697A1 (en) 1994-05-11 1995-11-16 Giesecke & Devrient Gmbh Data carrier with integrated circuit
JPH0844640A (en) 1994-07-29 1996-02-16 Nec Corp Automatic merge system for message
JPH08237298A (en) 1995-02-24 1996-09-13 Matsushita Electric Ind Co Ltd Electronic mail transferring device
JPH09222952A (en) 1996-02-19 1997-08-26 Toshiba Corp Character input method and message transmitting method

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305206A (en) * 1990-01-26 1994-04-19 Ricoh Company, Ltd. Apparatus for producing an operation manual for use with a host system
US5275818A (en) * 1992-02-11 1994-01-04 Uwe Kind Apparatus employing question and answer grid arrangement and method
US5724457A (en) * 1994-06-06 1998-03-03 Nec Corporation Character string input system
GB2295474A (en) * 1994-11-15 1996-05-29 Fuji Xerox Co Ltd Language-information providing apparatus
US7970598B1 (en) * 1995-02-14 2011-06-28 Aol Inc. System for automated translation of speech
US6021336A (en) * 1995-07-21 2000-02-01 Sony Corporation Portable communication terminal capable of transmitting text data
US6104381A (en) * 1995-12-28 2000-08-15 King Jim Co., Ltd. Character input apparatus
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US6580917B1 (en) * 1997-05-27 2003-06-17 Siemens Aktiengesellschaft Mobile station for use in mobile radio systems
US6278886B1 (en) * 1997-07-25 2001-08-21 Samsung Electronics Co., Ltd. Device and method for inputting and transmitting messages in a predetermined sequence in a portable telephone
US6490235B1 (en) * 1997-08-07 2002-12-03 Sony Corporation Storage and reproduction apparatus with rotary control element for controlling operations
US6487424B1 (en) * 1998-01-14 2002-11-26 Nokia Mobile Phones Limited Data entry by string of possible candidate information in a communication terminal
US6810272B2 (en) * 1998-01-14 2004-10-26 Nokia Mobile Phones Limited Data entry by string of possible candidate information in a hand-portable communication terminal
US6145084A (en) * 1998-10-08 2000-11-07 Net I Trust Adaptive communication system enabling dissimilar devices to exchange information over a network
US6633746B1 (en) * 1998-11-16 2003-10-14 Sbc Properties, L.P. Pager with a touch-sensitive display screen and method for transmitting a message therefrom
US6848080B1 (en) * 1999-11-05 2005-01-25 Microsoft Corporation Language input architecture for converting one text form to another text form with tolerance to spelling, typographical, and conversion errors
US20010029455A1 (en) * 2000-03-31 2001-10-11 Chin Jeffrey J. Method and apparatus for providing multilingual translation over a network
US6741235B1 (en) * 2000-06-13 2004-05-25 Michael Goren Rapid entry of data and information on a reduced size input area
US20080270113A1 (en) * 2000-11-20 2008-10-30 Jokipii Eron A Multi-language system for online communications
US20020188670A1 (en) * 2001-06-08 2002-12-12 Stringham Gary G. Method and apparatus that enables language translation of an electronic mail message
US6859211B2 (en) * 2001-09-13 2005-02-22 Terry H. Friedlander System and method for generating an online interactive story
US20030125927A1 (en) * 2001-12-28 2003-07-03 Microsoft Corporation Method and system for translating instant messages
US20040030542A1 (en) * 2002-07-26 2004-02-12 Fujitsu Limited Apparatus for and method of performing translation, and computer product
US20040078189A1 (en) * 2002-10-18 2004-04-22 Say-Ling Wen Phonetic identification assisted Chinese input system and method thereof
US20040158471A1 (en) * 2003-02-10 2004-08-12 Davis Joel A. Message translations
US20060133585A1 (en) * 2003-02-10 2006-06-22 Daigle Brian K Message translations
US20040229697A1 (en) * 2003-05-15 2004-11-18 Williams Roland E. Text entry within a video game
US20110223567A1 (en) * 2010-02-03 2011-09-15 Kai Staats Language and communication system

Also Published As

Publication number Publication date
DE69836975T2 (en) 2007-11-15
AU9763698A (en) 1999-05-31
EP1031912A1 (en) 2000-08-30
EP1031912B1 (en) 2007-01-24
ES2283077T3 (en) 2007-10-16
WO1999024900A1 (en) 1999-05-20
EP1031912A4 (en) 2002-01-16
US20060203009A1 (en) 2006-09-14
US7664536B2 (en) 2010-02-16
US7203908B1 (en) 2007-04-10
JPH11143616A (en) 1999-05-28
DE69836975D1 (en) 2007-03-15

Similar Documents

Publication Publication Date Title
US7664536B2 (en) Character communication device
JP3818428B2 (en) Character communication device
JP3679350B2 (en) Program, information storage medium and computer system
US9729478B2 (en) Network server and computer system for providing user status data
JP6552114B2 (en) Chat interface with haptic feedback functionality
US8317617B2 (en) Communication system and method using pictorial characters
US6629793B1 (en) Emoticon keyboard
JP2006285535A (en) Information display controller, its method, program, and recording medium
JP2002346230A (en) Game information, information storage medium, computer system and server system
JP3930489B2 (en) Chat system, communication apparatus, control method thereof, and program
EP1172132A2 (en) Entertainment system, recording medium
Albrant Universal Usability & Social Inclusion in the Video Game Industry: Where We are Now and Where We Go Next
KR100546864B1 (en) A character input system using the dual shock and a method thereof
JP2004215800A (en) GAME MESSAGE DISPlAY PROGRAM, GAME DEVICE AND GAME MESSAGE DISPLAY METHOD
JP2002078963A (en) Entertainment device and recording medium
JP2003108548A (en) Game device, game control method, its recording medium and computer program
JP2003108298A (en) Game device, game control method and recording medium therefor, and computer program
JP2003108549A (en) Game device, game control method, its recording medium and computer program
JP2002219272A (en) Videogame device, method of controlling it, program for videogame, and computer readable storage medium
JPH03270445A (en) Keyboard device
KR20050016674A (en) Communication device, communication method, program, and information recording medium
JP2004180701A (en) Input device for game which can be operated with one hand

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION