US20070180360A1 - Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements - Google Patents

Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements Download PDF

Info

Publication number
US20070180360A1
US20070180360A1 US11/345,326 US34532606A US2007180360A1 US 20070180360 A1 US20070180360 A1 US 20070180360A1 US 34532606 A US34532606 A US 34532606A US 2007180360 A1 US2007180360 A1 US 2007180360A1
Authority
US
United States
Prior art keywords
markup language
xml
application
wireless communication
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/345,326
Other versions
US8046679B2 (en
Inventor
Tim Neil
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Malikie Innovations Ltd
Original Assignee
Nextair Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Assigned to NEXTAIR CORPORATION reassignment NEXTAIR CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NEIL, TIM
Priority to US11/345,326 priority Critical patent/US8046679B2/en
Priority to EP06101233A priority patent/EP1816573A1/en
Application filed by Nextair Corp filed Critical Nextair Corp
Priority to CA002576697A priority patent/CA2576697A1/en
Publication of US20070180360A1 publication Critical patent/US20070180360A1/en
Assigned to RESEARCH IN MOTION LIMITED reassignment RESEARCH IN MOTION LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NEXTAIR CORPORATION
Publication of US8046679B2 publication Critical patent/US8046679B2/en
Application granted granted Critical
Assigned to BLACKBERRY LIMITED reassignment BLACKBERRY LIMITED CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: RESEARCH IN MOTION LIMITED
Assigned to MALIKIE INNOVATIONS LIMITED reassignment MALIKIE INNOVATIONS LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLACKBERRY LIMITED
Assigned to MALIKIE INNOVATIONS LIMITED reassignment MALIKIE INNOVATIONS LIMITED NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: BLACKBERRY LIMITED
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/80Information retrieval; Database structures therefor; File system structures therefor of semi-structured data, e.g. markup language structured data such as SGML, XML or HTML
    • G06F16/84Mapping; Conversion
    • G06F16/88Mark-up to mark-up conversion

Definitions

  • the present invention relates to markup languages, and more particularly to an apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements.
  • markup languages such as Extensible Markup Language (XML) is prevalent in modern computing. This is likely due in part to fact that markup language documents may be expressed in a simple textual form which can be processed by many different types of computing devices and operating system platforms. Markup language documents may thus facilitate cross-platform computing.
  • XML Extensible Markup Language
  • An apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements would be desirable.
  • FIG. 1 schematically illustrates a wireless communication device including virtual machine software
  • FIG. 2A illustrates the organization of exemplary virtual machine software at the wireless communication device of FIG. 1 ;
  • FIG. 2B further illustrates the organization of exemplary virtual machine software at the wireless communication device of FIG. 1 ;
  • FIG. 3 illustrates an operating environment for the wireless communication device of FIG. 1 ;
  • FIG. 4 illustrates the structure of example application definitions used by the device of FIG. 1 ;
  • FIG. 5 schematically illustrates the formation of application definition files at a transaction server of FIG. 3 from a master definition file
  • FIG. 6 schematically illustrates the transaction server of FIG. 3 in greater detail
  • FIG. 7 is a flow diagram illustrating the exchange of sample messages passed between a wireless communication device, transaction server and application server;
  • FIGS. 8-10 illustrate operation performed at a wireless communication device under control of virtual machine software of FIGS. 2A and 2B ;
  • FIG. 11 schematically illustrates the wireless communication device operating environment of FIG. 3 with an exemplary Rapid Application Development (RAD) tool which may be used to develop master definition files in a manner exemplary of an embodiment of the present invention
  • RAD Rapid Application Development
  • FIG. 12 schematically illustrates the RAD tool of FIG. 11 in greater detail
  • FIG. 13 illustrates an exemplary graphical user interface (GUI) of the RAD tool of FIG. 12 ;
  • FIG. 14 illustrates a project explorer portion of the RAD tool GUI of FIG. 13 in which exemplary global functions are declared
  • FIG. 15 illustrates a login screen for a Pocket PC wireless computing device defined in the project explorer of FIG. 14 ;
  • FIG. 16 illustrates a French language version of the login screen of FIG. 15 that is also defined in the project explorer of FIG. 14 ;
  • FIGS. 17A-17B textually illustrate a master definition Document Object Model (DOM) tree that is maintained in the memory of the RAD tool of FIG. 12 during mobile application design to represent the screens of FIGS. 15 and 16 ; and
  • DOM Document Object Model
  • FIGS. 18A-18B illustrate a master definition file which results from the serialization of the DOM tree of FIGS. 17A-17B .
  • the embodiment described herein pertains to an apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements.
  • the embodiment is described, however, within the specific context of a system that presents server-side applications at varied wireless communication devices (also referred to as “mobile devices”).
  • an apparatus comprising: at least one processor; and a memory coupled to the at least one processor storing:a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy; and machine-executable code which, when executed by the at least one processor, generates, from the markup language document, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • a machine-readable medium comprising: machine-executable code for generating, from a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • a method comprising: generating, from a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • a system which facilitates execution of server-side applications at wireless communication devices utilizes a text-based application definition file to govern the manner in which an application is presented at a wireless communication device.
  • the application definition file contains a description of how an application is to be presented at wireless communication device, the format of transactions over the wireless network, and a format of data related to the application to be stored at the wireless communication device.
  • the application definition file of the present embodiment is written in Extensible Markup Language (XML).
  • XML Extensible Markup Language
  • a virtual machine software component at the wireless communication device interprets the definition file and presents an interface to the application in accordance with the definition file.
  • FIG. 1 illustrates a wireless communication device 10 , exemplary of an embodiment of the present invention.
  • Wireless communication device 10 may be any conventional wireless communication device, modified to function in the manner described below.
  • wireless communication device 10 includes a processor 12 , in communication with a network interface 14 , storage memory 16 , and a user interface 18 typically including a keypad and/or touch-screen.
  • Network interface 14 enables device 10 to transmit and receive data over a wireless network 22 .
  • Wireless communication device 10 may be, for example, be a Research in Motion (RIM) two-way paging device, a WinCE based device, a PalmOS device, a WAP-enabled mobile telephone, or the like.
  • RIM Research in Motion
  • Memory 16 of device 10 stores a mobile operating system such as the PalmOS, or WinCE operating system software 20 .
  • Operating system software 20 typically includes graphical user interface and network interface software having suitable application programmer interfaces (“API”s) for use by other applications executing at device 10 .
  • API application programmer interfaces
  • Memory at device 10 further stores virtual machine software 24 which, when executed by wireless communication device 10 , enables device 10 to present an interface for server-side applications provided by a transaction server, described below.
  • virtual machine software 24 interprets a textual application definition file (a markup language document) defining a definition of a user interface 18 controlling application functionality, and the display format (including display flow) at device 10 for a particular server-side application; the format of data to be exchanged over the wireless network for the application; and the format of data to be stored locally at device 10 for the application.
  • Virtual machine software 24 uses operating system 20 and associated APIs to interact with device 10 , in accordance with the received application definition file. In this way, device 10 may present interfaces for a variety of applications, stored at a server.
  • virtual machine software 24 is viewed as another application resident at device 10 .
  • multiple wireless devices each having a similar virtual machine software 24 may use a common server-side application in combination with an application definition file, to present a user interface and program flow specifically adapted for the device.
  • the exemplary virtual machine software 24 is specifically adapted to work with the particular wireless communication device 10 .
  • virtual machine software 24 is a RIM virtual machine.
  • device 10 is a PalmOS or WinCE device
  • virtual machine software 24 would be a PalmOS or a WinCE virtual machine.
  • virtual machine software 24 is capable of accessing local storage 26 at device 10 .
  • the application definition file is formed using the well-known markup language XML.
  • XML entities are understood by the virtual machine software 24 .
  • XML entities are detailed in Appendix “A”, attached hereto.
  • AIRIXTM Markup Language (ARML) is an XML markup language used in the present embodiment.
  • the defined XML entities are interpreted by the virtual machine software 24 , and may be used as building blocks to present server-side applications at wireless communication device 10 , as detailed herein.
  • virtual machine software 24 includes a conventional XML parser 61 ; an event handler 65 ; a screen generation engine 67 ; and object classes 69 corresponding to XML entities supported by the virtual machine software 24 , and possibly contained within an application definition file 28 .
  • Supported XML entities are detailed in Appendix “A”.
  • Appendix “A” A person of ordinary skill will readily appreciate that those XML elements and attributes identified in Appendix “A” are exemplary only, and may be extended, or shortened as desired, as described in section II hereinafter for example.
  • XML parser 61 may be formed in accordance with the Document Object Model, or DOM, which is available at www.w3.org/DOM/ and is incorporated by reference hereinto. Parser 61 enables virtual machine software 24 to read an application definition file. Using the parser, the virtual machine software 24 may form a binary representation of the application definition file for storage at the wireless communication device, thereby eliminating the need to parse text each time an application is used. Parser 61 may convert each XML tag contained in the application definition file, and its associated data to tokens, for later processing. As will become apparent, this may avoid the need to repeatedly parse the text of an application definition file.
  • DOM Document Object Model
  • Screen generation engine 67 displays initial and subsequent screens at the wireless communication device, in accordance with an application definition 28 , as detailed below.
  • Event handler 65 of virtual machine software 24 allows device 10 under control of virtual machine software 24 to react to certain external events.
  • Example events include user interaction with presented graphical user interface (GUI) screens or display elements, incoming messages received from a wireless network, or the like.
  • GUI graphical user interface
  • Object classes 69 also form part of virtual machine 24 and define objects that allow device 10 to process each of the supported XML entities at the wireless communication device.
  • Each of object classes 69 includes attributes (e.g. fields or data members) used to store parameters defined by the XML file (XML element and/or attribute values), and allowing the XML entity to be processed at the wireless communication device, as detailed in Appendix “A”, for each supported XML entity.
  • Virtual machine software 24 may be expanded to support XML entities not detailed in Appendix “A”.
  • GUI screen display elements e.g. menu items, text items, buttons, etc.
  • the object instances are “customized” using XML element and attribute values contained in the application definition file 28 .
  • the event handler 65 of the virtual machine software 24 reacts to events for the application. The manner in which the event handler reacts to events is governed by the contents of the application definition file. Events may trigger processing defined within instances of associated “action” objects, which objects are instantiated from object classes 69 of virtual machine software 24 .
  • object classes 69 of virtual machine software 24 further include object classes corresponding to data tables and network transactions defined in the Table Definition and Package Definition sections of Appendix “A”. At run time, instances of object classes corresponding to these classes are created and populated with parameters contained within application definition file, as required.
  • FIG. 2B illustrates in greater detail the manner in which the virtual machine software 24 of FIG. 2A may be organized.
  • the wireless communication device 10 is currently executing a wireless communication device application (also referred to as a “mobile application”).
  • the virtual machine software 24 has two categories of components, namely, objects 169 and general purpose routines 59 .
  • Objects 169 are instantiations of object classes 69 ( FIG. 2A ) which are instantiated dynamically at run time when the application is executed at the wireless communication device 10 .
  • the types of objects 169 that are instantiated at any given moment e.g. screens, buttons, events, actions, etc., as will be described) depend upon the mobile application currently being executed and its state, including which user interface screen is currently displayed at the wireless communication device.
  • Each of objects 169 corresponds to an application component defined within the application definition file 28 .
  • the objects 169 are instantiated from binary representations 178 thereof which are maintained in secondary storage 26 , which representations 178 are created when the application definition file 28 is initially parsed.
  • Each object 169 contains methods which capture certain behaviours that are performed by all instances of the represented object, as well as data members which permit the characteristics or behavior of the object to be “customized” (e.g. each instance of a button object may include the same highlight( ) method which, if invoked, causes the button to become highlighted, and may further include X and Y coordinate data member values which define a unique location of the button on the encompassing UI screen).
  • each instance of a button object may include the same highlight( ) method which, if invoked, causes the button to become highlighted, and may further include X and Y coordinate data member values which define a unique location of the button on the encompassing UI screen).
  • General purpose routines 59 constitute a managing environment for the objects 169 .
  • the routines 59 encompass functionality which is useful for executing a mobile application at the wireless communication device but is not necessarily tied to a particular type of object 169 .
  • the routines 59 may include the XML parser 61 , which initially parses the application definition file 28 .
  • Other routines may facilitate loading or closing of UI screens, or the sending of messages over the wireless network 22 , as will be described.
  • the routines 59 effectively consolidate certain functionality for convenient invocation from any of objects 169 , as required.
  • virtual machine software 24 may be formed using conventional object-oriented programming techniques, and existing device libraries and APIs, as to function as detailed herein.
  • object classes 69 will vary depending on the type of virtual machine software, its operating system and API available at the device.
  • a machine executable version of virtual machine software 24 may be loaded and stored at a wireless communication device, using conventional techniques. It can be embedded in ROM, loaded into RAM over a network, or from a computer readable medium.
  • virtual machine software 24 and software forming object classes 69 are formed using object-oriented structures
  • object classes 69 forming part of the virtual machine could be replaced by equivalent functions, data structures or subroutines formed using a conventional (i.e. non-object-oriented) programming environment. Operation of virtual machine software 24 under control of an application definition file containing various XML definitions exemplified in Appendix “A” is further detailed below.
  • FIG. 3 illustrates the operating environment for a wireless communication device 10 .
  • wireless communication devices 30 , 32 and 34 are also illustrated in FIG. 3 .
  • These wireless communication devices 30 , 32 and 34 are similar to device 10 and also store and execute virtual machine software.
  • Virtual machine software like that stored at device 10 , executes on each wireless communication device 10 , 30 , 32 , 34 , and communicates with a transaction server 44 (referred to as a “middleware server 44 ” in U.S. Patent Publication No. US 2003/0060896, referenced above) by way of example wireless networks 36 and 38 and network gateways 40 and 42 .
  • Example gateways 40 and 42 are generally available as a service for those people wishing to have data access to wireless networks.
  • Wireless networks 36 and 38 are further connected to one or more computer data networks, such as the Internet and/or private data networks by way of gateway 40 or 42 .
  • embodiments of the invention may work with many types of wireless networks.
  • Transaction server 44 is in turn in communication with a data network, that is in communication with wireless networks 36 and 38 .
  • the communication used for such communication is via a HyperText Transfer Protocol (HTTP) transport over Transmission Control Protocol/Internet Protocol (TCP/IP).
  • HTTP HyperText Transfer Protocol
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • SNA Systems Network Architecture
  • At least two categories of communication between transaction server 44 and wireless communication devices 10 , 30 , 32 and 34 exist.
  • virtual machine software 24 at each device may query transaction server 44 for a list of applications that a user of an associated wireless communication device 10 , 30 , 32 or 34 can make use of. If a user decides to use a particular application, device 10 , 30 , 32 or 34 can download a text description, in the form of an application definition file, for the application from the transaction server 44 over its wireless interface.
  • virtual machine software 24 may send and receive (as well as present, and locally store) data to and from transaction server 44 which is related to the execution of applications, or its own internal operations.
  • the format of exchanged data for each application is defined by an associated application definition file. Again, the exchanged data may be formatted using XML, in accordance with the application definition file.
  • Transaction server 44 stores XML application definition files for those applications that have been enabled to work with the various devices 10 , 30 , 32 , and 34 using virtual machine software 24 in a pre-defined format understood by virtual machine software 24 .
  • Software providing the functions of the transaction server 44 in the exemplary embodiment is written in C#, using SQL Server or MySQL database.
  • the XML of the application definition files may conform to XML version 1.0, detailed in the XML version 1.0 specification third edition and available at www.w3.org/TR/2004/REC-xml-20040404, for example.
  • Each application definition file is formatted according to defined rules and uses pre-determined XML markup tags known by both virtual machine software 24 , and complementary transaction server software 68 . That is, each application definition file 28 is an XML document (i.e. an XML data instance file) which conforms to a predefined XML schema designed to support the execution of server-side applications at various types of wireless communication devices. Tags define XML elements used as building blocks to present an application at a wireless communication device. Knowledge of these rules, and an understanding of how each tag and section of text should be interpreted, allows virtual machine software 24 to process an XML application definition and thereafter execute an application, as described below. Virtual machine software 24 effectively acts as an interpreter for a given application definition file.
  • FIG. 4 illustrates an example format for an XML application definition file 28 .
  • the example application definition file 28 for a given device and application includes three components: a user interface definition section 48 , specific to the user interface for the device 10 , which defines the format of graphical user interface (GUI) screens for the application and how the user interacts with them and contains application flow control events and actions; a network transactions definition section 50 defining the format of data to be exchanged with the application; and a local data definition section 52 defining the format of data to be stored locally on the wireless communication device by the application.
  • GUI graphical user interface
  • XML markup tags are used to create an application definition file 28 .
  • the defined tags may broadly be classified into three categories, corresponding to the three sections 48 , 50 and 52 of an application definition file 28 .
  • Example XML tags and their corresponding significance are detailed in Appendix “A”.
  • virtual machine software 24 at a wireless communication device includes object classes corresponding to each of the XML tags. At run time, instances of the objects are created as required.
  • XML tags i.e. XML elements
  • the second category of example XML tags describes the network transaction section 50 of application definition 28 . These may include the following example XML tags:
  • the third category of XML tags used to describe an application are those used to define a logical database that may be stored at the wireless communication device.
  • the tags available that may be used in this section are:
  • virtual machine software 24 may, from time to time, need to perform certain administrative functions on behalf of a user.
  • one of object classes 69 has its own repertoire of tags to intercommunicate with the transaction server 44 .
  • tags differ from the previous three groupings in that they do not form part of an application definition file, but are solely used for administrative communications between the virtual machine software 24 and the transaction server 44 .
  • Data packages using these tags are composed and sent due to user interactions with the virtual machine's configuration screens.
  • the tags used for this include:
  • FIG. 5 illustrates the organization of application definitions at transaction server 44 and how transaction server 44 may form an application definition file 28 ( FIG. 4 ) for a given device 10 , 30 , 32 or 34 .
  • FIG. 5 illustrates only two wireless communication devices 10 and 30 are considered.
  • the user interface definition i.e. the definition of its GUI screens.
  • transaction server 44 stores a master definition file 58 (or simply “master definition” 58 ) for a given server-side application.
  • This master definition 58 contains example user interface descriptions 48 , 54 , 56 for each possible type of wireless communication device 10 , 30 , 32 ; descriptions of the network transactions 50 that are possible and data descriptions 52 of the data to be stored locally on the wireless communication device.
  • the network transactions 50 and data descriptions 52 will be the same for all wireless communication devices 10 , 30 and 32 , while the user interface descriptions 48 , 54 , and 56 vary slightly from one another. This may for example be due to display size limitations on some wireless communication devices which force a designer to lay out the display elements of a user interface slightly differently from device to device.
  • transaction server 44 composes an application definition file 28 by querying the device type and adding an appropriate user interface description 48 for device 10 to the definitions for the network transactions 50 and the data 52 .
  • transaction server 44 composes the application definition file 28 by adding the user interface description 54 for device 30 to the definitions for the network transactions 50 and data 52 .
  • the master definition 58 for a given application is created away from the transaction server 44 and may be loaded onto the transaction server 44 by administrative staff charged with its operation. Master definition files may be created by a developer using a rapid application development tool such the one that is described below in Section II. Alternatively, a simple text editor could be used. It will be appreciated that the master definition file 58 is an XML document.
  • FIG. 6 illustrates the organization of transaction server 44 .
  • Transaction server 44 may be any conventional application server, modified to function in as described herein.
  • transaction server 44 includes a processor 60 , in communication with a network interface 66 and storage memory 64 .
  • Transaction server 44 may be, for example, a server running Windows Server 2003 , a Sun Solaris server, or the like.
  • Memory of transaction server 44 stores an operating system such as Windows Server 2003, or Solaris operating system software 62 .
  • Network interface 66 enables transaction server 44 to transmit and receive data over a data network 63 . Transmissions are used to communicate with both the virtual machine software 24 (via the wireless networks 36 , 38 and wireless gateways 40 , 42 of FIG. 3 ) and one or more application servers, such as application server 70 , that are the end recipients of data sent from the mobile client applications and the generators of data that is sent to the mobile client applications.
  • Memory at transaction server 44 further stores software 68 which, when executed by transaction server 44 , enables the transaction server to understand and compose XML data packages that are sent and received by the transaction server 44 . These packages may be exchanged between transaction server 44 and the virtual machine software 24 , or between the transaction server 44 and the application server 70 .
  • Transaction server software 68 may be loaded from a machine-readable medium.
  • communication between the application server 70 and the transaction server 44 can, in an exemplary embodiment, use HTTP running on top of a standard TCP/IP stack; however this is not a requirement.
  • An HTTP connection between a running application at the application server 70 and the transaction server 44 is established in response to the application at a wireless communication device presenting the application.
  • the server-side application provides output to transaction server 44 over this connection.
  • the server-side application data is formatted into appropriate XML data packages understood by the virtual machine software 24 at a wireless communication device by the server-side application.
  • a server-side application (or an interface portion of the application) formats application output into XML in a manner consistent with the format defined by the application definition file for the application.
  • an interface component separate from the application could easily be formed with an understanding of the format and output for a particular application. That is, with a knowledge of the format of data provided and expected by an application at application server 70 , an interface component could be produced using techniques readily understood by those of ordinary skill.
  • the interface portion could translate application output to XML, as expected by transaction server 44 .
  • the interface portion may translate XML input from a wireless communication device into a format understood by the server-side application.
  • the particular identity of the wireless communication device on which the application is to be presented may be identified by a suitable identifier, in the form of a header contained in the server-side application output. This header may be used by transaction server 44 to forward the data to the appropriate wireless communication device. Alternatively, the identity of the connection could be used to forward the data to the appropriate wireless communication device.
  • FIG. 7 illustrates a sequence diagram detailing data (application data or application definition files 28 ) flow between wireless communication device 10 and transaction server 44 .
  • device 10 For data requested from transaction server 44 , device 10 , executing virtual machine software 24 , makes a request to transaction server 44 , which passes over the wireless network 36 through network gateway 40 .
  • Network gateway 40 passes the request to the transaction server 44 .
  • Transaction server 44 responds by executing a database query on its database 46 that finds which applications are available to the user and the user's wireless communication device.
  • data For data passed from transaction server 44 to device 10 , data is routed through network gateway 40 .
  • Network gateway 40 forwards the information to the user's wireless communication device over the wireless network 36 .
  • FIG. 7 when considered with FIG. 3 illustrates a sequence of communications between virtual machine software 24 (executing at device 10 ) and transaction server 44 that may occur when the user of a wireless communication device wishes to download an application definition file 28 for a server-side application.
  • Initially device 10 may interrogate server 44 to determine which applications are available for the particular wireless communication device being used. This may be accomplished by the user instructing the virtual machine software 24 at device 10 to interrogate the server 44 . Responsive to these instructions the virtual machine software 24 sends an XML message to the server requesting the list of applications (data flow 72 ); the XML message may contain the ⁇ FINDAPPS> tag, signifying to the transaction server 44 , its desire for a list of available applications. In response, transaction server 44 makes a query to database 46 . Database 46 , responsive to this query, returns a list of applications that are available to the user and the wireless communication device.
  • the list is typically based, at least in part, on the type of wireless communication device making the request, and the applications known to transaction server 44 .
  • Transaction server 44 converts this list to an XML message and sends it to the virtual machine (data flow 74 ). Again, a suitable XML tag identifies the message as containing the list of available applications.
  • a user at device 10 may choose to register for an available server-side application.
  • virtual machine software 24 at device 10 composes and sends an XML registration request for a selected application (data flow 76 ) to transaction server 44 .
  • an XML message containing a ⁇ REG> tag is sent to transaction server 44 .
  • the name of the application is specified in the message.
  • the transaction server 44 queries its database for the user interface definition for the selected application for the user's wireless communication device. Thereafter, the transaction server creates the application definition file, as detailed with reference to FIG. 5 .
  • transaction server 44 sends to the wireless communication device (data flow 78 — FIG. 7 ) the created application definition file 28 .
  • the user is then able to use the functionality defined by the interface description to send and receive data.
  • parser 61 of virtual machine software 24 may parse the XML text of the application definition file to form a tokenized version of the file. That is, each XML tag may be converted to a defined token for compact storage and to minimize repeated parsing of the XML text file.
  • the tokenized version of the application definition file may be stored for immediate or later use by device 10 .
  • tokenized may refer to placement of the XML structure into binary objects which are run-time accessible, which is much like conversion of a script into byte code.
  • the application definition file may initially be converted to a DOM tree representation.
  • the entire DOM tree may then be traversed.
  • a corresponding object 169 ( FIG. 2B ) may be instantiated from one of object classes 69 .
  • Instantiation of each object 169 may be facilitated by a fromXML( ) “constructor” method within the corresponding class 69 , which populates the object's data members based on XML element/attribute values.
  • the constructor method may receive the XML fragment which defines the XML element in the application definition file 28 and, based on the element and attribute values within the fragment, automatically populate the newly instantiated object's data members with like values.
  • the constructor method may or may not meet the strict definition of the term “constructor” as it is understood in the context of certain object-oriented programming languages (e.g. the method may not have the same name as the class).
  • the above XML fragment represents an “OK” button on a containing GUI screen (not shown) which performs two actions when clicked. The details of the actions are omitted for brevity.
  • the result may be instantiation of the button object 173 ( FIG. 2B ) from the following one of classes 69 :
  • the data members “name”, “index” and “caption” of object 173 correspond to attributes of the same name within the XML fragment.
  • the constructor method fromXML( ) populates these data members with the values “BTN1”, “1” and “OK”, respectively, based on the relevant XML attribute values.
  • the constructor method also populates the event array of button object 173 .
  • the event array is an array of event objects, each representing a different type of event that is significant with regard to the containing GUI screen display element (in this case, button object 173 ).
  • button object 173 only one significant event is defined for the “OK” button, namely, an “ONCLICK” event which represents the clicking of the button. Accordingly, only one event object 175 is instantiated.
  • the event object's data members includes an array of action objects 177 and 179 (one for each action element in the above XML fragment) representing actions to be taken when the event occurs.
  • Each action object is also populated by a constructor method within the action object, in like fashion.
  • FIG. 2B The result of instantiating the button object and subordinate objects is illustrated in FIG. 2B .
  • the button object is shown at 173 , within the context of objects 169 .
  • illustration of an object within the border of another object connotes the latter object's containment of the former.
  • a contained object is a data member of the containing object.
  • the button object 173 is contained within a screen object 171 which also includes an edit box object 181 .
  • This hierarchy indicates a UI screen having both a button and an edit box.
  • the sole significant event for the button object 173 is represented by event object 175 , which is the sole member of the event array of button object 173 .
  • the event object 175 in turn contain action objects 177 and 179 which represent actions that are to be taken when the containing event occurs. The actions may be of various types, as will be described.
  • the edit box object 181 of FIG. 2B contains two events 183 and 191 , each representing a significant event for the edit box (say, selection and text entry). These event objects in turn contain actions 185 , 193 and 195 , representing actions to be taken when the relevant event occurs.
  • the button class contains an onEvent( ) method. This method is invoked via a callback from the operating system 20 upon the detection of any event pertaining to the button UI construct for purposes of determining whether the detected event is significant and thus requires action to be taken.
  • Other UI constructs such as edit boxes, menu items, and the like also have a similar method. Cumulatively, these methods within instantiated objects 169 may comprise event handler 65 of FIG. 2A .
  • Each class also includes a writeToPersistentStorage( ) method which saves the object's state by storing data member values, e.g. to a file system.
  • the values are typically stored in a binary representation.
  • This method is invoked during initial DOM tree traversal for purposes of writing to persistent storage newly instantiated objects which are not immediately needed.
  • the objects may be de-allocated, and as a result, it is not necessary to maintain a vast set of objects representative of the entire application definition file 28 within wireless communication device memory. Only objects 169 pertaining to the current wireless communication device application state are instantiated at any given time, and wireless communication device resources are thereby conserved.
  • a corresponding readFromPersistentStorage( ) method permits a newly instantiated object to assume the state of a previously de-allocated object from values saved to persistent storage by the writeToPersistentStorage( ) method, e.g., when a screen is loaded due to user navigation to that screen. By initially storing the entire set of objects to persistent storage in this fashion, the need to maintain a DOM tree is avoided.
  • the screen generation engine 67 of the virtual machine software 24 at the device causes the virtual device to locate the definition of an initial screen for that application.
  • FIG. 8 Operation for loading a first or subsequent screen is illustrated in FIG. 8 .
  • generation engine 67 may employ a loadScreen(X) routine, which may be one of the general purpose routines 59 within virtual machine software 24 ( FIG. 2B ).
  • This routine may accept as a parameter a unique screen identifier X. Based on that identifier, the routine may find the appropriate representation of the named screen within local storage 26 (as previously stored by the writeToPersistentStorage( ) method of that screen when the DOM tree was initially traversed), instantiate a new screen object 169 (S 802 ), and cause the new screen object to populate itself with the stored data, e.g.
  • the readFromPersistentStorage( ) method of the screen object may in turn instantiate subordinate objects (such as buttons, edit boxes, menus, list boxes, choice items, and checkboxes, as detailed in Appendix “A”) and cause the readFromPersistentStorage( ) methods of these subordinate objects to be called, in an iterative fashion.
  • subordinate objects such as buttons, edit boxes, menus, list boxes, choice items, and checkboxes, as detailed in Appendix “A”
  • the readFromPersistentStorage( ) methods of these subordinate objects to be called, in an iterative fashion.
  • a hierarchy of instances 169 of object classes 69 are created within the virtual machine software 24 , as illustrated in FIG. 2B .
  • Each definition of a visual UI construct causes virtual machine software 24 to use the operating system of the wireless communication device to create corresponding display element of a graphical user interface as more particularly illustrated in FIG. 8 .
  • the associated XML definition is read in (S 806 , S 816 , S 826 , S 836 and S 846 ) and a corresponding instance of a screen object defined as part of the virtual machine software 24 is created by the virtual machine software 24 (at S 808 , S 818 , S 828 , S 838 and S 848 ), in accordance with S 902 and onward illustrated in FIG.
  • Each interface object instance is created at S 902 .
  • Each instance takes as attribute values defined by the XML text associated with the element.
  • a method of the virtual machine object is further called (S 904 ), and causes a corresponding device operating system object to be created (S 906 ).
  • Attributes originally defined in the XML text file, as stored within the virtual machine object instance, are applied to the corresponding instance of a display object created using the device operating system (S 908 -S 914 ). This is repeated for all attributes of the virtual machine object instance.
  • the event handler 65 of virtual machine software 24 is registered to process operating system events. This may for example entail registering, for each display element on the screen (e.g.
  • the onevent( ) methods for all of the objects which are associated with the currently displayed screen may constitute the event handler 65 for that screen. That is, there may be no distinct instance of an event handler 65 per se other than the onEvent( ) methods of currently displayed or instantiated UI construct objects.
  • virtual machine software 24 For each event (as identified by an ⁇ EVENT> tag in the application definition file 28 ) and action (as identified by an ⁇ ACTION> tag), virtual machine software 24 creates an instance of a corresponding event and action object forming part of virtual machine software 24 .
  • Virtual machine software 24 maintains a list identifying each instance of each event and action object, which may take the form of arrays of event objects and subordinate arrays of action objects as described above (S 916 to S 928 ).
  • Operation at S 902 -S 930 is repeated for each element of the screen at S 808 , S 818 , S 828 , S 838 and S 848 as illustrated in FIG. 8 . All elements originally defined between the ⁇ SCREEN> definition tags of the application definition file are so processed. After the entire screen has been so created in memory, it is displayed (S 854 ), using conventional techniques.
  • Events may be handled by virtual machine software 24 as illustrated in FIG. 10 . Operation at S 1002 and onward is performed in response to the operating system detecting an event.
  • the operating system Upon the occurrence of an event, such as the receipt of data from a wireless network 36 or 38 or user interaction with user interface controls at the wireless communication device, the operating system automatically invokes the onevent( ) method for each object 169 in respect of which callback was earlier registered (S 1002 ).
  • the onEvent( ) method for these objects determines whether the event is significant for the object (S 1004 ) and, if so (S 1006 ), passes control to each of the action(s) in the array of action objects within the relevant event object, in succession (S 1008 -S 1016 ), for purposes of effecting the desired behaviour.
  • Control may for example be passed by invoking a doAction( ) method of each instance of an action object within the action object array that comprises the data members of the relevant event object, in order to effect desired processing as defined by the XML ⁇ ACTION> tag.
  • the result of executing this method may be loading a named screen, closing the current screen, sending a message, storage data locally at the device, or other actions, as described below.
  • doAction( ) method of each action object hard-coded instructions exist that are capable of causing various types of actions to be performed.
  • the attribute values within the action object's data members dictate which of these hard-coded instructions are executed and thereby effectively serve as parameters to the action which determine the resultant operation of the wireless communication device.
  • the doAction( ) method may invoke a general purpose routine. For example, if an action specifies that a screen should be closed, a “destroy screen X” routine 181 , which is one of general purpose routines 59 ( FIG. 2B ) in the present embodiment, may be invoked from the action object's doAction( ) method. This routine may traverse screen objects within the instantiated objects 169 until the screen with the specified name X is found, at which point that screen object may be instructed to destroy itself. If the action indicates that a message (package) should be sent, a “createXMLPackage( )” general purpose routine 187 ( FIG.
  • routine 187 methods within an XML builder object may assemble data into an XML package which is then passed to message server object.
  • the message server object may use the device's network APIs to transmit the assembled XML package across the wireless network.
  • the relevant event objects will not be directly contained within a UI construct object within objects 169 (as shown in FIG. 2B ). Rather, the relevant event objects will be defined at the application level or at the screen level, depending upon whether the data event is defined as an application-level event (i.e. significant regardless of the currently displayed screen) or a screen-level event (i.e. only significant when a particular screen is displayed), respectively.
  • the event objects whose onevent( ) methods are invoked will be at the same level as, or will be directly contained by, the screen object corresponding to the currently displayed screen ( FIG. 2B ).
  • a user could send a login request 80 by interacting with an initial login screen, defined in the application definition file for the application. This would be passed by the transaction server 44 to the backend application server 70 .
  • the backend application server according to the logic embedded within its application, would return a login response 82 , which the transaction server 44 would pass to the virtual machine software 24 .
  • Other applications, running on the same or other application servers might involve different interactions, the nature of such interactions being based upon the functionality and logic embedded within the application server 70 .
  • FIG. 11 illustrates the tool 116 within the wireless communication device operating environment of FIG. 3 .
  • Wireless communication device 30 of FIG. 3 is omitted from FIG. 1 for clarity.
  • the RAD tool 116 of FIG. 11 is a computing device 118 , such as an Intel®-Processor based personal computer (PC) for example, executing rapid application development (RAD) software, which may be loaded from a machine-readable medium such as an optical disk 120 .
  • the tool 116 allows the developer to create a master application definition file 58 from which device-specific application definition files 28 may be generated.
  • Completed master definition files 58 are uploaded from computing device 118 to transaction server 44 , via network 119 , which may be an Ethernet local area network for example, for downloading to wireless communication devices 10 , 32 and 34 .
  • the device-specific application definition file 28 when downloaded, interpreted and executed at a wireless communication device 10 , 32 or 34 , permits the wireless communication device to emulate and intercommunicate with an application that is actually executing on an application server 70 ( FIG. 6 ), as described above.
  • FIG. 12 illustrates RAD tool 116 in greater detail.
  • the tool 116 is a PC 118 executing RAD software 130 .
  • the PC 118 includes a processor 134 in communication with memory 132 which stores the software 130 .
  • the PC 118 further includes a conventional display 138 , such as a Cathode Ray Tube (CRT) monitor or flat-screen display for example, and a conventional user input mechanism (UIM) 145 , such as a keyboard and/or mouse for example.
  • the PC 118 also includes a network interface card 142 (e.g. an Ethernet interface) which facilitates communication by the tool 116 over network 119 ( FIG. 11 ), e.g. for purposes of uploading a completed master definition file 58 from secondary storage 146 to the transaction server 44 .
  • a network interface card 142 e.g. an Ethernet interface
  • RAD software 130 when executed by PC 118 , it provides an intuitive graphical user interface which facilitates “drag and drop” application development, so that even developers who lack depth of expertise in software development may “develop a mobile application” (i.e. may generate a master definition file 58 ).
  • the procedure for developing a mobile application essentially consists of creating a visual hierarchy or “tree” of icons which correlates to a logical hierarchy of XML markup tags (e.g. as defined in Appendix “A”).
  • the created visual hierarchy may be similar to a graphical directory and file structure representation in a conventional graphical operating system.
  • Each icon represents a building block of the application (e.g.
  • a GUI screen a database table for storing program data, action to be executed upon the occurrence of a defined event etc.) and corresponds to a defined ARML tag (i.e. an instance of an XML element with attributes).
  • the tool 116 automatically generates a dynamically-accessible representation of the corresponding hierarchy of XML elements and attributes within memory 132 , in the form of a master definition DOM tree 150 data structure.
  • a DOM tree is essentially a dynamically accessible representation of an XML document that is well understood in the art.
  • a technique is employed to efficiently represent sets of actions that may be triggered in more than one scenario.
  • the RAD software 130 which may be referred to by the proprietary name “AIRIX Design Studio” or simply “Design Studio”, may be implemented as a set of plug-ins to a generic integrated design environment (IDE) framework such as the Eclipse framework.
  • IDE integrated design environment
  • the Eclipse platform is designed for building integrated development environments that can be used to create various applications such as web sites, embedded JavaTM programs, C++ programs, and Enterprise JavaBeansTM for example.
  • the platform exposes to tool providers mechanisms to use and rules to follow via well-defined APIs, classes and methods.
  • the RAD software 130 may be written in Delphi, using an SQL Server database for example.
  • FIG. 13 illustrates an exemplary GUI 1300 of the RAD tool 116 when the RAD software 130 is executed.
  • the GUI 1300 includes various components, including a toolbar 1302 , a project explorer 1304 , and a main design area 1306 .
  • Toolbar 1302 includes a menu list and icons for performing various development activities during mobile application development. Activities which may be performed include opening a new project, compiling a current mobile application, and publishing a current mobile application.
  • the term “project” refers to the mobile application under development. Compiling refers to the checking of various aspects of an application for errors or deviations from good programming practices. Compilation may cause hints, warnings or errors to be displayed, e.g.:
  • Publishing refers to the creation of a master definition file 58 by serializing the master definition DOM tree 150 . Publishing may cause hints, warnings or errors to be displayed, as during compilation.
  • the project explorer 1304 contains the visual hierarchy of icons 1305 that is created by the developer to represent the mobile application.
  • the visual hierarchy 1305 defines a simple inventory application for a Pocket PC mobile device.
  • FIG. 14 A more detailed view of the project explorer 1304 is provided in FIG. 14 , which is described below.
  • Main design area 1306 of FIG. 13 displays the currently-selected application component of project explorer 1304 .
  • an icon is selected in the visual hierarchy 1305 , a graphical representation of the relevant component (e.g. a screen or database table) and its properties appears in main design area 1306 .
  • icon 1308 representing a Pocket PC GUI screen has been selected, as indicated by reverse video.
  • an “interface designer” GUI that is specific to the relevant platform appears in design area 1306 .
  • the Pocket PC interface designer GUI includes a number of GUI areas, namely, a screen designer 1310 , an interface component drop down list 1312 and a properties tab 1314 .
  • the screen designer 1310 is a “screen painter” window which displays a graphical representation of the relevant wireless communication device type (a “virtual device”) for which screens are to be created.
  • the designer 1310 permits a developer to design screens by dragging and dropping display elements (such as buttons, edit boxes, or other widgets) to the virtual device screen in the window, offering a “what you see is what you get” (WYSIWYG) view of the interface screen under development.
  • the interface component drop down list 1312 facilitates definition and selection of individual GUI display elements which make up the screen as displayed within the interface designer window 1310 .
  • the properties tab 1314 displays the properties of the interface component that is currently selected in the drop down list 1312 . Properties that are generally applicable to the overall screen may also be displayed. Displayed properties may be updated as desired.
  • FIG. 14 illustrates project explorer 1304 in greater detail.
  • a different visual hieararchy 1400 “Project—Example”, than that which is shown in FIG. 13 is illustrated.
  • the illustrated project exemplifies a mobile application for a Pocket PC wireless computing device.
  • the mobile application utilizes global functions to efficiently declare a set of actions that is executed in multiple scenarios.
  • the visual hierarchy 1400 includes platform-independent components 1402 and platform-specific components 1404 .
  • Platform-independent components 1402 are application components (i.e. application building blocks such as GUI screens, definitions of significant events, and actions to be performed upon the occurrence of these events) which are not specific to a particular type of wireless communication device and may therefore alternatively be considered “platform-independent” (or “device-independent”).
  • Platform-specific components 1404 are application components that may vary from wireless communication device type to wireless communication device type. For example, the GUI screens of a mobile application may differ in some measure between wireless communication device types, due to differences in the capabilities of the devices (e.g. screen size and supported display elements).
  • application components generally correspond to XML elements within the master definition file 58 that will be generated by the RAD tool 116 .
  • the platform-independent components 1402 (which comprise the “Device Independent” branch of the visual hierarchy,) include application events 1406 , data rules 1408 , database tables 1410 and global functions 1412 .
  • Application events 1406 define the events which trigger processing within the mobile application regardless of the application's status (e.g. regardless of which GUI screen is presently displayed) as well as the actions that are to be performed upon the events' occurrence.
  • the receipt of an XML package at the wireless communication device can be defined as an application level event which results in the display of a message box for example.
  • Application level events are to be distinguished from screen level events (arrival of an XML package when a specific screen is displayed) and control level events (user manipulation of a GUI control such as a button press), which are defined separately from application level events 1406 .
  • An exemplary definition of a control-level event is described below.
  • Data rules 1408 dictate how XML packages received from enterprise applications affect data stored in database tables associated with an application.
  • a rule may define which field(s) of a table will be impacted by incoming data and the nature of the impact. Because rules make reference to database tables, logically they are defined after the tables (described below) have been defined. Rules also dictate how to apply changes to database tables from XML created in an outgoing XML transaction. Like application-level events 1406 , data rules 1408 are device independent.
  • Database tables 1410 are defined by a developer for purposes of storing data at run time for use by the mobile application executing at the wireless communication device.
  • Global functions 1412 contain definitions of global functions for the mobile application, which are a focus of the present description.
  • a global function is a named aggregation or set of actions which can be referenced from other areas of the visual hierarchy 1400 where the actions might otherwise be declared (i.e., from any event declaration in hierarchy 1400 ).
  • the referencing event declaration defines the circumstances in which the actions comprising the referenced global function will be triggered.
  • a global function may be warranted when the same set of N actions (N being an integer greater than one) should be executed in more than one scenario of a mobile application.
  • Each global function defines actions to be executed and a sequence for execution.
  • the global functions section 1412 of FIG. 14 contains two global function definitions 1414 and 1422 .
  • the first global function definition 1414 “Function 1, contains definitions for three actions A, B and C.
  • the first action 1416 “Action A” causes an XML package representing a login message with predetermined username and password field values to be sent from the wireless communication device 10 to the transaction server 70 .
  • the second action 1418 “Action B”, and the third action 1420 , “Action C”; each cause the wireless communication device 10 to activate the configured notification for the device (e.g. to play a “beep”).
  • the order of the actions A, B and C determines their sequence of execution.
  • Global function 1414 thus constitutes “unit” of code which causes the wireless communication device 10 to send an XML package login message and to activate the configured notification twice, in that sequence.
  • the definition of this global function is motivated by the fact that the mobile application requires that set of actions A, B and C to be executed in that sequence in more than one scenario.
  • the procedure for defining the global function 1414 in the project explorer window 1304 may be as follows.
  • the icon corresponding to the global functions section 1412 may initially be selected with a mouse (or similar user input mechanism 145 ) of the RAD tool 116 ( FIG. 12 ).
  • a right-click (or similar user action) may cause a pop up menu to be displayed.
  • the pop-up menu may present a list of permissible application components that may be defined within the global functions section 1412 ).
  • An Add Function option may permit the user to define a new global function. Selection of that menu item may cause a new global function icon 1414 to be created below icon 1412 , as shown in FIG. 14 , and a Function Properties window to be displayed in the main design area 1306 ( FIG. 13 ).
  • the Function Properties window may permit the user to enter properties of the newly-defined global function.
  • the Function Properties window may include a Function Name field for defining a unique function name that is not already in use by any other global function.
  • the name uniqueness constraint ensures that each global function may be uniquely identified from other areas of the visual hierarchy 1400 . In the present example, it is assumed that the name “Function1” is entered, thus that name is displayed as part of the icon at 1414 .
  • the newly-defined icon corresponding to function 1414 may be selected and the mouse again right-clicked to cause another, different pop-up menu to be displayed.
  • This pop-up menu may contain a list of permissible application components that may be defined within a function.
  • An Add Action option may permit the user to declare a new action. (Actions can be used to navigate to different portions of the application or to handle application data.)
  • the RAD tool 116 may for example support the different action types described in Appendix “A”. Each action type instructs the application to perform a different operation, thus the properties associated with each action type vary.
  • Selection of the Add Action menu item may cause a new action icon at 1416 to be created below the icon at 1414 and an Action Properties window to be displayed in the main design area 1306 for entering properties associated with the newly-defined action.
  • the Action Properties window may include an Action Name field for entering a name (e.g. “ActionA”) and an Action Type drop down list for selecting the type of action.
  • the drop down list may contain a predetermined list of possible action types (e.g. as described in Appendix “A”). When one of the action types is selected, the Action Properties window may be updated for entering further properties associated with the selected action type.
  • the action type for the first action A is set to “XML Transaction”, and the remaining properties are also set, so as to define an action which causes an XML package comprising a login request with a fixed username field of “SMITHJ” and a fixed password of “ABC321” to be sent to the transaction server 70 .
  • Definition of this action may involve typing the XML text comprising the message into an XML Text field.
  • a similar procedure may be followed to create the second global function 1422 .
  • the set of actions defined under global function 1422 (which are not visible in the project explorer window 1304 due to the fact that the portion of the visual hierarchy 1402 below the function definition 1422 has not be expanded), may be partly or wholly different from those defined in the first global function 1414 .
  • GUI screens are defined for only one platform, namely the Pocket PC platform, which is represented by “Pocket PC” branch 1426 .
  • GUI screen definitions 1428 and 1444 are illustrated in Pocket PC branch 1426 (other Pocket PC GUI screen definitions being omitted from FIG. 14 for brevity).
  • Screen definition 1428 defines the GUI screen 1500 of FIG. 15 and screen definition 1444 defines a similar GUI screen 1600 of FIG. 16 .
  • GUI screen 1500 has a title 1502 (“Login”), a text item 1504 (“Press the button to log in”) and an “OK” button 1506 .
  • the desired behavior for the screen is for four actions to occur upon selection of the “OK” button 1506 .
  • First, an XML package comprising a login request with a fixed username field of “SMITHJ” and a fixed password of “ABC321” should be sent to the transaction server 70 .
  • a notification sound should be played.
  • the notification sound should be repeated.
  • a message box 1508 with a title “Informational Message” and a message “Message sent.”, as shown in FIG. 15 should be displayed.
  • FIG. 16 illustrates the other GUI screen 1600 .
  • Screen 1600 is intended for display instead of GUI screen 1500 only in the case where logic within the mobile application has determined that the wireless computing device user prefers display of GUI screens in the French language.
  • the GUI screen 1600 is analogous in non-textual appearance and operation to GUI screen 1500 of FIG. 15 .
  • screen 1600 also has a title 1602 , a text item 1604 , an “OK” button 1606 and a message box 1608 .
  • the design of screen 1600 is such that, upon selection of the “OK” button 1606 , the same four actions as described above occur.
  • the textual aspects of the display elements such as title 1602 , text item 1604 , and the textual components of the message box 1608 , are in the French language.
  • Screen 1500 of FIG. 15 is defined in GUI screen definition 1428 of FIG. 14 .
  • the procedure for creating the screen definition 1428 in the project explorer window 1304 may involve steps that are similar to above-described steps for defining global functions 1412 . For example, right-clicking of the Pocket PC branch 1426 may pop up a menu having a “New Screen” option. Selection of that option may cause the icon 1428 to be created.
  • a unique screen name typed into a screen name field of a Screen Properties window (which may be displayed in the main design area 1306 ), “LoginScr”, becomes part of the icon 1428 representing the screen.
  • the title 1502 for the screen (“Login”) may be typed in a title field of the Screen Properties window.
  • the first icon 1430 represents the text item 1504 of FIG. 15 .
  • the second icon 1432 represents the “OK” button 1506 of FIG. 15 .
  • Each of these application components may be created by right-clicking the icon 1428 , choosing the appropriate new display element to be added (from a pop-up menu or a toolbar for example) and defining the new display element's properties in an associated properties window.
  • a ButtonClick event 1434 is declared below the button icon 1432 .
  • This event represents the selection of the “OK” button 1506 of FIG. 15 by a wireless communication device user.
  • Right-clicking of the event 1434 in project explorer 1304 causes another pop-up menu to be displayed.
  • the options that are presented on the displayed pop-up menu include an Add Function Call option (assuming that at least one global function has been declared) and an Add Action option. These two options represent the two ways in which wireless communication device behavior responsive to the occurrence of an event may be specified.
  • the first option, Add Function Call permits a reference to a previously-defined global function to be created. This option may be selected to create function call icon 1436 .
  • the definition of properties in an associated a Function Call Properties window displayed in main design area 1306 is illustrated in inset 1438 .
  • a name field 1440 contains a user-specified name (“CallFn1”) that will be displayed as part of the icon 1436 .
  • a “Function To Call” field 1442 provides a drop-down list which lists, by (unique) name, each global function defined in the global functions section 1412 . In FIG.
  • the list 1442 is illustrated in a dropped-down state, with two entries, namely Function 1 and Function 2 , corresponding to the two previously-defined global functions 1414 and 1422 , being visible.
  • the heavy border around the first entry, Function 1 indicates that this field has been selected by the user of RAD tool 116 , such that the function call 1436 defines a call to Function 1 .
  • function call 1436 is operationally equivalent to the creation of actions 1416 , 1418 and 1420 below the ButtonClick event 1434 .
  • a further action 1438 which causes English language message box 1508 ( FIG. 15 ) to be displayed may be defined by right-clicking the ButtonClick event 1434 of hierarchy 1400 , selecting pop-up menu option “Add Action”, and typing the desired message box title and message content in the Action Properties window.
  • the procedure for creating screen definition 1444 representing the other screen 1600 ( FIG. 16 ) is virtually identical to the above-described steps for creating screen definition 1428 , except that the textual content for screen 1600 is typed in French.
  • the resultant application components namely text item 1446 , button 1448 , ButtonClick event 1450 , function call 1452 , and action 1454 are accordingly the same as text item 1430 , button 1432 , ButtonClick event 1434 , function call 1436 , and action 1438 , respectively.
  • the same global function (“Function1”) that is referenced by function call 1436 for the English language screen 1500 is also referenced by function call 1452 for the French language screen 1600 .
  • all of the actions 1414 , 1416 and 1418 comprising the referenced function “Function1” are language-neutral (i.e. none of them cause any textual content to be displayed), they may be defined only once in the form of a global function, and the global function may then be referenced from each screen.
  • language-specific actions which have different textual content on screens 1500 and 1600 , are defined once for the English language screen 1500 and once for the French language screen 1600 .
  • actions which are specific to a platform, screen or event will be defined within the context of that platform, screen or event, as there is little motivation for defining such actions within a global function that would be referenced only once.
  • Global functions are not necessarily always referenced from the context of a control-level event. Global functions may also be referenced from screen-level or application-level events.
  • FIGS. 17A-17B the master definition DOM tree 150 of FIG. 12 is illustrated in greater detail.
  • DOM tree 150 is represented textually as XML in FIGS. 17A-17B for ease of reference. It will be appreciated that the DOM tree 150 in memory 132 of RAD tool 116 ( FIG. 12 ) is actually a dynamically-accessible representation.
  • the DOM tree 150 of FIGS. 17A-17B corresponds to the mobile application design illustrated in the project explorer 1304 of FIG. 14 . That is, the DOM tree 150 is automatically generated in memory 132 by RAD tool 116 as a result of the developer's creation of the “Project-Example” hierarchy 1400 of FIG. 14 .
  • FIGS. 17A-17B Two relevant portions of the master definition DOM tree 150 are illustrated in FIGS. 17A-17B .
  • the first portion 1700 (lines 2 - 19 of FIG. 17A ) corresponds to the global functions section 1412 of FIG. 14 .
  • the second portion 1702 (lines 21 - 61 of FIGS. 17A-17B ) correspond to the platform-specific components section 1404 of FIG. 14 .
  • Other portions of the master definition DOM tree 150 are omitted for brevity.
  • XML element XML element instance
  • XML element instance XML element instance
  • instance of an XML element are understood to be synonymous. Each of these is a form of markup language element, or an “instance of” a markup language element.
  • an outermost XML element contains two hierarchies of XML elements (i.e. two markup language hierarchies or sub-trees within DOM tree 150 ).
  • the first hierarchy which appears at lines 3 - 13 of FIG. 17 , corresponds to the first global function 1414 .
  • the second hierarchy which appears at lines 14 - 18 of FIG. 17 , corresponds to the second global function 1422 .
  • Each hierarchy has a parent XML element, FUNCTION, and contains a number of instances of the ACTION element which, as previously noted, defines an action to be performed by the wireless computing device 10 .
  • the XML elements FUNCTIONS and FUNCTION are extensions of the XML elements and attributes identified in Appendix “A”.
  • the ACTION element at lines 4 - 10 corresponds to action 1416 of FIG. 14 .
  • this ACTION element causes a login message (whose “body” is defined at lines 6 - 8 ) to be sent from the device 10 to the transaction server 70 .
  • the ACTION elements at lines 11 and 12 correspond to actions 1418 and 1420 of FIG. 14 , and each result in the playing of a notification sound at the wireless communication device 10 .
  • the second markup language element hierarchy also contains multiple ACTION elements at lines 15 - 17 whose details omitted for brevity.
  • the second portion 1702 of DOM tree 150 defines the GUI screens for Pocket PC mobile application, which include screens 1500 and 1600 of FIGS. 15 and 16 respectively.
  • lines 24 - 40 of FIGS. 17A-17B define English language screen 1500 while lines 41 - 57 of FIG. 17B define French language screen 1600 .
  • XML elements and attributes at lines 31 - 39 define the “OK” button 1506 ( FIG. 15 ).
  • the XML elements include an EVENT element at lines 33 - 37 which defines the ButtonClick event 1434 ( FIG. 14 ).
  • the EVENT element in turn contains a FNCALL element at line 33 referencing a global function. More specifically, the FNCALL element has a CALLEDFN attribute whose value, “Function1”, uniquely identifies the hierarchy of XML elements at lines 3 - 13 of FIG. 17A as the global function whose actions are to be executed upon occurrence of the ButtonClick event.
  • the ACTION element at lines 35 - 36 of FIG. 17B corresponds to the message box action 1438 of FIG. 14 , which is declared immediately within the containing EVENT element.
  • the XML elements and attributes which define screen 1600 are similar to those defining screen 1500 , except that the textual components are in the French language. It is noted that the CALLEDFN attribute of the FNCALL markup language element at line 50 (which corresponds to the function call 1452 of FIG. 14 ) is the same as the CALLEDFN attribute of the FNCALL markup language element at line 33 , since both reference the same global function 1414 .
  • FIGS. 18A-18B illustrate the master definition file 58 (an XML document) that is generated through serialization of the master definition DOM tree 150 of FIGS. 17A-17B . Only the portion of the master definition file 58 which defines the two GUI screens 1500 and 1600 for the Pocket PC platform is illustrated in FIGS. 17A-18B . Screen 1500 is defined at lines 3 - 30 of FIG. 18A , while screen 1600 is defined at lines 31 - 55 of FIGS. 18A-18B .
  • the definition of screen 1500 at lines 3 - 30 represents a “merging” of the English language screen definition at lines 24 - 40 of FIGS. 17A-17B of the master definition DOM tree 150 and the definition of the first global function at lines 3 - 13 of FIG. 17A of the master definition DOM tree 150 .
  • the screen definition at lines 3 - 30 of FIG. 18A is a reproduction of the screen definition at lines 24 - 40 of FIGS. 17A-17B , with the exception that the FNCALL element at line 33 of FIG. 17A has been replaced with all of the ACTION elements which make up the referenced global function (i.e., with the three actions defined at lines 4 - 12 of FIG. 17A ).
  • the resulting markup language at lines 14 - 23 of FIG. 18A is the same as if those three actions had originally been defined within the ButtonClick EVENT element, like the action 1438 (see lines 35 - 36 of FIG. 17B ).
  • the definition of screen 1600 at lines 31 - 55 of FIGS. 18A-18B represents a “merging” of the French language screen definition at lines 41 - 57 of the master definition DOM tree 150 ( FIG. 17B ) and the definition of the first global function at lines 3 - 13 of the master definition DOM tree 150 ( FIG. 17A ).
  • the FNCALL element at line 51 of FIG. 17A has again been replaced with the three actions defined at lines 4 - 12 of FIG. 17A .
  • the resulting markup language at lines 40 - 48 is the same as if those three actions had originally been defined within the context of the ButtonClick EVENT element, like action 1454 (corresponding to lines 51 - 52 of FIG. 17B ).
  • the resultant master definition file 58 may be used by the baseline system for presenting server-side applications at varied wireless communication devices as described in section I above.
  • the master definition file 58 of FIGS. 18A-18B is generated by machine-readable code comprising the RAD software 130 which traverses and serializes the DOM tree 150 using the above-described technique for substituting global function references with the actions of the referenced function. This traversal and serialization are performed upon user selection of a “publish” command of RAD tool 116 .
  • the master definition file 58 could alternatively be generated by machine-executable parser code which parses a textual version of the master definition DOM tree 150 (as shown in FIGS. 17A-17B for example).
  • the approach for facilitating generation of a markup language document containing identical sets of markup language elements as described hereinabove is not necessarily limited to markup language documents pertaining to mobile applications and wireless computing device action.
  • the approach may be used to simplify the generation of virtually any markup language document containing identical sets of markup language elements.
  • the approach may be used for various representations of a markup language document, such as textual markup language document files, DOM tree representations, or otherwise.
  • markup language document or “XML document” herein is understood to include not only textual (e.g. ASCII) electronic files, but other document representations, such as DOM trees or Simple API for XML (SAX) representations for example.

Abstract

From a markup language document (e.g. an Extensible Markup Language (XML) document expressed as a Document Object Model (DOM) tree) having a markup language element hierarchy containing a set of markup language elements and a plurality of references to the hierarchy, another markup language document (e.g. a textual XML document) is generated which contains one instance of the set of markup language elements for each of the plurality of references. The generated markup language document may otherwise have the same content as the original markup language document. Generation of a markup language document containing identical sets of markup language elements may thereby be simplified.

Description

    COPYRIGHT NOTICE
  • A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by any one of the patent document or patent disclosure, as it appears in a Patent Office patent file or records, but otherwise reserves all copyright rights whatsoever.
  • FIELD OF THE INVENTION
  • The present invention relates to markup languages, and more particularly to an apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements.
  • BACKGROUND OF THE INVENTION
  • The use of markup languages such as Extensible Markup Language (XML) is prevalent in modern computing. This is likely due in part to fact that markup language documents may be expressed in a simple textual form which can be processed by many different types of computing devices and operating system platforms. Markup language documents may thus facilitate cross-platform computing.
  • When a developer creates a markup language document, either using a text editor or through the use of an integrated development environment (IDE), it may be necessary to repeatedly generate the same set of markup language elements within the document. The repeated generation of the same set of markup language elements can be tedious and can result in a markup language document that is difficult to maintain.
  • An apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements would be desirable.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the figures which illustrate example embodiments of this invention:
  • FIG. 1 schematically illustrates a wireless communication device including virtual machine software;
  • FIG. 2A illustrates the organization of exemplary virtual machine software at the wireless communication device of FIG. 1;
  • FIG. 2B further illustrates the organization of exemplary virtual machine software at the wireless communication device of FIG. 1;
  • FIG. 3 illustrates an operating environment for the wireless communication device of FIG. 1;
  • FIG. 4 illustrates the structure of example application definitions used by the device of FIG. 1;
  • FIG. 5 schematically illustrates the formation of application definition files at a transaction server of FIG. 3 from a master definition file;
  • FIG. 6 schematically illustrates the transaction server of FIG. 3 in greater detail;
  • FIG. 7 is a flow diagram illustrating the exchange of sample messages passed between a wireless communication device, transaction server and application server;
  • FIGS. 8-10 illustrate operation performed at a wireless communication device under control of virtual machine software of FIGS. 2A and 2B;
  • FIG. 11 schematically illustrates the wireless communication device operating environment of FIG. 3 with an exemplary Rapid Application Development (RAD) tool which may be used to develop master definition files in a manner exemplary of an embodiment of the present invention;
  • FIG. 12 schematically illustrates the RAD tool of FIG. 11 in greater detail;
  • FIG. 13 illustrates an exemplary graphical user interface (GUI) of the RAD tool of FIG. 12;
  • FIG. 14 illustrates a project explorer portion of the RAD tool GUI of FIG. 13 in which exemplary global functions are declared;
  • FIG. 15 illustrates a login screen for a Pocket PC wireless computing device defined in the project explorer of FIG. 14;
  • FIG. 16 illustrates a French language version of the login screen of FIG. 15 that is also defined in the project explorer of FIG. 14;
  • FIGS. 17A-17B textually illustrate a master definition Document Object Model (DOM) tree that is maintained in the memory of the RAD tool of FIG. 12 during mobile application design to represent the screens of FIGS. 15 and 16; and
  • FIGS. 18A-18B illustrate a master definition file which results from the serialization of the DOM tree of FIGS. 17A-17B.
  • DETAILED DESCRIPTION
  • The embodiment described herein pertains to an apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements. The embodiment is described, however, within the specific context of a system that presents server-side applications at varied wireless communication devices (also referred to as “mobile devices”).
  • The system for presenting server-side applications at varied wireless communication devices which serves as the context for the present description was originally described in U.S. Patent Publication No. US 2003/0060896 (which is hereby incorporated by reference hereinto). This system is referred to as the “baseline system” for convenience. An overview of the baseline system is initially provided below under the section heading “I. Baseline System Facilitating Execution of Server-Side Applications At Wireless communication devices” to provide a context for the description which follows. Thereafter, a description of a rapid application development (RAD) tool exemplary of an embodiment of the present invention which may be used to facilitate creation of master definition file for use in the baseline system is provided under the section heading “II. Rapid Application Development Tool”.
  • In one aspect of the below-described embodiment, there is provided an apparatus comprising: at least one processor; and a memory coupled to the at least one processor storing:a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy; and machine-executable code which, when executed by the at least one processor, generates, from the markup language document, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • In another aspect of the below-described embodiment, there is provided a machine-readable medium comprising: machine-executable code for generating, from a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • In yet another aspect of the below-described embodiment, there is provided a method comprising: generating, from a markup language document containing: a markup language element hierarchy containing a set of markup language elements; and a plurality of references to the markup language element hierarchy, another markup language document containing one instance of the set of markup language elements for each of the plurality of references.
  • I. System Facilitating Execution of Server-Side Applications at Wireless Communication Devices
  • In overview, a system which facilitates execution of server-side applications at wireless communication devices utilizes a text-based application definition file to govern the manner in which an application is presented at a wireless communication device. The application definition file contains a description of how an application is to be presented at wireless communication device, the format of transactions over the wireless network, and a format of data related to the application to be stored at the wireless communication device. The application definition file of the present embodiment is written in Extensible Markup Language (XML). A virtual machine software component at the wireless communication device interprets the definition file and presents an interface to the application in accordance with the definition file.
  • FIG. 1 illustrates a wireless communication device 10, exemplary of an embodiment of the present invention. Wireless communication device 10 may be any conventional wireless communication device, modified to function in the manner described below. As such, wireless communication device 10 includes a processor 12, in communication with a network interface 14, storage memory 16, and a user interface 18 typically including a keypad and/or touch-screen. Network interface 14 enables device 10 to transmit and receive data over a wireless network 22. Wireless communication device 10 may be, for example, be a Research in Motion (RIM) two-way paging device, a WinCE based device, a PalmOS device, a WAP-enabled mobile telephone, or the like. Memory 16 of device 10 stores a mobile operating system such as the PalmOS, or WinCE operating system software 20. Operating system software 20 typically includes graphical user interface and network interface software having suitable application programmer interfaces (“API”s) for use by other applications executing at device 10.
  • Memory at device 10 further stores virtual machine software 24 which, when executed by wireless communication device 10, enables device 10 to present an interface for server-side applications provided by a transaction server, described below. Specifically, virtual machine software 24 interprets a textual application definition file (a markup language document) defining a definition of a user interface 18 controlling application functionality, and the display format (including display flow) at device 10 for a particular server-side application; the format of data to be exchanged over the wireless network for the application; and the format of data to be stored locally at device 10 for the application. Virtual machine software 24 uses operating system 20 and associated APIs to interact with device 10, in accordance with the received application definition file. In this way, device 10 may present interfaces for a variety of applications, stored at a server. From the perspective of operating system 20, virtual machine software 24 is viewed as another application resident at device 10. Moreover, multiple wireless devices each having a similar virtual machine software 24 may use a common server-side application in combination with an application definition file, to present a user interface and program flow specifically adapted for the device.
  • As such, and as will become apparent, the exemplary virtual machine software 24 is specifically adapted to work with the particular wireless communication device 10. Thus if device 10 is a RIM BlackBerry device, virtual machine software 24 is a RIM virtual machine. Similarly, if device 10 is a PalmOS or WinCE device, virtual machine software 24 would be a PalmOS or a WinCE virtual machine. As further illustrated in FIG. 1, virtual machine software 24 is capable of accessing local storage 26 at device 10.
  • In the present example, the application definition file is formed using the well-known markup language XML. Defined XML entities are understood by the virtual machine software 24. Defined XML entities are detailed in Appendix “A”, attached hereto. AIRIX™ Markup Language (ARML) is an XML markup language used in the present embodiment. The defined XML entities are interpreted by the virtual machine software 24, and may be used as building blocks to present server-side applications at wireless communication device 10, as detailed herein.
  • Specifically, as illustrated in FIG. 2A, virtual machine software 24 includes a conventional XML parser 61; an event handler 65; a screen generation engine 67; and object classes 69 corresponding to XML entities supported by the virtual machine software 24, and possibly contained within an application definition file 28. Supported XML entities are detailed in Appendix “A”. A person of ordinary skill will readily appreciate that those XML elements and attributes identified in Appendix “A” are exemplary only, and may be extended, or shortened as desired, as described in section II hereinafter for example.
  • XML parser 61 may be formed in accordance with the Document Object Model, or DOM, which is available at www.w3.org/DOM/ and is incorporated by reference hereinto. Parser 61 enables virtual machine software 24 to read an application definition file. Using the parser, the virtual machine software 24 may form a binary representation of the application definition file for storage at the wireless communication device, thereby eliminating the need to parse text each time an application is used. Parser 61 may convert each XML tag contained in the application definition file, and its associated data to tokens, for later processing. As will become apparent, this may avoid the need to repeatedly parse the text of an application definition file.
  • Screen generation engine 67 displays initial and subsequent screens at the wireless communication device, in accordance with an application definition 28, as detailed below.
  • Event handler 65, of virtual machine software 24 allows device 10 under control of virtual machine software 24 to react to certain external events. Example events include user interaction with presented graphical user interface (GUI) screens or display elements, incoming messages received from a wireless network, or the like.
  • Object classes 69 also form part of virtual machine 24 and define objects that allow device 10 to process each of the supported XML entities at the wireless communication device. Each of object classes 69 includes attributes (e.g. fields or data members) used to store parameters defined by the XML file (XML element and/or attribute values), and allowing the XML entity to be processed at the wireless communication device, as detailed in Appendix “A”, for each supported XML entity. Virtual machine software 24 may be expanded to support XML entities not detailed in Appendix “A”.
  • As detailed below, upon invocation of a particular application at wireless communication device 10, the virtual machine software 24 presents an initial GUI screen based on the contents of the application definition 28 for the application. GUI screen display elements (e.g. menu items, text items, buttons, etc.) are created by screen generation engine 67 by creating instances of corresponding object classes for defined elements, as contained within object classes 69. The object instances are “customized” using XML element and attribute values contained in the application definition file 28. Thereafter the event handler 65 of the virtual machine software 24 reacts to events for the application. The manner in which the event handler reacts to events is governed by the contents of the application definition file. Events may trigger processing defined within instances of associated “action” objects, which objects are instantiated from object classes 69 of virtual machine software 24.
  • Similarly, object classes 69 of virtual machine software 24 further include object classes corresponding to data tables and network transactions defined in the Table Definition and Package Definition sections of Appendix “A”. At run time, instances of object classes corresponding to these classes are created and populated with parameters contained within application definition file, as required.
  • FIG. 2B illustrates in greater detail the manner in which the virtual machine software 24 of FIG. 2A may be organized. For purposes of FIG. 2B it is assumed that the wireless communication device 10 is currently executing a wireless communication device application (also referred to as a “mobile application”). As illustrated, the virtual machine software 24 has two categories of components, namely, objects 169 and general purpose routines 59.
  • Objects 169 are instantiations of object classes 69 (FIG. 2A) which are instantiated dynamically at run time when the application is executed at the wireless communication device 10. The types of objects 169 that are instantiated at any given moment (e.g. screens, buttons, events, actions, etc., as will be described) depend upon the mobile application currently being executed and its state, including which user interface screen is currently displayed at the wireless communication device. Each of objects 169 corresponds to an application component defined within the application definition file 28. The objects 169 are instantiated from binary representations 178 thereof which are maintained in secondary storage 26, which representations 178 are created when the application definition file 28 is initially parsed. Each object 169 contains methods which capture certain behaviours that are performed by all instances of the represented object, as well as data members which permit the characteristics or behavior of the object to be “customized” (e.g. each instance of a button object may include the same highlight( ) method which, if invoked, causes the button to become highlighted, and may further include X and Y coordinate data member values which define a unique location of the button on the encompassing UI screen). A more detailed description of the exemplary set of objects 169 of FIG. 2B is provided below in conjunction with the description of operation of wireless computing device 10.
  • General purpose routines 59, on the other hand, constitute a managing environment for the objects 169. The routines 59 encompass functionality which is useful for executing a mobile application at the wireless communication device but is not necessarily tied to a particular type of object 169. For example, the routines 59 may include the XML parser 61, which initially parses the application definition file 28. Other routines may facilitate loading or closing of UI screens, or the sending of messages over the wireless network 22, as will be described. The routines 59 effectively consolidate certain functionality for convenient invocation from any of objects 169, as required.
  • Using this general description and the description which follows, persons of ordinary skill in the art will be able to form virtual machine software 24 for any particular device. Typically, virtual machine software 24 may be formed using conventional object-oriented programming techniques, and existing device libraries and APIs, as to function as detailed herein. As will be appreciated, the particular format of screen generation engine 67, object classes 69 will vary depending on the type of virtual machine software, its operating system and API available at the device. Once formed, a machine executable version of virtual machine software 24 may be loaded and stored at a wireless communication device, using conventional techniques. It can be embedded in ROM, loaded into RAM over a network, or from a computer readable medium.
  • Although, in the described embodiment the virtual machine software 24 and software forming object classes 69 are formed using object-oriented structures, persons of ordinary skill will readily appreciate that other approaches could be used to form suitable virtual machine software. For example, object classes 69 forming part of the virtual machine could be replaced by equivalent functions, data structures or subroutines formed using a conventional (i.e. non-object-oriented) programming environment. Operation of virtual machine software 24 under control of an application definition file containing various XML definitions exemplified in Appendix “A” is further detailed below.
  • FIG. 3 illustrates the operating environment for a wireless communication device 10. Further example wireless communication devices 30, 32 and 34 are also illustrated in FIG. 3. These wireless communication devices 30, 32 and 34 are similar to device 10 and also store and execute virtual machine software.
  • Virtual machine software like that stored at device 10, executes on each wireless communication device 10, 30, 32, 34, and communicates with a transaction server 44 (referred to as a “middleware server 44” in U.S. Patent Publication No. US 2003/0060896, referenced above) by way of example wireless networks 36 and 38 and network gateways 40 and 42. Example gateways 40 and 42 are generally available as a service for those people wishing to have data access to wireless networks. Wireless networks 36 and 38 are further connected to one or more computer data networks, such as the Internet and/or private data networks by way of gateway 40 or 42. As will be appreciated, embodiments of the invention may work with many types of wireless networks. Transaction server 44 is in turn in communication with a data network, that is in communication with wireless networks 36 and 38. The communication used for such communication is via a HyperText Transfer Protocol (HTTP) transport over Transmission Control Protocol/Internet Protocol (TCP/IP). As could be appreciated, other network protocols such as X.25 or Systems Network Architecture (SNA) could equally be used for this purpose.
  • At least two categories of communication between transaction server 44 and wireless communication devices 10, 30, 32 and 34 exist. First, virtual machine software 24 at each device may query transaction server 44 for a list of applications that a user of an associated wireless communication device 10, 30, 32 or 34 can make use of. If a user decides to use a particular application, device 10, 30, 32 or 34 can download a text description, in the form of an application definition file, for the application from the transaction server 44 over its wireless interface. Second, virtual machine software 24 may send and receive (as well as present, and locally store) data to and from transaction server 44 which is related to the execution of applications, or its own internal operations. The format of exchanged data for each application is defined by an associated application definition file. Again, the exchanged data may be formatted using XML, in accordance with the application definition file.
  • Transaction server 44 stores XML application definition files for those applications that have been enabled to work with the various devices 10, 30, 32, and 34 using virtual machine software 24 in a pre-defined format understood by virtual machine software 24. Software providing the functions of the transaction server 44, in the exemplary embodiment is written in C#, using SQL Server or MySQL database.
  • The XML of the application definition files may conform to XML version 1.0, detailed in the XML version 1.0 specification third edition and available at www.w3.org/TR/2004/REC-xml-20040404, for example.
  • Each application definition file is formatted according to defined rules and uses pre-determined XML markup tags known by both virtual machine software 24, and complementary transaction server software 68. That is, each application definition file 28 is an XML document (i.e. an XML data instance file) which conforms to a predefined XML schema designed to support the execution of server-side applications at various types of wireless communication devices. Tags define XML elements used as building blocks to present an application at a wireless communication device. Knowledge of these rules, and an understanding of how each tag and section of text should be interpreted, allows virtual machine software 24 to process an XML application definition and thereafter execute an application, as described below. Virtual machine software 24 effectively acts as an interpreter for a given application definition file.
  • FIG. 4 illustrates an example format for an XML application definition file 28. As illustrated, the example application definition file 28 for a given device and application includes three components: a user interface definition section 48, specific to the user interface for the device 10, which defines the format of graphical user interface (GUI) screens for the application and how the user interacts with them and contains application flow control events and actions; a network transactions definition section 50 defining the format of data to be exchanged with the application; and a local data definition section 52 defining the format of data to be stored locally on the wireless communication device by the application.
  • Defined XML markup tags are used to create an application definition file 28. The defined tags may broadly be classified into three categories, corresponding to the three sections 48, 50 and 52 of an application definition file 28.
  • Example XML tags and their corresponding significance are detailed in Appendix “A”. As noted above, virtual machine software 24 at a wireless communication device includes object classes corresponding to each of the XML tags. At run time, instances of the objects are created as required.
  • Broadly, the following list includes example XML tags (i.e. XML elements) which may be used to define the GUI screens:
      • SCREEN—this defines a screen. A SCREEN tag pair contains the definitions of the user interface elements (buttons, radio buttons, and the like) and the events associated with the screen and its elements
      • BTN—this tag defines a button and its associated attributes
      • LB—this tag defines a list box that allows selection of an item from a list of items.
      • CHOICE—this tag defines a choice item, that allows selection of an item from a set of items.
      • MENU—the application developer will use this tag to define a menu for a given screen.
      • MENUITEM—defines a selectable item on a given menu.
      • EB—this tag defines an edit box for entering textual content.
      • TI—this tag describes a text label that is displayed.
      • CHK—this tag describes a checkbox.
      • GRID—this tag defines a grid made up of a configurable number of columns and rows defining cells for containing textual content.
      • EVENT—this defines an event to be processed by the virtual machine software. Events can be defined against the application as a whole, individual screens or individual items on a given screen. Sample events would be receipt of data over the wireless interface, or an edit of text in an edit box by a wireless communication device user.
      • ACTION—this describes a particular action should be performed upon the occurrence of a particular event. Sample actions would be navigating to a new window or displaying a message box.
  • The second category of example XML tags describes the network transaction section 50 of application definition 28. These may include the following example XML tags:
      • TUPDATE—using this tag, the application developer can define an update that is performed to a table in the device's local storage. Attributes allow the update to be performed against multiple rows in a given table at once.
      • PKGFIELD—this tag is used to define a field in a data package (message) that passes over the wireless interface.
  • The third category of XML tags used to describe an application are those used to define a logical database that may be stored at the wireless communication device. The tags available that may be used in this section are:
      • TDE—this tag, and its attributes and subordinate field tags, define a table. Contained within a pair of TDEF tags are definitions of the fields contained in that table. The attributes of a table control such standard relational database functions as the primary key for the table.
      • FLD—this tag describes a field and its attributes. Attributes of a field are those found in a standard relational database system, such as the data type, whether the field relates to one in a different table, the need to index the field, and so on.
  • In addition to these XML tags, virtual machine software 24 may, from time to time, need to perform certain administrative functions on behalf of a user. In order to do this, one of object classes 69 has its own repertoire of tags to intercommunicate with the transaction server 44. Such tags differ from the previous three groupings in that they do not form part of an application definition file, but are solely used for administrative communications between the virtual machine software 24 and the transaction server 44. Data packages using these tags are composed and sent due to user interactions with the virtual machine's configuration screens. The tags used for this include:
      • REG—this allows the application to register and deregister a user for use with the transaction server.
      • FINDAPPS—by using this operation, users can interrogate the server for the list of applications that are available to them.
      • APPREG—using this operation, the end-user can register (or deregister) for an application and have the application interface downloaded automatically to their device (or remove the interface description from the device's local storage).
      • SA—If the user's preferred device is malfunctioning, or out of power or coverage, they will need a mechanism to tell the Server to attempt delivery to a different device. The Set Active (SA) command allows the user to set the device that they are currently using as their active one.
  • Referring again generally to the manner in which execution of server-based applications at wireless communication devices is facilitated, FIG. 5 illustrates the organization of application definitions at transaction server 44 and how transaction server 44 may form an application definition file 28 (FIG. 4) for a given device 10, 30, 32 or 34. In the illustration of FIG. 5, only two wireless communication devices 10 and 30 are considered. Typically, since network transactions and local data are the same across devices, the only piece of the application definition that varies for different devices is the user interface definition (i.e. the definition of its GUI screens).
  • So, transaction server 44 stores a master definition file 58 (or simply “master definition” 58) for a given server-side application. This master definition 58 contains example user interface descriptions 48, 54, 56 for each possible type of wireless communication device 10, 30, 32; descriptions of the network transactions 50 that are possible and data descriptions 52 of the data to be stored locally on the wireless communication device. Typically the network transactions 50 and data descriptions 52 will be the same for all wireless communication devices 10, 30 and 32, while the user interface descriptions 48, 54, and 56 vary slightly from one another. This may for example be due to display size limitations on some wireless communication devices which force a designer to lay out the display elements of a user interface slightly differently from device to device.
  • For device 10, transaction server 44 composes an application definition file 28 by querying the device type and adding an appropriate user interface description 48 for device 10 to the definitions for the network transactions 50 and the data 52. For device 30, transaction server 44 composes the application definition file 28 by adding the user interface description 54 for device 30 to the definitions for the network transactions 50 and data 52. These two files 28 may be thought of as platform-specific versions of a mobile application.
  • The master definition 58 for a given application is created away from the transaction server 44 and may be loaded onto the transaction server 44 by administrative staff charged with its operation. Master definition files may be created by a developer using a rapid application development tool such the one that is described below in Section II. Alternatively, a simple text editor could be used. It will be appreciated that the master definition file 58 is an XML document.
  • FIG. 6 illustrates the organization of transaction server 44. Transaction server 44 may be any conventional application server, modified to function in as described herein. As such, transaction server 44 includes a processor 60, in communication with a network interface 66 and storage memory 64. Transaction server 44 may be, for example, a server running Windows Server 2003, a Sun Solaris server, or the like. Memory of transaction server 44 stores an operating system such as Windows Server 2003, or Solaris operating system software 62.
  • Network interface 66 enables transaction server 44 to transmit and receive data over a data network 63. Transmissions are used to communicate with both the virtual machine software 24 (via the wireless networks 36, 38 and wireless gateways 40, 42 of FIG. 3) and one or more application servers, such as application server 70, that are the end recipients of data sent from the mobile client applications and the generators of data that is sent to the mobile client applications.
  • Memory at transaction server 44 further stores software 68 which, when executed by transaction server 44, enables the transaction server to understand and compose XML data packages that are sent and received by the transaction server 44. These packages may be exchanged between transaction server 44 and the virtual machine software 24, or between the transaction server 44 and the application server 70. Transaction server software 68 may be loaded from a machine-readable medium.
  • As described above, communication between the application server 70 and the transaction server 44 can, in an exemplary embodiment, use HTTP running on top of a standard TCP/IP stack; however this is not a requirement. An HTTP connection between a running application at the application server 70 and the transaction server 44 is established in response to the application at a wireless communication device presenting the application. The server-side application provides output to transaction server 44 over this connection. The server-side application data is formatted into appropriate XML data packages understood by the virtual machine software 24 at a wireless communication device by the server-side application.
  • That is, a server-side application (or an interface portion of the application) formats application output into XML in a manner consistent with the format defined by the application definition file for the application. Alternatively, an interface component separate from the application could easily be formed with an understanding of the format and output for a particular application. That is, with a knowledge of the format of data provided and expected by an application at application server 70, an interface component could be produced using techniques readily understood by those of ordinary skill. The interface portion could translate application output to XML, as expected by transaction server 44. Similarly, the interface portion may translate XML input from a wireless communication device into a format understood by the server-side application.
  • The particular identity of the wireless communication device on which the application is to be presented may be identified by a suitable identifier, in the form of a header contained in the server-side application output. This header may be used by transaction server 44 to forward the data to the appropriate wireless communication device. Alternatively, the identity of the connection could be used to forward the data to the appropriate wireless communication device.
  • FIG. 7 illustrates a sequence diagram detailing data (application data or application definition files 28) flow between wireless communication device 10 and transaction server 44.
  • For data requested from transaction server 44, device 10, executing virtual machine software 24, makes a request to transaction server 44, which passes over the wireless network 36 through network gateway 40. Network gateway 40 passes the request to the transaction server 44. Transaction server 44 responds by executing a database query on its database 46 that finds which applications are available to the user and the user's wireless communication device. For data passed from transaction server 44 to device 10, data is routed through network gateway 40. Network gateway 40 forwards the information to the user's wireless communication device over the wireless network 36.
  • FIG. 7 when considered with FIG. 3 illustrates a sequence of communications between virtual machine software 24 (executing at device 10) and transaction server 44 that may occur when the user of a wireless communication device wishes to download an application definition file 28 for a server-side application.
  • Initially device 10 may interrogate server 44 to determine which applications are available for the particular wireless communication device being used. This may be accomplished by the user instructing the virtual machine software 24 at device 10 to interrogate the server 44. Responsive to these instructions the virtual machine software 24 sends an XML message to the server requesting the list of applications (data flow 72); the XML message may contain the <FINDAPPS> tag, signifying to the transaction server 44, its desire for a list of available applications. In response, transaction server 44 makes a query to database 46. Database 46, responsive to this query, returns a list of applications that are available to the user and the wireless communication device. The list is typically based, at least in part, on the type of wireless communication device making the request, and the applications known to transaction server 44. Transaction server 44 converts this list to an XML message and sends it to the virtual machine (data flow 74). Again, a suitable XML tag identifies the message as containing the list of available applications.
  • In response, a user at device 10 may choose to register for an available server-side application. When a user chooses to register for an application, virtual machine software 24 at device 10 composes and sends an XML registration request for a selected application (data flow 76) to transaction server 44. As illustrated in FIG. 7, an XML message containing a <REG> tag is sent to transaction server 44. The name of the application is specified in the message. The transaction server 44, in response, queries its database for the user interface definition for the selected application for the user's wireless communication device. Thereafter, the transaction server creates the application definition file, as detailed with reference to FIG. 5. Then, transaction server 44 sends to the wireless communication device (data flow 78FIG. 7) the created application definition file 28.
  • The user is then able to use the functionality defined by the interface description to send and receive data.
  • At this time, parser 61 of virtual machine software 24 may parse the XML text of the application definition file to form a tokenized version of the file. That is, each XML tag may be converted to a defined token for compact storage and to minimize repeated parsing of the XML text file. The tokenized version of the application definition file may be stored for immediate or later use by device 10. In this context, the term “tokenized” may refer to placement of the XML structure into binary objects which are run-time accessible, which is much like conversion of a script into byte code.
  • The application definition file may initially be converted to a DOM tree representation. The entire DOM tree may then be traversed. For each XML element that is encountered during the traversal, a corresponding object 169 (FIG. 2B) may be instantiated from one of object classes 69. Instantiation of each object 169 may be facilitated by a fromXML( ) “constructor” method within the corresponding class 69, which populates the object's data members based on XML element/attribute values. For example, the constructor method may receive the XML fragment which defines the XML element in the application definition file 28 and, based on the element and attribute values within the fragment, automatically populate the newly instantiated object's data members with like values. It is noted that the constructor method may or may not meet the strict definition of the term “constructor” as it is understood in the context of certain object-oriented programming languages (e.g. the method may not have the same name as the class).
  • For purposes of illustrating the instantiation of a subset of the objects 169 of FIG. 2B, it is assumed that the following XML fragment is read from an application definition file 28:
  • <BTN NAME=“BTN1” INDEX=“1” CAPTION=“OK”>
    <EVENTS>
    <EVENT TYPE=“ONCLICK” . . .>
    <ACTION . . .>
    <ACTION . . .>
    </EVENT>
    </EVENTS>
  • The above XML fragment represents an “OK” button on a containing GUI screen (not shown) which performs two actions when clicked. The details of the actions are omitted for brevity. When a DOM tree representation of the above is encountered, the result may be instantiation of the button object 173 (FIG. 2B) from the following one of classes 69:
  • Public class button
    {
    str name;
    int index;
    str caption;
    event[] events; // event array
    fromXML(<XML>) { ... } // “constructor”
    writeToPersistantStorage( ) { ... }
    readFromPersistantStorage( ) { ... }
    onEvent( ) { ... }
    :
    }
  • The data members “name”, “index” and “caption” of object 173 correspond to attributes of the same name within the XML fragment. The constructor method fromXML( ) populates these data members with the values “BTN1”, “1” and “OK”, respectively, based on the relevant XML attribute values.
  • The constructor method also populates the event array of button object 173. The event array is an array of event objects, each representing a different type of event that is significant with regard to the containing GUI screen display element (in this case, button object 173). In the above example, only one significant event is defined for the “OK” button, namely, an “ONCLICK” event which represents the clicking of the button. Accordingly, only one event object 175 is instantiated. The event object's data members includes an array of action objects 177 and 179 (one for each action element in the above XML fragment) representing actions to be taken when the event occurs. Each action object is also populated by a constructor method within the action object, in like fashion.
  • The result of instantiating the button object and subordinate objects is illustrated in FIG. 2B. The button object is shown at 173, within the context of objects 169. In FIG. 2B, illustration of an object within the border of another object connotes the latter object's containment of the former. In the present embodiment a contained object is a data member of the containing object.
  • The button object 173 is contained within a screen object 171 which also includes an edit box object 181. This hierarchy indicates a UI screen having both a button and an edit box. The sole significant event for the button object 173 is represented by event object 175, which is the sole member of the event array of button object 173. The event object 175 in turn contain action objects 177 and 179 which represent actions that are to be taken when the containing event occurs. The actions may be of various types, as will be described.
  • The remaining objects 169 of FIG. 2B are based on other portions of the application definition file 28 which are not expressly set forth above. Briefly, the edit box object 181 of FIG. 2B contains two events 183 and 191, each representing a significant event for the edit box (say, selection and text entry). These event objects in turn contain actions 185, 193 and 195, representing actions to be taken when the relevant event occurs.
  • The button class contains an onEvent( ) method. This method is invoked via a callback from the operating system 20 upon the detection of any event pertaining to the button UI construct for purposes of determining whether the detected event is significant and thus requires action to be taken. Other UI constructs, such as edit boxes, menu items, and the like also have a similar method. Cumulatively, these methods within instantiated objects 169 may comprise event handler 65 of FIG. 2A.
  • Each class also includes a writeToPersistentStorage( ) method which saves the object's state by storing data member values, e.g. to a file system. The values are typically stored in a binary representation. This method is invoked during initial DOM tree traversal for purposes of writing to persistent storage newly instantiated objects which are not immediately needed. Once the data has been so stored, the objects may be de-allocated, and as a result, it is not necessary to maintain a vast set of objects representative of the entire application definition file 28 within wireless communication device memory. Only objects 169 pertaining to the current wireless communication device application state are instantiated at any given time, and wireless communication device resources are thereby conserved. A corresponding readFromPersistentStorage( ) method permits a newly instantiated object to assume the state of a previously de-allocated object from values saved to persistent storage by the writeToPersistentStorage( ) method, e.g., when a screen is loaded due to user navigation to that screen. By initially storing the entire set of objects to persistent storage in this fashion, the need to maintain a DOM tree is avoided.
  • Thereafter, upon invocation of a particular application for which the device 10 has registered, the screen generation engine 67 of the virtual machine software 24 at the device causes the virtual device to locate the definition of an initial screen for that application. The initially loaded screen will be the one identified within the application definition file 28 for that application by way of the attribute <First screen=“yes”>.
  • Operation for loading a first or subsequent screen is illustrated in FIG. 8. To load a screen, generation engine 67 may employ a loadScreen(X) routine, which may be one of the general purpose routines 59 within virtual machine software 24 (FIG. 2B). This routine may accept as a parameter a unique screen identifier X. Based on that identifier, the routine may find the appropriate representation of the named screen within local storage 26 (as previously stored by the writeToPersistentStorage( ) method of that screen when the DOM tree was initially traversed), instantiate a new screen object 169 (S802), and cause the new screen object to populate itself with the stored data, e.g. through invocation of the readFromPersistentStorage( ) method of the screen object. The latter method may in turn instantiate subordinate objects (such as buttons, edit boxes, menus, list boxes, choice items, and checkboxes, as detailed in Appendix “A”) and cause the readFromPersistentStorage( ) methods of these subordinate objects to be called, in an iterative fashion. In the result, a hierarchy of instances 169 of object classes 69 are created within the virtual machine software 24, as illustrated in FIG. 2B.
  • Each definition of a visual UI construct (also referred to as a “display element”) causes virtual machine software 24 to use the operating system of the wireless communication device to create corresponding display element of a graphical user interface as more particularly illustrated in FIG. 8. Specifically, for each element (S804, S814, S824, S834 and S844), the associated XML definition is read in (S806, S816, S826, S836 and S846) and a corresponding instance of a screen object defined as part of the virtual machine software 24 is created by the virtual machine software 24 (at S808, S818, S828, S838 and S848), in accordance with S902 and onward illustrated in FIG. 9. Each interface object instance is created at S902. Each instance takes as attribute values defined by the XML text associated with the element. A method of the virtual machine object is further called (S904), and causes a corresponding device operating system object to be created (S906). Attributes originally defined in the XML text file, as stored within the virtual machine object instance, are applied to the corresponding instance of a display object created using the device operating system (S908-S914). This is repeated for all attributes of the virtual machine object instance. For any element allowing user interaction, giving rise to an operating system event, the event handler 65 of virtual machine software 24 is registered to process operating system events. This may for example entail registering, for each display element on the screen (e.g. buttons, menu items, etc.), a callback to an onevent( ) method of the UI construct, which will be invoked upon the occurrence of any event in respect of that construct for purposes of determining whether the event is significant, such that the event's actions should be executed as a result. The onevent( ) methods for all of the objects which are associated with the currently displayed screen may constitute the event handler 65 for that screen. That is, there may be no distinct instance of an event handler 65 per se other than the onEvent( ) methods of currently displayed or instantiated UI construct objects.
  • As described above and illustrated in FIG. 2B, for each event (as identified by an <EVENT> tag in the application definition file 28) and action (as identified by an <ACTION> tag), virtual machine software 24 creates an instance of a corresponding event and action object forming part of virtual machine software 24. Virtual machine software 24 maintains a list identifying each instance of each event and action object, which may take the form of arrays of event objects and subordinate arrays of action objects as described above (S916 to S928).
  • Operation at S902-S930 is repeated for each element of the screen at S808, S818, S828, S838 and S848 as illustrated in FIG. 8. All elements originally defined between the <SCREEN> definition tags of the application definition file are so processed. After the entire screen has been so created in memory, it is displayed (S854), using conventional techniques.
  • Events may be handled by virtual machine software 24 as illustrated in FIG. 10. Operation at S1002 and onward is performed in response to the operating system detecting an event.
  • Upon the occurrence of an event, such as the receipt of data from a wireless network 36 or 38 or user interaction with user interface controls at the wireless communication device, the operating system automatically invokes the onevent( ) method for each object 169 in respect of which callback was earlier registered (S1002). The onEvent( ) method for these objects determines whether the event is significant for the object (S1004) and, if so (S1006), passes control to each of the action(s) in the array of action objects within the relevant event object, in succession (S1008-S1016), for purposes of effecting the desired behaviour. Control may for example be passed by invoking a doAction( ) method of each instance of an action object within the action object array that comprises the data members of the relevant event object, in order to effect desired processing as defined by the XML <ACTION> tag. The result of executing this method may be loading a named screen, closing the current screen, sending a message, storage data locally at the device, or other actions, as described below. Within the doAction( ) method of each action object, hard-coded instructions exist that are capable of causing various types of actions to be performed. The attribute values within the action object's data members dictate which of these hard-coded instructions are executed and thereby effectively serve as parameters to the action which determine the resultant operation of the wireless communication device.
  • In some cases, the doAction( ) method may invoke a general purpose routine. For example, if an action specifies that a screen should be closed, a “destroy screen X” routine 181, which is one of general purpose routines 59 (FIG. 2B) in the present embodiment, may be invoked from the action object's doAction( ) method. This routine may traverse screen objects within the instantiated objects 169 until the screen with the specified name X is found, at which point that screen object may be instructed to destroy itself. If the action indicates that a message (package) should be sent, a “createXMLPackage( )” general purpose routine 187 (FIG. 2B) may be invoked from the action object's doAction( ) method, to create and send a message over wireless network 22 containing specified data. According to that routine 187, methods within an XML builder object may assemble data into an XML package which is then passed to message server object. The message server object may use the device's network APIs to transmit the assembled XML package across the wireless network.
  • By executing actions which cause new screens to be loaded or closed, navigation through the screens of the application is accomplished according to the definition embodied in the application definition file 28.
  • If the event is the receipt of an XML package from the wireless network 22, then the relevant event objects will not be directly contained within a UI construct object within objects 169 (as shown in FIG. 2B). Rather, the relevant event objects will be defined at the application level or at the screen level, depending upon whether the data event is defined as an application-level event (i.e. significant regardless of the currently displayed screen) or a screen-level event (i.e. only significant when a particular screen is displayed), respectively. When an XML package is received, the event objects whose onevent( ) methods are invoked will be at the same level as, or will be directly contained by, the screen object corresponding to the currently displayed screen (FIG. 2B).
  • So, for example, as illustrated in FIG. 7, a user could send a login request 80 by interacting with an initial login screen, defined in the application definition file for the application. This would be passed by the transaction server 44 to the backend application server 70. The backend application server according to the logic embedded within its application, would return a login response 82, which the transaction server 44 would pass to the virtual machine software 24. Other applications, running on the same or other application servers might involve different interactions, the nature of such interactions being based upon the functionality and logic embedded within the application server 70.
  • II. Rapid Application Development Tool
  • In order to facilitate the development of a master definition file 58 (and, indirectly, application definition files 28) for use in the system described above, a rapid application development tool may be used. An exemplary RAD tool 116 (or simply “tool 116”) is illustrated in FIG. 11. FIG. 11 illustrates the tool 116 within the wireless communication device operating environment of FIG. 3. Wireless communication device 30 of FIG. 3 is omitted from FIG. 1 for clarity.
  • The RAD tool 116 of FIG. 11 is a computing device 118, such as an Intel®-Processor based personal computer (PC) for example, executing rapid application development (RAD) software, which may be loaded from a machine-readable medium such as an optical disk 120. The tool 116 allows the developer to create a master application definition file 58 from which device-specific application definition files 28 may be generated. Completed master definition files 58 are uploaded from computing device 118 to transaction server 44, via network 119, which may be an Ethernet local area network for example, for downloading to wireless communication devices 10, 32 and 34. In turn, the device-specific application definition file 28, when downloaded, interpreted and executed at a wireless communication device 10, 32 or 34, permits the wireless communication device to emulate and intercommunicate with an application that is actually executing on an application server 70 (FIG. 6), as described above.
  • FIG. 12 illustrates RAD tool 116 in greater detail. In the present embodiment the tool 116 is a PC 118 executing RAD software 130. The PC 118 includes a processor 134 in communication with memory 132 which stores the software 130. The PC 118 further includes a conventional display 138, such as a Cathode Ray Tube (CRT) monitor or flat-screen display for example, and a conventional user input mechanism (UIM) 145, such as a keyboard and/or mouse for example. The PC 118 also includes a network interface card 142 (e.g. an Ethernet interface) which facilitates communication by the tool 116 over network 119 (FIG. 11), e.g. for purposes of uploading a completed master definition file 58 from secondary storage 146 to the transaction server 44.
  • In overview, when RAD software 130 is executed by PC 118, it provides an intuitive graphical user interface which facilitates “drag and drop” application development, so that even developers who lack depth of expertise in software development may “develop a mobile application” (i.e. may generate a master definition file 58). The procedure for developing a mobile application essentially consists of creating a visual hierarchy or “tree” of icons which correlates to a logical hierarchy of XML markup tags (e.g. as defined in Appendix “A”). The created visual hierarchy may be similar to a graphical directory and file structure representation in a conventional graphical operating system. Each icon represents a building block of the application (e.g. a GUI screen, a database table for storing program data, action to be executed upon the occurrence of a defined event etc.) and corresponds to a defined ARML tag (i.e. an instance of an XML element with attributes). As a user creates icons and assigns properties to them, the tool 116 automatically generates a dynamically-accessible representation of the corresponding hierarchy of XML elements and attributes within memory 132, in the form of a master definition DOM tree 150 data structure. A DOM tree is essentially a dynamically accessible representation of an XML document that is well understood in the art. Within this master definition DOM tree 150, a technique is employed to efficiently represent sets of actions that may be triggered in more than one scenario. This technique is a focus of the present description, and is detailed below. When the user of tool 116 has completed development of the mobile application, the application is “published”, i.e. the master definition DOM tree 150 is serialized to form a master definition file 58, in a manner exemplary of the present invention.
  • The RAD software 130, which may be referred to by the proprietary name “AIRIX Design Studio” or simply “Design Studio”, may be implemented as a set of plug-ins to a generic integrated design environment (IDE) framework such as the Eclipse framework. As is known in the art, the Eclipse platform is designed for building integrated development environments that can be used to create various applications such as web sites, embedded Java™ programs, C++ programs, and Enterprise JavaBeans™ for example. The platform exposes to tool providers mechanisms to use and rules to follow via well-defined APIs, classes and methods. The RAD software 130 may be written in Delphi, using an SQL Server database for example.
  • FIG. 13 illustrates an exemplary GUI 1300 of the RAD tool 116 when the RAD software 130 is executed. The GUI 1300 includes various components, including a toolbar 1302, a project explorer 1304, and a main design area 1306.
  • Toolbar 1302 includes a menu list and icons for performing various development activities during mobile application development. Activities which may be performed include opening a new project, compiling a current mobile application, and publishing a current mobile application. The term “project” refers to the mobile application under development. Compiling refers to the checking of various aspects of an application for errors or deviations from good programming practices. Compilation may cause hints, warnings or errors to be displayed, e.g.:
      • Hint—screen “AddS cm” for device “RIM” has an IF action and has no actions for its ELSE LIST.
      • Warning—Value “[SP.screenname.savename]” is used in the action “gotomm” on screen “AddScm” for device “RIM”. It should be noted that if this scratchpad (a temporary buffer on the wireless communication device capable of storing variables under a name “savename” on a screen-specific basis) or query value results to null, the OPEN command will not open a screen.
      • Error—The action “NewAction” on MenuItem “addloc” on screen “AddScm” for device “RIM” cannot have a blank screen.
  • Publishing refers to the creation of a master definition file 58 by serializing the master definition DOM tree 150. Publishing may cause hints, warnings or errors to be displayed, as during compilation.
  • The project explorer 1304 contains the visual hierarchy of icons 1305 that is created by the developer to represent the mobile application. In FIG. 13, the visual hierarchy 1305 defines a simple inventory application for a Pocket PC mobile device. A more detailed view of the project explorer 1304 is provided in FIG. 14, which is described below.
  • Main design area 1306 of FIG. 13 displays the currently-selected application component of project explorer 1304. When an icon is selected in the visual hierarchy 1305, a graphical representation of the relevant component (e.g. a screen or database table) and its properties appears in main design area 1306. For example, in FIG. 13 icon 1308 representing a Pocket PC GUI screen has been selected, as indicated by reverse video. As a result, an “interface designer” GUI that is specific to the relevant platform appears in design area 1306. The Pocket PC interface designer GUI includes a number of GUI areas, namely, a screen designer 1310, an interface component drop down list 1312 and a properties tab 1314.
  • The screen designer 1310 is a “screen painter” window which displays a graphical representation of the relevant wireless communication device type (a “virtual device”) for which screens are to be created. The designer 1310 permits a developer to design screens by dragging and dropping display elements (such as buttons, edit boxes, or other widgets) to the virtual device screen in the window, offering a “what you see is what you get” (WYSIWYG) view of the interface screen under development.
  • The interface component drop down list 1312 facilitates definition and selection of individual GUI display elements which make up the screen as displayed within the interface designer window 1310.
  • The properties tab 1314 displays the properties of the interface component that is currently selected in the drop down list 1312. Properties that are generally applicable to the overall screen may also be displayed. Displayed properties may be updated as desired.
  • FIG. 14 illustrates project explorer 1304 in greater detail. In FIG. 14, a different visual hieararchy 1400, “Project—Example”, than that which is shown in FIG. 13 is illustrated. The illustrated project exemplifies a mobile application for a Pocket PC wireless computing device. As will be described, the mobile application utilizes global functions to efficiently declare a set of actions that is executed in multiple scenarios.
  • The visual hierarchy 1400 includes platform-independent components 1402 and platform-specific components 1404. Platform-independent components 1402 are application components (i.e. application building blocks such as GUI screens, definitions of significant events, and actions to be performed upon the occurrence of these events) which are not specific to a particular type of wireless communication device and may therefore alternatively be considered “platform-independent” (or “device-independent”). Platform-specific components 1404, on the other hand, are application components that may vary from wireless communication device type to wireless communication device type. For example, the GUI screens of a mobile application may differ in some measure between wireless communication device types, due to differences in the capabilities of the devices (e.g. screen size and supported display elements). As will be appreciated, application components generally correspond to XML elements within the master definition file 58 that will be generated by the RAD tool 116.
  • As shown in FIG. 14, the platform-independent components 1402, (which comprise the “Device Independent” branch of the visual hierarchy,) include application events 1406, data rules 1408, database tables 1410 and global functions 1412.
  • Application events 1406 define the events which trigger processing within the mobile application regardless of the application's status (e.g. regardless of which GUI screen is presently displayed) as well as the actions that are to be performed upon the events' occurrence. For example, the receipt of an XML package at the wireless communication device can be defined as an application level event which results in the display of a message box for example. Application level events are to be distinguished from screen level events (arrival of an XML package when a specific screen is displayed) and control level events (user manipulation of a GUI control such as a button press), which are defined separately from application level events 1406. An exemplary definition of a control-level event is described below.
  • Data rules 1408 dictate how XML packages received from enterprise applications affect data stored in database tables associated with an application. A rule may define which field(s) of a table will be impacted by incoming data and the nature of the impact. Because rules make reference to database tables, logically they are defined after the tables (described below) have been defined. Rules also dictate how to apply changes to database tables from XML created in an outgoing XML transaction. Like application-level events 1406, data rules 1408 are device independent.
  • Database tables 1410 are defined by a developer for purposes of storing data at run time for use by the mobile application executing at the wireless communication device.
  • Global functions 1412 contain definitions of global functions for the mobile application, which are a focus of the present description. A global function is a named aggregation or set of actions which can be referenced from other areas of the visual hierarchy 1400 where the actions might otherwise be declared (i.e., from any event declaration in hierarchy 1400). The referencing event declaration defines the circumstances in which the actions comprising the referenced global function will be triggered. A global function may be warranted when the same set of N actions (N being an integer greater than one) should be executed in more than one scenario of a mobile application. Each global function defines actions to be executed and a sequence for execution. By referencing a global function from multiple places within the visual hierarchy 1400, instead of repeatedly declaring the same set of actions, the developer may simplify implementation. Moreover, maintainability is improved, because any changes to the set of actions need only be effected in one place, i.e., the global function.
  • In the illustrated embodiment, the global functions section 1412 of FIG. 14 contains two global function definitions 1414 and 1422.
  • The first global function definition 1414, “Function 1, contains definitions for three actions A, B and C. The first action 1416, “Action A” causes an XML package representing a login message with predetermined username and password field values to be sent from the wireless communication device 10 to the transaction server 70. The second action 1418, “Action B”, and the third action 1420, “Action C”; each cause the wireless communication device 10 to activate the configured notification for the device (e.g. to play a “beep”). The order of the actions A, B and C determines their sequence of execution. Global function 1414 thus constitutes “unit” of code which causes the wireless communication device 10 to send an XML package login message and to activate the configured notification twice, in that sequence. As will be appreciated, the definition of this global function is motivated by the fact that the mobile application requires that set of actions A, B and C to be executed in that sequence in more than one scenario.
  • The procedure for defining the global function 1414 in the project explorer window 1304 may be as follows. The icon corresponding to the global functions section 1412 may initially be selected with a mouse (or similar user input mechanism 145) of the RAD tool 116 (FIG. 12). A right-click (or similar user action) may cause a pop up menu to be displayed. The pop-up menu may present a list of permissible application components that may be defined within the global functions section 1412). An Add Function option may permit the user to define a new global function. Selection of that menu item may cause a new global function icon 1414 to be created below icon 1412, as shown in FIG. 14, and a Function Properties window to be displayed in the main design area 1306 (FIG. 13). The Function Properties window may permit the user to enter properties of the newly-defined global function. The Function Properties window may include a Function Name field for defining a unique function name that is not already in use by any other global function. The name uniqueness constraint ensures that each global function may be uniquely identified from other areas of the visual hierarchy 1400. In the present example, it is assumed that the name “Function1” is entered, thus that name is displayed as part of the icon at 1414.
  • Subsequently, the newly-defined icon corresponding to function 1414 may be selected and the mouse again right-clicked to cause another, different pop-up menu to be displayed. This pop-up menu may contain a list of permissible application components that may be defined within a function. An Add Action option may permit the user to declare a new action. (Actions can be used to navigate to different portions of the application or to handle application data.) The RAD tool 116 may for example support the different action types described in Appendix “A”. Each action type instructs the application to perform a different operation, thus the properties associated with each action type vary.
  • Selection of the Add Action menu item may cause a new action icon at 1416 to be created below the icon at 1414 and an Action Properties window to be displayed in the main design area 1306 for entering properties associated with the newly-defined action. The Action Properties window may include an Action Name field for entering a name (e.g. “ActionA”) and an Action Type drop down list for selecting the type of action. The drop down list may contain a predetermined list of possible action types (e.g. as described in Appendix “A”). When one of the action types is selected, the Action Properties window may be updated for entering further properties associated with the selected action type.
  • In the present example, the action type for the first action A is set to “XML Transaction”, and the remaining properties are also set, so as to define an action which causes an XML package comprising a login request with a fixed username field of “SMITHJ” and a fixed password of “ABC321” to be sent to the transaction server 70. Definition of this action may involve typing the XML text comprising the message into an XML Text field.
  • By repeating the above-described steps for adding an action, but selecting a different action type “Notify” (rather than “XML Transaction”) which has no properties, a second action with the name “ActionB”, as represented by icon 1418, for causing the configured notification for the wireless communication device 10 to be activated, is defined. A further repetition of these steps results in the definition of action “ActionC” as represented by icon 1420. The definition of the first global function 1414 is thus completed.
  • A similar procedure may be followed to create the second global function 1422. The set of actions defined under global function 1422, (which are not visible in the project explorer window 1304 due to the fact that the portion of the visual hierarchy 1402 below the function definition 1422 has not be expanded), may be partly or wholly different from those defined in the first global function 1414.
  • Turning to the platform-specific components section 1404 of visual hierarchy 1400, (which comprises the “Operating Systems” branch), this section permits the definition of wireless communication device type-specific aspects of an application, which primarily comprise the displayable GUI screens and associated functionality of an application. In FIG. 14, GUI screens are defined for only one platform, namely the Pocket PC platform, which is represented by “Pocket PC” branch 1426.
  • Two GUI screen definitions 1428 and 1444 are illustrated in Pocket PC branch 1426 (other Pocket PC GUI screen definitions being omitted from FIG. 14 for brevity). Screen definition 1428 defines the GUI screen 1500 of FIG. 15 and screen definition 1444 defines a similar GUI screen 1600 of FIG. 16.
  • Referring to FIG. 15, it can be seen that GUI screen 1500 has a title 1502 (“Login”), a text item 1504 (“Press the button to log in”) and an “OK” button 1506. The desired behavior for the screen is for four actions to occur upon selection of the “OK” button 1506. First, an XML package comprising a login request with a fixed username field of “SMITHJ” and a fixed password of “ABC321” should be sent to the transaction server 70. Second, a notification sound should be played. Third, the notification sound should be repeated. Fourth, a message box 1508 with a title “Informational Message” and a message “Message sent.”, as shown in FIG. 15, should be displayed.
  • FIG. 16 illustrates the other GUI screen 1600. Screen 1600 is intended for display instead of GUI screen 1500 only in the case where logic within the mobile application has determined that the wireless computing device user prefers display of GUI screens in the French language. The GUI screen 1600 is analogous in non-textual appearance and operation to GUI screen 1500 of FIG. 15. For example, screen 1600 also has a title 1602, a text item 1604, an “OK” button 1606 and a message box 1608. Moreover, the design of screen 1600 is such that, upon selection of the “OK” button 1606, the same four actions as described above occur. However the textual aspects of the display elements, such as title 1602, text item 1604, and the textual components of the message box 1608, are in the French language.
  • Screen 1500 of FIG. 15 is defined in GUI screen definition 1428 of FIG. 14. The procedure for creating the screen definition 1428 in the project explorer window 1304 may involve steps that are similar to above-described steps for defining global functions 1412. For example, right-clicking of the Pocket PC branch 1426 may pop up a menu having a “New Screen” option. Selection of that option may cause the icon 1428 to be created. A unique screen name typed into a screen name field of a Screen Properties window (which may be displayed in the main design area 1306), “LoginScr”, becomes part of the icon 1428 representing the screen. The title 1502 for the screen (“Login”) may be typed in a title field of the Screen Properties window.
  • Below icon 1428 within the hierarchy, two icons 1430 and 1432 are created. The first icon 1430 represents the text item 1504 of FIG. 15. The second icon 1432 represents the “OK” button 1506 of FIG. 15. Each of these application components may be created by right-clicking the icon 1428, choosing the appropriate new display element to be added (from a pop-up menu or a toolbar for example) and defining the new display element's properties in an associated properties window.
  • A ButtonClick event 1434 is declared below the button icon 1432. This event represents the selection of the “OK” button 1506 of FIG. 15 by a wireless communication device user. Right-clicking of the event 1434 in project explorer 1304 causes another pop-up menu to be displayed. The options that are presented on the displayed pop-up menu include an Add Function Call option (assuming that at least one global function has been declared) and an Add Action option. These two options represent the two ways in which wireless communication device behavior responsive to the occurrence of an event may be specified.
  • The first option, Add Function Call, permits a reference to a previously-defined global function to be created. This option may be selected to create function call icon 1436. The definition of properties in an associated a Function Call Properties window displayed in main design area 1306 is illustrated in inset 1438. A name field 1440 contains a user-specified name (“CallFn1”) that will be displayed as part of the icon 1436. A “Function To Call” field 1442 provides a drop-down list which lists, by (unique) name, each global function defined in the global functions section 1412. In FIG. 14, the list 1442 is illustrated in a dropped-down state, with two entries, namely Function1 and Function2, corresponding to the two previously-defined global functions 1414 and 1422, being visible. The heavy border around the first entry, Function1, indicates that this field has been selected by the user of RAD tool 116, such that the function call 1436 defines a call to Function1. As will be appreciated, function call 1436 is operationally equivalent to the creation of actions 1416, 1418 and 1420 below the ButtonClick event 1434.
  • A further action 1438 (“Action4”) which causes English language message box 1508 (FIG. 15) to be displayed may be defined by right-clicking the ButtonClick event 1434 of hierarchy 1400, selecting pop-up menu option “Add Action”, and typing the desired message box title and message content in the Action Properties window.
  • The procedure for creating screen definition 1444 representing the other screen 1600 (FIG. 16) is virtually identical to the above-described steps for creating screen definition 1428, except that the textual content for screen 1600 is typed in French. The resultant application components, namely text item 1446, button 1448, ButtonClick event 1450, function call 1452, and action 1454 are accordingly the same as text item 1430, button 1432, ButtonClick event 1434, function call 1436, and action 1438, respectively.
  • Notably, the same global function (“Function1”) that is referenced by function call 1436 for the English language screen 1500 is also referenced by function call 1452 for the French language screen 1600. This reflects the fact that the same actions are to be performed upon selection of the “OK” button regardless of which of the English language or French language screens 1500 or 1600 is displayed. Because all of the actions 1414, 1416 and 1418 comprising the referenced function “Function1” are language-neutral (i.e. none of them cause any textual content to be displayed), they may be defined only once in the form of a global function, and the global function may then be referenced from each screen. In contrast, language-specific actions, which have different textual content on screens 1500 and 1600, are defined once for the English language screen 1500 and once for the French language screen 1600. Generally, actions which are specific to a platform, screen or event will be defined within the context of that platform, screen or event, as there is little motivation for defining such actions within a global function that would be referenced only once.
  • It will be appreciated that global functions are not necessarily always referenced from the context of a control-level event. Global functions may also be referenced from screen-level or application-level events.
  • Referring to FIGS. 17A-17B, the master definition DOM tree 150 of FIG. 12 is illustrated in greater detail. DOM tree 150 is represented textually as XML in FIGS. 17A-17B for ease of reference. It will be appreciated that the DOM tree 150 in memory 132 of RAD tool 116 (FIG. 12) is actually a dynamically-accessible representation. The DOM tree 150 of FIGS. 17A-17B corresponds to the mobile application design illustrated in the project explorer 1304 of FIG. 14. That is, the DOM tree 150 is automatically generated in memory 132 by RAD tool 116 as a result of the developer's creation of the “Project-Example” hierarchy 1400 of FIG. 14.
  • Two relevant portions of the master definition DOM tree 150 are illustrated in FIGS. 17A-17B. The first portion 1700 (lines 2-19 of FIG. 17A) corresponds to the global functions section 1412 of FIG. 14. The second portion 1702 (lines 21-61 of FIGS. 17A-17B) correspond to the platform-specific components section 1404 of FIG. 14. Other portions of the master definition DOM tree 150 are omitted for brevity.
  • In the description which follows, the term “XML element”, “XML element instance”, and “instance of an XML element” are understood to be synonymous. Each of these is a form of markup language element, or an “instance of” a markup language element.
  • In the first portion 1700, an outermost XML element, FUNCTIONS, contains two hierarchies of XML elements (i.e. two markup language hierarchies or sub-trees within DOM tree 150). The first hierarchy, which appears at lines 3-13 of FIG. 17, corresponds to the first global function 1414. The second hierarchy, which appears at lines 14-18 of FIG. 17, corresponds to the second global function 1422. Each hierarchy has a parent XML element, FUNCTION, and contains a number of instances of the ACTION element which, as previously noted, defines an action to be performed by the wireless computing device 10. It is noted that the XML elements FUNCTIONS and FUNCTION are extensions of the XML elements and attributes identified in Appendix “A”.
  • In the first markup language element hierarchy, the ACTION element at lines 4-10 corresponds to action 1416 of FIG. 14. When interpreted and executed by the virtual machine software 24 of a wireless computing device 10 (FIG. 1), this ACTION element causes a login message (whose “body” is defined at lines 6-8) to be sent from the device 10 to the transaction server 70. The ACTION elements at lines 11 and 12, on the other hand, correspond to actions 1418 and 1420 of FIG. 14, and each result in the playing of a notification sound at the wireless communication device 10.
  • The second markup language element hierarchy also contains multiple ACTION elements at lines 15-17 whose details omitted for brevity.
  • The second portion 1702 of DOM tree 150 defines the GUI screens for Pocket PC mobile application, which include screens 1500 and 1600 of FIGS. 15 and 16 respectively. In particular, lines 24-40 of FIGS. 17A-17B define English language screen 1500 while lines 41-57 of FIG. 17B define French language screen 1600.
  • Examining lines 24-40 more closely, it can be seen that XML elements and attributes at lines 31-39 define the “OK” button 1506 (FIG. 15). The XML elements include an EVENT element at lines 33-37 which defines the ButtonClick event 1434 (FIG. 14). The EVENT element in turn contains a FNCALL element at line 33 referencing a global function. More specifically, the FNCALL element has a CALLEDFN attribute whose value, “Function1”, uniquely identifies the hierarchy of XML elements at lines 3-13 of FIG. 17A as the global function whose actions are to be executed upon occurrence of the ButtonClick event. The ACTION element at lines 35-36 of FIG. 17B, on the other hand, corresponds to the message box action 1438 of FIG. 14, which is declared immediately within the containing EVENT element.
  • Turning to lines 41-57 of FIG. 17B, it can be seen that the XML elements and attributes which define screen 1600 are similar to those defining screen 1500, except that the textual components are in the French language. It is noted that the CALLEDFN attribute of the FNCALL markup language element at line 50 (which corresponds to the function call 1452 of FIG. 14) is the same as the CALLEDFN attribute of the FNCALL markup language element at line 33, since both reference the same global function 1414.
  • FIGS. 18A-18B illustrate the master definition file 58 (an XML document) that is generated through serialization of the master definition DOM tree 150 of FIGS. 17A-17B. Only the portion of the master definition file 58 which defines the two GUI screens 1500 and 1600 for the Pocket PC platform is illustrated in FIGS. 17A-18B. Screen 1500 is defined at lines 3-30 of FIG. 18A, while screen 1600 is defined at lines 31-55 of FIGS. 18A-18B.
  • Referring first to FIG. 18A, it can be seen that the definition of screen 1500 at lines 3-30 represents a “merging” of the English language screen definition at lines 24-40 of FIGS. 17A-17B of the master definition DOM tree 150 and the definition of the first global function at lines 3-13 of FIG. 17A of the master definition DOM tree 150. In essence, the screen definition at lines 3-30 of FIG. 18A is a reproduction of the screen definition at lines 24-40 of FIGS. 17A-17B, with the exception that the FNCALL element at line 33 of FIG. 17A has been replaced with all of the ACTION elements which make up the referenced global function (i.e., with the three actions defined at lines 4-12 of FIG. 17A). The resulting markup language at lines 14-23 of FIG. 18A is the same as if those three actions had originally been defined within the ButtonClick EVENT element, like the action 1438 (see lines 35-36 of FIG. 17B).
  • Similarly, the definition of screen 1600 at lines 31-55 of FIGS. 18A-18B represents a “merging” of the French language screen definition at lines 41-57 of the master definition DOM tree 150 (FIG. 17B) and the definition of the first global function at lines 3-13 of the master definition DOM tree 150 (FIG. 17A). The FNCALL element at line 51 of FIG. 17A has again been replaced with the three actions defined at lines 4-12 of FIG. 17A. The resulting markup language at lines 40-48 is the same as if those three actions had originally been defined within the context of the ButtonClick EVENT element, like action 1454 (corresponding to lines 51-52 of FIG. 17B).
  • It will be appreciated that XML elements FUNCTION and FUNCTION do not appear in the resultant master definition file 58. This illustrates the fact that global functions are simply a design-time convenience for the user of RAD tool 116 which facilitate implementation of mobile applications in which the same set of actions are to be performed in multiple scenarios. Global functions do not form part of the mobile application per se.
  • The resultant master definition file 58 may be used by the baseline system for presenting server-side applications at varied wireless communication devices as described in section I above.
  • In the above-described embodiment, the master definition file 58 of FIGS. 18A-18B is generated by machine-readable code comprising the RAD software 130 which traverses and serializes the DOM tree 150 using the above-described technique for substituting global function references with the actions of the referenced function. This traversal and serialization are performed upon user selection of a “publish” command of RAD tool 116. However, the master definition file 58 could alternatively be generated by machine-executable parser code which parses a textual version of the master definition DOM tree 150 (as shown in FIGS. 17A-17B for example).
  • It should be understood that the approach for facilitating generation of a markup language document containing identical sets of markup language elements as described hereinabove is not necessarily limited to markup language documents pertaining to mobile applications and wireless computing device action. The approach may be used to simplify the generation of virtually any markup language document containing identical sets of markup language elements. Moreover, the approach may be used for various representations of a markup language document, such as textual markup language document files, DOM tree representations, or otherwise.
  • Use of the term “markup language document” or “XML document” herein is understood to include not only textual (e.g. ASCII) electronic files, but other document representations, such as DOM trees or Simple API for XML (SAX) representations for example.
  • As will be appreciated by those skilled in the art, modifications to the above-described embodiment can be made without departing from the essence of the invention. For example, another markup language such as Standard Generalized Markup Language could be employed instead of XML.
  • It is possible that global functions could reference other global functions, such that a developer could “build upon” one global function in another. This may be useful in the case where N global functions (N being an integer greater than 1) should contain the same subset of ACTION elements. Instead, that subset of ACTION elements could be defined in a further global function that is simply referenced from each of the N global functions.
  • Other modifications will be apparent to those skilled in the art and, therefore, the invention is defined in the claims.

Claims (18)

1. An apparatus comprising:
at least one processor; and
a memory coupled to said at least one processor storing:
a markup language document containing:
a markup language element hierarchy containing a set of markup language elements; and
a plurality of references to said markup language element hierarchy; and
machine-executable code which, when executed by the at least one processor, generates, from said markup language document, another markup language document containing one instance of said set of markup language elements for each of said plurality of references.
2. The apparatus of claim 1 wherein said markup language is Extensible Markup Language (XML), said markup language document is an XML document, said set of markup language elements is a set of XML elements, and said another markup language document is another XML document.
3. The apparatus of claim 1 wherein said markup language element hierarchy has a parent markup language element and wherein each of said plurality of references is a further markup language element having an attribute identifying said parent markup language element of said markup language element hierarchy.
4. The apparatus of claim 1 wherein each of said markup language elements of said set is representative of a wireless communication device action and wherein said markup language element hierarchy has a parent markup language element representative of an aggregation of wireless communication device actions.
5. The apparatus of claim 4 wherein said wireless communication device action is any of displaying a message box, sending a markup language message, terminating an application, closing a graphical user interface (GUI) screen, executing a conditional if-then-else expression, playing a notification sound, opening a specified GUI screen, purging a buffer, refreshing a displayed GUI screen or saving a value to the buffer.
6. The apparatus of claim 1 wherein each of said markup language elements of said set is an instance of the same markup language element.
7. A machine-readable medium comprising:
machine-executable code for generating, from a markup language document containing:
a markup language element hierarchy containing a set of markup language elements; and
a plurality of references to said markup language element hierarchy, another markup language document containing one instance of said set of markup language elements for each of said plurality of references.
8. The machine-readable medium of claim 7 wherein said markup language is Extensible Markup Language (XML), said markup language document is an XML document, said set of markup language elements is a set of XML elements, and said another markup language document is another XML document.
9. The machine-readable medium of claim 7 wherein said markup language element hierarchy has a parent markup language element and wherein each of said plurality of references is a further markup language element having an attribute identifying said parent markup language element of said markup language element hierarchy.
10. The machine-readable medium of claim 7 wherein each of said markup language elements of said set is representative of a wireless communication device action and wherein said markup language element hierarchy has a parent markup language element representative of an aggregation of wireless communication device actions.
11. The machine-readable medium of claim 10 wherein said wireless communication device action is any of displaying a message box, sending a markup language message, terminating an application, closing a graphical user interface (GUI) screen, executing a conditional if-then-else expression, playing a notification sound, opening a specified GUI screen, purging a buffer, refreshing a displayed GUI screen or saving a value to the buffer.
12. The machine-readable medium of claim 7 wherein each of said markup language elements of said set is an instance of the same markup language element.
13. A method comprising:
generating, from a markup language document containing:
a markup language element hierarchy containing a set of markup language elements; and
a plurality of references to said markup language element hierarchy, another markup language document containing one instance of said set of markup language elements for each of said plurality of references.
14. The method of claim 13 wherein said markup language is Extensible Markup Language (XML), said markup language document is an XML document, said set of markup language elements is a set of XML elements, and said another markup language document is another XML document.
15. The method of claim 13 wherein said markup language element hierarchy has a parent markup language element and wherein each of said plurality of references is a further markup language element having an attribute identifying said parent markup language element of said markup language element hierarchy.
16. The method of claim 13 wherein each of said markup language elements of said set is representative of a wireless communication device action and wherein said markup language element hierarchy has a parent markup language element representative of an aggregation of wireless communication device actions.
17. The method of claim 16 wherein said wireless communication device action is any of displaying a message box, sending a markup language message, terminating an application, closing a graphical user interface (GUI) screen, executing a conditional if-then-else expression, playing a notification sound, opening a specified GUI screen, purging a buffer, refreshing a displayed GUI screen or saving a value to the buffer.
18. The method of claim 13 wherein each of said markup language elements of said set is an instance of the same markup language element.
US11/345,326 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements Active 2030-08-27 US8046679B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/345,326 US8046679B2 (en) 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements
EP06101233A EP1816573A1 (en) 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements
CA002576697A CA2576697A1 (en) 2006-02-02 2007-02-01 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/345,326 US8046679B2 (en) 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements
EP06101233A EP1816573A1 (en) 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements

Publications (2)

Publication Number Publication Date
US20070180360A1 true US20070180360A1 (en) 2007-08-02
US8046679B2 US8046679B2 (en) 2011-10-25

Family

ID=42732728

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/345,326 Active 2030-08-27 US8046679B2 (en) 2006-02-02 2006-02-02 Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements

Country Status (3)

Country Link
US (1) US8046679B2 (en)
EP (1) EP1816573A1 (en)
CA (1) CA2576697A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8239784B2 (en) * 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9524073B1 (en) * 2013-05-13 2016-12-20 Google Inc. Triggering action on a web page
CN112527297A (en) * 2020-12-23 2021-03-19 北京飞漫软件技术有限公司 Data processing method, device, equipment and storage medium
US11893401B1 (en) * 2022-12-06 2024-02-06 Capital One Services, Llc Real-time event status via an enhanced graphical user interface

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8819567B2 (en) 2011-09-13 2014-08-26 Apple Inc. Defining and editing user interface behaviors
US9164576B2 (en) 2011-09-13 2015-10-20 Apple Inc. Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US20130063484A1 (en) * 2011-09-13 2013-03-14 Samir Gehani Merging User Interface Behaviors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6336124B1 (en) * 1998-10-01 2002-01-01 Bcl Computers, Inc. Conversion data representing a document to other formats for manipulation and display
US20040153968A1 (en) * 2002-10-24 2004-08-05 Jennie Ching Method and system for user customizable asset metadata generation in a web-based asset management system
US20070044012A1 (en) * 2005-08-19 2007-02-22 Microsoft Corporation Encoding of markup language data
US20070276646A1 (en) * 2004-06-01 2007-11-29 Nikil Dutt Retargetable Instruction Set Simulators

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7546298B2 (en) 2001-01-09 2009-06-09 Nextair Corporation Software, devices and methods facilitating execution of server-side applications at mobile devices
GB2414820A (en) * 2004-03-04 2005-12-07 Sendo Int Ltd A method for retrieving data embedded in a textual data file

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6336124B1 (en) * 1998-10-01 2002-01-01 Bcl Computers, Inc. Conversion data representing a document to other formats for manipulation and display
US20040153968A1 (en) * 2002-10-24 2004-08-05 Jennie Ching Method and system for user customizable asset metadata generation in a web-based asset management system
US20070276646A1 (en) * 2004-06-01 2007-11-29 Nikil Dutt Retargetable Instruction Set Simulators
US20070044012A1 (en) * 2005-08-19 2007-02-22 Microsoft Corporation Encoding of markup language data

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US8612856B2 (en) 2004-07-30 2013-12-17 Apple Inc. Proximity detector in handheld device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US8239784B2 (en) * 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US9524073B1 (en) * 2013-05-13 2016-12-20 Google Inc. Triggering action on a web page
CN112527297A (en) * 2020-12-23 2021-03-19 北京飞漫软件技术有限公司 Data processing method, device, equipment and storage medium
US11893401B1 (en) * 2022-12-06 2024-02-06 Capital One Services, Llc Real-time event status via an enhanced graphical user interface

Also Published As

Publication number Publication date
CA2576697A1 (en) 2007-08-02
US8046679B2 (en) 2011-10-25
EP1816573A9 (en) 2008-12-24
EP1816573A1 (en) 2007-08-08

Similar Documents

Publication Publication Date Title
US8046679B2 (en) Apparatus, method and machine-readable medium for facilitating generation of a markup language document containing identical sets of markup language elements
US7890853B2 (en) Apparatus and machine-readable medium for generating markup language representing a derived entity which extends or overrides attributes of a base entity
US7913234B2 (en) Execution of textually-defined instructions at a wireless communication device
US20070288853A1 (en) Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support
US7941450B2 (en) Software, devices and methods facilitating execution of server-side applications at mobile devices
US7865528B2 (en) Software, devices and methods facilitating execution of server-side applications at mobile devices
US6961750B1 (en) Server-side control objects for processing client-side user interface elements
US7917888B2 (en) System and method for building multi-modal and multi-channel applications
US20070300237A1 (en) Facilitating access to application data at an application server by a wireless communication device
US20060190569A1 (en) Facilitating mobile device awareness of the availability of new or updated server-side applications
US8224951B2 (en) Determining operational status of a mobile device capable of executing server-side applications
US7533114B2 (en) Mobile device having extensible software for presenting server-side applications, software and methods
CA2578178C (en) Apparatus and machine-readable medium for generating markup language representing a derived entity which extends or overrides attributes of a base entity
US7441228B2 (en) Design-time representation for a first run-time environment with converting and executing applications for a second design-time environment
McClanahan et al. JavaServer™ Faces Specification
CA2592399C (en) Facilitating access to application data at an application server by a wireless communication device
CA2578177C (en) Execution of textually-defined instructions at a wireless communication device
EP1865422A1 (en) Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support
EP1865423B1 (en) Software and Device for refreshing Markup Language-Based Database Queries Independently from User Interface Screens

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEXTAIR CORPORATION, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NEIL, TIM;REEL/FRAME:017542/0815

Effective date: 20060131

AS Assignment

Owner name: RESEARCH IN MOTION LIMITED, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NEXTAIR CORPORATION;REEL/FRAME:026430/0390

Effective date: 20110610

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: BLACKBERRY LIMITED, ONTARIO

Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:034179/0923

Effective date: 20130709

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12

AS Assignment

Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103

Effective date: 20230511

AS Assignment

Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064269/0001

Effective date: 20230511