WO2014147867A1 - 画像処理装置及びプログラム - Google Patents
画像処理装置及びプログラム Download PDFInfo
- Publication number
- WO2014147867A1 WO2014147867A1 PCT/JP2013/074195 JP2013074195W WO2014147867A1 WO 2014147867 A1 WO2014147867 A1 WO 2014147867A1 JP 2013074195 W JP2013074195 W JP 2013074195W WO 2014147867 A1 WO2014147867 A1 WO 2014147867A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- image
- position information
- unit
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
- G06V20/63—Scene text, e.g. street names
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/40—Document-oriented image-based pattern recognition
- G06V30/41—Analysis of document content
- G06V30/413—Classification of content, e.g. text, photographs or tables
Definitions
- the present invention relates to an image processing apparatus and program.
- GUI graphical user interface
- a method of clicking (taping) the position of the icon to be selected and selecting the icon is known.
- a method of selecting a plurality of icons by dragging a rectangular range overlapping the plurality of icons to be selected is known.
- Such an icon selection method employed in the GUI is also applicable to the image processing apparatus.
- an object such as a pattern, a figure, or a symbol represented in an image to be processed by the image processing apparatus, it is possible to select an object whose position overlaps the position indicated by the user.
- An object of the present invention is to provide an image processing apparatus and program capable of selecting even an object whose position does not overlap with the position indicated by the user.
- the invention according to claim 1 is directed position information acquisition means for acquiring indicated position information including an indicated position of at least one user in an image in which at least one object is expressed;
- Reference position information acquisition means for acquiring reference position information including the position of the object in the image; and division means for dividing a processing target area of the image based on the reference position information and generating a plurality of division areas.
- Partitioned area specifying means for specifying at least one divided area based on the plurality of divided areas and the indicated position information, and object judging means for judging whether the object is selected based on the specified divided area
- an image processing apparatus characterized by including.
- the invention according to claim 2 is the image processing apparatus according to claim 1, wherein the object determination unit selects the object based on the position of the object or the area of the object.
- the image processing apparatus is characterized in that
- the invention according to claim 3 is the image processing apparatus according to claim 1 or 2, wherein the reference position information includes at least one position indicating a boundary of a processing target area of the image,
- the object determination means includes an area determination means for determining whether or not the divided area specified by the divided area specifying means is related to the object, and the specified divided area is a boundary of the processing target area of the image. It is determined that the object is not selected, in the case of a partitioned area partitioned based on the position shown and determined not to be associated with the object It is an apparatus.
- the invention according to claim 4 is the image processing apparatus according to any one of claims 1 to 3, wherein the dividing means acquires the position of the generatrix in the image based on the reference position information.
- the image processing apparatus is characterized in that a processing target area of the image is divided based on a Bonoloy boundary defined based on the position of the generating point, and the plurality of divided areas are generated.
- the invention according to claim 5 is the image processing apparatus according to claim 4, wherein the dividing means is based on a Delaunay boundary defined based on the position of the generatrix and the Voronoi boundary.
- the image processing apparatus is characterized in that a processing target area of the image is divided and the plurality of divided areas are generated.
- the invention according to claim 6 is the image processing apparatus according to any one of claims 1 to 5, wherein the indication position information includes information indicating an indication line segment of the user or an indication region of the user.
- the division area specifying unit is an image processing apparatus that specifies at least one division area based on information indicating the user's instruction line segment or the user's indication area.
- the invention according to claim 7 is the image processing apparatus according to any one of claims 1 to 6, wherein the image is an image in which at least one object and at least one marker are displayed. Means for acquiring a significant pixel area represented in the image, and identification means for identifying the significant pixel area as either the object or the marker;
- the image processing apparatus is characterized by including marker information acquisition means for acquiring the indication position information based on the marker in the image.
- the invention according to claim 8 is the image processing apparatus according to any one of claims 1 to 7, wherein instruction processing information indicating processing for instructing a user of the object as the target in the instruction position information is provided.
- the division area specifying means includes the instruction processing information related to the specified position information related to the specification in the specified division area, and the instruction processing information related to the specified division area.
- the object determination unit is the instruction processing information related to the identified section area related to the determination of the object determined to be selected, An object determined to be selected, including a second instruction processing information acquisition unit that acquires the instruction processing information related to the object; Doo, an image processing apparatus characterized by including an object processing means for executing the processes shown in the instruction processing information associated with the object.
- the invention according to claim 9 is an instructed position information acquiring means for acquiring instructed position information including an instructed position of at least one user in an image in which at least one object is expressed, a position of the object in the image
- the second aspect of the present invention it is possible to determine that an object which does not overlap the user's designated position and position or area is selected.
- the object has not been selected according to the position indicated by the user.
- processing associated with the indicated position information can be executed.
- FIG. 1 shows an example of an image processing apparatus according to an embodiment of the present invention.
- the image processing apparatus 100 includes a control unit 110, a storage unit 120, an operation unit 130, and a display unit 140.
- the control unit 110 is configured to include a CPU and the like. Control unit 110 operates in accordance with a program stored in storage unit 120.
- the storage unit 120 is configured to include a RAM, a ROM, and the like.
- the storage unit 120 stores a program executed by the control unit 110.
- the storage unit 120 also functions as a work memory of the control unit 110.
- the storage unit 120 may store image data to be processed by the image processing apparatus in advance.
- the storage unit 120 may store various data in advance, such as image data other than an image to be processed which is used by the image processing apparatus.
- the storage unit 120 may be any information storage medium readable by a computer, and may include, for example, a hard disk. Further, in the present embodiment, the case where the program executed by the control unit 110 is stored in the storage unit 120 will be described, but the program is provided, for example, via a communication network such as the Internet, etc. Alternatively, for example, the information may be stored and provided in various computer-readable information storage media such as a CD-ROM and a DVD-ROM.
- the operation unit 130 is an input device for receiving an operation by a user, and for example, a keyboard, a mouse, an operation button, and the like are used.
- the operation content received by the operation unit 130 is output to the control unit 110.
- the display unit 140 is, for example, a display, a projector, or the like.
- the display unit 140 executes output processing according to various known functions of the display unit 140, such as “display an image on a screen” and “project an image on a screen” according to an instruction from the control unit 110.
- a touch panel may be used, and various other known input devices can be applied.
- FIG. 1 An image reception unit 111, a significant pixel area acquisition unit 112, a marker / object identification unit 113, an indicated position information acquisition unit 114, and a reference position information acquisition unit And 115, an area dividing unit 116, a divided area specifying unit 117, and an object determining unit 118.
- These functions are realized, for example, by the control unit 110 executing a program stored in the storage unit 120.
- the details of the functional group will be described with reference to FIGS.
- the image receiving unit 111 receives a target image to be processed by the image processing apparatus 100 according to the present invention.
- the image receiving unit 111 receives a target image stored in the storage unit 120 by receiving a request from a user, a given application, or the like.
- the image accepting unit 111 may accept an object image from a given device (mobile phone, digital camera, scanner, etc.) connected to the image processing device 100, or via a network.
- the target image may be downloaded and accepted from the connected device.
- FIG. 2 shows an example of an image to be processed.
- the image 200 of the present embodiment is, for example, image data obtained by imaging a whiteboard (recording medium).
- a color different from the background color such as characters, patterns, figures, symbols, etc. drawn on the surface of the whiteboard (here, a color outside the color gamut of the surface of the whiteboard)
- a marker 202 such as a sticky note placed in the vicinity of the object 201 for the purpose of pointing the object 201.
- the marker 202 represented in the image 200 may have a predetermined image feature (for example, an image feature detectable by pattern matching), and may be, for example, a magnet or the like in addition to the sticky note. Specific symbols, figures, etc. are also applicable.
- the image 200 is not limited to an image obtained by imaging such a whiteboard.
- the image 200 may be an image in which the object 201 or the marker 202 is displayed on the background color, and may be, for example, an image created from a given application.
- the image 200 is not limited to the image data stored in the storage unit 120.
- Image data generated by the image generation unit 150 included in the image processing apparatus 100 may be a processing target.
- the image receiving unit 111 may receive the image data generated by the image generating unit 150 as the image 200.
- the display area in the display unit 140 is set as an area to be processed by the image processing apparatus 100 (hereinafter referred to as a process target area), and the entire area of the image related to the display is acquired as the image 200. You may do so.
- the image receiving unit 111 may obtain, as the image 200, an image represented in an area on the screen designated by the user with a pointing device (mouse, touch panel or the like).
- a pointing device mouse, touch panel or the like.
- the significant pixel area acquiring unit 112 acquires the significant pixel area represented in the image 200.
- the significant pixel area acquiring unit 112 acquires a plurality of significant pixel areas in accordance with a predetermined reference based on the image 200 acquired by the image receiving unit 111.
- the significant pixel area is an area in the image 200 that includes a color different from the background color. That is, each significant pixel area includes either the object 201 or the marker 202.
- FIG. 3 shows an example of the significant pixel area.
- the significant pixel area acquiring unit 112 acquires the significant pixel area 203 from the object 201 and the marker 202 represented in the image 200.
- the significant pixel region acquiring unit 112 acquires a continuous pixel group including pixels in which colors different from the background color are continuous.
- a method of acquiring such a continuous pixel group a method widely known as a labeling process can be used.
- the significant pixel region acquiring unit 112 defines a basic rectangle circumscribing each of the acquired continuous pixel groups, and the area of the basic rectangle, the density of background color pixels in the basic rectangle, and the intervals between the basic rectangles. By computing the distance and the like, a significant pixel area 203 in which a plurality of basic rectangles are integrated is defined.
- the significant pixel area acquisition unit 112 acquires significant pixel area information from each significant pixel area 203 thus defined, and stores the information in the storage unit 120.
- the significant pixel area information includes, for example, the boundary position (the position of each vertex) of the significant pixel area 203 and the representative position (the position of the center of gravity) of the significant pixel area 203.
- the significant pixel area acquisition unit 112 acquires each position coordinate related to the significant pixel area 203 as significant pixel area information and stores it. It is assumed that the data is stored in the unit 120.
- the significant pixel area acquiring unit 112 may also, for example, the position coordinates of the upper left apex, the distance from the upper left apex in the X coordinate axis direction, and the distance in the Y coordinate axis direction And may be acquired.
- the representative position of the significant pixel area 203 is not limited to the barycentric position of the area, but it is desirable that the representative position is information indicating the center position of the area.
- the significant pixel regions 203 are all shown as rectangles in FIG. 3, they may be shown as other polygons or circles. Further, the boundary position of the significant pixel area 203 in this case is not limited to the position coordinate of each vertex, and may include, for example, a plurality of position coordinates indicating the boundary of the area.
- the significant pixel area acquiring unit 112 may further divide one significant pixel area 203 into a plurality of parts based on the features (such as the gradation and the number of colors) of the image portion included in each significant pixel area. Also, a plurality of significant pixel areas 203 may be integrated into one. In addition, the two significant pixel areas 203 may be at positions overlapping each other.
- the marker / object identification unit 113 identifies the significant pixel area 203 as either the object 201 or the marker 202. For example, the marker / object identification unit 113 identifies significant pixel area information as either the area of the object 201 or the area of the marker 202 according to a predetermined standard.
- FIG. 4 shows an example of the area of the object 201 and the area of the marker 202.
- the area 205 of the marker is an area including the marker 202 in the image 200.
- the area 204 of the object is an area including the object 201 in the image 200.
- the marker / object identification unit 113 identifying the significant pixel area 203
- the marker / object identification unit 113 performs pattern matching on the image represented in the significant pixel area 203 with the reference image acquired in advance and stored in the storage unit 120. If the image in the significant pixel area 203 is similar to the reference image, the significant pixel area 203 is identified as the area 205 of the marker, otherwise the significant pixel area 203 is identified as the area 204 of the object. Do.
- the method of identifying the significant pixel area 203 by the marker / object identification unit 113 is not limited to that by pattern matching, and may be compared with, for example, an image feature set by the user.
- the marker / object identification unit 113 may identify the image in the significant pixel area 203 based on the information such as the shape and color of the marker accepted by the user's operation.
- the designated position information acquisition unit 114 acquires designated position information including at least one designated position of the user. For example, the pointing position information acquisition unit 114 acquires pointing position information based on the markers 202 in the image 200. Specifically, the pointed position information acquisition unit 114 acquires the representative position of the area 205 of the marker as pointed position information, and stores the representative position in the storage unit 120.
- the pointing position information may include information indicating a pointing segment of the user or a pointing region of the user.
- the information indicating the user's designated line segment is, for example, information indicating a line segment which passes through the representative position of the region 205 of the marker and is divided based on the boundary position of the region 205 of the marker.
- the information indicating the user's designated area is, for example, information indicating an area surrounded by the boundary position of the marker area 205, and the boundary position itself is also included.
- the pointing position information includes position coordinates of each vertex as a boundary position in the area 205 of the marker. That is, the pointing position information includes the pointing area of the user.
- the indicated position information acquiring unit 114 may acquire the indicated position information based on the user's operation performed by the operation unit 130. .
- the pointing position information acquisition unit 114 may acquire pointing position information based on a position, a line segment, or an area on a screen designated by the user with a pointing device (mouse, touch panel or the like).
- the pointed position information acquiring unit 114 may acquire each coordinate value included in the pointed position information from a value input by the user using a keyboard or the like.
- the pointed position information acquisition unit 114 uses a known technique for the position indicated by a human finger or the like with respect to the image projected by the projector.
- the pointed position information may be acquired by detecting the mark position.
- FIG. 5 shows an example of the area 206 indicated by the pointing device.
- the indicated position information further includes the indicated area of the user accepted in this manner.
- the image processing apparatus 100 may display an image indicating the thus-acquired pointing position information on the display unit 140 so as to be superimposed on the image 200 so that the user can visually recognize the image.
- the reference position information acquisition unit 115 acquires reference position information including the position of the object 201 in the image 200.
- the reference position information acquisition unit 115 acquires a representative position of the area 204 of the object (that is, the position of the object 201 in the image 200) as reference position information, and stores the acquired position in the storage unit 120.
- the reference position information may include at least one position indicating the boundary of the processing target area of the image 200.
- the reference position information acquisition unit 115 may acquire the vertex coordinates of the area as the position indicating the boundary in the processing target area of the image 200, and may be included in the reference position information.
- the reference position information includes the representative position of the area 204 of the object and the position of each vertex in the processing target area of the image 200.
- the reference position information acquisition unit 115 acquires the position of the object 201 and the area of the object 201.
- the reference position information acquisition unit 115 acquires the representative position and the boundary position of the area 204 of the object as the object position information.
- the object position information includes the representative position and the boundary position of the area 204 of the object.
- the area dividing unit 116 divides the processing target area of the image 200 based on the reference position information, and generates a plurality of divided areas.
- the area dividing unit 116 acquires the reference position information stored in the storage unit 120, and sets a plurality of positions included in the reference position information as the positions of the generating points.
- the region dividing unit 116 defines Voronoi boundaries and Delaunay boundaries in the processing target region of the image 200 based on the positions of the mother points.
- the area division unit 116 acquires division area information from a plurality of division areas including the defined boundary lines, and stores the division area information in the storage unit 120.
- the Voronoi boundary is, for example, a boundary line in the Voronoi diagram in which any position shown on the image 200 is divided according to which mother point is close.
- Such a Voronoi boundary is a part of a bisector to each generating point, and is characterized by being a polygon.
- the Delaunay boundary is a boundary line formed by connecting mother points included in the divided regions which are adjacent to each other in a plurality of divided regions defined from the Voronoi boundary generated earlier. is there. Note that the Delaunay boundary formed in this manner is characterized by being a triangle except in special cases.
- FIG. 6 shows an example of a Voronoi boundary, a Delaunay boundary, and a divided region 209 composed of these boundaries.
- the region dividing section 116 defines a region divided into a Voronoi boundary 207 (indicated by a broken line in the drawing) and a Delaunay boundary 208 (indicated by a dashed dotted line in the drawing) as a divided region 209, Get multiple parcel area information.
- a divided area 209 different from the area 204 of the object is generated, and such divided area information is acquired.
- the image processing apparatus 100 may display the image indicating the section area acquired in this manner on the display unit 140 so as to be superimposed on the image 200 so that the user can visually recognize it. Further, the image processing apparatus 100 may be configured to obtain indication position information by waiting for the user's operation from the indication position information acquisition unit 114 after visual recognition by the user as described above.
- the divided area specifying unit 117 specifies at least one divided area based on the plurality of divided areas 209 and the indicated position information. For example, the block area specifying unit 117 determines whether the block area information is specified based on the pointed position information.
- FIG. 7 shows an example of the identified partitioned area 211.
- the block area specifying unit 117 specifies the block area 211 from the area based on the pointing position information (hereinafter referred to as a pointing area 210).
- the divided area 211A including the indication area 210A is specified.
- a plurality of divided areas 211B overlapping a part of the instruction area 210B are identified.
- the block area specifying unit 117 acquires a plurality of block area information and designated position information stored in the storage unit 120. Then, it is determined whether or not at least a part of the divided area 209 obtained from all the divided area information overlaps with the specified area 210 obtained from the specified position information. If at least a part of the areas overlap, the divided area specifying unit 117 specifies the divided area 209, assigns “1” to the specified result of the divided area information, and stores the result in the storage unit 120. . Further, when the areas do not overlap with each other, the section area 209 is not specified, “0” is added to the specification result of the section area information, and the result is stored in the storage unit 120.
- one divided area including the largest area of the designated area 210B is another, for example. It may be specified.
- a segment area 209 in which at least a portion overlaps with a designated line segment obtained from the designated position information may be specified.
- the divided area 209 overlapping with the designated position obtained from the designated position information may be specified.
- the object determining unit 118 determines whether the object 201 is selected based on the identified partitioned area 211. For example, the object determination unit 118 determines whether or not the area 204 (object 201) of at least one object is selected based on the object position information and the division area information.
- FIG. 8 shows an example of the area 212 of the selected object.
- the object determination unit 118 determines that an area 212A of an object overlapping the specified divided area 211A and the object 201 have been selected.
- the object determination unit 118 is a section area 211 in which the specified section area 211 is sectioned based on the position indicating the boundary of the processing target area of the image 200 and is determined not to be related to the object 201 In the case of the area 211, it is determined that the object is not selected. For example, although not described in the drawing, when the divided area 211 specified from the pointing position information does not overlap with the area 204 of the object, it is determined that neither the area 204 of the object nor the object 201 is selected.
- the object determination unit 118 acquires object position information, indicated position information, and sectioned area information including a specification result, which are stored in the storage unit 120. Then, for the area 204 of the object based on the object position information, at least a part of the indication area 210 obtained from the indication position information, and the division area 211 based on the division area information where "1" is given to the identification result Determine if they overlap. When at least a part of the regions overlap, the object determination unit 118 determines that the region 204 of the object (the object 201) is selected. That is, the object selection result is stored as “1” in the storage unit 120. If the areas do not overlap, it is determined that the area 204 of the object (the object 201) is not selected. That is, the object selection result is stored as “0” in the storage unit 120.
- FIG. 9 and FIG. 10 are flowcharts showing processing executed by the image processing apparatus 100 in the present embodiment. The processing executed by the image processing apparatus 100 will be described below with reference to this flowchart.
- the image receiving unit 111 receives and acquires the image 200 from the user's operation, a request from a given application, and the like (S101).
- the significant pixel area acquiring unit 112 acquires a plurality of significant pixel areas 203 in accordance with a predetermined reference based on the image 200 acquired in S101.
- the marker / object identifying unit 113 identifies the plurality of significant pixel areas 203 acquired in S102 as either the area 204 of the object or the area 205 of the marker according to a predetermined standard (S103).
- the reference position information acquisition unit 115 acquires object position information from the area 204 of the object identified in S103 (S104). Furthermore, the reference position information acquisition unit 115 acquires reference position information from the representative position of the area 204 of the object and the position on the boundary of the image 200 (S105). Subsequently, the area division unit 116 acquires the positions of the plurality of mother points from the reference position information, divides the processing target area of the image 200 based on each mother point, and generates a plurality of division areas 209 (S106). ).
- the pointing position information acquisition unit 114 acquires pointing position information based on the area 205 of the marker identified in S103 and the setting operation of the user (S107).
- the divided area specifying unit 117 specifies, from the plurality of divided areas 209 generated in S106, the divided area 211 indicated in the indicated position information based on the indicated position information acquired in S107 (S108).
- the object determining unit 118 determines whether the object 201 is selected based on the designated position information acquired in S107 (S109). If it is determined that the object is selected (Y in S109), “1” is set in the object selection result and stored in the storage unit 120 (S112).
- the object determination unit 118 determines that the object 201 is not selected (N in S109), it is further determined whether the object 201 is selected based on the section area 211 specified in S108. (S110). When it is determined that the object is selected (Y in S110), "1" is set in the object selection result (S112), and when it is determined that the object is not selected (N in S110), "0" is set in the object selection result (S111), and stores it in the storage unit 120. Finally, the determination process (S109 to S112) for the object 201 is repeated (N in S113) until all the objects 201 are determined (Y in S113).
- the sectioned area 209 for dividing the processing target area of the image 200 is generated, and at least one sectioned area 211 is specified based on the position instructed by the user. It becomes possible to determine that the object 201 in which the area 204 and the object area 204 do not overlap is selected.
- the object determination unit 118 determines whether or not the object 201 is selected based on the area 204 of the object.
- the determination processing may be executed based on the position of
- FIG. 11 shows an example of the position 213 of the selected object.
- the object determination unit 118 in the present modification determines that the position 213 of the object in contact with the identified section area 211 and the object 201 based on the position are selected.
- the area dividing unit 116 becomes the reference at the time of dividing the divided area 209 in the divided area information.
- Reference position information may be included.
- the object determination unit 118 acquires the position 213 of the object based on the reference position information further added to the identified section area information, and the position of the object and the object 201 are obtained. , It is determined that it has been selected.
- the image 200 is at least one object What is required is that 201 be represented. That is, the image 200 may not have the marker 202 displayed.
- the image processing apparatus 100 may not include the marker / object identification unit 113 described in the above embodiment.
- the image processing apparatus 100 sets all significant pixel areas acquired by the significant pixel area acquisition unit 112 as the area 204 of the object, and the user of the instruction position information acquisition unit 114 performs the operation performed by the operation unit 130.
- the designated position information is acquired based on the operation.
- the position of the mother point is acquired based on the reference position information, and Voronoi based on the position of the mother point
- the block area information may be acquired by generating a plurality of block areas 209 from the border line of the border. That is, the area dividing section 116 may generate a plurality of divided areas 209 without defining the Delaunay boundary described in the above embodiment.
- FIG. 12 shows an example of a Voronoi boundary 207 (indicated by a broken line in the drawing) and a divided region 209 generated from the boundary in the present modification.
- the divided area 209 shown in FIG. 12 is different from each divided area 209 in comparison with the divided area 209 (the divided area 209 generated from the boundary of the Voronoi boundary 207 and the Delaunay boundary 208) shown in FIG.
- FIG. 13 illustrates an example of a partitioned area 211 identified based on the user's designated position and an area 212 of the selected object in the present modification.
- the area of the identified area is larger than that of the identified section area 211 shown in FIG. 13 as compared with those shown in FIGS. 7 to 8 in the description of the embodiment. Then, with respect to the area 212 of the object shown in each drawing, it is determined in the present modification that the area 212 of more objects is selected.
- the object determination unit 118 has a possibility of determining that the area 212 (object 201) of more objects is selected as compared to the embodiment.
- the modification (3) it can be determined that more objects 201 than in the embodiment are selected. In other words, according to the embodiment, it is possible to determine that the object 201 whose position does not overlap with the position designated by the user is selected as long as the modification (3) is not exceeded.
- the image processing apparatus 100 associates instruction processing information indicating instruction processing of the user targeting the object 201 with the instruction position information, and responds to the instruction processing information to the object 201 determined to be selected.
- the processing image processing targeting the area 204 of the object may be executed.
- FIG. 14 shows an example of the image processing apparatus according to the present modification.
- an instruction processing information association unit 301 As shown in FIG. 14, an instruction processing information association unit 301, a first instruction processing information acquisition unit 302, and a second instruction processing information acquisition unit 303, as functions of the image processing apparatus 100 according to this modification.
- An object processing unit 304 is further included from the image processing apparatus 100 of the embodiment.
- These functions are also realized, for example, by executing a program stored in the storage unit 120 by the control unit 110, similarly to each function of the embodiment.
- the details of the added function will be described.
- the instruction processing information association unit 301 associates instruction position information indicating instruction processing information indicating a user's instruction processing on an object.
- the instruction processing information is, for example, information including the contents (mask processing, encryption processing, distribution processing, and the like) of image processing to be executed on the area 204 of the object in the image 200.
- the instruction processing information may also be configured to include, for example, information indicating the priority of a predetermined process (the order of the rendering process when displaying the image 200, etc.). .
- the instruction processing information association unit 301 receives, for example, the user's operation performed by the operation unit 130, and associates instruction processing information with each piece of instruction position information.
- the image processing apparatus 100 causes the display unit 140 to display an image indicating the indication position information and a list indicating the contents of the indication processing information.
- the instruction processing information associating unit 301 receives the operation of the user who selects the instruction position information and the instruction processing information, and the selected instruction position information is selected. Associate the specified instruction processing information.
- the reference process information associating unit 301 determines the reference when identifying the area 205 of the marker Based on the reference image or the image feature, the instruction processing information previously associated with the reference image or the image feature is acquired and associated with the instruction processing information.
- the instruction processing information associating unit 301 associates the instruction processing information with the instruction position information, and further associates, for example, information indicating the priority of processing in each instruction processing (hereinafter referred to as processing priority information). You may do so.
- the first instruction processing information acquisition unit 302 acquires instruction processing information related to the specified position information related to the specification in the specified division area 211 and is related to the specified division area 211. .
- the sectioned area specifying unit 117 specifies the sectioned area 211, it acquires instruction processing information associated with the indicated position information serving as a reference at the time of specifying as the instruction processing information related to the sectioned area 211. .
- the identified section area 211 may be identified based on a plurality of pieces of designated position information.
- the first instruction processing information acquisition unit 302 may, for example, acquire instruction processing information for all the pieces of instruction position information serving as the reference.
- the highest priority instruction processing information may be acquired.
- associated instruction processing information may be acquired based on the designated position information included in the largest area among the designated areas 210 indicated in each piece of designated position information.
- the second instruction processing information acquisition unit 303 acquires instruction processing information related to the object, which is the instruction processing information related to the identified section area related to the determination of the object 201 determined to be selected. Do. For example, when the object determination unit 118 determines that the area 212 of the object and the object 201 are selected, the instruction processing information related to the section area 211 serving as the reference in the determination is the area 212 of the object and the object Acquired as instruction processing information related to 201.
- the second instruction processing information acquisition unit 303 also performs instruction processing information related to the area 212 of the object and the object 201 with respect to the instruction processing information associated with the instruction position information serving as the reference at the time of the determination. Get as.
- the area 212 of the object may be selected based on a plurality of pieces of instruction processing information or a plurality of specified partitioned areas 211.
- the second instruction processing information acquisition unit 303 may acquire all pieces of instruction processing information that has become the reference of the determination.
- the highest priority instruction processing information may be acquired.
- instruction processing information associated with the area including the largest area may be acquired.
- the object processing unit 304 executes, on the object 201 determined to be selected, the processing indicated by the instruction processing information related to the object 201. For example, based on the instruction processing information related to the object 201 acquired by the second instruction processing information acquisition unit, the object processing unit 304 performs processing indicated by the instruction processing information (image targeting the area 204 of the object) Process).
- the instruction processing information associating unit 301 associates instruction processing information indicating instruction processing of the user targeting the object 201 with the instruction position information acquired in S107 (S107-2). Then, the first instruction processing information acquisition unit 302 acquires instruction processing information related to the section area 211 identified in S108 (S108-2). Furthermore, the second instruction processing information acquisition unit 303 acquires instruction processing information related to the object 201 determined to be selected in S109 or S110 (S112-2).
- the object processing unit 304 executes the processing indicated in the instruction processing information (image processing targeting the area 204 of the object) based on the instruction processing information related to the object 201 acquired in S112-2. (S114).
- image processing apparatus 110 control unit 111 image reception unit 112 significant pixel area acquisition unit 113 object identification unit 114 indication position information acquisition unit 115 reference position information acquisition unit 116 area division unit 117 division area specification unit , 118 object determination unit, 120 storage unit, 130 operation unit, 140 display unit, 150 image generation unit, 200 target image, 201 object, 202 marker, 203 significant pixel area, 204 object area, 205 marker area, 206 pointing Device indicated area, 207 Voronoi boundary, 208 Delaunay boundary, 209 sectioned area, 210, 210A, 210B indicated area, 211, 211A, 211B specified sectioned area, 212, 212A, 212B selected Area of the object, 213 selected position of an object, 301 instruction processing information association unit, 302 first instruction processing information acquisition unit, 303 second instruction processing information acquisition unit, 304 object processing unit.
Abstract
Description
図1に、本発明の一実施形態に係る画像処理装置の一例を示す。図1に示すように、画像処理装置100は、制御部110と、記憶部120と、操作部130と、表示部140と、を含んで構成されている。
次に、本実施形態において画像処理装置100が実現する機能について説明する。画像処理装置100の機能として、図1に示すように、画像受付部111と、有意画素領域取得部112と、マーカー/オブジェクト識別部113と、指示位置情報取得部114と、基準位置情報取得部115と、領域区画部116と、区画領域特定部117と、オブジェクト判定部118と、を含んで構成される。これらの機能は、例えば、制御部110が記憶部120に格納されているプログラムを実行することによって実現される。以下、図2乃至10を参照しながら、上記機能群の詳細について説明する。
まず、画像受付部111は、本発明に係る画像処理装置100の処理対象となる対象画像を受け付ける。例えば、画像受付部111は、記憶部120に記憶される対象画像を、ユーザや所与のアプリケーション等からの要求を受け付けて取得する。なお、他にも画像受付部111は、対象画像を、画像処理装置100に接続された所与の装置(携帯電話、デジタルカメラ、スキャナ等)から受け付けるようにしてもよいし、ネットワークを介して接続される装置から、対象画像をダウンロードして受け付けるようにしてもよい。
有意画素領域取得部112は、画像200に表される有意画素領域を取得する。例えば、有意画素領域取得部112は、画像受付部111が取得した画像200に基づいて、予め定めた基準に従って、複数の有意画素領域を取得する。ここで、有意画素領域は、画像200のうち、背景色とは異なる色が含まれる領域のことである。すなわち、各々の有意画素領域には、オブジェクト201、又は、マーカー202の何れかが含まれる。
次に、マーカー/オブジェクト識別部113は、有意画素領域203を、オブジェクト201又はマーカー202の何れかに識別する。例えば、マーカー/オブジェクト識別部113は、有意画素領域情報を、予め定めた基準に従って、オブジェクト201の領域、又は、マーカー202の領域の何れかに識別する。
指示位置情報取得部114は、少なくとも一つのユーザの指示位置を含む指示位置情報を取得する。例えば、指示位置情報取得部114は、画像200におけるマーカー202に基づいて指示位置情報を取得する。具体的には、指示位置情報取得部114は、マーカーの領域205の代表位置を指示位置情報として取得し、記憶部120に記憶する。
基準位置情報取得部115は、画像200における、オブジェクト201の位置を含む基準位置情報を取得する。例えば、基準位置情報取得部115は、オブジェクトの領域204の代表位置(すなわち、画像200におけるオブジェクト201の位置)を、基準位置情報として取得し、記憶部120に保存する。
領域区画部116は、基準位置情報に基づいて、画像200の処理対象領域を区画し、複数の区画領域を生成する。
区画領域特定部117は、複数の区画領域209と指示位置情報とに基づいて、少なくとも一つの区画領域を特定する。例えば、区画領域特定部117は、区画領域情報が指示位置情報に基づいて特定されるか否かを判定する。
オブジェクト判定部118は、特定した区画領域211に基づいて、オブジェクト201が選択されたか否かを判定する。例えば、オブジェクト判定部118は、オブジェクト位置情報と区画領域情報とに基づいて、少なくとも一つのオブジェクトの領域204(オブジェクト201)に対して、選択されたか否かを判定する。
次に、本実施形態において画像処理装置100で実行される処理を説明する。図9及び図10は、本実施形態で画像処理装置100が実行する処理を示すフロー図である。以下、本フロー図を参照しながら、画像処理装置100が実行する処理について説明する。
なお、本発明は、以上に説明した実施の形態に限定されるものではない。本発明の趣旨を逸脱しない範囲で、適宜変更可能である。
指示処理情報関連付け部301は、指示位置情報にオブジェクトを対象としたユーザの指示処理を示す指示処理情報を関連付ける。ここで指示処理情報は、例えば、画像200におけるオブジェクトの領域204に対し実行する画像処理の内容(マスク処理、暗号化処理、配信処理等)を含んだ情報である。なお、指示処理情報は、他にも例えば、予め定められた処理(当該画像200を表示する際のレンダリング処理の順番等)の優先順位を示す情報を含んで構成されるものであってもよい。
第1の指示処理情報取得部302は、特定した区画領域211における当該特定に係る指示位置情報に関連付けられた指示処理情報であって、当該特定した区画領域211に関連する指示処理情報を取得する。例えば、区画領域特定部117が区画領域211を特定する際、当該特定する際の基準となった指示位置情報に関連付けられた指示処理情報を、当該区画領域211に関連する指示処理情報として取得する。
第2の指示処理情報取得部303は、選択されたと判定したオブジェクト201の当該判定に係る当該特定された区画領域に関連する前記指示処理情報であって、当該オブジェクトに関連する指示処理情報を取得する。例えば、オブジェクト判定部118がオブジェクトの領域212並びにオブジェクト201が選択されたと判定した際、当該判定する際の基準となった区画領域211に関連する指示処理情報を、当該オブジェクトの領域212並びに当該オブジェクト201に関連する指示処理情報として取得する。
オブジェクト処理部304は、選択されたと判定されたオブジェクト201に、当該オブジェクト201に関連する前記指示処理情報に示された処理を実行する。例えば、オブジェクト処理部304は、第2の指示処理情報取得部が取得した、オブジェクト201に関連する指示処理情報に基づいて、当該指示処理情報に示される処理(オブジェクトの領域204を対象とする画像処理)を実行する。
次に、本変形例において画像処理装置100で実行される処理を説明する。図8及び15は、本変形例で画像処理装置100が実行する処理を示すフロー図である。以下、本フロー図を参照しながら、画像処理装置100が実行する処理について説明する。なお、前記実施形態で実行される処理と同一の処理について、繰り返しの説明は省略する。
Claims (9)
- 少なくとも一つのオブジェクトが表された画像における、少なくとも一つのユーザの指示位置を含む指示位置情報を取得する指示位置情報取得手段と、
前記画像における前記オブジェクトの位置を含む基準位置情報を取得する基準位置情報取得手段と、
前記基準位置情報に基づいて前記画像の処理対象領域を区画し、複数の区画領域を生成する区画手段と、
前記複数の区画領域と前記指示位置情報とに基づいて少なくとも一つの区画領域を特定する区画領域特定手段と、
前記特定した区画領域に基づいて前記オブジェクトが選択されたか否かを判定するオブジェクト判定手段と、
を含むことを特徴とする画像処理装置。 - 前記オブジェクト判定手段は、前記オブジェクトの位置又は前記オブジェクトの領域に基づいて前記オブジェクトが選択されたか否かを判定する
ことを特徴とする請求項1に記載の画像処理装置。 - 前記基準位置情報は、前記画像の処理対象領域の境界を示す少なくとも一つの位置を含み、
前記オブジェクト判定手段は、
前記区画領域特定手段により特定した区画領域が前記オブジェクトと関わりを有するか否かを判定する領域判定手段を含み、当該特定した区画領域が、前記画像の処理対象領域の境界を示す位置に基づいて区画された区画領域であって前記オブジェクトと関わりを有さないと判定される区画領域である場合に、前記オブジェクトが選択されていないものと判定する
ことを特徴とする請求項1又は2に記載の画像処理装置。 - 前記区画手段は、前記基準位置情報に基づいて前記画像における母点の位置を取得し、前記母点の位置に基づいて画定されるボノロイ境界に基づいて前記画像の処理対象領域を区画し、前記複数の区画領域を生成する
ことを特徴とする請求項1乃至3のいずれかに記載の画像処理装置。 - 前記区画手段は、前記母点の位置と前記ボロノイ境界とに基づいて画定されるドロネー境界に基づいて前記画像の処理対象領域を区画し、前記複数の区画領域を生成する
ことを特徴とする請求項4に記載の画像処理装置。 - 前記指示位置情報は、ユーザの指示線分又はユーザの指示領域を示す情報を含み、
前記区画領域特定手段は、前記ユーザの指示線分又は前記ユーザの指示領域を示す情報に基づいて少なくとも一つの区画領域を特定する
ことを特徴とする請求項1乃至5のいずれかに記載の画像処理装置。 - 前記画像は、少なくとも一つの前記オブジェクトと少なくとも一つのマーカーが表された画像であって、
前記画像に表される有意画素領域を取得する有意画素領域取得手段と、
前記有意画素領域を前記オブジェクト又は前記マーカーの何れかに識別する識別手段と
、を含み、
前記指示位置情報取得手段は、前記画像における前記マーカーに基づいて前記指示位置情報を取得するマーカー情報取得手段を含む
ことを特徴とする請求項1乃至6のいずれかに記載の画像処理装置。 - 前記指示位置情報に前記オブジェクトを対象としたユーザの指示処理を示す指示処理情報を関連付ける手段と、
前記特定した区画領域における当該特定に係る当該指示位置情報に関連付けられた前記指示処理情報であって、当該特定した区画領域に関連する前記指示処理情報を取得する第1の指示処理情報取得手段と、
前記選択されたと判定した前記オブジェクトの当該判定に係る当該特定された区画領域に関連する前記指示処理情報であって、当該オブジェクトに関連する前記指示処理情報を取得する第2の指示処理情報取得手段と、
前記選択されたと判定されたオブジェクトに、当該オブジェクトに関連する前記指示処理情報に示された処理を実行するオブジェクト処理手段と、を含む
ことを特徴とする請求項1乃至7のいずれかに記載の画像処理装置。 - 少なくとも一つのオブジェクトが表された画像における、少なくとも一つのユーザの指示位置を含む指示位置情報を取得する指示位置情報取得手段、
前記画像における前記オブジェクトの位置を含む基準位置情報を取得する基準位置情報取得手段、
前記基準位置情報に基づいて前記画像の処理対象領域を区画し、複数の区画領域を生成する区画手段、
前記複数の区画領域と前記指示位置情報とに基づいて、少なくとも一つの区画領域を特定する区画領域特定手段、
前記特定した区画領域に基づいて前記オブジェクトが選択されたか否かを判定するオブジェクト判定手段、
としてコンピュータを機能させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2013383628A AU2013383628B2 (en) | 2013-03-21 | 2013-09-09 | Image processing apparatus, program, computer readable medium and image processing method |
SG11201506119RA SG11201506119RA (en) | 2013-03-21 | 2013-09-09 | Image processing apparatus and program |
CN201380073275.1A CN104995591B (zh) | 2013-03-21 | 2013-09-09 | 图像处理设备和图像处理方法 |
US14/745,054 US10095940B2 (en) | 2013-03-21 | 2015-06-19 | Image processing apparatus, image processing method and non-transitory computer readable medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013059087A JP2014186392A (ja) | 2013-03-21 | 2013-03-21 | 画像処理装置及びプログラム |
JP2013-059087 | 2013-03-21 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/745,054 Continuation US10095940B2 (en) | 2013-03-21 | 2015-06-19 | Image processing apparatus, image processing method and non-transitory computer readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014147867A1 true WO2014147867A1 (ja) | 2014-09-25 |
Family
ID=51579582
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/074195 WO2014147867A1 (ja) | 2013-03-21 | 2013-09-09 | 画像処理装置及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US10095940B2 (ja) |
JP (1) | JP2014186392A (ja) |
CN (1) | CN104995591B (ja) |
AU (1) | AU2013383628B2 (ja) |
SG (1) | SG11201506119RA (ja) |
WO (1) | WO2014147867A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6261190B2 (ja) * | 2013-05-31 | 2018-01-17 | キヤノン株式会社 | 設定装置および設定方法 |
CN106527912B (zh) * | 2016-10-28 | 2019-10-15 | 山东大学 | 一种基于Voronoi树图的信息检索可视化系统及方法 |
US10083218B1 (en) * | 2017-06-30 | 2018-09-25 | Konica Minolta Laboratory U.S.A., Inc. | Repairing tables |
CN107362535B (zh) | 2017-07-19 | 2019-04-26 | 腾讯科技(深圳)有限公司 | 游戏场景中的目标对象锁定方法、装置及电子设备 |
CN111191730B (zh) * | 2020-01-02 | 2023-05-12 | 中国航空工业集团公司西安航空计算技术研究所 | 一种面向嵌入式深度学习的超大尺寸图像目标检测方法及系统 |
CN111310624B (zh) * | 2020-02-05 | 2023-11-21 | 腾讯科技(深圳)有限公司 | 遮挡识别方法、装置、计算机设备及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11507455A (ja) * | 1995-06-06 | 1999-06-29 | シリコン グラフィックス インコーポレイテッド | メニューの作成、制御、表示のための方法及び装置 |
JP2003233462A (ja) * | 2002-02-12 | 2003-08-22 | Seiko Epson Corp | 描画補助システム、画像印刷システム、描画補助プログラム及び画像印刷プログラム、並びに描画補助方法及び画像印刷方法 |
WO2009084084A1 (ja) * | 2007-12-27 | 2009-07-09 | Pioneer Corporation | 記録媒体再生装置、記録媒体再生方法、記録媒体再生プログラムおよび記録媒体再生プログラムを格納した記録媒体 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4406668C2 (de) * | 1993-04-27 | 1996-09-12 | Hewlett Packard Co | Verfahren und Vorrichtung zum Betreiben eines berührungsempfindlichen Anzeigegeräts |
US7844914B2 (en) * | 2004-07-30 | 2010-11-30 | Apple Inc. | Activating virtual keys of a touch-screen virtual keyboard |
JP3741965B2 (ja) * | 2001-03-19 | 2006-02-01 | 株式会社ナムコ | 画像処理装置、画像処理用プログラム、およびそのプログラムを記録した記録媒体 |
US20040183833A1 (en) * | 2003-03-19 | 2004-09-23 | Chua Yong Tong | Keyboard error reduction method and apparatus |
JP4217969B2 (ja) | 2003-12-08 | 2009-02-04 | 富士ゼロックス株式会社 | 画像処理装置及びプログラム |
JP5421727B2 (ja) * | 2009-10-20 | 2014-02-19 | キヤノン株式会社 | 画像処理装置およびその制御方法 |
US8502789B2 (en) * | 2010-01-11 | 2013-08-06 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
CA2750093A1 (en) * | 2010-08-19 | 2012-02-19 | Daniel Reem | Method for computing and storing voronoi diagrams, and uses therefor |
US8493404B2 (en) * | 2010-08-24 | 2013-07-23 | Qualcomm Incorporated | Pixel rendering on display |
US9047686B2 (en) * | 2011-02-10 | 2015-06-02 | Qualcomm Incorporated | Data storage address assignment for graphics processing |
JP2012252559A (ja) | 2011-06-03 | 2012-12-20 | Sony Corp | 画像処理装置および方法、記録媒体並びにプログラム |
US9417754B2 (en) * | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
-
2013
- 2013-03-21 JP JP2013059087A patent/JP2014186392A/ja active Pending
- 2013-09-09 AU AU2013383628A patent/AU2013383628B2/en active Active
- 2013-09-09 SG SG11201506119RA patent/SG11201506119RA/en unknown
- 2013-09-09 CN CN201380073275.1A patent/CN104995591B/zh active Active
- 2013-09-09 WO PCT/JP2013/074195 patent/WO2014147867A1/ja active Application Filing
-
2015
- 2015-06-19 US US14/745,054 patent/US10095940B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11507455A (ja) * | 1995-06-06 | 1999-06-29 | シリコン グラフィックス インコーポレイテッド | メニューの作成、制御、表示のための方法及び装置 |
JP2003233462A (ja) * | 2002-02-12 | 2003-08-22 | Seiko Epson Corp | 描画補助システム、画像印刷システム、描画補助プログラム及び画像印刷プログラム、並びに描画補助方法及び画像印刷方法 |
WO2009084084A1 (ja) * | 2007-12-27 | 2009-07-09 | Pioneer Corporation | 記録媒体再生装置、記録媒体再生方法、記録媒体再生プログラムおよび記録媒体再生プログラムを格納した記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
US10095940B2 (en) | 2018-10-09 |
CN104995591A (zh) | 2015-10-21 |
AU2013383628A1 (en) | 2015-08-20 |
SG11201506119RA (en) | 2015-09-29 |
CN104995591B (zh) | 2019-01-04 |
US20160012302A1 (en) | 2016-01-14 |
AU2013383628B2 (en) | 2017-04-06 |
JP2014186392A (ja) | 2014-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014147867A1 (ja) | 画像処理装置及びプログラム | |
CN105659295B (zh) | 用于在移动设备上的真实环境的视图中表示兴趣点的方法以及用于此方法的移动设备 | |
KR100851977B1 (ko) | 가상 평면을 이용하여 전자 기기의 사용자 인터페이스를제어하는 방법 및 장치. | |
US20160188178A1 (en) | Display Processing Method And Portable Mobile Terminal | |
US9104309B2 (en) | Pattern swapping method and multi-touch device thereof | |
WO2021088422A1 (zh) | 应用消息的通知方法及装置 | |
US10855481B2 (en) | Live ink presence for real-time collaboration | |
CN105808035B (zh) | 图标显示方法及装置 | |
CN111383345A (zh) | 虚拟内容的显示方法、装置、终端设备及存储介质 | |
US20150302549A1 (en) | Information processing system, control method and computer-readable medium | |
CN102436342A (zh) | 图像预览及处理方法 | |
JP6229554B2 (ja) | 検出装置および検出方法 | |
US20160321968A1 (en) | Information processing method and electronic device | |
CN111913564B (zh) | 虚拟内容的操控方法、装置、系统、终端设备及存储介质 | |
CN111913639B (zh) | 虚拟内容的交互方法、装置、系统、终端设备及存储介质 | |
US20150302784A1 (en) | Information processing system, control method, and computer-readable medium | |
CN111913560A (zh) | 虚拟内容的显示方法、装置、系统、终端设备及存储介质 | |
KR102587645B1 (ko) | 터치스크린 제스처를 사용하여 정밀 포지셔닝하기 위한 시스템 및 방법 | |
CN111199512B (zh) | Svg矢量图形的调整方法、装置、存储介质及终端 | |
US20210072884A1 (en) | Information processing apparatus and non-transitory computer readable medium | |
CN107085521A (zh) | 一种图标显示方法和装置 | |
CN111083350B (zh) | 图像处理装置、图像处理方法及存储介质 | |
TW201419170A (zh) | 觸控系統及觸控系統的繪圖方法 | |
CN115145451B (zh) | 一种终端设备上的框选方法、装置、设备及存储介质 | |
JP6305073B2 (ja) | 制御装置、制御方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13878776 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013383628 Country of ref document: AU Date of ref document: 20130909 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13878776 Country of ref document: EP Kind code of ref document: A1 |