Shared vehicle positioning method, operation and maintenance terminal, server and storage medium

文档序号:8429 发布日期:2021-09-17 浏览:31次 中文

1. A shared vehicle positioning method, comprising:

receiving first image information and target range information sent by a server, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle;

acquiring second image information, wherein the second image information is an image shot by an operation and maintenance worker facing different directions at a target position, and the target position is located in a target range indicated by the target range information;

and determining the direction information of the target vehicle in the target range according to the landmark information in the first image information and the landmark information in the second image information.

2. The shared vehicle positioning method of claim 1, wherein the second image information comprises at least one of: multiple pictures with direction identification, 360-degree rotating video.

3. The shared vehicle positioning method as claimed in claim 2, wherein, in a case where the second image information includes a plurality of pictures with directional identifications, the determining of the directional information of the target vehicle within the target range from the landmark information in the first image information and the landmark information in the second image information includes:

identifying landmark information in the plurality of pictures and landmark information in the first image information;

and determining the direction information of the target vehicle in the target range by comparing the landmark information in the plurality of pictures with the landmark information in the first image information in combination with the direction identifiers in the plurality of pictures.

4. The shared vehicle positioning method as recited in claim 2, wherein the determining of the bearing information of the target vehicle within the target range from the landmark information in the first image information and the landmark information in the second image information in a case where the second image information includes a 360-degree rotated video, comprises:

sending the 360-degree rotation video and the first image information to a data server;

receiving the azimuth information of the target vehicle in the target range, which is sent by the data server;

the data server is used for identifying landmark information in the 360-degree rotating video and landmark information in the first image information, and determining the direction information of the target vehicle in the target range by comparing the landmark information in the 360-degree rotating video and the landmark information in the first image information.

5. A shared vehicle positioning method, comprising:

receiving first image information and target range information from a user terminal, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle;

identifying whether the target vehicle is a damaged vehicle according to the first image information;

and under the condition that the target vehicle is determined to be a damaged vehicle, sending the first image information and the target range information to an operation and maintenance terminal.

6. The shared vehicle locating method of claim 5, wherein after receiving the first image information and the target range information from the user terminal, the method further comprises:

judging whether the target vehicle is in a preset remote area or not according to the target range information;

and under the condition that the target vehicle is determined not to be in a preset remote area, sending preset processing time to the operation and maintenance terminal, wherein the preset processing time is used for indicating the time limit for repairing the target vehicle.

7. An operation and maintenance terminal, comprising: the device comprises a receiving unit, an acquisition unit and a processing unit;

the receiving unit is used for receiving first image information and target range information sent by a server, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle;

the acquisition unit is used for acquiring second image information, the second image information is an image shot by operation and maintenance personnel facing different directions at a target position, and the target position is located in a target range indicated by the target range information;

the processing unit is used for determining the direction information of the target vehicle in the target range according to the landmark information in the first image information and the landmark information in the second image information.

8. A server, comprising: a receiving unit, a processing unit and a transmitting unit;

the receiving unit is used for receiving first image information and target range information from a user terminal, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle;

the processing unit is used for identifying whether the target vehicle is a damaged vehicle according to the first image information;

the sending unit is used for sending the first image information and the target range information to an operation and maintenance terminal under the condition that the target vehicle is determined to be a damaged vehicle.

9. An operation and maintenance terminal is characterized by comprising a memory and a processor; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus;

when the operation and maintenance terminal is running, the processor executes the computer-executable instructions stored by the memory to cause the operation and maintenance terminal to perform the shared vehicle positioning method of any one of claims 1-4.

10. A server, comprising a memory and a processor; the memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus;

the processor executes the computer-executable instructions stored by the memory when the server is running to cause the server to perform the shared vehicle location method of any of claims 5-6.

11. A computer-readable storage medium comprising computer-executable instructions that, when executed on a computer, cause the computer to perform the shared vehicle location method of any of claims 1-4 or 5-6.

Background

Generally, when a user uses a shared vehicle, if the shared vehicle is found to be in fault, the shared vehicle can be notified to a server in a photographing and uploading manner.

In the prior art, after knowing that a shared vehicle has a fault, a server can determine the position of the shared vehicle according to a positioning system of the shared vehicle. However, if the location of the shared vehicle is remote, it is difficult for the service end to quickly and accurately locate the specific location of the shared vehicle due to the positioning gap.

Therefore, those skilled in the art have made efforts to develop a shared vehicle locating method, an operation and maintenance terminal, a server, and a storage medium that can quickly find a shared vehicle.

Disclosure of Invention

In view of the above-mentioned drawbacks of the prior art, the technical problems to be solved by the present invention are: how to quickly find a shared vehicle.

In order to achieve the purpose, the invention adopts the following technical scheme:

in a first aspect, the present invention provides a shared vehicle positioning method, including: receiving first image information and target range information sent by a server, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle; acquiring second image information, wherein the second image information is a long-range image shot by operation and maintenance personnel facing different directions at a target position, and the target position is located in a target range indicated by the target range information; and determining the direction information of the target vehicle in the target range according to the landmark information in the first image information and the landmark information in the second image information.

In the embodiment of the invention, under the condition of determining the target range, the images shot by the operation and maintenance personnel facing different directions at the target position of the target range are collected, and the azimuth information of the target vehicle in the target range is determined according to the instant image of the target vehicle shot by the user and the image shot at the target position. The first image information comprises landmark information, the second image information is a long-range image shot by the operation and maintenance personnel facing different directions at the target position, and the second image information comprises the landmark information, so that the second image information and the first image information are partially overlapped, and the direction information of the target vehicle in the target range can be determined according to the landmark information of the overlapped part.

In a preferred embodiment of the present invention, the second image information includes at least one of: multiple pictures with direction identification, 360-degree rotating video.

In a preferred embodiment of the present invention, in the case that the second image information includes a plurality of pictures with direction indicators, the determining the direction information of the target vehicle within the target range according to the landmark information in the first image information and the landmark information in the second image information includes: identifying landmark information in the plurality of pictures and landmark information in the first image information; and determining the direction information of the target vehicle in the target range by comparing the landmark information in the plurality of pictures with the landmark information in the first image information in combination with the direction identifiers in the plurality of pictures.

In a preferred embodiment of the present invention, in the case where the second image information includes a 360-degree rotation video, the determining the direction information of the target vehicle within the target range according to the landmark information in the first image information and the landmark information in the second image information includes: sending the 360-degree rotation video and the first image information to a data server; receiving the azimuth information of the target vehicle in the target range, which is sent by the data server; the data server is used for identifying landmark information in the 360-degree rotating video and landmark information in the first image information, and determining the direction information of the target vehicle in the target range by comparing the landmark information in the 360-degree rotating video and the landmark information in the first image information.

In a second aspect, the present invention provides a shared vehicle positioning method, including: receiving first image information and target range information from a user terminal, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle; identifying whether the target vehicle is a damaged vehicle according to the first image information; and under the condition that the target vehicle is determined to be a damaged vehicle, sending the first image information and the target range information to an operation and maintenance terminal.

In the embodiment of the invention, whether the target vehicle is a damaged vehicle or not can be identified according to the first image information, and the instant image and the target range information of the target vehicle are sent to the operation and maintenance terminal under the condition that the target vehicle is determined to be the damaged vehicle. Therefore, after the operation and maintenance terminal receives the first image information and the target range information, the operation and maintenance personnel can reach the approximate position of the damaged vehicle according to the target range information and further determine the direction of the damaged vehicle according to the first image information.

In a preferred embodiment of the present invention, after receiving the first image information and the target range information from the user terminal, the method further includes: judging whether the target vehicle is in a preset remote area or not according to the target range information; and under the condition that the target vehicle is determined not to be in a preset remote area, sending preset processing time to the operation and maintenance terminal, wherein the preset processing time is used for indicating the time limit for repairing the target vehicle.

In a third aspect, the present invention provides an operation and maintenance terminal, including: the device comprises a receiving unit, an acquisition unit and a processing unit; the receiving unit is used for receiving first image information and target range information sent by a server, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle; the acquisition unit is used for acquiring second image information, the second image information is an image shot by operation and maintenance personnel facing different directions at a target position, and the target position is located in a target range indicated by the target range information; the processing unit is used for determining the direction information of the target vehicle in the target range according to the landmark information in the first image information and the landmark information in the second image information.

In a preferred embodiment of the present invention, the second image information includes at least one of: multiple pictures with direction identification, 360-degree rotating video.

In a preferred embodiment of the present invention, in a case that the second image information includes a plurality of pictures with direction identifiers, the processing unit is specifically configured to: identifying landmark information in the plurality of pictures and landmark information in the first image information; and determining the direction information of the target vehicle in the target range by comparing the landmark information in the plurality of pictures with the landmark information in the first image information in combination with the direction identifiers in the plurality of pictures.

In a preferred embodiment of the present invention, the operation and maintenance terminal further includes a sending unit; in a case where the second image information includes a 360-degree rotated video, the transmitting unit is configured to: sending the 360-degree rotation video and the first image information to a data server; the receiving unit is further configured to: receiving the azimuth information of the target vehicle in the target range, which is sent by the data server; the data server is used for identifying landmark information in the 360-degree rotating video and landmark information in the first image information, and determining the direction information of the target vehicle in the target range by comparing the landmark information in the 360-degree rotating video and the landmark information in the first image information.

In a fourth aspect, the present invention provides a server, comprising: a receiving unit, a processing unit and a transmitting unit; the receiving unit is used for receiving first image information and target range information from a user terminal, wherein the first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle; the processing unit is used for identifying whether the target vehicle is a damaged vehicle according to the first image information; the sending unit is used for sending the first image information and the target range information to an operation and maintenance terminal under the condition that the target vehicle is determined to be a damaged vehicle.

In a preferred embodiment of the present invention, after the receiving unit receives the first image information and the target range information from the user terminal, the processing unit is further configured to determine whether the target vehicle is in a preset remote area according to the target range information; the sending unit is further configured to send a preset processing time to the operation and maintenance terminal under the condition that it is determined that the target vehicle is not located in a preset remote area, where the preset processing time is used for indicating a term for repairing the target vehicle.

In a fifth aspect, the present invention provides an operation and maintenance terminal, which includes a memory and a processor. The memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus. When the operation and maintenance terminal is running, the processor executes the computer execution instructions stored in the memory, so that the operation and maintenance terminal executes the shared vehicle positioning method provided by the first aspect and various possible embodiments thereof.

In a sixth aspect, the invention provides a server comprising a memory and a processor. The memory is used for storing computer execution instructions, and the processor is connected with the memory through a bus. When the server is running, the processor executes the computer-executable instructions stored by the memory to cause the server to perform the shared vehicle localization method provided by the second aspect described above and its various possible embodiments.

In a seventh aspect, a computer-readable storage medium is provided, where the computer-readable storage medium includes computer-executable instructions, and when the computer-executable instructions are executed on a computer, the operation and maintenance terminal is caused to execute the shared vehicle positioning method provided in the first aspect and various possible implementations thereof.

In an eighth aspect, a computer program product is provided, which comprises computer instructions, which, when run on a computer, cause the server to perform the shared vehicle positioning method provided by the second aspect and its various possible embodiments described above.

It should be noted that all or part of the computer instructions may be stored on the computer readable storage medium. The computer-readable storage medium may be packaged together with a processor executing the operation and maintenance terminal or the server, or may be packaged separately from the processor executing the operation and maintenance terminal or the server, which is not limited in this embodiment of the present invention.

For the description of the third, fifth and seventh aspects of the present invention, reference may be made to the detailed description of the first aspect; in addition, for the beneficial effects described in the third aspect, the fifth aspect and the seventh aspect, reference may be made to the beneficial effect analysis of the first aspect, and details are not repeated here.

Reference may be made to the detailed description of the second aspect for the description of the fourth, sixth and eighth aspects of the invention; in addition, for the beneficial effects described in the fourth aspect, the sixth aspect and the eighth aspect, reference may be made to the beneficial effect analysis of the second aspect, and details are not repeated here.

The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.

Drawings

FIG. 1 is a schematic diagram of a shared vehicle positioning system provided by an embodiment of the present invention;

FIG. 2 is a schematic flow chart diagram of a shared vehicle positioning method provided by an embodiment of the invention;

fig. 3 is a schematic structural diagram of an operation and maintenance terminal according to an embodiment of the present invention;

FIG. 4 is a schematic structural diagram of a server according to an embodiment of the present invention;

fig. 5 is a second schematic structural diagram of an operation and maintenance terminal or server according to an embodiment of the present invention.

Detailed Description

The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.

It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.

It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of embodiments of the present invention is not limited to performing functions in the order illustrated or discussed, but may include performing functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.

For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used for distinguishing the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like are not limited in number or execution order.

Some exemplary embodiments of the invention have been described for illustrative purposes, and it is to be understood that the invention may be practiced otherwise than as specifically described.

The above-described implementations are described in detail below with reference to specific embodiments and the accompanying drawings.

As shown in fig. 1, an embodiment of the present invention provides a shared vehicle positioning system 100, which includes a user terminal 101, a server 102, and an operation and maintenance terminal 103.

The user terminal 101 may be configured to capture an instant image of the target vehicle, which includes image information of a fault portion of the target vehicle and landmark information, and send the instant image and the location range information of the target vehicle to the server 102, in a case where the user finds that the target vehicle has some fault.

The server 102 may be configured to receive the location range information and the instant image of the target vehicle sent by the user terminal 101, determine whether the target vehicle is a damaged vehicle according to the instant image, and send the instant image and the location range information of the target vehicle to the operation and maintenance terminal 103 when the target vehicle is determined to be a damaged vehicle.

The operation and maintenance terminal 103 may be configured to receive the instant image and the position range information of the target vehicle sent by the server 102, and when the operation and maintenance person reaches the target range according to the position range information of the target vehicle, the operation and maintenance person may acquire a distant view image through the operation and maintenance terminal 103 facing different directions at a position within the target range, where the distant view image includes landmark information, and determine the direction information of the target vehicle within the target range according to the landmark information in the distant view image and the received landmark information in the instant image.

Optionally, the shared vehicle positioning system 100 may further include a data server 104. When determining the direction information of the target vehicle in the target range according to the distant view image and the received instant image, the operation and maintenance terminal 103 may send the distant view image and the received instant image to the data server 104, and the data server 104 may be configured to receive the distant view image and the instant image, and determine the direction information of the target vehicle in the target range according to the distant view image and the instant image. That is, the operation and maintenance terminal 103 may determine the direction information of the target vehicle within the target range according to the perspective image and the received real-time image, or may determine the direction information of the target vehicle within the target range through the data server 104.

As shown in fig. 2, an embodiment of the present invention provides a shared vehicle positioning method, which may be applied to the shared vehicle positioning system. The shared vehicle positioning method may include: S201-S205:

s201, the server receives first image information and target range information from the user terminal.

The first image information is an instant image of a target vehicle shot by a user, and the target range information is used for indicating the position range of the target vehicle.

After the user unlocks the target vehicle, if the user finds that the target vehicle has faults to affect riding in the using process, the user can shoot the first image information and send the first image information and the target range information to the server to inform the server of the fault condition of the target vehicle.

Optionally, the first image information may include a plurality of images, and the plurality of images include image information of a portion of the target vehicle where the fault exists, for example, if the chain of the target vehicle is broken, the plurality of images include an image of a broken portion of the chain of the target vehicle. The first image information may also include landmark information near the target vehicle. For example, when the user takes an instant image of the target vehicle, an application on the user terminal may prompt the user to take at least one distant view picture.

Optionally, the target range information may be position information of the user terminal, or position information of the target vehicle collected by the user terminal, which may be specifically determined according to an actual use condition, and this is not limited in this embodiment of the present application.

S202, the server identifies whether the target vehicle is a damaged vehicle according to the first image information.

Since the image information of the portion of the target vehicle where the failure exists is included in the first image information, the server can recognize whether the target vehicle is a damaged vehicle or not based on the first image information. Specifically, the server may identify whether the target vehicle is a damaged vehicle through a pre-trained fault identification model, where the fault identification model is trained based on images of different fault types.

S203, the server sends the first image information and the target range information to the operation and maintenance terminal under the condition that the target vehicle is determined to be a damaged vehicle.

Correspondingly, the operation and maintenance terminal receives the first image information and the target range information sent by the server.

If the server determines that the target vehicle is a damaged vehicle, the server indicates that the target vehicle needs maintenance personnel to maintain, and therefore the server can send the first image information and the target range information to the operation and maintenance terminal under the condition that the target vehicle is determined to be the damaged vehicle.

Optionally, the server may send the non-deduction information to the user terminal when determining that the target vehicle is a damaged vehicle, so as to avoid the occurrence of a situation that the user has to pay for the mistaken unlocking even if the user fails to use the target vehicle.

And S204, the operation and maintenance terminal acquires second image information.

The second image information is an image shot by the operation and maintenance personnel facing different directions at a target position, and the target position is located in a target range indicated by the target range information.

After the operation and maintenance terminal receives the first image information and the target range information sent by the server, the operation and maintenance personnel can reach the target range according to the target range information, and due to limitation of positioning accuracy, the operation and maintenance personnel can not accurately find the position of the target vehicle depending on the target range information, so that the operation and maintenance personnel can shoot long-range images in different directions at the target position of the target range. The target position may be any position within the target range.

Optionally, the second image information may include at least one of: multiple pictures with direction identification, 360-degree rotating video. That is to say, the operation and maintenance personnel can take a plurality of pictures facing different directions at the target position, and can also stand at the target position and slowly rotate to take a 360-degree rotation video.

It should be noted that, in the case that the second image information includes multiple pictures, after the operation and maintenance terminal takes each picture, the compass function of the system itself may determine the shooting direction in which the operation and maintenance terminal takes the picture, determine the direction identifier according to the shooting direction, and mark the direction identifier on the picture.

S205, the operation and maintenance terminal determines the direction information of the target vehicle in the target range according to the landmark information in the first image information and the landmark information in the second image information.

Optionally, the operation and maintenance terminal may determine the direction information of the target vehicle within the target range by the following two ways:

in a first mode

Under the condition that the second image information comprises a plurality of pictures with direction marks, the operation and maintenance terminal can identify landmark information in the pictures and landmark information in the first image information; and determining the direction information of the target vehicle in the target range by comparing the landmark information in the plurality of pictures with the landmark information in the first image information in combination with the direction identification in the pictures.

Mode two

In the case that the second image information includes a 360-degree rotation video, the operation and maintenance terminal may send the 360-degree rotation video and the first image information to the data server; the data server may identify the landmark information in the 360-degree rotated video and the landmark information in the first image information, and determine the orientation information of the target vehicle within the target range by comparing the landmark information in the 360-degree rotated video and the landmark information in the first image information. Finally, the operation and maintenance terminal can receive the direction information of the target vehicle in the target range, which is sent by the data server.

In the embodiment of the invention, the server can identify whether the target vehicle is a damaged vehicle according to the first image information, and send the instant image and the target range information of the target vehicle to the operation and maintenance terminal under the condition that the target vehicle is determined to be the damaged vehicle. Therefore, after the operation and maintenance terminal receives the first image information and the target range information, the operation and maintenance personnel can reach the approximate position of the damaged vehicle according to the target range information, under the condition that the target range is determined, the operation and maintenance terminal can collect images shot by the operation and maintenance personnel in different directions at the target position of the target range, and determine the azimuth information of the target vehicle in the target range according to the instant image of the target vehicle shot by the user and the image shot at the target position. The first image information comprises landmark information, the second image information is a long-range image shot by the operation and maintenance personnel facing different directions at the target position, and the second image information comprises the landmark information, so that the second image information and the first image information are partially overlapped, and the direction information of the target vehicle in the target range can be determined according to the landmark information of the overlapped part.

Optionally, after the server receives the first image information and the target range information from the user terminal, the shared vehicle positioning method may further include: s206 and S207:

and S206, the server judges whether the target vehicle is in a preset remote area or not according to the target range information.

The server can define the operation area of the shared vehicle in advance, and the areas beyond the operation area belong to the preset remote areas. After receiving the target range information sent by the user terminal, the server can judge whether the target vehicle is in the operation area according to the target range information, and if not, the target vehicle can be determined to be in the preset remote area. If so, it may be determined that the target vehicle is not in the predetermined remote area.

And S207, under the condition that the target vehicle is determined not to be in the preset remote area, the server sends the preset processing time to the operation and maintenance terminal.

The preset processing time may be used to indicate a term for repairing the target vehicle.

Since the use probability that the target vehicle is not located in the preset remote area is greater than the use probability that the target vehicle is located in the preset remote area, if the server determines that the target vehicle is not located in the preset remote area, it is necessary to perform fault processing on the target vehicle as soon as possible.

The server may send the preset processing time to the operation and maintenance terminal if it is determined that the target vehicle is not located in the preset remote area. The preset processing time is shorter than the operation and maintenance period of the operation and maintenance personnel for uniformly processing the fault vehicles.

In the embodiment of the invention, the preset processing time can be sent to the operation and maintenance terminal under the condition that the target vehicle is determined not to be in the preset remote area, so that the target vehicle with a fault can be repaired as soon as possible, and the influence on the normal use of the target vehicle by a user is avoided.

The scheme provided by the embodiment of the application is mainly introduced from the perspective of a method. To implement the above functions, it includes hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.

In the embodiment of the present application, the operation and maintenance terminal or the server may be divided into the functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. Optionally, the division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.

As shown in fig. 3, an operation and maintenance terminal 300 is provided in the embodiment of the present application. The operation and maintenance terminal 300 includes: a receiving unit 301, an acquisition unit 302 and a processing unit 303; the receiving unit 301 is configured to receive first image information and target range information, where the first image information is an instant image of a target vehicle captured by a user, and the target range information is used to indicate a position range of the target vehicle; the acquisition unit 302 is configured to acquire second image information, where the second image information is an image of an operation and maintenance worker shot in a target position facing different directions, and the target position is located in a target range indicated by the target range information; the processing unit 303 is configured to determine, according to landmark information in the first image information and landmark information in the second image information, direction information of the target vehicle in the target range.

Optionally, the second image information includes at least one of the following items: multiple pictures with direction identification, 360-degree rotating video.

Optionally, when the second image information includes a plurality of pictures with direction identifiers, the processing unit 303 is specifically configured to: identifying landmark information in the plurality of pictures and landmark information in the first image information; and determining the direction information of the target vehicle in the target range by comparing the landmark information in the plurality of pictures with the landmark information in the first image information in combination with the direction identifiers in the plurality of pictures.

Optionally, with continuing reference to fig. 3, the operation and maintenance terminal 300 may further include a sending unit 304; in a case where the second image information includes a 360-degree rotated video, the sending unit 304 is configured to: sending the 360-degree rotation video and the first image information to a data server; the receiving unit 301 is further configured to: receiving the azimuth information of the target vehicle in the target range, which is sent by the data server; the data server is used for identifying landmark information in the 360-degree rotating video and landmark information in the first image information, and determining the direction information of the target vehicle in the target range by comparing the landmark information in the 360-degree rotating video and the landmark information in the first image information.

The operation and maintenance terminal provided by the embodiment of the invention can acquire images shot by operation and maintenance personnel facing different directions at the target position in the target range under the condition of determining the target range, and determine the azimuth information of the target vehicle in the target range according to the instant image of the target vehicle shot by the user and the long-range images shot at different angles at the target position. The first image information comprises landmark information, the second image information is a long-range image shot by the operation and maintenance personnel facing different directions at the target position, and the second image information comprises the landmark information, so that the second image information and the first image information are partially overlapped, and the direction information of the target vehicle in the target range can be determined according to the landmark information of the overlapped part.

As shown in fig. 4, an embodiment of the present application provides a server 400. The server 400 includes: a receiving unit 401, a processing unit 402, and a transmitting unit 403; the receiving unit 401 is configured to receive first image information and target range information from a user terminal, where the first image information is an instant image of a target vehicle captured by a user, and the target range information is used to indicate a position range of the target vehicle; the processing unit 402 is configured to identify whether the target vehicle is a damaged vehicle according to the first image information; the sending unit 403 is configured to send the first image information and the target range information to an operation and maintenance terminal when it is determined that the target vehicle is a damaged vehicle.

Optionally, after the receiving unit 401 receives the first image information and the target range information from the user terminal, the processing unit 402 is further configured to determine whether the target vehicle is in a preset remote area according to the target range information; the sending unit 403 is further configured to send a preset processing time to the operation and maintenance terminal when it is determined that the target vehicle is not located in a preset remote area, where the preset processing time is used to indicate a term for repairing the target vehicle.

The server provided by the embodiment of the invention can identify whether the target vehicle is a damaged vehicle according to the first image information, and send the instant image and the target range information of the target vehicle to the operation and maintenance terminal under the condition that the target vehicle is determined to be the damaged vehicle. Therefore, after the operation and maintenance terminal receives the first image information and the target range information, the operation and maintenance personnel can reach the approximate position of the damaged vehicle according to the target range information and further determine the direction of the damaged vehicle according to the first image information.

The embodiment of the present application further provides an operation and maintenance terminal or server as shown in fig. 5, where the operation and maintenance terminal or server includes a processor 11, a memory 12, a communication interface 13, and a bus 14. The processor 11, the memory 12 and the communication interface 13 may be connected by a bus 14.

The processor 11 is a control center of an operation and maintenance terminal or a server, and may be a single processor or a collective term for multiple processing elements. For example, the processor 11 may be a general-purpose Central Processing Unit (CPU), or may be another general-purpose processor. Wherein a general purpose processor may be a microprocessor or any conventional processor or the like.

For one embodiment, processor 11 may include one or more CPUs, such as CPU 0 and CPU 1 shown in FIG. 5.

The memory 12 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.

In a possible implementation, the memory 12 may be present separately from the processor 11, and the memory 12 may be connected to the processor 11 via a bus 14 for storing instructions or program code. The deployment method of the service function chain provided by the embodiment of the present application can be implemented when the processor 11 calls and executes the instructions or program codes stored in the memory 12.

In another possible implementation, the memory 12 may also be integrated with the processor 11.

And a communication interface 13 for connecting with other devices through a communication network. The communication network may be an ethernet network, a radio access network, a Wireless Local Area Network (WLAN), or the like. The communication interface 13 may comprise a receiving unit for receiving data and a transmitting unit for transmitting data.

The bus 14 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 5, but this is not intended to represent only one bus or type of bus.

It should be noted that the structure shown in fig. 5 does not constitute a limitation on the operation and maintenance terminal or server. In addition to the components shown in FIG. 5, the operation and maintenance terminal or server may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.

Embodiments of the present invention also provide a computer-readable storage medium, which includes computer-executable instructions. When the computer executes the instructions to run on the computer, the computer executes the steps executed by the operation and maintenance terminal in the shared vehicle positioning method provided by the embodiment.

Embodiments of the present invention also provide a computer-readable storage medium, which includes computer-executable instructions. When the computer executes the instructions to run on the computer, the computer executes the steps executed by the server in the shared vehicle positioning method provided by the above embodiment.

The embodiment of the present invention further provides a computer program product, where the computer program product may be directly loaded into the memory and contains a software code, and after the computer program product is loaded and executed by the computer, the method for positioning a shared vehicle provided in the foregoing embodiment can be implemented to execute each step executed by the operation and maintenance terminal.

The embodiment of the present invention further provides a computer program product, where the computer program product may be directly loaded into the memory and contains software codes, and after the computer program product is loaded and executed by the computer, the method for positioning a shared vehicle provided in the foregoing embodiment may be implemented, and the steps executed by the server are executed.

Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for causing a terminal to execute the methods according to the embodiments of the present invention.

The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

完整详细技术资料下载
上一篇:石墨接头机器人自动装卡簧、装栓机
下一篇:行人图像不变性特征提取和无监督行人重识别方法与装置

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!