Method for generating flow-traffic prediction multi-time-sequence model, information sending method and device
1. A method for generating a multi-time-sequence model for traffic prediction comprises the following steps:
acquiring a historical flow transfer set of a target object in a preset time period;
inputting the historical transit amount set into each transit amount prediction time sequence model in a transit amount prediction time sequence model pool, and obtaining a prediction transit amount of each transit time granularity of the target object in a preset transit time period as a prediction transit amount group to form a prediction transit amount group set, wherein the preset time period comprises at least one cycle transit time period of the preset transit time period;
determining each historical flow quantity of each return time granularity in the prediction flow quantity group set and the historical flow quantity set as an input parameter of a preset linearization objective function so as to generate an objective function to be solved;
solving the target function to be solved based on a constraint condition set corresponding to the preset linearized target function to obtain a model weight coefficient set, wherein model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence models in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions for representing the quantity of the models to be constrained;
and according to the model weight coefficient set, carrying out weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
2. The method of claim 1, wherein prior to said solving the objective function to be solved, the method further comprises:
and generating a model error corresponding to the traffic prediction time sequence model based on the prediction traffic group corresponding to each traffic prediction time sequence model in the prediction traffic group set and each historical traffic of each return time granularity in the historical traffic set, so as to obtain a model error set.
3. The method of claim 2, wherein after the generating a model error corresponding to the traffic prediction temporal model yields a set of model errors, the method further comprises:
and deleting each constraint condition which is characterized in the constraint condition set and used for constraining the model weight coefficient so as to update the constraint condition set.
4. The method of claim 3, wherein after said removing the respective constraints from the set of constraints characterizing the respective constraints constraining the model weight coefficients, the method further comprises:
determining the model error set as input parameters of a first model weight function, a second model weight function and a third model weight function to generate a first model weight function to be weighted, a second model weight function to be weighted and a third model weight function to be weighted;
weighting the first model weight function to be weighted, the second model weight function to be weighted and the third model weight function to be weighted to obtain a weighted model weight function as a weight constraint condition;
adding the weighted constraints to a set of constraints to update the set of constraints.
5. An information sending method, comprising:
acquiring a historical flow transfer set of a target object in a preset time period;
inputting the historical traffic set into a traffic prediction multi-time sequence model to obtain a predicted traffic of each predicted time granularity of the target object in a preset prediction time period as a target predicted traffic, wherein the traffic prediction multi-time sequence model is generated by adopting the method of any one of claims 1 to 4;
the resulting target predicted traffic volume is sent to the associated display device.
6. The method of claim 5, wherein the obtaining a set of historical diversions of the target item over a preset time period further comprises:
and acquiring the inventory of the target articles.
7. The method of claim 6, wherein the method further comprises:
and controlling the associated scheduling equipment to execute scheduling operation according to the inventory amount and the obtained target predicted traffic amount in response to the fact that the inventory amount and the obtained target predicted traffic amount meet preset replenishment conditions.
8. A traffic prediction multi-timing model generation device comprises:
the system comprises an acquisition unit, a storage unit and a processing unit, wherein the acquisition unit is configured to acquire a historical flow traffic set of a target object in a preset time period;
an input unit configured to input the historical set of transitions into each transition prediction time sequence model in a transition prediction time sequence model pool, and obtain a predicted transition of each transition granularity of the target object within a preset transition time period as a predicted transition group to form a predicted transition group set, wherein the preset time period includes at least one periodic transition time period of the preset transition time period;
a determining unit configured to determine, as input parameters of a preset linearized objective function, respective historical traffic amounts of the prediction traffic amount set and the historical traffic amount set at the respective return time granularities, so as to generate an objective function to be solved;
the generating unit is configured to perform solving processing on the target function to be solved based on a constraint condition set corresponding to the preset linearized target function to obtain a model weight coefficient set, wherein model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence models in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions for representing the constraint on the number of the models;
and the weighted combination unit is configured to perform weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set according to the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
9. An information transmission apparatus comprising:
the historical traffic set acquisition unit is configured to acquire a historical traffic set of the target object within a preset time period;
a historical traffic volume set input unit configured to input the historical traffic volume set into a traffic volume prediction multi-time-sequence model, and obtain a predicted traffic volume of each prediction time granularity of the target article in a preset prediction time period as a target predicted traffic volume, wherein the traffic volume prediction multi-time-sequence model is generated by adopting the method of any one of claims 1 to 4;
a transmitting unit configured to transmit the obtained target predicted traffic volume to an associated display device.
10. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
11. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.
Background
Demand forecasting (Demand Forecast) is a very important ring in the field of supply chains, and is a three-way line of defense, which is commonly referred to as a supply chain, with inventory planning and supply chain execution. Currently, when generating a predicted traffic volume, the following method is generally adopted: generating a future period of time of the traffic through the historical traffic data and the selected one or more time series models.
However, when generating the predicted traffic volume in the above manner, there are often technical problems as follows: when a single time sequence model is adopted, the multi-periodicity characteristics of a time sequence cannot be covered, so that the prediction accuracy and robustness are poor; when a plurality of time sequence models are adopted, each time sequence model needs to be selected in advance, the number of the time sequence models and the weight of each time sequence model need to be determined, the stability of a prediction result is poor, and the model determining process is complicated.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a method, an information transmitting method, an apparatus, an electronic device, and a computer-readable medium for generating a multi-timing model for traffic prediction, to solve one or more of the technical problems mentioned in the above background section.
In a first aspect, some embodiments of the present disclosure provide a method for generating a traffic prediction multi-timing model, where the method includes: acquiring a historical flow transfer set of a target object in a preset time period; inputting the historical transit amount set into each transit amount prediction time sequence model in a transit amount prediction time sequence model pool, and obtaining the prediction transit amount of each transit amount of the target object in each return time granularity in a preset return time period as a prediction transit amount group to form a prediction transit amount group set, wherein the preset time period comprises at least one cycle return time period of the preset return time period; determining each historical flow quantity of each return time granularity in the prediction flow quantity set and the historical flow quantity set as an input parameter of a preset linearized objective function so as to generate an objective function to be solved; solving the target function to be solved based on a constraint condition set corresponding to the preset linearized target function to obtain a model weight coefficient set, wherein model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence model in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions for representing the number of the models to be constrained; and according to the model weight coefficient set, carrying out weighted combination processing on the traffic prediction time sequence models corresponding to the model weight coefficients in the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
Optionally, before the solving of the objective function to be solved, the method further includes: and generating a model error corresponding to the traffic prediction time sequence model based on the predicted traffic group corresponding to each traffic prediction time sequence model in the predicted traffic group set and each historical traffic in the historical traffic set at each return time granularity, so as to obtain a model error set.
Optionally, after generating a model error corresponding to the traffic prediction time series model to obtain a model error set, the method further includes: and deleting each constraint condition for representing the constraint on the model weight coefficient in the constraint condition set so as to update the constraint condition set.
Optionally, after deleting the constraint conditions that characterize the constraint conditions that constrain the model weight coefficients in the constraint condition set, the method further includes: determining the model error set as input parameters of a first model weight function, a second model weight function and a third model weight function to generate a first model weight function to be weighted, a second model weight function to be weighted and a third model weight function to be weighted; weighting the first model weight function to be weighted, the second model weight function to be weighted and the third model weight function to be weighted to obtain a weighted model weight function as a weight constraint condition; the above-mentioned weight constraints are added to the set of constraints to update the set of constraints.
In a second aspect, some embodiments of the present disclosure provide an information sending method, including: acquiring a historical flow transfer set of a target object in a preset time period; inputting the historical traffic set into a traffic prediction multi-time sequence model to obtain a predicted traffic of each predicted time granularity of the target object in a preset prediction time period as a target predicted traffic; the resulting target predicted traffic volume is sent to the associated display device.
Optionally, the obtaining of the historical traffic set of the target object in the preset time period further includes: and acquiring the inventory of the target articles.
Optionally, the method further comprises: and controlling the associated scheduling equipment to execute scheduling operation according to the stock quantity and the obtained target predicted traffic flow in response to the stock quantity and the obtained target predicted traffic flow meeting the preset replenishment condition.
In a third aspect, some embodiments of the present disclosure provide a device for generating a multi-timing model for traffic prediction, the device including: the system comprises an acquisition unit, a storage unit and a processing unit, wherein the acquisition unit is configured to acquire a historical flow traffic set of a target object in a preset time period; an input unit configured to input the historical set of transitions into each transition prediction time sequence model in a transition prediction time sequence model pool, and obtain a predicted transition of each transition granularity of the target object within a preset transition time period as a predicted transition group to form a predicted transition group set, wherein the preset time period includes at least one cycle transition time period of the preset transition time period; a determining unit, configured to determine, as input parameters of a preset linearized objective function, respective historical traffic amounts of the prediction traffic amount set and the historical traffic amount set at the respective return time granularities, so as to generate an objective function to be solved; a generating unit, configured to perform solving processing on the objective function to be solved based on a constraint condition set corresponding to the preset linearized objective function to obtain a model weight coefficient set, where a model weight coefficient in the model weight coefficient set corresponds to a traffic prediction time sequence model in the traffic prediction time sequence model pool, and the constraint condition set includes a constraint condition representing that the number of models is constrained; and the weighted combination unit is configured to perform weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set according to the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
Optionally, before the generating unit, the apparatus further comprises: and a model error generation unit configured to generate a model error corresponding to the traffic prediction time series model based on the predicted traffic group corresponding to each traffic prediction time series model in the traffic prediction time series model pool in the predicted traffic group set and each historical traffic at each return time granularity in the historical traffic set, and obtain a model error set.
Optionally, after the model error generating unit, the apparatus further comprises: and the deleting unit is configured to delete each constraint condition for constraining the model weight coefficient by the constraint condition set characterization so as to update the constraint condition set.
Optionally, after deleting the unit, the apparatus further comprises: an input parameter determination unit, a weighting processing unit, and an addition unit. Wherein the input parameter determination unit is configured to determine the above-mentioned set of model errors as input parameters for a first model weight function, a second model weight function and a third model weight function to generate a first model weight function to be weighted, a second model weight function to be weighted and a third model weight function to be weighted. The weighting processing unit is configured to perform weighting processing on the first model weight function to be weighted, the second model weight function to be weighted and the third model weight function to be weighted to obtain a weighted model weight function as a weight constraint condition. The adding unit is configured to add the above-described weight constraint to the constraint set to update the constraint set. In a fourth aspect, some embodiments of the present disclosure provide an information transmitting apparatus, the apparatus including: the historical traffic acquiring unit is configured to acquire a historical traffic set of the target object within a preset time period; a historical traffic input unit configured to input the historical traffic set into a traffic prediction multi-time-sequence model, and obtain a predicted traffic of each predicted time granularity of the target object within a preset prediction time period as a target predicted traffic; a transmitting unit configured to transmit the obtained target predicted traffic volume to an associated display device.
Optionally, the historical traffic set obtaining unit further includes: an inventory amount acquiring unit configured to acquire an inventory amount of the target item.
Optionally, the apparatus further comprises: and the control unit is configured to respond to the inventory amount and the obtained target predicted traffic amount to meet preset replenishment conditions, and control the associated scheduling equipment to execute scheduling operation according to the inventory amount and the obtained target predicted traffic amount.
In a fifth aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first or second aspects.
In a sixth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first or second aspects.
The above embodiments of the present disclosure have the following advantages: the multi-time-sequence model for the traffic prediction obtained by the method for generating the multi-time-sequence model for the traffic prediction of some embodiments of the disclosure has the advantages of improving the prediction accuracy, robustness and stability of the prediction result, and simplifying the process of determining the model. Specifically, the reasons that the prediction accuracy, robustness and prediction result stability are poor and the model determination process is complicated are as follows: when a single time sequence model is adopted, the multi-periodicity characteristics of a time sequence cannot be covered, so that the prediction accuracy and robustness are poor; when a plurality of time sequence models are adopted, each time sequence model needs to be selected in advance, the number of the time sequence models and the weight of each time sequence model need to be determined, the stability of a prediction result is poor, and the model determining process is complicated. Based on this, the method for generating a multi-timing model for traffic prediction according to some embodiments of the present disclosure first obtains a historical traffic set of a target item within a preset time period. And then, inputting the historical traffic set into each traffic prediction time sequence model in a traffic prediction time sequence model pool, and obtaining the predicted traffic of each return time granularity of the target object in a preset return time period as a predicted traffic group to form a predicted traffic group set. Wherein the preset time period includes at least one periodic time period of the preset time period. Therefore, the flow of the target article in each return time granularity can be predicted through each flow prediction time sequence model in the flow prediction time sequence model pool, a predicted flow set obtained through prediction is used as an input parameter, and the method can be further used for weight solution of an objective function to be solved. And then, determining each historical flow quantity of each return time granularity in the prediction flow quantity set and the historical flow quantity set as an input parameter of a preset linearized objective function so as to generate an objective function to be solved. Therefore, the objective function to be solved can be used as an objective condition when determining the model weight coefficient, so that the determined model weight coefficient is an optimal solution. And secondly, solving the objective function to be solved based on a constraint condition set corresponding to the preset linearized objective function to obtain a model weight coefficient set. The model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence models in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions representing the number of the models to be constrained. Therefore, the model weight coefficient set obtained by the solving process can be the optimal solution, and simultaneously, all constraint conditions in the constraint condition set can be met. And finally, according to the model weight coefficient set, carrying out weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set to obtain a traffic prediction multi-time sequence model. Therefore, the flow transfer amount prediction multi-sequence model corresponding to the target object can be obtained through weighted combination according to the optimal model weight coefficient set. And when the traffic prediction multi-time-sequence model is generated, the predicted traffic of the target article in the preset return time period is determined according to the historical traffic set of the target article in the preset time period, so that the multi-periodicity characteristic of the traffic of the target article in the historical time period can be covered, and the prediction accuracy is improved. And because the multi-time-sequence model for the traffic prediction is generated for the target article, the influence of the traffic variation characteristics of different articles in historical time (for example, the influence of large difference of traffic of different articles in different seasons) is reduced, and the robustness of prediction is improved. And because the selection of the model and the determination of the model weight coefficient are obtained by solving, whether the model is selected can be embodied by the model weight coefficient, so that the selection of each time sequence model in advance, the determination of the number of the time sequence models and the weight of each time sequence model are not needed, the stability of a prediction result is improved, and the model determination process is simplified.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is a schematic diagram of one application scenario of a method of generating a multi-temporal model for traffic prediction, according to some embodiments of the present disclosure;
fig. 2 is a schematic diagram of one application scenario of an information transmission method according to some embodiments of the present disclosure;
FIG. 3 is a flow diagram of some embodiments of a method of generating a multi-temporal model for traffic prediction according to the present disclosure;
fig. 4 is a flow chart of some embodiments of an information sending method according to the present disclosure;
FIG. 5 is a schematic block diagram of some embodiments of a traffic prediction multi-temporal model generation apparatus according to the present disclosure;
fig. 6 is a schematic structural diagram of some embodiments of an information transmitting apparatus according to the present disclosure;
FIG. 7 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of an application scenario of a method for generating a multi-temporal model for traffic prediction according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may obtain a historical set of traffic 102 for a target item over a preset time period. Then, the computing device 101 may input the historical set of transitions 102 into each transition prediction time model in the transition prediction time model pool 103, and obtain a predicted transition of each transition granularity of the target object within a preset transition time period as a predicted transition group to form a predicted transition group set 104. Wherein the preset time period includes at least one periodic time period of the preset time period. Thereafter, the computing device 101 may determine each historical flow 105 at each return time granularity in the set of predicted flows 104 and the set of historical flows 102 as input parameters of a preset linearized objective function 106 to generate an objective function to be solved 107. Next, the computing device 101 may perform solving processing on the objective function 107 to be solved based on the constraint condition set 108 corresponding to the preset linearized objective function 106, so as to obtain a model weight coefficient set 109. Wherein the model weight coefficients in the set of model weight coefficients 109 correspond to the traffic prediction temporal models in the pool of traffic prediction temporal models 103. The set of constraints 108 includes constraints that characterize the number of constraints on the model. Finally, the computing device 101 may perform weighted combination processing on the traffic prediction time series models corresponding to the model weight coefficients in the model weight coefficient set 109 according to the model weight coefficient set 109 to obtain a traffic prediction multi-time series model 110.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
Fig. 2 is a schematic diagram of an application scenario of an information sending method according to some embodiments of the present disclosure.
In the application scenario of fig. 2, first, the computing device 201 may obtain a historical set of traffic 202 for the target item over a preset time period. Then, the computing device 201 may input the historical traffic flow set 202 into the traffic flow prediction multi-time-sequence model 203, and obtain a predicted traffic flow of each predicted time granularity of the target item within a preset prediction time period as a target predicted traffic flow. For example, the aforementioned traffic prediction multi-timing model 203 may be the traffic prediction multi-timing model 110 in fig. 1. Finally, the computing device 201 may send the resulting target predicted amount of streaming 204 to the associated display device 205.
The computing device 201 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 2 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
The execution subjects of the traffic prediction multi-sequence model generation method and the information transmission method may be the same computing device or different computing devices.
With continued reference to fig. 3, a flow 300 of some embodiments of a method of generating a multi-temporal model for traffic prediction in accordance with the present disclosure is shown. The method for generating the flow transfer prediction multi-time sequence model comprises the following steps:
step 301, obtaining a historical flow traffic set of a target article in a preset time period.
In some embodiments, an executing subject (e.g., the computing device 101 shown in fig. 1 or the computing device 201 shown in fig. 2) of the traffic prediction multi-timing model generation method may obtain, through a wired connection manner or a wireless connection manner, a historical traffic set of the target item within a preset time period from a terminal storing the historical traffic of the target item. In practice, the execution main body may obtain, from the terminal, a historical traffic volume in which a corresponding article identifier is the same as an article identifier of the target article and a traffic time is within the preset time period, to obtain a historical traffic volume set. The target object may be a currently selected object. The historical traffic in the historical traffic set may be the number of traffic of the target item at the historical time (for example, the historical traffic may be the sales of the target item at the historical time). The item identifier may be used to uniquely identify the item. The preset time period may be a preset time period including each cycle time period. For example, the preset time period may be 2018/1/1-2020/12/31. The time granularity of the preset time period may be 2018/1/1-2020/12/31 per day. The preset time period may include a cycle time period of 2018/1/1-2019/1/1, 2019/1/1-2020/1/1. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
Step 302, inputting the historical traffic sets into each traffic prediction time sequence model in the traffic prediction time sequence model pool, and obtaining the predicted traffic of each return time granularity of the target object in a preset return time period as a predicted traffic group to form a predicted traffic group set.
In some embodiments, the execution subject may input the historical set of transitions into each transition prediction time sequence model in the pool of transition prediction time sequence models, and obtain a predicted transition of each transition granularity of the target object within a preset transition time period as a predicted transition group to form a predicted transition group set. Wherein each predicted traffic in the set of predicted traffic corresponds to each return time granularity. Each predicted traffic volume group in the set of predicted traffic volume groups corresponds to each traffic volume prediction timing model. The pool of traffic prediction timing models may be a collection of individual traffic prediction timing models. The above-mentioned traffic prediction time series model may be a time series model for predicting the traffic of the article. For example, the aforementioned pool of time series models for traffic prediction may include, but is not limited to, the following time series models for predicting traffic of an item: ETS (exponential smoothing algorithm), Hot-Winter (cubic exponential smoothing algorithm), SES (simple exponential smoothing algorithm), SARIMA (auto-regressive differential moving average with seasonality), SA (simple average algorithm). The preset time period for returning the target item may be a preset time period for returning the flow amount of the target item within the preset time period. The preset time period includes at least one periodic time period of the preset time period. The period return time period may be a period time period corresponding to the preset return time period. For example, the preset time period for the return may be 2020/12/1-2020/12/31. The above period return time period may be 2018/12/1-2018/12/31 and 2019/12/1-2019/12/31. The time granularity of the respective time granularities may be a unit time for further dividing the preset time period. For example, the respective return time granularities described above may be for each day within the preset return time periods 2020/12/1-2020/12/31. Therefore, the flow of the target article in each return time granularity can be predicted through each flow prediction time sequence model in the flow prediction time sequence model pool, a predicted flow set obtained through prediction is used as an input parameter, and the method can be further used for weight solution of an objective function to be solved.
Step 303, determining each historical traffic of each return time granularity in the prediction traffic group set and the historical traffic set as an input parameter of a preset linearized objective function, so as to generate an objective function to be solved.
In some embodiments, the execution main body may determine, as input parameters of a preset linearized objective function, respective historical traffic at the respective return time granularities in the set of predicted traffic sets and the set of historical traffic sets, so as to generate an objective function to be solved. The preset linearized target function may be a target function obtained by subjecting a preset target function to linearization processing. For example, the objective function may be a function that minimizes the value of MAPE (Mean absolute percentage Error), and may be expressed as:
wherein H represents the number of the return time granularities included in the preset return time period. t represents the number of the granularity of the return time. N represents the number of the traffic prediction time series models included in the traffic prediction time series model pool. i represents the number of the traffic prediction time series model in the traffic prediction time series model pool. OmegaiAnd model weight coefficients representing the ith traffic prediction time sequence model.And the predicted flow quantity of the target object generated by the ith flow quantity prediction time sequence model at the t return time granularity is represented. y ishist represents the historical amount of the target item's flow at the time granularity of the t-th return.
The preset linearization objective function, that is, the objective function after linearization, may be:
wherein z istTo represent
Through step 303, the generated objective function to be solved may be used as an objective condition when determining the model weight coefficient, so that the determined model weight coefficient is an optimal solution.
And 304, solving the objective function to be solved based on the constraint condition set corresponding to the preset linearized objective function to obtain a model weight coefficient set.
In some embodiments, the executing entity may perform solution processing on the objective function to be solved based on a constraint condition set corresponding to the preset linearized objective function, so as to obtain a model weight coefficient set. The constraint condition set may be a condition for constraining a numerical value of a coefficient in the preset linearization objective function. The set of constraints includes constraints that characterize the number of constraints on the model. The model weight coefficients in the set of model weight coefficients correspond to the traffic prediction temporal models in the pool of traffic prediction temporal models. For example, the set of constraints may include constraints expressed as the following mathematical model:
ωi∈[0,1] (1)
wherein Q is an arbitrarily large integer.
Where P represents a number threshold of selected traffic prediction temporal models. P is less than or equal to N.
ωi≤xi (7)
Mathematical models (3) and (7) show that if the ith traffic prediction timing model is selected, ω isiGreater than 0, if the ith traffic prediction timing model is not selected, ωiEqual to 0. The mathematical model (5) indicates that at least 1 traffic prediction timing model is selected. The mathematical model (6) indicates that at most P traffic prediction timing models are selected. The mathematical model (6) is a constraint that characterizes the number of constraints on the model.
In practice, the execution main body may call an Application Programming Interface (API) of a preset solver to perform solution processing on the target function to be solved, so as to obtain a model weight coefficient set. For example, the preset solver may be a Gurobi solver. The preset solver may also be a CPLEX solver. Therefore, the model weight coefficient set obtained by the solving process can be the optimal solution, and simultaneously, all constraint conditions in the constraint condition set can be met.
Alternatively, before step 304, the execution main body may generate a model error corresponding to the traffic prediction time series model based on the predicted traffic group corresponding to each traffic prediction time series model in the predicted traffic group set and each historical traffic at each return time granularity in the historical traffic set, so as to obtain a model error set. In practice, the execution body may generate a model Error corresponding to the flow amount prediction timing model by using a Root Mean Square Error (RMSE) formula. The smaller the model error is, the better the corresponding time sequence model prediction effect of the flow quantity prediction is, and further, the obtained model error set can be used for adjusting the model weight coefficient in the solving process.
Alternatively, after generating the model error corresponding to the traffic prediction time series model to obtain the set of model errors, the execution main body may delete each constraint condition representing a constraint on the model weight coefficient in the set of constraint conditions to update the set of constraint conditions. For example, the execution agent may delete the mathematical models (1), (2), (3), and (7) and update the constraint condition set. The updated set of constraints includes mathematical models (4), (5), and (6). Thus, the set of constraints described above may be updated to adapt to the insertion of constraints associated with model errors.
Optionally, after the constraint conditions for constraining the model weight coefficients are represented in the removed constraint condition set, the executive agent may first determine the model error set as input parameters of the first model weight function, the second model weight function and the third model weight function to generate a first model weight function to be weighted, a second model weight function to be weighted and a third model weight function to be weighted. The first model weight function, the second model weight function, and the third model weight function may be functions for determining model weight coefficients according to model errors. For example, the first model weight function to be weighted may be:
wherein, acciAnd representing the model error of the ith traffic prediction time sequence model. e denotes the base of the natural logarithmic function.Representing a first model weight function to be weighted.
The second model weight function to be weighted may be:
wherein the content of the first and second substances,representing a second model weight function to be weighted.
The third model weight function to be weighted may be:
wherein the content of the first and second substances,representing a third model weight function to be weighted. e denotes the base of the natural logarithmic function.
Then, the executing agent may perform weighting processing on the first model weight function to be weighted, the second model weight function to be weighted, and the third model weight function to be weighted, so as to obtain a weighted model weight function as a weight constraint condition. For example, the executing entity may perform weighting processing on the first model weight function to be weighted, the second model weight function to be weighted, and the third model weight function to be weighted by using the following formulas to obtain a weighted model weight function:
wherein, ω isiRepresenting a weighted model weight function. h is1、h2And h3Satisfy the requirement ofAnd
finally, the execution agent may add the weight constraint to the set of constraints to update the set of constraints. The constraints comprised by the updated set of constraints may be represented as the following mathematical model:
therefore, the constraint condition set can be updated, so that the constraint conditions related to the model errors are inserted into the constraint condition set, the adjustment effect of the model errors on the model weight coefficients can be considered when the model weight coefficients are solved, and the prediction accuracy of the multi-time-series model for the flow quantity prediction can be improved through the generated model weight coefficient set.
And 305, performing weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set according to the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
In some embodiments, the execution entity may perform weighted combination processing on the traffic prediction time-series model corresponding to each model weight coefficient in the model weight coefficient set according to the model weight coefficient set, so as to obtain a traffic prediction multi-time-series model. And the quantity of the flow amount prediction time sequence models which are weighted to form the flow amount prediction multi-time sequence model meets the constraint condition that the representation restricts the quantity of the models. In practice, the execution main body may first multiply each model weight coefficient by a corresponding respective traffic prediction time series model, and then may add the respective traffic prediction time series models obtained by the respective multiplications to obtain a traffic prediction multi-time series model. It is to be understood that the above addition may refer to a process of adding the prediction results of the respective traffic amount prediction timing models, that is, a process of adding the respective predicted traffic amounts. Therefore, the flow transfer amount prediction multi-sequence model corresponding to the target object can be obtained through weighted combination according to the optimal model weight coefficient set.
The above embodiments of the present disclosure have the following advantages: the multi-time-sequence model for the traffic prediction obtained by the method for generating the multi-time-sequence model for the traffic prediction of some embodiments of the disclosure has the advantages of improving the prediction accuracy, robustness and stability of the prediction result, and simplifying the process of determining the model. Specifically, the reasons that the prediction accuracy, robustness and prediction result stability are poor and the model determination process is complicated are as follows: when a single time sequence model is adopted, the multi-periodicity characteristics of a time sequence cannot be covered, so that the prediction accuracy and robustness are poor; when a plurality of time sequence models are adopted, each time sequence model needs to be selected in advance, the number of the time sequence models and the weight of each time sequence model need to be determined, the stability of a prediction result is poor, and the model determining process is complicated. Based on this, the method for generating a multi-timing model for traffic prediction according to some embodiments of the present disclosure first obtains a historical traffic set of a target item within a preset time period. And then, inputting the historical traffic set into each traffic prediction time sequence model in a traffic prediction time sequence model pool, and obtaining the predicted traffic of each return time granularity of the target object in a preset return time period as a predicted traffic group to form a predicted traffic group set. Wherein the preset time period includes at least one periodic time period of the preset time period. Therefore, the flow of the target article in each return time granularity can be predicted through each flow prediction time sequence model in the flow prediction time sequence model pool, a predicted flow set obtained through prediction is used as an input parameter, and the method can be further used for weight solution of an objective function to be solved. And then, determining each historical flow quantity of each return time granularity in the prediction flow quantity set and the historical flow quantity set as an input parameter of a preset linearized objective function so as to generate an objective function to be solved. Therefore, the objective function to be solved can be used as an objective condition when determining the model weight coefficient, so that the determined model weight coefficient is an optimal solution. And secondly, solving the objective function to be solved based on a constraint condition set corresponding to the preset linearized objective function to obtain a model weight coefficient set. The model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence models in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions representing the number of the models to be constrained. Therefore, the model weight coefficient set obtained by the solving process can be the optimal solution, and simultaneously, all constraint conditions in the constraint condition set can be met. And finally, according to the model weight coefficient set, carrying out weighted combination processing on the traffic prediction time sequence model corresponding to each model weight coefficient in the model weight coefficient set to obtain a traffic prediction multi-time sequence model. Therefore, the flow transfer amount prediction multi-sequence model corresponding to the target object can be obtained through weighted combination according to the optimal model weight coefficient set. And when the traffic prediction multi-time-sequence model is generated, the predicted traffic of the target article in the preset return time period is determined according to the historical traffic set of the target article in the preset time period, so that the multi-periodicity characteristic of the traffic of the target article in the historical time period can be covered, and the prediction accuracy is improved. And because the multi-time-sequence model for the traffic prediction is generated for the target article, the influence of the traffic variation characteristics of different articles in historical time (for example, the influence of large difference of traffic of different articles in different seasons) is reduced, and the robustness of prediction is improved. And because the selection of the model and the determination of the model weight coefficient are obtained by solving, whether the model is selected can be embodied by the model weight coefficient, so that the selection of each time sequence model in advance, the determination of the number of the time sequence models and the weight of each time sequence model are not needed, the stability of a prediction result is improved, and the model determination process is simplified.
With further reference to fig. 4, a flow 400 of some embodiments of an information sending method is shown. The process 400 of the information sending method includes the following steps:
step 401, obtaining a historical flow traffic set of a target article in a preset time period.
In some embodiments, an executing subject of the information sending method (for example, the computing device 101 shown in fig. 1 or the computing device 201 shown in fig. 2) may obtain, through a wired connection manner or a wireless connection manner, a historical traffic set of the target item in a preset time period from a terminal storing the historical traffic of the target item.
Optionally, the executing main body may further obtain an inventory amount of the target item. The inventory may be a remaining amount of the target item in at least one warehouse for storing the target item.
Step 402, inputting the historical traffic sets into a traffic prediction multi-time sequence model, and obtaining a predicted traffic of each predicted time granularity of the target object in a preset prediction time period as a target predicted traffic.
In some embodiments, the execution subject may input the historical traffic volume set into a traffic volume prediction multi-time-series model, and obtain a predicted traffic volume of each predicted time granularity of the target object within a preset prediction time period as a target predicted traffic volume. The multi-timing model for traffic prediction may be the multi-timing model for traffic prediction obtained through steps 301-305 in the embodiments corresponding to fig. 3. Therefore, the accuracy of the target prediction flow can be improved by the flow prediction multi-time-sequence model. And because the multi-time-sequence model for the traffic prediction is generated aiming at the target article, the influence of the traffic variation characteristics of articles with different historical times is reduced, and the robustness of the prediction is improved.
Step 403, sending the obtained target predicted traffic volume to the associated display device.
In some embodiments, the execution agent may send the resulting target predicted traffic volume to the associated display device along with it. The display device may be a device having a display function and associated with the execution body. This makes it possible to display the target predicted traffic volume.
Alternatively, the executing body may control the associated scheduling device to execute the scheduling operation according to the stock amount and the obtained target predicted amount of flow in response to the stock amount and the obtained target predicted amount of flow satisfying a preset replenishment condition. The preset replenishment condition may be "a difference between the sum of the obtained target predicted traffic volumes and the stock amount is greater than a preset threshold value". Here, the specific setting of the preset threshold is not limited. The dispatching equipment can be equipment with the function of dispatching goods. For example, the scheduling device may be an unmanned vehicle. The scheduling operation may be an operation performed by the scheduling device to schedule an article. In practice, first, the execution body described above may determine the difference between the sum of the obtained target predicted flow amounts and the stock amount as the adjustment amount. Then, the scheduling device may be controlled to perform scheduling of the scheduled amount of the target item. Therefore, when the current inventory quantity is insufficient, the target article can be dispatched according to the obtained target forecast traffic amount so as to replenish the inventory in advance.
The above embodiments of the present disclosure have the following advantages: firstly, acquiring a historical flow traffic set of a target article in a preset time period. And then, inputting the historical traffic set into a traffic prediction multi-time-sequence model to obtain a predicted traffic of each predicted time granularity of the target object in a preset prediction time period as a target predicted traffic. Therefore, the accuracy of the target prediction flow can be improved by the flow prediction multi-time-sequence model. And because the multi-time-sequence model for the traffic prediction is generated aiming at the target article, the influence of the traffic variation characteristics of articles with different historical times is reduced, and the robustness of the prediction is improved. Finally, the resulting target predicted traffic volume is sent to the associated display device. This makes it possible to display the target predicted traffic volume.
With further reference to fig. 5, as an implementation of the method shown in fig. 3, the present disclosure provides some embodiments of a traffic prediction multi-timing model generation apparatus, which correspond to those of the method shown in fig. 3, and which may be applied in various electronic devices.
As shown in fig. 5, the apparatus 500 for generating a multi-timing model for traffic prediction according to some embodiments includes: an acquisition unit 501, an input unit 502, a determination unit 503, a generation unit 504, and a weighted combination unit 505. The obtaining unit 501 is configured to obtain a historical flow traffic set of a target item within a preset time period; the input unit 502 is configured to input the historical set of transitions into each transition prediction time sequence model in the pool of transition prediction time sequence models, and obtain a predicted transition of each transition granularity of the target object within a preset transition time period as a predicted transition group to form a predicted transition group set, wherein the preset time period includes at least one periodic transition time period of the preset transition time period; the determining unit 503 is configured to determine each historical traffic in the prediction traffic group set and the historical traffic group set at each return time granularity as an input parameter of a preset linearized objective function to generate an objective function to be solved; the generating unit 504 is configured to perform solving processing on the objective function to be solved based on a constraint condition set corresponding to the preset linearized objective function, so as to obtain a model weight coefficient set, where a model weight coefficient in the model weight coefficient set corresponds to a traffic prediction time sequence model in the traffic prediction time sequence model pool, and the constraint condition set includes a constraint condition representing that a model number is constrained; the weighted combination unit 505 is configured to perform weighted combination processing on the traffic prediction time series model corresponding to each model weight coefficient in the model weight coefficient set according to the model weight coefficient set, so as to obtain a traffic prediction multi-time series model.
In an optional implementation manner of some embodiments, before the generating unit 504, the apparatus 500 for generating a traffic prediction multi-timing model may further include: and a model error generating unit (not shown in the figure) configured to generate a model error corresponding to the traffic prediction time series model based on the predicted traffic group corresponding to each traffic prediction time series model in the predicted traffic group set and each historical traffic at each return time granularity in the historical traffic set, and obtain a model error set.
In an optional implementation manner of some embodiments, after the model error generating unit, the apparatus 500 for generating a multi-timing model for predicting a flow transition amount may further include: and a deleting unit (not shown in the figure) configured to delete each constraint condition characterizing the constraint on the model weight coefficient in the constraint condition set so as to update the constraint condition set.
In an optional implementation manner of some embodiments, after deleting the unit, the apparatus 500 for generating a traffic prediction multi-timing model may further include: an input parameter determination unit, a weighting processing unit, and an adding unit (not shown in the figure). Wherein the input parameter determination unit is configured to determine the above-mentioned set of model errors as input parameters for a first model weight function, a second model weight function and a third model weight function to generate a first model weight function to be weighted, a second model weight function to be weighted and a third model weight function to be weighted. The weighting processing unit is configured to perform weighting processing on the first model weight function to be weighted, the second model weight function to be weighted and the third model weight function to be weighted to obtain a weighted model weight function as a weight constraint condition. The adding unit is configured to add the above-described weight constraint to the constraint set to update the constraint set.
It will be understood that the elements described in the apparatus 500 correspond to various steps in the method described with reference to fig. 3. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 500 and the units included therein, and are not described herein again.
With further reference to fig. 6, as an implementation of the method shown in fig. 4, the present disclosure provides some embodiments of an information transmitting apparatus, which correspond to those of the method shown in fig. 4, and which may be applied in various electronic devices in particular.
As shown in fig. 6, the information transmitting apparatus 600 of some embodiments includes: a history traffic set acquisition unit 601, a history traffic set input unit 602, and a transmission unit 603. The historical traffic set acquisition unit 601 is configured to acquire a historical traffic set of the target item within a preset time period; the historical traffic set input unit 602 is configured to input the historical traffic set into a traffic prediction multi-timing model, and obtain a predicted traffic of each predicted time granularity of the target object within a preset prediction time period as a target predicted traffic; the sending unit 603 is configured to send the resulting target predicted amount of streaming to the associated display device.
In an optional implementation manner of some embodiments, the history traffic set obtaining unit 601 may further include: an inventory amount acquiring unit (not shown in the figure) configured to acquire an inventory amount of the target item.
In an optional implementation manner of some embodiments, the information sending apparatus 600 may further include: and a control unit (not shown in the figure) configured to control the associated scheduling device to perform a scheduling operation according to the stock quantity and the obtained target predicted amount of flow in response to the stock quantity and the obtained target predicted amount of flow satisfying a preset replenishment condition.
It will be understood that the elements described in the apparatus 600 correspond to various steps in the method described with reference to fig. 4. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 600 and the units included therein, and are not described herein again.
Referring now to FIG. 7, a block diagram of an electronic device (e.g., computing device 101 of FIG. 1 or computing device 201 of FIG. 2) 700 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 may include a processing means (e.g., central processing unit, graphics processor, etc.) 701 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from storage 708 into a Random Access Memory (RAM) 703. In the RAM703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing device 701, the ROM 702, and the RAM703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication means 709 may allow the electronic device 700 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device 700 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 7 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via communications means 709, or may be installed from storage 708, or may be installed from ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a historical flow transfer set of a target object in a preset time period; inputting the historical transit amount set into each transit amount prediction time sequence model in a transit amount prediction time sequence model pool, and obtaining the prediction transit amount of each transit amount of the target object in each return time granularity in a preset return time period as a prediction transit amount group to form a prediction transit amount group set, wherein the preset time period comprises at least one cycle return time period of the preset return time period; determining each historical flow quantity of each return time granularity in the prediction flow quantity set and the historical flow quantity set as an input parameter of a preset linearized objective function so as to generate an objective function to be solved; solving the target function to be solved based on a constraint condition set corresponding to the preset linearized target function to obtain a model weight coefficient set, wherein model weight coefficients in the model weight coefficient set correspond to the traffic prediction time sequence model in the traffic prediction time sequence model pool, and the constraint condition set comprises constraint conditions for representing the number of the models to be constrained; and according to the model weight coefficient set, carrying out weighted combination processing on the traffic prediction time sequence models corresponding to the model weight coefficients in the model weight coefficient set to obtain a traffic prediction multi-time sequence model.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, an input unit, a determination unit, a generation unit, and a weighted combination unit. The names of the units do not form a limitation on the units themselves in some cases, for example, the acquiring unit may also be described as a unit for acquiring a historical flow traffic set of the target item in a preset time period.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.