Coarse sorting method and device

文档序号:7851 发布日期:2021-09-17 浏览:45次 中文

1. A method of coarse sequencing, the method comprising:

acquiring a history entry set of a user, and determining various interest point information of the user according to the history entry set;

acquiring a candidate item set, and determining the relevance scores of each candidate item in the candidate item set and the multiple kinds of interest point information respectively;

determining a final relevance score of each candidate item from the relevance scores;

and selecting a target candidate item for fine sorting from the candidate item set according to the final relevance score of each candidate item.

2. The method of claim 1, wherein determining a plurality of point of interest information for the user from the set of history entries comprises:

generating a history item vector of each history item;

and inputting each historical entry vector into a pre-trained interest network, and obtaining various interest point information output by the interest network.

3. The method of claim 2, wherein the interest network comprises a first fully connected network and a second fully connected network, and wherein the interest point information comprises an initial interest point vector, a final interest point vector, and an interest point weight;

inputting each history entry vector into a pre-trained interest network, and obtaining a plurality of interest point information output by the interest network, wherein the method comprises the following steps:

respectively inputting each history entry vector into a first fully-connected network and a second fully-connected network which are trained in advance, and obtaining interest point weights of one or more interest points corresponding to each history entry vector output by the first fully-connected network and initial interest point vectors of one or more interest points corresponding to each history entry vector output by the second fully-connected network;

and determining a final interest point vector of each interest point according to the initial interest point vector of one or more interest points corresponding to each history entry vector and the corresponding interest point weight.

4. The method of claim 3, wherein determining a final point of interest vector for each point of interest based on the initial point of interest vectors and corresponding point of interest weights for the one or more points of interest for each history entry vector comprises:

for each history entry vector, normalizing the interest point weight of the history entry vector;

calculating the product of the initial interest point vector of each interest point and the corresponding normalized interest point weight to obtain an intermediate interest point vector of each interest point;

and summarizing the intermediate interest point vectors of the same interest points in all the history entries to obtain the final interest point vector of each interest point.

5. The method of any one of claims 1-4, wherein said determining a relevance score for each candidate item in the set of candidate items with respect to the plurality of types of point of interest information comprises:

determining candidate entry vectors corresponding to the candidate entries;

and calculating the similarity between the candidate item vectors and the final interest point vector of each interest point respectively as a relevance score aiming at each candidate item vector.

6. The method of claim 5, wherein determining a final relevance score for each candidate item from the relevance scores comprises:

and selecting the maximum relevance score from the relevance scores of the current candidate item and each interest point as a final relevance score.

7. The method of claim 1, wherein selecting a target candidate item for fine ranking from the set of candidate items according to the final relevance score of each candidate item comprises:

selecting a plurality of candidate items with the highest final relevance scores as target candidate items;

alternatively, the first and second electrodes may be,

acquiring the prior score of each candidate item, and fusing the prior score of each candidate item with the corresponding final relevance score to obtain a final score; and selecting a plurality of candidate items with the highest final scores as target candidate items.

8. An apparatus for coarse sorting, the apparatus comprising:

the history item acquisition module is used for acquiring a history item set of a user;

the interest point information determining module is used for determining various interest point information of the user according to the history entry set;

the relevancy score determining module is used for acquiring a candidate item set and determining the relevancy scores of each candidate item in the candidate item set and the plurality of types of interest point information;

a final relevance score determining module for determining a final relevance score of each candidate item from the relevance scores;

and the target candidate item determining module is used for selecting the target candidate items for fine sorting from the candidate item set according to the final relevance scores of the candidate items.

9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-7 when executing the program.

10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.

Background

In scenes needing large-scale sequencing such as searching, recommending, advertising and the like, a funnel type cascade sequencing architecture is widely applied, the cascade sequencing architecture generally comprises modules such as recall, rough-row, fine-row, reordering and the like in sequence, and the number of candidate items of each module is reduced in sequence.

The rough ranking is between the recall and the fine ranking, hundreds to thousands of candidate entries are required to be picked out from the ten thousand candidate entries and sent into the fine ranking, and the design needs to balance two factors of time consumption and accuracy. On one hand, due to time-consuming requirements, the design of a coarse-row model is often simpler than that of a fine-row model, and the accuracy is lower than that of the fine-row model; on the other hand, the training data used in the rough ranking is the exposure data selected by the fine ranking, while the reasoning and prediction are data which are more extensive than the training data, so that the serious selection bias is caused; and the samples which are not clicked in the exposure belong to the positive samples returned by the rough model, and are taken as the negative samples, so that the model is difficult to learn. These factors cause inaccuracy of rough model estimation and influence recommendation effect.

Disclosure of Invention

The application provides a coarse sorting method and a coarse sorting device, which are used for solving the problem that in the prior art, the prediction accuracy of a coarse sorting model is not high in a coarse sorting stage.

In a first aspect, an embodiment of the present application provides a method for rough sorting, where the method includes:

acquiring a history entry set of a user, and determining various interest point information of the user according to the history entry set;

acquiring a candidate item set, and determining the relevance scores of each candidate item in the candidate item set and the multiple kinds of interest point information respectively;

determining a final relevance score of each candidate item from the relevance scores;

and selecting a target candidate item for fine sorting from the candidate item set according to the final relevance score of each candidate item.

In a second aspect, an embodiment of the present application further provides an apparatus for coarse sorting, where the apparatus includes:

the history item acquisition module is used for acquiring a history item set of a user;

the interest point information determining module is used for determining various interest point information of the user according to the history entry set;

the relevancy score determining module is used for acquiring a candidate item set and determining the relevancy scores of each candidate item in the candidate item set and the plurality of types of interest point information;

a final relevance score determining module for determining a final relevance score of each candidate item from the relevance scores;

and the target candidate item determining module is used for selecting the target candidate items for fine sorting from the candidate item set according to the final relevance scores of the candidate items.

In a third aspect, an embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method of the first aspect is implemented.

In a fourth aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method of the first aspect.

The application has the following beneficial effects:

in the embodiment, the history entries of the user are considered in the rough sorting stage of the recommendation system, and various interest point information of the current user is effectively extracted from the history entry set, then, the relevance scores of the candidate items in the candidate item set and the information of various interest points can be calculated respectively, the final relevance score of each candidate item is determined based on the relevance scores, then the final relevance score of each candidate item selects a target candidate item for fine sorting from the candidate item set, the selected target candidate items are matched with the current historical use habits of the user, the user personalization degree is high, the quality and the accuracy of screening the target candidate items in the rough arrangement stage are improved, the personalization capability of deciding the target candidate items in the rough arrangement stage is well improved, and then promoted recommendation system's recommendation effect to reach the purpose that promotes user's stickness and product and remain.

In addition, according to the embodiment, the interest point information related to the historical entries can be quickly extracted, then the final relevance score of each candidate entry can be obtained based on the relevance between the interest point information and the candidate entries, the requirement of scoring the candidate entries of tens of thousands of levels in the rough ranking stage can be met, the scoring efficiency is improved, and the efficiency of screening the target candidate entries is further improved.

Drawings

FIG. 1 is a flowchart of an embodiment of a method for coarse sorting according to an embodiment of the present application;

FIG. 2 is a schematic diagram of a coarse sorting framework according to an embodiment of the present disclosure;

FIG. 3 is a block diagram of an embodiment of a rough sorting apparatus according to a second embodiment of the present disclosure;

fig. 4 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.

Detailed Description

The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.

Example one

Fig. 1 is a flowchart of an embodiment of a rough ranking method provided in an embodiment of the present application, which may be applied to a recommendation system and is suitable for a real-time personalized recommendation scenario for products including news, information, short videos, music, advertisements, and the like.

The present embodiment may include the following steps:

step 110, obtaining a history entry set of a user, and determining various interest point information of the user according to the history entry set.

In this step, the user refers to a target account that needs to be personalized and recommended currently, and the source of the target account may be any one of all registered accounts in the system, or an account that initiates a request currently, or a target account determined according to actual business requirements, which is not limited in this embodiment.

In one example, the historical item set may include item data (item) that the current user has acted to over a past period of time, such as by placing item data that the user clicked over a past period of time into the historical item set. When the method is implemented, the log data of the current user can be obtained, and the history entry set of the current user is extracted from the log data. Wherein the entry data may include an entry id.

The embodiment can extract the interest point information of various interest points of the user from the history entry set of the user. In one embodiment, a deep learning network may be relied upon to extract point of interest information from a user's collection of historical entries, then step 110 may further include the steps of:

step 110-1, a history entry vector is generated for each history entry.

In implementation, a vector layer (Embedding layer) may be included in the system for mapping each item data (e.g., item id) into a dense vector of fixed dimensions, i.e., an item vector. The implementation of the Embedding layer may be an Embedding model, for example, the entry id of each history entry may be input into the Embedding model, so that the Embedding model outputs a history entry vector corresponding to the current history entry id, and each history entry vector is used as an input of the interest network.

The history entry vectors for all history entries may constitute a history entry vector set, i.e., [ v ]1,v2,…,vn],vi∈RdWhere d represents the dimension of the dense vector and n represents the number of history entries.

And step 110-2, inputting each historical entry vector into a pre-trained interest network, and obtaining a plurality of interest point information output by the interest network.

The interest network can be a deep learning network trained by a deep learning algorithm (such as a BP algorithm), and the interest network is used for extracting one or more interest point information of each history item.

In one embodiment, the interest network may be an MLP network (multi-layer fully connected neural network) that may include at least a first fully connected network and a second fully connected network; the point of interest information may illustratively include an initial point of interest vector, a final point of interest vector, and a point of interest weight. Step 110-2 may further include the steps of:

step 110-2-1, respectively inputting each history entry vector into a first fully-connected network and a second fully-connected network which are trained in advance, and obtaining interest point weights of one or more interest points corresponding to each history entry vector output by the first fully-connected network and initial interest point vectors of one or more interest points corresponding to each history entry vector output by the second fully-connected network.

In this step, for each history entry vector of the set of history entry vectors, it may be input to the first fully connected network and the second fully connected network, respectively. Wherein the first fully connected network is operative to map the history entry vector to interest point weights for one or more interest points; the second fully connected network functions to map the history entry vector to an initial point of interest vector of one or more points of interest.

For example, as shown in the schematic diagram of the rough sorting frame in fig. 2, each history entry vector (e.g., history emb 1, history emb 2, …, history emb n in fig. 2) is respectively input into two fully-connected networks (MLPs), so as to obtain an interest point weight (e.g., interest weight in fig. 2) and an initial interest point vector (e.g., interest vector in fig. 2) of one or more interest points corresponding to each history entry vector. Wherein the interest point weight of one or more interest points can be expressed asAn initial point of interest vector for one or more points of interest may be represented asT represents the number of points of interest, S represents the number of interest pointsThe vector dimension of the point of interest.

It should be noted that, according to different recommended scenes, the points of interest are also different, for example, the points of interest in the short video field may include different categories of videos, such as beauty, fun, games, and the like; the interest points of the e-commerce recommendation scene may be different categories of commodities, such as lady clothes, electric appliances, daily necessities, and the like, and the embodiment does not limit the specific interest points.

And step 110-2-2, determining a final interest point vector of each interest point according to the initial interest point vectors of one or more interest points corresponding to each history entry vector and the corresponding interest point weights.

For each history entry vector, after obtaining the interest point weights of one or more interest points corresponding to a history entry vector and the initial interest point vector, the interest point weights of the interest points and the initial interest point vector may be fused to obtain a final interest point vector of each interest point.

In one embodiment, step 110-2-2 may include the steps of:

and step 110-2-2-1, normalizing the interest point weights of the history entry vectors for each history entry vector.

In order to facilitate comparison and weighting of interest point weights of different magnitudes, after the first fully-connected network outputs the interest point weights of the interest points, normalization processing may be performed on the interest point weights by using a normalization function. The specific normalization function is not limited in this embodiment as long as it can achieve the normalization effect, for example, as shown in fig. 2, the normalization function may be a softmax function.

And step 110-2-2-2, calculating the product of the initial interest point vector of each interest point and the corresponding normalized interest point weight to obtain an intermediate interest point vector of each interest point.

In this step, for each history entry vector, an interest point vector of interest points of the history entry vector may be calculated as an intermediate interest point vector, respectively. Specifically, the product of the initial interest point vector of each interest point and the corresponding normalized interest point weight may be calculated as the intermediate interest point vector of the corresponding interest point.

The intermediate point of interest vector for each point of interest can be calculated using the following formula:

wherein, Ii(x, y) represents an intermediate point of interest vector for point of interest i,a normalized interest point vector representing the interest point i,an initial point of interest vector representing point of interest i.

And step 110-2-2-3, summarizing the intermediate interest point vectors of the same interest points in all the history entries to obtain the final interest point vector of each interest point.

In this step, after obtaining the intermediate interest point vector of each interest point of each history entry vector, the final interest point vector of each interest point may be obtained by accumulating (Weighted-sum) the interest point vectors of the same interest point in all history entry vectors. That is, the final interest point vector of each interest point is calculated using the following formula:

wherein FI (x, y) is the final interest point vector of the interest point i,representing the sum of the point of interest i vectors at the middle of all history entry vectors.

Step 120, obtaining a candidate item set, and determining a relevance score between each candidate item in the candidate item set and the plurality of interest point information.

In this step, the set of candidate items is a set of items to be recommended. In implementation, a relevance computation algorithm may be employed to compute a relevance score for each candidate entry and the point of interest information for each point of interest.

In one embodiment, step 120 may include the steps of:

step 120-1, a candidate entry vector corresponding to each candidate entry is determined.

In implementation, the entry id of each candidate entry may be input to an Embedding layer of the system, so that the Embedding layer outputs a dense vector corresponding to the current candidate entry id, that is, a candidate entry vector.

And step 120-2, calculating the similarity between each candidate item vector and the final interest point vector of each interest point as a relevance score.

In this step, for the candidate entry vector of each candidate entry, the similarity between the candidate entry vector and the final interest point vector of each interest point may be calculated. For example, the similarity may be determined by dot product calculation according to the following formula:

Si=FIi·vt,i∈[1,n],S∈RT

wherein S isiVector v for current candidate entrytFinal point of interest vector FI with point of interest iiThe similarity is calculated by dot product.

Of course, besides the similarity obtained by the dot product calculation, there may be other calculation methods of the similarity, such as cosine similarity, euclidean similarity, etc., which is not limited in this embodiment.

Step 130, determining a final relevance score for each candidate item from the relevance scores.

In one implementation, the maximum relevance score may be selected from the relevance scores of the current candidate item and the interest points as the final relevance score sim of the current candidate itemtNamely:

simt=max(Si),i∈[1,n]

and step 140, selecting target candidate items for fine sorting from the candidate item set according to the final relevance scores of the candidate items.

After the final relevance score between each candidate item in the candidate item set and the current user is determined, a target candidate item for fine sorting can be selected from the candidate item set according to the final relevance score between the target candidate item and the user.

In one embodiment, the target candidate entry may be determined by a similarity truncation method, and step 140 may include the following steps:

and selecting a plurality of candidate items with the highest final relevance scores as target candidate items.

In this embodiment, in the rough ranking stage, after the final relevance score of each candidate entry in the candidate entry set and the current user is obtained, the final relevance score may be used to rank and truncate the candidate entries, specifically, the candidate entries in the candidate entry set may be ranked in an order from high to low according to the final relevance score, and then a plurality of candidate entries ranked in the front and highest in the final relevance score are selected as target candidate entries for fine ranking, so as to ensure the quality of using the candidate entries in the subsequent fine ranking process.

In another embodiment, the target candidate entry may be determined by fusing existing prior scores, and step 140 may include the steps of:

acquiring the prior score of each candidate item, and fusing the prior score of each candidate item with the corresponding final relevance score to obtain a final score; and selecting a plurality of candidate items with the highest final scores as target candidate items.

In this embodiment, each candidate item may be scored through other processes to obtain a prior score. Then, taking the final relevance score between each candidate item and the current user as a weight, weighting the previous scores to obtain the final score of each candidate item, namely the calculation formula of the final score is as follows:

scoret=simt×other

wherein, scoretIs the final score, sim, of the candidate entry ttAnd the other is a prior score determined by other processes for the candidate item t.

Then, the candidate items in the candidate item set can be ranked according to the sequence of the final scores from high to low, and then a plurality of candidate items with the highest final scores and ranked in the front are selected as target candidate items for fine ranking, so that the quality of the candidate items used in a subsequent fine ranking process is ensured.

The number of the target candidate items selected to enter the fine ranking stage may be determined according to actual requirements, which is not limited in this embodiment.

In the embodiment, the history entries of the user are considered in the rough sorting stage of the recommendation system, and various interest point information of the current user is effectively extracted from the history entry set, then, the relevance scores of the candidate items in the candidate item set and the information of various interest points can be calculated respectively, the final relevance score of each candidate item is determined based on the relevance scores, then the final relevance score of each candidate item selects a target candidate item for fine sorting from the candidate item set, the selected target candidate items are matched with the current historical use habits of the user, the user personalization degree is high, the quality and the accuracy of screening the target candidate items in the rough arrangement stage are improved, the personalization capability of deciding the target candidate items in the rough arrangement stage is well improved, and then promoted recommendation system's recommendation effect to reach the purpose that promotes user's stickness and product and remain.

In addition, according to the embodiment, the interest point information related to the historical entries can be quickly extracted, then the final relevance score of each candidate entry can be obtained based on the relevance between the interest point information and the candidate entries, the requirement of scoring the candidate entries of tens of thousands of levels in the rough ranking stage can be met, the scoring efficiency is improved, and the efficiency of screening the target candidate entries is further improved.

Example two

Fig. 3 is a block diagram of an embodiment of a rough sorting apparatus provided in the second embodiment of the present application, where the apparatus may be applied in a recommendation system, and may include the following modules:

a history item obtaining module 210, configured to obtain a history item set of a user;

an interest point information determining module 220, configured to determine multiple kinds of interest point information of the user according to the history entry set;

a relevance score determining module 230, configured to obtain a candidate entry set, and determine relevance scores of each candidate entry in the candidate entry set and the multiple kinds of interest point information;

a final relevance score determining module 240 for determining a final relevance score for each candidate item from the relevance scores;

and a target candidate item determining module 250, configured to select a target candidate item for fine sorting from the candidate item set according to the final relevance score of each candidate item.

In one embodiment, the point of interest information determination module 220 may include the following sub-modules:

the history entry vector generation submodule is used for generating a history entry vector of each history entry;

and the interest network processing submodule is used for inputting each historical item vector into a pre-trained interest network and acquiring various interest point information output by the interest network.

In one embodiment, the interest network comprises a first fully connected network and a second fully connected network, the interest point information comprises an initial interest point vector, a final interest point vector and an interest point weight;

the interest network processing sub-module may include the following elements:

the full-connection network processing unit is used for respectively inputting each historical entry vector into a first full-connection network and a second full-connection network which are trained in advance, and obtaining interest point weights of one or more interest points corresponding to each historical entry vector output by the first full-connection network and initial interest point vectors of one or more interest points corresponding to each historical entry vector output by the second full-connection network;

and the final interest point vector determining unit is used for determining the final interest point vector of each interest point according to the initial interest point vector of one or more interest points corresponding to each history entry vector and the corresponding interest point weight.

In an embodiment, the final interest point vector determining unit is specifically configured to:

for each history entry vector, normalizing the interest point weight of the history entry vector;

calculating the product of the initial interest point vector of each interest point and the corresponding normalized interest point weight to obtain an intermediate interest point vector of each interest point;

and summarizing the intermediate interest point vectors of the same interest points in all the history entries to obtain the final interest point vector of each interest point.

In an embodiment, the relevancy score determining module 230 is specifically configured to:

determining candidate entry vectors corresponding to the candidate entries;

and calculating the similarity between the candidate item vectors and the final interest point vector of each interest point respectively as a relevance score aiming at each candidate item vector.

In an embodiment, the final relevance score determining module 240 is specifically configured to:

and selecting the maximum relevance score from the relevance scores of the current candidate item and each interest point as a final relevance score.

In an embodiment, the target candidate entry determining module 250 is specifically configured to:

selecting a plurality of candidate items with the highest final relevance scores as target candidate items;

alternatively, the first and second electrodes may be,

acquiring the prior score of each candidate item, and fusing the prior score of each candidate item with the corresponding final relevance score to obtain a final score; and selecting a plurality of candidate items with the highest final scores as target candidate items.

It should be noted that the apparatus for rough sorting provided in the embodiment of the present application can execute the method for rough sorting provided in the embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.

EXAMPLE III

Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present disclosure, as shown in fig. 4, the electronic device includes a processor 410, a memory 420, an input device 430, and an output device 440; the number of the processors 410 in the electronic device may be one or more, and one processor 410 is taken as an example in fig. 4; the processor 410, the memory 420, the input device 430 and the output device 440 in the electronic apparatus may be connected by a bus or other means, and the bus connection is exemplified in fig. 4.

The memory 420 serves as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the methods in the embodiments of the present application. The processor 410 executes various functional applications of the electronic device and data processing by executing software programs, instructions and modules stored in the memory 420, that is, implements the above-described method.

The memory 420 may mainly include a program storage area and a data storage area, wherein the program storage area

The operating system and the application program required by at least one function can be stored; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 420 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 420 may further include memory located remotely from processor 410, which may be connected to an electronic device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.

The input device 430 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus. The output device 440 may include a display device such as a display screen.

Example four

The fourth embodiment of the present application further provides a storage medium containing computer-executable instructions, which when executed by a processor of a server, are configured to perform the method of any one of the first embodiment.

From the above description of the embodiments, it is obvious for those skilled in the art that the present application can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.

It should be noted that, in the embodiment of the apparatus, the included units and modules are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the application.

It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

完整详细技术资料下载
上一篇:石墨接头机器人自动装卡簧、装栓机
下一篇:一种虚拟形象的确定和获取方法、装置以及电子设备

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!