Sequential social contact recommendation method, system and storage medium based on door mechanism
1. A sequential social recommendation method based on a door mechanism is characterized by comprising the following steps:
step S1, dividing sequences of user original consumption data and friend original consumption data respectively to obtain a user sequence segment and a friend sequence segment, initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for GRU neural network identification;
step S2, based on GRU neural network of selection door mechanism, filtering and selecting the user sequence data to obtain the current interest of the user, and filtering and selecting the friend sequence data to obtain the current interest of the friend;
s3, splicing the current interests of the friends to obtain short-term interests of the friends, initializing commodity data in the original consumption data of the friends to obtain long-term interests of the friends, and splicing the short-term interests of the friends and the long-term interests of the friends to obtain final interests of the friends;
step S4, convolving the current interest of the user and the final interest of the friends based on the neural network of the graph attention to obtain the weights of the friends for the user interest, and calculating the friend influence according to the weights of the friends for the user interest;
step S5, splicing the friend influence and the current interest of the user to obtain the final interest of the user;
and S6, calculating the final interest of the user according to a Softmax function to obtain the probability distribution of the user to different commodities, performing model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
2. The door mechanism-based sequential social recommendation method according to claim 1, wherein the step S1 comprises:
dividing the original consumption data of the user and the friend into a sequence according to a preset period to obtain a plurality of sequence segments, initializing each sequence segment to obtain sequence data, and after the sequence data are expanded according to a GRU neural network, expressing the user sequence data as xu=(m1,m2,...mj) The friend sequence data is represented as xf=(k1,k2,...kj) And j represents the size of the time step.
3. The door mechanism-based sequential social recommendation method according to claim 2, wherein the step S2 comprises:
s21, GRU neural network output is carried out to the user sequence data according to the formula (1), and a plurality of hidden states h are obtained corresponding to the sequence data sequencen(ii) a GRU neural network output is carried out on the friend sequence data according to the formula (2), and a plurality of hidden states h are obtained corresponding to the sequence data sequencef(ii) a The formula (1) is: h isn=GRU(xu) The formula (2) is: h isf=GRU(xf),xuFor user sequence data, xfFriend sequence data;
s22, according to the formula (3), the user sequence data and the hidden states hnThe last hidden state of (g) is merged into the process to obtain the select gate gn(ii) a The friend sequence data and the plurality of hidden states h are expressed according to equation (4)fThe last hidden state of (g) is merged into the process to obtain the select gate gf(ii) a The selection gate gnAnd the selection gate gfIncluding the user raw consumption data and current consumption characteristics;
the formula (3) is: gn=σ([W1xu;W2hj]+b),
The formula (4) is: gf=σ([W1xf;W2hg]+b),
Where σ is the activation function, W1、W2Is a weight matrix, b is an offset vector, xuFor user sequence data, xfFor friend sequence data, hjIs hnLast hidden state of hgIs hfThe last hidden state of (a);
s23, hiding the plurality of hidden states h according to the formula (5)nFiltering and selecting to obtain the current interest of the user; for the plurality of hidden states h according to equation (6)fFiltering and selecting to obtain the current interest of the friend; the formula (5) isThe formula (6) isgnAnd gfIn order to select the door or doors,is a hadamard product.
4. The door mechanism-based sequential social recommendation method according to claim 3, wherein the step S3 comprises:
s31, splicing the current interests of the friends according to the formula (7) to obtain interests, outputting the interests through a GRU neural network, and taking the output last hidden state as the short-term interests of the friends; the formula (7) is: as a friendCurrent interest of hgFor a plurality of hidden states hfThe last hidden state of (a);
s32, performing initialization learning on the commodity data of the initial consumption data of the friends to obtain the long-term interest of the friends;
and S33, splicing the short-term interest of the friend with the long-term interest of the friend to obtain the final interest of the friend.
5. The door mechanism-based sequential social recommendation method according to claim 4, wherein the step S4 comprises:
s41, convolving the current interest of the user with the final interest of the friends based on a neural network of a graph attention function formula (8) to obtain weights of the user interests of a plurality of friends, wherein the graph attention function formula (8) is as follows:
in the formula, T represents transposition, hcRepresenting the interest of the c-th friend on the social network, k ═ duu means that k is a collection of a plurality of friends d and users u, and h ═ d &kRepresenting the final interests of a plurality of friends and the current interest set of the user, hnA GRU neural network output representing user sequence data;
s42, weighting and calculating the weights of different friends on the user interests according to an equation (9) to obtain the friend influence, wherein the equation (9) is as follows:where k ═ d @ u denotes that k is a set of a plurality of friends d and users u, and h ═ d @ u denotes that k is a set of a plurality of friends d and users ukRepresenting the final interests of friends and the current interest set of the user, akA weight representing the user's interest.
6. The door mechanism-based sequential social recommendation method according to claim 5, wherein the step S6 comprises:
s61, performing dimension conversion on the commodity data in the original consumption data, wherein the converted commodity data can be expressed as z ═ (z)1,z2,...zI);
S62, calculating the final interest of the user and the converted commodity data according to a Softmax functional formula (10), wherein the Softmax functional formula (10) is as follows:
wherein I is the total number of commodities, zyIs the y-th commodity, zqIs the qth commodity, T is the transpose, q is any commodity, hlIs the ultimate interest of the user;
s63, performing model training on the interest probability of each commodity by adopting a cross entropy loss function according to an equation (11), and obtaining a training model when an output loss value tends to be stable, wherein the equation (11) is as follows:
in formula (II) p'qTo predict the probability of user interest in a good, 1-p'qTo predict the probability of a user not being interested in a commodity, pqThe probability of interest of the actual user in the commodity;
and S64, selecting commodity information ranked at the front according to the training model, the Recall index and the NDCG index, and recommending the commodity information to a user.
7. A social recommendation system based on a sequence for a door selection mechanism is characterized by comprising an initial module, an interest acquisition module and a training module;
the initial module is used for respectively dividing the user original consumption data and the friend original consumption data into sequences to obtain a user sequence segment and a friend sequence segment, and initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for the GRU neural network to identify;
the interest acquisition module is used for filtering and selecting the user sequence data based on a GRU neural network of a selection door mechanism to obtain the current interest of the user, filtering and selecting the friend sequence data to obtain the current interest of a friend, and splicing the current interest of the friend to obtain the short-term interest of the friend; initializing commodity data in the original consumption data of the friends to obtain long-term interests of the friends, and splicing the short-term interests of the friends with the long-term interests of the friends to obtain final interests of the friends; convolving the current interest of the user and the final interest of the friends based on a neural network of graph attention to obtain weights of the friends for the user interest, and obtaining friend influences through weighted calculation according to the weights of the friends for the user interest; splicing the friend influence and the current interest of the user to obtain the final interest of the user;
the training module is used for calculating the final interest of the user according to a Softmax function to obtain the probability distribution of the user to different commodities, carrying out model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
8. A select-gate mechanism sequence-based social recommendation system comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the gate mechanism sequence-based social recommendation method of any of claims 1 to 6.
9. A storage medium comprising one or more computer programs stored thereon that are executable by one or more processors to implement the door mechanism-based sequential social recommendation method of any one of claims 1-6.
Background
The recommendation system finds the personalized demands of the users by analyzing the behaviors of the users, thereby recommending some commodities to the corresponding users in a personalized manner and helping the users find the commodities which are wanted by the users but are difficult to find. There are various existing recommendation system models, such as RNN-based sequence recommendation system: given a series of historical user-item interactions, the RNN-based sequence recommendation system attempts to predict the next possible interaction by modeling the sequential dependencies of the given interactions, long term memory (LSTM) and Gated Recursion Unit (GRU) -based RNNs are also developed in addition to the basic RNN to capture the long term dependencies in the sequence; for example, the recommendation system combining the basic sequence and the social contact not only utilizes the self-interest of the user but also combines the influence of friends, so that the performance of the recommendation system is improved, but in the prior art, the interest learning of the user is inaccurate in the interest learning process of the user, and the real interest of the user is not expressed enough.
Disclosure of Invention
The invention aims to solve the technical problem of the prior art and provides a sequential social recommendation method, a sequential social recommendation system and a storage medium based on a door mechanism. The characteristic information of the useless information is filtered by using the selection door, the characteristic information related to the current overall consumption characteristics is reserved, the interest expressions of the user and the friends are obtained, the interest of the user can be more accurately learned through the selection door mechanism, and the recommendation performance of the recommendation system is further improved.
The technical scheme for solving the technical problems is as follows: a sequential social recommendation method based on a door mechanism comprises the following steps:
step S1, dividing sequences of user original consumption data and friend original consumption data respectively to obtain a user sequence segment and a friend sequence segment, initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for GRU neural network identification;
step S2, based on GRU neural network of selection door mechanism, filtering and selecting the user sequence data to obtain the current interest of the user, and filtering and selecting the friend sequence data to obtain the current interest of the friend;
s3, splicing the current interests of the friends to obtain short-term interests of the friends, initializing commodity data in the original consumption data of the friends to obtain long-term interests of the friends, and splicing the short-term interests of the friends and the long-term interests of the friends to obtain final interests of the friends;
step S4, convolving the current interest of the user and the final interest of the friends based on the neural network of the attention of the graph to obtain the weights of the friends for the user interest, and calculating the influence of the friends according to the weights of the friends for the user interest;
step S5, splicing the friend influence and the current interest of the user to obtain the final interest of the user;
and S6, calculating the final interest of the user according to a Softmax function to obtain the probability distribution of the user to different commodities, performing model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
The invention has the beneficial effects that: the user interest obtained after the GRU neural network output processing can be filtered through a selection gate to obtain information irrelevant to the current overall consumption characteristics, the characteristic information relevant to the current overall consumption characteristics is reserved, the interest expression of the user and friends is obtained, the friend influence similar to the user interest is enhanced according to the interest weights of different friends, and the consumption interest of the user can be obtained more accurately.
On the basis of the technical scheme, the invention can be further improved as follows:
further, the step S1 specifically includes:
dividing the original consumption data of the user and the friend into a sequence according to a preset period to obtain a plurality of sequence segments, initializing each sequence segment to obtain sequence data, and after the sequence data are expanded according to a GRU neural network, expressing the user sequence data as xu=(m1,m2,...mj) The friend sequence data is represented as xf=(k1,k2,...kj) And j represents the size of the time step.
Further, the step S2 specifically includes:
s21, GRU neural network output is carried out to the user sequence data according to the formula (1), and a plurality of hidden states h are obtained corresponding to the sequence data sequencen(ii) a GRU neural network output is carried out on the friend sequence data according to the formula (2), and a plurality of hidden states h are obtained corresponding to the sequence data sequencef(ii) a The formula (1) is: h isn=GRU(xu) The formula (2) is: h isf=GRU(xf),xuFor user sequence data, xfFriend sequence data;
s22, according to the formula (3), the user sequence data and the hidden states hnThe last hidden state of (g) is merged into the process to obtain the select gate gn(ii) a The friend sequence data and the plurality of hidden states h are expressed according to equation (4)fThe last hidden state of (g) is merged into the process to obtain the select gate gf(ii) a The selection gate gnAnd a selection gate gfIncluding the user raw consumption data and current consumption characteristics;
the formula (3) is: gn=σ([W1xu;W2hj]+b),
The formula (4) is: gf=σ([W1xf;W2hg]+b),
Where σ is the activation function, W1、W2Is a weight matrix, b is an offset vector, xuFor user sequence data, xfFor friend sequence data, hjIs hnLast hidden state of hgIs hfThe last hidden state of (a);
s23, hiding the plurality of hidden states h according to the formula (5)nFiltering and selecting to obtain the current interest of the user, and carrying out hidden state h according to the formula (6)fFiltering and selecting to obtain the current interest of the friend; the formula (5) isThe formula (6) isgnAnd gfIn order to select the door or doors,is a hadamard product.
The beneficial effect of adopting the further scheme is that: after the user sequence data is filtered and selected by the product of the selection gate and the Hadamard code, the feature information related to the current overall consumption feature of the user is reserved, the information unrelated to the current overall consumption feature is filtered, and the current interest of the user can be accurately expressed.
Further, step S3 specifically includes:
s31, splicing the current interests of the friends according to the formula (7) to obtain interests, outputting the interests through a GRU neural network, and taking the output last hidden state as the short-term interests of the friends; the formula (7) is: is the current interest of a friend, hgFor a plurality of hidden states hfThe last hidden state of (a);
s32, performing initialization learning on the commodity data of the initial consumption data of the friends to obtain the long-term interest of the friends;
and S33, splicing the short-term interest of the friend with the long-term interest of the friend to obtain the final interest of the friend.
The current interests of the friends containing the consumption characteristics related to current consumption can be obtained through selection and filtering, after the current interests of the friends are spliced and output through a GRU neural network, the short-term interests of the friends not only contain the consumption characteristic information after the friends are filtered, but also contain the overall consumption characteristics of the friends, and the final interests of the friends are obtained after the short-term interests of the friends are spliced with the long-term interests of the friends, so that the consumption information of the friends is enriched, the consumption characteristics of the friends are emphasized, and the consumption interests of the friends can be further accurately obtained.
Further, the step S4 specifically includes:
s41, convolving the current interest of the user with the final interest of the friends based on a neural network of a graph attention function formula (8) to obtain weights of the user interests of a plurality of friends, wherein the graph attention function formula (8) is as follows:
in the formula, T represents transposition, hcRepresenting the interest of the c-th friend on the social network, k ═ duu means that k is the union of a plurality of friends d and users u, and hkRepresenting the final interests of a plurality of friends and the current interest set of the user, hnA GRU neural network output representing user sequence data;
s42, weighting and calculating the weights of different friends on the user interests according to an equation (9) to obtain the friend influence, wherein the equation (9) is as follows:where k ═ d @ u denotes that k is a set of a plurality of friends d and users u, and h ═ d @ u denotes that k is a set of a plurality of friends d and users ukRepresenting the final interests of friends and the current interest set of the user, akA weight representing the user's interest.
The beneficial effect of adopting the further scheme is that: the final interests of the friends are convoluted and weighted and calculated according to the interest weights of different friends, so that the influence of the friends similar to the interests of the user can be enhanced, the consumption interests of the user can be more accurately obtained, the training model can more accurately learn the interests of the user, and the recommendation performance of the recommendation system is further improved.
Further, the step S6 specifically includes:
s61, carrying out dimension conversion on the commodity data in the original consumption data, and convertingThe converted commodity data may be expressed as z ═ z (z)1,z2,...zI);
S62, calculating the final interest of the user and the converted commodity data according to a Softmax functional formula (10), wherein the Softmax functional formula (10) is as follows:
wherein I is the total number of commodities, zyIs the y-th commodity, zqIs the qth commodity, T is the transpose, q is any commodity, hlIs the ultimate interest of the user;
s63, performing model training on the interest probability of each commodity by adopting a cross entropy loss function according to an equation (11), and obtaining a training model when an output loss value tends to be stable, wherein the equation (11) is as follows:
in formula (II) p'qTo predict the probability of user interest in a good, 1-p'qTo predict the probability of a user not being interested in a commodity, pqThe probability of interest of the actual user in the commodity;
and S64, selecting commodity information ranked at the front according to the training model, the Recall index and the NDCG index, and recommending the commodity information to a user.
The beneficial effect of adopting the further scheme is that: the method comprises the steps of calculating the interest probability of each commodity for a user, and then training the model by using a cross entropy loss function, so that the accuracy of learning the interest of the user in each commodity by using the training model can be improved, and when the output loss value tends to be stable, the training model can be used for testing data, so that the recommendation performance of the recommendation system is further improved.
In order to solve the above technical problem, the present invention further provides a sequence recommendation system based on a residual error network, including: the system comprises an initial module, an interest acquisition module and a training module;
the initial module is used for respectively dividing the user original consumption data and the friend original consumption data into sequences to obtain a user sequence segment and a friend sequence segment, and initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for the GRU neural network to identify;
the interest acquisition module is used for filtering and selecting the user sequence data based on a GRU neural network of a selection door mechanism to obtain the current interest of the user, filtering and selecting the friend sequence data to obtain the current interest of a friend, splicing the current interest of the friend to obtain the short-term interest of the friend, initializing commodity data in the original consumption data of the friend to obtain the long-term interest of the friend, and splicing the short-term interest of the friend and the long-term interest of the friend to obtain the final interest of the friend; convolving the current interest of the user and the final interest of the friends based on a neural network of graph attention to obtain weights of the friends for the user interest, and obtaining friend influences through weighted calculation according to the weights of the friends for the user interest; splicing the friend influence and the current interest of the user to obtain the final interest of the user;
the training module is used for calculating the final interest of the user according to a Softmax function to obtain probability distribution of different commodities, carrying out model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
Drawings
Fig. 1 is a flowchart of a sequential social recommendation method based on a door mechanism according to an embodiment of the present invention;
fig. 2 is a schematic diagram of GRU neural network output of a sequential social recommendation method based on a door mechanism according to an embodiment of the present invention;
fig. 3 is an overall frame diagram of a sequential social recommendation method based on a door mechanism according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a sequential social recommendation method based on a door mechanism according to an embodiment of the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
Example one
A sequential social recommendation method based on a door mechanism comprises the following steps:
step S1, dividing sequences of user original consumption data and friend original consumption data respectively to obtain a user sequence segment and a friend sequence segment, initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for GRU neural network identification;
step S2, based on GRU neural network of selection door mechanism, filtering and selecting the user sequence data to obtain the current interest of the user, and filtering and selecting the friend sequence data to obtain the current interest of the friend;
s3, splicing the current interests of the friends to obtain short-term interests of the friends, initializing commodity data in the original consumption data of the friends to obtain long-term interests of the friends, and splicing the short-term interests of the friends and the long-term interests of the friends to obtain final interests of the friends;
step S4, convolving the current interest of the user and the final interest of the friends based on the neural network of the attention of the graph to obtain the weights of the friends for the user interest, and calculating the influence of the friends according to the weights of the friends for the user interest;
step S5, splicing the friend influence and the current interest of the user to obtain the final interest of the user;
and S6, calculating the final interest of the user according to a Softmax function to obtain the probability distribution of the user to different commodities, performing model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
In the embodiment, the user interest obtained after the GRU neural network output processing can be filtered through the selection gate to obtain information irrelevant to the current overall consumption characteristic, the characteristic information relevant to the current overall consumption characteristic is retained, the interest expressions of the user and friends are obtained, the friend influence similar to the user interest is enhanced according to the interest weights of different friends, the consumption interest of the user can be obtained more accurately, the user interest can be learned more accurately by the training model, and the recommendation performance of the recommendation system is further improved.
Preferably, as an embodiment of the present invention, the step S1 specifically includes:
dividing the original consumption data of the user and the friend into a sequence according to a preset period to obtain a plurality of sequence segments, initializing each sequence segment to obtain sequence data, and after the sequence data are expanded according to a GRU neural network, expressing the user sequence data as xu=(m1,m2,...mj) The friend sequence data is represented as xf=(k1,k2,...kj) And j represents the size of the time step.
On the same social network, friends are among users who pay attention to each other;
the original consumption data of the user comprise a commodity information set consumed by the user, and the original consumption data of the friends comprise a commodity information set consumed by the friends;
wherein, the preset period recommends that 7 days are preset as a period;
wherein, the time step represents the length of the GRU neural network after expansion.
Preferably, as an embodiment of the present invention, the step S2 specifically includes:
s21, GRU neural network output is carried out to the user sequence data according to the formula (1), and a plurality of hidden states h are obtained corresponding to the sequence data sequencen(ii) a GRU neural network output is carried out on the friend sequence data according to the formula (2), and a plurality of hidden states h are obtained corresponding to the sequence data sequencef(ii) a The formula (1) is: h isn=GRU(xu) The formula (2) is: h isf=GRU(xf),xuFor user sequence data, xfFriend sequence data;
s22, according to the formula (3), the user sequence data and the hidden states hnThe last hidden state of (g) is merged into the process to obtain the select gate gn(ii) a The friend sequence data and the plurality of hidden states h are expressed according to equation (4)fThe last hidden state of (g) is merged into the process to obtain the select gate gf(ii) a The selection gate gnAnd a selection gate gfIncluding the user raw consumption data and current consumption characteristics;
the formula (3) is: gn=σ([W1xu;W2hj]+b),
The formula (4) is: gf=σ([W1xf;W2hg]+b),
Where σ is the activation function, W1、W2Is a weight matrix, b is an offset vector, xuFor user sequence data, xfFor friend sequence data, hjIs hnLast hidden state of hgIs hfThe last hidden state of (a);
s23, hiding the plurality of hidden states h according to the formula (5)nFiltering and selecting to obtain the current interest of the user, and carrying out hidden state h according to the formula (6)fFiltering and selecting to obtain the current interest of the friend; the formula (5) isThe formula (6) isgnAnd gfIn order to select the door or doors,is a hadamard product.
In addition, the above-mentioned "h" isnAnd the above-mentionedAre two different values, said hnRepresenting a plurality of hidden states, saidRepresenting the current interest of the user; h isfAnd the above-mentionedAre two different values, said hfRepresenting a plurality of hidden states, saidRepresenting the current interests of the friend.
It will be appreciated that, as shown in FIG. 2, the user sequence data x is representedu=(m1,m2,...mj) Inputting the data into GRU neural network to obtain multiple hidden states hn=(h1,h2,...hj) A plurality of hidden states hnObtaining the current interest of the user through filtering selection
It should be noted that, some consumption information irrelevant to the current interest may be included in the consumption data of the user, for example, the user wants to buy the sports equipment, but accidentally finds a certain digital product, which is the impurity information irrelevant to the current interest, and the traditional GRU neural network is difficult to process and filter the irrelevant information, so the learned interest of the user is inaccurate. Through the gate selection processing, some impurity information which does not accord with the current interest of the user can be effectively filtered, the impurity information such as a digital product is weakened, the sports equipment is reserved, the consumption behavior of the user is accurately learned, and the current interest of the user is expressed.
It can be understood that a GRU neural network with a selection gate is designed to output user sequence data, so that h is outputnLast hidden state hjIncluding the most representative of the user's current whole sequence dataThe current overall consumption characteristic information of the user is used for accurately acquiring the consumption interest of the userjConstructing a selection gate to combine the user sequence data with hjIncorporating into select gate g by formulanIn, select gate gn=(g1,g2,...gj) So that g isnNot only contains the original information of the user, but also contains the consumption characteristic information of the user, and then outputs h through the Hadamard product pairnAnd performing filtering selection to obtain the current interest of the user, wherein the current interest of the user comprises the characteristic information related to the overall consumption characteristics of the user, and the current interest of the friend comprises the characteristic information related to the overall consumption characteristics of the friend in the same way of filtering. The current interest of the user is the user interest obtained by learning the (n + 1) th sequence interaction information of the user through the GRU neural network, and the current interest of the friend is the friend interest obtained by learning the (j + 1) th sequence interaction information of the user through the GRU neural network.
In the above embodiment, after the user sequence data and the friend sequence data are filtered by the product of the selection gate and the hadamard code, the feature information related to the current overall consumption features of the user and the friend is retained, and the information unrelated to the current overall consumption features is filtered, so that the current interest of the user can be accurately expressed.
Preferably, as an embodiment of the present invention, the step S3 specifically includes:
s31, splicing the current interests of the friends according to the formula (7) to obtain interests, outputting the interests through a GRU neural network, and taking the output last hidden state as the short-term interests of the friends; the formula (7) is: is the current interest of a friend, hgFor a plurality of hidden states hfThe last hidden state of (a);
s32, performing initialization learning on the commodity data of the initial consumption data of the friends to obtain the long-term interest of the friends;
and S33, splicing the short-term interest of the friend with the long-term interest of the friend to obtain the final interest of the friend.
Wherein according to the formulaAnd outputting the interest by a GRU neural network.
Wherein according to the formulaIdentifying the short-term interests of friendsAnd the long-term interestSplicing to obtain final interest i of friendsd。
It can be understood that: the short-term interest of the friend is obtained by learning the j sequence interaction information of the friend by using a GRU neural network; the long-term interest of the friend is not changed with time or short-term interest, and is a fixed label of the friend, and when the short-term interest of the friend does not accord with the interest of the user, but the long-term interest accords with the interest of the user, the long-term favorite of the friend can be recommended to the user.
In the embodiment, the current interests of the friends containing the consumption characteristics related to current consumption can be obtained through selection and filtering, after the current interests of the friends are spliced and output through the GRU neural network, the short-term interests of the friends not only contain the consumption characteristic information after the friends are filtered, but also contain the overall consumption characteristics of the friends, and the final interests of the friends are obtained after the short-term interests of the friends are spliced, so that the consumption information of the friends is enriched, the consumption characteristics of the friends are emphasized, and the consumption interests of the friends can be further accurately obtained.
Preferably, as an embodiment of the present invention, the step S4 specifically includes:
s41, convolving the current interest of the user with the final interest of the friends based on a neural network of a graph attention function formula (8) to obtain weights of the user interests of a plurality of friends, wherein the graph attention function formula (8) is as follows:
in the formula, T represents transposition, hcRepresenting the interest of the c-th friend on the social network, k ═ duu means that k is a collection of a plurality of friends d and users u, and h ═ d &kRepresenting the final interests of a plurality of friends and the current interest set of the user, hnA GRU neural network output representing user sequence data;
s42, weighting and calculating the weights of different friends on the user interests according to an equation (9) to obtain the friend influence, wherein the equation (9) is as follows:where k ═ d @ u denotes that k is a set of a plurality of friends d and users u, and h ═ d @ u denotes that k is a set of a plurality of friends d and users ukRepresenting the final interests of friends and the current interest set of the user, akA weight representing the user's interest.
It should be noted that, the interests of different friends are not necessarily consistent with those of the user, and therefore more attention needs to be paid to the friends with similar interests to the user, and less attention needs to be paid to the friends with dissimilar interests to the user, so that the user is affected differently by the different friends by adding the graph attention network while convolving. The higher the similarity with the user's interests, the higher the weight of the interests of different friends to the user, and vice versa.
In the embodiment, the final interests of the friends are convoluted and weighted and calculated according to the interest weights of different friends, so that the friend influence similar to the user interest can be enhanced, the friend influence comprises the user interest, the consumption interest of the user can be acquired more accurately, the interest of the user can be learned more accurately by a training model, and the recommendation performance of a recommendation system is further improved.
Preferably, as an embodiment of the present invention, the step S6 specifically includes:
s61, performing dimension conversion on the commodity data in the original consumption data, wherein the converted commodity data can be expressed as z ═ (z)1,z2,...zI);
S62, calculating the final interest of the user and the converted commodity data according to a Softmax functional formula (10), wherein the Softmax functional formula (10) is as follows:
wherein I is the total number of commodities, zyIs the y-th commodity, zqIs the qth commodity, T is the transpose, q is any commodity, hlIs the ultimate interest of the user;
s63, performing model training on the interest probability of each commodity by adopting a cross entropy loss function according to an equation (11), and obtaining a training model when an output loss value tends to be stable, wherein the equation (11) is as follows:
in formula (II) p'qTo predict the probability of user interest in a good, 1-p'qTo predict the probability of a user not being interested in a commodity, pqThe probability of interest of the actual user in the commodity;
and S64, selecting commodity information ranked at the front according to the training model, the Recall index and the NDCG index, and recommending the commodity information to a user.
Splicing the friend influence and the current interest of the user to obtain the final interest h of the userl。
Obtaining a training model when the loss value tends to be stable, and determining the prediction probability value p 'of the user interested in each commodity according to the training model'qSorting the prediction probability values from large to small, simultaneously calculating through a Recall index and an NDCG index, selecting corresponding commodity information ranked at the top 20, and recommending the commodity information to the commodity informationA user.
And recommending the commodity information to the user by the information flow mode.
In the above embodiment, after the probability value of interest of the user to each commodity is calculated, the cross entropy loss function is adopted to train the model, so that the accuracy of learning the interest of the user to each commodity by the training model can be improved, and when the output loss value tends to be stable, the training model can be used for testing data, so that the recommendation performance of the recommendation system is further improved.
Example two
For convenience of understanding, the present embodiment describes a sequential social recommendation method based on a door mechanism by using a more specific example, and as shown in fig. 3, the sequential social recommendation method based on the door mechanism includes:
s1, converting the user sequence data xu=(m1,m2,...mj) Inputting the data into GRU neural network, and obtaining multiple hidden states h corresponding to sequence data sequencenA plurality of hidden states hnThe current interest of the user is obtained by obtaining the current interest output of the user through the selection gate filtering selection
S2 friend sequence data xf=(k1,k2,...kj) Inputting the data into GRU neural network, and obtaining multiple hidden states h corresponding to sequence data sequencefThe friend sequence data and the hfLast hidden state h ofgFiltering selection to get current interest of friendsH is to begCurrent interest with friendsSplicing to obtain interest i, inputting the interest i into the GRU neural network to output to obtain the short-term interest of the friend
S3, initializing all commodity data consumed by friends to obtain the long-term interest of the friends
S4, identifying short-term interests of friendsLong-term interest with friendsSplicing to obtain final interest i of friendsd;
S5, finding the final interest i of the frienddCarrying out convolution weighting calculation to obtain friend influence hmInfluence the friend on hmCurrent interest with userSplicing to obtain the final interest h of the userl;
S6, finding the final interest h of the user according to the Softmax functionlCalculating to obtain the probability value of each commodity; and according to the reverse order arrangement of the probability values of the interest of the user to the commodities, selecting the commodity information with the probability value ranking at the front, and recommending the commodity information to the user.
EXAMPLE III
The present embodiment provides a sequential social recommendation system based on a door mechanism, as shown in fig. 4, including: the system comprises an initial module, an interest acquisition module and a training module;
the initial module is used for respectively dividing the user original consumption data and the friend original consumption data into sequences to obtain a user sequence segment and a friend sequence segment, and initializing the user sequence segment and the friend sequence segment to obtain user sequence data and friend sequence data for the GRU neural network to identify;
the interest acquisition module is used for filtering and selecting the user sequence data based on a GRU neural network of a selection door mechanism to obtain the current interest of the user, and filtering and selecting the friend sequence data to obtain the current interest of the friend; splicing the current interests of the friends to obtain short-term interests of the friends, initializing commodity data in the original consumption data of the friends to obtain long-term interests of the friends, and splicing the short-term interests of the friends and the long-term interests of the friends to obtain final interests of the friends; convolving the current interest of the user and the final interest of the friends based on a neural network of graph attention to obtain weights of the friends for the user interest, and obtaining friend influences through weighted calculation according to the weights of the friends for the user interest; splicing the friend influence and the current interest of the user to obtain the final interest of the user;
the training module is used for calculating the final interest of the user according to a Softmax function to obtain the probability distribution of the user to different commodities, carrying out model training according to the probability distribution to obtain a training model, and recommending commodity information to the user according to the training model.
The embodiment also provides a sequential social recommendation system based on a door mechanism, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and is characterized in that when the processor executes the computer program, steps of the sequential social recommendation method based on the door mechanism are implemented, and are not described in detail herein.
The present embodiment further provides a storage medium, where the storage medium includes one or more computer programs stored therein, and the one or more computer programs may be executed by one or more processors to implement the steps of the sequential social recommendation based on the door mechanism in the above embodiments, which are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The technical solutions provided by the embodiments of the present invention are described in detail above, and the principles and embodiments of the present invention are explained in this patent by applying specific examples, and the descriptions of the embodiments above are only used to help understanding the principles of the embodiments of the present invention; the present invention is not limited to the above preferred embodiments, and any modifications, equivalent replacements, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention.