-
Couldn't load subscription status.
- Fork 2.9k
Implement DeepFM for CTR prediction #485
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 6 commits
76cc746
5d4166a
6e44fd6
250c394
ac65153
8a1af03
7b83fa4
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,90 @@ | ||
| # Deep Factorization Machines (DeepFM) for Click-Through Rate prediction | ||
|
|
||
| ## Introduction | ||
| This model implements the DeepFM proposed in the following paper: | ||
|
|
||
| ```text | ||
| Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He. DeepFM: | ||
| A Factorization-Machine based Neural Network for CTR Prediction. | ||
| Proceedings of the Twenty-Sixth International Joint Conference on | ||
| Artificial Intelligence (IJCAI-17), 2017 | ||
| ``` | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. line 7 ~ 10 行之前的空格删掉。 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 已改 |
||
|
|
||
| The DeepFm combines factorization machines and deep neural networks to model | ||
|
||
| both low order and high order feature interactions. For details of the | ||
| factorization machines, please refer to the paper [factorization | ||
| machines](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf) | ||
|
|
||
| ## Dataset | ||
| This example uses Criteo dataset which was used for the [Display Advertising | ||
| Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/) | ||
| hosted by Kaggle. | ||
|
|
||
| Each row is the features for an ad display and the first column is a label | ||
| indicating whether this ad has been clicked or not. There are 39 features in | ||
| total. 13 features take integer values and the other 26 features are | ||
| categorical features. For the test dataset, the labels are omitted. | ||
|
|
||
| Download dataset: | ||
| ```bash | ||
| cd data && ./download.sh && cd .. | ||
| ``` | ||
|
|
||
| ## Model | ||
| The DeepFM model is composed of the factorization machine layer (FM) and deep | ||
| neural networks (DNN). All the input features are feeded to both FM and DNN. | ||
| The output from FM and DNN are combined to form the final output. The embedding | ||
| layer for sparse features in the DNN shares the parameters with the latent | ||
| vectors (factors) of the FM layer. | ||
|
|
||
| The factorization machine layer in PaddlePaddle computes the second order | ||
| interactions. The following code example combines the factorization machine | ||
| layer and fully connected layer to form the full version of factorization | ||
| machine: | ||
|
|
||
| ```python | ||
| def fm_layer(input, factor_size): | ||
| first_order = paddle.layer.fc(input=input, size=1, act=paddle.activation.Linear()) | ||
| second_order = paddle.layer.factorization_machine(input=input, factor_size=factor_size) | ||
| fm = paddle.layer.addto(input=[first_order, second_order], | ||
| act=paddle.activation.Linear(), | ||
| ias_attr=False) | ||
|
||
| return fm | ||
| ``` | ||
|
|
||
| ## Data preparation | ||
| To preprocess the raw dataset, the integer features are clipped then min-max | ||
| normalized to [0, 1] and the categorical features are one-hot encoded. The raw | ||
| training dataset are splited such that 90% are used for training and the other | ||
| 10% are used for validation during training. | ||
|
|
||
| ```bash | ||
| python preprocess.py --datadir ./data/raw --outdir ./data | ||
| ``` | ||
|
|
||
| ## Train | ||
| The command line options for training can be listed by `python train.py -h`. | ||
|
|
||
| To train the model: | ||
| ```bash | ||
| python train.py \ | ||
| --train_data_path data/train.txt \ | ||
| --test_data_path data/valid.txt \ | ||
| 2>&1 | train.log | ||
| ``` | ||
|
|
||
| ## Evaluate | ||
|
|
||
|
||
| After training pass 9 batch 40000, the testing AUC is `0.807178` and the testing | ||
| cost is `0.445196`. | ||
|
|
||
| ## Infer | ||
| The command line options for infering can be listed by `python infer.py -h`. | ||
|
|
||
| To make inference for the test dataset: | ||
| ```bash | ||
| python infer.py \ | ||
| --model_gz_path models/model-pass-9-batch-10000.tar.gz \ | ||
| --data_path data/test.txt \ | ||
| --prediction_output_path ./predict.txt | ||
| ``` | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| #!/bin/bash | ||
|
|
||
| wget --no-check-certificate https://s3-eu-west-1.amazonaws.com/criteo-labs/dac.tar.gz | ||
| tar zxf dac.tar.gz | ||
| rm -f dac.tar.gz | ||
|
|
||
| mkdir raw | ||
| mv ./*.txt raw/ |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,63 @@ | ||
| import os | ||
| import gzip | ||
| import argparse | ||
| import itertools | ||
|
|
||
| import paddle.v2 as paddle | ||
|
|
||
| from network_conf import DeepFM | ||
| import reader | ||
|
|
||
|
|
||
| def parse_args(): | ||
| parser = argparse.ArgumentParser(description="PaddlePaddle DeepFM example") | ||
| parser.add_argument( | ||
| '--model_gz_path', | ||
| type=str, | ||
| required=True, | ||
| help="path of model parameters gz file") | ||
|
||
| parser.add_argument( | ||
| '--data_path', | ||
| type=str, | ||
| required=True, | ||
| help="path of the dataset to infer") | ||
| parser.add_argument( | ||
| '--prediction_output_path', | ||
| type=str, | ||
| required=True, | ||
| help="path to output the prediction") | ||
| parser.add_argument( | ||
| '--factor_size', | ||
| type=int, | ||
| default=10, | ||
| help="the factor size for the factorization machine (default:10)") | ||
|
|
||
| return parser.parse_args() | ||
|
|
||
|
|
||
| def infer(): | ||
| args = parse_args() | ||
|
|
||
| paddle.init(use_gpu=False, trainer_count=1) | ||
|
|
||
| model = DeepFM(args.factor_size, infer=True) | ||
|
|
||
| parameters = paddle.parameters.Parameters.from_tar( | ||
| gzip.open(args.model_gz_path, 'r')) | ||
|
|
||
| inferer = paddle.inference.Inference( | ||
| output_layer=model, parameters=parameters) | ||
|
|
||
| dataset = reader.Dataset() | ||
|
|
||
| infer_reader = paddle.batch(dataset.infer(args.data_path), batch_size=1000) | ||
|
|
||
| with open(args.prediction_output_path, 'w') as out: | ||
| for id, batch in enumerate(infer_reader()): | ||
| res = inferer.infer(input=batch) | ||
| predictions = [x for x in itertools.chain.from_iterable(res)] | ||
| out.write('\n'.join(map(str, predictions)) + '\n') | ||
|
|
||
|
|
||
| if __name__ == '__main__': | ||
| infer() | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,76 @@ | ||
| import paddle.v2 as paddle | ||
|
|
||
| dense_feature_dim = 13 | ||
| sparse_feature_dim = 117568 | ||
|
|
||
|
|
||
| def fm_layer(input, factor_size, fm_param_attr): | ||
| first_order = paddle.layer.fc( | ||
| input=input, size=1, act=paddle.activation.Linear()) | ||
| second_order = paddle.layer.factorization_machine( | ||
| input=input, | ||
| factor_size=factor_size, | ||
| act=paddle.activation.Linear(), | ||
| param_attr=fm_param_attr) | ||
| out = paddle.layer.addto( | ||
| input=[first_order, second_order], | ||
| act=paddle.activation.Linear(), | ||
| bias_attr=False) | ||
| return out | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. line 7 ~ 19 可以作为一个helper 加入在Paddle repo下。这个等Paddle 下面的 PR merge 之后再加吧。 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 好的 |
||
|
|
||
|
|
||
| def DeepFM(factor_size, infer=False): | ||
| dense_input = paddle.layer.data( | ||
| name="dense_input", | ||
| type=paddle.data_type.dense_vector(dense_feature_dim)) | ||
| sparse_input = paddle.layer.data( | ||
| name="sparse_input", | ||
| type=paddle.data_type.sparse_binary_vector(sparse_feature_dim)) | ||
| sparse_input_ids = [ | ||
| paddle.layer.data( | ||
| name="C" + str(i), | ||
| type=paddle.data_type.integer_value(sparse_feature_dim)) | ||
| for i in range(1, 27) | ||
| ] | ||
|
|
||
| dense_fm = fm_layer( | ||
| dense_input, | ||
| factor_size, | ||
| fm_param_attr=paddle.attr.Param(name="DenseFeatFactors")) | ||
| sparse_fm = fm_layer( | ||
| sparse_input, | ||
| factor_size, | ||
| fm_param_attr=paddle.attr.Param(name="SparseFeatFactors")) | ||
|
|
||
| def embedding_layer(input): | ||
| return paddle.layer.embedding( | ||
| input=input, | ||
| size=factor_size, | ||
| param_attr=paddle.attr.Param(name="SparseFeatFactors")) | ||
|
|
||
| sparse_embed_seq = map(embedding_layer, sparse_input_ids) | ||
| sparse_embed = paddle.layer.concat(sparse_embed_seq) | ||
|
|
||
| fc1 = paddle.layer.fc( | ||
| input=[sparse_embed, dense_input], | ||
| size=400, | ||
| act=paddle.activation.Relu()) | ||
| fc2 = paddle.layer.fc(input=fc1, size=400, act=paddle.activation.Relu()) | ||
| fc3 = paddle.layer.fc(input=fc2, size=400, act=paddle.activation.Relu()) | ||
|
|
||
| predict = paddle.layer.fc( | ||
| input=[dense_fm, sparse_fm, fc3], | ||
| size=1, | ||
| act=paddle.activation.Sigmoid()) | ||
|
|
||
| if not infer: | ||
| label = paddle.layer.data( | ||
| name="label", type=paddle.data_type.dense_vector(1)) | ||
| cost = paddle.layer.multi_binary_label_cross_entropy_cost( | ||
| input=predict, label=label) | ||
| paddle.evaluator.classification_error( | ||
| name="classification_error", input=predict, label=label) | ||
| paddle.evaluator.auc(name="auc", input=predict, label=label) | ||
| return cost | ||
| else: | ||
| return predict | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Deep Factorization Machine for Click-Through Rate prediction
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已改.