QRec: A Python Framework for quick implementation of recommender systems (TensorFlow Based)

Overview

logo

GitHub last commit

Introduction

QRec is a Python framework for recommender systems (Supported by Python 3.7.4 and Tensorflow 1.14+) in which a number of influential and newly state-of-the-art recommendation models are implemented. QRec has a lightweight architecture and provides user-friendly interfaces. It can facilitate model implementation and evaluation.
Founder and principal contributor: @Coder-Yu
Other contributors: @DouTong @Niki666 @HuXiLiFeng @BigPowerZ @flyxu
Supported by: @AIhongzhi (A/Prof. Hongzhi Yin, UQ), @mingaoo (A/Prof. Min Gao, CQU)

What's New

12/10/2021 - BUIR proposed in SIGIR'21 paper has been added.
30/07/2021 - We have transplanted QRec from py2 to py3.
07/06/2021 - SEPT proposed in our KDD'21 paper has been added.
16/05/2021 - SGL proposed in SIGIR'21 paper has been added.
16/01/2021 - MHCN proposed in our WWW'21 paper has been added.
22/09/2020 - DiffNet proposed in SIGIR'19 has been added.
19/09/2020 - DHCF proposed in KDD'20 has been added.
29/07/2020 - ESRF proposed in my TKDE paper has been added.
23/07/2020 - LightGCN proposed in SIGIR'20 has been added.
17/09/2019 - NGCF proposed in SIGIR'19 has been added.
13/08/2019 - RSGAN proposed in ICDM'19 has been added.
09/08/2019 - Our paper is accepted as full research paper by ICDM'19.
20/02/2019 - IRGAN proposed in SIGIR'17 has been added.
12/02/2019 - CFGAN proposed in CIKM'18 has been added.

Architecture

QRec Architecture

Workflow

QRec Architecture

Features

  • Cross-platform: QRec can be easily deployed and executed in any platforms, including MS Windows, Linux and Mac OS.
  • Fast execution: QRec is based on Numpy, Tensorflow and some lightweight structures, which make it run fast.
  • Easy configuration: QRec configs recommenders with a configuration file and provides multiple evaluation protocols.
  • Easy expansion: QRec provides a set of well-designed recommendation interfaces by which new algorithms can be easily implemented.

Requirements

  • gensim==4.1.2
  • joblib==1.1.0
  • mkl==2022.0.0
  • mkl_service==2.4.0
  • networkx==2.6.2
  • numba==0.53.1
  • numpy==1.20.3
  • scipy==1.6.2
  • tensorflow==1.14.0

Usage

There are two ways to run the recommendation models in QRec:

  • 1.Configure the xx.conf file in the directory named config. (xx is the name of the model you want to run)
  • 2.Run main.py.

Or

  • Follow the codes in snippet.py.

For more details, we refer you to the handbook of QRec.

Configuration

Essential Options

Entry Example Description
ratings D:/MovieLens/100K.txt Set the file path of the dataset. Format: each row separated by empty, tab or comma symbol.
social D:/MovieLens/trusts.txt Set the file path of the social dataset. Format: each row separated by empty, tab or comma symbol.
ratings.setup -columns 0 1 2 -columns: (user, item, rating) columns of rating data are used.
social.setup -columns 0 1 2 -columns: (trustor, trustee, weight) columns of social data are used.
mode.name UserKNN/ItemKNN/SlopeOne/etc. name of the recommendation model.
evaluation.setup -testSet ../dataset/testset.txt Main option: -testSet, -ap, -cv (choose one of them)
-testSet path/to/test/file (need to specify the test set manually)
-ap ratio (ap means that the ratings are automatically partitioned into training set and test set, the number is the ratio of the test set. e.g. -ap 0.2)
-cv k (-cv means cross validation, k is the number of the fold. e.g. -cv 5)
-predict path/to/user list/file (predict for a given list of users without evaluation; need to mannually specify the user list file (each line presents a user))
Secondary option:-b, -p, -cold, -tf, -val (multiple choices)
-val ratio (model test would be conducted on the validation set which is generated by randomly sampling the training dataset with the given ratio.)
-b thres (binarizing the rating values. Ratings equal or greater than thres will be changed into 1, and ratings lower than thres will be left out. e.g. -b 3.0)
-p (if this option is added, the cross validation wll be executed parallelly, otherwise executed one by one)
-tf (model training will be conducted on TensorFlow (only applicable and needed for shallow models))
-cold thres (evaluation on cold-start users; users in the training set with rated items more than thres will be removed from the test set)
item.ranking off -topN -1 Main option: whether to do item ranking
-topN N1,N2,N3...: the length of the recommendation list. *QRec can generate multiple evaluation results for different N at the same time
output.setup on -dir ./Results/ Main option: whether to output recommendation results
-dir path: the directory path of output results.

Memory-based Options

similarity pcc/cos Set the similarity method to use. Options: PCC, COS;
num.neighbors 30 Set the number of neighbors used for KNN-based algorithms such as UserKNN, ItemKNN.

Model-based Options

num.factors 5/10/20/number Set the number of latent factors
num.max.epoch 100/200/number Set the maximum number of epoch for iterative recommendation algorithms.
learnRate -init 0.01 -max 1 -init initial learning rate for iterative recommendation algorithms;
-max: maximum learning rate (default 1);
reg.lambda -u 0.05 -i 0.05 -b 0.1 -s 0.1 -u: user regularizaiton; -i: item regularization; -b: bias regularizaiton; -s: social regularization

Implement Your Model

  • 1.Make your new algorithm generalize the proper base class.
  • 2.Reimplement some of the following functions as needed.
          - readConfiguration()
          - printAlgorConfig()
          - initModel()
          - trainModel()
          - saveModel()
          - loadModel()
          - predictForRanking()
          - predict()

For more details, we refer you to the handbook of QRec.

Implemented Algorithms

       
Rating prediction Paper
SlopeOne Lemire and Maclachlan, Slope One Predictors for Online Rating-Based Collaborative Filtering, SDM'05.
PMF Salakhutdinov and Mnih, Probabilistic Matrix Factorization, NIPS'08.
SoRec Ma et al., SoRec: Social Recommendation Using Probabilistic Matrix Factorization, SIGIR'08.
SVD++ Koren, Factorization meets the neighborhood: a multifaceted collaborative filtering model, SIGKDD'08.
RSTE Ma et al., Learning to Recommend with Social Trust Ensemble, SIGIR'09.
SVD Y. Koren, Collaborative Filtering with Temporal Dynamics, SIGKDD'09.
SocialMF Jamali and Ester, A Matrix Factorization Technique with Trust Propagation for Recommendation in Social Networks, RecSys'10.
EE Khoshneshin et al., Collaborative Filtering via Euclidean Embedding, RecSys'10.
SoReg Ma et al., Recommender systems with social regularization, WSDM'11.
LOCABAL Tang, Jiliang, et al. Exploiting local and global social context for recommendation, AAAI'13.
SREE Li et al., Social Recommendation Using Euclidean embedding, IJCNN'17.
CUNE-MF Zhang et al., Collaborative User Network Embedding for Social Recommender Systems, SDM'17.

                       
Item Ranking Paper
BPR Rendle et al., BPR: Bayesian Personalized Ranking from Implicit Feedback, UAI'09.
WRMF Yifan Hu et al.Collaborative Filtering for Implicit Feedback Datasets, KDD'09.
SBPR Zhao et al., Leveraing Social Connections to Improve Personalized Ranking for Collaborative Filtering, CIKM'14
ExpoMF Liang et al., Modeling User Exposure in Recommendation, WWW''16.
CoFactor Liang et al., Factorization Meets the Item Embedding: Regularizing Matrix Factorization with Item Co-occurrence, RecSys'16.
TBPR Wang et al. Social Recommendation with Strong and Weak Ties, CIKM'16'.
CDAE Wu et al., Collaborative Denoising Auto-Encoders for Top-N Recommender Systems, WSDM'16'.
DMF Xue et al., Deep Matrix Factorization Models for Recommender Systems, IJCAI'17'.
NeuMF He et al. Neural Collaborative Filtering, WWW'17.
CUNE-BPR Zhang et al., Collaborative User Network Embedding for Social Recommender Systems, SDM'17'.
IRGAN Wang et al., IRGAN: A Minimax Game for Unifying Generative and Discriminative Information Retrieval Models, SIGIR'17'.
SERec Wang et al., Collaborative Filtering with Social Exposure: A Modular Approach to Social Recommendation, AAAI'18'.
APR He et al., Adversarial Personalized Ranking for Recommendation, SIGIR'18'.
IF-BPR Yu et al. Adaptive Implicit Friends Identification over Heterogeneous Network for Social Recommendation, CIKM'18'.
CFGAN Chae et al. CFGAN: A Generic Collaborative Filtering Framework based on Generative Adversarial Networks, CIKM'18.
NGCF Wang et al. Neural Graph Collaborative Filtering, SIGIR'19'.
DiffNet Wu et al. A Neural Influence Diffusion Model for Social Recommendation, SIGIR'19'.
RSGAN Yu et al. Generating Reliable Friends via Adversarial Learning to Improve Social Recommendation, ICDM'19'.
LightGCN He et al. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation, SIGIR'20.
DHCF Ji et al. Dual Channel Hypergraph Collaborative Filtering, KDD'20.
ESRF Yu et al. Enhancing Social Recommendation with Adversarial Graph Convlutional Networks, TKDE'20.
MHCN Yu et al. Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation, WWW'21.
SGL Wu et al. Self-supervised Graph Learning for Recommendation, SIGIR'21.
SEPT Yu et al. Socially-Aware Self-supervised Tri-Training for Recommendation, KDD'21.
BUIR Lee et al. Bootstrapping User and Item Representations for One-Class Collaborative Filtering, SIGIR'21.

Related Datasets

   
Data Set Basic Meta User Context
Users Items Ratings (Scale) Density Users Links (Type)
Ciao [1] 7,375 105,114 284,086 [1, 5] 0.0365% 7,375 111,781 Trust
Epinions [2] 40,163 139,738 664,824 [1, 5] 0.0118% 49,289 487,183 Trust
Douban [3] 2,848 39,586 894,887 [1, 5] 0.794% 2,848 35,770 Trust
LastFM [4] 1,892 17,632 92,834 implicit 0.27% 1,892 25,434 Trust
Yelp [5] 19,539 21,266 450,884 implicit 0.11% 19,539 864,157 Trust
Amazon-Book [6] 52,463 91,599 2,984,108 implicit 0.11% - - -

Reference

[1]. Tang, J., Gao, H., Liu, H.: mtrust:discerning multi-faceted trust in a connected world. In: International Conference on Web Search and Web Data Mining, WSDM 2012, Seattle, Wa, Usa, February. pp. 93–102 (2012)

[2]. Massa, P., Avesani, P.: Trust-aware recommender systems. In: Proceedings of the 2007 ACM conference on Recommender systems. pp. 17–24. ACM (2007)

[3]. G. Zhao, X. Qian, and X. Xie, “User-service rating prediction by exploring social users’ rating behaviors,” IEEE Transactions on Multimedia, vol. 18, no. 3, pp. 496–506, 2016.

[4]. Iván Cantador, Peter Brusilovsky, and Tsvi Kuflik. 2011. 2nd Workshop on Information Heterogeneity and Fusion in Recom- mender Systems (HetRec 2011). In Proceedings of the 5th ACM conference on Recommender systems (RecSys 2011). ACM, New York, NY, USA

[5]. Yu et al. Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation, WWW'21.

[6]. He et al. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation, SIGIR'20.

Acknowledgment

This project is supported by the Responsible Big Data Intelligence Lab (RBDI) at the school of ITEE, University of Queensland, and Chongqing University.

If our project is helpful to you, please cite one of these papers.
@inproceedings{yu2018adaptive,
title={Adaptive implicit friends identification over heterogeneous network for social recommendation},
author={Yu, Junliang and Gao, Min and Li, Jundong and Yin, Hongzhi and Liu, Huan},
booktitle={Proceedings of the 27th ACM International Conference on Information and Knowledge Management},
pages={357--366},
year={2018},
organization={ACM}
}

@inproceedings{yu2021self,
title={Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation},
author={Yu, Junliang and Yin, Hongzhi and Li, Jundong and Wang, Qinyong and Hung, Nguyen Quoc Viet and Zhang, Xiangliang},
booktitle={Proceedings of the Web Conference 2021},
pages={413--424},
year={2021}
}

Owner
Yu
Long live idealism!
Yu
TCPNet - Temporal-attentive-Covariance-Pooling-Networks-for-Video-Recognition

Temporal-attentive-Covariance-Pooling-Networks-for-Video-Recognition This is an implementation of TCPNet. Introduction For video recognition task, a g

Zilin Gao 21 Dec 08, 2022
Generative Art Using Neural Visual Grammars and Dual Encoders

Generative Art Using Neural Visual Grammars and Dual Encoders Arnheim 1 The original algorithm from the paper Generative Art Using Neural Visual Gramm

DeepMind 231 Jan 05, 2023
Pytorch implementation of Rosca, Mihaela, et al. "Variational Approaches for Auto-Encoding Generative Adversarial Networks."

alpha-GAN Unofficial pytorch implementation of Rosca, Mihaela, et al. "Variational Approaches for Auto-Encoding Generative Adversarial Networks." arXi

Victor Shepardson 78 Dec 08, 2022
Pipeline for employing a Lightweight deep learning models for LOW-power systems

PL-LOW A high-performance deep learning model lightweight pipeline that gradually lightens deep neural networks in order to utilize high-performance d

POSTECH Data Intelligence Lab 9 Aug 13, 2022
Rainbow DQN implementation that outperforms the paper's results on 40% of games using 20x less data 🌈

Rainbow 🌈 An implementation of Rainbow DQN which reaches a median HNS of 205.7 after only 10M frames (the original Rainbow from Hessel et al. 2017 re

Dominik Schmidt 31 Dec 21, 2022
HCQ: Hybrid Contrastive Quantization for Efficient Cross-View Video Retrieval

HCQ: Hybrid Contrastive Quantization for Efficient Cross-View Video Retrieval [toc] 1. Introduction This repository provides the code for our paper at

13 Dec 08, 2022
Segmentation and Identification of Vertebrae in CT Scans using CNN, k-means Clustering and k-NN

Segmentation and Identification of Vertebrae in CT Scans using CNN, k-means Clustering and k-NN If you use this code for your research, please cite ou

41 Dec 08, 2022
Plug-n-Play Reinforcement Learning in Python with OpenAI Gym and JAX

coax is built on top of JAX, but it doesn't have an explicit dependence on the jax python package. The reason is that your version of jaxlib will depend on your CUDA version.

128 Dec 27, 2022
GAT - Graph Attention Network (PyTorch) 💻 + graphs + 📣 = ❤️

GAT - Graph Attention Network (PyTorch) 💻 + graphs + 📣 = ❤️ This repo contains a PyTorch implementation of the original GAT paper ( 🔗 Veličković et

Aleksa Gordić 1.9k Jan 09, 2023
Optimized primitives for collective multi-GPU communication

NCCL Optimized primitives for inter-GPU communication. Introduction NCCL (pronounced "Nickel") is a stand-alone library of standard communication rout

NVIDIA Corporation 2k Jan 09, 2023
Film review classification

Film review classification Решение задачи классификации отзывов на фильмы на положительные и отрицательные с помощью рекуррентных нейронных сетей 1. З

Nikita Dukin 3 Jan 21, 2022
An easier way to build neural search on the cloud

An easier way to build neural search on the cloud Jina is a deep learning-powered search framework for building cross-/multi-modal search systems (e.g

Jina AI 17k Jan 02, 2023
(ICCV 2021) Official code of "Dressing in Order: Recurrent Person Image Generation for Pose Transfer, Virtual Try-on and Outfit Editing."

Dressing in Order (DiOr) 👚 [Paper] 👖 [Webpage] 👗 [Running this code] The official implementation of "Dressing in Order: Recurrent Person Image Gene

Aiyu Cui 277 Dec 28, 2022
A pytorch implementation of Pytorch-Sketch-RNN

Pytorch-Sketch-RNN A pytorch implementation of https://arxiv.org/abs/1704.03477 In order to draw other things than cats, you will find more drawing da

Alexis David Jacq 172 Dec 12, 2022
CFC-Net: A Critical Feature Capturing Network for Arbitrary-Oriented Object Detection in Remote Sensing Images

CFC-Net This project hosts the official implementation for the paper: CFC-Net: A Critical Feature Capturing Network for Arbitrary-Oriented Object Dete

ming71 55 Dec 12, 2022
Code for Efficient Visual Pretraining with Contrastive Detection

Code for DetCon This repository contains code for the ICCV 2021 paper "Efficient Visual Pretraining with Contrastive Detection" by Olivier J. Hénaff,

DeepMind 56 Nov 13, 2022
Generative Adversarial Networks(GANs)

Generative Adversarial Networks(GANs) Vanilla GAN ClusterGAN Vanilla GAN Model Structure Final Generator Structure A MLP with 2 hidden layers of hidde

Zhenbang Feng 2 Nov 05, 2021
Discover hidden deepweb pages

DeepWeb Scapper Att: Demo version An simple script to scrappe deepweb to find pages. Will return if any of those exists and will save on a file. You s

Héber Júlio 77 Oct 02, 2022
yolox_backbone is a deep-learning library and is a collection of YOLOX Backbone models.

YOLOX-Backbone yolox-backbone is a deep-learning library and is a collection of YOLOX backbone models. Install pip install yolox-backbone Load a Pret

Yonghye Kwon 21 Dec 28, 2022
Pytorch0.4.1 codes for InsightFace

InsightFace_Pytorch Pytorch0.4.1 codes for InsightFace 1. Intro This repo is a reimplementation of Arcface(paper), or Insightface(github) For models,

1.5k Jan 01, 2023