This is the repository of our article published on MDPI Entropy "Feature Selection for Recommender Systems with Quantum Computing".

Related tags

Deep LearningCQFS
Overview

Collaborative-driven Quantum Feature Selection

This repository was developed by Riccardo Nembrini, PhD student at Politecnico di Milano. See the websites of our quantum computing group and of our recommender systems group for more information on our teams and works. This repository contains the source code for the article "Feature Selection for Recommender Systems with Quantum Computing".

Here we explain how to install dependencies, setup the connection to D-Wave Leap quantum cloud services and how to run experiments included in this repository.

Installation

NOTE: This repository requires Python 3.7

It is suggested to install all the required packages into a new Python environment. So, after repository checkout, enter the repository folder and run the following commands to create a new environment:

If you're using virtualenv:

virtualenv -p python3 cqfs
source cqfs/bin/activate

If you're using conda:

conda create -n cqfs python=3.7 anaconda
conda activate cqfs

Remember to add this project in the PYTHONPATH environmental variable if you plan to run the experiments on the terminal:

export PYTHONPATH=$PYTHONPATH:/path/to/project/folder

Then, make sure you correctly activated the environment and install all the required packages through pip:

pip install -r requirements.txt

After installing the dependencies, it is suggested to compile Cython code in the repository.

In order to compile you must first have installed: gcc and python3 dev. Under Linux those can be installed with the following commands:

sudo apt install gcc 
sudo apt-get install python3-dev

If you are using Windows as operating system, the installation procedure is a bit more complex. You may refer to THIS guide.

Now you can compile all Cython algorithms by running the following command. The script will compile within the current active environment. The code has been developed for Linux and Windows platforms. During the compilation you may see some warnings.

python run_compile_all_cython.py

D-Wave Setup

In order to make use of D-Wave cloud services you must first sign-up to D-Wave Leap and get your API token.

Then, you need to run the following command in the newly created Python environment:

dwave setup

This is a guided setup for D-Wave Ocean SDK. When asked to select non-open-source packages to install you should answer y and install at least D-Wave Drivers (the D-Wave Problem Inspector package is not required, but could be useful to analyse problem solutions, if solving problems with the QPU only).

Then, continue the configuration by setting custom properties (or keeping the default ones, as we suggest), apart from the Authentication token field, where you should paste your API token obtained on the D-Wave Leap dashboard.

You should now be able to connect to D-Wave cloud services. In order to verify the connection, you can use the following command, which will send a test problem to D-Wave's QPU:

dwave ping

Running CQFS Experiments

First of all, you need to prepare the original files for the datasets.

For The Movies Dataset you need to download The Movies Dataset from Kaggle and place the compressed files in the directory recsys/Data_manager_offline_datasets/TheMoviesDataset/, making sure the file is called the-movies-dataset.zip.

For CiteULike_a you need to download the following .zip file and place it in the directory recsys/Data_manager_offline_datasets/CiteULike/, making sure the file is called CiteULike_a_t.zip.

We cannot provide data for Xing Challenge 2017, but if you have the dataset available, place the compressed file containing the dataset's original files in the directory recsys/Data_manager_offline_datasets/XingChallenge2017/, making sure the file is called xing_challenge_data_2017.zip.

After preparing the datasets, you should run the following command under the data directory:

python split_NameOfTheDataset.py

This python script will generate the data splits used in the experiments. Moreover, it will preprocess the dataset and check for any error in the preprocessing phase. The resulting splits are saved in the recsys/Data_manager_split_datasets directory.

After splitting the dataset, you can actually run the experiments. All the experiment scripts are in the experiments directory, so enter this folder first. Each dataset has separated experiment scripts that you can find in the corresponding directories. From now on, we will assume that you are running the following commands in the dataset-specific folders, thus running the scripts contained there.

Collaborative models

First of all, we need to optimize the chosen collaborative models to use with CQFS. To do so, run the following command:

python CollaborativeFiltering.py

The resulting models will be saved into the results directory.

CQFS

Then, you can run the CQFS procedure. We divided the procedure into a selection phase and a recommendation phase. To perform the selection through CQFS run the following command:

python CQFS.py

This script will solve the CQFS problem on the corresponding dataset and save all the selected features in appropriate subdirectories under the results directory.

After solving the feature selection problem, you should run the following command:

python CQFSTrainer.py

This script will optimize an ItemKNN content-based recommender system for each selection corresponding to the given hyperparameters (and previously obtained through CQFS), using only the selected features. Again, all the results are saved in the corresponding subdirectories under the results directory.

NOTE: Each selection with D-Wave Leap hybrid service on these problems is performed in around 8 seconds for The Movies Dataset and around 30 for CiteULike_a. Therefore, running the script as it is would result in consuming all the free time given with the developer plan on D-Wave Leap and may result in errors or invalid selections when there's no free time remaining.

We suggest to reduce the number of hyperparameters passed when running the experiments or, even better, chose a collaborative model and perform all the experiments on it.

This is not the case when running experiments with Simulated Annealing, since it is executed locally.

For Xing Challenge 2017 experiments run directly on the D-Wave QPU. Leaving all the hyperparameters unchanged, all the experiments should not exceed the free time of the developer plan. Pay attention when increasing the number of reads from the sampler or the annealing time.

Baselines

In order to obtain the baseline evaluations you can run the corresponding scripts with the following commands:

# ItemKNN content-based with all the features
python baseline_CBF.py

# ItemKNN content-based with features selected through TF-IDF
python baseline_TFIDF.py

# CFeCBF feature weighting baseline
python baseline_CFW.py

Acknowledgements

Software produced by Riccardo Nembrini. Recommender systems library by Maurizio Ferrari Dacrema.

Article authors: Riccardo Nembrini, Maurizio Ferrari Dacrema, Paolo Cremonesi

Owner
Quantum Computing Lab @ Politecnico di Milano
Quantum Machine Learning group
Quantum Computing Lab @ Politecnico di Milano
Gradient representations in ReLU networks as similarity functions

Gradient representations in ReLU networks as similarity functions by Dániel Rácz and Bálint Daróczy. This repo contains the python code related to our

1 Oct 08, 2021
Pytoydl: A toy deep learning framework built upon numpy.

Documents: https://pytoydl.readthedocs.io/zh/latest/ Pytoydl A toy deep learning framework built upon numpy. You can star this repository to keep trac

28 Dec 10, 2022
Learning Representations that Support Robust Transfer of Predictors

Transfer Risk Minimization (TRM) Code for Learning Representations that Support Robust Transfer of Predictors Prepare the Datasets Preprocess the Scen

Yilun Xu 15 Dec 07, 2022
PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT

PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT

NVIDIA Corporation 1.8k Dec 30, 2022
Vision Transformer and MLP-Mixer Architectures

Vision Transformer and MLP-Mixer Architectures Update (2.7.2021): Added the "When Vision Transformers Outperform ResNets..." paper, and SAM (Sharpness

Google Research 6.4k Jan 04, 2023
This repository is an implementation of our NeurIPS 2021 paper (Stylized Dialogue Generation with Multi-Pass Dual Learning) in PyTorch.

MPDL---TODO This repository is an implementation of our NeurIPS 2021 paper (Stylized Dialogue Generation with Multi-Pass Dual Learning) in PyTorch. Ci

CodebaseLi 3 Nov 27, 2022
PHOTONAI is a high level python API for designing and optimizing machine learning pipelines.

PHOTONAI is a high level python API for designing and optimizing machine learning pipelines. We've created a system in which you can easily select and

Medical Machine Learning Lab - University of Münster 57 Nov 12, 2022
Source code for Fixed-Point GAN for Cloud Detection

FCD: Fixed-Point GAN for Cloud Detection PyTorch source code of Nyborg & Assent (2020). Abstract The detection of clouds in satellite images is an ess

Joachim Nyborg 8 Dec 22, 2022
Object detection evaluation metrics using Python.

Object detection evaluation metrics using Python.

Louis Facun 2 Sep 06, 2022
Urban mobility simulations with Python3, RLlib (Deep Reinforcement Learning) and Mesa (Agent-based modeling)

Deep Reinforcement Learning for Smart Cities Documentation RLlib: https://docs.ray.io/en/master/rllib.html Mesa: https://mesa.readthedocs.io/en/stable

1 May 15, 2022
Acute ischemic stroke dataset

AISD Acute ischemic stroke dataset contains 397 Non-Contrast-enhanced CT (NCCT) scans of acute ischemic stroke with the interval from symptom onset to

Kongming Liang 21 Sep 06, 2022
Recreate CenternetV2 based on MMDET.

Introduction This project is trying to Recreate CenternetV2 based on MMDET, which is proposed in paper Probabilistic two-stage detection. This project

25 Dec 09, 2022
A resource for learning about deep learning techniques from regression to LSTM and Reinforcement Learning using financial data and the fitness functions of algorithmic trading

A tour through tensorflow with financial data I present several models ranging in complexity from simple regression to LSTM and policy networks. The s

195 Dec 07, 2022
PyTorch implementation code for the paper MixCo: Mix-up Contrastive Learning for Visual Representation

How to Reproduce our Results This repository contains PyTorch implementation code for the paper MixCo: Mix-up Contrastive Learning for Visual Represen

opcrisis 46 Dec 15, 2022
Automatic Differentiation Multipole Moment Molecular Forcefield

Automatic Differentiation Multipole Moment Molecular Forcefield Performance notes On a single gpu, using waterbox_31ang.pdb example from MPIDplugin wh

4 Jan 07, 2022
A PyTorch implementation of NeRF (Neural Radiance Fields) that reproduces the results.

NeRF-pytorch NeRF (Neural Radiance Fields) is a method that achieves state-of-the-art results for synthesizing novel views of complex scenes. Here are

Yen-Chen Lin 3.2k Jan 08, 2023
3D ResNet Video Classification accelerated by TensorRT

Activity Recognition TensorRT Perform video classification using 3D ResNets trained on Kinetics-400 dataset and accelerated with TensorRT P.S Click on

Akash James 39 Nov 21, 2022
GraPE is a Rust/Python library for high-performance Graph Processing and Embedding.

GraPE GraPE (Graph Processing and Embedding) is a fast graph processing and embedding library, designed to scale with big graphs and to run on both of

AnacletoLab 194 Dec 29, 2022
Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch .

PyTorch-High-Res-Stereo-Depth-Estimation Python scripts form performing stereo depth estimation using the high res stereo model in PyTorch. Stereo dep

Ibai Gorordo 26 Nov 24, 2022
EASY - Ensemble Augmented-Shot Y-shaped Learning: State-Of-The-Art Few-Shot Classification with Simple Ingredients.

EASY - Ensemble Augmented-Shot Y-shaped Learning: State-Of-The-Art Few-Shot Classification with Simple Ingredients. This repository is the official im

Yassir BENDOU 57 Dec 26, 2022