[email protected]) | PythonRepo" /> [email protected]) | PythonRepo">

This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" ([email protected])

Overview

GP-VAE

This repository provides datasets and code for preprocessing, training and testing models for the paper:

Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors
Wanyu Du, Jianqiao Zhao, Liwei Wang and Yangfeng Ji
ACL 2022 6th Workshop on Structured Prediction for NLP

image

Installation

The following command installs all necessary packages:

pip install -r requirements.txt

The project was tested using Python 3.6.6.

Datasets

  1. Twitter URL includes trn/val/tst.tsv, which has the following format in each line:
source_sentence \t reference_sentence 
  1. GYAFC has two sub-domains em and fr, please request and download the data from the original paper here.

Models

Training

Train the LSTM-based variational encoder-decoder with GP priors:

cd models/pg/
python main.py --task train --data_file ../../data/twitter_url \
			   --model_type gp_full --kernel_v 65.0 --kernel_r 0.0001

where --data_file indicates the data path for the training data,
--model_type indicates which prior to use, including copynet/normal/gp_full,
--kernel_v and --kernel_r specifies the hyper-parameters for the kernel of GP prior.

Train the transformer-based variational encoder-decoder with GP priors:

cd models/t5/
python t5_gpvae.py --task train --dataset twitter_url \
    			   --kernel_v 512.0 --kernel_r 0.001 

where --data_file indicates the data path for the training data,
--kernel_v and --kernel_r specifies the hyper-parameters for the kernel of GP prior.

Inference

Test the LSTM-based variational encoder-decoder with GP priors:

cd models/pg/
python main.py --task decode --data_file ../../data/twitter_url \
			   --model_type gp_full --kernel_v 65.0 --kernel_r 0.0001 \
			   --decode_from sample \
			   --model_file /path/to/best/checkpoint

where --data_file indicates the data path for the testing data,
--model_type indicates which prior to use, including copynet/normal/gp_full,
--kernel_v and --kernel_r specifies the hyper-parameters for the kernel of GP prior,
--decode_from indicates generating results conditioning on z_mean or randomly sampled z, including mean/sample.

Test the transformer-based variational encoder-decoder with GP priors:

cd models/t5/
python t5_gpvae.py --task eval --dataset twitter_url \
    			   --kernel_v 512.0 --kernel_r 0.001 \
    			   --from_mean \
    			   --timestamp '2021-02-14-04-57-04' \
    			   --ckpt '30000' # load best checkpoint

where --data_file indicates the data path for the testing data,
--kernel_v and --kernel_r specifies the hyper-parameters for the kernel of GP prior,
--from_mean indicates whether to generate results conditioning on z_mean or randomly sampled z,
--timestamp and --ckpt indicate the file path for the best checkpoint.

Citation

If you find this work useful for your research, please cite our paper:

Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors

@inproceedings{du2022gpvae,
    title = "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors",
    author = "Du, Wanyu and Zhao, Jianqiao and Wang, Liwei and Ji, Yangfeng",
    booktitle = "Proceedings of the 6th Workshop on Structured Prediction for NLP (SPNLP 2022)",
    year = "2022",
    publisher = "Association for Computational Linguistics",
}
Self-training for Few-shot Transfer Across Extreme Task Differences

Self-training for Few-shot Transfer Across Extreme Task Differences (STARTUP) Introduction This repo contains the official implementation of the follo

Cheng Perng Phoo 33 Oct 31, 2022
Convert ONNX model graph to Keras model format.

Convert ONNX model graph to Keras model format.

Grigory Malivenko 175 Dec 28, 2022
Tackling the Class Imbalance Problem of Deep Learning Based Head and Neck Organ Segmentation

Info This is the code repository of the work Tackling the Class Imbalance Problem of Deep Learning Based Head and Neck Organ Segmentation from Elias T

2 Apr 20, 2022
View model summaries in PyTorch!

torchinfo (formerly torch-summary) Torchinfo provides information complementary to what is provided by print(your_model) in PyTorch, similar to Tensor

Tyler Yep 1.5k Jan 05, 2023
Cl datasets - PyTorch image dataloaders and utility functions to load datasets for supervised continual learning

Continual learning datasets Introduction This repository contains PyTorch image

berjaoui 5 Aug 28, 2022
Code for WSDM 2022 paper, Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation.

DuoRec Code for WSDM 2022 paper, Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation. Usage Download datasets fr

Qrh 46 Dec 19, 2022
My personal Home Assistant configuration.

About This is my personal Home Assistant configuration. My guiding princile is to have full local control of all my devices. I intend everything to ru

Chris Turra 13 Jun 07, 2022
Yolov3 pytorch implementation

YOLOV3 Pytorch实现 在bubbliiing大佬代码的基础上进行了修改,添加了部分注释。 预训练模型 预训练模型来源于bubbliiing。 链接:https://pan.baidu.com/s/1ncREw6Na9ycZptdxiVMApw 提取码:appk 训练自己的数据集 按照VO

4 Aug 27, 2022
Llvlir - Low Level Variable Length Intermediate Representation

Low Level Variable Length Intermediate Representation Low Level Variable Length

Michael Clark 2 Jan 24, 2022
Turning SymPy expressions into PyTorch modules.

sympytorch A micro-library as a convenience for turning SymPy expressions into PyTorch Modules. All SymPy floats become trainable parameters. All SymP

Patrick Kidger 89 Dec 13, 2022
Degree-Quant: Quantization-Aware Training for Graph Neural Networks.

Degree-Quant This repo provides a clean re-implementation of the code associated with the paper Degree-Quant: Quantization-Aware Training for Graph Ne

35 Oct 07, 2022
[ICCV 2021] Our work presents a novel neural rendering approach that can efficiently reconstruct geometric and neural radiance fields for view synthesis.

MVSNeRF Project page | Paper This repository contains a pytorch lightning implementation for the ICCV 2021 paper: MVSNeRF: Fast Generalizable Radiance

Anpei Chen 529 Dec 30, 2022
Baseline model for "GraspNet-1Billion: A Large-Scale Benchmark for General Object Grasping" (CVPR 2020)

GraspNet Baseline Baseline model for "GraspNet-1Billion: A Large-Scale Benchmark for General Object Grasping" (CVPR 2020). [paper] [dataset] [API] [do

GraspNet 209 Dec 29, 2022
Implementation of momentum^2 teacher

Momentum^2 Teacher: Momentum Teacher with Momentum Statistics for Self-Supervised Learning Requirements All experiments are done with python3.6, torch

jemmy li 121 Sep 26, 2022
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain

Kaleido-BERT: Vision-Language Pre-training on Fashion Domain Mingchen Zhuge*, Dehong Gao*, Deng-Ping Fan#, Linbo Jin, Ben Chen, Haoming Zhou, Minghui

248 Dec 04, 2022
Zero-Shot Text-to-Image Generation VQGAN+CLIP Dockerized

VQGAN-CLIP-Docker About Zero-Shot Text-to-Image Generation VQGAN+CLIP Dockerized This is a stripped and minimal dependency repository for running loca

Kevin Costa 73 Sep 11, 2022
Collection of NLP model explanations and accompanying analysis tools

Thermostat is a large collection of NLP model explanations and accompanying analysis tools. Combines explainability methods from the captum library wi

126 Nov 22, 2022
Memory-Augmented Model Predictive Control

Memory-Augmented Model Predictive Control This repository hosts the source code for the journal article "Composing MPC with LQR and Neural Networks fo

Fangyu Wu 1 Jun 19, 2022
GNEE - GAT Neural Event Embeddings

GNEE - GAT Neural Event Embeddings This repository contains source code for the GNEE (GAT Neural Event Embeddings) method introduced in the paper: "Se

João Pedro Rodrigues Mattos 0 Sep 15, 2021
💃 VALSE: A Task-Independent Benchmark for Vision and Language Models Centered on Linguistic Phenomena

💃 VALSE: A Task-Independent Benchmark for Vision and Language Models Centered on Linguistic Phenomena.

Heidelberg-NLP 17 Nov 07, 2022