Grover is a model for Neural Fake News -- both generation and detectio

Overview

Grover

UPDATE, Sept 17 2019. We got into NeurIPS (camera ready coming soon!) and we've made Grover-Mega publicly available without you needing to fill out the form. You can download it using download_model.py.

(aka, code for Defending Against Neural Fake News)

Grover is a model for Neural Fake News -- both generation and detection. However, it probably can also be used for other generation tasks.

Visit our project page at rowanzellers.com/grover, the AI2 online demo, or read the full paper at arxiv.org/abs/1905.12616.

teaser

What's in this repo?

We are releasing the following:

  • Code for the Grover generator (in lm/). This involves training the model as a language model across fields.
  • Code for the Grover discriminator in discrimination/. Without much changing, you can run Grover as a discriminator to detect Neural Fake News.
  • Code for generating from a Grover model, in sample/.
  • Code for making your own RealNews dataset in realnews/.
  • Model checkpoints freely available online for all of the Grover models. For using the RealNews dataset for research, please submit this form and message me on contact me on Twitter or through email. You will need to use a valid account that has google cloud enabled, otherwise, I won't be able to give you access 😢

Scroll down 👇 for some easy-to-use instructions for setting up Grover to generate news articles.

Setting up your environment

NOTE: If you just care about making your own RealNews dataset, you will need to set up your environment separately just for that, using an AWS machine (see realnews/.)

There are a few ways you can run Grover:

  • Generation mode (inference). This requires a GPU because I wasn't able to get top-p sampling, or caching of transformer hidden states, to work on a TPU.
  • LM Validation mode (perplexity). This could be run on a GPU or a TPU, but I've only tested this with TPU inference.
  • LM Training mode. This requires a large TPU pod.
  • Discrimination mode (training). This requires a TPU pod.
  • Discrimination mode (inference). This could be run on a GPU or a TPU, but I've only tested this with TPU inference.

NOTE: You might be able to get things to work using different hardware. However, it might be a lot of work engineering wise and I don't recommend it if possible. Please don't contact me with requests like this, as there's not much help I can give you.

I used Python3.6 for everything. Usually I set it up using the following commands:

curl -o ~/miniconda.sh -O  https://repo.continuum.io/miniconda/Miniconda3-4.5.4-Linux-x86_64.sh  && \
     chmod +x ~/miniconda.sh && \
     ~/miniconda.sh -b -p ~/conda && \
     rm ~/miniconda.sh && \
     ~/conda/bin/conda install -y python=3.6

Then pip install -r requirements-gpu.txt if you're installing on a GPU, or pip install requirements-tpu.txt for TPU.

Misc notes/tips:

  • If you have a lot of projects on your machine, you might want to use an anaconda environment to handle them all. Use conda create -n grover python=3.6 to create an environment named grover. To enter the environment use source activate grover. To leave use source deactivate.
  • I'm using tensorflow 1.13.1 which requires Cuda 10.0. You'll need to install that from the nvidia website. I usually install it into /usr/local/cuda-10.0/, so you will need to run export LD_LIBRARY_PATH=/usr/local/cuda-10.0/lib64 so tensorflow knows where to find it.
  • I always have my pythonpath as the root directory. While in the grover directory, run export PYTHONPATH=$(pwd) to set it.

Quickstart: setting up Grover for generation!

  1. Set up your environment. Here's the easy way, assuming anaconda is installed: conda create -y -n grover python=3.6 && source activate grover && pip install -r requirements-gpu.txt
  2. Download the model using python download_model.py base
  3. Now generate: PYTHONPATH=$(pwd) python sample/contextual_generate.py -model_config_fn lm/configs/base.json -model_ckpt models/base/model.ckpt -metadata_fn sample/april2019_set_mini.jsonl -out_fn april2019_set_mini_out.jsonl

Congrats! You can view the generations, conditioned on the domain/headline/date/authors, in april2019_set_mini_out.jsonl.

FAQ: What's the deal with the release of Grover?

Our core position is that it is important to release possibly-dangerous models to researchers. At the same time, we believe Grover-Mega isn't particularly useful to anyone who isn't doing research in this area, particularly as we have an online web demo available and the model is computationally expensive. We previously were a bit stricter and limited initial use of Grover-Mega to researchers. Now that several months have passed since we put the paper on arxiv, and since several other large-scale language models have been publicly released, we figured that there is little harm in fully releasing Grover-Mega.

Bibtex

@inproceedings{zellers2019grover,
    title={Defending Against Neural Fake News},
    author={Zellers, Rowan and Holtzman, Ari and Rashkin, Hannah and Bisk, Yonatan and Farhadi, Ali and Roesner, Franziska and Choi, Yejin},
    booktitle={Advances in Neural Information Processing Systems 32},
    year={2019}
}
Owner
Rowan Zellers
Rowan Zellers
[WWW 2021 GLB] New Benchmarks for Learning on Non-Homophilous Graphs

New Benchmarks for Learning on Non-Homophilous Graphs Here are the codes and datasets accompanying the paper: New Benchmarks for Learning on Non-Homop

94 Dec 21, 2022
Negative sampling for solving the unlabeled entity problem in NER. ICLR-2021 paper: Empirical Analysis of Unlabeled Entity Problem in Named Entity Recognition.

Negative Sampling for NER Unlabeled entity problem is prevalent in many NER scenarios (e.g., weakly supervised NER). Our paper in ICLR-2021 proposes u

Yangming Li 128 Dec 29, 2022
Edge-Augmented Graph Transformer

Edge-augmented Graph Transformer Introduction This is the official implementation of the Edge-augmented Graph Transformer (EGT) as described in https:

Md Shamim Hussain 21 Dec 14, 2022
Data and code to support "Applied Natural Language Processing" (INFO 256, Fall 2021, UC Berkeley)

anlp21 Course materials for "Applied Natural Language Processing" (INFO 256, Fall 2021, UC Berkeley) Syllabus: http://people.ischool.berkeley.edu/~dba

David Bamman 48 Dec 06, 2022
Official implementation of Meta-StyleSpeech and StyleSpeech

Meta-StyleSpeech : Multi-Speaker Adaptive Text-to-Speech Generation Dongchan Min, Dong Bok Lee, Eunho Yang, and Sung Ju Hwang This is an official code

min95 169 Jan 05, 2023
This library is testing the ethics of language models by using natural adversarial texts.

prompt2slip This library is testing the ethics of language models by using natural adversarial texts. This tool allows for short and simple code and v

9 Dec 28, 2021
NeuTex: Neural Texture Mapping for Volumetric Neural Rendering

NeuTex: Neural Texture Mapping for Volumetric Neural Rendering Paper: https://arxiv.org/abs/2103.00762 Running Run on the provided DTU scene cd run ba

Fanbo Xiang 68 Jan 06, 2023
Question answering app is used to answer for a user given question from user given text.

Question answering app is used to answer for a user given question from user given text.It is created using HuggingFace's transformer pipeline and streamlit python packages.

Siva Prakash 3 Apr 05, 2022
Intent parsing and slot filling in PyTorch with seq2seq + attention

PyTorch Seq2Seq Intent Parsing Reframing intent parsing as a human - machine translation task. Work in progress successor to torch-seq2seq-intent-pars

Sean Robertson 159 Apr 04, 2022
InferSent sentence embeddings

InferSent InferSent is a sentence embeddings method that provides semantic representations for English sentences. It is trained on natural language in

Facebook Research 2.2k Dec 27, 2022
The source code of HeCo

HeCo This repo is for source code of KDD 2021 paper "Self-supervised Heterogeneous Graph Neural Network with Co-contrastive Learning". Paper Link: htt

Nian Liu 106 Dec 27, 2022
Retraining OpenAI's GPT-2 on Discord Chats

Train OpenAI's GPT-2 on Discord Chats Retraining a Text Generation Model on Discord Chats using gpt-2-simple that wraps existing model fine-tuning and

Ayush Mishra 4 Oct 27, 2022
A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex)

CodeJ A python project made to generate code using either OpenAI's codex or GPT-J (Although not as good as codex) Install requirements pip install -r

TheProtagonist 1 Dec 06, 2021
Indonesia spellchecker with python

indonesia-spellchecker Ganti kata yang terdapat pada file teks.txt untuk diperiksa kebenaran kata. Run on local machine python3 main.py

Rahmat Agung Julians 1 Sep 14, 2022
端到端的长本文摘要模型(法研杯2020司法摘要赛道)

端到端的长文本摘要模型(法研杯2020司法摘要赛道)

苏剑林(Jianlin Su) 334 Jan 08, 2023
Text Normalization(文本正则化)

Text Normalization(文本正则化) 任务描述:通过机器学习算法将英文文本的“手写”形式转换成“口语“形式,例如“6ft”转换成“six feet”等 实验结果 XGBoost + bag-of-words: 0.99159 XGBoost+Weights+rules:0.99002

Jason_Zhang 0 Feb 26, 2022
Official code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].

PLBART Code pre-release of our work, Unified Pre-training for Program Understanding and Generation accepted at NAACL 2021. Note. A detailed documentat

Wasi Ahmad 138 Dec 30, 2022
👑 spaCy building blocks and visualizers for Streamlit apps

spacy-streamlit: spaCy building blocks for Streamlit apps This package contains utilities for visualizing spaCy models and building interactive spaCy-

Explosion 620 Dec 29, 2022
🐍 A hyper-fast Python module for reading/writing JSON data using Rust's serde-json.

A hyper-fast, safe Python module to read and write JSON data. Works as a drop-in replacement for Python's built-in json module. This is alpha software

Matthias 479 Jan 01, 2023
AI Assistant for Building Reliable, High-performing and Fair Multilingual NLP Systems

AI Assistant for Building Reliable, High-performing and Fair Multilingual NLP Systems

Microsoft 37 Nov 29, 2022