GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates

Overview

GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates

Vibhor Agarwal, Sagar Joglekar, Anthony P. Young and Nishanth Sastry, "GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates", The ACM Web Conference (TheWebConf), 2022.

Abstract

An online forum that allows participatory engagement between users, very often, becomes a stage for heated debates. These debates sometimes escalate into full blown exchanges of hate and misinformation. As such, modeling these conversations through the lens of argumentation theory as graphs of supports and attacks has shown promise, especially in identifying which claims should be accepted. However, the argumentative relation of supports and attacks, also called the polarity, is difficult to infer from natural language exchanges, not least because support or attack relationship in natural language is intuitively contextual.

Various deep learning models have been used to classify the polarity, where the inputs to the model are typically just the texts of the replying argument and the argument being replied to. We propose GraphNLI, a novel graph-based deep learning architecture to infer argumentative relations, which not only considers the immediate pair of arguments involved in the response, but also the surrounding arguments, hence capturing the context of the discussion, through graph walks. We demonstrate the performance of this model on a curated debate dataset from Kialo, an online debating platform. Our model outperforms the relevant baselines with an overall accuracy of 83%, which demonstrates that incorporating nearby arguments in addition to the pair of relayed arguments helps in predicting argumentative relations in online debates.

The paper PDF will be available soon!

Directory Structure

  • GraphNLI folder contains the implementation of Graph Walks and GraphNLI model.
  • Baselines folder contains the implementation of all the four baselines in the paper.

Citation

If you find this paper useful in your research, please consider citing:

@inproceedings{agarwal2022graphnli,
  title={GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates},
  author={Vibhor Agarwal and Sagar Joglekar and Anthony P. Young and Nishanth Sastry},
  booktitle={The ACM Web Conference (TheWebConf)},
  year={2022}
}
Owner
Vibhor Agarwal
PhD Researcher @University-of-Surrey | Ex-SRE @media.net | GSoC' 18 @oppia | NLP | Graph Machine Learning | Computational Social Science
Vibhor Agarwal
Partially offline multi-language translator built upon Huggingface transformers.

Translate Command-line interface to translation pipelines, powered by Huggingface transformers. This tool can download translation models, and then us

Richard Jarry 8 Oct 25, 2022
Perform sentiment analysis and keyword extraction on Craigslist listings

craiglist-helper synopsis Perform sentiment analysis and keyword extraction on Craigslist listings Background I love Craigslist. I've found most of my

Mark Musil 1 Nov 08, 2021
A simple visual front end to the Maya UE4 RBF plugin delivered with MetaHumans

poseWrangler Overview PoseWrangler is a simple UI to create and edit pose-driven relationships in Maya using the MayaUE4RBF plugin. This plugin is dis

Christopher Evans 105 Dec 18, 2022
Code for CodeT5: a new code-aware pre-trained encoder-decoder model.

CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation This is the official PyTorch implementation

Salesforce 564 Jan 08, 2023
Predict an emoji that is associated with a text

Sentiment Analysis Sentiment analysis in computational linguistics is a general term for techniques that quantify sentiment or mood in a text. Can you

Tetsumichi(Telly) Umada 30 Sep 07, 2022
Awesome-NLP-Research (ANLP)

Awesome-NLP-Research (ANLP)

Language, Information, and Learning at Yale 72 Dec 19, 2022
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)

KoGPT KoGPT (Korean Generative Pre-trained Transformer) https://github.com/kakaobrain/kogpt https://huggingface.co/kakaobrain/kogpt Model Descriptions

Kakao Brain 797 Dec 26, 2022
[WWW 2021 GLB] New Benchmarks for Learning on Non-Homophilous Graphs

New Benchmarks for Learning on Non-Homophilous Graphs Here are the codes and datasets accompanying the paper: New Benchmarks for Learning on Non-Homop

94 Dec 21, 2022
A simple chatbot based on chatterbot that you can use for anything has basic features

Chatbotium A simple chatbot based on chatterbot that you can use for anything has basic features. I have some errors Read the paragraph below: Known b

Herman 1 Feb 16, 2022
Ceaser-Cipher - The Caesar Cipher technique is one of the earliest and simplest method of encryption technique

Ceaser-Cipher The Caesar Cipher technique is one of the earliest and simplest me

Lateefah Ajadi 2 May 12, 2022
MiCECo - Misskey Custom Emoji Counter

MiCECo Misskey Custom Emoji Counter Introduction This little script counts custo

7 Dec 25, 2022
ProteinBERT is a universal protein language model pretrained on ~106M proteins from the UniRef90 dataset.

ProteinBERT is a universal protein language model pretrained on ~106M proteins from the UniRef90 dataset. Through its Python API, the pretrained model can be fine-tuned on any protein-related task in

241 Jan 04, 2023
Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021

efficient-task-transfer This repository contains code for the experiments in our paper "What to Pre-Train on? Efficient Intermediate Task Selection".

AdapterHub 26 Dec 24, 2022
Learning to Rewrite for Non-Autoregressive Neural Machine Translation

RewriteNAT This repo provides the code for reproducing our proposed RewriteNAT in EMNLP 2021 paper entitled "Learning to Rewrite for Non-Autoregressiv

Xinwei Geng 20 Dec 25, 2022
A very simple framework for state-of-the-art Natural Language Processing (NLP)

A very simple framework for state-of-the-art NLP. Developed by Humboldt University of Berlin and friends. IMPORTANT: (30.08.2020) We moved our models

flair 12.3k Dec 31, 2022
Incorporating KenLM language model with HuggingFace implementation of Wav2Vec2CTC Model using beam search decoding

Wav2Vec2CTC With KenLM Using KenLM ARPA language model with beam search to decode audio files and show the most probable transcription. Assuming you'v

farisalasmary 65 Sep 21, 2022
Course project of [email protected]

NaiveMT Prepare Clone this repository git clone [email protected]:Poeroz/NaiveMT.git

Poeroz 2 Apr 24, 2022
An example project using OpenPrompt under pytorch-lightning for prompt-based SST2 sentiment analysis model

pl_prompt_sst An example project using OpenPrompt under the framework of pytorch-lightning for a training prompt-based text classification model on SS

Zhiling Zhang 5 Oct 21, 2022
Exploration of BERT-based models on twitter sentiment classifications

twitter-sentiment-analysis Explore the relationship between twitter sentiment of Tesla and its stock price/return. Explore the effect of different BER

Sammy Cui 2 Oct 02, 2022
End-to-end MLOps pipeline of a BERT model for emotion classification.

image source EmoBERT-MLOps The goal of this repository is to build an end-to-end MLOps pipeline based on the MLOps course from Made with ML, but this

Dimitre Oliveira 4 Nov 06, 2022