Data and Code for paper Outlining and Filling: Hierarchical Query Graph Generation for Answering Complex Questions over Knowledge Graph is available for research purposes.

Related tags

Deep LearningHGNet
Overview

Data and Code for paper Outlining and Filling: Hierarchical Query Graph Generation for Answering Complex Questions over Knowledge Graph is available for research purposes.

Results

We apply three KGQA benchmarks to evaluate our approach, ComplexWebQuestions (Talmor and Berant, 2018), LC-QuAD (Trivedi et al., 2017), and WebQSP (Yih et al., 2016).

Dataset Structure Acc. Query Graph Acc. Precision Recall F1-score [email protected]
ComplexWebQuestions 66.96 51.68 65.27 68.44 64.95 65.25
LC-QuAD 78.00 60.90 75.82 75.22 75.10 76.00
WebQSP 79.91 62.63 70.22 74.38 70.61 70.37

Requirements

  • Python == 3.7.0
  • cudatoolkit == 10.1.243
  • cudnn == 7.6.5
  • six == 1.15.0
  • torch == 1.4.0
  • transformers == 4.9.2
  • numpy == 1.19.2
  • SPARQLWrapper == 1.8.5
  • rouge_score == 0.0.4
  • filelock == 3.0.12
  • nltk == 3.6.2
  • absl == 0.0
  • dataclasses == 0.6
  • datasets == 1.9.0
  • jsonlines == 2.0.0
  • python_Levenshtein == 0.12.2
  • Virtuoso SPARQL query service

Data

  • Download and unzip our preprocessed data to ./, you can also running our scripts under ./preprocess to obtain them again.

  • Download our used Freebase and DBpedia. Both of them only contain English triples by removing other languages. Download and install Virtuoso to conduct the SPARQL query service for the downloaded Freebase and DBpedia. Here is a tutorial on how to install Virtuoso and import the knowledge graph into it.

  • Download GloVe Embedding glove.42B.300d.txt and put it to your_glove_path.

  • Download our vocabulary from here. Unzip and put it under ./. It contains our used SPARQL cache for Execution-Guided strategy.

Running Code

1. Training for HGNet

Before training, first set the following hyperparameter in train_cwq.sh, train_lcq.sh, and train_wsp.sh.

--glove_path your_glove_path

Execute the following command for training model on ComplexWebQuestions.

sh train_cwq.sh

Execute the following command for training model on LC-QuAD.

sh train_lcq.sh

Execute the following command for training model on WebQSP.

sh train_wsp.sh

The trained model file is saved under ./runs directory.
The path format of the trained model is ./runs/RUN_ID/checkpoints/best_snapshot_epoch_xx_best_val_acc_xx_model.pt.

2. Testing for HGNet

Before testing, need to train a model first and set the following hyperparameters in eval_cwq.sh, eval_lcq.sh, and eval_wsp.sh.

--cpt your_trained_model_path
--kb_endpoint your_sparql_service_ip

You can also directly download our trained models from here. Unzip and put it under ./.

Execute the following command for testing the model on ComplexWebQuestions.

sh eval_cwq.sh

Execute the following command for testing the model on LC-QuAD.

sh eval_lcq.sh

Execute the following command for testing the model on WebQSP.

sh eval_wsp.sh
Owner
Yongrui Chen
Yongrui Chen
Breast cancer is been classified into benign tumour and malignant tumour.

Breast cancer is been classified into benign tumour and malignant tumour. Logistic regression is applied in this model.

1 Feb 04, 2022
Official pytorch code for "APP: Anytime Progressive Pruning"

APP: Anytime Progressive Pruning Diganta Misra1,2,3, Bharat Runwal2,4, Tianlong Chen5, Zhangyang Wang5, Irina Rish1,3 1 Mila - Quebec AI Institute,2 L

Landskape AI 12 Nov 22, 2022
A no-BS, dead-simple training visualizer for tf-keras

A no-BS, dead-simple training visualizer for tf-keras TrainingDashboard Plot inter-epoch and intra-epoch loss and metrics within a jupyter notebook wi

Vibhu Agrawal 3 May 28, 2021
Code for ICML 2021 paper: How could Neural Networks understand Programs?

OSCAR This repository contains the source code of our ICML 2021 paper How could Neural Networks understand Programs?. Environment Run following comman

Dinglan Peng 115 Dec 17, 2022
A implemetation of the LRCN in mxnet

A implemetation of the LRCN in mxnet ##Abstract LRCN is a combination of CNN and RNN ##Installation Download UCF101 dataset ./avi2jpg.sh to split the

44 Aug 25, 2022
A tool to estimate time varying instantaneous reproduction number during epidemics

EpiEstim A tool to estimate time varying instantaneous reproduction number during epidemics. It is described in the following paper: @article{Cori2013

MRC Centre for Global Infectious Disease Analysis 78 Dec 19, 2022
[CVPR 2021] Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers

[CVPR 2021] Rethinking Semantic Segmentation from a Sequence-to-Sequence Perspective with Transformers

Fudan Zhang Vision Group 897 Jan 05, 2023
PyTorch implementation for "Sharpness-aware Quantization for Deep Neural Networks".

Sharpness-aware Quantization for Deep Neural Networks This is the official repository for our paper: Sharpness-aware Quantization for Deep Neural Netw

Zhuang AI Group 30 Dec 19, 2022
5 Jan 05, 2023
Code of our paper "Contrastive Object-level Pre-training with Spatial Noise Curriculum Learning"

CCOP Code of our paper Contrastive Object-level Pre-training with Spatial Noise Curriculum Learning Requirement Install OpenSelfSup Install Detectron2

Chenhongyi Yang 21 Dec 13, 2022
ONNX-PackNet-SfM: Python scripts for performing monocular depth estimation using the PackNet-SfM model in ONNX

Python scripts for performing monocular depth estimation using the PackNet-SfM model in ONNX

Ibai Gorordo 14 Dec 09, 2022
LiDAR R-CNN: An Efficient and Universal 3D Object Detector

LiDAR R-CNN: An Efficient and Universal 3D Object Detector Introduction This is the official code of LiDAR R-CNN: An Efficient and Universal 3D Object

TuSimple 295 Jan 05, 2023
A unified 3D Transformer Pipeline for visual synthesis

Overview This is the official repo for the paper: NÜWA: Visual Synthesis Pre-training for Neural visUal World creAtion. NÜWA is a unified multimodal p

Microsoft 2.6k Jan 06, 2023
Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer

Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer This repository contains the PyTorch code for Evo-ViT. This work proposes a slow-fas

YifanXu 53 Dec 05, 2022
Contains code for Deep Kernelized Dense Geometric Matching

DKM - Deep Kernelized Dense Geometric Matching Contains code for Deep Kernelized Dense Geometric Matching We provide pretrained models and code for ev

Johan Edstedt 83 Dec 23, 2022
ktrain is a Python library that makes deep learning and AI more accessible and easier to apply

Overview | Tutorials | Examples | Installation | FAQ | How to Cite Welcome to ktrain News and Announcements 2020-11-08: ktrain v0.25.x is released and

Arun S. Maiya 1.1k Jan 02, 2023
COLMAP - Structure-from-Motion and Multi-View Stereo

COLMAP About COLMAP is a general-purpose Structure-from-Motion (SfM) and Multi-View Stereo (MVS) pipeline with a graphical and command-line interface.

4.7k Jan 07, 2023
ROSITA: Enhancing Vision-and-Language Semantic Alignments via Cross- and Intra-modal Knowledge Integration

ROSITA News & Updates (24/08/2021) Release the demo to perform fine-grained semantic alignments using the pretrained ROSITA model. (15/08/2021) Releas

Vision and Language Group@ MIL 48 Dec 23, 2022
Learning from Guided Play: A Scheduled Hierarchical Approach for Improving Exploration in Adversarial Imitation Learning Source Code

Learning from Guided Play: A Scheduled Hierarchical Approach for Improving Exploration in Adversarial Imitation Learning Trevor Ablett*, Bryan Chan*,

STARS Laboratory 8 Sep 14, 2022
Pre-trained BERT Models for Ancient and Medieval Greek, and associated code for LaTeCH 2021 paper titled - "A Pilot Study for BERT Language Modelling and Morphological Analysis for Ancient and Medieval Greek"

Ancient Greek BERT The first and only available Ancient Greek sub-word BERT model! State-of-the-art post fine-tuning on Part-of-Speech Tagging and Mor

Pranaydeep Singh 22 Dec 08, 2022