An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"

Related tags

Deep LearningRASP
Overview

RASP

Setup

Mac or Linux

Run ./setup.sh . It will create a python3 virtual environment and install the dependencies for RASP. It will also try to install graphviz (the non-python part) and rlwrap on your machine. If these fail, you will still be able to use RASP, however: the interface will not be as nice without rlwrap, and drawing s-op computation flows will not be possible without graphviz. After having set up, you can run ./rasp.sh to start the RASP read-evaluate-print-loop.

Windows

Follow the instructions given in windows instructions.txt

The REPL

After having set up, if you are in mac/linux, you can run ./rasp.sh to start the RASP REPL. Otherwise, run python3 RASP_support/REPL.py Use Ctrl+C to quit a partially entered command, and Ctrl+D to exit the REPL.

Initial Environment

RASP starts with the base s-ops: tokens, indices, and length. It also has the base functions select, aggregate, and selector_width as described in the paper, a selector full_s created through select(1,1,==) that creates a "full" attention pattern, and several other library functions (check out RASP_support/rasplib.rasp to see them).

Additionally, the REPL begins with a base example, "hello", on which it shows the output for each created s-op or selector. This example can be changed, and toggled on and off, through commands to the REPL.

All RASP commands end with a semicolon. Commands to the REPL -- such as changing the base example -- do not.

Start by following along with the examples -- they are kept at the bottom of this readme.

Note on input types:

RASP expects inputs in four forms: strings, integers, floats, or booleans, handled respectively by tokens_str, tokens_int, tokens_float, and tokens_bool. Initially, RASP loads with tokens set to tokens_str, this can be changed by assignment, e.g.: tokens=tokens_int;. When changing the input type, you will also want to change the base example, e.g.: set example [0,1,2].

Note that assignments do not retroactively change the computation trees of existing s-ops!

Writing and Loading RASP files

To keep and load RASP code from files, save them with .rasp as the extension, and use the 'load' command without the extension. For example, you can load the examples file paper_examples.rasp in this repository to the REPL as follows:

>> load "paper_examples";

This will make (almost) all values in the file available in the loading environment (whether the REPL, or a different .rasp file): values whose names begin with an underscore remain private to the file they are written in. Loading files in the REPL will also print a list of all loaded values.

Syntax Highlighting

For the Sublime Text editor, you can get syntax highlighting for .rasp files as follows:

  1. Install package control for sublime (you might already have it: look in the menu [Sublime Text]->[Preferences] and see if it's there. If not, follow the instructions at https://packagecontrol.io/installation).
  2. Install the 'packagedev' package through package control ([Sublime Text]->[Preferences]->[Package Control], then type [install package], then [packagedev])
  3. After installing PackageDev, create a new syntax definition file through [Tools]->[Packages]->[Package Development]->[New Syntax Definition].
  4. Copy the contents of RASP_support/RASP.sublime-syntax into the new syntax definition file, and save it as RASP.sublime-syntax.

[Above is basically following the instructions in http://ilkinulas.github.io/programming/2016/02/05/sublime-text-syntax-highlighting.html , and then copying in the contents of the provided RASP.sublime-syntax file]

Examples

Play around in the REPL!

Try simple elementwise manipulations of s-ops:

>>  threexindices =3 * indices;
     s-op: threexindices
 	 Example: threexindices("hello") = [0, 3, 6, 9, 12] (ints)
>> indices+indices;
     s-op: out
 	 Example: out("hello") = [0, 2, 4, 6, 8] (ints)

Change the base example, and create a selector that focuses each position on all positions before it:

>> set example "hey"
>> prevs=select(indices,indices,<);
     selector: prevs
 	 Example:
 			     h e y
 			 h |      
 			 e | 1    
 			 y | 1 1  

Check the output of an s-op on your new base example:

>> threexindices;
     s-op: threexindices
 	 Example: threexindices("hey") = [0, 3, 6] (ints)

Or on specific inputs:

>> threexindices(["hi","there"]);
	 =  [0, 3] (ints)
>> threexindices("hiya");
	 =  [0, 3, 6, 9] (ints)

Aggregate with the full selection pattern (loaded automatically with the REPL) to compute the proportion of a letter in your input:

>> full_s;
     selector: full_s
 	 Example:
 			     h e y
 			 h | 1 1 1
 			 e | 1 1 1
 			 y | 1 1 1
>> my_frac=aggregate(full_s,indicator(tokens=="e"));
     s-op: my_frac
 	 Example: my_frac("hey") = [0.333]*3 (floats)

Note: when an s-op's output is identical in all positions, RASP simply prints the output of one position, followed by " * X" (where X is the sequence length) to mark the repetition.

Check if a letter is in your input at all:

>> "e" in tokens;
     s-op: out
 	 Example: out("hey") = [T]*3 (bools)

Alternately, in an elementwise fashion, check if each of your input tokens belongs to some group:

>> vowels = ["a","e","i","o","u"];
     list: vowels = ['a', 'e', 'i', 'o', 'u']
>> tokens in vowels;
     s-op: out
 	 Example: out("hey") = [F, T, F] (bools)

Draw the computation flow for an s-op you have created, on an input of your choice: (this will create a pdf in a subfolder comp_flows of the current directory)

>> draw(my_frac,"abcdeeee");
	 =  [0.5]*8 (floats)

Or simply on the base example:

>> draw(my_frac);
	 =  [0.333]*3 (floats)

If they bother you, turn the examples off, and bring them back when you need them:

>> examples off
>> indices;
     s-op: indices
>> full_s;
     selector: full_s
>> examples on
>> indices;
     s-op: indices
 	 Example: indices("hey") = [0, 1, 2] (ints)

You can also do this selectively, turning only selector or s-op examples on and off, e.g.: selector examples off.

Create a selector that focuses each position on all other positions containing the same token. But first, set the base example to "hello" for a better idea of what's happening:

>> set example "hello"
>> same_token=select(tokens,tokens,==);
     selector: same_token
 	 Example:
 			     h e l l o
 			 h | 1        
 			 e |   1      
 			 l |     1 1  
 			 l |     1 1  
 			 o |         1

Then, use selector_width to compute, for each position, how many other positions the selector same_token focuses it on. This effectively computes an in-place histogram over the input:

>> histogram=selector_width(same_token);
     s-op: histogram
 	 Example: histogram("hello") = [1, 1, 2, 2, 1] (ints)

For more complicated examples, check out paper_examples.rasp!

Experiments on Transformers

The transformers in the paper were trained, and their attention heatmaps visualised, using the code in this repository: https://github.com/tech-srl/RASP-exps

blind SQLIpy sebuah alat injeksi sql yang menggunakan waktu sql untuk mendapatkan sebuah server database.

blind SQLIpy Alat blind SQLIpy ini merupakan alat injeksi sql yang menggunakan metode time based blind sql injection metode tersebut membutuhkan waktu

Galih Anggoro Prasetya 4 Feb 24, 2022
Plugin adapted from Ultralytics to bring YOLOv5 into Napari

napari-yolov5 Plugin adapted from Ultralytics to bring YOLOv5 into Napari. Training and detection can be done using the GUI. Training dataset must be

2 May 05, 2022
3D-Transformer: Molecular Representation with Transformer in 3D Space

3D-Transformer: Molecular Representation with Transformer in 3D Space

55 Dec 19, 2022
Robotic Process Automation in Windows and Linux by using Driagrams.net BPMN diagrams.

BPMN_RPA Robotic Process Automation in Windows and Linux by using BPMN diagrams. With this Framework you can draw Business Process Model Notation base

23 Dec 14, 2022
Air Quality Prediction Using LSTM

AirQualityPredictionUsingLSTM In this Repo, i present to you the winning solution of smart gujarat hackathon 2019 where the task was to predict the qu

Deepak Nandwani 2 Dec 13, 2022
paper list in the area of reinforcenment learning for recommendation systems

paper list in the area of reinforcenment learning for recommendation systems

HenryZhao 23 Jun 09, 2022
Swin-Transformer is basically a hierarchical Transformer whose representation is computed with shifted windows.

Swin-Transformer Swin-Transformer is basically a hierarchical Transformer whose representation is computed with shifted windows. For more details, ple

旷视天元 MegEngine 9 Mar 14, 2022
2021 Artificial Intelligence Diabetes Datathon

A.I.D.D. 2021 2021 Artificial Intelligence Diabetes Datathon A.I.D.D. 2021은 ‘2021 인공지능 학습용 데이터 구축사업’을 통해 만들어진 학습용 데이터를 활용하여 당뇨병을 효과적으로 예측할 수 있는가에 대한 A

2 Dec 27, 2021
The Python code for the paper A Hybrid Quantum-Classical Algorithm for Robust Fitting

About The Python code for the paper A Hybrid Quantum-Classical Algorithm for Robust Fitting The demo program was only tested under Conda in a standard

Anh-Dzung Doan 5 Nov 28, 2022
Code for "Learning the Best Pooling Strategy for Visual Semantic Embedding", CVPR 2021

Learning the Best Pooling Strategy for Visual Semantic Embedding Official PyTorch implementation of the paper Learning the Best Pooling Strategy for V

Jiacheng Chen 106 Jan 06, 2023
Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer.

DocEnTR Description Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer. This model is implemented on to

Mohamed Ali Souibgui 74 Jan 07, 2023
MLOps will help you to understand how to build a Continuous Integration and Continuous Delivery pipeline for an ML/AI project.

page_type languages products description sample python azure azure-machine-learning-service azure-devops Code which demonstrates how to set up and ope

1 Nov 01, 2021
Watch faces morph into each other with StyleGAN 2, StyleGAN, and DCGAN!

FaceMorpher FaceMorpher is an innovative project to get a unique face morph (or interpolation for geeks) on a website. Yes, this means you can see fac

Anish 9 Jun 24, 2022
190 Jan 03, 2023
Torchreid: Deep learning person re-identification in PyTorch.

Torchreid Torchreid is a library for deep-learning person re-identification, written in PyTorch. It features: multi-GPU training support both image- a

Kaiyang 3.7k Jan 05, 2023
AFLNet: A Greybox Fuzzer for Network Protocols

AFLNet: A Greybox Fuzzer for Network Protocols AFLNet is a greybox fuzzer for protocol implementations. Unlike existing protocol fuzzers, it takes a m

626 Jan 06, 2023
Development kit for MIT Scene Parsing Benchmark

Development Kit for MIT Scene Parsing Benchmark [NEW!] Our PyTorch implementation is released in the following repository: https://github.com/hangzhao

MIT CSAIL Computer Vision 424 Dec 01, 2022
Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow

Mask R-CNN for Object Detection and Segmentation This is an implementation of Mask R-CNN on Python 3, Keras, and TensorFlow. The model generates bound

Matterport, Inc 22.5k Jan 04, 2023
Artificial Intelligence playing minesweeper 🤖

AI playing Minesweeper ✨ Minesweeper is a single-player puzzle video game. The objective of the game is to clear a rectangular board containing hidden

Vaibhaw 8 Oct 17, 2022
BMN: Boundary-Matching Network

BMN: Boundary-Matching Network A pytorch-version implementation codes of paper: "BMN: Boundary-Matching Network for Temporal Action Proposal Generatio

qinxin 260 Dec 06, 2022