Vvim - Keyboardless Vim interactions

Related tags

Hardwarevvim
Overview

Vvim - Keyboardless Vim interactions

This is done via a hardware glove that the user wears. The glove detects the finger's positions and translates them into key presses. It's currently a work in progress.

The glove prototype, with 4 sensors on two fingers

Subset of data

The stream of data from the 4 sensors (here each shown in a different colour) has been zeroed so that they all centre around the time when the user pressed the 'y' key.

Current Features

  • Glove prototype has been constructed.

  • Glove can detect finger movements of the right fore finger and right middle finger (With space to expand to more fingers if these first two actually work)

    • This corresponds to the following keys, shown with how often those keys show up in the current dataset: h: 628, u: 291, y: 171, m: 171, b: 155, k: 120, j: 21,
  • Glove records finger movements via an Arduino script vvim.ino on an Uno, and sends them to serial output.

  • Serial output is read by the python script glove_logger.py and saved to the file glove.log along with the Unix milliseconds since epoch.

  • A keylogger is installed on the developer's machine, and logs key presses to the file keys.log along with Unix milliseconds since epoch.

  • Running cleanup.sh cleans up the data from the keylogger and the serial output into one file named sorted.log.

  • A Gradient Boosted tree has been trained and saved to model.pkl. Currently it has a test accuracy of 79.7%.

    • This will hopefully be improved as more data is gathered, as currently there are only 587 keypresses on which to train 9 categories, or about 65 examples per category which is not enough.
  • The file eda.py saves plots to plots/ such as:

Graphs

Each colour is a differently positioned sensor. Each line is one stream of data recorded by a sensor. The streams have each been zeroed so that every instance of pressing a certain key is centred.

Keys on the home row

Some keys are easier to spot, and others less so as my fingers move a lot when pressing a y compared to a k just because of where the keys are positioned on the keyboard.

More or less data

The data has not been normalised, so there's far more data for when common keys like h are pressed compared to when a j is pressed

In Progress

  • Currently there are only about 600 keypresses recorded. Record more examples of typing and add more sensors to the fingers so that fewer keystrokes have to be typed in order to get the data.

To Do

  • If flex sensors aren't enough to predict exactly when a key is pressed, add force sensors to the fingertips.
  • Use an Arduino Nano instead of an Uno, and host the entire thing on the user's hand
  • Connect the glove to the computer via Bluetooth, instead of a wired connection
  • Current models don't have the option of categorizing an sequence of sensor readings as not pressing any key at all. This should be fixed so the model isn't constantly assuming at least one key is being pressed
    • This could be done easily with pressure sensors
  • Write some sort of visualiser to live track sensor data, actual key presses, and predicted key presses

Keys and which finger tends to press them

Note that this list is likely very specific to the author, as different people will type differently. I think I probably use my right ring finger much more than I really should. Also I type a y with my index finger for words like type or you (where I subsequently have to type another letter with me right hand), but I type it with my middle finger for words like yes, yank, or keyboard.

  • Right Hand
    • Thumb: space
    • Index: j, m, n, b, h, y
    • Middle: k, y, u, i, <, (, [
    • Ring: l, :, BACKSPACE, o, p, >, ), ], 0, _, -, +, =, ,, .
    • Pinky: ;, ENTER, /, ?
  • Left Hand (Incomplete as I've not yet built a glove for the left hand)
    • Pinky:
    • Ring:
    • Middle:
    • Index:
    • Thumb:

Here's a picture of my keyboard for reference:

How to Start Recording Data

Probably best to do this all in tmux since handling multiple terminal windows is a pain otherwise. A keylogger (I use Casey Scarborough's keylogger) is also required.

  1. Install requirements
pip3 install -r requirements.txt
  1. Run the command to clear the logfile:
sudo keylogger clear
  1. Start the keylogger:
sudo keylogger ./keys.log
  1. Start recording glove movements:
python3 glove_logger.py
  1. Put the glove on, and start typing things out. I usually do this by opening a text file (like Alice in Wonderland available on Gutenberg) in vim (vim alice.txt), and then splitting the window vertically (:vsp), and then opening a temporary file in which to type in (:e tmp). Finally, type (:set cursorbind) into both frames so that the source text scrolls as you type it. They keystrokes and finger movements will be recorded separately

  2. Remove the glove

  3. Stop the keylogger with CTRL-C

  4. Stop recording the finger movements with CTRL-C

  5. Now the data is recorded, clean it up:

./cleanup.sh
  1. And analyse the data with eda.py
python3 eda.py

The images will be stored to plots/ for your viewing pleasure

License

This work is licensed under GNU GPLv3. See the attached LICENSE. See https://choosealicense.com/licenses/gpl-3.0/# for a non-legalese explanation of the license.

Owner
Boyd Kane
CS and Statistics student at UCT, South Africa. Interested in data science, probability theory and speaking Spanish. ¡Hola!
Boyd Kane
DNP3 Stalker is a project to analyze and interact with DNP3 devices

DNP3 Stalker Purpose DNP3 Stalker is a project to analyze and interact with DNP3

Cutaway Security, LLC. 2 Feb 10, 2022
Examples to accompany the

Examples to accompany the "Raspberry Pi Pico Python SDK" book published by Raspberry Pi Trading, which forms part of the technical documentation in support of Raspberry Pi Pico and the MicroPython po

Raspberry Pi 589 Jan 08, 2023
Hook and simulate global mouse events in pure Python

mouse Take full control of your mouse with this small Python library. Hook global events, register hotkeys, simulate mouse movement and clicks, and mu

BoppreH 722 Dec 31, 2022
Isaac Gym Environments for Legged Robots

Isaac Gym Environments for Legged Robots This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain usi

Robotic Systems Lab - Legged Robotics at ETH Zürich 372 Jan 08, 2023
2021 Real Robot Challenge Phase2 attemp

Real_Robot_Challenge_Phase2_AE_attemp We(team name:thriftysnipe) are the first place winner of Phase1 in 2021 Real Robot Challenge. Please see this pa

Qiang Wang 2 Nov 15, 2021
Mycodo is open source software for the Raspberry Pi that couples inputs and outputs in interesting ways to sense and manipulate the environment.

Mycodo Environmental Regulation System Latest version: 8.12.9 Mycodo is open source software for the Raspberry Pi that couples inputs and outputs in i

Kyle Gabriel 2.3k Dec 29, 2022
This allows you to record keyboard and mouse input, and play it back using pynput.

Record and Play with Python! This allows you to record keyboard and mouse input, and play it back (with looping) using pynput. It allows for automatio

George Jensen 45 Jan 02, 2023
KIRI - Keyboard Interception, Remapping, and Injection using Raspberry Pi as an HID Proxy.

KIRI - Keyboard Interception, Remapping and Injection using Raspberry Pi as a HID Proxy. Near limitless abilities for a keyboard warrior. Features Sim

Viggo Falster 10 Dec 23, 2022
CPU benchmark by calculating Pi, powered by Python3

cpu-benchmark Info: CPU benchmark by calculating Pi, powered by Python 3. Algorithm The program calculates pi with an accuracy of 10,000 decimal place

Alex Dedyura 20 Jan 03, 2023
Transform a Raspberry Pi into a network diagnostic machine.

EtherView Last updated jan 30, 2022. Welcome to the EtherView project! This is a project to transform a RaspberryPi into a portable network diagnostic

1 Jan 30, 2022
Add filters (background blur, etc) to your webcam on Linux.

webcam-filters Add filters (background blur, etc) to your webcam on Linux. Video conferencing applications tend to either lack video effects altogethe

Jashandeep Sohi 480 Dec 14, 2022
A DUCO (Duino-Coin) miner for GigaDevice ARM boards.

GD32 Duino-Coin Miner Description Contains the firmware and miner software for mining DUCO (Duino-Coin) on GigaDevice GD32 chips. Supported boards GD3

Maximilian Gerhardt 2 Feb 20, 2022
Home solar infrastructure (with Peimar Inverter) monitoring based on Raspberry Pi 3 B+ using Grafana, InfluxDB, Custom Python Collector and Shelly EM.

raspberry-solar-mon Home solar infrastructure (with Peimar Inverter) monitoring based on Raspberry Pi 3 B+ using Grafana, InfluxDB, Custom Python Coll

cislow 10 Dec 23, 2022
OctoPrint is the snappy web interface for your 3D printer!

OctoPrint OctoPrint provides a snappy web interface for controlling consumer 3D printers. It is Free Software and released under the GNU Affero Genera

OctoPrint 7.1k Jan 03, 2023
A IC scan test interface for Arduino

ICSCAN_ARDUINO Prerequisites Python 3.6 or higher arduino uno or nano what is this It is a bitstream tranceiver to test IC chip It sends bitstream to

Nifty Chips Laboratory 0 Sep 15, 2022
PlatformIO development platform for GSM modules

PlatformIO development platform for GSM modules Supported Modules Quectel M66 OpenCPU Arduino - TODO other - in progress... Supported Boards Comet M66

Georgi Angelov 5 Aug 06, 2022
Technical Answers to Real-World Problems. Evolution of Watering Manually to Watering Automatically.

Automatic Watering System using Soil Moisture Sensor and RTC Timer with Arduino Technical Answers to Real-World Problems Know the plant, Grow the plan

NelakurthiSudheer 3 Jan 03, 2022
Simples Keylogger para Windows com um autoboot implementado no sistema

MKW Keylogger Keylogger simples para Windos com um autoboot implementado no sistema, o malware irá capturar pressionamentos de tecla e armazená-lo em

3 Jul 03, 2021
Software framework to enable agile robotic assembly applications.

ConnTact Software framework to enable agile robotic assembly applications. (Connect + Tactile) Overview Installation Development of framework was done

Southwest Research Institute Robotics 29 Dec 01, 2022
ROS2 nodes for Waveshare Alphabot2-Pi mobile robot.

ROS2 for Waveshare Alphabot2-Pi This repo contains ROS2 packages for the Waveshare Alphabot2-Pi mobile robot: alphabot2: it contains the nodes used to

Michele Rizzo 2 Oct 11, 2022