Analyses of the individual electric field magnitudes with Roast.

Overview

Aloi Davide - PhD Student (UoB)

Analysis of electric field magnitudes (wp2a dataset only at the moment) and correlation analysis with Dynamic Causal Modelling (DCM) results.

The goal of these analyses is to establish whether there is a relationship between single-subject electric field (E-field) magnitudes generated with the ROAST pipeline (Huang et al., 2019) and changes in effective connectivity within the motor network, derived using DCM and parametric empirical bayes (PEB).

The two analyses are:

  1. Correlation analysis between E-field magnitude - medians and max values - (or the current density?) in the motor cortex (M1) and Thalamus (Th) with self- / between-connectivities (M1 and Th only?) as derived from the DCM. e.g. Indahlastari et al. (2021). At the moment I am correlating e-field measures only with DCM measures derived from the contrast pre vs post Day-1 anodal only. However, I should also correlate those e-field measures with DCM measures derived from the contrast pre vs post Day-1 sham. I expect to find correlations between e-field measures and DCM measures for the anodal condition but not for sham.
  2. Pattern-recognition analysis using support vector machine (SVM) learning algorithm on MRI-derived tDCS current models to provide classification of tDCS treatment response (as reflected by increased M1-TH or TH-M1 connectivity or whatever other measure we decide). e.g. Albizu et al. (2020). The question here is: can we classify people who had an increase in thalamo-cortical connectivity using features from the MRI-current models?

The two analyses require similar preprocessing steps. Here's the list of the steps I've done and the respective scripts.

WP2A: I start from a dataset containing 22 folders (one per participant), each containing a T1 and a T2 scan (except for subject 16 who has only a T1).

  1. Renaming of anatomical scans: this renames the anatomical scans of each participant (i.e. sub-01_T1.nii etc).
  2. ROAST simulations: this script runs the ROAST simulations. In brief, ROAST outputs the following scans for each subject, while also using SPM routines for tissue segmentation: Voltage ("subjName_simulationTag_v.nii", unit in mV), E-field ("subjName_simulationTag_e.nii", unit in V/m) and E-field magnitude ("subjName_simulationTag_emag.nii", unit in V/m). The settings I have used for the simulation are: (t1, {'C3',1.0,'Fp2',-1.0},'T2', t2,'electype', 'pad', 'elecsize', [50 50 3], 'capType', '1020').
  3. Post ROAST preprocessing: ROAST outputs are in the ROAST model space. This script moves the results back to the MRI space, coregisters and normalises the electric field maps generated by ROAST. The script also normalise the T1 scan and all the masks.
  4. Ep values extraction from PEB result (Day-1 only): this script, starting from this .mat structure containing 66 PEBs (1 per participant / polarity), extracts the Ep values for each participant. The resulting file contains 66 matrices (participant 1 anodal, cathodal and sham, participant 2 ... 22).
  5. Estimation of posterior probability associated to each PEB extracted above. The script runs bayesian model averages for each PEB using the DCM function spm_dcm_peb_bmc. Results are saved in this .mat structure and used later on in the analyses to exclude connections with a posterior probability lower than 75%.
  6. WP2a e-magnitude measures estimation and correlation analysis. Steps:
    1. Load MNI template and M1/Th ROIs.
    2. Load .mat structure with Ep values and .mat structure with Pp values (Nb. Pp values are not used anymore);
    3. For each subject:
      1. Load normalised scan containing E-field magnitude (wsub-T1_emag.nii), normalised CSF, white and grey matter maps (wc1-2-3sub*.nii).
      2. Save DCM values related to the connections M1-M1, Th-Th, M1->Th and Th-> M1;
      3. Smooth E-field magnitude map using FWHM (4mm kernel);
      4. Mask E-field magnitude map with MNI template to exclude values outside the brain (useless if I then mask with CSF, wm and gm maps or with the M1/Th ROIs);
      5. Mask E-field magnitude map with M1 and Th ROIs and estimate means, medians and max electric-field values within the two ROIs;
      6. Save electric-field magnitude derived measures;
      7. Plot smoothed E-field magnitude map;
      8. Run 16 correlations: 4 DCM measures and 4 E-field measures (medians and max values).
      9. Plot correlations.

Questions:

  1. Electric field magnitudes or current densities?
  2. If so, how to deal with probabilistic masks?
  3. Should I threshold WM masks and apply binary erosion to remove the overlap between WM and GM?
  4. How to deal with Ep values which corresponding Pp is lower than our threshold (75%?)
  5. Should I mask out CSF tissue? Should I use a binary map containing only WM and GM?
  6. Hypotheses? Ideas?

Plots: Sticky note mind map - Sticky note mind map

References:

  1. Huang, Y., Datta, A., Bikson, M., & Parra, L. C. (2019). Realistic volumetric-approach to simulate transcranial electric stimulation—ROAST—a fully automated open-source pipeline. Journal of Neural Engineering, 16(5), 056006. https://doi.org/10.1088/1741-2552/ab208d
  2. Indahlastari, A., Albizu, A., Kraft, J. N., O’Shea, A., Nissim, N. R., Dunn, A. L., Carballo, D., Gordon, M. P., Taank, S., Kahn, A. T., Hernandez, C., Zucker, W. M., & Woods, A. J. (2021). Individualized tDCS modeling predicts functional connectivity changes within the working memory network in older adults. Brain Stimulation, 14(5), 1205–1215. https://doi.org/10.1016/j.brs.2021.08.003
  3. Albizu, A., Fang, R., Indahlastari, A., O’Shea, A., Stolte, S. E., See, K. B., Boutzoukas, E. M., Kraft, J. N., Nissim, N. R., & Woods, A. J. (2020). Machine learning and individual variability in electric field characteristics predict tDCS treatment response. Brain Stimulation, 13(6), 1753–1764. https://doi.org/10.1016/j.brs.2020.10.001
Owner
Davide Aloi
Doctoral Researcher at the University of Birmingham, UK. Centre for Human Brain Health. Investigating Disorders of Consciousness with fMRI and tDCS.
Davide Aloi
Code of the lileonardo team for the 2021 Emotion and Theme Recognition in Music task of MediaEval 2021

Emotion and Theme Recognition in Music The repository contains code for the submission of the lileonardo team to the 2021 Emotion and Theme Recognitio

Vincent Bour 8 Aug 02, 2022
PINN(s): Physics-Informed Neural Network(s) for von Karman vortex street

PINN(s): Physics-Informed Neural Network(s) for von Karman vortex street This is

ShotaDEGUCHI 2 Apr 18, 2022
Implementations of paper Controlling Directions Orthogonal to a Classifier

Classifier Orthogonalization Implementations of paper Controlling Directions Orthogonal to a Classifier , ICLR 2022, Yilun Xu, Hao He, Tianxiao Shen,

Yilun Xu 33 Dec 01, 2022
FairMOT - A simple baseline for one-shot multi-object tracking

FairMOT - A simple baseline for one-shot multi-object tracking

Yifu Zhang 3.6k Jan 08, 2023
Tutorial repo for an end-to-end Data Science project

End-to-end Data Science project This is the repo with the notebooks, code, and additional material used in the ITI's workshop. The goal of the session

Deena Gergis 127 Dec 30, 2022
A hobby project which includes a hand-gesture based virtual piano using a mobile phone camera and OpenCV library functions

Overview This is a hobby project which includes a hand-gesture controlled virtual piano using an android phone camera and some OpenCV library. My moti

Abhinav Gupta 1 Nov 19, 2021
This is a collection of our NAS and Vision Transformer work.

This is a collection of our NAS and Vision Transformer work.

Microsoft 828 Dec 28, 2022
TensorFlow-based neural network library

Sonnet Documentation | Examples Sonnet is a library built on top of TensorFlow 2 designed to provide simple, composable abstractions for machine learn

DeepMind 9.5k Jan 07, 2023
Small little script to scrape, parse and check for active tor nodes. Can be used as proxies.

TorScrape TorScrape is a small but useful script made in python that scrapes a website for active tor nodes, parse the html and then save the nodes in

5 Dec 04, 2022
RLBot Python bindings for the Rust crate rl_ball_sym

RLBot Python bindings for rl_ball_sym 0.6 Prerequisites: Rust & Cargo Build Tools for Visual Studio RLBot - Verify that the file %localappdata%\RLBotG

Eric Veilleux 2 Nov 25, 2022
Spatial Single-Cell Analysis Toolkit

Single-Cell Image Analysis Package Scimap is a scalable toolkit for analyzing spatial molecular data. The underlying framework is generalizable to spa

Laboratory of Systems Pharmacology @ Harvard 30 Nov 08, 2022
Codes for paper "KNAS: Green Neural Architecture Search"

KNAS Codes for paper "KNAS: Green Neural Architecture Search" KNAS is a green (energy-efficient) Neural Architecture Search (NAS) approach. It contain

90 Dec 22, 2022
VISSL is FAIR's library of extensible, modular and scalable components for SOTA Self-Supervised Learning with images.

What's New Below we share, in reverse chronological order, the updates and new releases in VISSL. All VISSL releases are available here. [Oct 2021]: V

Meta Research 2.9k Jan 07, 2023
deep-table implements various state-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch.

deep-table implements various state-of-the-art deep learning and self-supervised learning algorithms for tabular data using PyTorch.

63 Oct 17, 2022
Pytorch implementation of 'Fingerprint Presentation Attack Detector Using Global-Local Model'

RTK-PAD This is an official pytorch implementation of 'Fingerprint Presentation Attack Detector Using Global-Local Model', which is accepted by IEEE T

6 Aug 01, 2022
Secure Distributed Training at Scale

Secure Distributed Training at Scale This repository contains the implementation of experiments from the paper "Secure Distributed Training at Scale"

Yandex Research 9 Jul 11, 2022
An example of time series augmentation methods with Keras

Time Series Augmentation This is a collection of time series data augmentation methods and an example use using Keras. News 2020/04/16: Repository Cre

九州大学 ヒューマンインタフェース研究室 229 Jan 02, 2023
Unsupervised phone and word segmentation using dynamic programming on self-supervised VQ features.

Unsupervised Phone and Word Segmentation using Vector-Quantized Neural Networks Overview Unsupervised phone and word segmentation on speech data is pe

Herman Kamper 13 Dec 11, 2022
NICE-GAN — Official PyTorch Implementation Reusing Discriminators for Encoding: Towards Unsupervised Image-to-Image Translation

NICE-GAN-pytorch - Official PyTorch implementation of NICE-GAN: Reusing Discriminators for Encoding: Towards Unsupervised Image-to-Image Translation

Runfa Chen 208 Nov 25, 2022
competitions-v2

Codabench (formerly Codalab Competitions v2) Installation $ cp .env_sample .env $ docker-compose up -d $ docker-compose exec django ./manage.py migrat

CodaLab 21 Dec 02, 2022