A FAIR dataset of TCV experimental results for validating edge/divertor turbulence models.

Related tags

Deep LearningTCV-X21
Overview

TCV-X21 validation for divertor turbulence simulations

Quick links

arXiv PDF

Binder DOI

Dataset licence Software licence

Test Python package codecov

Intro

Welcome to TCV-X21. We're glad you've found us!

This repository is designed to let you perform the analysis presented in Oliveira and Body et. al., Nuclear Fusion, 2021, both using the data given in the paper, and with a turbulence simulation of your own. We hope that, by providing the analysis, the TCV-X21 case can be used as a standard validation and bench-marking case for turbulence simulations of the divertor in fusion experiments. The repository allows you to scrutinise and suggest improvements to the analysis (there's always room for improvement), to directly interact with and explore the data in greater depth than is possible in a paper, and — we hope — use this case to test a simulation of your own.

To use this repository, you'll need to either use the mybinder.org link below OR user rights on a computer with Python-3, conda and git-lfs pre-installed.

Video tutorial

This quick tutorial shows you how to navigate the repository and use some of the functionality of the library.

Video_tutorial.mp4

What can you find in this repository

  • 1.experimental_data: data from the TCV experimental campaign, in NetCDF, MATLAB and IMAS formats, as well as information about the reference scenario, and the reference magnetic geometry (in .eqdsk, IMAS and PARALLAX-nc formats)
  • 2.simulation_data: data from simulations of the TCV-X21 case, in NetCDF format, as well as raw data files and conversion routines
  • 3.results: high resolution PNGs and LaTeX-ready tables for a paper
  • tcvx21: a Python library of software, which includes
    • record_c: a class to interface with NetCDF/HDF5 formatted data files
    • observable_c: a class to interact with and plot observables
    • file_io: tools to interact with MATLAB and JSON files
    • quant_validation: routines to perform the quantitative validation
    • analysis: statistics, curve-fitting, bootstrap algorithms, contour finding
    • units_m.py: setting up pint-based unit-aware analysis (it's difficult to overstate how cool this library is)
    • grillix_post: a set of routines used for post-processing GRILLIX simulation data, which might help if you're trying to post-process your own simulation. You can see a worked example in simulation_postprocessing.ipynb
  • notebooks: Jupyter notebooks, which allow us to provide code with outputs and comments together
    • simulation_setup.ipynb: what you might need to set up a simulation to test
    • simulation_postprocessing.ipynb: how to post-process the data
    • data_exploration.ipynb: some examples to get you started exploring the data
    • bulk_process.ipynb: runs over every observable to make the results — which you'll need to do if you're writing a paper from the results
  • tests: tests to make sure that we haven't broken anything in the analysis routines
  • README.md: this file, which helps you to get the software up and running, and to explain where you can find everything you need. It also provides the details of the licencing (below). There's more specific README.md files in several of the subfolders.

and lots more files. If you're not a developer, you can safely ignore these.

What can't you find in this repository

Due to licencing issues, the source code of the simulations is not provided. Sorry!

Also, the raw simulations are not provided here due to space limitations (some runs have more than a terabyte of data), but they are all backed up on archive servers. If you'd like to access the raw data, get in contact.

License and attribution notice

The TCV-X21 datasets are licenced under a Creative Commons Attribution 4.0 license, given in LICENCE. The source code of the analysis routines and Python library is licenced under a MIT license, given in tcvx21/LICENCE.

For the datasets, we ask that you provide attribution if using this data via the citation in the CITATION.cff file. We additionally require that you mark any changes to the dataset, and state specifically that the authors do not endorse your work unless such endorsement has been expressly given.

For the software, you can use, modify and share without attribution or marking changes.

Running the Jupyter notebooks (installation as non-root user)

To run the Jupyter notebooks, you have two options. The first is to use the mybinder.org interface, which let you interact with the notebooks via a web interface. You can launch the binder for this repository by clicking the binder badge in the repository header. Note that not all of the repository content is copied to the Docker image (this is specified in .dockerignore). The large checkpoint files are not included in the image, although they can be found in the repository at 2.simulation_data/GRILLIX/checkpoints_for_1mm. Additionally, the default docker image will not work with git.

Alternatively, if you'd like to run the notebooks locally or to extend the repository, you'll need to install additional Python packages. First of all, you need Python-3 and conda installed (latest versions recommended). Then, to install the necessary packages, we make a sandbox environment. This has a few advantages to installing packages globally — sudo rights are not required, you can install package versions without risking breaking other Python scripts, and if everything goes terribly wrong you can easily delete everything and restart. We've included a simple shell script to perform the necessary steps, which you can execute with

./install_env.sh

This will install the library in a subfolder of the TCV-X21 repository called tcvx21_env. It will also add a kernel to your global Jupyter installation. To remove the repository, you can delete the folder tcvx21_env and run jupyter kernelspec uninstall tcvx21.

To run tests and open Jupyter

Once you've installed via either option, you can activate the python environment with conda activate ./tcvx21_env. To deactivate, run conda deactivate.

Then, it is recommended to run the test suite with pytest which ensures that everything is installed and working correctly. If something fails, let us know in the issues. Note that this executes all of the analysis notebooks, so it might take a while to run.

Finally, run jupyter lab to open a Jupyter server in the TCV-X21 repository. Then, you can open any of the notebooks (.ipynb extension) by clicking in the side-bar.

A note on pinned dependencies

To ensure that the results are reproducible, the environment.yml file has pinned dependencies. However, if you want to use this software as a library, pinned dependencies are unnecessarily restrictive. You can remove the versions after the = sign in the environment.yml, but be warned that things might break.

You might also like...
Fair Recommendation in Two-Sided Platforms

Fair Recommendation in Two-Sided Platforms

Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

Regulatory Instruments for Fair Personalized Pricing.

Fair pricing Source code for WWW 2022 paper Regulatory Instruments for Fair Personalized Pricing. Installation Requirements Linux with Python = 3.6 p

This is the official repo for TransFill:  Reference-guided Image Inpainting by Merging Multiple Color and Spatial Transformations at CVPR'21. According to some product reasons, we are not planning to release the training/testing codes and models. However, we will release the dataset and the scripts to prepare the dataset.
This code reproduces the results of the paper, "Measuring Data Leakage in Machine-Learning Models with Fisher Information"

Fisher Information Loss This repository contains code that can be used to reproduce the experimental results presented in the paper: Awni Hannun, Chua

A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Experimental solutions to selected exercises from the book [Advances in Financial Machine Learning by Marcos Lopez De Prado]

Advances in Financial Machine Learning Exercises Experimental solutions to selected exercises from the book Advances in Financial Machine Learning by

An experimental technique for efficiently exploring neural architectures.
An experimental technique for efficiently exploring neural architectures.

SMASH: One-Shot Model Architecture Search through HyperNetworks An experimental technique for efficiently exploring neural architectures. This reposit

A simple but complete full-attention transformer with a set of promising experimental features from various papers
A simple but complete full-attention transformer with a set of promising experimental features from various papers

x-transformers A concise but fully-featured transformer, complete with a set of promising experimental features from various papers. Install $ pip ins

Comments
  • Repair results

    Repair results

    It appears that the 3.results folder had not been updated with the outputs of the notebooks.

    I've rerun the notebooks and now have the latest results in the folder.

    opened by TBody 1
Releases(v1.0)
Time Dependent DFT in Tamm-Dancoff Approximation

Density Function Theory Program - kspy-tddft(tda) This is an implementation of Time-Dependent Density Functional Theory(TDDFT) using the Tamm-Dancoff

Peter Borthwick 2 Nov 17, 2022
The offcial repository for 'CharacterBERT and Self-Teaching for Improving the Robustness of Dense Retrievers on Queries with Typos', SIGIR2022

CharacterBERT-DR The offcial repository for CharacterBERT and Self-Teaching for Improving the Robustness of Dense Retrievers on Queries with Typos, Sh

ielab 11 Nov 15, 2022
Gin provides a lightweight configuration framework for Python

Gin Config Authors: Dan Holtmann-Rice, Sergio Guadarrama, Nathan Silberman Contributors: Oscar Ramirez, Marek Fiser Gin provides a lightweight configu

Google 1.7k Jan 03, 2023
Data and code for the paper "Importance of Kernel Bandwidth in Quantum Machine Learning"

Reproducibility materials for "Importance of Kernel Bandwidth in Quantum Machine Learning" Repo structure: code contains Python scripts used to genera

Ruslan Shaydulin 3 Oct 23, 2022
Official implementation for the paper "Attentive Prototypes for Source-free Unsupervised Domain Adaptive 3D Object Detection"

Attentive Prototypes for Source-free Unsupervised Domain Adaptive 3D Object Detection PyTorch code release of the paper "Attentive Prototypes for Sour

Deepti Hegde 23 Oct 17, 2022
Code for the CVPR2021 paper "Patch-NetVLAD: Multi-Scale Fusion of Locally-Global Descriptors for Place Recognition"

Patch-NetVLAD: Multi-Scale Fusion of Locally-Global Descriptors for Place Recognition This repository contains code for the CVPR2021 paper "Patch-NetV

QVPR 368 Jan 06, 2023
NVIDIA Merlin is an open source library providing end-to-end GPU-accelerated recommender systems, from feature engineering and preprocessing to training deep learning models and running inference in production.

NVIDIA Merlin NVIDIA Merlin is an open source library designed to accelerate recommender systems on NVIDIA’s GPUs. It enables data scientists, machine

419 Jan 03, 2023
An example of Scatterbrain implementation (combining local attention and Performer)

An example of Scatterbrain implementation (combining local attention and Performer)

HazyResearch 97 Jan 02, 2023
Self-labelling via simultaneous clustering and representation learning. (ICLR 2020)

Self-labelling via simultaneous clustering and representation learning 🆗 🆗 🎉 NEW models (20th August 2020): Added standard SeLa pretrained torchvis

Yuki M. Asano 469 Jan 02, 2023
Accuracy Aligned. Concise Implementation of Swin Transformer

Accuracy Aligned. Concise Implementation of Swin Transformer This repository contains the implementation of Swin Transformer, and the training codes o

FengWang 77 Dec 16, 2022
Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

6 Dec 19, 2022
Source code related to the article submitted to the International Conference on Computational Science ICCS 2022 in London

POTHER: Patch-Voted Deep Learning-based Chest X-ray Bias Analysis for COVID-19 Detection Source code related to the article submitted to the Internati

Tomasz Szczepański 1 Apr 29, 2022
Training, generation, and analysis code for Learning Particle Physics by Example: Location-Aware Generative Adversarial Networks for Physics

Location-Aware Generative Adversarial Networks (LAGAN) for Physics Synthesis This repository contains all the code used in L. de Oliveira (@lukedeo),

Deep Learning for HEP 57 Oct 22, 2022
Code for Temporally Abstract Partial Models

Code for Temporally Abstract Partial Models Accompanies the code for the experimental section of the paper: Temporally Abstract Partial Models, Khetar

DeepMind 19 Jul 13, 2022
FANet - Real-time Semantic Segmentation with Fast Attention

FANet Real-time Semantic Segmentation with Fast Attention Ping Hu, Federico Perazzi, Fabian Caba Heilbron, Oliver Wang, Zhe Lin, Kate Saenko , Stan Sc

Ping Hu 42 Nov 30, 2022
ViewFormer: NeRF-free Neural Rendering from Few Images Using Transformers

ViewFormer: NeRF-free Neural Rendering from Few Images Using Transformers Official implementation of ViewFormer. ViewFormer is a NeRF-free neural rend

Jonáš Kulhánek 169 Dec 30, 2022
This repo contains implementation of different architectures for emotion recognition in conversations.

Emotion Recognition in Conversations Updates 🔥 🔥 🔥 Date Announcements 03/08/2021 🎆 🎆 We have released a new dataset M2H2: A Multimodal Multiparty

Deep Cognition and Language Research (DeCLaRe) Lab 1k Dec 30, 2022
Official Implementation for the paper DeepFace-EMD: Re-ranking Using Patch-wise Earth Mover’s Distance Improves Out-Of-Distribution Face Identification

DeepFace-EMD: Re-ranking Using Patch-wise Earth Mover’s Distance Improves Out-Of-Distribution Face Identification Official Implementation for the pape

Anh M. Nguyen 36 Dec 28, 2022
A curated list of awesome open source libraries to deploy, monitor, version and scale your machine learning

Awesome production machine learning This repository contains a curated list of awesome open source libraries that will help you deploy, monitor, versi

The Institute for Ethical Machine Learning 12.9k Jan 04, 2023
MegEngine implementation of YOLOX

Introduction YOLOX is an anchor-free version of YOLO, with a simpler design but better performance! It aims to bridge the gap between research and ind

旷视天元 MegEngine 77 Nov 22, 2022