Reproduces the results of the paper "Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable domain decomposition approach for solving differential equations".

Overview

Finite basis physics-informed neural networks (FBPINNs)


This repository reproduces the results of the paper Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable domain decomposition approach for solving differential equations, B. Moseley, T. Nissen-Meyer and A. Markham, Jul 2021 ArXiv.


Key contributions

  • Physics-informed neural networks (PINNs) offer a powerful new paradigm for solving problems relating to differential equations
  • However, a key limitation is that PINNs struggle to scale to problems with large domains and/or multi-scale solutions
  • We present finite basis physics-informed neural networks (FBPINNs), which are able to scale to these problems
  • To do so, FBPINNs use a combination of domain decomposition, subdomain normalisation and flexible training schedules
  • FBPINNs outperform PINNs in terms of accuracy and computational resources required

Workflow

FBPINNs divide the problem domain into many small, overlapping subdomains. A neural network is placed within each subdomain such that within the center of the subdomain, the network learns the full solution, whilst in the overlapping regions, the solution is defined as the sum over all overlapping networks.

We use smooth, differentiable window functions to locally confine each network to its subdomain, and the inputs of each network are individually normalised over the subdomain.

In comparison to existing domain decomposition techniques, FBPINNs do not require additional interface terms in their loss function, and they ensure the solution is continuous across subdomain interfaces by the construction of their solution ansatz.

Installation

FBPINNs only requires Python libraries to run.

We recommend setting up a new environment, for example:

conda create -n fbpinns python=3  # Use conda package manager
conda activate fbpinns

and then installing the following libraries:

conda install scipy matplotlib jupyter
conda install pytorch torchvision torchaudio cudatoolkit=10.2 -c pytorch
pip install tensorboardX

All of our work was completed using PyTorch version 1.8.1 with CUDA 10.2.

Finally, download the source code:

git clone https://github.com/benmoseley/FBPINNs.git

Getting started

The workflow to train and compare FBPINNs and PINNs is very simple to set up, and consists of three steps:

  1. Initialise a problems.Problem class, which defines the differential equation (and boundary condition) you want to solve
  2. Initialise a constants.Constants object, which defines all of the other training hyperparameters (domain, number of subdomains, training schedule, .. etc)
  3. Pass this Constants object to the main.FBPINNTrainer or main.PINNTrainer class and call the .train() method to start training.

For example, to solve the problem du/dx = cos(wx) shown above you can use the following code to train a FBPINN / PINN:

P = problems.Cos1D_1(w=1, A=0)# initialise problem class

c1 = constants.Constants(
            RUN="FBPINN_%s"%(P.name),# run name
            P=P,# problem class
            SUBDOMAIN_XS=[np.linspace(-2*np.pi,2*np.pi,5)],# defines subdomains
            SUBDOMAIN_WS=[2*np.ones(5)],# defines width of overlapping regions between subdomains
            BOUNDARY_N=(1/P.w,),# optional arguments passed to the constraining operator
            Y_N=(0,1/P.w,),# defines unnormalisation
            ACTIVE_SCHEDULER=active_schedulers.AllActiveSchedulerND,# training scheduler
            ACTIVE_SCHEDULER_ARGS=(),# training scheduler arguments
            N_HIDDEN=16,# number of hidden units in subdomain network
            N_LAYERS=2,# number of hidden layers in subdomain network
            BATCH_SIZE=(200,),# number of training points
            N_STEPS=5000,# number of training steps
            BATCH_SIZE_TEST=(400,),# number of testing points
            )

run = main.FBPINNTrainer(c1)# train FBPINN
run.train()

c2 = constants.Constants(
            RUN="PINN_%s"%(P.name),
            P=P,
            SUBDOMAIN_XS=[np.linspace(-2*np.pi,2*np.pi,5)],
            BOUNDARY_N=(1/P.w,),
            Y_N=(0,1/P.w,),
            N_HIDDEN=32,
            N_LAYERS=3,
            BATCH_SIZE=(200,),
            N_STEPS=5000,
            BATCH_SIZE_TEST=(400,),
            )

run = main.PINNTrainer(c2)# train PINN
run.train()

The training code will automatically start outputting training statistics, plots and tensorboard summaries. The tensorboard summaries can be viewed by installing tensorboard and then running the command line tensorboard --logdir fbpinns/results/summaries/.

Defining your own problem.Problem class

To learn how to define and solve your own problem, see the Defining your own problem Jupyter notebook here.

Reproducing our results

The purpose of each folder is as follows:

  • fbpinns : contains the main code which defines and trains FBPINNs.
  • analytical_solutions : contains a copy of the BURGERS_SOLUTION code used to compute the exact solution to the Burgers equation problem.
  • seismic-cpml : contains a Python implementation of the SEISMIC_CPML FD library used to solve the wave equation problem.
  • shared_modules : contains generic Python helper functions and classes.

To reproduce the results in the paper, use the following steps:

  1. Run the scripts fbpinns/paper_main_1D.py, fbpinns/paper_main_2D.py, fbpinns/paper_main_3D.py. These train and save all of the FBPINNs and PINNs presented in the paper.
  2. Run the notebook fbpinns/Paper plots.ipynb. This generates all of the plots in the paper.

Further questions?

Please raise a GitHub issue or feel free to contact us.

Owner
Ben Moseley
Physics + AI researcher at University of Oxford, ML lead at NASA Frontier Development Lab
Ben Moseley
Kohei's 5th place solution for xview3 challenge

xview3-kohei-solution Usage This repository assumes that the given data set is stored in the following locations: $ ls data/input/xview3/*.csv data/in

Kohei Ozaki 2 Jan 17, 2022
small collection of functions for neural networks

neurobiba other languages: RU small collection of functions for neural networks. very easy to use! Installation: pip install neurobiba See examples h

4 Aug 23, 2021
Code for Active Learning at The ImageNet Scale.

Code for Active Learning at The ImageNet Scale. This repository implements many popular active learning algorithms and allows training with torch's DDP.

Zeyad Emam 47 Dec 12, 2022
Codes for the paper Contrast and Mix: Temporal Contrastive Video Domain Adaptation with Background Mixing

Contrast and Mix (CoMix) The repository contains the codes for the paper Contrast and Mix: Temporal Contrastive Video Domain Adaptation with Backgroun

Computer Vision and Intelligence Research (CVIR) 13 Dec 10, 2022
Code for the paper: Fighting Fake News: Image Splice Detection via Learned Self-Consistency

Fighting Fake News: Image Splice Detection via Learned Self-Consistency [paper] [website] Minyoung Huh *12, Andrew Liu *1, Andrew Owens1, Alexei A. Ef

minyoung huh (jacob) 174 Dec 09, 2022
Bringing Computer Vision and Flutter together , to build an awesome app !!

Bringing Computer Vision and Flutter together , to build an awesome app !! Explore the Directories Flutter · Machine Learning Table of Contents About

Padmanabha Banerjee 14 Apr 07, 2022
tf2-keras implement yolov5

YOLOv5 in tesnorflow2.x-keras yolov5数据增强jupyter示例 Bilibili视频讲解地址: 《yolov5 解读,训练,复现》 Bilibili视频讲解PPT文件: yolov5_bilibili_talk_ppt.pdf Bilibili视频讲解PPT文件:

yangcheng 254 Jan 08, 2023
The tl;dr on a few notable transformer/language model papers + other papers (alignment, memorization, etc).

The tl;dr on a few notable transformer/language model papers + other papers (alignment, memorization, etc).

Will Thompson 166 Jan 04, 2023
Object-aware Contrastive Learning for Debiased Scene Representation

Object-aware Contrastive Learning Official PyTorch implementation of "Object-aware Contrastive Learning for Debiased Scene Representation" by Sangwoo

43 Dec 14, 2022
Deep Learning and Logical Reasoning from Data and Knowledge

Logic Tensor Networks (LTN) Logic Tensor Network (LTN) is a neurosymbolic framework that supports querying, learning and reasoning with both rich data

171 Dec 29, 2022
The Multi-Mission Maximum Likelihood framework (3ML)

PyPi Conda The Multi-Mission Maximum Likelihood framework (3ML) A framework for multi-wavelength/multi-messenger analysis for astronomy/astrophysics.

The Multi-Mission Maximum Likelihood (3ML) 62 Dec 30, 2022
Implementation of OmniNet, Omnidirectional Representations from Transformers, in Pytorch

Omninet - Pytorch Implementation of OmniNet, Omnidirectional Representations from Transformers, in Pytorch. The authors propose that we should be atte

Phil Wang 48 Nov 21, 2022
“英特尔创新大师杯”深度学习挑战赛 赛道3:CCKS2021中文NLP地址相关性任务

基于 bert4keras 的一个baseline 不作任何 数据trick 单模 线上 最高可到 0.7891 # 基础 版 train.py 0.7769 # transformer 各层 cls concat 明神的trick https://xv44586.git

孙永松 7 Dec 28, 2021
(CVPR2021) ClassSR: A General Framework to Accelerate Super-Resolution Networks by Data Characteristic

ClassSR (CVPR2021) ClassSR: A General Framework to Accelerate Super-Resolution Networks by Data Characteristic Paper Authors: Xiangtao Kong, Hengyuan

Xiangtao Kong 308 Jan 05, 2023
SimDeblur is a simple framework for image and video deblurring, implemented by PyTorch

SimDeblur (Simple Deblurring) is an open source framework for image and video deblurring toolbox based on PyTorch, which contains most deep-learning based state-of-the-art deblurring algorithms. It i

220 Jan 07, 2023
Barbershop: GAN-based Image Compositing using Segmentation Masks (SIGGRAPH Asia 2021)

Barbershop: GAN-based Image Compositing using Segmentation Masks Barbershop: GAN-based Image Compositing using Segmentation Masks Peihao Zhu, Rameen A

Peihao Zhu 928 Dec 30, 2022
Official Datasets and Implementation from our Paper "Video Class Agnostic Segmentation in Autonomous Driving".

Video Class Agnostic Segmentation [Method Paper] [Benchmark Paper] [Project] [Demo] Official Datasets and Implementation from our Paper "Video Class A

Mennatullah Siam 26 Oct 24, 2022
The implemetation of Dynamic Nerual Garments proposed in Siggraph Asia 2021

DynamicNeuralGarments Introduction This repository contains the implemetation of Dynamic Nerual Garments proposed in Siggraph Asia 2021. ./GarmentMoti

42 Dec 27, 2022
Instantaneous Motion Generation for Robots and Machines.

Ruckig Instantaneous Motion Generation for Robots and Machines. Ruckig generates trajectories on-the-fly, allowing robots and machines to react instan

Berscheid 374 Dec 23, 2022