Final report with code for KAIST Course KSE 801.

Overview

🧮 KSE 801 Final Report with Code

This is the final report with code for KAIST course KSE 801.

Author: Chuanbo Hua, Federico Berto.

💡 Introduction About the OSC

Orthogonal collocation is a method for the numerical solution of partial differential equations. It uses collocation at the zeros of some orthogonal polynomials to transform the partial differential equation (PDE) to a set of ordinary differential equations (ODEs). The ODEs can then be solved by any method. It has been shown that it is usually advantageous to choose the collocation points as the zeros of the corresponding Jacobi polynomial (independent of the PDE system) [1].

Orthogonal collocation method was famous at 1970s, mainly developed by BA Finlayson [2]. Which is a powerful collocation tool in solving partial differential equations and ordinary differential equations.

Orthogonal collocation method works for more than one variable, but here we only choose one variable cases, since this is more simple to understand and most widely used.

💡 Introduction About the GNN

You can find more details from the jupter notebook within gnn-notebook folder. We include the dataset init, model training and test in the folder.

Reminder: for dataset, we provide another repository for dataset generator. Please refer to repo: https://github.com/DiffEqML/pde-dataset-generator.

🏷 Features

  • Turoritals. We provide several examples, including linear and nonlinear problems to help you to understand how to use it and the performance of this model.
  • Algorithm Explanation. We provide a document to in detail explain how this alogirthm works by example, which we think it's easier to get. For more detail, please refer to Algorithm section.

⚙️ Requirement

Python Version: 3.6 or later
Python Package: numpy, matplotlib, jupyter-notebook/jupyter-lab, dgl, torch

🔧 Structure

  • src: source code for OSC algorithm.
  • fig: algorithm output figures for readme
  • osc-notebook: tutorial jupyter notebooks about our osc method
  • gnn-notebook: tutorial jupyter notebooks about graph neural network
  • script: some training and tesing script of the graph neural network

🔦 How to use

Step 1. Download or Clone this repository.

Step 2. Refer to osc-notebook/example.ipynb, it will introduce how to use this model in detail by examples. Main process would be

  1. collocation1d(): generate collocation points.
  2. generator1d(): generate algebra equations from PDEs to be solved.
  3. numpy.linalg.solve(): solve the algebra equations to get polynomial result,
  4. polynomial1d(): generate simulation value to check the loss.

Step 3. Refer to notebooks under gnn-notebook to get the idea of training graph model.

📈 Examples

One variable, linear, 3 order Loss: <1e-4

One variable, linear, 4 order Loss: 2.2586

One variable, nonlinear Loss: 0.0447

2D PDEs Simulation

Dam Breaking Simulation

📜 Algorithm

Here we are going to simply introduce how 1D OSC works by example. Original pdf please refer to Introduction.pdf in this repository.

📚 References

[1] Orthogonal collocation. (2018, January 30). In Wikipedia. https://en.wikipedia.org/wiki/Orthogonal_collocation.

[2] Carey, G. F., and Bruce A. Finlayson. "Orthogonal collocation on finite elements." Chemical Engineering Science 30.5-6 (1975): 587-596.

Owner
Chuanbo HUA
HIT, POSTECH, KAIST.
Chuanbo HUA
Real time sign language recognition

The proposed work aims at converting american sign language gestures into English that can be understood by everyone in real time.

Mohit Kaushik 6 Jun 13, 2022
FID calculation with proper image resizing and quantization steps

clean-fid: Fixing Inconsistencies in FID Project | Paper The FID calculation involves many steps that can produce inconsistencies in the final metric.

Gaurav Parmar 606 Jan 06, 2023
TAP: Text-Aware Pre-training for Text-VQA and Text-Caption, CVPR 2021 (Oral)

TAP: Text-Aware Pre-training TAP: Text-Aware Pre-training for Text-VQA and Text-Caption by Zhengyuan Yang, Yijuan Lu, Jianfeng Wang, Xi Yin, Dinei Flo

Microsoft 61 Nov 14, 2022
Self-attentive task GAN for space domain awareness data augmentation.

SATGAN TODO: update the article URL once published. Article about this implemention The self-attentive task generative adversarial network (SATGAN) le

Nathan 2 Mar 24, 2022
Code for the paper "Jukebox: A Generative Model for Music"

Status: Archive (code is provided as-is, no updates expected) Jukebox Code for "Jukebox: A Generative Model for Music" Paper Blog Explorer Colab Insta

OpenAI 6k Jan 02, 2023
DiscoNet: Learning Distilled Collaboration Graph for Multi-Agent Perception [NeurIPS 2021]

DiscoNet: Learning Distilled Collaboration Graph for Multi-Agent Perception [NeurIPS 2021] Yiming Li, Shunli Ren, Pengxiang Wu, Siheng Chen, Chen Feng

Automation and Intelligence for Civil Engineering (AI4CE) Lab @ NYU 98 Dec 21, 2022
Benchmarks for Model-Based Optimization

Design-Bench Design-Bench is a benchmarking framework for solving automatic design problems that involve choosing an input that maximizes a black-box

Brandon Trabucco 43 Dec 20, 2022
NAS Benchmark in "Prioritized Architecture Sampling with Monto-Carlo Tree Search", CVPR2021

NAS-Bench-Macro This repository includes the benchmark and code for NAS-Bench-Macro in paper "Prioritized Architecture Sampling with Monto-Carlo Tree

35 Jan 03, 2023
Source code for the BMVC-2021 paper "SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation".

SimReg: A Simple Regression Based Framework for Self-supervised Knowledge Distillation Source code for the paper "SimReg: Regression as a Simple Yet E

9 Oct 15, 2022
Pytorch implementation for "Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets" (ECCV 2020 Spotlight)

Distribution-Balanced Loss [Paper] The implementation of our paper Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets (

Tong WU 304 Dec 22, 2022
Algorithm to texture 3D reconstructions from multi-view stereo images

MVS-Texturing Welcome to our project that textures 3D reconstructions from images. This project focuses on 3D reconstructions generated using structur

Nils Moehrle 766 Jan 04, 2023
Theory-inspired Parameter Control Benchmarks for Dynamic Algorithm Configuration

This repo is for the paper: Theory-inspired Parameter Control Benchmarks for Dynamic Algorithm Configuration The DAC environment is based on the Dynam

Carola Doerr 1 Aug 19, 2022
A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or simply to separate onnx files to any size you want.

sne4onnx A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or

Katsuya Hyodo 10 Aug 30, 2022
An ever-growing playground of notebooks showcasing CLIP's impressive zero-shot capabilities.

Playground for CLIP-like models Demo Colab Link GradCAM Visualization Naive Zero-shot Detection Smarter Zero-shot Detection Captcha Solver Changelog 2

Kevin Zakka 101 Dec 30, 2022
Use AI to generate a optimized stock portfolio

Use AI, Modern Portfolio Theory, and Monte Carlo simulation's to generate a optimized stock portfolio that minimizes risk while maximizing returns. Ho

Greg James 30 Dec 22, 2022
Using Machine Learning to Test Causal Hypotheses in Conjoint Analysis

Readme File for "Using Machine Learning to Test Causal Hypotheses in Conjoint Analysis" by Ham, Imai, and Janson. (2022) All scripts were written and

0 Jan 27, 2022
This is the repository for our paper SimpleTrack: Understanding and Rethinking 3D Multi-object Tracking

SimpleTrack This is the repository for our paper SimpleTrack: Understanding and Rethinking 3D Multi-object Tracking. We are still working on writing t

TuSimple 189 Dec 26, 2022
Large scale and asynchronous Hyperparameter Optimization at your fingertip.

Syne Tune This package provides state-of-the-art distributed hyperparameter optimizers (HPO) where trials can be evaluated with several backend option

Amazon Web Services - Labs 236 Jan 01, 2023
Author Disambiguation using Knowledge Graph Embeddings with Literals

Author Name Disambiguation with Knowledge Graph Embeddings using Literals This is the repository for the master thesis project on Knowledge Graph Embe

12 Oct 19, 2022
The source code for 'Noisy-Labeled NER with Confidence Estimation' accepted by NAACL 2021

Kun Liu*, Yao Fu*, Chuanqi Tan, Mosha Chen, Ningyu Zhang, Songfang Huang, Sheng Gao. Noisy-Labeled NER with Confidence Estimation. NAACL 2021. [arxiv]

30 Nov 12, 2022