Official code for "Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes", CVPR2022

Overview

Python 3.6

[CVPR 2022] Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes

Dongkwon Jin, Wonhui Park, Seong-Gyun Jeong, Heeyeon Kwon, and Chang-Su Kim

overview

Official implementation for "Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes" [paper] [supp] [video].

We construct a new dataset called "SDLane". SDLane is available at here. Now, only test set is provided due to privacy issues. All dataset will be provided soon.

Video

Video

Related work

We wil also present another paper, "Eigencontours: Novel Contour Descriptors Based on Low-Rank Approximation", accepted to CVPR 2022 (oral) [github] [video].

Requirements

  • PyTorch >= 1.6
  • CUDA >= 10.0
  • CuDNN >= 7.6.5
  • python >= 3.6

Installation

  1. Download repository. We call this directory as ROOT:
$ git clone https://github.com/dongkwonjin/Eigenlanes.git
  1. Download pre-trained model parameters and preprocessed data in ROOT:
$ cd ROOT
$ unzip pretrained.zip
$ unzip preprocessed.zip
  1. Create conda environment:
$ conda create -n eigenlanes python=3.7 anaconda
$ conda activate eigenlanes
  1. Install dependencies:
$ conda install pytorch torchvision torchaudio cudatoolkit=10.2 -c pytorch
$ pip install -r requirements.txt

Directory structure

.                           # ROOT
├── Preprocessing           # directory for data preprocessing
│   ├── culane              # dataset name (culane, tusimple)
|   |   ├── P00             # preprocessing step 1
|   |   |   ├── code
|   |   ├── P01             # preprocessing step 2
|   |   |   ├── code
|   │   └── ...
│   └── ...                 # etc.
├── Modeling                # directory for modeling
│   ├── culane              # dataset name (culane, tusimple)
|   |   ├── code
│   ├── tusimple           
|   |   ├── code
│   └── ...                 # etc.
├── pretrained              # pretrained model parameters 
│   ├── culane              
│   ├── tusimple            
│   └── ...                 # etc.
├── preprocessed            # preprocessed data
│   ├── culane              # dataset name (culane, tusimple)
|   |   ├── P03             
|   |   |   ├── output
|   |   ├── P04             
|   |   |   ├── output
|   │   └── ...
│   └── ...
.

Evaluation (for CULane)

To test on CULane, you need to install official CULane evaluation tools. The official metric implementation is available here. Please downloads the tools into ROOT/Modeling/culane/code/evaluation/culane/. The tools require OpenCV C++. Please follow here to install OpenCV C++. Then, you compile the evaluation tools. We recommend to see an installation guideline

$ cd ROOT/Modeling/culane/code/evaluation/culane/
$ make

Train

  1. Set the dataset you want to train (DATASET_NAME)
  2. Parse your dataset path into the -dataset_dir argument.
  3. Edit config.py if you want to control the training process in detail
$ cd ROOT/Modeling/DATASET_NAME/code/
$ python main.py --run_mode train --pre_dir ROOT/preprocessed/DATASET_NAME/ --dataset_dir /where/is/your/dataset/path/ 

Test

  1. Set the dataset you want to test (DATASET_NAME)
  2. Parse your dataset path into the -dataset_dir argument.
  3. If you want to get the performances of our work,
$ cd ROOT/Modeling/DATASET_NAME/code/
$ python main.py --run_mode test_paper --pre_dir ROOT/preprocessed/DATASET_NAME/ --paper_weight_dir ROOT/pretrained/DATASET_NAME/ --dataset_dir /where/is/your/dataset/path/
  1. If you want to evaluate a model you trained,
$ cd ROOT/Modeling/DATASET_NAME/code/
$ python main.py --run_mode test --pre_dir ROOT/preprocessed/DATASET_NAME/ --dataset_dir /where/is/your/dataset/path/

Preprocessing

example

Data preprocessing is divided into five steps, which are P00, P01, P02, P03, and P04. Below we describe each step in detail.

  1. In P00, the type of ground-truth lanes in a dataset is converted to pickle format.
  2. In P01, each lane in a training set is represented by 2D points sampled uniformly in the vertical direction.
  3. In P02, lane matrix is constructed and SVD is performed. Then, each lane is transformed to its coefficient vector.
  4. In P03, clustering is performed to obtain lane candidates.
  5. In P04, training labels are generated to train the SI module in the proposed SIIC-Net.

If you want to get the preproessed data, please run the preprocessing codes in order. Also, you can download the preprocessed data.

$ cd ROOT/Preprocessing/DATASET_NAME/PXX_each_preprocessing_step/code/
$ python main.py --dataset_dir /where/is/your/dataset/path/

Reference

@Inproceedings{
    Jin2022eigenlanes,
    title={Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes},
    author={Jin, Dongkwon and Park, Wonhui and Jeong, Seong-Gyun and Kwon, Heeyeon and Kim, Chang-Su},
    booktitle={CVPR},
    year={2022}
}
Owner
Dongkwon Jin
BS: EE, Korea University Grad: EE, Korea University (Current)
Dongkwon Jin
code from "Tensor decomposition of higher-order correlations by nonlinear Hebbian plasticity"

Code associated with the paper "Tensor decomposition of higher-order correlations by nonlinear Hebbian learning," Ocker & Buice, Neurips 2021. "plot_f

Gabriel Koch Ocker 4 Oct 16, 2022
ICSS - Interactive Continual Semantic Segmentation

Presentation This repository contains the code of our paper: Weakly-supervised c

Alteia 9 Jul 23, 2022
Code and description for my BSc Project, September 2021

BSc-Project Disclaimer: This repo consists of only the additional python scripts necessary to run the agent. To run the project on your own personal d

Matin Tavakoli 20 Jul 19, 2022
This is a work in progress reimplementation of Instant Neural Graphics Primitives

Neural Hash Encoding This is a work in progress reimplementation of Instant Neural Graphics Primitives Currently this can train an implicit representa

Penn 79 Sep 01, 2022
[ICCV'21] UNISURF: Unifying Neural Implicit Surfaces and Radiance Fields for Multi-View Reconstruction

UNISURF: Unifying Neural Implicit Surfaces and Radiance Fields for Multi-View Reconstruction Project Page | Paper | Supplementary | Video This reposit

331 Dec 28, 2022
An onlinel learning to rank python codebase.

OLTR Online learning to rank python codebase. The code related to Pairwise Differentiable Gradient Descent (ranker/PDGDLinearRanker.py) is copied from

ielab 5 Jul 18, 2022
Fast, flexible and fun neural networks.

Brainstorm Discontinuation Notice Brainstorm is no longer being maintained, so we recommend using one of the many other,available frameworks, such as

IDSIA 1.3k Nov 21, 2022
Swin-Transformer is basically a hierarchical Transformer whose representation is computed with shifted windows.

Swin-Transformer Swin-Transformer is basically a hierarchical Transformer whose representation is computed with shifted windows. For more details, ple

旷视天元 MegEngine 9 Mar 14, 2022
gym-anm is a framework for designing reinforcement learning (RL) environments that model Active Network Management (ANM) tasks in electricity distribution networks.

gym-anm is a framework for designing reinforcement learning (RL) environments that model Active Network Management (ANM) tasks in electricity distribution networks. It is built on top of the OpenAI G

Robin Henry 99 Dec 12, 2022
A minimal TPU compatible Jax implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis

NeRF Minimal Jax implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis. Result of Tiny-NeRF RGB Depth

Soumik Rakshit 11 Jul 24, 2022
CVPR 2021 Official Pytorch Code for UC2: Universal Cross-lingual Cross-modal Vision-and-Language Pre-training

UC2 UC2: Universal Cross-lingual Cross-modal Vision-and-Language Pre-training Mingyang Zhou, Luowei Zhou, Shuohang Wang, Yu Cheng, Linjie Li, Zhou Yu,

Mingyang Zhou 28 Dec 30, 2022
PyTorch implementation of "PatchGame: Learning to Signal Mid-level Patches in Referential Games" to appear in NeurIPS 2021

PatchGame: Learning to Signal Mid-level Patches in Referential Games This repository is the official implementation of the paper - "PatchGame: Learnin

Kamal Gupta 22 Mar 16, 2022
Medical-Image-Triage-and-Classification-System-Based-on-COVID-19-CT-and-X-ray-Scan-Dataset

Medical-Image-Triage-and-Classification-System-Based-on-COVID-19-CT-and-X-ray-Sc

2 Dec 26, 2021
LibMTL: A PyTorch Library for Multi-Task Learning

LibMTL LibMTL is an open-source library built on PyTorch for Multi-Task Learning (MTL). See the latest documentation for detailed introductions and AP

765 Jan 06, 2023
The code for SAG-DTA: Prediction of Drug–Target Affinity Using Self-Attention Graph Network.

SAG-DTA The code is the implementation for the paper 'SAG-DTA: Prediction of Drug–Target Affinity Using Self-Attention Graph Network'. Requirements py

Shugang Zhang 7 Aug 02, 2022
PICK: Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks

Code for the paper "PICK: Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks" (ICPR 2020)

Wenwen Yu 498 Dec 24, 2022
Training Structured Neural Networks Through Manifold Identification and Variance Reduction

Training Structured Neural Networks Through Manifold Identification and Variance Reduction This repository is a pytorch implementation of the Regulari

0 Dec 23, 2021
Open source repository for the code accompanying the paper 'PatchNets: Patch-Based Generalizable Deep Implicit 3D Shape Representations'.

PatchNets This is the official repository for the project "PatchNets: Patch-Based Generalizable Deep Implicit 3D Shape Representations". For details,

16 May 22, 2022
This library provides an abstraction to perform Model Versioning using Weight & Biases.

Description This library provides an abstraction to perform Model Versioning using Weight & Biases. Features Version a new trained model Promote a mod

Hector Lopez Almazan 2 Jan 28, 2022
A general-purpose programming language, focused on simplicity, safety and stability.

The Rivet programming language A general-purpose programming language, focused on simplicity, safety and stability. Rivet's goal is to be a very power

The Rivet programming language 17 Dec 29, 2022