Immortal tracker

Overview

Immortal_tracker

Prerequisite

Our code is tested for Python 3.6.
To install required liabraries:

pip install -r requirements.txt

Waymo Open Dataset

Prepare dataset & off-the-shelf detections

Download WOD perception dataset:

#Waymo Dataset         
└── waymo
       ├── training (not required)  
       ├── validation   
       ├── testing 

To extract timestamp infos/ego infos from .tfrecord files, run the following:

bash preparedata/waymo/waymo_preparedata.sh  /
   
    /waymo

   

Run the following to convert detection results into to .npz files. The detection results should be in official WOD submission format(.bin)
We recommand you to use CenterPoint(two-frame model for tracking) detection results for reproducing our results. Please follow https://github.com/tianweiy/CenterPoint or email its author for CenterPoint detection results.

bash preparedata/waymo/waymo_convert_detection.sh 
   
    /detection_result.bin cp

#you can also use other detections:
#bash preparedata/waymo/waymo_convert_detection.sh 
     
     

     
    
   

Inference

Use the following command to start inferencing on WOD. The validation set is used by default.

python main_waymo.py --name immortal --det_name cp --config_path configs/waymo_configs/immortal.yaml --process 8

Evaluation with WOD official devkit:

Follow https://github.com/waymo-research/waymo-open-dataset to build the evaluation tools and run the following command for evaluation:

#Convert the tracking results into .bin file
python evaluation/waymo/pred_bin.py --name immortal
#For evaluation

   
    /bazel-bin/waymo_open_dataset/metrics/tools/compute_tracking_metrics_main mot_results/waymo/validation/immortal/bin/pred.bin 
    
     /validation_gt.bin

    
   

nuScenes Dataset

Prepare dataset & off-the-shelf detections

Download nuScenes perception dataset

# For nuScenes Dataset         
└── NUSCENES_DATASET_ROOT
       ├── samples       
       ├── sweeps       
       ├── maps         
       ├── v1.0-trainval 
       ├── v1.0-test

To extract timestamp infos/ego infos, run the following:

bash preparedata/nuscenes/nu_preparedata.sh 
   
    /nuscenes

   

Run the following to convert detection results into to .npz files. The detection results should be in official nuScenes submission format(.json)
We recommand you to use centerpoint(two-frame model for tracking) detection results for reproducing our results.

bash preparedata/nuscenes/nu_convert_detection.sh  
   
    /detection_result.json cp

#you can also use other detections:
#bash preparedata/nuscenes/nu_convert_detection.sh 
     
     

     
    
   

Inference

Use the following command to start inferencing on nuScenes. The validation set is used by default.

python main_nuscenes.py --name immortal --det_name cp --config_path configs/nu_configs/immortal.yaml --process 8

Evaluation with nuScenes official devkit:

Follow https://github.com/nutonomy/nuscenes-devkit to build the official evaluation tools for nuScenes. Run the following command for evaluation:

/nuscenes ">
#To convert tracking results into .json format
bash evaluation/nuscenes/pipeline.sh immortal
#To evaluate
python 
   
    /nuscenes-devkit/python-sdk/nuscenes/eval/tracking/evaluate.py \
"./mot_results/nuscenes/validation_2hz/immortal/results/results.json" \
--output_dir "./mot_results/nuscenes/validation_2hz/immortal/results" \
--eval_set "val" \
--dataroot 
    
     /nuscenes

    
   
Sleep staging from ECG, assisted with EEG

Sleep_Staging_Knowledge Distillation This codebase implements knowledge distillation approach for ECG based sleep staging assisted by EEG based sleep

2 Dec 12, 2022
Feature extraction made simple with torchextractor

torchextractor: PyTorch Intermediate Feature Extraction Introduction Too many times some model definitions get remorselessly copy-pasted just because

Antoine Broyelle 89 Oct 31, 2022
Miscellaneous and lightweight network tools

Network Tools Collection of miscellaneous and lightweight network tools to simplify daily operations, administration, and troubleshooting of networks.

Nicholas Russo 22 Mar 22, 2022
Code for "Unsupervised State Representation Learning in Atari"

Unsupervised State Representation Learning in Atari Ankesh Anand*, Evan Racah*, Sherjil Ozair*, Yoshua Bengio, Marc-Alexandre Côté, R Devon Hjelm This

Mila 217 Jan 03, 2023
Naszilla is a Python library for neural architecture search (NAS)

A repository to compare many popular NAS algorithms seamlessly across three popular benchmarks (NASBench 101, 201, and 301). You can implement your ow

270 Jan 03, 2023
Code for DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning

DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning Pytorch Implementation for DisCo: Remedy Self-supervi

79 Jan 06, 2023
Parallel Latent Tree-Induction for Faster Sequence Encoding

FastTrees This repository contains the experimental code supporting the FastTrees paper by Bill Pung. Software Requirements Python 3.6, NLTK and PyTor

Bill Pung 4 Mar 29, 2022
[NeurIPS-2021] Slow Learning and Fast Inference: Efficient Graph Similarity Computation via Knowledge Distillation

Efficient Graph Similarity Computation - (EGSC) This repo contains the source code and dataset for our paper: Slow Learning and Fast Inference: Effici

24 Dec 31, 2022
Official implementation of the NRNS paper: No RL, No Simulation: Learning to Navigate without Navigating

No RL No Simulation (NRNS) Official implementation of the NRNS paper: No RL, No Simulation: Learning to Navigate without Navigating NRNS is a heriarch

Meera Hahn 20 Nov 29, 2022
Official repository for Hierarchical Opacity Propagation for Image Matting

HOP-Matting Official repository for Hierarchical Opacity Propagation for Image Matting 🚧 🚧 🚧 Under Construction 🚧 🚧 🚧 🚧 🚧 🚧   Coming Soon   

Li Yaoyi 54 Dec 30, 2021
Real time Human Detection Counting

In this python project, we are going to build the Human Detection and Counting System through Webcam or you can give your own video or images. This is a deep learning project on computer vision, whic

Mir Nawaz Ahmad 2 Jun 17, 2022
Simple implementation of Mobile-Former on Pytorch

Simple-implementation-of-Mobile-Former At present, only the model but no trained. There may be some bug in the code, and some details may be different

Acheung 103 Dec 31, 2022
Fast sparse deep learning on CPUs

SPARSEDNN **If you want to use this repo, please send me an email: [email pro

Ziheng Wang 44 Nov 30, 2022
Iris prediction model is used to classify iris species created julia's DecisionTree, DataFrames, JLD2, PlotlyJS and Statistics packages.

Iris Species Predictor Iris prediction is used to classify iris species using their sepal length, sepal width, petal length and petal width created us

Siva Prakash 2 Jan 06, 2022
Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners

DART Implementation for ICLR2022 paper Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners. Environment

ZJUNLP 83 Dec 27, 2022
Implementation of ConvMixer for "Patches Are All You Need? 🤷"

Patches Are All You Need? 🤷 This repository contains an implementation of ConvMixer for the ICLR 2022 submission "Patches Are All You Need?" by Asher

CMU Locus Lab 934 Jan 08, 2023
Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts

t5-japanese Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts. The following is a list of models that

Kimio Kuramitsu 1 Dec 13, 2021
OverFeat is a Convolutional Network-based image classifier and feature extractor.

OverFeat OverFeat is a Convolutional Network-based image classifier and feature extractor. OverFeat was trained on the ImageNet dataset and participat

593 Dec 08, 2022
It is the assignment for COMP 576 in Rice University

COMP-576 It is the assignment for COMP 576 in Rice University There are two programming assignments and one Final Project. Assignment 1: It is a MLP a

Maojie Tang 1 Nov 25, 2021
Build a small, 3 domain internet using Github pages and Wikipedia and construct a crawler to crawl, render, and index.

TechSEO Crawler Build a small, 3 domain internet using Github pages and Wikipedia and construct a crawler to crawl, render, and index. Play with the r

JR Oakes 57 Nov 24, 2022