Skip to content

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live.

Harsh-Avinash/Dynamic-Realtime-Animation-Control

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic Realtime Animation Control

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live. At it’s skeletal state, our project essentially merges facial keypoint detection, keypoint meshing, emotion and gesture tracking with animation. The final rendered animation can be projected / broadcasted onto an application that requires webcam access.

3 step

MOTIVATION

To look presentable on any video call at any given time.
For protecting yourself and maintaining privacy on the internet.
To make a replacement for video streaming through a webcam that consumes more bandwidth, resulting in using lower bandwidth.

OBJECTIVES

To make a pose-detection model using OpenCV and Tensor flow.
To also make an emotion detection model.
Animate a render of the newly made model on a live rendering software like Blender or Three.JS!

OUTPUT

work Finaloutput

CONTRIBUTERS

Harsh-Avinash
Seshank-k
Nishita-Varshney
Aaryan Bhatiya Ghosh

About

Our project is targeted at making an application that dynamically detects the user’s expressions and gestures and projects it onto an animation software which then renders a 2D/3D animation realtime that gets broadcasted live.

Topics

Resources

Stars

Watchers

Forks

Languages