Python implementation of "Elliptic Fourier Features of a Closed Contour"

Overview

PyEFD

Build and Test Documentation Status image image image

An Python/NumPy implementation of a method for approximating a contour with a Fourier series, as described in [1].

Installation

pip install pyefd

Usage

Given a closed contour of a shape, generated by e.g. scikit-image or OpenCV, this package can fit a Fourier series approximating the shape of the contour.

General usage examples

This section describes the general usage patterns of pyefd.

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10)

The coefficients returned are the a_n, b_n, c_n and d_n of the following Fourier series representation of the shape.

The coefficients returned are by default normalized so that they are rotation and size-invariant. This can be overridden by calling:

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=False)

Normalization can also be done afterwards:

from pyefd import normalize_efd
coeffs = normalize_efd(coeffs)

OpenCV example

If you are using OpenCV to generate contours, this example shows how to connect it to pyefd.

import cv2 
import numpy
from pyefd import elliptic_fourier_descriptors

# Find the contours of a binary image using OpenCV.
contours, hierarchy = cv2.findContours(
    im, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)

# Iterate through all contours found and store each contour's 
# elliptical Fourier descriptor's coefficients.
coeffs = []
for cnt in contours:
    # Find the coefficients of all contours
    coeffs.append(elliptic_fourier_descriptors(
        numpy.squeeze(cnt), order=10))

Using EFD as features

To use these as features, one can write a small wrapper function:

from pyefd import elliptic_fourier_descriptors

def efd_feature(contour):
    coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
    return coeffs.flatten()[3:]

If the coefficients are normalized, then coeffs[0, 0] = 1.0, coeffs[0, 1] = 0.0 and coeffs[0, 2] = 0.0, so they can be disregarded when using the elliptic Fourier descriptors as features.

See [1] for more technical details.

Testing

Run tests with with Pytest:

py.test tests.py

The tests include a single image from the MNIST dataset of handwritten digits ([2]) as a contour to use for testing.

Documentation

See ReadTheDocs.

References

[1]: Frank P Kuhl, Charles R Giardina, Elliptic Fourier features of a closed contour, Computer Graphics and Image Processing, Volume 18, Issue 3, 1982, Pages 236-258, ISSN 0146-664X, http://dx.doi.org/10.1016/0146-664X(82)90034-X.

[2]: LeCun et al. (1999): The MNIST Dataset Of Handwritten Digits

Comments
  • Vectorized contour reconstruction function

    Vectorized contour reconstruction function

    Hope to contribute some more to this project with an extracted contour reconstruction function. Refactored tests accordingly. To compare reconstructed shapes I had to import a reliable hausdorff distance function, for which the scipy package was included in the test requirements.

    opened by reinvantveer 4
  • fix x/y swapping and add demo

    fix x/y swapping and add demo

    Hi,

    I noticed that in some places apparently the x/y dimension was mixed up and I attempted to fix this. As a test and demo, I added a few geometric figures to showcase this method.

    Best regards, Jonathan

    enhancement 
    opened by jonathanschilling 3
  • Method not robust to random index ?

    Method not robust to random index ?

    Hello,

    I wanted to test your method, I do not really know how does it works but it seems that how the point are indexed have some importance as I get strange result when the array is indexed differently ... Is there a way to resolve this ?

    Find below illustration of what I mean

    normal result when points are correctly ordered image

    abnormal result when points are randomly ordered image

    opened by julienguegan 3
  • Bad reconstruction results

    Bad reconstruction results

    Hi, now I'm writing the code that reconstructs the image from eft coefficienct @hbldh

    img_1 = np.array(
        [
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                64,
                64,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                127,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                64,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                191,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
        ]
    )
    
    img_1 = np.uint8(img_1)
    edges = cv2.Canny(img_1,100,200)
    contour_2 = []
    
    for i in range(edges.shape[0]):
        for j in range(edges.shape[1]):
            if edges[i,j] == 255:
              contour_2.append([i,j])
    contour_2 = np.array(contour_2)
    
    cv2.imwrite('test1.png',img_1)
    
    coeffs = pyefd.elliptic_fourier_descriptors(contour_2, order=10, normalize=False)
    
    contour_2 = pyefd.reconstruct_contour(coeffs, locus=(0, 0), num_points=300)
    
    for i in range(contour_1.shape[0]):
        tmp[int(round(contour_1[i][0]))][int(round(contour_1[i][1]))] = 255
    print(tmp.shape)
    cv2.imwrite('test2.png',tmp)
    

    However, the result is not the supposed one. How can I fix my code to reconstruct the correct image?

    test1, reconstruction of img_1(test1.png) test2, reconstruction of edge test3, reconstruction from coeffs, (test2.png)

    opened by MADONOKOUKI 2
  • Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Hi, I am sending my contour sequence to your function to define properties using the opencv example in your readme file, but I get the following error. What is the reason?

    My code:

    import cv2 
    import numpy as np
    from pyefd import elliptic_fourier_descriptors
    
    def auto_canny(image, sigma=0.33):
    	# compute the median of the single channel pixel intensities
    	v = np.median(image)
    	# apply automatic Canny edge detection using the computed median
    	lower = int(max(0, (1.0 - sigma) * v))
    	upper = int(min(255, (1.0 + sigma) * v))
    	edged = cv2.Canny(image, lower, upper)
    	# return the edged image
    	return edged
    def efd_feature(contour):
        coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
        return coeffs.flatten()[3:]
    img = cv2.imread('C:/Users/Ogeday/image.jpg')
    gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
    retval,th = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY_INV +cv2.THRESH_OTSU)
    cv2.imshow("thresolded",th);
    
    canny=auto_canny(th);
    
    cv2.imshow("cannied",canny);
    # Find the contours of a binary image using OpenCV.
    contours, hierarchy = cv2.findContours(canny, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
    
    # Iterate through all contours found and store each contour's 
    # elliptical Fourier descriptor's coefficients.
    coeffs = []
    for cnt in contours:
        # Find the coefficients of all contours
     coeffs.append(elliptic_fourier_descriptors(np.squeeze(cnt), order=10))
    
    efd=efd_feature(contours);
    print(efd);
    
    opened by OgedayOztekin 2
  • pyefd for 3D points

    pyefd for 3D points

    Hi!

    I wondered if I could use pyefd for generating the contour from 3D data points, where x, y, and z are the coordinates of a generic point. Do you have any suggestions?

    I really appreciate any help you can provide!

    opened by dalbenzioG 1
  • Feature request: normalize_efd function that also outputs angle and scale

    Feature request: normalize_efd function that also outputs angle and scale

    Thank you very much for this beautiful piece of software. For my purposes it would be great to also get the normalization angle and scale in order to store it alongside the descriptor for future lookups. Would it be possible to have a analogous function to normalize_efd that outputs those values and the normalized descriptor as a tuple?

    enhancement 
    opened by geloescht 1
  • Release/v1.5.0

    Release/v1.5.0

    Version 1.5.0

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.
    opened by hbldh 0
  • Create Dependabot config file

    Create Dependabot config file

    :wave: Dependabot is moving natively into GitHub! This pull request migrates your configuration from Dependabot.com to a config file, using the new syntax. When you merge this pull request, we'll swap out dependabot-preview (me) for a new dependabot app, and you'll be all set!

    With this change, you'll now use the Dependabot page in GitHub, rather than the Dependabot dashboard, to monitor your version updates. Dependabot is now configured exclusively using config files.

    If you've got any questions or feedback for us, please let us know by creating an issue in the dependabot/dependabot-core repository.

    Learn more about the relaunch of Dependabot

    Please note that regular @dependabot commands do not work on this pull request.

    :robot::yellow_heart:

    dependencies 
    opened by dependabot-preview[bot] 0
  • Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/.

    You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

    View the update logs.

    opened by dependabot-preview[bot] 0
  • Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files.

    As a result, Dependabot couldn't update your dependencies.

    The error Dependabot encountered was:

    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    [pipenv.exceptions.ResolutionFailure]:       req_dir=requirements_dir
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 726, in resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       req_dir=req_dir,
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 480, in actually_resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       resolved_tree = resolver.resolve()
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 395, in resolve
    [pipenv.exceptions.ResolutionFailure]:       raise ResolutionFailure(message=str(e))
    [pipenv.exceptions.ResolutionFailure]:       pipenv.exceptions.ResolutionFailure: ERROR: ERROR: Could not find a version that matches black
    [pipenv.exceptions.ResolutionFailure]:       Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    [pipenv.exceptions.ResolutionFailure]: Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
      First try clearing your dependency cache with $ pipenv lock --clear, then try the original command again.
     Alternatively, you can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
      Hint: try $ pipenv lock --pre if it is a pre-release dependency.
    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    
    ['Traceback (most recent call last):\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 501, in create_spinner\n    yield sp\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 649, in venv_resolve_deps\n    c = resolve(cmd, sp)\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 539, in resolve\n    sys.exit(c.return_code)\n', 'SystemExit: 1\n']
    

    If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

    You can mention @dependabot in the comments below to contact the Dependabot team.

    opened by dependabot-preview[bot] 0
  • Contour chain approximation

    Contour chain approximation "simple" is buggy or numerically instable

    Description

    I was running Fourier descriptors extraction on contours that naturally contain long straight lines. I used cv.CHAIN_APPROX_SIMPLE as usual but was having weird results as if the method does not converge:

    image

    I tried storing the contour as cv.CHAIN_APPROX_NONE instead and it fixed the problem for all of my cases: image

    Minimal setup to reproduce:

    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,255,255), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    
    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,0,0), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_NONE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    

    I get: image image

    opened by MikeTkachuk 0
  • RuntimeWarning: invalid value encountered in true_divide

    RuntimeWarning: invalid value encountered in true_divide

    Some specific contour leads to a warning and to NaN due to division by 0.

    from pyefd import elliptic_fourier_descriptors
    import numpy as np
    
    contour = np.array([(0.0007365261134166801, 0.0008592751780890362), (0.0011385481809349507, 0.0005073326831297464), (0.0016015060818268534, 0.00024058327913523136), (0.002107608603590938, 6.927799610623175e-05), (0.002637406510141327, 0.0), (0.003170539965043462, 3.5411605355473164e-05), (0.0036865209486098838, 0.00017415196403836042), (0.0036865209486098838, 0.00017415196403836042), (0.003301593851628093, 0.0011941724608851567), (0.003301593851628093, 0.0011941724608851567), (0.0029920052614881287, 0.001110928245675824), (0.002672125188546981, 0.0010896812824625624), (0.002354246444616681, 0.0011312480801257685), (0.002050584931558297, 0.0012340312499438122), (0.0017728101910231553, 0.001394080892339833), (0.001531596950512193, 0.0016052463893156954), (0.0013362148995842427, 0.001859412769243729), (0.0011941724608850457, 0.0021468125606828314), (0.001110928245675491, 0.0024564011508226846), (0.0010896812824621183, 0.0027762812237640544), (0.0011312480801258795, 0.003094159967693799), (0.001234031249943368, 0.0033978214807524054), (0.001394080892340055, 0.003675596221287547), (0.0016052463893154734, 0.003916809461798509), (0.00185941276924384, 0.004112191512726571), (0.0021468125606826094, 0.004254233951425768), (0.0017618854637007075, 0.005274254448272675), (0.0012828858113027586, 0.005037517050440643), (0.0008592751780888142, 0.0047118802988938), (0.0005073326831298575, 0.004309858231375752), (0.0002405832791353424, 0.003846900330483627), (6.927799610623175e-05, 0.0033407978087195422), (0.0, 0.0028109999021695975), (3.5411605355584186e-05, 0.0022778664472672405), (0.0001741519640382494, 0.0017618854637008186), (0.00041088936187017033, 0.0012828858113032027), (0.0007365261134166801, 0.0008592751780890362)])
    y = elliptic_fourier_descriptors(contour, order=3, normalize=False)
    print(y)
    

    will give the following output :

    [[nan nan nan nan] [nan nan nan nan] [nan nan nan nan]] /usr/local/lib/python3.7/dist-packages/pyefd.py:67: RuntimeWarning: invalid value encountered in true_divide a = consts * np.sum((dxy[:, 0] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:68: RuntimeWarning: invalid value encountered in true_divide b = consts * np.sum((dxy[:, 0] / dt) * d_sin_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:69: RuntimeWarning: invalid value encountered in true_divide c = consts * np.sum((dxy[:, 1] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:70: RuntimeWarning: invalid value encountered in true_divide d = consts * np.sum((dxy[:, 1] / dt) * d_sin_phi, axis=1)


    Any idea how to fix this ?

    Or how to work-around this ?

    opened by ghost 3
  • Descriptors not consistent across cycled contour indices

    Descriptors not consistent across cycled contour indices

    Description

    I am trying to create invariant descriptors for the same silhouettes at different rotation angles.

    What I Did

    Created rotated copies of the same picture. Ran skimage.measure.find_contours() on it to extract a contour and pyefd.elliptic_fourier_descriptors(normalize=True) on the result. Expected output: Equal with some margin of error for differently rotated copies. Actual output: Result is only sometimes equal.

    Unfortunately my code is spread over several source files and depends on data, so I cannot easily share an example of what I am actually doing. But here is a function that, when inserted into tests.py will result in a failed test:

    def test_normalizing_4():
        contour_2 = np.roll(contour_1[:-1,:], 40, axis=0)
        contour_2 = np.append(contour_2, [contour_2[0]], axis=0)
        c1 = pyefd.elliptic_fourier_descriptors(contour_1, normalize=True)
        c2 = pyefd.elliptic_fourier_descriptors(contour_2, normalize=True)
        np.testing.assert_almost_equal(c1, c2, decimal=12)
    

    The reason for this behaviour is actually mentioned in the original paper in chapter 5.1 and figure 8: For every shape there are two possible classifications, each rotated along one of the two semi-major axes (rotated 180 degrees from each other). It seems like pyefd chooses one of them based on the location of the first point in the contour.

    There might be two solutions to this, firstly to return both classifications or to choose one of them (more) consistently by examining higher harmonic content of the descriptor. Note that the (near-)circular case also exists as outlined in the paper in chapter 5.2, so returning multiple descriptors and normalisation parametres might be required anyway for contours with rotational symmetry.

    bug enhancement help wanted 
    opened by geloescht 2
Releases(v1.6.0)
  • v1.6.0(Dec 9, 2021)

    Version 1.6.0 (2021-12-09)

    Added

    • Added a demo for 3D surfaces with cylindrical symmetries. (examples/example1.py)

    Fixes

    • Fixes incorrectly plotted curves when no imshow has been called.
    • Fixes ugly coefficient calculation code.
    Source code(tar.gz)
    Source code(zip)
  • v1.5.1(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v.1.5.1-2(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Sep 28, 2020)

  • v0.1.0(Feb 9, 2016)

Owner
Henrik Blidh
Mathematician, Python programmer and Pointless Projecteer.
Henrik Blidh
Monk is a low code Deep Learning tool and a unified wrapper for Computer Vision.

Monk - A computer vision toolkit for everyone Why use Monk Issue: Want to begin learning computer vision Solution: Start with Monk's hands-on study ro

Tessellate Imaging 507 Dec 04, 2022
1st Solution For NeurIPS 2021 Competition on ML4CO Dual Task

KIDA: Knowledge Inheritance in Data Aggregation This project releases our 1st place solution on NeurIPS2021 ML4CO Dual Task. Slide and model weights a

MEGVII Research 24 Sep 08, 2022
PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation

PocketNet This is the official repository of the paper: PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and M

Fadi Boutros 40 Dec 22, 2022
COIN the currently largest dataset for comprehensive instruction video analysis.

COIN Dataset COIN is the currently largest dataset for comprehensive instruction video analysis. It contains 11,827 videos of 180 different tasks (i.e

86 Dec 28, 2022
[CVPR 2021] Few-shot 3D Point Cloud Semantic Segmentation

Few-shot 3D Point Cloud Semantic Segmentation Created by Na Zhao from National University of Singapore Introduction This repository contains the PyTor

117 Dec 27, 2022
An implementation of Video Frame Interpolation via Adaptive Separable Convolution using PyTorch

This work has now been superseded by: https://github.com/sniklaus/revisiting-sepconv sepconv-slomo This is a reference implementation of Video Frame I

Simon Niklaus 984 Dec 16, 2022
Sign Language Transformers (CVPR'20)

Sign Language Transformers (CVPR'20) This repo contains the training and evaluation code for the paper Sign Language Transformers: Sign Language Trans

Necati Cihan Camgoz 164 Dec 30, 2022
A stock generator that assess a list of stocks and returns the best stocks for investing and money allocations based on users choices of volatility, duration and number of stocks

Stock-Generator Please visit "Stock Generator.ipynb" for a clearer view and "Stock Generator.py" for scripts. The stock generator is designed to allow

jmengnyay 1 Aug 02, 2022
Zsseg.baseline - Zero-Shot Semantic Segmentation

This repo is for our paper A Simple Baseline for Zero-shot Semantic Segmentation

98 Dec 20, 2022
MemStream: Memory-Based Anomaly Detection in Multi-Aspect Streams with Concept Drift

MemStream Implementation of MemStream: Memory-Based Anomaly Detection in Multi-Aspect Streams with Concept Drift . Siddharth Bhatia, Arjit Jain, Shivi

Stream-AD 61 Dec 02, 2022
Supplementary code for the experiments described in the 2021 ISMIR submission: Leveraging Hierarchical Structures for Few Shot Musical Instrument Recognition.

Music Trees Supplementary code for the experiments described in the 2021 ISMIR submission: Leveraging Hierarchical Structures for Few Shot Musical Ins

Hugo Flores GarcĂ­a 32 Nov 22, 2022
Simple converter for deploying Stable-Baselines3 model to TFLite and/or Coral

Running SB3 developed agents on TFLite or Coral Introduction I've been using Stable-Baselines3 to train agents against some custom Gyms, some of which

Gary Briggs 16 Oct 11, 2022
Code for "Learning Skeletal Graph Neural Networks for Hard 3D Pose Estimation" ICCV'21

Skeletal-GNN Code for "Learning Skeletal Graph Neural Networks for Hard 3D Pose Estimation" ICCV'21 Various deep learning techniques have been propose

37 Oct 23, 2022
Deep learned, hardware-accelerated 3D object pose estimation

Isaac ROS Pose Estimation Overview This repository provides NVIDIA GPU-accelerated packages for 3D object pose estimation. Using a deep learned pose e

NVIDIA Isaac ROS 41 Dec 18, 2022
Python Classes: Medical Insurance Project using Object Oriented Programming Concepts

Medical-Insurance-Project-OOP Python Classes: Medical Insurance Project using Object Oriented Programming Concepts Classes are an incredibly useful pr

Hugo B. 0 Feb 04, 2022
Open-Set Recognition: A Good Closed-Set Classifier is All You Need

Open-Set Recognition: A Good Closed-Set Classifier is All You Need Code for our paper: "Open-Set Recognition: A Good Closed-Set Classifier is All You

194 Jan 03, 2023
A (PyTorch) imbalanced dataset sampler for oversampling low frequent classes and undersampling high frequent ones.

Imbalanced Dataset Sampler Introduction In many machine learning applications, we often come across datasets where some types of data may be seen more

Ming 2k Jan 08, 2023
StellarGraph - Machine Learning on Graphs

StellarGraph Machine Learning Library StellarGraph is a Python library for machine learning on graphs and networks. Table of Contents Introduction Get

S T E L L A R 2.6k Jan 05, 2023
PyTorch implementation of VAGAN: Visual Feature Attribution Using Wasserstein GANs

Prototypical Networks for Few shot Learning in PyTorch Simple alternative Implementation of Prototypical Networks for Few Shot Learning (paper, code)

Orobix 93 Aug 17, 2022
Semi-supervised semantic segmentation needs strong, varied perturbations

Semi-supervised semantic segmentation using CutMix and Colour Augmentation Implementations of our papers: Semi-supervised semantic segmentation needs

146 Dec 20, 2022