Python implementation of "Elliptic Fourier Features of a Closed Contour"

Overview

PyEFD

Build and Test Documentation Status image image image

An Python/NumPy implementation of a method for approximating a contour with a Fourier series, as described in [1].

Installation

pip install pyefd

Usage

Given a closed contour of a shape, generated by e.g. scikit-image or OpenCV, this package can fit a Fourier series approximating the shape of the contour.

General usage examples

This section describes the general usage patterns of pyefd.

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10)

The coefficients returned are the a_n, b_n, c_n and d_n of the following Fourier series representation of the shape.

The coefficients returned are by default normalized so that they are rotation and size-invariant. This can be overridden by calling:

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=False)

Normalization can also be done afterwards:

from pyefd import normalize_efd
coeffs = normalize_efd(coeffs)

OpenCV example

If you are using OpenCV to generate contours, this example shows how to connect it to pyefd.

import cv2 
import numpy
from pyefd import elliptic_fourier_descriptors

# Find the contours of a binary image using OpenCV.
contours, hierarchy = cv2.findContours(
    im, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)

# Iterate through all contours found and store each contour's 
# elliptical Fourier descriptor's coefficients.
coeffs = []
for cnt in contours:
    # Find the coefficients of all contours
    coeffs.append(elliptic_fourier_descriptors(
        numpy.squeeze(cnt), order=10))

Using EFD as features

To use these as features, one can write a small wrapper function:

from pyefd import elliptic_fourier_descriptors

def efd_feature(contour):
    coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
    return coeffs.flatten()[3:]

If the coefficients are normalized, then coeffs[0, 0] = 1.0, coeffs[0, 1] = 0.0 and coeffs[0, 2] = 0.0, so they can be disregarded when using the elliptic Fourier descriptors as features.

See [1] for more technical details.

Testing

Run tests with with Pytest:

py.test tests.py

The tests include a single image from the MNIST dataset of handwritten digits ([2]) as a contour to use for testing.

Documentation

See ReadTheDocs.

References

[1]: Frank P Kuhl, Charles R Giardina, Elliptic Fourier features of a closed contour, Computer Graphics and Image Processing, Volume 18, Issue 3, 1982, Pages 236-258, ISSN 0146-664X, http://dx.doi.org/10.1016/0146-664X(82)90034-X.

[2]: LeCun et al. (1999): The MNIST Dataset Of Handwritten Digits

Comments
  • Vectorized contour reconstruction function

    Vectorized contour reconstruction function

    Hope to contribute some more to this project with an extracted contour reconstruction function. Refactored tests accordingly. To compare reconstructed shapes I had to import a reliable hausdorff distance function, for which the scipy package was included in the test requirements.

    opened by reinvantveer 4
  • fix x/y swapping and add demo

    fix x/y swapping and add demo

    Hi,

    I noticed that in some places apparently the x/y dimension was mixed up and I attempted to fix this. As a test and demo, I added a few geometric figures to showcase this method.

    Best regards, Jonathan

    enhancement 
    opened by jonathanschilling 3
  • Method not robust to random index ?

    Method not robust to random index ?

    Hello,

    I wanted to test your method, I do not really know how does it works but it seems that how the point are indexed have some importance as I get strange result when the array is indexed differently ... Is there a way to resolve this ?

    Find below illustration of what I mean

    normal result when points are correctly ordered image

    abnormal result when points are randomly ordered image

    opened by julienguegan 3
  • Bad reconstruction results

    Bad reconstruction results

    Hi, now I'm writing the code that reconstructs the image from eft coefficienct @hbldh

    img_1 = np.array(
        [
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                64,
                64,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                127,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                64,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                191,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
        ]
    )
    
    img_1 = np.uint8(img_1)
    edges = cv2.Canny(img_1,100,200)
    contour_2 = []
    
    for i in range(edges.shape[0]):
        for j in range(edges.shape[1]):
            if edges[i,j] == 255:
              contour_2.append([i,j])
    contour_2 = np.array(contour_2)
    
    cv2.imwrite('test1.png',img_1)
    
    coeffs = pyefd.elliptic_fourier_descriptors(contour_2, order=10, normalize=False)
    
    contour_2 = pyefd.reconstruct_contour(coeffs, locus=(0, 0), num_points=300)
    
    for i in range(contour_1.shape[0]):
        tmp[int(round(contour_1[i][0]))][int(round(contour_1[i][1]))] = 255
    print(tmp.shape)
    cv2.imwrite('test2.png',tmp)
    

    However, the result is not the supposed one. How can I fix my code to reconstruct the correct image?

    test1, reconstruction of img_1(test1.png) test2, reconstruction of edge test3, reconstruction from coeffs, (test2.png)

    opened by MADONOKOUKI 2
  • Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Hi, I am sending my contour sequence to your function to define properties using the opencv example in your readme file, but I get the following error. What is the reason?

    My code:

    import cv2 
    import numpy as np
    from pyefd import elliptic_fourier_descriptors
    
    def auto_canny(image, sigma=0.33):
    	# compute the median of the single channel pixel intensities
    	v = np.median(image)
    	# apply automatic Canny edge detection using the computed median
    	lower = int(max(0, (1.0 - sigma) * v))
    	upper = int(min(255, (1.0 + sigma) * v))
    	edged = cv2.Canny(image, lower, upper)
    	# return the edged image
    	return edged
    def efd_feature(contour):
        coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
        return coeffs.flatten()[3:]
    img = cv2.imread('C:/Users/Ogeday/image.jpg')
    gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
    retval,th = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY_INV +cv2.THRESH_OTSU)
    cv2.imshow("thresolded",th);
    
    canny=auto_canny(th);
    
    cv2.imshow("cannied",canny);
    # Find the contours of a binary image using OpenCV.
    contours, hierarchy = cv2.findContours(canny, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
    
    # Iterate through all contours found and store each contour's 
    # elliptical Fourier descriptor's coefficients.
    coeffs = []
    for cnt in contours:
        # Find the coefficients of all contours
     coeffs.append(elliptic_fourier_descriptors(np.squeeze(cnt), order=10))
    
    efd=efd_feature(contours);
    print(efd);
    
    opened by OgedayOztekin 2
  • pyefd for 3D points

    pyefd for 3D points

    Hi!

    I wondered if I could use pyefd for generating the contour from 3D data points, where x, y, and z are the coordinates of a generic point. Do you have any suggestions?

    I really appreciate any help you can provide!

    opened by dalbenzioG 1
  • Feature request: normalize_efd function that also outputs angle and scale

    Feature request: normalize_efd function that also outputs angle and scale

    Thank you very much for this beautiful piece of software. For my purposes it would be great to also get the normalization angle and scale in order to store it alongside the descriptor for future lookups. Would it be possible to have a analogous function to normalize_efd that outputs those values and the normalized descriptor as a tuple?

    enhancement 
    opened by geloescht 1
  • Release/v1.5.0

    Release/v1.5.0

    Version 1.5.0

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.
    opened by hbldh 0
  • Create Dependabot config file

    Create Dependabot config file

    :wave: Dependabot is moving natively into GitHub! This pull request migrates your configuration from Dependabot.com to a config file, using the new syntax. When you merge this pull request, we'll swap out dependabot-preview (me) for a new dependabot app, and you'll be all set!

    With this change, you'll now use the Dependabot page in GitHub, rather than the Dependabot dashboard, to monitor your version updates. Dependabot is now configured exclusively using config files.

    If you've got any questions or feedback for us, please let us know by creating an issue in the dependabot/dependabot-core repository.

    Learn more about the relaunch of Dependabot

    Please note that regular @dependabot commands do not work on this pull request.

    :robot::yellow_heart:

    dependencies 
    opened by dependabot-preview[bot] 0
  • Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/.

    You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

    View the update logs.

    opened by dependabot-preview[bot] 0
  • Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files.

    As a result, Dependabot couldn't update your dependencies.

    The error Dependabot encountered was:

    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    [pipenv.exceptions.ResolutionFailure]:       req_dir=requirements_dir
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 726, in resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       req_dir=req_dir,
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 480, in actually_resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       resolved_tree = resolver.resolve()
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 395, in resolve
    [pipenv.exceptions.ResolutionFailure]:       raise ResolutionFailure(message=str(e))
    [pipenv.exceptions.ResolutionFailure]:       pipenv.exceptions.ResolutionFailure: ERROR: ERROR: Could not find a version that matches black
    [pipenv.exceptions.ResolutionFailure]:       Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    [pipenv.exceptions.ResolutionFailure]: Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
      First try clearing your dependency cache with $ pipenv lock --clear, then try the original command again.
     Alternatively, you can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
      Hint: try $ pipenv lock --pre if it is a pre-release dependency.
    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    
    ['Traceback (most recent call last):\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 501, in create_spinner\n    yield sp\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 649, in venv_resolve_deps\n    c = resolve(cmd, sp)\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 539, in resolve\n    sys.exit(c.return_code)\n', 'SystemExit: 1\n']
    

    If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

    You can mention @dependabot in the comments below to contact the Dependabot team.

    opened by dependabot-preview[bot] 0
  • Contour chain approximation

    Contour chain approximation "simple" is buggy or numerically instable

    Description

    I was running Fourier descriptors extraction on contours that naturally contain long straight lines. I used cv.CHAIN_APPROX_SIMPLE as usual but was having weird results as if the method does not converge:

    image

    I tried storing the contour as cv.CHAIN_APPROX_NONE instead and it fixed the problem for all of my cases: image

    Minimal setup to reproduce:

    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,255,255), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    
    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,0,0), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_NONE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    

    I get: image image

    opened by MikeTkachuk 0
  • RuntimeWarning: invalid value encountered in true_divide

    RuntimeWarning: invalid value encountered in true_divide

    Some specific contour leads to a warning and to NaN due to division by 0.

    from pyefd import elliptic_fourier_descriptors
    import numpy as np
    
    contour = np.array([(0.0007365261134166801, 0.0008592751780890362), (0.0011385481809349507, 0.0005073326831297464), (0.0016015060818268534, 0.00024058327913523136), (0.002107608603590938, 6.927799610623175e-05), (0.002637406510141327, 0.0), (0.003170539965043462, 3.5411605355473164e-05), (0.0036865209486098838, 0.00017415196403836042), (0.0036865209486098838, 0.00017415196403836042), (0.003301593851628093, 0.0011941724608851567), (0.003301593851628093, 0.0011941724608851567), (0.0029920052614881287, 0.001110928245675824), (0.002672125188546981, 0.0010896812824625624), (0.002354246444616681, 0.0011312480801257685), (0.002050584931558297, 0.0012340312499438122), (0.0017728101910231553, 0.001394080892339833), (0.001531596950512193, 0.0016052463893156954), (0.0013362148995842427, 0.001859412769243729), (0.0011941724608850457, 0.0021468125606828314), (0.001110928245675491, 0.0024564011508226846), (0.0010896812824621183, 0.0027762812237640544), (0.0011312480801258795, 0.003094159967693799), (0.001234031249943368, 0.0033978214807524054), (0.001394080892340055, 0.003675596221287547), (0.0016052463893154734, 0.003916809461798509), (0.00185941276924384, 0.004112191512726571), (0.0021468125606826094, 0.004254233951425768), (0.0017618854637007075, 0.005274254448272675), (0.0012828858113027586, 0.005037517050440643), (0.0008592751780888142, 0.0047118802988938), (0.0005073326831298575, 0.004309858231375752), (0.0002405832791353424, 0.003846900330483627), (6.927799610623175e-05, 0.0033407978087195422), (0.0, 0.0028109999021695975), (3.5411605355584186e-05, 0.0022778664472672405), (0.0001741519640382494, 0.0017618854637008186), (0.00041088936187017033, 0.0012828858113032027), (0.0007365261134166801, 0.0008592751780890362)])
    y = elliptic_fourier_descriptors(contour, order=3, normalize=False)
    print(y)
    

    will give the following output :

    [[nan nan nan nan] [nan nan nan nan] [nan nan nan nan]] /usr/local/lib/python3.7/dist-packages/pyefd.py:67: RuntimeWarning: invalid value encountered in true_divide a = consts * np.sum((dxy[:, 0] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:68: RuntimeWarning: invalid value encountered in true_divide b = consts * np.sum((dxy[:, 0] / dt) * d_sin_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:69: RuntimeWarning: invalid value encountered in true_divide c = consts * np.sum((dxy[:, 1] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:70: RuntimeWarning: invalid value encountered in true_divide d = consts * np.sum((dxy[:, 1] / dt) * d_sin_phi, axis=1)


    Any idea how to fix this ?

    Or how to work-around this ?

    opened by ghost 3
  • Descriptors not consistent across cycled contour indices

    Descriptors not consistent across cycled contour indices

    Description

    I am trying to create invariant descriptors for the same silhouettes at different rotation angles.

    What I Did

    Created rotated copies of the same picture. Ran skimage.measure.find_contours() on it to extract a contour and pyefd.elliptic_fourier_descriptors(normalize=True) on the result. Expected output: Equal with some margin of error for differently rotated copies. Actual output: Result is only sometimes equal.

    Unfortunately my code is spread over several source files and depends on data, so I cannot easily share an example of what I am actually doing. But here is a function that, when inserted into tests.py will result in a failed test:

    def test_normalizing_4():
        contour_2 = np.roll(contour_1[:-1,:], 40, axis=0)
        contour_2 = np.append(contour_2, [contour_2[0]], axis=0)
        c1 = pyefd.elliptic_fourier_descriptors(contour_1, normalize=True)
        c2 = pyefd.elliptic_fourier_descriptors(contour_2, normalize=True)
        np.testing.assert_almost_equal(c1, c2, decimal=12)
    

    The reason for this behaviour is actually mentioned in the original paper in chapter 5.1 and figure 8: For every shape there are two possible classifications, each rotated along one of the two semi-major axes (rotated 180 degrees from each other). It seems like pyefd chooses one of them based on the location of the first point in the contour.

    There might be two solutions to this, firstly to return both classifications or to choose one of them (more) consistently by examining higher harmonic content of the descriptor. Note that the (near-)circular case also exists as outlined in the paper in chapter 5.2, so returning multiple descriptors and normalisation parametres might be required anyway for contours with rotational symmetry.

    bug enhancement help wanted 
    opened by geloescht 2
Releases(v1.6.0)
  • v1.6.0(Dec 9, 2021)

    Version 1.6.0 (2021-12-09)

    Added

    • Added a demo for 3D surfaces with cylindrical symmetries. (examples/example1.py)

    Fixes

    • Fixes incorrectly plotted curves when no imshow has been called.
    • Fixes ugly coefficient calculation code.
    Source code(tar.gz)
    Source code(zip)
  • v1.5.1(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v.1.5.1-2(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Sep 28, 2020)

  • v0.1.0(Feb 9, 2016)

Owner
Henrik Blidh
Mathematician, Python programmer and Pointless Projecteer.
Henrik Blidh
Facilitates implementing deep neural-network backbones, data augmentations

Introduction Nowadays, the training of Deep Learning models is fragmented and unified. When AI engineers face up with one specific task, the common wa

40 Dec 29, 2022
Person Re-identification

Person Re-identification Final project of Computer Vision Table of content Person Re-identification Table of content Students: Proposed method Dataset

Nguyễn Hoàng Quân 4 Jun 17, 2021
Tools for manipulating UVs in the Blender viewport.

UV Tool Suite for Blender A set of tools to make editing UVs easier in Blender. These tools can be accessed wither through the Kitfox - UV panel on th

35 Oct 29, 2022
The official repository for our paper "The Neural Data Router: Adaptive Control Flow in Transformers Improves Systematic Generalization".

Codebase for learning control flow in transformers The official repository for our paper "The Neural Data Router: Adaptive Control Flow in Transformer

Csordás Róbert 24 Oct 15, 2022
Codes for CIKM'21 paper 'Self-Supervised Graph Co-Training for Session-based Recommendation'.

COTREC Codes for CIKM'21 paper 'Self-Supervised Graph Co-Training for Session-based Recommendation'. Requirements: Python 3.7, Pytorch 1.6.0 Best Hype

Xin Xia 42 Dec 09, 2022
TAP: Text-Aware Pre-training for Text-VQA and Text-Caption, CVPR 2021 (Oral)

TAP: Text-Aware Pre-training TAP: Text-Aware Pre-training for Text-VQA and Text-Caption by Zhengyuan Yang, Yijuan Lu, Jianfeng Wang, Xi Yin, Dinei Flo

Microsoft 61 Nov 14, 2022
Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task

multi-task_losses_optimizer Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task 已经实验过了,不会有cuda out of memory情况 ##Par

14 Dec 25, 2022
LightNet++: Boosted Light-weighted Networks for Real-time Semantic Segmentation

LightNet++ !!!New Repo.!!! ⇒ EfficientNet.PyTorch: Concise, Modular, Human-friendly PyTorch implementation of EfficientNet with Pre-trained Weights !!

linksense 237 Jan 05, 2023
TensorFlow GNN is a library to build Graph Neural Networks on the TensorFlow platform.

TensorFlow GNN This is an early (alpha) release to get community feedback. It's under active development and we may break API compatibility in the fut

889 Dec 30, 2022
Wordle Env: A Daily Word Environment for Reinforcement Learning

Wordle Env: A Daily Word Environment for Reinforcement Learning Setup Steps: git pull [email&#

2 Mar 28, 2022
This is the official repository of XVFI (eXtreme Video Frame Interpolation)

XVFI This is the official repository of XVFI (eXtreme Video Frame Interpolation), https://arxiv.org/abs/2103.16206 Last Update: 20210607 We provide th

Jihyong Oh 195 Dec 29, 2022
Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)

Length-Adaptive Transformer This is the official Pytorch implementation of Length-Adaptive Transformer. For detailed information about the method, ple

Clova AI Research 93 Dec 28, 2022
Back to the Feature: Learning Robust Camera Localization from Pixels to Pose (CVPR 2021)

Back to the Feature with PixLoc We introduce PixLoc, a neural network for end-to-end learning of camera localization from an image and a 3D model via

Computer Vision and Geometry Lab 610 Jan 05, 2023
Biomarker identification for COVID-19 Severity in BALF cells Single-cell RNA-seq data

scBALF Covid-19 dataset Analysis Here is the Github page that has the codes for the bioinformatics pipeline described in the paper COVID-Datathon: Bio

Nami Niyakan 2 May 21, 2022
A dead simple python wrapper for darknet that works with OpenCV 4.1, CUDA 10.1

What Dead simple python wrapper for Yolo V3 using AlexyAB's darknet fork. Works with CUDA 10.1 and OpenCV 4.1 or later (I use OpenCV master as of Jun

Pliable Pixels 6 Jan 12, 2022
An Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.

DeepNER An Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models. This repository contains complex Deep

Derrick 9 May 30, 2022
This repository contains code accompanying the paper "An End-to-End Chinese Text Normalization Model based on Rule-Guided Flat-Lattice Transformer"

FlatTN This repository contains code accompanying the paper "An End-to-End Chinese Text Normalization Model based on Rule-Guided Flat-Lattice Transfor

THUHCSI 74 Nov 28, 2022
Collision risk estimation using stochastic motion models

collision_risk_estimation Collision risk estimation using stochastic motion models. This is a new approach, based on stochastic models, to predict the

Unmesh 7 Jun 26, 2022
Resources complimenting the Machine Learning Course led in the Faculty of mathematics and informatics part of Sofia University.

Machine Learning and Data Mining, Summer 2021-2022 How to learn data science and machine learning? Programming. Learn Python. Basic Statistics. Take a

Simeon Hristov 8 Oct 04, 2022