Python implementation of "Elliptic Fourier Features of a Closed Contour"

Overview

PyEFD

Build and Test Documentation Status image image image

An Python/NumPy implementation of a method for approximating a contour with a Fourier series, as described in [1].

Installation

pip install pyefd

Usage

Given a closed contour of a shape, generated by e.g. scikit-image or OpenCV, this package can fit a Fourier series approximating the shape of the contour.

General usage examples

This section describes the general usage patterns of pyefd.

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10)

The coefficients returned are the a_n, b_n, c_n and d_n of the following Fourier series representation of the shape.

The coefficients returned are by default normalized so that they are rotation and size-invariant. This can be overridden by calling:

from pyefd import elliptic_fourier_descriptors
coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=False)

Normalization can also be done afterwards:

from pyefd import normalize_efd
coeffs = normalize_efd(coeffs)

OpenCV example

If you are using OpenCV to generate contours, this example shows how to connect it to pyefd.

import cv2 
import numpy
from pyefd import elliptic_fourier_descriptors

# Find the contours of a binary image using OpenCV.
contours, hierarchy = cv2.findContours(
    im, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)

# Iterate through all contours found and store each contour's 
# elliptical Fourier descriptor's coefficients.
coeffs = []
for cnt in contours:
    # Find the coefficients of all contours
    coeffs.append(elliptic_fourier_descriptors(
        numpy.squeeze(cnt), order=10))

Using EFD as features

To use these as features, one can write a small wrapper function:

from pyefd import elliptic_fourier_descriptors

def efd_feature(contour):
    coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
    return coeffs.flatten()[3:]

If the coefficients are normalized, then coeffs[0, 0] = 1.0, coeffs[0, 1] = 0.0 and coeffs[0, 2] = 0.0, so they can be disregarded when using the elliptic Fourier descriptors as features.

See [1] for more technical details.

Testing

Run tests with with Pytest:

py.test tests.py

The tests include a single image from the MNIST dataset of handwritten digits ([2]) as a contour to use for testing.

Documentation

See ReadTheDocs.

References

[1]: Frank P Kuhl, Charles R Giardina, Elliptic Fourier features of a closed contour, Computer Graphics and Image Processing, Volume 18, Issue 3, 1982, Pages 236-258, ISSN 0146-664X, http://dx.doi.org/10.1016/0146-664X(82)90034-X.

[2]: LeCun et al. (1999): The MNIST Dataset Of Handwritten Digits

Comments
  • Vectorized contour reconstruction function

    Vectorized contour reconstruction function

    Hope to contribute some more to this project with an extracted contour reconstruction function. Refactored tests accordingly. To compare reconstructed shapes I had to import a reliable hausdorff distance function, for which the scipy package was included in the test requirements.

    opened by reinvantveer 4
  • fix x/y swapping and add demo

    fix x/y swapping and add demo

    Hi,

    I noticed that in some places apparently the x/y dimension was mixed up and I attempted to fix this. As a test and demo, I added a few geometric figures to showcase this method.

    Best regards, Jonathan

    enhancement 
    opened by jonathanschilling 3
  • Method not robust to random index ?

    Method not robust to random index ?

    Hello,

    I wanted to test your method, I do not really know how does it works but it seems that how the point are indexed have some importance as I get strange result when the array is indexed differently ... Is there a way to resolve this ?

    Find below illustration of what I mean

    normal result when points are correctly ordered image

    abnormal result when points are randomly ordered image

    opened by julienguegan 3
  • Bad reconstruction results

    Bad reconstruction results

    Hi, now I'm writing the code that reconstructs the image from eft coefficienct @hbldh

    img_1 = np.array(
        [
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                64,
                64,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                127,
                127,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                0,
                0,
                0,
                64,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                64,
                0,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                64,
                0,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                191,
                127,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                127,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                191,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                127,
                255,
                255,
                191,
                64,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                0,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                0,
                0,
                0,
                64,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                127,
                0,
                0,
                0,
                64,
                191,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
            [
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
                255,
            ],
        ]
    )
    
    img_1 = np.uint8(img_1)
    edges = cv2.Canny(img_1,100,200)
    contour_2 = []
    
    for i in range(edges.shape[0]):
        for j in range(edges.shape[1]):
            if edges[i,j] == 255:
              contour_2.append([i,j])
    contour_2 = np.array(contour_2)
    
    cv2.imwrite('test1.png',img_1)
    
    coeffs = pyefd.elliptic_fourier_descriptors(contour_2, order=10, normalize=False)
    
    contour_2 = pyefd.reconstruct_contour(coeffs, locus=(0, 0), num_points=300)
    
    for i in range(contour_1.shape[0]):
        tmp[int(round(contour_1[i][0]))][int(round(contour_1[i][1]))] = 255
    print(tmp.shape)
    cv2.imwrite('test2.png',tmp)
    

    However, the result is not the supposed one. How can I fix my code to reconstruct the correct image?

    test1, reconstruction of img_1(test1.png) test2, reconstruction of edge test3, reconstruction from coeffs, (test2.png)

    opened by MADONOKOUKI 2
  • Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Error: operands could not be broadcast together with shapes (0,1,2) (10,0)

    Hi, I am sending my contour sequence to your function to define properties using the opencv example in your readme file, but I get the following error. What is the reason?

    My code:

    import cv2 
    import numpy as np
    from pyefd import elliptic_fourier_descriptors
    
    def auto_canny(image, sigma=0.33):
    	# compute the median of the single channel pixel intensities
    	v = np.median(image)
    	# apply automatic Canny edge detection using the computed median
    	lower = int(max(0, (1.0 - sigma) * v))
    	upper = int(min(255, (1.0 + sigma) * v))
    	edged = cv2.Canny(image, lower, upper)
    	# return the edged image
    	return edged
    def efd_feature(contour):
        coeffs = elliptic_fourier_descriptors(contour, order=10, normalize=True)
        return coeffs.flatten()[3:]
    img = cv2.imread('C:/Users/Ogeday/image.jpg')
    gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
    retval,th = cv2.threshold(gray, 0, 255, cv2.THRESH_BINARY_INV +cv2.THRESH_OTSU)
    cv2.imshow("thresolded",th);
    
    canny=auto_canny(th);
    
    cv2.imshow("cannied",canny);
    # Find the contours of a binary image using OpenCV.
    contours, hierarchy = cv2.findContours(canny, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
    
    # Iterate through all contours found and store each contour's 
    # elliptical Fourier descriptor's coefficients.
    coeffs = []
    for cnt in contours:
        # Find the coefficients of all contours
     coeffs.append(elliptic_fourier_descriptors(np.squeeze(cnt), order=10))
    
    efd=efd_feature(contours);
    print(efd);
    
    opened by OgedayOztekin 2
  • pyefd for 3D points

    pyefd for 3D points

    Hi!

    I wondered if I could use pyefd for generating the contour from 3D data points, where x, y, and z are the coordinates of a generic point. Do you have any suggestions?

    I really appreciate any help you can provide!

    opened by dalbenzioG 1
  • Feature request: normalize_efd function that also outputs angle and scale

    Feature request: normalize_efd function that also outputs angle and scale

    Thank you very much for this beautiful piece of software. For my purposes it would be great to also get the normalization angle and scale in order to store it alongside the descriptor for future lookups. Would it be possible to have a analogous function to normalize_efd that outputs those values and the normalized descriptor as a tuple?

    enhancement 
    opened by geloescht 1
  • Release/v1.5.0

    Release/v1.5.0

    Version 1.5.0

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.
    opened by hbldh 0
  • Create Dependabot config file

    Create Dependabot config file

    :wave: Dependabot is moving natively into GitHub! This pull request migrates your configuration from Dependabot.com to a config file, using the new syntax. When you merge this pull request, we'll swap out dependabot-preview (me) for a new dependabot app, and you'll be all set!

    With this change, you'll now use the Dependabot page in GitHub, rather than the Dependabot dashboard, to monitor your version updates. Dependabot is now configured exclusively using config files.

    If you've got any questions or feedback for us, please let us know by creating an issue in the dependabot/dependabot-core repository.

    Learn more about the relaunch of Dependabot

    Please note that regular @dependabot commands do not work on this pull request.

    :robot::yellow_heart:

    dependencies 
    opened by dependabot-preview[bot] 0
  • Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/

    Dependabot couldn't authenticate with https://pypi.python.org/simple/.

    You can provide authentication details in your Dependabot dashboard by clicking into the account menu (in the top right) and selecting 'Config variables'.

    View the update logs.

    opened by dependabot-preview[bot] 0
  • Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files

    Dependabot can't resolve your Python dependency files.

    As a result, Dependabot couldn't update your dependencies.

    The error Dependabot encountered was:

    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    [pipenv.exceptions.ResolutionFailure]:       req_dir=requirements_dir
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 726, in resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       req_dir=req_dir,
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 480, in actually_resolve_deps
    [pipenv.exceptions.ResolutionFailure]:       resolved_tree = resolver.resolve()
    [pipenv.exceptions.ResolutionFailure]:   File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 395, in resolve
    [pipenv.exceptions.ResolutionFailure]:       raise ResolutionFailure(message=str(e))
    [pipenv.exceptions.ResolutionFailure]:       pipenv.exceptions.ResolutionFailure: ERROR: ERROR: Could not find a version that matches black
    [pipenv.exceptions.ResolutionFailure]:       Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    [pipenv.exceptions.ResolutionFailure]: Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
      First try clearing your dependency cache with $ pipenv lock --clear, then try the original command again.
     Alternatively, you can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
      Hint: try $ pipenv lock --pre if it is a pre-release dependency.
    ERROR: ERROR: Could not find a version that matches black
    Skipped pre-versions: 18.3a0, 18.3a0, 18.3a1, 18.3a1, 18.3a2, 18.3a2, 18.3a3, 18.3a3, 18.3a4, 18.3a4, 18.4a0, 18.4a0, 18.4a1, 18.4a1, 18.4a2, 18.4a2, 18.4a3, 18.4a3, 18.4a4, 18.4a4, 18.5b0, 18.5b0, 18.5b1, 18.5b1, 18.6b0, 18.6b0, 18.6b1, 18.6b1, 18.6b2, 18.6b2, 18.6b3, 18.6b3, 18.6b4, 18.6b4, 18.9b0, 18.9b0, 19.3b0, 19.3b0
    There are incompatible versions in the resolved dependencies.
    
    ['Traceback (most recent call last):\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 501, in create_spinner\n    yield sp\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 649, in venv_resolve_deps\n    c = resolve(cmd, sp)\n', '  File "/usr/local/.pyenv/versions/3.7.3/lib/python3.7/site-packages/pipenv/utils.py", line 539, in resolve\n    sys.exit(c.return_code)\n', 'SystemExit: 1\n']
    

    If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

    You can mention @dependabot in the comments below to contact the Dependabot team.

    opened by dependabot-preview[bot] 0
  • Contour chain approximation

    Contour chain approximation "simple" is buggy or numerically instable

    Description

    I was running Fourier descriptors extraction on contours that naturally contain long straight lines. I used cv.CHAIN_APPROX_SIMPLE as usual but was having weird results as if the method does not converge:

    image

    I tried storing the contour as cv.CHAIN_APPROX_NONE instead and it fixed the problem for all of my cases: image

    Minimal setup to reproduce:

    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,255,255), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    
    img = np.zeros((100,100), dtype=np.uint8)
    img = cv.rectangle(img, (25,25), (75,75), (255,0,0), -1)
    cnt, h = cv.findContours(img,cv.RETR_EXTERNAL, cv.CHAIN_APPROX_NONE)
    coeffs = pyefd.elliptic_fourier_descriptors(cnt[0].reshape(-1,2), order=10, normalize=True)
    pyefd.plot_efd(coeffs)
    plt.show()
    

    I get: image image

    opened by MikeTkachuk 0
  • RuntimeWarning: invalid value encountered in true_divide

    RuntimeWarning: invalid value encountered in true_divide

    Some specific contour leads to a warning and to NaN due to division by 0.

    from pyefd import elliptic_fourier_descriptors
    import numpy as np
    
    contour = np.array([(0.0007365261134166801, 0.0008592751780890362), (0.0011385481809349507, 0.0005073326831297464), (0.0016015060818268534, 0.00024058327913523136), (0.002107608603590938, 6.927799610623175e-05), (0.002637406510141327, 0.0), (0.003170539965043462, 3.5411605355473164e-05), (0.0036865209486098838, 0.00017415196403836042), (0.0036865209486098838, 0.00017415196403836042), (0.003301593851628093, 0.0011941724608851567), (0.003301593851628093, 0.0011941724608851567), (0.0029920052614881287, 0.001110928245675824), (0.002672125188546981, 0.0010896812824625624), (0.002354246444616681, 0.0011312480801257685), (0.002050584931558297, 0.0012340312499438122), (0.0017728101910231553, 0.001394080892339833), (0.001531596950512193, 0.0016052463893156954), (0.0013362148995842427, 0.001859412769243729), (0.0011941724608850457, 0.0021468125606828314), (0.001110928245675491, 0.0024564011508226846), (0.0010896812824621183, 0.0027762812237640544), (0.0011312480801258795, 0.003094159967693799), (0.001234031249943368, 0.0033978214807524054), (0.001394080892340055, 0.003675596221287547), (0.0016052463893154734, 0.003916809461798509), (0.00185941276924384, 0.004112191512726571), (0.0021468125606826094, 0.004254233951425768), (0.0017618854637007075, 0.005274254448272675), (0.0012828858113027586, 0.005037517050440643), (0.0008592751780888142, 0.0047118802988938), (0.0005073326831298575, 0.004309858231375752), (0.0002405832791353424, 0.003846900330483627), (6.927799610623175e-05, 0.0033407978087195422), (0.0, 0.0028109999021695975), (3.5411605355584186e-05, 0.0022778664472672405), (0.0001741519640382494, 0.0017618854637008186), (0.00041088936187017033, 0.0012828858113032027), (0.0007365261134166801, 0.0008592751780890362)])
    y = elliptic_fourier_descriptors(contour, order=3, normalize=False)
    print(y)
    

    will give the following output :

    [[nan nan nan nan] [nan nan nan nan] [nan nan nan nan]] /usr/local/lib/python3.7/dist-packages/pyefd.py:67: RuntimeWarning: invalid value encountered in true_divide a = consts * np.sum((dxy[:, 0] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:68: RuntimeWarning: invalid value encountered in true_divide b = consts * np.sum((dxy[:, 0] / dt) * d_sin_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:69: RuntimeWarning: invalid value encountered in true_divide c = consts * np.sum((dxy[:, 1] / dt) * d_cos_phi, axis=1) /usr/local/lib/python3.7/dist-packages/pyefd.py:70: RuntimeWarning: invalid value encountered in true_divide d = consts * np.sum((dxy[:, 1] / dt) * d_sin_phi, axis=1)


    Any idea how to fix this ?

    Or how to work-around this ?

    opened by ghost 3
  • Descriptors not consistent across cycled contour indices

    Descriptors not consistent across cycled contour indices

    Description

    I am trying to create invariant descriptors for the same silhouettes at different rotation angles.

    What I Did

    Created rotated copies of the same picture. Ran skimage.measure.find_contours() on it to extract a contour and pyefd.elliptic_fourier_descriptors(normalize=True) on the result. Expected output: Equal with some margin of error for differently rotated copies. Actual output: Result is only sometimes equal.

    Unfortunately my code is spread over several source files and depends on data, so I cannot easily share an example of what I am actually doing. But here is a function that, when inserted into tests.py will result in a failed test:

    def test_normalizing_4():
        contour_2 = np.roll(contour_1[:-1,:], 40, axis=0)
        contour_2 = np.append(contour_2, [contour_2[0]], axis=0)
        c1 = pyefd.elliptic_fourier_descriptors(contour_1, normalize=True)
        c2 = pyefd.elliptic_fourier_descriptors(contour_2, normalize=True)
        np.testing.assert_almost_equal(c1, c2, decimal=12)
    

    The reason for this behaviour is actually mentioned in the original paper in chapter 5.1 and figure 8: For every shape there are two possible classifications, each rotated along one of the two semi-major axes (rotated 180 degrees from each other). It seems like pyefd chooses one of them based on the location of the first point in the contour.

    There might be two solutions to this, firstly to return both classifications or to choose one of them (more) consistently by examining higher harmonic content of the descriptor. Note that the (near-)circular case also exists as outlined in the paper in chapter 5.2, so returning multiple descriptors and normalisation parametres might be required anyway for contours with rotational symmetry.

    bug enhancement help wanted 
    opened by geloescht 2
Releases(v1.6.0)
  • v1.6.0(Dec 9, 2021)

    Version 1.6.0 (2021-12-09)

    Added

    • Added a demo for 3D surfaces with cylindrical symmetries. (examples/example1.py)

    Fixes

    • Fixes incorrectly plotted curves when no imshow has been called.
    • Fixes ugly coefficient calculation code.
    Source code(tar.gz)
    Source code(zip)
  • v1.5.1(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v.1.5.1-2(Jan 22, 2021)

    1.5.1 (2021-01-22)

    Added

    • return_transformation keyword on elliptic_fourier_descriptors method. Merged #11. Fixes #5.

    Fixes

    • Documentation correction. Merged #12.

    Removed

    • Removed example script which did not work anymore.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.1(Sep 28, 2020)

  • v0.1.0(Feb 9, 2016)

Owner
Henrik Blidh
Mathematician, Python programmer and Pointless Projecteer.
Henrik Blidh
Densely Connected Convolutional Networks, In CVPR 2017 (Best Paper Award).

Densely Connected Convolutional Networks (DenseNets) This repository contains the code for DenseNet introduced in the following paper Densely Connecte

Zhuang Liu 4.5k Jan 03, 2023
Minimal implementation of Denoised Smoothing: A Provable Defense for Pretrained Classifiers in TensorFlow.

Denoised-Smoothing-TF Minimal implementation of Denoised Smoothing: A Provable Defense for Pretrained Classifiers in TensorFlow. Denoised Smoothing is

Sayak Paul 19 Dec 11, 2022
This program creates a formatted excel file which highlights the undervalued stock according to Graham's number.

Over-and-Undervalued-Stocks Of Nepse Using Graham's Number Scrap the latest data using different websites and creates a formatted excel file that high

6 May 03, 2022
SysWhispers Shellcode Loader

Shhhloader Shhhloader is a SysWhispers Shellcode Loader that is currently a Work in Progress. It takes raw shellcode as input and compiles a C++ stub

icyguider 630 Jan 03, 2023
How to Leverage Multimodal EHR Data for Better Medical Predictions?

How to Leverage Multimodal EHR Data for Better Medical Predictions? This repository contains the code of the paper: How to Leverage Multimodal EHR Dat

13 Dec 13, 2022
Chinese named entity recognization with BiLSTM using Keras

Chinese named entity recognization (Bilstm with Keras) Project Structure ./ ├── README.md ├── data │   ├── README.md │   ├── data 数据集 │   │   ├─

1 Dec 17, 2021
Spectrum Surveying: Active Radio Map Estimation with Autonomous UAVs

Spectrum Surveying: The Python code in this repository implements the simulations and plots the figures described in the paper “Spectrum Surveying: Ac

Universitetet i Agder 2 Dec 06, 2022
BMN: Boundary-Matching Network

BMN: Boundary-Matching Network A pytorch-version implementation codes of paper: "BMN: Boundary-Matching Network for Temporal Action Proposal Generatio

qinxin 260 Dec 06, 2022
PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi

PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi PIKA is a lightweight speech processing toolkit based on Pytorch and (Py)

336 Nov 25, 2022
StyleTransfer - Open source style transfer project, based on VGG19

StyleTransfer - Open source style transfer project, based on VGG19

Patrick martins de lima 9 Dec 13, 2021
Official Implementation for Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation

Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation We present a generic image-to-image translation framework, pixel2style2pixel (pSp

2.8k Dec 30, 2022
Predicting Axillary Lymph Node Metastasis in Early Breast Cancer Using Deep Learning on Primary Tumor Biopsy Slides

Predicting Axillary Lymph Node Metastasis in Early Breast Cancer Using Deep Learning on Primary Tumor Biopsy Slides Project | This repo is the officia

CVSM Group - email: <a href=[email protected]"> 33 Dec 28, 2022
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"

Medical Image Segmentation with Guided Attention This repository contains the code of our paper: "'Multi-scale self-guided attention for medical image

Ashish Sinha 394 Dec 28, 2022
Google Landmark Recogntion and Retrieval 2021 Solutions

Google Landmark Recogntion and Retrieval 2021 Solutions In this repository you can find solution and code for Google Landmark Recognition 2021 and Goo

Vadim Timakin 5 Nov 25, 2022
This repository contains a PyTorch implementation of "AD-NeRF: Audio Driven Neural Radiance Fields for Talking Head Synthesis".

AD-NeRF: Audio Driven Neural Radiance Fields for Talking Head Synthesis | Project Page | Paper | PyTorch implementation for the paper "AD-NeRF: Audio

551 Dec 29, 2022
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.

Machine Learning From Scratch About Python implementations of some of the fundamental Machine Learning models and algorithms from scratch. The purpose

Erik Linder-Norén 21.8k Jan 09, 2023
Code for paper 'Hand-Object Contact Consistency Reasoning for Human Grasps Generation' at ICCV 2021

GraspTTA Hand-Object Contact Consistency Reasoning for Human Grasps Generation (ICCV 2021). Project Page with Videos Demo Quick Results Visualization

Hanwen Jiang 47 Dec 09, 2022
[CVPR'22] COAP: Learning Compositional Occupancy of People

COAP: Compositional Articulated Occupancy of People Paper | Video | Project Page This is the official implementation of the CVPR 2022 paper COAP: Lear

Marko Mihajlovic 111 Dec 11, 2022
Code accompanying the paper "How Tight Can PAC-Bayes be in the Small Data Regime?"

How Tight Can PAC-Bayes be in the Small Data Regime? This is the code to reproduce all experiments for the following paper: @inproceedings{Foong:2021:

5 Dec 21, 2021
A Robust Unsupervised Ensemble of Feature-Based Explanations using Restricted Boltzmann Machines

A Robust Unsupervised Ensemble of Feature-Based Explanations using Restricted Boltzmann Machines Understanding the results of deep neural networks is

Johan van den Heuvel 2 Dec 13, 2021