Unified Interface for Constructing and Managing Workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Overview

CI Slack Twitter

Couler

What is Couler?

Couler aims to provide a unified interface for constructing and managing workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Couler is included in CNCF Cloud Native Landscape and LF AI Landscape.

Who uses Couler?

You can find a list of organizations who are using Couler in ADOPTERS.md. If you'd like to add your organization to the list, please send us a pull request.

Why use Couler?

Many workflow engines exist nowadays, e.g. Argo Workflows, Tekton Pipelines, and Apache Airflow. However, their programming experience varies and they have different level of abstractions that are often obscure and complex. The code snippets below are some examples for constructing workflows using Apache Airflow and Kubeflow Pipelines.

Apache Airflow Kubeflow Pipelines

def create_dag(dag_id,
               schedule,
               dag_number,
               default_args):
    def hello_world_py(*args):
        print('Hello World')

    dag = DAG(dag_id,
              schedule_interval=schedule,
              default_args=default_args)
    with dag:
        t1 = PythonOperator(
            task_id='hello_world',
            python_callable=hello_world_py,
            dag_number=dag_number)
    return dag

for n in range(1, 10):
    default_args = {'owner': 'airflow',
                    'start_date': datetime(2018, 1, 1)
                    }
    globals()[dag_id] = create_dag(
        'hello_world_{}'.format(str(n)),
        '@daily',
        n,
        default_args)

class FlipCoinOp(dsl.ContainerOp):
    """Flip a coin and output heads or tails randomly."""
    def __init__(self):
        super(FlipCoinOp, self).__init__(
            name='Flip',
            image='python:alpine3.6',
            command=['sh', '-c'],
            arguments=['python -c "import random; result = \'heads\' if random.randint(0,1) == 0 '
                       'else \'tails\'; print(result)" | tee /tmp/output'],
            file_outputs={'output': '/tmp/output'})

class PrintOp(dsl.ContainerOp):
    """Print a message."""
    def __init__(self, msg):
        super(PrintOp, self).__init__(
            name='Print',
            image='alpine:3.6',
            command=['echo', msg],
        )

# define the recursive operation
@graph_component
def flip_component(flip_result):
    print_flip = PrintOp(flip_result)
    flipA = FlipCoinOp().after(print_flip)
    with dsl.Condition(flipA.output == 'heads'):
        flip_component(flipA.output)

@dsl.pipeline(
    name='pipeline flip coin',
    description='shows how to use graph_component.'
)
def recursive():
    flipA = FlipCoinOp()
    flipB = FlipCoinOp()
    flip_loop = flip_component(flipA.output)
    flip_loop.after(flipB)
    PrintOp('cool, it is over. %s' % flipA.output).after(flip_loop)

Couler provides a unified interface for constructing and managing workflows that provides the following:

  • Simplicity: Unified interface and imperative programming style for defining workflows with automatic construction of directed acyclic graph (DAG).
  • Extensibility: Extensible to support various workflow engines.
  • Reusability: Reusable steps for tasks such as distributed training of machine learning models.
  • Efficiency: Automatic workflow and resource optimizations under the hood.

Please see the following sections for installation guide and examples.

Installation

  • Couler currently only supports Argo Workflows. Please see instructions here to install Argo Workflows on your Kubernetes cluster.
  • Install Python 3.6+
  • Install Couler Python SDK via the following pip command:
pip install git+https://github.com/couler-proj/couler

Alternatively, you can clone this repository and then run the following to install:

python setup.py install

Examples

Coin Flip

This example combines the use of a Python function result, along with conditionals, to take a dynamic path in the workflow. In this example, depending on the result of the first step defined in flip_coin(), the template will either run the heads() step or the tails() step.

Steps can be defined via either couler.run_script() for Python functions or couler.run_container() for containers. In addition, the conditional logic to decide whether to flip the coin in this example is defined via the combined use of couler.when() and couler.equal().

import couler.argo as couler
from couler.argo_submitter import ArgoSubmitter


def random_code():
    import random

    res = "heads" if random.randint(0, 1) == 0 else "tails"
    print(res)


def flip_coin():
    return couler.run_script(image="python:alpine3.6", source=random_code)


def heads():
    return couler.run_container(
        image="alpine:3.6", command=["sh", "-c", 'echo "it was heads"']
    )


def tails():
    return couler.run_container(
        image="alpine:3.6", command=["sh", "-c", 'echo "it was tails"']
    )


result = flip_coin()
couler.when(couler.equal(result, "heads"), lambda: heads())
couler.when(couler.equal(result, "tails"), lambda: tails())

submitter = ArgoSubmitter()
couler.run(submitter=submitter)

DAG

This example demonstrates different ways to define the workflow as a directed-acyclic graph (DAG) by specifying the dependencies of each task via couler.set_dependencies() and couler.dag(). Please see the code comments for the specific shape of DAG that we've defined in linear() and diamond().

import couler.argo as couler
from couler.argo_submitter import ArgoSubmitter


def job_a(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="A",
    )


def job_b(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="B",
    )


def job_c(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="C",
    )


def job_d(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="D",
    )

#     A
#    / \
#   B   C
#  /
# D
def linear():
    couler.set_dependencies(lambda: job_a(message="A"), dependencies=None)
    couler.set_dependencies(lambda: job_b(message="B"), dependencies=["A"])
    couler.set_dependencies(lambda: job_c(message="C"), dependencies=["A"])
    couler.set_dependencies(lambda: job_d(message="D"), dependencies=["B"])


#   A
#  / \
# B   C
#  \ /
#   D
def diamond():
    couler.dag(
        [
            [lambda: job_a(message="A")],
            [lambda: job_a(message="A"), lambda: job_b(message="B")],  # A -> B
            [lambda: job_a(message="A"), lambda: job_c(message="C")],  # A -> C
            [lambda: job_b(message="B"), lambda: job_d(message="D")],  # B -> D
            [lambda: job_c(message="C"), lambda: job_d(message="D")],  # C -> D
        ]
    )


linear()
submitter = ArgoSubmitter()
couler.run(submitter=submitter)

Note that the current version only works with Argo Workflows but we are actively working on the design of the unified interface that is extensible to additional workflow engines. Please stay tuned for more updates and we welcome any feedback and contributions from the community.

Community Blogs and Presentations

Comments
  • feat: Support all script template fields

    feat: Support all script template fields

    What changes were proposed in this pull request?

    This PR makes it so that script templates inherit from the container template and support all the fields that Argo supports.

    Why are the changes needed?

    We were not able to use the script template as is since a lot fields were missing.

    Does this PR introduce any user-facing change?

    Yes, the script template signature has additional arguments.

    How was this patch tested?

    We made sure the current tests pass and this use case for script templates is supported.

    opened by rushtehrani 12
  • fix: Fix issue with volume mount

    fix: Fix issue with volume mount

    What changes were proposed in this pull request?

    fix Issue #193

    Why are the changes needed?

    fix Issue #193

    Does this PR introduce any user-facing change?

    No

    How was this patch tested?

    volume_test.py

    opened by peiniliu 8
  • feat: removing stray yaml dump

    feat: removing stray yaml dump

    What changes were proposed in this pull request?

    This fixes https://github.com/couler-proj/couler/issues/248

    In that issue, @merlintang mentions this section can be removed.

    Why are the changes needed?

    Since this code is not contained in a method, it ends up being called outside of the context of couler/argo.

    Does this PR introduce any user-facing change?

    Only the reduction of output.

    How was this patch tested?

    Removed this and raised an exception and didn't get the yaml dump

    opened by dmerrick 7
  • feat: Add image pull secrect support for python src

    feat: Add image pull secrect support for python src

    What changes were proposed in this pull request?

    add image pull secret config for workflow

    Why are the changes needed?

    because our project need pull image from private hub

    Does this PR introduce any user-facing change?

    i think no.it was a new feature.

    How was this patch tested?

    add one unit test and our project use this code to pull image

    opened by lcgash 7
  • Install instructions don't work in Pycharm `venv`

    Install instructions don't work in Pycharm `venv`

    Summary

    I'm working with a PyCharm generated venv (3.8.5) and running install scripts supplied here doesn't work. The folder in site-packages has no real code in it.

    When I run it from a normal terminal, that is Python 3.8.5 sans venv, it install an egg. I don't think this is expected behaviour - I was expecting a wheel...

    Diagnostics

    Mac BigSur 11.3.1 Latest repository version 0.1.1rc8.

    bug 
    opened by moshewe 7
  • [RFC] Add initial design doc for core Couler API

    [RFC] Add initial design doc for core Couler API

    Hi community,

    We are opening this PR to share our thoughts on supporting multiple workflow engines. We'd appreciate any feedback and suggestions. In addition, if you are interested in contributing either a new backend or functionalities of the existing backend, please let us know in this pull request.

    opened by terrytangyuan 7
  • fix: Relax dependency version pinning.

    fix: Relax dependency version pinning.

    What changes were proposed in this pull request?

    To avoid depencency version conflicts with other libraries, pin dependencies to version ranges rather than exact versions.

    Why are the changes needed?

    Let me know if you disagree, but in general I think library code should try to avoid pinning to exact versions. Otherwise it can get very tricky to resolve conflicts for non-trivial projects.

    Does this PR introduce any user-facing change?

    It's possible that there are breaking changes in dependencies that I don't know about! I'll wait for tests to run and see if anything breaks.

    How was this patch tested?

    See above: this change shouldn't require new tests.

    opened by jmcarp 6
  • container volumeMounts can not mount exsist PVC.

    container volumeMounts can not mount exsist PVC.

    Summary

    What happened/what you expected to happen?

    We expect the container use the existing volume defined inside the workflow. Such as: volumes-existing.yaml

    example for testing:

    import os
    
    import couler.argo as couler
    from couler.argo_submitter import ArgoSubmitter
    from couler.core.templates.volume import VolumeMount, Volume
    
    couler.add_volume(Volume("apppath", "mnist"))
    
    mount = VolumeMount("apppath", "/data/")
    command = ["ls", mount.mount_path]
    
    couler.run_container(
            image="alpine:3.12.0", command=command, volume_mounts=[mount]
        )
    
    submitter = ArgoSubmitter(namespace="testagent")
    couler.run(submitter=submitter)
    

    OrderedDict([('apiVersion', 'argoproj.io/v1alpha1'), ('kind', 'Workflow'), ('metadata', {'generateName': 'runpy-'}), ('spec', {'entrypoint': 'runpy', 'volumes': [OrderedDict([('name', 'apppath'), ('persistentVolumeClaim', {'claimName': 'mnist'})])], 'templates': [{'name': 'runpy', 'steps': [[OrderedDict([('name', 'module-3418'), ('template', 'module')])]]}, OrderedDict([('name', 'module'), ('container', OrderedDict([('image', 'alpine:3.12.0'), ('command', ['ls', '/data/']), ('volumeMounts', [OrderedDict([('name', 'apppath'), ('mountPath', '/data/')])])])), (**'volumes', [{'name': 'apppath', 'emptyDir': {}}**])])]})])

    This will raise the problem because the volumeMounts inside the container find volumes inside the container('emptyDir') rather than volumes inside the workflow(PVC).

    The reason is this code, because inside the container, automatically generate volumes for volumeMounts,

    https://github.com/couler-proj/couler/blob/d20f874882e55c5e3aa53ffaf78670f6b4d314a0/couler/core/templates/container.py#L146-L153

    After removing the automatically generated 'emptydir{}', the volumeMount point to the right volume definition inside the workflow.

    OrderedDict([('apiVersion', 'argoproj.io/v1alpha1'), ('kind', 'Workflow'), ('metadata', {'generateName': 'runpy-'}), ('spec', {'entrypoint': 'runpy', 'volumes': [OrderedDict([('name', 'apppath'), ('persistentVolumeClaim', {'claimName': 'mnist'}**)])], 'templates': [{'name': 'runpy', 'steps': [[OrderedDict([('name', 'module-3418'), ('template', 'module')])]]}, OrderedDict([('name', 'module'), ('container', OrderedDict([('image', 'alpine:3.12.0'), ('command', ['ls', '/data/']), ('volumeMounts', [OrderedDict([('name', 'apppath'), ('mountPath', '/data/')])])])), ('volumes', [])**])]})])

    Diagnostics

    What is the version of Couler you are using?

    latest v0.1.1rc8

    What is the version of the workflow engine you are using?

    argo: v3.0.0-rc3

    Any logs or other information that could help debugging?


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    bug good first issue 
    opened by peiniliu 6
  • fix: Support multiple function arguments in couler.map()

    fix: Support multiple function arguments in couler.map()

    What changes were proposed in this pull request?

    Added changes to accept multiple function arguments in couler.map() #169 continuing the fix adding the test function for the modifications.

    Had to start again from scratch. Reason: The previous idea just pretty much loops the arguments in the map().

    return map(map(function, input_list), *other)

    But the return value is a Step Class not a function so the first map() will succeed but not the second one or the third... because the map() checks the function first.

    inner_step = Step(name=inner_dict["id"], template=template_name)

    return inner_step #32 issue

    Why are the changes needed?

    #32 issue

    Does this PR introduce any user-facing change?

    No.

    How was this patch tested?

    A test was created for it. similar to the one-argument test. Test

    opened by nooraangelva 5
  • fix: Volume_claim to dynamic

    fix: Volume_claim to dynamic

    What changes were proposed in this pull request?

    The proposal would make the VolumeClaimTemplates size and accessModes dynamic by letting the user input them so they would not be hardcoded.

    Why are the changes needed?

    #210 Working on a workflow I noticed that my workflow would require the accessModes to be ReadWriteMany, not ReadWriteOnce. ReadWriteOnce was hardcoded, I thought that it would be better if the workflow creator could set it the way she/he needs it. Same for the size.

    Does this PR introduce any user-facing change?

    Users would have to write 3 arguments to the VolumeClaimTemplate instead of one.

    Previous version: volume = VolumeClaimTemplate("workdir")

    Updated version: volume = VolumeClaimTemplate("workdir", ['ReadWriteMany'], '1Gi')

    How was this patch tested?

    The testing was done by abiding Coulers instructions

    I added a new scripts/integration_tests.Unix.sh. Because I encountered the following problem the solution is in the link also.

    opened by nooraangelva 5
  • How to specify securityContext

    How to specify securityContext

    The k8s cluster I deploy to has a pod security policy, and requires that the Argo workflows have the following, top-level securityContext:

    apiVersion: argoproj.io/v1alpha1
    kind: Workflow
    metadata:
      generateName: main-
    spec:
      securityContext:
         fsGroup: 2000
         runAsNonRoot: true
         runAsUser: 1000
    ...
    

    How can I specify that via couler? I couldn't find anything in the docs.

    opened by kodeninja 5
  • Kubernetes API exception 404

    Kubernetes API exception 404

    Hi I am trying out one of the example using argo workflow and I am getting k8s api exeception. Please find the logs:

    (venv) (base) [email protected] pythonProject1 % python main.py INFO:root:Argo submitter namespace: argo INFO:root:Found local kubernetes config. Initialized with kube_config. INFO:root:Checking workflow name/generatedName main- INFO:root:Submitting workflow to Argo ERROR:root:Failed to submit workflow Traceback (most recent call last): File "main.py", line 9, in <module> result = couler.run(submitter=submitter) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo.py", line 73, in run res = submitter.submit(wf, secrets=secrets) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 151, in submit return self._create_workflow(workflow_yaml) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 174, in _create_workflow raise e File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 158, in _create_workflow response = self._custom_object_api_client.create_namespaced_custom_object( # noqa: E501 File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api/custom_objects_api.py", line 225, in create_namespaced_custom_object return self.create_namespaced_custom_object_with_http_info(group, version, namespace, plural, body, **kwargs) # noqa: E501 File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api/custom_objects_api.py", line 344, in create_namespaced_custom_object_with_http_info return self.api_client.call_api( File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 348, in call_api return self.__call_api(resource_path, method, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 180, in __call_api response_data = self.request( File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 391, in request return self.rest_client.POST(url, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/rest.py", line 275, in POST return self.request("POST", url, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/rest.py", line 234, in request raise ApiException(http_resp=r) kubernetes.client.exceptions.ApiException: (404) Reason: Not Found HTTP response headers: HTTPHeaderDict({'Audit-Id': '47ae3075-3530-4e82-b495-fc359d544f51', 'Cache-Control': 'no-cache, private', 'Content-Type': 'text/plain; charset=utf-8', 'X-Content-Type-Options': 'nosniff', 'X-Kubernetes-Pf-Flowschema-Uid': '52efdd2c-fc48-4b6a-a014-0f20cb376b7f', 'X-Kubernetes-Pf-Prioritylevel-Uid': '4dbba312-e8f6-4b16-97b2-90fe3b9d7dc0', 'Date': 'Tue, 26 Jul 2022 06:20:02 GMT', 'Content-Length': '19'}) HTTP response body: 404 page not found

    bug 
    opened by smetal1 0
  • Documentation: Difference to other meta workflow engines

    Documentation: Difference to other meta workflow engines

    Couler is a meta workflow engine, but I know of at least ZenML and Kedro which are meta workflow engines as well. While the self-presentation of these other two is towards machine learning, it seems to me they are pretty much generally usable for any type of work. What is Couler aiming to do differently than ZenML and/or Kedro?

    opened by Make42 3
  • Need additional clarification/examples around using set_dependencies+map

    Need additional clarification/examples around using set_dependencies+map

    Summary

    I'm confused on how to properly use dependencies. Let's say I have a workflow with 4 groups of steps (A, B, C, D) and each has multiple subtasks that can happen in parallel (A1, A2, ..., B1, B2, ...). Currently, I'm adding all the A steps using couler.map, then adding all the B steps with couler.map, etc. This correctly parallelizes across A1, A2, ..., but none of the B steps start until all the A steps have completed, despite the fact that I never explicitly set dependencies.

    In this case, I want A and B to run in parallel, then C then D. Having this run sequentially as A, B, C, D is technically correct, but not ideally performant. However, given that I'm not setting dependencies, and they're still running sequentially, I feel like using the set_dependencies function wouldn't help. Also, when I tried to use the set_dependencies function, the couler code errored on parsing its own generated yaml due to duplicate anchor definitions. Would definitely like to see a more in-depth example than those currently present in the README which shows how to properly use set_dependencies in combination with functions like map.

    Use Cases

    Mostly explained above.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by varunm22 2
  • Revert

    Revert "home brew" code to official Python client

    Summary

    Change custom code to use the official Python API

    Use Cases

    It's difficult developing new features and onboarding new developers to the codebase, as the underlying structures don't follow the (well-documented) official Python API.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by moshewe 6
  • Explicit parameter passing between steps

    Explicit parameter passing between steps

    Summary

    Example:

    out1 = couler.create_parameter_artifact(path="/mnt/test.txt")
    out2 = couler.create_parameter_artifact(path="/mnt/test2.txt")
    
    
    def producer(name):
        return couler.run_container(
            image="alpine:3.6", command=["sh", "-c", 'echo "test" > /mnt/test.txt']
            , step_name=name, output=[out1, out2]
        )
    
    
    def consumer(name):
        inputs = couler.get_step_output(step_name="1")
        return couler.run_container(
            image="alpine:3.6", command=["sh", "-c", 'cat /mnt/test.txt']
            , step_name=name, args=[inputs[0]],
        )
    
    
    couler.set_dependencies(lambda: producer("1"), dependencies=None)
    couler.set_dependencies(lambda: consumer("2"), dependencies=["1"])
    

    Now arguments in consumer template look like this:

    arguments:
                  parameters:
                    - name: para-2-0
                      value: "{{tasks.1.outputs.parameters.output-id-15}}"
                    - name: para-2-1
                      value: "{{tasks.1.outputs.parameters.output-id-15}}"
                    - name: para-2-2
                      value: "{{tasks.1.outputs.parameters.output-id-16}}"
    

    It would be useful if there was a way for setting dependency between steps without implicit parameter passing.

    For myself I just added a flag to run_container that just turns off this behavior.

    Use Cases

    I have one parent step that generates couple outputs, and I have multiple children steps, each one of them only needs proper subset of parent outputs, the rest of information would be redundant.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by hcnt 1
  • Example for usage of secret

    Example for usage of secret

    Summary

    What change needs making? Example script on how to use created secret successfully. I have tried myself using the following code from test It creates a secret successfully but fails to echo the secrets value.

    Use Cases

    When would you use this? when I need to use a secret for example in the authentication.

    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement good first issue help wanted 
    opened by nooraangelva 2
Releases(v0.1.1rc8-stable)
  • v0.1.1rc8-stable(Apr 12, 2021)

    This release includes the compatibility fixes with different protobuf versions as well as fix for unnecessarily raised exception when using /tmp as mount path. This release also introduces the support for workflow memoization caches.

    List of changes since the last release can be found here.

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1rc8(Mar 23, 2021)

  • v0.1.1rc7(Dec 2, 2020)

  • v0.1.1rc6(Sep 25, 2020)

    This release includes several bug fixes and enhancements. Below are some of the notable changes:

    • Bump the dependency of Argo Python client to v3.5.1 and re-enable Argo Workflow spec validation.
    • Fix incorrect ApiException import path for Kubernetes Python client with version 11.0.0 and above.
    • Support callable for Couler core APIs in stead of previously only types.FunctionType.
    • Switch to use Argo Workflows v2.10.2 for integration tests.
    Source code(tar.gz)
    Source code(zip)
  • v0.1.1rc5(Sep 15, 2020)

Owner
Couler Project
Unified Interface for Constructing and Managing Workflows
Couler Project
ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs

(Comet-) ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs Paper Jena D. Hwang, Chandra Bhagavatula, Ronan Le Bras, Jeff Da, Keisuke Sa

AI2 152 Dec 27, 2022
ISBI 2022: Cross-level Contrastive Learning and Consistency Constraint for Semi-supervised Medical Image.

Cross-level Contrastive Learning and Consistency Constraint for Semi-supervised Medical Image Introduction This repository contains the PyTorch implem

25 Nov 09, 2022
ComputerVision - This repository aims at realized easy network architecture

ComputerVision This repository aims at realized easy network architecture Colori

DongDong 4 Dec 14, 2022
Official code repository of the paper Learning Associative Inference Using Fast Weight Memory by Schlag et al.

Learning Associative Inference Using Fast Weight Memory This repository contains the offical code for the paper Learning Associative Inference Using F

Imanol Schlag 18 Oct 12, 2022
AfriBERTa: Exploring the Viability of Pretrained Multilingual Language Models for Low-resourced Languages

AfriBERTa: Exploring the Viability of Pretrained Multilingual Language Models for Low-resourced Languages This repository contains the code for the pa

Kelechi 40 Nov 24, 2022
Selective Wavelet Attention Learning for Single Image Deraining

SWAL Code for Paper "Selective Wavelet Attention Learning for Single Image Deraining" Prerequisites Python 3 PyTorch Models We provide the models trai

Bobo 9 Jun 17, 2022
[ICCV 2021] Learning A Single Network for Scale-Arbitrary Super-Resolution

ArbSR Pytorch implementation of "Learning A Single Network for Scale-Arbitrary Super-Resolution", ICCV 2021 [Project] [arXiv] Highlights A plug-in mod

Longguang Wang 229 Dec 30, 2022
Code for CoMatch: Semi-supervised Learning with Contrastive Graph Regularization

CoMatch: Semi-supervised Learning with Contrastive Graph Regularization (Salesforce Research) This is a PyTorch implementation of the CoMatch paper [B

Salesforce 107 Dec 14, 2022
Towards Implicit Text-Guided 3D Shape Generation (CVPR2022)

Towards Implicit Text-Guided 3D Shape Generation Towards Implicit Text-Guided 3D Shape Generation (CVPR2022) Code for the paper [Towards Implicit Text

55 Dec 16, 2022
Pytorch implementation of Value Iteration Networks (NIPS 2016 best paper)

VIN: Value Iteration Networks A quick thank you A few others have released amazing related work which helped inspire and improve my own implementation

Kent Sommer 297 Dec 26, 2022
NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch

PyTorch implementation of Normalizer-Free Networks and SGD - Adaptive Gradient Clipping Paper: https://arxiv.org/abs/2102.06171.pdf Original code: htt

Vaibhav Balloli 320 Jan 02, 2023
An easy-to-use app to visualise attentions of various VQA models.

Ask Me Anything: A tool for visualising Visual Question Answering (AMA) An easy-to-use app to visualise attentions of various VQA models. Please click

Apoorve 37 Nov 13, 2022
The Unsupervised Reinforcement Learning Benchmark (URLB)

The Unsupervised Reinforcement Learning Benchmark (URLB) URLB provides a set of leading algorithms for unsupervised reinforcement learning where agent

259 Dec 26, 2022
Franka Emika Panda manipulator kinematics&dynamics simulation

pybullet_sim_panda Pybullet simulation environment for Franka Emika Panda Dependency pybullet, numpy, spatial_math_mini Simple example (please check s

0 Jan 20, 2022
Classify bird species based on their songs using SIamese Networks and 1D dilated convolutions.

The goal is to classify different birds species based on their songs/calls. Spectrograms have been extracted from the audio samples and used as features for classification.

Aditya Dutt 9 Dec 27, 2022
Informal Persian Universal Dependency Treebank

Informal Persian Universal Dependency Treebank (iPerUDT) Informal Persian Universal Dependency Treebank, consisting of 3000 sentences and 54,904 token

Roya Kabiri 0 Jan 05, 2022
Sparse R-CNN: End-to-End Object Detection with Learnable Proposals, CVPR2021

End-to-End Object Detection with Learnable Proposal, CVPR2021

Peize Sun 1.2k Dec 27, 2022
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.

SmallPebble Project status: experimental, unstable. SmallPebble is a minimal/toy automatic differentiation/deep learning library written from scratch

Sidney Radcliffe 92 Dec 30, 2022
Simple cross-platform application for DaVinci surgical video frame annotation

About DaVid is a simple cross-platform GUI for annotating robotic and endoscopic surgical actions for use in deep-learning research. Features Simple a

Cyril Zakka 4 Oct 09, 2021
Cross View SLAM

Cross View SLAM This is the associated code and dataset repository for our paper I. D. Miller et al., "Any Way You Look at It: Semantic Crossview Loca

Ian D. Miller 99 Dec 09, 2022