✨️🐍 SPARQL endpoint built with RDFLib to serve machine learning models, or any other logic implemented in Python

Overview

SPARQL endpoint for RDFLib

Version Python versions

Run tests Publish to PyPI CodeQL Coverage

rdflib-endpoint is a SPARQL endpoint based on a RDFLib Graph to easily serve machine learning models, or any other logic implemented in Python via custom SPARQL functions.

It aims to enable python developers to easily deploy functions that can be queried in a federated fashion using SPARQL. For example: using a python function to resolve labels for specific identifiers, or run a classifier given entities retrieved using a SERVICE query to another SPARQL endpoint.

🧑‍🏫 How it works

The user defines and registers custom SPARQL functions using Python, and/or populate the RDFLib Graph, then the endpoint is deployed based on the FastAPI framework.

The deployed SPARQL endpoint can be used as a SERVICE in a federated SPARQL query from regular triplestores SPARQL endpoints. Tested on OpenLink Virtuoso (Jena based) and Ontotext GraphDB (rdf4j based). The endpoint is CORS enabled.

Built with RDFLib and FastAPI. Tested for Python 3.7, 3.8 and 3.9

Please create an issue, or send a pull request if you are facing issues or would like to see a feature implemented

📥 Install the package

Install the package from PyPI:

pip install rdflib-endpoint

🐍 Define custom SPARQL functions

Checkout the example folder for a complete working app example to get started, with a docker deployment. The best way to create a new SPARQL endpoint is to copy this example folder and start from it.

Create a app/main.py file in your project folder with your functions and endpoint parameters:

from rdflib_endpoint import SparqlEndpoint
import rdflib
from rdflib.plugins.sparql.evalutils import _eval

def custom_concat(query_results, ctx, part, eval_part):
    """Concat 2 strings in the 2 senses and return the length as additional Length variable
    """
    argument1 = str(_eval(part.expr.expr[0], eval_part.forget(ctx, _except=part.expr._vars)))
    argument2 = str(_eval(part.expr.expr[1], eval_part.forget(ctx, _except=part.expr._vars)))
    evaluation = []
    scores = []
    evaluation.append(argument1 + argument2)
    evaluation.append(argument2 + argument1)
    scores.append(len(argument1 + argument2))
    scores.append(len(argument2 + argument1))
    # Append the results for our custom function
    for i, result in enumerate(evaluation):
        query_results.append(eval_part.merge({
            part.var: rdflib.Literal(result), 
            rdflib.term.Variable(part.var + 'Length'): rdflib.Literal(scores[i])
        }))
    return query_results, ctx, part, eval_part

# Start the SPARQL endpoint based on a RDFLib Graph and register your custom functions
g = rdflib.graph.ConjunctiveGraph()
app = SparqlEndpoint(
    graph=g,
    # Register the functions:
    functions={
        'https://w3id.org/um/sparql-functions/custom_concat': custom_concat
    },
    # CORS enabled by default
    cors_enabled=True,
    # Metadata used for the service description and Swagger UI:
    title="SPARQL endpoint for RDFLib graph", 
    description="A SPARQL endpoint to serve machine learning models, or any other logic implemented in Python. \n[Source code](https://github.com/vemonet/rdflib-endpoint)",
    version="0.1.0",
    public_url='https://your-endpoint-url/sparql',
    # Example queries displayed in the Swagger UI to help users try your function
    example_query="""Example query:\n
```
PREFIX myfunctions: <https://w3id.org/um/sparql-functions/>
SELECT ?concat ?concatLength WHERE {
    BIND("First" AS ?first)
    BIND(myfunctions:custom_concat(?first, "last") AS ?concat)
}
```"""
)

🦄 Run the SPARQL endpoint

To quickly get started you can run the FastAPI server from the example folder with uvicorn on http://localhost:8000

cd example
uvicorn main:app --reload --app-dir app

Checkout in the example/README.md for more details, such as deploying it with docker.

🧑‍💻 Development

📥 Install for development

Install from the latest GitHub commit to make sure you have the latest updates:

pip install rdf[email protected]+https://github.com/vemonet/[email protected]

Or clone and install locally for development:

git clone https://github.com/vemonet/rdflib-endpoint
cd rdflib-endpoint
pip install -e .

You can use a virtual environment to avoid conflicts:

# Create the virtual environment folder in your workspace
python3 -m venv .venv
# Activate it using a script in the created folder
source .venv/bin/activate

✅️ Run the tests

Install additional dependencies:

pip install pytest requests

Run the tests locally (from the root folder):

pytest -s

📂 Projects using rdflib-endpoint

Here are some projects using rdflib-endpoint to deploy custom SPARQL endpoints with python:

Comments
  • rdflib.plugins.sparql.CUSTOM_EVALS[

    rdflib.plugins.sparql.CUSTOM_EVALS["exampleEval"] = customEval

    I would like to know if this is possible to implement this kind of custom evaluation function, such as in this example: Source code for examples.custom_eval

    Also, do you please know if this would work in a federated query from Wikidata or Wikibase ? Thanks.

    enhancement 
    opened by rchateauneu 3
  • Fix error when dealing with AND operator (&&)

    Fix error when dealing with AND operator (&&)

    Hello, There is an error when dealing with AND operator (&&). Actually parse_qsl incorrectly parse the request body, misunderstanding && as new parameters. I suggest unquoting the data later on. Best regards, Ba Huy

    opened by tbhuy 2
  • Should be able to use prefixes bound in graph's NamespaceManager

    Should be able to use prefixes bound in graph's NamespaceManager

    Prefixes can be bound to a graph using the NamespaceManager. This makes them globally available, when loading and serializing as well as when querying.

    rdflib-endpoint, however, requires prefixes to be declared in the query input even for those bound to the graph, because prepareQuery is not aware of the graph.

    It would be convenient if the SPARQL endpoint could use the globally bound prefixes. My suggestion would be to drop the prepareQuery completely. I actually don't see why it is there in the first place, given that the prepared query is never used later on.

    opened by Natureshadow 0
  • SPARQL query containing 'coalesce' returns no result on rdflib-endpoint.SparqlEndpoint()

    SPARQL query containing 'coalesce' returns no result on rdflib-endpoint.SparqlEndpoint()

    Hello!

    My setup:

    I define a graph, g = Graph(), then I g.parse() a number of .ttl files and create the endpoint with SparqlEndpoint(graph=g). Then I use uvicorn.run(app, ...) to expose the endpoint on my local machine.

    I can successfully run this simple sparql statement to query for keywords:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT 
    ?keyword
    WHERE { ?subj dcat:keyword ?keyword }
    

    However, as soon as I add a coalesce statement, the query does not return results any more:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT 
    ?keyword
    (coalesce(?keyword, "xyz") as ?foo) 
    WHERE { ?subj dcat:keyword ?keyword }
    

    I tried something similar on wikidata which has no problems:

    SELECT ?item ?itemLabel 
    (coalesce(?itemLabel, 2) as ?foo)
    WHERE 
    {
      ?item wdt:P31 wd:Q146. # Must be of a cat
      SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
    }
    

    The server debug output for the keywords query looks ok to me.

    INFO:     127.0.0.1:43942 - "GET /sparql?format=json&query=%0APREFIX+dcat%3A+%3Chttp%3A%2F%2Fwww.w3.org%2Fns%2Fdcat%23%3E%0ASELECT+%0A%3Fkeyword%0A%28coalesce%28%3Fkeyword%2C+%22xyz%22%29+as+%3Ffoo%29+%0AWHERE+%7B+%3Fsubj+dcat%3Akeyword+%3Fkeyword+%7D%0ALIMIT+10%0A HTTP/1.1" 200 OK
    

    Putting COALESCE in the WHERE clause does not solve the problem:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT
    ?keyword
    ?foo
    WHERE {
    ?subj dcat:keyword ?keyword
    BIND(coalesce(?keyword, 2) as ?foo)
    }
    

    Am I missing something?

    Thanks in advance

    opened by byte-for-byte 4
  • Construct query

    Construct query

    Use rdflib-endpoint more frequently I ran into an issue with a sparql construct.

    sparql without prefixes

    CONSTRUCT {
      ?work dcterms:date ?jaar .
    }
    WHERE {
    {  select ?work (min(?year) as ?jaar)
        where {
              ?work dc:date ?year .
        }
        group by ?work }
    }
    

    The correct tuples are selected but no triples are created. Using rdflib directly does give the correct triples.

    opened by RichDijk 1
Releases(0.2.6)
  • 0.2.6(Dec 19, 2022)

    Changelog

    • YASGUI now use the provided example query as default query

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.2.5...0.2.6

    Source code(tar.gz)
    Source code(zip)
  • 0.2.5(Dec 19, 2022)

  • 0.2.4(Dec 19, 2022)

    Changelog

    • Add support for rdflib.Graph using store="Oxigraph" (oxrdflib package https://github.com/oxigraph/oxrdflib), related to https://github.com/vemonet/rdflib-endpoint/issues/4
    • CLI option for defining the store has been added

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.2.3...0.2.4

    Source code(tar.gz)
    Source code(zip)
  • 0.2.3(Dec 19, 2022)

  • 0.2.2(Dec 19, 2022)

  • 0.2.1(Dec 19, 2022)

  • 0.2.0(Dec 19, 2022)

    Changelog

    • Migrate from setup.py to pyproject.toml with a hatch build backend and src/ layout
    • Added types
    • Check for strict type compliance using mypy
    • Added tests for python 3.10 and 3.11
    • Added CITATION.cff file and pre-commit hooks
    • Merged pull request https://github.com/vemonet/rdflib-endpoint/pull/2 "Fix error when dealing with AND operator (&&)"

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.1.6...0.2.0

    Source code(tar.gz)
    Source code(zip)
  • 0.1.6(Feb 14, 2022)

  • 0.1.5(Jan 10, 2022)

  • 0.1.4(Dec 15, 2021)

  • 0.1.3(Dec 14, 2021)

  • 0.1.2(Dec 14, 2021)

    rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python. Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL endpoint can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Changelog:

    • Added a CLI feature to quickly start a SPARQL endpoint based on a local RDF file: rdflib-endpoint server your-file.nt

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.1.1...0.1.2

    Source code(tar.gz)
    Source code(zip)
  • 0.1.1(Dec 8, 2021)

    rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python. Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL endpoint can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Changelog:

    • Minor improvements to tests and workflows
    Source code(tar.gz)
    Source code(zip)
  • 0.1.0(Dec 7, 2021)

    First release of rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python

    Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Source code(tar.gz)
    Source code(zip)
Fetching Cryptocurrency Prices from Coingecko and Displaying them on Grafana

cryptocurrency-prices-grafana Fetching Cryptocurrency Prices from Coingecko and Displaying them on Grafana About This stack consists of: Prometheus (t

Ruan Bekker 7 Aug 01, 2022
A minimum reproducible repository for embedding panel in FastAPI

FastAPI-Panel A minimum reproducible repository for embedding panel in FastAPI Follow either This Tutorial or These steps below ↓↓↓ Clone the reposito

Tyler Houssian 15 Sep 22, 2022
A rate limiter for Starlette and FastAPI

SlowApi A rate limiting library for Starlette and FastAPI adapted from flask-limiter. Note: this is alpha quality code still, the API may change, and

Laurent Savaete 562 Jan 01, 2023
A RESTful API for creating and monitoring resource components of a hypothetical build system. Built with FastAPI and pydantic. Complete with testing and CI.

diskspace-monitor-CRUD Background The build system is part of a large environment with a multitude of different components. Many of the components hav

Nick Hopewell 67 Dec 14, 2022
A set of demo of deploying a Machine Learning Model in production using various methods

Machine Learning Model in Production This git is for those who have concern about serving your machine learning model to production. Overview The tuto

Vo Van Tu 53 Sep 14, 2022
A FastAPI Middleware of joerick/pyinstrument to check your service performance.

fastapi_profiler A FastAPI Middleware of joerick/pyinstrument to check your service performance. 📣 Info A FastAPI Middleware of pyinstrument to check

LeoSun 107 Jan 05, 2023
Code Specialist 27 Oct 16, 2022
Ansible Inventory Plugin, created to get hosts from HTTP API.

ansible-ws-inventory-plugin Ansible Inventory Plugin, created to get hosts from HTTP API. Features: Database compatible with MongoDB and Filesystem (J

Carlos Neto 0 Feb 05, 2022
Sample-fastapi - A sample app using Fastapi that you can deploy on App Platform

Getting Started We provide a sample app using Fastapi that you can deploy on App

Erhan BÜTE 2 Jan 17, 2022
cookiecutter template for web API with python

Python project template for Web API with cookiecutter What's this This provides the project template including minimum test/lint/typechecking package

Hitoshi Manabe 4 Jan 28, 2021
FastAPI Server Session is a dependency-based extension for FastAPI that adds support for server-sided session management

FastAPI Server-sided Session FastAPI Server Session is a dependency-based extension for FastAPI that adds support for server-sided session management.

DevGuyAhnaf 5 Dec 23, 2022
京东图片点击验证码识别

京东图片验证码识别 本项目是@yqchilde 大佬的 JDMemberCloseAccount 识别图形验证码(#45)思路验证,若你也有思路可以提交Issue和PR也可以在 @yqchilde 的 TG群 找到我 声明 本脚本只是为了学习研究使用 本脚本除了采集处理验证码图片没有其他任何功能,也

AntonVanke 37 Dec 22, 2022
A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits

A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits Install You can install this Library with: pip instal

Tert0 33 Nov 28, 2022
Twitter API monitor with fastAPI + MongoDB

Twitter API monitor with fastAPI + MongoDB You need to have a file .env with the following variables: DB_URL="mongodb+srv://mongodb_path" DB_URL2=

Leonardo Ferreira 3 Apr 08, 2022
Voucher FastAPI

Voucher-API Requirement Docker Installed on system Libraries Pandas Psycopg2 FastAPI PyArrow Pydantic Uvicorn How to run Download the repo on your sys

Hassan Munir 1 Jan 26, 2022
FastAPI pagination

FastAPI Pagination Installation # Basic version pip install fastapi-pagination # All available integrations pip install fastapi-pagination[all] Avail

Yurii Karabas 561 Jan 07, 2023
Lightning FastAPI

Lightning FastAPI Lightning FastAPI framework, provides boiler plates for FastAPI based on Django Framework Explaination / | │ manage.py │ README.

Rajesh Joshi 1 Oct 15, 2021
🔀⏳ Easy throttling with asyncio support

Throttler Zero-dependency Python package for easy throttling with asyncio support. 📝 Table of Contents 🎒 Install 🛠 Usage Examples Throttler and Thr

Ramzan Bekbulatov 80 Dec 07, 2022
Stac-fastapi built on Tile38 and Redis to support caching

stac-fastapi-caching Stac-fastapi built on Tile38 to support caching. This code is built on top of stac-fastapi-elasticsearch 0.1.0 with pyle38, a Pyt

Jonathan Healy 4 Apr 11, 2022
Hook Slinger acts as a simple service that lets you send, retry, and manage event-triggered POST requests, aka webhooks

Hook Slinger acts as a simple service that lets you send, retry, and manage event-triggered POST requests, aka webhooks. It provides a fully self-contained docker image that is easy to orchestrate, m

Redowan Delowar 96 Jan 02, 2023