Simple to use local JSON database 🦉
# This is pure python, not specific to pylowdb ;)
db.data['posts'] = ({ 'id': 1, 'title': 'pylowdb is awesome' })
# Save to file
db.write()
# db.json
{
"posts": [
{ "id": 1, "title": "pylowdb is awesome" }
]
}
- Lightweight
- Minimalist and easy to learn API
- Query and modify data using plain Python
- Atomic write
- Hackable:
- Change storage, file format (JSON, YAML, ...) or add encryption via adapters
pip install pylowdb
import os
from os import path
from pylowdb import (
Low,
JSONFile,
)
# Use JSON file for storage
file = path.join(os.getcwd(), 'db.json')
adapter = JSONFile(file)
db = Low(adapter)
# Read data from JSON file, this will set db.data content
db.read()
# If file.json doesn't exist, db.data will be None
# Set default data
# db.data = db.data or { 'posts': [] }
db.data = db.data or { 'posts': [] }
# Create and query items using plain Python
db.data['posts'].append('hello world')
db.data['posts'][0]
# You can also use this syntax if you prefer
posts = db.data['posts']
posts.append('hello world')
# Write db.data content to db.json
db.write()
// db.json
{
"posts": [ "hello world" ]
}
For more example, see examples/
directory.
Pylowdb has classes (for synchronous adapters).
from pylowdb import (
Low,
JSONFile,
)
db = Low(JSONFile('file.json'))
db.read()
db.write()
Calls adapter.read()
and sets db.data
.
Note: JSONFile
adapter will set db.data
to None
if file doesn't exist.
db.data # is None
db.read()
db.data # is not None
Calls adapter.write(db.data)
.
db.data = { 'posts': [] }
db.write() # file.json will be { posts: [] }
db.data = {}
db.write() # file.json will be {}
Holds your db content. If you're using the adapters coming with pylowdb, it can be any type supported by json.dumbs
.
For example:
db.data = 'string'
db.data = [1, 2, 3]
db.data = { 'key': 'value' }
Adapter for reading and writing JSON files.
Low(JSONFile(filename))
In-memory adapter. Useful for speeding up unit tests.
Low(Memory())
Adapter for reading and writing YAML files.
Low(YAMLFile(filename))
Adapters for reading and writing text. Useful for creating custom adapters.
If you've published an adapter for pylowdb, feel free to create a PR to add it here.
You may want to create an adapter to write db.data
to YAML, XML, encrypt data, a remote storage, ...
An adapter is a simple class that just needs to expose two methods:
class CustomAdapter:
read(self):
# should return deserialized data
pass
write(self, data):
# should return nothing
pass
For example, let's say you have some async storage and want to create an adapter for it:
api = YourAPI()
class CustomAdapter:
# Optional: your adapter can take arguments
def __init__(self, *args, **kwargs):
pass
def read(self):
data = api.read()
return data
def write(self, data):
api.write(data)
adapter = CustomAdapter()
db = Low(adapter)
See pylowdb/adapters
for more examples.
To create an adapter for another format than JSON, you can use
TextFile
.
For example:
from pylowdb import (
Adapter,
Low,
TextFile,
)
import yaml
class YAMLFile(Adapter):
def __init__(self, filename: str):
self.adapter = TextFile(filename)
def read(self):
data = self.adapter.read()
if data is None:
return null
else:
return YAML.deserialize(data)
def write(self, obj):
return self.adapter.write(YAML.serialize(obj))
adapter = YAMLFile('file.yaml')
db = Low(adapter)
If you have large Python objects (~10-100MB
) you may hit some performance issues. This is because whenever you call db.write
, the whole db.data
is serialized and written to storage.
Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write
only when you need it.
If you plan to scale, it's highly recommended to use databases like PostgreSQL, MySql, Oracle ...