The Timescale NFT Starter Kit is a step-by-step guide to get up and running with collecting, storing, analyzing and visualizing NFT data from OpenSea, using PostgreSQL and TimescaleDB.

Overview

Timescale NFT Starter Kit

The Timescale NFT Starter Kit is a step-by-step guide to get up and running with collecting, storing, analyzing and visualizing NFT data from OpenSea, using PostgreSQL and TimescaleDB.

The NFT Starter Kit will give you a foundation for analyzing NFT trends so that you can bring some data to your purchasing decisions, or just learn about the NFT space from a data-driven perspective. It also serves as a solid foundation for your more complex NFT analysis projects in the future.

We recommend following along with the NFT Starter Kit tutorial to get familar with the contents of this repository.

For more information about the NFT Starter Kit, see the announcement blog post.

Project components

Earn a Time Travel Tiger NFT

Time Travel Tigers is a collection of 20 hand-crafted NFTs featuring Timescale’s mascot: Eon the friendly tiger, as they travel through space and time, spreading the word about time-series data wearing various disguises to blend in. The first 20 people to complete the NFT Starter Kit tutorial can earn a limited edition NFT from the collection, for free! Simply download the NFT Starter Kit, complete the tutorial and fill out this form, and we’ll send one of the limited-edition Eon NFTs to your ETH address (at no cost to you!).

Get started

Clone the nft-starter-kit repository:

git clone https://github.com/timescale/nft-starter-kit.git
cd nft-starter-kit

Setting up the pre-built Superset dashboards

This part of the project is fully Dockerized. TimescaleDB and the Superset dashboard is built out automatically using docker-compose. After completing the steps below, you will have a local TimescaleDB and Superset instance running in containers - containing 500K+ NFT transactions from OpenSea.

The Docker service uses port 8088 (for Superset) and 6543 (for TimescaleDB) so make sure there's no other services using those ports before starting the installation process.

Prerequisites

  • Docker

  • Docker compose

    Verify that both are installed:

    docker --version && docker-compose --version

Instructions

  1. Run docker-compose up --build in the /pre-built-dashboards folder:

    cd pre-built-dashboards
    docker-compose up --build

    See when the process is done (it could take a couple of minutes):

    timescaledb_1      | PostgreSQL init process complete; ready for start up.
  2. Go to http://0.0.0.0:8088/ in your browser and login with these credentials:

    user: admin
    password: admin
    
  3. Open the Databases page inside Superset (http://0.0.0.0:8088/databaseview/list/). You will see exactly one item there called NFT Starter Kit.

  4. Click the edit button (pencil icon) on the right side of the table (under "Actions").

  5. Don't change anything in the popup window, just click Finish. This will make sure the database can be reached from Superset.

  6. Go check out your NFT dashboards!

    Collections dashboard: http://0.0.0.0:8088/superset/dashboard/1

    Assets dashboard: http://0.0.0.0:8088/superset/dashboard/2

Running the data ingestion script

If you'd like to ingest data into your database (be it a local TimescaleDB, or in Timescale Cloud) straight from the OpenSea API, follow these steps to configure the ingestion script:

Prerequisites

Instructions

  1. Go to the root folder of the project:
    cd nft-starter-kit
  2. Create a new Python virtual environment and install the requirements:
    virtualenv env && source env/bin/activate
    pip install -r requirements.txt
  3. Replace the parameters in the config.py file:
    DB_NAME="tsdb"
    HOST="YOUR_HOST_URL"
    USER="tsdbadmin"
    PASS="YOUR_PASSWORD_HERE"
    PORT="PORT_NUMBER"
    OPENSEA_START_DATE="2021-10-01T00:00:00" # example start date (UTC)
    OPENSEA_END_DATE="2021-10-06T23:59:59" # example end date (UTC)
  4. Run the Python script:
    python opensea_ingest.py
    This will start ingesting data in batches, ~300 rows at a time:
    Start ingesting data between 2021-10-01 00:00:00+00:00 and 2021-10-06 23:59:59+00:00
    ---
    Fetching transactions from OpenSea...
    Data loaded into temp table!
    Data ingested!
    Data has been backfilled until this time: 2021-10-06 23:51:31.140126+00:00
    ---
    You can stop the ingesting process anytime (Ctrl+C), otherwise the script will run until all the transactions have been ingested from the given time period.

Ingest the sample data

If you don't want to spend time waiting until a decent amount of data is ingested, you can just use our sample dataset which contains 500K+ sale transactions from OpenSea (this sample was used for the Superset dashboard as well)

Prerequisites

Instructions

  1. Go to the folder with the sample CSV files (or you can also download them from here):
    cd pre-built-dashboards/database/data
  2. Connect to your database with PSQL:
    psql -x "postgres://host:port/tsdb?sslmode=require"
    If you're using Timescale Cloud, the instructions under How to Connect provide a customized command to run to connect directly to your database.
  3. Import the CSV files in this order (it can take a few minutes in total):
    \copy accounts FROM 001_accounts.csv CSV HEADER;
    \copy collections FROM 002_collections.csv CSV HEADER;
    \copy assets FROM 003_assets.csv CSV HEADER;
    \copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
  4. Try running some queries on your database:
    SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales 
Python & Julia port of codes in excellent R books

X4DS This repo is a collection of Python & Julia port of codes in the following excellent R books: An Introduction to Statistical Learning (ISLR) Stat

Gitony 5 Jun 21, 2022
Drag’n’drop Pivot Tables and Charts for Jupyter/IPython Notebook, care of PivotTable.js

pivottablejs: the Python module Drag’n’drop Pivot Tables and Charts for Jupyter/IPython Notebook, care of PivotTable.js Installation pip install pivot

Nicolas Kruchten 512 Dec 26, 2022
Here are my graphs for hw_02

Let's Have A Look At Some Graphs! Graph 1: State Mentions in Congressperson's Tweets on 10/01/2017 The graph below uses this data set to demonstrate h

7 Sep 02, 2022
A collection of 100 Deep Learning images and visualizations

A collection of Deep Learning images and visualizations. The project has been developed by the AI Summer team and currently contains almost 100 images.

AI Summer 65 Sep 12, 2022
Application for viewing pokemon regional variants.

Pokemon Regional Variants Application Application for viewing pokemon regional variants. Run The Source Code Download Python https://www.python.org/do

Michael J Bailey 4 Oct 08, 2021
Log visualizer for whirl-framework

Lumberjack Log visualizer for whirl-framework Установка pip install -r requirements.txt Как пользоваться python3 lumberjack.py -l путь до лога -o

Vladimir Malinovskii 2 Dec 19, 2022
Because trello only have payed options to generate a RunUp chart, this solves that!

Trello Runup Chart Generator The basic concept of the project is that Corello is pay-to-use and want to use Trello To-Do/Doing/Done automation with gi

Rômulo Schiavon 1 Dec 21, 2021
Type-safe YAML parser and validator.

StrictYAML StrictYAML is a type-safe YAML parser that parses and validates a restricted subset of the YAML specification. Priorities: Beautiful API Re

Colm O'Connor 1.2k Jan 04, 2023
plotly scatterplots which show molecule images on hover!

molplotly Plotly scatterplots which show molecule images on hovering over the datapoints! Required packages: pandas rdkit jupyter_dash ➡️ See example.

150 Dec 28, 2022
An interactive dashboard for visualisation, integration and classification of data using Active Learning.

AstronomicAL An interactive dashboard for visualisation, integration and classification of data using Active Learning. AstronomicAL is a human-in-the-

45 Nov 28, 2022
Active Transport Analytics Model (ATAM) is a new strategic transport modelling and data visualization framework for Active Transport as well as emerging micro-mobility modes

{ATAM} Active Transport Analytics Model Active Transport Analytics Model (“ATAM”) is a new strategic transport modelling and data visualization framew

Peter Stephan 0 Jan 12, 2022
Bcc2telegraf: An integration that sends ebpf-based bcc histogram metrics to telegraf daemon

bcc2telegraf bcc2telegraf is an integration that sends ebpf-based bcc histogram

Peter Bobrov 2 Feb 17, 2022
🎨 Python3 binding for `@AntV/G2Plot` Plotting Library .

PyG2Plot 🎨 Python3 binding for @AntV/G2Plot which an interactive and responsive charting library. Based on the grammar of graphics, you can easily ma

hustcc 990 Jan 05, 2023
ICS-Visualizer is an interactive Industrial Control Systems (ICS) network graph that contains up-to-date ICS metadata

ICS-Visualizer is an interactive Industrial Control Systems (ICS) network graph that contains up-to-date ICS metadata (Name, company, port, user manua

QeeqBox 2 Dec 13, 2021
Fast data visualization and GUI tools for scientific / engineering applications

PyQtGraph A pure-Python graphics library for PyQt5/PyQt6/PySide2/PySide6 Copyright 2020 Luke Campagnola, University of North Carolina at Chapel Hill h

pyqtgraph 3.1k Jan 08, 2023
📊 Charts with pure python

A zero-dependency python package that prints basic charts to a Jupyter output Charts supported: Bar graphs Scatter plots Histograms 🍑 📊 👏 Examples

Max Humber 54 Oct 04, 2022
:small_red_triangle: Ternary plotting library for python with matplotlib

python-ternary This is a plotting library for use with matplotlib to make ternary plots plots in the two dimensional simplex projected onto a two dime

Marc 611 Dec 29, 2022
Investment and risk technologies maintained by Fortitudo Technologies.

Fortitudo Technologies Open Source This package allows you to freely explore open-source implementations of some of our fundamental technologies under

Fortitudo Technologies 11 Dec 14, 2022
This is Pygrr PolyArt, a program used for drawing custom Polygon models for your Pygrr project!

This is Pygrr PolyArt, a program used for drawing custom Polygon models for your Pygrr project!

Isaac 4 Dec 14, 2021
An automatic prover for tautologies in Metamath

completeness An automatic prover for tautologies in Metamath This program implements the constructive proof of the Completeness Theorem for propositio

Scott Fenton 2 Dec 15, 2021