The command line interface for Gradient - Gradient is an an end-to-end MLOps platform

Overview

GitHubSplash

Gradient CLI

PyPI Downloads


Get started: Create AccountInstall CLITutorialsDocs

Resources: WebsiteBlogSupportContact Sales


Gradient is an an end-to-end MLOps platform that enables individuals and organizations to quickly develop, train, and deploy Deep Learning models. The Gradient software stack runs on any infrastructure e.g. AWS, GCP, on-premise and low-cost Paperspace GPUs. Leverage automatic versioning, distributed training, built-in graphs & metrics, hyperparameter search, GradientCI, 1-click Jupyter Notebooks, our Python SDK, and more.

Key components:

  • Notebooks: 1-click Jupyter Notebooks.
  • Workflows: Train models at scale with composable actions.
  • Inference: Deploy models as API endpoints.

Gradient supports any ML/DL framework (TensorFlow, PyTorch, XGBoost, etc).


See releasenotes.md for details on the current release, as well as release history.


Getting Started

  1. Make sure you have a Paperspace account set up. Go to http://paperspace.com to register and generate an API key.

  2. Use pip, pipenv, or conda to install the gradient package, e.g.:

    pip install -U gradient

    To install/update prerelease (Alpha/Beta) version version of gradient, use:

    pip install -U --pre gradient

  3. Set your api key by executing the following:

    gradient apiKey

    Note: your api key is cached in ~/.paperspace/config.json

    You can remove your cached api key by executing:

    gradient logout

Executing tasks on Gradient

The Gradient CLI follows a standard [command] [--options] syntax

For example, to create a new Deployment use:

gradient workflows create [type] [--options]

For a full list of available commands run gradient workflows --help. You can also view more info about Workflows in the docs.

Contributing

Want to contribute? Contact us at [email protected]

Pre-Release Testing

Have a Paperspace QA tester install your change directly from the branch to test it. They can do it with pip install git+https://github.com/Paperspace/[email protected].

Comments
  • Some parameters in job_client.create() don't work

    Some parameters in job_client.create() don't work

    I'm following the sample docs at gradient.api_sdk.clients.job_client. I can't get working_directory or job_env to work.

    working_directory: auto set to /paperspace. When I pass in /app, it's ignored & /paperspace is still used (listed Console > Job > Environment; and on the Job model fetch from job_client.list()). I just cashed in and set all my Docker stuff from /app to /paperspace and it worked. Ideally I wouldn't even want to pass /app, but it'd just be picked up from WORKDIR in the Dockerfile (which is /app)

    job_env: I can't find a workaround (save putting a config.json in /storage). Parameter gets gets ignored; of note, the docs has a typo:

    job = job_client.create(
        ...
        job_env={
            'CUSTOM_ENV'='Some value that will be set as system environment',
        }
    )
    

    'CUSTOM_ENV'='..' should be 'CUSTOM_ENV': '..'. Not nit-picking; just that it had me second guessing my approach, since using a dict gets ignored ([j.job_env for j in job_client.list()] => [None, None, ..]) so I tried json.dumps(my_dict), still ignored. Any suggestions on getting the job_env passed in?

    Overall, it seems like some params are respected, and others not; and it's hard to know which is which. Maybe a mismatch on the docs vs Github vs PyPi?

    opened by lefnire 8
  • Unable to install due to numpy==1.19.4

    Unable to install due to numpy==1.19.4

    Hi I am unable to use gradient on python 3.7.9 due to RuntimeError caused by numpy==1.19.4

    RuntimeError: The current Numpy installation ('c:\users\XXXX\envs\gradient\lib\site-packages\numpy\init.py') fails to pass a sanity check due to a bug in the windows runtime. See this issue for more information: https://tinyurl.com/y3dm3h86

    I tried downgrading the numpy, but it appears to be incompatible and i got the ff ModuleNotFoundError: ModuleNotFoundError: No module named '_curses'

    opened by thompsonalecbgo 7
  • Update Readme.md

    Update Readme.md

    I found two outdated or currently not working commands while trying to execute my container:

    1. The --project option was ignored but providing --projectId worked.
    2. The example for the --command option had as error message: Error: Missing argument "SCRIPT...". Upon providing a script the originally provided command was ignored. Instead I had to use the --shell option to execute the correct call.
    opened by nziermann 6
  • Keep artifacts for canceled jobs

    Keep artifacts for canceled jobs

    I am aware this is probably not the place to open this issue, but it seems like artifacts are not kept when a job is canceled. But why? I just finished a 2 days running job and wanted to have access to my best model, but the artifacts are not showing up potentially because I canceled the job. If this is the case, this is very frustrating.

    opened by MichelML 5
  • feat(notebooks): enable basic notebook lifecycle commands PS-13680

    feat(notebooks): enable basic notebook lifecycle commands PS-13680

    Newly Supported Commands

    • notebooks stop
    • notebook start
    • notebook create
    • notebook fork
    • notebook artifacts list

    Related tickets

    https://paperspace.atlassian.net/browse/PS-12219 https://paperspace.atlassian.net/browse/PS-11559 https://paperspace.atlassian.net/browse/PS-13681 https://paperspace.atlassian.net/browse/PS-13682 https://paperspace.atlassian.net/browse/PS-13683 https://paperspace.atlassian.net/browse/PS-13684

    This PR follows up on the PR for PS-12219 (https://github.com/Paperspace/PS_API/pull/1526). It enables createNotebook, startNotebook, artifactsList and forkNotebook commands with the v2 endpoint. It adds in the ability for creating a notebook with vm_type_id or vm_type_label

    opened by kevin-kabore 4
  • Failing to start notebook

    Failing to start notebook

    I'm trying to start an instance of an existing notebook: gradient notebooks start --id [id] --machineType Free-P5000

    and get in return: Failed to create resource: Cluster null not found

    specifying any cluster ID doesn't change anything and the output remains the same.

    A bug ?

    bug 
    opened by macsunmood 3
  • CR2-22 CR2-49 CR2-48 add metrics list to sdk

    CR2-22 CR2-49 CR2-48 add metrics list to sdk

    Adding list custom metrics functionality to sdk and cli for jobs, experiments, deployments, and notebooks

    QA Test Plan:

    1. in cli, check that this new list command shows up in the help text: gradient deployments metrics --help
    2. create a deployment (gui or cli doesn't matter) and wait for it to finish provisioning
    3. try the new list command: gradient deployments metrics list --id XYZ, should return custom metrics (a list of words),
    4. repeat 1 and 2 for jobs, experiments, and notebooks. these might return null but that's okay for now, just checking that this new command is available. note the stuff that returns null below if any and I'll solve that in a separate ticket.
    released 
    opened by robghchen 3
  • Fix: Serialize jobenv to envVars PS-15020

    Fix: Serialize jobenv to envVars PS-15020

    Fixes serialization of job environment

    Test Plan: Create a job and specify jobEnv parameter. Use command "env" to see if job environment is affected

    released 
    opened by paperspace-philip 3
  • When providing --ignoreFiles comma separated the code returns Attribute error.

    When providing --ignoreFiles comma separated the code returns Attribute error.

    The command below with the --ignoreFiles field set

    gradient experiments run single node \
    --name test \
    --projectid 12345 \
    --container paperspace/tensorflow-python \
    --machineType V100 \
    --command 'python main.py' \
    --ignoreFiles "file1,file2,file3,folder1,folder2,folder3"
    

    Returns the following error

    file_paths = self._retrieve_file_paths(workspace_path, ignore_files)
    File "/lib/python3.7/site-packages/gradient/workspace.py", line 59, in _retrieve_file_paths
        exclude += ignored_files.split(',')
    AttributeError: 'list' object has no attribute 'split'
    
    opened by EvDuijnhoven 3
  • Notebook example with SDK concepts

    Notebook example with SDK concepts

    Example of using new SDK functionality to create projects, experiments, analyze model, and create an inferrence deployment.

    Tested on TensorFlow 2.0 & Python3 container image.

    opened by dte 3
  • Replace faulty chunk counting logic

    Replace faulty chunk counting logic

    PR #384 for allowing multipart uploads introduced a bug that prevents datasets from being uploaded that contain files that are multiples of 500Mb. This became more of an issue with #389 which changes the chunk sizes to 15Mb, meaning that any file that is a multiple of 15Mb (75Mb in my case) will cause the dataset to fail to upload.

    What I believe is happening is that because we faulty logic instructs us to read an additional block of data from the filesystem that doesn't exist. How this appears in my experience is that the CLI hangs indefinitely, or in a Workflow it crashes.

    released 
    opened by fmorlock-tt 2
  • Failed to execute request against storage provider when uploading a dataset

    Failed to execute request against storage provider when uploading a dataset

    Dear everyone,

    I am trying to upload a new version of my dataset, but I keep getting Brokenpipe errors. Here is what I am doing :

    gradient datasets versions create --id ...
    gradient datasets files put --id ...:... --source-path "."
    
    Failed to execute request against storage provider: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe'))
    

    Uploading to other cloud providers is working, but not here.

    By the way, is gradient-cli uploading everything again at each new version? Or is it comparing hashes to avoid unnecessary file transfers?

    Thanks, Clément

    opened by clementpoiret 0
Releases(v2.0.6)
Owner
Paperspace
Paperspace
A CLI Spigot plugin manager that adheres to Unix conventions and Python best practices.

Spud A cross-platform, Spigot plugin manager that adheres to the Unix philosophy and Python best practices. Some focuses of the project are: Easy and

Tommy Dougiamas 9 Dec 02, 2022
A Python-based Wordle solver and CLI player

Wordle A Python-based Wordle solver and CLI player This was created using Python 3.9.7. SPOILER ALERT: the data directory contains spoilers for upcomi

Will Fitzgerald 1 Jul 24, 2022
adds flavor of interactive filtering to the traditional pipe concept of UNIX shell

percol __ ____ ___ ______________ / / / __ \/ _ \/ ___/ ___/ __ \/ / / /_/ / __/ / / /__/ /_/ / / / .__

Masafumi Oyamada 3.2k Jan 07, 2023
Dark powered asynchronous completion framework for neovim/Vim8

deoplete.nvim Dark powered asynchronous completion framework for neovim/Vim8 Note: The development of this plugin is finished. Accepts minor patches a

Shougo 5.9k Dec 30, 2022
This tool is a free and unlimited python CLI for google translate. based on google_trans_new.

GoTransPy A free and unlimited python CLI for google translate based on google_trans_new. It's very easy to use and solve the problem that the old api

Youssef Mohamed 2 Jan 10, 2022
A selfbot made with DPY, doesn't have much commands but there's some useful commands to use.

Phantom Selfbot A selfbot made in DPY, made by Zenith. How to use Add your token in token = 'YOUR-MOMS-TOKEN-HERE' Change the prefix in prefix = If

[Ͼ⁴] Ƶephyr 2 Dec 02, 2021
frogtrade9000 - a command-line Rich client for the freqtrade REST API

frogtrade9000 - a command-line Rich client for the freqtrade REST API I found FreqUI too cumbersome and slow on my Raspberry Pi 400 when running multi

Robert Davey 79 Dec 02, 2022
swarmexec executes command in swarm service

Swarmexec swarmexec executes command in swarm service Install pip install git+https://github.com/filimon43g/swarmexec.git Config In swarm_config.ini

Phil 2 Nov 23, 2021
An anime command-line system information tool written in python.

Animefetch - v0.0.3 An anime command-line system information tool written in python. Description Animefetch is an anime command-line system informatio

Thadeuks 6 Jun 17, 2022
A Bot Which Send Automatically Commands To Karuta Hub to Gain it's Currency

A Bot Which Send Automatically Commands To Karuta Hub to Gain it's Currency

HarshalWaykole 1 Feb 09, 2022
A dec-bin converter uses 2's complement.

2's Complement Dec-Bin Converter A dec-bin converter uses 2's complement. Visit my Medium Post. What is 2's complement? Two's complement is the most c

Khaw Chi Hun (Jacky) 9 Mar 01, 2022
CLI for SQLite Databases with auto-completion and syntax highlighting

litecli Docs A command-line client for SQLite databases that has auto-completion and syntax highlighting. Installation If you already know how to inst

dbcli 1.8k Dec 31, 2022
🪛 A simple pydantic to Form FastAPI model converter.

pyfa-converter Makes it pretty easy to create a model based on Field [pydantic] and use the model for www-form-data. How to install? pip install pyfa_

20 Dec 22, 2022
🐾 Get the nftables counters easier to read

nft-stats Get the nftables counters easier to read It kind of hard to read the output of nft list ruleset so there is a small program parcising the ou

7 Oct 08, 2022
Termtyper is a TUI typing application that provides you a great feel with typing with a lot of options to tweak

Termtyper Termtyper is a TUI (Text User Interface) typing application that provides you a great feel with typing with a lot of options to tweak! It is

Noob Coder 834 Dec 27, 2022
Dead simple CLI tool to try Python packages - It's never been easier! :package:

try - It's never been easier to try Python packages try is an easy-to-use cli tool to try out Python packages. Features Install specific package versi

Timo Furrer 659 Dec 28, 2022
Command line interface for testing internet bandwidth using speedtest.net

speedtest-cli Command line interface for testing internet bandwidth using speedtest.net Versions speedtest-cli works with Python 2.4-3.7 Installation

Matt Martz 12.4k Jan 08, 2023
A simple python application for running a CI pipeline locally

A simple python application for running a CI pipeline locally This app currently supports GitLab CI scripts

Tom Stowe 0 Jan 11, 2022
A dashboard for your Terminal written in the Python 3 language,

termDash is a handy little program, written in the Python 3 language, and is a small little dashboard for your terminal, designed to be a utility to help people, as well as helping new users get used

Rebecca White 2 Dec 03, 2021
A command line tool to hide and reveal information inside images (works for both PNGs and JPGs)

ImgReRite A command line tool to hide and reveal information inside images (work

Jigyasu 10 Jul 27, 2022