A tool for creating credentials for accessing S3 buckets

Overview

s3-credentials

PyPI Changelog Tests License

A tool for creating credentials for accessing S3 buckets

For project background, see s3-credentials: a tool for creating credentials for S3 buckets on my blog.

⚠️ Warning

I am not an AWS security expert. You shoud review how this tool works carefully before using it against with own AWS account.

If you are an AWS security expert I would love to get your feedback!

Installation

Install this tool using pip:

$ pip install s3-credentials

Configuration

This tool uses boto3 under the hood which supports a number of different ways of providing your AWS credentials. If you have an existing ~/.aws/config or ~/.aws/credentials file the tool will use that - otherwise you can set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables before calling this tool.

Usage

The s3-credentials create command is the core feature of this tool. Pass it one or more S3 bucket names and it will create a new user with permission to access just those specific buckets, then create access credentials for that user and output them to your console.

Make sure to record the SecretAccessKey because it will only be displayed once and cannot be recreated later on.

In this example I create credentials for reading and writing files in my static.niche-museums.com S3 bucket:

% s3-credentials create static.niche-museums.com

Created user: s3.read-write.static.niche-museums.com with permissions boundary: arn:aws:iam::aws:policy/AmazonS3FullAccess
Attached policy s3.read-write.static.niche-museums.com to user s3.read-write.static.niche-museums.com
Created access key for user: s3.read-write.static.niche-museums.com
{
    "UserName": "s3.read-write.static.niche-museums.com",
    "AccessKeyId": "AKIAWXFXAIOZOYLZAEW5",
    "Status": "Active",
    "SecretAccessKey": "...",
    "CreateDate": "2021-11-03 01:38:24+00:00"
}

The command has several additional options:

  • --username TEXT: The username to use for the user that is created by the command (or the username of an existing user if you do not want to create a new one). If ommitted a default such as s3.read-write.static.niche-museums.com will be used.
  • -c, --create-bucket: Create the buckts if they do not exist. Without this any missing buckets will be treated as an error.
  • --read-only: The user should only be allowed to read files from the bucket.-
  • --write-only: The user should only be allowed to write files to the bucket, but not read them. This is useful for logging use-cases.
  • --bucket-region: If creating buckets, the region in which they should be created.
  • --silent: Don't output details of what is happening, just output the JSON for the created access credentials at the end.
  • --user-permissions-boundary: Custom permissions boundary to use for users created by this tool. This will default to restricting those users to only interacting with S3, taking the --read-only option into account. Use none to create users without any permissions boundary at all.

Here's the full sequence of events that take place when you run this command:

  1. Confirm that each of the specified buckets exists. If they do not and --create-bucket was passed create them - otherwise exit with an error.
  2. If a username was not specified, determine a username using the s3.$permission.$buckets format.
  3. If a user with that username does not exist, create one with an S3 permissions boundary that respects the --read-only option - unless --user-permissions-boundary=none was passed (or a custom permissions boundary string).
  4. For each specified bucket, add an inline IAM policy to the user that gives them permission to either read-only, write-only or read-write against that bucket.
  5. Create a new access key for that user and output the key and its secret to the console.

Other commands

whoami

To see which user you are authenticated as:

s3-credentials whoami

This will output JSON representing the currently authenticated user.

list-users

To see a list of all users that exist for your AWS account:

s3-credentials list-users

This will return pretty-printed JSON objects by default.

Add --nl to collapse these to single lines as valid newline-delimited JSON.

Add --array to output a valid JSON array of objects instead.

list-buckets

Shows a list of all buckets in your AWS account.

s3-credentials list-buckets

Accepts the same --nl and --array options as list-users.

list-user-policies

To see a list of inline policies belonging to users:

% s3-credentials list-user-policies s3.read-write.static.niche-museums.com

User: s3.read-write.static.niche-museums.com
PolicyName: s3.read-write.static.niche-museums.com
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "ListObjectsInBucket",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::static.niche-museums.com"
            ]
        },
        {
            "Sid": "AllObjectActions",
            "Effect": "Allow",
            "Action": "s3:*Object",
            "Resource": [
                "arn:aws:s3:::static.niche-museums.com/*"
            ]
        }
    ]
}

You can pass any number of usernames here. If you don't specify a username the tool will loop through every user belonging to your account:

s3-credentials list-user-policies

delete-user

In trying out this tool it's possible you will create several different user accounts that you later decide to clean up.

Deleting AWS users is a little fiddly: you first need to delete their access keys, then their inline policies and finally the user themselves.

The s3-credentials delete-user handles this for you:

% s3-credentials delete-user s3.read-write.simonw-test-bucket-10
User: s3.read-write.simonw-test-bucket-10
  Deleted policy: s3.read-write.simonw-test-bucket-10
  Deleted access key: AKIAWXFXAIOZK3GPEIWR
  Deleted user

You can pass it multiple usernames to delete multiple users at a time.

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd s3-credentials
python -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest
Comments
  • `s3-credentials create` command

    `s3-credentials create` command

    This is the command which create a user and returns credentials for a specified bucket, optionally also creating the bucket as well.

    See initial design notes in #1.

    enhancement 
    opened by simonw 22
  • Standard default output should be a valid JSON array

    Standard default output should be a valid JSON array

    I just spotted list-buckets has the same not-quite-newline-delimited JSON output format, which is a bad default. I should fix that too.

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/issues> /28#issuecomment-1014838721

    enhancement 
    opened by simonw 15
  • Research creating expiring credentials using `sts.assume_role()`

    Research creating expiring credentials using `sts.assume_role()`

    The initial reason for creating this tool was that I wanted to be able to create long-lived (never expiring) tokens for the kinds of use-cases described in this post: https://simonwillison.net/2021/Nov/3/s3-credentials/

    Expiring credentials are fantastic for all sorts of other use-cases. It would be great if this tool could optionally create those instead of creating long-lived credentials.

    This would mean the tool didn't have to create users at all (when used in that mode) - it could create a role and then create temporary access credentials for that role using sts.assume_role(): https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role

    enhancement research 
    opened by simonw 15
  • `s3-credentials put-objects` command

    `s3-credentials put-objects` command

    It's frustrating when using s3-credentials put-object that you have to specify the key name each time, rather than deriving that from the filename:

    s3-credentials put-object simonwillison-cors-allowed-public \
      click_default_group-1.2.2-py3-none-any.whl \
      /tmp/click-default-group/dist/click_default_group-1.2.2-py3-none-any.whl
    

    One way to fix this would be with a s3-credentials put-objects which works like this:

    s3-credentials put-objects simonwillison-cors-allowed-public /tmp/click-default-group/dist/click_default_group-1.2.2-py3-none-any.whl
    

    It could accept multiple files (hence the plural name) and could also accept directories and recursively upload their contents.

    enhancement 
    opened by simonw 13
  • Make it easier to add extra policy statements

    Make it easier to add extra policy statements

    The current --policy option lets you set a custom policy, but leaves it to you to define one.

    I find myself wanting to mix in the following to the policy that I use, for s3-ocr:

    https://docs.aws.amazon.com/textract/latest/dg/security_iam_id-based-policy-examples.html#security_iam_async-actions

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "textract:StartDocumentTextDetection",
                    "textract:StartDocumentAnalysis",
                    "textract:GetDocumentTextDetection",
                    "textract:GetDocumentAnalysis"
                ],
                "Resource": "*"
            }
        ]
    }
    

    Would be nice if there was a neat way to do this.

    enhancement research 
    opened by simonw 10
  • Mechanism for running tests against a real AWS account

    Mechanism for running tests against a real AWS account

    The tests for this project currently run against mocks - which is good, because I don't like the idea of GitHub Action tests hitting real APIs.

    But... this project is about building securely against AWS. As such, automated tests that genuinely exercise a live AWS account (and check that the resulting permissions behave as expected) would be incredibly valuable for growing my confidence that this tool works as advertised.

    These tests would need quite a high level of administrative access, because they need to be able to create users, roles etc.

    I don't like the idea of storing my own AWS administrator account credentials in a GitHub Actions secret though. I think I'll write these tests such that they can be run outside of GitHub Actions, maybe configured via environment variables that allow other project contributors to run tests against their own accounts.

    tests 
    opened by simonw 10
  • Stop using action wildcards and start explicitly listing permissions

    Stop using action wildcards and start explicitly listing permissions

    See https://github.com/simonw/s3-credentials/issues/11#issuecomment-959844042_ for context.

    The read-write policy currently uses "Action": "s3:*Object" - and the read-only one uses Action": "s3:GetObject*".

    This is pretty gross - surely explicitly listing the allowed actions is better practice?

    • [x] #23
    • [x] #24
    • [x] #25
    research 
    opened by simonw 10
  • Support configuring the bucket as a website

    Support configuring the bucket as a website

    It would be useful to have an opt-in option for saying "this bucket should be configured as a website" - because setting that up without a tool is quite fiddly.

    https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteAccessPermissionsReqd.html has the details:

    When you configure a bucket as a static website, if you want your website to be public, you can grant public read access. To make your bucket publicly readable, you must disable block public access settings for the bucket and write a bucket policy that grants public read access.

    See #20 for "block public access" setting, and #19 for bucket policies.

    enhancement 
    opened by simonw 9
  • Work-in-progress create command

    Work-in-progress create command

    Refs #3. This is implemented... but it doesn't seem to work - when I copy and paste the credentials into Transmit it refuses to connect.

    • [x] Get it working
    • [x] Add tests
    • [x] Add documentation
    opened by simonw 8
  • Manually test --prefix against litestream.io

    Manually test --prefix against litestream.io

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/pull/39#issuecomment-1014857276

    Splitting this into a separate issue mainly so I can clearly document how to use Litestream in the comments here.

    Goal is to confirm that S3 credentials created using s3-credentials create ... --prefix litestream-test/ can be used with Litestream to back up a SQLite database to that path within the bucket.

    research tests 
    opened by simonw 7
  • Apply jdub policy suggestions

    Apply jdub policy suggestions

    https://github.com/simonw/s3-credentials/blob/main/s3_credentials/policies.py

    My suggestions:

    • specify individual actions explicitly (no wildcards)
    • separate permissions by resource (Buckets vs. Objects)
    • Sid is unnecessary

    Your read/write policy is good, but instead of *Object, list GetObject and PutObject.

    Your read-only policy would be better written like your read/write policy, one section for the bucket permission (ListBucket), one for the object permission (which should be GetObject, no wildcard).

    Your write-only policy is great as is.

    You may want to add additional permissions to let clients set ACLs. But if it's all simple object-by-object stuff, these very simple policies are great.

    Originally posted by @jdub in https://github.com/simonw/s3-credentials/issues/7#issuecomment-958651592

    enhancement research 
    opened by simonw 7
  • Add s3:PutObjectAcl to write policies

    Add s3:PutObjectAcl to write policies

    This came up here:

    • https://github.com/simonw/public-notes/issues/9#issuecomment-1328567164

    It turned out django-storages nees a write policy that includes s3:PutObjectAcl: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#iam-policy

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "VisualEditor0",
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObjectAcl",
                    "s3:GetObject",
                    "s3:ListBucket",
                    "s3:DeleteObject",
                    "s3:PutObjectAcl"
                ],
                "Principal": {
                    "AWS": "arn:aws:iam::example-AWS-account-ID:user/example-user-name"
                },
                "Resource": [
                    "arn:aws:s3:::example-bucket-name/*",
                    "arn:aws:s3:::example-bucket-name"
                ]
            }
        ]
    }
    

    Looks like I should add s3:GetObjectAcl to the default read policies too.

    enhancement 
    opened by simonw 3
  • Add the options to add tags to the created resources

    Add the options to add tags to the created resources

    Hello! I am looking into using s3-credentials for my projects. I use tags to identify resources in different ways, like how it was created or the project it belongs. I was wondering if there is planned support for adding tags to the resources created, or if you would be open for a contribution in that area.

    enhancement 
    opened by threkk 3
  • `get-objects/put-objects` `--skip` and `--skip-hash` options

    `get-objects/put-objects` `--skip` and `--skip-hash` options

    Idea:

    • --skip to skip downloading a file if it already exists with the same filename
    • --skip-hash to skip downloading a file if it already exists AND the MD5 hash has not changed (more expensive as needs to calculate the local hash)

    Originally posted by @simonw in https://github.com/simonw/s3-credentials/issues/78#issuecomment-1248398247

    enhancement 
    opened by simonw 1
  • Provide a `--profile` option to allow AWS profile selection

    Provide a `--profile` option to allow AWS profile selection

    Users with multiple AWS accounts can declare named profiles to manage the different sets of credentials/regions. It would be ideal if s3-credentials accepted a --profile argument, just like the aws comand line tool.

    enhancement 
    opened by nk9 3
  • Bad session token error masked if not creating a new bucket

    Bad session token error masked if not creating a new bucket

    If a bad or expired session token is set, the create command fails with a misleading error that an existing bucket doesn't exist, if --create-bucket isn't specified. If --create-bucket is specified a traceback with more info is given instead:

    export AWS_ACCESS_KEY_ID="..."
    export AWS_SECRET_ACCESS_KEY="..."
    export AWS_SESSION_TOKEN="EXPIRED_TOKEN" 
    
    $ s3-credentials create --username USERNAME BUCKET
    Error: Bucket does not exist: BUCKET - try --create-bucket to create it
    
    $ s3-credentials create --create-bucket --username USERNAME BUCKET
    Traceback (most recent call last):
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/bin/s3-credentials", line 8, in <module>
        sys.exit(cli())
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1659, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/s3_credentials/cli.py", line 314, in create
        s3.create_bucket(Bucket=bucket, **kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/botocore/client.py", line 391, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/home/kimv/work/data_engineering/sbsa_archive/.venv/lib/python3.9/site-packages/botocore/client.py", line 719, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the CreateBucket operation: The provided token has expired.
    
    bug 
    opened by kimvanwyk 3
Releases(0.14)
  • 0.14(Sep 15, 2022)

    • s3-credentials put-objects command (docs) for uploading more than one file or directory to an S3 bucket at a time. #68
    • s3-credentials get-objects command (docs) for downloading multiple files from an S3 bucket. #78
    Source code(tar.gz)
    Source code(zip)
  • 0.13(Aug 12, 2022)

    • Documentation now lives on a dedicated documentation website: https://s3-credentials.readthedocs.io/ #71
    • s3-credentials create ... --website --create-bucket now creates an S3 bucket that is configured to act as a website, with index.html an the index page and error.html as the page used for any errors. #21
    • s3-credentials list-buckets --details now returns the bucket region and the URL to the website, if it is configured to act as a website. #77
    • Fixed a bug where list-bucket would return an error if the bucket (or specified --prefix) was empty. #76
    Source code(tar.gz)
    Source code(zip)
  • 0.12.1(Aug 1, 2022)

    • Using the --policy or --statement options now implies --user-permissions-boundary=none. Previously it was easy to use these options to accidentally create credentials that did not work as expected since they would have a default permissions boundary that locked them down to only being able to access S3. #74
    • The s3-credentials.AmazonS3FullAccess role created by this tool in order to issue temporary credentials previously used the default MaxSessionDuration value of 3600, preventing it from creating credentials that could last more than an hour. This has been increased to 12 hours. See this issue comment for instructions on fixing your existing role if this bug is affecting your account. #75
    Source code(tar.gz)
    Source code(zip)
  • 0.12(Jun 30, 2022)

    • New --statement JSON option for both the s3-credentials create and s3-credentials policy commands, allowing one or more additional policy statements (provided as JSON strings) to be added to the generated IAM policy. #72
    Source code(tar.gz)
    Source code(zip)
  • 0.11(May 1, 2022)

  • 0.10(Jan 25, 2022)

  • 0.9(Jan 18, 2022)

    See Weeknotes: s3-credentials prefix and Datasette 0.60 for extra background on these new features.

    • New --prefix myprefix/ option to s3-credentials create, which configures the credentials to only allow access to keys within the S3 bucket that start with the provided prefix. #12
    • s3-credentials policy --prefix myprefix/ command for generating and outputting a JSON policy that is restricted to the specified prefix. You can see examples in the README.
    • New list-bucket command for listing the contents of a specified bucket. #28
    • The list-users, list-buckets and list-bucket command all default to outputting an indented JSON array - previously the outputted indented JSON objects separated by newlines. The --nl option can be used to return newline-delimited single line JSON objects. The new --csv and --tsv options can be used to return CSV or TSV output. #48
    Source code(tar.gz)
    Source code(zip)
  • 0.8(Dec 7, 2021)

    • s3-credentials create my-bucket --public option for creating public buckets, which allow anyone with knowledge of a filename to download that file. This works by attaching this public bucket policy to the bucket after it is created. #42
    • s3-credentials put-object now sets the Content-Type header on the uploaded object. The type is detected based on the filename, or can be specified using the new --content-type option. #43
    • s3-credentials policy my-bucket --public-bucket outputs the public bucket policy that would be attached to a bucket of that name. #44
    Source code(tar.gz)
    Source code(zip)
  • 0.7(Nov 30, 2021)

    • s3-credentials policy command, to output the JSON policy that would be used directly to the terminal. #37
    • README now includes examples of the three different policies. #36
    • s3-credentials put-object and s3-credentials get-object commands for uploading and downloading files from an S3 bucket. #38
    Source code(tar.gz)
    Source code(zip)
  • 0.6(Nov 18, 2021)

    • create --dry-run option outputs a summary of changes that would be made to an AWS account without applying them. #35
    • s3-credentials whoami command now uses sts.GetCallerIdentity, which means it works with any kind of access key. #33
    Source code(tar.gz)
    Source code(zip)
  • 0.5(Nov 11, 2021)

    • New s3-credentials create --duration 20m option. This creates temporary credentials that only last for the specified time, by creating a role and using STS.AssignRole() to retrieve credentials. #27
    • Redesigned read-only and read-write policies to no longer use wildcards and instead explicitly list allowed actions. #15
    • Commands now accept an optional --auth file/path.json option to specify a JSON or INI file containing the credentials to use. #29
    • New s3-credentials list-buckets --details option to include ACLs, website configuration and bucket policies. #22
    • New s3-credentials create --format ini option for outputting INI format instead of JSON. #17
    • Now uses botocore.stub in some of the tests - thanks, Niko Abeler. #16
    • Added integration tests, run using pytest --integration, which exercise the tool against an AWS account and delete any created resources afterwards. #30
    • Added tips section to the README, including how to access CloudTrail
    Source code(tar.gz)
    Source code(zip)
  • 0.4(Nov 4, 2021)

    • New options for authenticating with AWS: --access-key, --secret-key, --session-token, --endpoint-url. #2
    • Various improvements to JSON policies - thanks, @jdub! #11
    • --policy filename.json option for specifying a custom JSON policy. #14
    Source code(tar.gz)
    Source code(zip)
  • 0.3(Nov 3, 2021)

  • 0.2(Nov 3, 2021)

  • 0.1(Nov 3, 2021)

    • Initial release
    • s3-credentials create name-of-bucket creates a new user with read-write access only to the specified S3 bucket, creates an access key for that user and outputs it to the console. #3
    • s3-credentials list-users lists all of the users for the current AWS account. #4
    • s3-credentials list-user-policies lists inline policies for the specifeid users, or all users. #5
    • s3-credentials whoami shows information about the currently authenticated user.
    Source code(tar.gz)
    Source code(zip)
Owner
Simon Willison
Simon Willison
thumbor is an open-source photo thumbnail service by globo.com

Survey If you use thumbor, please take 1 minute and answer this survey? It's only 2 questions and one is multiple choice!!! thumbor is a smart imaging

Thumbor (by @globocom) 9.3k Dec 31, 2022
Tools used by Ada Health's internal IT team to deploy and manage a serverless Munki setup.

Serverless Munki This repository contains cross platform code to deploy a production ready Munki service, complete with AutoPkg, that runs entirely fr

Ada Health 17 Dec 05, 2022
对hermit 的API进行简单的封装,做成了这个python moudle

hermit-py 对hermit 的API进行简单的封装,做成了这个Python Moudle,推荐通过wheel的方式安装。 目前对点击、滑动、模拟输入、找组件、等支持较好,支持查看页面的实时布局信息,再通过布局信息进行点击滑动等操作。 支持剪贴板相关的操作,支持设置剪贴的任意语言内容。

LookCos 40 Jun 25, 2022
[OSGIFI] - INFORMATION GATHERING TOOL, FROM INSTAGRAM ACCOUNTS.

⚡ OSGIFI THIS TOOL PERMIT YOU TO DISCOVERING & GATHERING INFO FROM INSTAGRAM ACCOUNTS, FOR EXAMPLE: Full Name Verified Account Or Not Private Account

BASILEOLUS 9 Nov 29, 2022
A telegram bot to interact with a Minecraft Server

telegram-mc-bot A telegram bot to interact with a Minecraft Server It has the following commands: /status - Returns the server status (Online/Offline)

KleynArt 1 Dec 09, 2021
Trading through Binance's API using Python & sqlite

pycrypt Automate trading crypto using Python to pull data from Binance's API and analyse trends. May or may not consistently lose money but oh well it

Maxim 4 Sep 02, 2022
livestream-chat: Overlay para chats de livestreams

livestream-chat Overlay para chats de livestreams. Inicialmente para rodar dentro do browser do obs-studio. TODO: Issues iniciais Suporte a API do You

Eduardo Mendes 10 Dec 16, 2022
Telegram bot that sends new offers from otomoto.pl

Telegram bot that sends new offers under certain filters from otomoto.pl How to use this bot? Install requirements with pip install -r requirements.tx

Mikhail Zanka 1 Feb 14, 2022
This bot can stream audio or video files and urls in telegram voice chats :)

Voice Chat Streamer This bot can stream audio or video files and urls in telegram voice chats :) 🎯 Follow me and star this repo for more telegram bot

Anjana Madu 63 Dec 25, 2022
Network simulation tools

Overview I'm building my network simulation environments with Vagrant using libvirt plugin on a Ubuntu 20.04 system... and I always hated how boring i

Ivan Pepelnjak 219 Jan 07, 2023
Cancel all your follow requests on Instagram.

Unrequester This python code unrequests all your follow requests on Instagram, using selenium. Everything's step-by-step and understanding it is like

ChamRun 3 Apr 09, 2022
Use PyTgCalls easier than before.

PyTgCalls wrapper Making it easier for you to use pytgcalls. Features No need to care about audio convertion. Play directly from URLs, YouTube and loc

Calls Music 12 Jul 21, 2022
itadori webhook spammer fucker

itadori-webhook-spammer-fucker Installation # install the requirements $ python3 -m pip install -r requirements.txt $ python3 main.py Repl.it: https:/

6 Mar 05, 2022
Simple Python Auto Follow Bot

Instagram-Auto-Follow-Bot Description Một IG BOT đơn giản. Tự động follow những người mà bạn muốn cướp follow. Tự động unfollow. Tự động đăng nhập vào

CodingLinhTinh 3 Aug 27, 2022
A python package for AxisVM

PyAxisVM The package is under development. Follow us on social media, where we'll announce the first release! Overview The PyAxisVM project offers a h

AxisVM - InterCAD 8 Nov 19, 2022
A Serverless Application Model stack that persists the $XRP price to the XRPL every minute as a TrustLine. There are no servers, it is effectively a "smart contract" in Python for the XRPL.

xrpl-price-persist-oracle-sam This is a XRPL Oracle that publishes external data into the XRPL. This Oracle was inspired by XRPL-Labs/XRPL-Persist-Pri

Joseph Chiocchi 11 Dec 17, 2022
Check your bot status automatically using userbot, simply and easy

Status Checker Userbot check your bot status automatically using userbot, simply and easy. Mandatory Vars API_ID : Telegram API_ID, get it from my.tel

ALBY 6 Feb 20, 2022
Video Bot: an Advanced Telegram Bot that's allow you to play Video & Music on Telegram Group Video Chat

Video Bot is an Advanced Telegram Bot that's allow you to play Video & Music on

5 Jan 26, 2022
Retrieves GitHub Stats via `git_api` and flask.

GitHub User Search Created using Python3 and git_api, coded by JBYT27. About This is a project I decided to make for Kajam, but I decided to choose a

an aspirin 4 May 11, 2022
PunkScape Discord bot to lookup rarities, create diptychs and more.

PunkScape Discord Bot A Discord bot created for the Discord server of PunkScapes, a banner NFT project. It was intially created to lookup rarities of

Akuti 4 Jun 24, 2022