An Automated udemy coupons scraper which scrapes coupons and autopost the result in blogspot post

Overview

Autoscraper-n-blogger

An Automated udemy coupons scraper which scrapes coupons and autopost the result in blogspot post and notifies via Telegram bot

Requirements

  • Blogger account and blog id
  • Telegram Bot API key and Your Telegram chat id to notify you and send results

    Setup

    Before setup place Telegram bot API key, Telegram chat id and Blogger id in config.json file !

    How to get my Telegram bot api key ? - Telegram-bot api-key

    How to get your Telegram chat id ? - Telegram chat-id

    pip3 install requirements.txt

    Once Installed all the requirements, setup the easyblogger by below command

    easyblogger --blogid get

    To get the blog id refer - https://subinsb.com/how-to-find-blogger-blog-id

    This will open up a browser window that you use to authenticate with your google account

    Note : Authenticate the google account associated with blogger account

    you’re all set to use Easyblogger !

    python3 auto.py

    This above file will scrape all the udemy course and coupons and it will post in blogger and it will send a copy of scraped results via Telegram bot !

    This can be hosted on a cloud server to run it automatically everyday !

    Demo

    Autoscraper.mp4
  • Owner
    GOKUL A.P
    Pythonist | Web Application Pentester | CTF player | Automation developer
    GOKUL A.P
    A leetcode scraper to compile all questions in leetcode free tier to text file. pdf also available.

    A leetcode scraper to compile all questions in leetcode free tier to text file, pdf also available. if new questions get added, run again to get new questions.

    3 Dec 07, 2021
    Scrapping the data from each page of biocides listed on the BAUA website into a csv file

    Scrapping the data from each page of biocides listed on the BAUA website into a csv file

    Eric DE MARIA 1 Nov 30, 2021
    TarkovScrappy - A nifty little bot that lets you know if a queried item might be required for a quest at some point in the land of Tarkov!

    TarkovScrappy A nifty little bot that lets you know if a queried item might be required for a quest at some point in the land of Tarkov! Hideout items

    Joshua Smeda 2 Apr 11, 2022
    Discord webhook spammer with proxy support and proxy scraper

    Discord webhook spammer with proxy support and proxy scraper

    3 Feb 27, 2022
    A Very simple free proxy list scraper.

    Scrappp A Very simple free proxy list scraper, made in python The tool scrape proxy from diffrent sites and api's. Screenshots About the script !!! RE

    Joji aka Moncef 12 Oct 27, 2022
    Simple tool to scrape and download cross country ski timings and results from live.skidor.com

    LiveSkidorDownload Simple tool to scrape and download cross country ski timings

    0 Jan 07, 2022
    This is a module that I had created along with my friend. It's a basic web scraping module

    QuickInfo PYPI link : https://pypi.org/project/quickinfo/ This is the library that you've all been searching for, it's built for developers and allows

    OneBit 2 Dec 13, 2021
    This Spider/Bot is developed using Python and based on Scrapy Framework to Fetch some items information from Amazon

    - Hello, This Project Contains Amazon Web-bot. - I've developed this bot for fething some items information on Amazon. - Scrapy Framework in Python is

    Khaled Tofailieh 4 Feb 13, 2022
    This is a script that scrapes the longitude and latitude on food.grab.com

    grab This is a script that scrapes the longitude and latitude for any restaurant in Manila on food.grab.com, location can be adjusted. Search Result p

    0 Nov 22, 2021
    Html Content / Article Extractor, web scrapping lib in Python

    Python-Goose - Article Extractor Intro Goose was originally an article extractor written in Java that has most recently (Aug2011) been converted to a

    Xavier Grangier 3.8k Jan 02, 2023
    A Spider for BiliBili comments with a simple API server.

    BiliComment A spider for BiliBili comment. Spider Usage Put config.json into config directory, and then python . ./config/config.json. A example confi

    Hao 3 Jul 05, 2021
    Scrapes the Sun Life of Canada Philippines web site for historical prices of their investment funds and then saves them as CSV files.

    slocpi-scraper Sun Life of Canada Philippines Inc Investment Funds Scraper Install dependencies pip install -r requirements.txt Usage General format:

    Daryl Yu 2 Jan 07, 2022
    Web Content Retrieval for Humans™

    Lassie Lassie is a Python library for retrieving basic content from websites. Usage import lassie lassie.fetch('http://www.youtube.com/watch?v

    Mike Helmick 570 Dec 19, 2022
    一个m3u8视频流下载脚本

    一个Python的m3u8流视频下载脚本 介绍 m3u8流视频日益常见,目前好用的下载器也有很多,我把之前自己写的一个小脚本分享出来,供广大网友使用。写此程序的目的在于给视频下载爱好者提供一个下载样例,可直接调用,勿再重复造轮子。 使用方法 在python中直接运行程序或进行外部调用 import

    Nchu 0 Oct 10, 2021
    抖音批量下载用户所有无水印视频

    Douyincrawler 抖音批量下载用户所有无水印视频 Run 安装python3, 安装依赖

    28 Dec 08, 2022
    京东秒杀商品抢购Python脚本

    Jd_Seckill 非常感谢原作者 https://github.com/zhou-xiaojun/jd_mask 提供的代码 也非常感谢 https://github.com/wlwwu/jd_maotai 进行的优化 主要功能 登陆京东商城(www.jd.com) cookies登录 (需要自

    Andy Zou 1.5k Jan 03, 2023
    👁️ Tool for Data Extraction and Web Requests.

    httpmapper 👁️ Project • Technologies • Installation • How it works • License Project 🚧 For educational purposes. This is a project that I developed,

    15 Dec 05, 2021
    a small library for extracting rich content from urls

    A small library for extracting rich content from urls. what does it do? micawber supplies a few methods for retrieving rich metadata about a variety o

    Charles Leifer 588 Dec 27, 2022
    A dead simple crawler to get books information from Douban.

    Introduction A dead simple crawler to get books information from Douban. Pre-requesites Python 3 Install dependencies from requirements.txt (Optional)

    Yun Wang 1 Jan 10, 2022
    Example of scraping a paginated API endpoint and dumping the data into a DB

    Provider API Scraper Example Example of scraping a paginated API endpoint and dumping the data into a DB. Pre-requisits Python = 3.9 Pipenv Setup # i

    Alex Skobelev 1 Oct 20, 2021