Ransomware leak site monitoring

Related tags

Loggingransomwatch
Overview

RansomWatch

Build Image Docker Hub Publish Docker Hub Image

RansomWatch is a ransomware leak site monitoring tool. It will scrape all of the entries on various ransomware leak sites, store the data in a SQLite database, and send notifications via Slack or Discord when a new victim shows up, or when a victim is removed.

Configuration

In config_vol/, please copy config.sample.yaml to config.yaml, and add the following:

  • Leak site URLs. I decided not to make this list public in order to prevent them from gaining even more noteriety, so if you have them, add them in. If not, this tool isn't for you.
  • Notification destinations. RansomWatch currently supports notifying via.the following:
    • Slack: Follow these instructions to add a new app to your Slack workspace and add the webhook URL to the config.
    • Discord: Follow these instructions to add a new app to your Discord server and add the webhook URL to the config.

Additionally, there are a few environment variables you may need to set:

  • RW_DB_PATH: Path for the SQLite database to use
  • RW_CONFIG_PATH: Path to the config.yaml file

These are both set in the provided docker-compose.yml.

Usage

This is intended to be run in Docker via a cronjob on whatever increment you decide to use.

First, build the container: docker-compose build app

Then, add it to your crontab. Example crontab entry (running every 8 hours):

0 */8 * * * cd /path/to/ransomwatch && docker-compose up --abort-on-container-exit

If you'd prefer, you can use the image published on Docker Hub (captaingeech/ransomwatch) instead, with a docker-compose.yml that looks something like this:

version: "3"

services:
  app:
    image: captaingeech/ransomwatch:latest
    depends_on:
      - proxy
    volumes:
      - ./db_vol:/db
      - ./config_vol:/config
    environment:
      PYTHONUNBUFFERED: 1
      RW_DB_PATH: /db/ransomwatch.db
      RW_CONFIG_PATH: /config/config.yaml

  proxy:
    image: captaingeech/tor-proxy:latest

This can also be run via the command line, but that requires you to have your own Tor proxy (with the control service) running. Example execution:

$ RW_DB_PATH=./db_vol/ransomwatch.db RW_CONFIG_PATH=./config_vol/config.yaml python3 src/ransomwatch.py

Example Slack Messages

Slack notification for new victim

Slack notification for removed victim

Slack notification for site down

Slack notification for an error

The messages sent to Discord are very similar in style, identical in content.

Leak Site Implementations

The following leak sites are (planned to be) supported:

  • Conti
  • MAZE
  • Egregor
  • Sodinokibi/REvil
  • DoppelPaymer (captcha, prob won't be supported for a while)
  • NetWalker
  • Pysa
  • Avaddon
  • DarkSide
  • CL0P
  • Nefilim
  • Mount Locker
  • Suncrypt
  • Everest
  • Ragnarok
  • Ragnar_Locker
  • BABUK LOCKER
  • Pay2Key
  • Cuba
  • RansomEXX
  • Pay2Key
  • Ranzy Locker
  • Astro Team
  • LV

If there are other leak sites you want implemented, feel free to open a PR or DM me on Twitter, @captainGeech42

Comments
  • Pysa timestamp format change

    Pysa timestamp format change

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/pysa.py", line 38, in scrape_victims
        published_dt = datetime.strptime(
      File "/usr/local/lib/python3.9/_strptime.py", line 568, in _strptime_datetime
        tt, fraction, gmtoff_fraction = _strptime(data_string, format)
      File "/usr/local/lib/python3.9/_strptime.py", line 349, in _strptime
        raise ValueError("time data %r does not match format %r" %
    ValueError: time data '22/03/21' does not match format '%m/%d/%y'
    
    opened by captainGeech42 4
  • Something broken with REvil

    Something broken with REvil

    app_1    | 2021/04/20 18:36:25 [ERROR] Got an error while scraping REvil, notifying
    app_1    | 2021/04/20 18:36:25 [ERROR] Error sending Discord notification (400): {"embeds": ["0"]}
    app_1    | 2021/04/20 18:36:25 [ERROR] Failed to send error notification to Discord guild "test-discord"
    app_1    | 2021/04/20 18:36:25 [ERROR] Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | http.client.RemoteDisconnected: Remote end closed connection without response
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
    app_1    |     resp = conn.urlopen(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
    app_1    |     retries = retries.increment(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 532, in increment
    app_1    |     raise six.reraise(type(error), error, _stacktrace)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 734, in reraise
    app_1    |     raise value.with_traceback(tb)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/app/ransomwatch.py", line 52, in main
    app_1    |     s.scrape_victims()
    app_1    |   File "/app/sites/revil.py", line 62, in scrape_victims
    app_1    |     r = p.get(f"{self.url}?page={i}", headers=self.headers)
    app_1    |   File "/app/net/proxy.py", line 101, in get
    app_1    |     return self.session.get(*args, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
    app_1    |     return self.request('GET', url, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    app_1    |     resp = self.send(prep, **send_kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    app_1    |     r = adapter.send(request, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    app_1    |     raise ConnectionError(err, request=request)
    app_1    | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    | 2021/04/20 18:36:25 [INFO] Finished all sites, exiting
    

    not sure what's going on. similar error w/ slack

    bug 
    opened by captainGeech42 3
  • Conti - Scraping Error

    Conti - Scraping Error

    Describe the bug

    Error Message Below:

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    To Reproduce Steps to reproduce the behavior: This error has happened several times over the last 24 hours while ransomwatch has been run on a cron job.

    Expected behavior Parse the contents of the Conti site with no errors or have additional error handling built in to handle this error.

    Screenshots If applicable, add screenshots to help explain your problem.

    Logs

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    Environment

    • OS: Ubuntu 20.04
    • How you are running it: Docker via cron job (read me best practices implementation)

    Additional context Add any other context about the problem here.

    opened by GRIT-5ynax 2
  • Dockerhub image out of date

    Dockerhub image out of date

    Running the Dockerhub image results in

    app_1 | Traceback (most recent call last): app_1 | File "/app/ransomwatch.py", line 98, in app_1 | NotificationManager.send_error_notification(f"Non-scraping failure", tb, fatal=True) app_1 | File "/app/notifications/manager.py", line 30, in send_error_notification app_1 | for workspace, params in Config["slack"].items(): app_1 | KeyError: 'slack'

    Works if the image is built

    bug 
    opened by nhova 2
  • New sites

    New sites

    • [x] Ranzy
    • [x] Astro
    • [x] Pay2Key
    • [x] Cuba
    • [x] RansomEXX
    • [x] Mount Locker
    • [x] Ragnarok
    • [ ] Ragnar Locker
    • [x] Suncrypt
    • [x] Everest
    • [x] Nefilim
    • [x] CL0P
    • [x] Pysa
    opened by captainGeech42 2
  • New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    This pull request adds support for BLACKMATTER, ARVIN, EL COMETA, LORENZ, XING, LOCKBIT.

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc.
    • [x] The data going into the DB is properly parsed and is accurate
    enhancement 
    opened by x-originating-ip 1
  • cl0p scraper broken

    cl0p scraper broken

    Describe the bug Cl0p scraper out of date

    Logs

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/cl0p.py", line 21, in scrape_victims
        victim_list = soup.find("div", class_="collapse-section").find_all("li")
    AttributeError: 'NoneType' object has no attribute 'find_all'
    

    should probably just update this to the v3 site as well

    bug 
    opened by captainGeech42 1
  • Enhance pysa datetimes processing (#50)

    Enhance pysa datetimes processing (#50)

    Describe the changes

    Adding some logics into pysa.py to try to process the datetime better. Also, exception handling has been added to avoid crash of the script.

    Related issue(s)

    #50

    How was it tested?

    Before: scrapping failed at some point if pysa was defined in the yaml config file (see related issue).

    Now:

    • [x] scrapping works
    • [x] dates look good (although as we don't know what is the true value, we can only admit it's relevant)
    • [x] the script does not crash any longer because of the try/catch instructions.
    opened by biligonzales 1
  • Handle missing notifications element in the yaml config file (#52)

    Handle missing notifications element in the yaml config file (#52)

    Describe the changes

    Added minor changes into manager.py so that it does not scream out loud if we do not want to configure notifications. Basically the presence of the notifications element in the Config yaml is tested.

    Related issue(s)

    #52

    How was it tested?

    • [x] Docker started with an empty notifications element
    • [x] Docker started withtout any notifications element
    opened by biligonzales 1
  • Unable to run without configured notifications

    Unable to run without configured notifications

    The notifications part in the config.yaml file needs to be present and configured to avoid any error at runtime. Would be great to be able to leave the notifications part empty (or even not to set it in the yaml config).

    opened by biligonzales 1
  • Conti: scraper fixed (#73)

    Conti: scraper fixed (#73)

    Describe the changes

    Fixed the Conti scraper to use the newsList javascript item because no html elements were available any longer.

    Related issue(s)

    This fixes issue #73

    How was it tested?

    1. Add Conti url to config.yaml
    2. Run docker-compose build app
    3. Run docker-compose up --abort-on-container-exit
    4. Conti results are pushed again in the database

    Checklist for a new scraper (delete if N/A)

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc. (there was one logging.debug there, I left it as it was)
    • [x] The data going into the DB is properly parsed and is accurate
    opened by biligonzales 0
  • Lockbit scraper fixed (now uses playwright) #74

    Lockbit scraper fixed (now uses playwright) #74

    Describe the changes

    Lockbit 2.0 now uses a ddos protection mechanism hence the regular http get method is no longer working.

    As a workaround I have implemented the playwright Microsoft library which behaves as if a proper browser did the request.

    Summary of the changes:

    1. lockbit.py: replaced the use of requests by playwright
    2. requirements.txt: added playwright
    3. Dockerfile: added playwright chromium support as well as required libraries.

    I have also upgraded at the top of the Dockerfile from python3.9-buster to python3.10-bullseye.

    Related issue(s)

    It fixes Issue #74

    Note that the scraping engine for lockbit has been left untouched as it is still perfectly working. Only the web page retrieval method has been altered.

    How was it tested?

    • [x] docker-compose build app
    • [x] docker-compose up --abort-on-container-exit
    • [x] Checked that Lockbit entries have been inserted into the database
    opened by biligonzales 3
  • new victims monitoring is broken, alert only when sites are down

    new victims monitoring is broken, alert only when sites are down

    Describe the bug The app doesn't alert when new victims added to the ransom sites (we noticed that new victim are being added on some of the sites) We get alerts only when the sites are down.

    Expected behavior The app alert when new victim are added to the ransom sits being monitored.

    Logs Starting ransomwatch_proxy_1 ... done Starting ransomwatch_app_1 ... done Attaching to ransomwatch_proxy_1, ransomwatch_app_1 proxy_1 | Feb 07 14:50:31.819 [notice] Tor 0.4.5.7 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1i, Zlib 1.2.11, Liblzma 5.2.5, Libzstd 1.4.5 and Unknown N/A as libc. proxy_1 | Feb 07 14:50:31.822 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://www.torproject.org/download/download#warning proxy_1 | Feb 07 14:50:31.822 [notice] Read configuration file "/etc/tor/torrc". proxy_1 | Feb 07 14:50:31.825 [notice] Opening Socks listener on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Socks listener connection (ready) on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opening Control listener on 0.0.0.0:9051 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Control listener connection (ready) on 0.0.0.0:9051 app_1 | 2022/02/07 14:50:33 [INFO] Initializing app_1 | 2022/02/07 14:50:33 [INFO] Found 30 sites app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Avaddon app_1 | 2022/02/07 14:50:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Conti app_1 | 2022/02/07 14:50:38 [INFO] Scraping victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:48 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:48 [INFO] Finished Conti app_1 | 2022/02/07 14:51:48 [INFO] Starting process for DarkSide app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for REvil app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for Babuk app_1 | 2022/02/07 14:51:50 [INFO] Scraping victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:51 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:51 [INFO] Finished Babuk app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Ranzy app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Astro app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Pay2Key app_1 | 2022/02/07 14:51:53 [INFO] Scraping victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:54 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:54 [INFO] Finished Pay2Key app_1 | 2022/02/07 14:51:54 [INFO] Starting process for Cuba app_1 | 2022/02/07 14:51:57 [INFO] This is the first scrape for Cuba, no victim notifications will be sent app_1 | 2022/02/07 14:51:57 [INFO] Scraping victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:08 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:08 [INFO] Finished Cuba app_1 | 2022/02/07 14:52:08 [INFO] Starting process for RansomEXX app_1 | 2022/02/07 14:52:10 [INFO] Scraping victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:13 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:13 [INFO] Finished RansomEXX app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Mount app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnarok app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnar app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Suncrypt app_1 | 2022/02/07 14:52:15 [INFO] This is the first scrape for Suncrypt, no victim notifications will be sent app_1 | 2022/02/07 14:52:15 [INFO] Scraping victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:17 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:17 [INFO] Finished Suncrypt app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Everest app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Nefilim app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Cl0p app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Pysa app_1 | 2022/02/07 14:52:19 [INFO] Scraping victims app_1 | 2022/02/07 14:52:23 [WARNING] couldn't parse timestamp: 00/00/00 app_1 | /usr/local/lib/python3.9/site-packages/dateparser/date_parser.py:35: PytzUsageWarning: The localize method is no longer necessary, as this time zone supports the fold attribute (PEP 495). For more details on migrating to a PEP 495-compliant implementation, see https://pytz-deprecation-shim.readthedocs.io/en/latest/migration.html app_1 | date_obj = stz.localize(date_obj) app_1 | 2022/02/07 14:52:24 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:24 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:24 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:24 [INFO] Finished Pysa app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Hive app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lockbit app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Xing app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lorenz app_1 | 2022/02/07 14:52:26 [INFO] This is the first scrape for Lorenz, no victim notifications will be sent app_1 | 2022/02/07 14:52:26 [INFO] Scraping victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:27 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:27 [INFO] Finished Lorenz app_1 | 2022/02/07 14:52:27 [INFO] Starting process for ElCometa app_1 | 2022/02/07 14:52:27 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:27 [INFO] Starting process for Arvin app_1 | 2022/02/07 14:52:30 [INFO] This is the first scrape for Arvin, no victim notifications will be sent app_1 | 2022/02/07 14:52:30 [INFO] Scraping victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:33 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:33 [INFO] Finished Arvin app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Blackmatter app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Avoslocker app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for LV app_1 | 2022/02/07 14:52:35 [INFO] Scraping victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:37 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:37 [INFO] Finished LV app_1 | 2022/02/07 14:52:37 [INFO] Starting process for Marketo app_1 | 2022/02/07 14:52:37 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:37 [INFO] Starting process for LockData app_1 | 2022/02/07 14:52:40 [INFO] Scraping victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:42 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:42 [INFO] Finished LockData app_1 | 2022/02/07 14:52:42 [INFO] Starting process for Rook app_1 | 2022/02/07 14:52:42 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:42 [INFO] Finished all sites, exiting

    Environment

    • OS: [Ubuntu 20.04.3]
    • How you are running it: [Docker with cronjob]
    opened by Deventual 1
  • Victim removal detection doesn't work properly when onion changes

    Victim removal detection doesn't work properly when onion changes

    Victim removal detection currently uses the full URL usually, which includes the onion domain. One side effect of this is that whenever the onion addr for a site changes, all of the victims are considered removed and new on the next scrape, which is problematic.

    Change this to just use the URI + site ID.

    bug 
    opened by captainGeech42 0
  • LOCKBIT 2.0 Support

    LOCKBIT 2.0 Support

    Site Info (no URL) LOCKBIT 2.0 was released some time ago. It should be confirmed either the scraper works with the new site or a module should be rewritten.

    Is the site currently online? Yes

    opened by wersas1 5
Releases(v1.2)
  • v1.2(Dec 4, 2021)

    This release fixes a few different bugs on the following scrapers:

    • Ragnar
    • Lorenz
    • Pysa
    • Arvin

    What's Changed

    • fixed #79 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/80
    • fixed #76 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/81
    • fixed #77, changed dateparsing to use lib by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/82
    • changed arvin date parsing to use lib (fixes #75) by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/83

    Full Changelog: https://github.com/captainGeech42/ransomwatch/compare/v1.1...v1.2

    Source code(tar.gz)
    Source code(zip)
  • v1.1(Dec 2, 2021)

    Ransomwatch v1.1

    This release adds support for many new sites, and has a critical security update. For details on the security update, see here.

    Supported Sites

    This release supports the following shame sites:

    • Conti
    • Sodinokibi/REvil
    • Pysa
    • Avaddon
    • DarkSide
    • CL0P
    • Nefilim
    • Mount Locker
    • Suncrypt
    • Everest
    • Ragnarok
    • Ragnar_Locker
    • BABUK LOCKER
    • Pay2Key
    • Cuba
    • RansomEXX
    • Pay2Key
    • Ranzy Locker
    • Astro Team
    • BlackMatter
    • Arvin
    • El_Cometa
    • Lorenz
    • Xing
    • Lockbit
    • AvosLocker
    • LV
    • Marketo
    • Lockdata
    Source code(tar.gz)
    Source code(zip)
  • v1.0(Apr 18, 2021)

    v1.0 Ransomwatch Release

    This initial version of Ransomwatch supports the following sites:

    • Conti
    • REvil/Sodinokibi
    • Avaddon
    • DarkSide

    This release supports notifying via:

    • Slack Webhooks

    More sites/notification capabilities will be added over time. However, this release has been tested in a production capacity and should be suitable to start collections.

    If you find any bugs or run across an problems, please open an issue to help improve Ransomwatch

    Source code(tar.gz)
    Source code(zip)
Owner
Zander Work
@osusec / @OSU-SOC
Zander Work
Monitor and log Network and Disks statistics in MegaBytes per second.

iometrics Monitor and log Network and Disks statistics in MegaBytes per second. Install pip install iometrics Usage Pytorch-lightning integration from

Leo Gallucci 17 May 03, 2022
蓝鲸日志平台(BK-LOG)是为解决分布式架构下日志收集、查询困难的一款日志产品,基于业界主流的全文检索引擎

蓝鲸日志平台(BK-LOG)是为解决分布式架构下日志收集、查询困难的一款日志产品,基于业界主流的全文检索引擎,通过蓝鲸智云的专属 Agent 进行日志采集,提供多种场景化的采集、查询功能。

腾讯蓝鲸 102 Dec 22, 2022
Log processor for nginx or apache that extracts user and user sessions and calculates other types of useful data for bot detection or traffic analysis

Log processor for nginx or apache that extracts user and user sessions and calculates other types of useful data for bot detection or traffic analysis

David Puerta Martín 1 Nov 11, 2021
Monitor creation, deletion and changes to LDAP objects live during your pentest or system administration!

LDAP Monitor Monitor creation, deletion and changes to LDAP objects live during your pentest or system administration! With this tool you can quickly

Podalirius 500 Dec 28, 2022
ScreenshotLogger works just like a keylogger but instead of capturing keystroke,it captures the screen, stores it or sends via email

ScreenshotLogger works just like a keylogger but instead of capturing keystroke,it captures the screen, stores it or sends via email. Scrapeasy is super easy to use and handles everything for you. Ju

Ifechukwudeni Oweh 17 Jul 17, 2022
A very basic esp32-based logic analyzer capable of sampling digital signals at up to ~3.2MHz.

A very basic esp32-based logic analyzer capable of sampling digital signals at up to ~3.2MHz.

Davide Della Giustina 43 Dec 27, 2022
Monitoring plugin to check disk io with Icinga, Nagios and other compatible monitoring solutions

check_disk_io - Monitor disk io This is a monitoring plugin for Icinga, Nagios and other compatible monitoring solutions to check the disk io. It uses

DinoTools 3 Nov 15, 2022
A Prometheus exporter for monitoring & analyzing Grafana Labs' technical documentation

grafana-docs-exporter A Prometheus exporter for monitoring & analyzing Grafana Labs' technical documentation Here is the public endpoint.

Matt Abrams 5 May 02, 2022
The new Python SDK for Sentry.io

sentry-python - Sentry SDK for Python This is the next line of the Python SDK for Sentry, intended to replace the raven package on PyPI. from sentry_s

Sentry 1.4k Dec 31, 2022
👻 - Simple Keylloger with Socket

Keyllogs 👻 - Simple Keylloger with Socket Keyllogs 🎲 - Run Keyllogs

Bidouffe 3 Mar 28, 2022
Progressbar 2 - A progress bar for Python 2 and Python 3 - "pip install progressbar2"

Text progress bar library for Python. Travis status: Coverage: Install The package can be installed through pip (this is the recommended method): pip

Rick van Hattem 795 Dec 18, 2022
Rich is a Python library for rich text and beautiful formatting in the terminal.

Rich 中文 readme • lengua española readme • Läs på svenska Rich is a Python library for rich text and beautiful formatting in the terminal. The Rich API

Will McGugan 41.5k Jan 07, 2023
Pretty and useful exceptions in Python, automatically.

better-exceptions Pretty and more helpful exceptions in Python, automatically. Usage Install better_exceptions via pip: $ pip install better_exception

Qix 4.3k Dec 29, 2022
Simple and versatile logging library for python 3.6 above

Simple and versatile logging library for python 3.6 above

Miguel 1 Nov 23, 2022
This is a wonderful simple python tool used to store the keyboard log.

Keylogger This is a wonderful simple python tool used to store the keyboard log. Record your keys. It will capture passwords and credentials in a comp

Rithin Lehan 2 Nov 25, 2021
ClusterMonitor - a very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs

ClusterMonitor A very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs. Usage To start recording

23 Oct 04, 2021
This is a key logger based in python which when executed records all the keystrokes of the system it has been executed on .

This is a key logger based in python which when executed records all the keystrokes of the system it has been executed on

Purbayan Majumder 0 Mar 28, 2022
Small toolkit for python multiprocessing logging to file

Small Toolkit for Python Multiprocessing Logging This is a small toolkit for solving unsafe python mutliprocess logging (file logging and rotation) In

Qishuai 1 Nov 10, 2021
A python logging library

logi v1.3.4 instolation the lib works on python 3x versions pip install logi examples import import logi log = logger(path='C:/file path', timestamp=T

2 Jul 06, 2022
changedetection.io - The best and simplest self-hosted website change detection monitoring service

changedetection.io - The best and simplest self-hosted website change detection monitoring service. An alternative to Visualping, Watchtower etc. Designed for simplicity - the main goal is to simply

7.3k Jan 01, 2023