Python HDFS client

Overview

Python HDFS client

Because the world needs yet another way to talk to HDFS from Python.

Usage

This library provides a Python client for WebHDFS. NameNode HA is supported by passing in both NameNodes. Responses are returned as nice Python classes, and any failed operation will raise some subclass of HdfsException matching the Java exception.

Example usage:

>>> fs = pyhdfs.HdfsClient(hosts='nn1.example.com:50070,nn2.example.com:50070', user_name='someone')
>>> fs.list_status('/')
[FileStatus(pathSuffix='benchmarks', permission='777', type='DIRECTORY', ...), FileStatus(...), ...]
>>> fs.listdir('/')
['benchmarks', 'hbase', 'solr', 'tmp', 'user', 'var']
>>> fs.mkdirs('/fruit/x/y')
True
>>> fs.create('/fruit/apple', 'delicious')
>>> fs.append('/fruit/apple', ' food')
>>> with contextlib.closing(fs.open('/fruit/apple')) as f:
...     f.read()
...
b'delicious food'
>>> fs.get_file_status('/fruit/apple')
FileStatus(length=14, owner='someone', type='FILE', ...)
>>> fs.get_file_status('/fruit/apple').owner
'someone'
>>> fs.get_content_summary('/fruit')
ContentSummary(directoryCount=3, fileCount=1, length=14, quota=-1, spaceConsumed=14, spaceQuota=-1)
>>> list(fs.walk('/fruit'))
[('/fruit', ['x'], ['apple']), ('/fruit/x', ['y'], []), ('/fruit/x/y', [], [])]
>>> fs.exists('/fruit/apple')
True
>>> fs.delete('/fruit')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File ".../pyhdfs.py", line 525, in delete
  ...
pyhdfs.HdfsPathIsNotEmptyDirectoryException: `/fruit is non empty': Directory is not empty
>>> fs.delete('/fruit', recursive=True)
True
>>> fs.exists('/fruit/apple')
False
>>> issubclass(pyhdfs.HdfsFileNotFoundException, pyhdfs.HdfsIOException)
True

The methods and return values generally map directly to WebHDFS endpoints. The client also provides convenience methods that mimic Python os methods and HDFS CLI commands (e.g. walk and copy_to_local).

pyhdfs logs all HDFS actions at the INFO level, so turning on INFO level logging will give you a debug record for your application.

For more information, see the full API docs.

Installing

pip install pyhdfs

Python 3 is required.

Development testing

http://codecov.io/github/jingw/pyhdfs/coverage.svg?branch=master Documentation Status

First run install-hdfs.sh x.y.z, which will download, extract, and run the HDFS NN/DN processes in the current directory. (Replace x.y.z with a real version.) Then run the following commands. Note they will create and delete hdfs://localhost/tmp/pyhdfs_test.

Commands:

python3 -m venv env
source env/bin/activate
pip install -e .
pip install -r dev_requirements.txt
pytest
Comments
  • client should return some info when succuessfully create a file

    client should return some info when succuessfully create a file

    for example, hdfs server may return a response with headers like this

    HTTP/1.1 201 Created
    Location: webhdfs://<HOST>:<PORT>/<PATH>
    Content-Length: 0
    

    I want to get location from response headers, however, client.create do not return any thing.

    opened by cosven 7
  • Write error

    Write error

    Hello Mkdir and listdir work fine But create didn't

    fs.create('/fruit/apple', 'delicious')
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/root/miniconda2/lib/python2.7/site-packages/pyhdfs.py", line 426, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 126, in put
        return request('put', url, data=data, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/api.py", line 58, in request
        return session.request(method=method, url=url, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 512, in request
        resp = self.send(prep, **send_kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/sessions.py", line 622, in send
        r = adapter.send(request, **kwargs)
      File "/root/miniconda2/lib/python2.7/site-packages/requests/adapters.py", line 513, in send
        raise ConnectionError(e, request=request)
    requests.exceptions.ConnectionError: HTTPConnectionPool(host='1566bb80c4dc', port=50075): Max retries exceeded with url: /webhdfs/v1/fruit/apple?op=CREATE&user.name=hdfs&namenoderpcaddress=localhost:8020&overwrite=false (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f644f364510>: Failed to establish a new connection: [Errno -2] Name or service not known',))
    
    opened by albertoRamon 4
  • requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) ConnectionResetError: [WinError 10054] 远程主机强迫关闭了一个现有的连接。

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 440, in send timeout=timeout File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 639, in urlopen _stacktrace=sys.exc_info()[2]) File "D:\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 357, in increment raise six.reraise(type(error), error, _stacktrace) File "D:\Anaconda3\lib\site-packages\urllib3\packages\six.py", line 685, in reraise raise value.with_traceback(tb) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 601, in urlopen chunked=chunked) File "D:\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 357, in _make_request conn.request(method, url, **httplib_request_kw) File "D:\Anaconda3\lib\http\client.py", line 1239, in request self._send_request(method, url, body, headers, encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1285, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1234, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "D:\Anaconda3\lib\http\client.py", line 1065, in _send_output self.send(chunk) File "D:\Anaconda3\lib\http\client.py", line 986, in send self.sock.sendall(data) urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last): File "D:\workspace\phdfs\check_wrf.py", line 144, in fs.copy_from_local(parname,"/test/fcst/china/10d_arwpost_sta/near/" + wrflisttime.format("YYYYMMDD") + "/" + parname,overwrite = True) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 753, in copy_from_local self.create(dest, f, **kwargs) File "D:\Anaconda3\lib\site-packages\pyhdfs.py", line 426, in create metadata_response.headers['location'], data=data, **self._requests_kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 126, in put return request('put', url, data=data, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\api.py", line 58, in request return session.request(method=method, url=url, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 508, in request resp = self.send(prep, **send_kwargs) File "D:\Anaconda3\lib\site-packages\requests\sessions.py", line 618, in send r = adapter.send(request, **kwargs) File "D:\Anaconda3\lib\site-packages\requests\adapters.py", line 490, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

    opened by Georege 4
  • BUG:Chinese character can't copy to hdfs

    BUG:Chinese character can't copy to hdfs

    UnicodeEncodeError: 'latin-1' codec can't encode characters in position 2-3: Body ('张三') is not valid Latin-1. Use body.encode('utf-8') if you want to send it encoded in UTF-8.

    opened by yiershanxll 3
  • Help me,please . The second run of the function in the script results in an abnormal result

    Help me,please . The second run of the function in the script results in an abnormal result

    I am a rookie~~!!

    The following code:

    list_info = [{"tenant": "coco", "hive_path": "/user/open_001_dev", "ftp_path": "/files/prov/001"},
                     {"tenant": "lili", "hive_path": "/user/open_002_dev", "ftp_path": "/files/prov/002"}]
    result = 0
    client=pyhdfs.HdfsClient(hosts="10.173.5.18:9000",user_name="hdfs",timeout=10,max_tries=3,randomize_hosts="false")
    def hive_content_size():
        global result
        for item in range(2):
            if "hive_path" in list_info[item]:
                print(client.get_content_summary(list_info[item]["hive_path"]))
    
    hive_content_size()
    

    The result of the first loop is output normally,but the output of the second loop is abnormal.

    The bottom is the error report:

    ContentSummary(directoryCount=1258, fileCount=3773, length=141829751002, quota=4000000, spaceConsumed=425489253006, spaceQuota=659706976665600)
    
    Failed to reach to 10.173.5.18:9000 (attempt 3/3)
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 445, in _make_request
        six.raise_from(e, None)
      File "<string>", line 3, in raise_from
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 440, in _make_request
        httplib_response = conn.getresponse()
      File "/usr/local/python/lib/python3.9/http/client.py", line 1347, in getresponse
        response.begin()
      File "/usr/local/python/lib/python3.9/http/client.py", line 307, in begin
        version, status, reason = self._read_status()
      File "/usr/local/python/lib/python3.9/http/client.py", line 268, in _read_status
        line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
      File "/usr/local/python/lib/python3.9/socket.py", line 704, in readinto
        return self._sock.recv_into(b)
    socket.timeout: timed out
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 439, in send
        resp = conn.urlopen(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 755, in urlopen
        retries = retries.increment(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/util/retry.py", line 532, in increment
        raise six.reraise(type(error), error, _stacktrace)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/packages/six.py", line 735, in reraise
        raise value
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 699, in urlopen
        httplib_response = self._make_request(
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 447, in _make_request
        self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
      File "/usr/local/python/lib/python3.9/site-packages/urllib3-1.26.4-py3.9.egg/urllib3/connectionpool.py", line 336, in _raise_timeout
        raise ReadTimeoutError(
    urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='10.173.5.18', port=9000): Read timed out. (read timeout=10)
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 418, in _request
        response = self._requests_session.request(
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/usr/local/python/lib/python3.9/site-packages/requests-2.25.1-py3.9.egg/requests/adapters.py", line 529, in send
        raise ReadTimeout(e, request=request)
    requests.exceptions.ReadTimeout: HTTPConnectionPool(host='10.162.3.171', port=19888): Read timed out. (read timeout=10)
    Traceback (most recent call last):
      File "/home/hadoop/shay/monthly_report/test01.py", line 24, in <module>
        print(hive_content_size())
      File "/home/hadoop/shay/monthly_report/test01.py", line 22, in hive_content_size
        print(client.get_content_summary(list_info[item]["hive_path"]))
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 633, in get_content_summary
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 450, in _get
      File "/usr/local/python/lib/python3.9/site-packages/PyHDFS-0.3.1-py3.9.egg/pyhdfs/__init__.py", line 442, in _request
    pyhdfs.HdfsNoServerException: Could not use any of the given hosts
    

    ask for help~~!!!

    opened by qwe55982 2
  • HdfsFileAlreadyExistsException is not implemented?

    HdfsFileAlreadyExistsException is not implemented?

    Hi! Thanks for your great work. I have noticed that some Exceptions are not implemented right now?

    For example: If I try to upload the file with same path, the python raises ConnectionError instead of HdfsFileAlreadyExistsException.

    error message as following:

    Traceback (most recent call last):
      File "test_pyhdfs.py", line 12, in <module>
        fs.create('/xxx/xxx/images/test.png', data=file)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/pyhdfs/__init__.py", line 504, in create
        metadata_response.headers['location'], data=data, **self._requests_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 132, in put
        return request('put', url, data=data, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/api.py", line 61, in request
        return session.request(method=method, url=url, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 542, in request
        resp = self.send(prep, **send_kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/sessions.py", line 655, in send
        r = adapter.send(request, **kwargs)
      File "/home/chiuhongyu/workplace/xxx/venv/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
        raise ConnectionError(err, request=request)
    requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
    
    opened by james77777778 1
  • Support customized WEBHDFS_PATH

    Support customized WEBHDFS_PATH

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 1
  • TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'

    I am using hadoop 2.6( with Docker: sudo docker run -i -t sequenceiq/hadoop-docker:2.6.0 /etc/bootstrap.sh -bash).

    When I using PyHDFS to call client.list_status, I got error:

    Traceback (most recent call last):
      File "testhdfs.py", line 3, in <module>
        print(client.list_status('/'))
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 428, in list_status
        _json(self._get(path, 'LISTSTATUS', **kwargs))['FileStatuses']['FileStatus']
      File "...testenv/lib/python3.4/site-packages/pyhdfs.py", line 427, in <listcomp>
        FileStatus(**item) for item in
    TypeError: __new__() got an unexpected keyword argument 'storagePolicy'
    

    The code:

    from pyhdfs import HdfsClient
    client = HdfsClient(hosts='172.17.0.2:50070')
    print(client.list_status('/'))
    

    This issue is cause of JSON from server has extra property storagePolicy, add it to pyhdfs.py can fix this. But I want to know weather this property is standard property of HDFS/WebHDFS.

    bug 
    opened by robberphex 1
  • why response assert not empty

    why response assert not empty

    In pyhdfs.py, line 424

    assert not metadata_response.content
    

    In my client, I get some response when upload files.

    b'<html>\r\n<head><title>307 Temporary Redirect</title></head>\r\n<body bgcolor="white">\r\n<center><h1>307 Temporary Redirect</h1></center>\r\n<hr><center>nginx/1.13.8</center>\r\n</body>\r\n</html>\r\n'
    

    This response does not mean the upload process failed, and I can successfully upload my files when I delete this line. Why add this line? could you please help me to figure out this problem?

    opened by SparkSnail 0
  • Support setting webhdfs_path

    Support setting webhdfs_path

    In the latest version of pyhdfs, webhdfs is set as a constant '/webhdfs/v1', it works well in most kind of scene, but users may use their customized HTTP URL, for example, users may set their own webhdfs service using Pylon, and they access their restful server using their customized URL PATTERN like http://<HOST>:<HTTP_PORT>/webhdfs/api/v2/<PATH>?op=...

    opened by SparkSnail 0
  • Let pyhdfs can visit HDFS in kerberos environment

    Let pyhdfs can visit HDFS in kerberos environment

    When HDFS need kerberos authentication,ur pyhds.py cannot visit HDFS. So maybe u should add authentication information in ur pyhdfs.py. In fact, it will call request module when python visit HDFS, so add authentication information at here.

    opened by LuckyNemo 0
  • got type error while append file

    got type error while append file

    File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 520, in append path, 'APPEND', expected_status=HTTPStatus.TEMPORARY_REDIRECT, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 466, in _post return self._request('post', path, op, expected_status, **kwargs) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 431, in _request _check_response(response, expected_status) File "/usr/local/lib/python3.6/site-packages/pyhdfs/__init__.py", line 933, in _check_response remote_exception['message'] = exception_name + ' - ' + remote_exception['message'] TypeError: must be str, not NoneType

    opened by BingoZ 0
  • can't parse JSON with unprintable characters

    can't parse JSON with unprintable characters

    If a weird non-utf file name is created in HDFS, then the client fails when it can't interpret the response as a valid JSON string.

    e.g. it's possible to put a ctrl-r in the file name

    bug 
    opened by jingw 0
Releases(v0.3.1)
CVE-2021-43798Exp多线程批量验证脚本

Grafana V8.*任意文件读取Exp--多线程批量验证脚本 漏洞描述 Grafana是一个开源的度量分析与可视化套件。经常被用作基础设施的时间序列数据和应用程序分析的可视化,它在其他领域也被广泛的使用包括工业传感器、家庭自动化、天气和过程控制等。其 8.*版本任意文件读取漏洞,该漏洞目前为0d

2 Dec 16, 2021
HatSploit native powerful payload generation and shellcode injection tool that provides support for common platforms and architectures.

HatVenom HatSploit native powerful payload generation and shellcode injection tool that provides support for common platforms and architectures. Featu

EntySec 100 Dec 23, 2022
Fuzz introspector is a tool to help fuzzer developers to get an understanding of their fuzzer’s performance and identify any potential blockers.

Fuzz introspector Fuzz introspector is a tool to help fuzzer developers to get an understanding of their fuzzer’s performance and identify any potenti

Open Source Security Foundation (OpenSSF) 221 Jan 01, 2023
Generate obfuscated meterpreter shells

Generator Evade AV with obfuscated payloads Installation must install dotnet prior to running the script with net45 Running ./generator.py -ip Your-I

Fawaz Al-Mutairi 219 Nov 28, 2022
Example for the NFT 3D Collectibles using Blender Scripting (Python).

NFT Collectibles using Blender Python What is this? This project is to demonstrate for generating NFT Collectible Avatar-Styled images. For details, p

hideckies 48 Nov 26, 2022
Wonk is a tool for combining a set of AWS policy files into smaller compiled policy sets.

Wonk is a tool for combining a set of AWS policy files into smaller compiled policy sets.

Amino, Inc 140 Dec 16, 2022
Client script for the fisherman phishing tool

Client script for the fisherman phishing tool

Pushkar Raj 1 Feb 23, 2022
GitLab CE/EE Preauth RCE using ExifTool

CVE-2021-22205 GitLab CE/EE Preauth RCE using ExifTool This project is for learning only, if someone's rights have been violated, please contact me to

3ND 164 Dec 10, 2022
This is a repository filled with scripts that were made with Python, and designed to exploit computer systems.

PYTHON-EXPLOITATION This is a repository filled with scripts that were made with Python, and designed to exploit computer systems. Networking tcp_clin

Nathan Galindo 1 Oct 30, 2021
Exploit for CVE-2017-17562 vulnerability, that allows RCE on GoAhead (< v3.6.5) if the CGI is enabled and a CGI program is dynamically linked.

GoAhead RCE Exploit Exploit for CVE-2017-17562 vulnerability, that allows RCE on GoAhead ( v3.6.5) if the CGI is enabled and a CGI program is dynamic

Francisco Spínola 2 Dec 12, 2021
Denial Attacks by Various Methods

Denial Service Attack Denial Attacks by Various Methods IIIIIIIIIIIIIIIIIIII PPPPPPPPPPPPPPPPP VVVVVVVV VVVVVVVV I::

Baris Dincer 9 Nov 26, 2022
Cracker - Tools CRACK FACEBOOK DAN INSTAGRAM DENGAN FITUR BANYAK

CLOME TO TOOLS ME 😁 FITUR TOOLS RESULTS INSTALASI ____/-- INSTALLASI /+/+/+/ t

Jeeck X Nano 3 Jan 08, 2022
OLOP: One-Line & Obfuscated Python

OLOP: One-Line & Obfuscated Python This repository contains useful python modules for one-line and obfuscated python. pip install olop-ShadowLugia650

1 Jan 09, 2022
Get related domains / subdomains by looking at Google Analytics IDs

DomainRelationShips ██╗ ██╗ █████╗ ██╗██████╗ ██║ ██║██╔══██╗ ██║██╔══██╗ ██║ ██║█████

Josué Encinar 161 Jan 02, 2023
DoSer.py - Simple DoSer in Python

DoSer.py - Simple DoSer in Python What is DoSer? DoSer is basically an HTTP Denial of Service attack that affects threaded servers. It works like this

8 Sep 02, 2022
Tools for investigating Log4j CVE-2021-44228

Log4jTools Tools for investigating Log4j CVE-2021-44228 FetchPayload.py (Get java payload from ldap path provided in JNDI lookup). Example command: Re

MalwareTech 91 Dec 29, 2022
Confluence OGNL injection

CVE-2021-26084 Confluence OGNL injection CVE-2021-26084 is an Object-Graph Navigation Language (OGNL) injection vulnerability in the Atlassian Conflue

Ashish Kunwar 15 Sep 23, 2022
🐝 ℹ️ Honeybee extension for export to IES-VE gem file format

honeybee-ies Honeybee extension for export a HBJSON file to IES-VE GEM file format Installation pip install honeybee-ies QuickStart import pathlib fro

Ladybug Tools 4 Jul 12, 2022
GitGuardian Shield: protect your secrets with GitGuardian

Detect secret in source code, scan your repo for leaks. Find secrets with GitGuardian and prevent leaked credentials. GitGuardian is an automated secrets detection & remediation service.

GitGuardian 1.2k Dec 27, 2022
Facebook Fast Cracking Tool With Python

Pro-Crack Facebook Fast Cracking Tool This is a multi-password‌ cracking tool that can help you hack facebook accounts very quickly Installation On Te

ReD H4CkeR 5 Feb 19, 2022