Vericopy - This Python script provides various usage modes for secure local file copying and hashing.

Overview

Vericopy

This Python script provides various usage modes for secure local file copying and hashing. Hash data is captured and logged for paths before and after copying, to confirm that transfers were successful and enable optional copying modes such as merge and exchange, in which files will only be transferred from sources that are not already present somewhere in the destination. Multiprocessing is used for hashing to optimise performance.

Prerequisites

Python 3.7 or later is required, with the tqdm progress bar module installed (pip install tqdm).

This script has been tested using Python versions 3.7, 3.8.5, and 3.95, running on macOS 11.4, Ubuntu 20.04, and Windows 10 20H2.

Script usage

The script has six usage modes, outlined below. Hash data generated will be written to a log folder, by default created in folder vericopy_logs.

Multiprocessing will be used to hash data on each of the source and destination paths simultaneously. Performance will be maximised if the provided paths are on separate drives.

Script-wide flags can be viewed using: python3 vericopy.py --help, and are as follows:

  • -l or --log: opt to write the status messages displayed in the console to a log file in the log folder.
  • -d or --debug: display debug messages and write these to a dedicated debug log file in the log folder.
  • --logfolder [str]: folder to write logs to (if not specified, the default of vericopy_logs will be used).
  • --algorithms [md5, sha1, sha256]: SHA1 is the hashing algorithm used by default; this flag allows for other algorithm(s) to be specified, from options 'md5', 'sha1' and 'sha256'.
  • --ignore-dotfiles: files and folders beginning with . will not be processed (such folders are typically hidden and contain system/settings data).
  • --ignore-windows-volume-folders: folders named System Volume Information and $RECYCLE.BIN will not be processed (such folders typically contain hidden Windows system information).

Usage example incorporating flags:

python3 vericopy.py --algorithms md5 sha1 sha256 -l --logfolder alternate_log_folder --ignore-dotfiles --ignore-windows-volume-folders [mode and mode arguments]

Usage modes

Copy

Files in one or more source paths will be copied to a destination folder, with hash verification performed before and after the copy to verify transfer completed successfully. Files within subfolders will be included for this and other modes (i.e. recursive copy). Absolute or relative paths may be provided; all metadata generated will revert to absolute paths.

Syntax:

python3 vericopy.py copy source_path [source_path ...] destination_path [flags]

Usage example:

python3 vericopy.py copy gov.archives.arc.1155023 TourTheInternationalSpaceStation space_videos

The above will copy all files in folders gov.archives.arc.1155023 and TourTheInternationalSpaceStation to folder space_videos, with file hashing performed before and after the copy to confirm success.

The available flags can be viewed using: python3 vericopy.py copy --help, and are as follows:

  • --only-hash-transferred-files: instead of performing a full before-and-after hash of the destination folder, only those files transferred will be hashed and verified. Useful if there are already files present in the destination folder and these files do not need capturing within the metadata generated.
  • --hash-files [str ... str]: one or more (space separated) paths to pre-computed hash files, to avoid re-generation of hash data for the source(s) and destination. These hash files may be generated using the hash mode, or will be present in the log folder from any previous executions of copy, move, or merge.

Move

The usage and available flags for move are identical to copy, except files will be moved (i.e. deleted from the source after the transfer is verified) instead of copied. The 'move' is achieved by initially copying files from source(s) to destination, then running a hash check that the transfer was successful - following confirmation, the files will be deleted from the source(s).

Merge

Hash data will be generated and compared between source(s) and destination, and only those files that are not present somewhere within the destination path will be copied. When using this mode, data will be transferred to a [timestamp]_merge subfolder in the destination. This subfolder contains:

  • A folder for each of the source(s), within which are the unique files that were not previously present in the destination.
  • A folder merge_hash_references, which contains [source_file_filename].references.txt files for each of the files on the source(s) that were already present on the destination before the transfer. Each of these files contain a file list of where copies of the original source file may be found on the destination (as this is a hash-based reference, the filenames on the destination may be different than that of the source).

This means that a file is created on the destination for every file in the source(s), but for those files that already existed somewhere within the destination, this file will be a small .references.txt file rather than the original data. Inclusion of these reference files allows for a 'view' of the original source(s) that may be reconstructed later if desired (provided that no data is deleted on the destination in the meantime).

Syntax:

python3 vericopy.py merge source_path [source_path ...] destination_path [flags]

Usage example:

python3 vericopy.py merge gov.archives.arc.1155023 TourTheInternationalSpaceStation space_videos

The above will merge all files in folders gov.archives.arc.1155023 and TourTheInternationalSpaceStation into space_videos - the files in gov.archives.arc.1155023 and TourTheInternationalSpaceStation that were not present in space_videos will be fully copied, while .references.txt files will be created in reference to files that were already present. This data will be placed in a [timestamp]_merge subfolder in space_videos.

The available flags are the same as for the copy mode listed above.

Exchange

Similarly to merge, hash data will be generated and compared between two or more source(s). Each source will then receive a copy of any files stored within the other source(s) that were not previously present somewhere within the source folder. When using this mode, data will be transferred to a [timestamp]_exchange subfolder in each source, with contents and structure as per the equivalent in the merge mode above.

Syntax:

python3 vericopy.py exchange source_path source_path [source_path ...] [flags]

Usage example:

python3 vericopy.py exchange gov.archives.arc.1155023 TourTheInternationalSpaceStation space_videos

The above will exchange all files in folders gov.archives.arc.1155023, TourTheInternationalSpaceStation, and space_videos - each folder will receive copies of files that were previously only present within the other two folders. This data will be placed in a [timestamp]_exchange subfolder in each folder.

The available flags are the same as for the copy mode listed above.

Hash

Hash data will be generated for the provided source(s) with no subsequent file transfers. Useful to pre-compute hash data for one of the modes above. This mode also allows for hashing of files within .zip and .7z archives if specified using the flags detailed below.

Syntax:

python3 vericopy.py hash source_path [source_path ...] [flags]

Usage example:

python3 vericopy.py hash gov.archives.arc.1155023 TourTheInternationalSpaceStation

The above will hash all files in folders gov.archives.arc.1155023 and TourTheInternationalSpaceStation, with results placed in the log folder.

The available flags can be viewed using: python3 vericopy.py hash --help, and are as follows:

  • -o [str] or --output [str]: a file path to output unified hash metadata to, taken from all of the source(s). If not specified, per-source hash metadata will still be available in the log folder.
  • -a or --archives: attempt to hash files within archive files. At this time only .zip and .7z files are supported, and encrypted .zip files and all .7z files require the 7z commandline tool to be installed and accessible on the system PATH.
  • --only-archive-contents: attempt to hash files within archive files, but do not hash the archive file itself. E.g. the files contained within sample.zip would be hashed, but sample.zip itself would not be hashed.
  • -c [str] or --cache [str]: if attempting to hash files within archive files, any encrypted .zip files or .7z files will need to be temporarily extracted in order to hash the files within. The default cache folder location will be vericopy_cache within the script folder, however a custom cache folder may be specified with this flag. As this folder will be deleted at the end of script execution, for safety, it must not exist before the script is run.
  • -p [str] or --password [str]: if attempting to hash files within encrypted archive files, specify the password to attempt with this flag. Unencrypted archive files will still extract even if the password is set. Note that terminal history on your system may reveal this password to other users.

Compare

Hash files generated while transferring data or using the hash mode may be compared with the compare mode, to determine if all hash values in a 'source' hash file can be found within a 'destination' hash file.

Syntax:

python3 vericopy.py compare source_output_path destination_output_path [flags]

Usage example:

python3 vericopy.py compare hashes-original.txt hashes-updated.txt

The above will compare the hashes within hashes-original.txt and hashes-updated.txt, checking that all hashes in hashes-original.txt are present at least once in hashes-updated.txt.

The available flags can be viewed using: python3 vericopy.py compare --help, and are as follows:

  • -c or --compare-filepaths: by default, only hash values will be compared, rather than file paths as well as hash values. This behaviour allows for files to be moved between sources, and for duplicates of files to be deleted. Using this flag will also compare that all file paths in the source file can be found in the destination file.
  • -m [str] or --missing-files-output [str]: any missing hashes will be reported in command line output; these may be consolidated to a missing file list in an output file path using this flag.
  • --copy-missing-dest [str]: any hash values found to be missing will have a copy of the source file copied to a folder specified using this flag. Note that the files must still be present at the original source locations for this to work.

Privacy, log data, and uninstallation

This script runs entirely locally; no third party services are communicated with.

Log data and hash metadata is stored by default in folder vericopy_logs (created in the folder that the script is executed in). Debug logs capture system details (including Python version and operating system), command line arguments used, and events occurring during script execution. Archive passwords are not recorded in these logs, but will be retained on the local system in terminal history.

Full uninstallation can be achieved by:

  1. Deleting the script and any other downloaded files (e.g. the readme and license).
  2. Deleting the logs folder (vericopy_logs by default).
  3. If desired, removing records of archive passwords stored in terminal history.
  4. If desired, removing the tqdm library and Python runtime.

Known issues

  1. A Python bug may cause issues in Windows when trying to quit the script using CTRL+C. A SIGBREAK can be sent instead using CTRL+BREAK, or by invoking the on-screen keyboard (WIN+R, then run osk.exe) and using its Ctrl+ScrLk keys.

Contributing

If you would like to contribute, please fork the repository and use a feature branch. Pull requests are warmly welcome.

Licensing

The code in this project is licensed under the MIT License.

You might also like...
Pti-file-format - Reverse engineering the Polyend Tracker instrument file format

pti-file-format Reverse engineering the Polyend Tracker instrument file format.

Generates a clean .txt file of contents of a 3 lined csv file

Generates a clean .txt file of contents of a 3 lined csv file. File contents is the .gml file of some function which stores the contents of the csv as a map.

PaddingZip - a tool that you can craft a zip file that contains the padding characters between the file content.
PaddingZip - a tool that you can craft a zip file that contains the padding characters between the file content.

PaddingZip - a tool that you can craft a zip file that contains the padding characters between the file content.

Extract longest transcript or longest CDS transcript from GTF annotation file or gencode transcripts fasta file.

Extract longest transcript or longest CDS transcript from GTF annotation file or gencode transcripts fasta file.

Two scripts help you to convert csv file to md file by template

Two scripts help you to convert csv file to md file by template. One help you generate multiple md files with different filenames from the first colume of csv file. Another can generate one md file with several blocks.

This simple python script pcopy reads a list of file names and copies them to a separate folder

pCopy This simple python script pcopy reads a list of file names and copies them to a separate folder. Pre-requisites Python 3 (ver. 3.6) How to use

Small Python script to generate a calendar (.ics) file from SIMASTER courses schedule.
Small Python script to generate a calendar (.ics) file from SIMASTER courses schedule.

simaster.ics Small Python script to generate a calendar (.ics) file from SIMASTER courses schedule. Usage Getting the events.json file from SIMASTER O

Automatically generates a TypeQL script for doing entity and relationship insertions from a .csv file, so you don't have to mess with writing TypeQL.
Automatically generates a TypeQL script for doing entity and relationship insertions from a .csv file, so you don't have to mess with writing TypeQL.

Automatically generates a TypeQL script for doing entity and relationship insertions from a .csv file, so you don't have to mess with writing TypeQL.

A Python library that provides basic functions to read / write Aseprite format files

A Python library that provides basic functions to read / write Aseprite format files

Releases(v1.0.3)
  • v1.0.3(Jul 30, 2021)

    Additional logging and empty input file checking added to Compare mode, and final warning message will no longer advise to check in log files if they are not being created (i.e. -l or -d flag not set).

    Source code(tar.gz)
    Source code(zip)
  • v1.0.2(Jul 27, 2021)

    Paths to pipes and sockets will now be skipped (with user warning generated), and filename fix for logs created if Unix root directory (i.e. '/') is hashed.

    Source code(tar.gz)
    Source code(zip)
  • v1.0.1(Jul 26, 2021)

    Exception handling and user warnings added for file/folder permission errors, and for file not found errors (if files are deleted in the time between the paths file listing scan and the deleted file being reached in the hash queue).

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Jul 24, 2021)

Lumar - Smart File Creator

Lumar is a free tool for creating and managing files. With Lumar you can quickly create any type of file, add a file content and file size. With Lumar you can also find out if Photoshop or other imag

Paul - FloatDesign 3 Dec 10, 2021
Various converters to convert value sets from CSV to JSON, etc.

ValueSet Converters Tools for converting value sets in different formats. Such as converting extensional value sets in CSV format to JSON format able

Health Open Terminology Ecosystem 4 Sep 08, 2022
Uproot is a library for reading and writing ROOT files in pure Python and NumPy.

Uproot is a library for reading and writing ROOT files in pure Python and NumPy. Unlike the standard C++ ROOT implementation, Uproot is only an I/O li

Scikit-HEP Project 164 Dec 31, 2022
This program can help you to move and rename many files at once

This program can help you to rename and save many files in a folder in seconds, but don't give the same name to files, it can delete both files.

João Assalim 1 Oct 10, 2022
Object-oriented file system path manipulation

path (aka path pie, formerly path.py) implements path objects as first-class entities, allowing common operations on files to be invoked on those path

Jason R. Coombs 1k Dec 28, 2022
A small Python module for determining appropriate platform-specific dirs, e.g. a "user data dir".

the problem What directory should your app use for storing user data? If running on macOS, you should use: ~/Library/Application Support/AppName If

ActiveState Software 948 Dec 31, 2022
CredSweeper is a tool to detect credentials in any directories or files.

CredSweeper is a tool to detect credentials in any directories or files. CredSweeper could help users to detect unwanted exposure of credentials (such as personal information, token, passwords, api k

Samsung 54 Dec 13, 2022
shred - A cross-platform library for securely deleting files beyond recovery.

shred Help the project financially: Donate: https://smartlegion.github.io/donate/ Yandex Money: https://yoomoney.ru/to/4100115206129186 PayPal: https:

4 Sep 04, 2021
Python script for converting figma produced SVG files into C++ JUCE framework source code

AutoJucer Python script for converting figma produced SVG files into C++ JUCE framework source code Watch the tutorial here! Getting Started Make some

SuperConductor 1 Nov 26, 2021
Publicly Open Amazon AWS S3 Bucket Viewer

S3Viewer Publicly open storage viewer (Amazon S3 Bucket, Azure Blob, FTP server, HTTP Index Of/) s3viewer is a free tool for security researchers that

Sharon Brizinov 377 Dec 02, 2022
Python function to stream unzip all the files in a ZIP archive: without loading the entire ZIP file or any of its files into memory at once

Python function to stream unzip all the files in a ZIP archive: without loading the entire ZIP file or any of its files into memory at once

Department for International Trade 206 Jan 02, 2023
Creates folders into a directory to categorize files in that directory by file extensions and move all things from sub-directories to current directory.

Categorize and Uncategorize Your Folders Table of Content TL;DR just take me to how to install. What are Extension Categorizer and Folder Dumper Insta

Furkan Baytekin 1 Oct 17, 2021
ZipFly is a zip archive generator based on zipfile.py

ZipFly is a zip archive generator based on zipfile.py. It was created by Buzon.io to generate very large ZIP archives for immediate sending out to clients, or for writing large ZIP archives without m

Buzon 506 Jan 04, 2023
PyDeleter - delete a specifically formatted file in a directory or delete all other files

PyDeleter If you want to delete a specifically formatted file in a directory or delete all other files, PyDeleter does it for you. How to use? 1- Down

Amirabbas Motamedi 1 Jan 30, 2022
Media file renamer and organizion tool

mnamer mnamer (media renamer) is an intelligent and highly configurable media organization utility. It parses media filenames for metadata, searches t

Jessy Williams 533 Dec 29, 2022
File storage with API access. Used as a part of the Swipio project

API File storage File storage with API access. Used as a part of the Swipio project 📝 About The Project File storage allows you to upload and downloa

25 Sep 17, 2022
Python library for reading and writing tabular data via streams.

tabulator-py A library for reading and writing tabular data (csv/xls/json/etc). [Important Notice] We have released Frictionless Framework. This frame

Frictionless Data 231 Dec 09, 2022
ValveVMF - A python library to parse Valve's VMF files

ValveVMF ValveVMF is a Python library for parsing .vmf files for the Source Engi

pySourceSDK 2 Jan 02, 2022
A simple bulk file renamer, written in python.

Python File Editor A simple bulk file renamer, written in python. There are two functions, the bulk rename and the bulk file extention change. Bulk Fi

Sam Bloomfield 2 Dec 22, 2021
Find potentially sensitive files

find_files Find potentially sensitive files This script searchs for potentially sensitive files based off of file name or string contained in the file

4 Aug 20, 2022