Skip to content

Commit 957215e

Browse files
author
ThreePlanetsSoftware
committed
Updated README for readability
1 parent ef8ef7a commit 957215e

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,13 @@ This script mines SQLite databases for hidden gems that might be overlooked. It
88
This script searches identified SQLite databases to find files that are hidden in blob objects within the database. The `fun_stuff.pl` file controls the regular expressions that will be matched to assert that a given blob is a given file type. Currently it only supports file types whose magic number starts at offset 0.
99

1010
## Usage
11-
### Base
11+
### Individual Files
1212
This script is run by perl on a command line. The easiest usage is to look at one SQLite database, which is accomplished by running `perl sqlite_miner.pl --file=<path to SQLite database>`. When this is run, the script will create a folder in the output folder named `YYYY_MM_DD_<database_name>`. For example, running this on NotesStore.sqlite today will generate `2017_10_21_NoteStore.sqlite`. Importantly, at the beginning of the run, the script will copy the target SQLite database into this folder and work from the copied database, instead of the original. Also within that folder will be a file `results.csv` that contains a line-by-line list of each blob that is identified as potentially beng a known file. If the `--export` option is set, the folder will also contain an export folder that has all of the files which were recognized saved within them. If the `--decompress` option is set, the copied database will be updated with any decompressed data that is identified.
1313

14+
### Entire Directories
1415
For a larger outlook, the script can be run to recursively look at an entire directory with `perl sqlite_miner.pl --dir=<path to directory>`. That will cause the script to recursively walk through every file under that directory, check the file's header to see if it is SQLite format 3, and run each identified SQLite file as if it had been done using the `--file=` option above. The only difference is all of the results from that entire folder will be stored under an output directory named `YYYY_MM_DD_<folder_name>`. For example, running this on /home/test/backup_unpacked today will generate `2017_10_21_backup_unpacked/`. The `results.csv` will contain all results from the entire directory, but each specific database will have its own output folder within the overall directory.
1516

17+
### Android Backup Files
1618
The script can also be used to open an Adnroid backup without unpacking it first with `perl sqlite_miner.pl --android-backup=<path to backup>`. That will cause the script to open the Android backup, decompress the TAR portion to a temporary folder, then iteratively walk through the TAR file, exporting any files that have a SQLite format 3 header. At that point the script will behave the same as using the `--dir=` option described above. When done, the script will remove the temporary folder that the TAR file was saved in, but please be aware that for large, full, backups, this may result in an additional 1GB+ of space (the same amount of space as if you decompressed the file in order to run the `--dir=` option against an actual folder). In addition, while this script attempts to cut down on memory usage, for large files tar may run out of memory and crash. If that occurs, decompress the file separately, then run this script with the `--dir` option on the exported data. Finally, decompressing the Android backup and working through the TAR will add time on to the script's processing. For example, a 1GB backup.ab file took roughly an additional 45 seconds to run against the Android backup, as compared to the exported files on a not-very-powerful computer. That said, decompressing the files takes roughly as long.
1719

1820
### Options

0 commit comments

Comments
 (0)