You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
By: Jon Baumann, [Ciofeca Forensics](https://www.ciofecaforensics.com)
3
3
4
4
## About
5
-
This script mines SQLite databases for hidden gems that might be overlooked. It identifies for the forensic examiner which databases, tables, and columns had which potential types of files within them.
5
+
This script mines SQLite databases for hidden gems that might be overlooked. It identifies for the forensic examiner which databases, tables, and columns had which potential types of files within them. For an explanation of why I wrote this script, please see [this blog entry](https://www.ciofecaforensics.com/2017/10/23/mining-hidden-gems-with-sqlite-miner/).
6
6
7
7
## How It Works
8
8
This script searches identified SQLite databases to find files that are hidden in blob objects within the database. The `fun_stuff.pl` file controls the regular expressions that will be matched to assert that a given blob is a given file type. Currently it only supports file types whose magic number starts at offset 0.
@@ -13,13 +13,13 @@ This script is run by perl on a command line. The easiest usage is to look at on
13
13
14
14
For a larger outlook, the script can be run to recursively look at an entire directory with `perl sqlite_miner.pl --dir=<path to directory>`. That will cause the script to recursively walk through every file under that directory, check the file's header to see if it is SQLite format 3, and run each identified SQLite file as if it had been done using the `--file=` option above. The only difference is all of the results from that entire folder will be stored under an output directory named `YYYY_MM_DD_<folder_name>`. For example, running this on /home/test/backup_unpacked today will generate `2017_10_21_backup_unpacked/`. The `results.csv` will contain all results from the entire directory, but each specific database will have its own output folder within the overall directory.
15
15
16
-
The script can also be used to open an Adnroid backup without unpacking it first with `perl sqlite_miner.pl --android-backup=<path to backup>`. That will cause the script to open the Android backup, decompress the TAR portion to a temporary folder, then recurseivly walk through the TAR file, exporting any files that have a SQLite format 3 header. Then the script will behave the same as using the `--dir=` option described above. When done, the script will remove the temporary folder that the TAR file was saved in, but please be aware that for large, full, backups, this may result in an additional 1GB+ of space (the same amount of sapce as if you decompressed the file in order to run the `--dir=` option against an actual folder.
16
+
The script can also be used to open an Adnroid backup without unpacking it first with `perl sqlite_miner.pl --android-backup=<path to backup>`. That will cause the script to open the Android backup, decompress the TAR portion to a temporary folder, then iteratively walk through the TAR file, exporting any files that have a SQLite format 3 header. At that point the script will behave the same as using the `--dir=` option described above. When done, the script will remove the temporary folder that the TAR file was saved in, but please be aware that for large, full, backups, this may result in an additional 1GB+ of space (the same amount of space as if you decompressed the file in order to run the `--dir=` option against an actual folder). In addition, while this script attempts to cut down on memory usage, for large files tar may run out of memory and crash. If that occurs, decompress the file separately, then run this script with the `--dir` option on the exported data. Finally, decompressing the Android backup and working through the TAR will add time on to the script's processing. For example, a 1GB backup.ab file took roughly an additional 45 seconds to run against the Android backup, as compared to the exported files on a not-very-powerful computer. That said, decompressing the files takes roughly as long.
17
17
18
18
### Options
19
19
The required options that are currently supported are (one of):
20
20
1.`--file=`: This option tells the script where to find the SQLite you want to mine.
21
-
2.`--dir=`: This option tells the script where to find a directory to recursively search for SQLite format 3 database files and to parse each of them as if the --fiile option was called on them above.
22
-
3.`--android-backup=`: This option tells the script where to find ain Android backup file to recursively search for SQLite format 3 database files and to parse each of them as if the --fiile option was called on them above.
21
+
2.`--dir=`: This option tells the script where to find a directory to recursively search for SQLite format 3 database files and to parse each of them as if the `--file` option was called on them above.
22
+
3.`--android-backup=`: This option tells the script where to find an Android backup file to recursively search for SQLite format 3 database files and to parse each of them as if the `--file` option was called on them above.
23
23
24
24
The optional arguments are:
25
25
1.`--decompress`: This option tells the script to decompress any compressed data it knows it can unpack and replace the original data with the decompressed data to provide the examiner with a plaintext view. Note, this option drastically increases the run time as now the script is reading in the comrpessed object, decompressing it, and writing it back into the database.
print_log_line_if($log_file_handle, "Found no SQLite databases in Android backup.\n", !$tmp_tar_files_extracted);
791
805
792
806
# Clean up our temporary directory
793
807
File::Path->remove_tree($tmp_export_dir);
@@ -812,6 +826,7 @@ sub print_usage {
812
826
print"\t--dir=<path>: Identifies a directory to recursively search to find SQLite files to work on.\n";
813
827
print"\t--android-backup=<path>: Identifies an Android backup to open and search for SQLite files to work on.\n";
814
828
print"\t\tNote: This will unpack the Android backup on disk for TAR, so please ensure you have available space.\n";
829
+
print"\t\tNote: This also can be memory intensive while decompressing. If it errors out due to low memory, extract the backup yourself and use the --dir= option.\n";
815
830
print"\nOptional Options:\n";
816
831
print"\t--decompress: If set, will decompress recognized and supported compressed blobs, replacing the original blob contents on the working copy\n";
817
832
print"\t\tNote: Decompress gets very slow in a database with large compressed objects. Expect this to take a few seconds to run.\n";
0 commit comments