It happens to all testers eventually. You come across a file share hosting dozens of database backups. Giddiness ensues as you realize you have full read access and can copy any of them down to your dropbox, until you notice the database backups are tens, if not hundreds, of gigabytes in size. However, in this particular situation you simply have neither the hard drive space nor bandwidth to pull down a massive database backup and boot up a virtual machine to search through the data in a timely fashion.
Cue Amazon Web Services (AWS). We can upload the database to a secure, non-public S3 bucket and have Amazon Relational Database Service (RDS) restore the database directly. This means that we can have access to that data in as little as 10 minutes while all the “heavy lifting” is performed by the cloud.
***NOTE: This script can help you demonstrate the impact of test findings without overtaxing your time or hardware but remember to always discuss the potential use of cloud technology during the engagement with your clients before testing begins.
Unfortunately, AWS likes to complicate things and there are quite a few steps involved in performing those two actions. At the bottom is a link to a bash script that will handle the entire exchange. The input is simply the database backup file to be uploaded, as well as the name of the database. After successful uploading and restoration you are provided with a table count and connection details for further queries.
Running without any arguments:
$ ./sql-backup-restore.sh usage: ./sql-backup-restore.sh options This script restores a SQL Server database backup to AWS and returns connection details & a table count OPTIONS: -h Show this message -f The SQL Server Database backup file (usually .bak) -d Database Name (ex. MYDATBASE)
Running on a test database:
$ ./sql-backup-restore.sh -f /mnt/FileSrv_IP/DB_Backups/JulyDatabaseBackup.bak -d THISISMYDATABASENAME [*] Creating S3 Bucket to store database backup: s3-sql-restore-wi41zjcsdg [*] Uploading backup file (/mnt/FileSrv_IP/DB_Backups/JulyDatabaseBackup.bak) to S3 bucket (s3-sql-restore-wi41zjcsdg) upload: ../JulyDatabaseBackup.bak to s3://s3-sql-restore-wi41zjcsdg/JulyDatabaseBackup.bak [*] Creating a VPC security group allowing TCP1433 inbound for RDS [*] Creating the IAM Role & Policy so RDS can access S3 [*] Creating an option group (option-group-sql-restore) to hold the SQLSERVER_BACKUP_RESTORE option for RDS [*] Adding the SQLSERVER_BACKUP_RESTORE option to option-group-sql-restore group Username: user34wkeceq Password: pass9zoacs5 [*] Creating the RDS SQL Server Database - db-sql-restore-rwkmm7hog ~15mins [*] RDS SQL Server now starting RDS Still coming up...may take a few minutes <SNIP> RDS Still coming up...may take a few minutes RDS Still coming up...may take a few minutes [*] SQL Server hostname: Hostname: db-sql-restore-rwkmm7hog.cicdy9uy2.us-east-1.rds.amazonaws.com Username: user34wkeceq Password: pass9zoacs5 [*] Restoring the SQL server database from S3 [*] still restoring the DB <SNIP> [*] still restoring the DB 1 RESTORE_DB THISISMYDATABASENAME [2019-01-18 1 2019-01-18 16:42:22.087 2019-01-18 16:41:15.730 arn:aws:s3:::s3-sql-restore-wi41zjcsdg/JulyDatabaseBackup.bak 0 NULL [*] Row count for all tables in the database Changed database context to 'THISISMYDATABASENAME'. rows ------------------------ ----------- sysclones 0 sysseobjvalues <SNIP> 1220 sysschobjs 2428 (94 rows affected) [*] Run whatever SQL queries you want with: sqlcmd -S db-sql-restore-rwkmm7hog.cicdy9uy2.us-east-1.rds.amazonaws.com -U user34wkeceq -P pass9zoacs5
Now, while the script relies on Mircosoft’s “sqlcmd” to run the stored procedures automatically, there is nothing stopping you from connecting with something like SQL Server Management Studio for autocomplete and other features.
The tool can be found here: https://github.com/DolosGroup/sql-backup-restore