Using Powershell to read and write MSSQL

These examples are using a test database called EmailList with a table called Table1 which has three fields ‘username’,’email’ and ‘display name’.

This sample code will log in to MSSQL with your current logged in credentials,

$dataSource = "TESTSQLSERVER" $database = "EmailList" $connectionString = "Server=$dataSource;Database=$database;Integrated Security=True;" $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() $command = $connection.CreateCommand()

This code will return the number of records with the specified email address,

$command.CommandText = "Select Count(*) FROM Table1 Where Email=''" $command.ExecuteScalar()

This code with write a record to the database,

$command.CommandText = "INSERT INTO Table1 VALUES ('username','','Users Name')" $command.ExecuteNonQuery()

This code will read records from the database,

$command.CommandText = "SELECT * FROM Table1 WHERE Email=''" $reader = $command.ExecuteReader() While ($reader.Read()) {$reader[0];$reader[1];$reader[2]} $reader.Close()

Don’t forget to close your connection once you have finished,


Clean up corrupted SQL files

This is vbscript code I used to clean up massive SQL files that had random corruption that appeared as long strings of repeated characters,

'Written by Ben Penney 'Script requires sub-folders CLEAN, BAD and ERROR to be in the script path Option Explicit Dim objShell, objFSO Set objShell = WScript.CreateObject("WScript.Shell") Set objFSO = CreateObject("Scripting.FileSystemObject") Dim objFile, objInput, objOutput, objBad, objError Dim intLineNum, strRead, blnLineGood, CharNum, CharNum2 For Each objFile in objFSO.GetFolder(objShell.CurrentDirectory).Files If LCase(Right(objFile.Name, 3)="sql") Then Set objInput = objFSO.OpenTextFile(objFile.Name) Set objOutput = objFSO.CreateTextFile("CLEAN\" & objFile.Name) Set objBad = objFSO.CreateTextFile("BAD\" & objFile.Name) Set objError = objFSO.CreateTextFile("ERROR\" & objFile.Name) intLineNum = 1 Do While Not objInput.AtEndOfStream strRead = objInput.ReadLine blnLineGood = True ' ---------- START: Check for repeat characters ---------- For CharNum = 1 to Len(strRead) - 102 If Mid(strRead,CharNum,1) <> " " Then For CharNum2 = 1 to 100 blnLineGood = False ' Assume bad line unless we prove it is not If Mid(strRead,CharNum,1) <> Mid(strRead,CharNum+CharNum2,1) Then blnLineGood = True ' Line is not bad Exit For End If Next If blnLineGood = False Then Exit For End If Next ' ----------- END: Check for repeat characters ----------- ' ---------- START: Check for SQL syntax ---------- If Left(strRead,6) <> "INSERT" And Right(strRead,2) <> ")," And Right(strRead,2) <> ");" Then blnLineGood = False If Left(strRead,6) <> "INSERT" And Left(strRead,1) <> "(" Then blnLineGood = False ' ----------- END: Check for SQL syntax ----------- ' ---------- START: Write line to output files. Catch errors. ---------- On Error Resume Next Err.Clear If blnLineGood = True Then objOutput.WriteLine strRead Else objBad.WriteLine strRead End If If Err.Number <> 0 Then objError.WriteLine "error writing line " & intLineNum On Error Goto 0 ' ----------- END: Write line to output files. Catch errors. ----------- intLineNum = intLineNum + 1 Loop objInput.Close objOutput.Close objBad.Close objError.Close End If Next Wscript.Echo "All Done."

 **Script requires sub-folders CLEAN, BAD and ERROR to be in the script path**

It is pretty rough but basically it checks each character in each line for repeated characters (ignoring spaces). If it finds the same character repeated 100 times it will output that line to the BAD file.

Additionally it checks that each line starts in ( and ends in either ), or ); (excluding INSERT lines). If it fails these tests it will output the line to the BAD file otherwise the line goes into the CLEAN file.

The ERROR file is there to catch errors with writing the line to either file which seem to pop up occasionally. The line numbers of the failed line are recorded in this file.

Can be easily tweaked to suit any sort of syntax check you were interested in.


MySQL handy commands

After spending 3 days so far manually recovering databases in MySQL I thought I would make my own cheat sheet for useful commands.

List size of all databases in MySql,

SELECT table_schema "DB Name", Round(Sum(data_length + index_length) / 1024 / 1024, 1) "DB Size in MB" FROM information_schema.tables GROUP BY table_schema;

List the row count of all tables in a database (estimate only),


List the row count of all tables in a database accurately (still not sure how to add the quotes to the output correctly),

mysql -B -uusername -ppassword --disable-column-names --execute "SELECT CONCAT('SELECT ""',table_name,'"" AS table_name, COUNT(*) AS exact_row_count FROM ""',table_schema,'"".""',table_name,'"" UNION ') FROM INFORMATION_SCHEMA.TABLES WHERE table_schema = 'DB_NAME';"

This will output lines such as,


Grab the output leaving off the last UNION and finish the command with a ‘;’

Return the number of tables in the database,

SELECT COUNT(*) FROM information_schema.tables WHERE table_schema = 'DB_NAME';

Import all .sql files in current directory (can modify wildcard search to be more exclusive),

find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch DB_NAME

Troubleshooting Microsoft SQL full data or log volumes

If you have administered any SQL servers no doubt you have come across a transaction log file (or sometimes data) that has filled up your drive and have to work out what to do.

First you should determine if the logs files suddenly grew abnormally which can happen if someone runs a complex query. Hopefully there are some monitoring history for you to reference to determine what has happened. The following resolution is really only recommended when the log or data files have grown due to some once off process such as a database having half of its data removed or some unique update that grew the log file to a huge size.

The following code comes from this page and contains a lot more detail for those who want more information.

You can use this query to display the current size of the DB files and how much free space each has.

name AS FileName, 
size/128.0 AS CurrentSizeMB,  
size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS INT)/128.0 AS FreeSpaceMB 
FROM sys.database_files; 

To do this open up your SQL Management Studio and click on ‘New Query’.


Then paste the code into the new window and select what database you wish to run the code on with the drop down box highlighted in the image below.


When you click the ‘Execute’ button you will see a result at the bottom of the window looking something like this,


What this result shows us is that the database has a data file and a log file and in this example the data file is roughly 23 gigabytes in size with 4 gigabytes of that being free space while the log file is 9 gigabytes with 9 gigabytes free (in fact only 40 megabytes is currently in use). To say that the log file is overly large and is wasting space is difficult to say but in general this file will grow in shrink internally as a normal function however if this log file has been, for example 1 gigabyte in size for the past 6 months and grew to 9 gigabytes overnight then it is possible that the file could be shrunk to recover some space however the log file will grow again if it needs to (under the assumption that you have not disabled the auto grow options).

So if you make the decision to shrink the file here are the steps to follow.

Right click on the database you wish to shrink. Select tasks -> shrink -> files option.


For this example I am shrinking the log file for this database so I need to change the ‘File type’ drop down box to Log as shown below and click OK (or select an appropriate ‘Shrink file to’ option first if you like).


Now with any luck the file has shrunk in the file system and recovered some of your drive space.

As a side note you may be thinking to yourself why didn’t I just go into the shrink file dialog box to see what the ‘Available free space’ was for the database. Firstly this is a more dangerous approach as you are leaving yourself open to accidentally clicking OK and shrinking a file that may not require it and secondly you can check multiple databases more quickly using the script and the drop down menu (still trying to work out how to simply show all databases in one script and if I ever work it out I will update my post).

Thanks for reading.