Last Login from all Domain Controllers

Requires Active Directory Module. Just cut and paste into PowerShell then run the command as shown in the example.

function Get-ADUserLastLogon([string]$userName) {
    $dcs = Get-ADDomainController -Filter {Name -like "*"}
    $time = 0
    foreach($dc in $dcs) {
        $hostname = $dc.HostName
        $user = Get-ADUser $userName -Server $hostname -Properties lastLogon 
        if($user.LastLogon -gt $time) 
            $time = $user.LastLogon
    $dt = [DateTime]::FromFileTime($time)
    "$username last logged on at: $($dt.ToString("yyyy/MM/dd HH:mm:ss"))"


    Get-ADUserLastLogon -UserName JoeBloggs

Additional configurations,
if you wish to exclude specific domain controllers due to communication limitations such as all DCs with DMZ in their name change line two to,

    $dcs = Get-ADDomainController -Filter {Name -notlike "*DMZ*"}

Export list of subnets from Active Directory Sites and Services

Save the following code,

$Sites = [System.DirectoryServices.ActiveDirectory.Forest]::GetCurrentForest().Sites
foreach ($Site in $Sites) {
   foreach ($subnet in $site.Subnets) {
      $subnet | Select Name,Site

to ‘ExportSites.ps1’ and run the command,

.\ExportSites.ps1 | Export-CSV sites.txt

Migrate FSRM Quotas using PowerShell to 2012 R2

This version has been superseded by this one

Hello to all of you,

I have been tasked with migrating from one file server solution to another with share folders going to various different locations and roughly 200,000 assigned quotas. The current system is 2008 R2 while the new one is 2012 R2. My plan is to utilise the dirquota.exe command to export the current configuration and then auto generate the required PowerShell commands to import on to the new 2012 R2. Almost all of the quotas we have are basic auto assigned quotas that still match the quota template and these quotas do not need to be individually set but naturally there are the odd quotas that has been modified manually with their own custom email thresholds and limits.

This script has the following assumptions,
1) The quota templates needed to be applied have been migrated to the new server. This is a native command using dirquota.exe template export /File:PATH
2) All Auto Quotas are using templates that match. Probably wouldn’t be much work to make it work with other setups but I am pretty confident that most admins should have set up their auto quotas in this way.
3) Commands are set with a KillTimeOut of 10 minutes. dirquota.exe is not able to output the KillTimeOut value. You can change this manually in either the script or the output.
4) dirquota.exe does not output if the Report ‘Files By Property’ so due to this it is assumed that this report is ticked on for all instances of report actions.
5) Auto Quotas with notifications will probably cause errors with this script.

I believe this is now a working product. Still some more testing to go and to look at ways to improve the code.

** Drop me a line if you find any errors and I will work on updating the code **

List of work to do
– Currently the output includes notifications for all quotas that do not match the template even if their notifications have not been modified. Might be too much work to prevent it. ** BUG: if auto quotas have notifications this will not work as it will try to add notifications that already exist **

# Written by Ben Penney # Version 0.95 # This script has been written to be run on the old/source FSRM server and the output generated to be executed on the target server. # This script has been tested on 2008 R2 and 2012 R2 source servers and 2012 R2 target servers. # SourcePath is the local path on the old server of the folder you want to export the quotas from. # TargetPath is the local path on the new server where you will execute the generated PowerShell commands # Example: # .\ExportQuotas -SourcePath "O:\StaffShared" -TargetPath "C:\TEST" > ImportQuotas.ps1 Param([Parameter(Mandatory = $true)] $SourcePath,$TargetPath) # ---------- START: Process Auto Quotas ---------- # This process is used to generate the PowerShell commands to create the Auto Quotas on the new server. # Additionally the list of auto quotas are kept for comparison later to exclude all quotas that will # be automatically generated from the output of this script. $AutoQuotas = @() $AutoQuotaTemplates = @() ForEach ($line in ((dirquota autoquota list /path:$SourcePath\*) + (dirquota autoquota list /path:$SourcePath\))) { If ($line -like 'Auto Apply Quota Path:*') {$AutoQuotas = $AutoQuotas + $line.Substring(24,$line.Length-24)} If ($line -like 'Source Template:*') { If ($line -like '*(Matches template)') { $AutoQuotaTemplates = $AutoQuotaTemplates + $line.Substring(24,$line.Length-43) "New-FsrmAutoQuota -Path '$($AutoQuotas[-1] -iReplace(($SourcePath -replace "\\","\\"),$TargetPath))' –Template '$($AutoQuotaTemplates[-1])'" } Else {Write-Error "Expecting all autoquotas to be using matching templates. Sorry. Quitting now."} } } If ($AutoQuotas.Length -ne $AutoQuotaTemplates.Length) {Write-Error "Something went wrong with parsing the auto quotas"} # ----------- END: Process Auto Quotas ----------- "Start-Sleep -s 30 # This line is to give the server time to create the auto quotas" Function processDirQuotaOutput ($QuotaOutput) { [int]$LineNum = 2 If ($QuotaOutput[1] -like 'This tool is deprecated*') {$LineNum += 2} # For 2012 output While ($LineNum -lt $QuotaOutput.Length) { # ---------- Quota Path Line ---------- $QuotaPath = $QuotaOutput[$LineNum].Substring(24,$QuotaOutput[$LineNum].Length-24) $QuotaPathNew = $($QuotaPath -iReplace(($SourcePath -replace "\\","\\"),$TargetPath)) $QuotaPathParent = $QuotaPath.SubString(0,$QuotaPath.LastIndexOf("\")) $LineNum++ # ---------- Share Path (2008) or Description (2012) Line ---------- # Skipped $LineNum++ # ---------- Source Template Line ---------- If ($QuotaOutput[$LineNum] -like '*(Does not match template)') { $SourceTemplateMatches = $false $SourceTemplate = $QuotaOutput[$LineNum].Substring(24,$QuotaOutput[$LineNum].Length-50) } ElseIf ($QuotaOutput[$LineNum] -like '*(Matches template)') { $SourceTemplateMatches = $true $SourceTemplate = $QuotaOutput[$LineNum].Substring(24,$QuotaOutput[$LineNum].Length-43) } $LineNum++ # ---------- Quota Status Line ---------- If ($QuotaOutput[$LineNum] -like '*Disabled') {$QuotaStatus = "-Disabled "} $LineNum++ # ---------- Limit Line ---------- $Limit = $QuotaOutput[$LineNum].Substring(24,$QuotaOutput[$LineNum].Length-31).Replace(' ','') If ($QuotaOutput[$LineNum].Substring($QuotaOutput[$LineNum].Length-5,4) -eq 'Soft') { $LimitType = "-SoftLimit" } $LineNum++ # ---------- Used, Available, Peak Usage Line ---------- # Skipped $LineNum += 3 # ---------- START: Generate Quota commands ---------- # Since this is the fisrt variable output by the dirquota.exe command we need to create a command # for the previous quota read in from the file if there was one. $ProcessNotifications = $false If ($AutoQuotas -contains $QuotaPathParent) { # Quota is the child of an auto quota so will have a template set (assuming all auto quotas are templates) If ($SourceTemplate -eq $Null) { # Quota has no template so should be removed (assuming all auto quotas are templates) "Remove-FsrmQuota -Path '$QuotaPathNew' -Confirm:`$False" "New-FsrmQuota -Path '$QuotaPathNew' -Size $Limit $LimitType $QuotaStatus" $ProcessNotifications = $true } ElseIf ($SourceTemplate -ne $AutoQuotaTemplates[[array]::indexof($AutoQuotas,$QuotaPathParent)]) { # Quota does not match template of auto quota "Reset-FsrmQuota -Path '$QuotaPathNew' -Template '$SourceTemplate' -Confirm:`$False" $ProcessNotifications = $true } } Else { # Quota is not the child of auto quota so will not have any quota yet If ($SourceTemplate -eq $Null) { "New-FsrmQuota -Path '$QuotaPathNew' -Size $Limit $LimitType $QuotaStatus" $ProcessNotifications = $true } Else { "New-FsrmQuota -Path '$QuotaPathNew' -Template '$SourceTemplate'" $ProcessNotifications = $true } } If ($SourceTemplateMatches -eq $False) { # Quota is a template but does not match so update the changes "Set-FsrmQuota -Path '$QuotaPathNew' -Size $Limit $LimitType $QuotaStatus" $ProcessNotifications = $true } # ----------- END: Generate Quota commands ----------- # ---------- Thresholds Line ---------- $LineNum += 2 If ($QuotaOutput[$LineNum-2] -notlike '*None' -And $ProcessNotifications) { $Thresholds = "" While ($LineNum -lt $QuotaOutput.Length -And ` $QuotaOutput[$LineNum] -notlike 'Quota Path:*') { $Actions = "" [int]$NotificationLimit = $QuotaOutput[$LineNum].SubString($QuotaOutput[$LineNum].IndexOf("(")+1,3).Trim() $LineNum += 2 While ($LineNum -lt $QuotaOutput.Length -And ` $QuotaOutput[$LineNum] -notlike 'Quota Path:*' -And ` $QuotaOutput[$LineNum] -notlike 'Notifications for *') { $NotificationType = $QuotaOutput[$LineNum].SubString(32,$QuotaOutput[$LineNum].Length-32) Switch ($NotificationType) { 'Event Log' { [int]$RunLimitInterval = $QuotaOutput[$LineNum+1].SubString(32,$QuotaOutput[$LineNum+1].Length-39) $EventType = $QuotaOutput[$LineNum+2].SubString(32,$QuotaOutput[$LineNum+2].Length-32) $MessageBody = $QuotaOutput[$LineNum+3].SubString(32,$QuotaOutput[$LineNum+3].Length-32) $LineNum += 5 While ($LineNum -lt $QuotaOutput.Length -And ` $QuotaOutput[$LineNum] -notlike 'Quota Path:*' -And ` $QuotaOutput[$LineNum] -notlike '*Notification Type:*' -And ` $QuotaOutput[$LineNum] -notlike 'Notifications for *') { $MessageBody += "`r`n" + $QuotaOutput[$LineNum-1] $LineNum++ } $Actions += ",(New-FsrmAction Event -EventType $EventType -Body '$MessageBody')" } 'E-mail' { [int]$RunLimitInterval = $QuotaOutput[$LineNum+1].SubString(32,$QuotaOutput[$LineNum+1].Length-39) $MailTo = $QuotaOutput[$LineNum+2].SubString(32,$QuotaOutput[$LineNum+2].Length-32) $LineNum += 3 $MailSubject = "" If ($QuotaOutput[$LineNum] -like '*Mail Subject:*') { $MailSubject = $QuotaOutput[$LineNum].SubString(32,$QuotaOutput[$LineNum].Length-32) $LineNum++ } $MessageBody = $QuotaOutput[$LineNum].SubString(32,$QuotaOutput[$LineNum].Length-32) $LineNum += 2 While ($LineNum -lt $QuotaOutput.Length -And ` $QuotaOutput[$LineNum] -notlike 'Quota Path:*' -And ` $QuotaOutput[$LineNum] -notlike '*Notification Type:*' -And ` $QuotaOutput[$LineNum] -notlike 'Notifications for *') { $MessageBody += "`r`n" + $QuotaOutput[$LineNum-1] $LineNum++ } $Actions += ",(New-FsrmAction Email -MailTo '$MailTo' -Subject '$MailSubject' -Body '$MessageBody'" $Actions += " -RunLimitInterval $RunLimitInterval)" } 'Command' { [int]$RunLimitInterval = $QuotaOutput[$LineNum+1].SubString(32,$QuotaOutput[$LineNum+1].Length-39) $Command = $QuotaOutput[$LineNum+2].SubString(32,$QuotaOutput[$LineNum+2].Length-32) $Arguments = $QuotaOutput[$LineNum+3].SubString(32,$QuotaOutput[$LineNum+3].Length-32) $WorkingDirectory = $QuotaOutput[$LineNum+4].SubString(32,$QuotaOutput[$LineNum+4].Length-32) Switch ($QuotaOutput[$LineNum+5].SubString(32,$QuotaOutput[$LineNum+5].Length-32)) { 'NT AUTHORITY\LOCAL SERVICE' {$RunAs = 'LocalService'} 'NT AUTHORITY\SYSTEM' {$RunAs = 'LocalSystem'} 'NT AUTHORITY\NETWORK SERVICE' {$RunAs = 'NetworkService'} } $LineNum += 7 $Actions += ",(New-FsrmAction Command -Command '$Command' -CommandParameters '$Arguments' -WorkingDirectory '$WorkingDirectory'" $Actions += " -SecurityLevel $RunAs -RunLimitInterval $RunLimitInterval -Killtimeout 10)" } 'Report' { #---Do nothing for now--- [int]$RunLimitInterval = $QuotaOutput[$LineNum+1].SubString(32,$QuotaOutput[$LineNum+1].Length-39) $LineNum += 2 $Reports = "FilesByProperty" While ($QuotaOutput[$LineNum] -notlike '*Reports saved to:*') { Switch ($QuotaOutput[$LineNum].SubString(6,10)) { 'Duplicate ' {$Reports += ",DuplicateFiles"} 'File Scree' {$Reports += ",FileScreen"} 'Files by F' {$Reports += ",FilesByFileGroup"} 'Files by O' {$Reports += ",FilesByOwner"} 'Large File' {$Reports += ",LargeFiles"} 'Least Rece' {$Reports += ",LeastRecentlyAccessed"} 'Most Recen' {$Reports += ",MostRecentlyAccessed"} 'Quota Usag' {$Reports += ",QuotaUsage"} } $LineNum++ } $SendReportsTo = "" If ($QuotaOutput[$LineNum+1].Length -gt 32) { $SendReportsTo = $QuotaOutput[$LineNum+1].SubString(32,$QuotaOutput[$LineNum+1].Length-32) } $LineNum += 3 $Actions += ",(New-FsrmAction Report -ReportTypes $Reports -MailTo '$SendReportsTo')" } } } $Thresholds += ",(New-FsrmQuotaThreshold -Percentage $NotificationLimit -Action $($Actions.SubString(1)))" } "Set-FsrmQuota -Path '$QuotaPathNew' -Threshold $($Thresholds.SubString(1))" } # ----------- END: Generate Threshold commands ----------- # ---------- START: Reset variables ---------- $QuotaPath = $null $QuotaPathNew = $null $QuotaPathParent = $null $SourceTemplate = $null $SourceTemplateMatches = $null $QuotaStatus = $null $Limit = $null $LimitType = $null $Thresholds = @() # ----------- END: Reset variables ----------- } } # ---------- START: Process Quotas ---------- processDirQuotaOutput (dirquota quota list /path:$SourcePath\* -List-Notifications) processDirQuotaOutput (dirquota quota list /path:$SourcePath\ -List-Notifications) # ----------- END: Process Quotas -----------

Script to split large files

This VBscript will split all files with the matching extension (‘sql’ in this example) in the script folder into separate files of size intSize (1gb in this example). Files will not be exactly of that size but will be slightly larger depending on your line sizes and how often the script checks the output file size (can be adjusted by modifying the line with the ‘Mod’ expression in it). I did try doing it in PowerShell but it took roughly 10,000 times as long to process!

'Written by Ben Penney Option Explicit Dim strExtension, intSize strExtension = "sql" 'The file type to split intSize = 1000000000 'The size to split into in bytes Dim objShell, objFSO Set objShell = WScript.CreateObject("WScript.Shell") Set objFSO = CreateObject("Scripting.FileSystemObject") Dim objFile, objInput, objOutput, objOutputFile, objError Dim intLineNum, intFilenum, strRead For Each objFile in objFSO.GetFolder(objShell.CurrentDirectory).Files If LCase(Right(objFile.Name, Len(strExtension))=strExtension) Then intFilenum = 1 Set objInput = objFSO.OpenTextFile(objFile.Name) Set objOutput = objFSO.CreateTextFile(Replace(objFile.Name,"." & strExtension,intFilenum & "." & strExtension)) Set objOutputFile = objFSO.GetFile(Replace(objFile.Name,"." & strExtension,intFilenum & "." & strExtension)) Set objError = objFSO.CreateTextFile(Replace(objFile.Name,"." & strExtension,"ERROR." & strExtension)) intLineNum = 1 Do While Not objInput.AtEndOfStream strRead = objInput.ReadLine If intLineNum Mod 10000 = 0 Then 'Put this in as it slowed down the process checking size every line If objOutputFile.Size > intSize Then objOutput.Close Set objOutput = objFSO.CreateTextFile(Replace(objFile.Name,"." & strExtension,intFilenum & "." & strExtension)) Set objOutputFile = objFSO.GetFile(Replace(objFile.Name,"." & strExtension,intFilenum & "." & strExtension)) End If End If Err.Clear On Error Resume Next objOutput.WriteLine strRead If Err.Number 0 Then objError.WriteLine "error writing line " & intFilenum On Error Goto 0 intLineNum = intLineNum + 1 Loop objInput.Close objOutput.Close objError.Close End If Next Wscript.Echo "All Done."

Clean up corrupted SQL files

This is vbscript code I used to clean up massive SQL files that had random corruption that appeared as long strings of repeated characters,

'Written by Ben Penney 'Script requires sub-folders CLEAN, BAD and ERROR to be in the script path Option Explicit Dim objShell, objFSO Set objShell = WScript.CreateObject("WScript.Shell") Set objFSO = CreateObject("Scripting.FileSystemObject") Dim objFile, objInput, objOutput, objBad, objError Dim intLineNum, strRead, blnLineGood, CharNum, CharNum2 For Each objFile in objFSO.GetFolder(objShell.CurrentDirectory).Files If LCase(Right(objFile.Name, 3)="sql") Then Set objInput = objFSO.OpenTextFile(objFile.Name) Set objOutput = objFSO.CreateTextFile("CLEAN\" & objFile.Name) Set objBad = objFSO.CreateTextFile("BAD\" & objFile.Name) Set objError = objFSO.CreateTextFile("ERROR\" & objFile.Name) intLineNum = 1 Do While Not objInput.AtEndOfStream strRead = objInput.ReadLine blnLineGood = True ' ---------- START: Check for repeat characters ---------- For CharNum = 1 to Len(strRead) - 102 If Mid(strRead,CharNum,1) <> " " Then For CharNum2 = 1 to 100 blnLineGood = False ' Assume bad line unless we prove it is not If Mid(strRead,CharNum,1) <> Mid(strRead,CharNum+CharNum2,1) Then blnLineGood = True ' Line is not bad Exit For End If Next If blnLineGood = False Then Exit For End If Next ' ----------- END: Check for repeat characters ----------- ' ---------- START: Check for SQL syntax ---------- If Left(strRead,6) <> "INSERT" And Right(strRead,2) <> ")," And Right(strRead,2) <> ");" Then blnLineGood = False If Left(strRead,6) <> "INSERT" And Left(strRead,1) <> "(" Then blnLineGood = False ' ----------- END: Check for SQL syntax ----------- ' ---------- START: Write line to output files. Catch errors. ---------- On Error Resume Next Err.Clear If blnLineGood = True Then objOutput.WriteLine strRead Else objBad.WriteLine strRead End If If Err.Number <> 0 Then objError.WriteLine "error writing line " & intLineNum On Error Goto 0 ' ----------- END: Write line to output files. Catch errors. ----------- intLineNum = intLineNum + 1 Loop objInput.Close objOutput.Close objBad.Close objError.Close End If Next Wscript.Echo "All Done."

 **Script requires sub-folders CLEAN, BAD and ERROR to be in the script path**

It is pretty rough but basically it checks each character in each line for repeated characters (ignoring spaces). If it finds the same character repeated 100 times it will output that line to the BAD file.

Additionally it checks that each line starts in ( and ends in either ), or ); (excluding INSERT lines). If it fails these tests it will output the line to the BAD file otherwise the line goes into the CLEAN file.

The ERROR file is there to catch errors with writing the line to either file which seem to pop up occasionally. The line numbers of the failed line are recorded in this file.

Can be easily tweaked to suit any sort of syntax check you were interested in.


MySQL handy commands

After spending 3 days so far manually recovering databases in MySQL I thought I would make my own cheat sheet for useful commands.

List size of all databases in MySql,

SELECT table_schema "DB Name", Round(Sum(data_length + index_length) / 1024 / 1024, 1) "DB Size in MB" FROM information_schema.tables GROUP BY table_schema;

List the row count of all tables in a database (estimate only),


List the row count of all tables in a database accurately (still not sure how to add the quotes to the output correctly),

mysql -B -uusername -ppassword --disable-column-names --execute "SELECT CONCAT('SELECT ""',table_name,'"" AS table_name, COUNT(*) AS exact_row_count FROM ""',table_schema,'"".""',table_name,'"" UNION ') FROM INFORMATION_SCHEMA.TABLES WHERE table_schema = 'DB_NAME';"

This will output lines such as,


Grab the output leaving off the last UNION and finish the command with a ‘;’

Return the number of tables in the database,

SELECT COUNT(*) FROM information_schema.tables WHERE table_schema = 'DB_NAME';

Import all .sql files in current directory (can modify wildcard search to be more exclusive),

find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch DB_NAME

Troubleshooting Control Manager Operations Error 7006

Recently a number of servers in my workplace started to show up with the following errors,

Log Name: System
Source: Service Control Manager
Event ID: 7006
Level: Error
The ScRegSetValueExW call failed for Start with the following error: Access is denied.

A simple google search finds plenty of talk about it with a lot of discussion around anti-virus software. Fair enough but it seems that it was not clear on how to identify the underlying cause of the error. A tip for those who are getting this error is to check your system log around the time of the error messages and see what the Service Control Manager was doing around the times of the errors.

In my case the following information level events appeared immediately before the errors,

Log Name: System
Source: Service Control Manager
Event ID: 7036
Level: Information
The Microsoft Network Inspection service entered the running state.

and immediately after,

Log Name: System
Source: Service Control Manager
Event ID: 7036
Level: Information
The Microsoft Network Inspection service entered the stopped state.

So I had my culprit which in my case was Microsoft System Endpoint Protection (or Microsoft Security Essentials). Still working on the resolution.

SCCM Kerberos Error 4

Recently it came to my attention that our SCCM servers were bringing up the following error for many of our workstations,

Log Name: System
Source: Security-Kerberos
Event ID: 4
Level: Error
The Kerberos client received a KRB_AP_ERR_MODIFIED error from the server computer1$. The target name used was cifs/ This indicates that the target server failed to decrypt the ticket provided by the client. This can occur when the target server principal name (SPN) is registered on an account other than the account the target service is using. Ensure that the target SPN is only registered on the account used by the server. This error can also happen if the target service account password is different than what is configured on the Kerberos Key Distribution Center for that target service. Ensure that the service on the server and the KDC are both configured to use the same password. If the server name is not fully qualified, and the target domain ( is different from the client domain (, check if there are identically named server accounts in these two domains, or use the fully-qualified name to identify the server.

I did find the following posts online,
Event ID 4 — Kerberos Client Configuration which suggests deleting the offending computer object and recreating a new one (to summarise). Very good advice but did not resolve my issue.

Kerberos and SPN problems which suggested to install SPN records for the SQL server and follow up posts that this did not work and that it is a DNS reverse look up issue.

This post got me thinking. I checked my DNS reverse look up zones and they were all there from what I could see. Next I thought, I wonder what computer1 and computer2 resolved to in DNS. Bingo both of these machines responded on the same IP address meaning that when SCCM does its reverse look up for the computer1 it returns with the name of computer2 (I still have no idea why SCCM is doing this reverse lookup). The cause of this issue is not that we have two computers with the same IP address out there but there are two records in DNS for the same IP on with two different names. This was due to our DHCP lease times being much shorter than our DNS scavenging times. To resolve the issue we increased out DHCP leases to 8 days and our scavenging to 5-10 days.

If you want to see some instructions on setting up a reverse lookup zone in DNS check out this guide from Tom’s Hardware Create Reverse Primary DNS Zone in Windows Server 2012

If you want to see some instructions on setting up DNS scavenging settings check out this guide Don’t be afraid of DNS Scavenging. Just be patient.

If you want to see some instructions on where to change DHCP lease time check out these very basic instructions (sorry best I could find without being too wordy) How do I change the DHCP address lease time in Windows 2000?

Troubleshooting Microsoft SQL full data or log volumes

If you have administered any SQL servers no doubt you have come across a transaction log file (or sometimes data) that has filled up your drive and have to work out what to do.

First you should determine if the logs files suddenly grew abnormally which can happen if someone runs a complex query. Hopefully there are some monitoring history for you to reference to determine what has happened. The following resolution is really only recommended when the log or data files have grown due to some once off process such as a database having half of its data removed or some unique update that grew the log file to a huge size.

The following code comes from this page and contains a lot more detail for those who want more information.

You can use this query to display the current size of the DB files and how much free space each has.

name AS FileName, 
size/128.0 AS CurrentSizeMB,  
size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS INT)/128.0 AS FreeSpaceMB 
FROM sys.database_files; 

To do this open up your SQL Management Studio and click on ‘New Query’.


Then paste the code into the new window and select what database you wish to run the code on with the drop down box highlighted in the image below.


When you click the ‘Execute’ button you will see a result at the bottom of the window looking something like this,


What this result shows us is that the database has a data file and a log file and in this example the data file is roughly 23 gigabytes in size with 4 gigabytes of that being free space while the log file is 9 gigabytes with 9 gigabytes free (in fact only 40 megabytes is currently in use). To say that the log file is overly large and is wasting space is difficult to say but in general this file will grow in shrink internally as a normal function however if this log file has been, for example 1 gigabyte in size for the past 6 months and grew to 9 gigabytes overnight then it is possible that the file could be shrunk to recover some space however the log file will grow again if it needs to (under the assumption that you have not disabled the auto grow options).

So if you make the decision to shrink the file here are the steps to follow.

Right click on the database you wish to shrink. Select tasks -> shrink -> files option.


For this example I am shrinking the log file for this database so I need to change the ‘File type’ drop down box to Log as shown below and click OK (or select an appropriate ‘Shrink file to’ option first if you like).


Now with any luck the file has shrunk in the file system and recovered some of your drive space.

As a side note you may be thinking to yourself why didn’t I just go into the shrink file dialog box to see what the ‘Available free space’ was for the database. Firstly this is a more dangerous approach as you are leaving yourself open to accidentally clicking OK and shrinking a file that may not require it and secondly you can check multiple databases more quickly using the script and the drop down menu (still trying to work out how to simply show all databases in one script and if I ever work it out I will update my post).

Thanks for reading.

Running Microsoft Office as SYSTEM account

If you have ever written some code that uses one of the Microsoft Office products and intended to run it on a server without user interaction using the builtin SYSTEM account you may very well have discovered that it just does not work and troubleshooting why is difficult.

According to this extremely helpful Microsoft article doing this is basically considered a no no. What would have been even better is if they said ‘hey, we don’t support this but here is how to make it work’, especially considering how ridiculously simple the solution is.

Credit to this post that finally gave me the answer which is to create the SYSTEM accounts Desktop folder.

32 bit Office:

64 bit Office:

That’s it!

Note: This has been tested on Office 2010 and Office 2013

A little more information for those who are interested and some basic troubleshooting for those who do run applications as SYSTEM in general.

Firstly if you are troubleshooting running applications as SYSTEM go and grab PSEXEC.EXE from here. Put the PSEXEC.EXE file somewhere on your machine that you are testing from and then launch either a command prompt or PowerShell and change directory to the path where you placed the EXE. Now we can run powershell.exe (or cmd.exe if you prefer) as SYSTEM by using the command,

psexec.exe /i /s powershell.exe

This will load up another window which is running under the SYSTEM credentials which we can confirm in PowerShell with the following command,


which should give us the response,

PS C:\> [Environment]::UserName

From here you can run, for example, Excel (change directory to the office folder and run using the command ‘.\EXCEL.EXE’ in PowerShell).

Using Office 2010 in this way will show you that Excel will launch and you can create a new document as normal but then when you click on the save button, or even ‘save as’, the application simply does nothing without that ‘Desktop’ folder created. Using Office 2013 in this same way shows that Microsoft have done some improvements as the application actually creates the missing folder and continues to work perfectly fine. Unfortunately the folder is still required to be created manually if you simply try to run your code as SYSTEM.

Using PSEXEC.EXE in this way is also required if you plan on generating any password hashes for PowerShell code that will run under the SYSTEM account. But more on that in another post.