MySQL handy commands

After spending 3 days so far manually recovering databases in MySQL I thought I would make my own cheat sheet for useful commands.

List size of all databases in MySql,

SELECT table_schema "DB Name", Round(Sum(data_length + index_length) / 1024 / 1024, 1) "DB Size in MB" FROM information_schema.tables GROUP BY table_schema;

List the row count of all tables in a database (estimate only),


List the row count of all tables in a database accurately (still not sure how to add the quotes to the output correctly),

mysql -B -uusername -ppassword --disable-column-names --execute "SELECT CONCAT('SELECT ""',table_name,'"" AS table_name, COUNT(*) AS exact_row_count FROM ""',table_schema,'"".""',table_name,'"" UNION ') FROM INFORMATION_SCHEMA.TABLES WHERE table_schema = 'DB_NAME';"

This will output lines such as,


Grab the output leaving off the last UNION and finish the command with a ‘;’

Return the number of tables in the database,

SELECT COUNT(*) FROM information_schema.tables WHERE table_schema = 'DB_NAME';

Import all .sql files in current directory (can modify wildcard search to be more exclusive),

find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch DB_NAME

Troubleshooting Control Manager Operations Error 7006

Recently a number of servers in my workplace started to show up with the following errors,

Log Name: System
Source: Service Control Manager
Event ID: 7006
Level: Error
The ScRegSetValueExW call failed for Start with the following error: Access is denied.

A simple google search finds plenty of talk about it with a lot of discussion around anti-virus software. Fair enough but it seems that it was not clear on how to identify the underlying cause of the error. A tip for those who are getting this error is to check your system log around the time of the error messages and see what the Service Control Manager was doing around the times of the errors.

In my case the following information level events appeared immediately before the errors,

Log Name: System
Source: Service Control Manager
Event ID: 7036
Level: Information
The Microsoft Network Inspection service entered the running state.

and immediately after,

Log Name: System
Source: Service Control Manager
Event ID: 7036
Level: Information
The Microsoft Network Inspection service entered the stopped state.

So I had my culprit which in my case was Microsoft System Endpoint Protection (or Microsoft Security Essentials). Still working on the resolution.

SCCM Kerberos Error 4

Recently it came to my attention that our SCCM servers were bringing up the following error for many of our workstations,

Log Name: System
Source: Security-Kerberos
Event ID: 4
Level: Error
The Kerberos client received a KRB_AP_ERR_MODIFIED error from the server computer1$. The target name used was cifs/ This indicates that the target server failed to decrypt the ticket provided by the client. This can occur when the target server principal name (SPN) is registered on an account other than the account the target service is using. Ensure that the target SPN is only registered on the account used by the server. This error can also happen if the target service account password is different than what is configured on the Kerberos Key Distribution Center for that target service. Ensure that the service on the server and the KDC are both configured to use the same password. If the server name is not fully qualified, and the target domain ( is different from the client domain (, check if there are identically named server accounts in these two domains, or use the fully-qualified name to identify the server.

I did find the following posts online,
Event ID 4 — Kerberos Client Configuration which suggests deleting the offending computer object and recreating a new one (to summarise). Very good advice but did not resolve my issue.

Kerberos and SPN problems which suggested to install SPN records for the SQL server and follow up posts that this did not work and that it is a DNS reverse look up issue.

This post got me thinking. I checked my DNS reverse look up zones and they were all there from what I could see. Next I thought, I wonder what computer1 and computer2 resolved to in DNS. Bingo both of these machines responded on the same IP address meaning that when SCCM does its reverse look up for the computer1 it returns with the name of computer2 (I still have no idea why SCCM is doing this reverse lookup). The cause of this issue is not that we have two computers with the same IP address out there but there are two records in DNS for the same IP on with two different names. This was due to our DHCP lease times being much shorter than our DNS scavenging times. To resolve the issue we increased out DHCP leases to 8 days and our scavenging to 5-10 days.

If you want to see some instructions on setting up a reverse lookup zone in DNS check out this guide from Tom’s Hardware Create Reverse Primary DNS Zone in Windows Server 2012

If you want to see some instructions on setting up DNS scavenging settings check out this guide Don’t be afraid of DNS Scavenging. Just be patient.

If you want to see some instructions on where to change DHCP lease time check out these very basic instructions (sorry best I could find without being too wordy) How do I change the DHCP address lease time in Windows 2000?

Troubleshooting Microsoft SQL full data or log volumes

If you have administered any SQL servers no doubt you have come across a transaction log file (or sometimes data) that has filled up your drive and have to work out what to do.

First you should determine if the logs files suddenly grew abnormally which can happen if someone runs a complex query. Hopefully there are some monitoring history for you to reference to determine what has happened. The following resolution is really only recommended when the log or data files have grown due to some once off process such as a database having half of its data removed or some unique update that grew the log file to a huge size.

The following code comes from this page and contains a lot more detail for those who want more information.

You can use this query to display the current size of the DB files and how much free space each has.

name AS FileName, 
size/128.0 AS CurrentSizeMB,  
size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS INT)/128.0 AS FreeSpaceMB 
FROM sys.database_files; 

To do this open up your SQL Management Studio and click on ‘New Query’.


Then paste the code into the new window and select what database you wish to run the code on with the drop down box highlighted in the image below.


When you click the ‘Execute’ button you will see a result at the bottom of the window looking something like this,


What this result shows us is that the database has a data file and a log file and in this example the data file is roughly 23 gigabytes in size with 4 gigabytes of that being free space while the log file is 9 gigabytes with 9 gigabytes free (in fact only 40 megabytes is currently in use). To say that the log file is overly large and is wasting space is difficult to say but in general this file will grow in shrink internally as a normal function however if this log file has been, for example 1 gigabyte in size for the past 6 months and grew to 9 gigabytes overnight then it is possible that the file could be shrunk to recover some space however the log file will grow again if it needs to (under the assumption that you have not disabled the auto grow options).

So if you make the decision to shrink the file here are the steps to follow.

Right click on the database you wish to shrink. Select tasks -> shrink -> files option.


For this example I am shrinking the log file for this database so I need to change the ‘File type’ drop down box to Log as shown below and click OK (or select an appropriate ‘Shrink file to’ option first if you like).


Now with any luck the file has shrunk in the file system and recovered some of your drive space.

As a side note you may be thinking to yourself why didn’t I just go into the shrink file dialog box to see what the ‘Available free space’ was for the database. Firstly this is a more dangerous approach as you are leaving yourself open to accidentally clicking OK and shrinking a file that may not require it and secondly you can check multiple databases more quickly using the script and the drop down menu (still trying to work out how to simply show all databases in one script and if I ever work it out I will update my post).

Thanks for reading.

Running Microsoft Office as SYSTEM account

If you have ever written some code that uses one of the Microsoft Office products and intended to run it on a server without user interaction using the builtin SYSTEM account you may very well have discovered that it just does not work and troubleshooting why is difficult.

According to this extremely helpful Microsoft article doing this is basically considered a no no. What would have been even better is if they said ‘hey, we don’t support this but here is how to make it work’, especially considering how ridiculously simple the solution is.

Credit to this post that finally gave me the answer which is to create the SYSTEM accounts Desktop folder.

32 bit Office:

64 bit Office:

That’s it!

Note: This has been tested on Office 2010 and Office 2013

A little more information for those who are interested and some basic troubleshooting for those who do run applications as SYSTEM in general.

Firstly if you are troubleshooting running applications as SYSTEM go and grab PSEXEC.EXE from here. Put the PSEXEC.EXE file somewhere on your machine that you are testing from and then launch either a command prompt or PowerShell and change directory to the path where you placed the EXE. Now we can run powershell.exe (or cmd.exe if you prefer) as SYSTEM by using the command,

psexec.exe /i /s powershell.exe

This will load up another window which is running under the SYSTEM credentials which we can confirm in PowerShell with the following command,


which should give us the response,

PS C:\> [Environment]::UserName

From here you can run, for example, Excel (change directory to the office folder and run using the command ‘.\EXCEL.EXE’ in PowerShell).

Using Office 2010 in this way will show you that Excel will launch and you can create a new document as normal but then when you click on the save button, or even ‘save as’, the application simply does nothing without that ‘Desktop’ folder created. Using Office 2013 in this same way shows that Microsoft have done some improvements as the application actually creates the missing folder and continues to work perfectly fine. Unfortunately the folder is still required to be created manually if you simply try to run your code as SYSTEM.

Using PSEXEC.EXE in this way is also required if you plan on generating any password hashes for PowerShell code that will run under the SYSTEM account. But more on that in another post.

Reading emails from Office365 account using PowerShell

Welcome fellow Sys Admins.

After spending all day trawling through the internet for a way to easily read through the mail in a O365 mailbox using PowerShell and gathering code from all over the place to do it I thought it would be good to upload my code to for everyone to share and hopefully you end up here at the start of your quest and not at the end 🙂

This code will allow you to read email directly from O365 without the need to install Outlook. All you need is the .dll file from the Microsoft Exchange Web Services Managed API 2.2 which you can download here. Ensure that you update the file path to the dll file if it is different to what is defined in the code.

Credit goes to the following post from ‘Steve Goodman’s Exchange & Office 365 Blog’ for the majority of the code,
How to export Email Headers from Exchange using Powershell

Credit also to Exchange Managed API autodiscover with Powershell resolving the issue with the security error when trying to autodiscover to O365 by adding the $TestUrlCallback function. While you could simply just call autodiscover with the second variable as $true you are basically trusting any redirections that may happen.

# Written by Ben Penney $mail="" $password="password" # Set the path to your copy of EWS Managed API $dllpath = "C:\Program Files\Microsoft\Exchange\Web Services\2.2\Microsoft.Exchange.WebServices.dll" # Load the Assemply [void][Reflection.Assembly]::LoadFile($dllpath) # Create a new Exchange service object $service = new-object Microsoft.Exchange.WebServices.Data.ExchangeService #These are your O365 credentials $Service.Credentials = New-Object Microsoft.Exchange.WebServices.Data.WebCredentials($mail,$password) # this TestUrlCallback is purely a security check $TestUrlCallback = { param ([string] $url) if ($url -eq "") {$true} else {$false} } # Autodiscover using the mail address set above $service.AutodiscoverUrl($mail,$TestUrlCallback) # create Property Set to include body and header of email $PropertySet = New-Object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties) # set email body to text $PropertySet.RequestedBodyType = [Microsoft.Exchange.WebServices.Data.BodyType]::Text; # Set how many emails we want to read at a time $numOfEmailsToRead = 100 # Index to keep track of where we are up to. Set to 0 initially. $index = 0 # Do/while loop for paging through the folder do { # Set what we want to retrieve from the folder. This will grab the first $pagesize emails $view = New-Object Microsoft.Exchange.WebServices.Data.ItemView($numOfEmailsToRead,$index) # Retrieve the data from the folder $findResults = $service.FindItems([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Inbox,$view) foreach ($item in $findResults.Items) { # load the additional properties for the item $item.Load($propertySet) # Output the results "From: $($item.From.Name)" "Subject: $($item.Subject)" "Body: $($Item.Body.Text)" "References: $($item.References)" "InternetMessageID: $($item.InternetMessageID)" "InternetMessageHeaders:" # Display the headers - using a little foreach loop $item.InternetMessageHeaders|foreach{"$($_.Name): $($_.Value)"} "" } # Increment $index to next block of emails $index += $numOfEmailsToRead } while ($findResults.MoreAvailable) # Do/While there are more emails to retrieve

Update: I just found another blog where someone has done similar with some additional coding for filtering mail based on subject and read status,
Check/Read Messages Exchange/Office365 Inbox with Powershell
So check that out as well!