The life of an application config from the registry to SQLite to SQL CE to XML

A new beta of ShutOff 2000 v3.0.0 is now available. The product is fast approaching code completion.

The Visual Basic 6 edition of the application has always used the Windows Registry to store its settings. However, Microsoft’s best practice for the Windows 7 and Windows Server 2008 R2 platform does not allow for this, so I’ve been researching new ways to store settings outside of the Registry.

Right from the start, I wanted to store everything in an XML file. I toyed around with encrypting the file, making it binary, and so on, but eventually I dismissed the notion and decided to use an embedded database.

That’s where things got interesting. Originally, I used SQLite, but after playing around with it for a while, I decided that there was a lot of overhead attached to this, as well as non-native interops, so I switched instead to use SQL Compact Edition (SQL CE), which operates in a similar fashion to SQLite.

Unfortunately, in all of my testing, SQLite and SQL CE was just not fast enough. The queries themselves to store and retrieve records were fast, but the initial connection was too slow. Since the new ShutOff 2000 has a command-line interface, I wanted millisecond response times from the stored settings.

I readily admit that the connection speed is probably my fault, but when it comes right down to it, I used the Windows Registry because I only need to save around ten plain-text values, and it felt as though I was over-engineering things.

After announcing on Twitter that SQL CE was too slow, I decided to fall back to my original XML idea. As mentioned above, this comes with specific rules about where you can save your files. At least with the embedded database, the file was distributed with the application, but in the case of XML configuration, I’d need to be able to freely create and modify the settings file at will.

Enter, with a question that pointed me in the right direction. I used the suggestions provided by the accepted answer, but was not completely satisfied with the result, which worked on a per-user level. Instead, I used my Bingle-Fu to find a project called “User options in XML” on CodeProject, by Richard Schneider.

I tweaked the class that he wrote so that it would utilise Environment.SpecialFolder.CommonApplicationData instead of Environment.SpecialFolder.ApplicationData as written, and added some very basic error handling for managing a corrupt XML file (which came about when I renamed the class and got an InvalidOperatorException reading the file).

The good news is that although this wasn’t exactly a drop-in replacement for SQL CE, the refactoring I previously did to separate registry settings from SQL CE settings reduced the implementation time dramatically. I was also able to refactor a lot more code and make the product more stable.

So now I’m feeling a lot better about how this is turning out. It’s been a long road for me: I’m more of a database person than a software developer, and I really wanted the embedded databases to work, but I think this is the right approach.

Happy New Year!

Regular followers will recognise the date of 22 May as the first day of the Smiters of Iniquity calendar year.

As I mentioned last year on the Smiters’ forum, I will be naming the months of the year after disciples as they die, in honour of the memory left by Ian when he died, and his mostly carbon life form became more mostly carbon, due to the fact that he no longer has as much water in his system, a side effect of being dead.

Henceforth, the start of the calendar will be known as Glory Hole Day, and the month will be known as Ian. 7.1.1 begins now.

On the importance of good backups

I’ve been vocal for some years about the importance of backups, and I have zero sympathy for anyone who does not have good backups*.

This comes with (very) painful experience: a few years ago, during the migration of data from several drives to my new computer at the time, I lost the drive everything was copied to, before I made a backup. It was not recoverable:

Unfortunately we were not able to recover any data from the Seagate hard drive you sent us. The drive arrived to us with what looked like a bad voice coil. We were able to verify the circuit board and voice coil were operational. We opened the hard drive in a clean room and discovered massive damage to all platters and heads from a skipping head. We can see millions of pot marks on the platters from a skipping head. The internal white filter is black with platter debris. The desiccant packet inside the drive shows that it absorbed a lot of water causing black blotches on the outside. This drive was turned on after the initial failure for a very long time to create the damage. We typically see this type of damage when someone puts the drive in and out of a freezer several times. This would not be a recoverable drive.

(Thanks to Robert Fovall from for taking a look. If you’re interested, they have the best rates and service out there.)

Tonight, to participate in the latest survey from Paul Randal (of, I ran a query on one of my client’s servers. While I was signed in, I decided to do a checkup on disk space and backups, because, you know, backups.

I noticed that the data drive on the server was using over 500GB for the backup folder (which is copied every 15 minutes to a separate machine, with a removable encrypted drive, swapped out every week, etc., etc.)

The problem is that I know the backup drive has a 500GB capacity. Strangely, when I checked this drive last week, it hadn’t run out of space. So someone missed it on their daily checklist (or Run Book). I’ll have to figure that out in the morning.

More importantly, the SQL Server backups were not being copied off the SQL Server machine every fifteen minutes. This is as bad as not having a backup at all. Should the RAID fail, I’d lose everything newer than the latest backup, which for all I know, could be a week old.

So I decided to delete some of the old backup files. Normally I recommend not taking this course of action unless absolutely necessary. Our plan is to retain three months of backups for this server, and there just isn’t enough space on the backup drive. So we’ll have to modify the plan and update the Run Book to take this into account.

But here’s the fun part: I decided to use my existing recovery scripts to do a test-restore of the largest database, and guess what? The scripts were wrong. There was a small thing, easily fixed by me, but for someone who doesn’t know SQL Server, and can’t read error messages? World-ending (or in this case, medical clinic-closing).

I was able to attend to a disk space shortage, test my restore scripts, and was found wanting. So this is a reminder that even your perfectly-laid plans must be checked and re-checked periodically.

* A good backup is one that can be restored. I know it sounds obvious, but this is why you test them, right? Right?

Bug in git svn clone, failing at r1000

There’s an interesting bug I ran across tonight while converting a large SVN repository (containing over 4100 revisions and 52 400 objects) to Git, using git svn clone. The clone failed at r1000 and left me with an incomplete repository.

I did some searching and came across several articles on StackOverflow. The one that is still open in my browser is this one, which sums it up quite well:

Is it possible to clone a git repository which has more than one revisions? We tried to do it, and after the 1000’th commit it does a GC and exits leaving the clone in a unusable state.

While I didn’t have duplication in my .config file per the above question, I did have to run git svn fetch manually, several more times, to get the rest of the SVN revisions down properly.

My whole aim of converting is fairly obvious: Git is faster and uses less space than SVN, doesn’t have those crazy .svn folders everywhere, not to mention its superior branch and merge functionality. For comparative purposes, I was able to go from 176MB to 86.5MB (a saving of 51%) by converting this single repository.

The original source control used on this project was Visual SourceSafe, which I converted to SVN a couple of years ago using VSS2SVN. The VSS repository was over 1GB in size, so I’ve effectively recovered over 91% of my storage, just by using proper source control.

ShutOff 2000 progress report

I got a new laptop in March, which includes an SSD and 16GB of RAM, and it was enough inspiration (over and above the virtual machines for SQL Server that I’ve been playing with) to work some more on ShutOff 2000.

I’ve made astounding progress on it, considering the lethargic approach the C# version has had since 2002. Ten years? Ouch …

The latest version of the VB6 edition (v2.8.6) was released on 31 March 2011, just over a year ago. After that, I worked on version 2.8.7, which included some neat new features for the software updater code. However, this code remains unreleased to the public because, well, 3.0.0 (the C# version) is almost done!

In the last few weeks, I’ve completely refactored some chunks of code that were badly designed in the first place (As Jeff Atwood says, the worst code you’ve looked at is your own). Also, instead of using the Windows Registry to store settings, I’m going with a SQL Compact Edition database file. I picked this over SQLite for the easier coding (though I prefer the idea of SQLite and may change my mind in a later version).

My opinion on the piracy problem is pragmatic, as long-time readers will already know. When the Core group cracked ShutOff 2000 in the early part of the last decade, I was honoured that they took the time. In fact, their key generator had a better UI than my own and I took some design tips from it.

In any event, I think the C# version is ready to go in the next couple of weeks. I need to do some testing still, and then there’s the small matter of how to package and release it. I’ve toyed with the idea of an automated registration system, which is very different to the manual process I currently have in place. For every new registration, I manually generate a registration code and email the purchaser directly. Madness, but there you are.

The latest beta is available from the information page.

The other thing I’ve been playing with a lot more is Git for source control. I decided that I didn’t like SVN after all, and I already use Git for managing my website source control on the Mac, so it was a no-brainer to use it on Windows too.

To sum up this rambling post, this is one more step in the process of sorting out my life. I’m getting on top of things I’ve left too long. Spring cleaning, I suppose.