Many a year ago, sql server ce (sql server compact edition) was an amazing little database that microsoft released for use on Windows CE / Pocket PC devices. The coolest thing about it was that it was somewhat compatible with the big boy version of sql server, so a lot of your queries and sql knowledge from sql server could be utilized for mobile computer software.
This little database performed adequately for most situations, considering it was running on a little mobile 200mhz cpu, so the occasional slowdown was often excuseable. Except for when it came to inserting a lot of data. Keep in mind that the common technology stack at the time was to use the .Net compact framework (a stripped down, mobile-enabled version of the .Net framework) and have this .net code interact with your sql ce db. Micrsoft had envisioned a beautiful world where ado.net would work in the same disconnected fashion as the big-boy .net framework did, using datasets and datatables, and even datareaders if you were a performance maniac. These disconnected data layers work great for the full .net work, but on a mobile device this proved to be a big headache- afterall, the device is the only thing interacting with the local db, so why duplicate everything in a table into yet another structure in memory that effectively mimics.. a table? Many developers would eventually ditch datasets and move to using datareaders for populating compact framework forms, and then use direct sql for adding/updating the records in the database. This proved to be the only way to avoid waithing several seconds for your form to load a single record, in many instances.
But, most mobile projects ran into a much larger problem – the need for larger databases on the mobile device. It makes perfect sense to need larger, multi-megabyte databases on your mobile computer, since most of these mobile computers ran in disconnected mode and would sync up their (often sizeable) databases with a larger server any time they were docked and synched up. These large databases were almost impossible to create when using a sql ce database. Most devs would initially download a large chunk of data from a remote server, fill up a dataset, and then tell it to update the database. This was a disaster- even 10k records could take hours to update.
So, why dont we create a sql ce database on a server, and just download the whole thing to the device to update it? Sorry – at that time, sql ce was only supported on mobile compters, no full-pc version existed. (A company, I don’t recall the name, actually created a product that would do this, by reverse engineering the sql ce file format… yikes!)
The next step most devs would then take would be to ditch the dataset and move to using sql insert statements to add each row of data directly. This helped a lot, and with tuning (using prepared statements, etc) could speed things up a lot. But even in these cases, that same 10k records could still take 30 minutes. Possibly an order of magnitude faster, but still not even close to useable.
My company back in those days created a product specifically to address this problem – you may recall the SSCEDirect product – in which we provided a compact framework interface to a lower-level native code oledb interface to a sqlce db. This produce would typically increase performance by well over another order of magnitude, often 2. So now our 10k records might only take 15 seconds to load, or even less. This was what sql ce was meant to be.
If SSCEDirect sounds familiar, that’s probably because it is what MS later added to the Sql CE .Net interface via the SqlCeResultset objects – these effectively do exactly the same thing as what our product did in those days.
Ok, so now it is 2012 and I’m writing about all this, why?
It is just amazing to me that Sql CE has grown up a lot over the years, and has even become repurposed as the lightweight in-process database mean for use on non-mobile, full PC environments. I love the concept of it, but those old performance problems still haunt me today. Even running Sql CE 4.x on a fast desktop PC, I can try to insert 10k’s to 100k’s of records using the entity framework – and it still grinds to a halt. The full sql server seems to handle this fine, and yet the stripped down single-user in-process local database with a massive cpu and ram – and which should be able to run circles around the big multi-user database – just dies on large numbers of inserts. Sure, there’s the overhead of the entity framework, but even using ado again, it still isnt much faster. And yes, I’m using indexing correctly etc.
So what am I doing in my app to make it perform again? Yep, it’s back to using the direct-to-table hack like we had to do back in the Windows CE days. Right now I’m actually using Erik’s “Sql CE Bulk Copy” tool- at http://sqlcebulkcopy.codeplex.com – to help accomplish this.
Feels like 2003 again.