One of the most frequent questions on forums is how to back up MySql databases directly from a .NET application. Unfortunately, unlike Access, SqlLite etc., it is not really a case of just copying a file and renaming it, you really have to get down and dirty and basically recreate the SQL that creates all tables and inserts all data back into tables from scratch. There are applications that do this for you, obviously, such as SqlDump, mysqlhotcopy etc., but these require some knowledge of MySQLl and relational databases in general. Unfortunately, in my experience, 99% of my clients do not even know what a relational database is; they just want a computer program that can do their invoices and manage their stock with pretty graphs to look at. When it comes to backing up, they just want to press a button that says Back-up and everything is taken care of. The obvious answer to this would be to use
Process.Start("mysqldump", filename), and hey presto, you have a full backup. Sadly, I have never had much success with this. Let me tell you. I tried, but it would always stuff up, especially when it is the client trying to do a back up with no one else around to recover the process for them. Then you have an irate client on your hands, telling you that you have the programming skills of a twit, who couldn't even make a backup system that worked. So I set out to make a bulletproof, fully managed, backup class that I could use in my .NET apps.
I just thought, having completed most of what I set out to accomplish, that I would share the results of that effort with everyone else out there who needs something like this. First off though, I need to stress that I am in no way a database expert, rather I am just a humble application developer who has built up a business building custom business applications that just happen to have a database for the back-end. While I have fully tested this successfully on my own databases, those databases tend to be small and located on the machine that is doing the backup. For the database experts out there, I would really appreciate your comments and criticisms to optimise this and correct things that I have done incorrectly. I also need to apologise in advance if the code lines are too long to read without scrolling. I have widescreen monitors, and I hate line continuation characters in my code (except maybe to stress SQL - maybe), so there.
Other than the fact that you need access to a MySQL server with a database to play around with, and the .NET connector for MySQL which can be found here: MySql Connector for .NET, you will only need Visual Studio (I have used VS 2008) to run the project and test the backups.
Obviously, the first thing to do once you have established what the project is setting out to do, is to figure out what is necessary to accomplish those goals. When it comes to backing up a database schema, the only things to make a decision on are:
1. What is the optimal way to recreate a database from scratch?
Having done the research, mainly studying the way other people and organisations have accomplished this, I came to the conclusion that recreating the SQL for the
INSERTs for each table would be the most obvious way to go, especially as this was in our original conclusion. We also needed to make a decision as to how far in recreating the database we needed to go. Would we need to backup Stored Procedures, Functions, and Views? What about recreating Indexes? And Foreign Keys?
I decided that it would be better to start off with the aim of being able to recreate any database, complete with Stored Procedures, Functions, Views, and Foreign Keys. As far as Indexes go, I would rely on MySQL itself to create Indexes on Unique, Primary, and Foreign Keys automatically when the tables are recreated on restoration. Any other Indexes would have to be created manually once the restoration is complete, by the SysAdmin.
2. What would be the best way to store the SQL statements created by the backup?
The most obvious choice here would be a simple text file that could just be read as a stream and the SQL parsed on the fly so to speak. But this opened up questions as to how to separate different sections such as Tables, Stored Procedures etc., as you would need to use Regular Expressions, or some sort of comma delimiter or something to find out where you were in the process of restoration. This option also led to questions regarding how to ensure the integrity of the data, and how to validate the backup file that was selected for restoration.
The solution which came to me was to use an XML file format. This solved all the questions above, because you could just use XML tags to keep the sections apart without any hassle, and using XML attributes, it would be possible not only to restore the whole database, but just the bits that you needed to. This solution also solved all validation issues as you could create an XML schema that would be used to validate the whole file even as the user was selecting it. As I was using VB.NET to create the class, I could also use LINQ and XML literals (sorry C# guys - you will have to wait for .NET 5.0) to create and read the XML file with relative ease. Many thanks to Jorge Paulino for his article on XML literals, as I had never heard of them before then. Another advantage to this approach would be that since XML is an international standard, it would be relatively easy to recreate the database even without the application that created the file. Even by just opening up in Notepad, it would be easy to see the structure of the database and the data.
3. How to accomplish actually going about creating the SQL and saving it in an XML file?
One of the main targets was to ensure that we could re-use this code in all our database projects, and so the only way to go would be to create a standalone Class Library that could be imported into any other project as and when needed. The question then became whether to have a single class that handled both backups and restores, or to have two classes, one for backup and one for restore. I opted for two separate classes, mainly as I could then have the same constructor for each and yet still know whether it was a restore or a backup. I included a module which handled the XML creation and validation. This module also includes a function to validate a connection to a server, and a function that returns a list of databases that the particular user has permission to view on the selected server.
How it works
BackUp class has three private variables: a
ConnectionStringBuilder to hold our server and user information, and also to construct our connection strings in the various backup routines; an
XDocument (which is just an in-memory representation of an XML file), to construct the backup file (I have arbitrarily given these files the extension .msb [MySQL Backup - very original I thought]); and a string which will hold the full path name of the backup file that will be saved. The class has a constructor and an exposed method -
BackUpDataBase (there goes my originality again). The constructor takes four strings as arguments, namely,
FileName, which are all pretty self-explanatory. It uses these arguments to instantiate the
ConnectionStringBuilder and assign the path name. The constructor also calls a method in the Validation Module,
BuildNewBackUpFile, which creates a brand new .msb file. This method builds up an
XDocument using XML literals and then returns that doc.
Friend Function BuildNewBackUpFile() As XDocument
Dim NewXDoc As New XDocument wXDoc = <?xml version="1.0" encoding="UTF-8"?>
<!--MySql Backup File-->
Return NewXDoc 'Easy as that
End Function 'BuildNewBackUpFile
As you can see from the routine, it is really a simple layout. The various sections would then be saved under the appropriate tags like this:
<Table Name="MyTableName">CREATE TABLE `Table1`(`ID`int(10),`Name`varchar(45))</Table>
<Table Name="MyTableName2">CREATE TABLE`Table2`(`ID`int(10),`Name`varchar(45))</Table>
<Constraint Name="MyConstraint">ALTER TABLE `MyTable` _
ADD CONSTRAINT `MyConstraint`FOREIGN KEY (... etc <Constraint>
The exposed method
BackUpDataBase takes an optional
BackGroundWorker as an argument, which makes it easy to report the progress of the backup as it happens. Basically, all this routine does is to call each separate section in order, one after the other, the different steps having been broken up into their own routines. I won't bore you with every detail on every routine, all of them being pretty similar in nature, and so I will go through the two most important routines:
CreateTable being an offshoot of
This routine basically uses the Information_Schema database to query for the names of all tables that belong to the selected database, and then uses a
DataReader to iterate through the tables, calling the
CreateTable routine to build up the SQL to create each table. Once the
CREATE TABLE statement is returned, the routine then creates a new
XElement to hold the new XML, assigns the table name to the
Name attribute of the
Table tag, and adds the
XElement under the parent node
Dim newElement As New XElement("Table", strCreateTable)
newElement.@Name = strTableName Dim parent As XElement = msbBackUp...<Tables>.FirstOrDefault
It then calls the
BackUpData routine on each table to then create the
Insert statements for each row of data in the table. Using the same query gives us the engine type and the
Auto_Increment value to tag on to the end of the
CREATE TABLE statement. As this is the most time consuming phase, the routine has the option of taking a
BackGroundWorker as an argument, allowing the worker to report progress as it goes along. This is calculated by a
Count query on the Information_Schema.Tables table only if the Worker is present.
This routine takes the table name, engine type, and
Auto_Increment as arguments, and builds up and returns the full
CREATE TABLE statement for each table. Initially, it starts off creating the
DROP TABLE statement, and then uses a
DESCRIBE TABLE query to build up the
CREATE TABLE statement, iterating through each field returned.
Dim strReturn As String = "DROP TABLE IF EXISTS _
`" & strConnection.Database & "`.`" & Table & "`;"
Using conDetails As New MySqlConnection(strConnection.ToString)
strReturn &= "CREATE TABLE `" & strConnection.Database & "`.`" & Table & "` ("
Dim cmdRows As New MySqlCommand("DESCRIBE `" & Table & "`", conDetails)
Dim dbrRows As MySqlDataReader
dbrRows = cmdRows.ExecuteReader()
Describe Table returns six values for each column in the table:
Extra. On receiving this information, it is merely a matter of building each column in this order:
Name DataType (max value) NOT NULL / NULL AUTO_INCREMENT. We separately build up a Primary Key as we go along, and if not empty, tag it on to the
Create Table. Lastly, we include information as to the Engine Type and the
Next Auto_Increment number to the end of the statement to ensure a restoration creates the right type of table (MySQL defaults to InnoDB, if not stated).
While dbrRows.Read() strReturn &= "`" & dbrRows.GetString("Field") & "` " & dbrRows.GetString("Type")
If Not dbrRows.GetString("Null") = "YES" Then
strReturn &= " NOT NULL"
If Not IsDBNull(dbrRows.Item("Default")) Then
strReturn &= " DEFAULT '" & dbrRows.GetString("Default") & "'"
If Not dbrRows.GetString("Extra") = Nothing Then
strReturn &= " " & dbrRows.GetString("Extra").ToUpper()
If Not dbrRows.GetString("Key") = Nothing Then
If dbrRows.GetString("Key") = "PRI" Then
If strPrimaryBuilder = String.Empty Then
strPrimaryBuilder = dbrRows.GetString("Field")
strPrimaryBuilder &= "," & dbrRows.GetString("Field")
strReturn &= "," End While
If strPrimaryBuilder = String.Empty Then
strReturn = strReturn.Remove(strReturn.Length - 1, 1)
strPrimaryKey = "PRIMARY KEY (" & strPrimaryBuilder & ")"
strReturn &= strPrimaryKey & ") "
strReturn &= "ENGINE=" & TableEngine
If AutoIncrement >= 0 Then
strReturn &= " AUTO_INCREMENT=" & AutoIncrement
This section is probably the backbone of everything, and yet all we do is call a
SELECT * query from the table, pass this to a reader, and build an
Insert SQL statement with the results.
Dim strInsert As String = String.Empty
Using conData As New MySqlConnection(strConnection.ToString)
Dim cmdRows As New MySqlCommand("SELECT * FROM `" & _
strConnection.Database & "`.`" _
& Table & "`", conData)
Dim dbrRows As MySqlDataReader = cmdRows.ExecuteReader
If dbrRows.HasRows Then
I found it imperative to retrieve the data type of each value and then handle many of the data types on their own, as such things as apostrophes in strings, MySQL/.NET date conversions etc., could really stall things quite abruptly. One thing that surprised me was the need to convert booleans to ints, as when saving a boolean value, it would save it as literally True or False, and then when trying to restore, it would say something about failure to convert to boolean. One important aspect which I needed to figure out was how to save Blob columns. In the end, I did a hack by retrieving the bytes into a byte array and then converting the bytes to Hex, and then storing the Hex version. This works for pictures, Word, Excel, and text files, restoring all perfectly (Excel gives a warning about a wrong file format or something, but if you ignore that and just tell it to go ahead, it opens fine). As I don't know the size of the blob field in advance, I had to initiate the byte array with an arbitrary value (which I limited to 1 MB - arbitrarily). This would have to be increased if the size of the returned document or picture is likely to be bigger than this. This would also depend heavily on the maximum size of the data able to be returned by your server as a resultset, which is also set by default to 1 MB by MySQL. Doing some testing on a test database which I created specifically, I could get it up to 15 MB before I started having OutOfMemory exceptions (on a basic 32 bit system with 2 GB memory). This will obviously get lower and lower the more records you have in the table, remembering that this is all going into an in-memory XML file, and so you can easily see how your memory is quickly going to get used up.
strInsert = "INSERT INTO `" & conData.Database & "`.`" & Table & "` ("
Dim intFieldCount As Integer = dbrRows.FieldCount - 1
Dim columns As New List(Of String)
Dim values As New List(Of String)
For intColumns As Integer = 0 To intFieldCount
If Not IsDBNull(dbrRows(intColumns)) Then
columns.Add("`" & dbrRows.GetName(intColumns) & "`")
Dim strType As String = dbrRows.Item(intColumns).GetType.ToString
Select Case strType
Case "System.DateTime" l
Dim dteValue As DateTime = dbrRows.GetMySqlDateTime(intColumns)
Dim strValue As String = "'" & dteValue.Year & "-" & _
dteValue.Month & "-" & _
dteValue.Day & "'"
Dim intBoolean As Integer
If dbrRows.Item(intColumns) = True Then
intBoolean = 1
intBoolean = 0
Dim strValue As String = dbrRows.GetString(intColumns)
strValue = strValue.Replace("'", "''")
strValue = strValue.Replace(";", "")
values.Add("'" & strValue & "'")
Dim bytBlob(1048576) As Byte
Dim lngFileLength As Long = _
dbrRows.GetBytes(intColumns, 0, bytBlob, 0, 1048576)
ReDim Preserve bytBlob(lngFileLength)
values.Add("0x" & ByteArrayToHex(bytBlob))
Case Else If IsNumeric(dbrRows.Item(intColumns)) Then
values.Add("'" & dbrRows.Item(intColumns).ToString & "'")
strInsert &= Join(columns.ToArray, ", ") & ") " & "VALUES ( "
strInsert &= Join(values.ToArray, ", ") & " )"
INSERT statement is complete, the routine then goes ahead and creates an XML element under the tag
<Inserts>, with the tag
<Insert> as explained above.
Backing up Constraints, Views, Procedures, and Functions
Each of the above sections has its own routine, e.g.,
BackUpViews etc., and each routine is very similar in nature. Basically, they all query the Information_Schema database to retrieve the relevant entity names related to the current database, then iterate through each found value and calls
SHOW CREATE FUNCTION or
SHOW CREATE PROCEDURE etc., as the case may be. The results of that call are then passed to a reader, which retrieves the value from the right column and builds up an XML element, which it adds to the current
Finalising the backup
Once the initial
BackUpDataBase routine has completed all sections, it is just a matter of calling the
Save method and the back up is done.
How it works
Restore class has only two private variables, namely
XDocument. The constructor for the class takes five arguments: user name, password, server, database, and full path of the file to be restored. As in the case of the
BackUp class, the constructor uses these arguments to instantiate the
ConnectionStringBuilder and sets the
XDocument variable to the correct file using the
XDocument.Load(FileName) method. The
Restore class has two other public methods, one to validate that the selected backup file contains information relevant to the selected database, and another method which does all the work:
RestoreDataBase (again, extra originality points).
I included this routine to ensure that the user has selected the right backup file for the correct database. All it does is check whether the
Name attribute attached to the main database tag matches the name of the selected database. Using XML literals, this becomes very easy:
Public Function ValidateDataBases() As Boolean
Dim strDataBaseName As String = msbRestoreFile...<DataBase>.@Name
If strDataBaseName = strConnection.Database Then
This function needs to be called before the main call to
RestoreDatabase in the application that is being used to do the actual restoring, as will be explained below when I go through the demo app.
This routine is the crux of the whole class. It takes an optional
BackGroundWorker as an argument to facilitate progress reporting. Basically, this routine just calls other routines which actually restore the different relevant sections. It starts off restoring the tables first, then it recreates all the foreign keys and unique key constraints before restoring the actual data. Doing it this way, I figured that the data will be validated against all constraints whilst being restored. Going on from there, it then recreates all functions, procedures, etc., if any are present. Because all the different sections are pretty much the same, I will only go through the
With all these routines, we just use LINQ and XML literals to extract each value from the relevant section of the backup file and then execute the value (which is a fully formed SQL statement) against the database. In the
RestoreTables routine, you can see that I first switch off all constraint checks so we don't have a problem dropping tables that have Foreign Key constraints. The beauty about using XML literals and LINQ to retrieve the values is that if there are no values, nothing happens. You don't get any errors or warnings or anything, it just moves on to the next section (although I don't quite see why anyone would back up a database with no tables).
Private Sub RestoreTables()
Using conTables As New MySqlConnection(strConnection.ToString)
Dim cmdSetUp As New MySqlCommand(" _
@OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0;"&" _
For Each table In From element In msbRestoreFile...<Table>
Dim cmdCreateTable As New MySqlCommand(table.Value, conTables)
This is the basic procedure followed by all the Restore routines (
RestoreViews etc.), except when it comes to the
RestoreData routine where I use a
MySqlTransaction to execute the
Inserts, as that way, there is always the option of rolling back if there is a problem, as well as the fact that it is much, much faster. Using a transaction in all other cases was counter productive, as when executing statements such as
CREATE FUNCTION etc., MySQL does an implicit commit, and so a transaction would be redundant.
I let this routine accept an optional
BackGroundWorker as an argument so that we can report progress as we go along. Other than some code to calculate how far we are in the process, everything else is pretty much similar to what has gone on before, except we use a
End Try block to catch any exceptions, and if so roll back the transaction.
Private Sub RestoreData(Optional ByVal worker As BackgroundWorker = Nothing)
Using conData As New MySqlConnection(strConnection.ToString)
Dim transData As MySqlTransaction
transData = conData.BeginTransaction
Dim intInsertCount As Integer = msbRestoreFile...<Insert>.Count
Dim intInsertsMade As Integer = 0
For Each insert In From element In msbRestoreFile...<Insert>
If Not worker Is Nothing Then
worker.ReportProgress(20 + ((70 / intInsertCount)*intInsertsMade))
intInsertsMade += 1
Dim cmdData As New MySqlCommand(insert.Value, conData)
Catch exMySql As MySqlException
If Not transData Is Nothing Then
If any errors are encountered, then I rollback all data inserts, but let the rest of the procedures carry on. That way, the database structure will be in place at the end rather than a half restored load of tables. In all my applications, I use a pretty decent logging system which logs all SQL executed against the database in a separate XML file, and this can be used to reconstruct all transactions since the start of the use of the program, if necessary (very hard going, but possible). I also use this log to recreate transactions since the last backup was done (normally, the last 24 hours worth), if there is any need to restore.
Finalising the restore
Once tables have been reconstructed, constraints recreated, and data restored to perfection, we just go on to restore the rest of the Schema, the Functions, Procedures, Views etc., the same way. Then it's done, and I think the client will now be much happier with me.
Using the classes - The demo application
The demo application which I included is a very basic app, that does two things - backs up and restores databases using these two classes. It consists of two forms: a form to login to the app, and the apps main form. To log in requires the user to enter the server, user name, and password, which it then checks using the
ValidateConnection function. If the user is validated, then the main form opens up. Users have the option of ticking a checkbox which will tell the app to remember them the next time the app is started. This will store the current server name and the user's user name in the Application Settings file (users will still have to provide their password on restarting). The main form consists of a combobox which is loaded with the databases to which the user has sufficient permissions on to create a backup. It has a tab control with two tabs: one to back up and the other to restore the database selected with the combobox. To back up, the user simply has to select the database with the combobox, enter a filename in the textbox provided, and then press the button marked Back Up. To restore a database the user simply has to select the database using the combobox, select the relevant back up file using the Browse button, and if the file is validated, press Restore to restore the database. In both cases, a progress bar will come into view to let the user know the progress of events. Once the operation is complete, a message box will open letting the user know. If an error occurs, then the user will be informed as to the potential cause of the issue, and given advice on what to do next.
Main form loading
Load event, we just retrieve the names of the databases on the server using our
GetDataBases method, which returns a
List(of String), and set the
ComboBox.DataSource to the list of names returned. This ensures that a database selected by the user actually exists and that the user is able to extract the back up.
It is vital at this point to map out the error handling strategy which I've used in the back up classes. Throughout the classes, I have only used the
Using directive unless I need to catch a very specific error (even then, I just throw the error on) to connect to databases and write to files and so forth. This ensures that any error encountered is thrown back to the calling function, while still making sure my database connection, etc., gets closed and disposed of in the proper manner. This means that all potential errors will need to be handled at the UI level. This means using
End Try blocks to call any of the methods in our classes, and then handling the error by informing the user about what is going on and suggesting what they can do to try again. This entails studying the method to be called and working out which errors are likely to occur, and then handling each potential error. Handling a general exception is just not good practice in my view, unless you have handled all other potential exceptions first.
When the user presses the Back Up button, we first check to make sure that they have entered a valid name for the file. If so, we then disable the Backup and Exit buttons and show the progress bar (setting it to 0). We then open up a
FolderBrowserDialog so the user can select the folder to save the backup file into, then using the result together with the filename entered by the user to get the full path string (I use a .msb extension for my backup files). We then instantiate a new BackUp class using the current user's user name, password, the selected database, and the full path as arguments to the constructor.
Private Sub btnBackUp_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) _
If txtFileName.Text = String.Empty Then
MsgBox("Please enter valid File Name")
btnBackUp.Enabled = False
btnExit.Enabled = False
pbBackUp.Visible = True
pbBackUp.Value = 0
Dim fbdBackup As New FolderBrowserDialog
BackUpFileName = fbdBackup.SelectedPath & "/" & txtFileName.Text & ".msb"
CurrentBackUp = New BackUp(CurrentServer, _
Catch exMysql As MySqlException
MsgBox("There was an error connecting with the server.etc")
Catch exXML As Xml.XmlException
MsgBox("Could not save BackUp File to Disk. Please try again.etc")
I created two
BackGroundWorkers at design time, one to handle each operation. This is optional, but recommended, and so after creating my
BackUp class, I call my
RunWorkerAsync method. The
DoWork event is where we make the call to our
BackUpDataBase routine, passing on the current worker as an argument, so we can receive progress reports. In the
ProgressChanged event, we merely set the backup progress bar's value to the worker's progress percent. When either the worker has completed the backup, or there was an error, it will raise its
RunWorkerCompleted event, which is where we either need to analyse the error and present the user with useful info, or we let the user know everything was a success and reset the UI.
On the Restore tab, the Restore button is disabled until the user has selected a valid backup file.
Selecting a backup file
Upon clicking the Browse button, the user will be presented with an
OpenFileDialog to select the backup file. Once the user has selected a file, it is immediately checked with our
This method creates a new
XmlDocument class and loads the selected document using the class'
Load method. We then call a method
GetSchema, which creates an XML schema on the fly, so to speak. As we saw earlier, the actual schema for the XML file is very simple and so is creating its schema:
Private Function GetSchema() As XmlSchema
Dim BackUpSchema As New Xml.Schema.XmlSchema
Dim DatabaseElement As New XmlSchemaElement
DatabaseElement.Name = "DataBase"
Dim ctDataBase As New XmlSchemaComplexType()
DatabaseElement.SchemaType = ctDataBase
Dim sqDataBase As New XmlSchemaSequence
ctDataBase.Particle = sqDataBase
Dim eDataBaseName As New XmlSchemaAttribute()
eDataBaseName.Name = "Name"
eDataBaseName.Use = XmlSchemaUse.Required
eDataBaseName.SchemaTypeName = _
New XmlQualifiedName("string", "http://www.w3.org/2001/XMLSchema")
Dim xseTables As New XmlSchemaElement()
xseTables.Name = "Tables"
Dim eConstraints As New XmlSchemaElement()
eConstraints.Name = "Constraints"
Dim eInserts As New XmlSchemaElement()
eInserts.Name = "Inserts"
Dim eViews As New XmlSchemaElement()
eViews.Name = "Views"
Dim eProcedures As New XmlSchemaElement()
eProcedures.Name = "Procedures"
Dim eFunctions As New XmlSchemaElement()
eFunctions.Name = "Functions"
Dim eEvents As New XmlSchemaElement()
eEvents.Name = "Events"
Dim eTriggers As New XmlSchemaElement()
eTriggers.Name = "Triggers"
As can be seen, this is pretty self explanatory. We just create the root element database as a
ComplexElement type, create a new XML sequence, and set the database element as the sequence's base. We then add an attribute to the database element to contain the name of the database that was backed up. Then it is just a matter of creating simple elements for the other nodes.
We then add this schema to the
XmlDocument's schema collection and then use it to validate the selected file. To handle any schema errors encountered, we need to declare a validation event handler with its delegate. This handler just throws the error, expecting it to be caught in the main UI.
Once the selected backup file has been validated, we set a label to display the selected file and enable the Restore button. If any errors are encountered, then we trap them with a
Try block, catching potential errors by type and handling them appropriately.
When the backup file has been validated, the Restore button is enabled. When the user clicks on the button, we first check (just in case) whether the file is valid (remembering that the label holding the filename would only have a valid file name). We also make sure the user never clicked by mistake. If OK, then we instantiate a new
Restore class using the current user's details, the selected database, and the selected file. Then the files database is checked to make sure it corresponds with the selected database, disable the Restore and Exit buttons, and show the restore progress bar. We then call on our restore
BackGroundworker to do its thing, calling our new
RestoreDatabase method in its
DoWork routine. We monitor progress using our restore worker's
ProgressChanged routine, and when it is complete or there is an error, we will pick it up in the
RunWorkerCompleted routine. And there we have it. Restore Complete == Smiling Client.
Points to ponder
I have pretty much thrown this class at every database that I could lay my hands on, and so far, it seems to handle everything I managed to throw at it. Well, let me put it this way, the errors generated by the testing were ironed out during testing, hence the data type analysis on each column of the tables and so forth. This in no way means it is the bulletproof class that I wanted, as I have yet to test it on very large databases. I think even if I had one to test, it would not stand up to it really. With that in mind, the largest database I tried ended up with a 456 MB XML file, which took two minutes to create and 5 minutes to restore. Trying to open the file in Notepad nearly froze my computer. My client's databases never come anywhere near this size, and so considerations beyond this are unnecessary at this stage. If I were to need a facility for recreating larger databases, I would think about separating the different sections into different physical files, especially the
INSERTs, where I would probably restrict each insert file to no more than 50,000 records. I would then use the main XML file as a guide to which files should go where, maybe just using it to list the generated files in order.
- Initial post - 07 June, 2010.