Click here to Skip to main content
12,819,862 members (35,923 online)
Click here to Skip to main content
Add your own
alternative version


32 bookmarked
Posted 22 Feb 2006

Dynamic Creation of DTS-packages in VB

, 29 Sep 2009 CPOL
Rate this:
Please Sign up or sign in to vote.
Example dynamic configuration of DTS-packages for transferring DBF files to SQL Server with VB


Majority of the developers that work in the data access area have a collision of moving of data between databases in which their applications work. This program is a solution of specific target - copying data from table(s) of DBF format to Microsoft SQL Server database. A special feature of this solution is that - DTS Package will be dynamically created for DBF file(s). It is realized as DCOM Client/Server Application (class dbf_dynaimport). Such solution works so: server part of application is realized as DCOM component, receives from client part of application as arguments: path to transferred DBF file(s), name of target SQL server, name of target database. Also path to transferred DBF file(s) can accept as values path to folder or path to one DBF table. If its value path is to folder, then package will be configured for all DBF files in this folder. Server component uses Microsoft OLE DB Provider for Visual FoxPro to obtaining structure from DBF file(s) and uses this information for configuring and executing DTS package, using for this purpose Microsoft DTS Package Object Library, therefore the server part of the application requires SQL server. However this condition does not mean that server component needs to be installed on everyone's SQL Server. One component can be used for transferring data to other SQL servers accessible through a local network, certainly under the condition that there will be configured security (installation and configuring of server part will be described below). Let's consider more in detail the creation of the class.

I. Server Application

I.I Obtaining Structure of DBF File with ADO

For this purpose, the class uses the internal function - get_struct. get_struct as arguments accept path to folder where DBF-table(s) is(are). As a result, the function returns a dynamic two-dimensional array. In the first column - field name, in the second - part of create table SQL statement, which contains information about field types and their sizes of this text field.

Private Function get_struct(path As String) As String()
On Error GoTo Error_Handler
  Dim db As New ADODB.Connection, rs As New ADODB.Recordset
  Dim cSql As String, table_name As String, tbl As String
  Dim d As New Collection, c As Variant, f() As String, i As Integer
    If Right(path, 1) <> "\" Then
      tbl = Replace(Right(path, Len(path) - InStrRev(path, "\")), ".dbf", "")
      path = Left(path, InStr(path, "\"))
    End If

Here we determine argument path assigned to one DBF file, or to folder.

db.CursorLocation = adUseClient
db.Open "DRIVER={Microsoft FoxPro VFP Driver (*.dbf)};" & _
    "SourceDB=" & path & ";SourceType=DBF;deleted=no"
Set rs = db.OpenSchema(adSchemaColumns, Array(Empty, Empty, tbl, table_name))
If rs.RecordCount = 0 Then Err.Raise vbObjectError + 513, "dbf_dynaimport", _
    "Open " & path & tbl & " fault!"

Use ADO OpenSchema method to obtain database schema information. It returns Recordset that contains information about columns. The second argument of this method specifies constraint column table_name, which limits result of query if argument path is assigned to one DBF file. Otherwise if path is assigned to folder, then variable tbl is empty, therefore returned recordset will not be limited, and will contain information about all columns of all DBF files in this folder. If OpenSchema returns no one record, then it means that there is a problem with retrieving information from this file or folder. For example, there are no files in the required directory. Therefore in any case if RecordCount equal to zero, we raise an error to the client application.

On Error Resume Next
While Not rs.EOF
    d.Add CStr(rs("TABLE_NAME")), CStr(rs("TABLE_NAME"))
On Error GoTo Error_Handler

In the returned recordset field, TABLE_NAME will contain table name. Unfortunately I have not found out how to filter result, so that it contains only unique values of that field. Therefore I use for this purpose collection d. It's filled only with unique values of this field.

    ReDim Preserve f(d.Count, 2)
    For i = 1 To d.Count
        f(i, 1) = d(i)
        rs.Filter = "TABLE_NAME = '" & d(i) & "'"
        cSql = "[dbo].[" & d(i) & "] ( "
        While Not rs.EOF
            cSql = cSql & "[" & CStr(rs("COLUMN_NAME")) & "] "
            Select Case CInt(rs("DATA_TYPE"))
                Case 3
                  cSql = cSql & "[int] "
                Case 5
                  cSql = cSql & "[float] "
                Case 11
                  cSql = cSql & "[bit] "
                Case 128
                  cSql = cSql & "[image] "
                Case 129
                  If CLng(rs("CHARACTER_MAXIMUM_LENGTH")) = 2147483647 Then _
                  cSql = cSql & "[text] " Else cSql = cSql & "[varchar] (" _
                  & CStr(rs("CHARACTER_MAXIMUM_LENGTH")) & ") "
                Case 131
                  cSql = cSql & "[numeric] " & IIf(CInt(rs("NUMERIC_SCALE")) _
                  = 0, "(" & CInt(rs("NUMERIC_PRECISION")) & ", ", "(" & _
                  CInt(rs("NUMERIC_PRECISION")) - 1 & ", ") & _
                  CInt(rs("NUMERIC_SCALE")) & ") "
                Case 133, 135
                  cSql = cSql & "[datetime] "
            End Select
            cSql = cSql & ", "
        cSql = Left(cSql, Len(cSql) - 2) & ")"
        f(i, 2) = cSql
'        Debug.Print f(i, 1) & "     " & f(i, 2)
    Next i
    get_struct = f
    Set rs = Nothing
    Set db = Nothing
Exit Function
Err.Raise Err.Number, "dbf_dynaimport", Err.Description
End Function

We moved by cycle on collection d and use Filter property with criteria string TABLE_NAME. It limits recordset to records with the same table name. Further move on filtered records and look into DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_SCALE and NUMERIC_PRECISION columns. Based on this information, map OLE DB type indicators to Microsoft SQL Server data types.

I.II Creating DTS-Package with Microsoft DTS Package Object Library

For this purpose, I used an external function trans which directly receives arguments from a client application. According to these arguments, DTS-package will be configured and executed. In case of successful execution, we receive table(s) on SQL Server. Or if execution fails, the server component raises an error to client application.

Public Sub trans(pvd As ProviderType, path As String, srv As String, _
db As String)
On Error GoTo Error_Handler

    Dim f() As String
    f = get_struct(path)

The first action in class is its function call get_struct for obtaining structure of the table.

Dim oPackage As New DTS.Package
Dim oConnection As DTS.Connection
Dim oTask As DTS.Task
Dim oStep As DTS.Step
Dim oTransform As DTS.Transformation
Dim oPumpTask As DTS.DataPumpTask
Dim oEsqlTask As DTS.ExecuteSQLTask
Dim oPrecConstraint As DTS.PrecedenceConstraint
Dim i As Integer, j As Integer, n As Integer, tbl As String

With oPackage
    .Name = "DTS Package"
    .FailOnError = True
    .WriteCompletionStatusToNTEventLog = True
End With

We instantiate the variable goPackage as DTS.Package. DTS.Package object - its uppermost in hierarchy of Microsoft DTSPackage Object Library.

Select Case pvd
       Case jet
         Set oConnection = oPackage.Connections.New("Microsoft.Jet.OLEDB.4.0")
         oConnection.ConnectionProperties("Extended Properties") = "dBase 5.0"
       Case vfpole
           Set oConnection = oPackage.Connections.New("VFPOLEDB")
End Select
    oConnection.ID = 1
    oConnection.DataSource = path
    oPackage.Connections.Add oConnection
    Set oConnection = Nothing

We set up source connection. Server component can use Jet OLEDB or Visual Foxpro OLEDB Provider. This choice is defined by pvd argument.

Set oConnection = oPackage.Connections.New("SQLOLEDB")
With oConnection
    .ID = 2
    .DataSource = srv
    .Catalog = db
    .UseTrustedConnection = True
End With
oPackage.Connections.Add oConnection
Set oConnection = Nothing

Setting up destination connection:

For n = LBound(f) To UBound(f)
    tbl = f(n, 1)
i = i + 1

We moved by cycle on array f. If variable path contains path to one file, then variable n equal to one, therefore cycle executed once. Otherwise if path assigned to folder, then n is more than one, and cycle executed more than once. We use cycle for setup Tasks and Steps objects in configured package, as their number depends on number of transferred files.

Set oTask = oPackage.Tasks.New("DTSExecuteSQLTask")
Set oEsqlTask = oTask.CustomTask
With oEsqlTask
    .Name = "SQLTask" & i
    .SQLStatement = "if exists(SELECT [name] FROM [" & db & _
    "].[dbo].[sysobjects] WHERE [name] = '" & tbl & _
    "' AND type = 'U') DROP TABLE [" & db & "].[dbo].[" & tbl _
    & "] CREATE TABLE [" & db & "]." & f(n, 2)
    .ConnectionID = 2
End With
oPackage.Tasks.Add oTask
Set oEsqlTask = Nothing
Set oTask = Nothing

To collection Tasks we add create table task. Its ExecuteSQLTask object is created by calling Tasks.New method with class name argument DTSExecuteSQLTask. This task will create an empty table in DB according to value of property oEsqlTask.SQLStatement. In addition to usual SQL expression Create Table, I added SQL for checking on existence of table with the same name (if exists). If table already exists, it will be deleted. This behaviour is possible to change, but in this realisation is made so. oEsqlTask is linked to Connection 2.

Set oTask = oPackage.Tasks.New("DTSDataPumpTask")
Set oPumpTask = oTask.CustomTask
With oPumpTask
    .Name = "PumpTask" & i
    .SourceConnectionID = 1
    .SourceSQLStatement = "select * from " & tbl
    .DestinationConnectionID = 2
    .DestinationObjectName = "[" & db & "].[dbo].[" & tbl & "]"
End With

To collection Tasks we add the second Task object. Its Transform Data Task object is created by calling Tasks.New method with class name argument DTSDataPumpTask. Transform Data Task object performs transforming data between source and destination connections. Source connection is specified by property SourceConnectionID, destination by DestinationConnectionID. Property oPumpTask.SourceSQLStatement is SQL select command from DBF-file.

   Set oTransform = oPumpTask.Transformations.New("DTS.DataPumpTransformCopy")
   With oTransform
       .Name = "Transform"
       .TransformFlags = DTSTransformFlag_Default
   End With
   oPumpTask.Transformations.Add oTransform
   oPackage.Tasks.Add oTask
   Set oTask = Nothing
j = j + 1

To collection oPumpTask.Transformations are added Transformation object containing information about conversion of fields from source table to fields of destination table. It uses DTSTransformFlag_Default constant in TransformFlags property.

    Set oStep = oPackage.Steps.New
    oStep.Name = "Step" & j
    oStep.TaskName = "SQLTask" & i
    oPackage.Steps.Add oStep
    Set oStep = Nothing
j = j + 1
    Set oStep = oPackage.Steps.New
    oStep.Name = "Step" & j
    oStep.TaskName = "PumpTask" & i
    oPackage.Steps.Add oStep

To collection Steps are added two Step objects which contain information of the flow Package and completed tasks. Both steps are added to package by line goPackage.Steps.Add oStep.

    Set oPrecConstraint = oStep.PrecedenceConstraints.New("Step" & j - 1)
    oStep.PrecedenceConstraints.Add oPrecConstraint
    Set oPrecConstraint = Nothing
    Set oStep = Nothing
Next n

To collection PrecedenceConstraints are added PrecedenceConstraint object, which determines conditions that should be observed, before package executes the next step with name Step2. Conditions are defined by PrecedenceBasis and Value properties. Here the default values are used, therefore these properties are omitted in code. By default PrecedenceBasis equals DTSStepPrecedenceBasis_ExecResult constant, that is, it is based on execution result. And Value is equal to DTSStepExecResult_Success constant, that meets Step execution succeeded. Thus Step2 will be executed only if status of execution Step1 equalled to success.

If usr = "" Then oPackage.Execute Else oPackage.SaveToSQLServer srv, _
usr, pwd
Set oPackage = Nothing
Exit Sub

Finally the package is configured and ready to execute or save to SQL Server. This choice is defined by the external property save. After executing or saving, object will be destroyed and there is a normal exit from sub.

Err.Raise Err.Number, "dbf_dynaimport", "Execution of Package failed, " & _
"Description: " & Err.Description & vbCrLf & sAccumStepErrors(oPackage)

And in the end is the Sub Error_Handler.

II. Client Application

Small sample code from possible client applications:

Dim c As New trans_svr.dbf_dynaimport
c.trans vfpole, "C:\", "strong_server", "pubs"


c.trans jet, "\\rrp\d\test\1.dbf", "strong_server", "pubs"

Or:"sa") = "pass"
c.trans jet, "\\rrp\d\test\", "strong_server", "pubs"

III. Deployment Application

For development and deployment of this application as DCOM application, I used a very good Microsoft article:

In part "Set the Server's Security" of this article, I offer tune security settings only for this application:

  1. On the server computer, click the Start button, and then select Run. In the Run dialog box, type Dcomcnfg, and then click OK.
  2. Select the Default Properties tab, and verify that Enable Distributed COM on this computer is checked.
  3. Select the Applications tab, highlight server, trans_svr.dbf_dynaimport, and then click the Properties button.
  4. Select the General tab, set the Authentication Level to Default, and then select the Location tab. The only option that is checked should be Run application on this computer.
  5. Select the Security tab and check the Use custom access permissions and the Use custom launch permissions options. Click the Edit... for access(launch) permissions and use Add button to add users or groups who will be access(launch) application. For example Domain Users. Also add SYSTEM account to access and launch permissions.
  6. Select the Identity tab. On this tab, you need to specify account, which will be identified server part of application. This account will be account, to which trusts SQL Server. There will be enough setup windows user account, and add this account to SQL Server security logins. Then specify which databases can be accessed by this login and add db_ddladmin, db_datareader, db_datawriter database roles for this databases. Also server component may be used with others SQL Servers, through local network in the Domain. Just add this windows user account to this SQL Servers, as is shown above.

It is necessary to remember that server part of application has such access rights, as Windows account access rights, which will be identified. File paths passed as arguments to server component may not be visible for this Windows account. It is possible to use Universal Naming Convention (UNC) for arguments.


If I have some time, I might create a similar component for transferring data from Excel, Access to SQL. I might also develop the same for modern versions of .NET & SQL Server.   


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

Andrey Chuvahin
Team Leader
Ukraine Ukraine
No Biography provided

You may also be interested in...

Comments and Discussions

-- There are no messages in this forum --
Permalink | Advertise | Privacy | Terms of Use | Mobile
Web02 | 2.8.170308.1 | Last Updated 29 Sep 2009
Article Copyright 2006 by Andrey Chuvahin
Everything else Copyright © CodeProject, 1999-2017
Layout: fixed | fluid