Click here to Skip to main content
Click here to Skip to main content

Deployment made simple using Powershell

, 14 Dec 2006 CPOL
Rate this:
Please Sign up or sign in to vote.
Powershell scripts allow you to take advantage of .NET libraries and write scripts which are almost as powerful as the .NET code itself.

Introduction

Powershell scripts allow you to take advantage of .NET libraries and write scripts which are almost as powerful as the .NET code itself. You can do many powerful operations like call external DLLs, use .NET namespaces like System.IO, System.Net, run processes and intercept output, call web services etc. The possibilities are endless. Here, I will show you a Powershell script which assists you in day to day deployment of websites. Everyday, we make changes to web projects, which need to be deployed to development servers, sometimes on beta servers, and finally on the production server. Using this script, you can automate all the manual work that you do again and again on your deployment package every time you upload your website to some server. We use this script in Pageflakes every single day during our local development server upload, beta release, and final production server release. All we do is run the script, go to the server, and extract a zip file on the web folder, and that's all. The new version gets deployed within two minutes without any manual work at all, and completely removes any possibility of human error during deployment.

Automating deployment

The Powershell script does the following for you:

  • Maintains different configuration information for different deployments. For example, different connection strings for development servers and production servers (one or more production servers).
  • Creates a deployment folder using the deployment date, time, and version so that you have a separate folder for each deployment and can keep track of things deployed on a day, e.g., 20061214-1.
  • Copies only the change files and some predefined files to the deployment folder. So, you don't deploy the whole website every day.
  • Copies the web.config and customizes the <appSettings>, <connectionString>, <assemblies> etc., as per the deployment configuration. For example, you can have different connection strings for different servers.
  • Updates all JavaScript files with a version number so that in every deployment, a new file gets downloaded by client browsers.
  • Updates default.aspx automatically with the modified script file name.
  • Compresses all JavaScript files that gets deployed.
  • Compresses all static HTML files using an Absolute HTML Optimizer.
  • Creates a zip file which contains the deployment package.
  • FTP the zip file to a target server.

After the deployment script runs, all you need to do is extract the zip file on the server and that's all!

You can easily FTP the modified files instead of copying only the zip file, by changing the FTP part at the end of the script.

Configuring the script

The script takes some command line parameters:

Param (
    $CONFIG = "alpha",  # Configuration setting.
                        # e.g. alpha, beta, release
    $VERSION = 1,     # Version number 
                    # for the deployment. Increase as you will
    [DateTime] $LAST_CHANGE_DATETIME = 
       [DateTime]::Now.AddDays(-10), # File change date to consider
    $FROM ="SampleWeb", # Location of web site.
                        # Relative path to the script
    $FTP = 1 ) # 1 - FTP to host. 0 - Don't FTP

The parameters are:

  • CONFIG - Defines the configuration, e.g., "alpha", "beta", "release".
  • VERSION - Version number that you increase on every deployment. Script files are suffixed with the version number.
  • LAST_CHANGE_DATETIME - Files modified after this date time is deployed.
  • FROM - Relative path to the website folder.
  • FTP - If 1, FTP the zip, otherwise don't.

Then, define the FTP configuration in the following block:

# FTP Configuration where the package is uploaded
$HOSTNAME = "ftp.pageflakes.com";
$UPLOAD_DIR = "updates"; # Folder where the final
                         # zip package is uploaded
$USER = "UserName"; # Enter a user name for FTP Server
$PASS = "Password"; # Enter a password for FTP Password

After that, some handy configurations:

# Exlude the following folder while copying files
$EXCLUDE_FOLDERS = @('\App_Data', '\temp');
    
$DEFAULT_ASPX = "Default.aspx"; # The aspx file where script tag
                                # references are updated with version no
$SCRIPT_FILES = @('Script.js'); # Script files which gets
                                # a version suffix added

$ASSEMBLIES_TO_REMOVE = 
  @('Microsoft.Build.Framework'); 
  # Unwanted assemblies inside <assemblies>

You can define which folders to exclude always in the deployment. For example, no need to deploy App_Data.

Also define the ASPX file name which gets modified with new script references automatically. The $SCRIPT_FILES array contains the list of script files which are suffixed with the version number on every deployment and <SCRIPT Src="..."> is modified accordingly.

$ASSEMBLIES_TO_REMOVE specifies the assembly names which are removed from web.config's <assemblies> block. Sometimes, it is necessary to remove assemblies which are not needed in the production server. For example, nunitframework.dll.

After that, you define the configurations for different deployment targets:

if( $CONFIG -eq "alpha" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "AlphaDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}
if( $CONFIG -eq "beta" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "BetaDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}
if( $CONFIG -eq "release" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "ReleaseDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}

Here, I have specified different connection strings for different deployment targets.

Copying only modified files

The script first gets all the modified files:

# Get the files which were changed
# after the $LAST_CHANGE_DATETIME
$changedFiles = get-childitem $srcPath 
   -exclude "*.log" -Recurse | 
   where { $_ -is [System.IO.FileInfo] 
   -and $_.LastWriteTime -ge $LAST_CHANGE_DATETIME 
   -or $_.Name -eq $DEFAULT_ASPX 
   -or $_.Name -eq "web.config" }

The query finds all the files modified after the specified change date. But it always includes Default.aspx and web.config as these two files are always deployed. This is a requirement in Pageflakes and that's why I have put it there. You can remove it if you don't want it.

The next step is to decide the relative path for each file inside the deployment folder and copy the files from the source folder to the destination folder:

foreach( $file in $changedFiles )
{
    [string]$filePath = $file.ToString();
    [string]$relativePath = 
      $file.DirectoryName.substring($srcPath.Length);
    
    # if the file is in one of the excluded folders, don't copy
    $canCopy = 1;
    foreach( $excludeFolder in $EXCLUDE_FOLDERS )
    {
        if( $relativePath.StartsWith($excludeFolder) )
        {
            $canCopy = 0;
        }
    }

Before copying a file, it checks if the relative path contains any of the excluded folders.

If the folder is valid, copy the file. It also ensures the directory structure is created before copying the file:

if( $canCopy -eq 1 )
{
    # if the relative path contains a subdirectory,
    # then create a subdirectory under the deploy path
    [string]$copyPath = [System.IO.Path]::Combine( 
         $deployPath, $relativePath.TrimStart('\') );    

    if( ![System.IO.Directory]::Exists($copyPath) )
    { 
        $newDir = 
          [System.IO.Directory]::CreateDirectory($copyPath);
    }

    copy $filePath $copyPath;    
}

The next step is to configure the web.config according to the deployment configuration. For example, configuring the connection string for production server.

Configure web.config automatically

First, it loads the web.config in XmlDocument. This is System.Xml.XmlDocument.

$webConfigFilePath = 
  [System.IO.Path]::Combine( $deployPath, "web.config" );
if( [System.IO.File]::Exists( $webConfigFilePath ) )
{
    "    Web config found";
    [System.Xml.XmlDocument]$doc = 
       new-object System.Xml.XmlDocument;
    $doc.Load($webConfigFilePath);

Then, it changes some settings like changing the compilation mode to debug="false", which is highly recommended for the production server and sets the connection string.

$root = $doc.get_DocumentElement();

# Change compilation mode to debug="false"    
$root."system.web".compilation.debug = "false";

# Set deployment configuration specific connection string
$root.connectionStrings.add.connectionString = $CONNECTION_STRING

You can set multiple connection strings here easily. Then it runs through all <appSettings> keys, and modifies the properties according to deployment configuration:

foreach( $item in $root.appSettings.add )
{
    if( $item.key -eq "Proxy" )
    { $item.value = ""; }
    if( $item.key -eq "Version" )
    { $item.value = "" + $VERSION; }
}

The next step is to remove unwanted assemblies which should not be deployed to servers.

foreach( $item in  $root."system.web".compilation.assemblies.add )
{
    foreach( $assemblyName in $ASSEMBLIES_TO_REMOVE )
    {
        if( $item.assembly.Contains($assemblyName) )
        { 
            $removedNode = 
              $root."system.web".compilation.assemblies.RemoveChild( $item ); 
        }
    }
}

This gives us a nicely prepared web.config. I bet you have been doing all these manually before uploading some fixes to server or making new releases. See how Powershell can completely automate this and remove the possibility of human errors completely.

Update .js file names and <script> tags

The next step is to update all JavaScript references. In Pageflakes, we have a problem that when we upload modified JavaScript's, they do not get downloaded by existing users because they are cached in the client's browser. So, we need to change the file name every time we make changes to those scripts and upload on servers. This is error prone manual work. After changing the file name, we need to go to Default.aspx and change all script references and put the new file name. This is more manual work, and we generally make serious mistakes and screw up production servers at least once every three deployments. So, we completely automated this process using this Powershell script, and now we never worry about the script file problem at all.

foreach( $scriptFile in $SCRIPT_FILES )
{
    updateReferenceWithNewVersion $scriptFile 
       $DEFAULT_ASPX $VERSION $srcPath $deployPath
}

The actual work is done inside the function where we look for the file name and replace with the new name.

function updateReferenceWithNewVersion( $fileName, 
       $referenceFile, $versionNo, $srcPath, $destPath )
{
    $filePath = [System.IO.Path]::Combine( $destPath, $fileName );
    if( [System.IO.File]::Exists( $filePath ) )
    {
        "$fileName exists. Upgrading its version in $referenceFile"
        
        $referencePath = 
          [System.IO.Path]::Combine( $destPath, $referenceFile );
        if( -not [System.IO.File]::Exists( $referencePath ) )
        {
            "    Copying $referenceFile because it's not updated"
            $referenceSrcPath = 
              [System.IO.Path]::Combine( $srcPath, $referenceFile );
            $result = copy $referenceSrcPath $referencePath
        }
        
        $newFileName = $fileName.Split('.')[0] + "-" + 
             $versionNo + "." + $fileName.Split('.')[1];
        
        ren $filePath $newFileName
        
        $referenceContent = 
          [System.IO.File]::ReadAllText( $referencePath );
        $referenceContent = 
          $referenceContent.Replace( $fileName, $newFileName );
        [System.IO.File]::WriteAllText( 
          $referencePath, $referenceContent );
    }
    else
    {
        "$fileName not available"
    }
}

JSMIN all script files

We use JSMIN to compress all JavaScript files before deploying. This allows us to keep uncompressed JS files which contain lots of comments and spaces inside them without any reservation. When a script file gets deployed, JSMIN removes all comments and spaces from it, and generates a very compact version of the script which browsers understand without any problem. This reduces script download time significantly, and improves the overall site download speed.

$jsFiles = get-childitem $deployPath 
  -include *.js -Recurse | where { $_ -is [System.IO.FileInfo] }

$p = new-object System.Diagnostics.Process;
$si = new-object System.Diagnostics.ProcessStartInfo;
$si.FileName =     $jsminPath;
$si.CreateNoWindow = [Boolean]"true";

$p.StartInfo = $si;

if( $jsFiles -eq $null )
{
    "    No .js files modified"
}
foreach( $file in $jsFiles )
{
    [string]$filePath = $file.ToString();
    [string]$newFilePath = $filePath + ".new";

    $si.Arguments = "`"" + $filePath + "`" `"" + $newFilePath + "`"";

    $result = $p.Start();
    $p.WaitForExit();

    if( $result )
    {
        "    Minified: $filePath";
        copy $newFilePath $filePath;
        del $newFilePath;
    }
    else
    {
        del $newFilePath;
    }
}
$p.Dispose();

The idea is to launch jsmin.exe using the System.Diagnostics.Process class from the .NET library for each and every .js file. This looks pretty bad on screen when you have lots of .js files as one new command prompt window pops up for one second. But we don't mind it at all given the immense favor JSMIN does for us.

The attached source code with the article contains a special version of JSMIN which does not remove line breaks. We collect JavaScript errors from client side. So, we need to know the line number and the code on that line. When JSMIN removes the line number, it becomes impossible to trace the error. This is why we have modified JSMIN and prevented it from removing line breaks.

Compress HTML files

We compress all HTML files mercilessly in order to reduce their size as much as possible. Absolute HTML Optimizer is an amazing tool which does it for us. The idea is the same as JSMIN; launch ahc.exe with some command line parameters and it does the job for us.

$p = new-object System.Diagnostics.Process;

$si = new-object System.Diagnostics.ProcessStartInfo;
$si.FileName =     $htmlCompressorPath;
$si.CreateNoWindow = [Boolean]"false";
$compressorParam = "`"" + $deployPath + 
                   "`" " + $HTML_COMPRESSOR_PARAMS;
"$compressorParam"
$si.Arguments = $compressorParam;
$p.StartInfo = $si;
$result = $p.Start();
$p.WaitForExit();

if( $result )
{
    "    Compressed";
}
else
{
    "    Error occured";
}
$p.Dispose();

Zip up files using SharpZip

The deployment folder is zipped using SharpZip which is a free zip library for .NET. Here's how the DLL is loaded and initialized:

$zipLibraryPath = 
  [System.IO.Path]::GetFullPath(
  [System.IO.Path]::Combine( $pwd.ToString(), 
  "ICSharpCode.SharpZipLib.dll" ));
[void][System.Reflection.Assembly]::LoadFile($zipLibraryPath);
$zip = new-object ICSharpCode.SharpZipLib.Zip.FastZip;

Then one single line zips the files:

$zip.CreateZip( $deployZipFilePath, 
    $deployPath, [Boolean]"true", [string]::Empty );

FTP deployment package

Once we have the zip file, we use the Windows FTP command line tool to upload the zip to FTP. There's an FTP class available in .NET 2.0 System.Net. But I could not make it work properly using Powershell. So, I used the plain simple Windows FTP command line tool.

# FTP the Zip file
"Uploading file to FTP server..."
$FtpCommandFilePath = 
  [System.IO.Path]::GetFullPath(
  [System.IO.Path]::Combine( $pwd.ToString(), 
  "FTPCommand.txt" ) );
$FtpUploadCommand = "PUT `"" + $deployZipFilePath + "`"";
if( $UPLOAD_DIR.Length -gt 0 ) 
   { $FtpChdirCommand = "CD " + $UPLOAD_DIR; }
$FtpCommands = @( $USER, $PASS, 
  $FtpChdirCommand, "BINARY", $FtpUploadCommand, "QUIT" );
$FtpCommand = [String]::Join( "`r`n", $FtpCommands );
set-content $FtpCommandFilePath $FtpCommand
    
    ftp "-s:$FtpCommandFilePath" $HOSTNAME
    
    del $FtpCommandFilePath
    
    "FTP Complete." 

The idea is to build an FTP command file which contains the FTP commands like sending user name, password, and upload instructions. Then the command file is passed to ftp.exe which runs the commands and uploads the zip file.

Conclusion

Copying modified files, changing files names, configuring web.config, compressing HTML and JavaScript files, zip, FTP, all are done by one simple Powershell script. It has saved us hundreds of hours of manual labor in the last one year. It took a long time to get the script work perfectly, but the time spent on it was worth it. We can prevent any human errors during deployment, and can greatly speed up day to day patch releases on production servers using this script. Enjoy!

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Omar Al Zabir
Architect BT, UK (ex British Telecom)
United Kingdom United Kingdom

Comments and Discussions

 
Questionhow do you add a new module or new assembly references ? PinmemberMember 79029311-Jul-13 10:18 
Questionupdates Pinmemberkiquenet.com7-Jun-12 22:08 
GeneralMy vote of 5 PinmemberNickos_me1-Nov-11 0:11 
GeneralGreat Article PinmemberAbinash Bishoyi19-Jun-10 11:32 
GeneralDowload all zip files from a ftp location Pinmemberpawan venugopal10-Jun-10 12:40 
GeneralInteresting, but way out of proportion... Pinmemberzavitax9-Mar-08 6:22 
GeneralExcellent, very versatile solution PinmemberHolfling21-Sep-07 10:40 
Generaldifficulty? In Understanding why PinmembertCodex31-May-07 10:04 
GeneralRe: difficulty? In Understanding why PinmemberOmar Al Zabir31-May-07 19:57 
GeneralRe: difficulty? In Understanding why Pinmember.jpg7-Jun-07 22:21 
IMO, writing a small custom console application is much cleaner and maintainable then the power shell solution in your example.
 
On top of that, can you explain what is a 'human script'? Aren't power shell script written by human? Finally, under what case you found msbuild can't do the job? even with custom coded task?
GeneralRe: difficulty? In Understanding why [modified] Pinmemberjrjespersen16-Aug-07 6:31 
GeneralCool stuff:) PinmemberAsif Sayed25-May-07 4:47 
GeneralA few problems here... PinmemberGreg Ennis1-Mar-07 18:22 
GeneralRe: A few problems here... PinmemberOmar Al Zabir3-Mar-07 22:20 
GeneralWhy C# category Pinmembermicmit_syd23-Feb-07 14:58 
GeneralCongratulations to Omar Al Zabir Pinmemberazamsharp1-Feb-07 9:44 
GeneralRe: Congratulations to Omar Al Zabir PinmemberOmar Al Zabir1-Feb-07 17:12 
Generalcongrats PinmemberSamiha Esha31-Jan-07 22:47 
Generalfar beyond :zzz: PinmemberAlexander German29-Jan-07 11:14 
GeneralGreat PinmemberSajid Wasim18-Jan-07 4:37 
QuestionPowerShell support for Win2000 PinmemberShaikh Shamshuddin25-Dec-06 21:36 
AnswerRe: PowerShell support for Win2000 PinmemberOmar Al Zabir25-Dec-06 21:51 
GeneralHelpful Pinmemberraasiel21-Dec-06 3:09 
GeneralWe need a PowerShell topic here at Code Project PinmemberJeff Modzel19-Dec-06 2:27 
Questionpowershell vs msbuild/nant Pinmembernorbert_barbosa15-Dec-06 14:07 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

| Advertise | Privacy | Mobile
Web01 | 2.8.141015.1 | Last Updated 14 Dec 2006
Article Copyright 2006 by Omar Al Zabir
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid