Click here to Skip to main content
15,868,340 members
Articles / Programming Languages / Javascript
Article

Deployment made simple using Powershell

Rate me:
Please Sign up or sign in to vote.
4.88/5 (35 votes)
14 Dec 2006CPOL8 min read 287.7K   1.9K   165   34
Powershell scripts allow you to take advantage of .NET libraries and write scripts which are almost as powerful as the .NET code itself.

Introduction

Powershell scripts allow you to take advantage of .NET libraries and write scripts which are almost as powerful as the .NET code itself. You can do many powerful operations like call external DLLs, use .NET namespaces like System.IO, System.Net, run processes and intercept output, call web services etc. The possibilities are endless. Here, I will show you a Powershell script which assists you in day to day deployment of websites. Everyday, we make changes to web projects, which need to be deployed to development servers, sometimes on beta servers, and finally on the production server. Using this script, you can automate all the manual work that you do again and again on your deployment package every time you upload your website to some server. We use this script in Pageflakes every single day during our local development server upload, beta release, and final production server release. All we do is run the script, go to the server, and extract a zip file on the web folder, and that's all. The new version gets deployed within two minutes without any manual work at all, and completely removes any possibility of human error during deployment.

Automating deployment

The Powershell script does the following for you:

  • Maintains different configuration information for different deployments. For example, different connection strings for development servers and production servers (one or more production servers).
  • Creates a deployment folder using the deployment date, time, and version so that you have a separate folder for each deployment and can keep track of things deployed on a day, e.g., 20061214-1.
  • Copies only the change files and some predefined files to the deployment folder. So, you don't deploy the whole website every day.
  • Copies the web.config and customizes the <appSettings>, <connectionString>, <assemblies> etc., as per the deployment configuration. For example, you can have different connection strings for different servers.
  • Updates all JavaScript files with a version number so that in every deployment, a new file gets downloaded by client browsers.
  • Updates default.aspx automatically with the modified script file name.
  • Compresses all JavaScript files that gets deployed.
  • Compresses all static HTML files using an Absolute HTML Optimizer.
  • Creates a zip file which contains the deployment package.
  • FTP the zip file to a target server.

After the deployment script runs, all you need to do is extract the zip file on the server and that's all!

You can easily FTP the modified files instead of copying only the zip file, by changing the FTP part at the end of the script.

Configuring the script

The script takes some command line parameters:

Param (
    $CONFIG = "alpha",  # Configuration setting.
                        # e.g. alpha, beta, release
    $VERSION = 1,     # Version number 
                    # for the deployment. Increase as you will
    [DateTime] $LAST_CHANGE_DATETIME = 
       [DateTime]::Now.AddDays(-10), # File change date to consider
    $FROM ="SampleWeb", # Location of web site.
                        # Relative path to the script
    $FTP = 1 ) # 1 - FTP to host. 0 - Don't FTP

The parameters are:

  • CONFIG - Defines the configuration, e.g., "alpha", "beta", "release".
  • VERSION - Version number that you increase on every deployment. Script files are suffixed with the version number.
  • LAST_CHANGE_DATETIME - Files modified after this date time is deployed.
  • FROM - Relative path to the website folder.
  • FTP - If 1, FTP the zip, otherwise don't.

Then, define the FTP configuration in the following block:

# FTP Configuration where the package is uploaded
$HOSTNAME = "ftp.pageflakes.com";
$UPLOAD_DIR = "updates"; # Folder where the final
                         # zip package is uploaded
$USER = "UserName"; # Enter a user name for FTP Server
$PASS = "Password"; # Enter a password for FTP Password

After that, some handy configurations:

# Exlude the following folder while copying files
$EXCLUDE_FOLDERS = @('\App_Data', '\temp');
    
$DEFAULT_ASPX = "Default.aspx"; # The aspx file where script tag
                                # references are updated with version no
$SCRIPT_FILES = @('Script.js'); # Script files which gets
                                # a version suffix added

$ASSEMBLIES_TO_REMOVE = 
  @('Microsoft.Build.Framework'); 
  # Unwanted assemblies inside <assemblies>

You can define which folders to exclude always in the deployment. For example, no need to deploy App_Data.

Also define the ASPX file name which gets modified with new script references automatically. The $SCRIPT_FILES array contains the list of script files which are suffixed with the version number on every deployment and <SCRIPT Src="..."> is modified accordingly.

$ASSEMBLIES_TO_REMOVE specifies the assembly names which are removed from web.config's <assemblies> block. Sometimes, it is necessary to remove assemblies which are not needed in the production server. For example, nunitframework.dll.

After that, you define the configurations for different deployment targets:

if( $CONFIG -eq "alpha" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "AlphaDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}
if( $CONFIG -eq "beta" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "BetaDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}
if( $CONFIG -eq "release" )
{
    $DB_HOST = "(local)";
    $DB_NAME = "ReleaseDatabase";
    $CREDENTIAL = "trusted_connection=false;";
}

Here, I have specified different connection strings for different deployment targets.

Copying only modified files

The script first gets all the modified files:

# Get the files which were changed
# after the $LAST_CHANGE_DATETIME
$changedFiles = get-childitem $srcPath 
   -exclude "*.log" -Recurse | 
   where { $_ -is [System.IO.FileInfo] 
   -and $_.LastWriteTime -ge $LAST_CHANGE_DATETIME 
   -or $_.Name -eq $DEFAULT_ASPX 
   -or $_.Name -eq "web.config" }

The query finds all the files modified after the specified change date. But it always includes Default.aspx and web.config as these two files are always deployed. This is a requirement in Pageflakes and that's why I have put it there. You can remove it if you don't want it.

The next step is to decide the relative path for each file inside the deployment folder and copy the files from the source folder to the destination folder:

foreach( $file in $changedFiles )
{
    [string]$filePath = $file.ToString();
    [string]$relativePath = 
      $file.DirectoryName.substring($srcPath.Length);
    
    # if the file is in one of the excluded folders, don't copy
    $canCopy = 1;
    foreach( $excludeFolder in $EXCLUDE_FOLDERS )
    {
        if( $relativePath.StartsWith($excludeFolder) )
        {
            $canCopy = 0;
        }
    }

Before copying a file, it checks if the relative path contains any of the excluded folders.

If the folder is valid, copy the file. It also ensures the directory structure is created before copying the file:

if( $canCopy -eq 1 )
{
    # if the relative path contains a subdirectory,
    # then create a subdirectory under the deploy path
    [string]$copyPath = [System.IO.Path]::Combine( 
         $deployPath, $relativePath.TrimStart('\') );    

    if( ![System.IO.Directory]::Exists($copyPath) )
    { 
        $newDir = 
          [System.IO.Directory]::CreateDirectory($copyPath);
    }

    copy $filePath $copyPath;    
}

The next step is to configure the web.config according to the deployment configuration. For example, configuring the connection string for production server.

Configure web.config automatically

First, it loads the web.config in XmlDocument. This is System.Xml.XmlDocument.

$webConfigFilePath = 
  [System.IO.Path]::Combine( $deployPath, "web.config" );
if( [System.IO.File]::Exists( $webConfigFilePath ) )
{
    "    Web config found";
    [System.Xml.XmlDocument]$doc = 
       new-object System.Xml.XmlDocument;
    $doc.Load($webConfigFilePath);

Then, it changes some settings like changing the compilation mode to debug="false", which is highly recommended for the production server and sets the connection string.

$root = $doc.get_DocumentElement();

# Change compilation mode to debug="false"    
$root."system.web".compilation.debug = "false";

# Set deployment configuration specific connection string
$root.connectionStrings.add.connectionString = $CONNECTION_STRING

You can set multiple connection strings here easily. Then it runs through all <appSettings> keys, and modifies the properties according to deployment configuration:

foreach( $item in $root.appSettings.add )
{
    if( $item.key -eq "Proxy" )
    { $item.value = ""; }
    if( $item.key -eq "Version" )
    { $item.value = "" + $VERSION; }
}

The next step is to remove unwanted assemblies which should not be deployed to servers.

foreach( $item in  $root."system.web".compilation.assemblies.add )
{
    foreach( $assemblyName in $ASSEMBLIES_TO_REMOVE )
    {
        if( $item.assembly.Contains($assemblyName) )
        { 
            $removedNode = 
              $root."system.web".compilation.assemblies.RemoveChild( $item ); 
        }
    }
}

This gives us a nicely prepared web.config. I bet you have been doing all these manually before uploading some fixes to server or making new releases. See how Powershell can completely automate this and remove the possibility of human errors completely.

Update .js file names and <script> tags

The next step is to update all JavaScript references. In Pageflakes, we have a problem that when we upload modified JavaScript's, they do not get downloaded by existing users because they are cached in the client's browser. So, we need to change the file name every time we make changes to those scripts and upload on servers. This is error prone manual work. After changing the file name, we need to go to Default.aspx and change all script references and put the new file name. This is more manual work, and we generally make serious mistakes and screw up production servers at least once every three deployments. So, we completely automated this process using this Powershell script, and now we never worry about the script file problem at all.

foreach( $scriptFile in $SCRIPT_FILES )
{
    updateReferenceWithNewVersion $scriptFile 
       $DEFAULT_ASPX $VERSION $srcPath $deployPath
}

The actual work is done inside the function where we look for the file name and replace with the new name.

function updateReferenceWithNewVersion( $fileName, 
       $referenceFile, $versionNo, $srcPath, $destPath )
{
    $filePath = [System.IO.Path]::Combine( $destPath, $fileName );
    if( [System.IO.File]::Exists( $filePath ) )
    {
        "$fileName exists. Upgrading its version in $referenceFile"
        
        $referencePath = 
          [System.IO.Path]::Combine( $destPath, $referenceFile );
        if( -not [System.IO.File]::Exists( $referencePath ) )
        {
            "    Copying $referenceFile because it's not updated"
            $referenceSrcPath = 
              [System.IO.Path]::Combine( $srcPath, $referenceFile );
            $result = copy $referenceSrcPath $referencePath
        }
        
        $newFileName = $fileName.Split('.')[0] + "-" + 
             $versionNo + "." + $fileName.Split('.')[1];
        
        ren $filePath $newFileName
        
        $referenceContent = 
          [System.IO.File]::ReadAllText( $referencePath );
        $referenceContent = 
          $referenceContent.Replace( $fileName, $newFileName );
        [System.IO.File]::WriteAllText( 
          $referencePath, $referenceContent );
    }
    else
    {
        "$fileName not available"
    }
}

JSMIN all script files

We use JSMIN to compress all JavaScript files before deploying. This allows us to keep uncompressed JS files which contain lots of comments and spaces inside them without any reservation. When a script file gets deployed, JSMIN removes all comments and spaces from it, and generates a very compact version of the script which browsers understand without any problem. This reduces script download time significantly, and improves the overall site download speed.

$jsFiles = get-childitem $deployPath 
  -include *.js -Recurse | where { $_ -is [System.IO.FileInfo] }

$p = new-object System.Diagnostics.Process;
$si = new-object System.Diagnostics.ProcessStartInfo;
$si.FileName =     $jsminPath;
$si.CreateNoWindow = [Boolean]"true";

$p.StartInfo = $si;

if( $jsFiles -eq $null )
{
    "    No .js files modified"
}
foreach( $file in $jsFiles )
{
    [string]$filePath = $file.ToString();
    [string]$newFilePath = $filePath + ".new";

    $si.Arguments = "`"" + $filePath + "`" `"" + $newFilePath + "`"";

    $result = $p.Start();
    $p.WaitForExit();

    if( $result )
    {
        "    Minified: $filePath";
        copy $newFilePath $filePath;
        del $newFilePath;
    }
    else
    {
        del $newFilePath;
    }
}
$p.Dispose();

The idea is to launch jsmin.exe using the System.Diagnostics.Process class from the .NET library for each and every .js file. This looks pretty bad on screen when you have lots of .js files as one new command prompt window pops up for one second. But we don't mind it at all given the immense favor JSMIN does for us.

The attached source code with the article contains a special version of JSMIN which does not remove line breaks. We collect JavaScript errors from client side. So, we need to know the line number and the code on that line. When JSMIN removes the line number, it becomes impossible to trace the error. This is why we have modified JSMIN and prevented it from removing line breaks.

Compress HTML files

We compress all HTML files mercilessly in order to reduce their size as much as possible. Absolute HTML Optimizer is an amazing tool which does it for us. The idea is the same as JSMIN; launch ahc.exe with some command line parameters and it does the job for us.

$p = new-object System.Diagnostics.Process;

$si = new-object System.Diagnostics.ProcessStartInfo;
$si.FileName =     $htmlCompressorPath;
$si.CreateNoWindow = [Boolean]"false";
$compressorParam = "`"" + $deployPath + 
                   "`" " + $HTML_COMPRESSOR_PARAMS;
"$compressorParam"
$si.Arguments = $compressorParam;
$p.StartInfo = $si;
$result = $p.Start();
$p.WaitForExit();

if( $result )
{
    "    Compressed";
}
else
{
    "    Error occured";
}
$p.Dispose();

Zip up files using SharpZip

The deployment folder is zipped using SharpZip which is a free zip library for .NET. Here's how the DLL is loaded and initialized:

$zipLibraryPath = 
  [System.IO.Path]::GetFullPath(
  [System.IO.Path]::Combine( $pwd.ToString(), 
  "ICSharpCode.SharpZipLib.dll" ));
[void][System.Reflection.Assembly]::LoadFile($zipLibraryPath);
$zip = new-object ICSharpCode.SharpZipLib.Zip.FastZip;

Then one single line zips the files:

$zip.CreateZip( $deployZipFilePath, 
    $deployPath, [Boolean]"true", [string]::Empty );

FTP deployment package

Once we have the zip file, we use the Windows FTP command line tool to upload the zip to FTP. There's an FTP class available in .NET 2.0 System.Net. But I could not make it work properly using Powershell. So, I used the plain simple Windows FTP command line tool.

# FTP the Zip file
"Uploading file to FTP server..."
$FtpCommandFilePath = 
  [System.IO.Path]::GetFullPath(
  [System.IO.Path]::Combine( $pwd.ToString(), 
  "FTPCommand.txt" ) );
$FtpUploadCommand = "PUT `"" + $deployZipFilePath + "`"";
if( $UPLOAD_DIR.Length -gt 0 ) 
   { $FtpChdirCommand = "CD " + $UPLOAD_DIR; }
$FtpCommands = @( $USER, $PASS, 
  $FtpChdirCommand, "BINARY", $FtpUploadCommand, "QUIT" );
$FtpCommand = [String]::Join( "`r`n", $FtpCommands );
set-content $FtpCommandFilePath $FtpCommand
    
    ftp "-s:$FtpCommandFilePath" $HOSTNAME
    
    del $FtpCommandFilePath
    
    "FTP Complete." 

The idea is to build an FTP command file which contains the FTP commands like sending user name, password, and upload instructions. Then the command file is passed to ftp.exe which runs the commands and uploads the zip file.

Conclusion

Copying modified files, changing files names, configuring web.config, compressing HTML and JavaScript files, zip, FTP, all are done by one simple Powershell script. It has saved us hundreds of hours of manual labor in the last one year. It took a long time to get the script work perfectly, but the time spent on it was worth it. We can prevent any human errors during deployment, and can greatly speed up day to day patch releases on production servers using this script. Enjoy!

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Architect BT, UK (ex British Telecom)
United Kingdom United Kingdom

Comments and Discussions

 
QuestionPowerShell support for Win2000 Pin
Shaikh Shamshuddin25-Dec-06 21:36
Shaikh Shamshuddin25-Dec-06 21:36 
AnswerRe: PowerShell support for Win2000 Pin
Omar Al Zabir25-Dec-06 21:51
Omar Al Zabir25-Dec-06 21:51 
GeneralHelpful Pin
Shafqat Ahmed21-Dec-06 3:09
Shafqat Ahmed21-Dec-06 3:09 
GeneralWe need a PowerShell topic here at Code Project Pin
Jeff Modzel19-Dec-06 2:27
Jeff Modzel19-Dec-06 2:27 
Questionpowershell vs msbuild/nant Pin
norbert_barbosa15-Dec-06 14:07
norbert_barbosa15-Dec-06 14:07 
AnswerRe: powershell vs msbuild/nant Pin
Omar Al Zabir15-Dec-06 18:19
Omar Al Zabir15-Dec-06 18:19 
GeneralRe: powershell vs msbuild/nant Pin
MotoMan04518-Dec-06 10:02
MotoMan04518-Dec-06 10:02 
GeneralRe: powershell vs msbuild/nant [modified] Pin
Cash Foley6-Jun-07 13:43
Cash Foley6-Jun-07 13:43 
I've been using and extending NAnt for 4 years. It has some really nice things and I've done some amazing things with it. However, my current strategy is to leave it. - It's just converting all my existing stuff that will make it painful and slow.

We plan to us MSBuild for Build Control and PowerShell for Deployment. Currently we use NAnt for both of these things.

The worst thing about NAnt and MSBuild is it's XML. The semantics of the language are limited by XML. XSLT is powerful but painful for the same reason. Neither language supports IF, THEN, ELSE because of the constraints of XML. The most glaring problem with these languages is the lack of Parametrized control. A "Target" has expected input that must be passed by convention, no declaration control, and properties are all "Global". Even parameters used in a small snippet are Global. Hopefully you can see the ramifications of this without me spelling it out.

While NAnt and MSBuild are both extensible, it can be awkward to Extend in a way providing backwards compatibility (see comments on parameters) if you are trying to build a library of scripts. It's VERY easy to break dependent scripts.

With this said, MSBuild still has a future with us. If you want to make build processes that execute with Team Build AND when a developer presses F5, MSBuild is what will facilitate it. In short, it the best way to extend Visual Studio for the foreseeable future.

I still have a temptation to use MSBuild for deployments for this simple advantage - it's part of the Framework starting in 2.0. If a machine has 2.0+ installed, it has the core MSBuild language. This gives you flexibility of pushing scripts to a target server and executing them remotely. Nonetheless, PS is so much more powerful when it comes to working with OS components and it will become standard on the OS, it's just a matter of time before you'll be able to count on availability.

Powershell not only provides Parametrized Procedures but is embraces OO with a vengeance. It naively can work will all .Net Classes, but also ADO, WMI, and COM! You don't have to dance through hoops to get full access to these. If you need to utilize .NET objects, you don't have to write rapper Tasks like you would in NAnt and MSBuild. You may want isolate the access in scripts and Cmdlets, but it's not required.

With all that said, Powershell is "scripting". You don't get strong typing that comes from a compiled language because these things are resolved at run time. It works really hard to make sense of Types but that leaves it resolution at Run Time. Also, PowerShell has new syntax and parsing rules you'll have to learn. It won't be painless to learn and master but it has so much power it will be worth it.

-- modified at 13:39 Thursday 7th June, 2007

Cash Foley

GeneralRe: powershell vs msbuild/nant Pin
BAIJUMAX10-Jun-07 20:30
professionalBAIJUMAX10-Jun-07 20:30 
GeneralIf this is "Made Easy" Pin
Joao Prado15-Dec-06 5:56
Joao Prado15-Dec-06 5:56 
Generalwell done Pin
Samiha Esha15-Dec-06 2:07
Samiha Esha15-Dec-06 2:07 
GeneralGood Article Pin
NormDroid15-Dec-06 1:13
professionalNormDroid15-Dec-06 1:13 
GeneralJust writing something like this Pin
Jan Seda14-Dec-06 23:48
professionalJan Seda14-Dec-06 23:48 
GeneralIts really awsome Pin
mouly14-Dec-06 23:42
mouly14-Dec-06 23:42 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.