INDIVIRTUAL - TECHNISCH PARTNER IN DIGITALE DIENSTVERLENING

Tridion's new API for Content Porting

February 19, 2014

Tridion's new API for Content Porting

Our old friend the Content Porter

SDL Content Porter! You can’t live with it, and you can’t live without it. Many of us have served our time, making exports and imports as we moved our Tridion templates through the DTAP street or even nursemaiding long-running “ports” of actual content from one system to another. Over the years, Content Porter has been through several releases, and sometimes we’ve been painfully aware that getting it right has been perhaps one of the most difficult challenges for Tridion R&D to solve over the years. Let’s face it, if you can design and implement all that dependency-management without getting tied in knots, you’re doing pretty well. Even if the commonest use-cases for Content Porter don’t make so much use of the dependency-management features, but they still have to work for the occasion when you do need them.

Over time, we’ve also seen various architectures - from a beginning where far more of the logic was in the client. Recent releases have seen most of the logic move to the server, with much more robust verification to prevent problems with imports, not to mention support for transactions. The content porter client these days doesn’t do much more than allow the user to specify their selections, and download/upload a content porter package created on the server.

The new core service API

With SDL Tridion 2013 SP1, the API that handles this on the server has now been exposed as part of the core service, and the next version of the Content Porter client will use this API. I picked up on this via a blog post from Eric Huiza, and then came across a series of posts by Anton Minko in which he walks through some simple scenarios for using the API. Anton’s examples are done in C# - which is great for some things, but I can’t help feeling that this all becomes a bit more useful if you can just do it from script. It’s closer to the comfort zone for the kind of tasks that will probably be useful.

So is this the end for Content Porter? Of course not - mostly it’s a question of the implementation being moved more and more into the core, but I also think the Content Porter client will remain very useful. Further down the page, I’m going to show a scripted approach, but you will immediately see that it’s at a “quick and dirty” level. I’m not handling service faults or anything like that. For anything approaching a sizeable port where you need reliability, there’s a lot to be said for using a client that comes as a properly shipped product from R&D. Otherwise the first little network glitch will probably have you cursing. Of course, there’s nothing to stop you from firing up Visual Studio and building a highly robust client to suit the needs of your own architecture, but you’ve got to walk before you can run. Here goes:

A Powershell approach

Perhaps the first thing to note is that although this is described as a core service API, the new features aren’t supported by the existing core service client assembly. There’s a new client assembly, and new endpoints. Actually, there are two assemblies - the second is referenced by the main client assembly, and presumably implements types which are common to server and client. On my system these were at “C:Program FilesTridionbinclient/ImportExport”, but YMMV.

OK - so to begin with, we’ll need to load those two assemblies, and do some other general setup.

function InitImportExport{
    if (-not $global:ImportExportIsInitialized) {
        Add-Type -assemblyName System.ServiceModel
        Add-Type -Path 'C:\Program Files (x86)\Tridion\bin\client\ImportExport\Tridion.ContentManager.ImportExport.Client.dll'
        Add-Type -Path 'C:\Program Files (x86)\Tridion\bin\client\ImportExport\Tridion.ContentManager.ImportExport.Common.dll'
        Import-Module Reflection
        Import-Namespace Tridion.ContentManager.ImportExport
        Import-Namespace Tridion.ContentManager.ImportExport.Client
        $global:ImportExportIsInitialized = $true
    }
}

In addition to the ImportExport assemblies we also need System.ServiceModel. You can also see that I’m using the Poshcode Reflection module as described here.  I’m using this to import two namespaces, which helps to keep the code readable. The Powershell Import-Module function is aware of modules that are already loaded, but import-namespace executes fully every time you call it, which can take a second or two, so I set a global flag to ensure we can call this init routine from most of our other functions without incurring unnecessary overhead.

So now we have the client proxy loaded, we can call the service, or we could if the config file were located in the right place. For Powershell work this is a bit awkward, so the usual approach is to do the same configuration from code. Here’s the standard configuration shipped by SDL so that we can see what we need to do:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <system.serviceModel>
    <bindings>
      <basicHttpBinding>
        <binding name="ImportExport_basicHttpBinding" maxReceivedMessageSize="41943040">
          <readerQuotas maxStringContentLength="41943040" maxArrayLength="41943040" />
          <security mode="TransportCredentialOnly">
            <!-- For LDAP or SSO authentication of transport credentials, use clientCredentialType="None" -->
            <transport clientCredentialType="Windows" />
          </security>
        </binding>
        <binding name="ImportExport_StreamDownload_basicHttpBinding" maxReceivedMessageSize="2147483648" transferMode="StreamedResponse" messageEncoding="Mtom" sendTimeout="00:30:00">
          <security mode="TransportCredentialOnly">
            <!-- For LDAP or SSO authentication of transport credentials, use clientCredentialType="None" -->
            <transport clientCredentialType="Windows" />
          </security>
        </binding>
        <binding name="ImportExport_StreamUpload_basicHttpBinding" maxReceivedMessageSize="2147483648" transferMode="StreamedRequest" messageEncoding="Mtom" receiveTimeout="00:30:00">
          <security mode="None" />
          <readerQuotas maxStringContentLength="2147483647" maxArrayLength="2147483647" />
        </binding>
      </basicHttpBinding>
    </bindings>
    <client>
      <endpoint name="basicHttp_2013" address="http://localhost/webservices/ImportExportService2013.svc/basicHttp" binding="basicHttpBinding" bindingConfiguration="ImportExport_basicHttpBinding" contract="Tridion.ContentManager.ImportExport.Client.IImportExportService" />
      <endpoint name="streamDownload_basicHttp_2013" address="http://localhost/webservices/ImportExportService2013.svc/streamDownload_basicHttp" binding="basicHttpBinding" bindingConfiguration="ImportExport_StreamDownload_basicHttpBinding" contract="Tridion.ContentManager.ImportExport.Client.IImportExportStreamDownload" />
      <endpoint name="streamUpload_basicHttp_2013" address="http://localhost/webservices/ImportExportService2013.svc/streamUpload_basicHttp" binding="basicHttpBinding" bindingConfiguration="ImportExport_StreamUpload_basicHttpBinding" contract="Tridion.ContentManager.ImportExport.Client.IImportExportStreamUpload" />
    </client>
  </system.serviceModel>
</configuration>

Here you can see the various bindings and endpoints - we’ll need all three bindings and all three endpoints. Note also that the configuration defaults to localhost, which may not be so useful in real life, but still - for this example, I’ve stuck with it. So here’s the same thing, wrapped up in a function that will hand me a client object of the correct type.

function get-ImportExportServiceClient {
param(
[parameter(Mandatory=$false)]
[AllowNull()]
[ValidateSet("Service","Upload","Download")]
[string]$type="Service"
)
	InitImportExport

	$binding = new-object System.ServiceModel.BasicHttpBinding
	$binding.MaxBufferPoolSize = [int]::MaxValue
	$binding.MaxReceivedMessageSize = [int]::MaxValue
	$binding.ReaderQuotas.MaxArrayLength = [int]::MaxValue
	$binding.ReaderQuotas.MaxBytesPerRead = [int]::MaxValue
	$binding.ReaderQuotas.MaxStringContentLength = [int]::MaxValue
	$binding.ReaderQuotas.MaxNameTableCharCount = [int]::MaxValue
	
	switch($type)
	{
		"Service" {
			$binding.Security.Mode = [System.ServiceModel.BasicHttpSecurityMode]::TransportCredentialOnly
			$binding.Security.Transport.ClientCredentialType = [System.ServiceModel.HttpClientCredentialType]::Windows
			$endpoint = new-object System.ServiceModel.EndpointAddress http://localhost/webservices/ImportExportService2013.svc/basicHttp
			new-object Tridion.ContentManager.ImportExport.Client.ImportExportServiceClient $binding,$endpoint
		}

		"Download" {
			$binding.Security.Mode = [System.ServiceModel.BasicHttpSecurityMode]::TransportCredentialOnly
			$binding.Security.Transport.ClientCredentialType = [System.ServiceModel.HttpClientCredentialType]::Windows
			$binding.TransferMode = [ServiceModel.TransferMode]::StreamedResponse
			$binding.MessageEncoding = [ServiceModel.WsMessageEncoding]::Mtom
			$endpoint = new-object System.ServiceModel.EndpointAddress http://localhost/webservices/ImportExportService2013.svc/streamDownload_basicHttp	
			new-object Tridion.ContentManager.ImportExport.Client.ImportExportStreamDownloadClient $binding,$endpoint
		}

		"Upload" {
			$binding.Security.Mode = [System.ServiceModel.BasicHttpSecurityMode]::None
			$binding.TransferMode = [ServiceModel.TransferMode]::StreamedRequest
			$binding.MessageEncoding = [ServiceModel.WsMessageEncoding]::Mtom
			$endpoint = new-object System.ServiceModel.EndpointAddress http://localhost/webservices/ImportExportService2013.svc/streamUpload_basicHttp
			new-object Tridion.ContentManager.ImportExport.Client.ImportExportStreamUploadClient $binding,$endpoint
		}
	}
}

The only difference here is that I’ve maxed-out the quotas. Feel free to use SDL’s values in your own implementation.  We’ll also need a few more utility functions, because the stream processing gets a bit verbose, and we also need to do some polling to wait for the imports and exports to be finished before we rely on them. The coding logic is lifted more or less directly from Anton’s examples.

function DownloadPackageToFile($packageId, $filePath){
	
	InitImportExport

	$downloadService = get-ImportExportServiceClient -type Download
	try {
		$packageStream = $downloadService.DownloadPackage($packageId, $true)	
		$fileStream = [IO.File]::Create($filePath)
		$packageStream.CopyTo( $fileStream)
	}	
	finally { 
		if ($fileStream -ne $null) {
			$fileStream.Dispose()
		}
		if ($packageStream -ne $null) {
			$packageStream.Dispose()
		}
	}
}

function UploadPackageFromFile($packageLocation){
	
	InitImportExport

	$uploadService = get-ImportExportServiceClient -type Upload
	try {
		$packageStream = [IO.File]::OpenRead($packageLocation)
		$uploadService.UploadPackageStream($packageStream)
	}
	finally {
		if ($packageStream -ne $null){
			$packageStream.Dispose()	
		}
	}
}

function WaitForImportExportFinish( $serviceClient, $processId){
	
	InitImportExport

	do {
		$processState = $serviceClient.GetProcessState($processId)
		if ([ProcessState]::Finished,[ProcessState]::Aborted,[ProcessState]::AbortedByUser -contains $processId){
			break;
		}
		sleep 1
	} while ($true)
	$processState.Value
}

So - with the boilerplate code out of the way, we can continue on to the interesting parts. What does it take to execute an export and an import?

$impexp = get-ImportExportServiceClient
$contentFolderSelection = new-object SubTreeSelection "tcm:3-1005-2",$false
$exportInstruction = new-object ExportInstruction
$exportInstruction.BluePrintMode = [BluePrintMode]::ExportSharedItemsFromOwningPublication
($processId = $impexp.StartExport( @($contentFolderSelection),$exportInstruction))

WaitForImportExportFinish $impexp $processId

DownloadPackageToFile $processId "package.zip"
$uploadId = UploadPackageFromFile "package.zip"
$impexp = get-ImportExportServiceClient
$processId = $impexp.StartImport($uploadId, (new-object ImportInstruction))
$impexp.GetProcessState($processId)

WaitForImportExportFinish $impexp $processId

That’s just enough code to export the contents of my content folder, and then re-import it back in exactly the same place. Obviously, this is a trivial example, but the API gives you the possibility of doing pretty much anything you can do using Content Porter. The only thing so far that I’ve noticed is that it’s not obvious how you would make a selection from your import package. Anton suggests that it isn’t necessary, although he doesn’t say it’s impossible. If you find out how, please let me know and I’ll update this. (Actually, this limitation doesn’t bother me, because I always create the package exactly as I want to have it imported. There’s such a thing as turning too many of the dials at once. Still - I know some people who do use it.)

Conclusions

So how useful is this? I know we’ve been asking for it for years, but in principle you could achieve more or less the same thing with a saved configuration file and Content Porter. I think it’s an improvement to have a publicly supported API though - if only because it’s consistent with Tridion’s long-standing ethos of creating fully functional APIs which their own user interfaces then use. (I’m also hoping that mappings will be easier to do this way.) But in any case, just because it was theoretically possible to automate the management of your environments with the old system, that doesn’t mean that it isn’t a huge improvement to have the ability to work with imports and exports in a much cleaner way. The credibility of a system like SDL Tridion as an enterprise offering relies heavily on the notion that infrastructure support like this should “just work”. It’s great to see the Import-Export functionality confirmed as a first-class citizen.

Dominic Cronin

Dominic Cronin