Skip to content

Adventures in Migrating Content from Lotus Notes and SharePoint Backwards Compatible Document Libraries to WSSv3

I recently completed a project where I migrated Lotus Notes databases and SharePoint Portal Server 2003 Backwards Compatible Document Libraries (SPS2003BCDL) to WSSv3.  I learned a lot through the process and wanted to share how the experience worked and where I ended up.  For those who aren’t familiar, SharePoint Portal Server 2001 (SPS2001) used a document library based on the Exchange Web Storage System, sometimes called WSS which became a problem when Windows SharePoint Services became the new name for SharePoint Team Services.  The Web Storage System had its issues but it did have some great features.  Being based on the Exchange engine it supported security, versioning and user specified document metadata.  What WSSv3 calls Content Types the Web Storage System called document profiles.

The problem with the web storage system was that it didn’t perform well and it didn’t scale well.  As a result in SharePoint Portal Server 2003 Microsoft shifted to a SQL based storage engine.  However, in the process they lost per-item security, and the ability to have any sort of profile for documents.  Because of the substantial feature removals in going to SQL, they made installing the web storage system available as the “Backwards Compatible Document Libraries” and provided a set of migration tools from the BCDL to the new document libraries but the limitations and the lack of profiles stopped many customers, including my customer, from migrating their documents over.

So once WSSv3 was released with support for content types it was possible to migrate from the web storage system to WSSv3.  Thus the impetus for the migration project I just completed.

We tackled the migration as two separate pieces.

Notes Conversion

First up was the Lotus Notes migration.  There were five databases to be moved, two of which were really the same database with different statuses of the data.  They got collapsed into one database with an extra status field because moving items from list to list isn’t as easy in SharePoint as it is to move from database to database in Notes.  Although I outsourced the actual migration the tool used was Proposion.  There are some interesting issues that arose that I think everyone should consider when they’re having to do a migration.

Help Your SharePoint User
  1. Make sure that the vendor you select to do the migration understands content types.  Content types are important when you consider how you’re going to find data in the long run.  The vendor I chose wasn’t initially familiar with content types.  You can read the whitepaper I wrote, Managing Enterprise Metadata with Content Types, if you want to know more about them and get a sense for how to use them.
  2. Make sure the vendor you select understands how to use features and solutions to deploy and provision content types.  If you just go into the user interface and create a content type you’ll find it difficult to move it between development and production.  If you create a feature that defines the content types all you have to do is test the feature in the dev environment and then deploy the feature into the production environment.  Why a SharePoint solution?  That’s the repeatable way of deploying a feature.  The one I selected didn’t.
  3. Be clear about what your expectations are.  One of the databases was a collection of inspection forms.  I requested that the Notes database be moved into a document with properties/quick parts so that those properties would expose themselves to SharePoint.  The idea was that it would be a real honest to goodness form that was searchable via SharePoint (just like I built in the Managing Enterprise Metadata with Content Types) but in the end they couldn’t figure it out, and I decided that it wasn’t worth pushing.  Apparently all the conversion tools convert from Notes Databases into list items.
  4. One tricky bit about this particular migration was that one of the Notes databases had links between the documents.  We ended up doing a two pass migration where we converted the links to SharePoint links once we knew what the ID#s of the SharePoint records was.
  5. Consider what the final appearance should look like.  Ultimately I created a set of web parts to get the XML of an item and then transformed it with XSLT.  One would think that there would be a way to do that with out of the box web parts, however, the XML web part doesn’t support data connections or parameter substitution so it wouldn’t work.  Instead I created a web part that emitted the XML for the record requested in the URL via a web part connection.  I connected that to the connectable XML web part I wrote and everything fit together.

All in all the conversion was successful, however, it was certainly a lot more painful than it should have been.  Hopefully the next one won’t be so hard, particularly since I’ve got a lot of the tools built for it already.

SharePoint Conversion

When we finished the Notes conversion I thought, great.  The hard part is behind us.  What’s left is just time consuming.  The library we were moving was 22GB in the Web Storage System (about 17 GB on disk).  I thought it would be easy, I’d export the library using the open source SPMigration tool kit that Kimmo Forss built and made available via CodePlex.  After several attempts I managed to get a version of the tool that would run.  There were some packaging issues which wouldn’t allow the installer to run.

So I started running the tool against the datastore in a virtual PC environment and it ran for more than 48 hours … so I gave up.  I bought a server to loan to the client for the migration.  That server was VERY fast, and did manage to create an export in 24 hours or so, but it clearly wasn’t exporting everything.  Try after try lead me from one error to another until finally I had to give up.  I found out from Kimmo that the dataset I was migrating was the largest single-shot migration that the tools had ever been tried for so far as he knew.

At this point I decided to evaluate Tzunami Deployer, without getting into too many specifics, I can say that the export tool for SharePoint 2001 (SPS2003BCDL) just didn’t export all of the data I had.  I was instructed to do smaller migrations, however, this wasn’t feasible given the structure of the data and my project restrictions.

I took the code that Kimmo had written and copied out everything that I had to have to make the system function.  I managed to get a prototype exporter created.  However, there were still folders where the export utility was taking too long and eventually timed out.  After some work I was able to adapt the structure to ask the web storage system to just give me URLs for all of the items in the folder.  From there I could make individual requests to get the metadata for the item.  One would think that this would be much slower, however, in the end it was radically faster.  When I say radically, I mean somewhere between 6x and 12x faster. Apparently the web storage system doesn’t like figuring out what properties to return dynamically so it takes it a long time.  Ultimately the new server I loaned the client did the entire migration of 17/22GB in a little more than 2 hours.

I was originally going to help Kimmo update the tools by migrating the changes into the core code but ultimately decided that I had changed the structure enough that the changes would cause architectural ripples in the export program that I didn’t have time to retrofit.  Kimmo added it to his work list since he now knew why the program took so long to export.

With my new export in hand I wrote an import tool to process the metadata files and upload them to SharePoint.  Which lead me to defects in the SPFileCollection.Add method.  They’ve documented that it doesn’t work correctly, but the core API issue remains the same.  However, in the end the import tool works.  It imported versions, meta data, and document profiles.

Just in case anyone else is having the same problems, I’m available to help with migration projects from the web storage system to WSSv3 and MOSS.  I’ve got a set of import tools that work – and are quick.  (They were always designed to be flexible on the document profiles/content types they work with.) We ended up doing our migration during an evening – not even over a weekend.  I’d probably recommend a weekend migration, but our migration worked out just fine.

10 Comments

  1. Great article. I am doing a large Lotus migration to MOSS. Will you please send me or give me the links’ url.

    I appreciate.

    thanks,

    –syed

  2. You must make sure that the columns you’re trying to add are on the list. Also, make sure you call SPListItem.Update()

  3. Nich job :) I’m currently in a MOSS conversion project and my import tool to MOSS is stuck in the SPFileCollection.Add bugs Any suggestions on getting the metadata I’m currently using hashtables, the files are flowing inn but the metadata is’nt.
    -Geirs

  4. Don’t try to do it with hash tables. Set them using the properties of the item as the Add() shows to handle Created and modified dates.

  5. hi
    its gr8 article..bt some things m not very clear abt are would this work for conversion of any kind of data that resides in lotus notes or something specific..wt if m having some custom application in lotus notes and i want to migrate its data into sharepoint list..will i be able to migrate with above procedure

  6. Great Job!!! This article is very nice. I will be doing Migrations shortly and will keep in mind the points mentioned in this article. Thanks.

  7. looking at moving from lotus notes 6.5.5 to sharepoint wss 3.0. will try to use some of your notes.

  8. Great insight, great article, and thanks for sharing it.
    How to subscribe on your blog ???


Add a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this: