Blog - Robert Bogue [MVP]
Rob's Notebook
SharePoint Calendar

Categories

Links

Archives

Other Blogs

Thor Projects LLC - Welcome : Blog - Robert Bogue [MVP]
Thursday, January 31, 2008

Garbage on a SharePoint Site’s Main Page

Occasionally I get a call from clients that their main page for a site has been “corrupted.”  Upon further review I can see that it’s actually a copy of a Word document.  It appears that they’ve accidentally saved a Word document as default.aspx.  Unfortunately the clients rarely have SharePoint Designer – it’s a right click to fix this within SPD (Revert to Site Definition.)

Fortunately, however, there’s a way to fix this without SharePoint designer.  There is a page in layouts called reghost.aspx that allows you to reghost – or uncustomize – a page.  So you can append _layouts/reghost.aspx to the url to get this page.  Let’s say that you have a site at the url http://wss/foobarred you would type in http://wss/foobarred/_layouts/reghost.aspx.  On the page that appears you can reghost (or uncustomize or revert) a single page – or all the pages in the site.  Enter the URL of the bad page or ask for all of the pages and hit the Reset button and the page will return to the format on the file system.

It’s a nice way to work around the cursing that happens when a page is accidentally overwritten.  (Both what appears on the screen and what happens behind the keyboard.)


Categories: Professional | 2 Comments
 
Monday, January 21, 2008

STSADM strikes again, Failed to extract the cab file in the solution...

Having more fun with STSADM ... I tried doing:

STSADM -o addsolution -filename XmlWebParts.wsp

and I received back this message:

Failed to extract the cab file in the solution.

When I looked ... I had accidentally duplicated a line in the DDF file so a file was being included in the cab file twice with the same name -- apparently SharePoint doesn't like that.

 


Categories: Professional | 10 Comments
 
Monday, January 21, 2008

And Now for Something Completely Different - How capitalization effects STSADM -o addsolution

Earlier this evening I was working on a project and put together a quick batch file to uninstall and reinstall a solution.  I issued the following command (in a batch file):

STSADM -o addsolution -filename XmlWebParts.WSP

and I got back in response:

“xmlwebparts.wsp” has an unsupported extension, and cannot be added to the solution store.

Having never seen this particilar error from STSADM I was intrigued.  I decided to try the following command:

STSADM -o addsolution -filename XmlWebParts.wsp

and received:

Operation completed successfully.

Say what?  It turns out that for some reason STSADM -o addsolution wants the .wsp extension in lower case.

 


Categories: Professional | 2 Comments
 
Saturday, January 19, 2008

[ASP.NET] Provider Logging Project

Preparing for conferences is fun.  It forces me to put things together and take concepts that I’ve used in developing client applications and packaging them in ways that other people can use.  That’s why I’m happy to announce availability of the Provider Logging project on Codeplex.  The Provider Logging project is a set of providers for ASP.NET.  These providers encapsulate another provider so that you can monitor the interaction between ASP.NET and your provider – or one of the out of the box providers.

The Provider Logging project currently includes logging providers for Membership, Roles, and Site Map.  The intent is that logging providers for the other ASP.NET provider model options would be written too. (I'm calling for volunteers)  All of the logging providers do so by writing out via System.Diagnostics.Trace.  From there you can configure the trace logger based on one of the built in ASP.NET loggers.  There’s a sample ASP.NET web site included in the source code which has a web.config that is configured with the file logger.  Because they use System.Diagnostics.Trace you can choose where you log their output, and can even filter the output of them to get only the events that you're interested in.

One might wonder why the Provider Logging project was necessary.  There are two key reasons why I felt like the Provider Logging project was necessary:

1)      A training tool – There’s nothing like seeing how the interaction really works to help you build better providers.  The provider model and how the calls are made are documented but the normal sequences of events are not documented very well, so with the Provider Logging providers you can see the sequences that happen.

2)      A diagnostic tool – ASP.NET isn’t the best about explaining why it took an action or didn’t take an action based on the response of the provider.  It blindly carries on.  However, if you’re not getting the behavior you want you are left guessing and without much hope of figuring out what’s going on.  This is particularly true in tools like Microsoft SharePoint that leverage ASP.NET.  (In fact, all of the providers in the initial release were written to debug problems with various providers as they were used in SharePoint.)

The Provider Logging project represents that core of the custom authentication presentations that I’m giving at the Office Developers Conference and SharePoint Connections.  We’ll be tearing apart what happens when authentication providers are called by the ASP.NET framework – and what happens when they’re called in SharePoint.  If you need to write a custom authentication provider and can’t wait for those presentations, check out the Provider Logging project.  It’s not a replacement for attending my custom authentication sessions, but it’s a step in the right direction.


Categories: Professional | 0 Comments
 
Friday, January 18, 2008

Adventures in Migrating Content from Lotus Notes and SharePoint Backwards Compatible Document Libraries to WSSv3

I recently completed a project where I migrated Lotus Notes databases and SharePoint Portal Server 2003 Backwards Compatible Document Libraries (SPS2003BCDL) to WSSv3.  I learned a lot through the process and wanted to share how the experience worked and where I ended up.  For those who aren’t familiar, SharePoint Portal Server 2001 (SPS2001) used a document library based on the Exchange Web Storage System, sometimes called WSS which became a problem when Windows SharePoint Services became the new name for SharePoint Team Services.  The Web Storage System had its issues but it did have some great features.  Being based on the Exchange engine it supported security, versioning and user specified document metadata.  What WSSv3 calls Content Types the Web Storage System called document profiles.

The problem with the web storage system was that it didn’t perform well and it didn’t scale well.  As a result in SharePoint Portal Server 2003 Microsoft shifted to a SQL based storage engine.  However, in the process they lost per-item security, and the ability to have any sort of profile for documents.  Because of the substantial feature removals in going to SQL, they made installing the web storage system available as the “Backwards Compatible Document Libraries” and provided a set of migration tools from the BCDL to the new document libraries but the limitations and the lack of profiles stopped many customers, including my customer, from migrating their documents over.

So once WSSv3 was released with support for content types it was possible to migrate from the web storage system to WSSv3.  Thus the impetus for the migration project I just completed.

We tackled the migration as two separate pieces.

Notes Conversion

First up was the Lotus Notes migration.  There were five databases to be moved, two of which were really the same database with different statuses of the data.  They got collapsed into one database with an extra status field because moving items from list to list isn’t as easy in SharePoint as it is to move from database to database in Notes.  Although I outsourced the actual migration the tool used was Proposion.  There are some interesting issues that arose that I think everyone should consider when they’re having to do a migration.

1)      Make sure that the vendor you select to do the migration understands content types.  Content types are important when you consider how you’re going to find data in the long run.  The vendor I chose wasn’t initially familiar with content types.  You can read the whitepaper I wrote, Managing Enterprise Metadata with Content Types, if you want to know more about them and get a sense for how to use them.

2)      Make sure the vendor you select understands how to use features and solutions to deploy and provision content types.  If you just go into the user interface and create a content type you’ll find it difficult to move it between development and production.  If you create a feature that defines the content types all you have to do is test the feature in the dev environment and then deploy the feature into the production environment.  Why a SharePoint solution?  That’s the repeatable way of deploying a feature.  The one I selected didn’t.

3)      Be clear about what your expectations are.  One of the databases was a collection of inspection forms.  I requested that the Notes database be moved into a document with properties/quick parts so that those properties would expose themselves to SharePoint.  The idea was that it would be a real honest to goodness form that was searchable via SharePoint (just like I built in the Managing Enterprise Metadata with Content Types) but in the end they couldn’t figure it out, and I decided that it wasn’t worth pushing.  Apparently all the conversion tools convert from Notes Databases into list items.

4)      One tricky bit about this particular migration was that one of the Notes databases had links between the documents.  We ended up doing a two pass migration where we converted the links to SharePoint links once we knew what the ID#s of the SharePoint records was.

5)      Consider what the final appearance should look like.  Ultimately I created a set of web parts to get the XML of an item and then transformed it with XSLT.  One would think that there would be a way to do that with out of the box web parts, however, the XML web part doesn’t support data connections or parameter substitution so it wouldn’t work.  Instead I created a web part that emitted the XML for the record requested in the URL via a web part connection.  I connected that to the connectable XML web part I wrote and everything fit together.

All in all the conversion was successful, however, it was certainly a lot more painful than it should have been.  Hopefully the next one won’t be so hard, particularly since I’ve got a lot of the tools built for it already.

SharePoint Conversion

When we finished the Notes conversion I thought, great.  The hard part is behind us.  What’s left is just time consuming.  The library we were moving was 22GB in the Web Storage System (about 17 GB on disk).  I thought it would be easy, I’d export the library using the open source SPMigration tool kit that Kimmo Forss built and made available via CodePlex.  After several attempts I managed to get a version of the tool that would run.  There were some packaging issues which wouldn’t allow the installer to run.

So I started running the tool against the datastore in a virtual PC environment and it ran for more than 48 hours … so I gave up.  I bought a server to loan to the client for the migration.  That server was VERY fast, and did manage to create an export in 24 hours or so, but it clearly wasn’t exporting everything.  Try after try lead me from one error to another until finally I had to give up.  I found out from Kimmo that the dataset I was migrating was the largest single-shot migration that the tools had ever been tried for so far as he knew.

At this point I decided to evaluate Tzunami Deployer, without getting into too many specifics, I can say that the export tool for SharePoint 2001 (SPS2003BCDL) just didn’t export all of the data I had.  I was instructed to do smaller migrations, however, this wasn’t feasible given the structure of the data and my project restrictions.

I took the code that Kimmo had written and copied out everything that I had to have to make the system function.  I managed to get a prototype exporter created.  However, there were still folders where the export utility was taking too long and eventually timed out.  After some work I was able to adapt the structure to ask the web storage system to just give me URLs for all of the items in the folder.  From there I could make individual requests to get the metadata for the item.  One would think that this would be much slower, however, in the end it was radically faster.  When I say radically, I mean somewhere between 6x and 12x faster. Apparently the web storage system doesn’t like figuring out what properties to return dynamically so it takes it a long time.  Ultimately the new server I loaned the client did the entire migration of 17/22GB in a little more than 2 hours.

I was originally going to help Kimmo update the tools by migrating the changes into the core code but ultimately decided that I had changed the structure enough that the changes would cause architectural ripples in the export program that I didn’t have time to retrofit.  Kimmo added it to his work list since he now knew why the program took so long to export.

With my new export in hand I wrote an import tool to process the metadata files and upload them to SharePoint.  Which lead me to defects in the SPFileCollection.Add method.  They’ve documented that it doesn’t work correctly, but the core API issue remains the same.  However, in the end the import tool works.  It imported versions, meta data, and document profiles.

Just in case anyone else is having the same problems, I’m available to help with migration projects from the web storage system to WSSv3 and MOSS.  I’ve got a set of import tools that work – and are quick.  (They were always designed to be flexible on the document profiles/content types they work with.) We ended up doing our migration during an evening – not even over a weekend.  I’d probably recommend a weekend migration, but our migration worked out just fine.

Categories: Professional | 10 Comments
 
Saturday, January 12, 2008

Come Play “Where’s Rob” at SharePoint Events

It will be a busy next few months for me.  I’ve got a lot of great clients I’m working for, but it’s not that which is going to keep me busy.  I’ve got a pretty jam packed conference schedule.  I’m sharing it here just in case you want to try to play “Where’s Rob?” (see Where’s Waldo).  Here’s the rundown of my Spring:

Date

Event Name

Sessions/Activities

February 1st-2nd

Sleepless in Chicago

Trainer/Presenter/Judge? (TBD)

February 4th-6th

SharePoint Information Worker Conference 2008

·         SharePoint Designer: When should you use it and how?

·         Connecting Metadata in Office and SharePoint

February 10th-13th

Office Developers Conference 2008

·         SharePoint Search and Office

·         Custom Authentication for SharePoint

March 3rd-6th

Microsoft Office SharePoint Conference 2008

·         [Tentative] SharePoint for the Developer and ITPro

April 20th-23rd

SharePoint Connections 2008 Spring

·         Workshop: SharePoint Workflows

·         Connect SharePoint Search and Office

·         Quick Integration from SharePoint to your Application

·         Custom Authentication for SharePoint

All of these events have a great set of content that they’re going to be delivering.  It’s an amazing thing to see how the amount and depth of the content for SharePoint has grown over the last year.  I hope to see you at one of these events.


Categories: Professional | 0 Comments
 
Saturday, January 12, 2008

Import Profiles Only for Active Users

While working with a client recently we noticed that they were still seeing disabled accounts in the people search results.  That is, generally speaking, bad.  But it’s actually pretty easy to fix this with a tweak of the LDAP query being used to generate the profiles. First we have to get there so go to…

 

1)      Central Administration

2)      Shared Service Provider (the one that hosts user profiles)

3)      User profiles and properties

4)      View import connections

5)      Hover over the connection you want to change’s name and click edit

There’s an option in the Search Settings section titled user filter that probably has in it:

(&(objectCategory=Person)(objectClass=User))

What we want is that plus a part of the query that says not account disabled.  It happens that account disabled is a part of the userAccessControl bitmapped field in AD – which means it’s not simple to determine if a bit is set or not.  However, it’s possible.  There is a technet “Hey, Scripting Guy” article which answers the question “How Can I Get a List of All the Disabled User Accounts in Active Directory?  It turns out the post has in it the magic key we need.

(userAccountControl:1.2.840.113556.1.4.803:=2)

If we wrap this up in a not, and add it to our query we get the results we want.  By the way, the funny number in the middle of that statement is just telling LDAP to use a bitwise AND.  That means that only items will be returned where the account disabled is set.  Since we want the reverse we’ll wrap that up in a not, and we get a query that looks like this:

(&(objectCategory=Person)(objectclass=user)(!(userAccountControl:1.2.840.113556.1.4.803:=2)))

Immediately after doing this and doing a profile import you may be thinking that the disabled users should be gone, unfortunately no.  But that’s an artifact of search.

Search doesn’t remove an entry until the entry has been missing for three full imports in a row.  The thinking is that a site might be temporarily offline during the index and it would be bad to remove it from the index just for a bit of bad timing.  So if you want to delete the user from the search results do three full imports and the users should disappear.


Categories: Professional | 0 Comments
 
 
Tuesday, January 01, 2008

Rebuild the WSSv2 Full Text Index

I found an old walk through for rebuilding the WSSv2 Full Text Index.  I've wrapped it up into a PDF and uploaded it here.  (http://www.thorprojects.com/files/RebuildtheWSSv2FullTextIndex.pdf)


Categories: Professional | 0 Comments
 
Tuesday, January 01, 2008

Why SharePoint isn't Perfect!

I rarely engage into debates with lunatics.  A wise man once told me that anyone who argues with an insane person is themselves insane.  As a result I’ve had a natural bias against engaging in religious wars directly.  When I do engage in religious wars (see “Open Source Software on the Desktop – Is it Right for You”) I generally do my homework.  I try to see all the sides of the equation as I can.  In the case of Linux vs. Windows – I ran both on my servers and on my workstations.  I ran them both for years before writing about it.  When I had reached my conclusions I validated them with someone who held a seemingly contrary position.  I did all of this because I try to avoid the insanity and stick with the facts.

I do currently host a few DotNetNuke sites on one of my servers.  Nothing really big but enough to matter but certainly enough to give me a sense for what the technology does.  I did a tech edit for the Beginning DotNetNuke Skinning and Design.  I certainly am not the foremost authority on DotNetNuke.  However, I do know a few people now who are authorities on the technology.  I learned a lot from my experiences with the product while getting it setup – and in working with it for the book.

I have spent a bit of time with Community Server – not nearly enough, however, I have great respect for Rob Howard and the guys at Telligent.  Do I believe that I can accurately articulate what Community Server does and does not do well – nope.  However, am I still trying to learn more about it – yep!

Why am I making a point to tell you about the related and semi-related products that I am paying attention to – and how I’m paying attention to them?  Honestly, because I am hoping that before writing another “SharePoint Sucks” blog post that someone will pause and really understand the platform first.  Even the short snippets I write about SharePoint I try to do with a balanced hand.  My End Bracket “You Should Learn SharePoint” article in MSDN magazine didn’t just mention the positives, I also highlighted some of the real issues that developers face.

I guarantee that there are issues with SharePoint.  I’ve got a list longer than my arm of things I want to see changed.  Some of them I communicate to MS – and some of them I silently live with – because I realize that there’s a reason why they did something the way they did and it’s better for the product to do it the way that they did – even if it inconveniences me, my scenario, or my customer.

I’m not saying that there aren’t truly bonehead things in SharePoint – there are.  Invite you to try to find them – and tell the product team about them when you find them.  (If you can’t get them I’m happy to hear from you and pass along reasonable, well articulated issues for you.)  What I’m saying is that for every bonehead thing that I find I find dozens of other things that exist for a reason.

The thing is that product development isn’t about absolutes.  No product team knows what everyone will do – or want to do – with the product.  They have finite resources and finite input on a product.  They have to compromise and build the best product they can.  It will have warts.  It will have gaps.  It will have bad spots.  The trick isn’t creating a product without these – the trick is putting them in the least intrusive places.

So let me it a few common complaints about SharePoint and provide my canned responses so you can skip the time of writing a half-thought out blog post if that’s your intention.

1)      It’s too complex – You’re right.  Anything that is flexible is.  SharePoint is very flexible and is therefore necessarily complex.  I’m not going to insult your intelligence to tell you that it’s easy – it’s not.  I learn more about it every day.  However, it’s a big product family and you can stay tactically focused and only learn those things that apply to your situation – if you want.  Here’s a quick list of questions and answers on the complexity of SharePoint in no particular order.

a.       Do you have to know XSLT to work with SharePoint? – not really unless you want to work with the out of the box web parts.

b.      Do you have to know XML to work with SharePoint – probably. 

c.       Does that XML knowledge surpass most people’s understanding? – at times. Yes – most folks don’t understand namespaces in XML and in some places it’s important to SharePoint.  This occurs most notably in web part configuration .  i.e.  .webpart – files.

d.      Do you have to understand Code Access Security? – Yes, but frankly if you’re developing a web application in ASP.NET you should know this.  You don’t have to write CAS policies if you’re willing to deploy everything to the GAC.

e.      Do you have to know ISA Server, Active Directory, Indexing, SQL Server, IIS, etc.?  -- No not really.  Sure it helps but I wouldn’t say that you “have” to know these things.

f.        Do you have to know Master Pages?  -- Only if you want to brand the pages or change the layout – if you don’t worry about that for your ASP.NET applications then you don’t have to worry about it with SharePoint.

g.       Is it hard to install? – Nope.  The standard windows  installer where you hit next a bunch of times and it will finish.  Sure if you install it in a complex configuration it will prompt you for a fair amount of data but by and large it will configure itself if you’re willing to take the defaults.

2)      It doesn’t do X as well as Y.  One of the things that I run into repeatedly is the thinking that SharePoint should be better than targeted applications.  These applications have on niche (or at least narrow) market that they are targeted for.  They solve one problem.  They’re not flexible.  They can’t be configured or reconfigured by users.  However, they do their one thing better than any other application.  When you compare SharePoint side-by-side with applications like this you can determine all sorts of ways where SharePoint doesn’t measure up.  However, that misses the point.  SharePoint is a general purpose tool that can be made to do several things – in many cases well above the “good enough” bar to make it a viable solution.  Sure the dedicated application may do a better job with some things.  There’s no doubt.  However, what’s the better answer from a organization perspective.  One product that solves many problems, is flexible, and has tight integration to Office – or a set of individual products that may each individually do slightly better at solving one or two problems but come with a set of integration, synchronization, and disaster recovery issues?  There isn’t one answer here – just a realization that you have to think about the greater good.

3)      SharePoint didn’t eliminate my need to organize my information.  I love this one.  It always leads me to Fred Brooks “No Silver Bullet: essence and accidents in software engineering.”  The reason is because there are no silver bullets.  SharePoint won’t eliminate the need to organize information – it will: minimize the need to have fully defined content taxonomies by leveraging full text search.  It will create new opportunities for organization by allowing the multi-dimensional aspects of metadata based organization, and by improving context.  If you have a bad taxonomy today, SharePoint will make it easier to find the information you’re looking for – however, it won’t make it possible to find everything every time.

4)      SharePoint didn’t automate my company.  No, SharePoint doesn’t do the truly hard work – it allows the hard work to happen.  SharePoint won’t magically define all of your workflows.  It won’t define metadata for each of your document types.  It won’t establish information rights management policies across the organization.  However, since it can do these things it becomes possible to have the hard conversations that lead to the results you want.  With the traditional network shares technologies there’s no opportunity for workflow, adding metadata, or automatic assignment of information rights management policies so there is little reason to even entertain the conversations.  It’s honestly a “the chicken and the egg” problem.  You can’t define the processes until you have a tool that can implement them – and you can’t fully implement the tool until you have the processes.  Jason Mraz has a song, Live is Wonderful that has much better examples than “chicken and the egg” but they’re less known.

 

I’m assuming that if you’ve read this far you’re saying “but you’re just defending SharePoint.”  I’ll take that criticism but I would ask that you look back at my responses – I’m really just defending the process of building software.  SharePoint is a flexible, general purpose, highly integrated tool.  It’s not for every situation and it won’t do the hard work of building taxonomies and it won’t automate your company.

 

For those developers who want to insist that it’s a hard platform to develop for, I agree within context.  Today we’re used to IDEs that do all of the heavy lifting for us.  We absolutely expect that we can hit F5 and compile, deploy, and test our work.  It’s not that way with SharePoint.  Absolutely we need better development tools for SharePoint.  However, that’s fixable without condemning SharePoint.

 

I “grew up” professionally programming on a VAX and on a PC in C.  I learned how to use print statements to know what my software was doing.  I didn’t have interactive debuggers.  I didn’t have the ability to set breakpoints.  I learned how to develop without these tools.  The problem is that most developers today never learned these skills.  Because of that if there isn’t an interactive debugger developers don’t know what to do.  I feel sorry for them.  I don’t like debugging with print statements – but I feel like it makes me a better developer to know how – and when – to do this.

 

So SharePoint isn’t perfect, but what is?


Categories: Professional | 0 Comments