Skip to content

Sending HTML Emails with System.Net.Mail.MailMessage is more than IsBodyHtml

It seems like lately I’ve been running into a series of things that should be easier than they’re turning out to be. I’m not sure exactly why that is – but I’ve got another fun one to share.

So System.Net.Mail.MailMessage is the object (along with SmtpClient) which are used to send emails. There’s a single property on MailMessage called IsBodyHtml that is supposed to indicate that the body of the HTML should be treated as Html. That’s all fine and good except that my experience is that it doesn’t work consistently. In fact, it rarely works for me. So how do you make it work? Well, you add Alternate views. MailMessage has a property AlternateViews which is a set of alternate views that can be displayed with the message. In order to get a HTML view I had to add a new AlternateView with the encoding of text/html or System.Net.Mime.MediaTypes.Text.Html. So when I want HTML I add this code to the bottom of the place where I’m sending the message:

AlternateView av = AlternateView.CreateAlternateViewFromString(msg.Body, new System.Net.Mime.ContentType(System.Net.Mime.MediaTypeNames.Text.Html));

av.TransferEncoding = System.Net.Mime.TransferEncoding.SevenBit;

msg.AlternateViews.Add(av);

I also generally add one for plain text encoding for those email readers that can’t cope with HTML which can be done by replacing the content type with ‘Text/Plain’ like this:

AlternateView av = AlternateView.CreateAlternateViewFromString(msg.Body, new System.Net.Mime.ContentType(System.Net.Mime.MediaTypeNames.Text.Plain));

av.TransferEncoding = System.Net.Mime.TransferEncoding.SevenBit;

msg.AlternateViews.Add(av);

Hopefully this will make it easier to figure out what is wrong when you are trying to send HTML email messages and it isn’t working.

OnWorkflowItemChanged and Workflow Event Delivery problems

One of the problems that I recently ran across was that I had a workflow that would stop getting events delivered to it while it was running. It wasn’t clear what was going on but with some help from Eilene Hao Klaka and Gabe Hall we were able to sort out the root issue. So let’s take a rather simple workflow:

The key here is that the workflow watches the item using OnWorkflowItemChanged then stops watching the item while it works on watching some tasks and then it resumes looking at the workflow. This looks pretty harmless and it is – right up to the point where someone makes a change to the document when the workflow is looking for task updates – but not item updates. When this happen the workflow gets the event, can’t process it – and ultimately it’s no longer runnable. It won’t respond to any more events or take any more actions. Making matters worse is that you don’t so much as get a message in the ULS that this has happened.

So what’s going on? The short of it is that the workflow host and workflow foundation believe that you need events on the item in the middle of this workflow because the scope of the correlation token for OnWorkflowItemChanged is the workflow itself. So even though there’s not an activity that is listening – the WF host and foundation believe there should be. The first thought might be to change the scope of the correlation token for OnWorkflowItemChanged to something smaller – but that won’t work because OnWorkflowItemChanged expects the token created at the scope of the workflow.

However, there is another approach that will work. You can setup a subscription for your own event by using the CallExternalMethodActivity, selecting the InterfaceType of Microsoft.SharePoint.Workflow.IListItemService and selecting the method name of InitializeForEvent. You’ll also need to provide an ID ([Updated]The guid from the workflowItem workflowProperties.Item.Guid), the itemId (which in our case we can get from workflowProperties.ItemId), and listId (again we can get this from workflowProperties.ListId). This sets up a subscription for events. The second activity we need is HandleExternalEventActivity. In this activity we select the same InterfaceType of Microsoft.SharePoint.Workflow.IListItemService and select an EventName of OnItemChanged. This will get signaled when the workflowItem is changed. The beauty of this is that you can place these two activities inside of a sequence activity and set the correlation token to the scope of the sequenceactivity. When the correlation token falls out of scope, the subscription for the events will automatically be removed.

I’ve bundled this into a custom sequence activity. The sequence activity that I’ve prepared will allow you to bind the SPWorkflowActivationProperties (workflowProperties) from which the activity will automatically get the listId and ItemId – or you can bind them individually. I’ve also allowed you to bind to the subscription id – while I don’t know why you would need this, I was trying to be complete. Finally, I also added a Invoked method so you can run code based on the event happening. CAUTION: I’m not setting the sender when I call your method so you’ll not want to use this for anything that requires the sender object. (i.e. for those cases when you need to know what branch you’re in.) This is the same problem that OnWorkflowItemChanged/OnTaskItemChanged has – so I didn’t see this as a big issue. The activity looks like this:

The code is available here – use it at your own risk.  All you need to do is replace your OnWorkflowItemChanged with this activity and the rest of your workflow should remain undisturbed.

[Update 2010-07-27] There are a few issues with the initial post.  I originally called out that the subscription ID be a random Guid — this isn’t correct, the Guid needs to be the guid of the document that the workflow is running against.  Second, You need to make sure that you want the event that you’re subscribing for.  If you quickly subscribe for an event and then exit the scope you’ll have problems.  Finally, I adapted this approach to put the listen in the middle of a while loop to ensure that I wasn’t subscribing and falling out of scope repeatedly.  Unfortunately, this adaptation can’t be packaged as an activity because of a bug in Visual Studio 2008 (the problem doesn’t exist in Visual Studio 2010).  The bug prevents you from providing the condition to the while activity in your custom activity.  The workaround is to simply replicate the pattern directly in the main workflow.  Although this is tedious it works. — rlb

OnTaskCreated, DelayActivity, and PersistOnClose – How you can force the creation of a task

I’ve been working on a rather complex SharePoint workflow and I’ve run into a few problems. The workflow does a parallel approval of a form – and well, I’ve discovered a few issues.

First, there aren’t many examples of how to do parallel approvals. This is particularly true when you need to keep unique instance data per iteration of the replicator loop. However, by scoping the correlation token to the inner sequence activity in the replicator, using a custom sequence activity to hold the additional parameters you need, and using the ChildInitialized event of the replicator that can be done.

However, I also ran into some odd problems that only occur when a workflow is doing parallel execution. But before I get there, I have to explain how CreateTask works. The CreateTask activity doesn’t actually create a task immediately, it creates a request to create the task when the workflow is serialized. The basic thing is that they want to minimize disk IO on the SQL server so if you wanted to change things after creating it – but before it’s written to disk in the list – you could. However, there are many situations where you need to force the task to be created so you can start to use it in your loop conditions. Commonly you want to know if the item exists and isn’t completed. If you put this at the top of a while loop immediately after a CreateTask you’ll never enter the while loop because the task will be missing from the point of view of the condition.

To every problem there is a solution, enter the OnTaskCreated activity. But wait, I have to mention that Microsoft is recommending that you not use OnTaskCreated – if you don’t believe me check out KB 970548. They say to use DelayActivity. I’ve commened on DelayActivity in the past, particularly about the fact that you can’t have a DelayActivity that runs for less than a minute. Well, I was wrong. There’s a situation where you can have a DelayActivity fire in less than a minute.

If you’re running a parallel situation (say inside of a replicator) and you hit a DelayActivity the wait is put on a timer queue inside of the workflow – just like any other workflow foundation workflow. The other branches will continue to run while the DelayActivity quietly ticks off the time. So it is technically possible to have DelayActivity sit for less than a minute.

In my case, I had set my DelayActivity to one second and didn’t think anything about it. That is until I got a Null reference exception thrown back at me from the workflow. Why? Well, it seems like in the serialization process for the workflow the event fired and well, the SharePoint workflow host didn’t know what to do with it. (Reportedly this is fixed in SharePoint 2010 but I’ve not tested this.) So now what do I do?

Well, enter the PersistOnClose attribute. This is an attribute in the Workflow Foundation that signals to WF that the workflow should be serialized immediately after completing the activity. This sounds pretty good… I can do a create task then an activity that has this attribute and all is well. Of course, if I had 100s of parallel branches in a workflow this would be sort of abusive on the system forcing it to serialize a workflow 100s of times – but for my case where I’m only ever a dozen or so branches wide at the same time it works fine.

All I did was I created a new activity that does nothing – except it has a PersistOnClose attribute on it. I put this immediately behind my CreateTask and voila. I get my task created. There’s no crazy eventing going on. There’s no delay while the system goes to sleep and wakes back up – just a little extra overhead on the system.

Problem solved. It’s more than a bit crazy how you solve something like this – but it works and I’ve got one less problem to worry about.

[Update: You can download a copy of my code (use at your own risk) here.]

SharePoint Saturday Wrap Up and Thank You

Last Saturday, January 30th 2010, we did a SharePoint Saturday event here in Indianapolis. From the perspective of most folks it was a roaring success. Kevin Dostalek posted his recap already. David Petersen posted a few pictures before we got rolling. Woody Windischman, Chris Geier posted about their experiences and both Enrique Lima and Rob Wilson posted their slide decks.

For me I felt like I was in the center of the storm. Officially the SharePoint Users Group of Indiana executed the event. (i.e. the users group held the money) However, I can honestly say that my introduction to the steering committee was correct – I didn’t really feel like I did that much. In my conversations with the other members of the steering committee they felt largely the same way. That – to me – says a lot about the group of folks committed to making the event a success. We all compete for business in Indianapolis – and at the same time we worked together to deliver a community event that most people walked away from happy.

I wanted to say thank you again to the sponsors: Ambassador, Apparatus, CDW, Idera, K2, Microsoft, SHI, Wrox. Without their generous support the event simply wouldn’t have been possible.

I also wanted to thank everyone for coming. A SharePoint event in Indianapolis with 372 registrations and 250 attendees should demonstrate what a great community we have in Indianapolis.

Inside the Way Back Machine, Inside Access

This talk of Access Services got me to dig up my copy of Inside Access and take a few pictures:

You may notice my name in the upper right – and the date near the lower left. This is one of the first two books that I worked on. (I think it was second but I’m actually not sure.)

Some of you may not recognize what that second picture is of – it’s a picture of a 5.25″ floppy disk. That’s right it’s flexible. This one is available for read/write (see the notch on the upper right.) With 360K of storage and a 360 RPM speed this baby was all the rage – 17 years ago.

In case anyone’s wondering – the book is a museum piece now – I don’t refer to it daily.

SharePoint 2010 and Access Services Place

I’m going to take a break for this post from what tends to be some very technical detail that you need to know. Instead in this post I’m going to talk about how one of the new features in the Office 2010 wave and the way these features may impact the market. Of course, this is a bit of prognosticating on my part but I think my perception should be on firm ground. Let me start by explaining.

Back when Microsoft released Microsoft Access 1.0, I was working for Woods Industries as a network manager. At the time we were more interested in Borland’s Paradox for Windows. However, I was interested enough in both to perform a technical edit on the New Riders Publishing books Inside Paradox for Windows and Inside Microsoft Access. Thus my experience with Microsoft Access spans more than 15 years. Since my time at Woods I’ve run across Access dozens, if not hundreds of times. I’ve seen it in use as small organizations and large organizations. I’ve seen it used as a way to transform data and as a complete solution. There are folks who have made their entire careers building Access databases. The platform is robust enough to create a career around. There used to be magazines and conferences dedicated to Microsoft Access development. I used to speak at Advisor Media conferences where Access was covered as a track. (I was speaking on SharePoint.) I’ve had plenty a conversation with folks who only did Access development at these conferences.

Access sits in this spot in the market where organizations (or departments) don’t need or can’t afford a fully custom solution developed. As much as .NET applications are, I believe, easier to build and quicker than other technologies there’s still a great deal of information that you need to know in order to build a scalable application. Access tends to be what people use when they need to get something done and they don’t have another way to do it. That doesn’t make it bad. As long as you don’t get the scale of the application too out of whack it’s a cost effective way to build things.

However, Access isn’t perfect. The speed of development comes with a cost. Access has well known classic issues with corruption and difficulty with recoverability. There are solutions to these issues; however, most of the databases that are created are created by those that don’t know how to solve these issues. Access also has the inherent limitations of being a tool that requires the client application be installed. In some cases it’s not possible to implement solutions where Access is required on the client. Access is the application that SharePoint resembles most from an adoption standpoint. Access’ initial adoption was viral. SharePoint’s adoption is viral. Both are tools that users and managers in organizations can use to create solutions.

Still, the relationship between Access 2007 and SharePoint 2007 isn’t that great. It’s possible for Access 2007 to consume SharePoint 2007 lists but because of locking issues on the SharePoint side it’s not possible to have an Access .MDB database in SharePoint. It’s a love-hate relationship. I won’t quite to go so far as I did with calling out relationship issues like I did with InfoPath. However, the relationship between Access and SharePoint in 2007 isn’t the best.

With that background I want to break down the folks that use access into a set of categories so that we can see where Access Services fits – and Access itself fits with SharePoint 2010. I see SharePoint cannibalizing of some of Access’ core market. I see organizations implementing SharePoint where the issues of having a client installation aren’t acceptable. While SharePoint doesn’t have nearly the flexibility that Access has for customization without code, SharePoint is flexible enough for a broad array of applications which used to require Access.

I see two key sizes of organization that use Access. There’s the small size business that don’t have a centralized IT department or their IT department is one or two people. Access tends to be used because it’s easy to do and is something that the IT people that work for the organization can deal with. They have to do everything so they don’t have time to become and expert at development. They use Access to help the business when buying a package isn’t cost effective – and neither is hiring a professional programmer.

The other size of organization that I see using Access is the very large organization where there is a centralized IT department of dozens, hundreds, or thousands of people. In these organizations Access is used because there are dozens of small projects that can’t get prioritized because they don’t have enough value to the organization. Not that they’re not useful. Not that they won’t be valuable for the organization. Instead the project’s value (and maybe cost) are too small to get scheduled. It also may be that there are factions in the organization that dislike or distrust the central IT department and therefore they want to work on their own. Access is a perfect tool because they are likely to have it and it’s generally powerful enough to accomplish the goal.

Certainly there are mid-sized organizations that use Access but I see it in use more in large organizations and small organizations than in the mid-sized organization. Generally mid-sized organizations are trying to “grow up” to be big enterprises and start that awful adolescent phase where they are too big to accept the same level of risk they used to and too small to cope with the bureaucracy that a lower risk tolerance requires. As a result they believe that they have to develop everything “big” and Access temporarily takes a back seat to more traditional development languages like .NET.

In SharePoint 2010 one of the new services is the Access Service. Through this service and the Access 2010 client it’s possible to upload entire Access applications to SharePoint. This allows the application to be run from SharePoint. That means that the application – or part of the application – can be run by those clients who don’t have the Access client installed on their desktop. This can dramatically increase the reach of Access Applications.

Access Services isn’t without its limitations. Like InfoPath Forms Services there are some things that just don’t make sense or work that well in a web world. One buddy of mine quoted 82 limitations in Access Services – 82 things that don’t move from Access to Access Services. Certainly there are some things that you can do in Access that don’t make sense on the web (think special characters in field and table names). If you can, however, live with those limitations you can start to create a way for users to quickly work with data via the web.

There are a few key things that you should know about Access Services 2010. First, all of the design objects are stored in SharePoint so if you lose the .ACCDB file it’s not a big deal. You can regenerate it from the ribbon in SharePoint. This applies to all of the design objects – including those which are designed for client only use. It also applies to the data files that aren’t linked. The second thing to know is that not every object that you use from Access Services must be accessible from the web. If you have some reports that you need client features for you can still have those – they just won’t be visible on the web. So if you have items that can’t be converted to run on the web (think of the 82 from above.)

I should also mention that VBA code isn’t supported running on the server, but Access Macros are. These aren’t the same old macros from years ago, they’re a brand new set which have a brand new macro editor. The reason for this is part of the primary tenants of SharePoint – that is that code from one user shouldn’t be able to impact another. Macros are made up of a set of trusted components which are specifically designed and tested to prevent side effects that might allow one user to access another’s information. As a result they can be run on the server – where VBA code cannot. That means that you’ll need to plan to build your logic with macros if you want it to run server side. Access also added data level macros which can eliminate the need to copy validation logic from one form to another. On the server these are implemented as QuickFlows – basically a workflow that can’t persist. This is one of the features added to SharePoint 2010.

One way to think about this is the same way we think about workflows in SharePoint Designer. They’re available for anyone (with permission) to create because they are declarative workflows – they stitch together a set of known components. They’re trusted because the components themselves are trusted. In the same way Access Macros components are trusted and thus can be used by anyone.

From my perspective there will be one key place where Access Services will really excel. That is for the Enterprise scenarios where a large organization has deployed SharePoint Enterprise. This should make sense given that SharePoint Enterprise is required to get Access Services and it’s generally the largest organizations that have made this investment. Because of the cost of the enterprise licensing it’s unlikely that smaller organizations will leverage Access Services.

In addition to being used as a tool to create solutions for business units and groups inside of an organization, there’s one other key reason why Access Services may be used. That is that Access Services will generate the RDL language used by SQL Server Reporting Services (SSRS). This means that you can use Access to quickly and easily report on SharePoint data – this is a huge hole in SharePoint. In order to do reports you either have to do a lot of work to generate reports in SSRS including the use of third party components, export the data to Excel and make the spreadsheet pretty, or attach Access to the SharePoint lists and build your reporting from there. In 2007 none of these solutions are very palatable.

In 2010 we have another enhancement that is designed to protect the system but it also makes it harder to do reporting. Query Throttling will prevent queries asking for too many records (administrator controlled, defaulted to 5,000 records) from being run during the day. There’s an administrator setting for when large queries can be run (i.e. the middle of the night.) However, if you need a report before the end of your day – trying this from SharePoint directly may be difficult. Access Services works around this issue and allows you to build reports on large datasets. The net result is that you’re able to do reporting on the larger data sets – and the report design experience is good.

So I believe a key area of use for Access in the Enterprise will be for the development of reporting even on applications not originally created in Access.

InfoPath contacted a data source but failed to receive data

I’m doing some testing on a form including some security testing. One of the things that I ran into was the following error (details shown):

The form uses Universal Data Connections (UDCX). The user I’m testing doesn’t have write access to some of the reference lists. When I dug into the issue, I got tons of hits for Kerberos issues and some for the LSA Loopback check issue. However, the issue that I found was that my UDCX file included entries for update as well as reading the data. This was the default UDCX file that InfoPath created – take a look:

<?xml version=”1.0″ encoding=”UTF-8″?>
<?MicrosoftWindowsSharePointServices ContentTypeID=”0x010100B4CBD48E029A4ad8B62CB0E41868F2B0″?>
<udc:DataSource MajorVersion=”2″ MinorVersion=”0″ xmlns:udc=”http://schemas.microsoft.com/office/infopath/2006/udc”>
<udc:Name>Sites</udc:Name>
<udc:Description>Format: UDC V2; Connection Type: SharePointList; Purpose: ReadOnly; Generated by Microsoft Office InfoPath 2007 on 2009-10-19 at 07:55:36 by DEMO\Administrator.</udc:Description>
<udc:Type MajorVersion=”2″ MinorVersion=”0″ Type=”SharePointList”>
<udc:SubType MajorVersion=”0″ MinorVersion=”0″ Type=””/>
</udc:Type>
<udc:ConnectionInfo Purpose=”ReadOnly” AltDataSource=””>
<udc:WsdlUrl/>
<udc:SelectCommand>
<udc:ListId>{7159DFE5-D236-472A-81B8-D031EAE14F59}</udc:ListId>
<udc:WebUrl>http://wss/sites/moc/</udc:WebUrl>
<udc:ConnectionString/>
<udc:ServiceUrl UseFormsServiceProxy=”false”/>
<udc:SoapAction/>
<udc:Query/>
</udc:SelectCommand>
<udc:UpdateCommand>
<udc:ServiceUrl UseFormsServiceProxy=”false”/>
<udc:SoapAction/>
<udc:Submit/>
<udc:FileName>Specify a filename or formula</udc:FileName>
<udc:FolderName AllowOverwrite=””/>
</udc:UpdateCommand>
<!–udc:Authentication><udc:SSO AppId=” CredentialType=” /></udc:Authentication–>
</udc:ConnectionInfo>
</udc:DataSource>

All I did to resolve the problem was to remove the <udc:UpdateCommand/> tag so that the UDC didn’t have an update option. That didn’t in and of itself resolve the issue — but as soon as I reuploaded and checked in the UDC files they started working — the UDCs were not checked in in my connections library.  Doh!

Hyper-V, NVidia, Lenovo T61p, and how LinkedIn came to the rescue

It’s no secret that I do a fair number of presentations. It’s also no secret that I am a big SharePoint user. (In case you missed it the Pope is Catholic too!) With SharePoint 2010 being 64 bit only I really had two choices – Hyper-V or VMWare workstation. Since I have been an avid user of VMWare for a while that’s not a problem – until you consider that I’ll be presenting at some Microsoft run conferences and that I’ve got a few projects with the product team. So I had to get Hyper-V working.

In my case that meant taking my LenovoT61p and installing Windows Server 2008 R2 on it. That process went fine, right up to the point where I enabled Hyper-V. This turned my system into what I like to refer to as a flying brick. Hyper-V disables certain things like suspend and hibernate on machines that have the role enabled. What’s worse – much worse – is that there’s a design flaw (you might say the virtualization team and I disagree on this particular issue) where if your driver allocates memory with write combining it has to flush the translation lookaside buffer on the processor. In practical terms, this surfaces as an issue with high performance video cards (or high performance video drivers) in machines running Hyper-V. The result is substantially reduced performance. The Microsoft recommendation? Run with the out of the box VGA drivers. (See for yourself.) Where’s my problem?

Well, the fact that I do presentations means that I sort of need to be able to control what video I send to the external monitor port – which the out of the box drivers don’t support. The driver that NVidia had used the write combining flag (as they should have). When I contacted the virtualization product team at Microsoft the recommendation eventually became… use two machines — one machine that you use to drive the monitor which does a remote desktop connection into your other machine. It’s that advice that drove me to purchasing my Lenovo X200 Tablet (which I still love).

Of course, I wasn’t satisfied. The more I dug into the problem the more stonewalls I got. I was once told to buy new hardware – of course, the hardware wasn’t actually available yet. So I decided to reach out and see if I could find someone at NVidia to help. If they could allow me to turn off the write combining when they allocated a page of memory, I could eliminate the performance problem. (I have to say again, they’re doing the right thing, it’s Hyper-V that has the problem.) So I fired up linked in and realized that I had two different second order connections to NVidia. I sent emails to both of the folks I knew who reportedly had connections to NVidia. Both of those connections ended up making it to NVidia. I won’t bore you with the details and the people but I will say that ultimately NVidia made a fix to their driver for Hyper-V.

I can’t explain what it means to me to hear that a company like NVidia would go out of their way to help their customers – even when the problem they’re solving isn’t really one that they created. I had made a decision (because of some seriously negative interactions with ATI years ago) that I’d be buying all NVidia graphics cards. It’s not often that I get proof that the decisions I make are the right ones – but this definitely falls in this category.

I still use more VMWare Workstation than I use Hyper-V. However, at least I don’t feel like stabbing a fork in my eye when I’m running Hyper-V on my laptop anymore.

SharePoint Saturday Indianapolis Jan 30th, 2010

We’ve gone official that we’re going to be having a SharePoint Saturday in Indianapolis on Jan 30th, 2010. It will be at the JA center on North Keystone and should be a great day. We’re planning three tracks with five sessions each. The tracks are going to be: IT Professional, Developer, and Business and End User. There are details on the site on how you can submit session proposals as we’re hoping to close speaker submissions soon.

I look forward to seeing you all there.

Recent Posts

Public Speaking