Monday, May 7, 2012

Error occurred in deployment step Add Solution

Unlike what I’m telling about myself on my About page (this is a no-code blog), I shall write more and more development orientated posts :-). I have tried to deploy a solution with CAS policies directly from Visual Studio 2010 to SharePoint with the new SharePoint Developer Tools.


The Deploy failed with the error “error occurred in deployment step Add Solution: Property set method not found”


Thanks to my colleague Waldek Mastykarz this is resolved. Go to his blog here to see the solution.


Tags: SP2010, VS2010


View the original article here

Output cache and load balancer problem

At the moment I am troubleshooting a very slow (response times of 10+ seconds monitored with Fiddler) responding SharePoint 2007 intranet at a customer.  At first we thought that the problems where caused by network traffic from SharePoint to SQL because this was routed through the load balancer (Windows 2008 NLB). We configured the route tables on the SharePoint servers but this only caused a slight improvement in performance.


We also saw that object and output cache were both not configured on the site collections. So we also configured both the cache options. Object cache with a size of 200MB per site collection and a refresh time of 600 seconds. Output cache was switched on with the verified cache profile option set to Intranet. Almost immediately after these configurations were done, the helpdesk received lots of phone calls by users. The users reported that it seemed that they were logged on as an other user. After switching off output cache this was solved.


This Technet article also describes the limitations of NLB combined with output caching:


When used with two or more Web servers, output caching might affect consistency. You can configure a cache profile not to check for updates for each request and, for example, configure the profile to ignore changes to the version of the Web page in the output cache until 60 seconds after the original page is updated. If you have two Web servers in your topology, and depending on the load balancer used to route the user’s request, a reader might see inconsistent content if the page is rendered by one server and then a later request is routed to a second server within that 60-second window.


Tags: Infrastructure, performance


View the original article here

SharePoint backup restore and virtualization support

At the moment I have to write a SharePoint backup and restore document and planning for my clients SharePoint farms. All together, production and non-production, there are 10 SharePoint farms.


At first I thought, this is a very straightforward document about what parts of SharePoint and/or file system to backup and restore. After some search queries on the internet I read that virtualization is also very important. I knew that already off course but I did not know what is or is not supported by Microsoft.


Some quotes on the internet:


“Do not use the Hyper-V snapshot feature on virtual servers that are connected to a SharePoint Products and Technologies server farm. This is because the timer services and the search applications might become unsynchronized during the snapshot process.”


“As a best practice, we recommend that you do not use the snapshot feature on virtual machines in a production environment.”


There are all sorts of SharePoint farms at the client. Complete virtual server farms, combined virtual and physical server farms and complete physical server farms. Considering this, there should also be different recovery scenarios. Maybe I will discuss the different scenarios is another article later on, but for the moment I will give you all the information I have read about virtualization support.


SharePoint farm Backup/Restore with VMware Snapshots: http://social.technet.microsoft.com/Forums/en/sharepoint2010setup/thread/e5abf633-9023-4f24-a707-2680cced28e8
Virtualizing SharePoint Server 2007 Series:
http://blogs.msdn.com/b/uksharepoint/archive/2009/02/26/virtualizing-sharepoint-series-introduction.aspx
Best practices for virtualization (SharePoint Server 2010):
http://technet.microsoft.com/en-us/library/hh295699.aspx
Virtual machine guidance (SharePoint Server 2010):
http://technet.microsoft.com/en-us/library/ff621103.aspx


Update 12/22/2011


Resource Center Virtualization for SharePoint Server 2010:
http://technet.microsoft.com/en-us/sharepoint/ff602849.aspx
Server Virtualization Validation Program:
http://www.windowsservercatalog.com/svvp.aspx?svvppage=svvp.htm


Tags: backup & restore, Virtualization


View the original article here

Sunday, May 6, 2012

Converting SharePoint license trouble

At a customer I was building a demo site to convince the management of the power of SharePoint 2010. The plan was to show some fancy stuff with Visio Graphics Services, Infopath Forms, workflows etc. For this we needed, obviously, the Enterprise license. So I opened CA, browsed to the Manage service applications part to make a new Visio Graphics Service Application… The next image indicated this was not possible. My conclusion was that a Standard license was installed (duh).


New Service Application


But I’m stubborn and looked to the Upgrade and Migration part of CA because I was not sure that the Standard license was used at installation. The Convert License Type page showed me this:


convert license type
Hmmmm…strange, I thought that the license type should show up here.


So I also looked at Upgrade and Migration > Enable Enterprise Features (see below, first image) and Enable Features on existing sites (below, second image).


Enable Enterprise features


Enable features


The page “Enable features on existing sites” says that the Standard features can be enabled what should indicate that a Standard license is used, but the “Enable Enterprise features” page indicates the opposite. Confused…
So I wanted to know another way to check the license type. My colleague Waldek Mastykarz (http://blog.mastykarz.nl) told me to look into the registry on the SharePoint server at this key:


HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Registration\{90140000-110D-0000-1000-0000000FF1CE}


regkey


Clicking on the key DigitalProductID indicated the license is a MOSS FIS Enterprise license (see below).


regkeydetail


But…this should not be a reason for all the vague licensing information SharePoint shows. So Waldek found a kb article (http://support.microsoft.com/kb/2143810) about FIS licenses that do not activate all product features. When I executed the PowerShell commandlet for the GUID on the SharePoint server, one of the GUIDs from the kb article showed up. So all this indicated a (more or less) valid license was used, but it was too old.
The client will take further steps and will reinstall the SharePoint Farm with another license.


Tags: license, SP2010


View the original article here

Move IIS7 root script

In my previous post I indicated that the VirtualDirectory Path of the SharePoint Central Administration Web Application in a configuration file was incorrect. I have found the source of this error.


Before I started the SP2010 installation I decided to move the inetpub directory to another drive. I searched for this on the internet and found a script to do this. Unfortunately there was an error in this script which added a second backslash (”\”) to the VirtualDirectory Path.


At first I did not know what caused this error so I searched in the registry. There I found the registry key (HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\InetStp) where IIS stores the location of the WWWROOT. (see below)


regkey


The double backslashes where caused by an error in the script that I used to move the inetpub directory from C: to D:. If you also used the moveIIS7root.bat script which I found here, you can download the correct version here. I made changes in lines 46 and 47 (changed %moveto%\inetpub to %moveto%inetpub).


UPDATE August 31, 2011


Also check the Advanced Settings in IIS per website (see below).


advanced


Here you also have to change the Physical Path (see below)


pp


Tags: IIS, Registry, SP2010


View the original article here

Timer service terminated unexpectedly error 7031 7024

Errors in the System log

Log Name:      SystemSource:        Service Control ManagerDate:          8/31/2011 11:53:35 AMEvent ID:      7031Task Category: NoneLevel:         ErrorKeywords:      ClassicUser:          N/AComputer:      Description:The SharePoint 2010 Timer service terminated unexpectedly. It has done this 479 time(s). The following corrective actionwill be taken in 30000 milliseconds: Restart the service.Log Name:      SystemSource:        Service Control ManagerDate:          8/31/2011 11:53:35 AMEvent ID:      7024Task Category: NoneLevel:         ErrorKeywords:      ClassicUser:          N/AComputer:      Description:The SharePoint 2010 Timer service terminated withservice-specific error 2147500037 (0x80004005).

Both errors appeared every 3 minutes.


Cause


Well in our particular case a developer asked me to change the owstimer.exe.config file (and restart the timer service) to configure a specific timer job.


Solution


In this case, obvious, I think. Replacing the owstimer.exe.config file to its originally settings, which are:


Other causes/solutions:


When I searched the internet for solutions I also found this post by Yorick here. Problem was a missing GUID named folder and the solution recreating that folder. Check his blogpost if my solution did not solve anything for you.


Tags: error, SP2010, timer service


View the original article here

SharePoint 2010 Document Management - Part 6

In this last part of my document management series I want to talk about workflow. You have three options for using workflow with documents in SharePoint:

Versioning settings - content approval workflowWorkflow settings - browser workflowSharePoint Designer 2010 workflow

Let's look a bit closer at these options.


You can activate content approval in the versioning settings menu. This enables a new column:


2012-05-02-DocMgmt-Part06-01.png


This document is only viewable by the author of the document and users with approval permissions. You have to create a new SharePoint group, or use a default group, with approval permission for this document library. The major downside of this approach is that there is no notification for the approvers. This will result in documents being in the approval status for a long time.


The second option is to create a new workflow through the workflow settings of the document library. There is the approval workflow that has the following description:


“Routes a document for approval. Approvers can approve or reject the document, reassign the approval task, or request changes to the document.”


You have a lot more options than the first workflow option:


2012-05-02-DocMgmt-Part06-02.png


You can give the workflow a name, select the task and history list and define the start options. Click at Next to see more options:


2012-05-02-DocMgmt-Part06-03.png


You can now assign the workflow to one or more users/groups, create a custom message and determine the duration options. This gives you a lot more flexibility in creating workflows. The major downside to this approach is that you cannot work with conditions. For example: You want to create a workflow task based at the type of document for a specific group. You cannot configure this with the browser workflows. This is were the power of SharePoint Designer will help you!


The last option I will discuss is workflow in SharePoint Designer. Let’s open the site and click Workflows in the navigation bar:


2012-05-02-DocMgmt-Part06-04.png


There are so many options here so this can be a bit overwhelming in the beginning. This is one of the reasons why end users should not use SharePoint Designer without proper training. There are a lot of topics on the Net about workflow in SharePoint Designer, I will just focus at creating a new workflow for our document library. Click List Workflow and select the document library. Give the workflow a name and description and you can start building your workflow:


2012-05-02-DocMgmt-Part06-05.png


You can now start building your own workflow from scratch. The great thing is that you have a lot of Conditions and Actions to enhance your workflow. The following links are really useful in finding your way through these conditions and actions:


Actions
http://office.microsoft.com/en-us/sharepoint-designer-help/workflow-actions-in-sharepoint-designer-2010-a-quick-reference-guide-HA010376961.aspx


Conditions
http://office.microsoft.com/en-us/sharepoint-designer-help/workflow-conditions-in-sharepoint-designer-2010-a-quick-reference-guide-HA010376962.aspx


Let’s say our document library contains a field Department, you can use the condition ‘If field equals value’ to assign the approval action to different approvers. The other great thing about SharePoint Designer is that you can create your own e-mail notifications or changes existing ones. I won’t discuss this in detail in this article but I am planning to dedicate an article to this subject soon!


I hope you've enjoyed my document management series and this last article. I want to end this series with my last section of tips and tricks.


Moving documents

The workflows described in this article are bound to one site, it is not possible to move documents to other sites. You need Visual Studio or Nintex Workflow for these type of actions.

Content type

Are you using the same document template on multiple sites? Do you need the same workflow for all the sites? Create a content type workflow.

E-mail notifications

You can use SharePoint Designer to create your own e-mail notifications. This means you can personalize it.

View the original article here

Thursday, April 19, 2012

SharePoint 2010 Upgrade Videos


I’ve recently started a SharePoint 2010 Upgrade project for a client. As a preparation for this upgrade, I performed upgrade scenario on my test environments and captured a series of videos.


Here is a link to SharePoint 2010 Upgrade playlist: http://www.youtube.com/view_play_list?p=7B18675060D89566


http://www.youtube.com/p/7B18675060D89566?hl=en_US&fs=1



View the original article here

SharePoint Outlook Connector

Over the last few months I’ve had the privilege of being one of the contributors to the Sobiens SharePoint Outlook Connector project on CodePlex.

The project is a VSTO package that gives Outlook a new tree view panel showing connected SharePoint sites, providing a drag and drop interface to save emails and their attachments to the document libraries of your choice.

image

The Outlook Connector lets you drag emails from Outlook into document libraries in SharePoint.

The new release gives users a context menu when you right-drag email across, giving options to Copy or Move the email as an Outlook .msg file, or to Copy the email and attachments as Word documents.

The new version also shows all document library types, not only those made from the Document Library template.

Hopefully, we’ve also closed a few bugs without creating too many new ones!

Technorati Tags: Outlook, SharePoint, SharePoint 2010, SharePoint Development


View the original article here

Wednesday, April 18, 2012

SharePoint 2010 Google Maps Web Part now Free!

In a fit of community spiritness, I’ve decided to share my Google Maps Web Part, complete with source on CodePlex.

Example of joelblogs.co.uk Google Map Web Part

If you want a Sandbox Solution or a Farm Solution with a simple way of placing a Google Map onto a Web Part page, this is for you.

Check out the project here: http://sp2010googlemaps.codeplex.com/.

Technorati Tags: Google Maps, SharePoint 2010, SharePoint Development, Solution Sandbox, WebParts


View the original article here

The resource cannot be found error starting SP2010 Central Admin

First, let me begin with the error that started the idea for this blog post. After running the Configuration Wizard of SharePoint 2010 on a Windows 2008 SP2 server the following error appeared when the Central Admin participate dialog should have appeared:

Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable.  Please review the following URL and make sure that it is spelled correctly.
Requested URL: /_admin/adminconfigintro.aspx


Version Information: Microsoft .NET Framework Version:2.0.50727.4214; ASP.NET Version:2.0.50727.4209


What have I done to get to this error and resolve it:


With a PowerShell script I installed all prerequisites. Also with PS I installed SharePoint and created the configuration and administration databases and SharePoint Central Administration.


The CA site opened with the error shown above. Several events appeared in the eventlog:


Log Name:      Application
Source:        COM+ SOAP Services
Event ID:      0
Task Category: None
Level:         Warning
Keywords:      Classic
User:          N/A
Computer:     
Description:
Installation in the global assembly cache failed:  C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\policy\Policy.11.0.Microsoft.SharePoint.Security.dll


Log Name:      Application
Source:        SharePoint 2010 Products Configuration Wizard
Event ID:      107
Task Category: None
Level:         Warning
Keywords:      Classic
User:          N/A
Computer:     
Description:
Unable to create a Service Connection Point in the current Active Directory domain. Verify that the SharePoint container exists in the current domain and that you have rights to write to it.
Microsoft.SharePoint.SPException: The object LDAP://CN=Microsoft SharePoint Products,CN=System,DC=xxxx,DC=yyyy,DC=nl doesn’t exist in the directory.
at Microsoft.SharePoint.Administration.SPServiceConnectionPoint.Ensure(String serviceBindingInformation)
at Microsoft.SharePoint.PostSetupConfiguration.CentralAdministrationSiteTask.ProvisionAdminVs()


Solutions:
On this blog http://blogs.technet.com/b/wbaer/archive/2009/12/11/common-microsoft-sharepoint-server-2010-installation-issues-and-resolutions.aspx I found a possible solution for the policy error (issue number 5). The solution would be to delete all files in this directory and run the Wizard again. So I deleted all files.
The possible solution to resolve the Service Connection Point event I found on technet here http://technet.microsoft.com/en-us/library/ff730261.aspx. Because Active Directory is another department I am not able to solve this myself so a call is send to the AD team. But it is only a warning event so it should not be a critical requirement for the installation of SP2010.


The next step was to run the Configuration Wizard again to see if the error was solved. It wasn’t although the policy event had not returned. But now other more serious errors/events showed up:


In the eventlog
Log Name:      Application
Source:        Microsoft-SharePoint Products-SharePoint Foundation
Event ID:      8306
Task Category: Claims Authentication
Level:         Error
User:          \
Computer:     
Description:
The description for Event ID 8306 from source Microsoft-SharePoint Products-SharePoint Foundation cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
If the event originated on another computer, the display information had to be saved with the event.
The following information was included with the event:
Could not connect to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc. TCP error code 10061: No connection could be made because the target machine actively refused it 127.0.0.1:32843.
The publisher has been disabled and its resource is not avaiable. This usually occurs when the publisher is in the process of being uninstalled or upgraded


In the ULS
Critical An exception occurred when trying to issue security token: Could not connect to http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc. TCP error code 10061: No connection could be made because the target machine actively refused it 127.0.0.1:32843. .f8f078d7-fcaf-443e-8dc2-32de426a9c78


This blog http://velavans.blogspot.com/2010/06/sharepoint-2010-exception-occurred-when.html tells to start the “Claims to Windows Token Service” but running the Wizard again did not resolve any of the issues.


At this point I took the original error and read it again. The path /_admin/ with file adminconfigintro.aspx is unavailable. So I checked the website in IIS to if the path was there.


It was (see below)
_admin


Exploring this VirtualDirectory opened the path C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\ADMIN and here the adminconfigintro.aspx file is present (see below)


aspx


So, that is strange. Now comes the most absurd part of this story. I just started to explore my inetpub directory, hoping to find some clues to this strange problem.


Finally I ended up in this directory D:\inetpub\temp\appPools (see below) where I found a file called “SharePoint Central Administration v4.config”. This sounded promising. I opened the file in notepad and began scrolling down, just to see what was in the file..


 apppools


 I came to a section where all the VirtualDirectory Paths (see below) are listed for the Admin website. I scanned through this section and saw that the physical path for the root site / was D:\\ with a double backslash. This was not right I thought and changed it to a single backslash and saved the file.


path


I browsed again to http:///_admin/adminconfigintro.aspx?scenarioid=adminconfig&welcomestringid=farmconfigurationwizard_welcome
And Praise the Lord…the Central Admin screen appeared


As soon as I got in CA, another file appeared in the map D:\inetpub\temp\appPools. This was the file SecurityTokenServiceApplicationPool.config. And guess what also in this file the root path was D:\\ and I also changed it here.


Now I still have to figure where this setting is originally configured because it seems that it is put in every config file showing up in this directory. If you can help me, please let me know!


View the original article here

Google Wave RIP

So Google officially came out and killed Wave.  Here is yet another technology that I've learned that has one away.  They claimed adoption, frankly I was waiting for it's release!  I was planning on speaking about it at the SPEVO conference in London, but I got volcanoed out but I was going to switch topics anyways. 

Some of the problems stem from the browser war.  Chrome loosely interpreted an HTML tag while IE strictly interpreted it.  This meant that worked in chrome and not IE.  So that kind of killed the whole Google wave to SharePoint integration.  Another thing that got worse was they playback.  It sometimes got out sequence and didn't playback character for character like they promised.  In the beginning it was great because it was different by providing the user with the context of time.  Towards the end it was just another forum or discussion board. 

It's really too bad that technology didn't make it.  I think if someone were to deploy what Google had advertised I think would have been a great success.


View the original article here

Tuesday, April 17, 2012

Functions Used In Calculated Columns In SharePoint Lists

Posted by: Stacy Draper

Thursday, December 17, 2009

Category: SharePoint

A question that I've had on the back burner for quite some time is what are the functions in SharePoint calculated columns.  I've taken a cursoury look here and there and never found the answer.  I've recently been passing direct messages via twitter to AndrewClark, @sharepointac, and he sent me a message that had the answer I've been looking for http://office.microsoft.com/en-us/sharepointtechnology/CH011711171033.aspx 

View the original article here

Monday, April 16, 2012

Experiences with my SharePoint development farm

So I’ve been building out a new development farm and trying a few things out on it here and there.   I’ve experimented with the dynamic ram in Hyper-v.  The database server got so messed up he didn’t even know he was running sql.  I set it back to a static amount, rebooted and all was right with the world.  I had two web front ends and the surprise there was both web front ends have to be up if you want the .wsp solution to install correctly.  I spent all my time today getting bginfo set up just the way I wanted it.  I guess the first thing to do is describe my environment.

First off I had to make a choice between Hyper-V and VMware.  At the time I built the host my client was mostly using Hyper-V, or so I was told.  It turned out later that they were really using more VMWare.  The other deciding factor was that it came with my OS.  There are converts to go from VMWare to Hyper-V and vice versa, never the same tool though.  But they do exist and they aren’t terrible.  Long story short I went with Hyper-V.

I have AD1 (Active Directory) and it’s still on a dynamic allocation of ram 768MB – 1792MB right now it’s at 953MB assigned and it’s using 686MB.  I only have one user.  Me.  If I ever have a situation where I’m testing users I’ll but it up to 2 or 4GB.

The DB1 (Database) has been set to a static 4096MB.  Before the incident it was set dynamically at 2G – 4G and it was assigned 3.5G and using 3G.

WFE1 (Web Front End) is set to use 4 processors and 8 gigs of ram.  I was messing around with the processors giving 2 here and 4 there.  Turns out I was over thinking it.  They are virtual processors and I can only assign up to 4.  Having 12 available it just didn’t make sense.  I couldn’t say physically which processor it should use or shouldn’t use.  So what the heck?  I gave everyone 4 processors.  8 gigs of ram because there’s visual studio and everything else under the sun on that box.  That’s where I spend a lot of my time so I want it to run real smooth.

WFE2 is still on the dynamic ram technology with 2-8GB it runs around 2GB assigned and 512MB used. The reason why I went with 2 WFEs is so that I could do a little bit more testing before my code leaves me.  I never really felt like I was missing out on anything before, but I have it there just in case.  I used to have it connected all the time, but then I learned that when it’s not booted up the wsp never deploys.  Not wanting to have that kind of problem on my development box I took it out of the farm with PSConfig.  I’ll put it back in with PSConfigif I ever need it.

An interesting thing about this PSConfigis that WFE2 hosted central admin as well as WFE1.  I removed WFE2 through central admin and added WFE2 back to the farm with PSConfig and I told it not to host central admin.  This caused a problem to appear.  I was getting messages about not upgrading properly.  So I removed wfe2 through PSConfig this time and added it back to the farm telling WFE2 to host central admin again.  Then I removed it with PSConfig and added it back telling it NOT to host central admin.  That finally worked.  I guess I wanted my way?  Maybe I didn’t want to forget that I had it hosting central admin? Who knows all that matters is it’s the way I want it and I’m happy. 

Fabian Williams convinced me to have a client machine.  I don’t know if I’ll ever use this one but we’ll see.  As a matter of fact I just looked and seemed to have messed it up.  I created an iSCSI drive and I disconnected it, but it was a windows 7 box with the recommended ram and probably 4 processors.

I’ve spent the last few days trying to find a project that I could do for community.  I couldn’t find anything that wasn’t done a few different ways so I went on a detour today.  I thought hey, I’m like a real live sysadmin now.  Now I too can have those cool desktops with the text on it that tells me all about the server I’m on like in Figure 1.

image

So I hopped on Google and didn’t know what to search for.  So I hopped on twitter and blurted out something like, “What do admins use to write on the backgrounds of their desktops?”  As always Todd Klindt made sense of my question and suggested that some people like to use BGInfo.  Several people agreed, except Mark Rackley.  He some how came up with Sharpie, but I wanted something more dynamic than that.

BGInfo is pretty cool and time tested so I’m not going to get into all of it’s awesomeness.  I saw that I could get values via vbscript.  I thought that should be fun.  I was once the best I knew at vbscript and was wondering if I still had the touch.  Turns out I don’t.  I spent all day today trying to figure out how to call PowerShell from VBScript.  You can do it, but you’re not going to be calling SharePoint cmdlets.  those are 64 bit.  VBScript, as I became painfully reaware of is 32 bit.  There are a couple reasons why this wouldn’t have worked.  The Calling application that creates the app domain in the first place is 32bit.

However I did come up with some good ideas.  To use VBScript to call PowerShell and then pass data back to VBScript I came up with a few things that worked and a few that don’t.  So you’ll see some stuff about environment variables.  That doesn’t work because environment variables are session specific and I couldn’t find anything that would let you rewrite the system environment variables.  So that was out.  What I finally liked was writing to a file cause that’s super easy in PowerShell and reading in VBScript and that’s only about 5 lines of code that you can copy and paste.  Passing information to PowerShell by parameters worked pretty well too.

So I was really stuck.  How was I going to get the SharePoint Version.  I rarely use it.  But I wanted it.  I had to have it on the desktop.  I don’t know why either, but just bare with me.  I was going to use PowerShell to get the version number.  Cause again I blurted out a question like, “How can I get the version number from SharePoint” in twitter.  @usher and @toddklindt responded with get-spfarm | select BuildVersion or something similar.  Both they’re approaches were a little different but were basically the same idea.  This other guy, @sstrnager, came up with something around get-wmiobject win32_product and this got me to thinking.  Since BGInfo has a built in WMI query interface maybe I can just use that?  I was right.  for the first time today.  It felt pretty good.  Until I spent 2 hours trying to get a scheduled task to work which started working for no reason at all.  I finally used

SELECT Version FROM Win32_Product WHERE name = "Microsoft SharePoint Server 2010 "

So at the end of a very long day I was able to put some text on a desktop or 4 and leave my self some notes.  I do wish I could centrally manage this.  I dread having to go to each machine to do make a little change.  Like leave myself another little note about my farm or if there’s other information I would like to see is going to be a major chore.  Maybe I can but the configuration file on a network drive?  I’ve never had much luck with that and I’ve had enough rejection for one day.


View the original article here

Cheap office 365 pricing

Alls you gotta do is . . . .

Become a registered partner or higher, then from there join online services as a competency and then cloud services and you get office 365 free for a year and then it's half price after that if you don't meet the sales quota.


View the original article here

Slow People Picker

So maybe you have a slow people picker?  We did.  The network guy said “Let’s check which ports are being blocked.”  I thought “well, it’s working so I don’t know what good that’s going to do?  It’s just super slow.” but I bit my tongue and went through the motions.

We saw that port 3268 was being blocked from SharePoint servers to AD.  We unblocked it and the people picker ran like the wind.  I was amazed.  Now that we new the port number things became easier to find.  First thing I did was tweet about it.  Then we went and did some research.

Our infrastructure was a little weird, they are . . . let’s say, firewall friendly.  Between the 3 servers in the SharePoint farm there are 3 firewalls and then between all that and AD there’s one going out and one before active directory.  The outbound firewall rule was prohibiting 3268 from going out.

Last night I had an interview and it came up and I couldn’t hardly remember the port, so I thought I’d blog about it.  An hour ago I sat down to write this post and couldn’t find my tweet.  I never know how hard it was to find old tweets.  Good thing for me I have cool friends who know stuff.  Dan Usher send me this link http://support.microsoft.com/kb/832017 and I saw the port number and it jogged my memory.

The long and the short of it is ldap at 389 is not as graceful as the catalog at 3268 at searching.  If you have a slow people picker try to find someone while looking at the traffic and you’ll see a wealth of information.


View the original article here

Ferrari Case Study

I just read the Ferrari case study and was surprised by this statement, “Site traffic has increased by 237 percent, with a 150 percent increase in unique visitors.” SharePoint by itself doesn’t increase traffic.  Understanding how search engines look at your site and being able to manage the content search engines look at increases traffic. 

The article did state that it’s now easier manage content than it was in their previous java based version.  It almost sounded like they had to have a developer make all content changes.  I’m not a Java fan, but I do think this is a cheap shot at a whole platform.  There are plenty of bad content management solutions in .NET technologies too.  The increased traffic was a combination of a few things.

When the content, meta tags, titles and other asset areas are easy to manage it’s easier to make the search engines like the site.  It’s also important to know what they like.  So it’s important to know what to do and to have the ability to do it, and this is what a properly implemented SharePoint brings to the table.  Believe me you can set up SharePoint so that it doesn’t cater to search engines at all.  It’s just content, like anything else it can be managed poorly or it can be managed well.

These numbers do seem modest so I tend to believe they aren’t from a marketing champagne (because even a bad one would more than double your traffic) and are really from search engine optimization but it’s hard to know that for sure.

The bottom line is that the case study didn’t clearly point out why traffic increased. I think they are trying to say that because the content was more easily managed they were better able to optimize the content for higher ranks on the search engines.


View the original article here

Lessons Learned from Launch of Intranet on SharePoint 2007

This post provides some lessons learned from the launch of our Corporate Intranet.  After about two weeks of poor performance and stability issues we stabilized the site and resolved most of the issues.  The lessons learned here are common and I'm sure our team was not the first (nor the last) to make these mistakes. 

Our Corporate Intranet supports about 21 Business Area / Business Units (BA/BUs).  When I say Intranet I am referring to a content publishing web site that provides announcements, latest news, corporate policies and other information that is important for employees to consider.  It is not a place for employees to collaborate as teams, this is done by another set of SharePoint 2007 sites.

The Intranet is hosted on something we call the Common Web Platform (CWP).  What makes it common is it is one set of features / functionality that powers Intranet, Extranet and Internet publishing sites.

In 2007 I started working on a project to upgrade CWP from its current infrastructure (SharePoint 2003 / MS Content Management Server 2002) to SharePoint 2007.  The first major component to rollout under the new SharePoint 2007 version of CWP is the Intranet site.

Our intranet is not small. It contains approximately 67,000 webs, 65,0000 documents and 70,000 web pages.  The business requirements for sharing content between BA/BUs led us to determine that putting all this content in one site collection was the best choice.  I still believe this was the right decision, but it did cause us to create a SharePoint Content DB that is around 330 GB.

Launch day was actually quite calm from my perspective.  Yes we had a large site, but I felt we had done an excellent job with performance testing so launch would actually go quite smooth  I do not want to go into specifics but the performance testing done had shown that the new SharePoint 2007 site would be able to scale about 3 X higher in number of page views and users than the existing platform.

Everything wasn't perfect, in fact far from it.  We had quite a number of lingering issues from content migration.  We also had some application bugs that just would not go away.  But everyone agreed that these could be solved so we decided to go forward with the launch.

So at approximately noon Eastern US time on March 25th, 2009 we had the DNS team flip the switch and all traffic rolled off the old environment and to the new.  It was one of the smoothest cutover’s I have ever been associated with, I even heard some people saying that the did not know we had flipped the switch. 

The next morning as Europe came online the proverbial $hit hit the fan.  I'm not going to go into the blow by blow details, but I will say a dedicated team of engineers that wanted nothing more than to see this new platform succeed went to work along with MS Premier Support.  On Tuesday April 7th the task force was closed as everyone agreed that while the new Intranet had some problems it was stable and performance was acceptable to end users.

This was a really tough one to troubleshoot.  The thing that made it tough was just inconsistency with the crashes.  We could never tie it back to one specific event or one set of clear patterns.  The only consistency was the fact that it crashed during peak traffic loads (from 2 AM - 9 AM Eastern US time).  The Intranet availability dipped to about 60% during these two weeks.

As I stated after about two weeks of pure hell we got things stable.  During that time we did a lot of analysis and a few changes.  So in no particular order here is the things we changed and why and what I personally learned.

One of our field controls makes calls to a web service that in turn makes calls to a database to retrieve some data.  It is pretty basic stuff.  Well, to make a long story short the web service ended up in our SharePoint solution package and our field control ended up a call back into the same Application Domain to call some data from a database.  Yes, I know not a very smart thing to do.

Anyway, during the performance we put together specific KPI’s to watch for this web service.  We saw no major problems with it, but put it on a list of things to change once the application went into maintenance mode.

While we never linked any outages specifically to calls to this web service, however we did see a major improvement in overall stability when the web service was moved to a separate application pool. 

So the lesson learned is to keep the Application Pools that host SharePoint sites dedicated to SharePoint sites (do not have those Application Pools host non SharePoint IIS Sites).

One of the problems that definitely caused outages was table locks at the SQL Server level.  We traced the table locks back to SQL that was being generated by a CAML query we used to show documents associated with a given web page. 

SQL will lock a table if it thinks a query will return more than 5000 rows.  So it is very important that you set a row limit when using SPQuery and CrossListQueryCache objects.  When SharePoint generates the SQL for CrossListQuery if will set a default row limit of 2 million items.  I’m not sure if it does the same thing for SPQuery, but better safe than sorry. 

So the lesson learned here is always set a row limit that is less than 5000 when using SPQuery and CrossListQueryCache. 

The CAML Query referenced in Item 2 above was using FileRef field to filter the result list.  Unfortunately FileRef is a special field inside of SharePoint, meaning it doesn’t lend itself to be indexed (See Index List Field).  So the SQL query’s that were generated from the CAML were doing full table scans which is another big performance hit and can cause unwanted database locks.

So in the end we abandoned using CAML query to get the documents and instead pulled them the SPWeb.GetListItem method.  At first there was a hugh debate on our team, because fundamentally it is better to reduce communication with DB.  So we were going from essentially one call to the DB to two calls per file in our document list field control (note: SPWeb.GetListItem results in at least 2 calls to the DB, one to get the List field info and one to pull the ListItem data).

Our control has a limit of 200 documents that can be displayed.  So we knew the maximum number of times we would call GetListItem per page would be 200.  We also knew that the average number of documents per page was 3.  So most pages had very few documents to display.

Our team is looking at alternate approaches.  One idea is to add a field to each document that has a GUID.  Then index that field and go back to doing queries using that new field.  We have a lot of testing to do before we make a decision to go in that direction.

So the lesson learned was do not write CAML query's that use FileRef as the primary field to filter the results. 

Okay this one requires a separate blog post.  I promise to post a blog entry with this information very soon.  In the mean time I can say that the key mistake made with performance testing was not taking into consideration user sessions and think times.  We had the right URLs (we took these straight from logs of production machine), but we ran them through too fast which created a situation where URLs uses output cached versions of the pages when under normal load they would not have used the cached versions. 

Granted we had a rough launch because the performance testing did not catch the critical application issues.  I do not want to leave people with the impression that everything we did was wrong.  Our team did a lot of stuff write and often these things get forgotten when things go wrong.  So here is a short list of the things we did right:

We used 64 bit hardware for all our servers (SQL and Web Front Ends). We used the caching options with Publishing sites effectively (Output Cache, Object Cache and BLOB Cache). We discovered a major memory leak in our code with performance testing and fixed it before going live. We put together a well defined set of Solutions and Features for our application (so we can deploy easily). We created a team of people that have some really deep knowledge on building SharePoint Publishing sites.

MSDN: Best Practices: Common Coding Issues When Using the SharePoint Object Model

Microsoft TechNet: Tune Web server performance (Office SharePoint Server)

SharePoint for End Users: Manage large SharePoint lists for better performance

Reza Alirezaei’s Blog: 20 key Points Arising, or Inferred, From “Working with large lists in MOSS 2007” Paper


View the original article here

Managing Key-Value Pair Settings in SharePoint

Recently my team discussed various ways to manage application configuration settings in SharePoint.  We wanted to avoid using Web.Config files for Key-Value Pair data because of the complexity of managing this data in SharePoint.

SharePoint contains a decent API (SPWebConfigModification) for managing web.config settings across the farm.  It works pretty good, but we have had some trouble with it in certain situations.  If you are interested in a good article on SPWebConfigModification check out this one by Mark Wagner

Thankfully the folks on the SharePoint development team delivered just what we needed to have a simple solution for managing key-value pair settings.  It is the SPPropertyBag class.

SharePoint provides a properties collection for SPFarm, SPWebApplication, SPWebService and SPWeb objects (none for SPSite, but site settings can be stored @ SPSite.RootWeb).

The properties collection for SPFarm, SPWebApplication and SPWebService are not really based on the SPPropertyBag class.  The properties collection comes from the derived class SPPersistedObject and is actually a HashTable.  But it works exactly the same as SPPropertyBag.

It is really easy to manage the properties settings.  Here is some sample code to update settings for SPFarm, SPWebApplication or SPWebService.

public static void SetPropertyValue(SPPersistedObject spObject, string name, string value)
{
if (spObject.Properties.ContainsKey(name))
{
spObject.Properties[name] = value;
}
else
{
spObject.Properties.Add(name, value);
}
spObject.Update();
}

The SPWeb class actually uses SPPropertyBag for its properties.  The code to manage SPWeb.Properties is similar.


public static void SetPropertyValue(SPWeb webSite, string name, string value)
{
if (webSite.Properties.ContainsKey(name))
{
webSite.Properties[name] = value;
}
else
{
webSite.Properties.Add(name, value);
}
webSite.Properties.Update();
}

View the original article here

Sunday, April 15, 2012

Internet Explorer Discussion Toolbar and SharePoint Publishing Sites.

The Internet Explorer Discussion Toolbar will probe your web site to see if it is using SharePoint (or Front Page Server Extensions). If it finds that you are using SharePoint then it will enabled the toolbars discussion feature which will most likely result in an Access Denied or some other error message to the user.

While this is a very minor thing you may want to consider blocking access to the URLs Internet Explorer Discussion Toolbar uses to determine if a site is using SharePoint. This can easily be done by using an ISAPI Filter and blocking traffic to /_vti_bin/owssvr.dll and /MSOffice/cltreq.asp.

Recently my team launched some public facing SharePoint Publishing Sites and discovered a small issue with the Internet Explorer Discussion Toolbar. When we would browse our guest (anonymous) access URL we would be prompted for a login. We only were seeing it from certain test clients using Internet Explorer. By installing Fiddler on one of the test clients we could quickly see traffic going to the /_vti_bin/owssvr.dll which would return a HTTP 401 messaging indicating that the client was not authorized.

Below is some sample traffic I collected using Fiddler and http://sharepoint.microsoft.com/. As you can see the 11th request (3rd line below) is a call to /_vti_bin/owssvr.dll.

Fiddler Traffic Capture

With a little trial and error we were able to quickly figure out that this toolbar was generating those requests to /_vti_bin/owssvr.dll. I’m not an Internet Explorer Discussion Toolbar expert, but it appears to send that request every time a request is made to the server.

If it receives a 200 then it enables discussions for the page. Below is a screenshot from http://sharepoint.microsoft.com/. The Discussion Toolbar is enabled and ready to go.

discussion toolbar enabled

Just because the toolbar is enabled does not mean people will be able to attach comments to your web pages. I tried this and discovered that the toolbar will fail with an Access Denied error since it is trying to write to the SharePoint site collection.

During the testing I discovered that if the request to /_vti_bin/owssvr.dll fails then the toolbar will display a message stating that discussions are not allowed for this page. Below is a screenshot of the discussion toolbar disabled.

discussion toolbar disabled

To stop this activity we used ISAPI_Rewrite to deny all requests going to /_vti_bin/owssvr.dll and /MSOffice/cltreq.asp through the IIS sites that support browsing (we have separate IIS site for content editing). We did NOT want to block traffic to /_vti_bin/owssvr.dll through our editing site because we were concerned it would break some of the Office Client integration features.


View the original article here

FieldID class

I came across a very useful class in the Publishing library.

Microsoft.SharePoint.Publishing.FieldId

The FieldId class contains a list of commonly used field ids.  I like it because it gives me an easy way to reference fields without having to worry about Internal Name vs. Display Name.

Since this is in the Publishing library you will have to have SharePoint 2007 to take advantage.



View the original article here

Exciting year to be in IT

I’ve been in IT for a little while now and I have to say that this is the most excited I’ve been since 2000.  Why, well because of all the great stuff Microsoft plans to ship this year. 

Here is a short list of what has me so excited.

I know I’m late to the Silverlight party since a lot of people felt like version 3 was a good product for developers.  Well, I’m late on purpose.  I remember taking a look at Silverlight 1 and 2 and thinking, hmmmm I wonder where this will go.  Nothing there for me to go back to business and say we have to take a hard look at this now.  With Silverlight 3 I finally started to see some real potential for the business, but I wanted see if the adoption rate would be good enough.

Now with Silverlight 4 getting ready to ship I finally feel comfortable standing up and saying lets take a hard look at Silverlight for business application development.

I consider this to really be the 3rd major release of the .Net Framework stack.  I guess I’m most excited about the new parallel features that are coming with this version of the framework.  But, what is more important is the fact that the framework continues to grow and get better. 

I believe MVC has shown that it is here to stay.  The latest improvements in MVC 2 have really addressed some of the rough edges that were in the MVC 1 release. 

I see some debates raging about MVC vs. WebForms.  Frankly I think the debates are a little silly as each technology has its niche. It reminds me of the old VB vs. C++ debates for doing Windows Forms development.  Although MVC web development is no where near as complicated as building Windows Forms applications in C++ :D.

Just noticed that @scottgu published an article about this very subject as I am finishing up this blog posting. 413 Graves Mill RoadIsn’t it ironic. :D

Visual Studio 2010, all I can say is wow.  Some people will think I’m full of it because on the surface it does not look like Visual Studio 2010 has a lot of improvements.  I agree that a lot of the improvements are in specific areas (ex. SharePoint development), but the new Extension Manager model should not be overlooked.

I’ve seen some work coming out of the SharePoint camps that are taking advantage of the new Extension Manager.  One great example is the work being done by Waldek Mastykarz.

The other thing that really has me excited is the new enhancements inside of Visual Studio 2010 Team System.  Last week I watched a Channel 9 video about the new Test Lab Manager.  The more I learn about these “little” enhancements the more I can vision software development teams increasing productivity and quality.  Good stuff if you are a manager of a software development team.

I sort of saved the best for last in this case.  While SharePoint 2010 will not be taking advantage of a lot of the new technology from Microsoft (MVC 2, .Net 4) there are some new features coming that make development a much better experience.

The new SharePoint Tools for SharePoint 2010 are great.  While there is still room for improvement these show that Microsoft got the message about development experience with SharePoint.

I’m also really excited about the new client object model.  This makes connecting to SharePoint data from AJAX, Javascript and Silverlight a palatable experience. 

I am also really excited about the Developer Dashboard technology.  I got to see this very early on and I almost made a mess in my pants.  The reason is because I had just finished up going through a painstaking process of “find the bottleneck” with SharePoint. 

Finally I’m pumped about the new Services architecture.  This is the only major architecture change I can see in SharePoint 2010 (perhaps I am being myopic).  This is a good thing as I think the upgrade from 2003 to 2007 was a lot to chew on.  Anyway the new Services architecture shows a lot of promise for building new extensions to SharePoint.  As soon as I saw the new model I thought of 2 new services that could add value to anyone running Publishing sites. 

The team at Microsoft is getting ready to ship a lot of products this year.  Big hats off to everyone involved. 


View the original article here

More Lessons learned from Performance Testing SharePoint

Performance testing with SharePoint, or any web based application, can be quite tricky.  Recently my team launched an upgraded Corporate Web Site based on SharePoint 2007.  The launch was quite challenging mainly due to mistakes made during performance testing Lessons Learned from Intranet Launch.

This post is dedicated to the lessons learned from the performance testing of Corporate Web Site. 

Prior to launch we ran through our performance test scenarios 3 times.  Each time the output showed that we could scale way beyond the existing implantation of our Corporate Web Site (Referred to as Violin from here on). 

The performance test scenarios had been chosen based on traffic patterns and pages determined to be high risk for performance (This was good). 

Our key performance requirements stated that the web servers must support 38 page views / sec with response time < 5 sec (This was good).  This is a nice well defined requirement, although some could argue that 38 page views needs to be broken down into specific types of pages (ex. 10 home page views, 7 chapter page views, …). 

We also had a performance goal stating that processor utilization should not go above 80% on web servers for more than 5 seconds (This was good).

For the final test we replayed traffic from IIS logs that were taken during peak traffic window (when we received the most requests / sec).  This was a bit tricky because my Load Runner resource told me that this was not supported by Load Runner.  So he and I had to message the data inside the IIS logs to get it so Load Runner would support running the tests (this felt wrong at the time, but I cannot say if it is a mistake).

We used Load Runner (sorry I do not know version) for all of the performance tests.  The Load Runner clients were located within the same data center as our web servers, but they were on different network segments.

When we ran the tests we engaged several people from operations team (Network, Windows Server, SQL Server DBA and SharePoint Admin).  These people were tasked with monitoring components related to their area of expertise.  They were also required to collect performance statistics and report those back so they could be included in overall performance test report (This was good).

So each time we ran the tests we were able to reach levels of about 90 page views / sec on one server with avg. response time < 5 seconds (we have 4 load balanced WFE in our farm).  So we were hi-fiving and slapping each other on the back.  As far as we were concerned performance requirements were met, check them off we are done.

We did notice an occasional spike w/ CPU, but we were able to correlate this back to pages expiring in Output Cache.  So this was not a concern.

Well once we went live we discovered that something was gravely wrong.

After going live we discovered that the output cache hit ratio was not aligned with the numbers we were seeing during performance testing.  So were were having a LOT less output cache hits.  This resulted in the servers having to do a lot more work than originally anticipated.

What could have happened? We thought we did everything right with the performance tests.  What went wrong?

Well after much soul searching (and re-reading basics of performance testing) it hit me.  "

Oh $hit we didn’t model user variations and think times. 

Yeah it does, the reason is because we ran a high number of requests but the proportion of cached requests vs. un-cached requests was out of balance.  Had we have taken into consideration user think times and other variations(browser type, user location) we would have less hits against output cache.

Classic 101 Performance Testing Mistake.  Oh well, you pick yourself up, dust yourself off and vow not to make the same mistake again.

User think times are critical when doing performance testing (especially for web applications that rely on ASP.Net Output Caching to meet performance goals).

Just as important as think times you need to look at the IIS Logs (or your web analytics reports) to understand browser differences and local differences.  This is extremely critical if you have Output Cache configured so it treats these differences as non cached page requests.

While this is not as important as Think Times and End User variations it is important if you are doing performance testing through a load balancer configured with session affinity. 

All of the tests we ran looked like they were coming from 2 IPs.  While I cannot prove this invalidated the test results it looks like there was some sort of caching efficiencies realized somewhere in the stack (Switch, NIC, IIS, …). 

References

Microsoft Patterns and Practices: Performance Testing Guidance for Web Applications

Microsoft Office Server Online: Configure page output cache settings

MSDN: Output Caching and Cache Profiles


View the original article here

Missing Method Exception (or why my Unit Tests would not work with my SharePoint project)

I'm using some test driven development concepts on a little SharePoint development project.

I noticed that I kept running into a really strange error every time I added a new Method to my SharePoint project.

Whenever I would run my new Unit Test I would get a System.MissingMethodException error saying that the method in my SharePoint project could not be found.

Turns out that the problem was caused by the fact that the Unit Test project was loading my SharePoint assembly from the GAC.  So for me to get the tests to work I had to install the latest version of my SharePoint assembly into the GAC and then close/open Visual Studio.

I like to take very small steps when I develop (add some functionality, test it, add some functionality, test it).  So the idea of constantly updating the GAC and closing Visual Studio was not appealing.

I considered removing my SharePoint assembly from the GAC, but I put it in there for some legitimate reasons.

After much trial and error I found a solution that works okay (but is not ideal).  I configured my Unit Test project to install the SharePoint dll in the GAC whenever the project is compiled.  And I run the Unit Tests in Debug mode.  With this setup I am able to get instant feedback from my changes.

As I said it is not ideal, but works.

To install my SharePoint dll in the GAC I added a Post Build event to my Unit Test project.  It just runs the gacutil command.

To run my Unit Tests in debug mode I launch my tests using the Visual Studio Debug option rather than Run option.

VSNET_UnitTest_SnapShot

I am using Visual Studio 2008 Test project for my Unit Tests.


View the original article here

Load Testing SharePoint (Lessons Learned) (Part 2)

In part 1 I discussed the setup of my load testing environment and tests.  In part 2 I want to focus on the tests runs and what configuration I had to make to get them to work.

Bottlenecks oh Bottlenecks, wherefore art thou Bottlenecks

Right out of the gate I ran into trouble.  My tests showed a bottleneck at 4 requests per second.  The CPU was running @ 25 - 35% no matter what user load VSTS used (remember I was using goal based testing that would keep loading users until the goal was met).  I tried several different tests and they all had the same result.  So I knew there was a bottleneck somewhere.

I started by looking at the network.  Specifically I focused on the virtual's network adaptor.  I was worried that there was some sort of VMWare configuration problem.  To test the network I used a file copy test (I uploaded and downloaded a large file to the web server).  The test showed that the network was working just fine.

Then it hit me, the web server was configured to only support Integrated Windows authentication (NTLM).  So I configured SharePoint (and IIS) to support Anonymous authentication.  Bam, the bottleneck was gone.

Size Matters

So I ran my first set of tests.  Unfortunately I noted a fairly significant difference between the out of the box Article page and our custom page.

Next I used the SharePoint Test Data Population Tool to create a site collection that contained 22,000 sites and about 50,000 pages.

Then I ran the tests again with some very surprising results.  The out of the box Article page when from 114 requests per second to 56 requests per second.  That's right we recorded an almost 100% decrease in performance just due to the size of the site collection. 

Our custom page's did not experience the same percentage of slow down (actually their performance improved, but that was due to improvements we made to the code).

Summing Up

Performance testing SharePoint is a tedious but necessary task.  Here are my lessons learned from the exercise.

1. Plan, Plan, Plan

Establish the test goals

Determine which tests you need to run

Determine which tools you will use

Determine the setup/configuration of your SharePoint and Load test environments.

2. Dry run the tests (leave plenty of time to work through issues)

3. Don't test with an empty site collection


View the original article here

Saturday, April 14, 2012

Creating a Feature to update the Web.Config

Creating a Feature to update the Web.Config

I just wrapped up a feature that will update the web.config with custom settings. I did it by creating a feature receiver that loads web.config modifications from an xml configuration file and then applies them using the SPWebConfigModification class.

The entire process is straight forward. There are few subtle nuisances to SPWebConfigModification that make it tricky.

Below is some sample code from my Feature. The ApplyWebConfigChange is responsible for creating an entry to the web.config.

private void ApplyWebConfigChange(SPWebApplication webApp, string ConfigurationChangeIdentifier, string ConfigurationName, string ConfigurationXPath, string ConfigurationValue)
{

Collection modsCollection = webApp.WebConfigModifications;

SPWebConfigModification configMod = LookupModificationEntry(modsCollection, ConfigurationChangeIdentifier, ConfigurationName, ConfigurationXPath);
if (configMod == null)
{
configMod = new SPWebConfigModification(ConfigurationName, ConfigurationXPath);
configMod.Owner = ConfigurationChangeIdentifier;
configMod.Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode;
modsCollection.Add(configMod);
}

configMod.Value = ConfigurationValue;
webApp.Farm.Services.GetValue().ApplyWebConfigModifications();
webApp.Update();
}


The first step is to reference the collection that contains a list of SPWebConfigModification objects. This collection is accessed via the SPWebApplication.WebConfigModifications property.

The next step is to see if the change already exists in the WebConfigModifications collections. If it does then all I want to do is update it. If it is not there then I want to create it.

When creating new entries it is extremely important to make sure the SPWebConfigModification.Name is unique within the SPWebConfigModification.Path. If the name is not unique within the context of the path then you could end up with duplicate entries and you will not be able to remove the entry with SPWebConfigModification.

Tip: Get very comfortable with XPath since both the Path property and Name property have to be valid XPath syntax.


Make sure the Owner property is set to a unique value for your change. Later I will provide a quick and easy way to remove changes from SPWebConfigModification collection. My technique assumes that the Owner is unique for each modification.

Tip: Always set Type property to SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode. The two other options "EnsureAttribute" and "EnsureSection" will permanently change the web.config (as in you can never remove the items, even if you manually remove them they will return)


The last step is to call the SPWebService.ApplyWebConfigModifications and SPWebApplication.Update. Since my web application is deployed across multiple web front ends I need to call the ApplyWebConfigModifications. This routine will make sure the web.config change is made on all web front ends. If I only had one web front end then I could have just called SPWebApplication.Update.

Removing Web.Config Changes

Removing entries is really simple if you can assume that the Owner property is unique per change. In the routine below you can see that I will go through each item in the SPWebConfigModification. When I find an entry with the Owner set to a certain value I will remove it.

private void RemoveWebConfigChange(SPWebApplication webApp, string ConfigurationChangeIdentifier)
{

Collection modsCollection = webApp.WebConfigModifications;

SPWebConfigModification deleteModification = null;

foreach (SPWebConfigModification mod in modsCollection)
{
if (mod.Owner == ConfigurationChangeIdentifier)
{
deleteModification = mod;
break;
}
}

if (deleteModification != null)
{
modsCollection.Remove(deleteModification);
webApp.Farm.Services.GetValue().ApplyWebConfigModifications();
webApp.Update();
}
}


Here is an article that covers some more problems with SPWebConfigModification in more detail.

I learned a lot of great information about SPWebConfigModification here.


View the original article here

Big Thanks to Office / SharePoint Teams for TAP Airlift

I had the privilege of attending the Office 14 TAP Air Lift this week in Seattle.  This is my second time coming out to to a SharePoint Air Lift and I must say they never disappoint.  While I cannot share any information from the Air Lift I can say it is exciting times to be working with SharePoint. 

I really want to extend a big thank you to Microsoft for hosting a great event.  The folks in Office / SharePoint development teams have some big deadlines in front of them.  For them to take time out of their busy schedules to spend some one on one time with customers says a LOT. 


View the original article here

Adventures in extending the Publishing.Fields.LinkValue class

Recently I was working on a SharePoint application that required me to extend the out-of-box Publishing Link field.  In the end I found an acceptable solution, but it took a little while to get there.

The first thing I tried was creating a new link field value class that extended the LinkFieldValue class.  The LinkFieldValue class is not sealed so I thought I would be able to work with it.

LinkFieldValue inherits from the HtmlTagValue class.  HtmlTagValue is more or less a dictionary that provides a structured way to create an HTML tag.  Unfortunately HtmlTagValue has all of the routines I needed marked as Internal.  So this meant that my idea of inheriting from LinkFieldValue was dead.

The Publishing LinkFieldValue class inherits from the Microsoft.SharePoint.Publishing.Fields.HtmlTagValue class.  If you look at the guts of LinkFieldValue you will see that HtmlTagValue does all the heavy lifting.  The LinkFieldValue properties call back into the HtmlTagValue.Item property. 

HtmlTagValue class has two public routines (everything else is Internal or private). So that pretty much ended my idea of extending the LinkFieldValue class. 

Next I decided to store the link data in a SPFieldMultiColumnValue field.  I have created plenty of these in the past so I knew exactly what to do.

Everything was great until I ran a test for Link fix-up.

In case you did not know WSS has built-in functionality that supports fixing site collection links that get broken because the WSS object (image, document, publishing page) is moved. 

Turns out WSS link fix-up only works for certain field types (and Text is not one of them).  It also requires that the link be stored as an anchor tag.  So that pretty much ended my idea of using the SPFieldMultiColumnValue field.

Using XML to store the Link data

Finally I decided to store the link using the same technique as the Summary Link control.  The Summary Link control provides a way of storing a variable number of links in a publishing field.  It does this by serializing the data into XHTML and storing it in a rich text field.

First I created a base field class that would act as a parent for any field that I wanted to store as XHTML. 

public class XmlAsHtmlField : SPFieldMultiLineText
{
public XmlAsHtmlField(SPFieldCollection fields, string fieldName) : base(fields, fieldName) { }
public XmlAsHtmlField(SPFieldCollection fields, string typeName, string displayName) : base(fields, typeName, displayName)
{
base.RichText = true;
base.SetRichTextMode(SPRichTextMode.FullHtml);
}
}

Next I created a XmlAsHtmlFieldValue class that would act as a base class for all field values that use the XmlAsHtmlField class.  The base field value class implements a dictionary that provides a mechanism for all child classes to store their property values.  The class assumes that any properties stored in the dictionary translate their value into XHTML.


public class XmlAsHtmlValue
{
private Dictionary storageItems = new Dictionary();

///


/// Initializes the XmlAsHtmlValue class
///

public XmlAsHtmlValue()
{
}

///


/// Initializes the XmlAsHtmlValue class and populates the Dictionary based on the provided XML
///

///
public XmlAsHtmlValue(string HtmlValue)
{
HtmlValue = Utility.FixQuotesInXML(HtmlValue);

System.Xml.XmlDocument xmlDocument = new XmlDocument();
xmlDocument.LoadXml(HtmlValue);
XmlNodeList storageItemNodes = xmlDocument.SelectNodes("/div/div");
foreach (XmlNode storageItemNode in storageItemNodes)
{
XmlAttribute classValue = storageItemNode.Attributes["class"];
string id = storageItemNode.Attributes["id"].Value;
XmlStorageItemType itemType = (XmlStorageItemType)System.Enum.Parse(typeof(XmlStorageItemType), classValue.Value);
storageItems.Add(id, new XmlStorageItem(itemType, storageItemNode.InnerXml, id));
}
}

///


/// Returns the dictionary that is used to store properties
///

public Dictionary StorageItems
{
get
{
return storageItems;
}
}

///


/// Creates an XML representation of the Dictionary
///

///
public virtual string ToXml()
{
StringBuilder returnValue = new StringBuilder();
returnValue.Append("
");

foreach (KeyValuePair item in storageItems)
{
XmlStorageItem storageItem = item.Value;
returnValue.AppendFormat("

{2}
", storageItem.ID, storageItem.StorageItemType, storageItem.StorageItemValue);
}
returnValue.Append("
");

return returnValue.ToString();
}

public override string ToString()
{
return this.ToXml();
}

}

One really ugly problem I ran into was the fact that Microsoft will strip out double and single quotes from the HTML.  I am not sure why this is done, but I had to create a routine to put the quotes back in.

Next I created an extended link value class that used the out-of-box link field value plus another class I created with the new link properties.  I stored both of the classes in the XMLAsHtmlValue class.

Because I used the out-of-box Link Value class I ran into some problems when I created unit tests.  When I tried to generate the tests in Visual Studio.Net 2008 I would receive the following error.

Could not resolve member reference: Microsoft.SharePoint.ISPConversionProcessor::PostProcess

To fix the problem I added the following DLL as a reference in my project (Microsoft.HtmlTrans.Interface, it is located in the GAC)


In the end I had something that would allow me to create fairly extensible field value classes.  I am not sure how well this solution scales (a topic for a future posting) and I am not how easy this solution will be to upgrade with future versions of SharePoint.


View the original article here

SharePoint Community Kit Wiki (or Lipstick on a Pig)

Recently the development team at work started using a Wiki to keep track of information that is important to the team.  Since our company has standardized on using SharePoint for Collaboration solutions we used the Wiki featured delivered from Microsoft (via the Community Kit for SharePoint).

I must say I am very unimpressed by the SharePoint Wiki.  I don't want to take anything away from the community effort that developed the Wiki template for SharePoint.  I'm sure the community has done the best they can to develop a Wiki application on top of SharePoint (although it does appear to be missing some basic functionality one would expect from a Wiki.. Categories, Page Discussions).  Perhaps SharePoint is not the best platform to host a Wiki.

Here is a short list of my main concerns with using the SharePoint Wiki.

1. Content is stored in a single list

As most know SharePoint starts to perform badly whenever a list contains more than 1000 items (this number is debated, what is not debated is that SharePoint performance degrades as list size grows).

The Wiki appears to be storing all pages directly in the root of a Document list.  So after my Wiki has 1000 pages I will start to see a major slowdown.  I am really surprised that categories (acting as Folders) were not implemented.  To me this would have been a natural way to help reduce the impact of having a large list of Wiki pages.

2. Poor Search

This is my #1 complaint with our new Wiki.  In my opinion a Wiki is only as good as its search engine.  And unfortunately SharePoint search is still not as good as it can be. 

Conclusion

When I compare the SharePoint Wiki features and functionality to something like ScrewTurn I am left with one thought..  the SharePoint Wiki sucks.


View the original article here