Writing to ULS from within C#

Event Receivers and Feature Receivers are notoriously hard to debug.  Here’s a little gem to ease your debugging migraines:

Here’s the function I use, just to simplify subsequent calls:

private void LogOut(TraceSeverity Level, string OutStr)
            SPDiagnosticsService.Local.WriteTrace(0, new SPDiagnosticsCategory("SuperRouter Updated", Level, EventSeverity.Error), TraceSeverity.Unexpected, OutStr, null);

 Here’s  what the call looks like. Note you can set the severity level.  I sprinkle these throughout the code and dump data etc throughout

LogOut(TraceSeverity.High, "Item Updated Event Trapped!");

 Here’s the reference to use in the code:

using Microsoft.SharePoint.Utilities;
using Microsoft.SharePoint.Administration;


If you’d like a class to reuse, here it is:

 public class MyLogger
        public static void LogOut(TraceSeverity Level, string OutStr)
            SPDiagnosticsService.Local.WriteTrace(0, new SPDiagnosticsCategory("JoelER", Level, EventSeverity.Error), TraceSeverity.Unexpected, OutStr, null);

Taxonomy Internals: A rose by any other name…

The SharePoint Managed Metadata Service centrally manages Taxonomies within SharePoint.  More than one service application can exist within a farm.  Each uses the traditional service connection proxy to associate with a web application.  Each service application has its own database where terms are stored.

To navigate using the object model, first we grab a session and a termstore:

TaxonomySession session = new TaxonomySession(site); //site is an SPSite object
TermStore defaultKeywordStore = session.DefaultKeywordsTermStore;

// Get the default site collection TermStore associated with the provide site.
TermStore defaultSiteCollectionStore = session.DefaultSiteCollectionTermStore;
TermStoreCollection termStores = session.TermStores;

To get a term, we have a lot of options:

TermCollection terms = session.GetTerms(prefix, true, StringMatchOption.StartsWith, 4, true);
// parameter 2: Only search in default labels, false does broader scan
// 4: The maximum number of terms returned from each TermStore
// parameter 4: The results should not contain unavailable terms

We can even search by a custom property name:

TermCollection terms = session.GetTermsWithCustomProperty( customPropertyName, true);

Terms.count() will give you the total terms returned.  If you know you are doing a precise lookup, then the term you are looking for is found in terms[0].

$witching to PowerShell J, here’s how to add a term:

$taxonomySession = Get-SPTaxonomySession -Site $TaxSite
    $termStore = $taxonomySession.TermStores["Managed Metadata Service"]
    $group = $termStore.Groups["Claims"]

    $termSet = $group.TermSets | Where-Object { $_.Name -eq $termSetName }
if($termSet -eq $null)
              $termSet = $group.CreateTermSet($termSetName) 
              Write-Host "Created Successfully $($termSetName) TermSet"
              Write-Host "Whoops, could not create $($termSetName) TermSet"

# make it available for tagging
# set a description

Renaming terms

There’s a reason why there’s no rename method in the taxonomy object model. Instead there’s a way to “move” a term.  To move from one to another, make sure both are term objects, and do:


It’s overloaded to allow you to move it to the top level of a term set. 

Rather than “rename” a term, instead a new label is applied to the term, and the new label can be made the default value.

In this case, I added a new label for the term, then made it the default for our language (1033):
$Lev1TermObj.CreateLabel("Joel new Term",1033,$true)

Note the UI does not let you set a new term as the default. Your only option is to exchange the two values of the labels.

The terms can be seen:


There is  Timer Job that runs hourly called Taxonomy Update Scheduler

This updates Site Collections with the latest term changes made to the Enterprise Metadata Service. The amazing thing is this Timer Job updates the term, even if a document is checked out.  The document remains checked out, but the term value changes.

 The wonderful thing about the approach to add a default label rather than rename a term is that the user can find the term searching for any label, yet it is the default label that will appear to the user in the user interface.

Smart filtering in BCS

Business Connectivity Services allows the rapid creation of reference connections to live legacy data such as tables or views in SQL Server.  The wildcard filtering is great, but what if you want to customize it?

Smart MyFilterParm Filtering in BCS

Using a stored procedure requires deviating from the easy out-of-box dynamic SQL, and defining the input parameter(s).

Here’s the pseudocode T-SQL for the smart MyFilterParm filtering.  Note that first up to 10 MyFilterParms are listed, based on matching on a specific field, then 50 more generic wildcard matches are matched.  This has the advantage of a high speed response, even if a user enters the letter “a” for search, making long distance (inter-farm) lookups more responsive.  For the Union, note the fields from each Select need to match precisely in sequence, name and type.  Best is if the exact same fields and names are returned that we are using today.

CREATE procedure dbo.sp_MySmartSearch
@MyFilterParmSmart nvarchar(255) = null
SELECT 10 FROM [MyDataBase].[dbo].[CompanyView]
WHERE MyFilterParm LIKE @MyFilterParmSmart + '%'
SELECT 50 FROM [MyDataBase].[dbo].[CompanyView]
WHERE CompanyNM LIKE '%' + @MyFilterParmSmart + '%'

Once this Stored Procedure is written, export the BDCM (using SPD) and edit the XML to provide hard-coded reference to the above Stored procedure, MyFilterParm filter parameter, and fields returned.  The BDCM import is not done in SPD, but is instead done in Central Admin in the BCS service app config.  Here’s the XML Pseudocode to replace within the Methods XML group in the BDCM (important parts highlighted in larger font):

<Method IsStatic="false" Name="NamesByWildcardProcedure">
    <Property Name="BackEndObject" Type="System.String">sp_MySmartSearch</Property>
    <Property Name="BackEndObjectType" Type="System.String">SqlServerRoutine</Property>
    <Property Name="RdbCommandText" Type="System.String">[dbo].[sp_MySmartSearch]</Property>
    <Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089">StoredProcedure</Property>
    <Property Name="Schema" Type="System.String">dbo</Property>
    <AccessControlEntry Principal="MYDOMAIN\administrator">
      <Right BdcRight="Edit" />
      <Right BdcRight="Execute" />
      <Right BdcRight="SetPermissions" />
      <Right BdcRight="SelectableInClients" />
    <AccessControlEntry Principal="MYDOMAIN\user1">
      <Right BdcRight="Edit" />
      <Right BdcRight="Execute" />
      <Right BdcRight="SelectableInClients" />
    <AccessControlEntry Principal="MYDOMAIN\user2">
      <Right BdcRight="Edit" />
      <Right BdcRight="Execute" />
    <FilterDescriptor Type="Wildcard" FilterField="LastName" Name="Wildcard">
        <Property Name="CaseSensitive" Type="System.Boolean">false</Property>
        <Property Name="IsDefault" Type="System.Boolean">false</Property>
        <Property Name="UsedForDisambiguation" Type="System.Boolean">false</Property>
    <Parameter Direction="In" Name="@MyFilterParmSmart">
      <TypeDescriptor TypeName="System.String" AssociatedFilter="Wildcard" Name="@MyFilterParmSmart">
          <Property Name="Order" Type="System.Int32">0</Property>
          <DefaultValue MethodInstanceName="NamesByWildcardProcedure" Type="System.String">a</DefaultValue>

You’ll find the XML much easier to edit in Visual Studio (any version) as the nesting is a bit much to handle in Notepad.

MSDN offers a similar example of a stored procedure, in this case, designed to return precisely one row:


Scot Hillier’s BCS book is also an excellent reference:


SharePoint Document IDs

The SharePoint Document ID Service is a new feature of SharePoint 2010 that offers a number of useful capabilities, but carries some limitations.  Let’s dig a bit deeper and see what it does and how it works.

One challenge for SharePoint users is that links tend to easily break. Rename a file or folder, or move the document, and a previously saved or shared link will not work.  By tagging a document with an ID, SharePoint can start referencing documents using this ID, even when the underlying structure beneath it has changed.  SharePoint can accept a link with this ID, by referencing a dedicated page on each site that takes care of finding the the document.  This page is named DocIDRedir.aspx.  Here’s what a URL might look like:


There’s also a Document ID web part that’s available for users to enter a Document ID.  This is used most prominently when creating a Records Center site, which is based on an out-of-box website template.

The Document ID Service is enabled at the Site Collection level, and assigns Document IDs that are unique only within the site collection.  There is a prefix available for configuration that is most useful when assigned uniquely for each Site Collection to ensure uniqueness across your web application and even farm.  If you have more than one farm, it makes sense to provide an embedded prefix to indicate the farm, to ensure uniqueness globally.

Setting Document ID

Once the Document ID Service is enabled, every new or edited document instantly gets a Document ID assigned.  However, historical documents do not get an immediate Document ID assignment.  The assignment of Document IDs to documents that were uploaded prior to this service being enabled are assigned by a Timer Job called the “Document ID assignment job” that exists at the Web Application level.  By default this job runs nightly.  This is one of two jobs associated with the Document ID Service; the other being the “Document ID enable/disable job ”

When the Document ID Service is enabled for a Site Collection, Event Receivers are automatically installed in each Document Library.  Actually there is a set of Event Receivers installed for each and every Content Type configured within that document library.  The Event Receiver is called “Document ID Generator” and is configured to by fired synchronously.  There is a separate Event Receiver for the following events:

  • ItemAdded
  • ItemUpdated
  • ItemCheckedIn
  • ItemUncheckedOut

Once a Document ID is assigned, it is changeable through the Object Model, although do so at your own risk.  Before the Document  ID Service is enabled, the Document ID field does not exist to be assigned.   if you are migrating from a legacy system that has existing Document IDs, you can first migrate the documents, then the Document ID service is enabled.  This adds the internal Document ID field.  Then before the daily Document ID Assignment job runs (better yet, disable it during this process), we can programmatically take the legacy Document IDs and assign their values to the SharePoint IDs.  With the Document ID field populated, the Document ID Service will not overwrite the already set Document IDs.

Note that part of Document ID Service is to redirect URLs referencing the Document ID.  It turns out, if you manually assign duplicate Document IDs (something that in theory should never occur), the daily Document ID Assignment Job detects this situation, and the DocIDRedir.aspx redirects to a site-based search page that passes in the Document ID.   

Under the covers there are three internal components to a Document ID:

  • _dlc_DocIdUrl: fully qualified URL for document referencing the DocIDRedir.aspx along with the lookup parameter
  • _dlc_DocId: The Document ID.  This is the internal property you can directly address and assign as $item[“_dlc_DocId”]
  • _dlc_DocIdItemGuid: DocID related GUID

That completes our tour of the Document ID Service.  I look forward to hearing of others’ experience with it.

Item Level permissions

SharePoint has a robust object model supporting security at each level of the farm.  Let’s take a quick tour of some relevant methods and properties around item level reporting.

All securable objects have a method named GetUserEffectivePermissionInfo which is defined in the base class SPSecurableObject. This method returns back an SPPermissionInfo object which we can use to inspect the role definition bindings and corresponding permission levels. SPSecurableObject is imple,eented at the SPWeb, SPList, and SPLIstItem class level, hence how we assign permissions if needed at the site level.

 We can loop through the SPRoleAssignments objects via the RoleAssignments property. This will give us information about how the user is given access to the resource. This returns the Member (the account or group), the RoleDefinitionBindings (permission level). This is an excellent place to start if you are looping through each item.

 Next can look at the RoleDefinitionBindings property which returns back a collection of SPRoleDefinition objects that tell us about the type of access granted.

 Other important properties for reporting security include:

  • HasUniqueRoleAssignments, or the method returing the same thing: get_HasUniqueRoleAssignments()
  • RoleDefinitionBindings: collection of SPRole Definition objects returned.
  • IsSiteAdmin : a property of the user, indicates if a user is a Site Collection Admin ,which includes explicit permissions to everything
  • SPListItem.FirstUniqueAncestorSecurableObject: Retrieves the first unique ancestor if it has unique role assignments otherwise returns the first parent object (folder, list, or Web site) that has unique role assignments.
  • SPItem.AllRolesForCurrentUser

For a more general view of Security permissions in SharePoint, please see this TechNet article.