Azure Function using a Managed Identity to call SharePoint Online

This post will detail how to use a Managed Identity from an Azure Function to make calls to the SharePoint Online REST API.

Azure Managed Identities offer a way to allow Azure Services to “self authenticate” against other Azure Services. I tend to think of them as “service accounts in the cloud”.

1.The first step is to create an Azure Function, let’s call ours testmitosharepoint.

2. Next we click Platform features->Identity, use the default of System assigned and then choose On and click the Save button:

3. You’ll see a prompt asking explaining you are creating an object in Active Directory, click the Yes button:

4. You’ll then see the objectid of the service principal that has just been created, copy this id as we’ll need it later when we add the permissions.

5. At this point we have configured our Function App to have an identity in Azure AD and we can use this identity to retrieve a bearer token and pass the bearer token to calls to the SharePoint Online REST API.

6. The Managed Identity is basically an Azure AD Service Principal. We can see the details of this Service Principal (and modify it) in various ways. Using the Microsoft Graph Explorer is one such way. Browse to https://aka.ms/ge and sign in (if this is the first time you’ve used the GE then you’ll have to consent to the app).

7. Once signed in we want to use the servicePrincipals endpoint (at the time of writing part of the beta API):

https://docs.microsoft.com/en-us/graph/api/serviceprincipal-list?view=graph-rest-beta

8. Note that to call this API you’ll might need to grant yourself further permissions:


9. We can retrieve the details of the Managed Identity Service Principal using the following query (substitute the object id you copied above at step 4):

https://graph.microsoft.com/beta/serviceprincipals/29f7ec8c-07eb-4c94-983b-acd806e910c7

You should see the details of your service principal in the results:

{
"@odata.context": "https://graph.microsoft.com/beta/$metadata#servicePrincipals/$entity",
"id": "29f7ec8c-07eb-4c94-983b-acd806e910c7",
"deletedDateTime": null,
"accountEnabled": true,
"appDisplayName": null,
"appId": "ee26a3db-e89c-440e-94a7-1ba20b58380f",
"appOwnerOrganizationId": null,
"appRoleAssignmentRequired": false,
"displayName": "testmitosharepoint",
"errorUrl": null,
"homepage": null,
"info": null,
"logoutUrl": null,
"publishedPermissionScopes": [],
"preferredTokenSigningKeyThumbprint": null,
"publisherName": null,
"replyUrls": [],
"samlMetadataUrl": null,
"servicePrincipalNames": [
"ee26a3db-e89c-440e-94a7-1ba20b58380f",
"https://identity.azure.net/nFoc2cyj+BaqZHpNGXdazrwSbFP50bLwojc994uTgLI="
],
"signInAudience": null,
"tags": [],
"addIns": [],
"appRoles": [],
"keyCredentials": [
{
"customKeyIdentifier": "1EEBD51734B62D816F3839CD214A525B0F39DE35",
"endDateTime": "2019-06-15T19:03:00Z",
"keyId": "2b3761fc-1e0b-4816-b955-e45245bc9aef",
"startDateTime": "2019-03-17T19:03:00Z",
"type": "AsymmetricX509Cert",
"usage": "Verify",
"key": null,
"displayName": "CN=/subscriptions/06c69857-8582-4e2f-9a83-6d2840327e73/resourcegroups/testmitosharepointrg"
}
],
"passwordCredentials": []
}

Of particular note above is that this service principal has an X509 Certificate credential, this Certificate is managed by Azure and we do not have to worry about it expiring or being trusted etc., Azure takes care of rolling this over for us. This is important as to make App Only calls to the SharePoint Online REST API we need to present certificate credentials.

10. Let’s check this service principal using the AzureAD Powershell commandlets. First login to AzureAD from Powershell and then run the following:

Get-AzureADServicePrincipal -SearchString “testmitosharepoint”


11. So we can see the service principal from the Graph Explorer and we can also see the Service Principal from AzureAD Powershell. But what we want to do is not just see the service principal, we also want to add the appropriate App Only permissions so it can be used to call the SharePoint Online REST API. First we need to see the details of what App Only permissions are available.

12. We want to get the list of App Only roles that are available for SharePoint Online. SharePoint Online in fact has its own service principal. We can get the basic details of this by running the following powershell (note we want to make a note of the ObjectId as we’ll be using that later on as the ResourceId parameter we pass to the New-AzureADServiceAppRoleAssignment commandlet):

Get-AzureADServicePrincipal -SearchString “Office 365 SharePoint”

13. Next we want to see all of the App Only roles that this service principal offers. We can do that from Powershell by calling the following:

Get-AzureADServicePrincipal -SearchString “Office 365 SharePoint” | %{$_.AppRoles} | fl

This will present us with the following output:

AllowedMemberTypes : {Application}
Description : Allows the app to read user profiles without a signed in user.
DisplayName : Read user profiles
Id : df021288-bdef-4463-88db-98f22de89214
IsEnabled : True
Value : User.Read.All
AllowedMemberTypes : {Application}
Description : Allows the app to read and update user profiles and to read basic site info without a signed in user.
DisplayName : Read and write user profiles
Id : 741f803b-c850-494e-b5df-cde7c675a1ca
IsEnabled : True
Value : User.ReadWrite.All
AllowedMemberTypes : {Application}
Description : Allows the app to write enterprise managed metadata and to read basic site info without a signed in user.
DisplayName : Read and write managed metadata
Id : c8e3537c-ec53-43b9-bed3-b2bd3617ae97
IsEnabled : True
Value : TermStore.ReadWrite.All
AllowedMemberTypes : {Application}
Description : Allows the app to read enterprise managed metadata and to read basic site info without a signed in user.
DisplayName : Read managed metadata
Id : 2a8d57a5-4090-4a41-bf1c-3c621d2ccad3
IsEnabled : True
Value : TermStore.Read.All
AllowedMemberTypes : {Application}
Description : Allows the app to read, create, update, and delete document libraries and lists in all site collections without a signed in user.
DisplayName : Read and write items and lists in all site collections
Id : 9bff6588-13f2-4c48-bbf2-ddab62256b36
IsEnabled : True
Value : Sites.Manage.All
AllowedMemberTypes : {Application}
Description : Allows the app to have full control of all site collections without a signed in user.
DisplayName : Have full control of all site collections
Id : 678536fe-1083-478a-9c59-b99265e6b0d3
IsEnabled : True
Value : Sites.FullControl.All
AllowedMemberTypes : {Application}
Description : Allows the app to read documents and list items in all site collections without a signed in user.
DisplayName : Read items in all site collections
Id : d13f72ca-a275-4b96-b789-48ebcc4da984
IsEnabled : True
Value : Sites.Read.All
AllowedMemberTypes : {Application}
Description : Allows the app to create, read, update, and delete documents and list items in all site collections without a signed in user.
DisplayName : Read and write items in all site collections
Id : fbcd29d2-fcca-4405-aded-518d457caae4
IsEnabled : True
Value : Sites.ReadWrite.All

14. What we now need is to select what Application Role we want to assign to our Managed Identity service principal and make a note of it. I want to add to my “testmitosharepoint” the App Only role of Sites.Read.All, so from the above I need to make a note of the Id d13f72ca-a275-4b96-b789-48ebcc4da984 as this will be passed to the New-AzureADServiceAppRoleAssignment commandlet as the Id parameter.

15. So we now have the details of our service principal “testmitosharepoint”, we have the details of the SharePoint Online service princpal, and we have the details of the App Only role we want to grant to our service principal. Next we want to call the New-AzureADServiceAppRoleAssignment commandlet (for both the ObjectId and the PrincipalId we pass our Managed Identity service principal ObjectId):

New-AzureADServiceAppRoleAssignment -ObjectId 29f7ec8c-07eb-4c94-983b-acd806e910c7 -PrincipalId 29f7ec8c-07eb-4c94-983b-acd806e910c7 -ResourceId 60ce75ac-957c-4f93-b382-d7f85cbbe649 -Id d13f72ca-a275-4b96-b789-48ebcc4da984

Note: You will see an error at this point, but it appears that the operation succeeds. See the following StackOverflow post for more details:

https://stackoverflow.com/questions/52557766/assigning-microsoft-graph-permissions-to-azure-managed-service-identity?noredirect=1&lq=1

Update 19th March 2019: Arturo Lucatero (Senior Program Manager at Microsoft) tweeted the following:

16. To check if the permissions have been added we can run the following Powershell:

Get-AzureADServiceAppRoleAssignedTo -ObjectId 29f7ec8c-07eb-4c94-983b-acd806e910c7

17. We now have granted the correct permissions on the service principal, we can now return to the Azure Portal and add an Azure Function that will use the Managed Identity’s token to call the SharePoint Online REST API.

18. Back in the Azure Portal browse to the function, and then browse to the testmitosharepoint function app, and add a new function using the defaults of HTTP Trigger, authentication Function (note the authentication doesn’t matter for now as we are just testing, of course you’d want to configure this to what you require).

19. Add the following two lines of code immediately under the first log.LogInformation line, save and run the function and inspect the output:

log.LogInformation("MSI_ENDPOINT: " + System.Environment.GetEnvironmentVariable("MSI_ENDPOINT"));    log.LogInformation("MSI_SECRET: " + System.Environment.GetEnvironmentVariable("MSI_SECRET"));

20. You should see something like the following output:

21. What you are seeing is the “local” endpoint details that the underlying infrastructure puts in place when you configure your Function to use a Managed Identity. Note that basically what we get is an http endpoint listening on 127.0.01 and some port, this is local to the App Service the Function is hosted on. If you wanted to get your Managed Identity of a VM you could add a simple command line app and make the exact same call as the VM would “host” this local endpoint and have the environment variables configured in the same way.

22. Next we will add the following code to the function immediately under the two log.LogInformation lines we just added (make sure you enter your tenant name :

HttpClient client = new HttpClient();    
string sharePointResourceId = "https://<yourtenant>.sharepoint.com";    
string apiVersion = "2017-09-01";    
client.DefaultRequestHeaders.Add("Secret", System.Environment.GetEnvironmentVariable("MSI_SECRET"));    
var tokenResponse = await client.GetAsync(String.Format("{0}/?resource={1}&api-version={2}", System.Environment.GetEnvironmentVariable("MSI_ENDPOINT"), sharePointResourceId, apiVersion));    
var rawContent = await tokenResponse.Content.ReadAsStringAsync();    log.LogInformation("MI Response: " + rawContent);    
dynamic managedIdentityApiResponseData = JsonConvert.DeserializeObject(rawContent);    
string managedIdentityBearerToken = null;    
managedIdentityBearerToken =  managedIdentityBearerToken ??managedIdentityApiResponseData?.access_token;    
log.LogInformation("Managed Identity Access Token: " + managedIdentityBearerToken);

23. What we “should” see is an output similar to the below:

2019-03-18T17:35:35.458 [Information] Script for function 'HttpTrigger1' changed. Reloading. 2019-03-18T17:35:35.541 [Information] Compilation succeeded. 2019-03-18T17:35:36.264 [Information] Executing 'Functions.HttpTrigger1' (Reason='This function was programmatically called via the host APIs.', Id=416bacbe-9ec9-4246-8ed4-f53803cb7d4a) 2019-03-18T17:35:36.404 [Information] C# HTTP trigger function processed a request. 2019-03-18T17:35:36.404 [Information] MSI_ENDPOINT: http://127.0.0.1:41031/MSI/token/ 2019-03-18T17:35:36.404 [Information] MSI_SECRET: DA23480116914E918792DD4A6CDC16B0 2019-03-18T17:35:36.421 [Information] MI Response: {"access_token":"eyJ<truncated for brevity>ubVHVxsw","expires_on":"3/19/2019 1:25:16 AM +00:00","resource":"https://finarne.sharepoint.com","token_type":"Bearer"} 2019-03-18T17:35:36.429 [Information] Managed Identity Access Token:  
eyJ<truncated for brevity>ubVHVxsw 2019-03-18T17:35:36.432 [Information] Executed 'Functions.HttpTrigger1' (Succeeded, Id=416bacbe-9ec9-4246-8ed4-f53803cb7d4a)

24. We can see the raw output from the Managed Identity “local” api – a Json object that contains various properties of our service principal token (formatted for clarity):

{    
   "access_token": "eyJ<truncated for brevity>ubVHVxsw",    
   "expires_on": "3/19/2019 1:25:16 AM +00:00",    
   "resource": "https://<yourtenant>.sharepoint.com",    
   "token_type": "Bearer"
}

25. We have also stored the access_token value in the variable managedIdentityBearerToken, we will use this value when we call the SharePoint Online REST API. But before we do that one thing we can do to “check” this token is to browse to https://jwt.ms and paste in the access_token value:

26. We can also click the Claims tab to see additional details (just showing the basic details here, lots more if you scroll down the page):

27. Now we can go back to our Azure Function and test calling into the SharePoint Online REST API. Add the following code immediately under the log.LogInformation(“Managed Identity Access Token: ” + managedIdentityBearerToken); line (we are simply calling back to the root site on the tenant and getting the details of the _api/web call):

string sharePointOnlineRESTAPIEndPoint = "https://<yourtenant>.sharepoint.com/_api/web";
HttpClient sharePointRESTAPIClient = new HttpClient();
sharePointRESTAPIClient.DefaultRequestHeaders.Add("Authorization", "Bearer " + managedIdentityBearerToken);
sharePointRESTAPIClient.DefaultRequestHeaders.Add("Accept", "application/json");
var sharePointRESTAPIResponse = await sharePointRESTAPIClient.GetAsync(sharePointOnlineRESTAPIEndPoint);
var sharePointRESTAPIRawContent = await sharePointRESTAPIResponse.Content.ReadAsStringAsync();
log.LogInformation("SharePoint REST API Response: " + sharePointRESTAPIRawContent);

28. Save and run the code, you should see output from the SharePoint REST API.

29. If you want to add other API permissions to the Managed Identity (for example call back to the Microsoft Graph endpoints) you can use the Powershell in steps 12 & 13 to discover what App Only Roles are available, all you need to do is to change the -SearchString parameter from “Office 365 SharePoint Online” to “Microsoft Graph”:

30. You will see a LOT of results, but the same principals apply. You choose the App Only permissions you want, add then using the details in steps 14 & 15 grant the permissions, when you request the Managed Identity access token at step 22 you would pass in “https://graph.microsoft.com&#8221; as the resource parameter:

HttpClient client = new HttpClient();
string microsoftGraphResourceId = "https://graph.microsoft.com";
string apiVersion = "2017-09-01";
client.DefaultRequestHeaders.Add("Secret", System.Environment.GetEnvironmentVariable("MSI_SECRET"));
var tokenResponse = await client.GetAsync(String.Format("{0}/?resource={1}&api-version={2}", System.Environment.GetEnvironmentVariable("MSI_ENDPOINT"), microsoftGraphResourceId, apiVersion));

31. You can repeat the same steps to examine the token in the https://jwt.ms website and you can then make calls to the Microsoft Graph the same way you called the SharePoint Online REST API, all you need to do is change the HttpClient calls with the correct endpoint.

Summary
This post has looked at how you can add App Only permissions to the Managed Identity service principal using the most basic steps. There are SDKs that will help in calling the “local” endpoint and also calling the SharePoint Online REST API, all I wanted to do here was look at doing all of this at the most basic level.

YMMV

Spfx 1.6 AadHttpClient OAuth Permissions fun – part 1 – prerequisites

Article series
  1. Spfx 1.6 AadHttpClient OAuth Permissions fun – part 1 – prerequisites (this article)
  2. Spfx 1.6 AadHttpClient OAuth Permissions fun – part 2 – Sample webpart (coming soon)
  3. Spfx 1.6 AadHttpClient OAuth Permissions fun – part 3 – Call Azure Function (coming soon)
  4. Spfx 1.6 AadHttpClient OAuth Permissions fun – part 4 – Differences between SharePoint Admin API Management and Azure AD App Reg blade (coming soon)
  5. Spfx 1.6 AadHttpClient OAuth Permissions fun – part 5 – What should be fixed (coming soon)
This is the first in a series of articles that will do a deep dive into the SharePoint Spfx configuration that allows an spfx webpart to call backend APIs such as the Microsoft Graph in a seamless manner using Azure AD authentication via the OAuth2 Implicit Flow.

This article series was inspired by a troubling issue I encountered as I was trying to grok the SharePoint Admin API management screen. I started a dialogue on this issue on the sp-docs github site here:

https://github.com/SharePoint/sp-dev-docs/issues/2516

This is the first article which checks the pre-requisites for ensuring the spfx 1.6 service principal is correctly configured on your tenant.

The SharePoint Framework 1.6 is the version that made the AadHttpClient (amongst other goodies) GA (Generally Available).

With this release it becomes almost trivial to write spfx webparts that make calls to the Microsoft Graph or other RESP APIs such as Azure Functions.

The technical details that make this functionality work all seem (on the surface) to be almost magical, but it is important to understand what is configured at a low level in SharePoint and Azure AD to make this functionality work as there are inconsistencies between various admin screens in SharePoint and Azure.

This is all best explained by following a set of steps, in this first article we will check the prerequisites are in place on your tenant.

The most fundamental step to checking if you tenant has the necessary functionality in place is to check if the Spfx 1.6 service principal has been deployed to your tenant. The spfx 1.6 service principal has a display name of SharePoint Online Client Extensibility Web Application Principal.

If you have the AzureAD powershell module you can check by running the following script:
Connect-AzureAD
Get-AzureADServicePrincipal -SearchString "SharePoint Online Client Extensibility Web Application Principal"
If you see a result then you have the service principal already provisioned.

Alternatively you can browse to https://portal.azure.com and then:
  1. Click on Azure Active Directory
  2. Click on App registrations
  3. Click on View all applications
If you see the following you’re good to go: If you don’t see the spfx 1.6 service principal then you’ll need to visit the Preview SharePoint admin screen API management page.

To do this:
  1. Browse to https://-admin.sharepoint.com
  2. Click the Try the preview button
  3. Click the API management menu option on the left.
At this point you “should” see a message that tells you the SharePoint Framework is coming soon…even if you don’t just wait a few minutes and the run the above powershell and/or navigate to the Azure Portal App Reg blade.

Now that you know you’ve got the correct spfx service principal configured on your tenant there is one additional step that you need to execute (note at the time of writing this additional step is supposedly going to be fixed soon so you may well not have to do this).

Go back to the Azure Portal https://portal.azure.com
  1. Click on Azure Active Directory
  2. Click on App registrations
  3. Click on View all application
  4. Click on SharePoint Online Client Extensibility Web Application Principal
  5. Click on Settings
  6. Click on Required permissions
At this point you “should” see something like the following: AppRegBlade2
If you see this then your tenant should have the required permissions assigned to the spfx service principal…but the permissions might not yet be granted (again at the time of writing all of these steps had to be done manually).

To double-check the spfx service principal has the required permissions granted you can go the SharePoint Admin screen, and click the API management screen, you should see the following: SpAdminAPIManagement1 If you see the Approved entry for Windows Azure Active Directory then you are good to go.

Alternatively if you have the Microsoft.Online.SharePoint.PowerShell module installed you can run the following Powershell and see a result:
Connect-SPService -Url https://-admin.sharepoint.com
Get-SPOTenantServicePrincipalPermissionGrants
If you see results similar to the following you should be good to go: PSResults1
If at this point you don’t see the granted permissions then you can do the following back in the Azure Portal:
  1. Click the Grant permissions button
  2. Click the Yes button when prompted
  3. Wait for the Azure portal to report a success message, and then go back to the SharePoint Admin API management screen (refresh it) and you should now see the above permission.
At this stage you should be happy that you can now test out creating a simple spfx webpart that calls the Microsoft Graph, we’ll use a sample app to test this all out and then check what low level configuration has been written to the Azure AD backend by running Powershell, using the Microsoft Graph Explorer, the SharePoint Admin API management screen, and the Azure AD App Reg blade.

Until next time.

403 Forbidden from /_api/contextinfo when using Chrome Postman REST App

tl;dr
The Postman App was sending an Origin header to /_api/contextinfo and that was generating a 403 Forbidden. Using a fiddler rule I removed the Origin HTTP header and the call to /_api/contextinfo endpoint then worked.

I’ve recently been trying to grok the SharePoint online REST API, particularly executing requests that require an HTTP POST and therefore a X-RequestDigest header.

To help me understand the interaction with the REST API I installed the Google Chrome Postman REST App and started to test.

If you open google chrome, login to your Office 365 site, then launch Postman, requests to the RESP API will be sent with the appropriate cookies for FedAuth and rtFA to authenticate.

So all good then, I was able to execute simple GET requests such as https://*myo365site*/_api/web and get back results as expected.

I wanted to start experimenting with the REST APIs for custom permissions that are “documented” here:

https://msdn.microsoft.com/en-us/library/office/dn495392.aspx

So the first thing I needed to do was to get a Request Digest to add as a header to my POST requests from Postman. Of course to get a Request Digest you need to have issued a POST request – but for POST requests you need a X-RequestDigest header – chicken & egg.

The process to follow is to issue a POST request to the https://*myo365site*/_api/contextinfo endpoint with an empty body and two headers:

Accept: application/json;odata=verbose
Content-Length: 0

This “should” return a 200 status code and a body such as the following:


{


 "d": {
 "GetContextWebInformation": {
 "__metadata": {
 "type": "SP.ContextWebInformation"
 },
 "FormDigestTimeoutSeconds": 1800,
 "FormDigestValue": "<FORMDIGESTVALUE>",
 "LibraryVersion": "16.0.4107.1226",
 "SiteFullUrl": "https://*myo365site*",
 "SupportedSchemaVersions": {
 "__metadata": {
 "type": "Collection(Edm.String)"
 },
 "results": [
 "14.0.0.0",
 "15.0.0.0"
 ]
 },
 "WebFullUrl": "https://*myo365site*"
 }
 }
}

Instead I was getting a 403 Forbidden status code with no body. I then started up fiddler to what was being sent through by Postman. Postman was sending through a few extra headers:

POST https://*myo365site*/_api/contextinfo HTTP/1.1
Host: *myo365site*
Connection: keep-alive
Content-Length: 0
Accept: application/json;odata=verbose
Origin: chrome-extension://fdmmgilgnpjigdojojpjoooidkmcomcm
CSP: active
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.81 Safari/537.36
Accept-Encoding: gzip, deflate
Accept-Language: en-GB,en-US;q=0.8,en;q=0.6
Cookie: WSS_FullScreenMode=false; rtFa=*cookievalue*; FedAuth=*cookievalue*

I copied all of the above headers into fildder’s Compose tab, I then started to remove the additional headers one by one to see if it made any difference.

When I removed the Origin: chrome-extension://fdmmgilgnpjigdojojpjoooidkmcomcm the request succeeded!

I then added a rule to fildder in the static function OnBeforeRequest(oSession: Session) method:

oSession.RequestHeaders.Remove("origin");

And my requests from Postman to /_api/web/contextinfo now succeeded and I was able to obtain the RequestDigest value from the JSOM and use it as the value for the X-RequestDigest HTTP header for subsequent HTTP POST calls to REST endpoints.

Now…should I be leaving this fiddler rule in place “for all requests”?, I don’t know. All I know for now is that for my specific environment this is what I was seeing and I was able to get the call to /_api/contextinfo to execute successfully by removing the origin header.

I suspect that once I start issuing calls that do require the origin header (guessing here but the calls from an AppWeb to a HostWeb for example) then I’ll maybe run into issues. Need to investigate further.

YMMV

Fixing the Save site as a template error when you have provisioned custom site columns in SharePoint 2013

***UPDATE: 18/03/2015***
I’ve updated this article to discuss the specifics of when this issue can arise

I had a report of a user experiencing an error using the Save site as a Template on a straight forward Team Site in 2013.

tl;dr
If you provision a custom site column with the Overwrite=”true” attribute using SPFieldCollection.AddFieldAsXml¬†then the Save site as¬†template function will fail as the underlying code to generate the wsp file will add a duplicate Overwrite=”true” attribute to the site column and therefore generate an xml file error. You need to update any provisioned site column’s SchemaXml property to remove the Overwrite=”true” attribute.

I was able to re-create the error on a test server and began the investigation.

Note: the farm was patched up to August 2013 CU only, but I’ve had a quick check of the codebase in SP1 and it appears that this issue discussed below persists

In ULS I saw the following error for the correlation id:

[Forced due to logging gap, Original Level: Monitorable] System.Xml.XmlException: ‘Overwrite’ is a duplicate attribute name. Line 1, position 327.
at System.Xml.XmlTextReaderImpl.Throw(String res, String arg, Int32 lineNo, Int32 linePos)
at System.Xml.XmlTextReaderImpl.AttributeDuplCheck()
at System.Xml.XmlTextReaderImpl.ParseAttributes()
at System.Xml.XmlTextReaderImpl.ParseElement()
at System.Xml.XmlTextReaderImpl.ParseDocumentContent()
at Microsoft.SharePoint.SPSolutionExporter.WriteXmlToWriter(XmlWriter output, String xml, Boolean skipDocumentElement, Boolean addCdata)
at Microsoft.SharePoint.SPSolutionExporter.ExportFields(SPFieldCollection fields, String partitionName)
at Microsoft.SharePoint.SPSolutionExporter.ExportListsManifest(ListInstancesExportSummaryInfo exportSummary, ModuleExportSummaryInfo moduleExportSummary, List`1 workflowContentTypes, String workflowForm, String serverRelativeworkflowForm)
at Microsoft.SharePoint.SPSolutionExporter.ExportLists()
at Microsoft.SharePoint.SPSolutionExporter.GenerateSolutionFiles()
at Microsoft.SharePoint.SPSolutionExporter.ExportWebAsSolution()

and

System.InvalidOperationException: Error generating solution files in temporary directory.
at Microsoft.SharePoint.SPSolutionExporter.ExportWebAsSolution()
at Microsoft.SharePoint.SPSolutionExporter.ExportWebToGallery(SPWeb web, String solutionFileName, String title, String description, ExportMode exportMode, Boolean includeContent, String workflowTemplateName, String destinationListUrl, Action`1 solutionPostProcessor, Boolean activateSolution)
at Microsoft.SharePoint.ApplicationPages.SaveAsTemplatePage.BtnSaveAsTemplate_Click(Object sender, EventArgs e)
at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

The first clue here is that Overwrite is a duplicate attribute. And, that part of the stack trace:

at Microsoft.SharePoint.SPSolutionExporter.ExportFields(SPFieldCollection fields, String partitionName)

My first thought was were there any custom site columns on this site, and there were several. The custom site columns had been provisioned first to the farm Content Type Hub and then published to the site via a custom content type.

The custom site columns were deployed to the farm as part of a wsp, created in Visual Studio. The Visual Studio solution contained a few features that included your typical Elements.xml file containing some site columns defined using CAML.

Now if you’re like me you’ll “probably” have got into a habit of adding to your site columns the Overwrite=”TRUE” attribute, this is usually so that your develop, deploy, test workflow is continuously deploying the “most up to date” version of your custom artefacts. Also, it is a “requirement” for certain types of site columns to have Overwrite=”TRUE” as part of their definition.

As an example, the following are typical blog posts stating the requirement of adding the Overwrite=”TRUE” attribute:

https://mdashiqur.wordpress.com/2014/02/09/lookup-field-as-a-site-column-using-caml-sharepoint-lookup-field/

https://andrewonsoftware.wordpress.com/2011/10/27/how-to-define-lookup-column-via-caml-in-sharepoint-2010-and-avoid-errors/

We have established that our test site has custom columns that contain the Overwrite=”TRUE” attribute, but this seems to be causing an issue with Save site as a template.

Taking a step back into memory lane, what Save site as a template actually does is to create a wsp containing all the artefacts required to create a “copy” of a site. This can be very useful for Site Collection owners to define what each subsite should “look like”. Save site as a template is also a useful feature for developers to learn how to “write CAML”, and in fact what a lot of developers used to do would be to start off any custom site definition/web template by first prototyping in the UI, Save site as a template, download the wsp from the Site Collection solution gallery, and then crack open the wsp and add the files into Visual Studio.

The biggest issue with this prototyping approach was that the wsp generated by Save site as a template didn’t create “round-trippable” CAML definitions for fields. A developer would add all the wsp files into their solution, try to deploy it and it would “appear” to have worked but site columns such as Lookup columns wouldn’t get provisioned correctly. So “it is known” that using Save site as a template and Visual Studio together requires more tweaking of the generated files.

Now going back to the stack trace of the error in ULS I opened ILSpy and started to work my way through the code base for the SPSolutionExporter class, specifically on the Microsoft.SharePoint.SPSolutionExporter.ExportFields(SPFieldCollection fields, String partitionName) method.

Here’s what ILSpy decompiles the method to:

// Microsoft.SharePoint.SPSolutionExporter
private void ExportFields(SPFieldCollection fields, string partitionName)
{
	if (fields.Count <= 0 || this.WorkflowExportModeIsEnabled)
	{
		ULS.SendTraceTag(894035u, ULSCat.msoulscat_WSS_SolutionExporter, ULSTraceLevel.Verbose, "There are no fields to export for partition \"{0}\" so the feature will not be included in this solution.", new object[]
		{
			partitionName
		});
		return;
	}
	SPSolutionExporter.FieldsExportSummaryInfo fieldsExportSummaryInfo = new SPSolutionExporter.FieldsExportSummaryInfo();
	foreach (SPField sPField in fields)
	{
		string title = sPField.Title;
		try
		{
			SPSolutionExporter.FieldExportSummaryInfo fieldExportSummaryInfo = SPSolutionExporter.ExportField(sPField, this.web);
			fieldsExportSummaryInfo.FieldExportSummaryInfoEntries.Add(fieldExportSummaryInfo.SortedName, fieldExportSummaryInfo);
		}
		catch (Exception ex)
		{
			string strMessage = string.Format(this.web.UICulture, SPResource.GetString("SitePackaging_ErrorExportingField", new object[0]), new object[]
			{
				title
			});
			ULS.SendTraceTag(894036u, ULSCat.msoulscat_WSS_SolutionExporter, ULSTraceLevel.Monitorable, ex.ToString());
			throw new SPException(strMessage);
		}
	}
	string text = SPSolutionExporter.ConvertWebRelativeUrlToPartitionedRelativePath("ElementsFields.xml", partitionName);
	ULS.SendTraceTag(894037u, ULSCat.msoulscat_WSS_SolutionExporter, ULSTraceLevel.Verbose, "Creating field feature manifest file '{0}'", new object[]
	{
		text
	});
	using (ScopedXmlWriter scopedXmlWriter = new ScopedXmlWriter(this.CreateXmlWriterInStagingArea(text), text))
	{
		using (new ScopedXmlWriterElement(scopedXmlWriter.Value, "", "Elements", "http://schemas.microsoft.com/sharepoint/"))
		{
			foreach (KeyValuePair<string, SPSolutionExporter.FieldExportSummaryInfo> current in fieldsExportSummaryInfo.FieldExportSummaryInfoEntries)
			{
				SPSolutionExporter.FieldExportSummaryInfo value = current.Value;
				SPSolutionExporter.WriteXmlToWriter(scopedXmlWriter.Value, value.SchemaXml);
			}
		}
	}
	string text2 = SPSolutionExporter.ConvertWebRelativeUrlToPartitionedRelativePath("Feature.xml", partitionName);
	fieldsExportSummaryInfo.FeatureFileRelativePath = text2;
	ULS.SendTraceTag(894038u, ULSCat.msoulscat_WSS_SolutionExporter, ULSTraceLevel.Verbose, "Creating fields feature file '{0}'", new object[]
	{
		text2
	});
	using (ScopedXmlWriter scopedXmlWriter2 = new ScopedXmlWriter(this.CreateXmlWriterInStagingArea(text2), text2))
	{
		using (new ScopedXmlWriterElement(scopedXmlWriter2.Value, "", "Feature", "http://schemas.microsoft.com/sharepoint/"))
		{
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Id", null, fieldsExportSummaryInfo.FeatureId);
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Title", null, "Fields feature of exported web template \"" + this.web.Title + "\"");
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Version", null, "1.0.0.0");
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Scope", null, "Web");
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Hidden", null, true);
			SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "RequireResources", null, true);
			using (new ScopedXmlWriterElement(scopedXmlWriter2.Value, string.Empty, "ElementManifests", null))
			{
				using (new ScopedXmlWriterElement(scopedXmlWriter2.Value, string.Empty, "ElementManifest", null))
				{
					SPSolutionExporter.WriteXmlAttribute(scopedXmlWriter2.Value, string.Empty, "Location", null, Path.GetFileName(text));
				}
			}
		}
	}
	ULS.SendTraceTag(894039u, ULSCat.msoulscat_WSS_SolutionExporter, ULSTraceLevel.Verbose, "Exported {0} fields into partition \"{1}\".", new object[]
	{
		fieldsExportSummaryInfo.FieldExportSummaryInfoEntries.Count,
		partitionName
	});
}

The key line in this is line 18:

		try
		{
			SPSolutionExporter.FieldExportSummaryInfo fieldExportSummaryInfo = SPSolutionExporter.ExportField(sPField, this.web);
			fieldsExportSummaryInfo.FieldExportSummaryInfoEntries.Add(fieldExportSummaryInfo.SortedName, fieldExportSummaryInfo);
		}

This is the line of code that generates the xml that is then eventually written to disk as an xml file. The SPSolutionExporter.ExportField method calls the SPSolutionExporter.GetFieldSchemaXml which in turn calls the SPSolutionExporter.GenerateSchemaXmlForExport method

internal string GenerateSchemaXmlForExport(bool addOverWriteAttribute, bool removeSealedAttribute)
{
	string text = this.SchemaXml;
	text = SPUtility.RemoveXmlAttributeWithNameFromFirstNode(text, "Field", "Version");
	if (removeSealedAttribute)
	{
		text = SPUtility.RemoveXmlAttributeWithNameFromFirstNode(text, "Field", "Sealed");
	}
	if (addOverWriteAttribute)
	{
		string attributeNameValue = "Overwrite" + "=\"TRUE\"";
		text = SPUtility.AddXmlAttributeToFirstNode(text, "Field", attributeNameValue);
	}
	return text;
}

As we can see from the above the snippet:

	if (addOverWriteAttribute)
	{
		string attributeNameValue = "Overwrite" + "=\"TRUE\"";
		text = SPUtility.AddXmlAttributeToFirstNode(text, "Field", attributeNameValue);
	}
}

The culprit is lines 11 and 12. The Overwrite=”TRUE” attribute is added to each field’s xml EVEN IF IT ALREADY EXISTS. I say that again, even if the current field’s SchemaXml property already contains the Overwrite=”TRUE” attribute, the SPSolutionExporter.GenerateSchemaXmlForExport adds it in again. And this is the reason why the error is being generated, we have custom site columns that contain the Overwrite=”TRUE” attribute and it is these custom site columns that are causing the Save site as a template functionality to fail.

Once this code unwinds eventually an attempt will be made to write an xml document out, and the string passed to the XmlWriter contains an element with two Overwrite=”TRUE” attributes and therefore fails.

If we think about what is happening here, what “I think” is that Microsoft have received feedback that the wsp file created using Save site as a template did not add in the correct “round-trippable” xml on certain types of columns, and so they have added in this code to the SPSolutionExporter class to mitigate against the scenario of a developer using the UI to prototype, save as a template, and add the wsp files into Visual Studio.

Unfortunately is seems that the very people they’re trying to help – developers – are most likely going to be impacted by this bug as it is developers who will be creating custom site columns in CAML (and migrations from 2010 will almost certainly have custom site columns defined via CAML).

***UPDATE: 18/03/2015***
I’ve identified that the cause of this issue is when custom columns are added using the SPFieldCollection.AddFieldAsXml method and not as a declaritive field via a feature. The codebase for SPFieldCollection.AddFieldAsXml must be taking the xml passed to it and copying it to the SPField.SchemaXml property.

Now that we have identified the cause, is there a solution to this issue? The first solution would be to never deploy any site columns that are defined via CAML and contain the Overwrite=”TRUE” attribute.

But that’s not perhaps a pratical solution, there are already many site columns defined in this way, and it might not be feasible to re-deploy these site columns (in particular, a lot of farms will have deployed custom content types to the content type hub, and the admins of these farms may well have scenarios where they cannot re-publish the content types).

Also, (and I’ve not tested this) it might not be possible to get custom site columns deployed if they are defined with CAML and they are of type lookup.

Going back to the codbase of SPSolutionExporter, SPField.SchemaXml property is what is used to build up the xml that will eventually be written to disk:

internal string GenerateSchemaXmlForExport(bool addOverWriteAttribute, bool removeSealedAttribute)
{
	string text = this.SchemaXml;
	text = SPUtility.RemoveXmlAttributeWithNameFromFirstNode(text, "Field", "Version");

So is there a way for us to (1) Identify the fields that are causing the error and (2) update the fields to remove the error. The answers are yes and yes. The following Powershell snippet can be used to identify fields that will cause the error:

$w = Get-SPWeb -Identity https://test.company.internal/sites/testsite
$w.Fields | ?{$_.SchemaXml -like "*Overwrite=*"} | select Title, StaticName | ft

And the following powershell can be used to update the SchemaXml property to “remove” the issue:

$w = Get-SPWeb -Identity https://test.company.internal/sites/testsite
$fs = $w.Fields | ?{$_.SchemaXml -like "*Overwrite=*"}
foreach($f in $fs){
    $f.Title
    $f.SchemaXml = $f.SchemaXml.replace("Overwrite=`"TRUE`"", "")
}

You can run this on an individual site to allow you to save the site as a template. If you have a content type hub, then run the snippet against the content type hub, but then (importantly) make sure you publish all content types that use the custom site columns.

YMMV

 

 

 

 

 

 

 

 

SharePoint 2013 error after creating a view

There have been several reports of users experiencing an error when creating a view with an out the box SharePoint 2013 Team Site.

tl;dr 
Requests made via a load balancer that strips out the Accept-Encoding header cause an error creating a view on a SharePoint 2013 site with MDS activated.

My current environment is SharePoint 2013 with up to the August 2013 CU applied, on a Windows 2012 server, and I too had reports from users that when they were creating a view on a document library that they would see a page with the details:

Error
Cannot complete this action.
Please try again.
Troubleshoot issues with Microsoft SharePoint Foundation.
GO BACK TO SITE

The view was created successfully, but the user is presented with this error page and that is not something we want to be seeing.

Of interest here is that the url of the page ends with (more on this later) and the fact there is no correlation id presented either:

/_vti_bin/owssvr.dll?CS=65001

A SharePoint 2013 Team Site has the MDS (Minimal Download Strategy) feature enabled by default, and I had read that with the MDS feature deactivated that the creation of a view would succeed. So to test I created a simple team site, tried creating a view on the Documents library. Was presented with the error as above. I then deactivated the MDS feature on the site, tried creating the view and this time it succeeded.

So, we have a scenario where it appears that deactivating the MDS feature removes the error from creating a view. I was not satisfied with this as the MDS feature is not something I think we should just deactivate, and so started down a route to see if there was any errors reported on the server.

In our environment we use a load balancer in front of SharePoint (F5 BigIP version 11.3.0 build 3144.0 Hotfix HF8) so the first thing I wanted to do is to point my browser directly at a specific SharePoint server so I examine the logs on that server.

So I modified my local HOSTS file to point my SharePoint hostnames directly at a server, bypassing the load balancer. And…I no longer experienced the error!

So, it would appear that the load balancer was “getting in the way” of the request to create a view. The next thing I did was fire up fiddler and execute a request first via the load balancer (and therefore see an error) and then a request bypassing the load balancer (with no error) and examine the raw request and responses.

The key request to examine was the POST request to the endpoint:

/_vti_bin/owssvr.dll?CS=65001

Both the request and response bodies were almost identical, but for the request via the load balancer fiddler in fact reported a protocol violation. Further examination of the response , the response headers for the load balancer seemed strange, the header:

Content-Length: 0

Was causing the protocol violation as the response does in fact have a body. Of note here is that the response code with MDS activated is a 200 with a response body containing some | separated values including the view page url to redirect to.

The scenario mentioned earlier where we deactivate the MDS feature and create a view with no error, the fiddler trace shows that the POST request to owssvr.dll is met with a 302 redirect response (so the code path inside the owssvr.dll must choose a different response type based on the fact that is has received a MDS request.) All MDS requests have an HTTP form variable:

_MINIMALDOWNLOAD=1

This must be used by the code path inside owssvr.dll to decide if the response code is a 200 with a response body, or a 302 with no response body (more on this at the end of this article).

Now, from what I can tell the fact that the response contains a Content-Length: 0¬†header I’m guessing the client side code on the browser “gets it’s knickers in a twist” as we would say in Scotland. And that is why the user is presented with an error as the browser then attempts to make subsequent incorrect requests (the subsequent incorrect requests are not the issue, it’s the response to the POST to the owssvr.dll that is the issue).

I then went to our load balancer team and we setup a temporary load balancer configuration where the load balancer would only target one server (just to isolate out the investigation of log files to one server only).

On the single server that the load balancer was pointing to I started wireshark to capture the incoming requests. I then repeated my earlier set of steps: make a request to the server directly by bypassing the load balancer (no error) and make a request via the load balancer (see an error). I then started to examine the request as captured by wireshark to compare both.

The difference between the requests was as follows, for the POST to owssvr.dll the request that bypassed the load balancer had the following header:

Accept-Encoding: gzip, deflate

The request sent via the load balancer did not have this header. Of note here is earlier when I was examining the requests via fiddler the Accept-Encoding header was present in both, so it would appear that the load balancer is removing the Acccept-Encoding header.

We then proceeded to look at the load balancer configuration and lo and behold there is a setting:

Keep Accept Encoding

And this request was disabled (and from what I understand is the default setting) so the Accept-Encoding header is not sent on by the load balancer.

We then enabled the Keep Accept Encoding header and re-tested – and this time creating a view with MDS enabled and going via the load balancer succeeded.

Further examination of the headers in fiddler and wireshark showed that the Accept-Encoding header was sent through as part of the request and that the Content-Length header is returned with the correct value.

So this would appear to be a bug in the code for owssvr.dll, specifically concerning the fact that there is no Accept-Encoding header present for the code path that handles the MDS requests. It would appear that the owssvr.dll response to a request with no Accept-Encoding header is to send back a Content-Length: 0 header, even though it does in fact send back a response body.

At the browser end of things, I can only assume that the low level error handing in the XMLHttpRequest object (MDS makes use of XMLHttpRequest to make server side calls) has found a Content-Length: 0 header and this does not match up with the fact that there is actually a response body and it then tries to make the request again using an HTTP GET to owssvr.dll which the server responds to with an error.

Going back to the start of the article, my current patch level is August 2013 CU, we do have a plan to apply SP1 but just not gotten around to this.

Would be useful to know if anyone else has experienced the same issue with SP1 applied.

YMMV

SharePoint 2013 Audit Log Trimming – remember to edit your timer jobs to fit your log retention

SharePoint 2013 offers the option of audit logs:

Configure audit settings for a site collection

One of the settings available is to trim the audit logs and optionally save the trimmed data into a library. As the above article states the default is every 30 days, but you can change the log retention to some other value.

If you do want to change your log retention to something other than the default (say 7 days) then you might find that your logs don’t appear to be getting trimmed. Make sure that you also set the schedule of the corresponding timer job to match your retention schedule.

Each web application will have its own Audit Log Trimming job. The default schedule for the timer job is monthly, so if you want to have a retention of 7 days then set the schedule of the timer jobs to weekly and test the results.

YMMV

SharePoint 2013 Content Search Web Part and filtering on User Profile Properties

I am working on a new SharePoint 2013 implementation for an organisation that has a particular desire to have as much personalised content as possible on the Intranet home page. The idea is to have say one main area on the home page for company news, then to have other areas that surface news targeted for Your Department and Your Job Title.

In order to achieve this we need a publishing site, Content Types that have Site Columns of the type Managed Metadata that are mapped to the People->Department and People->Job Title term sets. Then, we simply add some Content Search Web Parts to the publishing site home page and configure the KQL to return only pages that have matches for the appropriate User Profile property.
Note: I’m going to assume that you’ve got the farm setup with a functioning Managed Metadata service, Search service (with continuous crawling configured), and a User Profile service that is populated with several users all of which have values in their Department and Job Title properties.

The corresponding Managed Metadata People-Department and People-Job Title term sets should also have the values populated from the User Profile service.

In my examples below I am using Departments such as Finance and Product Development, and Job Titles such as Finance Manager, Accountant, Senior Project Manager, Project Manager and Developer. If you don’t have the exact same values in your user profiles then make sure you substitute the values in step 14 below with values form your profiles.

1. In Central Admin, create a site collection based on the Publishing Portal site definition.

2. Once the site collection is created, browse to the site.

3. Give Everyone permission to at least read the site.

4. Go to Site Actions->Site Settings->Site columns and create the following Site Columns:

Name: FilterOnDepartment
Type: Managed Metadata
Term Set: People->Department

Name: FilterOnJobTitle
Type: Managed Metadata
Term Set: People->Job Title

5. Go to Site Actions->Site Settings->Site content types and create a Content Type using the following settings:

Name: FilterOnUserProfile
Parent Content Type: Article Page (from the Page Layouts Content Type parent)

6. Add the following existing site columns to the FilterOnUserProfile content type:

FilterOnDepartment
FilterOnJobTitle

7. Go to Site Actions->Site Settings->Site libraries and lists
8. Click on Customise “Pages”
9. Click on Add from existing site content types
10. Add the following content type:

FilterOnUserProfile

11. Select Site Actions->Site Contents
12. Click on Pages
13. From the Ribbon, select the FILES tab
14. You will now add a series of pages by clicking on the New Document drop-down and selecting FilterOnUserProfile for each of the following values (once you create the page, you’ll be returned to the Pages library, for the page you just created select Edit Properies):

Title: Page For Department Finance
Page Layout: (Article Page) Body Only
(once page is created and you’ve clicked on Edit Properties)
Content Type: FilterOnUserProfile
Comments: This page has a FilterOnDepartment column value of Finance.
FilterOnDepartment: Finance

Title: Page For Department Product Development
Page Layout: (Article Page) Body Only
(once page is created and you’ve clicked on Edit Properties)
Content Type: FilterOnUserProfile
Comments: This page has a FilterOnDepartment column value of Product Development.
FilterOnDepartment: Product Development

Title: Page For Job Title Developer
Page Layout: (Article Page) Body Only
(once page is created and you’ve clicked on Edit Properties)
Content Type: FilterOnUserProfile
Comments: This page has a FilterOnJobTitle column value of Developer.
FilterOnJobTitle: Developer

Title: Page For Job Title Finance Manager
Page Layout: (Article Page) Body Only
(once page is created and you’ve clicked on Edit Properties)
Content Type: FilterOnUserProfile
Comments: This page has a FilterOnJobTitle column value of Finance Manager.
FilterOnJobTitle: Finance Manager

Title: Page Finance and Accountant
Page Layout: (Article Page) Body Only
(once page is created and you’ve clicked on Edit Properties)
Content Type: FilterOnUserProfile
Comments: This page has a FilterOnJobDepartment column value of Finance AND a FilterOnJobTitle column value of Accountant.
FilterOnDepartment: Accountant
FilterOnJobTitle: Accountant

15. Once you’ve created all of the pages you need to check in and publish a major version of each page.

16. Now you’ll need to wait for search to crawl the newly added content. If you have continuous crawling then you’ll need to wait at the most 15 minutes. If you don’t have continuous crawling you’ll need to execute a full crawl.
Note: If you’re impatient and want to know if the continuous crawl has found your new pages, in Central Admin go to the Search Service application, click on Search Schema and then enter owstaxid into the Managed property filter and click the -> button. You should see your site columns.

17. Once you’re happy that the crawl has finished you now need to check that there is managed properties created and populated with the values, browse to the home page of your publishing site and enter the following search criteria into the Search this site box:

owstaxidFilterOnDepartment:Finance
(for the site columns you created in step 4 a new managed property with the name owstaxid will have been created and populated by the crawl)

18. If you are seeing results from the previous search then you can proceed with the next steps of adding the web parts to the home page of the publishing site, if you’re not seeing results from the search then spend some time examining your search configuration and maybe do a full crawl.

19. Browse to the home page of the publishing site
20. Choose Site Actions->Edit Page
21. In the Page Content area edit to your taste (I add the text “This page will allow you to see articles based on your User Profile values for Department and Job Title.”
22. Delete all the existing web parts from the Top Left and Top Right zones.
23. In the Top Left zone click Add a Web Part and add a Content Search web part (from the Content Rollup category).
24. For the Content Search web part you just added select the Edit Web Part menu option.
25. In the Web Part editor on the right, click the Change Query button.
26. You will now be in the Build Your Query dialogue with the BASICS tab selected. Click the Switch to Advanced Mode option.
27. In the Query text box clear all the content and add the following:
owstaxidFilterOnDepartment:{User.Department}
Note: In a production environment you’d want to add several other search criteria here such as the content type but for the purposes of this small demo this query text will suffice.

28. Click the OK button
Note:If you’re carrying out all of these steps as say the setup account (like me) then you might be tempted to click the Test Query button in the previous step. If you do you’ll probably not see any results being returned unless the setup account has a Department value set in it’s User Profile.

29. In the Web Part editor on the right, in the Display Templates section choose the Two lines option from the Item drop down.
30. Expand the Property Mappings section, select the Change the mapping of managed properties… check box and then select the CommentsOWSMTXT option from the Line 2 drop down.
31. Expand the Appearance section and enter Department News for the Title and then select the Title Only option from the Chrome Type drop-down.
32. Click the OK button.
33. Save, check-in and publish the page.
34. Now log in as a user from the Finance department and check to see if you see two articles in the Department News web part.
35. You can now add another web part to the Top Right zone, this time you’re going to set the filter for Job Title. The key differences from the previous steps are for the Query Text use the following:
owstaxidFilterOnJobTitle:{User.SPS-JobTitle}
36. Finally, add a another web part to the Header zone, this web part will display news for Department OR Job Title (just to demonstrate that you can combine results). The Query Text to use is:
(owstaxidFilterOnDepartment:{User.Department} OR owstaxidFilterOnJobTitle:{User.SPS-JobTitle})
Note: The default operator is AND and so you *could* change this query to reduce the articles presented by removing the OR.

YMMV

Number Mysteries – Considering Bases Exercise

Here’s a sample C# program for working out the Considering Bases exercise:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace NumberMysteries
{
	/// <summary>
	///  Wrapper for the Number Mysteries course.
	/// </summary>
	class Program
	{
		/// <summary>
		/// Standard entry point.
		/// </summary>
		/// <param name="args">Ignored.</param>
		static void Main(string[] args)
		{
			int knutsInSickle = 29;
			int sicklesInGalleon = 17;
			int knutsInGalleon = knutsInSickle * sicklesInGalleon;

			int answerA = (1 * knutsInGalleon) + (14 * knutsInSickle) + 3;

			int numGalleons = 1509 / (knutsInGalleon);
			int knutsLeftOver1 = 1509 % (knutsInGalleon);
			int numSickles = knutsLeftOver1 / knutsInSickle;
			int knutsLeftOver2 = knutsLeftOver1 % knutsInSickle;

			Console.WriteLine("For answer A:\nNumber of knuts:{0:d}", answerA);
			Console.WriteLine("For answer B:\nNumber of Galleons:{0:d}, Number of Sickles:{1:d}, Knuts left over:{2:d}", numGalleons, numSickles, knutsLeftOver2);
		}
	}
}

If you want to download the solution then click here.

SharePoint EventReceiver AfterProperties and Date fields

I had a brief to write an EventReceiver that looked for changes to certain fields and if the changes were detected to write an entry to another list. Seems simple enough but I ran into an issue with Date fields.

In the event receiver ItemUpdating event I was checking a Date field by first extracting the current date from the properties.ListItem[“MyDateField”] then checking this against the AfterProperties[“MyDateField”]. Of course the data stored in the AfterProperties[“MyDateField”] is not a Date but is a string.

I created a DateTime object from the AfterProperties[“MyDateField”] using the Convert.ToDateTime() method. All should have been fine but it turns out the two Date values were different – even though when I was doing my testing I was not changing the MyDateField,¬†when I tested the two values for¬†equality the comparison was returning false.

The reason for this was that my current time zone is BST (British Summer Time) and therefore the two date values were one hour out from each other. The solution was to take the DateTime from the Convert.ToDateTime() method and then get *another* DateTime by calling the DateTime.ToUniversalTime() method. This made sure that the DateTime I got from Convert.ToDateTime(AfterProperties[“MyDateField”]) was¬†in same time zone as the ListItem[“MyDateField”] value.

This is down to how SharePoint internally stores DateTimes, one of the CAML attributes for a Field definition pertaining to Date fields is the StorageTZ attribute, some details here:

http://msdn.microsoft.com/en-us/library/ms437580(v=office.12).aspx

I suspect that this is the default for a Date field.

I can only imagine the havoc this type of code problem could reek if I had written this code in the depths of a December evening (as the time zone would be GMT which is a match for UTC), got everything working, passed testing and into production. Then, come the dawn of BST on the last Sunday of March the bug would appear.

It’s often the case that we developers outside the US complain about settings being defaulted to US based settings, this is one case where the default setting is the match for the UK, luckily for me it’s summer time and I picked up the issue.

A sore one indeed.