On the 70-488 – SharePoint Core Solutions Exam

No brain-dumps or anything here; if you want the answers, go and bloody study. However, it’s worth knowing about the structure of the exam.

Right, so this is one of the 4 exams you’ll need for the MCSD SharePoint Applications. We decided to fling me at this one without preparation – I’m a SharePoint developer, so shouldn’t I be good at this one without lots of learning?

Continue reading

Website Bandwidth Usage

Off topic a little for SharePoint, but we all know the value of a Content Delivery Network (CDN), right? In particular, using services that host commonly used files, like jQuery.js, etc.? This has the advantage that other sites that use that CDN may have already cached that file in your visitor’s browser, but it also reduces the bandwidth used by your site.

Well, I found that my site was spending a lot of bandwidth serving jQuery.

Capture

Yup, 10% of my site’s bandwidth was being spent on… serving jQuery. That’s not efficient, so I found this post, which describes how to make the site use a CDN instead. Note that the functions.php file it mentions is the one in your theme.

Hopefully, that’ll reduce the bandwidth used. I also changed the css files for the theme, by minimising them; that should save another 200Mb per month. In total, that should be about a 12% saving on bandwidth.

It’s funny how this all mounts up!

Observations on Office 365 – Part 2

Following on from customer related issues in Office 365, there are a number of technology issues that give me concern. I will caveat this that this is based on my last project, and by the time I write this the Office 365 SharePoint platform could have changed to address some of these problems.

Technology Issues

No options for dealing with Query Throttle Exceptions. Consider this – you’ve been using a list for a while, and have lots of useful data in it. Suddenly, some of your views start to return Query Throttle exceptions. You’ve just tipped over the magic 5000 items limit in a large list. What do you do? Well, in normal SharePoint you’ve a number of options, the main one being to create an index or indices on the list so that your queries can be more efficient, and thus avoiding throttling. However, that action is itself throttled – so you’d have to either a) get a Farm Admin to do it, or b) have to define a “happy hour” window late at night when throttling was turned off, and add the index during that window.

The problem is that neither of these options are available in Office 365. You don’t have a Farm Admin, and you don’t have a happy hour. One might expect that you could raise a ticket with Microsoft and that their support would be able to add an index for you – but we (or rather, our customer) tried this, and Microsoft said that they couldn’t. Essentially, if you’ve not got your design exactly right and you want to make changes to a list that contains 5000 items or more, you’re fucked.

As this includes changes like ‘Adding a new column’, which isn’t that rare an activity, this is a show-stopper for me. Hopefully Microsoft will introduce some approach for dealing with this.

Search Latency in SharePoint Online can be very long. I blogged about this previously, and I still don’t think that your typical business user would expect to wait 8 hours for new content to get indexed. I mean, in an on-prem SharePoint system the Continuous Crawl kicks in every 5 minutes – that’s more like the timescale we need. Worse, this compounds the problems with large list queries described above, as it makes it difficult to use the Search Service (which is probably better suited to that kind of query) instead.

Browser Support. Office 365 will no longer supports Internet Explorer 8. This is unfortunately, because although IE8 is junk, it is still used by a lot of our customers. In short, they’ll lose the ability to not upgrade their browsers for years on end. (I might rejoice, but from their point of view it’s an expense with little benefit).

Logging and Fault Diagnosis. Essentially, Office 365 doesn’t have any. Take a moment, and read that again – Office 365 doesn’t have any logging for fault diagnosis. There is a PowerShell API for retrieving log information from SharePoint Online’s logging service – but the only thing that writes to the log is Business Connectivity Services.

It’s possible that Microsoft have more access to logs than we do – I did wonder if the issue is that all customers sharing a tenant machine would be able to read each other’s ULS log entries – but the point remains that we have very limited logging information to go on. Mostly, it’ll just be the standard error message “An exception has occurred”. Great.

Reliability. Which developing with SharePoint online I found I was getting more errors than I’d expect to see. For example, when migrating documents using the client side object model, I would occasionally receive errors like “The server did not respond” which I don’t expect to see normally. Browsing around was similar – you’d occasionally get “Server did not respond” errors. And sometimes I’d see weird transient errors that I couldn’t repeat, such as problems changing the appearance of a site. Now, these problems might not be due to the platform itself – I sometimes question our Internet connection – but still, reliability was worse than having our own server on the same network.

No easy way to do ‘Timed Jobs’. You know how it is, sometimes you just want to be able to perform a timed job for something. For example, it might be ‘Archive completed tasks and the end of the month’. There’s no easy way to do this, though. I’ve had a play with using Azure Scheduler to perform such actions, and it kind of works – but it was hard to get working, and password expiry breaks it. I really miss timer jobs.

Unconvinced about Apps. To be honest, that could be a post of it’s own, so that’s what I’ll do.

Conclusion

I suppose my conclusion is simply that there are a number of issues with running SharePoint Online projects, and I might not rush to do too many projects in it. In particular, the lack of ways of handling the Query Throttle, and the surprisingly long search latency are, to my mind, major issues. SharePoint online might still be a good way of doing your project – but caveat emptor.

Observations on Office 365 – Part 1

We recently completed another Office 365 project, and I must confess, I’m still not sure about it. The project was reasonably successful – but there were a lot of tears shed, and we still have issues that are proving alarming difficult to deal with.

Broadly, I think, these issues can be broken down in to two categories – problems with the customer, and problems with the technology. Continue reading

My Remote Event Receiver works in Debug, but not when published

Okay, I had exactly this issue – that my SharePoint App supporting my Remote Event Receivers would work when run under an F5 – debug deploy, but not when published properly. Annoyingly, I’ve found that I have to make a number of changes when moving between a ‘proper’ build, and an F5-deploy; I’ve mentioned those before, but they were turning off the AppUninstalling event, and replacing the ClientId in the app manifest with a *.

However, I’d checked these things, and my code still didn’t work. Continue reading

Turn off the Minimum Download Strategy feature in CSOM

I have found that the Minimum Download Strategy can cause issues with some of the JavaScript/JQuery I used in some of my pages – particularly when using Display Templates. I’m not the only person to have problems with it, either. Well, here’s the CSOM to turn it off:

private static void RemoveMinimalDownload(ClientContext clientContext, Web web)
{
 Guid MDSfeature = new Guid("87294C72-F260-42f3-A41B-981A2FFCE37A");
 FeatureCollection features = web.Features;
 clientContext.Load(features);
 clientContext.ExecuteQuery();
 features.Remove(MDSfeature, true);
 clientContext.ExecuteQuery();
}

Working with fields in CSOM

I’ve already detailed how to create a new Taxonomy Field in CSOM – here’s the more generic how to create a general field on a list.:

internal static void CreateFields(ClientContext clientContext, List targetList, string xmlDef)
{
 targetList.Fields.AddFieldAsXml(xmlDef, true, AddFieldOptions.AddFieldInternalNameHint);
 clientContext.ExecuteQuery();
}

And as a bonus, here’s how to set a field to be indexed in the client side object model:

internal static void SetIndex(ClientContext clientContext, List list, string fieldName)
{
	Field f = list.Fields.GetByInternalNameOrTitle(fieldName);
	clientContext.Load(f);
	clientContext.ExecuteQuery();
	f.Indexed = true;
	f.Update();
	list.Update();
	clientContext.ExecuteQuery();
}

Upload a File with CSOM

This is an example of how to upload a file with the C# Client Side Object Model (CSOM):

internal static File UploadFile(ClientContext clientContext, Web web, string filePath, Folder folder, string fileName, string title)
{
 string target = folder.ServerRelativeUrl + "/" + fileName;
FileCreationInformation fci = new FileCreationInformation();
 fci.Overwrite = true;
 fci.Url = target;
 fci.Content = System.IO.File.ReadAllBytes(filePath);
File uploadedFile = folder.Files.Add(fci);
 uploadedFile.ListItemAllFields["Title"] = title;
 uploadedFile.ListItemAllFields.Update();
 clientContext.ExecuteQuery();
if (uploadedFile.CheckOutType != CheckOutType.None)
 {
 uploadedFile.CheckIn("Initial Upload", CheckinType.MajorCheckIn);
 }
 clientContext.Load(uploadedFile);
 clientContext.ExecuteQuery();
 return uploadedFile;
}

Note that this method also lets you set a title for the document, as well as the file name, and it checks the document in for you if required.

What does TaxonomyItem.NormalizeName() do?

So, the client side application I’ve been working on has to sync a LOT of terms to the term store, and I’ve mentioned how I had problems with Managed Metadata labels and the ampersand – and how I fixed them using TaxonomyItem.NormalizeName().

Well, that was fine, but my application was slow – so I started looking at what I could do to eliminate client side object model calls (CSOM). My suspicion was that, as the function was static, it wasn’t doing anything that I couldn’t do in my application, and save a round trip to the server.

So, I opened up reflected and decompiled Microsoft.SharePoint.Taxonomy.dll. Inside I found the code for the following:

Regex _trimSpacesRegex = new Regex(@"\s+", RegexOptions.Compiled | RegexOptions.IgnoreCase);
//Normalize name as Taxonomy Service does
string name = _trimSpacesRegex.Replace(termName, " ").Replace('&', '\uff06').Replace('"', '\uff02').Trim();

That’s much, much faster than a round trip to the server, and I learnt that speechmarks are also converted from ASCII to Unicode too.

Remote Event Receivers: Identify when a document’s metadata is first completed

Document events are a perennial problem in SharePoint. This is, in part, due to the way documents are put into SharePoint:

  • You upload the document into SharePoint…
  • …which fires ItemAdded …
  • …then you complete the metadata…
  • …which fires item updated.

So, my problem was that our customer wanted an email sent when a document was first ‘added’ to SharePoint – except that by added they meant “Has been uploaded and it’s metadata completed for the first time”. While SharePoint does, technically, fire the correct events at the correct times, it’s pretty easy to see this ‘business event’ is probably more useful. Continue reading