Quantcast
Channel: Custom Development – yuriburger.net
Viewing all 53 articles
Browse latest View live

WCF Service Factory in SharePoint 2010

$
0
0

This post is about how to create a WCF Service and host it in SharePoint Foundation (or up) using the available Service Factory. I use CKSDev for these steps, but you can also use standard Visual Studio tools to accomplish the same results. CKSDev just takes care of all the initial plumbing and is a real timesaver. There are quite a few articles and posts out there, that describe this very process, but I like to keep this as a personal note. As a bonus I will be cleaning out the dreadful http://tempuri.org namespaces!

Overview:

1. Create WCF Service based on CKSDev template
2. Deploy to SharePoint
3. Change the http://tempuri.org namespaces

Create a WCF Service and deploy to SharePoint

1. Create a new solution and project.

Template: Empty SharePoint Project

Target: .NET Framework 3.5

2. Deploy it as a farm solution, since the assembly should go in the GAC.

1

3. Add a new item to the empty SharePoint project:

WCF Service (CKSDev)

2

4. This will create the following structure for you:

3

IHostedWCFService.cs Interface and WCF Service Contract
HostedWCFService.svc.cs WCF Service Class
HostedWCFService.svc WCF Service declaration

5. Build and deploy

6. Access the Service using the browser. To be able to do this, you will need to access the service using the MEX endpoint (see note). Just append “/mex” to the service url in your browser.

4

Next steps: change the http://tempuri.org namespaces

First we remove the one in the WCF Service Implementation (corresponds to the wsdl definition targetnamespace)

<wsdl:definitions name="HostedWCFService" targetNamespace="http://tempuri.org/" />

1. Add a using System.ServiceModel;

2. Decorate the implementation class with a ServiceBehavior attribute

namespace Blog.Examples.HostWCF
{
    [BasicHttpBindingServiceMetadataExchangeEndpoint]
    [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Required)]
    [ServiceBehavior(Namespace = "http://schema.demo.loc/HostWCF")]
    public class HostedWCFService : IHostedWCFService
    {
        // To test this service, use the Visual Studio WCF Test client
        // set the endpoint to http://<Your server name>/_vti_bin/Blog.Examples.HostWCF/HostedWCFService.svc/mex
        public string HelloWorld()
        {
            return "Hello World from WCF and SharePoint 2010";
        }
    }
}
<wsdl:definitions name="HostedWCFService" targetNamespace="http://schema.demo.loc/HostWCF" />

Then we remove the one in the WCF Service Contract (corresponds to the wsdl:portType namespace) by adding a namespace to the ServiceContract decoration.

<wsdl:portType name="IHostedWCFService">
  <wsdl:operation name="HelloWorld">
  <wsdl:input wsaw:Action="http://tempuri.org/IHostedWCFService/HelloWorld" message="tns:IHostedWCFService_HelloWorld_InputMessage" />
using System.ServiceModel;

namespace Blog.Examples.HostWCF
{
    // NOTE: You can use the "Rename" command on the "Refactor" menu to change the interface name "IHostedWCFService" in both code and config file together.
    [ServiceContract(Namespace = "http://schema.demo.loc/HostWCF")]
    public interface IHostedWCFService
    {
        [OperationContract]
        string HelloWorld();
    }
}
<wsdl:input wsaw:Action="http://schema.demo.loc/HostWCF/IHostedWCFService/HelloWorld" />

And finally we control the Action header for the method mapping, by adding our URI to the OperationContract.

using System.ServiceModel;

namespace Blog.Examples.HostWCF
{
    // NOTE: You can use the "Rename" command on the "Refactor" menu to change the interface name "IHostedWCFService" in both code and config file together.
    [ServiceContract(Namespace = "http://schema.demo.loc/HostWCF")]
    public interface IHostedWCFService
    {
        [OperationContract(Action = "http://schema.demo.loc/HostWCF/HelloWorld")]
        string HelloWorld();
    }
}
<wsdl:input wsaw:Action="http://schema.demo.loc/HostWCF/HelloWorld" />

The one http://tempuri.org namespace that remains, is the wsdl:import namespace. This enables the import of the WSDL binding information generated by the Service Factory. Unfortunately, I have not found a way to change these.

<wsdl:import namespace="http://tempuri.org/" location="http://yuriburger.demo.loc/_vti_bin/Blog.Examples.HostWCF/HostedWCFService.svc/mex?wsdl=wsdl0" />

That’s all!

CSKDev: Community Kit for SharePoint: Development Tools Edition

From the CodePlex site:

This project extends the Visual Studio 2010 SharePoint project system with advanced templates and tools. Using these extensions you will be able to find relevant information from your SharePoint environments without leaving Visual Studio. You will have greater productivity while developing SharePoint components and you will have greater deployment capabilities on your local SharePoint installation.

http://cksdev.codeplex.com/

MSDN Article about WCF in SharePoint

http://msdn.microsoft.com/en-us/library/ff521586.aspx

In this excellent article, there is a warning about custom WCF and Claims Based Authentication. Be sure to read this.

Note on MEX

WCF uses a Metadata Exchange endpoint to expose metadata about the service. The CKSDev template takes care of this by adding the [BasicHttpBindingServiceMetadataExchangeEndpoint] to the Service Class.



“Code-easy” modifications for SharePoint 2010: Part 1, the Validating Email Field

$
0
0

I always wanted to do a series about easy to build, real life customer requests for SharePoint modifications. And as with most ideas, never really found the time to do it. Well, now’s as good a time as any!

Other parts:

So here is part 1, the Validating Email Field.

SharePoint 2010 Foundation finally enables field validation through formulas which is great. But this may not be the best solution if you want to apply custom (or more complex) validation rules. One common customer request heard often is to validate email address inputs. And for this simple request I personally like to use Regular Expressions instead of calculated formulas. I have seen a couple of nice solutions using a more generic Regular Expression Custom Field, but in my case I would like to keep it clean and simple and hide the complexity of the Regular Expression itself.

Something like this, a standard text input field that throws an error if you input an invalid email address:

4

And have the error message configurable, because we want to support different scenarios:

3

Couple of things we need for our custom field:

  1. A class for our Validating Email Field. Since email addresses tend to consist of text characters only, we start by inheriting from the SPTextField class.
  2. A class for our custom validation, based on the PresentationFramework ValidationRule class. This way, we can reuse it in later projects.
  3. Deployment files, in this case we need a FLDTYPES_*.XML file defining our new field.

To start, create a Visual Studio 2010 Empty SharePoint solution. Since the assembly needs to go in the GAC (see my earlier post about Sandbox solutions), we select a farm solution.

1

Our class needs the following:

  • Since we want to configure our validation error message through the field’s properties, we need a property for this in our class.
  • An override for the GetValidatedString method where we can perform our email validation.

Create a class for our custom field; in this case we call it the ValidatingEmailField and add a property for the error message:

namespace Blog.Examples.ValidatingFields
{
    public class ValidatingEmailField : SPFieldText
    {
        private string _validatedErrorMessage;
        public string ValidatedErrorMessage
        {
            get
            {
                return _validatedErrorMessage;
            }
            set
            {
                _validatedErrorMessage = value;
            }
        }

        private void Init()
        {
            object validatedErrorMessageValue = GetCustomProperty("ValidatedErrorMessage");

            if (validatedErrorMessageValue != null)
            {
                ValidatedErrorMessage = validatedErrorMessageValue.ToString();
            }
        }

        public ValidatingEmailField(SPFieldCollection fields, string fieldName)
            : base(fields, fieldName)
        {
            this.Init();
        }

  	 public ValidatingEmailField(SPFieldCollection fields, String typeName, String displayName)
            : base(fields, typeName, displayName)
        {
            this.Init();
        }
    }
}

Then we add our override method to this class. This method uses our custom EmailValidationRule class which we will create next.

public override string GetValidatedString(object value)
        {
            if (value == null)
            {
                return string.Empty;
            }
            else
            {
                // Use our custom EmailValidationRule class.
                // Because it is based on .Net's ValidationRule, we pass the current CultureInfo
                CultureInfo cultureInfo = SPContext.Current.Web.UICulture;
                EmailValidationRule emailValidationRule = new EmailValidationRule();
                ValidationResult emailValidationResult = emailValidationRule.Validate(value, cultureInfo);

                if (emailValidationResult.IsValid)
                {
                    return base.GetValidatedString(value);
                }
                else
                {
                    throw new SPFieldValidationException(ValidatedErrorMessage);
                }
            }
        }

Next: the EmailValidationRule class. There are a few ways you can validate an email address, but as said earlier I like to use a regular expression for this.

namespace Blog.Examples.ValidatingFields
{
    public class EmailValidationRule : ValidationRule
    {
        public EmailValidationRule()
        {
        }

        public override ValidationResult Validate(object value, CultureInfo culture)
        {
            // Only UpperCase matches, so make sure you set IgnoreCase
            const string EMAILREGEX = "^[A-Z0-9._%+-]+@(?:[A-Z0-9-]+\\.)+[A-Z]{2,4}$";

            if (value != null)
            {
                Regex regex = new Regex(EMAILREGEX, RegexOptions.IgnoreCase);

                if (regex.IsMatch(value.ToString()))
                {
                    return new ValidationResult(true, "Valid email address.");
                }
                else
                {
                    return new ValidationResult(false, "Invalid email address.");
                }
            }
            else
            {
                return new ValidationResult(false, "No value entered.");
            }
        }
    }
}

A small note on the expression used: this expression will match most common email addresses, but there are some exceptions that will not. For example, top level domain names with more than 4 characters will fail.

And finally the deployment file.

Add a SharePoint mapped folder to your project pointing to “{SharePointRoot}\Template\XML”

2

Add a new xml file to this mapped folder: Fldtypes_<your unique field name>.xml .

Important: Make sure the filename is unique, because every custom field you will ever add will end up in this folder.

<?xml version="1.0" encoding="utf-8" ?>


    ValidatingEmailField
    Text
    Email address
    TRUE
    TRUE
    TRUE
    TRUE
    Blog.Examples.ValidatingFields.ValidatingEmailField,
Blog.Examples.ValidatingFields,Version=1.0.0.0,Culture=neutral,
PublicKeyToken=fd1fbc97e8ceddac







That’s it, package and deploy!

You can download the Visual Studio 2010 solution here.

Part 2 will be about building and deploying your custom Content Query Web Part Item Style.


To Sandbox or not to sandbox?

$
0
0

Microsoft SharePoint (Foundation) 2010 introduced the concept of running sandboxed code. This code can be deployed using WSP solutions, but the difference is it can be uploaded by authorized users at the Site Collection. Sandbox solutions run in a limited SharePoint Execution context and farm administrators can monitor and manage those solutions.

Because of the limited context, only a subset of the SharePoint Server Object Model is available. To quickly figure out, if a Sandboxed solution is possible (or even required), I created a small chart. The chart and information is not even close to complete and I will probably update these regularly.

For extensive information about the Sandbox in SharePoint 2010: http://msdn.microsoft.com/en-us/magazine/ee335711.aspx

Oh, one more thing: Microsoft recommends Sandbox Solutions by default. So only use those Farm solutions if you really need to.

SharePoint 2010 design chart

The following Microsoft.SharePoint subset can be used with sandboxed solutions:

Namespace    Remark
Microsoft.SharePoint All, except SPSite constructor, SPSecurity, SPWorkItem and SPWorkItemCollection, SPAlertCollection.Add, SPAlertTemplateCollection.Add, SPUserSolution and SPUserSolutionCollection, SPTransformUtilities
Microsoft.SharePoint.Navigation     
Microsoft.SharePoint.Utilities SPUtility.SendEmail, SPUtility.GetNTFullNameandEmailFromLogin
Microsoft.SharePoint.Workflow SPWebPartManager, SPWebPartConnection, WebPartZone, WebPartPage, ToolPane, ToolPart
Microsoft.SharePoint.WebPartPages  

The following items can not be deployed using a sandboxed solution:

  • Site Definition
  • Application Pages
  • Farm and Web Application scoped features
  • Custom Actions Group
  • Custom Fields
  • Visual Studio Visual Web Part
  • Workflows with code
  • HideCustomAction
  • SharePoint Timer Jobs
  • Feature Receivers for Farm and Web scoped features

Visual Studio 2010: Sandbox Deployment Errors

$
0
0

So you created a nice Sandbox SharePoint solution and need to deploy it. You entered the correct Site URL and hit deploy. Unfortunately the output shows  a failed deployment step:

image

Couple of deployment errors Visual Studio might throw at you at this stage:

Error occurred in deployment step ‘Activate Features’: Cannot start service SPUserCodeV4 on computer <name>. Check if the Windows Service is not disabled: “SharePoint 2010 User Code Host”. Better yet, set it to “Automatic” on your development machine.
Error occurred in deployment step ‘Activate Features’: This feature cannot be activated at this time. The contents of the feature’s solution requires the Solution Sandbox service to be running. Your development environment should have at least one server running the “Microsoft SharePoint Foundation Sandboxed Code Service “. Check in Central Administration/ Services on Server.
Error occurred in deployment step ‘Add Solution’: Sandboxed code execution request failed. This could have a couple of reasons, but the one I encounter the most is lack of rights. I normally enable “god-mode” on my dev machines (see below).
Error occurred in deployment step ‘Activate Features’: Attempted to perform an unauthorized operation. If you are running in “god-mode” this shouldn’t happen. Just make sure you are Site Collection Administrator.

This list is by no means complete and I will extend it as I encounter other issues, hopefully not too many :) .

Running in “God-mode”

Important: Not recommended in production environments. Since you are deploying with Visual Studio, this is obviously not the case.

Update 1: if you are a local admin on your dev box (and most of us are), make sure you test your solutions using lesser priviliged accounts!

To enable easy access on your development machine, make sure the following is true

  • Disable User Account Control (in Windows 2008 R2 this means lowering the setting to “Never notify”).

image

  • Microsoft Visual Studio runs as Administrator (see title bar).

image

  • You are a member of the local Administrators group
  • You are a member of the local WSS_ADMIN_WPG group
  • You are database owner (dbo) of both content databases and the SharePoint_Config database
  • You are Farm Administrator
  • You are Site Collection Administrator

Update 2: this post needs revising, since this can be done with fewer permissions than mentioned here.


“Code-easy” modifications for SharePoint 2010: Part 2, the custom Content Query Web Part Item Style

$
0
0

I always wanted to do a series about easy to build, real life customer requests for SharePoint modifications. And as with most ideas, never really found the time to do it. Well, now’s as good a time as any!

Other parts:

So here is part 2, the custom Content Query Web Part Item Style

Aaahh, the Content Query Web Part, or CQWP for short. The Swiss Army Knife no real SharePoint solution architect can do without. If you are not familiar with the Content Query Web Part in SharePoint, here is a small introduction. I won’t go in depth on all the different advanced options you might have configuring the CQWP, because there are lots of good articles about that out there. I do focus on how you would package and deploy such a customization using Visual Studio 2010.

Introduction

The Content Query Web Part (CQWP) was first introduced in SharePoint MOSS 2007. It allows for querying and presenting SharePoint data in a very flexible and customizable way and is considered one of the most important Web Parts in SharePoint. It can roll up SharePoint items:

  • Throughout your site collection.
  • In one specific site and all of its sub sites.
  • In one specific list.

As an example, say you want to do an aggregation (“roll up”) of all news articles from your news site and show the latest 3 with a thumbnail image on your home page? Easy with the help of the CQWP:

1

You start with configuring the CQWP through the settings in the Web Part toolbox.

2 3 4

The Query part determines where you will get your items from:

  • Source: the scope of the query.
  • List Type: choose the list type your items are in. In this case we pick Pages Library because our news articles are based on Publishing Pages stored in the default Pages library.
  • Content Type: we only want to return Article Pages and not Web Part pages.

The Presentation part determines how the items will be presented on the page:

  • Grouping and Sorting: speaks for itself. We pick 3 items, sorted by Article Date Descending.
  • Styles: the group style is used when we group the items and is not used in this example. The Item style is the most interesting part and controls how the items are rendered. There are several options, so be sure to check them all out. For now we pick the default “Image on left”.

Fields to display:

This part is a new feature of the CQWP in SharePoint 2010. These fields are called “Slots” and are used to bind item fields to XSL properties. Every item style template can have its own set of configurable fields.

Customization

This post is about configuring and customizing the CQWP if you are not *that* happy with the standard behavior of the available Item Styles. My most requested style for instance, is not available out of the box:

15

  • the title on top
  • the image on the left, just beneath the title
  • the byline to the right of the image
  • the first 200 characters of the article content, beneath the byline
  • a read more link

The standard CQWP gets its configuration from files in the Site Collection Style Library. These files are shared among all default Content Query Web Parts in your site. Let’s take a closer look at these files and the best way to do this is using SharePoint Designer. Just open up your site and navigate to the All Files section. Here you will find the Style Library with the XSL Style Sheets:

5

The file we are interested in is the ItemStyle.xsl. If you take a peek at the contents of this file, you will find several <xsl:template> elements which correspond to the different Style options you see configuring the CQWP. The names of the template should mostly speak for themselves. One exception, the “Image Left” is called “Default”.

Important:If you are not familiar with XSL you might want to read up on the subject. We keep our customization fairly simple, so you will just need a basic understanding.

We could of course modify the OOTB file using SharePoint Designer and create our customization that way. And that is a perfectly supported scenario, but with three major downsides:

  • This is an OOTB file and if it is modified it is saved to the content database. If someone chooses to reset it to the site definition all your changes would be lost.
  • Coexistence with other applications could be hard since you would have to share the default ItemStyle.xsl among them.
  • If you mess up the default ItemStyle.xsl it could affect other Content Query Web Parts in the Site Collection.

So the best way around these issues is to supply our own style sheet with our Web Part.

To start, create a Visual Studio 2010 Empty SharePoint solution. Since we don’t have an assembly that needs Full Trust (see my earlier post about Sandbox solutions), we select a sandboxed solution.

8

Add a SharePoint module:

9

After the module is added to the project, delete the Sample.txt

Next we add the XSLT Style Sheet: Add New Item/ Data / Style Sheet. Remember to give it a .xsl extension instead of .xslt to be in line with OOTB SharePoint files.

Your Solution should now look like this:

10

My next step is to copy the content of the OOTB file in my new file to start off with.

Open the default file in SharePoint Designer and answer “No” to the question if you want to check the file out (see downside no. 1). Copy and Paste the contents in your new file in Visual Studio, clearing the current contents of the file.

You now effectively made a copy of the OOTB file and are ready to create your own rendering template (the actual CQWP Item Style customization).

To give yourself a head start, it is easiest to copy a similar one. In this case that would be “Image Left”, which is called “Default” in the ItemStyle.xsl. Just copy and paste the complete template (everything between and including the <xsl:template name=”Default” match=”*” mode=”itemstyle”> and </xsl:template> tags).

Rename our template to something like “ImageWithArticleContent” and replace the wildcard match (“*”) with our specific style.

<xsl:template name=”ImageWithArticleContent” match=”Row[@Style='ImageWithArticleContent']” mode=”itemstyle”>

The next step is to reorganize the template to match our requested design. Basically it just starts with the title and the image and finally the Byline (“description”). We then add a new property “Content” and display the first 200 characters and the Read More link. Because we add a property a new slot will be created in the CQWP toolbox as we’ll see later.

 

<xsl:template name="ImageWithArticleContent" match="Row[@Style='ImageWithArticleContent']" mode="itemstyle">
    <xsl:variable name="SafeLinkUrl">
      <xsl:call-template name="OuterTemplate.GetSafeLink">
        <xsl:with-param name="UrlColumnName" select="'LinkUrl'"/>
      </xsl:call-template>
    </xsl:variable>
    <xsl:variable name="SafeImageUrl">
      <xsl:call-template name="OuterTemplate.GetSafeStaticUrl">
        <xsl:with-param name="UrlColumnName" select="'ImageUrl'"/>
      </xsl:call-template>
    </xsl:variable>
    <xsl:variable name="DisplayTitle">
      <xsl:call-template name="OuterTemplate.GetTitle">
        <xsl:with-param name="Title" select="@Title"/>
        <xsl:with-param name="UrlColumnName" select="'LinkUrl'"/>
      </xsl:call-template>
    </xsl:variable>
    <div class="item">
      <div class="link-item">
        <xsl:call-template name="OuterTemplate.CallPresenceStatusIconTemplate"/>
        <a href="{$SafeLinkUrl}" title="{@LinkToolTip}">
          <xsl:if test="$ItemsHaveStreams = 'True'">
            <xsl:attribute name="onclick">
              <xsl:value-of select="@OnClickForWebRendering"/>
            </xsl:attribute>
          </xsl:if>
          <xsl:if test="$ItemsHaveStreams != 'True' and @OpenInNewWindow = 'True'">
            <xsl:attribute name="onclick">
              <xsl:value-of disable-output-escaping="yes" select="$OnClickTargetAttribute"/>
            </xsl:attribute>
          </xsl:if>
          <xsl:value-of select="$DisplayTitle"/>
        </a>
      </div>
      <xsl:if test="string-length($SafeImageUrl) != 0">
        <div class="image-area-left">
          <a href="{$SafeLinkUrl}">
            <xsl:if test="$ItemsHaveStreams = 'True'">
              <xsl:attribute name="onclick">
                <xsl:value-of select="@OnClickForWebRendering"/>
              </xsl:attribute>
            </xsl:if>
            <xsl:if test="$ItemsHaveStreams != 'True' and @OpenInNewWindow = 'True'">
              <xsl:attribute name="onclick">
                <xsl:value-of disable-output-escaping="yes" select="$OnClickTargetAttribute"/>
              </xsl:attribute>
            </xsl:if>
            <img class="image" src="{$SafeImageUrl}" title="{@ImageUrlAltText}">
              <xsl:if test="$ImageWidth != ''">
                <xsl:attribute name="width">
                  <xsl:value-of select="$ImageWidth" />
                </xsl:attribute>
              </xsl:if>
              <xsl:if test="$ImageHeight != ''">
                <xsl:attribute name="height">
                  <xsl:value-of select="$ImageHeight" />
                </xsl:attribute>
              </xsl:if>
            </img>
          </a>
        </div>
      </xsl:if>
      <div class="description">
        <xsl:value-of select="@Description" />
      </div>
      <div class="content">
        <xsl:value-of select="substring(@Content, 0, 200)" disable-output-escaping="yes"/>
        <a href="{$SafeLinkUrl}">...Read More</a>
      </div>
    </div>
  </xsl:template>
  

Important: Because we are cutting off the output rendering we potentially leave open HTML tags! See this post about a solution.

Unfortunately, there is no SharePoint GUI way to link our custom Item Style to the Content Query Web Part. The way around this, is to deploy a pre-configured Web Part file with our solution.

Add a Web Part to our project:

12

Delete the Web Part class file, since we are only interested in the .webpart file.

Add an empty (unconfigured) CQWP to your SharePoint Page and export this Web Part.

Open the exported file in notepad or Visual Studio and copy the contents.

Paste this in the .webpart file in our Visual Studio project.

  • Set the Title and Description to the required values.
  • Set the ItemLimit to 3

Modify the ItemXslLink property to link our custom style sheet:

<property name="ItemXslLink" type="string" >/Style%20Library/Xsl/CustomItemStyle.xsl</property>

Modify the ItemStyle property to match our custom style:

<property name="ItemStyle" type="string">ImageWithArticleContent</property>

Save the file.

Important: since our solution doesn’t need an assembly, we need to exclude it from the package. Select your project and set the “Include Assembly In Package” to false.

We can now deploy our solution and test our work.

13

14

You can download the Visual Studio 2010 solution here.


Speed up SharePoint 2010 development

$
0
0

SharePoint Development tends to be a wee bit more time consuming compared to “standard” ASP.NET development. Although Microsoft really helped us a lot by providing seriously improved integration with Visual Studio 2010, building, packaging, deploying and debugging can still be annoyingly slow.

Here are a couple of things you could do to speed things up a bit.

1. Fast hardware

Of course the most important part. Be sure to have a lot of RAM in your development machine, at least 4 GB. With the 64-bit OS requirement for SharePoint 2010 we can address more than 3 GB so make sure it is available. Even current generation laptops can handle 8 GB of RAM, no excuses anymore (apart from budget that is).

Because you probably run all required software on one machine, fast disks are equally important. This component improves the overall experience, but MSSQL Server benefits the most. If you are running virtuals, you can go for fast external disks using eSata (or maybe USB 3.0). If you are more like me (and don’t like the external disk hassle) be sure to examine the disk specs when shopping for new hardware.

2. The right OS

I tried running Windows 7 Ultimate with SharePoint 2010 installed locally. Yes it works, but it is still a desktop OS. And you really can’t run things like domain controllers and stuff like that locally. If you really want things to go fast, you boot a regular server OS, say Windows Server 2008 R2. It handles server software far better than any desktop OS. And if you can, just install all needed components on one server (i.e. MSSQL Server, SharePoint 2010, Visual Studio, Office, Domain Controller, DNS, etc.). And Windows 7 VHD boot makes dual booting a breeze!

If you do need more than one server, think about installing Windows Server 2008 R2 with the Hyper-V role. And run your servers on top of that! You will probably need at least 8GB (host + 2 guests), but will give you ultimate flexibility.

 

3. Tweak your OS and server software

Same old story:

  • Don’t run unneeded services or components. No, you don’t need Adobe Auto Updater or the Windows Media Player Online Subscription blah blah. Check the Windows Services and registry “run” keys.
  • If you run MSSQL Server (and you probably do), be sure to maximize its memory to a value you consider adequate. This way, it won’t hog up your machine. Also if you have more than one disk, store the “tempdb” on a different disk than the content or configuration database. Other “production” performance tweaks also apply of course.
  • Regarding SharePoint IIS configuration, I normally run most parts on the same Application Pool. This saves valuable memory, but makes recycling take a bit longer. If you run many Web Applications but only develop in one, dedicate one AppPool to your development application and share the rest.
  • Keep your content databases lean. There is normally no need to develop against a 50+ GB database. Just use a default empty one or with limited content. You can of course switch later on if you need to test your solution.

 

4. Be smart

When deploying and debugging SharePoint solutions, Application Pool recycles are a pain! Be smart when laying out your Visual Studio projects and decide which parts must be farm solutions and which parts sandbox solutions. Remember: farm solutions require a IIS recycle where as sandbox solutions do not (see earlier post about Sandbox solutions). You can always merge them later when heading towards production state.

If you are developing loads of Web Parts or User Controls (used by Visual Web Parts), there is another thing you could do. Since most of them are plain ASP.NET components, you can stick them on a simple ASPX page and develop them using IIS. Especially handy during the early stages and you can add them to your SharePoint project when they need a SharePoint environment.

5. Time savers

You probably already have the “CKS: Development Tools Edition” installed. If not, please do. They also contain several time savers regarding deployment, like Quick Deploy, Copy Assemblies and Copy Files. There is even a feature that warms up your site after a IIS recycle!

You probably have other tricks up your sleeve, but I thought I’d share mine.


Output Escaping Rich Text in a Content Query Web Part

$
0
0

If you use a SharePoint (2010) Content Query Web Part to display content from a rich text field, you will end up with escaped HTML. Because it is escaped it is rendered as content on your page, probably not what you would want.

1

To fix this, you need to edit the ItemStyle.xsl (preferably a custom one, see this post). You would just have to add the following command to the part outputting the HTML:

<xsl:value-of disable-output-escaping="yes" select="$blah"/>

One other thing to watch out for, is when creating summary descriptions from these rich text fields. Say you have this text:

<h1 class=”ms-rteElement-H1″>Cras porta pharetra magna eget consectetur. </h1> <p class=”ms-rteElement-H1″> </p> <p>Etiam eget nibh eu libero dictum congue. Etiam tempor auctor lectus, at porta dolor molestie vitae.</p>

And you want to create a summary of 150 characters with a “Read More” link. You would cut off the output rendering and potentially leave open HTML tags! And that could really screw up the user experience. So another option would be to strip all html from your output. This way you will loose the rich text markup (H1, bolds and so on), but ensure valid html output.

To accomplish this, you would also need to edit the ItemStyle.xsl (again preferably a custom one) and include some kind of HTML stripping function.

<xsl:template name="Functions.RemoveHtml">
    <xsl:param name="String"/>
    <xsl:choose>
      <xsl:when test="contains($String, '&lt;')">
        <xsl:value-of select="substring-before($String, '&lt;')"/>
        <xsl:call-template name="Functions.RemoveHtml">
          <xsl:with-param name="String"
            select="substring-after($String, '&gt;')"/>
        </xsl:call-template>
      </xsl:when>
      <xsl:otherwise>
        <xsl:value-of select="$String"/>
      </xsl:otherwise>
    </xsl:choose>
</xsl:template>

Core Results Web Part with configurable Ranking Model

$
0
0

Ranking Models are cool. If you don’t know what they can do for you, here’s a summary:

SharePoint 2010 Enterprise Search uses ranking models to determine how data is weighed and results are ranked. Out of the box SharePoint 2010 provides different models, but you can also create your own. Creating your own ranking model allows you to tailor the results your query returns by:

  • assigning different weights to metadata properties.
  • include hits on custom managed properties in general search (as opposed to searching them through advanced search exclusively).

There are several nice blogs out there that describe in detail how you can add your own custom ranking model. For instance this excellent blog post, which describes the process quite clear: http://calvisblog.wordpress.com/2010/06/21/custom-ranking-models-with-sharepoint-2010-background-value-and-administrative-overview/ (by Shaun O’Callaghan).

Applying your custom ranking model requires you to look up the GUID, export the standard OOB Core Results Web Part, modifying a web part property and uploading it again (see this previous post). Now, that works of course, but I wanted a true end-user solution where you can configure the ranking model through the web part properties. Something like this:

EditorPart

Extending the Core Results Web Part

Note: we need to develop a SharePoint Farm Solution, because the Microsoft.SharePoint.WebPartPages.DataFormWebPart class (referenced
by the CoreResultsWebPart) is not available in sandbox solutions.

To start, we create a new web part deriving from the Core Results Web Part. The web part includes a property that allows setting the RankingModelID. It also inherits from IWebEditable needed for implementing an EditorPart (see further down).

public class EnhancedCoreResults : CoreResultsWebPart, IWebEditable
{
     [WebBrowsable(false)]
     [Personalizable(PersonalizationScope.Shared)]
     public string rankingModelID { get; set; }


 }

I want the web part properties to show the available Ranking Models so the user can select them from a dropdown list. For this to work, we cannot use regular properties and have to implement an EditorPart. The reason for this, is that regular properties don’t allow you to dynamically load values through code behind and I don’t want to hardcode the values for the dropdown list.

Next, create the EditorPart:

The EditorPart needs to get the names and values of the available ranking models. For this to work, we need to connect to our Search Service Application and fetch the available models:

class EnhancedCoreResultsEditorPart:EditorPart
    {
        private DropDownList rankingModelList = new DropDownList(); 

        protected override void CreateChildControls()
        {
            base.CreateChildControls();

            RankingModelCollection rankingModels = null;

            SPServiceContext context = SPServiceContext.GetContext(SPContext.Current.Site);
            SearchServiceApplicationProxy searchApplicationProxy = context.GetDefaultProxy(typeof(SearchServiceApplicationProxy)) as SearchServiceApplicationProxy;
            SearchServiceApplicationInfo searchApplictionInfo = searchApplicationProxy.GetSearchServiceApplicationInfo();
            Guid searchApplicationID = searchApplictionInfo.SearchServiceApplicationId;
            SearchServiceApplication searchApplication = SearchService.Service.SearchApplications.GetValue<SearchServiceApplication>(searchApplicationID);

            Ranking ranking = new Ranking(searchApplication);
            rankingModels = ranking.RankingModels;

            foreach (RankingModel rankingModel in rankingModels)
            {
                ListItem item = new ListItem();
                item.Text = rankingModel.Name;
                item.Value = rankingModel.ID.ToString();
                rankingModelList.Items.Add(item);
            }

            this.Controls.Add(new LiteralControl("<div class=\"UserSectionHead\">Ranking Model</div>"));
            this.Controls.Add(new LiteralControl("<div class=\"UserSectionBody\">"));
            this.Controls.Add(rankingModelList);
            this.Controls.Add(new LiteralControl("<br/><br/></div>"));

            this.ChildControlsCreated = true;
        }
}

The EditorPart sets the web part RankingModelID using the value from our dropdown list.

     public override bool ApplyChanges()
        {
            EnsureChildControls();

            EnhancedCoreResults enhancedCoreResultsWebPart = (EnhancedCoreResults)this.WebPartToEdit;
            if (enhancedCoreResultsWebPart != null)
            {
                enhancedCoreResultsWebPart.rankingModelID = rankingModelList.SelectedValue; 
            }
            else
            {
                return false;
            }
            return true;
        }

        public override void SyncChanges()
        {
            EnsureChildControls();

            EnhancedCoreResults enhancedCoreResultsWebPart = (EnhancedCoreResults)this.WebPartToEdit;
            if (enhancedCoreResultsWebPart != null)
            {
                rankingModelList.SelectedValue = enhancedCoreResultsWebPart.rankingModelID;
            }
        }

Wire up the EditorPart in our custom Core Results class:

        EditorPartCollection IWebEditable.CreateEditorParts()
        {
            List<EditorPart> editors = new List<EditorPart>();
            EnhancedCoreResultsEditorPart editorPart = new EnhancedCoreResultsEditorPart();
            editorPart.ID = "EnhancedCoreResults_editorPart";
            editors.Add(editorPart);
            return new EditorPartCollection(editors);
        }

        object IWebEditable.WebBrowsableObject
        {
            get { return this; }
        }

Create the logic that actually sets the RankingModelID by overriding the ConfigureDataSourceProperties() method:

     protected override void ConfigureDataSourceProperties()
        {
            if (this.ShowSearchResults)
            {
                base.ConfigureDataSourceProperties();

                if (!string.IsNullOrEmpty(rankingModelID))
                {
                    CoreResultsDatasource dataSource = this.DataSource as CoreResultsDatasource;
                    dataSource.RankingModelID = rankingModelID;
                }
            }

        }

Note: I found several articles on the web showing similar functionality by overriding the GetXPathNavigator() method and setting the ranking model through the SharedQueryManager instance. Somehow I couldn’t get this to work the way I wanted to so I decided to do it differently.

That’s it, build, package and deploy.

You can download the Visual Studio solution here.



Custom SharePoint Health Analyzer Rule for the ViewFormPagesLockdown feature

$
0
0

SharePoint 2010 has a special hidden feature that ensures anonymous users with read access cannot view list and library form pages (i.e. “Forms/AllItems.aspx”). This feature is also known as SharePoint Lockdown mode and is automatically enabled on publishing sites. Especially on internet facing SharePoint sites, this feature ensures anonymous users stay out of certain area’s of a site. Since this feature is not automatically enabled on non-publishing sites, we need to find a way to check this features state on a regular basis. We could of course create a scheduled PowerShell script for this, but I like a more solid approach for this one.

More info on the Lockdown feature on this MSDN blog (MOSS 2007, but still applicable. By Ryan Duguid)

As we all know, SharePoint 2010 includes the Health Analyzer. This component uses a bunch of rules to check the farm for predefined issues and report on any found:

1

Fortunately, the rule set is extensible and we can add our custom ones. This is not a difficult task and a good candidate for our feature check!

To start we need to create a class that inherits from either the SPHealthAnalysisRule or SPRepairableHealthAnalysisRule. From these abstract classes we need to implement some string properties “Summary”, “Explanation” and “Remedy” (for explaining the issues found) and the “Category” and “ErrorLevel” properties. The “Check()” method performs the actual analysis so must be implemented too. If you want users to be able to execute a repair right from the Central Administration interface, the SPRepairableHealthAnalysisRule includes a “Repair()” method just for that. Finally we can implement the “AutomaticExecutionParameters” property to enable automatic Timer Job execution for our rule and in this case we do.

More detailed info on this on MSDN.

1. We start with an Empty SharePoint Project, and deploy it as a Farm Solution. We add our Central Administration web as our Site Url.

2. Next we add the class that inherits from SPHealthAnalysisRule and add the necessary using statements. After implementing the abstract class our rule looks like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.SharePoint.Administration;
using Microsoft.SharePoint.Administration.Health;

namespace Blog.Examples.HealthAnalysisRule
{
     public sealed class ViewFormPagesLockDown : SPHealthAnalysisRule
    {
        public override SPHealthCategory Category
        {
            get { throw new NotImplementedException(); }
        }

        public override SPHealthCheckStatus Check()
        {
            throw new NotImplementedException();
        }

        public override SPHealthCheckErrorLevel ErrorLevel
        {
            get { throw new NotImplementedException(); }
        }

        public override string Explanation
        {
            get { throw new NotImplementedException(); }
        }

        public override string Remedy
        {
            get { throw new NotImplementedException(); }
        }

        public override string Summary
        {
            get { throw new NotImplementedException(); }
        }
    }
}

3. Next we add our logic to the Check() method to do the actual testing. This method uses two helper functions, both included in the downloadable solution:

  • isAnonymousEnabled: checks if the Web Application has Anonymous Access turned on.
  • isFeatureActivated: checks if the FormPagesLockDown feature is activated.

The method itself stores any failing Web Application and a counter which we can use for reporting the issues.

public override SPHealthCheckStatus Check()
{
            SPHealthCheckStatus ret = SPHealthCheckStatus.Passed;
            SPFarm farm = SPFarm.Local;
            foreach (SPService service in farm.Services)
            {
                if (service is SPWebService)
                {
                    SPWebService webService = (SPWebService)service;
                    foreach (SPWebApplication webApp in webService.WebApplications)
                    {
                        if (isAnonymousEnabled(webApp))
                        {
                            foreach (SPSite site in webApp.Sites)
                            {
                                try
                                {
                                    if (!isFeatureActivated(site, new Guid(FEATUREGUID)))
                                    {
                                        //Anonymous Access is enabled on the Web Application, but the ViewFormsLockDown feature was not found.
                                        this.failingWebApplications.Add(webApp.Name, +1);
                                    }
                                }
                                finally
                                {
                                    if (site != null)
                                    {
                                        site.Dispose();
                                    }
                                }
                            }
                        }
                    }
                }
            }
            if (this.failingWebApplications.Count == 0)
            {
                ret = SPHealthCheckStatus.Passed;
            }
            else
            {
                ret = SPHealthCheckStatus.Failed;
            }
            return ret;
 }

4. Finally implement the property “SPHealthAnalysisRuleAutomaticExecutionParameters” to enable automatic Timer Job schedules.

public override SPHealthAnalysisRuleAutomaticExecutionParameters AutomaticExecutionParameters
{
            get
            {
                SPHealthAnalysisRuleAutomaticExecutionParameters retval = new SPHealthAnalysisRuleAutomaticExecutionParameters();
                retval.Schedule = SPHealthCheckSchedule.Daily;
                retval.Scope = SPHealthCheckScope.All;
                retval.ServiceType = typeof(SPTimerService);
                retval.RepairAutomatically = false;
                return retval;
            }
 }     

5. We now could build and deploy, but we would have to register the custom Health Analyzer rules through PowerShell. This process can be automated through SharePoint’s feature framework.

 public override void FeatureInstalled(SPFeatureReceiverProperties properties)
 {
            try
            {
                Assembly currentAssembly = Assembly.GetExecutingAssembly();
                IDictionary exceptions = SPHealthAnalyzer.RegisterRules(currentAssembly);
            }
            catch (Exception ex)
            {
                throw new Exception("There was an error registering the Health Analysis rules: " + ex.Message); 
            }
}

If we compile and deploy, we can see our rule in action.

2

image

If you want the rule to report any errors, make sure you enable anonymous access on a Web Application and disable the (hidden) feature. Of course we should use PowerShell for this:

Disable-SPFeature -Identity ViewFormPagesLockDown -Url http://public.demo.loc

Download the complete solution here.


SharePoint Connections Amsterdam 2012 Slide Deck available

$
0
0

The slide deck from my session for SharePoint Connections Amsterdam 2012 was posted on the SPC site.

Creating Sustainable Solutions With SharePoint 2013

Creating sustainable solutions is always daunting, whether you are a single developer or working as a member in a large team. Every type of project requires a specific approach, there are no silver bullets. Still, there are some considerations you should make with every new SharePoint development project to ensure proper developer workflow and the delivery of maintainable solutions.
In this session we will look at several new tools and topics around SharePoint 2013 and Team Foundation Server 2012 and how you can use them to enhance your development projects.
Topics covered:
• The correct SharePoint solution type
• Solution Lifecycle Management
• TFS supported development
• Quality Assurance

Download my slide deck here: Yuri-BurgerSustainable Solutions with SharePoint 2013

To view a listing of all available slide decks from SPC Amsterdam 2012, visit the SPC site here.


Deploying and activating SharePoint 2013 themes using Visual Studio

$
0
0

I won’t deny it… I never was a big fan of theming SharePoint sites. In MOSS this was implemented absolutely awful and the SharePoint 2010 *slash* PowerPoint approach didn’t do it for me either. Glad to see both annoyances are gone now. Let’s take a look at how SharePoint 2013 changes the game and what we need to do to change its looks!

Themes in SharePoint 2013 consist of two things:

  • A theme, as in colorscheme
  • Optionally a fontscheme

Furthermore, a theme is part of what is called a “Composed Look”. These Composed Looks are essentially what is available to you when you click the “Change the look” under Site Settings.

SiteSettings1

Besides a color- and fontscheme the following items are part of a composed look:

  • A master page, this can also be a custom master page
  • Optionally a background image

The solution

Things I like to have changed in my custom look:

  1. Some UI colors for specific elements, in this case the SuiteBar at the very top
  2. Some of the fonts used
  3. Add a custom background image

In the end, it should look like this:

Result

I like the default look, but also want to include a background image. This default look is called the Office Look and it doesn’t use any specific font. For fonts, I favour the one used for “Sea Monster”. BTW, don’t you love that theme’s name? If you want to take a look at all the different OOTB themes, take a peek under “Change the look” in your site settings.

Most of these tasks can be done through the UI by uploading the artifacts manually. In this case (and almost any case) I like to use Visual Studio to ensure the correct deployment of these files. Theming, at least for me, is usually done for on premise or dedicated SharePoint solutions, so I don’t mind creating Farm solutions for this task.

Since the schema is pretty important, it is best to start with some of the OOTB parts:

  1. First we copy the color pallette used by the Office Look, since we like most of this OOTB theme
  2. Then we copy the fonts used by Sea Monster
  3. And finally grab your custom background logo. Make sure you keep the size of this file under 150KB!

You can download the color palette and fontscheme from the links in the “Composed Looks” settings section.

ComposedLook.1png

To set this up using Visual Studio 2012, we create a new Solution and make it a Farm Solution.

FarmSolution

Next, we add a mapped “Images” folder for our background image.

Then, we add a module (or two if you are including a custom master page) to deploy our Theme and Fontscheme. Add the artifacts and rename them appropiately (companylook-palette001.spcolor and companylook-fontscheme001.font in my case)
Modify the elements.xml so we deploy to the correct catalogs. Don’t forget to add GhostableInLibrary attribute!

This table lists the possible deployment options for our artifacts:

Master Pages <Module Name=”[Module Name]” Url=”_catalogs/masterpage”>
<File Path=”[Module Name][Master Page Name].master” Url=”[Master Page Name].master” Type=”GhostableInLibrary” />
</Module>
Themes <Module Name=”[Module Name]” Url=”_catalogs/theme/15″
<File Path=”[Module Name][Theme Name].spcolor” Url=”[Theme Name].spcolor” Type=”GhostableInLibrary” />
</Module>
Fontscheme <Module Name=”[Module Name]” Url=”_catalogs/theme/15″
<File Path=”[Module Name][Theme Name].spfont” Url=”[Theme Name].spfont” Type=”GhostableInLibrary” />
</Module>

Note: The attribute Type=”GhostableInLibrary” indicates that the item is added to the content database. The Url attribute of the module specifies where to store the file in the content database.

This is what your solution structure should look like right now:

Structure

To create a “Composed Look” we actually need to insert a listitem in the “Composed Looks” list. To do this we use a Feature Receiver that adds the list item with the correct field values:

Add a feature receiver to our solution’s feature. Set the scope to “Web”

Then add the following code to insert our custom list item:

public override void FeatureActivated(SPFeatureReceiverProperties properties)
        {
            var web = properties.Feature.Parent as SPWeb;
            var list = web.GetList(string.Format("{0}/_catalogs/design",web.ServerRelativeUrl));
            var item = list.AddItem();

            item["Title"] = "CompanyLook";
            item["Name"] = "CompanyLook";
            item["DisplayOrder"] = 1;

            var masterPageUrl = new SPFieldUrlValue();
            masterPageUrl.Url = masterPageUrl.Description = "/sites/dev/_catalogs/masterpage/seattle.master";
            item["MasterPageUrl"] = masterPageUrl;

            var themeUrl = new SPFieldUrlValue();
            themeUrl.Url = themeUrl.Description = "/sites/dev/_catalogs/theme/15/companylook-palette001.spcolor";
            item["ThemeUrl"] = themeUrl;

            var fontSchemeUrl = new SPFieldUrlValue();
            fontSchemeUrl.Url = fontSchemeUrl.Description = "/sites/dev/_catalogs/theme/15/companylook-fontscheme001.spfont";
            item["FontSchemeUrl"] = fontSchemeUrl;

            var imageUrl = new SPFieldUrlValue();
            imageUrl.Url = imageUrl.Description = "/_layouts/15/images/CompanyLook/Background.png";
            item["ImageUrl"] = imageUrl;

            item.Update();
        }

Remember to set the DisplayOrder, otherwise you will get the default value of 100 and your look will be hidden behind the “Grey” look.

This takes care of the deployment of our custom theme. Site owners can now activate it using the site settings, but you can also do it using code behind with another feature receiver.

Add a feature and feature receiver to our solution. Again, set the scope to “Web”. Add the following code to activate the custom look:

 public override void FeatureActivated(SPFeatureReceiverProperties properties)
        {
            var web = properties.Feature.Parent as SPWeb;
            var list = web.GetList(string.Format("{0}/_catalogs/design", web.ServerRelativeUrl));

            SPQuery query = new SPQuery();
            query.ViewFields = "<FieldRef Name='ThemeUrl' /><FieldRef Name='ImageUrl' /><FieldRef Name='FontSchemeUrl' />";
            query.Query = "<Where><Eq><FieldRef Name='Name' /><Value Type='Text'>CompanyLook</Value></Eq></Where>";

            SPListItemCollection listItems = list.GetItems(query);

            if (listItems.Count >= 1)
            {
                var theme = new SPFieldUrlValue(listItems[0]["ThemeUrl"].ToString());
                var font = new SPFieldUrlValue(listItems[0]["FontSchemeUrl"].ToString());
                var image = new SPFieldUrlValue(listItems[0]["ImageUrl"].ToString());

                if (theme != null && font != null && image != null)
                {
                    // In production code, make sure the files exist before applying the theme!
                    var themeUrl = (new Uri(theme.Url)).AbsolutePath;
                    var fontSchemeUrl = (new Uri(font.Url)).AbsolutePath;
                    var imageUrl = (new Uri(image.Url)).AbsolutePath;

                    web.ApplyTheme(themeUrl, fontSchemeUrl, imageUrl, true);
                }
            }
        }

That’s it! You can download the complete Visual Studio 2012 solution here.


Errors resizing a SharePoint 2013 App Part (Client Web Part)

$
0
0

Since SharePoint Client Web Parts are simply iFrames, resizing them from within your App logic (i.e. App.js) can be a little challenging. Fortunately Microsoft provides a way to do this dynamically using postmessages. More info on this can be found on MSDN.

Basically, how this works is, you send a special ‘resize’ message to your iFrames parent (the SharePoint Web containing the App Part). Based on this message, the iFrame is resized according to the supplied dimensions:

window.parent.postMessage(“<message senderId={your ID}>resize(120, 300)</message>”, this.location.hostname);

This only works if your Client Web Parts have the Web Part Title enabled (via the Client Web Part Chrome settings). If you don’t, the resizing function breaks, because it cannot find the Web Parts title DIV:

Chrome

SCRIPT5007: Unable to get property ‘style’ of undefined or null reference

The code throwing the error:

if (resizeWidth)
{
document.getElementById(webPartDivId + '_ChromeTitle').style.cssText = widthCssText;
cssText = 'width:100% !important;'
}

Unfortunately, there is not much we can do about this. If you really need the App Part title gone and keep the dynamic resizing my choice would be to hide it using CSS. Meanwhile we wait for the fix.


Application Lifecycle Management – Improving Quality in SharePoint Solutions

$
0
0

Introduction

“Application Lifecycle Management (ALM) is a continuous process of managing the life of an application through governance, development and maintenance. ALM is the marriage of business management to software engineering made possible by tools that facilitate and integrate requirements management, architecture, coding, testing, tracking, and release management.”

In this and future blog posts we will look at how ALM and the tools that MS provides support us in ensuring high quality solutions. Specifically, we explore a few different types of testing and how they relate to our SharePoint solutions.

  • Manual Tests (this post)
  • Load Tests
  • Code Review/ Code Analysis
  • Unit Tests
  • Coded UI Tests

To get things straight, I like testing. I think it is by far the best (academic) method to prove you did things right. And the best part, even before the UATs start!

This post is not meant to be exhaustive nor used as the perfect recipe for integrating testing in your products lifecycle. It is aimed at getting you started with some of the testing basics and how to set them up for your SharePoint projects. Whether you start a project with the design of your tests (yes, those guys need designing too) or use them to sign off your project, that is of course completely up to you.

Note: it is perfectly feasible to have different tests target the same area. For instance a load test might show memory degradation over time because of memory leaks in your custom solution. A sanity test could warn you for the same issue, but does so by analyzing your custom code and look for the incorrect disposal of objects.

The Visual Studio 2012 Start Screen contains a lot of how-to videos related to testing, so make sure you check those out too!

Videos

Core Concepts

Before we start, let’s look at some of the core concepts.

Tests are part of a test plan.

Sounds pretty simple, right? You cannot start implementing any kind of realistic testing unless you at least determine the following:

Goal you need to set the bar at a certain level. Even if you are just going to test a small part of the products functionality, state so in your test plan goals.
Exceptions highlight areas or components not part of your tests. Examples are external line of business systems, interfaces and third party assemblies. Or make it an opt-in, so something along the lines “anything not addressed here, is not part of the test…..”.
Tests covered what type of tests do you cover (and not what do your tests cover)?
Software used list the tools you need to implement and execute your test plan.
Test data describe your test dataset if you need one.
Test strategy how and in what order are we going to execute these tests? And describe how we are going to report on the findings and results. Answer questions like: what depth do we need and who the actors are in our play.
Test environment describe the required conditions needed for correct execution of the tests.

Tests need a design.

Every test, even the modest ones need some form of design:

Name Descriptive name or title plus a brief description.
Link to a requirement if possible, link a test to a requirement (or set of requirements), a user story, product backlog item or functional description.
Test data used what is needed from your test data set to perform this test? Link or attach the items to your test case.
Measurement what indicators do we use to determine a pass or a fail?
Expected result(s) For example, in case of a load test you would expect the environment to stay in the specified “Green Zone”. In case of a manual UI test every step could have its expected result:
  • Step 1: Login as user x with password y.
  • Expected result: Intranet landing page should render without errors.
  • Step 2: Navigate to news by clicking the Global Navigation Menu Item labeled “News”.
  • Expected result: News overview page should render, with a list of articles ordered by date descending.
Pass/ fail criteria

Manual Tests

Simple but powerful. These type of tests are easy to setup (as in non-complex), but usually require a decent amount of effort to work out. My advice, keep them simple by design with small descriptive steps. You could administer the different tests using Microsoft Excel, but of course Microsoft offers tooling for this.

Microsoft Test Manager allows you to plan, manage, design and execute manual tests. It can also be used to run exploratory test sessions and automated tests, directly from a test plan. Connected to TFS (or Team Foundation Service) it enables the logging of bugs and can provide trace information and snapshot capabilities. It enables the tester to provide as much valuable information using recorded actions and a commenting system.

Microsoft Test Manager requires one of the following Visual Studio editions: Visual Studio Ultimate, Visual Studio Premium or Visual Studio Test Professional.

More information: http://msdn.microsoft.com/en-us/library/jj635157.aspx

To get you started using Microsoft Test Manager, check the hyperlink mentioned above or follow these steps.

  1. Fire up Test Manager, either from a Visual Studio test case or directly from the program group.
  2. Connect it to Team Foundation Server or Team Foundation Service. For more information about TFS and Team Foundation Service, see previous blog posts:
  3. Create a test plan (see “Core Concepts”)
m1 m2

Your test plan should contain at least one Test Suite that will hold the actual tests to be run. You can add Test Suites directly by hand or automatically by adding requirements to your plan. As a bonus, a default static Test Suite is created automatically for you but you can also nest Test Suites if you like.

Your test plan should contain at least one Test Suite that will hold the actual tests to be run. You can add Test Suites directly by hand or automatically by adding requirements to your plan. As a bonus, a default static Test Suite is created automatically for you but you can also nest Test Suites if you like.

  1. Add Test Suites by adding requirements to your plan. The query by default shows all items from the “Requirements” category and thus includes bugs, backlog items, etc.
m3a m3
  1. Finally, we can start adding Test Cases to our Test Suite. If you already have Test Cases setup through Visual Studio or TFS web, you can use the query based “Add” method. There is also an option to create them directly from Microsoft Test Manager through the “New” button:

m4

These Test Cases are stored in TFS and automatically linked to the Product Backlog Item. You can also attach or link additional items or documents for the Tester to use.

The actual test run is also performed from Microsoft Test Manager. It shows the tester the steps and expected outcome using a “split-screen”:

m5

The real fun starts when testers provide feedback whenever a step fails. From the results of this test, a bug can be created and stored in TFS. This bug then contains the steps performed and any extra information (comment, video, screenshot, etc.) the tester provided.

m6

More posts to come :)

Update: Part two is ready, so now it is officially a series (allbeit a small one).

  • Load Tests
  • Code Review/ Code Analysis
  • Unit Tests
  • Coded UI Tests

/Y.


Slide deck SharePoint Saturday Stockholm

$
0
0

I had the pleasure to speak on the SharePoint Saturday event in Stockholm. This (perfectly organized) event was held on January 25th 2014. The slide deck from my session on creating sustainable solutions with SharePoint 2013 is now available on SlideShare.

Thanks all for who joined my session and thanks to the organizers for a great event!

Session details: Creating sustainable solutions with SharePoint 2013

Creating sustainable solutions is always daunting, whether you are a single developer or working as a member in a large team. Every type of project requires a specific approach, there are no silver bullets. Still, there are some considerations you should make with every new SharePoint development project to ensure proper developer workflow and the delivery of maintainable solutions.

In this session we will look at several tools and topics around SharePoint 2013, the different solution types and Team Foundation Server and how you can use them to enhance your development projects. Topics covered:

  • The correct SharePoint solution type, basically “Farm v.s. Sandbox v.s. the App Model”
  • JavaScript and HTML frameworks, why and when to use
  • Solution Lifecycle Management
  • TFS supported development

/Y.


SharePoint 2013 Solution Type Diagram

$
0
0

WhatToBuild.png

A couple of years ago I wrote this post about the Solution Type in SharePoint 2010. Since the introduction of SharePoint 2013 we now have the new App Model to consider so I thought it was time to create an updated version.

I use the Solution Type Diagram in my day to day decisions to get an idea on what kind of solution I would like to architect or at least start out with. Remember, as with every SharePoint project circumstances may differ and it always depends (J). So this is my version and you probably need to adapt it a little to suite your needs and your customer needs (your mileage may vary).

Decision.png

Let me walk through the considerations on this fairly simple chart:

Decision: “Just Artifacts?”

We all probably know by now that Microsoft has deprecated the Sandbox Solution model. But as MSDN states, this is true only for the use of custom managed code within the sandbox solution. If you just need to deploy SharePoint artifacts like lists, content types, images or JavaScript code Sandbox solutions are still a viable option and in fact not deprecated. More information on this subject can be found on MSDN:

http://msdn.microsoft.com/en-us/library/jj163114.aspx

So if we can stick with just declarative markup for deploying our SharePoint artifacts, the Sandbox solution is usually the fastest way to go. In all other cases it is not and we need to move further down the Solution Type diagram.

Decision: “CAM compatible?”

Compatible with the Cloud Application Model as Microsoft likes to call it (or App Model). So first you look at the technical side of the question. Can we deliver and meet the requirements with the Apps Model components. So we don’t need to deploy legacy stuff like Timer Jobs, Application Pages, etc. If we do, we could opt for separating these legacy bits and try to come up with a hybrid solution. But we also look at a more concept level. The Apps Model is pretty much intended for on demand installation, like picking an app from the store and have and end user install it him or herself. So if you have an Intranet Branding solution complete with My Site branding, an app might just not be the right delivery model. But if you have a collection of Web Parts an app is probably the right way to go.

My bottom line: Apps are intended for End Users, Full Trust and Sandbox solutions are for Administrators.

Since the App Model is new and still sort of version 1.0 there are a lot of caveats. Most of these are technical like the issues with apps published through Microsoft UAG. Or apps reading or writing lists on an anonymous Office 365 public facing internet site.

Decision: “Office 365?”

Do we target Office 365 or SharePoint Online? And I mean SharePoint Online or Office 365 multi-tenant (not the Office 365 dedicated version). And I tend to look a little bit further down the road with this one. So you might have customers asking for an on premise solution, but if you know the customer is considering or even evaluating O365 you might want to answer yes to this question. You don’t want to design a solution that could potentially block a customer from migrating to the cloud or implementing a hybrid environment.

The next couple of decisions are when we actually are compatible with the App model.

Decision: “Workflow, scheduled tasks?”

If we are just using HTML, CSS, JavaScript, local SharePoint lists and external data through REST a SharePoint Hosted App is probably what we are looking for. Remember, no Server Side Custom Code is available! If we do need things like custom workflow (or workflow behavior), scheduled tasks or remote/ app event receivers SharePoint Hosted is not an option and we need to move further down the diagram.

Decision: “Office 365?”

Again we look if we need to target SharePoint Online or Office 365. If we don’t, Auto Hosted is not an option and we need to go with a Provider Hosted app.

Decision: “Sell through the store?”

Do we need to sell through the Microsoft Store? This option is currently only available for SharePoint Hosted and Provider Hosted Apps. Auto Hosted Apps are not permitted in the store (yet). In that case you can of course opt to license, sell and distribute the app yourself.

So these decisions should eventually get you to a solution type:

Sandbox Solution (No Code Sandbox Solution) Still a viable option and O365 compatible
Sandbox Solution (with custom managed code) Deprecated, use with caution
Full Trust Solution Avoid if possible since it blocks customers from moving to the cloud
Apps The preferred solution model but watch out for technical and architectural caveats

And while you are at it, remember to check the availability of your choice if you are targeting Office 365 or SharePoint online. Luckily not much diversity there, although it never hurts to check if this has been changed. See TechNet for more information: http://technet.microsoft.com/en-us/library/5e1ee081-cab8-4c1b-9783-21c38ddcb8b0

Developer features O365 Small Business O365 Midsize Business O365 Enterprise E*O365 Education A*

O365 Government G*

App Deployment: Autohosted Apps Yes Yes Yes
App Deployment: Cloud-Hosted Apps Yes Yes Yes
App Deployment: SharePoint-Hosted Apps Yes Yes Yes
Full-Trust Solutions No No No
REST API Yes Yes Yes
Sandboxed Solutions Yes Yes Yes

/Y.



Versioning your SharePoint Apps using Team Foundation Server Team Builds

$
0
0

One of the cool features of Team Foundation Server is the ability to schedule and execute builds. This feature is simply known as “Team Builds” (or even shorter, just “Builds”) and provides some interesting extensibility points for doing Build Management.

When it comes to build management, enabling  versioning on my SharePoint Apps is usually the first thing I do when creating a new build.

Quick introduction to Builds

Builds allow you to create a Build Definition and configure the Build Process. After the definition is ready and saved back to TFS, you can queue new builds on demand, scheduled or as part of your Continuous Integration setup:

Trigger

The second neat thing about Builds is, they can output your build to a drop location, ready for you to collect and distribute:

Drop

On top of all this builds provide a way to trace your output back to a certain build. So if you have a certain application that contains an assembly (like a SharePoint Full Trust Solution or a SharePoint Provider Hosted App), Builds can brand these files with a configurable build number. Usually this number contains things like a build date and sequence number and matches the build report name:

Report

Note: in case of assemblies it was common practice for SharePoint custom solutions to only update the AssemblyFileVersion attribute of an assembly. If we kept the AssemblyVersion attribute at 1.0.0.0 we steered away from unneeded web.config SafeControl modifications.

This supplied script also uses this technique if you supply the “-FixedVersion” build argument. See Modification 2 how to achieve this.

The Script

Before Team Foundation Server 2013 customized assembly versioning was usually done by creating a custom build activity and updating (or creating your own) Build Process Template. With TFS 213 we still have those options, but we also have the ability to provide pre- and postbuild PowerShell scripts. These configurable build process parameters are the ideal place to extend and customize our build process. No heavy lifting required, all PowerShell!

See MSDN for more information on these parameters: http://msdn.microsoft.com/en-us/library/vstudio/dn376353.aspx

There is an awesome project on Codeplex that provides a great starting point for our customized build: Community TFS Build Extensions. These extensions also include PowerShell scripts with many, many activities. Most of these are easy to read, well documented and their usage obvious.

See Codeplex for more information: https://tfsbuildextensions.codeplex.com/documentation

So, with those scripts at hand we can start to customize our build process. Since I only made 2 small modifications, I will jump right in.

Modification 1: Version your AppManifest.xml (internally)

Versioning your app is pretty simple, you can do that through the Visual Studio UI of course.

AppVersioning

But there are a couple of issues with this:

  • This value can be changed by developers through the Visual Studio UI. You could of course crack open the AppManifest and overwrite this value upon team build, but…..
  • …. updating this value triggers the App update cycle and that might not always be what you want.

This usually only concerns me if I am dealing with SharePoint Hosted Apps. Because those type of apps don’t have an assembly and commonly contain only HTML, CSS and JS it is pretty hard to track an app back to a certain build.

So we use the Team Foundation Server pre-build script option to version our AppManifest.xml internally by inserting a XML comment. In this case “BuildNumber”.

<?xml version="1.0" encoding="utf-8"?>
<!--Published:920014D5-5E5A-442D-934A-437E4FB0A4DA-->
<!--Created:cb85b80c-f585-40ff-8bfc-12ff4d0e34a9-->
<!--BuildNumber: Demo App MAIN Build_2014.04.18.1-->
<App xmlns="http://schemas.microsoft.com/sharepoint/2012/app/manifest" Name="DemoApp" ProductID="{91a961f1-88fd-4675-b3b4-48d0de3ed086}" Version="1.0.0.0" SharePointMinVersion="15.0.0.0">
  <Properties>
    <Title>Demo App</Title>
    <StartPage>~appWebUrl/Pages/Default.aspx?{StandardTokens}</StartPage>
  </Properties>
  <AppPrincipal>
    <Internal />
  </AppPrincipal>
</App>

To get the comment in the AppManifest, I started with the scripts from the Community TFS Build Extensions mentioned earlier. This kit already contains an ApplyVersionToFiles.ps1, but that script only targets assemblies. By adding a few lines, we make it SharePoint App aware :)

# Modification: support SharePoint App Manifest Versioning
# Apply buildnumber to manifest files
$NewBuildNumber = $Env:TF_BUILD_BUILDNUMBER

# Apply the buildnumber to the manifest files
$files = gci $Env:TF_BUILD_SOURCESDIRECTORY -recurse -include AppManifest.xml
if($files)
{
Write-Verbose "Will apply $NewBuildNumber to $($files.count) files."

foreach ($file in $files) {

if(-not $Disable)
{
[System.Xml.XmlDocument] $doc = new-object System.Xml.XmlDocument
$doc.load($file)

attrib $file -r

$newComment = $doc.CreateComment("BuildNumber: $NewBuildNumber");
$root = $doc.DocumentElement
$doc.InsertBefore($newComment,$root)

$doc.Save($file)

Write-Verbose "$file.FullName - version applied"
}
}
}
else
{
Write-Warning "No Manifest files found."
}

This block of code locates the AppManifest.xml in the Build Sources directory and adds the build number as a XML comment.

Modification 2: Fixed versioning your assembly

Next modification we make to the build process is the ability to set the assembly file version and to leave the assembly version unchanged. This is of course not targeted at SharePoint Hosted Apps (since there is no assembly), but mainly for the legacy (Full Trust/ Sandbox) stuff. As mentioned, it was common practice for SharePoint custom solutions to only update the AssemblyFileVersion attribute of an assembly and keep the AssemblyVersion at 1.0.0.0. We still like to track back an assembly to a certain build, so we create a second modification:

# Modification: support SharePoint fixed assembly versioning
# We don't want to match the AssemblyVersion, just the AssemblyFileVersion
if($Fixedversion)
{
$VersionRegex = "\d+\.\d+\.\d+\.\d+(?&lt;!AssemblyVersion.+)"
}

This supplied script enables this if you supply the “-FixedVersion” build argument. After you build your solution and fetch it from the DROP location your assembly properties will look like this:

Version1

But the assembly version will stay at 1.0.0.0 as shown here (using JetBrains dotPeek which is a great tool by the way):

Version2

The Setup

So how do we get all this to work?

Step 1: get the scripts from Codeplex: https://tfsbuildextensions.codeplex.com/documentation

Step 2: apply the modifications or get the scripts attached to this post.

Step 3: upload the script files to source control. I usually place them in a BUILD folder outside of code branches.

Build3

Step 4: Create a new build definition. There are many good posts about how to do this so I will only include the basic steps here.

Supply a build definition name

Build1

Configure the drop location

Build2

Set build process parameters

Build

Step 5: kick off your team build and check the results in your DROP folder.

Happy building!

You can download the scripts here.

/Y.


JSLink and DataTables

$
0
0

jsdatatables

Whenever I have to work with client side tabular data, I like to pull in DataTables since it is such a powerful plugin. For those of you who are not familiar with DataTables check the link or the info at the end of the post.

SharePoint itself of course has its default way of presenting tabular data, mainly through the well-known OOTB List View:

1

This default view has some nice options, like paging and search but lacks in UI goodness and some advanced options like multi column ordering. So how would we go from this OOTB view to a much nicer DataTables view like this one?

2

Before SharePoint 2013 we had to use SharePoint Designer, custom XSLT List Views or even entire custom Web Parts to achieve the aforementioned DataTables view. SharePoint 2013 adds the ability to leverage Client Side Rendering and it is a totally new concept. It allows you to render your own output for list views, fields and forms. The rendering is based on HTML, CSS and Javascript and can be deployed using simple files.

In this post, I will create a custom List View and present the list data using DataTables. What do we need?

  • Since DataTables is a jQuery plugin, you need it loaded on the page where the view is used. You can use a Content Editor Web Part, custom masterpage or CustomAction to reference the file from a document library or CDN. I prefer the CustomAction, but any option should work.
  • DataTables library. The actual library containing the plugin.
  • Optionally a theme integration JavaScript file. In this example I used the jQuery UI plugin, but you can also style the DataTable yourself with or without the use of a HTML framework like Foundation or Bootstrap.
  • Optionally a DataTables CSS file if you want to control the table style.
CSS jquery-ui.dataTables.min.css DataTables CSS with jQuery UI theme
Images <several> Images used by the jQuery UI theme
JS jquery.dataTables.min.js DataTables library
dataTables.jqueryui.min.js jQuery UI integration file

Once the files are available and loaded on the page, we need to create our JSLink file. Please note that JSLink inner workings is beyond the scope of this article. Fortunately, there is quite a lot of information available on the web. See this link for more information: http://msdn.microsoft.com/en-us/magazine/dn745867.aspx

The Solution

We need our solution to be a bit generic, because we do not want to hardcode column names and column types. So in this case, our default view supplies several columns and we have to take care of the correct rendering.

We need the following code to setup (“register”) our DataTables Template and take handle the dynamic column names:

function registerDataTables() {
	var itemCtx = {};

	itemCtx.Templates = {};
	itemCtx.Templates.Header = &amp;quot;&amp;lt;table class='display' id='datatablesListView'&amp;gt;&amp;quot;;
	itemCtx.Templates.Item = ItemOverrideDataTables;
	itemCtx.Templates.Footer = &amp;quot;&amp;lt;/table&amp;gt;&amp;quot;;
	itemCtx.ListTemplateType = 100;
	itemCtx.OnPostRender = [];

	itemCtx.OnPostRender.push(function()
	{
		var columns = [];
		var index, len;
		for (index = 0, len = ctx.ListSchema.Field.length; index &amp;lt; len; ++index) {
			columns.push( {&amp;quot;title&amp;quot;: ctx.ListSchema.Field[index].DisplayName });
		}

		$(&amp;quot;#datatablesListView&amp;quot;).dataTable(
		{
			&amp;quot;columns&amp;quot;: columns
		});
	});

	SPClientTemplates.TemplateManager.RegisterTemplateOverrides(itemCtx);
}

This code sets up the DataTables container (the HTML table structure) and the list item override. On Post-Render it scans the available column names (from ctx.ListSchema.Field) and initializes the dataTable plugin.
For the list item override we have the following code:

function ItemOverrideDataTables(ctx) {
	var rowItem = &amp;quot;&amp;lt;tr&amp;gt;&amp;quot;;

	var index, len;
	for (index = 0, len = ctx.ListSchema.Field.length; index &amp;lt; len; ++index) {
		var cell = &amp;quot;&amp;quot;;

		//Test for Array
		if (Object.prototype.toString.call(ctx.CurrentItem[ctx.ListSchema.Field[index].RealFieldName]) === '[object Array]' ) {
			for (index1 = 0, len1 = ctx.CurrentItem[ctx.ListSchema.Field[index].RealFieldName].length; index1 &amp;lt; len1; ++index1) {
				cell += ctx.CurrentItem[ctx.ListSchema.Field[index].RealFieldName][index1].title + &amp;quot; &amp;quot;;
			}
		}
		else if (ctx.ListSchema.Field[index].Name === &amp;quot;LinkTitle&amp;quot;) {
			cell = &amp;quot;&amp;lt;a href='&amp;quot; + ctx.displayFormUrl + &amp;quot;&amp;amp;ID=&amp;quot; + ctx.CurrentItem.ID +  &amp;quot;'&amp;gt;&amp;quot;;
			cell += ctx.CurrentItem[ctx.ListSchema.Field[index].RealFieldName];
			cell += &amp;quot;&amp;lt;/a&amp;gt;&amp;quot;;
		}
		else {
			cell = ctx.CurrentItem[ctx.ListSchema.Field[index].RealFieldName];
		}

		rowItem += &amp;quot;&amp;lt;td&amp;gt;&amp;quot; + cell + &amp;quot;&amp;lt;/td&amp;gt;&amp;quot; ;
	}

	rowItem += &amp;quot;&amp;lt;/tr&amp;gt;&amp;quot;;

	return rowItem;
}

This function is responsible for constructing the table rows and individual cells. In my example I take care of a couple of specific field renderings:

  • Arrays, to handle most of the SharePoint multi-select field types
  • LinkTitle, to render a link to the item (from the displayFormUrl property)
  • Generic fields that just need to display a text value

You will have to adjust this if you want the template to behave differently. We save this code to a separate JSLink JavaScript file so we can reference it from our Web Part properties.

The Setup

The setup of the solution is quite easy since you just need to copy the files to a shared location (for instance the Style Library, Asset Library or Master Page gallery. The most important part is to have the JavaScript and CSS files loaded on the page you enable this view on. As mentioned earlier, there are multiple ways to achieve this so you should be able to get this to work. Just remember, you only need the support files pre-loaded on the page and not the actual JSLink file. That file we can reference from our Web Part properties:

3

Once saved and published, the page should show the new and shiny data view. Oh, and dataTables has a ton of options so make sure you check out the links below if you are interested.

Download the sample files here: Blog.Examples.DataTables.zip

More info on Datatables:

DataTables is a plug-in for the jQuery Javascript library. It is a highly flexible tool, based upon the foundations of progressive enhancement, and will add advanced interaction controls to any HTML table.

  • Highly configurable
  • Super-fast
  • Pagination, instant search and multi-column ordering
  • Supports almost any data source: DOM, Javascript, Ajax and server-side processing
  • Easily theme-able: DataTables, jQuery UI, Bootstrap, Foundation
  • Wide variety of extensions
  • Extensive options and a beautiful, expressive API
  • Fully internationalisable
  • Professional quality
  • Free open source software

https://datatables.net/


Slidedeck SharePoint Artifact Provisioning

Building your SharePoint Apps using the new scriptable build in VSO

$
0
0

BuildHeader

This post is about Team Builds and the new build system for Visual Studio (Online) that is currently in preview. If you are new to team builds or not sure why you should use them, you might want to check my earlier post about Application Lifecycle Management for SharePoint People. Especially the part about local builds being plain evil :)

Visual Studio (Online) has a new build system!

Although still in preview this new system has some compelling features compared to the current XAML build system. For one, it is truly web based but more importantly it targets cross-platform applications in a way the XAML build definitions never did. If you would like to know more, see the feature overview: https://msdn.microsoft.com/Library/vs/alm/Build/feature-overview

If you would still like to use the XAML build definitions, you can… for now at least.
Compared to the XAML build definitions, creating a build for SharePoint Apps is quite a lot easier. You don’t need any custom build tasks or post-build PowerShell scripts to create the app package and can extend the process with customizable build steps. In this post I show you how to setup a simple SharePoint App but you can extend it to manage the more complex build scenarios such as CI or support your Release Management process.
To be able to make use of the build system, you need to get hold of a Visual Studio Online account. You can create one for free on https://www.visualstudio.com/ if you do not already own one. Each of these free accounts will get you:

  • 5 free Basic user licenses
  • Unlimited stakeholders
  • Unlimited eligible MSDN subscribers
  • Unlimited team projects and private code repos
  • Free work item tracking for all users
  • Free 60 minutes/month of build
  • Free 20K virtual user minutes/month of load testing

After creating your account, first thing you need is to setup a project to hold at least the source code and build definitions. You only need to supply a few basic settings, like project name and version control system. Pick anything you like, the process is the same for both the process templates and version control systems. If you don’t know what to use, I would suggest Scrum and Git.

New VSO Project

Preparation

If you already have a SharePoint App project in Visual Studio you need to add it to source control to be able to build it using the Hosted Build Controllers. See https://msdn.microsoft.com/Library/vs/alm/Build/github/index for more information on how to accomplish this task. If you don’t, you can set one up right from the Visual Studio IDE:

NewProjectVS

In this case I will pick a simple SharePoint Hosted App, but you can use the same process for Provider Hosted Apps. After the first “Commit” and “Sync” (Git Pull + Push) the source code should be available on Visual Studio Online:

CodeView

Now that our source code is available, we can start creating our first scripted build.

Step 1: Create new build
You access the build system from the navigation menu at the top:

BuildMenu

This page allows you to manage both your XAML builds and the new scripted builds. If you now click the “+” sign, you have several build templates to choose from (including an emply one), but the Visual Studio template will do just fine for most projects:

ChooseTemplate

We are now presented with the actual build configuration. From the template we already have 4 steps present, but for the sake of this sample we can remove both the Visual Studio Test and the Index Sources & Publish Symbols step. This leaves us with only 2 steps to configure:

Step 2: Visual Studio Build Configuration

Here we need to supply some specific build arguments to be able to build and package SharePoint Apps. If you are already familiar with building SharePoint Apps with XAML templates, you will probably recognize them. One thing to note: you do not need to specify the publish directory (“/p:PublishDir”) because that has been taken care of by the next step.

The following arguments are needed:

  • /p:IsPackaging=true
  • /p:AppSpecificPublishOutputs=true
  • /p:VisualStudioVersion=12.0

BuildConfig

Step 3: The Publish Build Artifacts Configuration

The next step takes care of publishing the build output to the drop location so we can access the results. In our case, we need at least to add the “.app” package.

PublishConfig

Now we can save and run our build!
Step 4: Running the build
Now that we are all done configuring, we can run our build right from the Visual Studio Online UI. You can do this by choosing “Queue build…” from the context menu or page:

QueueBuild

Here you will notice one of the new features of this new scripted build system: instant detailed feedback on the build process through a nice looking console!

FeedbackBuild

If all goes well, you now should have a SharePoint App build by Visual Studio Online! If not, you have access to high detailed logs, right from this Build Result page. To access the artifacts, you can navigate to this build summary by clicking the top left build item:

BuildResult

If you then navigate to “Artifacts” you can download or explore your drop folder which should contain your SharePoint App package:

Artifacts

Happy building (with VSO)!

/Y.


Slidedeck SharePoint Saturday Oslo 2015 now available

Viewing all 53 articles
Browse latest View live




Latest Images