Materials From My MVVM using Knockout.js Session at E4D Expert Days Conference

Yesterday i gave my second session at E4D’s Expert Days conference, Single Page Applications fundamentals using Knockout.js.


The session dealt with the aspects of creating a SPA, the ins and outs of using Knockout.js, how to use Require.js for dependency injections, using HTML5 offline storage and we finished up with a small but cute overview of Underscore.js, the swiss army knife of JavaScript utility functions.

I would like to thank everyone who joined me on this exciting session and i’m looking forward to seeing some of you tomorrow.

If you would like to take another look at the session’s presentation – click here

To download the code samples – click here

To learn more about JavaScript the language check out my previous presentation on the subject – here

Materials From My SharePoint 2013 Session at E4D Expert Days Conference

Today was the first day of the annual E4D Solutions Expert Days conference. My first session at the conference was about SharePoint 2013, Microsoft biggest evolution to the SharePoint platform to date.


The session dealt with many of the new aspects of SharePoint development. We started the session with an overview of SharePoint 2013 and then deep dived into the MDS (Minimal Download Strategy), Remote Event Receivers, the awesome new feature of Client-side Item Rendering and finished off with the new array of RESTFul OData services and the new and improved CSOM (Client Side Object Model) (and a few words about Apps of course).

Since SharePoint development model is changing toward the client side, i also did a quick introduction to Knockout – a JavaScript MVVM framework, and a personal favorite of mine.

During the rest of the week ill host two more sessions – SPA Applications with Knockout (on Tuesday 25/12/2012) and SignalR & NodeJS Development (on Thursday 27/12/12).

If you would like to take another look at today’s presentation on SharePoint 2013 – click here

To download the code samples – click here

For my presentation about Knockout and jsRender – click here


SharePoint 2010’s Social Web Parts and Web templates/Site definitions

In today’s segment of “Things that should work out of the box but don’t”  I’m going to discuss adding any of the social web parts (Tag cloud, Noteboard etc.) to a page/site definition or a web template.

Coming to this I thought it’s going to be as simple as exporting the web part I wish to add (Tag cloud in my case), copy its .webpart file’s content to my pages module’s element.xml file and deploy my lovely solution. Boy was I wrong…

Doing the above I was greeted with the informative error message: “Cannot import this web part“. After double (and triple) checking the XML data I pasted I realized that this isn’t going to work. A quick search on Google revealed the following article from Microsoft itself:

You receive the error “Cannot import this web part” when you try to import an exported TagCloud Web Part in SharePoint 2010” (article link)

To summarize the article, the social web parts are based on Microsoft.SharePoint.WebPartPages.WebPart and not System.Web.UI.WebControls.WebParts.WebPart, and as such, the v3 schema (the .webpart file’s content) that SharePoint exports when you click on Export  is not valid for web parts based on the older namespace (Microsoft.SharePoint.WebPartPages.WebPart). This kind of web parts needs to be exported using the v2 schema, but sadly SharePoint 2010 doesn’t give you this option.jackie-chan-wtf In order to get the v2 schema (a .dwp file) the article suggests you head over to the web part gallery of the site collection and export the web part from there.

Success! doing this exports the web part with the v2 schema and providing the much needed .dwp file. I added the file’s content to my pages module element’s elements.xml file for the page it should appear at and the skies are all smiling and happy again. That is until you try to set ANY type of property of that web part that is specific to it… (i.e. the tag cloud’s scope property). soclose In my particular case, i needed to have the tag cloud web part scoped to “Under the current URL by all users“. No matter what i tried to add to the schema, nothing worked.

Long story short, i decided to handle this change at the code level.

Changing web part properties programmatically:

At the heart of working with web parts programmatically lies the SPLimitedWebPartManager object. From the MSDN documentation for this object we learn that “The SPLimitedWebPartManager provides a limited set of Web Part operations that can be performed in object model scenarios when there is no HttpContext and no instantiated Page object.”

Since we are not going to initiate any page object here, this is perfect for us.

One ‘gotcha’ to note about the SPLimitedWebPartManager object is to always make sure we perform check out on the page we are going to change the web part properties on – before –  we initiate the object!

Checking out the desired page can be done with the following code:

SPList pagesList = web.Lists["Pages"];

SPQuery query = new SPQuery();

query.Query = @”<Where><Eq><FieldRef Name=’LinkFilename’ /><Value Type=’Text’>Default.aspx</Value></Eq></Where>”;

query.ViewFields = “<FieldRef Name=’LinkFilename’ />”;

SPListItemCollection page = pagesList.GetItems(query);

if (page != null)


Once the page is checked out, we can move forward and initiate the SPLimitedWebPartManager object:

SPLimitedWebPartManager webparts = web.GetLimitedWebPartManager(web.Url + "/Pages/Default.aspx", System.Web.UI.WebControls.WebParts.PersonalizationScope.Shared);

Now that we have the SPLimitedWebPartManager ready, we can loop trough all of the web parts presented in the page, looking for the specific one we need, and change its properties! In my example i will change the scope of a tag cloud web part:

foreach (var wp in webparts.WebParts) {  

if (wp is TagCloudWebPart) {

(wp as TagCloudWebPart).UserScope = TagCloudUserScope.UnderUrlEveryone;

webparts.SaveChanges((wp as TagCloudWebPart));






The above code loops trough the WebParts collection of the SPLimitedWebPartManager object and looks for a web part that is of type TagCloudWebPart. Once it finds it it sets it’s user scope property, check in and publish the page that hosts it and breaks the loop.

You can use this method to programatically change properties of any kind of web part you wish, tough i would recommend to use the v3 schema for that where possible. It’s much much easier.

And with this optimistic message we conclude this segment of “Things that should work out of the box but don’t“.

Materials from my Knockout.js and JSRender session

Yesterday i gave my next session at E4D Solutions’s new client side development course over at E4D’s offices (You can find the presentation of the first session – What can html5 actually do? – right here, the second session – JavaScript the languageright here and the third session – Bootstrap, tips and other animalsright here).

Last night’s session was all about Knockout.JS and jsRender. Knockout.JS is an MVVM (model,view,view-model) framework for JavaScript which, in my humble opinion, is an essential tool in today’s JavaScript development world. jsRender, on the other hand, is an awesome templating engine for JavaScript which is both powerful and easy to use.

I would like to thank everyone who joined me last night and took some time off their busy schedule. I hope you enjoyed the session and the hands-on labs and i can already sense some of you putting Knockout to use in your next projects 🙂

If you like to take another look at my presentation about JavaScript head over to

See you next week when we discuss real time communication with SignalR and how it can help us build better client side applications!

Adventures with SharePoint 2010 ActivityManager, Newsfeed and other animals…

So this week i was tasked with creating a custom newsfeed and social wall (think Facebook) for a client using SharePoint 2010’s built in activity feed. One aspect of the project was to take the OOTB activities from the newsfeed and save them in an external database, so custom built solutions will be able to display them. While it may seem as a simple task at first, it turned out to be, what you may call, a learning experience.

In the following post I’ll walk you through all the steps and findings i got from this experience, so hopefully you will save precious time if you get to deal with the same things i had to.

Chapter 1: The case of the missing activities.

Before getting to the developing part I started with the basics. After creating a handful of users I created some activities for them (tagging, creating a new blog post etc.). Soon after, I realized no activities are shown on any of the user’s newsfeeds. First thing that came to my mind was:

Thankfully, the solution was quick and easy.

By default, SharePoint doesn’t enable the newsfeed, and as such, the timer job responsible for collection all the data for the newsfeed and recent activities is disabled.

To fix this unfortunate issue, simply follow these instructions:

1)  Head over to central admin, click on Manage service applications and then click on your User profile service application. Once there, click on Setup my sites under the My site settings section:

2) When the new page opens, scroll down and find the Newsfeed section. Make sure the Enable newsfeed on My sites checkbox is checked and click OK.

3) Click on Monitoring on the left side quick navigation menu, and then click on Review job definitions under the Timer Jobs section.

4) Find the User Profile Service Application – Activity Feed Job (first part of the job name might vary in your environment), click on it and then click on Enable.

5) Set the desired time frame for the job to run (usually set to run every hour) and click Run Now.

6) Once the job finishes its run (you can see it on the Job History link under the Timer Links section of the left menu) all the activities are shown on the user’s my site.

Chapter 2: Who are you ActivityManager, and what have you done with my activities??

So with the activities shown, and everything is sunny and bright, I moved forward to create my custom timer job that will copy the activities from the newsfeed to my own database.

Things seemed pretty straightforward from this point on. All I have to do now is create a timer job that will loop through all the users in the User Profile Service (UPS for short), initialize an ActivitiyManager object for each one of them and use the GetActivitiesByMe method to get all of the specified user activities.

Creating the timer job was a breeze. Just create a new class and inherit from SPJobDefinition base class. Inside the Execute method i get access to the UserProfileManager using the following code:

var currentContext = SPServiceContext.GetContext(webApp.ServiceApplicationProxyGroup, SPSiteSubscriptionIdentifier.Default);
UserProfileManager userProfMan = new UserProfileManager(currentContext);

So far so good.  Inside the foreach loop of the user profiles, it’s time to initialize the ActivityManager for each profile and get its activities. The ActivityManager object require the following arguments:

Pay special attention to the last line: “userProfile: The UserProfile object representing the user who will be treated as the current user for this…” We will get back to this line shortly.

So with this info in mind, I created my ActivityManager class instance as follows:

foreach (UserProfile profile in userProfMan)
    ActivityManager activityMan = new ActivityManager(profile, currentContext);

Now, to get all of the selected user activities, i make a call to the GetActivitiesByMe method of the ActivitiyManager as follows:

ActivityEventsCollection myActs =  activityMan.GetActivitiesByMe();

So all smiling and happy i deploy my solution, set a breakpoint right at the end of the foreach loop and run the timer job from SharePoint’s central admin page.

When the first user returned zero activities i thought “OK, i might have forgotten to add activities for that one”, when the second returned zero i started to get suspicious, when the third, fourth and fifth returned zero activities – i knew something went wrong.

After a few hours of debugging, googling and using reflector, I found this key piece of evidence:

public ActivityEventsCollection GetActivitiesForMe(DateTime minEventTime, int maxEvents)
    return new ActivityEventsCollection(this, -1L, false, minEventTime, maxEvents);

This is the method that gets called once you try to get events for a user and can be found under Microsoft.Office.Server.UserProfile.dll.

When the method gets called,It calls the EnsureViewerInfo method which has the following code snippet:

this.GetViewerAndPublisherInfo(this.m_UserInfoRetrieved ? null : UserProfileGlobal.GetCurrentUserName(), null, out person, out person2);

The method checks if m_UserInfoRetrieved private member is true. if so it returns null but otherwise it calls the GetCurrentUserName method.

I couldn’t find ANYWHERE in the code that sets this member so from what i can understand it is always false and as such always calls the GetCurrentUserName method!

The GetCurrentUserName method uses the current http context and windows authentication methods to get the current user. It doesn’t care nor want to care about whatever UserProfile you passed to the ActivityManager constructor.

Trying to find confirmation to my theory i found the following thread on MSDN forums:
Sharepoint 2010 social network API Activity feeds
On this thread, a member named Daniel Larson wrote the following:

…You cannot impersonate the activity manager… not with elevated priveledge or other means! In fact, nowhere in the Social APIs do they support impersonation.

>> Do you know why ActivityManager needs user profile object to create a new object?

Just guessing, but when they wrote the code they probably thought they’d support impersonation, but then decided against it for security reasons.

This is the exact same conclusion as i got.

Seeing this, i realized i will have to find another way to get a user’s activity feed.

Chapter 3: The case of the missing ‘New blog post’ activity!

As I realized the GetAcitiviesByMe method isn’t going to cut it, I went ahead and used the GetActivitiesForUser method of the same said ActivitiyManager object. This method takes a UserProfile object (or a string representing the user’s login name) as a single  argument and return an ActivityEventsCollection object. After the change, my code looks as follows:

foreach (UserProfile profile in userProfMan)
     ActivityManager activityMan = new ActivityManager(profile, currentContext);
     var activities = activityMan.GetActivitiesByUser(profile);

Once the solution deployed and debugged a progress was made! Instead of getting zero activities for users I now get some of them. A bit more checks and I realize that some types of activities just refuse to show up. The one that caught my eye the most was the “New blog post” activity. I’ll spare you the description of the head banging against the wall/desk/bowl of fruits etc. and just show you how to fix this.

Under the Setup My Sites area of My Site Settings (if you need a reminder on where it is – check chapter 1 in this article) you’ll find a section called Security Trimming Options. This property allows you to set whether to checks each link in the user’s activity feed against the user who try to watch it and determine whether he has the sufficient permissions to watch the activity or not:

By default, Check all links for permissions is selected and as such if the user who try to get the activities (the system account in my timer job case) doesn’t have direct access permissions to the area where the activity happened (the user’s blog in this example) – any activity from that area won’t be returned. By changing the property to Show all links regardless of permissions, all of the user activities are now returned in my custom timer job and i can happily move forward to get the correct template of the activity and save it to my database!

Chapter 4: “I could template you all day if i wanted to!”

So now i have a collection of ActivityEvent objects and i wish to template them the same way SharePoint does before showing them on a user’s my site or newsfeed. Sadly, an ActivityEvent object doesn’t come with a built-in property or method that will template the object for us. Instead, it has several properties (keywords) that are used by the templating engine to build the activity text:

  • Name
  • Link
  • Link2
  • Value
  • Publisher
  • Owner

An example of activity templates is as follows:

{Publisher} published a new blog post.&lt;br/&gt;{Link}

{Publisher} rated {Link} as {Value} of {Name}.

{Publisher} tagged {Link} with your interest.&lt;br/&gt;{Link2}

All of the OOTB templates are stored in the osrvcore resource file located at: C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources.

What I need is the full activity text in order to store it in a database. Therefore, my plan of action is to load the templates resource file, get the right template for my ActivityEvent activity and use a super-simple regular expression to replace the keyword (i.e: Publisher, Link etc.) with its value from the ActivityEvent object.

To load the templates resource file I used Neo Assyrian’s post Retrieve Activities from the SharePoint 2010 Activity Feeds as reference:

ActivityType activityType = activityMan.ActivityTypes[activity.ActivityTypeId];
ActivityTemplate activityTemplate = activityType.ActivityTemplates[bool.FalseString];
var templateStr = SPUtility.GetLocalizedString("$Resources:" + activityTemplate.TitleFormatLocStringName, activityTemplate.TitleFormatLocStringResourceFile, (uint)CultureInfo.CurrentUICulture.LCID);

By getting the activity type (based on the ActivityTypeId of the current activity) we can get the ActivityTemplate object which, among other things, holds the title of the template. This title is actually the key in the resource (resx) file and its value is the template text.

Now that we have templateStr which holds the template text, it’s time to use my little non-scalable helper method and build the complete html text for that template:

private string RenderEvent(string format, ActivityEvent actEvent)
    Regex regx = new Regex(@”\{([^}]*)\}”, RegexOptions.IgnoreCase);

    MatchCollection mactches = regx.Matches(format);

    foreach (Match match in mactches)
        switch (match.Value.ToLower())
             case “{publisher}”:
format = format.Replace(match.Value, BuildUrl(actEvent.Publisher.Href, actEvent.Publisher.Name));
             case “{value}”:
format = format.Replace(match.Value, actEvent.Value);
            case “{link}”:
format = format.Replace(match.Value, BuildUrl(actEvent.Link.Href, actEvent.Link.Name));
           case “{link2}”:
format = format.Replace(match.Value, BuildUrl(actEvent.Link2.Href, actEvent.Link2.Name));
           case “{owner}”:
format = format.Replace(match.Value, BuildUrl(actEvent.Owner.Href, actEvent.Owner.Name));
           case “{name}”:
format = format.Replace(match.Value, actEvent.Name);
return format;

The method accepts two variables: a template format and an ActivityEvent object that represent the activity, it than use a simple regular expression to extract all the template keywords (words between two curly brackets) and then using a simple switch/case replace the keyword with HTML content using the BuildUrl method:

private string BuildUrl(string href, string name)
return string.Format(“<a href='{0}’ title='{1}’>{1}</a>”, href, name);

And that wrap the entire thing up. I now have the template fully rendered as html, just like SharePoint shows it on the user’s my site and newsfeed!


So after all of this, I finally got my timer job working. Every hour it takes all the new activities and save them in an external database that i use with my custom web parts to show these activities. This just shows that even though the social API is not the best or the most neat one out there, with a little bit of work, we can bend it to do as we wish.

If you want to read a bit more about the newsfeed, check out my earlier post on the subject Working with SharePoint 2010 User Activity NewsFeed.

Materials from my JavaScript session

Yesterday i gave the second session of E4D Solutions’s new client side development course over at E4D’s offices (You can find the presentation of the first session – What can html5 actually do?right here).

Last night’s session was all about JavaScript, starting from basics such as types and functions and all the way up to advanced topics such as design patterns and dependency injection using require.js.

I would like to thank everyone who joined me last night, and even though it wasn’t an easy session, i hope you enjoyed it and left the classroom last night armed with new and exciting knowledge about JavaScript.

If you like to take another look at my presentation about JavaScript head over to

See you next week when we discuss Web.API and how it can help us build better client side applications!

Materials from my HTML5 session

Yesterday i had the pleasure of kick starting the new and improved client side development course over at E4D Solutions with my session about HTML5. The course was built from the ground up to cover all the important aspects of client side development now days including topics such as HTML5 fundamentals, JavaScript code structuring, Developing SPA applications using KnockoutJS/Backbone.js and AngularJS and even a touch of ASP.NET Web API and real time communication using SignalR.

I would like to thank everyone who participated at my session last night, you were awesome and i truly hope you are already exploring what we discussed during the session in preparation to our next meetup on JavaScript next Wednesday 🙂

If you like to take another look at my presentation about HTML5, fire up your favorite browser (IE9+ if your using IE) and head over to

My book is published!

You might have noticed that i havent updated my blog in a while. There are two (happy) reasons for that:

1) My son was born about two weeks ago and as you might have guessed, things are hectic (but happy) 🙂

2) The book i’ve been working on for the last 9 months for Packt Publishing is finally released!

Since im sure you are not here to hear about my new born son, let’s talk about the book 🙂

A few months ago i was approached by Packt Publishing to check if a preperation guide for the Silverlight 4 MCTS exam can be written. A short while after that work has started.

9 months later im proud to announce that the book is published and can be purchesed as a physical book, kindle or PDF/Epub from Amazon or straight from Packt Publishing web site.

And what is this book all about you might ask?

The book is a hands-on certification guide with practical examples and Q&As to help .NET developers prepare for and pass the (70-506): TS: Microsoft Silverlight 4 Development exam.

The book deals with all of the subjects the MCTS exam requires and use a handful of tutorials to help the reader learn and put to practical use the topics discussed.

The book’s table of contents is as follows:

Chapter 1: Overview of Silverlight
Chapter 2: Laying out Our User Interface
Chapter 3: Enhancing the User Interface
Chapter 4: Implementing Application Logic
Chapter 5: Working with Data
Chapter 6: Interacting with the Host Platform
Chapter 7: Structuring Applications
Chapter 8: Deploying Applications

I would like to use this stage and thank everyone at Packt Publishing for publishing this book, and a special thanks to my editors Vishal Bodwani and Kerry Geroge. Without you two this book would never have been a reality!

Integrating a custom processor in FAST pipeline – part 1 of 2

FAST for SharePoint 2010 allows us to integrate our own custom processor into the search engine’s processing pipeline. While attempting to modify the default processing pipeline may sound like an hazardous task, its actually not that bad. In part 1 of the series we will set the stage for our custom processor by creating a BCS connector, set the managed properties and add a refiner to the refinement panel.

For the purpose of this article, we’ll use BCS as a connector for a database with the following columns:

There is nothing special about the ProductID, ProductName or ProductDescription columns. These are just your regular run of the mill nvarchar and int columns. ProductTags on the other hand is a bit different.

ProductTags is going to represent a set of tags, separated by the | (pipeline) character. Once crawled, this property will serve as a refinement property. The problem with this column is that we need to separate the tags using the pipeline character so each tag will be counted as an individual tag. If our search result return 3 products with the tag Home Styling, the user expects that the refinement panel will show Home Styling (3).

So how do we get this result? how do we separate the tags? The short answer would be integrating a custom pipeline processor that will receive the tags as input, and replace the pipeline separator character with a character FAST recognize as a separator – the char u2029.

Before we go on with creating the custom processor make sure you have some products in your data base. Pay special attention to the ProductTags column and make sure you add data to it with the following format: tag|tag|tag etc.

To get the data from the SQL database to our SharePoint we are going to use a BCS connector built in SharePoint Designer 2010. Using SharePoint Designer 2010 to create BCS connections is a fast and simple method to create BCS connections and its great to use when the data you wish to represent is simple, as in our case.

Creating the BCS connection

To keep this post focused on the subject of item processing, follow the great walkthrough by Ingo Karstein – Create a simple BCS connection with SharePoint Designer 2010

Once you have the BCS created we can go ahead and create our crawling content source.

Creating a crawl content source

Fire up SharePoint 2010 central admin and choose Manage service applications. Choose your FAST content SSA and click Content Sources on the left side menu. We are now in the Manage Content Sources page.

Click on the New Content Source button on the upper horizontal bar to add a new content source to crawl. The Add Content Source page shows up. Name the new content source to your liking. At the Content Source Type area, make sure you select Line of Business Data and select the name of the BCS connector you created earlier.

In my system the form looks as follows:

Once all the fields are filled, scroll to the bottom of the page, check the Start full crawl of this content source box and click the OK button. Wait for the crawler to finish its thing and check the crawl log by clicking on Crawl Log on the left menu.If your content source shows successes or warnings you are clear to move on to the next step.

Mapping managed properties

If we try to perform a search right now we will see that the title of the result items is wrong. For example, if we search for the term Xbox, which i have in my database as a product name, we will get the following result:

Not the most informative title is it… In order to show the right title, we have to set the managed property Title to include data from the crawled property that represent ProductName from our database. This process is called mapping. We map a managed property to include data from crawled property.

Fire up the ol’ central admin and click on Manage service applications. Click on your FAST Query SSA application and then click on FAST Search Administration on the top part of the left menu. Under Property Management, click on Managed properties:

In the Managed properties page, search for the managed property Title. Once found, click on it and choose Edit Managed Property. At the Mappings to Crawled Properties section of the page click on the Add Mapping button. Change the category to Business Data, find the ProductName crawled property, mark it and click on Add. The name of the crawled property might differ a bit in your setup with an additional prefix such as read listelement or something similar.

Once added click on the OK button. Again under the Mappings to Crawled Properties section highlight the newly added property and using the Move Up button move it all the way to the top. That will make sure that FAST will use the data from this property.

Finally, click on the OK button and perform a full crawl again on our content source. If we perform the same search for Xbox now, the result will be as follows:

Now that we got this issue out of the way, let’s add the ProductTags property to the refinement panel.

Adding a refinement property

Before we can add our tags property to the refinement panel we first have to create, well a property for it. Once again, fire up the central admin, click on Manage service applications, then click on your FAST Query SSA application and finally click on FAST Search Administration on the top part of the left menu.

Under Property Management, click on Managed properties and then click on Add Managed Property. Name the new property as you wish. In my example i called it ProductTags.

Next, under the Mappings to Crawled Properties section make  sure the first radio button is selected and click on Add Mapping. Once again, change the category to Business Data and look for an entity with the name ProductTags. Add it and click OK.

Under the Refiner Property section check both Refiner property and Deep Refiner.

Under the Query Property section check the Query property check box and click OK.

For FAST to recognize our new property we must perform a crawl again, so go ahead and perform another full crawl on our content source. Once done get back to the search result page and under Site Settings click on Edit Page.

Click on Edit Web Part for the Refinement Panel web part and expend the Refinement section. Copy the xml from the Filter Category Definition property to your favorite text editor and add the following snippet above the closing FilterCategories tag (the last line):

<Category Title=Product Tags” Description=Tags for product” Type=Microsoft.Office.Server.Search.WebControls.ManagedPropertyFilterGenerator” MetadataThreshold=1
NumberOfFiltersToDisplay=4” MaxNumberOfFilters=20” ShowMoreLink=True” MappedProperty=ProductTags” MoreLinkText=show more” LessLinkText=show fewer” ShowCounts=Count />

Copy the XML back to the Filter Category Definition property and make sure you uncheck the Use Default Configuration check box!

Once done click OK and Save & Close to finish editing the page. The result of the change is as follows:

Well we are half way trough.. we have the tags showing in the refinement panel but they aren’t separated. All the tags are shown in the same line. As stated above, the reason for this is that FAST doesn’t recognize the pipeline char as a separator and as such treat the text as one big line.

To solve this issue we will need to build a custom processor to replace the pipeline char with FAST’s separator char. We will build this processor in part 2 of the series, so take a short rest (you deserve it after all these settings) and check back soon for part 2 🙂

SharePoint Repository for NuGet is alive!

I’m happy to announce that my new open source project, SharePoint Repository for NuGet, is now live on Codeplex!

SPRNuGet is a powerful integration between NuGet, the renowned .net package management system and Microsoft SharePoint 2010.

SPRNuGet combines SharePoint 2010 powerful features such as security, publishing and UI to help you organize your code packages using build-in features of SharePoint such as folders, item level forms, ratings and enterprise keywords.

SPRNuGet is a site-collection level feature and once installed a new folder will be created to hold the NuGet packages. When a package is uploaded and successfully verified as a NuGet package you’ll be able to change its Metadata using the following form:

The form will allow you to set the following Metadata fields for a NuGet package:

  • Title
  • Description
  • Summary
  • Language
  • Copyrights
  • Version
  • Icon URL
  • Both the Latest Version and Absolute Latest version fields
  • The project URL
  • The project’s License URL
  • Release notes
  • The license acceptance field
  • Tags (from a SPRNuGet term store created during the installation process)
  • Rating

When completed, the package is added to the SharePoint folder as follows:

You can edit the package Metadata at any given time, add tags or change its rating.

When connected to Visual Studio 2010, SPRNuGet will show packages the exact same way you are used to from NuGet:

So what are you waiting for? Go have a look at SPRNuGet right now at and don’t forget to follow SPRNuGet on Twitter for the latest news and announcements!