SharePoint Saturday New York– Wrap-up

SharePoint Saturday New York has been a lot of fun with a great set of speakers, volunteers, and attendees. 

A big thank you to everyone that sat through my presentation on Developing Reusable Workflow Features.  The slide deck is available here:  http://www.slideshare.net/nextconnect/developing-reusable-workflow-features 

And the Visual Studio project is available here:  SPBlueprints.Activities

I hope that the information was helpful!

Bulk Updates of User Profile Properties

This past week fellow SharePoint MVP Yaroslav Pentsarskyy posted an excellent PowerShell script for doing bulk updates on the UserProfile properties via PowerShell.  The Bulk Update SharePoint 2010 User Profile Properties is a great script that makes it extremely easy to populate any new fields that are not set to synchronize. 

My team has been doing a lot of client work promoting the user of User Profiles for use within customizations or to drive business processes.  For a quick overview check out my blog post Permanent Link to User Profiles – Driving Business Process or sit in on my Developing Reusable Workflow Features presentation at SharePoint Saturday NY on July 30th or SharePoint Saturday The Conference 2011 August 11-13th.

This also demonstrates another great example of the value that PowerShell can bring to Building and Maintaining a high functioning SharePoint environment.

A Portfolio Approach to Developing Workflows and Processes

One of the common pitfalls I see with process optimization projects is that they tend to focus on a specific process at a time.  This may be ok when you are just starting out, or working with informal processes, but as the number of complex processes improves it is important to try and take a step back and look at things from an overall portfolio perspective.  In many cases processes overlap or are interrelated.  I most often see this in finance processes because they are so common in all organizations.  Something like a Check Request process should be a pretty standard, well defined process but it is often part of a number of other process flows.  It is easy to ignore the fact that the same steps and activities are followed elsewhere, but that leads to a lot of extra work for the process designers and administrators as well as non-standard activities for your process workers to follow.

To overcome this, it is important to consider the following points when analyzing and designing the process:

  • Is there a natural collection of steps or activities?
  • Are these steps also done to support another process?
  • Is a different group or department responsible for those steps?

While performing the process analysis and design, some activities may form a natural grouping.  It could be a set of steps that are referred to under a particular label and they are likely to be assigned to or processed by a specific group of users.  For large complex workflows, it may be a good idea to make that a sub-process that can be referred to as a set unit.  It is important to talk to the stakeholders that perform those tasks and understand if those same tasks or processes are are also performed to support another process.  If they are, then it would be better to design a standard sub-process that is called from the other processes than to build in the specific steps into each process.

The Check Request example I mentioned before is one that I have seen come up in more than one organization.  There is a set of common steps where requests have to be approved, logged, and then processed.  Standard compliance activities are another example of a common central process that may be leveraged by a number of other processes.  Some times these opportunities present themselves early, but other times you have to dig to identify these sub-processes.  In cases where there are multiple groups involved the main process owner or stakeholder may not fully understand the details of how every step is executed so it is important to interview the actual project participants to understand what they are doing and other processes that may use those steps.  Within the Check Request example, it is unlikely that a process owner in Operations understands all of the various corporate activities that may generate a check request, they only understand that it is part of their one process.  By talking to the process workers in Finance, the other perspective can be considered.

By taking a Portfolio Approach in this case, you can potentially make real improvements that extend the process design and automation benefits not just to the one process, but to multiple processes across the entire organization.  Those processes will also get easier to expand and manage as they can leverage common sub-processes and existing functionality.

SharePoint Saturday The Conference DC

I have been confirmed as a speaker at SharePoint Saturday The Conference in DC this year.  The event will take place at the Northern Virginia Community College in Annandale, VA Thursday August 11th through Saturday August 13th.  This marks the first multi-day SharePoint Saturday event and it is expected that over 2500 people will be in attendance which definitely makes this one of the premiere events of the year.  The best news is that advance registration costs a meager $39.

I will be presenting two sessions.

Developing Reusable Workflow Features

Session Level:  200

Session Type:  Developer

A key component to developing and maintaining a suite of workflow processes is consistency and reusable features. This session will provide an overview of how to approach reusable features that can be used in SharePoint workflows and then walk through some specific examples.

Getting the Most from User Profiles

Session Level: 300

Session Type: IT Pro / Architect

This session will help you maximize the power and value of the SharePoint User Profiles available in SharePoint Server. We will do a brief feature review and then cover strategies on how to approach defining and maintaining custom attributes, how to approach synchronizing content with other business systems, and how to leverage the information to drive business processes both inside of and outside of SharePoint.

If you are going to attend the event, stop by and say hello. 

Quota Management and Storage Reports in SharePoint 2010

A few years ago I wrote an article about how to enable and work with the Quota Management features in SharePoint 2007 (click here for article) which proved to be a popular post.  Quota Management is a pretty important topic when it comes to SharePoint Governance and overall maintenance of the platform.  While the overall Quota Management features in SharePoint 2010 were maintained, there was one big feature left out when SharePoint first shipped, and that was the “Storage space allocation” page also known as StoreMon.aspx page that was available to Site Collection administrators from the Site Settings page.  

New Storage Metrics

With the release of SharePoint 2010 SP1 (download here) the feature returns, but in a much different format and vastly improved.  The page was renamed “Storage Metrics” and it is a gold mine of information since it provides a way for Administrators to navigate through the content locations on the site and provides details for the item’s Total Size, % of Parent, % of Site Quota, and Last Modified Date.  This makes it easy for administrators to identify where content is concentrated, and can also show an exceptionally large lists, libraries, folders, and documents. 

There was one aspect of this that I thought was helpful in 2007 that is no longer supported, and that is the ability to view the number of versions of a given document right from the report.  In many cases I’ve seen versioning turned without any limits, and some popular documents might have 1,000s of versions.  The report used to provide a way to find those exceptions so that they could be cleaned up.

 

Performance Improvements

From what I understand, it was removed because it proved to be extremely resource intensive and information was gathered in real-time so it could cause service stability issues in very large environments.  With its return is a completely revamped gathering process that relies on timer jobs, titled Storage Metrics Processing, resulting in much faster page loads and no risk of crashing the server just by viewing the report.  These jobs will pull data every 5 minutes but like all timer jobs, the frequency can be adjusted to better meet your needs and environment.  For larger environments, it might be a good idea to reduce that frequency to avoid the extra overhead.

Configuring Quotas

As with the 2007 version, this feature is only available if quotas are enabled.  In cases where quotas are not currently being used and proper limits managed, the safest bet is to establish a quota that cannot be met.  This will enable the features without the risk of triggering a warning or locking a site that exceeds the thresholds.  Locking the site is the only risk with quotas, there is no risk of data loss.

Summary

Both Farm and Site Collection Administrators should review the functionality and add its review into their content review and cleanup processes.

%d bloggers like this: