Category: Administration

SharePoint Administration related posts

Integrating Open Source Software and Components

Over the past few years there has been tremendous progress in making many components and solutions available in the open source markets like CodePlex, SourceForge, or even individual blogs and sites.  Many of these are community driven projects supported by individuals, with some by commercial companies.  For some reason many companies avoid or strictly limit allowing these solutions to be installed or used.  When approached properly I think these solutions should be considered.  This article will cover the advantages as well as outline an approach to make take when evaluating which solutions to install and how to test them.

Advantages

The first advantage of implementing any solution is that it should make it much quicker to solve a business problem.  In some cases it might be a complete solution, like commercial software components, but at the very least it should act like an “accelerator” offering you a springboard to the final solution.  One advantage that it has over commercially packaged systems is that you have the source code in case changes, additions, or fixes need to be made. 

Another advantage is that it can expose the development team to different coding techniques so that they can better address similar problems in the future.  Microsoft has been providing sample databases and applications for years for this very purpose.  People tend to learn more by seeing examples in action versus class diagrams.

Review Project Status and Activities

Not all solutions are of the same quality and grade.  In some cases there are a few quick samples while in others there was a more formal project with full QA that has gone through multiple release cycles.  It is therefore important to review things like the product status and release number.  It is also important to review the Issues log to see how much activity there is and how quickly issues have been addressed.  You can typically get a good idea about how widely it is used.  In some cases you may be familiar with the developers which offers the advantage of discussing questions and issues directly with them.

Approach to Testing

To increase your chance of success and mitigate risk, it is important to always fully test the components to meet both functional and performance testing.  With commercial software the expectations are normally pretty high, but I would approach this as if a member of the project team produced it even if no changes were made.  This includes specialized testing like the “DisposeCheck” tests typically done to custom code that interacts with SharePoint’s API. 

Final Notes

When expectations are properly set, and the solution is fully reviewed and tested, including these open source solutions can contribute to the team’s ability to effectively deliver solutions quickly.

Related Articles

MySite Provisioning Methods

A number of times over the past few years I have stumbled into discussions (in person or online) about how to automate the creation of MySites for all users in the organization.  Creating the sites programmatically is actually pretty simple, but the real question is “Why do you want to do that?”  There are advantages and disadvantages to automating the process, but for me it almost always comes down to two big things; Governance and Business Reason. 

MySite Governance

MySites present an interesting challenge with regards to governance.  While most of the topics are outside the scope of this article there are a few important topics that relate to the number of MySites within an organization.

Storage Considerations – Even with quotas in place it is easy to see exponential growth for the storage requirements.  In larger environments with 1,000s of users serious planning needs to take place to build out a SQL environment to support the site collections.  Planning should also be done to manage the number of sites per content database to ensure long term maintainability. 

When you provision all of the sites at once all of the planning has to be done up front, where conversely if you provision the sites slowly over time you spend a little time planning out the long term assumptions and then tweak the strategy over time as the sites and their usage evolves.  It is much easier to make corrections with the slow approach.

User Support and Training – A MySite is very different than an email account which is something nearly all computer users are familiar with at this point.  The average SharePoint user has never received any formal training and has little understanding of the capabilities of a site collection.  Without proper training it is unlikely that user will be able to take advantage of any of the real benefits of the MySite leaving them to just use it as a replacement for a personal network share (see Storage Considerations above). 

In my experience site owners or administrators for traditional collaboration or department sites are much more likely to have success and less likely to need extra support.  That narrower group of people is a much better starting point, and they are also sophisticated enough to initiate the automatic provisioning process themselves.

Business Reason

Each organization should develop a user story for what the purpose of a MySite is within their organization.  Like any site collection, it can be used for many different purposes such as; Landing Page, Dashboard, Personal Site, etc.  The user story may help establish how the MySite will be used, who is expected to use it, and ultimately if customization is needed to provide the functionality and content.  The answers to those questions should help guide the decision about how to provision the sites. 

Closing

While I tend to like the go slow and make adjustments path, there are valid reasons for needing to auto-provision sites for large groups of users.  Hopefully the guidance here will help to guide the team through proper planning so that the implementation can be successful.

Related Posts

Keys to Establishing a Disaster Recovery Plan

Today I put the finishing touches on a SharePoint Disaster Recovery presentation I’ll be delivering at the Triangle SharePoint User Group meeting this week.  Part of the presentation goes into the regular technical aspects of backing up and restoring SharePoint, but the second half goes into how I approach a Disaster Recovery plan. 

Prioritize Your Content

Not all sites, nor all types of content are the same.  In environments where I have seen SharePoint been very successful, it was successful because it was used to manage or store business critical content.  When looking at any medium to large implementation, 100GB+ of content, it really pays to spend some time working with the business to prioritize the content so that a restore sequence can be established.  Getting the most critical content back online will likely buy you some time and ease the pressure while you work on the rest of the content.

When you practice your recovery, be sure to practice it based on your recovery sequence to validate that it can be executed in that order and that you are not overlooking any dependencies.

Track Customizations and Solutions

I’ve mentioned a few times in the past how important it is to keep a running list of all the customizations in place along with the actual solutions or install files.  If you have to rebuild the environment this is crucial to getting the system back in working order.  Review the list regularly and be sure to save it to separate media since keeping it on the SharePoint server may not help you much during a catastrophic failure.

Identify Multiple Recovery Paths

There are different types of failures, and the same path isn’t appropriate for each.  In addition, sometimes unknowns come up that either complicate or block the desired recovery path.  It is always a good idea to have a secondary recovery path.  When it comes to Disaster Recovery, redundancy is a very good thing.

Know Your Restore Rate

It is important to know who long it will take you to restore the systems.  Base the rate on actual tests, not on the stated rate of a given network or tape technology.  If the company has a DR agreement with a facility like Sun Guard it would be a good idea to also include both the expected rates for your home facility as well as the disaster facility.  Once the restore rate is established it can be used for ongoing planning as your contend databases grow.

It also pays to set a worst case scenario rate.  If the entire data center had to be restored or rebuilt and every tape machine is in overdrive and the network is saturated you are not going to see the same performance that you will when just recovering a single system. 

Ensure the Plan Meets the Business Objectives

All of this planning is to ensure that you can recover the system in a way that can get the business back up and running.  Be sure to get validation on the recovery plan and do not make assumptions.  If the business really does require quicker recovery times there may be justification for additional tools to help speed up the recovery process.  The cost of those tools is typically negligible when compared to the cost of business interruption.

Service Applications in SharePoint 2010

I’ve spent a fair amount of time recently getting up to speed with SharePoint 2010.  One of the features that surprised me the most is the new Service Applications framework. 

TechNet defines Service applications as “a resource that can be shared across sites within a farm or, in some cases, across multiple farms.”  Previously resources were shared between sites using the Shared Service Provider (SSP) model.  The SSP model worked fairly well with simple deployments but little information was available for complicated distributed deployments which led to a lot of frustration.  In addition, one of the big complaints was that the SSP was only available for MOSS versions and was not available for WSS.  The Service Applications framework has been built directly into Foundation services which makes it available to all flavors of SharePoint. 

A Modular Approach

SharePoint Server 2010 ships with services for all of the main MS Office apps, Search, User Profiles, Managed Meta-Data, Performance Point, and BCS.  Not only can you define which applications are configured, but you can manage the associations so that the applications can be exposed to specific SharePoint web applications.  Since it is possible to deploy the same service multiple times you can support different configurations and authentication models to support more complex deployments.  In addition there are situations where you can use FAST search on some applications and regular Search on others in the same farm.  There are an unlimited number of deployment possibilities.

SharePoint 2010 Service Applications

Since the framework is open this provides real opportunities to the ISVs in areas like vertical applications in just about every area.  Some of the ECM and BPM vendors have been heavily integrated with SharePoint for years so I think this can only make their offerings that much more powerful.

Another advantage to the modular approach is that it is possible to for both Microsoft and the ISVs to provide new services or service versions outside of the typical long release cycle.  I’m hopeful that this will enable some of the currently evolving technologies and features to catch up. 

Planning Considerations

In some recent articles I outlined some considerations for Site Topology Planning [Article 1, Article 2].  In addition to segmenting the applications and sites therein, you will also want to start to draw out the service associations as well.  For example you can determine how many search or user profile providers do you need.

All of this segmentation does has a potentially negative side effect.  Each of these services has its own database and potentially its own IIS Application Pool, which best practice dictates has its own dedicated domain account.  It is easy to see how the number of accounts, application pools and databases would get out of hand.  This was a common complaint I heard in many smaller environments with MOSS, and it is exponentially worse with 2010.  In my opinion the new model has many more advantages than disadvantages.  I will not let a few extra databases and accounts influence my application and service association decisions. 

Summary

The Service Application model in SharePoint 2010 offers up a lot of flexibility and expandability allowing implementers to tailor the deployment to the specific business needs.  I encourage everyone to evaluate the new model thoroughly before making any upgrade or migration plans in order to maximize the value of the new services.

Troubleshooting Page Loading Issues

This article will cover some techniques that I typically use for troubleshooting performance issues when loading a particular page in SharePoint.  Troubleshooting the overall platform and member servers are outside the scope of this article.

On occasion I have run into situations where a specific page or collection of pages take longer than expected to load.  I say longer than expected because everything is relative to your content, user activity and your underlying server architecture.  If most pages load in under 10 seconds, but one in particular takes 30-40 seconds or consistently times out then there is clearly something that needs to be reviewed.

Review the SharePoint Page

The first place I start is by reviewing the actual SharePoint page(s) having the trouble.  It is possible, and somewhat common, to have content on a given page that is not displayed. 

Look for Hidden Web Parts

It is possible to have hidden web parts on the page, and while there are some valid reasons for this, in most cases it is done by mistake.  Some users do not understand the difference between the Close and Delete option in the web part menu.  Close will essentially hide the web part, Delete will actually remove it from the page.  Since in both cases it is no longer displayed, most do not think anything more about it after it is gone.  That content is still being loaded even if it is not being displayed.  I have seen a few instances were multiple instances of a large lists are being loaded in this manner which slowed the page load down by 20+ seconds. 

To check for this through the browser, add “?contents=1” to the end of the URL.  This will redirect you to the Web Part Maintenance page displayed below.

Web Part Maintenance Page

From there you can remove the Web Part by selecting the checkbox and clicking the Delete button.  If you want to restore it, return to the page go into Edit Mode and select the Add a Web Part button.  For sites with Publishing Features you will then need to go into the Advanced Web Part gallery options to see the regular Web Part menu.  Select the “Closed Web Parts” gallery listed at the top and add it back to the appropriate zone on the page.

Closed Web Part Gallery

Use of Audiences

Check to see if Audiences are being used.  While this may be a very useful feature in some instances, it can also be inappropriately used.  Just like the Closed web parts mentioned above, this content will load for everyone, but it is only displayed for people in the appropriate audience.  It is a good idea to try and avoid this on top level or heavily trafficked pages.

Trace the Requests

I’ll then use a tool like Fiddler which is able to track all of the network requests from the client computer to the servers.  It is important to note that when running this you will want to close out any other applications that might be making network requests.  Things like your email client or even apps like the Google Toolbar can be pretty chatty and make the results a little harder to read. 

Fiddler will let you now exactly how long it takes to load the page, and break down each and every resource that is also loaded in order to render the page. 

Review Content Locations

From the trace results, take note of where the requested resources are located.  If everything is being requested from the same server it should be pretty straightforward, but in some cases there may be images, styles, scripts, or other resources referenced on other servers. 

Review each of the sources and see if there are any connectivity issues with that particular server.   If any of those servers are outside of your network it may be even harder to have consistent page load times.  In some cases it might be possible to relocate some of those resources closer to the application.  Images, styles, and scripts that are frequently used, but that do not change very often are good candidates for that.  

Review What Is Cached

This is also a great tool for determining what is and what is not being cached.  During SharePoint design or branding projects it is a good idea to clear your browser cache and compare the first request versus a subsequent request where images and related resources are cached.  In some cases new or infrequent users may be having trouble getting resources to load that regular users already have cached.

Summary

Hopefully these tips will help you isolate ay performance problems and bottlenecks in order to keep your page load times as quick as possible.

Related Posts

Site Topology Planning and Taxonomies

In the previous article SharePoint Site Topology Planning I discussed some of the technical implications of organizing the the sites within one or more applications, site collections, and sub-sites.  The article started to get pretty long so I decided to save the taxonomy part of the discussion for a separate article.

Organizing Sites

In the previous article I addressed different types of content and using that to help segment the sites across applications and site collections.  Within a given application it is possible to provide some meaningful segmentation by configuring managed paths. 

For example, you may segment collaboration sites with the following url structure:

  • http://collaborate.company.com – Collaboration Application
    • /Communities – Communities of Practice Managed Path
      • /Proposals – Proposal Community of Practice Site Collection
      • /Procurement -  Procurement Community of Practice Site Collection
    • /Projects – Projects Managed Path
      • /Alpha – Alpha Project Site Collection
      • /Omega – Omega Project Site Collection

Organizational Hierarchies

Most people understand hierarchies, and most businesses (at least in the west) have been organized in hierarchies for many years.  It is natural for people to think of their organization in this manner, but this may not be the best way to plan for the topology of your sites.

Traditional Intranets tend to go from the largest organizational unit down to the smallest.  There may be multiple divisions, with multiple business units, with multiple departments, with multiple teams, with people that actually do the work.  Sites or portals that go 5 or more levels deep can become very difficult to manage and even harder to use.  Modern businesses need to remain agile with teams always being redefined, combined and split up.  In  most situations it is a good idea to fight the hierarchy tendencies and strive for a flatter structure. 

From a SharePoint perspective a flatter structure with more site collections will make it easier to reorganize sites and structure versus a single site collection with 5+ sub-site layers deep.  As previously discussed Site Collections can be backed up and restored with a high level of fidelity (completeness) compared to a sub-site’s export and import options.  The key to usability and manageability is to find the right amount of segmentation and site collection structure.

Finding Sites and Content in a Flat World

An alternative to a rigid hierarchy is adopting flexible taxonomies with tagging.  Tagging provides a flexible and dynamic method of describing the content and sites that can evolve over time.  A great example of this is a site like StackOverflow compared with the rigid structure of the MSDN/TechNet Forums. The flat structure decreases the chance of duplication and provides new opportunities to view the data in new and unique ways. 

The SharePoint 2010 system full supports tagging without the need for custom or third party add-on components.  I fully expect that these will be a popular feature within the new version. 

Summary

Following the guidance between the two articles you should be able to properly plan your site topology.  Assumptions and business decisions do change, but if you establish the right level of granularity with applications and site collections you will be able to migrate and relocate things as needed.

Related Posts

SharePoint Site Topology Planning

This is the next in a series of articles addressing core SharePoint implementation topics.  I hope that this is valuable to both groups looking to implement SharePoint for the first time as well as groups planning planning for an upgrade or migration to SharePoint 2010.

A Series of Containers

I tend to think about SharePoint from a container perspective.  Each of the containers has a set of settings and features that can be configured and administered for all of the containers within.  It is important to understand the boundaries of each level so that you can create a site topology that meets all of your objectives.  I’ll cover a sub-set of these I typically consider while planning the site topology.  The containers I’m going to address include:

Farm – In the context of this article, all of the Applications, Site Collections, and Sites hosted by one or more connected SharePoint servers.  I say “connected” because it is possible, and fairly likely that most organizations will have more than one Farm (Dev, Staging, Production for things like Intranet, Extranet, Internet, etc).  A farm has one or more content Applications plus additional applications for things like Central Administration.

Application – An application would be the top level address which maps to an application in IIS.  For example http://sharepoint.  An application has one or more Site Collections.  The application level is also the level where you have the opportunity to identify one or more content databases for storing the content on the SQL server.

Site Collection – A site collection has one or more Sites also referred to as Webs or Sub-Sites.

Sub-Sites or Webs – This is the smallest container which resides under and can be managed by the Site Collection.

Types of Content

The first step is identify what types of content or site(s) you expect to host.

  • Internet
  • Extranet
  • Intranet
  • MySites
  • Company or Divisional Portal(s)
  • Web Content Management (Publishing)
  • Electronic Content Management (Document Management)
  • Business Intelligence
  • Project Management
  • Team Collaboration
  • Application Hosting

Not all of those types of content apply to every organization, but it should be pretty clear that the content, the update frequency and the manner in which it is used and managed can vary quite a bit from type to type. 

Considerations of Multiple Applications

Very small companies or workgroups may be able to get by with all of the types of content hosted in a single Application, but for anything larger there should be plans to segment it to multiple Applications in the farm.

Here are some considerations when using multiple Applications:

Authentication Model – What type of authorization model will be used?  Options include Anonymous, Windows Integration (NTLM or Kerberos), and Forms Based Authentication (FBA).  Since an Application can only support a single authorization mode, that can dictate additional applications.

Edit:    Anders Rask was kind enough to point out some changes to the authorization model in 2010 that make the previous statements incorrect.

In 2007:  Options include Anonymous, Windows Integration (NTLM or Kerberos), and Forms Based Authentication (FBA).  Since an Application can only support a single authorization mode, that can dictate additional applications.

In 2010: There are two high level options; Classic which supports Windows Authentication only and Claims Based which supports one or more different providers.  Additional information can be found on TechNet’s article Plan Authentication Methods.

Sessions – When a user visits an application a session is created.  It is important to understand that their session is for a specific IIS Application so if they visit the main company portal and then click to see the MySite they may be asked to authenticate again.  With Anonymous this should not be an issue.  With Windows Integrated this should not be an issue if 1) users are using Internet Explorer 2) they access the site(s) with the same account they use to access their computer and 3) their browser is properly configured to pass logged on user information. 

Application Scoped Solutions – Solutions can be scoped to specific Applications which can provide some flexibility in deploying new features.  In a large environment it is important to only show features to the areas where they apply.

Considerations for Site Collections

The site collection has some important boundaries to consider when deciding to use one versus multiple site collections.  They include:

Amount of Data – It is important to keep your site collections at a maintainable level.  There is no hard limit, but as Site Collections grow over 40GB they can get more difficult to maintain and will take longer to restore.  SQL Server tuning becomes more critical with larger databases.  It is a good idea to segment your content across multiple site collections (and across multiple content databases) in a manner that makes sense.

SharePoint Groups – SharePoint Groups can be a good way to organize users, especially when they do not map to functional areas or groups otherwise managed in Active Directory.  These groups are defined and contained within a given Site Collection which means if you want to use them in multiple site collections they have to be duplicated and maintained separately which can be problematic.

Site Collection Administrator – A site collection administrator has great power and control within the site collection container.  A Site Collection administrator can choose themes, manage all security within the site collection, manage activated solutions and potentially deploy other customizations if policy permits.  Enabling content owners should be a top priority, and the Site Collection provides a good container for that.

Quota Management – Quotas are set at the Site Collection level so if granular quota management is required for billing or charge back purposes it may have some impact on how site collections are segmented. 

Navigation – The default SharePoint navigation provider does not span site collections (and therefore applications) which can make a standardized or unified navigation scheme difficult to maintain.  This tends to work fine for team collaboration sites, but can be cumbersome when you need to link many site collections.

Content Types – I think Content Types were one of the most important changes introduced with WSS 3.0 and MOSS.  An entire overview of content types is outside the scope of this article, but keep in mind that within the current release Content Types are created and maintained within a Site Collection.  If a Content Type applies to multiple site collections then it needs to be duplicated.  SharePoint 2010 will support farm level content types which will remove the need to duplicate them.

Profiles (WSS Only) – If you are using WSS it is important to understand that the profiles are stored at the Site Collection level.  If you add custom attributes they will need to be added to all site collections they apply to.

Implications on Backup and Recovery

There are a few different methods to backup and recover SharePoint content.  Within the context of this article it is important to understand the difference between the commands that stsadm provides. 

Site Collection Backup and Restore – Considering the boundaries previously discussed, there is a lot of extra content stored in the top level site of a site collection.  Doing an stsadm backup will provide a high fidelity snapshot of the content, configuration, workflows, and other customizations.  It is important to note that if you are moving the site collection between applications or farms that you will need to install any solutions or dependencies referenced in the current location.

Sub-Site or Web Export and Import – The Export process offered for Sub-Sites is great for archiving, but it does not provide the fidelity needed to move sites around.  It will not save workflow, features, solutions or alerts.  I’ve also had inconsistent results with DataViews on sites being migrated in this manner.

If you think you will need to migrate the content or want flexibility, it would be in your best interest to consider using more Site Collections rather than deeply nested Sub-Sites.

Upgrade and Migration Considerations

The purpose and content within a site collection or site can evolve over time.  Some sites that started very narrow in purpose may change and now warrant their own Site Collection or Application.  When preparing for a migration or upgrade it is a good time to run through this exercise again to validate the assumptions and decisions that were previously made or overlooked.  While it may make the move more difficult the changes will pay dividends over the coming years offering a system better tuned to the user’s needs and more maintainable by the site owners and farm administrators.

Related Posts

%d bloggers like this: