I'm currently working on a website which is based upon SharePoint 2007 Moss which had an issues with some specific Publishing pages not having the right content in fields after Content Deployment. The symptom was that the content of a page was modified by the contenteditor, but after a Full Content Deployment to another farm the page did not have the changes but had the old content (in Version History: v1.0).
After much searching i noticed in SharePoint Manager 2007 that some properties of the listitem were duplicated (the only difference was the names were Xml-escaped). On a page i found the properties "Basis tekst" and "Basis_x0020_tekst". On a newly created page the (escaped named) property "Basis_x0020_tekst" would not be added to the listitem.
I'm not quite sure how this duplication was introduced, but it might have something to do with that the pages were privisioned via Feature Stapling and "SimplePublishing set to False" of the web. Apparently Content Deployment overwrites the "Basis tekst" value in the destination farm with the "Basis_x0020_tekst" value. The "Basis_x0020_tekst" property still had the content at the time of provisioning.
I whipped together a simple tool to check and remove properties which had a name which after "XmlConvert.DecodeName" was already present.
private void CheckListItem(SPListItem listitem, bool fix)
{
foreach (string propertykey in listitem.Properties.Keys)
{
string decodedname = XmlConvert.DecodeName(propertykey);
if (!decodedname.Equals(propertykey) && listitem.Properties.ContainsKey(decodedname))
{
string url = SPUrlUtility.CombineUrl(listitem.Web.ServerRelativeUrl, listitem.Url);
Console.WriteLine(string.Format("Corrupt property found with name \"{0}\" on listitem \"{1}\"", propertykey, url ));
if (fix)
{
listitem.Properties.Remove(propertykey);
listitem.SystemUpdate(false);
Console.WriteLine(string.Format("Fixed by removing property \"{0}\"", propertykey));
}
}
}
}
Ofcourse this never would have occured if the fieldnames would not have had spaces in the name.
Sander Schutten
Wednesday, April 20, 2011
Thursday, August 12, 2010
Custom Timer Job Server
I've just finished a Solution which provides two ways of modifying the Server on which a timer job will run. This solution contains implements Microsoft-unsupported-hacks, but seem to work at this moment.
This solution is realized as a last resort to force (Custom) Timer Jobs to run on a specific server.
The WebApplication Feature `Macaw.Moss2007.TimerJobServerSelectorSolution: TimerJobServerSelectorFeature` should be activated on the `Central Administration` webapplication, which registers a Control Adapter to enable changing the server of the Timer Job. It also copies the "Microsoft.SharePoint.ApplicationPages.Administration.dll" from "_app_bin" to the "bin" folder. And modifies the Trust of the CentralAdmin webapplication.
I also provided an STSADM extension to be able to set the server of a timer job. An example is:
stsadm -o jobserver -url http://cms.website.local -job DependencySpiderJob -server MOSSWFE02
or
stsadm -o jobserver -url http://cms.website.local -job DependencySpiderJob -server
The solution can be downloaded from:
https://mrcl.svn.codeplex.com/svn/Moss2007/Macaw.Moss2007.TimerJobServerSelectorSolution
This solution is realized as a last resort to force (Custom) Timer Jobs to run on a specific server.
The WebApplication Feature `Macaw.Moss2007.TimerJobServerSelectorSolution: TimerJobServerSelectorFeature` should be activated on the `Central Administration` webapplication, which registers a Control Adapter to enable changing the server of the Timer Job. It also copies the "Microsoft.SharePoint.ApplicationPages.Administration.dll" from "_app_bin" to the "bin" folder. And modifies the Trust of the CentralAdmin webapplication.
I also provided an STSADM extension to be able to set the server of a timer job. An example is:
stsadm -o jobserver -url http://cms.website.local -job DependencySpiderJob -server MOSSWFE02
or
stsadm -o jobserver -url http://cms.website.local -job DependencySpiderJob -server
The solution can be downloaded from:
https://mrcl.svn.codeplex.com/svn/Moss2007/Macaw.Moss2007.TimerJobServerSelectorSolution
Friday, July 9, 2010
Moss2007: Adding button to Publishing Edit Console via Feature
To add a button to the Publishing Edit Console it's advised to modify the CustomQuickMenu.xml file of your SiteCollection. Most information describe the procedure to overwrite the default file. When multiple solutions would use this method of overwriting, they would discard each others modifications. It would be much better to parse the xml file and only add/delete it's own modifications. The code of the FeatureReceiver posted here does just this. It uploads it's own .xml file and adds a reference for this xml file to the CustomQuickMenu.xml. Upon deactivation of the Feature, all modifications will be restored.
class DependencyQuickAccessButtonFeatureReceiver : SPFeatureReceiver
{
public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPSite site = (SPSite)properties.Feature.Parent;
SPFolder editingmenufolder = site.RootWeb.GetFolder("/_catalogs/masterpage/Editing Menu/");
Stream dqabstream = this.GetType().Assembly.GetManifestResourceStream("Macaw.Wss3.DependencySpiderSolution.FeatureCode.DependenciesQuickAccessButton.xml");
SPFile dqabfile = editingmenufolder.Files.Add("DependenciesQuickAccessButton.xml",dqabstream,true);
dqabfile.CheckIn("Added by DependencySpiderSolution");
try
{
dqabfile.Publish("Added by DependencySpiderSolution");
}
catch { }
SPFile file = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/CustomQuickAccess.xml");
Stream stream = file.OpenBinaryStream();
XmlDocument document = new XmlDocument();
document.Load(stream);
stream.Close();
XmlNode consolenode = document["Console"];
if (consolenode == null)
{
consolenode = document.AppendChild(document.CreateElement("Console"));
}
XmlNode customfilenode = consolenode.SelectSingleNode("/Console/customfile[@FileName='DependenciesQuickAccessButton']");
if (customfilenode == null)
{
customfilenode = consolenode.AppendChild(document.CreateElement("customfile"));
XmlAttribute filenameattribute = customfilenode.Attributes.Append(document.CreateAttribute("FileName"));
filenameattribute.Value = "DependenciesQuickAccessButton";
}
stream = new System.IO.MemoryStream();
StreamWriter writer = new StreamWriter(stream, Encoding.ASCII);
document.Save(writer);
writer.Flush();
file.CheckOut();
file.SaveBinary(stream);
file.CheckIn("Added DependencyQuickAccessButton");
try
{
file.Approve("Added DependencyQuickAccessButton");
}
catch { }
}
public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
SPSite site = (SPSite)properties.Feature.Parent;
#region Restore "CustomQuickAccess.xml"
SPFile file = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/CustomQuickAccess.xml");
Stream stream = file.OpenBinaryStream();
XmlDocument document = new XmlDocument();
document.Load(stream);
stream.Close();
XmlNode customfilenode = document.SelectSingleNode("/Console/customfile[@FileName='DependenciesQuickAccessButton']");
if (customfilenode != null)
{
customfilenode.ParentNode.RemoveChild(customfilenode);
}
stream = new System.IO.MemoryStream();
StreamWriter writer = new StreamWriter(stream, Encoding.ASCII);
document.Save(writer);
file.CheckOut();
file.SaveBinary(stream);
file.CheckIn("Removed DependencyQuickAccessButton");
try
{
file.Approve("Removed DependencyQuickAccessButton");
}
catch { }
#endregion
#region Delete "DependenciesQuickAccessButton.xml"
SPFile dqabfile = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/DependenciesQuickAccessButton.xml");
if (dqabfile != null && dqabfile.Exists)
{
dqabfile.Delete();
}
#endregion
}
public override void FeatureInstalled(SPFeatureReceiverProperties properties)
{
}
public override void FeatureUninstalling(SPFeatureReceiverProperties properties)
{
}
}
I used these sites as information source:
http://community.zevenseas.com/Blogs/Robin/archive/2010/06/07/sharepoint-file-encoding-and-uploading-the-customquickaccess-xml.aspx
class DependencyQuickAccessButtonFeatureReceiver : SPFeatureReceiver
{
public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPSite site = (SPSite)properties.Feature.Parent;
SPFolder editingmenufolder = site.RootWeb.GetFolder("/_catalogs/masterpage/Editing Menu/");
Stream dqabstream = this.GetType().Assembly.GetManifestResourceStream("Macaw.Wss3.DependencySpiderSolution.FeatureCode.DependenciesQuickAccessButton.xml");
SPFile dqabfile = editingmenufolder.Files.Add("DependenciesQuickAccessButton.xml",dqabstream,true);
dqabfile.CheckIn("Added by DependencySpiderSolution");
try
{
dqabfile.Publish("Added by DependencySpiderSolution");
}
catch { }
SPFile file = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/CustomQuickAccess.xml");
Stream stream = file.OpenBinaryStream();
XmlDocument document = new XmlDocument();
document.Load(stream);
stream.Close();
XmlNode consolenode = document["Console"];
if (consolenode == null)
{
consolenode = document.AppendChild(document.CreateElement("Console"));
}
XmlNode customfilenode = consolenode.SelectSingleNode("/Console/customfile[@FileName='DependenciesQuickAccessButton']");
if (customfilenode == null)
{
customfilenode = consolenode.AppendChild(document.CreateElement("customfile"));
XmlAttribute filenameattribute = customfilenode.Attributes.Append(document.CreateAttribute("FileName"));
filenameattribute.Value = "DependenciesQuickAccessButton";
}
stream = new System.IO.MemoryStream();
StreamWriter writer = new StreamWriter(stream, Encoding.ASCII);
document.Save(writer);
writer.Flush();
file.CheckOut();
file.SaveBinary(stream);
file.CheckIn("Added DependencyQuickAccessButton");
try
{
file.Approve("Added DependencyQuickAccessButton");
}
catch { }
}
public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
SPSite site = (SPSite)properties.Feature.Parent;
#region Restore "CustomQuickAccess.xml"
SPFile file = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/CustomQuickAccess.xml");
Stream stream = file.OpenBinaryStream();
XmlDocument document = new XmlDocument();
document.Load(stream);
stream.Close();
XmlNode customfilenode = document.SelectSingleNode("/Console/customfile[@FileName='DependenciesQuickAccessButton']");
if (customfilenode != null)
{
customfilenode.ParentNode.RemoveChild(customfilenode);
}
stream = new System.IO.MemoryStream();
StreamWriter writer = new StreamWriter(stream, Encoding.ASCII);
document.Save(writer);
file.CheckOut();
file.SaveBinary(stream);
file.CheckIn("Removed DependencyQuickAccessButton");
try
{
file.Approve("Removed DependencyQuickAccessButton");
}
catch { }
#endregion
#region Delete "DependenciesQuickAccessButton.xml"
SPFile dqabfile = site.RootWeb.GetFile("/_catalogs/masterpage/Editing Menu/DependenciesQuickAccessButton.xml");
if (dqabfile != null && dqabfile.Exists)
{
dqabfile.Delete();
}
#endregion
}
public override void FeatureInstalled(SPFeatureReceiverProperties properties)
{
}
public override void FeatureUninstalling(SPFeatureReceiverProperties properties)
{
}
}
I used these sites as information source:
http://community.zevenseas.com/Blogs/Robin/archive/2010/06/07/sharepoint-file-encoding-and-uploading-the-customquickaccess-xml.aspx
Thursday, May 27, 2010
XmlSitemap Solution for SharePoint 2007
I've just finished an initial version of a XmlSitemap generator solution for SharePoint 2007. XmlSitemaps are meant for search engines like Google to help them index your website better. For many companies it's very important to SEO.
The following XmlSitemap solutions already existed for SharePoint 2007:
http://blog.mastykarz.nl/imtech-xml-sitemap-free-sharepoint-feature/
http://www.thesug.org/Blogs/lsuslinky/archive/2009/04/17/SharePoint_SiteMap_Generator__Version_2.aspx.aspx
http://www.kwizcom.com/ProductPage.asp?ProductID=737&ProductSubNodeID=738
I played mix & match to create my own solution.
I started out with generating a XmlDocument and uploading it as a file (sitemap.xml) to the rootweb of a sitecollection. Unfortunatly i had to drop this easy solution as i realized that the urls in the XmlSitemap protocol have to be absolute urls. This would pose a problem when:
* ContentDeployment is enabled, because the generated file would be deployed to a different farm which most likely has a different DNS.
* Alternate Access Mappings are used, because only one url would be present in the sitemap file. It would be technically possible to store all AAM urls in one file, search engines should select only the urls which match the sitemap-url. But this is not very neat as internal DNS entries would also be seen.
I ended up with a solution consisting of a Job which runs periodically (once a day, at night). I also had to implement two HttpHandlers for serving the sitemap.xml files.
The Job creates a XmlSitemap-index file and (if needed) multiple sitemap.xml files. The files get stored as Persisted Object under the WebApplication as a set. For each Site Collection and AAM a set of sitemap files will be generated and stored.
The HttpHandlers look up which set of sitemap files should be used and write out directly the stored xml.
This is an example of a generated XmlSitemap Index file:
<?xml version="1.0" encoding="utf-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.0.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.1.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.2.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.3.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
</sitemapindex>
As you might notice multiple seperate sitemap.xml files are being referenced. A requirement of the XmlSitemap protocol is that the files may not get too large (max.50.000pages or 10mb).
This is a snippet of sitemap.0.xml:
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://hbrmosdev01:41238/Pages/default.aspx</loc>
<lastmod>2010-04-21</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>http://hbrmosdev01:41238/de/Pages/default.aspx</loc>
<lastmod>2010-03-10</lastmod>
<priority>1.0</priority>
</url>
...
</urlset>
The priority is by default set to "0.5". The priority of welcome-pages is set to "1.0". The changefrquency is calculated based upon the listitem versions. The average interval of modificationdates is calculated and by an algorithm the changefreq is set to: daily, weekly, monthly or yearly. I've decided not to implement: always and never.
For my needs it was important that no processing is needed upon requesting the sitemap files, as it would not scale very well to very large websites.
This solution should be compatible with multiple WFE. Only one machine will generate the sitemap files and store it in the configuration database.
Code is available here
The sitemap can be fed to Search engines. But it also can be referenced from within the Robots.txt file.
The following XmlSitemap solutions already existed for SharePoint 2007:
http://blog.mastykarz.nl/imtech-xml-sitemap-free-sharepoint-feature/
http://www.thesug.org/Blogs/lsuslinky/archive/2009/04/17/SharePoint_SiteMap_Generator__Version_2.aspx.aspx
http://www.kwizcom.com/ProductPage.asp?ProductID=737&ProductSubNodeID=738
I played mix & match to create my own solution.
I started out with generating a XmlDocument and uploading it as a file (sitemap.xml) to the rootweb of a sitecollection. Unfortunatly i had to drop this easy solution as i realized that the urls in the XmlSitemap protocol have to be absolute urls. This would pose a problem when:
* ContentDeployment is enabled, because the generated file would be deployed to a different farm which most likely has a different DNS.
* Alternate Access Mappings are used, because only one url would be present in the sitemap file. It would be technically possible to store all AAM urls in one file, search engines should select only the urls which match the sitemap-url. But this is not very neat as internal DNS entries would also be seen.
I ended up with a solution consisting of a Job which runs periodically (once a day, at night). I also had to implement two HttpHandlers for serving the sitemap.xml files.
The Job creates a XmlSitemap-index file and (if needed) multiple sitemap.xml files. The files get stored as Persisted Object under the WebApplication as a set. For each Site Collection and AAM a set of sitemap files will be generated and stored.
The HttpHandlers look up which set of sitemap files should be used and write out directly the stored xml.
This is an example of a generated XmlSitemap Index file:
<?xml version="1.0" encoding="utf-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.0.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.1.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.2.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
<sitemap>
<loc>http://hbrmosdev01:41238/sitemap.3.xml</loc>
<lastmod>2010-05-27</lastmod>
</sitemap>
</sitemapindex>
As you might notice multiple seperate sitemap.xml files are being referenced. A requirement of the XmlSitemap protocol is that the files may not get too large (max.50.000pages or 10mb).
This is a snippet of sitemap.0.xml:
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://hbrmosdev01:41238/Pages/default.aspx</loc>
<lastmod>2010-04-21</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>http://hbrmosdev01:41238/de/Pages/default.aspx</loc>
<lastmod>2010-03-10</lastmod>
<priority>1.0</priority>
</url>
...
</urlset>
The priority is by default set to "0.5". The priority of welcome-pages is set to "1.0". The changefrquency is calculated based upon the listitem versions. The average interval of modificationdates is calculated and by an algorithm the changefreq is set to: daily, weekly, monthly or yearly. I've decided not to implement: always and never.
For my needs it was important that no processing is needed upon requesting the sitemap files, as it would not scale very well to very large websites.
This solution should be compatible with multiple WFE. Only one machine will generate the sitemap files and store it in the configuration database.
Code is available here
The sitemap can be fed to Search engines. But it also can be referenced from within the Robots.txt file.
Friday, May 21, 2010
Google Analytics Tracking in SharePoint 2010
It's quite easy to embed a Google Analytics Tracking script into a SharePoint Site by adding it to a MasterPage or a Page Layout. The main disadvantage of this approach is that contenteditors don't have any control whatsoever over the tracking.
I already had a Google Analytics Solution for Wss3 lying around and upgraded it to Wss4. Instead of a simple upgrade, i buffed it up a little too to experiment with the features of Mss2010.
I've added two buttons to the Publishing Page Action Group for managing the Google Analytics settings for a specific page and library.
The Ribbon buttons show a Dialog for managing the User Interface of the Google Analytics tracking.
The solution can be downloaded from here
I already had a Google Analytics Solution for Wss3 lying around and upgraded it to Wss4. Instead of a simple upgrade, i buffed it up a little too to experiment with the features of Mss2010.
I've added two buttons to the Publishing Page Action Group for managing the Google Analytics settings for a specific page and library.
The Ribbon buttons show a Dialog for managing the User Interface of the Google Analytics tracking.
The solution can be downloaded from here
Thursday, May 20, 2010
Adding buttons to Mss2010 Ribbon PageActions for Publishing sites
After trying for hours to get a button added to the PageActions-group of the Ribbon. I've managed to find out what i was doing wrong.
Two important things:
* It may be needed to clear your browser's cache to see your Ribbon button appear
* On publishing pages one should not use "Ribbon.WikiPageTab.PageActions" or "Ribbon.WikiPageTab.Actions" as Location. The valid Location is "Ribbon.WikiPageTab.PubPageActions".
This is an example:
<CustomAction Id="Macaw.Wss4.GoogleAnalyticsSolution.GoogleAnalyticsConfigurationPageSettings" Location="CommandUI.Ribbon">
<CommandUIExtension>
<CommandUIDefinitions>
<CommandUIDefinition Location="Ribbon.WikiPageTab.PubPageActions.Controls._children">
<Button
Id="Ribbon.WikiPageTab.PubPageActions.GoogleAnalyticsConfigurationPageSettings"
Sequence="100"
Command="GoogleAnalyticsConfigurationPageSettings"
LabelText="Google Analytics"
Image16by16="/_layouts/1033/images/formatmap16x16.png" Image16by16Top="-120" Image16by16Left="-32"
Image32by32="/_layouts/1033/images/formatmap32x32.png" Image32by32Top="-192" Image32by32Left="-224"
ToolTipTitle="Google Analytics"
ToolTipDescription="Google Analytics"
TemplateAlias="o2" />
</CommandUIDefinition>
</CommandUIDefinitions>
<CommandUIHandlers>
<CommandUIHandler Command="GoogleAnalyticsConfigurationPageSettings" CommandAction="SP.UI.Notify.addNotification('Hello from the notification area');" EnabledScript="true" />
</CommandUIHandlers>
</CommandUIExtension>
</CustomAction>
Two important things:
* It may be needed to clear your browser's cache to see your Ribbon button appear
* On publishing pages one should not use "Ribbon.WikiPageTab.PageActions" or "Ribbon.WikiPageTab.Actions" as Location. The valid Location is "Ribbon.WikiPageTab.PubPageActions".
This is an example:
<CustomAction Id="Macaw.Wss4.GoogleAnalyticsSolution.GoogleAnalyticsConfigurationPageSettings" Location="CommandUI.Ribbon">
<CommandUIExtension>
<CommandUIDefinitions>
<CommandUIDefinition Location="Ribbon.WikiPageTab.PubPageActions.Controls._children">
<Button
Id="Ribbon.WikiPageTab.PubPageActions.GoogleAnalyticsConfigurationPageSettings"
Sequence="100"
Command="GoogleAnalyticsConfigurationPageSettings"
LabelText="Google Analytics"
Image16by16="/_layouts/1033/images/formatmap16x16.png" Image16by16Top="-120" Image16by16Left="-32"
Image32by32="/_layouts/1033/images/formatmap32x32.png" Image32by32Top="-192" Image32by32Left="-224"
ToolTipTitle="Google Analytics"
ToolTipDescription="Google Analytics"
TemplateAlias="o2" />
</CommandUIDefinition>
</CommandUIDefinitions>
<CommandUIHandlers>
<CommandUIHandler Command="GoogleAnalyticsConfigurationPageSettings" CommandAction="SP.UI.Notify.addNotification('Hello from the notification area');" EnabledScript="true" />
</CommandUIHandlers>
</CommandUIExtension>
</CustomAction>
Tuesday, April 27, 2010
SharePoint 2007 Search Enhancements
Until now i've always configured Search for SharePoint for Publishing websites, simply just by enabling Search. I did notice the search results weren´t of very good quality, but it sufficed.
The client of a recent project i´m working on, is much more critical and thus the search should provide good and clean results. To be able to achieve this, i've implemented the SearchEnhancements Solution.
The first and hardest one to get resolved was that the indexer also indexed terms like `Site Actions` and `Publishing`. These terms are of the SharePoint Web UserInterface and obviously you don´t want to display this to anonymous users as search results. The solution is to define a set of exclusion crawl rules in the SSP which indexes your Publishing website. The Solution has a WebApplication Feature which creates the crawl rules in the associated SSP.
Another problem i´ve noticed with search was that unwanted parts of content were getting indexed. For instance the menus were getting indexed, and also some deep-link components. The problem with this is that you might get these unwanted parts to be displayed as summary-text for a searchresult. Also when searching for a specific term which is also present in a deep-link component, all pages would be returned as search results. For this i've implemented a solution which i've found on the internet, a customcontrol which cloacks the content for the Moss indexer.
The third minor problem was that ASP.NET would not recognize the Moss indexer as a crawler. This is because the standard UserAgent string of SharePoint does not include the "Crawler" term. For this i've created a Farm Feature which modifies the UserAgent string in the registry. The WFE's probably have to be restarted to effectuate the registry modifications.
The solution is available here at CodePlex.
The client of a recent project i´m working on, is much more critical and thus the search should provide good and clean results. To be able to achieve this, i've implemented the SearchEnhancements Solution.
The first and hardest one to get resolved was that the indexer also indexed terms like `Site Actions` and `Publishing`. These terms are of the SharePoint Web UserInterface and obviously you don´t want to display this to anonymous users as search results. The solution is to define a set of exclusion crawl rules in the SSP which indexes your Publishing website. The Solution has a WebApplication Feature which creates the crawl rules in the associated SSP.
Another problem i´ve noticed with search was that unwanted parts of content were getting indexed. For instance the menus were getting indexed, and also some deep-link components. The problem with this is that you might get these unwanted parts to be displayed as summary-text for a searchresult. Also when searching for a specific term which is also present in a deep-link component, all pages would be returned as search results. For this i've implemented a solution which i've found on the internet, a customcontrol which cloacks the content for the Moss indexer.
The third minor problem was that ASP.NET would not recognize the Moss indexer as a crawler. This is because the standard UserAgent string of SharePoint does not include the "Crawler" term. For this i've created a Farm Feature which modifies the UserAgent string in the registry. The WFE's probably have to be restarted to effectuate the registry modifications.
The solution is available here at CodePlex.
Subscribe to:
Posts (Atom)