Sunday, April 19, 2015

Perform search requests in Sharepoint via javascript object model

In one of my previous posts I showed how to get Sharepoint user profile properties via javascript object model: here. In this post I will show how to do another common task via javascript: perform search requests. It will be useful e.g. if you develop search-driven solution for Sharepoint Online. Here is the javascript code:

   1: var newsItems = [];
   2:  
   3: var NewsItem = function (title, url, ingress, newsDate) {
   4:     this.Title = title;
   5:     this.Url = url;
   6:     this.Ingress = ingress;
   7:     this.NewsDate = newsDate;
   8: }
   9: SP.SOD.executeFunc('sp.search.js',
  10: 'Microsoft.SharePoint.Client.Search.Query.KeywordQuery', function () {
  11:     var Search = Microsoft.SharePoint.Client.Search.Query;
  12:     var ctx = SP.ClientContext.get_current();
  13:     var site = ctx.get_site();
  14:     ctx.load(site);
  15:  
  16:     var query = new Search.KeywordQuery(ctx);
  17:     query.set_queryText("..."); // search query
  18:     query.set_enableSorting(true);
  19:  
  20:     var sortproperties = query.get_sortList();
  21:     sortproperties.add("NewsDate", 1);
  22:     query.set_rowLimit(100);
  23:     query.get_selectProperties().add("NewsIngress");
  24:     query.get_selectProperties().add("Path");
  25:     query.get_selectProperties().add("NewsDate");
  26:     query.set_trimDuplicates(false);
  27:  
  28:     var executor = new Search.SearchExecutor(ctx);
  29:     var result = executor.executeQuery(query);
  30:  
  31:     ctx.executeQueryAsync(function () {
  32:  
  33:         var tableCollection = new Search.ResultTableCollection();
  34:         tableCollection.initPropertiesFromJson(result.get_value());
  35:         var rows = tableCollection.get_item(0).get_resultRows();
  36:         var enumItems = rows;
  37:         var currentRow = 0;
  38:         var rowCount = rows.length;
  39:  
  40:         while (currentRow < rowCount) {
  41:             var row = rows[currentRow];
  42:             newsItems.push(new NewsItem(row["Title"], row["Path"], row["NewsIngress"],
  43:                 row["NewsDate"]));
  44:             currentRow++;
  45:         }
  46:     },
  47:     function (sender, args) {
  48:         console.log(args.get_message());
  49:     });
  50: });

At first we prepare query object (lines 16-26). Here we set actual query string (line 17) and various properties, including managed properties which should be retrieved from search index. After that we perform actual query to the search index asynchronously using SearchExecutor object (lines 28-44). Search results are saved to the array of news items which then may be used e.g. for binding to UI component. Having this example you will be able to easily adopt it for your scenario.

Sunday, April 12, 2015

Create search crawl rules for Sharepoint search service application via PowerShell

In one of my previous articles I showed how we may exclude system pages like AllItems.aspx from search results: Exclude AllItems.aspx from search results in Sharepoint 2013. In this post I will show how to create search crawl rules via PowerShell. It may be useful when you need to exclude a lot of contents from search crawling and doing it manually would mean a lot of work (e.g. when you restored large content database from production, but don’t need to crawl all sites). Here is the script:

   1:  
   2: # Ensure SharePoint PowerShell Snapin
   3: if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
   4: {
   5:     Add-PSSnapin "Microsoft.SharePoint.PowerShell"
   6: }
   7:  
   8: [xml]$xmlinput=(Get-Content "CrawlRules.xml")
   9:  
  10: foreach($WebApplication in $xmlinput.SelectNodes("Build/WebApplication"))
  11: {
  12:     foreach($SearchService in $WebApplication.SelectNodes("SearchService"))
  13:     {
  14:         #Get search service
  15:         $strServiceName=$SearchService.Name;
  16:         $spService=Get-SPEnterpriseSearchServiceApplication -Identity $strServiceName;
  17:         
  18:         #Clear rules if needed
  19:         $Rules=$SearchService.SelectNodes("Rules");
  20:         $strClearRules=$Rules.ItemOf(0).Clear;
  21:         if ($strClearRules -eq "True")
  22:         {
  23:             $spRules=Get-SPEnterpriseSearchCrawlRule -SearchApplication $spService;
  24:             foreach ($spRule in $spRules)
  25:             {
  26:                 if ($spRule -ne $null)
  27:                 {
  28:                     Write-Host "Deleting rule:" $spRule.Path -ForegroundColor Yellow
  29:                     $spRule.Delete();
  30:                 }
  31:             }
  32:         }
  33:  
  34:         #Add new rules
  35:         foreach($CrawlRule in $SearchService.SelectNodes("Rules/Rule"))
  36:         {
  37:             $FollowComplexUrls=$false;
  38:             if($CrawlRule.FollowComplexUrls -eq "True")
  39:             {
  40:                 $FollowComplexUrls=$true;
  41:             }
  42:             
  43:             if ($CrawlRule.Type -eq "ExclusionRule")
  44:             {
  45:                 #In exclusion FollowComplexUrls actually means "Exclude complex URLs"
  46:                 $FollowComplexUrls=!$FollowComplexUrls;
  47:                 New-SPEnterpriseSearchCrawlRule -Path $CrawlRule.URL -SearchApplication
  48: $spService -Type $CrawlRule.Type -FollowComplexUrls $FollowComplexUrls
  49:             }
  50:             else
  51:             {
  52:                 $CrawlAsHttp=$false;
  53:                 if($CrawlRule.CrawlAsHttp -eq "True")
  54:                 {
  55:                     $CrawlAsHttp=$true;
  56:                 }
  57:                 
  58:                 $SuppressIndexing=$false;
  59:                 if($CrawlRule.SuppressIndexing -eq "True")
  60:                 {
  61:                     $SuppressIndexing=$true;
  62:                 }
  63:                 
  64:                 New-SPEnterpriseSearchCrawlRule -Path $CrawlRule.URL -SearchApplication
  65: $spService -Type $CrawlRule.Type -FollowComplexUrls $FollowComplexUrls -CrawlAsHttp
  66: $CrawlAsHttp -SuppressIndexing $SuppressIndexing
  67:             }
  68:         }
  69:     }
  70: }

Rules are defined in CrawlRules.xml file which has the following structure:

   1:  
   2: <?xml version="1.0" encoding="utf-8"?>
   3: <Build>
   4:   <WebApplication>
   5:     <SearchService Name="Search Service Application">
   6:       <Rules Clear="True">
   7:         <Rule URL="*://*/_layouts/*" Type="ExclusionRule" FollowComplexUrls="False" />
   8:         <Rule URL="*://*/_catalogs/*" Type="ExclusionRule" />
   9:         <Rule URL="*://*/_vti_bin/*" Type="ExclusionRule" />
  10:         <Rule URL="*://*/forms/AllItems.aspx*" Type="ExclusionRule" />
  11:         <Rule URL="*://*/forms/DispForm.aspx*" Type="ExclusionRule" />
  12:         <Rule URL="*://*/forms/EditForm.aspx*" Type="ExclusionRule" />
  13:         <Rule URL="*://*/forms/NewForm.aspx*" Type="ExclusionRule" />
  14:       </Rules>
  15:     </SearchService>
  16:   </WebApplication>
  17: </Build>

As result it will create exclusion rules for layouts pages, also for pages from _catalogs and _bti_bin and for list forms AllItems.aspx, DispForm.aspx, EditForm.aspx and NewForm.aspx. You may generate this xml file programmatically if you have a lot of sites which should be excluded and then pass it to the script above. It will simplify administrative work, which is not needed to be done manually.

Saturday, April 4, 2015

Fix 401 Unauthorized error for anonymous users when use owssvr.dll

Library owssvr.dll is used for performing various operations against Sharepoint content: URL Protocol. E.g. you may retrieve data from particular list using this dll like this:

http://example.com/[sites/][Site_Name/]_vti_bin/owssvr.dll?Cmd=Display&List=GUID&XMLDATA=TRUE

There is however problem with using owssvr.dll by anonymous users which may get HTTP 401 Unauthorized error. In order to make owssvr.dll work for anonymous users few things should be done. At first we need to enable anonymous state on particular SPWeb and add SPBasePermissions.ViewFormPages and SPBasePermissions.UseRemoteAPIs to Limited access role definition (SPRoleType.Guest) which is used for anonymous users:

   1: web.RoleDefinitions.BreakInheritance(true, true);
   2: var rd = web.RoleDefinitions.GetByType(SPRoleType.Guest);
   3: rd.BasePermissions |= SPBasePermissions.ViewFormPages |
   4:     SPBasePermissions.UseRemoteAPIs;
   5: rd.Update();
   6: web.AnonymousState = SPWeb.WebAnonymousState.On;
   7: web.Update();

This is however not enough. We also need to add SPBasePermissions.UseRemoteAPIs permission to the list from which we will retrieve data via owssvr.dll. The main problem is that when you enable anonymous access (grant read permissions for anonymous users) for the list from UI List Settings > List Permissions > Anonymous Access:

image

only the following permissions are added to SPList.AnonymousPermMask64:

SPBasePermissions.ViewListItems | SPBasePermissions.ViewVersions | SPBasePermissions.ViewFormPages | SPBasePermissions.Open | SPBasePermissions.ViewPages | SPBasePermissions.UseClientIntegration

and SPBasePermissions.UseRemoteAPIs is not there as you can see. Even if we added it on SPWeb level, without that anonymous users will still get 401 Unauthorized error when will try to use owssvr.dll library. So in order to make it work we need to do grant SPBasePermissions.UseRemoteAPIs on list level:

   1: SPList list = ...;
   2: list.BreakRoleInheritance(true);
   3: list.AnonymousPermMask64 = SPBasePermissions.ViewListItems |
   4:         SPBasePermissions.ViewVersions | SPBasePermissions.ViewFormPages |
   5:         SPBasePermissions.Open | SPBasePermissions.ViewPages |
   6:         SPBasePermissions.UseClientIntegration | SPBasePermissions.UseRemoteAPIs;
   7: list.Update();

After that owssvr.dll should work for anonymous users.

Wednesday, April 1, 2015

Sharepoint MVP 2015

Hello dear readers of my blog. I’m pleasant to announce that I’ve became Sharepoint MVP 2015: 5th time in sequence from 2011 year. So this award is little anniversary for me and I’m glad to be with community all these years. Hope that my humble contribution helps people in their work and make the world a little bit better. Community work became part of my life, and it is not kind of real work for me, but is more kind of pleasure and a way to give something to people regardless of the countries, cultures, religions and any other differences. In our not simple time I think it is very important to keep hearts and minds open and help other people without waiting any compensations. Good that MS recognizes it, but I would continue to do that even without MVP award. Often people ask how to become MVP: my answer is teach to give, not only take.