Sharing the experience search

Search sharing-the-experience.blogspot.com

Wednesday, October 9, 2013

SharePoint Online: Aha moments! How to get a log?

Welcome to the SharePoint Online bizzare world!

Do you think you can access the log ?

Here is what Microsoft says about:

as of October 2013:

The Get-SPOTenantLogEntry cmdlet cannot retrieve all SharePoint Online errors. This cmdlet retrieves a subset of errors that happen due to external systems.

For Beta 2, the only company logs available are for Business Connectivity Services (BCS).

SharePoint Online and PowerShell: How to Get-Spweb?

Finally, your SharePoint is Online! I mean, you have got your farm running in SharePoint online.

You still have some administration\development\maintenance left for SharePoint Online. And of course, the main means that you have used to have to do your job is PowerShell.

Now, tricky questions (who thought!) - how do you run Get-Spweb in SharePoint Online?

Isn't true that we have PowerShell in SharePoint online, you might wonder?
Yes, you are right (kind of), we have SharePoint Online management shell.
But, the module microsoft.online.sharepoint.powershell only includes some basic manipulation commands on the site collection level 

Check this cool visual tool for PowerSheeling in SharePoint to see what's available in SharePoint PowerShell modules.

So, how do you proceed with the task Get-Spweb in SharePoint Online?
Through SharePoint Client Side model of course!

Here are 3 ways that deal with it:
1.Gary Lapointe module - Lapointe.SharePointOnline.PowerShell.msi
Example:
Get-SPOWeb [-Detail [<SwitchParameter>]] [-Identity [<SPOWebPipeBind>]]

2. Using standard  Microsoft.SharePoint.Client dll

Example:

$loadInfo1 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
$loadInfo2 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$webUrl = Read-Host -Prompt "HTTPS URL for your SP Online 2013 site" 
$username = Read-Host -Prompt "Email address for logging into that site"
$password = Read-Host -Prompt "Password for $username" -AsSecureString


$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl) 
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$web = $ctx.Web
$webs = $web.Webs
$lists = $web.Lists 

$ctx.Load($lists)
$ctx.Load($webs)
$ctx.Load($web)
$ctx.ExecuteQuery()

$lists| select -Property Title



Example:

#You need to run PowerShell in single-threaded mode (the authentication requires it).
powershell -STA
#Import the DLL.
[Reflection.Assembly]::LoadFile("{local path to the dll ClaimsAuth.dll}")
#Instantiate a new SPOContext providing your SharePoint Online site URL. (Don’t forget the https)
$ctx = new-object SPOContext("{Url of the site}")

#let's test it

$web = $ctx.Web
$lists = $web.Lists 

$ctx.Load($lists)
$ctx.Load($web)
$ctx.Load($web.Webs)
$ctx.ExecuteQuery()

$lists| Select -Property Title, ItemCount | FT -AutoSize

Now, it's finally is THE TIME, SharePoint Online Administators, to get know Client Side Object Model for SharePoint! 

SharePoint Online manual migration: Managed Metadata issues

I have just finished SharePoint Online manual migration. And this post is all about migration Term Store and the managed metadata columns.

So, you have got a term set on-prem.

How do you migrate the term set to SharePoint Online? 

In case of migration to SharePoint Online manually, the only option you left with - is Import CSV file of term set to the destination - SPOnline.

The provided PowerShell script exports the term site into CSV file on-prem in order to import it in SharePoint Online.

The pitfall of importing CSV file , you can't specify child-parent relationship between term sets.

Another issue that I have discovered that there is no way to copy over Keywords, since it's a system term set. So, in case you need them from on-prem either you should re-type them manually or (much better option) a buy a 3rd party tool for SharePoint migration.

And finally, the BIG ISSUE of manual migration of managed metadata:

Imported terms will get a different GUID than terms on-prem. That means, every list that has managed metadata  after migration should be updated manually with valid imported term.

To sum up, if you have used Taxonomy, Folksonomy extensively in your on-prem, the best approach to migrate is to use 3rd party tools for SharePoint migration

Taxonomy, Folksonomy. What is it and why we should bother?

A taxonomy is a hierarchical classification of words, labels, or terms that are organized into groups based on similarities.

You can represent your corporate taxonomy by using managed metadata.
Managed metadata is a hierarchical collection of centrally managed terms that you can define, and then use as attributes for items in Microsoft SharePoint Server 2010.
See:



​ Taxonomy is the key: Managed Metadata can drive navigation, discovery, relation, and re-usability of content.
  Managed Metadata provides a common vocabulary and can connect people to social networks.
   Managed Metadata columns promote the consistent use of metadata across sites because they provide users with a list of terms that they can apply to their content. 
 Taxonomy tags are added by the content creator or author. Most typical blogs will have a tag cloud – this is a good example of a Taxonomy.
Folksonomy tags are added by the consumer or reader (not the content creator).  So Flickr keywords – where the user can add their own tags / keywords describing a photo – is a good example of a Folksonomy.
A folksonomy-based approach to metadata can be useful because it taps the knowledge and expertise of site users and content creators and it enables content classification to evolve with the users’ changing business needs and interests. 

SharePoint Online migration: Manual migration. Why, How?

I have just finished manual migration SharePoint on-prem to SharePoint Online.

And this post is about:
 - Why I chose manual migration?
 - How is it done?


Just to make sure that you are aware of Supported Migration scenarios, let me post this link for you - Migrate to Office 365 – Supported Migration Scenarios. This blog is obviously to support 3d party tool for migration, in this case - Sharegate. I am not related to this product in any ways, moreover I have never used it. But I like how the author presents the information regarding SPOnline migration simply and clear.

So, Why I chose manual migration?

I took some time to analyze leaders in 3rd part tools for SharePoint Migration , and came up with impression that it's worth the money if content is bigger than 50 Gb. In my case, it wasn't.
So, I have decided to do a manual migration.

NOTE: MANUAL MIGRATION WILL REPLACE CREATED BY|MODIFIED BY as well as CREATED DATE|MODIFIED DATE with the user who is doing migration and date will be set as current date when migration is done.
In case you have to preserve these metadata, you need to to use 3rd party migration tools.

How is manual  migration done?

You have several options here, depending what you want to copy over.
The fastest and easiest way is:

 1.  Save site as a site template. 


   This way you can copy the entire site with content (if you set the option  "Include content").

You can't save site template if:
 - In the site Publishing feature is activated;
 - If list is to large to save as a template (The size of a template cannot exceed 52428800 bytes.)
 - Sometimes lookup, managed metadata columns prevent from saving the site as a site template.

2.  Save list as a list template

In case site template wont' work out. You can create a blank site in SharePoint Online, and than transfer content list by list.

The easy way to transfer list - is via List template with content.  (List- Settings -> save as a template)
This way you can save list as well as library.

You can't save list template if:
- The list is too large to save as a template. The size of a template cannot exceed 52428800 bytes.
 - The list has a lookup or metadata columns

Special case: How to save Wiki as a list template:
There is no option "save as a template" under Wiki library settings. BUT, you can save Wiki as a template.
You can do that via SharePoint designer. Open the site in SharePoint Designer, select wiki library and in the panel above you will find the option "save as a template"

3. Copy data via Access
   In some cases, you have to use Access to bring the data over into the cloud. You can find "Open with Access" in the list menu.
You have 2 options how you can link data in the Access:
 - Link to data;
 - Export.

How to choose over another?

You don't need to write back to source (on-prem), plus sometimes you want to transfer managed metadata, user, lookup columns. This is a perfect candidate for Export.
The export will convert managed metadata, user, lookup into text fields.

In contrary, most of the time, for destination, "Link to data" brings the best result. 

How to migrate lookup, user, managed metadata columns?

Before migrating list with the lookup, you have to create a list for the lookup value in the destination site.
Then you can create a destination list with the lookup and link the list to Access. Meanwhile you copy data of the source list to Acces, the lookup values are converted to text value.
Then copy data from the source table to the destination table.
After the data are copied over via Access, update the lookup column in Quick Edit mode in the list
Quick edit allows you update manually Managed Metadata and User columns pretty fast.

Here are the cases when I use Access:

 - Copy Links list;

 - Export from the source a list with a user column.

 - Export from the source a list with a lookup. 

-  Export from the source a list with a managed metadata column.

4. Copy files via Explorer
Access allows you to copy list over to the cloud. But you can't use it for Libraries.
To copy files, use "Open with Explorer"


Exception: Files large than 50 Mb while copying give error "The file size exceeds the limit allowed and cannot be saved"


You still can copy these files by dragging and dropping directly in the document library:






Special bonus: How to set read-only on list?
And this tip is kind of obvious, but I want to share it anyway to complete the picture of manual migration.

Once you have transferred the data from the source, you want to make sure that the source data will not be changed. The manual migration is tedious, so you want to  limit incremental update as much as possible.
First of course, you want to make sure that you start with the data that are not updated frequently. This way the chances that anybody noticed that the list\site is read-only are really low.
To set the list\libraries\site read-only, break the permission inheritance and change the permission to "Read"

Summary:

The described manual migration steps are exactly what I did to migrate the portal.

Please note, that if you have customization, you need to figure out whether you want to leave it behind, if no, would be possible to convert it to the sandbox solution?
In my case, I have got rid of all customization before the actual migration back in the migration from SP2010 to SP2013 on-prem. This way I have prepared the farm in advance.

To migrate 4 Gb of the content of 16 subsites (1 site collection, with no user site collection content migration) took 80 hours - 2 weeks of solo work.

Welcome to the cloud! Once you have done it, find a new job)


Monday, October 7, 2013

SharePoint Online migration: a quick analysis of 3rd party tools for migration

Recently I have finished manual migration to SharePoint Online. 


And this a quick post is my first glance analysis of leading 3rd party tools for migration:
Note: I am not affiliated to any 3rd party SharePoint migration tools. This analysis done in order to choose (if necessary) tools for my migrations projects.

1. Quest Migration Suite for SharePoint - Quest’s Migration Suite for SharePoint is the complete solution for simplifying SharePoint migrations whether to SharePoint on-premises or SharePoint Online. It virtually eliminates the risks of downtime and data loss. With a single, agentless tool requiring just one install, you can seamlessly move your entire SharePoint environment, as well as Microsoft Exchange Public Folders, and Windows files, to on-premises versions of SharePoint, SharePoint Online, or a hybrid of the two. And if you decide to wait to migrate to SharePoint Online down the road, you’ll already have a tool in place to proceed immediately.

   I used the trial version to trim the content before the actual migration. 

Pro:  I liked the easy installation and easy manipulation with SP objects. I have used extensively In-place tagging. I love that they offer a trial version with full functionality. The only limit is the expiration date. 

Cons: UI is ugly. 

 Price: Unknown.

   I haven't tried it, but I have seen a live demonstration and got a chance to ask questions.

Pro:  The product has a feature of comparison of the objects and manipulation (Incrementation update\migrate capabilities)
It seems that it is more robust than Quest.
I like that they have online version ( for SPOnline farms) of their product for control and governance - ControlPoint.

Cons: I haven't got a price sheet after presentation. The whole deal with hiding price is a big annoyance.

 Price: Unknown.

    I haven't tried it, but I have seen a live demonstration and got a change to ask questions.

 Pro: A BIG BIG PRO - THE TRUST. AvePoint VP - Jeremy Thake. He is a well known expert in SharePoint world and has  lots of intelligent and hot topic articles on SharePoint. Because of his involvement in the product I feel a huge inclination to trust these guys.
Prices are not hidden.

Cons: The concept of DocAve 6 Platform is complicated in my opinion. I couldn't figured out which product do I need for migration until I have seen a live presentation.

Price: Content Manager - 995 per SP server (web-front, app server)

Project Server 2013: Bugs

Recently , I was participating in the Project Server Upgrade from 2010 to 2013. The migration process was typical and well described in Upgrade to Project Server 2013

After migration was over, I have faced several issues in Project Server 2013, which were confirmed as bugs by Microsoft and as of October 2013 most of them were not resolved.

Here is a small collection of them:

Project Server 2013: Can't close the period - already resolved


Project Server 2013: Timesheet.aspx. The view failed to load.


Project Server 2013: TS Approval history wont' show filtered by Resource name


Project Server 2013: <%$Resources:PWA,ADMIN_ADDMODIFYUSER_BROWSE%>

After migration to Project Server 2013, I have noticed several bugs that already have been confirmed by Microsoft , but as of October 2013 haven't been resolved yet.

One of them is following:

When on the Approval page, the timesheet is selected, and click Accept.
The Confirm Approval pop up instead of button "browse" has label <% <%$Resources:PWA,ADMIN_ADDMODIFYUSER_BROWSE%>>.

Here is a response from Microsoft:
RESOLUTION:

·         When selecting a different timesheet manager from approval center, instead of "Browse" on the button you are seeing a label with "<% <%$Resources:PWA,ADMIN_ADDMODIFYUSER_BROWSE%>>".
·         But still you will be able to browse and select a different timesheet manager.
·         But the label name part is broken and issue is reproducible with latest CU also.
·         I have filed an internal bug for this case and closing case as bug so that you will not be charged.

Project Server 2013: TS Approval history wont' show filtered by Resource name

After migration to Project Server 2013, I have found several bugs that have been confirmed but not yet resolved by Microsoft.

One of them:

For timesheets in approval center, while applying "Resources" filter and fetching timesheets with "Resource Name", project server 2013 does not show any timesheets

As of October 2013, 
RESOLUTION:

 -  Resource filter issue in 2013 is reproducible with latest CU in-house.
 - This looks to be a known bug in project server 2013 and we have already filed an internal bug.

 - But, as an alternate solution, you can uncheck both "Date" and "Resource" filters and then use resource name to get the timesheet.

Project Server 2013: Can't close the period

After migration was over to Project Server 2013. I have noticed several issues.

One of them was:

I can't close Time Reporting Period, the page TimePeriod.aspx gets reloaded without saving data.

As it turned out, If any timesheets exist it's not possible to close the period.

That bug was fixed in Project CU June 26,2013.

Project Server 2013: Timesheet.aspx. The view failed to load.

Recently , I was participating in the Project Server Upgrade from 2010 to 2013. The migration process was typical and well described in Upgrade to Project Server 2013

After migration was over, I have faced several issues in Project Server 2013, which were confirmed as bugs by Microsoft and as of October 2013 were not resolved.

One of them is following:

Issue:
When try to access the timesheet which is having the deleted task, getting the below error.
The view failed to load. Press Ok to reload this view with default settings. Press Cancel to select another view. The issue appears when Single Entry Mode is ON.


Workaround:
To remove offending task, set Single Entry Mode is off, remove the task from the timesheet. If necessary, set the Single Entry Mode back to On.

Thursday, October 3, 2013

SharePoint: Search analytics or How to Improve Search

Recently, I have been diving deeper into Information Architecture

Key Issues in Governance Planning is Search


Google search and sharepoint search shouldn't work the same

when you google something you will get a lots of something

when you search in SharePoint, your intention get less but more relevant information to make business decision quicker. You are confined in bussiness jargon you use in your organization, or by author you are searching for.


Typical Search problems:
 - Garbage in, Garbage out
 - Poort meta data
 - Too much noise
   - Irrelevant content on top
   - Useless content
   - Duplicate content
 - Bad document authoring (pdf, MSOffice)
 - Misalligment with/misunderstanding users' conceptions
 - No Improvement over time


Actions to improve search:
 - Use Site Collection Web Reports
 - Check statistics weekly in the first 3 months
 - Make 20-50 Best Bets
 - Check monthly  and adjust
 - Improve titles descriptions
 - Add metadata
 - Map existing metadata


How to use Site Collection Web Reports to analyze Search?

How do you know of search needs improvement?

Start with "Failed Queries" /_layouts/WebAnalytics/Report.aspx?t=SearchFailureReport&l=sc
Look at the numbers of Queries that are failed. Compare with numbers of visitors
If you see that failed numbers are high, this is the first indication that you have to improve search.

Percentage abandoned value indication:
 returned no results - need to see if we can improve content exposure (via metadata) to show the result based on criteria
 100%  - users didn't follow any link that been shown in the result page. Check the result by yourself, analyze why content is not used by user. Is it wrong metata that describes the content or user uses different terminology?


How do you analyze UX via Search?

If visitor numbers are high /_layouts/WebAnalytics/Report.aspx?t=UniqueVisitorsTrend&l=sc
, but rarely people use Search overall _layouts/WebAnalytics/Report.aspx?t=SearchTrafficTrendReport&l=sc
, it might mean that you navigation is great
or users go only to the specific places without exploration what else can be useful in the portal and maybe you need to promote search feature more.

If users intensely use search it might indicate that you have to improve your navigation page

How can you reduce failed queries?
Add a search keyword with best bet.
Best Bet is a promoted result. Analyze use of the Best Bets via "Best Bets usage" - _layouts/WebAnalytics/Report.aspx?t=BestBetPerformanceReport&l=sc


This note is based on Search Analytics in SharePoint 2010



SharePoint 2013: Architectural changes. Pain Points

As I have launched a migration project to SharePoint 2013, I have started analyzing the architectural changes in SharePoint 2013 farm and other related products.

My goal is to migrate SharePoint 2010 with Project Server 2010 to SharePoint 2013 with Project Server 2013.

Here is my earliest discoveries:

1. Hardware recommendations differ for SharePoint 2013 and for Project Server 2013. But Project Server lives inside the SP Farm.
For small SP farm, you have to have around 12 Gb for Front End and 8 Gb for SQL.
But in case you want to include Project Server in the farm, MS recommends 16Gb for Front End and 16 Gb for Sql.
NOTE: This 16 Gb recommendation doesn't cover additional need for memory by other SP services:
The minimum hardware requirements in this section are recommended in which only the required services to run Project Server 2013 are enabled. Be aware that enabling additional SharePoint Server 2013 features in the farm may require more resources. 
So, it seems like MS intends to sale lots of  Azure VM services!
At this moment, I am struggling to find on-prem VMs for a new demanding farm. Realistically, I will not get more than 10 GB RAM for SP+Project Server 2013.


PAINT POINT: You have to boost hardware not only for SP2013 ,but for Project Sever 2013 accommodation.


2.Office Web Apps server (or OWA farm) should be installed on dedicated server (NOT ON THE SP SERVER).
 And by the way, have you acquainted with PowerShell? OWA Sever doesn't have UI to set settings, you have to use PowerShell 3.0.
  Once you have installed OWA farm (server) , you have to bind (SPWopiBinding) SP to it.

Now, think twice in case of migration.
Not only you have to boost hardware for serves, you have to find an additional server in order to have Office Web Apps. In SP2010 Office Web Apps product was supposed to be installed on the SP server.

PAINT POINT: You have to find an additional server for OWA to provide  functionality that was  "hardware-strain" free in the previous SP version.

3. Web analytics services are removed from Sp2013 as a separate services. Now it's a part of Search Service.
When you upgrade to SharePoint 2013, do not attach and upgrade the databases that contain the data from Web Analytics in SharePoint Server 2010. We recommend that you turn off Web Analytics in the SharePoint Server 2010 environment before you copy the content databases that you want to upgrade to SharePoint 2013.
PAINT POINT: Web analytics are best service to clean your farm before you go to SharePoint 2013. It allows you find unused spwebs. But you have to turn the service off when you want to copy your content db for test upgrade. So, you need to clean the environment (with help of Web analytics) before even your first test upgrade.

4. A new Access Service requires SQL 2012.

 In case you want\or have to provide Access Service in your SP 2013 Farm, you have point the service to SQL 2012 Server.

PAINT POINT: For Access Service you have to have SQL 2012. So if you have planned to stick with SQL 2008, make sure you don't need to have Access Service. In case you need it you have 2 choices: 1. Have 2012 SQL server installed at the first place, or 2. extend you farm and include additional 2012 SQL server to accommodate Access Service Application.