Sharing the experience search

Search sharing-the-experience.blogspot.com
Showing posts with label Administration. Show all posts
Showing posts with label Administration. Show all posts

Monday, April 21, 2014

#SP24 : SharePoint IAAS

@kenmaglio in #SP24S018 "Azure IaaS - Easy Mode SharePoint 2013 Development Environment" showed how to setup SP2013 on Azure VM.
Here are resources to make it happen:
 Walkthrough to setup SP2013 on Azure VM
 Configure the VMs for external access to SharePoint

More on SaaS vs IaaS


Wednesday, October 9, 2013

SharePoint Online: Aha moments! How to get a log?

Welcome to the SharePoint Online bizzare world!

Do you think you can access the log ?

Here is what Microsoft says about:

as of October 2013:

The Get-SPOTenantLogEntry cmdlet cannot retrieve all SharePoint Online errors. This cmdlet retrieves a subset of errors that happen due to external systems.

For Beta 2, the only company logs available are for Business Connectivity Services (BCS).

SharePoint Online and PowerShell: How to Get-Spweb?

Finally, your SharePoint is Online! I mean, you have got your farm running in SharePoint online.

You still have some administration\development\maintenance left for SharePoint Online. And of course, the main means that you have used to have to do your job is PowerShell.

Now, tricky questions (who thought!) - how do you run Get-Spweb in SharePoint Online?

Isn't true that we have PowerShell in SharePoint online, you might wonder?
Yes, you are right (kind of), we have SharePoint Online management shell.
But, the module microsoft.online.sharepoint.powershell only includes some basic manipulation commands on the site collection level 

Check this cool visual tool for PowerSheeling in SharePoint to see what's available in SharePoint PowerShell modules.

So, how do you proceed with the task Get-Spweb in SharePoint Online?
Through SharePoint Client Side model of course!

Here are 3 ways that deal with it:
1.Gary Lapointe module - Lapointe.SharePointOnline.PowerShell.msi
Example:
Get-SPOWeb [-Detail [<SwitchParameter>]] [-Identity [<SPOWebPipeBind>]]

2. Using standard  Microsoft.SharePoint.Client dll

Example:

$loadInfo1 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
$loadInfo2 = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$webUrl = Read-Host -Prompt "HTTPS URL for your SP Online 2013 site" 
$username = Read-Host -Prompt "Email address for logging into that site"
$password = Read-Host -Prompt "Password for $username" -AsSecureString


$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl) 
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$web = $ctx.Web
$webs = $web.Webs
$lists = $web.Lists 

$ctx.Load($lists)
$ctx.Load($webs)
$ctx.Load($web)
$ctx.ExecuteQuery()

$lists| select -Property Title



Example:

#You need to run PowerShell in single-threaded mode (the authentication requires it).
powershell -STA
#Import the DLL.
[Reflection.Assembly]::LoadFile("{local path to the dll ClaimsAuth.dll}")
#Instantiate a new SPOContext providing your SharePoint Online site URL. (Don’t forget the https)
$ctx = new-object SPOContext("{Url of the site}")

#let's test it

$web = $ctx.Web
$lists = $web.Lists 

$ctx.Load($lists)
$ctx.Load($web)
$ctx.Load($web.Webs)
$ctx.ExecuteQuery()

$lists| Select -Property Title, ItemCount | FT -AutoSize

Now, it's finally is THE TIME, SharePoint Online Administators, to get know Client Side Object Model for SharePoint! 

Taxonomy, Folksonomy. What is it and why we should bother?

A taxonomy is a hierarchical classification of words, labels, or terms that are organized into groups based on similarities.

You can represent your corporate taxonomy by using managed metadata.
Managed metadata is a hierarchical collection of centrally managed terms that you can define, and then use as attributes for items in Microsoft SharePoint Server 2010.
See:



​ Taxonomy is the key: Managed Metadata can drive navigation, discovery, relation, and re-usability of content.
  Managed Metadata provides a common vocabulary and can connect people to social networks.
   Managed Metadata columns promote the consistent use of metadata across sites because they provide users with a list of terms that they can apply to their content. 
 Taxonomy tags are added by the content creator or author. Most typical blogs will have a tag cloud – this is a good example of a Taxonomy.
Folksonomy tags are added by the consumer or reader (not the content creator).  So Flickr keywords – where the user can add their own tags / keywords describing a photo – is a good example of a Folksonomy.
A folksonomy-based approach to metadata can be useful because it taps the knowledge and expertise of site users and content creators and it enables content classification to evolve with the users’ changing business needs and interests. 

Thursday, October 3, 2013

SharePoint: Search analytics or How to Improve Search

Recently, I have been diving deeper into Information Architecture

Key Issues in Governance Planning is Search


Google search and sharepoint search shouldn't work the same

when you google something you will get a lots of something

when you search in SharePoint, your intention get less but more relevant information to make business decision quicker. You are confined in bussiness jargon you use in your organization, or by author you are searching for.


Typical Search problems:
 - Garbage in, Garbage out
 - Poort meta data
 - Too much noise
   - Irrelevant content on top
   - Useless content
   - Duplicate content
 - Bad document authoring (pdf, MSOffice)
 - Misalligment with/misunderstanding users' conceptions
 - No Improvement over time


Actions to improve search:
 - Use Site Collection Web Reports
 - Check statistics weekly in the first 3 months
 - Make 20-50 Best Bets
 - Check monthly  and adjust
 - Improve titles descriptions
 - Add metadata
 - Map existing metadata


How to use Site Collection Web Reports to analyze Search?

How do you know of search needs improvement?

Start with "Failed Queries" /_layouts/WebAnalytics/Report.aspx?t=SearchFailureReport&l=sc
Look at the numbers of Queries that are failed. Compare with numbers of visitors
If you see that failed numbers are high, this is the first indication that you have to improve search.

Percentage abandoned value indication:
 returned no results - need to see if we can improve content exposure (via metadata) to show the result based on criteria
 100%  - users didn't follow any link that been shown in the result page. Check the result by yourself, analyze why content is not used by user. Is it wrong metata that describes the content or user uses different terminology?


How do you analyze UX via Search?

If visitor numbers are high /_layouts/WebAnalytics/Report.aspx?t=UniqueVisitorsTrend&l=sc
, but rarely people use Search overall _layouts/WebAnalytics/Report.aspx?t=SearchTrafficTrendReport&l=sc
, it might mean that you navigation is great
or users go only to the specific places without exploration what else can be useful in the portal and maybe you need to promote search feature more.

If users intensely use search it might indicate that you have to improve your navigation page

How can you reduce failed queries?
Add a search keyword with best bet.
Best Bet is a promoted result. Analyze use of the Best Bets via "Best Bets usage" - _layouts/WebAnalytics/Report.aspx?t=BestBetPerformanceReport&l=sc


This note is based on Search Analytics in SharePoint 2010



Tuesday, June 18, 2013

Alert error: You do not have an e-mail address.


You do not have an e-mail address.
Alert has been created successfully but you will not receive notifications until valid e-mail or mobile address has been provided in your profile

Got this error on your on-prem SP2013 while setting alerts?
The reason, in my case, was that User Property "Work email" wasn't setup correctly.

Don't worry , it is easy to get it straight and here is how:
1. Go to Manage User Properties, and find "Work Email", Edit
2. In the "Work Email" properties, add a new mapping "mail"
3. Make sure that Policy settings: Replicable, and Edit settings: Allow users to edit values for this property.

4. Start Full synchronization.

Done!




Friday, May 24, 2013

Is SharePoint rubbish? Trash talk on SharePoint architecture

I have recently bought a book by one of the favorite knowledgeable guy in the SharePoint world - Todd Klindt - Professional SharePoint 2013 Administration

The book is hilarious and very descent to have on the shelf if you are a SharePoint Administrator, Developer or Architect.

Here is my favorite paragraph so far, which is very metaphorical and easy memorize for someone who is new in this big and messy SharePoint World.



"Try this analogy to understand how [all] pieces work together: Web applications are the landfill. Content databases are giant dumpsters. A site collection is a big, black 50-gallon garbage bag. Webs, lists, and items are pieces of trash. Your users spend all week creating garbage, continuously stuffing it in the garbage bags, with each piece of trash occupying only one garbage bag at a time. Each garbage bag can hold only 50 gallons of trash (quotas) before it is full, after which the user has to either ask for a new garbage bag or get a bigger garbage bag.  That full garbage bag is placed in a dumpster, and it is not possible to put a garbage bag in more than one dumpster without destroying it.  Dumpsters are serviced only by one landfill but that landfill can handle thousands of dumpsters without issue. "

Friday, March 22, 2013

SharePoint 2010 on Cloud. SaaS vs IaaS

Ever wonder how can you move your on-prem SP farm to the cloud? And what is cloud anyway?
 As of 03/22/2013 you have 2 options of cloud-solution for SharePoint:
- Office 365 that deliver bunch of MS products , one of them is SharePoint Online.
This is a classical example of SaaS.
- WIndows Azure VM. 
This is a classical example of IaaS.
Here is a good pic to describe SaaS (Software as a Service), PaaS (Platform as a Service) and IaaS (Infrastructure as a Service):

In the context of SharePoint migration to the cloud, you need to understand that SharePoint Online delivers the latest version of SharePoint
What is the implication? You have to be totally ready to move SP2013 .MS doesn't offer a mean to move data from your on-prem to SharePoint online.
At this moment I haven't tried any 3d party tools for this purpose yet.
Also, your farm should not have farm solutions. SharePoint Online doesn't support farm solutions. And it doesn't support reporting services at this moment.

Windows Azure VM gives you more flexibility. You can choose what version of SharePoint you want to run, and you have a full control to administer the system. More on SharePoint Deployment on Windows Azure Virtual Machines

But also it means you have to have a resource to support the system.
And by the way, wonder how can you cut the cost? I wonder too)
At this moment, it's interesting to know that:
Virtual machines continue to incur compute charges even when they are stopped You can avoid these charges by deleting the virtual machine.
BTW, in case you want to automate SP installation process, try this - http://autospinstaller.codeplex.com/

Thursday, March 21, 2013

SharePoint 2010: Confirm site use and deletion. How to restore Site Collection ?

One big part of  the migration to a new version of SharePoint is a preparation phase.
This phase emphasizes cleaning or pruning the environment before moving content database (  in case of SharePoint 2013 Upgrade you don't have in-place upgrade option) to a new version.

I recommend to clean from top to bottom.
First, identify unused web applications. (SP Migration: Phase I "Cleaning". Unused web applications )
Then check unused site collections.
At this step, I recommend to make use of "Confirm site use and deletion"




Of course, you can still use this option even you are not planning the upgrade but still feel urge to get of rid of "dead" structure and content.

I found a useful article that explain in detail how this feature works - A Closer Look At “Site use confirmation and deletion”…
As this post mentioned, it's good to know that :
As of the June 2012 CU for SharePoint 2010, the Dead Site Delete timer job now calls proc_DeleteSiteAsync which is explained below - this is good news, and now means that sites deleted by site use conformation and deletion will be subject to the site Recycle Bin.
Let's explore June 2012 CU (build 14.0.6123.5002):
It has resolved issue 2598348 Description of the SharePoint Foundation 2010 hotfix package (Wss-x-none.msp): July 2, 2012

From http://support.microsoft.com/kb/2598348
If a site is deleted by a dead site delete timer job, the site is deleted permanently. Instead, the site should be sent to the Site Collection Recycle Bin.

So, based on this knowledge you can now check 'Automatically delete the site collection if use is not confirmed' with less struggles in your mind.

And btw, Where is Site Collection bin? Here is the asnwer:
SharePoint 2010: SP1 Site Collection Recycle Bin (en-US)

SharePoint 2010 to 2013 Upgrade: Phase I "Cleaning". Unused web applications

As you may know Microsoft recommends to clean the farm first before migrating to a new version of SharePoint. (Clean up an environment before an upgrade to SharePoint 2013)
For my migration story SharePoint 2007 to SharePoint 2010, please refer to "SharePoint 2007 to 2010 Upgrade" online project

Now, I m in the process of migration SharePoint 2010 to SharePoint 2013.

I recommend to start cleaning from top to bottom.

First, check unused web applications

Web analytics reports are really good help for that.
It shows you: Total Number of Page Views, Total Number of Daily Unique Visitors.
Based on that information, it's fairly easy to find abandoned web applications.
So, next step is to stop the related application pool and notify users.

Thursday, October 11, 2012

SharePoint Limits

This is a very short post on SharePoint limitations.

Often when you negotiate software requirements for SharePoint project, you need to have at hand information what SharePoint cant' do and what it can (especially - Boundaries: Static limits that cannot be exceeded by design)

Here is few links that will help at least me to reach this info easily:

Software boundaries and limits for SharePoint Server 2013

SharePoint Online: software boundaries and limits (SP 2010 Online. At the moment of creation of this post we still have SP2013 in Preview)

SharePoint Server 2010 capacity management: Software boundaries and limits

Plan for software boundaries (Office SharePoint Server 2007) , and a short version of SharePoint 2007 Limits

And my old posts on:

Large list limits Sharepoint 2007/2010

"Column Limit Exceeded" Message When You Add a New Column to a SharePoint Services List

Happy analyzing to you!

Friday, September 14, 2012

Term store migration to another farm

Your task is to migrate a content db to another farm. For this you can refer to the post How restore a SharePoint 2010 content database on the different farm

But in case you have managed metadata column somewhere in your web application, you will soon discover that they lost related term set which is expected.

The term store is not stored in the content database. It's a responsibility of the Managed Metadata Service.

You question - how can I migrate term store from one farm to another?

Here is what I found useful to read and use:

Migrating managed metadata term sets to another farm on another domain - the article clearly explains what happens with metadata value in the columns when you restore content db on the another farm. Essentially, you will not loose metadata value in the lists, but you can't use it further until you migrate related term store.

Article also provides a means to migrate your term set and how do you reconnect the term store to the content via PowerShell - http://sptermstoreutilities.codeplex.com/


Here is my 2 cents in it:

Instead of using http://metadataexportsps.codeplex.com/ for UI export\import of term store, you can use a PowerShell from Laponte - a PowerShell SharePoint guru - Exporting and Importing SharePoint 2010 Terms. The navigation on his blog  is a little bit confusing - the extended powershell comands are bundled in one Lapointe.SharePoint2010.Automation.wsp on http://blog.falchionconsulting.com/index.php/downloads/

In case you want to know how to use PowerShell in SharePoint -Simple concept: How to use SharePoint cmdlets in PowerShell ISE


ATTENTION:
If you have reused terms in the termset, make sure the order in the export file the following:
Parent, then Child
In this example, the term set Customers reuse terms from the first level of the term set Projects. In the initial file that I have exported, the customer set goes first before the project.
If I try to export this file via Import-SPTerms , I will get an error.
I need to change the order of the terms beforehand and them import the file.


P.S.

For SPOnline migration I believe you have only one option is to export via csv . I wonder what about exporting reused terms. I can't see the int csv file to specify it....

Monday, June 25, 2012

SharePoint: High Availability

Recently, I have been asked to develop a plan to make SharePoint 2010 farm Highly Available.
This short post will outline the main phases that farm needs to undergo in order to be Highly Available.

In my case, we have 2 datacenters. One of them is Primary, and another - Secondary. In case primary goes down, we should provide a seamless switch to the Secondary without loss functionality and preferably performance. 


In my personal opinion, I prefer to give performance boost in case I have resources, even it might mean that end-users will notice some performance degradation in case of failover.

Here is a schema to implement for High Availability:




Here are key notes:

1. Web Front Ends (WFEs) - When Primary is up, all 4 WFEs (primary and secondary) are serving the requests. In case the Primary is out, the Secondary WFEs will be getting all requests.
Here I see, that it may impact performance, since in the usual scenario end-users uses all 4 WFE (that load-balanced via ISA).
One of the option to keep WFE performance steady, is to keep only 2 WFEs available in case Primary is up.
The seamless switch will be provided via ISA

2. SharePoint application server - both of them engaged in case Primary is up. In case of failover, the second application server should have the same services running as in Primary to maintain the same functionality as before. 
The weakest point is a timer job. I can imagine that in some scenarios WFs that been served by primary server at the time of failover will never get back.
One more note on this- plan your search architecture: Search Service Application: Architecture in one page


3. SSRS servers - in case Primary is active, engage both. In case of failover the second SSRS will get all requests. The seamless switch will be provided via ISA


4. Often enterprise SharePoint solution interact with external system via BDC (Trying to figure out what's the difference between 2007 BDC and 2010 BCS?)
We need to plan how we can provide access to these external system in case of failover.
That means extensive communication with teams who support such system.
In my example, we accumulate all external systems calls via web services developed on BizTalk server. From our side, we need to configure ISA to have additional BizTalk availabled on the Secondary datacenter.


5.On SQL side - we are implementing async mirroring. In case primary goes down, we don't have any loss and ready to switch to the Secondary.
I prefer to have a witness on the Secondary based on assumption that we use Secondary in case Primary is down, not vice versa.

In case report db servers fail in the Primary, we need to have extra work from front-ent side. Report connection files have connection information inside them. We need to make sure that all connection files have alias name instead of the actual. In the case of failover, we just modifing alias on Report server side. Keep in the mind that SSAS type of connection file won't work with SQL alias and we can do it via host file.

We need to setup failover settings for SharePoint databases.
Refer to this post for what can be mirrored in SharePoint 2010:
The 2010 SharePoint databases, purposes and mirroring supportability
You will find that some of the SharePoint DBs are not required and are not design to setup for failover since they are not critical and easy to re-create.
Based on what recover time you have , adjust for yourself what needs to be configured with failover and what can be omitted. If a SharePoint db is not configured with failover, plan ahead what necessarily actions you should pefrom to put the bd back in case of failure, and what impact will be if the db is not ready right away. As an example, most likely the StateService will not be in high demand right after switching to the reserved (Secondary) datacenter.


Here is an outline plan how to introduce HA in SharePoint 2010 farm gradually:


1. Decide how many additional servers are needed, and their configuration. 
2. WFE. Test first without including in ISA. Then include in the prod farm.
3. App server. Build the server cautiously. I believe once the app server is joined, it will used by prod WFE. No ISA configuration is needed. All requests goes from the SharePoint farm configuration directly.
4. Work on external systems HA
5. Work on SQL servers, failover settings on SharePoint side 
6. SSRS servers

Wish you happy HA to you.


Friday, June 22, 2012

SharePoint: SharePoint 2007 Administration and PowerShell 2.0



Last months my work with SharePoint 2007, I felt a strong inclination to use PowerShell 2.0.
I envied a happy people who were already with SharePoint 2010 and could enjoy SharePoint 2010 Administration via SharePoint.ps1


PowerShell and SharePoint: What, Why and How
Simple concept: How to use SharePoint cmdlets in PowerShell ISE


I also wanted to have such beautiful commands that have been shipped by SharePoint.ps1 for SharePoint 2010:
Get-SPFarm,
Get-SPWeb,
Get-SPWebApplication


So, my problem was that we don't have SharePoint.ps1 for SharePoint 2007.
To easy up my desire, I have created my functions:
Get-SPFarmV3,
Get-SPWebV3,
Get-SPWebApplicationV3.


And some additional:
Get-SitesWithMissingTemplate, Get-SSProvider, Get-SPVersionV3.


All above I have packed in one SPv3Adapter.mdl module.


Windows PowerShell Module Concepts


Module Installation Best Practices:


Do not install modules for Windows PowerShell in the system location at %Windir%\System32\WindowsPowerShell\v1.0\Modules. Only modules included with Windows are installed to the system location.


http://msdn.microsoft.com/en-us/library/dd878350(v=vs.85).aspx


Module installation
http://msdn.microsoft.com/en-us/library/dd878350(v=vs.85).aspx




And some additional PowerShell files:
Helper functions: Start-CustomTranscript, Get-CustomAPPLog,Get-SolutionDeployed


I have used a lot Get-SolutionDeployed during upgrade SP2007 to SP2010:


The module Helper.mdl has dependency on SPv3Adapter.mdl

Please install  SPv3Adapter.mdl  module first in order to enjoy Sp2007 helper functions.
One of the favorable option for me is to use manifest (you are welcome to use attached customModulesLoader.psm1 and  Manifest.mdl)  to ship the PS modules.


Wednesday, June 20, 2012

SharePoint: Is there life after development? Transition plan for SharePoint support team

Finally, my 2 year project on SharePoint 2007\2010 has been finished!
We started with 9 brave people in the team.
Our tasks were:
1. Build SOA infrastructure;
2. Bring data from disparate data sources into one application for measurement and management;
3. Build SharePoint farm;
4. Organize development and deployment process;
5. Develop custom business solutions based on SharePoint technology;
6. Build a report system to make use of the data in SharePoint;
It took us roughly 8 months of initial and hectic work.

And in another 1 year we have been asked (with 7 people):
 7. Extend functionality;
 8. Maintenance;
6.  Develop mobile solution to work with custom SharePoint sites;
7.  Upgrade 2007 to 2010.

Finished all. Now, it's time to transition support to the support team.
So, here is my a suggested transition plan for SharePoint support team.


The plan should be outlined by a person who holds knowledge. But the content should be filled by support team.  Also support team needs to get an estimate for every phase, the estimate should be recommended by a knowledgeable person.
They need to collaborate with developers to get the document done. 
Once the document is ready, a team lead\ or some other guy who knows the picture, needs to verify the document and to correct if necessarily.

Here are steps for transition plan:

1. Knowledge holders 
A main holder can be a team lead in the project. He should provide contact information to support team.
The contact info should contain all major areas that is needed to fully support the SharePoint farm and a responsible\ knowledgeable person in this area.

Here is an example of areas:
SharePoint Developer
SharePoint Architect
External system (BDC) administrators
DBA
Report developer
AD Administrator
ISA Administrator
SharePoint trainers

2. Environment  awareness
Support team should become aware:
 - what environments were build (ex: Dev, QC, Training, Staging , Prod);
 - the purpose of the environment;
 - the environment topology (how many servers with what roles, load balancing (ex. ISA) configuration)

The access plan should be developed: when to ask access to where. 

It will be milestones for the support team get them up to speed.

3.  Code awareness
- what the custom solutions and where they are;
- solution dependencies;
- how to build and deploy;

4. Development and deployment approaches
- is it possible go with no-downtime deployment and how (SharePoint : Farm solution deployment with no downtime. Update-SPSolution -local)


Approaches to deploy:
Content-based and code-based deployment, hybrid ( refer to previous posts):
When it makes sense to use:
- Sandbox solution - SharePoint 2010: Sandboxed solution restrictions and considerations
Hybrid Sandbox solution ;
- Farm solution;
- Content deployment

Each of these has own appropriate context and their advantages and disadvantages.

Note: This step is  for education purposes. You never know what knowledge a new member in SharePoint support team has.


5. Troubleshooting : common areas
As an example for our farm it is:
- SSRS
- External system check ( BDCM)

6. Backup and restore strategy


Happy supporting


Tuesday, June 19, 2012

SharePoint : Farm solution deployment with no downtime. Update-SPSolution -local

Wondering if it's possible to re-deploy farm solution wsp file without downtime? 

In some circumstances, it's possible.


You can do no-downtime deployment if following is true:
 - Your modified wsp file doesn't contain new files or features. The most common scenario, you want to change some code logic that is an existing dll. Once you deploy your modified wsp, the new dll will be placed in GAC (in case you don't have versioning enabled).


- You have load-balanced WFEs. (ISA with SharePoint Farm ( a little bit more complicated scenario with off-box SSL termination))



Here is how it works:
1. ISA test.
Test first if site is up with one server drained. Test for all severs in the ISA farm object for portal.

2. ISA switch.
Drain the first WFE.

3. WFE work. Update-SPSolution
Copy new wsp file on the local drive.
Run
Update-SPSolution -Identity {name}.wsp -LiteralPath "{path}\{name}.wsp" -Local –GACDeployment

Make sure that you specify the parameter –local.
It will deploy files locally and locally restart IIS.

After such update you will see following:


I have 2 boxes in QC env: SOA-MOSS01-QC, SOA-MOSS02-QC.
In the last operation result I see that operation is been performed only on one box SOA-MOSS01-QC.
The second box haven’t been updated yet and at this time actively  serving user requests with old functionality in place.

NOTE:

The Update-SPSolution cmdlet upgrades a deployed SharePoint solution in the farm. Use this cmdlet only if a new solution contains the same set of files and features as the deployed solution. If files and features are different, the solution must be retracted and redeployed by using the Uninstall-SPSolution and Install-SPSolution cmdlets, respectively.

Uninstall-SPSolution also has –local parameter.
But, The Install-SPSolution cmdlet deploys an installed SharePoint solution in the farm. Use the Add-SPSolution cmdlet to install a SharePoint solution package io the farm.
I haven’t tested this option.  And I can see a caveat here, if we need to add-spsolution , we need to remove-spsolution first. And this command doesn’t have –local option. Most likely in the scenario when we need to re-install spsolution instead of update, we have to bring the portal down.

4. Test the result locally on that drained and already updated WFE.

5. ISA Switch
Re-switch the servers.

6. Repeat steps from 3-4.

Done.

P.S.
That is how it looks from an end-user:
He is still accessing the portal while the specific WFE IIS is down.



SSRS: authentication error 14


I hate the SSRS error:  authentication error 14 in the http log of SSRS.
Time to time we have this issue , even though the farm is already established and we haven't changed anything in the topology.

We have 2 SharePoint WFEs and one SSRS server, SSRS is in integrated mode.

Here are few posts that I have found describe the error with different causes:


But, unfortunately all of these didn't help to resolve the authentication error 14.


What we noticed that we can have such error on one WFE, but other WFE  still can call SSRS to render the reports.
We saw a correlation between free memory and ability SharePoint to call SSRS.


Even though we can have 5 gb of available , but free can be lower 1 GB.
In this case SharePoint can't call SSRS server.








The easiest and not so smart action to make SSRS worked on this WFE is:


1. On WFE: IIS reset;
2. SSRS: server reboot,
3. SSRS: SSRS Service re-start

I would glad to hear your comments on this issue and suggestions how can I intelligently resolve this issue.

Monday, June 18, 2012

Performance issue: CAML against a large list

How do you measure and tune performance on the custom form of the SharePoint list?

Recently, I was lucky enough to take part in the performance issue investigation.
We have a huge custom form, in which we pull data from 6 lists.

Here are the steps that I took to investigate performance issue on the custom form:

1.  I  turned the Developer Dashboard On.

It allowed me to see the execution total time and the specific action in that execution.

Since I knew, we were dealing with related items loading from different lists, I focused on the "EnsureListItemsData" on the left side.

EnsureListItemsData is related , exactly, to list items loading.

How do you know which EnsureListItemsData# relates to what list?


2. I turned stack trace on in the web.config file


In the trace you can search by specific EnsureListItemsData#[number] to find the SQL query where one of the parameters is a list name.


3. Before changing anything, I documented  the average execution time for every related list.

4. I identified the most expensive list items load


5. Then, I moved to the code to investigate how the form was written regarding this list.


I have looked at the code and found useful to analyze this piece:

<sharepoint:spdatasource runat="server" id="dsDocument" datasourcemode="List" useinternalname="true"
                            scope="Recursive" selectcommand="<View><Query><Where><Eq><FieldRef Name=' Mngr_Id' LookupId='TRUE'></FieldRef><Value Type='Integer'>{Mngr_Id}</Value></Eq></Where></Query><ViewFields><FieldRef Name='ID'/><FieldRef Name='ContentType'/><FieldRef Name='EncodedAbsThumbnailUrl'/><FieldRef Name='FileRef'/><FieldRef Name='FileLeafRef'/><FieldRef Name='Title'/></ViewFields></View>">
                          <SelectParameters>
                              <asp:Parameter Name="ListName" DefaultValue="Documents" />
                              <asp:QueryStringParameter runat="server" Name="Mngr_Id" QueryStringField="ID" />
                          </SelectParameters>
                        </sharepoint:spdatasource>


Since, the CAML was trying to get the data from List by the field Mngr_Id, I wanted to check if the field is indexed

It turned out that is not indexed!

That means every time the CAML query  selects all items from the list and then applies filter.  by the provided value in the field Mngr_Id.

Here is an article in MSDN that discuses Query Throttling and Indexing. "How Does Indexing Affect Throttling?" precisely describes the current situation in my custom form that uses CAML query.

Recommendations:

1. To Optimize the select.
Here the options that I can see:

1. Set index on the column Mngr_Id in the Documents list:
Here is a PowerShell script to setup the index on SharePoint list automatically.

The script helps in case you have an array of the lists that you want to index.
Of you want to have an automated process of the changes in SharePoint Farm.

I have tested index Mngr_Id :
Measurement before setting the index:
EnsureListItemsData#6 (1575.84 ms)
Measurement after setting the index:
EnsureListItemsData#6 (675.89 ms)

2. Change the CAML query:
2a. Using subfolders. That calls for re-organization items in the document , a folder name should match the Manager Title. This was you MAY probably specify from what view you can select in the spdatasource control;


2. To analyze the changed web application settings:
We have to keep the default settings.
Default:
List View Lookup Threshold 8
List View Threshold  5000

Instead of:
Modified config threshold settings:
List View Lookup Threshold 36
List View Threshold 500 000

This allows us to restrict developers beforehand and optimize the structure and code.

Wish you a fast SharePoint site and calm administrative days)

Friday, June 15, 2012

SharePoint: End users access audit. How to pull users and their groups for the site


Your SharePoint Audit requires to clean the users on the sites. How do you proceed with this?

In case you don't a have third-party tool you have to be creative. You want to have a report that shows what user in what group in what subsite.

Here what I did.

I used an excel automated report since we need to change the membership and we want to reflect this changes fast in the report. Once I have created the report I put the excel on SharePoint , this way people can see the current situation with user membership.

Be mindful, in case you have assigned users directly on site , the report below doesn't show them.

Assumptions are:
 -  All users have been given access via group membership ;
 -  The spweb (subsite) uses unique permission levels (site collection inheritance is broken).
 -  The report shows the data for specific site collection

In case you need to enhance present limitations, you need to change a SQL query.

That what I did:
1. SQL
I have created a view:
ALTER view [dbo].[UserGroupAssignment] as
select UserInfo.tp_Title AS [User],
UserInfo.tp_Login AS [User ID],
UserInfo.tp_Email AS Email,
Groups.Title as [Group],
WebGroups.WebName as [Site]
from WSS_Content_EP..UserInfo (nolock)
join WSS_Content_EP..GroupMembership (nolock) on UserInfo.tp_SiteID=GroupMembership.SiteId and UserInfo.tp_ID=GroupMembership.MemberId 
join WSS_Content_EP..Groups (nolock) on Groups.SiteId=UserInfo.tp_SiteID and groups.ID=GroupMembership.GroupId
join
(
SELECT Min(webs.title)  AS WebName,
       Min(groups.title)AS GroupName ,
       Groups.ID
FROM   WSS_Content_EP..[roles] (nolock)
       JOIN WSS_Content_EP..roleassignment (nolock)
         ON roleassignment.siteid = roles.siteid
            AND roleassignment.roleid = roles.roleid
            AND roleassignment.scopeid IN (SELECT [perms].[scopeid]
                                           FROM   WSS_Content_EP..[perms] (nolock)
                                                  JOIN WSS_Content_EP..webs (nolock)
                                                    ON
                                          perms.scopeurl = webs.fullurl
                                          AND webs.siteid = perms.siteid
                                          AND webs.id = perms.webid
                                           WHERE
                    [perms].siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
                    AND webs.fullurl NOT LIKE 'apps')
       INNER JOIN WSS_Content_EP..webs (nolock)
               ON roles.siteid = webs.siteid
                  AND roles.webid = webs.id
                  AND webs.scopeid = roleassignment.scopeid
       INNER JOIN WSS_Content_EP..groups (nolock)
               ON roleassignment.siteid = groups.siteid
                  AND roleassignment.principalid = groups.id
WHERE  roles.siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
GROUP  BY groups.id

) as WebGroups on WebGRoups.ID=Groups.ID
where  tp_SiteID='55FE9630-5420-48F3-9099-210AEEEF43A8'
and UserInfo.tp_IsActive=1
--order by WebName,Title
GO

P.S. 
siteid is your site collection id where you want get the report from.
WSS_Content_EP - your content db where you site collection resides.
web.fullurl not like 'apps' - I am not including a root site for site collections apps.

Created an report account who has a read-only access for the view and underlying the tables


The query doesn’t include the spwebs that have permission level inherited from site collection level. The subquery WebGroups currently  cuts them, probably because we don’t have these webs in the table “roles” (webid)

ALTER view [dbo].[UserGroupAssignment] as
select UserInfo.tp_Title AS [User],
UserInfo.tp_Login AS [User ID],
UserInfo.tp_Email AS Email,
Groups.Title as [Group],
WebGroups.WebName as [Site]
from WSS_Content_EP..UserInfo (nolock)
join WSS_Content_EP..GroupMembership (nolock) on UserInfo.tp_SiteID=GroupMembership.SiteId and UserInfo.tp_ID=GroupMembership.MemberId 
join WSS_Content_EP..Groups (nolock) on Groups.SiteId=UserInfo.tp_SiteID and groups.ID=GroupMembership.GroupId
join
(
SELECT Min(webs.title)  AS WebName,
       Min(groups.title)AS GroupName ,
       Groups.ID
FROM   WSS_Content_EP..[roles] (nolock)
       JOIN WSS_Content_EP..roleassignment (nolock)
         ON roleassignment.siteid = roles.siteid
            AND roleassignment.roleid = roles.roleid
            AND roleassignment.scopeid IN (SELECT [perms].[scopeid]
                                           FROM   WSS_Content_EP..[perms] (nolock)
                                                  JOIN WSS_Content_EP..webs (nolock)
                                                    ON
                                          perms.scopeurl = webs.fullurl
                                          AND webs.siteid = perms.siteid
                                          AND webs.id = perms.webid
                                           WHERE
                    [perms].siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
                    AND webs.fullurl NOT LIKE 'apps')
       INNER JOIN WSS_Content_EP..webs (nolock)
               ON roles.siteid = webs.siteid
                  AND roles.webid = webs.id
                  AND webs.scopeid = roleassignment.scopeid
       INNER JOIN WSS_Content_EP..groups (nolock)
               ON roleassignment.siteid = groups.siteid
                  AND roleassignment.principalid = groups.id
WHERE  roles.siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
GROUP  BY groups.id

) as WebGroups on WebGRoups.ID=Groups.ID
where  tp_SiteID='55FE9630-5420-48F3-9099-210AEEEF43A8'
and UserInfo.tp_IsActive=1
--order by WebName,Title
GO


2. Excel
Excel has a connection to the view.

Data on the tab “Report_data” got refreshed every time the worksbook is open in Excel app
ATTENTION: IF YOU HAVE PUT A WORKBOOK IN THE SHAREPOINT 2010 IT WILL NOT GET REFRESHED IN THE BROWSER via xlviewer.aspx
You need to open it in the excel application to refresh the data.


The tab “Report” hosts a pivot table based on the tab “Report_Data”



Good luck with auditing!