Sharing the experience search

Search sharing-the-experience.blogspot.com
Showing posts with label Lesson learned. Show all posts
Showing posts with label Lesson learned. Show all posts

Friday, March 28, 2014

Managed metadata: 5 things that you wish you knew before using term store management tool in SharePoint

My recent posts are more on the darker side. I see the mistakes, issues and flaws in SharePoint, as well as in working with SharePoint.
It doesn't mean that SharePoint is getting worse, it just means I am getting better). I know more, I see more.
Last couple of years, I was working on understanding how to use SharePoint right. Where SharePoint is helpful, what feature is more appropriate for the specific use case. 
Finally, I have figured out where Managed metadata can bring value. 
More on that:

I have implemented managed metadata, and have used it successfully to improve findability in the portal.
Nevertheless, working with managed metadata in the real world portal highlights some flaws that currently exist in Term Store management implementation.

1. No history, no versioning
There is no way to see changes made in the term store, nor you can't revert changes.

2.No author of the term
The term doesn't have an author field. So, there is now way to know who created or modified and when the term.

3. No usage (where value is used)
Being in the term store you can't trace where the term has been used.

4. No UI to backup\restore
What you can do is to reach out for PowerShell to export term to backup it.

5. No sync options in case of Hybrid Cloud Architecture
There is no way to maintain term store in sync between 2 farms: on-prem and SharePoint Online

These 5 things can be easily significant in the large implementation of the SharePoint. I hope that you know them before making decision. And it doesn't mean that these facts stop you, but you will be better equipped to set the expectations right.

Common mistakes in SharePoint 2013 Arhitecture

I am participating in the upgrade from SharePoint 2007 to  SharePoint 2013 on premises. It is nice to be back in upgrade work since it was my main activity last 6 years). This time it's a little bit different.
First of all, the upgrade is from SharePoint 2007 to SharePoint 2013. We do it through a transitional SharePoint 2010 farm.
And second of all, I wasn't involved in the architecture phase of building the farm. I jumped on the upgrade bandwagon just before the final prod upgrade. So, what I do mainly is  verifying that new farm is in the working conditions and testing once again the content migration.
Along the way I see how people built the farm and streamlined the upgrade process. It's a good experience to have to observe different style of working with SharePoint. I am free of making architectural decisions, but at the same time I am experienced enough to see outcome of decisions that were made by others.
This post is about my observations on common SharePoint 2013 architectural mistakes.

I can get "editing in the browser" feature with no additional configuration
No, you have to have 2 conditions to have Office Web Apps on the farm:
1. Office Web App server
2.  Claim based web application

You need to have OWA server. You can't place OWA on App server. It should be a dedicated server for Office Web App server.Plan for an additional server in the farm.
Office Web Apps can be used only by SharePoint 2013 web applications that use claims-based authentication


OWA server will handle all Excel calculation
Office Web Apps Server enables you to view workbooks that contain Data Models that use native data. However, you can’t explore data in items such as PivotTable reports, PivotChart reports, and timeline controls that use a Data Model as the data source.

Excel Web App runs in one of two modes:
SharePoint view mode   In this mode, Excel Services is used to view workbooks in the browser.
Office Web Apps Server view mode   In this mode, Excel Web App is used to view workbooks in the browser.

       Excel Services, and Excel Web App all have a lot in common, but they are not exactly the same. These applications can differ in what workbook features are supported for viewing in a browser

More on SharePoint 2013 architectural pain points

SP farm doesn't use SQL alias
 An old and common mistake. I have just recently jumped on the project after the farm was already configured. The first thing that makes me sick in the farm, that there is no alias for SQL.
So, in case you sql server dies and need to make sure that you name the new server the same way as previous one.

Tuesday, July 16, 2013

SharePoint 2013: AD group membership changes are not reflected on the site

[What you have]:

Web application with claims based authentication (which is by default) in SharePoint 2013, and the site to which you gave access via AD group.
You have added\removed a user from AD group, but  the site permissions won't reflect the changes for this user.

[Why]:

When a domain user logs on to SharePoint, the server will create a token that contains information about that user, along with any domain groups they may be a member of.  By default, SharePoint 2010 (as will SharePoint 2013) will hang on to this data for 24 hours, at which point the token will expire, and the next user logon will force a fresh token to be created.

[What to do]:
Run stsadm to change the default value 1440 minutes (24 hours)
for example, set 1 hours (60 minites)

stsadm.exe -o setproperty -propertyname token-timeout -propertyvalue 60
iisreset


You may also run into an additional problem related to Claims token cahce.
To resolve this, check this out

Friday, May 24, 2013

Is SharePoint rubbish? Trash talk on SharePoint architecture

I have recently bought a book by one of the favorite knowledgeable guy in the SharePoint world - Todd Klindt - Professional SharePoint 2013 Administration

The book is hilarious and very descent to have on the shelf if you are a SharePoint Administrator, Developer or Architect.

Here is my favorite paragraph so far, which is very metaphorical and easy memorize for someone who is new in this big and messy SharePoint World.



"Try this analogy to understand how [all] pieces work together: Web applications are the landfill. Content databases are giant dumpsters. A site collection is a big, black 50-gallon garbage bag. Webs, lists, and items are pieces of trash. Your users spend all week creating garbage, continuously stuffing it in the garbage bags, with each piece of trash occupying only one garbage bag at a time. Each garbage bag can hold only 50 gallons of trash (quotas) before it is full, after which the user has to either ask for a new garbage bag or get a bigger garbage bag.  That full garbage bag is placed in a dumpster, and it is not possible to put a garbage bag in more than one dumpster without destroying it.  Dumpsters are serviced only by one landfill but that landfill can handle thousands of dumpsters without issue. "

Thursday, June 21, 2012

SharePoint: How to build SharePoint application wrong. SharePoint Karma



Whenever I refer to SharePoint development strategies, I always end up with SharePoint deployment strategies.


I am aware that in SharePoint world there are people who prefer to do customization via SPD and without a traditional development cycle.


This post for others... who wants to preserve a development cycle and who wants to have a fully automated and predictable deployment:
- via wsp files (or sandbox solution)
- and specific actions (like a feature activation) which is shipped as  PowerShell\stsadm commands. (Best Practices: Sharepoint Application Development Life Cycle)


How do you choose between content customization and code-based approach?
If you need to deploy anything on the server, that's a sign that's your approach is code-based customization.


Here is a quick SharePoint karma guide for code-driven development for SharePoint. Each wrong action will lead to a specific strong deployment reaction.
All described actions I saw done by developers. I hope this quick guide helps you to avoid pain of SharePoint deployment.


Action: Create a site locally manually. And then develop some other features.
Reaction: Deployment will be hard to other environments. You will need to re-create a site manually and hope that you haven't skip any steps.
Recommended action: create a custom site definition(onet file).It was a recommended approach since 2007. But I heard it's not what will be recommended in SharePoint 15. In SharePoint 2010 you have 2 options:
 - a web template that is shipped via sandbox solution and placed in the content db;
 - a site definition that is shipped via wsp file and deployed on the server.
Custom web templates (the preferred approach) and custom site definitions (which in some scenarios you must do)


Action: Create stp to deploy first version of the site. You know that in the future you need to change the site structure.
Reaction: Once you have created a site from a template, there is no way you can change structure via stp
Recommended action: Use site definition and feature stapling approach for further releases. In SP2010 you can make the changes for some features that is shipped via site definition using Feature Upgrade


Action: Build a wsp, place in it only eventreceivers for specific list with expectation that the list will have appropriate structure (someone should change the structure of list and then register the eventreceiver)
Reaction: As you can guess by action, you will not have stable behavior on the environment,since half of the deployment should be performed manually. I can only hope that you keep your deployment documentation up to date.
Recommended action: If you need to develop an eventreciever for the list that should have specific fields or content type, you should describe the list in the list template feature. Package the list template, list template instance and eventreciever register features as a bundle in wsp file.

Action: Every time is time for deployment - create a new site as a blank site - and create all object manually, copy files from original site to the destination and than replace hard-coded id in the file - manually
Reaction: It's messy and prone to errors.  There is no way to make such site automatic deployable
Recommended action: consider best practices of SharePoint development, use packages to deploy and feature upgrade for further releases. In case you need make some data structure changes - use SharePoint object model, write the code and package as a ReleaseFeature ( Feature stapling). This approach will allow you to keep track of all your structure changes on the existing site, plus you still can use the site definition to create a new site, and after the creation the feature stapling will do change work as it did for the existing one.


Action:Write a code to send an email\interact with end-users without environment variables
Reaction: If you miss configuration settings for your interactive logic, you miss control over it. You may end up with end-user notification sent while you are testing the application.
Recommended action: Choose approach that suits your needs best:
- property bag on the specific object (site, list, item ,etc ) (One of the way how to handle the property bag - SharePoint Property Bag Settings 2010)
- hidden list with environment variables.


Action: Manually copy to the environment all custom master pages and customized OTB SPD web parts
Reaction: It's messy and prone to errors.  There is no way to make such site automatic deployable
Recommended action: Package master pages in a feature to deliver to the server.
Regarding OTB SPD web part -there is no easy way, since the web part holds the id of the specific object from where the data is pulled. So this object will not be valid on other environment ( but in case you have developed on the copy of the prod content db, you are fine)
We have decided to develop a custom feature receiver to deploy such web parts - File provisioning: a hard-coded list id replacement


Action: Deploy report files with data source - a list asmx with hard coded List ID
Reaction: If you developed the data source (a list) that doesn't have the same id as in prod, you will have to manually change the data source after deployment.
Recommended action: To make it transportable - you can think of msi installer with parameters to swap hard coded value to a new one.


Action: Create a site groups manually and have the logic that gets these groups by IDs
Reaction: It’s prone to errors. If someone delete the group and then create a new one with the same name. You hardcoded value in the logic will break.
Recommended approach: Try the approach:
- create custom code feature which creates the groups. In the logic verifies if the custom groups exist and the permissions for them are right. If the groups don't exist, re-created them in the code.

Action: Create your own field as a counter that looks at id and has some increment value in the config file. Every time user deletes the records, you have to change the increment value in the config file.
Reaction: What???
Recommendation: I saw such application that works several years in the Prod with regular request from developer to change the increment. Last time I have checked the increment was gravitating to number 42)).
So never do such thing. This is approach is plain wrong. If you need,for some reasons, to keep ids for list items in strict order and you don't want to skip a number even though the user deletes the item and your next SharePoint id will not be subsequent to previous one, consider:
- to create a calculated field;
- to implement event receiver on the updated\created to keep the counter up to date.


And overall bad actions:


Action: Developer's objection: “It’s working on my local machine, so it is supposed to work everywhere”
Reaction: Depends on how you thought about implementation plan (if any), you may end up with the code that will work nowhere except your local machine.
Recommendation: Before you start coding for SharePoint, think through how you see your changes will be deployed.


Action: Administrator's easy fix: “If on the feature on the Prod gives you a headache - uninstall  the feature”
Reaction: Some sites will may lose the functionality that they needed from uninstalled feature.
Recommendation: Analyze what uses this feature 



Action: Project manager accusation\question: "This is sign of careless and sloppy work. How can we fix it?"  
Reaction: A team will not collaborate to fix the issue since they are considered careless and lazy
Recommendation: Respect the work of others, be aware the world is not perfect, we are here to make it work at some extend. Ask you question in the polite way to get a faster and constructive answer.



Good deeds to you and light consequences.

Wednesday, February 29, 2012

Lesson learned: list corruption after stsadm -o import


The case:
The site is created from STP template with some data included. The additional data imported through stsadm –o import

The result:
       List corruption on the site.

The recommendation:

Don’t combine 2 approaches: STP with data and stsadm –o import.
Either go with:
-           STP with fresh data ;
-          Stsadm –o import on the blank site template.

Consideration:

 STSADM.EXE -O EXPORT:
It exports data using Object Model and generated new GUID for every object.
It doesn't copy workflow history and tasks lists. 

This means that you have some hard-coded logic in the site it will be broken after stsadm.exe export.

(for further analysis, you welcome to read my blog entry - Sharepoint: How to backup site: stsadm export and SPD backup)

P.S.
Some of the errors that you find in the import log that will wreck the havoc:


Importing File Version 1.0
Invalid file name. The file name you specified could not be used



Item does not exist. It may have been deleted by another user.
   at Microsoft.SharePoint.SPListItem.EnsureItemIsValid()