Sharing the experience search

Search sharing-the-experience.blogspot.com
Showing posts with label Tips. Show all posts
Showing posts with label Tips. Show all posts

Monday, May 19, 2014

@SPC_ORG by @susanhanley on User Adoption. Observe, Listen and Communicate

A keynote on 5/19/2014 #SharePointConference.ORG - User Adoption
http://susanhanley.com/. Susan is local to me - she lives in Bethesda, MD.

Notes from the keynote.
The factors of Successful User Adoption:

1. WIIFM - what's in it for me.
Make you heard your stakeholders

2. The SP solution should be tight to critical company goal

3. Adoptable solution solves the problem

4. Give people voice

SP2013 features engage users:
 - likes, ratings in RIGHT context;
 - search; Create a Tip web part with tips and feedback. (Sue thinks it's crucial)

5. Engaging content
 - well written content for the web;
 - relevant for user.

6. Keep it simple

7. Start small. 
Turn out solution in 90 days

8.Roll out in gradual manner

9.Engage leaders
"No involvement by leaders, no commitment by employees. No exceptions"

A great tip: " To reduce attachments in the emails, set a rule in the leader's mailbox to reject attachments and ask to provide a link to the document"

10. Look for "early adopters"

11. Train to make people comfortable
Plan a training roadmap - for comfort.
Effective training - training suits people needs. Use different formats for different people.
12. COMMUNICATION
  • be persistent
  • get feedback whenever possible
  • Think about how to communicate THE LAUNCH

The ideas on launch videos:

don't call SharePoint portal - SharePoint portal

Another idea:
30 minutes for 30 things that you didn't know about SP
"Get Sharp on SharePoint" - webcast

13. OBSERVE the users

Friday, April 18, 2014

#SP24 : SPOnline public site branding issues


@gusfrases presented #SP24S067 Office 365 Public Websites The Good The Bad 
the Ugly #SP24 Online conference. The ugliest part was the most attractive one.

In that part he showed the challenges and issues around branding of a SPOnline public facing site.

Here are the most prominent challenges and issues:

Mobile
No customizable device channels

Anonymous users issues
Workaround:

Sandbox solution
JSlink (and yes, you can do a multiple JSLinks ListView Web Part issues with JSLink and Display Templates – A solution?)
  • No cache

Not a real workaround for cache issue:
 CDN for jQuery
 Client -side options for caching
 Lazy load;
 Content optimization

>> Takeaway>> NO for a Search-driven application on SPOnline public anonymous sites



Site management issue

No UI for Site columns, content types, search schema and creation of the subsite
Some default site templates are not available

Workaround:
Some features are accessible through direct links, some via SPD.

User Code Service not reliable in O365
Instead of Sandbox solution, consider JSLink when possible

>>Takeaway>> Why and why not to use SPonline for public facing sites?

*Search = OOTB Search

Resources:
Blog by Gus Frases
SPOnline Public Site with search which is possible built by Gus Frases



Wednesday, October 9, 2013

SharePoint Online: Aha moments! How to get a log?

Welcome to the SharePoint Online bizzare world!

Do you think you can access the log ?

Here is what Microsoft says about:

as of October 2013:

The Get-SPOTenantLogEntry cmdlet cannot retrieve all SharePoint Online errors. This cmdlet retrieves a subset of errors that happen due to external systems.

For Beta 2, the only company logs available are for Business Connectivity Services (BCS).

SharePoint Online manual migration: Managed Metadata issues

I have just finished SharePoint Online manual migration. And this post is all about migration Term Store and the managed metadata columns.

So, you have got a term set on-prem.

How do you migrate the term set to SharePoint Online? 

In case of migration to SharePoint Online manually, the only option you left with - is Import CSV file of term set to the destination - SPOnline.

The provided PowerShell script exports the term site into CSV file on-prem in order to import it in SharePoint Online.

The pitfall of importing CSV file , you can't specify child-parent relationship between term sets.

Another issue that I have discovered that there is no way to copy over Keywords, since it's a system term set. So, in case you need them from on-prem either you should re-type them manually or (much better option) a buy a 3rd party tool for SharePoint migration.

And finally, the BIG ISSUE of manual migration of managed metadata:

Imported terms will get a different GUID than terms on-prem. That means, every list that has managed metadata  after migration should be updated manually with valid imported term.

To sum up, if you have used Taxonomy, Folksonomy extensively in your on-prem, the best approach to migrate is to use 3rd party tools for SharePoint migration

Friday, May 24, 2013

Is SharePoint rubbish? Trash talk on SharePoint architecture

I have recently bought a book by one of the favorite knowledgeable guy in the SharePoint world - Todd Klindt - Professional SharePoint 2013 Administration

The book is hilarious and very descent to have on the shelf if you are a SharePoint Administrator, Developer or Architect.

Here is my favorite paragraph so far, which is very metaphorical and easy memorize for someone who is new in this big and messy SharePoint World.



"Try this analogy to understand how [all] pieces work together: Web applications are the landfill. Content databases are giant dumpsters. A site collection is a big, black 50-gallon garbage bag. Webs, lists, and items are pieces of trash. Your users spend all week creating garbage, continuously stuffing it in the garbage bags, with each piece of trash occupying only one garbage bag at a time. Each garbage bag can hold only 50 gallons of trash (quotas) before it is full, after which the user has to either ask for a new garbage bag or get a bigger garbage bag.  That full garbage bag is placed in a dumpster, and it is not possible to put a garbage bag in more than one dumpster without destroying it.  Dumpsters are serviced only by one landfill but that landfill can handle thousands of dumpsters without issue. "

Thursday, November 15, 2012

Search result won't show Default form for the list


I encountered a problem when crawled search results for list items gave URLs based on their display form but not on the Default one – what is unexpected.
So, I decided to investigate the logics how the search engine defines which of display forms to use for search results.

I found the similar problem described here:

After playing for a while with renaming and created date for the forms available in the list, I came up with the following conclusion:

Search engine takes the OLDERS display form (aspx)  for list items to construct search result URLs.

[The note is humbly provided by http://www.facebook.com/yury.shpakov]


Thursday, September 13, 2012

Simple Concept: How to search by managed metadata field or term?

[Question]: I have created a term and used it in managed metadata column. How can I search by term value to find any place where such term has been used?

For an example,
I have a managed metadata column "Project Tags" in the library.
I have an item in the library with value "eReview" in the Project Tags column.
I want to be able to search by eReview to find that item




[Answer]:
Once you have created a managed metadata column, you have to run a crawl.
Crawl will automatically  create a crawled property and managed property that look at the managed metadata column.

What you have to do is to check "Include values for this property in the search index". It will give ability to search by term title in the search.

In case you need to search byTerm id, use  a managed property value like this:

Where owsTaxIdProject is a managed property and term id - the actual term of id (in this case -  eReview).

You may ask 2 questions at this time:
1 . Why I need to search by Term id?
 - You may want to build a custom webpart to show all related content based on the select term.
2.  How do I know Term id?
One way to find a term id  is to go to the site collection hidden list Lists/TaxonomyHiddenList
Other popular option is to use seach refiner with the term that is interesting for you:
Select the value in the refiner and then look at the url of the search page:

Easy way to decode this - URL Decoder/Encoder



ATTENTION:
If you have several metadata columns in the several lists that look at the the same term set, but you call them differently (internal names in the different lists are not the same), the crawler will create several managed properties.
You want them unite. Find all crawled properties that have been created based on your managed metadata.

Hint: name of crawled property has a pattern. ows prefix means - custom metadata, taxId - refers to the type of column- Managed metadata. Hence you can find your all crawled managed metadata columns by searching withing crawled properties by name taxId.

And then map the crawled properties to the one Managed property.



Good luck with Taxonomy!

Tuesday, September 11, 2012

Get Term Id from Managed Metadata Field in a Sandbox Solution


You are developing a sandbox solution where you want to work with Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue.

You have noticed that you have an exception:

The exception is : System.RunTime.Serialization.SerializationException
Message : Type 'Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue' in Assembly 'Microsoft.SharePoint.Taxonomy, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c' is not marked as serializable.
Server stack trace: 
   at System.Runtime.Serialization.FormatterServices.InternalGetSerializableMembers(RuntimeType type)
   at System.Runtime.Serialization.FormatterServices.GetSerializableMembers(Type type, StreamingContext context)
   at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitMemberInfo()
   at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitSerialize(Object obj, ISurrogateSelector surrogateSelector, StreamingContext context, SerObjectInfoInit serObjectInfoInit, IFormatterConverter converter, ObjectWriter objectWriter)
   at System.Runtime.Serialization.Formatters.Binary.ObjectWriter.Serialize(Object graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
   at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Serialize(Stream serializationStream, Object graph, Header[] headers, Boolean fCheck)
   at System.Runtime.Remoting.Channels.BinaryServerFormatterSink.SerializeResponse(IServerResponseChannelSinkStack sinkStack, IMessage msg, ITransportHeaders& headers, Stream& stream)
   at System.Runtime.Remoting.Channels.BinaryServerFormatterSink.ProcessMessage(IServerChannelSinkStack sinkStack, IMessage requestMsg, ITransportHeaders requestHeaders, Stream requestStream, IMessage& responseMsg, ITransportHeaders& responseHeaders, Stream& responseStream)


You question: How do I get a value from managed data field?

Microsoft.SharePoint.Taxonomy.dll is not allowed in sandbox solution and that's why you're getting an exception. You can get string value of managed metadata field by calling GetFormattedValue.

But doing so, you will get a value from managed metadata field. 

var relatedNoteFldStaticName = {You metadata column static name}+"0";
item[ relatedNoteFldStaticName  ]

What it does - it pulls the related hidden notes field where Term Id is stored (SPSiteDataQuery in Sandboxed solution with multivalued metadata column)


Monday, July 16, 2012

Are you missing a using directive or an assembly reference


The type or namespace name  could not be found (are you missing a using directive or an assembly reference)?

What to check if you are missing an assembly reference?

1. Check if you have reference in References:
2. Check Build Order and Project Dependencies:
3. And finally, you may experience a strange behavior when you have added a reference, dependency and build order are right. You can even see that VS knows your dll while you developing. But when you hit build it loses the reference.
   If this is your case, check the Target framework:
The target framework should be the same for your project and the reference that you use inside.
Sometimes VS creates a project with the different target framework (based on the project type) than other projects in the solutions.
In my case , a console application project has a default .Net Framework4 Client Profile, but other projects in the solution have .Net Framework 4.

Wednesday, July 11, 2012

MVC: Htmlhelper. Make the markup clean

This is a quick post, mostly for myself, to remember a great concept of HtmlHelper in MVC.

Since, I have never worked before with MVC, I am really pleased to see how web pages look much cleaner in MVC than in ASP.Net Web forms.

One way to achieve cleanness in the asp.net page is to use HtmlHelper class from System.Web.Mvc
instead of standard HTML markup.

Here is a quick example:

You want to show the picture in your page. The old way to do it:
@foreach (var item in Model)
{
     @: Review
     <div class ="review">
     <h4>@item.Restaurant.Name</h4>
     <img src=" item.Restaurant.ImageUrl" alt="item.Restaurant.Nam " />    
     </div>
}


This approach will add more markups in you page + you will need to get rid of the automated encoding of the string inside that code block.

Here is a nicer way to go:

@foreach (var item in Model)
{
     @: Review
     <div class ="review">
     <h4>@item.Restaurant.Name</h4>
     @Html.Image(item.Restaurant.ImageUrl, item.Restaurant.Name)
     </div>
}

     
Wait a minute, there is no method HtmlHelper.Image.... 
Here is a perk, do your own extension for HtmlHelper:

 public static class MyHelpers
    {
        public static MvcHtmlString Image (this HtmlHelper helper,string src, string AltText)
        {
           
            var builder = new TagBuilder("img");
            builder.MergeAttribute("src", src);
            builder.MergeAttribute("alt", AltText);
            return MvcHtmlString.Create(builder.ToString(TagRenderMode.SelfClosing));
        }
    }


Friday, June 22, 2012

SharePoint: SharePoint 2007 Administration and PowerShell 2.0



Last months my work with SharePoint 2007, I felt a strong inclination to use PowerShell 2.0.
I envied a happy people who were already with SharePoint 2010 and could enjoy SharePoint 2010 Administration via SharePoint.ps1


PowerShell and SharePoint: What, Why and How
Simple concept: How to use SharePoint cmdlets in PowerShell ISE


I also wanted to have such beautiful commands that have been shipped by SharePoint.ps1 for SharePoint 2010:
Get-SPFarm,
Get-SPWeb,
Get-SPWebApplication


So, my problem was that we don't have SharePoint.ps1 for SharePoint 2007.
To easy up my desire, I have created my functions:
Get-SPFarmV3,
Get-SPWebV3,
Get-SPWebApplicationV3.


And some additional:
Get-SitesWithMissingTemplate, Get-SSProvider, Get-SPVersionV3.


All above I have packed in one SPv3Adapter.mdl module.


Windows PowerShell Module Concepts


Module Installation Best Practices:


Do not install modules for Windows PowerShell in the system location at %Windir%\System32\WindowsPowerShell\v1.0\Modules. Only modules included with Windows are installed to the system location.


http://msdn.microsoft.com/en-us/library/dd878350(v=vs.85).aspx


Module installation
http://msdn.microsoft.com/en-us/library/dd878350(v=vs.85).aspx




And some additional PowerShell files:
Helper functions: Start-CustomTranscript, Get-CustomAPPLog,Get-SolutionDeployed


I have used a lot Get-SolutionDeployed during upgrade SP2007 to SP2010:


The module Helper.mdl has dependency on SPv3Adapter.mdl

Please install  SPv3Adapter.mdl  module first in order to enjoy Sp2007 helper functions.
One of the favorable option for me is to use manifest (you are welcome to use attached customModulesLoader.psm1 and  Manifest.mdl)  to ship the PS modules.


Visual Studio 2010 and BDCM: How to build and deploy



During upgrade from SP2007 to SP2010, we have decided convert 2007 BDC files into 2010 BDCM files.


We did it using hybrid BDC upgrade approach


The exported BDC files I have added to SharePoint project in VS 2010
Creating a Business Data Connectivity Model


The article says each SharePoint project can contain only one model.


It's not true.


You can have as many models as you want in the one visual studio 2010 SharePoint project. 
Just create one project item per feature.


For an example, if you want shipped 10 BDCM models, create 10 features in the project.
Place one project item with BDCM model in each feature.


Identical feature property values from all project items are merged together in the feature manifest. However, if two different project items specify the same feature property key with non-matching values, a validation error occurs.



<Property Key="ModelFileName" Value="BdcModel1\BdcModel1.bdcm" />
So, remember - One project item per feature.
 
Note:
As you may know 2010 SharePoint Foundation allows to use BDCM. But there is one caviat with developing BDCM models in VS 2010 for SharePoint Foundation:

BDC deployment relies on feature event receiver to take care of actions like model import/feature activation. Unfortunately the assembly where the feature event receiver lives does not ship with  SharePoint Foundation. And the reason for that is same assembly has other functionality Microsoft decided not to ship in Foundation.

But, you have a solution to make it work:
 Deploy a BDC Model project to SharePoint Foundation 2010 using Visual Studio 2010
  Publish BDC Model project to SharePoint Foundation 2010 with Visual Studio 2010

Tuesday, June 19, 2012

SSRS: authentication error 14


I hate the SSRS error:  authentication error 14 in the http log of SSRS.
Time to time we have this issue , even though the farm is already established and we haven't changed anything in the topology.

We have 2 SharePoint WFEs and one SSRS server, SSRS is in integrated mode.

Here are few posts that I have found describe the error with different causes:


But, unfortunately all of these didn't help to resolve the authentication error 14.


What we noticed that we can have such error on one WFE, but other WFE  still can call SSRS to render the reports.
We saw a correlation between free memory and ability SharePoint to call SSRS.


Even though we can have 5 gb of available , but free can be lower 1 GB.
In this case SharePoint can't call SSRS server.








The easiest and not so smart action to make SSRS worked on this WFE is:


1. On WFE: IIS reset;
2. SSRS: server reboot,
3. SSRS: SSRS Service re-start

I would glad to hear your comments on this issue and suggestions how can I intelligently resolve this issue.

Monday, June 18, 2012

Performance issue: CAML against a large list

How do you measure and tune performance on the custom form of the SharePoint list?

Recently, I was lucky enough to take part in the performance issue investigation.
We have a huge custom form, in which we pull data from 6 lists.

Here are the steps that I took to investigate performance issue on the custom form:

1.  I  turned the Developer Dashboard On.

It allowed me to see the execution total time and the specific action in that execution.

Since I knew, we were dealing with related items loading from different lists, I focused on the "EnsureListItemsData" on the left side.

EnsureListItemsData is related , exactly, to list items loading.

How do you know which EnsureListItemsData# relates to what list?


2. I turned stack trace on in the web.config file


In the trace you can search by specific EnsureListItemsData#[number] to find the SQL query where one of the parameters is a list name.


3. Before changing anything, I documented  the average execution time for every related list.

4. I identified the most expensive list items load


5. Then, I moved to the code to investigate how the form was written regarding this list.


I have looked at the code and found useful to analyze this piece:

<sharepoint:spdatasource runat="server" id="dsDocument" datasourcemode="List" useinternalname="true"
                            scope="Recursive" selectcommand="<View><Query><Where><Eq><FieldRef Name=' Mngr_Id' LookupId='TRUE'></FieldRef><Value Type='Integer'>{Mngr_Id}</Value></Eq></Where></Query><ViewFields><FieldRef Name='ID'/><FieldRef Name='ContentType'/><FieldRef Name='EncodedAbsThumbnailUrl'/><FieldRef Name='FileRef'/><FieldRef Name='FileLeafRef'/><FieldRef Name='Title'/></ViewFields></View>">
                          <SelectParameters>
                              <asp:Parameter Name="ListName" DefaultValue="Documents" />
                              <asp:QueryStringParameter runat="server" Name="Mngr_Id" QueryStringField="ID" />
                          </SelectParameters>
                        </sharepoint:spdatasource>


Since, the CAML was trying to get the data from List by the field Mngr_Id, I wanted to check if the field is indexed

It turned out that is not indexed!

That means every time the CAML query  selects all items from the list and then applies filter.  by the provided value in the field Mngr_Id.

Here is an article in MSDN that discuses Query Throttling and Indexing. "How Does Indexing Affect Throttling?" precisely describes the current situation in my custom form that uses CAML query.

Recommendations:

1. To Optimize the select.
Here the options that I can see:

1. Set index on the column Mngr_Id in the Documents list:
Here is a PowerShell script to setup the index on SharePoint list automatically.

The script helps in case you have an array of the lists that you want to index.
Of you want to have an automated process of the changes in SharePoint Farm.

I have tested index Mngr_Id :
Measurement before setting the index:
EnsureListItemsData#6 (1575.84 ms)
Measurement after setting the index:
EnsureListItemsData#6 (675.89 ms)

2. Change the CAML query:
2a. Using subfolders. That calls for re-organization items in the document , a folder name should match the Manager Title. This was you MAY probably specify from what view you can select in the spdatasource control;


2. To analyze the changed web application settings:
We have to keep the default settings.
Default:
List View Lookup Threshold 8
List View Threshold  5000

Instead of:
Modified config threshold settings:
List View Lookup Threshold 36
List View Threshold 500 000

This allows us to restrict developers beforehand and optimize the structure and code.

Wish you a fast SharePoint site and calm administrative days)

Friday, June 15, 2012

SharePoint: End users access audit. How to pull users and their groups for the site


Your SharePoint Audit requires to clean the users on the sites. How do you proceed with this?

In case you don't a have third-party tool you have to be creative. You want to have a report that shows what user in what group in what subsite.

Here what I did.

I used an excel automated report since we need to change the membership and we want to reflect this changes fast in the report. Once I have created the report I put the excel on SharePoint , this way people can see the current situation with user membership.

Be mindful, in case you have assigned users directly on site , the report below doesn't show them.

Assumptions are:
 -  All users have been given access via group membership ;
 -  The spweb (subsite) uses unique permission levels (site collection inheritance is broken).
 -  The report shows the data for specific site collection

In case you need to enhance present limitations, you need to change a SQL query.

That what I did:
1. SQL
I have created a view:
ALTER view [dbo].[UserGroupAssignment] as
select UserInfo.tp_Title AS [User],
UserInfo.tp_Login AS [User ID],
UserInfo.tp_Email AS Email,
Groups.Title as [Group],
WebGroups.WebName as [Site]
from WSS_Content_EP..UserInfo (nolock)
join WSS_Content_EP..GroupMembership (nolock) on UserInfo.tp_SiteID=GroupMembership.SiteId and UserInfo.tp_ID=GroupMembership.MemberId 
join WSS_Content_EP..Groups (nolock) on Groups.SiteId=UserInfo.tp_SiteID and groups.ID=GroupMembership.GroupId
join
(
SELECT Min(webs.title)  AS WebName,
       Min(groups.title)AS GroupName ,
       Groups.ID
FROM   WSS_Content_EP..[roles] (nolock)
       JOIN WSS_Content_EP..roleassignment (nolock)
         ON roleassignment.siteid = roles.siteid
            AND roleassignment.roleid = roles.roleid
            AND roleassignment.scopeid IN (SELECT [perms].[scopeid]
                                           FROM   WSS_Content_EP..[perms] (nolock)
                                                  JOIN WSS_Content_EP..webs (nolock)
                                                    ON
                                          perms.scopeurl = webs.fullurl
                                          AND webs.siteid = perms.siteid
                                          AND webs.id = perms.webid
                                           WHERE
                    [perms].siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
                    AND webs.fullurl NOT LIKE 'apps')
       INNER JOIN WSS_Content_EP..webs (nolock)
               ON roles.siteid = webs.siteid
                  AND roles.webid = webs.id
                  AND webs.scopeid = roleassignment.scopeid
       INNER JOIN WSS_Content_EP..groups (nolock)
               ON roleassignment.siteid = groups.siteid
                  AND roleassignment.principalid = groups.id
WHERE  roles.siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
GROUP  BY groups.id

) as WebGroups on WebGRoups.ID=Groups.ID
where  tp_SiteID='55FE9630-5420-48F3-9099-210AEEEF43A8'
and UserInfo.tp_IsActive=1
--order by WebName,Title
GO

P.S. 
siteid is your site collection id where you want get the report from.
WSS_Content_EP - your content db where you site collection resides.
web.fullurl not like 'apps' - I am not including a root site for site collections apps.

Created an report account who has a read-only access for the view and underlying the tables


The query doesn’t include the spwebs that have permission level inherited from site collection level. The subquery WebGroups currently  cuts them, probably because we don’t have these webs in the table “roles” (webid)

ALTER view [dbo].[UserGroupAssignment] as
select UserInfo.tp_Title AS [User],
UserInfo.tp_Login AS [User ID],
UserInfo.tp_Email AS Email,
Groups.Title as [Group],
WebGroups.WebName as [Site]
from WSS_Content_EP..UserInfo (nolock)
join WSS_Content_EP..GroupMembership (nolock) on UserInfo.tp_SiteID=GroupMembership.SiteId and UserInfo.tp_ID=GroupMembership.MemberId 
join WSS_Content_EP..Groups (nolock) on Groups.SiteId=UserInfo.tp_SiteID and groups.ID=GroupMembership.GroupId
join
(
SELECT Min(webs.title)  AS WebName,
       Min(groups.title)AS GroupName ,
       Groups.ID
FROM   WSS_Content_EP..[roles] (nolock)
       JOIN WSS_Content_EP..roleassignment (nolock)
         ON roleassignment.siteid = roles.siteid
            AND roleassignment.roleid = roles.roleid
            AND roleassignment.scopeid IN (SELECT [perms].[scopeid]
                                           FROM   WSS_Content_EP..[perms] (nolock)
                                                  JOIN WSS_Content_EP..webs (nolock)
                                                    ON
                                          perms.scopeurl = webs.fullurl
                                          AND webs.siteid = perms.siteid
                                          AND webs.id = perms.webid
                                           WHERE
                    [perms].siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
                    AND webs.fullurl NOT LIKE 'apps')
       INNER JOIN WSS_Content_EP..webs (nolock)
               ON roles.siteid = webs.siteid
                  AND roles.webid = webs.id
                  AND webs.scopeid = roleassignment.scopeid
       INNER JOIN WSS_Content_EP..groups (nolock)
               ON roleassignment.siteid = groups.siteid
                  AND roleassignment.principalid = groups.id
WHERE  roles.siteid = '55FE9630-5420-48F3-9099-210AEEEF43A8'
GROUP  BY groups.id

) as WebGroups on WebGRoups.ID=Groups.ID
where  tp_SiteID='55FE9630-5420-48F3-9099-210AEEEF43A8'
and UserInfo.tp_IsActive=1
--order by WebName,Title
GO


2. Excel
Excel has a connection to the view.

Data on the tab “Report_data” got refreshed every time the worksbook is open in Excel app
ATTENTION: IF YOU HAVE PUT A WORKBOOK IN THE SHAREPOINT 2010 IT WILL NOT GET REFRESHED IN THE BROWSER via xlviewer.aspx
You need to open it in the excel application to refresh the data.


The tab “Report” hosts a pivot table based on the tab “Report_Data”



Good luck with auditing!