Wednesday, June 27, 2012

SharePoint 2010 strange behavior of Taxonomies under Migration from one server to another

Microsoft.SharePoint.dll version 14.0.4762.1000

Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000

 

We migrate only published and approved content from staging to production. Under one of the scenarios, we want to introduce new Terms for our system. Here is the strange behavior we are facing that might be reported to Microsoft for improvements in future service packs:

  1. Create new term on staging under desired termset.

  2. Reproduce same term with same Guid and parent term set on production :

    • By using TermSetItem.CreateTerm Method (String, Int32, Guid)

    • By stopping meta dataservice and then replace db on production from staging

    • By using Export-SPMetadataWebServicePartitionData,Export-SPMetadataWebServicePartitionData  powershell



  3. Create unpublished SPItem on staging which refer the new term.

  4. Let the content deployment proceed. This will migrate only the Term entry in TaxonomyHiddenList.

  5. Publish any spDoc which refers few of the old and new terms.

  6. Let the content deployment migrate the new published item.

  7. Now on production the newly migrated spDoc has only new term referred.

  8. If we migrate spDoc which only has old terms , not the new one, it migrates well and all the terms are visible under the display item form > SPField.


How you could help me in this:

I got tp_ListId and tp_DocId for my target document migrated from [AllDocs] database Tableusing LeafName.

Using tp_ListId and tp_DocId  on [AllUserData] I was able to observe that, actully content deployment has created entry correctly here.

The entry is like under ntext2 is like:“OldTerm1|OldTerm1GUID;NewTerm1|NewTerm1GUID;OLDTerm2|OldTerm2GUID”

Using all SharePoint API’s and U2U etc, my migrated item returns only the new term , it seems to be old terms are wiped off , but they are not , they are there in the database.

The output of U2U is:

NewTerm1|NewTerm1GUID

Now I have a question for you, what else than Taxonomy Update Scheduler Timer Job could be the culprit? Is it like the account with which SharePoint Timer Job is running , must have full control on Meta Data DB and Site Collection DB ?

 

How Microsoft should help us in this:

For a new term, when referred by an item, it should be handled by the migration /deployment API’s exposed internally that:

  • A new Term should be created under the same termeset as on source. Right now it tends to create under System > Keyword Terms set if term is missing. ( if I miss step 2 above)

  •  No multiple entries should be allowed in TaxonomyHiddenList with same title /GUID, name parent termset etc. If we migrate an item which is published under ste 4 above. 2 entries are created under TaxonomyHiddenList.

  • Taxonomy Update Scheduler should be made efficient to handle multiple entries for same term in the taxonomyhiddenlist by the external processes.


 Reply 1


A.

 

After doing all the above steps once :

So the Taxonomy Hidden List has all the terms as desired , the Term Store is up to date , the SQL  content DB has the entry as it should be.

Means Taxonomy Update Scheduler is not able to update the content DB , so that it may display the right values.

 

Do we need some kind of Service pack here ? We have :

Microsoft.SharePoint.dll version 14.0.4762.1000

Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000

 

I tried  TaxonomySession.SyncHiddenList(mySiteCollection); but it could not help .

 

 

B.

If I run all my Meta service , web application pool and SharePoint Timer Job with an "Admin Everywhere" account on production replica and follow the steps above along with TaxonomySession.SyncHiddenList(mySiteCollection);  still it does not help.

C.

If I run all the app pools and windows services on the server using an account who has admin rights everywhere along with TaxonomySession.SyncHiddenList(mySiteCollection);  it helps , I had to insert SyncHiddenList along with manual run of Taxonomy Migration Job in b/w step 4 and 5 above. i.e after the Taxonomy Hidden List item migration and before the actual content come in .

 

 

 

 

 

 

Is there some shorter way to avoid all the mess above said ? Or at least you could point out what else than exactly my Meta service app pool  , web application pool and SharePoint Timer Job is involved in Taxonomies !!!

 

Reply 2


Temporary Solution:
Everywhere on msdn and blogs, technology geeks have suggested making taxonomy store to be common for staging and production environments. But in our case we cannot maintain this. So with our version of SharePoint (Microsoft.SharePoint.dll version 14.0.4762.1000, Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000 ) we have tested below mentioned to work as an alternative:

 

0. On staging and production go to Central Admin > Security > Configure Service Accounts

a. Select Farm Account and press OK, the account you had decided to be farm account previously will be getting full control at many places in the DB , if the DB's are backup and restore off the line , generally we loose this important couple b/w farm account and Database. It will make sure the timer service which runs with Farm account can do many stuff on the whole farm without any errors.

b. do similar to above for app pool account for managed meta service and your target web applications.

1. Run export import for the staging and production to be in sync. Halt the changes on staging.
2. Create new term on staging under desired termset. Extract the GUID for this new term and keep safe with you
3. Create unpublished SPItem on staging which refer the new term.( To create an entry in taxonomy hiddenlist)
4. TaxonomySession.SyncHiddenList(mySiteCollectionStaging).Before Sync Please make sure your SharePoint Timer Job is running with an account which has full rights on site collection and the taxonomy data base. This can be done by setting admin for site collection and taxonomy service.Please make sure after this you wait for  "Enterprise Metadata site data update" and "Taxonomy Update Scheduler"  to run once as scheduled.

5. Delete the unpublished item of step 2 if you wish to .

6. Run export process and discard this export package. Because this package contains all the terms, since they are updated by sync process.
A. Reproduce same term with same Guid and parent term set on production: By using TermSetItem.CreateTerm Method (String, Int32, Guid) Take GUID and name from step two above.
B. Create an unpublished dummy item on production with new term.
C. TaxonomySession.SyncHiddenList(mySiteCollectionProd).Please make sure after this you wait for  "Enterprise Metadata site data update" and "Taxonomy Update Scheduler"  to run once as scheduled.
D. Delete the unpublished item if you wish to.

Now servers are ready for future normal import and exports without any error till the time you don’t introduce new terms.

 

Reply 3


Permanent Solution:

 

How Microsoft should help us in this:

1. Make the content deployment API's smart enough , that source Taxonomy hidden list is not marked for migration while exporting site collection , when the SPtem which refer to a term is created on Target , the right ( in sync) Term will automatically be created under target taxonomy hidden list.

2. In case Target term store is not updated , there should be explicit messages in import log , that a term is referred which may have been missing in the target term store.Import is unsuccessful.Please update your term store and run the import again. Or let the SharePoint Timer job create one at right place and give error if parent term set is missing

Friday, June 22, 2012

User Profiles in custom SPGridView with Pagination for SharePoint 2010

Hi

My User Profiles have around 10,000 records.

 

I want to display them in a SharePoint Grid ( custom) . If the count of profile were only 50 , to display them all my dev machine takes around 40 seconds with all custom processing I have.

So, I want page size say 20 , and on each next click I should   get  from profile DB next 20 records. I should be able to get page numbers also in the bottom. For direct navigation.

 

Please suggest if you have something in your mind / you done it already in past.

 

I am gonn rely on Microsoft.Office.Server.UserProfiles.ProfileManagerBase.GetEnumerator for this .

 

Any pointer ?

Reply 1

Hi hemantrhtk,

It may be not easy to use paging to user profiles, in this situation, you may consider store the profile information in a datatable, and use paging for the datatable, here is an example about this, please refer to this for more information:
http://www.codeproject.com/Articles/14017/using-cache-to-store-Data-in-datatable-for-custom

Thanks,




 

Qiao Wei

Reply 2

with http://msdn.microsoft.com/en-us/library/ee581591.aspx under microsoft.Office.Server.UserProfiles

we can atleast implement  next,previous under pagination. But looking for page numbers with last and 1st page buttons.

Anyways we are targeting membership provider as base for pagination, which has   public abstract MembershipUserCollection GetAllUsers(int pageIndex, int pageSize, out int totalRecords);. after that for each page fetch the records from User profiles.

Tuesday, June 12, 2012

Quickly get all files inside a spfolder recursive

Follow these steps:

  • Open the spfolder in IE.

  • Click on open with windows explorer.

  • Use this url to map a drive in your my computer.

  • Use below power-shell to extract the list:


 Get-ChildItem "z:\" -Recurse  | Foreach-Object 
  {
       write-host $_.fullname;
       Add-Content  "C:\hemant\sometextfile.txt"  $_.fullname;
  }

Friday, June 1, 2012

Central Admin Content Deployment Jobs and Path by Microsoft (SharePoint 2010) VS SharePoint Content Deployment Wizard by chrisobrien

Hi

Few of our projects are using SharePoint Content Deployment Wizard  by chrisobrien    since old days of MOSS 2007. And others use Central Admin Content Deployment Jobs and Path  by Microsoft .

Microsoft keeps on updating there out of box feature in SharePoint as usual.

 

I am starting this discussion for folks here to encourage using the best that could be. I would request everyone to share there views Like :--

____________________________________________________________________________________________________________

My preference : Central Admin Content Deployment Jobs and Path  by Microsoft (SharePoint 2010) (Option 1)

Why :

  1. I prefer option 1 because it is something for which I can go back to Microsoft  and request for support. I get  almost all feature in option 1 , which option 2 gives.


OR

My preference :  SharePoint Content Deployment Wizard  by chrisobrien   (Option 2)

Why :

  1. I prefer option 2 for level of customization it allows with custom API as wrapper  , I can control the whole migration process . Option 1 is kind of Black box for us.


_____________________________________________________________________________________________________________

 

You are welcome to share links where the tech world have already such threads available which gives this comparison handy like SharePoint Content Deployment Wizard

________________________________________________________________________________________________________________________________________________________________________________________________________________________

 

 

Are there some alternatives if we don’t want to upgrade from Microsoft SharePoint Server 2010 (14.0.4763.1000), still we don’t face below mentioned issues:

  1.        An incremental content deployment of a package in SharePoint Server 2010 fails if the following conditions are true:

    • The package contains a renamed site.

    • A sub-site contains a link to the renamed site.




Additionally, you receive the following error message:

Value does not fall within the expected range.

References: http://support.microsoft.com/kb/2459108

  1.        You perform a content deployment from a source web application to a destination web application in SharePoint Foundation 2010.

    • You select languages under the Alternate language section in the source web application.

    • You perform an incremental content deployment.




In this scenario, the alternative language setting is not changed in the destination web application.

References: http://support.microsoft.com/kb/2536591

  1.        Assume that you perform an incremental content deployment on a destination SharePoint Server 2010 farm. In this situation, the changes for the alternative language settings on the source farm are not reflected on the destination farm.


References: http://support.microsoft.com/kb/2536591

  1.        When you use the Managed Metadata columns together with document sets in SharePoint Foundation 2010, you cannot perform a content deployment successfully. Additionally, you receive the following error message:


FatalError: Specified data type does not match the current data type of the property.

References: http://support.microsoft.com/kb/2598304

  1.        You paste HTML markup into the comments field of a publishing page in a SharePoint site.


You change the page order by using the Site Navigation Settings page in the SharePoint site.

You perform a content deployment from this site to another site.

In this scenario, the content deployment fails, and you receive the following error message:

Cannot complete this action.

References: http://support.microsoft.com/kb/2598304

  1.        One of the issue you may face with Deployment job is :   How to Fix – Publishing Site Content Deployment Error – Duplicate First Name Column

  2.        You should verify for relative links, if they are migrated correctly.


Nevertheless we can avoid many not listed anywhere using below mentioned:

Best practices for content deployment (SharePoint Server 2010)

You are invited to add the issues you kept on facing in SharePoint Content Deployment using Paths and Jobs , with links to resolution or explanations.

SPWebService.CollectSPRequestAllocationCallStacks property

There was a time when you had to restart a lot of stuff after editing registry entries like  : KEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\HeapSettings        SPRequestStackTrace = 1

to get the root cause of memory leaks and all that stuff.

Now you have property  SPWebService.CollectSPRequestAllocationCallStacks .  This is a diagnostic setting that facilitates the debugging of leaks of SPSite and SPWeb objects. If not disposed properly these objects can hold onto large amounts of memory. By enabling this setting, traces in the trace log that report leaks will contain callstacks. However, callstack collection is expensive so this should only be enabled during a brief diagnostic period.

Powershell :

$MySvc = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$MySvc.CollectSPRequestAllocationCallStacks = $true | $false
$MySvc.Update()

 

Monday, May 28, 2012

Microsoft.SharePoint.Taxonomy Migration of Taxonomy

Hi



I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .

  1. I took Content DB backup of Site collection SC1 on this server 1 along with Taxonomy DB backup on server 1 .

  2. I restored these DB's on Server2 ( Microsoft.SharePoint.DLL version  14.0.4762.1000)  SQL DB .

  3. Created a new Taxonomy Meta Service with  restored MetaService DB backup on Farm2.

  4. Created a new web application and site collection with restored Content DB from Server1  on Farm2.

  5. Manually run the Taxonomy Update Scheduler Timer Job.

  6. When I create a new item/ edit an on Server 2, a new entry is created in TaxonomyHiddenList and a new entry under termStore "System".

  7. Even though I have same term present in my custom Termstore in a termset !!!!

The ideal behavior should be , on server 2 , new item should be tagged to my custom term in predefined termset and termstore .



Please guide me what step I am missing.

Reply 1

Let's try a different approach. Instead of trying to create the service from a backup, we can export the data and import it into a new metadata service:

  1. Delete the Managed Metadata service in Farm 2.

  2. Create a new Managed Metadata service in Farm 2. Note the application pool account.

  3. In farm 1, from an elevated SharePoint Management Console window, run the Export-SPMetadataWebServicePartitionData cmdlet. This will create export the managed metadata data into a cabinet file (.CAB):

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsite" -ServiceProxy "ManagedMetadata Service Proxy Name" -Path "C:\Temp\ManagedMetadata.cab"

  1. opy the .CAB file from farm 1 to a folder on the farm 2 SQL Server. Share this folder and give Everyone Full Control sharing permission (we will clean this up later)

  2. In farm 2 SQL Server Management Studio, give the Managed Metadata application pool account (from step 2) bulkadmin SQL Server role (Expand the server instance -> Expand Security -> Find the account -> Right click, Properties -> Server Roles Page -> Check bulkadmin -> Click OK)

  3. In farm 2 SharePoint server, from an elevated SharePoint Management Console window, run the Import-SPMetadataWebServicePartitionData cmdlet to import the data:

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsitefarm2" -ServiceProxy "New Managed Metadata Service Proxy Name" -Path "\\SQLServerName\Share\ManagedMetadata.cab"

You'll need to update the URLs and names to reflect your environments.






JASON WARREN



Reply 2



Hello Jason



Thanks for your great help. With the migration commands you shared also , I was getting the same problem.

The source of the problem in my case was :

After recreation / DB restoration /migration of Taxonomies , few of the properties in my User Profiles had lost the mappings, I had to manually remap them to get the things working.

Reply 3 

After everything I am facing below mentioned: ( To sum up I will rewrite all steps I did)\



  1.        I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .On Farm 2 Microsoft.SharePoint.DLL version  is 14.0.4762.1000

  2.        The Farm1 and Farm2 are exact replica ( created from same content DB) and Taxonomies also being same with same GUIDs of Term Store and sets and Terms.

  3.        I created a new term under my custom term set on Farm1 . Used  Export-SPMetadataWebServicePartitionData  and  Import-SPMetadataWebServicePartitionData   (OverwriteExisting)  to migrate changes in Terms  to Farm 2

  4.        Edited an existing content in Site Collection 1 to refer to the new term.

  5.        Now I used Deployment API’s to take incremental export of site collection 1 .

  6. The import process on Site Collection2 on farm 2 create two entries in TaxonomyHiddenList !! These two entries have same Title, IdForTermStore, IdForTerm, IdForTermSet etc . Only difference is the SPITEM ID .

  7.        All the functionalities of my site work fine. But I am not sure why these two entries should be there in TaxonomyHiddenList !!!



After each step above I run the Taxonomy Update Scheduler Timer Job on Site collection 2 web application. Manual TaxonomySession.SyncHiddenList also did not help.



Seems to be one entry in TaxonomyHiddenList came from content migration of TaxonomyHiddenList itself and the second entry came as include dependency of list item edited which refer to this new  Term. Might be the case ?



As the import Log mention The import Process on Site Collection2 refer to TaxonomyHiddenList Twice :

[5/30/2012 11:08:46 AM] Start Time: 5/30/2012 11:08:46 AM.
[5/30/2012 11:08:46 AM] Progress: Initializing Import.
[5/30/2012 11:08:46 AM] Progress: Starting content import.
[5/30/2012 11:08:46 AM] Progress: De-Serializing Objects to Database.
[5/30/2012 11:08:46 AM] [Folder] [Person] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Person] Verbose: Source URL: _catalogs/users/Person
[5/30/2012 11:08:46 AM] [Folder] [Item] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Item] Verbose: Source URL: Lists/TaxonomyHiddenList/Item
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Verbose: Source URL: myLibrary/Forms/myLibrary
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Progress: Importing
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Source URL: sitemap.xml
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Destination URL: /sitemap.xml
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Progress: Importing
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: List URL: /myLibrary
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: Deleting...
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Verbose: List URL: /Lists/TaxonomyHiddenList
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Verbose: List URL: /_catalogs/users
[5/30/2012 11:08:47 AM] Verbose: Performing final fixups.
[5/30/2012 11:08:47 AM] Progress: Import completed.
[5/30/2012 11:08:47 AM] Finish Time: 5/30/2012 11:08:47 AM.
[5/30/2012 11:08:47 AM] Duration: 00:00:00
[5/30/2012 11:08:47 AM] Total Objects: 11
[5/30/2012 11:08:47 AM] Finished with 0 warnings.
[5/30/2012 11:08:47 AM] Finished with 0 errors.

My migration process export only the content which is published . If I give a tweak in the process it works:



1. Create an item which is not published using the new term on server 1 . This populates an entry in TaxonomyHiddenList. Let this Term in TaxonomyHiddenList migrate in next export import migration.



2. Now before next migration cycle , Publish the item which was refering to new term on server 1 . After next migration , the updated item goes to server 2 and there is no extra term in TaxonomyHiddenList onserver 2.

Reason: Probably import process on Server 2 is taken care by SharePoint as a Transaction. Now TaxonomyHiddenList Import and refering item import is under single transaction. So the item does not get reference to the newly created / imported taxonomy hidden list entry and hence trigger a new spitem creation in the taxonomyhidden list.

If we don't want to use powershell for Taxonomy migration and want to stick to db migration, there is a way out, restart SQL server just before db restore on target , so that there may be no connection Live to the managed db at the time of restore.

If we try to point the old service to a new DB instead(on the target,where new db is restored from the source meta db), some mappings may be lost for the termsets to the Columns in the list.

You may also like:

System KeyWords for Managed MetaData Service


SharePoint 2010 strange behavior of Taxonomies under Migration from one


Wednesday, May 16, 2012

Microsoft.SharePoint.Deployment SPExportSettings

Hi

 

ExportChangeToken property in SPExportSettings  is something which tells SharePoint API's a point in past from which the incremental export should start .

 

CurrentChangeToken  (ReadOnly) is something which is set internally , and referred by future exports as a mile stone for export process.This property is set  the when some change occurs in site or when the increamental batchis complete ?

 

I have a batch which runs perfectly under ideal conditions.

 

 

I want that when I run this batch manually , value of CurrentChangeToken  should not be persisted in the system,

so that automatic batch process keeps on running as is & my manual run is free enough to export without any affect on the automatic.

 

 

Please help.

1. Is it ok to create my own persisted object store of SPChangeToken , let the Batch use it to get last value and add new token there with CurrentChangeToken   value.  The manual process do not update the custom persisted object , so last in the persisted object is the one by automatic process.

 

 

 

2.

I have one last resort for me , if I don't find direct API's in Deployment name spaces :  SPSite.GetChanges Method (SPChangeToken, SPChangeToken)

In this case I want to know Microsoft.SharePoint.Deployment.SPExportSettings.CurrentChangeToken  (ReadOnly)  property is set internally after the export has run  or it is independent of when export runs , but dependent on when actual change occurs.

Reply 1

Hi

 

Under content migration scenarios from staging to Production on daily basis , how  Microsoft.SharePoint.Deployment API's are better performer than http://technet.microsoft.com/en-us/library/ee721058.aspx ?