Thursday, August 2, 2012

Access "Pages" Library for Chinese Locale

Hi

My site collection has 5 language variations.

 

Except Chinese version I am able to access Pages Library Like :

web.Lists["Pages"] or web.Lists.TryGetList("Pages")  .

 

But for Chinese sub site :

  1. If I Iterate   web.Lists   collection  using foreach loop , there is a list present whose SPList.Title is "Pages" . For this list Under SPList.SchemaXml , Title is Title="頁面"

  2. web.Lists["Pages"] gives error: List 'Pages' does not exist at site with URL 'XXXXXXXXXXXXXXXXXXXXXXXXX' AND  web.Lists.TryGetList("Pages")   gives null .

  3. But web.Lists["頁面"]  returns the right SPList object


Is it like , SharePoint API's to retrieve a list by name refer to  SchemaXml   rather than SPList.Title , while retrieving a list internally ?

 

 

 

Reply 1 by http://social.technet.microsoft.com/profile/ivan%20vagunin/?ws=usercard-mini

Hi!

I would propose to use SPWeb.GetList function - it returns list by url (eg web.GetList("/myweb/Pages") and url usually the same for all locales.

http://msdn.microsoft.com/ru-ru/library/microsoft.sharepoint.spweb.getlist(v=office.12).aspx

reply 2 By http://social.technet.microsoft.com/profile/aseem%20sood/?ws=usercard-mini

All the options that I can come up with are:

 

  1. Web.GetList(<Url from SharePoint Resource File>) Preferred over the others for Intranet

  2. PublishingWeb.PagesList (Coz it was specifically mentioned for Pages Library)

  3. SPWeb.GetListsOfType ( with 850 for Pages Library. If it does not work then 101-Document Library and Loop through)

  4. Guid LibraryPagesId = PublishingWeb.GetPagesListId(web); SPList psgeslib = web.Lists[LibraryPagesId];


Also, just wanted to add Web.GetList(URL) is normally a better performing approach than Web.Lists[ID/TITLE].

Thanks,

Aseem Sood

Monday, July 9, 2012

SharePoint 2010 | Content Deployment using Path and Jobs from Central Admin | Errors

Hi

We have a staging source and target production.  For 1st migration on blank site collection :

1. Even if we migrate content to a blank site collections we get many warnings like :

a.

[WebFeature] [Publishing]   Warning: Provisioning did not succeed. Details: Failed to create the 'Images' library. OriginalException: The feature failed to activate because a list at 'PublishingImages' already exists in this site.  Delete or rename the list and try activating the feature again.

b.

Warning: Provisioning did not succeed. Details: Failed to create the 'Images' library. OriginalException: The feature failed to activate because a list at 'PublishingImages' already exists in this site.  Delete or rename the list and try activating the feature again.

c.

Warning: User or group 19 cannot be resolved.

 

 Can we use a backup of staging as target instead of blank . What else than initial migration failure should I aspect as drawback in this ?

 

2.

Content deployment deploys the most recent major and minor versions of a content item. For example, if version 2.7 of a Web page is being deployed, the most recent major version (2.0) of the page, and the most recent minor version (2.7), will be deployed to the destination site.

Is it possible to limit the migration to last major version only ?

  1. We have started with blank site collection using powershell. Still getting the error in point one.

  2. I refer to http://centralAdmin/_admin/Deployment.aspx in this problem statements above.


Reply 1 By http://social.technet.microsoft.com/profile/jason%20warren/?ws=usercard-mini

Microsoft's Stefan Goßner has a great post about content deployment that I recommend you review.

 

In the section about the requirements for a successfuly content deployment, he recommends using an empty site collection for the destination and not a site collection with the Blank Site template (STS#1):
3) Use an empty site collection as the destination of your content deployment job

As already discussed in Part 5 content deployment will fail if the destination database contains conflicting content. To avoid this it is required that the initial deployment is done into an empty site collection.

Be aware that the only way to create an empty site collection is to use the following STSADM command:

STSADM.EXE -o createsite -url <url-to-site-collection> -ownerlogin domain\user -owneremail <email-address>

Using the "Blank Site" template will NOT create an empty site collection! It will actually create a site collection with content. You can see the difference if you create a site collection using both methods and then inspect the content of the created sites using SharePoint designer.

I personally recommed to always use the STSADM command with the syntax above to ensure that you really have an empty site collection as destination.

Impact: If the site collection has been created using a different method or already contains data the content deployment job will fail.

How to resolve: Deploy into an empty site collection

For your second question, content deployment will deploy the latest published version. If you don't want a minor version to be deployed (it's a draft, perhaps), simply do not publish it.




 

JASON WARREN

Infrastructure Specialist



 


 Reply 2 By http://social.technet.microsoft.com/profile/neal%20mcfee%20%5Bmct%5D/?ws=usercard-mini

To create a site collection that does not have a template (an empty site template) you omit the -template parameter in the New-SPSite cmdlet.

If you create a blank site using the STS#0, this will not work for the task you are trying to complete.

 

 

Try deleting the target site collection and recreating it using New-SPSite but do not specify the template.

Then re-run your publishing job.




If this helps you then please mark the post as helpful.
If this answers your question then mark it as the answer.
If another contributor in the thread answers your question then please do the right thing.
And as always most answers for SharePoint are based on "It depends"

 

Sunday, July 1, 2012

VariationsFixupTool

variationsfixuptool is used to correct variations system data
on publishing sites or pages.If you want to analyze the actual site structure
and the data stored in the relationships list this tool may be good one.

Syntax

stsadm -o variationsfixuptool

   -url <source variation site URL>

   [-scan]

   [-recurse]

   [-label]

   [-fix]

   [-spawn]

   [-showrunningjobs]


Parameters


Parameter name Value Required? Description
url A valid URL, such as http://server_name Yes The URL of a site in source variation where variations system data is being analyzed or corrected.
scan <none> No Analyzes the variations hierarchy and report findings.
This parameter provides functionality that cannot be accessed using the Central Administration Web site.
For each Site/Page, it reports:
  • If the source site is marked as being in the source variation hierarchy (SPWeb.AllProperties["__InSourceHierarchy"] == True ?). If the site is marked as being in the source hierarchy means that the page is part of the source hierarchy.
  • The Variation Group Id of the source site or page (SPWeb.AllProperties["Variation Group Id"] or PublishingPage.ListItem[FieldId.VariationGroupId])
  • Which peers are registered in the relationships list with same Variation Group Id
  • If a variation peer exists in the configured labels (default is all spawned labels) by first checking if a peer is configured in the relationships list with the same variation group ID for the given label. If no peer can be found using the relationships list we try to lookup the peer using the default URL that would be used when creating a variant in the given label
  • The variation group id of the variation peers in the target labels (SPWeb.AllProperties["Variation Group Id"] or PublishingPage.ListItem[FieldId.VariationGroupId])
  • If the source site/page is configured in the relationships list
  • In addition the command reports the variation labels used for the peer check (default is all spawned labels).
  • The tool will not identify any issues - it is required to analyze the report in detail and interpret it in order to find problems.
recurse <none> No Scan or fix all subsites of the site specified by the url parameter.
label A valid label name, such as "English" No Name of the label of the variation target.
fix <none> No Corrects invalid variations system data that are found. If the recurse parameter is used, fixes are done recursively for all subsites.
This parameter provides functionality that cannot be accessed using the Central Administration Web site.Fix mode will not create missing variation peers.
The following issues are automatically fixed:
  • Missing Variation Group ID on source site or source page
  • Missing relationships list entry for source or target page or site
  • Missing Variation Group ID on target site or target page
  • Different Variation Group ID in source site/page and target site/page (source ID will win)
spawn <none> No Creates new site variations on the source variation site specified by the url parameter for all target variations labels. If the recurse parameter is used, variations for subsites and pages are also created.
This parameter equivalent to the New Variation Site user interface setting that is located on the Site Content and Structure page.
This operation mode cannot be used to spawn pages in already spawned sites.
The command initiates the site spawn operation by creating a scheduled work item for the SpawnSiteJobDefinition timerjob. The spawn will be performed as soon as the timerjob runs.
The difference between the scheduled work item created by the CPVAreaEventReceiver and the Variationsfixuptool is that the tool has the option to create a workitem, which only spawns the site to one specific variation rather than to all spawned variation labels.
A common use scenario is to configure the source variation label with the Hierarchy Creation option to create only the root site and not the complete hierarchy and later to spawn the hierarchy or parts of the hierarchy from the source label using the STSADM command.
In MOSS 2007, this command was a major benefit compared to the automatic hierarchy creation in the UI, as it was the only way to force the creation in OWSTIMER rather than W3WP.
With SharePoint 2010 where all actions are already in OWSTIMER the benefit of using the STSADM command rather than the UI is rather limited.
showrunningjobs <none> No Displays current status of Variations Propagate Page Job Definition and Variations Propagate Site Job Definition timer jobs that are located on the Timer Job Status page of the SharePoint Central Administration Web site. But this does not provide information about Variations Create Hierarchies Job Definition , Variations Create Page Job Definition and Variations Create Site Job Definition.You only get to know about Variations Propagate Page Job Definition and Variations Propagate Site Job Definition.
Example :
stsadm -o variationsfixuptool -scan -url http://server/sites/pub/vhome/source -recurse > C:\report1.html

Source of information :

SharePoint Variations – The complete Guide – Part 11 – Variations Fixup Tool (blogs.technet.com/b/stefan_gossner/archive/2011/11/28/sharepoint-variations-the-complete-guide-part-11-variations-fixup-tool.aspx)

Use the variationsfixuptool operation (SharePoint Server 2010) [technet.microsoft.com/en-us/library/dd789633(v=office.14).aspx]

You may also like:

Cannot create Variation site because the parent Publishing web cannot…


0x80070057 sharepoint variations


Access “Pages” Library for Chinese Locale

Friday, June 29, 2012

System KeyWords for Managed MetaData Service

Hi

 

under the "System" Group >  "Keywords" Termset  I can see various terms which are named as various combination of digits like  :

1,2,3,4,5,6

1,2,3,6,4,5

1,2,5,6,4,3

1,3,4,2,5,6

1,3,5,2,4,6

1,3,6,2,4,5

1,4,2,3,5,6

1,4,3,6,2,5

1,5,2,3,4,6

1,5,2,3,6,4

1,6,2,3,4,5

 

If you know why they are there , kindly share this with all of us.Please explain what is the purpose of these digits ( Enterprise Keywords) , I we loose few of them , what may be the consequences.

 

The source of the question is , one of our servers is having  few extra of them like : 18,19,20,21,22,23

Reply 1

Hello,

I checked and I wasn't able to reproduce this issue by Content Deployment between 2 SharePoint environments so this seems like an environment specific issue; in-depth engagement is not feasible thru forum replies and this issue requires a more in-depth level of support.
Please visit the below link to see the various paid support options that are available to better meet your needs:http://support.microsoft.com/default.aspx?id=fh;en-us;offerprophone. If you are a MSDN / TechNet subscriber, you can also contact our support by using your free support incidents.

However, other members of the community may still have encountered the issue you're seeing, and have a solution to offer!

The keywords visible under "System" group are added by tagging documents / list items with predefined (taxonomy) and user defined (folksonomy). There are no pre-populated digit keywords so I do not believe there would be impact other than tagged items losing these tags.

Please see the following articles
http://office.microsoft.com/en-us/sharepoint-server-help/CH010372669.aspx
http://technet.microsoft.com/en-us/library/ee424402.aspx
http://technet.microsoft.com/en-us/library/ee424403.aspx
Regards,
Nishant Shah

 

Reply 2

 

I doubt :

if staging has a dummy term and corresponding TaxonomyHiddenList entry . The Content deployment /migration from staging to production create such type of terms.

 

What Microsoft can do to fix it :

1. Suppose staging has a term (myGuid) under Termset1(Guid1), under Group1 ( Guid2) , under Store 1 ( Guid3 ); and while migration of site collection it finds a term in taxonomyhiddenlist which is missing on target >

2. It should search for Guid1 under GUID2 under GUID 3 to create a term with myGuid.

As of now under current implementation by Microsoft for migration API's it seems to be creating entry under Group System Termset Keywords.

Wednesday, June 27, 2012

SharePoint 2010 strange behavior of Taxonomies under Migration from one server to another

Microsoft.SharePoint.dll version 14.0.4762.1000

Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000

 

We migrate only published and approved content from staging to production. Under one of the scenarios, we want to introduce new Terms for our system. Here is the strange behavior we are facing that might be reported to Microsoft for improvements in future service packs:

  1. Create new term on staging under desired termset.

  2. Reproduce same term with same Guid and parent term set on production :

    • By using TermSetItem.CreateTerm Method (String, Int32, Guid)

    • By stopping meta dataservice and then replace db on production from staging

    • By using Export-SPMetadataWebServicePartitionData,Export-SPMetadataWebServicePartitionData  powershell



  3. Create unpublished SPItem on staging which refer the new term.

  4. Let the content deployment proceed. This will migrate only the Term entry in TaxonomyHiddenList.

  5. Publish any spDoc which refers few of the old and new terms.

  6. Let the content deployment migrate the new published item.

  7. Now on production the newly migrated spDoc has only new term referred.

  8. If we migrate spDoc which only has old terms , not the new one, it migrates well and all the terms are visible under the display item form > SPField.


How you could help me in this:

I got tp_ListId and tp_DocId for my target document migrated from [AllDocs] database Tableusing LeafName.

Using tp_ListId and tp_DocId  on [AllUserData] I was able to observe that, actully content deployment has created entry correctly here.

The entry is like under ntext2 is like:“OldTerm1|OldTerm1GUID;NewTerm1|NewTerm1GUID;OLDTerm2|OldTerm2GUID”

Using all SharePoint API’s and U2U etc, my migrated item returns only the new term , it seems to be old terms are wiped off , but they are not , they are there in the database.

The output of U2U is:

NewTerm1|NewTerm1GUID

Now I have a question for you, what else than Taxonomy Update Scheduler Timer Job could be the culprit? Is it like the account with which SharePoint Timer Job is running , must have full control on Meta Data DB and Site Collection DB ?

 

How Microsoft should help us in this:

For a new term, when referred by an item, it should be handled by the migration /deployment API’s exposed internally that:

  • A new Term should be created under the same termeset as on source. Right now it tends to create under System > Keyword Terms set if term is missing. ( if I miss step 2 above)

  •  No multiple entries should be allowed in TaxonomyHiddenList with same title /GUID, name parent termset etc. If we migrate an item which is published under ste 4 above. 2 entries are created under TaxonomyHiddenList.

  • Taxonomy Update Scheduler should be made efficient to handle multiple entries for same term in the taxonomyhiddenlist by the external processes.


 Reply 1


A.

 

After doing all the above steps once :

So the Taxonomy Hidden List has all the terms as desired , the Term Store is up to date , the SQL  content DB has the entry as it should be.

Means Taxonomy Update Scheduler is not able to update the content DB , so that it may display the right values.

 

Do we need some kind of Service pack here ? We have :

Microsoft.SharePoint.dll version 14.0.4762.1000

Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000

 

I tried  TaxonomySession.SyncHiddenList(mySiteCollection); but it could not help .

 

 

B.

If I run all my Meta service , web application pool and SharePoint Timer Job with an "Admin Everywhere" account on production replica and follow the steps above along with TaxonomySession.SyncHiddenList(mySiteCollection);  still it does not help.

C.

If I run all the app pools and windows services on the server using an account who has admin rights everywhere along with TaxonomySession.SyncHiddenList(mySiteCollection);  it helps , I had to insert SyncHiddenList along with manual run of Taxonomy Migration Job in b/w step 4 and 5 above. i.e after the Taxonomy Hidden List item migration and before the actual content come in .

 

 

 

 

 

 

Is there some shorter way to avoid all the mess above said ? Or at least you could point out what else than exactly my Meta service app pool  , web application pool and SharePoint Timer Job is involved in Taxonomies !!!

 

Reply 2


Temporary Solution:
Everywhere on msdn and blogs, technology geeks have suggested making taxonomy store to be common for staging and production environments. But in our case we cannot maintain this. So with our version of SharePoint (Microsoft.SharePoint.dll version 14.0.4762.1000, Microsoft.SharePoint.Taxonomy.dll  version 14.0.4756.1000 ) we have tested below mentioned to work as an alternative:

 

0. On staging and production go to Central Admin > Security > Configure Service Accounts

a. Select Farm Account and press OK, the account you had decided to be farm account previously will be getting full control at many places in the DB , if the DB's are backup and restore off the line , generally we loose this important couple b/w farm account and Database. It will make sure the timer service which runs with Farm account can do many stuff on the whole farm without any errors.

b. do similar to above for app pool account for managed meta service and your target web applications.

1. Run export import for the staging and production to be in sync. Halt the changes on staging.
2. Create new term on staging under desired termset. Extract the GUID for this new term and keep safe with you
3. Create unpublished SPItem on staging which refer the new term.( To create an entry in taxonomy hiddenlist)
4. TaxonomySession.SyncHiddenList(mySiteCollectionStaging).Before Sync Please make sure your SharePoint Timer Job is running with an account which has full rights on site collection and the taxonomy data base. This can be done by setting admin for site collection and taxonomy service.Please make sure after this you wait for  "Enterprise Metadata site data update" and "Taxonomy Update Scheduler"  to run once as scheduled.

5. Delete the unpublished item of step 2 if you wish to .

6. Run export process and discard this export package. Because this package contains all the terms, since they are updated by sync process.
A. Reproduce same term with same Guid and parent term set on production: By using TermSetItem.CreateTerm Method (String, Int32, Guid) Take GUID and name from step two above.
B. Create an unpublished dummy item on production with new term.
C. TaxonomySession.SyncHiddenList(mySiteCollectionProd).Please make sure after this you wait for  "Enterprise Metadata site data update" and "Taxonomy Update Scheduler"  to run once as scheduled.
D. Delete the unpublished item if you wish to.

Now servers are ready for future normal import and exports without any error till the time you don’t introduce new terms.

 

Reply 3


Permanent Solution:

 

How Microsoft should help us in this:

1. Make the content deployment API's smart enough , that source Taxonomy hidden list is not marked for migration while exporting site collection , when the SPtem which refer to a term is created on Target , the right ( in sync) Term will automatically be created under target taxonomy hidden list.

2. In case Target term store is not updated , there should be explicit messages in import log , that a term is referred which may have been missing in the target term store.Import is unsuccessful.Please update your term store and run the import again. Or let the SharePoint Timer job create one at right place and give error if parent term set is missing

Friday, June 22, 2012

User Profiles in custom SPGridView with Pagination for SharePoint 2010

Hi

My User Profiles have around 10,000 records.

 

I want to display them in a SharePoint Grid ( custom) . If the count of profile were only 50 , to display them all my dev machine takes around 40 seconds with all custom processing I have.

So, I want page size say 20 , and on each next click I should   get  from profile DB next 20 records. I should be able to get page numbers also in the bottom. For direct navigation.

 

Please suggest if you have something in your mind / you done it already in past.

 

I am gonn rely on Microsoft.Office.Server.UserProfiles.ProfileManagerBase.GetEnumerator for this .

 

Any pointer ?

Reply 1

Hi hemantrhtk,

It may be not easy to use paging to user profiles, in this situation, you may consider store the profile information in a datatable, and use paging for the datatable, here is an example about this, please refer to this for more information:
http://www.codeproject.com/Articles/14017/using-cache-to-store-Data-in-datatable-for-custom

Thanks,




 

Qiao Wei

Reply 2

with http://msdn.microsoft.com/en-us/library/ee581591.aspx under microsoft.Office.Server.UserProfiles

we can atleast implement  next,previous under pagination. But looking for page numbers with last and 1st page buttons.

Anyways we are targeting membership provider as base for pagination, which has   public abstract MembershipUserCollection GetAllUsers(int pageIndex, int pageSize, out int totalRecords);. after that for each page fetch the records from User profiles.

Tuesday, June 12, 2012

Quickly get all files inside a spfolder recursive

Follow these steps:

  • Open the spfolder in IE.

  • Click on open with windows explorer.

  • Use this url to map a drive in your my computer.

  • Use below power-shell to extract the list:


 Get-ChildItem "z:\" -Recurse  | Foreach-Object 
  {
       write-host $_.fullname;
       Add-Content  "C:\hemant\sometextfile.txt"  $_.fullname;
  }