Showing posts with label SharePoint. Show all posts
Showing posts with label SharePoint. Show all posts

Sunday, April 21, 2013

Thursday, March 14, 2013

Refiners in SharePoint 2013 Search

We will try to understand how refiners work in SharePoint 2013 here.

Refiner webpart passes a Jason object in url .



1. Go to view all site content ( http://servername/sites/var/_layouts/15/viewlsts.aspx)

2. Create a Custom List similar to mentioned below :




Column TermSetColumnPOC ( Allow multiple values) is linked to a termset withTerms :

Term1 , Term2 , Term3 , Term4 , Term5 ,Term6 , Term7 , Term8 ,Term9

I want my search result page to display results  when "Term8"  is present in column TermSetColumnPOC or NormalColumnPOC  . I will also provide a filter , which will hide the results which don't have "Term8"  present in column TermSetColumnPOC .

2. Populate data in custom List :



Searching "Term1" should give items hk1 , hk4 , hk7, hk9

Searching "Term8" should give hk1, hk6,hk8

Now applying Refiner on "TermSetColumnPOC" equal to value  "Term8"  should give only hk1,hk8

3. Create a Content Source under Search administration in central admin which contain only your test web application to be crawled . This is not a mandate , but will expedite this process of testing refiners , as you will drastically reduce crawling time.

Run a full crawl on this search content source.



4.   Search Service application > Search Schema

Create a managed property "TermSetColumnPOCManaged" as mentioned below











5. Run full crawl  again.



6. After this crawl is over , you can test refiners in your search center . I don't want one so , I will create my own simple page.

7.  Enable site collection feature "Search Server Web Parts and Templates"





8.  Create a simple web part page and insert below mentioned webpart on page

Search Box , Search results , Refinement



9. Under Webpart properties of refinement  webpart choose TermSetColumnPOCManaged as one of the available refiners.



10 . Publish and approve this page .

11.  Type "Term1"  in search box or Type url   http://......./search1.aspx#k=Term1



11.1 Type "Term8"  in search box or Type url   http://......./search1.aspx#k=Term8



11.2 Type "Term8"  in search box and Click on "Term8"   in Refiners

( use url decode to understand what is being passed in url )

or

http://......./search1.aspx#Default={"k":"Term8","r":[{"n":"TermSetColumnPOCManaged","t":["\"ǂǂ5465726d38\""],"o":"and","k":false,"m":null}]}

( here equivalent json object is passed as parameter )

or

http://......./search1.aspx#Default={"k":"Term8","r":[{"n":"TermSetColumnPOCManaged","t":["\"Term8\""],"o":"and","k":false,"m":null}]}

( here equivalent json object is passed as parameter  but with Term display value )



11.3 search "TermSetColumnPOCManaged:Term8"  or 

type url  http://......./search1.aspx#k=TermSetColumnPOCManaged:Term8





Another example of JASON with Refiners and Search core result webpart :

On the same page Search "Term1" , and in Refiners , select "Term1"



or type url :

http://......./search1.aspx#Default={"k":"Term1","r":[{"n":"TermSetColumnPOCManaged","t":["\"Term1\""],"o":"and","k":false,"m":null}]}

The result in both ways :





One more example of jason with Search result page :

On the same page Search "Term1" , and in Refiners , select "Term9"

or type url :

http://......./search1.aspx#Default={"k":"Term1","r":[{"n":"TermSetColumnPOCManaged","t":["\"Term9\""],"o":"and","k":false,"m":null}]}

Result both ways comes to be :





You may also like:

SharePoint 2013 features overview

Friday, February 8, 2013

Asynchronous call to WebService / WCF using JQuery

In one of the implementations, we had to do asynchronous calls to multiple SharePoint Lists for better UI and user experience.

For better performance and managed solution, we exposed a custom web service which takes care of all the data manipulations and returns desired result set / exposes methods to do required operations.

Here is a Proof of concept code snippet which might be helpful to you to call web services using JQuery.Ajax  :

$.ajax({
// type: Type, //GET or POST or PUT or DELETE verb
url: Uri, // Location of the service
// data: Data, //Data sent to server
dataType: DataType, //Expected data format from server
cache: false, // no-cache
success: function (msg) {//On Successfull service call
ServiceSucceeded(msg);
},
error: ServiceFailed// When Service call fails
});

Full Sample Code :
asynchronous-call-to-web-service_1

Saturday, January 12, 2013

finding features in a content database in SharePoint 2010 using PowerShell or tools

Sometimes we have a feature id and we want to know the places where it is active, even as a dummy one. Below mentioned script might be helpful.

http://get-spscripts.com/2011/06/removing-features-from-content-database.html

or

http://featureadmin.codeplex.com/downloads/get/290833

or

http://archive.msdn.microsoft.com/WssAnalyzeFeatures

 

Sunday, October 14, 2012

feature activation code using powershell does not pick appsettings values

This is a  very common scenario where , our deployment guides tend to be useless with automation just because features are activated well using UI in settings of site/site collection  , but fail using powershell.

Here is the solution to this :


using (SPWeb web = properties.Feature.Parent as SPWeb)
{
SPWebApplication webApp = web.Site.WebApplication;
Configuration config = WebConfigurationManager.OpenWebConfiguration("/", webApp.Name);
// App settings are retrieved this way
config.AppSettings.Settings["someAppSettingKey"].Value;


// Connection string settings are retrieved this way
config.ConnectionStrings.ConnectionStrings["someDBConnectionString"].ConnectionString
web.Site.Dispose();
}


 

Reference :

Read web.config on FeatureActivated event while activating Feature using PowerShell

 

Wednesday, October 10, 2012

Get Dependent files from HTML response

Hi

I am getting response of a page like :

System.Net.HttpWebRequest  request  ...

request.Credentials...

request.Method = ...

request.UserAgent =..

(System.Net.HttpWebResponse)request.GetResponse();

receiveStream = response.GetResponseStream();

readStream = new StreamReader(receiveStream, Encoding.UTF8);

string pageContent = readStream.ReadToEnd();

Now out of this string pageContent    , I want to extract all sub request and make them a hit.   Is there something better than html agility pack for this ?

Reply 1 By http://social.msdn.microsoft.com/profile/joel%20engineer/?ws=usercard-mini

I usually read it into an htmlDocument Class and the use either GetElementByID or GetElementsByTagname()

 

 

 

string pageContent;

webBrowser1.Document.Write(pageContent);

 




jdweng

 

Reply 2 


You should try http://htmlagilitypack.codeplex.com/

I am searching something better dear.

Thursday, August 2, 2012

Access "Pages" Library for Chinese Locale

Hi

My site collection has 5 language variations.

 

Except Chinese version I am able to access Pages Library Like :

web.Lists["Pages"] or web.Lists.TryGetList("Pages")  .

 

But for Chinese sub site :

  1. If I Iterate   web.Lists   collection  using foreach loop , there is a list present whose SPList.Title is "Pages" . For this list Under SPList.SchemaXml , Title is Title="頁面"

  2. web.Lists["Pages"] gives error: List 'Pages' does not exist at site with URL 'XXXXXXXXXXXXXXXXXXXXXXXXX' AND  web.Lists.TryGetList("Pages")   gives null .

  3. But web.Lists["頁面"]  returns the right SPList object


Is it like , SharePoint API's to retrieve a list by name refer to  SchemaXml   rather than SPList.Title , while retrieving a list internally ?

 

 

 

Reply 1 by http://social.technet.microsoft.com/profile/ivan%20vagunin/?ws=usercard-mini

Hi!

I would propose to use SPWeb.GetList function - it returns list by url (eg web.GetList("/myweb/Pages") and url usually the same for all locales.

http://msdn.microsoft.com/ru-ru/library/microsoft.sharepoint.spweb.getlist(v=office.12).aspx

reply 2 By http://social.technet.microsoft.com/profile/aseem%20sood/?ws=usercard-mini

All the options that I can come up with are:

 

  1. Web.GetList(<Url from SharePoint Resource File>) Preferred over the others for Intranet

  2. PublishingWeb.PagesList (Coz it was specifically mentioned for Pages Library)

  3. SPWeb.GetListsOfType ( with 850 for Pages Library. If it does not work then 101-Document Library and Loop through)

  4. Guid LibraryPagesId = PublishingWeb.GetPagesListId(web); SPList psgeslib = web.Lists[LibraryPagesId];


Also, just wanted to add Web.GetList(URL) is normally a better performing approach than Web.Lists[ID/TITLE].

Thanks,

Aseem Sood

Sunday, July 1, 2012

VariationsFixupTool

variationsfixuptool is used to correct variations system data
on publishing sites or pages.If you want to analyze the actual site structure
and the data stored in the relationships list this tool may be good one.

Syntax

stsadm -o variationsfixuptool

   -url <source variation site URL>

   [-scan]

   [-recurse]

   [-label]

   [-fix]

   [-spawn]

   [-showrunningjobs]


Parameters


Parameter name Value Required? Description
url A valid URL, such as http://server_name Yes The URL of a site in source variation where variations system data is being analyzed or corrected.
scan <none> No Analyzes the variations hierarchy and report findings.
This parameter provides functionality that cannot be accessed using the Central Administration Web site.
For each Site/Page, it reports:
  • If the source site is marked as being in the source variation hierarchy (SPWeb.AllProperties["__InSourceHierarchy"] == True ?). If the site is marked as being in the source hierarchy means that the page is part of the source hierarchy.
  • The Variation Group Id of the source site or page (SPWeb.AllProperties["Variation Group Id"] or PublishingPage.ListItem[FieldId.VariationGroupId])
  • Which peers are registered in the relationships list with same Variation Group Id
  • If a variation peer exists in the configured labels (default is all spawned labels) by first checking if a peer is configured in the relationships list with the same variation group ID for the given label. If no peer can be found using the relationships list we try to lookup the peer using the default URL that would be used when creating a variant in the given label
  • The variation group id of the variation peers in the target labels (SPWeb.AllProperties["Variation Group Id"] or PublishingPage.ListItem[FieldId.VariationGroupId])
  • If the source site/page is configured in the relationships list
  • In addition the command reports the variation labels used for the peer check (default is all spawned labels).
  • The tool will not identify any issues - it is required to analyze the report in detail and interpret it in order to find problems.
recurse <none> No Scan or fix all subsites of the site specified by the url parameter.
label A valid label name, such as "English" No Name of the label of the variation target.
fix <none> No Corrects invalid variations system data that are found. If the recurse parameter is used, fixes are done recursively for all subsites.
This parameter provides functionality that cannot be accessed using the Central Administration Web site.Fix mode will not create missing variation peers.
The following issues are automatically fixed:
  • Missing Variation Group ID on source site or source page
  • Missing relationships list entry for source or target page or site
  • Missing Variation Group ID on target site or target page
  • Different Variation Group ID in source site/page and target site/page (source ID will win)
spawn <none> No Creates new site variations on the source variation site specified by the url parameter for all target variations labels. If the recurse parameter is used, variations for subsites and pages are also created.
This parameter equivalent to the New Variation Site user interface setting that is located on the Site Content and Structure page.
This operation mode cannot be used to spawn pages in already spawned sites.
The command initiates the site spawn operation by creating a scheduled work item for the SpawnSiteJobDefinition timerjob. The spawn will be performed as soon as the timerjob runs.
The difference between the scheduled work item created by the CPVAreaEventReceiver and the Variationsfixuptool is that the tool has the option to create a workitem, which only spawns the site to one specific variation rather than to all spawned variation labels.
A common use scenario is to configure the source variation label with the Hierarchy Creation option to create only the root site and not the complete hierarchy and later to spawn the hierarchy or parts of the hierarchy from the source label using the STSADM command.
In MOSS 2007, this command was a major benefit compared to the automatic hierarchy creation in the UI, as it was the only way to force the creation in OWSTIMER rather than W3WP.
With SharePoint 2010 where all actions are already in OWSTIMER the benefit of using the STSADM command rather than the UI is rather limited.
showrunningjobs <none> No Displays current status of Variations Propagate Page Job Definition and Variations Propagate Site Job Definition timer jobs that are located on the Timer Job Status page of the SharePoint Central Administration Web site. But this does not provide information about Variations Create Hierarchies Job Definition , Variations Create Page Job Definition and Variations Create Site Job Definition.You only get to know about Variations Propagate Page Job Definition and Variations Propagate Site Job Definition.
Example :
stsadm -o variationsfixuptool -scan -url http://server/sites/pub/vhome/source -recurse > C:\report1.html

Source of information :

SharePoint Variations – The complete Guide – Part 11 – Variations Fixup Tool (blogs.technet.com/b/stefan_gossner/archive/2011/11/28/sharepoint-variations-the-complete-guide-part-11-variations-fixup-tool.aspx)

Use the variationsfixuptool operation (SharePoint Server 2010) [technet.microsoft.com/en-us/library/dd789633(v=office.14).aspx]

You may also like:

Cannot create Variation site because the parent Publishing web cannot…


0x80070057 sharepoint variations


Access “Pages” Library for Chinese Locale

Friday, June 29, 2012

System KeyWords for Managed MetaData Service

Hi

 

under the "System" Group >  "Keywords" Termset  I can see various terms which are named as various combination of digits like  :

1,2,3,4,5,6

1,2,3,6,4,5

1,2,5,6,4,3

1,3,4,2,5,6

1,3,5,2,4,6

1,3,6,2,4,5

1,4,2,3,5,6

1,4,3,6,2,5

1,5,2,3,4,6

1,5,2,3,6,4

1,6,2,3,4,5

 

If you know why they are there , kindly share this with all of us.Please explain what is the purpose of these digits ( Enterprise Keywords) , I we loose few of them , what may be the consequences.

 

The source of the question is , one of our servers is having  few extra of them like : 18,19,20,21,22,23

Reply 1

Hello,

I checked and I wasn't able to reproduce this issue by Content Deployment between 2 SharePoint environments so this seems like an environment specific issue; in-depth engagement is not feasible thru forum replies and this issue requires a more in-depth level of support.
Please visit the below link to see the various paid support options that are available to better meet your needs:http://support.microsoft.com/default.aspx?id=fh;en-us;offerprophone. If you are a MSDN / TechNet subscriber, you can also contact our support by using your free support incidents.

However, other members of the community may still have encountered the issue you're seeing, and have a solution to offer!

The keywords visible under "System" group are added by tagging documents / list items with predefined (taxonomy) and user defined (folksonomy). There are no pre-populated digit keywords so I do not believe there would be impact other than tagged items losing these tags.

Please see the following articles
http://office.microsoft.com/en-us/sharepoint-server-help/CH010372669.aspx
http://technet.microsoft.com/en-us/library/ee424402.aspx
http://technet.microsoft.com/en-us/library/ee424403.aspx
Regards,
Nishant Shah

 

Reply 2

 

I doubt :

if staging has a dummy term and corresponding TaxonomyHiddenList entry . The Content deployment /migration from staging to production create such type of terms.

 

What Microsoft can do to fix it :

1. Suppose staging has a term (myGuid) under Termset1(Guid1), under Group1 ( Guid2) , under Store 1 ( Guid3 ); and while migration of site collection it finds a term in taxonomyhiddenlist which is missing on target >

2. It should search for Guid1 under GUID2 under GUID 3 to create a term with myGuid.

As of now under current implementation by Microsoft for migration API's it seems to be creating entry under Group System Termset Keywords.

Friday, June 22, 2012

User Profiles in custom SPGridView with Pagination for SharePoint 2010

Hi

My User Profiles have around 10,000 records.

 

I want to display them in a SharePoint Grid ( custom) . If the count of profile were only 50 , to display them all my dev machine takes around 40 seconds with all custom processing I have.

So, I want page size say 20 , and on each next click I should   get  from profile DB next 20 records. I should be able to get page numbers also in the bottom. For direct navigation.

 

Please suggest if you have something in your mind / you done it already in past.

 

I am gonn rely on Microsoft.Office.Server.UserProfiles.ProfileManagerBase.GetEnumerator for this .

 

Any pointer ?

Reply 1

Hi hemantrhtk,

It may be not easy to use paging to user profiles, in this situation, you may consider store the profile information in a datatable, and use paging for the datatable, here is an example about this, please refer to this for more information:
http://www.codeproject.com/Articles/14017/using-cache-to-store-Data-in-datatable-for-custom

Thanks,




 

Qiao Wei

Reply 2

with http://msdn.microsoft.com/en-us/library/ee581591.aspx under microsoft.Office.Server.UserProfiles

we can atleast implement  next,previous under pagination. But looking for page numbers with last and 1st page buttons.

Anyways we are targeting membership provider as base for pagination, which has   public abstract MembershipUserCollection GetAllUsers(int pageIndex, int pageSize, out int totalRecords);. after that for each page fetch the records from User profiles.

Tuesday, June 12, 2012

Quickly get all files inside a spfolder recursive

Follow these steps:

  • Open the spfolder in IE.

  • Click on open with windows explorer.

  • Use this url to map a drive in your my computer.

  • Use below power-shell to extract the list:


 Get-ChildItem "z:\" -Recurse  | Foreach-Object 
  {
       write-host $_.fullname;
       Add-Content  "C:\hemant\sometextfile.txt"  $_.fullname;
  }

Monday, May 28, 2012

Microsoft.SharePoint.Taxonomy Migration of Taxonomy

Hi



I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .

  1. I took Content DB backup of Site collection SC1 on this server 1 along with Taxonomy DB backup on server 1 .

  2. I restored these DB's on Server2 ( Microsoft.SharePoint.DLL version  14.0.4762.1000)  SQL DB .

  3. Created a new Taxonomy Meta Service with  restored MetaService DB backup on Farm2.

  4. Created a new web application and site collection with restored Content DB from Server1  on Farm2.

  5. Manually run the Taxonomy Update Scheduler Timer Job.

  6. When I create a new item/ edit an on Server 2, a new entry is created in TaxonomyHiddenList and a new entry under termStore "System".

  7. Even though I have same term present in my custom Termstore in a termset !!!!

The ideal behavior should be , on server 2 , new item should be tagged to my custom term in predefined termset and termstore .



Please guide me what step I am missing.

Reply 1

Let's try a different approach. Instead of trying to create the service from a backup, we can export the data and import it into a new metadata service:

  1. Delete the Managed Metadata service in Farm 2.

  2. Create a new Managed Metadata service in Farm 2. Note the application pool account.

  3. In farm 1, from an elevated SharePoint Management Console window, run the Export-SPMetadataWebServicePartitionData cmdlet. This will create export the managed metadata data into a cabinet file (.CAB):

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsite" -ServiceProxy "ManagedMetadata Service Proxy Name" -Path "C:\Temp\ManagedMetadata.cab"

  1. opy the .CAB file from farm 1 to a folder on the farm 2 SQL Server. Share this folder and give Everyone Full Control sharing permission (we will clean this up later)

  2. In farm 2 SQL Server Management Studio, give the Managed Metadata application pool account (from step 2) bulkadmin SQL Server role (Expand the server instance -> Expand Security -> Find the account -> Right click, Properties -> Server Roles Page -> Check bulkadmin -> Click OK)

  3. In farm 2 SharePoint server, from an elevated SharePoint Management Console window, run the Import-SPMetadataWebServicePartitionData cmdlet to import the data:

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsitefarm2" -ServiceProxy "New Managed Metadata Service Proxy Name" -Path "\\SQLServerName\Share\ManagedMetadata.cab"

You'll need to update the URLs and names to reflect your environments.






JASON WARREN



Reply 2



Hello Jason



Thanks for your great help. With the migration commands you shared also , I was getting the same problem.

The source of the problem in my case was :

After recreation / DB restoration /migration of Taxonomies , few of the properties in my User Profiles had lost the mappings, I had to manually remap them to get the things working.

Reply 3 

After everything I am facing below mentioned: ( To sum up I will rewrite all steps I did)\



  1.        I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .On Farm 2 Microsoft.SharePoint.DLL version  is 14.0.4762.1000

  2.        The Farm1 and Farm2 are exact replica ( created from same content DB) and Taxonomies also being same with same GUIDs of Term Store and sets and Terms.

  3.        I created a new term under my custom term set on Farm1 . Used  Export-SPMetadataWebServicePartitionData  and  Import-SPMetadataWebServicePartitionData   (OverwriteExisting)  to migrate changes in Terms  to Farm 2

  4.        Edited an existing content in Site Collection 1 to refer to the new term.

  5.        Now I used Deployment API’s to take incremental export of site collection 1 .

  6. The import process on Site Collection2 on farm 2 create two entries in TaxonomyHiddenList !! These two entries have same Title, IdForTermStore, IdForTerm, IdForTermSet etc . Only difference is the SPITEM ID .

  7.        All the functionalities of my site work fine. But I am not sure why these two entries should be there in TaxonomyHiddenList !!!



After each step above I run the Taxonomy Update Scheduler Timer Job on Site collection 2 web application. Manual TaxonomySession.SyncHiddenList also did not help.



Seems to be one entry in TaxonomyHiddenList came from content migration of TaxonomyHiddenList itself and the second entry came as include dependency of list item edited which refer to this new  Term. Might be the case ?



As the import Log mention The import Process on Site Collection2 refer to TaxonomyHiddenList Twice :

[5/30/2012 11:08:46 AM] Start Time: 5/30/2012 11:08:46 AM.
[5/30/2012 11:08:46 AM] Progress: Initializing Import.
[5/30/2012 11:08:46 AM] Progress: Starting content import.
[5/30/2012 11:08:46 AM] Progress: De-Serializing Objects to Database.
[5/30/2012 11:08:46 AM] [Folder] [Person] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Person] Verbose: Source URL: _catalogs/users/Person
[5/30/2012 11:08:46 AM] [Folder] [Item] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Item] Verbose: Source URL: Lists/TaxonomyHiddenList/Item
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Verbose: Source URL: myLibrary/Forms/myLibrary
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Progress: Importing
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Source URL: sitemap.xml
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Destination URL: /sitemap.xml
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Progress: Importing
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: List URL: /myLibrary
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: Deleting...
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Verbose: List URL: /Lists/TaxonomyHiddenList
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Verbose: List URL: /_catalogs/users
[5/30/2012 11:08:47 AM] Verbose: Performing final fixups.
[5/30/2012 11:08:47 AM] Progress: Import completed.
[5/30/2012 11:08:47 AM] Finish Time: 5/30/2012 11:08:47 AM.
[5/30/2012 11:08:47 AM] Duration: 00:00:00
[5/30/2012 11:08:47 AM] Total Objects: 11
[5/30/2012 11:08:47 AM] Finished with 0 warnings.
[5/30/2012 11:08:47 AM] Finished with 0 errors.

My migration process export only the content which is published . If I give a tweak in the process it works:



1. Create an item which is not published using the new term on server 1 . This populates an entry in TaxonomyHiddenList. Let this Term in TaxonomyHiddenList migrate in next export import migration.



2. Now before next migration cycle , Publish the item which was refering to new term on server 1 . After next migration , the updated item goes to server 2 and there is no extra term in TaxonomyHiddenList onserver 2.

Reason: Probably import process on Server 2 is taken care by SharePoint as a Transaction. Now TaxonomyHiddenList Import and refering item import is under single transaction. So the item does not get reference to the newly created / imported taxonomy hidden list entry and hence trigger a new spitem creation in the taxonomyhidden list.

If we don't want to use powershell for Taxonomy migration and want to stick to db migration, there is a way out, restart SQL server just before db restore on target , so that there may be no connection Live to the managed db at the time of restore.

If we try to point the old service to a new DB instead(on the target,where new db is restored from the source meta db), some mappings may be lost for the termsets to the Columns in the list.

You may also like:

System KeyWords for Managed MetaData Service


SharePoint 2010 strange behavior of Taxonomies under Migration from one


Wednesday, May 16, 2012

Microsoft.SharePoint.Deployment SPExportSettings

Hi

 

ExportChangeToken property in SPExportSettings  is something which tells SharePoint API's a point in past from which the incremental export should start .

 

CurrentChangeToken  (ReadOnly) is something which is set internally , and referred by future exports as a mile stone for export process.This property is set  the when some change occurs in site or when the increamental batchis complete ?

 

I have a batch which runs perfectly under ideal conditions.

 

 

I want that when I run this batch manually , value of CurrentChangeToken  should not be persisted in the system,

so that automatic batch process keeps on running as is & my manual run is free enough to export without any affect on the automatic.

 

 

Please help.

1. Is it ok to create my own persisted object store of SPChangeToken , let the Batch use it to get last value and add new token there with CurrentChangeToken   value.  The manual process do not update the custom persisted object , so last in the persisted object is the one by automatic process.

 

 

 

2.

I have one last resort for me , if I don't find direct API's in Deployment name spaces :  SPSite.GetChanges Method (SPChangeToken, SPChangeToken)

In this case I want to know Microsoft.SharePoint.Deployment.SPExportSettings.CurrentChangeToken  (ReadOnly)  property is set internally after the export has run  or it is independent of when export runs , but dependent on when actual change occurs.

Reply 1

Hi

 

Under content migration scenarios from staging to Production on daily basis , how  Microsoft.SharePoint.Deployment API's are better performer than http://technet.microsoft.com/en-us/library/ee721058.aspx ?

Friday, March 23, 2012

SharePoint 2010 | Login over HTTPS from HTTP pages

Hello Friends

 

Did you ever have a chance to work on Login control for SharePoint which sends user’s credentials on SSL and rest of the stuff non-secured?  I found few of the good links which suggest creating our own custom Cookie handlers.

Microsoft SharePoint Team has written their own cookie handler specific to SharePoint which does not allow authentication token being generated on a secured connection to be used under non-secured. 1st thing, Do you see any kind of harm in overriding this behavior of SharePoint’s cookie handler with our own custom one ? Second,I was able to transfer back the cookies using  this approach as in the links below, but current context was lost, and user remained to be logged off !!!

 

References:

  1.        http://www.sp2010hosting.com/Lists/Posts/Post.aspx?ID=5

  2.        http://blogs.visigo.com/chriscoulson/mixed-http-and-https-content-with-sharepoint-2010/

  3.        http://www.sharepointconfig.com/2010/04/partial-ssl-sharepoint-sites-login-over-http-from-http-pages/


 

Steps I followed:

1. Extended the current setup on SSL . Made it working similar to the default zone application ( including resource files and web.config entries,manual dll etc.)

2. Set the postback url for login button.

3. Made the URL rewrite entries as suggested in both zones. ( these are spread across multiple links, REQUEST_METHOD is additional to https ON rule) URL rewriting can be done very easily with an IIS extension provided by Microsoft.

4. Created the custom handler as suggested and replaced it with SharePoint one.

5. Set the required SSL for cookies to false, so that they may be used on non ssl also.

 

Reply 1

The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.

So with  custom cookie handler with both end point under same zone , it works! !

 

Reply 2

Hello hemantrhtk

 

Thank you for your post.

 

This is a quick note to let you know that we are performing research on this issue.

 

Thanks,




 

Pengyu Zhao

 

Reply 3

The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.

So with  custom cookie handler with both end point under same zone , it works! !

Friday, March 2, 2012

Migration from MOSS 2007 to SharePoint 2010 Infopath does not work

We are facing an issue to connect my SharePoint 2010 application with InfoPath Designer.

The site collection for which we are getting this error is migrated from MOSS 2007 to SharePoint 2010.For other applications which are created in SharePoint 2010 itself (no migration) , everything works fine.

While we are trying to connect to migrated application ,we are getting error :

 

‘This feature requires SharePoint 2010 or greater with InfoPath  Forms Services enabled’

 

We did below points to troubleshoot this issue but didn’t get any success :(http://sharepointbloggin.com/2010/12/  and http://sharepointbloggin.com/)

 

  • Activated the "SharePoint Server Enterprise Site Collection features" at Site / Web level

  • Checked Enterprise CAL Licence on the server .


 

However when we tried to connect another web application in the same farm that is working fine .

 

 

Please suggest any workaround if  anyone has faced this kind of problem before, we would greatly appreciate his / her inputs.

Reply 1

Hi hemantrhtk,

May be you haven’t activate it on your the top-level site .Go to the top-level site of wherever your site is, go to the Site Settings > Site Collection Features, and ensure the Enterprise Feature is activated here.

This is also may be InfoPath compatibility problems, you can delete the item relate InfoPath , then recreate it on your SharePoint 2010 environment.

Thanks,

Jack

Hello Jack

 

We have tried this also as per the links shared above.

Reply 2 by http://social.msdn.microsoft.com/profile/shuklabond/?ws=usercard-mini

Hi Hemant ,

Find your solution at :

http://social.msdn.microsoft.com/Forums/en-US/sharepoint2010general/thread/9f11dc60-4f51-4114-8cc1-fa21115d0de7

Regards,
Vivek Shukla

Wednesday, February 8, 2012

SharePoint as Multilingual Solution - Comparison with other products

Hi

I need a comparison sheet of SharePoint 2010 with other products with respect to multilingual support . Other products could be : Google Sites, Hyper Office,  Documentum, Alfresco , Oracle Beehive, Jive, O3 Spaces, Mindtouch ,Lotus Vignette,  Drupal, Salesforce an so on.

 

Reply 1

I don't think a sheet like that exists. I found a couple of interesting links for you:

 

http://technet.microsoft.com/en-us/library/cc262055.aspx

 

 

http://sharepoint.microsoft.com/blogs/GetThePoint/Lists/Posts/Post.aspx?ID=366

 

 

 

http://blog.consejoinc.com/2011/01/creating-multilingual-sites-in.html

Hopefuly they contain useful information for you. One good thing to remember is that users always have to translate content to the other language. All the system settings (such as settings and default columns) are translated. I don't know how to other products work compared to SharePoint.

You also buy a platform with SharePoint so you can use it for lots more then a multilingual solution.




Blog: www.jasperoosterveld.com Twitter: @JasITConsultant

 

Thursday, February 2, 2012

List web service returns empty result if anonymous is enabled for site collection using list GUID

I am using MOSS 2007 SP 2. I enabled anonymous access on the site collection for lists only. When I try to call GetListItems on the lists web service on a list that does not have anonymous enabled using the list GUID in the request I get back 0 items. If I used the list name instead I get back list items. If I disable anonymous access for the site collection I also get back list items. Any idea what's going on?

 

Reply 1

Wrap your request inside authentication.asmx.

Use LoginResult method and based on the results of this function assign the CookieContainer of authentication web service to the Lists web service.

Delegate Control to Inject Google Analytics in MOSS

Hi Guys,

i want to load js for Google analytics for some site and don't want to load for some site so thinking of doing this using delegate control so users can control through feature activation and deactivation.i did something like this for banner image so will this approch work to load js too ?

if there is other way please advise

Thanks

Ronak

Reply

You can opt for 2 set of master pages. One without scripts included, the other one having the Google Analytic implementation

 

Reply 2

Guys,

I am trying to add JS in Master page using Delegate Controls but its not working.here is code i have written.

Please advise

 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using Microsoft.SharePoint;
namespace WSPBuilderProject1.SharePointRoot.Template.ControlTemplates.PhilaGov
{
public partial class PhilaGovGA : UserControl {
protected void Page_Load(object sender, EventArgs e) {
string _url = SPContext.Current.Web.Url;
}
protected override void CreateChildControls() {
base.CreateChildControls();
this.Controls.Add(new LiteralControl(@"<script type='text/javascript'>
var _gaq = _gaq || [];
_gaq.push(['_setAccount', '#12323232']);
_gaq.push(['_trackPageview']);
</script>"));
}

}
}


 

Once i enable feature i dont see this code in head section of master page.

Please advise

 

Thanks

Ronak

Friday, January 20, 2012

ResolvePrincipal

Hi

 

I am trying to ResolvePrincipal using People.asmx in two seprate farms. Both farms are single server farms with same configuration . Both farms target the same AD for user information. End point of AD being use din both the farms are the same .

But one one server it takes average 3.5 seconds to Run this web service method, but on the other it takes 43 seconds.

 

What could be the reason behind ?

Reply 1:

Check out a network trace using the tool Netmon.

http://www.microsoft.com/download/en/details.aspx?id=4865




Varun Malhotra

Saturday, May 28, 2011

Analyse event logs for Service Control Manager activities from windowsservices

Sometime you might want to start and stop kind of activities out of event logs for windows services . here is powershell script for this
$eventL = Get-EventLog -LogName "System" -Source "Service Control Manager" ;
Get-Service | ForEach-Object {
$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;

$eventL | where { $_.Message.Contains($st2) -eq "true" } | Select-Object -First 1 | ForEach-Object {

Write-Host $st1 $st2 $st3 -foregroundcolor cyan;
$st4 = "";
$st5 = "";
$st4 = $_.Message;
$st5 = $_.TimeGenerated;
Write-Host $st4 $st5;

}

}
$eventL = $null;

Now that you have decided the culprit windows service, use this script to get whole available history:

$eventL = Get-EventLog -LogName "System" -Source "Service Control Manager" ;

Get-Service | ForEach-Object {

if($_.Name -eq "AAAAAAAAAAAA"){

$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;

$eventL | where { $_.Message.Contains($st2) -eq "true" } | ForEach-Object {

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;
$st4 = "";
$st5 = "";
$st4 = $_.Message;
$st5 = $_.TimeGenerated;
Write-Host $st4 $st5;

}

}

}
$eventL = $null;

Might be the case , you only want to list down which service is stopped and which one is active :

Get-Service | ForEach-Object {

$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

Write-Host $st1 "," $st2 "," $st3 -foregroundcolor cyan;

}

You may also like:


get last app pool recycle timings