Hi
My User Profiles have around 10,000 records.
I want to display them in a SharePoint Grid ( custom) . If the count of profile were only 50 , to display them all my dev machine takes around 40 seconds with all custom processing I have.
So, I want page size say 20 , and on each next click I should get from profile DB next 20 records. I should be able to get page numbers also in the bottom. For direct navigation.
Please suggest if you have something in your mind / you done it already in past.
I am gonn rely on Microsoft.Office.Server.UserProfiles.ProfileManagerBase.GetEnumerator for this .
Any pointer ?
Reply 1
Hi hemantrhtk,
It may be not easy to use paging to user profiles, in this situation, you may consider store the profile information in a datatable, and use paging for the datatable, here is an example about this, please refer to this for more information:
http://www.codeproject.com/Articles/14017/using-cache-to-store-Data-in-datatable-for-custom
Thanks,
Qiao Wei
Reply 2
with http://msdn.microsoft.com/en-us/library/ee581591.aspx under microsoft.Office.Server.UserProfiles
we can atleast implement next,previous under pagination. But looking for page numbers with last and 1st page buttons.
Anyways we are targeting membership provider as base for pagination, which has public abstract MembershipUserCollection GetAllUsers(int pageIndex, int pageSize, out int totalRecords);. after that for each page fetch the records from User profiles.
Friday, June 22, 2012
Tuesday, June 12, 2012
Quickly get all files inside a spfolder recursive
Follow these steps:
- Open the spfolder in IE.
- Click on open with windows explorer.
- Use this url to map a drive in your my computer.
- Use below power-shell to extract the list:
Get-ChildItem "z:\" -Recurse | Foreach-Object
{
write-host $_.fullname;
Add-Content "C:\hemant\sometextfile.txt" $_.fullname;
}
Friday, June 1, 2012
Central Admin Content Deployment Jobs and Path by Microsoft (SharePoint 2010) VS SharePoint Content Deployment Wizard by chrisobrien
Hi
Few of our projects are using SharePoint Content Deployment Wizard by chrisobrien since old days of MOSS 2007. And others use Central Admin Content Deployment Jobs and Path by Microsoft .
Microsoft keeps on updating there out of box feature in SharePoint as usual.
I am starting this discussion for folks here to encourage using the best that could be. I would request everyone to share there views Like :--
____________________________________________________________________________________________________________
My preference : Central Admin Content Deployment Jobs and Path by Microsoft (SharePoint 2010) (Option 1)
Why :
OR
My preference : SharePoint Content Deployment Wizard by chrisobrien (Option 2)
Why :
_____________________________________________________________________________________________________________
You are welcome to share links where the tech world have already such threads available which gives this comparison handy like SharePoint Content Deployment Wizard
________________________________________________________________________________________________________________________________________________________________________________________________________________________
Are there some alternatives if we don’t want to upgrade from Microsoft SharePoint Server 2010 (14.0.4763.1000), still we don’t face below mentioned issues:
Additionally, you receive the following error message:
Value does not fall within the expected range.
References: http://support.microsoft.com/kb/2459108
In this scenario, the alternative language setting is not changed in the destination web application.
References: http://support.microsoft.com/kb/2536591
References: http://support.microsoft.com/kb/2536591
FatalError: Specified data type does not match the current data type of the property.
References: http://support.microsoft.com/kb/2598304
You change the page order by using the Site Navigation Settings page in the SharePoint site.
You perform a content deployment from this site to another site.
In this scenario, the content deployment fails, and you receive the following error message:
Cannot complete this action.
References: http://support.microsoft.com/kb/2598304
Nevertheless we can avoid many not listed anywhere using below mentioned:
Best practices for content deployment (SharePoint Server 2010)
You are invited to add the issues you kept on facing in SharePoint Content Deployment using Paths and Jobs , with links to resolution or explanations.
Few of our projects are using SharePoint Content Deployment Wizard by chrisobrien since old days of MOSS 2007. And others use Central Admin Content Deployment Jobs and Path by Microsoft .
Microsoft keeps on updating there out of box feature in SharePoint as usual.
I am starting this discussion for folks here to encourage using the best that could be. I would request everyone to share there views Like :--
____________________________________________________________________________________________________________
My preference : Central Admin Content Deployment Jobs and Path by Microsoft (SharePoint 2010) (Option 1)
Why :
- I prefer option 1 because it is something for which I can go back to Microsoft and request for support. I get almost all feature in option 1 , which option 2 gives.
OR
My preference : SharePoint Content Deployment Wizard by chrisobrien (Option 2)
Why :
- I prefer option 2 for level of customization it allows with custom API as wrapper , I can control the whole migration process . Option 1 is kind of Black box for us.
_____________________________________________________________________________________________________________
You are welcome to share links where the tech world have already such threads available which gives this comparison handy like SharePoint Content Deployment Wizard
________________________________________________________________________________________________________________________________________________________________________________________________________________________
Are there some alternatives if we don’t want to upgrade from Microsoft SharePoint Server 2010 (14.0.4763.1000), still we don’t face below mentioned issues:
- An incremental content deployment of a package in SharePoint Server 2010 fails if the following conditions are true:
- The package contains a renamed site.
- A sub-site contains a link to the renamed site.
Additionally, you receive the following error message:
Value does not fall within the expected range.
References: http://support.microsoft.com/kb/2459108
- You perform a content deployment from a source web application to a destination web application in SharePoint Foundation 2010.
- You select languages under the Alternate language section in the source web application.
- You perform an incremental content deployment.
In this scenario, the alternative language setting is not changed in the destination web application.
References: http://support.microsoft.com/kb/2536591
- Assume that you perform an incremental content deployment on a destination SharePoint Server 2010 farm. In this situation, the changes for the alternative language settings on the source farm are not reflected on the destination farm.
References: http://support.microsoft.com/kb/2536591
- When you use the Managed Metadata columns together with document sets in SharePoint Foundation 2010, you cannot perform a content deployment successfully. Additionally, you receive the following error message:
FatalError: Specified data type does not match the current data type of the property.
References: http://support.microsoft.com/kb/2598304
- You paste HTML markup into the comments field of a publishing page in a SharePoint site.
You change the page order by using the Site Navigation Settings page in the SharePoint site.
You perform a content deployment from this site to another site.
In this scenario, the content deployment fails, and you receive the following error message:
Cannot complete this action.
References: http://support.microsoft.com/kb/2598304
- One of the issue you may face with Deployment job is : How to Fix – Publishing Site Content Deployment Error – Duplicate First Name Column
- You should verify for relative links, if they are migrated correctly.
Nevertheless we can avoid many not listed anywhere using below mentioned:
Best practices for content deployment (SharePoint Server 2010)
You are invited to add the issues you kept on facing in SharePoint Content Deployment using Paths and Jobs , with links to resolution or explanations.
SPWebService.CollectSPRequestAllocationCallStacks property
There was a time when you had to restart a lot of stuff after editing registry entries like : KEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\HeapSettings SPRequestStackTrace = 1
to get the root cause of memory leaks and all that stuff.
Now you have property SPWebService.CollectSPRequestAllocationCallStacks . This is a diagnostic setting that facilitates the debugging of leaks of SPSite and SPWeb objects. If not disposed properly these objects can hold onto large amounts of memory. By enabling this setting, traces in the trace log that report leaks will contain callstacks. However, callstack collection is expensive so this should only be enabled during a brief diagnostic period.
Powershell :
$MySvc = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$MySvc.CollectSPRequestAllocationCallStacks = $true | $false
$MySvc.Update()
to get the root cause of memory leaks and all that stuff.
Now you have property SPWebService.CollectSPRequestAllocationCallStacks . This is a diagnostic setting that facilitates the debugging of leaks of SPSite and SPWeb objects. If not disposed properly these objects can hold onto large amounts of memory. By enabling this setting, traces in the trace log that report leaks will contain callstacks. However, callstack collection is expensive so this should only be enabled during a brief diagnostic period.
Powershell :
$MySvc = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$MySvc.CollectSPRequestAllocationCallStacks = $true | $false
$MySvc.Update()
Monday, May 28, 2012
Microsoft.SharePoint.Taxonomy Migration of Taxonomy
Hi
I have server1 with Microsoft.SharePoint.DLL version 14.0.5123.5000 in farm1 .
The ideal behavior should be , on server 2 , new item should be tagged to my custom term in predefined termset and termstore .
Please guide me what step I am missing.
Reply 1
Let's try a different approach. Instead of trying to create the service from a backup, we can export the data and import it into a new metadata service:
Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsite" -ServiceProxy "ManagedMetadata Service Proxy Name" -Path "C:\Temp\ManagedMetadata.cab"
Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsitefarm2" -ServiceProxy "New Managed Metadata Service Proxy Name" -Path "\\SQLServerName\Share\ManagedMetadata.cab"
You'll need to update the URLs and names to reflect your environments.
Hello Jason
Thanks for your great help. With the migration commands you shared also , I was getting the same problem.
The source of the problem in my case was :
After recreation / DB restoration /migration of Taxonomies , few of the properties in my User Profiles had lost the mappings, I had to manually remap them to get the things working.
Reply 3
After everything I am facing below mentioned: ( To sum up I will rewrite all steps I did)\
After each step above I run the Taxonomy Update Scheduler Timer Job on Site collection 2 web application. Manual TaxonomySession.SyncHiddenList also did not help.
Seems to be one entry in TaxonomyHiddenList came from content migration of TaxonomyHiddenList itself and the second entry came as include dependency of list item edited which refer to this new Term. Might be the case ?
As the import Log mention The import Process on Site Collection2 refer to TaxonomyHiddenList Twice :
[5/30/2012 11:08:46 AM] Start Time: 5/30/2012 11:08:46 AM.
[5/30/2012 11:08:46 AM] Progress: Initializing Import.
[5/30/2012 11:08:46 AM] Progress: Starting content import.
[5/30/2012 11:08:46 AM] Progress: De-Serializing Objects to Database.
[5/30/2012 11:08:46 AM] [Folder] [Person] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Person] Verbose: Source URL: _catalogs/users/Person
[5/30/2012 11:08:46 AM] [Folder] [Item] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Item] Verbose: Source URL: Lists/TaxonomyHiddenList/Item
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Verbose: Source URL: myLibrary/Forms/myLibrary
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Progress: Importing
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Source URL: sitemap.xml
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Destination URL: /sitemap.xml
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Progress: Importing
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: List URL: /myLibrary
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: Deleting...
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Verbose: List URL: /Lists/TaxonomyHiddenList
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Verbose: List URL: /_catalogs/users
[5/30/2012 11:08:47 AM] Verbose: Performing final fixups.
[5/30/2012 11:08:47 AM] Progress: Import completed.
[5/30/2012 11:08:47 AM] Finish Time: 5/30/2012 11:08:47 AM.
[5/30/2012 11:08:47 AM] Duration: 00:00:00
[5/30/2012 11:08:47 AM] Total Objects: 11
[5/30/2012 11:08:47 AM] Finished with 0 warnings.
[5/30/2012 11:08:47 AM] Finished with 0 errors.
My migration process export only the content which is published . If I give a tweak in the process it works:
1. Create an item which is not published using the new term on server 1 . This populates an entry in TaxonomyHiddenList. Let this Term in TaxonomyHiddenList migrate in next export import migration.
2. Now before next migration cycle , Publish the item which was refering to new term on server 1 . After next migration , the updated item goes to server 2 and there is no extra term in TaxonomyHiddenList onserver 2.
Reason: Probably import process on Server 2 is taken care by SharePoint as a Transaction. Now TaxonomyHiddenList Import and refering item import is under single transaction. So the item does not get reference to the newly created / imported taxonomy hidden list entry and hence trigger a new spitem creation in the taxonomyhidden list.
If we don't want to use powershell for Taxonomy migration and want to stick to db migration, there is a way out, restart SQL server just before db restore on target , so that there may be no connection Live to the managed db at the time of restore.
If we try to point the old service to a new DB instead(on the target,where new db is restored from the source meta db), some mappings may be lost for the termsets to the Columns in the list.
You may also like:
System KeyWords for Managed MetaData Service
SharePoint 2010 strange behavior of Taxonomies under Migration from one…
I have server1 with Microsoft.SharePoint.DLL version 14.0.5123.5000 in farm1 .
- I took Content DB backup of Site collection SC1 on this server 1 along with Taxonomy DB backup on server 1 .
- I restored these DB's on Server2 ( Microsoft.SharePoint.DLL version 14.0.4762.1000) SQL DB .
- Created a new Taxonomy Meta Service with restored MetaService DB backup on Farm2.
- Created a new web application and site collection with restored Content DB from Server1 on Farm2.
- Manually run the Taxonomy Update Scheduler Timer Job.
- When I create a new item/ edit an on Server 2, a new entry is created in TaxonomyHiddenList and a new entry under termStore "System".
- Even though I have same term present in my custom Termstore in a termset !!!!
The ideal behavior should be , on server 2 , new item should be tagged to my custom term in predefined termset and termstore .
Please guide me what step I am missing.
Reply 1
Let's try a different approach. Instead of trying to create the service from a backup, we can export the data and import it into a new metadata service:
- Delete the Managed Metadata service in Farm 2.
- Create a new Managed Metadata service in Farm 2. Note the application pool account.
- In farm 1, from an elevated SharePoint Management Console window, run the Export-SPMetadataWebServicePartitionData cmdlet. This will create export the managed metadata data into a cabinet file (.CAB):
Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsite" -ServiceProxy "ManagedMetadata Service Proxy Name" -Path "C:\Temp\ManagedMetadata.cab"
- opy the .CAB file from farm 1 to a folder on the farm 2 SQL Server. Share this folder and give Everyone Full Control sharing permission (we will clean this up later)
- In farm 2 SQL Server Management Studio, give the Managed Metadata application pool account (from step 2) bulkadmin SQL Server role (Expand the server instance -> Expand Security -> Find the account -> Right click, Properties -> Server Roles Page -> Check bulkadmin -> Click OK)
- In farm 2 SharePoint server, from an elevated SharePoint Management Console window, run the Import-SPMetadataWebServicePartitionData cmdlet to import the data:
Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsitefarm2" -ServiceProxy "New Managed Metadata Service Proxy Name" -Path "\\SQLServerName\Share\ManagedMetadata.cab"
You'll need to update the URLs and names to reflect your environments.
JASON WARREN
Reply 2
Hello Jason
Thanks for your great help. With the migration commands you shared also , I was getting the same problem.
The source of the problem in my case was :
After recreation / DB restoration /migration of Taxonomies , few of the properties in my User Profiles had lost the mappings, I had to manually remap them to get the things working.
Reply 3
After everything I am facing below mentioned: ( To sum up I will rewrite all steps I did)\
- I have server1 with Microsoft.SharePoint.DLL version 14.0.5123.5000 in farm1 .On Farm 2 Microsoft.SharePoint.DLL version is 14.0.4762.1000
- The Farm1 and Farm2 are exact replica ( created from same content DB) and Taxonomies also being same with same GUIDs of Term Store and sets and Terms.
- I created a new term under my custom term set on Farm1 . Used Export-SPMetadataWebServicePartitionData and Import-SPMetadataWebServicePartitionData (OverwriteExisting) to migrate changes in Terms to Farm 2
- Edited an existing content in Site Collection 1 to refer to the new term.
- Now I used Deployment API’s to take incremental export of site collection 1 .
- The import process on Site Collection2 on farm 2 create two entries in TaxonomyHiddenList !! These two entries have same Title, IdForTermStore, IdForTerm, IdForTermSet etc . Only difference is the SPITEM ID .
- All the functionalities of my site work fine. But I am not sure why these two entries should be there in TaxonomyHiddenList !!!
After each step above I run the Taxonomy Update Scheduler Timer Job on Site collection 2 web application. Manual TaxonomySession.SyncHiddenList also did not help.
Seems to be one entry in TaxonomyHiddenList came from content migration of TaxonomyHiddenList itself and the second entry came as include dependency of list item edited which refer to this new Term. Might be the case ?
As the import Log mention The import Process on Site Collection2 refer to TaxonomyHiddenList Twice :
[5/30/2012 11:08:46 AM] Start Time: 5/30/2012 11:08:46 AM.
[5/30/2012 11:08:46 AM] Progress: Initializing Import.
[5/30/2012 11:08:46 AM] Progress: Starting content import.
[5/30/2012 11:08:46 AM] Progress: De-Serializing Objects to Database.
[5/30/2012 11:08:46 AM] [Folder] [Person] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Person] Verbose: Source URL: _catalogs/users/Person
[5/30/2012 11:08:46 AM] [Folder] [Item] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Item] Verbose: Source URL: Lists/TaxonomyHiddenList/Item
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Verbose: Source URL: myLibrary/Forms/myLibrary
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Progress: Importing
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Source URL: sitemap.xml
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Destination URL: /sitemap.xml
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Progress: Importing
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: List URL: /myLibrary
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: Deleting...
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Verbose: List URL: /Lists/TaxonomyHiddenList
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Verbose: List URL: /_catalogs/users
[5/30/2012 11:08:47 AM] Verbose: Performing final fixups.
[5/30/2012 11:08:47 AM] Progress: Import completed.
[5/30/2012 11:08:47 AM] Finish Time: 5/30/2012 11:08:47 AM.
[5/30/2012 11:08:47 AM] Duration: 00:00:00
[5/30/2012 11:08:47 AM] Total Objects: 11
[5/30/2012 11:08:47 AM] Finished with 0 warnings.
[5/30/2012 11:08:47 AM] Finished with 0 errors.
My migration process export only the content which is published . If I give a tweak in the process it works:
1. Create an item which is not published using the new term on server 1 . This populates an entry in TaxonomyHiddenList. Let this Term in TaxonomyHiddenList migrate in next export import migration.
2. Now before next migration cycle , Publish the item which was refering to new term on server 1 . After next migration , the updated item goes to server 2 and there is no extra term in TaxonomyHiddenList onserver 2.
Reason: Probably import process on Server 2 is taken care by SharePoint as a Transaction. Now TaxonomyHiddenList Import and refering item import is under single transaction. So the item does not get reference to the newly created / imported taxonomy hidden list entry and hence trigger a new spitem creation in the taxonomyhidden list.
If we don't want to use powershell for Taxonomy migration and want to stick to db migration, there is a way out, restart SQL server just before db restore on target , so that there may be no connection Live to the managed db at the time of restore.
If we try to point the old service to a new DB instead(on the target,where new db is restored from the source meta db), some mappings may be lost for the termsets to the Columns in the list.
You may also like:
System KeyWords for Managed MetaData Service
SharePoint 2010 strange behavior of Taxonomies under Migration from one…
Wednesday, May 16, 2012
Microsoft.SharePoint.Deployment SPExportSettings
Hi
ExportChangeToken property in SPExportSettings is something which tells SharePoint API's a point in past from which the incremental export should start .
CurrentChangeToken (ReadOnly) is something which is set internally , and referred by future exports as a mile stone for export process.This property is set the when some change occurs in site or when the increamental batchis complete ?
I have a batch which runs perfectly under ideal conditions.
I want that when I run this batch manually , value of CurrentChangeToken should not be persisted in the system,
so that automatic batch process keeps on running as is & my manual run is free enough to export without any affect on the automatic.
Please help.
1. Is it ok to create my own persisted object store of SPChangeToken , let the Batch use it to get last value and add new token there with CurrentChangeToken value. The manual process do not update the custom persisted object , so last in the persisted object is the one by automatic process.
2.
I have one last resort for me , if I don't find direct API's in Deployment name spaces : SPSite.GetChanges Method (SPChangeToken, SPChangeToken)
In this case I want to know Microsoft.SharePoint.Deployment.SPExportSettings.CurrentChangeToken (ReadOnly) property is set internally after the export has run or it is independent of when export runs , but dependent on when actual change occurs.
Reply 1
Hi
Under content migration scenarios from staging to Production on daily basis , how Microsoft.SharePoint.Deployment API's are better performer than http://technet.microsoft.com/en-us/library/ee721058.aspx ?
ExportChangeToken property in SPExportSettings is something which tells SharePoint API's a point in past from which the incremental export should start .
CurrentChangeToken (ReadOnly) is something which is set internally , and referred by future exports as a mile stone for export process.This property is set the when some change occurs in site or when the increamental batchis complete ?
I have a batch which runs perfectly under ideal conditions.
I want that when I run this batch manually , value of CurrentChangeToken should not be persisted in the system,
so that automatic batch process keeps on running as is & my manual run is free enough to export without any affect on the automatic.
Please help.
1. Is it ok to create my own persisted object store of SPChangeToken , let the Batch use it to get last value and add new token there with CurrentChangeToken value. The manual process do not update the custom persisted object , so last in the persisted object is the one by automatic process.
2.
I have one last resort for me , if I don't find direct API's in Deployment name spaces : SPSite.GetChanges Method (SPChangeToken, SPChangeToken)
In this case I want to know Microsoft.SharePoint.Deployment.SPExportSettings.CurrentChangeToken (ReadOnly) property is set internally after the export has run or it is independent of when export runs , but dependent on when actual change occurs.
Reply 1
Hi
Under content migration scenarios from staging to Production on daily basis , how Microsoft.SharePoint.Deployment API's are better performer than http://technet.microsoft.com/en-us/library/ee721058.aspx ?
Friday, March 23, 2012
SharePoint 2010 | Login over HTTPS from HTTP pages
Hello Friends
Did you ever have a chance to work on Login control for SharePoint which sends user’s credentials on SSL and rest of the stuff non-secured? I found few of the good links which suggest creating our own custom Cookie handlers.
Microsoft SharePoint Team has written their own cookie handler specific to SharePoint which does not allow authentication token being generated on a secured connection to be used under non-secured. 1st thing, Do you see any kind of harm in overriding this behavior of SharePoint’s cookie handler with our own custom one ? Second,I was able to transfer back the cookies using this approach as in the links below, but current context was lost, and user remained to be logged off !!!
References:
Steps I followed:
1. Extended the current setup on SSL . Made it working similar to the default zone application ( including resource files and web.config entries,manual dll etc.)
2. Set the postback url for login button.
3. Made the URL rewrite entries as suggested in both zones. ( these are spread across multiple links, REQUEST_METHOD is additional to https ON rule) URL rewriting can be done very easily with an IIS extension provided by Microsoft.
4. Created the custom handler as suggested and replaced it with SharePoint one.
5. Set the required SSL for cookies to false, so that they may be used on non ssl also.
Reply 1
The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.
So with custom cookie handler with both end point under same zone , it works! !
Reply 2
Hello hemantrhtk
Thank you for your post.
This is a quick note to let you know that we are performing research on this issue.
Thanks,
Pengyu Zhao
Reply 3
The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.
So with custom cookie handler with both end point under same zone , it works! !
Did you ever have a chance to work on Login control for SharePoint which sends user’s credentials on SSL and rest of the stuff non-secured? I found few of the good links which suggest creating our own custom Cookie handlers.
Microsoft SharePoint Team has written their own cookie handler specific to SharePoint which does not allow authentication token being generated on a secured connection to be used under non-secured. 1st thing, Do you see any kind of harm in overriding this behavior of SharePoint’s cookie handler with our own custom one ? Second,I was able to transfer back the cookies using this approach as in the links below, but current context was lost, and user remained to be logged off !!!
References:
- http://www.sp2010hosting.com/Lists/Posts/Post.aspx?ID=5
- http://blogs.visigo.com/chriscoulson/mixed-http-and-https-content-with-sharepoint-2010/
- http://www.sharepointconfig.com/2010/04/partial-ssl-sharepoint-sites-login-over-http-from-http-pages/
Steps I followed:
1. Extended the current setup on SSL . Made it working similar to the default zone application ( including resource files and web.config entries,manual dll etc.)
2. Set the postback url for login button.
3. Made the URL rewrite entries as suggested in both zones. ( these are spread across multiple links, REQUEST_METHOD is additional to https ON rule) URL rewriting can be done very easily with an IIS extension provided by Microsoft.
4. Created the custom handler as suggested and replaced it with SharePoint one.
5. Set the required SSL for cookies to false, so that they may be used on non ssl also.
Reply 1
The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.
So with custom cookie handler with both end point under same zone , it works! !
Reply 2
Hello hemantrhtk
Thank you for your post.
This is a quick note to let you know that we are performing research on this issue.
Thanks,
Pengyu Zhao
Reply 3
The two bindings should be given under same zone , only then it works. I was giving the alternate access mappings under different zone.
So with custom cookie handler with both end point under same zone , it works! !
Subscribe to:
Posts (Atom)