Friday, February 8, 2013

Asynchronous call to WebService / WCF using JQuery

In one of the implementations, we had to do asynchronous calls to multiple SharePoint Lists for better UI and user experience.

For better performance and managed solution, we exposed a custom web service which takes care of all the data manipulations and returns desired result set / exposes methods to do required operations.

Here is a Proof of concept code snippet which might be helpful to you to call web services using JQuery.Ajax  :

$.ajax({
// type: Type, //GET or POST or PUT or DELETE verb
url: Uri, // Location of the service
// data: Data, //Data sent to server
dataType: DataType, //Expected data format from server
cache: false, // no-cache
success: function (msg) {//On Successfull service call
ServiceSucceeded(msg);
},
error: ServiceFailed// When Service call fails
});

Full Sample Code :
asynchronous-call-to-web-service_1

Saturday, January 12, 2013

finding features in a content database in SharePoint 2010 using PowerShell or tools

Sometimes we have a feature id and we want to know the places where it is active, even as a dummy one. Below mentioned script might be helpful.

http://get-spscripts.com/2011/06/removing-features-from-content-database.html

or

http://featureadmin.codeplex.com/downloads/get/290833

or

http://archive.msdn.microsoft.com/WssAnalyzeFeatures

 

Tuesday, January 8, 2013

SharePoint 2010 Enterprise Search | SharePoint Crawl ExceptionalBehaviour

Security Issues to be taken care while configuring SharePoint Search for Public facing Portals - SharePoint
When search crawler [SPSCrawl.asmx and sitedata.asmx ] comes to a SharePoint site , how it gets to know whether it's a SharePoint site or a normal site ?

There is a custom header defined by Microsoft on SharePoint web applications : Name : MicrosoftSharePointTeamServices ; Value like : 14.0.0.4762

It tells crawler to dig to the item levels in SharePoint Lists , treat target as SharePoint Site.....



What if this custom header is removed on target ? :--> Search crawler will crawl up to list level only . If we use fiddler, while crawling site collection you could observe , there is no call made to SPSCrawl.asmx and sitedata.asmx by the crawler. This web application will no more be treated as SharePoint Website by Search crawl.
Now to make your site secure , you want that hackers may not get these custom headers , but search crawlers need it .
There is a way out , let your search crawl be targeted to a different web application than the public facing one !!!! And on public facing website use <clear /> under http response headers to hide from external world internal server information.

You may also like:

my website does not work well with newly released version


Question : what is arpirowupdater.hxx ?

Tuesday, November 20, 2012

display hide content on page using JQuery

On page load , 1st div which is "div1"  is enabled , rest are hidden . On further link clicks , respective divs , as defined by class of list element  , are enabled , rest divs are disabled

jQuery(function () {

$('#nav a').click(function () {

var myClassLinked = $(this).parent().attr('class');

$('#container > :not(#' + myClassLinked + ')').hide();

$('#' + myClassLinked ).fadeIn(); }); });

$(document).ready(function () {

$('#container > :not(#' + 'div1' + ')').hide();

$('#' + 'div1').fadeIn();

});

Full example code

Sunday, October 14, 2012

feature activation code using powershell does not pick appsettings values

This is a  very common scenario where , our deployment guides tend to be useless with automation just because features are activated well using UI in settings of site/site collection  , but fail using powershell.

Here is the solution to this :


using (SPWeb web = properties.Feature.Parent as SPWeb)
{
SPWebApplication webApp = web.Site.WebApplication;
Configuration config = WebConfigurationManager.OpenWebConfiguration("/", webApp.Name);
// App settings are retrieved this way
config.AppSettings.Settings["someAppSettingKey"].Value;


// Connection string settings are retrieved this way
config.ConnectionStrings.ConnectionStrings["someDBConnectionString"].ConnectionString
web.Site.Dispose();
}


 

Reference :

Read web.config on FeatureActivated event while activating Feature using PowerShell

 

Wednesday, October 10, 2012

Get Dependent files from HTML response

Hi

I am getting response of a page like :

System.Net.HttpWebRequest  request  ...

request.Credentials...

request.Method = ...

request.UserAgent =..

(System.Net.HttpWebResponse)request.GetResponse();

receiveStream = response.GetResponseStream();

readStream = new StreamReader(receiveStream, Encoding.UTF8);

string pageContent = readStream.ReadToEnd();

Now out of this string pageContent    , I want to extract all sub request and make them a hit.   Is there something better than html agility pack for this ?

Reply 1 By http://social.msdn.microsoft.com/profile/joel%20engineer/?ws=usercard-mini

I usually read it into an htmlDocument Class and the use either GetElementByID or GetElementsByTagname()

 

 

 

string pageContent;

webBrowser1.Document.Write(pageContent);

 




jdweng

 

Reply 2 


You should try http://htmlagilitypack.codeplex.com/

I am searching something better dear.

Tuesday, September 25, 2012

Dummy Workflow References

Hi

 


    •  For my Document Library WorkflowAssociations.Count  is zero.

    • For this document Library EnableModeration  is false.

    •  Still I can see column in the document Library named "Approval Status".  (internal name   "_ModerationStatus")



When I try to set  EnableModeration  true for this document library , I get exception.

Seems to be some time in the past workflow was enabled on this list / enablemoderation was true . But setting these to negative did not remove Approval Status column.

How I can fix this document library without deletion , so that  future workflows enabled or enablemoderation=true should not give exception ?