Showing posts with label powershell. Show all posts
Showing posts with label powershell. Show all posts

Monday, April 8, 2019

Create terms using SharePointPnP.PowerShell

Line 13: example, how to connect using  SharePointPnP.PowerShell with SharePoint Online
Line19: Example how to create site collection level Term Group.
Line 23: A possible alternative to "Import-PnPTermSet"  using "Import-Csv", "Export-Csv" and "New-PnPTermSet"

sample here

Wednesday, March 8, 2017

Create database and content using psql





This post is related to https://hemantrohtak.blogspot.com/2016/03/is-entity-framework-best-performing.html


Sunday, February 26, 2017

Start/Stop/Get details for Windows Services

If you are busy with multiple assignments and cater needs from multiple projects, or may be own a laptop with very little resources, you may want to manipulate windows services active on your laptop by Windows PowerShell.
For example, you may keep handy a .csv targeting one assignment, when you start working on that assignment simply stop other bulky windows services and activate the one you are interested in.

Some helpful commands:


#Get-Service | select-object  status  -unique
#Get-Service | where {$_.status -eq "Running"} 

# get all windows services running:
Get-Service | select Name, CanPauseAndContinue, CanShutdown, CanStop, DisplayName, MachineName, ServiceName, ServiceHandle, Status, ServiceType, StartType, Site, Container, @{Name=’RequiredServices’;Expression={[string]::join(“;”, ($_.RequiredServices.Name))}} ,  @{Name=’DependentServices’;Expression={[string]::join(“;”, ($_.DependentServices.Name))}} ,    @{Name=’ServicesDependedOn’;Expression={[string]::join(“;”, ($_.ServicesDependedOn.Name))}} | Export-Csv c:\gm\servs.csv;

#After analysis of output say, you want to start 3 of them and stop 2, but leave one untouched still be in the csv, create a csv something like:
#(consider the dependencies listed in output above to decide order)
#"Name","Start","Stop"
#"MSSQL$HK2012SQL","TRUE","FALSE"
#"SQLAgent$HK2012SQL","TRUE","FALSE"
#"postgresql-x64-9.5","TRUE","FALSE"
#"wuauserv","FALSE","TRUE"
#"wlidsvc","FALSE","TRUE"
#"RpcSs","FALSE","FALSE"

import-csv "C:\gm\input.csv" | foreach-object { write-host $_.Name; if([bool]::Parse($_.Start)){ Start-Service $_.Name}; if([bool]::Parse($_.Stop)){ Stop-Service $_.Name};}

sample commands at :  bitbucket.org/hemantup/orm/src/HEAD/Scripts/services.txt

But if you are more into security and want control over what is happening over your laptop above mentioned might not be sufficient.
You might need to to deep dive into processes like :

Get-Process | Export-csv "c:\gm\pr.csv"

Thursday, January 1, 2015

Send bulk email using gmail or any other server



1. Visit https://www.google.com/settings/security/lesssecureapps  and enable access for less secure apps for some time

2. Verify that your machine where powershell is available , latest version of .Net Framework is installed.

3.  Verify that in powershell you have "Send-MailMessage" available .

(in powershell window type  :---- Get-Command Send-MailMessage )

4.   Export your gmail/ any other email account contacts as csv  . Say it has name of contact column as "name"  and email as "email"

5. Create the sample html you want to send , say some dummy html

6. Here is a sample ps1 using  "Send-MailMessage" with gmail

sample script

7. Give your sleep time in powershell script  with respect to  Gmail send message limits https://support.google.com/a/answer/166852?hl=en  and https://support.google.com/a/answer/175121?hl=en

8. don't forget to disable less secure apps @ https://www.google.com/settings/security/lesssecureapps   after you are done .







Monday, April 14, 2014

Should I Pay money to Google Adsense ? Prove using Powershell

Last week I  finalized a new hosting plan . It offers me $100 worth of code for Google adsense and similar programs.  This was the 1st time I saw Google adsense with angle of an advertiser . (Probably when I last time renewed my hosting  also , I got these codes , but never used it . Was not even aware of what it is at that moment )

1st question which comes to me , do they make us fool !!! Does this $ 100 really worth something .  After all brainstorming  my conclusion is : if you are selling a product , invest only in a program  which charges you as per actual product sale .  Never invest in Google adsense and similar programs , who just take responsibility of taking user to your door on web , if they don't promise / convert in to actual sale , don't pay them .

Well, I am not saying something in air . I have valid points to prove it , if you agree with below mentioned , you must admit what I said above :

Thursday, April 3, 2014

How to Post on all of your FaceBook Groups Using Powershell ?

Powershell script saved as text here( Auth codes removed)


#1. This PowerShell script demonstrate how to use Facebook API's in PowerShell .
#2. How to load new framework dll . (Details How to load .Net 4 compiled dll in Powershell ? ) 
#3. How to generate Auth Access Token in facebook. (Details How to generate Facebook token ? )
#4. How to read online xml file which may be site map . (How to read online xml using PowerShell?  $onlineXMLSitemap = "http://sharepoint.asia/sitemap.xml" ;$doc = New-Object System.Xml.XmlDocument;$doc.Load($onlineXMLSitemap);
#5. How to run c# code from powershell directly . (Add-Type -ReferencedAssemblies $mAs -TypeDefinition $cSource -Language CSharp )
#6. How to solve overload function problem in Powershell ( Post method of facebook API has two overloads , one with Path , one without path . We are using 1st . That is why we called c# code from Powershell)
#7. How to select random entries in a collection . ($doc.urlset.url | Get-Random -count 1 | ForEach-Object )
#8. Business purpose resolved : Randomly post on Facebook in my all groups . This script picks one random url from provided Sitemap.XML and post it on 100 FB Groups and your own wall . You can reduce this number though .$Groups.data  | Get-Random -count 100| ForEach-Object {
#Related terms : Bulk Post on facebook , Post on All Groups ,  Use Facebook API is Powershell , Use Facebook API is c#


How to load .Net 4 compiled dll in Powershell ?

Do you get error like :

Exception calling "LoadFile" with "1" argument(s): "This assembly is built by a runtime
newer than the currently loaded runtime and cannot be loaded. (Exception from HRESULT:
0x8013101B)"

How to read online xml using PowerShell?

This post demonstrate how you could read an xml file and manipulate using PowerShell.
# I have taken this site's sitemap as sample xml

$onlineXMLSitemap = "https://hemantrohtak.blogspot.com/sitemap.xml" ;
$doc = New-Object System.Xml.XmlDocument;
$doc.Load($onlineXMLSitemap);

# you could manipulate this xml just like an ordinary xml in object model
#below mentioned: Print a random url from my sitemap

$doc.urlset.url | Get-Random -count 1 | ForEach-Object {

Write-Host $_.loc ;

}

Sample Code :

# I have taken this site's sitemap as sample xml
$onlineXMLSitemap = "https://hemantrohtak.blogspot.com/sitemap.xml" ;
$doc = New-Object System.Xml.XmlDocument;
$doc.Load($onlineXMLSitemap);
# you could manipulate this xml just like an ordinary xml in object model

Saturday, February 22, 2014

Can I stop my screen from locking due to inactivity?

Tonight I was watching movie on my TV being broadcasted via HDMI from my Laptop.

It was annoying to force my laptop from dying every now and then.  (Somehow I was not able to change power management settings to never sleep, don’t know why!!!)



So I quickly wrote this PowerShell script, to keep on clicking at random places on my screen:

Sunday, September 1, 2013

List View Threshold

List View Threshold


Sometimes you might face a situation where you get this error:

9239 items (list view threshold is 5000)

The number of items in list exceeds the list view threshold, which is 5000 items.  Tasks that cause excessive server load (such as those involving all list items) are currently prohibited.

Even though it is recommended to use prescribed limits in your queries, but

Wednesday, June 5, 2013

cache user accounts

For FBA web application using claims, we should specifically use FBA user like 0#.f|abc|username for SuperReader and SuperUser accounts.

Choose two such accounts , preferably new one.

Visit user policy for web application  under central admin

With option all zones, give superUser Full Control and SuperReader Read rights.

Set superReader and SuperUser  fr caching by below mentioned powershell command :

$myWebApp = Get-SPWebApplication -Identity "<WebApplication>"
$myWebApp.Properties["portalsuperuseraccount"] = "<SuperUser>"
$myWebApp.Properties["portalsuperreaderaccount"] = "<SuperReader>"
$myWebApp.Update()

 

Now go for one IISRESET

Saturday, June 1, 2013

Offset Today in SPQuery Example

There is a virtual situation :

You have no archival / expiration policies setup for your list and on and off you want to delete everything older than 30 days . What you will do ?

This sample power shell solves your purpose and fulfill example for offset parameter in powershell :

$WebURL = "http://mysitecollection:100/myWeb";
$spWeb = Get-SPWeb -Identity $WebURL;

$sList = $spWeb.GetList("/myWeb/Lists/myList");
write-host $sList.Title;
$camlQuery = "<Where><Lt><FieldRef Name='Modified' /><Value Type='DateTime'><Today OffsetDays=-30 /></Value></Lt></Where>";
$spQuery = new-object Microsoft.SharePoint.SPQuery ;
$spQuery.Query = $camlQuery ;
$spQuery.ViewFields = "<FieldRef Name='ID' />";

$ToBeDeleted = $sList.GetItems() ;
$ToBeDeleted = $sList.GetItems($spQuery) ;
$i=0;
$ToBeDeleted | ForEach-Object {
$i++;
Write-Host $_.ID -foregroundcolor cyan

$deaditem= $sList.GetItemById($_.ID);  # if you wish you may decide to recycle instead to recycle bin

$deaditem.Delete();

}

write-host $i;

 

 

Friday, May 31, 2013

Include specific folder content in BlobCache SharePoint

It is quite possible that you instruct blob cache framework to include only specific SP document Libraries . ( I have mentioned it as document libraries , as you must be aware it does not work well with custom folders created using designer etc)

Suppose I have few heavy .jpeg  and .dhx  in a spdocumentLibrary called  HeavyContent .

I want to cache all .pdf , .doc ,.docx ,.flv,.f4v ,.swf  in my web application , but  .jpeg and .dhx   are the one only from spdoc lib "HeavyContent"

Search for a tag similar to below mentioned  in web.config and edit the path parameter.  :

<BlobCache location="C:\myBlob\Public" path="(\.(doc|docx|pdf|swf|flv|f4v)$|HeavyContent.*\.(dhx|jpeg)$)" maxSize="10" max-age="86400" enabled="true" />

This will mark items as specified by path parameter regex  for 24 hrs , to be cached.

\.(doc|docx|pdf|swf|flv|f4v)$    <---->  everything which ends with extension mentioned

|                                            <---->   or

HeavyContent.*\.(dhx|jpeg)$ <----> content inside HeavyContent ending with dhx or jpeg



Disclaimer : I am not sure how performance will be affected by making path parameter complex.


Mime Types with BlobCache

Now suppose your IIS does not support some mime types like dhx and f4v.  You have two options :

Option 1 : add mime type at server level in IIS.

Option 2 : use    browserFileHandling = "Nosniff"  in the webconfig section discussed above.  [ Ref : http://blogs.msdn.com/b/ie/archive/2008/09/02/ie8-security-part-vi-beta-2-update.aspx ]
How to  Flushing the BLOB cache



  1. IISRESET [Recommended : Increase the startup and shutdown time limits on the web application to accommodate the extra time it takes to initialize or serialize the cache index for very large BLOB caches]

  2. Powershell :



Write-Host -ForegroundColor White ” – Enabling SP PowerShell cmdlets…”
If ((Get-PsSnapin |?{$_.Name -eq “Microsoft.SharePoint.PowerShell”})-eq $null)
{
$PSSnapin = Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue | Out-Null
}


$webAppall = Get-SPWebApplication
foreach ($_.URL in $webAppall) {
$webApp = Get-SPWebApplication $_.URL
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlobCache($webApp)
Write-Host “Flushed the BLOB cache for:” $_.URL
}
3. change enable to false in web.config , change location parameter , set enable to true


You may also like:







Saturday, January 12, 2013

finding features in a content database in SharePoint 2010 using PowerShell or tools

Sometimes we have a feature id and we want to know the places where it is active, even as a dummy one. Below mentioned script might be helpful.

http://get-spscripts.com/2011/06/removing-features-from-content-database.html

or

http://featureadmin.codeplex.com/downloads/get/290833

or

http://archive.msdn.microsoft.com/WssAnalyzeFeatures

 

Friday, June 1, 2012

SPWebService.CollectSPRequestAllocationCallStacks property

There was a time when you had to restart a lot of stuff after editing registry entries like  : KEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\HeapSettings        SPRequestStackTrace = 1

to get the root cause of memory leaks and all that stuff.

Now you have property  SPWebService.CollectSPRequestAllocationCallStacks .  This is a diagnostic setting that facilitates the debugging of leaks of SPSite and SPWeb objects. If not disposed properly these objects can hold onto large amounts of memory. By enabling this setting, traces in the trace log that report leaks will contain callstacks. However, callstack collection is expensive so this should only be enabled during a brief diagnostic period.

Powershell :

$MySvc = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$MySvc.CollectSPRequestAllocationCallStacks = $true | $false
$MySvc.Update()

 

Monday, May 28, 2012

Microsoft.SharePoint.Taxonomy Migration of Taxonomy

Hi



I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .

  1. I took Content DB backup of Site collection SC1 on this server 1 along with Taxonomy DB backup on server 1 .

  2. I restored these DB's on Server2 ( Microsoft.SharePoint.DLL version  14.0.4762.1000)  SQL DB .

  3. Created a new Taxonomy Meta Service with  restored MetaService DB backup on Farm2.

  4. Created a new web application and site collection with restored Content DB from Server1  on Farm2.

  5. Manually run the Taxonomy Update Scheduler Timer Job.

  6. When I create a new item/ edit an on Server 2, a new entry is created in TaxonomyHiddenList and a new entry under termStore "System".

  7. Even though I have same term present in my custom Termstore in a termset !!!!

The ideal behavior should be , on server 2 , new item should be tagged to my custom term in predefined termset and termstore .



Please guide me what step I am missing.

Reply 1

Let's try a different approach. Instead of trying to create the service from a backup, we can export the data and import it into a new metadata service:

  1. Delete the Managed Metadata service in Farm 2.

  2. Create a new Managed Metadata service in Farm 2. Note the application pool account.

  3. In farm 1, from an elevated SharePoint Management Console window, run the Export-SPMetadataWebServicePartitionData cmdlet. This will create export the managed metadata data into a cabinet file (.CAB):

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsite" -ServiceProxy "ManagedMetadata Service Proxy Name" -Path "C:\Temp\ManagedMetadata.cab"

  1. opy the .CAB file from farm 1 to a folder on the farm 2 SQL Server. Share this folder and give Everyone Full Control sharing permission (we will clean this up later)

  2. In farm 2 SQL Server Management Studio, give the Managed Metadata application pool account (from step 2) bulkadmin SQL Server role (Expand the server instance -> Expand Security -> Find the account -> Right click, Properties -> Server Roles Page -> Check bulkadmin -> Click OK)

  3. In farm 2 SharePoint server, from an elevated SharePoint Management Console window, run the Import-SPMetadataWebServicePartitionData cmdlet to import the data:

Export-SPMetadataWebServicePartitionData -Identity "http://sharepointsitefarm2" -ServiceProxy "New Managed Metadata Service Proxy Name" -Path "\\SQLServerName\Share\ManagedMetadata.cab"

You'll need to update the URLs and names to reflect your environments.






JASON WARREN



Reply 2



Hello Jason



Thanks for your great help. With the migration commands you shared also , I was getting the same problem.

The source of the problem in my case was :

After recreation / DB restoration /migration of Taxonomies , few of the properties in my User Profiles had lost the mappings, I had to manually remap them to get the things working.

Reply 3 

After everything I am facing below mentioned: ( To sum up I will rewrite all steps I did)\



  1.        I have server1 with Microsoft.SharePoint.DLL version  14.0.5123.5000  in farm1 .On Farm 2 Microsoft.SharePoint.DLL version  is 14.0.4762.1000

  2.        The Farm1 and Farm2 are exact replica ( created from same content DB) and Taxonomies also being same with same GUIDs of Term Store and sets and Terms.

  3.        I created a new term under my custom term set on Farm1 . Used  Export-SPMetadataWebServicePartitionData  and  Import-SPMetadataWebServicePartitionData   (OverwriteExisting)  to migrate changes in Terms  to Farm 2

  4.        Edited an existing content in Site Collection 1 to refer to the new term.

  5.        Now I used Deployment API’s to take incremental export of site collection 1 .

  6. The import process on Site Collection2 on farm 2 create two entries in TaxonomyHiddenList !! These two entries have same Title, IdForTermStore, IdForTerm, IdForTermSet etc . Only difference is the SPITEM ID .

  7.        All the functionalities of my site work fine. But I am not sure why these two entries should be there in TaxonomyHiddenList !!!



After each step above I run the Taxonomy Update Scheduler Timer Job on Site collection 2 web application. Manual TaxonomySession.SyncHiddenList also did not help.



Seems to be one entry in TaxonomyHiddenList came from content migration of TaxonomyHiddenList itself and the second entry came as include dependency of list item edited which refer to this new  Term. Might be the case ?



As the import Log mention The import Process on Site Collection2 refer to TaxonomyHiddenList Twice :

[5/30/2012 11:08:46 AM] Start Time: 5/30/2012 11:08:46 AM.
[5/30/2012 11:08:46 AM] Progress: Initializing Import.
[5/30/2012 11:08:46 AM] Progress: Starting content import.
[5/30/2012 11:08:46 AM] Progress: De-Serializing Objects to Database.
[5/30/2012 11:08:46 AM] [Folder] [Person] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Person] Verbose: Source URL: _catalogs/users/Person
[5/30/2012 11:08:46 AM] [Folder] [Item] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [Item] Verbose: Source URL: Lists/TaxonomyHiddenList/Item
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Progress: Importing
[5/30/2012 11:08:46 AM] [Folder] [myLibrary] Verbose: Source URL: myLibrary/Forms/myLibrary
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Progress: Importing
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Source URL: sitemap.xml
[5/30/2012 11:08:46 AM] [File] [sitemap.xml] Verbose: Destination URL: /sitemap.xml
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Progress: Importing
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: List URL: /myLibrary
[5/30/2012 11:08:46 AM] [ListItem] [xyz.pdf] Verbose: Deleting...
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [53_.000] Verbose: List URL: /Lists/TaxonomyHiddenList
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Progress: Importing
[5/30/2012 11:08:47 AM] [ListItem] [272_.000] Verbose: List URL: /_catalogs/users
[5/30/2012 11:08:47 AM] Verbose: Performing final fixups.
[5/30/2012 11:08:47 AM] Progress: Import completed.
[5/30/2012 11:08:47 AM] Finish Time: 5/30/2012 11:08:47 AM.
[5/30/2012 11:08:47 AM] Duration: 00:00:00
[5/30/2012 11:08:47 AM] Total Objects: 11
[5/30/2012 11:08:47 AM] Finished with 0 warnings.
[5/30/2012 11:08:47 AM] Finished with 0 errors.

My migration process export only the content which is published . If I give a tweak in the process it works:



1. Create an item which is not published using the new term on server 1 . This populates an entry in TaxonomyHiddenList. Let this Term in TaxonomyHiddenList migrate in next export import migration.



2. Now before next migration cycle , Publish the item which was refering to new term on server 1 . After next migration , the updated item goes to server 2 and there is no extra term in TaxonomyHiddenList onserver 2.

Reason: Probably import process on Server 2 is taken care by SharePoint as a Transaction. Now TaxonomyHiddenList Import and refering item import is under single transaction. So the item does not get reference to the newly created / imported taxonomy hidden list entry and hence trigger a new spitem creation in the taxonomyhidden list.

If we don't want to use powershell for Taxonomy migration and want to stick to db migration, there is a way out, restart SQL server just before db restore on target , so that there may be no connection Live to the managed db at the time of restore.

If we try to point the old service to a new DB instead(on the target,where new db is restored from the source meta db), some mappings may be lost for the termsets to the Columns in the list.

You may also like:

System KeyWords for Managed MetaData Service


SharePoint 2010 strange behavior of Taxonomies under Migration from one


Saturday, May 28, 2011

Analyse event logs for Service Control Manager activities from windowsservices

Sometime you might want to start and stop kind of activities out of event logs for windows services . here is powershell script for this
$eventL = Get-EventLog -LogName "System" -Source "Service Control Manager" ;
Get-Service | ForEach-Object {
$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;

$eventL | where { $_.Message.Contains($st2) -eq "true" } | Select-Object -First 1 | ForEach-Object {

Write-Host $st1 $st2 $st3 -foregroundcolor cyan;
$st4 = "";
$st5 = "";
$st4 = $_.Message;
$st5 = $_.TimeGenerated;
Write-Host $st4 $st5;

}

}
$eventL = $null;

Now that you have decided the culprit windows service, use this script to get whole available history:

$eventL = Get-EventLog -LogName "System" -Source "Service Control Manager" ;

Get-Service | ForEach-Object {

if($_.Name -eq "AAAAAAAAAAAA"){

$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;

$eventL | where { $_.Message.Contains($st2) -eq "true" } | ForEach-Object {

#Write-Host $st1 $st2 $st3 -foregroundcolor cyan;
$st4 = "";
$st5 = "";
$st4 = $_.Message;
$st5 = $_.TimeGenerated;
Write-Host $st4 $st5;

}

}

}
$eventL = $null;

Might be the case , you only want to list down which service is stopped and which one is active :

Get-Service | ForEach-Object {

$st1 = "";
$st2 = "";
$st3 = "";
$st1 = $_.Status;
$st2 = $_.DisplayName;
$st3 = $_.Name;

Write-Host $st1 "," $st2 "," $st3 -foregroundcolor cyan;

}

You may also like:


get last app pool recycle timings

Tuesday, April 19, 2011

get last app pool recycle timings

c:\Windows\System32\inetsrv\appcmd list wp  gives you all the process IDs

and on powershell  go for :

(Get-Process -Id [processID]).StartTime

You may also like:

Analyse event logs for Service Control Manager activities from windows

Thursday, February 17, 2011

List Level backups for multiple sub sites -Poweshell

In SharePoint projects, there are scenarios, where a number of sub sites are created using a same template (now WSP).

We come across requirements where these templates require updation in due course of time.   And hence the sub sites already created may require an update. But it is not feasible to go to each and every sub site created and make the changes.

So best way could be take backup of variable components (   Lists) and then recreate all the sub sites and then restore back the lists to the newly   created sub sites.

The below mentioned   PowerShell script can be used to take backup export the content  from a specific named list in a shared folder structure, where each sub site will have a specific folder for backup in format cmp :

 

$mainBackuppath="\\abc2010\exportimport"

$File="export.txt"

Get-SPSite -Identity $spsite | Get-SPWeb -limit all| ForEach-Object {

 

 

 

if(($_.URL -eq $spsite+"/xyz") -or ($_.URL -eq $spsite+"") -or ($_.URL -eq $spsite+"/lmn") )

{

write-host -f Green  "Skipping " $_.URL

"Skipping " + $_.URL | Out-File $File -append

 

}

else

{

 

write-host -f Green "processing for "  $_.URL

"processing for " + $_.URL | Out-File $File -append

$currentSubSiteTitlewithoutspace= ""

$currentSubSiteTitlewithoutspace=$_.Title.Replace(" ","")

write-host -f Blue "Creating Directory: "$mainBackuppath"\"$currentSubSiteTitlewithoutspace

"Creating Directory: "+$mainBackuppath+"\"+$currentSubSiteTitlewithoutspace | Out-File $File -append

$error.clear()

 

[IO.Directory]::CreateDirectory($mainBackuppath+"\"+$currentSubSiteTitlewithoutspace)

if($error.count -gt 0){$error | out-file $File -append}

 

 

 

$filesystemPath=""

$filesystemPath = $mainBackuppath+"\"+$currentSubSiteTitlewithoutspace

 

if(Test-Path $filesystemPath)

{

$filesystemPath =$mainBackuppath+"\"+$currentSubSiteTitlewithoutspace+"\"+$currentSubSiteTitlewithoutspace+".cmp"

$DocLibPath=""

$DocLibPath= "/"+$_.Title+"/Shared Documents"

write-host  -f Yellow "Taking Backup for "$DocLibPath " at location " $filesystemPath "...."

"Taking Backup for "+$DocLibPath +" at location "+ $filesystemPath +"...."| Out-File $File -append

 

$error.clear()

export-spweb -identity $_.URL   -path  $filesystemPath   -ItemUrl $DocLibPath -Force -IncludeUserSecurity -IncludeVersions All | Out-File $File -append

if($error.count -gt 0){$error | out-file $File -append}

 

write-host  -f Yellow "Backup for "$DocLibPath " at location " $filesystemPath "Completed."

"Backup for "+$DocLibPath +" at location "+ $filesystemPath +"Completed."| Out-File $File -append

 

}

 

else

{

write-host  -f Red $filesystemPath  " does not exist."

$filesystemPath  +" does not exist." | Out-File $File -append

}

 

write-host -f Green "End of "  $_.URL"`n`n`n`n"

"End of "+  $_.URL+"`n`n`n`n"| Out-File $File -append

 

 

}

 

}

 

 

The below mentioned   PowerShell script can be used to restore backup /exported content  shared folder structure, where each sub site will have a specific folder for backup in format cmp, to the subsites created using new template :

 

 

$spsite= "http://abc:1104"

$mainBackuppath="\\abc2010\exportimport"

$File="export.txt"

Get-SPSite -Identity $spsite | Get-SPWeb -limit all| ForEach-Object {

 

 

 

if(($_.URL -eq $spsite+"/xyz") -or ($_.URL -eq $spsite+"") -or ($_.URL -eq $spsite+"/lmn") )

{

write-host -f Green  "Skipping " $_.URL

"Skipping " + $_.URL | Out-File $File -append

 

}

else

{

 

write-host -f Green "processing for "  $_.URL

"processing for " + $_.URL | Out-File $File -append

 

$currentSubSiteTitlewithoutspace= ""

$currentSubSiteTitlewithoutspace=$_.Title.Replace(" ","")

write-host -f Blue "Checking if directory exist for current web"

"Checking if directory exist for current web" | Out-File $File -append

$filesystemPath=""

$filesystemPath =$mainBackuppath+"\"+$currentSubSiteTitlewithoutspace+"\"+$currentSubSiteTitlewithoutspace+".cmp"

 

if(Test-Path $filesystemPath)

{

 

$DocLibPath=""

$DocLibPath= "/"+$_.Title+"/Shared Documents"

write-host  -f Yellow "Restoring  Backup at  "$DocLibPath " from  " $filesystemPath "...."

"Restoring  Backup at  "+$DocLibPath +" from  " +$filesystemPath +"...." | Out-File $File -append

 

$error.clear()

Import-SPWeb -Identity $_.URL -Path $filesystemPath  -IncludeUserSecurity  -Force -UpdateVersions Overwrite | Out-File $File -append

if($error.count -gt 0){$error | out-file $File -append}

 

write-host  -f Yellow "Backup for "$DocLibPath " at location " $filesystemPath "Completed."

"Backup for "+$DocLibPath +" at location " +$filesystemPath +"Completed." | Out-File $File -append

 

}

else

 

{

write-host  -f Red $filesystemPath  " does not exist."

$filesystemPath  +" does not exist." | Out-File $File -append

 

}

 

 

 

write-host -f Green "End of "  $_.URL"`n`n`n`n"

"End of "  +$_.URL+"`n`n`n`n" | Out-File $File -append

 

 

}

 

}

 

___________________________________________________________________________________________________________________

description of commands being used:

 

 

1.       Get-SPSite  : creates object for the spsite as given by identity parameter.

2.       Get-SPWeb -limit all  gets all the  spweb in the site collection

 

3.       Out-File $File –append   :  put entry in a log file.

 

4.       write-host -f Green "processing for "  $_.URL :  to write a text on screen

 

5.       $error.clear() : clears the current error

6.       [IO.Directory]::CreateDirectory()  : used to create a folder on the shared path

 

 

7.       if($error.count -gt 0){$error | out-file $File –append :  if the error count is greater than 0, that is logged in the log file.

 

 

8.       Test-Path $filesystemPath  : verifies if the folder has been created

 

9.       export-spweb -identity $_.URL   -path  $filesystemPath   -ItemUrl $DocLibPath -Force -IncludeUserSecurity -IncludeVersions All | Out-File $File –append

: the heart of all commands, which is actually creating the cmp file

 

10.   Import-SPWeb -Identity $_.URL -Path $filesystemPath  -IncludeUserSecurity  -Force -UpdateVersions Overwrite | Out-File $File –append

: the heart of all commands, which is actually restoring   the cmp file.