We have an issue here where we have a lot of Ops documentation in a sharepoint wiki. This includes documentation on how to get sharepoint running and other emergency procedures. The problem is, if the sharepoint instance is down, we don't have access to our docs. Here is a simple script that will dump all the wiki pages to html pages in a directory. The links won't work but at least you have the text. This is sufficient for our purposes, please let me know if you extend it.
The script needs to run on the sharepoint server (or somewhere it can source the libraries). It grabs each item and basically dumps the div that is the body of your wiki page.
blogspot is having trouble urlencoding the tags below so you may want to fix the H1 tags in the body at the end of the script.
The script needs to run on the sharepoint server (or somewhere it can source the libraries). It grabs each item and basically dumps the div that is the body of your wiki page.
blogspot is having trouble urlencoding the tags below so you may want to fix the H1 tags in the body at the end of the script.
$uri = "http://SHAREPOINT/sites/wiki"
$list = "Wiki
Pages"
$outputDir = "\\server\backups\Wiki"
$site = New-Object Microsoft.SharePoint.SPSite($uri)
$web = $site.openweb()
$list = $web.lists[$list]
$i = 0
"exporting
to: $outputDir"
foreach ($page in $list.Items) {
[xml]$objXML = $page.xml
$body = $objXML.row.ows_WikiField
$title = $page.DisplayName
"creating File $i : $title"
$i++
$body = "< H1 > $title < / H1 >" + $body
$body | Out-File -FilePath "$outputDir\$title.html"
}