2

I have Following PowerShell Script which downloads documents from document library with folders to a local drive. it also downloads metadata of each document in CSV file. How do I copy the folder path of each document in same CSV file?

if((Get-PSSnapin "Microsoft.SharePoint.PowerShell") -eq $null) { Add-PSSnapin Microsoft.SharePoint.PowerShell } $destination = "C:\\FolderName\\" $web = Get-SPWeb -Identity "http://XYZ:2010/" $list = $web.GetList("http://XYZ:2010/Shared Documents/") function ProcessFolder { param($folderUrl) $folder = $web.GetFolder($folderUrl) foreach ($file in $folder.Files) { #Ensure destination directory $destinationfolder = $destination + "/" + $folder.Url if (!(Test-Path -path $destinationfolder)) { $dest = New-Item $destinationfolder -type directory } #Download file $binary = $file.OpenBinary() $stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create $writer = New-Object System.IO.BinaryWriter($stream) $writer.write($binary) $writer.Close() } } $exportlist = @() $list.Items | foreach { $obj = New-Object PSObject -Property @{ "Title" = $_["Title"] "Name" = $_["Name"] "Modified Date" = $_["Modified"] "Modified By" =$_["Modified By"] "Size"= $_["File Size"] } $exportlist += $obj $exportlist | Export-Csv -path 'C:\FolderName\MyList.csv' -noType } #Download root files ProcessFolder($list.RootFolder.Url) #Download files in folders foreach ($folder in $list.Folders) { ProcessFolder($folder.Url) } #DownloadMetadata ($list.RootFolder.Url) 

    1 Answer 1

    1

    Try SPListItem.File.Url to get the the relative url of the file. And SPWeb.Url to get the url of the web and combine them together. In your code use:

    "Path" = $web.Url + "/" + $_.File.Url 

    The code may look something like this. Note: I have not tested it. But it will give you an idea how to go with your scenario:

    if((Get-PSSnapin "Microsoft.SharePoint.PowerShell") -eq $null) { Add-PSSnapin Microsoft.SharePoint.PowerShell } $destination = "C:\\FolderName\\" $web = Get-SPWeb -Identity "http://XYZ:2010/" $list = $web.GetList("http://XYZ:2010/Shared Documents/") $exportlist = @() function ProcessFolder { param($folderUrl) $folder = $web.GetFolder($folderUrl) foreach ($file in $folder.Files) { $item = $file.ListItem #Ensure destination directory $destinationfolder = $destination + "/" + $folder.Url if (!(Test-Path -path $destinationfolder)) { $dest = New-Item $destinationfolder -type directory } #Download file $binary = $file.OpenBinary() $stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create $writer = New-Object System.IO.BinaryWriter($stream) $writer.write($binary) $writer.Close() $obj = New-Object PSObject -Property @{ "Title" = $item["Title"] "Name" = $item["Name"] "Modified Date" = $item["Modified"] "Modified By" =$item["Modified By"] "Size"= $item["File Size"] } } $exportlist += $obj $exportlist | Export-Csv -path 'C:\FolderName\MyList.csv' -noType } #Download root files ProcessFolder($list.RootFolder.Url) #Download files in folders foreach ($folder in $list.Folders) { ProcessFolder($folder.Url) } #DownloadMetadata ($list.RootFolder.Url) 
    7
    • Thanks A lot. i can maintain log as well in same CSV file using above code?CommentedMar 1, 2014 at 6:52
    • what do you mean by log?CommentedMar 1, 2014 at 6:54
    • means I have to maintain a log of documents which gets downloaded succesfully or notCommentedMar 1, 2014 at 6:55
    • The code for downloading the files and exporting the info to Excel is working in two different loops. You may have to move the export code inside the download code to be able to update a flag is CSV file once a document is downloaded. Remember to get the List item from file in the download code using $file.ListItemCommentedMar 1, 2014 at 6:59
    • See my updated answer. As mentioned it is not tested.CommentedMar 1, 2014 at 7:19

    Start asking to get answers

    Find the answer to your question by asking.

    Ask question

    Explore related questions

    See similar questions with these tags.