There is a SharePoint site called http://mySPSite.com . It normally gets fully downloaded at the client side with images, CSS and JavaScript files in 12 seconds. I want to monitor the complete request using PowerShell such that it simulates the download of all the pages, in exactly the same way as a browser would.
What would be an appropriate way to achieve this?
I Guess you can use this :
http://www.howtogeek.com/124736/stupid-geek-tricks-extract-links-off-any-webpage-using-powershell/
for accessing all link and image in your web site and making an array with them. Then download them in looop with this script :
$url = "http://website.com/downloads/Iwantthisfile.txt"
$path = "C:\temp\thisisthefile.txt"
# param([string]$url, [string]$path)
if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent $path))) {
$path = Join-Path $pwd (Split-Path -leaf $path)
}
"Downloading [$url]`nSaving at [$path]"
$client = new-object System.Net.WebClient
$client.DownloadFile($url, $path)
#$client.DownloadData($url, $path)
$path
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.