简体   繁体   中英

Powershell script to move IIS logs to AWS S3 bucket

I'm trying to get a script which will move iis logs and archives older than 1 day from my instances to the S3 bucket (for example logs). S3 path: logs/iislogs/instance-ID/W3SVC1, /W3SVC2 etc

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles"

$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")

foreach ($i in Get-ChildItem $source)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i
}
}

As a result I'm getting error:

Write-S3Object : The file indicated by the FilePath property does not exist! At line:12 char:15 + Write-S3Object <<<< -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i + CategoryInfo : InvalidOperation: (Amazon.PowerShe...eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Object], InvalidOperationException + FullyQualifiedErrorId : System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

Also in S3: logs/iislogs/instance-ID/ all copied files from subfolders.

Please help

After some research I'm able to copy log files older than 1 day to the S3 then delete them from source PC. But the problem that S3 bucket path include ...\\c:\\inetpub\\logs\\LogFiles. How to cut it and copy to logs/iislogs/instance-ID/W3SVC1, /W3SVC2?

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles\*"
$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")


foreach ($i in Get-ChildItem $source -include *.txt -recurse)

{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -Key iislogs/$instanceIdResult/$i -File $i
}
}
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-1))} | Remove-Item -Force

The answer for my question below. This script does what I needed plus it save powershell console output to the file and sends this file to me as email attachment.

# This script will copy all log files from $source older than 3 days into AWS S3 $bucket/iislogs/instanceid/C:/inetpub/logs/LogFiles using $Akey and $SKey credentials. Then delete copied files and send email with report
# !!! install AWSToolsAndSDKForNet_sdk before run !!!
# Make sure that you have access to C:\inetpub\logs\LogFiles\... folders
# Created 26 Aug 2014 by Nick Sinyakov


Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$bucket="YOUR AWS S3 BUSKET"
$source="C:\inetpub\logs\LogFiles\*"
$outputpath="C:\temp\log.txt"
$wc = New-Object System.Net.WebClient
$instanceId = $wc.DownloadString("http://IP/latest/meta-data/instance-id")
$AKey="AWS access key"
$SKey="AWS secret key"

Set-AWSCredentials -AccessKey $AKey -SecretKey $SKey -StoreAs For_Move
Initialize-AWSDefaults -ProfileName For_Move -Region YOUR-AWS-REGION

Start-Transcript -path $outputpath -Force
foreach ($i in Get-ChildItem $source -include *.log -recurse)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-3)))
{
$fileName = (Get-ChildItem $i).Name
$parentFolderName = Split-Path (Split-Path $i -Parent) -Leaf
Write-S3Object -BucketName $bucket -Key iislogs/$instanceId/$parentFolderName/$fileName -File $i
}
}
Stop-Transcript
Send-MailMessage -To email@domain.com -From email@domain.com -Subject "IIS Log move to S3 report" -SmtpServer yoursmtpserver -Attachments $outputpath
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-3))} | Remove-Item -Force

Hope it will help somebody

Here is a script that I run daily to keep logs in S3 from IIS. It scans all IIS websites, finds their log folders, pushes logs to S3 and marks processed log filnames with an underscore. Hope it helps

Import-Module AWSPowerShell
# Set the script variables
$accessKey = "XXXXXXXXXXXXXXXXXXXX"
$secretKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
$bucketName = "bucketname"
$today = get-date

# Add a function for iterating log files in a directory and pushing them to S3
function processLogDirectory {
  Param($logDir, $bucketName)
  # Log directories are only created after a site is first accessed
  # Check if the log directory exists
  if(Test-Path $logDir) {
    # Get all .log files from the folder except the ones we've processed previously
    $logs = Get-ChildItem -Path $logDir  -Exclude "*.log_"

    # Iterate the logs for pushing to S3
    foreach($log in $logs) {
      # Make sure we don't try to upload today's log file
      if($log.name -ne $log_today) {
        # Push the log file to the S3 Bucket specified in a folder based on the site's name
        Write-S3Object -BucketName $bucketName -Key "$($site.name)/$($log.name)" -File $log.FullName
        # As a safety, just rename the files instead of deleting them
        # If the original files are left, they will get reuploaded. 
        # Reuploaded files will overwrite original logs in the S3 bucket
        # New versions of the logs will be created if versioning is enabled on the bucket
        # Rename-Item $log.FullName "$($log.name)_"
        # Replace the previous line with the next line to delete the log files permanently
        # Remove-Item $log.FullName -whatif # remove the -whatif to really really delete the logs
        # Also, it references the original log file name.
        # There will be an exception if the Rename-Item line is not removed or the Remove-Item is not modified
      }
    }
  }
}

# Create an AWS Credentials object 
Set-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey

# Get filename for Today's log
# We won't be able to access it due to lock from IIS
$log_today = "u_ex$('{0:yy}' -f $today)$('{0:MM}' -f $today)$('{0:dd}' -f $today).log"

# Get All websites
$websites = (Get-Website) 

# Iterate through the sites
foreach($site in $websites) {
  # Check if there is an FTP site started
  if($site.ftpserver.state -eq "started") {
    # Get the FTP site's log directory
    $log_dir = $site.ftpserver.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
    $svc = "FTPSVC$($site.id)"
    # Add trailing slash if needed - needed more often than you would expect 
    if($log_dir[-1] -ne "\") {
      $log_dir = "$($log_dir)\"
    }

    # Concatenate the full log directory
    $svclog_dir = "$($log_dir)$($svc)"
    Write-Host "processing $($site.name)"
    processLogDirectory -logDir $svclog_dir -bucketName $bucketName
  } else {
    # Process the W3 site
    if($site.elementtagname -eq "site") {
      # Get the W3 site's log directory
      $log_dir = $site.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
      $svc = "W3SVC$($site.id)"
      # Add trailing slash if needed - needed more often than you would expect 
      if($log_dir[-1] -ne "\") {
        $log_dir = "$($log_dir)\"
      }

      # Concatenate the full log directory
      $svclog_dir = "$($log_dir)$($svc)"
      Write-Host "processing $($site.name)"
      processLogDirectory -logDir $svclog_dir -bucketName $bucketName
    }
  }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM