简体   繁体   English

将IIS日志移至AWS S3存储桶的Powershell脚本

[英]Powershell script to move IIS logs to AWS S3 bucket

I'm trying to get a script which will move iis logs and archives older than 1 day from my instances to the S3 bucket (for example logs). 我正在尝试获取一个脚本,该脚本会将iis日志和早于1天的存档从我的实例移至S3存储桶(例如日志)。 S3 path: logs/iislogs/instance-ID/W3SVC1, /W3SVC2 etc S3路径:logs / iislogs / instance-ID / W3SVC1,/ W3SVC2等

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles"

$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")

foreach ($i in Get-ChildItem $source)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i
}
}

As a result I'm getting error: 结果我得到了错误:

Write-S3Object : The file indicated by the FilePath property does not exist! Write-S3Object:FilePath属性指示的文件不存在! At line:12 char:15 + Write-S3Object <<<< -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i + CategoryInfo : InvalidOperation: (Amazon.PowerShe...eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Object], InvalidOperationException + FullyQualifiedErrorId : System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet 在第12行:char:15 + Write-S3Object <<<< -BucketName $ bucket -File $ i.FullName -Key iislogs / $ instanceIdResult / $ i + CategoryInfo:InvalidOperation:(Amazon.PowerShe ... eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Object],InvalidOperationException + FullyQualifiedErrorId:System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

Also in S3: logs/iislogs/instance-ID/ all copied files from subfolders. 同样在S3中:日志/ iislogs /实例ID /从子文件夹复制的所有文件。

Please help 请帮忙

After some research I'm able to copy log files older than 1 day to the S3 then delete them from source PC. 经过一些研究,我能够将超过1天的日志文件复制到S3,然后从源PC删除它们。 But the problem that S3 bucket path include ...\\c:\\inetpub\\logs\\LogFiles. 但是S3存储桶路径包含... \\ c:\\ inetpub \\ logs \\ LogFiles的问题。 How to cut it and copy to logs/iislogs/instance-ID/W3SVC1, /W3SVC2? 如何剪切并复制到日志/ iislogs / instance-ID / W3SVC1,/ W3SVC2?

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles\*"
$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")


foreach ($i in Get-ChildItem $source -include *.txt -recurse)

{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -Key iislogs/$instanceIdResult/$i -File $i
}
}
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-1))} | Remove-Item -Force

The answer for my question below. 我下面的问题的答案。 This script does what I needed plus it save powershell console output to the file and sends this file to me as email attachment. 该脚本完成了我所需的工作,并将Powershell控制台输出保存到文件中,并将此文件作为电子邮件附件发送给我。

# This script will copy all log files from $source older than 3 days into AWS S3 $bucket/iislogs/instanceid/C:/inetpub/logs/LogFiles using $Akey and $SKey credentials. Then delete copied files and send email with report
# !!! install AWSToolsAndSDKForNet_sdk before run !!!
# Make sure that you have access to C:\inetpub\logs\LogFiles\... folders
# Created 26 Aug 2014 by Nick Sinyakov


Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$bucket="YOUR AWS S3 BUSKET"
$source="C:\inetpub\logs\LogFiles\*"
$outputpath="C:\temp\log.txt"
$wc = New-Object System.Net.WebClient
$instanceId = $wc.DownloadString("http://IP/latest/meta-data/instance-id")
$AKey="AWS access key"
$SKey="AWS secret key"

Set-AWSCredentials -AccessKey $AKey -SecretKey $SKey -StoreAs For_Move
Initialize-AWSDefaults -ProfileName For_Move -Region YOUR-AWS-REGION

Start-Transcript -path $outputpath -Force
foreach ($i in Get-ChildItem $source -include *.log -recurse)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-3)))
{
$fileName = (Get-ChildItem $i).Name
$parentFolderName = Split-Path (Split-Path $i -Parent) -Leaf
Write-S3Object -BucketName $bucket -Key iislogs/$instanceId/$parentFolderName/$fileName -File $i
}
}
Stop-Transcript
Send-MailMessage -To email@domain.com -From email@domain.com -Subject "IIS Log move to S3 report" -SmtpServer yoursmtpserver -Attachments $outputpath
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-3))} | Remove-Item -Force

Hope it will help somebody 希望对别人有帮助

Here is a script that I run daily to keep logs in S3 from IIS. 这是我每天运行的脚本,用于将IIS中的日志保存在S3中。 It scans all IIS websites, finds their log folders, pushes logs to S3 and marks processed log filnames with an underscore. 它扫描所有IIS网站,找到其日志文件夹,将日志推送到S3并用下划线标记已处理的日志文件名。 Hope it helps 希望能帮助到你

Import-Module AWSPowerShell
# Set the script variables
$accessKey = "XXXXXXXXXXXXXXXXXXXX"
$secretKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
$bucketName = "bucketname"
$today = get-date

# Add a function for iterating log files in a directory and pushing them to S3
function processLogDirectory {
  Param($logDir, $bucketName)
  # Log directories are only created after a site is first accessed
  # Check if the log directory exists
  if(Test-Path $logDir) {
    # Get all .log files from the folder except the ones we've processed previously
    $logs = Get-ChildItem -Path $logDir  -Exclude "*.log_"

    # Iterate the logs for pushing to S3
    foreach($log in $logs) {
      # Make sure we don't try to upload today's log file
      if($log.name -ne $log_today) {
        # Push the log file to the S3 Bucket specified in a folder based on the site's name
        Write-S3Object -BucketName $bucketName -Key "$($site.name)/$($log.name)" -File $log.FullName
        # As a safety, just rename the files instead of deleting them
        # If the original files are left, they will get reuploaded. 
        # Reuploaded files will overwrite original logs in the S3 bucket
        # New versions of the logs will be created if versioning is enabled on the bucket
        # Rename-Item $log.FullName "$($log.name)_"
        # Replace the previous line with the next line to delete the log files permanently
        # Remove-Item $log.FullName -whatif # remove the -whatif to really really delete the logs
        # Also, it references the original log file name.
        # There will be an exception if the Rename-Item line is not removed or the Remove-Item is not modified
      }
    }
  }
}

# Create an AWS Credentials object 
Set-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey

# Get filename for Today's log
# We won't be able to access it due to lock from IIS
$log_today = "u_ex$('{0:yy}' -f $today)$('{0:MM}' -f $today)$('{0:dd}' -f $today).log"

# Get All websites
$websites = (Get-Website) 

# Iterate through the sites
foreach($site in $websites) {
  # Check if there is an FTP site started
  if($site.ftpserver.state -eq "started") {
    # Get the FTP site's log directory
    $log_dir = $site.ftpserver.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
    $svc = "FTPSVC$($site.id)"
    # Add trailing slash if needed - needed more often than you would expect 
    if($log_dir[-1] -ne "\") {
      $log_dir = "$($log_dir)\"
    }

    # Concatenate the full log directory
    $svclog_dir = "$($log_dir)$($svc)"
    Write-Host "processing $($site.name)"
    processLogDirectory -logDir $svclog_dir -bucketName $bucketName
  } else {
    # Process the W3 site
    if($site.elementtagname -eq "site") {
      # Get the W3 site's log directory
      $log_dir = $site.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
      $svc = "W3SVC$($site.id)"
      # Add trailing slash if needed - needed more often than you would expect 
      if($log_dir[-1] -ne "\") {
        $log_dir = "$($log_dir)\"
      }

      # Concatenate the full log directory
      $svclog_dir = "$($log_dir)$($svc)"
      Write-Host "processing $($site.name)"
      processLogDirectory -logDir $svclog_dir -bucketName $bucketName
    }
  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM