簡體   English   中英

將IIS日志移至AWS S3存儲桶的Powershell腳本

[英]Powershell script to move IIS logs to AWS S3 bucket

我正在嘗試獲取一個腳本,該腳本會將iis日志和早於1天的存檔從我的實例移至S3存儲桶(例如日志)。 S3路徑:logs / iislogs / instance-ID / W3SVC1,/ W3SVC2等

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles"

$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")

foreach ($i in Get-ChildItem $source)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i
}
}

結果我得到了錯誤:

Write-S3Object:FilePath屬性指示的文件不存在! 在第12行:char:15 + Write-S3Object <<<< -BucketName $ bucket -File $ i.FullName -Key iislogs / $ instanceIdResult / $ i + CategoryInfo:InvalidOperation:(Amazon.PowerShe ... eS3ObjectCmdlet:WriteS3ObjectCmdlet) [Write-S3Object],InvalidOperationException + FullyQualifiedErrorId:System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

同樣在S3中:日志/ iislogs /實例ID /從子文件夾復制的所有文件。

請幫忙

經過一些研究,我能夠將超過1天的日志文件復制到S3,然后從源PC刪除它們。 但是S3存儲桶路徑包含... \\ c:\\ inetpub \\ logs \\ LogFiles的問題。 如何剪切並復制到日志/ iislogs / instance-ID / W3SVC1,/ W3SVC2?

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles\*"
$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")


foreach ($i in Get-ChildItem $source -include *.txt -recurse)

{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -Key iislogs/$instanceIdResult/$i -File $i
}
}
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-1))} | Remove-Item -Force

我下面的問題的答案。 該腳本完成了我所需的工作,並將Powershell控制台輸出保存到文件中,並將此文件作為電子郵件附件發送給我。

# This script will copy all log files from $source older than 3 days into AWS S3 $bucket/iislogs/instanceid/C:/inetpub/logs/LogFiles using $Akey and $SKey credentials. Then delete copied files and send email with report
# !!! install AWSToolsAndSDKForNet_sdk before run !!!
# Make sure that you have access to C:\inetpub\logs\LogFiles\... folders
# Created 26 Aug 2014 by Nick Sinyakov


Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$bucket="YOUR AWS S3 BUSKET"
$source="C:\inetpub\logs\LogFiles\*"
$outputpath="C:\temp\log.txt"
$wc = New-Object System.Net.WebClient
$instanceId = $wc.DownloadString("http://IP/latest/meta-data/instance-id")
$AKey="AWS access key"
$SKey="AWS secret key"

Set-AWSCredentials -AccessKey $AKey -SecretKey $SKey -StoreAs For_Move
Initialize-AWSDefaults -ProfileName For_Move -Region YOUR-AWS-REGION

Start-Transcript -path $outputpath -Force
foreach ($i in Get-ChildItem $source -include *.log -recurse)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-3)))
{
$fileName = (Get-ChildItem $i).Name
$parentFolderName = Split-Path (Split-Path $i -Parent) -Leaf
Write-S3Object -BucketName $bucket -Key iislogs/$instanceId/$parentFolderName/$fileName -File $i
}
}
Stop-Transcript
Send-MailMessage -To email@domain.com -From email@domain.com -Subject "IIS Log move to S3 report" -SmtpServer yoursmtpserver -Attachments $outputpath
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-3))} | Remove-Item -Force

希望對別人有幫助

這是我每天運行的腳本,用於將IIS中的日志保存在S3中。 它掃描所有IIS網站,找到其日志文件夾,將日志推送到S3並用下划線標記已處理的日志文件名。 希望能幫助到你

Import-Module AWSPowerShell
# Set the script variables
$accessKey = "XXXXXXXXXXXXXXXXXXXX"
$secretKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
$bucketName = "bucketname"
$today = get-date

# Add a function for iterating log files in a directory and pushing them to S3
function processLogDirectory {
  Param($logDir, $bucketName)
  # Log directories are only created after a site is first accessed
  # Check if the log directory exists
  if(Test-Path $logDir) {
    # Get all .log files from the folder except the ones we've processed previously
    $logs = Get-ChildItem -Path $logDir  -Exclude "*.log_"

    # Iterate the logs for pushing to S3
    foreach($log in $logs) {
      # Make sure we don't try to upload today's log file
      if($log.name -ne $log_today) {
        # Push the log file to the S3 Bucket specified in a folder based on the site's name
        Write-S3Object -BucketName $bucketName -Key "$($site.name)/$($log.name)" -File $log.FullName
        # As a safety, just rename the files instead of deleting them
        # If the original files are left, they will get reuploaded. 
        # Reuploaded files will overwrite original logs in the S3 bucket
        # New versions of the logs will be created if versioning is enabled on the bucket
        # Rename-Item $log.FullName "$($log.name)_"
        # Replace the previous line with the next line to delete the log files permanently
        # Remove-Item $log.FullName -whatif # remove the -whatif to really really delete the logs
        # Also, it references the original log file name.
        # There will be an exception if the Rename-Item line is not removed or the Remove-Item is not modified
      }
    }
  }
}

# Create an AWS Credentials object 
Set-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey

# Get filename for Today's log
# We won't be able to access it due to lock from IIS
$log_today = "u_ex$('{0:yy}' -f $today)$('{0:MM}' -f $today)$('{0:dd}' -f $today).log"

# Get All websites
$websites = (Get-Website) 

# Iterate through the sites
foreach($site in $websites) {
  # Check if there is an FTP site started
  if($site.ftpserver.state -eq "started") {
    # Get the FTP site's log directory
    $log_dir = $site.ftpserver.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
    $svc = "FTPSVC$($site.id)"
    # Add trailing slash if needed - needed more often than you would expect 
    if($log_dir[-1] -ne "\") {
      $log_dir = "$($log_dir)\"
    }

    # Concatenate the full log directory
    $svclog_dir = "$($log_dir)$($svc)"
    Write-Host "processing $($site.name)"
    processLogDirectory -logDir $svclog_dir -bucketName $bucketName
  } else {
    # Process the W3 site
    if($site.elementtagname -eq "site") {
      # Get the W3 site's log directory
      $log_dir = $site.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
      $svc = "W3SVC$($site.id)"
      # Add trailing slash if needed - needed more often than you would expect 
      if($log_dir[-1] -ne "\") {
        $log_dir = "$($log_dir)\"
      }

      # Concatenate the full log directory
      $svclog_dir = "$($log_dir)$($svc)"
      Write-Host "processing $($site.name)"
      processLogDirectory -logDir $svclog_dir -bucketName $bucketName
    }
  }
}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM