简体   繁体   English

自动创建目录 Azure Data Lake Gen 2 - PowerShell

[英]Creation of Directory Azure Data Lake Gen 2 Automatically - PowerShell

I want to create directories in the Azure Data Lake Gen2 in this format.我想以此格式在 Azure Data Lake Gen2 中创建目录。 YYYY = 2020 (Current Year) -> MM = 10 (Current Month), DD = 28 (Current Date) and inside the Date folder I need to place the data file (Data.csv). YYYY = 2020(当前年份)-> MM = 10(当前月份),DD = 28(当前日期)并且在日期文件夹中我需要放置数据文件(Data.csv)。 ie: /YYYY=2020/MM=10/DD=28/Data.csv.即:/YYYY=2020/MM=10/DD=28/Data.csv。

Code Used for Data.csv in Root Folder Directory: With the Help of @Jim Xu根文件夹目录中用于Data.csv的代码:在@Jim Xu的帮助下

$username = "xyz@abc.com"
$password =ConvertTo-SecureString "" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PsCredential($username,$password)


#Connect-AzureRmAccount -Credential $credential | out-null

Connect-AzAccount -Credential $credential
$dataFactoryName=""
$resourceGroupName=""
# get dataFactory triggers
$triggers=Get-AzDataFactoryV2Trigger -DataFactoryName $dataFactoryName  -ResourceGroupName $resourceGroupName
$datas=@()
foreach ($trigger in $triggers) {
    # get the trigger run history
    $today = Get-Date
    $yesterday = $today.AddDays(-1)
     $splat = @{ 
        ResourceGroupName       = $trigger.ResourceGroupName
        DataFactoryName         = $trigger.DataFactoryName
        TriggerName             = $trigger.Name
        TriggerRunStartedAfter  = $yesterday
        TriggerRunStartedBefore = $today
   }
    
   $historys =Get-AzDataFactoryV2TriggerRun @splat
   if($historys -ne $null){
     # create date
     foreach($history in $historys){
        $obj =[PsCustomObject]@{
            'TriggerRunTimestamp '     = $history.TriggerRunTimestamp
            'ResourceGroupName '   =$history.ResourceGroupName
            'DataFactoryName' =$history.DataFactoryName
            'TriggerName '  = $history.TriggerName
            'TriggerRunId'= $history.TriggerRunId
            'TriggerType'=$history.TriggerType
            'Status' =$history.Status

        }
        # add data to an array
        $datas += $obj
     }
   } 
   
  
 }
 #  convert data to csv string
 $contents =(($datas | ConvertTo-Csv -NoTypeInformation) -join [Environment]::NewLine)

 # upload to Azure Data Lake Store Gen2

 #1. Create a sas token
 $accountName="testadls05"
 $fileSystemName="test"
 $filePath="data.csv"
 $account = Get-AzStorageAccount -ResourceGroupName andywin7 -Name $accountName
 $sas= New-AzStorageAccountSASToken -Service Blob  -ResourceType Service,Container,Object `
      -Permission "racwdlup" -StartTime (Get-Date).AddMinutes(-10) `
      -ExpiryTime (Get-Date).AddHours(2) -Context $account.Context
$baseUrl ="https://{0}.dfs.core.windows.net/{1}/{2}{3}" -f $accountName ,  $fileSystemName, $filePath, $sas
#2. Create file
$endpoint =$baseUrl +"&resource=file"

Invoke-RestMethod -Method Put -Uri $endpoint -Headers @{"Content-Length" = 0} -UseBasicParsing

#3 append data
$endpoint =$baseUrl +"&action=append&position=0"
Invoke-RestMethod -Method Patch -Uri $endpoint -Headers @{"Content-Length" = $contents.Length} -Body $contents -UseBasicParsing

#4 flush data
$endpoint =$baseUrl + ("&action=flush&position={0}" -f $contents.Length)
Invoke-RestMethod -Method Patch -Uri $endpoint -UseBasicParsing

#Check the result (get data)

Invoke-RestMethod -Method Get -Uri $baseUrl -UseBasicParsing

If anyone is having suggestions, please do post.如果有人有建议,请发表。 Thanks谢谢

Here is my solution:这是我的解决方案:

 $username = "xyz@abc.com"
 $password = Get-Content D:\Powershell\new\passwords\password.txt | ConvertTo-SecureString -Key (Get-Content D:\Powershell\new\passwords\aes.key)
 $credential = New-Object System.Management.Automation.PsCredential($username,$password)

 Connect-AzAccount -Credential $credential

# Input Variables

 $dataFactoryName="dna-production-gen2"
 $resourceGroupName="DataLake-Gen2"

# get dataFactory triggers

$triggers=Get-AzDataFactoryV2Trigger -DataFactoryName $dataFactoryName  -ResourceGroupName $resourceGroupName
$datas=@()
foreach ($trigger in $triggers) {
    # get the trigger run history
    $today = Get-Date
    $yesterday = $today.AddDays(-1)
     $splat = @{ 
        ResourceGroupName       = $trigger.ResourceGroupName
        DataFactoryName         = $trigger.DataFactoryName
        TriggerName             = $trigger.Name
        TriggerRunStartedAfter  = $yesterday
        TriggerRunStartedBefore = $today
   }
    
   $historys =Get-AzDataFactoryV2TriggerRun @splat
   if($historys -ne $null){
     # create date
     foreach($history in $historys){
        $obj =[PsCustomObject]@{
            'TriggerRunTimestamp '     = $history.TriggerRunTimestamp
            'ResourceGroupName '   =$history.ResourceGroupName
            'DataFactoryName' =$history.DataFactoryName
            'TriggerName '  = $history.TriggerName
            'TriggerRunId'= $history.TriggerRunId
            'TriggerType'=$history.TriggerType
            'Status' =$history.Status

        }
        # add data to an array
        $datas += $obj
     }
   } 
   
  
 }

#  convert data to csv string

 $contents =(($datas | ConvertTo-Csv -NoTypeInformation) -join [Environment]::NewLine)

 # upload to Azure Data Lake Store Gen2

 #1. Create a sas token
 
  $accountName="dna2020gen2"
 
# $path = New-Item -ItemType Directory -Path ".\$((Get-Date).ToString('yyyy-MM-dd'))"

 $YY = (Get-Date).year
 $MM = (Get-Date).month
 $DD = get-date –f dd
 
 $fileSystemName="dev"
 $filePath="triggers/YYYY=$YY/MM=$MM/DD=$DD/data.csv"
 $account = Get-AzStorageAccount -ResourceGroupName 'DataLake-Gen2' -Name $accountName
 $sas= New-AzStorageAccountSASToken -Service Blob  -ResourceType Service,Container,Object `
      -Permission "racwdlup" -StartTime (Get-Date).AddMinutes(-10) `
      -ExpiryTime (Get-Date).AddHours(2) -Context $account.Context
$baseUrl ="https://{0}.dfs.core.windows.net/{1}/{2}{3}" -f $accountName ,  $fileSystemName, $filePath, $sas

#2. Create file
$endpoint =$baseUrl +"&resource=file"

Invoke-RestMethod -Method Put -Uri $endpoint -Headers @{"Content-Length" = 0} -UseBasicParsing

#3 append data
$endpoint =$baseUrl +"&action=append&position=0"
Invoke-RestMethod -Method Patch -Uri $endpoint -Headers @{"Content-Length" = $contents.Length} -Body $contents -UseBasicParsing

#4 flush data
$endpoint =$baseUrl + ("&action=flush&position={0}" -f $contents.Length)
Invoke-RestMethod -Method Patch -Uri $endpoint -UseBasicParsing

#Check the result (get data)

Invoke-RestMethod -Method Get -Uri $baseUrl -UseBasicParsing

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用PowerShell在azure数据湖存储中设置目录到期 - Set Directory expiration in azure data lake storage using powershell 为 Powershell 中的 Azure 数据湖目录创建 SAS 令牌 - Create SAS token for Azure Data Lake directory in Powershell 在Powershell脚本中通过调用API在Azure Data Lake Gen2中创建文件系统 - Creating file system in azure data lake gen2 with calling API in powershell script Azure Data Lake Analytics添加Powershell API - Azure Data Lake Analytics add Powershell API 使用 PowerShell 为不同用户授予对单独 Data Lake Gen 2 文件夹的访问权限 - Grant access to separate Data Lake Gen 2 folders for different users using PowerShell 将数据写入 Azure Data Lake Store - Powershell 脚本 - Writing the data to Azure Data Lake Store - Powershell Scripting 使用Powershell的Azure Data Lake子文件夹权限 - Azure Data Lake Child Folder permissions using Powershell 如何使用 Powershell -recursive 在 Azure Data Lake Store 中移动文件? - How to move files in Azure Data Lake Store with Powershell -recursive? Powershell-使用托管服务身份访问Azure Data Lake Store - Powershell - Use Managed Service Identity to access Azure Data Lake Store 尝试使用Ruby在我的数据湖存储gen2上进行Azure REST API调用时,“服务器未能验证请求” - “Server failed to authenticate the request” when attempting to make an Azure REST API call on my data lake storage gen2 using Ruby
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM