简体   繁体   中英

Azure SQL Server JSON without DTU usage

I have a S0 Standard Azure SQL server where i would like to bring in 100GB of JSON data without smashing through my DTU allowance and getting charged a huge amount of money. I have £20 GBP per month Azure credit from my Developer Program Benefit subscription.

Is there any way of attaching an uploaded MDB / LDB to an Azure SQL SaaS ?

Unless anyone else has suggestions on how to not break my DTU allowance i wanted to try importing the JSON in to a local install of SQL Server 2016 and then attaching the database file to the server so that it is pre-processed.

This will be a long upload and will incur costs for a new file storage so does anyone know if this will work or how i might do it once uploaded?

Any info on how to prevent DTU usage for a JSON upload would be appreciate.

So, the actual answer to this question is - DTU aren't like prepaid resources, DTU are allocated according to your Plan and you use them like CPU\\Memory\\Disk "cycles". When you exceed DTU quota nothing "bad" happens, you are just throttled, so think of this a capacity limit, not a spending limit.

Migrate the raw json file to Azure Storage and use the new json features of azure SQL. https://blogs.msdn.microsoft.com/sqlserverstorageengine/2015/10/07/bulk-importing-json-files-into-sql-server/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM