简体   繁体   English

Azure SQL 服务器错误:弹性池已达到其存储限制。 弹性池的存储使用量不能超过 (204800)

[英]Azure SQL Server Error : The elastic pool has reached its storage limit. The storage usage for the elastic pool cannot exceed (204800)

When I am trying to update or insert multiple rows from the application or directly sql server database, I am getting the error as below.当我尝试从应用程序或直接从 sql 服务器数据库更新或插入多行时,出现如下错误。

Msg 1132, Level 16, State 1, Line 1 The elastic pool has reached its storage limit.消息 1132,级别 16,State 1,第 1 行弹性池已达到其存储限制。 The storage usage for the elastic pool cannot exceed (204800) MBs.弹性池的存储使用量不能超过 (204800) MB。

I do not know how to handle this.我不知道该如何处理。 Please help.请帮忙。

You should proactively check the current size quota for your databases, to make sure it is set as expected. 您应该主动检查数据库的当前大小配额,以确保它按预期设置。 To do this, the following statement can be used in the context of the target database: 为此,可以在目标数据库的上下文中使用以下语句:

SELECT DATABASEPROPERTYEX(DB_NAME(), 'MaxSizeInBytes');

To solve this issue scale up to the service objective with a larger maximum size quota, explicitly change the quota to match the maximum by using the ALTER DATABASE … MODIFY (MAXSIZE = …) command as shown above (unless a lower quota is desired to guarantee being able to scale down in the future). 要解决此问题,请扩展到具有更大最大大小配额的服务目标,通过使用ALTER DATABASE ... MODIFY(MAXSIZE = ...)命令显式更改配额以匹配最大值(除非需要保证较低的配额)能够在未来缩小规模)。 The change is executed in an online manner. 更改以在线方式执行。

ALTER DATABASE DB1 MODIFY (MAXSIZE = 10 GB);

On this documentation you will find a table that shows the resources available at each service tier, including the maximum storage. 文档中,您将找到一个表,其中显示了每个服务层可用的资源,包括最大存储​​空间。

Error when inserting data: "The elastic pool has reached its storage limit. The storage usage for the elastic pool cannot exceed (153600) MBs" 插入数据时出错:“弹性池已达到其存储限制。弹性池的存储使用量不能超过(153600)MB”

A scale up scale down of the database didn't fix the error. 向上扩展数据库缩小并未修复错误。

Pool was scaled from 100GB to 150GB. 池的容量从100GB扩展到150GB。 Size stated in error is 153.6GB. 错误陈述的大小为153.6GB。 Scaled the elastic pool to 250GB. 将弹性池缩放至250GB。 Data successfully inserted. 数据已成功插入。 Scaled back down to 100GB, for cost purposes, and insert still worked. 为了成本目的,缩小到100GB,插入仍然有效。

A scale up of the elastic pool--beyond the limit in error message--and scale down fixed the issue. 弹性池的放大 - 超出错误消息的限制 - 并按比例缩小解决了问题。

缩小数据库可以视为中间解决方案,因为这将释放分配的空间。

DBCC SHRINKDATABASE ('DB-Name', 10); 

This was straightforward to fix in the Azure Portal.这很容易在 Azure 门户中修复。 I selected the Elastic Pool and navigated to Settings > Configure.我选择了 Elastic Pool 并导航到设置 > 配置。 There is a slider where you can increase the size of the data.有一个 slider,您可以在其中增加数据的大小。 I doubled this for a small increase in cost and it fixed the issue.我将其翻倍以增加成本并解决了问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM