简体   繁体   English

写入 SQL 数据库时出现 I/O 错误

[英]I/O error while writing to an SQL database

I have a database to which I am inserting rows and populating with values.我有一个数据库,我正在向其中插入行并填充值。 I am currently using sqlite3 on Python 3. What I find surprising is that if I simply insert the rows/values manually one at a time (eg iterations = 1 ), this will work.我目前在 Python 3 上使用 sqlite3。我发现令人惊讶的是,如果我只是一次手动插入行/值(例如iterations = 1 ),这将起作用。 Additionally, if I simply keep iterations below (approximately) 100, it will work as well!此外,如果我只是将iterations保持在(大约)100 以下,它也能正常工作! But as I increase the number of iterations, there tends to be some randomly varying number (typically below 1000) of iterations that I cannot exceed for some reason, and I observe the error copied below each time.但是当我增加迭代次数时,往往会有一些随机变化的迭代次数(通常低于 1000),由于某种原因我不能超过,我每次都观察到下面复制的错误。

What is causing this error?是什么导致了这个错误? How can it be overcome so that I can make iterations as large as necessary (eg 1000000 )?如何克服它以便我可以根据需要进行尽可能大的iterations (例如1000000 )? Below is a small and simplified snippet from a larger code:下面是来自较大代码的一个小而简化的片段:


column1 = 'id'
column2 = 'shot'
column3 = 'time'
column4 = 'psi'
column5 = 'temp'
column6 = 'dens'
column7 = 'temp_err'
column8 = 'dens_err'
iterations = 1000
timeout = 100
i = 0 
while i < iterations:
    try:  
        time = 3
        psi = 2 
        unique_id = 232
        temp = 0.4
        dens = 0.2
        temp_err = 0.02
        dens_err = 0.01
        values = [str(unique_id),str(shot),time,psi,temp,dens,temp_err,dens_err]
        conn = sqlite3.connect(sqlite_file,timeout=timeout)
        cursor = conn.cursor()
        cursor.execute("INSERT INTO {tn} ({c1},{c2},{c3},{c4},{c5},{c6},{c7},{c8}) VALUES ({o1},{o2},{o3},{o4},{o5},{o6},{o7},{o8})".\
                    format(tn=table_name,c1=column1,c2=column2,c3=column3,c4=column4,c5=column5,c6=column6,c7=column7,c8=column8,o1=values[0],\
                           o2=values[1],o3=values[2],o4=values[3],o5=values[4],o6=values[5],o7=values[6],o8=values[7]))
        conn.commit() 
        conn.close() 
    except sqlite3.IntegrityError:
        print('ERROR: ID already exists in PRIMARY KEY column {}'.format(column1)) 
    i = i + 1    

The error I observe is:我观察到的错误是:

Traceback (most recent call last):

  File "<ipython-input-27-59d2691987a1>", line 18, in <module>
    o2=values[1],o3=values[2],o4=values[3],o5=values[4],o6=values[5],o7=values[6],o8=values[7]))

OperationalError: disk I/O error

I've tried increasing timeout and connecting/committing/closing the connection to the database outside of the loop, but these approaches have not worked.我已经尝试增加timeout并在循环外连接/提交/关闭与数据库的连接,但这些方法没有奏效。


Another possible solution that worked for me:对我有用的另一个可能的解决方案:

column1 = 'id'
column2 = 'shot'
column3 = 'time'
column4 = 'psi'
column5 = 'temp'
column6 = 'dens'
column7 = 'temp_err'
column8 = 'dens_err'
iterations = 10000
timeout = 100 
with sqlite3.connect(sqlite_file,timeout=timeout) as conn:
    cursor = conn.cursor()
    i = 0 
    while i < iterations:
        try:  
            time = 3
            psi = 2 
            unique_id = 232
            temp = 0.4
            dens = 0.2
            temp_err = 0.02
            dens_err = 0.01
            values = [str(unique_id),str(shot),time,psi,temp,dens,temp_err,dens_err]
            cursor.execute("INSERT INTO {tn} ({c1},{c2},{c3},{c4},{c5},{c6},{c7},{c8}) VALUES ({o1},{o2},{o3},{o4},{o5},{o6},{o7},{o8})".\
                        format(tn=table_name,c1=column1,c2=column2,c3=column3,c4=column4,c5=column5,c6=column6,c7=column7,c8=column8,o1=values[0],\
                               o2=values[1],o3=values[2],o4=values[3],o5=values[4],o6=values[5],o7=values[6],o8=values[7]))
        except sqlite3.IntegrityError:
            print('ERROR: ID already exists in PRIMARY KEY column {}'.format(column1)) 
        i = i + 1  
        print(i)
    conn.commit()  

How is the database set up on the drive?驱动器上的数据库是如何设置的? It's possible that the drive is full and it can't write anymore, or your system is having trouble handling the throughput.驱动器可能已满并且无法再写入,或者您的系统在处理吞吐量时遇到问题。

Check to see if the drive sqlite3 uses is full, and if not, try adding in a short delay between insertions.检查 sqlite3 使用的驱动器是否已满,如果没有,请尝试在插入之间添加一个短延迟。 Alternatively, you can try pre-calculating all the data you want to insert and running one large bulk insert rather than multiple small ones.或者,您可以尝试预先计算要插入的所有数据并运行一次大批量插入而不是多个小批量插入。

It shouldn't be related to your Python code and more on the sqlite3/hardware side of things.它不应该与您的 Python 代码以及更多关于 sqlite3/硬件方面的内容相关。

Edit编辑

As per @Jonathan Willcock 's comment, you can also try wrapping all the inserts into one transaction.根据@Jonathan Willcock 的评论,您还可以尝试将所有插入内容包装到一个事务中。

To do this, you would have to shuffle some things around.要做到这一点,你必须洗牌一些东西。 The flow is supposed to be like this:流程应该是这样的:

open connection > start transaction > run queries > commit transaction on success / rollback transaction on error > close connection

try:
    conn = sqlite3.connect(sqlite_file, timeout=timeout, isolation_level=None)
    cursor = conn.cursor()
    while i < iterations:
        # your loop here
    conn.commit()
except sqlite3.IntegrityError:
    conn.rollback()
    # other error handling here
finally:
    conn.close()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 写入数据库时​​,Python 3 中 SQLite3 的磁盘 I/O 错误 - disk I/O error with SQLite3 in Python 3 when writing to a database 将JSON写入MYSQL数据库时SQL语法中的PYTHON错误 - PYTHON error in SQL syntax while writing json to MYSQL database 无法修复我在编写代码以分析 O'Reilly 书中的“California Housing”数据集时遇到的错误 - Not able to fix the error I get while writing code for analyzing "California Housing" data set from O'Reilly book 将pyspark数据帧写入MySQL数据库时出错 - Error While writing pyspark dataframe to MySQL database 将 spark dataframe 写入 csv 文件时出现“调用 o58.csv 时出错”错误 - Getting "An error occurred while calling o58.csv" error while writing a spark dataframe into a csv file 在 Python 中写入 SQL 语句时出现列名无效的错误 - Getting error that invalid column name while writing SQL statement in Python 保存Excel文件时发生I / O错误-Python - I/O Error while saving Excel file - Python 将python(pandas)数据框写入SQL数据库错误 - Writing python (pandas) Data Frame to SQL Database Error 将 Django 与 sql server 数据库连接时出错 - Error while connecting Django with sql server database 从SQL导入python时发生磁盘I / O错误 - Disk I/O Error when importing from SQL into python
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM