在.net中长时间运行的数据库操作操作、数据库、时间、net

2023-09-05 04:07:10 作者:荒芜了青春又负了谁/*

我有我通过使用.NET应用程序中插入SQL Server表/更新记录分隔的文件。 该文件具有约8的记录和日常处理。 我的问题:它是安全的,甚至明智通过每个8万行保持连接到数据库打开,而我转动,或者我应该关闭连接,并与每个循环迭代重启?这本身听起来很繁琐。 不过,我担心拿着很长一段时间打开的连接,持有锁和使用内存不必要的。 什么将是一个更加可扩展,安全和明智的方式做到这一点?

I have a delimited file that I use to insert/update records in a sql server table via a .net application. The file has about 80000 records and is processed daily. My question: is it safe or even sensible to keep the connection to the db open while i spin through each of the 80000 rows or should I be closing the connection and reopening with each iteration of the loop? That sounds cumbersome in itself. However, I am concerned about holding an open connection for a long time, holding locks and using up memory unnecessarily. What would be a more scalable, safe and sensible way to do this?

推荐答案

首先,没有你不应该打开/关闭连接的每一行。 8万行,将采取永远只会增加开销。你可以考虑分批行(重置连接说的每10-500行)。幸运的是,有一个更好的选择:

First, no you should not open/close the connection every row. For 80,000 rows, that will take forever and will just add to the overhead. You could consider batching the rows (reset the connection say every 10-500 rows). Fortunately, there is a better option:

其次,正确的方法插入/更新的行数为从.NET应用程序的数据库,是使用的 SqlBulkCopy的方法的,而不是INSERT或UPDATE命令。您应该使用SqlBulkCopy的加载数据的行放入临时/临时表,然后使用SQL存储过程来完成插入/更新到实际的表(S),连接质量。

Secondly, the proper way to insert/update that many rows into a database from a .Net application, is to use the SQLBulkCopy methods, and not the INSERT or UPDATE commands. You should use SQLBulkCopy to load the data rows into a holding/staging table, and then use a SQL Stored Procedure to do the Insert/Update to the actual table(s), en-mass.

如果您担心的使用SqlBulkCopy的持续负载,具有配料选择内置。

If you are concerned about the sustained load of the SQLBulkCopy, it has batching options built-in.

使用这种技术,数据的初始上载应的至少的5倍更快,以及实际的表插入/​​更新应该只是几秒钟的时间。

Using this technique, the initial upload of data should be at least 5x faster, and the actual table Insert/Updates should only be a matter of seconds.