Monday, March 12, 2012

Im flooding my SQL Server with INSERT / UPDATE requests. How do I optimize?

I have an application that calculates a bunch of numbers and then inserts them into a table (or updates a record in the table if it exists). In my test environment it is issuing 100 insert or update requests to the server per second and it could run for several hours.
After a the first several hundred requests, the SQL server is bogging down (processor at 90-100%) and the application slows down while it waits on SQL to update the database.
What would be the best way to optimize this app? Would it help to loop through all my insert/update requests and then send them as one big batch of statements to the server (per 1000 requests or something)? Is there a better way of doing this?
Thanks!

Here are two approachs we did:

1. Queue up all insert/update, and run batch job as one transaction in one connection, next release of our Lattice.DataMapper will support run batch job in a seperate thread at certain time you define, see midnight tonight.
2. You can send a xml doc with all your insert/update data over to SQL server and call stored procedures using SQL server xml support. So only one stored procedure call you are done.

No comments:

Post a Comment