# Sql-server – Is It Okay to Leave the Connection Open and Set CommandTimeout for Long Durations

connectionsquerysql server

I have to process daily files and it's a long process that deletes, imports and updates processes. Like literally 20 to 30 minutes per file for several files a day.

** EDIT: I also have some other processes that look like they are going to be going for 90+ minutes each and I'm worried about them too. Like, is this the best strategy?

I have a console application running with something like this:

For each file:

  using (SqlConnection cx = new SqlConnection(connectionString))
{
SqlCommand cmd = new SqlCommand("delete from " + tableName + " where fileName = '" + fileName + "' and date > @date", cx);
cmd.CommandTimeout = 1280;
cx.Open();
cmd.ExecuteNonQuery();
}


And then:

    using (var sbc = new SqlBulkCopy(connectionString))
{
sbc.DestinationTableName = tableName;
sbc.BatchSize = 750;
sbc.BulkCopyTimeout = 640;
sbc.WriteToServer(dt);
}


And finally:

   System.Data.DataTable dt = new System.Data.DataTable();

using (SqlConnection cx = new SqlConnection(connectionString))
{
SqlCommand cmd = = new SqlCommand("SELECT * from " + tableName + " where fileName = '" + fileName + "'", cx);
cx.Open();
da.SelectCommand.CommandTimeout = 1280;
da.Fill(dt);
}

// this is followed by reporting on the data


So this is a low priority task but it has to get done and each file has to get processed sequentially. Is there any problem with setting the connection timeout for so long? We are on an internal network. I've run DTA and taken all their indexing advice.

Any other thoughts, advice and ideas welcome.