There are so many parts to this that represent potential bottlenecks. You probably want to rethink your architecture here, and take care of a couple of basics while you're at it. What you are going to have to do is a lot of investigative work, identifying slow down areas sections by section. Some things I would consider - you have a List created without a capacity. You know that you're going to be allocating 1000 records to it at a time, so create the list sized to 1000 entries. Something else I would look at is not reusing the same list in a batch. What you could do is create, effectively, a List of Lists and create all of your batches before you actually update the database. Something like this:
List<List<TblDispense>> batches = new List<List<TblDispense>>(1000);
List<TblDispense> tblDispense = new List<TblDispense>(1000);
batches.Add(tblDispense);
.... Do work adding data into tblDispense but don't clear it down at the end.
At the end, just iterate over your batches and add them.
However, playing around with this isn't going to get around the fact that your biggest bottleneck is going to be physically loading and reading the file. You might want to investigate faster file capabilities in the framework, and create something like a ReadOnlySpan<t> to hold the contents of the file (if you're using a late enough version of .NET).
Note: When I am investigating performance, I normally use
dotTrace Profiler: .NET Profiler[
^]