Step 1 - Parallelize the processing
var bag = new ConcurrentBag<tuple><int,>>();
Parallel.For(0, ds.Tables[s].Rows.Count, (i) =>
{
for (int j = 0; j < ds.Tables[s].Columns.Count; j++)
{
if (ds.Tables[s].Rows[i][j].ToString().ToUpper().Trim().Contains("NO OF UNITS"))
{
int t = j + 1;
if (ds.Tables[s].Rows[i][t].ToString() == "")
{
bag.Add(new Tuple<int,>(i, j));
}
}
}
});
</tuple>
Step 2 - Perform a bulk operation
var indexes = bag.ToList();
How you do the bulk operation depends significantly on what DB server you're using and how you want to hook into the data locally. Inserting is really expensive one at a time... If your DataTable is purely disconnected you can just sequentially add them.
Step 3 - Design the database better
I put this as the last step because, realistically, changing schemas in a deployed or legacy application can be challenging and risky. The first two steps are also easier to implement and you want to do those anyway... Searching values for hard coded values in databases that are meaningful is really bad. You want to have metadata that can be enforced with referential integrity so the results this process can come from an optimized query.