130,000 rows is a lot to load in one go: I assume that you aren't trying to present this to the user directly!
There are a couple of things you can do to speed it up, but with that number of rows they probably aren't going to have a major effect - but they are simple. Start by looking at your query. Are you retrieving information you don't need? For example, 130,000 text fields of 100 characters each needs 13,000,000 bytes of bandwidth just to transfer, and an image field is going to be even bigger. If your query is
SELECT * FROM MyTable
Then you may be able to save significant time by fetching only the data you need.
There is also the option of "on-demand" loading - you fetch the minimum data necessary in one go (just the row IDs for example) and then only fetch the row detail when you actually need it. I use this system for images to good effect:
On-demand loading of images from a database[
^] (it's in C#, but it's pretty obvious what is going on)
The biggest performance improvement (from a user perspective) is the hardest to implement: move your loading code into a separate thread and retrieve row-by-row, updating as needed. It's not too complex to actually do this - a
BackgroundWorker[
^] is pretty simple - but it can mean substancial changes to other code, depending on how you are using things. I tend to try to load in the background at startup when the user isn't actually doing anything yet, if I can.