|Just a random question since I already have a different solution but I had the following situation.
I have a table with a billion rows (actually probably about 1.1 billion.)
I was trying to span the table, read every row and do an analysis.
Certainly couldn't load the entire table. I was using a paged query (limit/count). Each page took about 90 minutes for the query itself. So not really something that was going to allow me to do much analysis.
Any other ideas on spanning it or speeding it up?
At one point I was even considering just dumping it and writing an app to do the analysis outside of the database. That was about the only other solution I had.
The database was MySQL (AWS Aurora actually).
The relevant parts of the table were as follows and the id has a primary key. (I didn't design the table.)