Click here to Skip to main content
15,305,754 members
Please Sign up or sign in to vote.
1.00/5 (2 votes)
See more:
I am using npoi to export data in excel sheet.It is giving error "Invalid row number(1048576) outside allowable range (0.. 1048575)" when data is more than 1 million. Is there any way to divide data in multiple sheet and store in same excel file in npoi.

What I have tried:

var workbook = new XSSFWorkbook();
var sheet = workbook.CreateSheet("USERReport");

string JSON = JsonConvert.SerializeObject(dataTable);

var items = JsonConvert.DeserializeObject<List<USERExcelSheet>>(JSON);

var columns = new[] { "DATETIME", "NAME", "ADDRESS", "MOBILE" };

var headers = new[] { "DATETIME", "NAME", "ADDRESS", "MOBILE" };

var headerRow = sheet.CreateRow(0);

for (int i = 0; i < columns.Length; i++)
    var cell = headerRow.CreateCell(i);

for (int i = 0; i < items.Count; i++)
    var rowIndex = i + 1;
    var row = sheet.CreateRow(rowIndex);

    for (int j = 0; j < columns.Length; j++)
        var cell = row.CreateCell(j);
        var o = items[i];
        cell.SetCellValue(Convert.ToString(o.GetType().GetProperty(columns[j]).GetValue(o, null)));

//if (!Directory.Exists( _env.WebRootPath + "\\" + "USERReport"))

//    Directory.CreateDirectory(_env.WebRootPath + "\\" + "USERReport");
if (!Directory.Exists(Path.Combine(_env.WebRootPath  , "USERReport")))

    Directory.CreateDirectory(Path.Combine(_env.WebRootPath, "USERReport"));
var FilePath = Path.Combine(_env.WebRootPath , "USERReport", FileName+".xlsx");
FileStream file = new FileStream(FilePath, FileMode.CreateNew, FileAccess.Write);
Updated 13-Sep-21 22:12pm
Patrice T 14-Sep-21 2:45am
What your problem with this code ?
Member 14512875 14-Sep-21 2:56am
Giving error if row is more than 1 million.
Patrice T 14-Sep-21 3:00am
Give error message.
Use Improve question to update your question.
So that everyone can pay attention to this information.
Member 14512875 14-Sep-21 3:03am
Invalid row number(1048576) outside allowable range (0.. 1048575)
Richard MacCutchan 14-Sep-21 4:20am
The message could not be clearer. Excel worksheets have limits on the number of rows and the number of columns. You need to break your data into smaller pieces.
PIEBALDconsult 14-Sep-21 9:39am
Don't use Excel as a data store. Just write a CSV file and if someone wants to load it into Excel, they are welcome to.

Excel has limitations: Excel specifications and limits[^] - specifically in your case it is limited to a maximum of 1,048,576 rows by 16,384 columns per sheet.

There is no way round that, and while it is possible to break your data into multiple sheets I have to say that a 1,000,000 row (or 16,000 column for that matter) spreadsheet is probably an abomination.
While it is possible to use Excel as a database or a data transfer method, it's not a good idea when the data volume starts to grow - and your whole workflow should be reorganised around a better mechanism (an actual database for example). All you are doing is creating difficulties for yourself later, and probably using way too much memory and processing power in bodging around the real problem!
Member 14512875 14-Sep-21 3:07am
Is there any built in method in npoi to create sheet on run time?
OriginalGriff 14-Sep-21 3:53am
Did you read what I said, at all?
Member 14512875 14-Sep-21 4:00am
yes I read that , but with above solution I need to do lot of manual things.
OriginalGriff 14-Sep-21 4:39am
And you would have to do much the same manual things if you split the data to multiple sheets because any software which processes it (and sure as heck no human will ever look at it!) will be equally unaware that the second sheet exists.

Which means that it won't fail, but won't process all the data either - which is much, much worse!
Moving to a DB means that you have to fix all apps or it's obvious that you missed one.
TheRealSteveJudge 14-Sep-21 4:26am
Even using Excel 64-bit (which is not limited to 2 gb.) on a high-end cpu workstation loaded with RAM you can't do this.

fyi: Excel 64 can address up to 8 terabytes of memory ... that does not mean you can use that much memory !

The limits OriginalGriff mention are for one worksheet: you can, of course, have multiple worksheets per workbook, and multiple workbooks per project.
The limit of virtual address space for 32-bit editions of Windows-based applications is 2 gigabytes (GB). For Excel, this space is shared by the Excel application itself together with any add-ins that run in the same process. The size of the worksheet itself also affects the usage of virtual address space. Because Excel loads the worksheet into addressable memory, some worksheets that have a file size of less than 2 GB may still require Excel to use more than 2 GB of addressable memory.

the 64-bit edition of Office does not impose hard limits on file size. Instead, workbook size is limited only by available memory and system resources.
Even if you could ... you'd most likely have a mess that would take forever to recalculate.

Assuming you don't have the funds to buy a supercomputer, you need to use another strategy to create your solution. A strategy that breaks the computational process into steps each of which can work on a subset of the data.

Do investigate Power Pivot, and Power Query, as possible tools that would allow you to use very large data with Excel (I have not used either one): [^],[^].

Read these: [^], [^] for other ideas.

And, as OriginalGriff suggested, consider using a database.
TheRealSteveJudge 14-Sep-21 4:26am

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900