Hi,
I'm trying to send large files (~4GB) from a http server to some clients. My server implementation is based on Boosts HTTPServer3 Example
HTTPServer3.
The problem is, that the files are first completely loaded into memory and then send to the Client.
I would like to read chunks of about 5-10mb and send them before reading the next chunk, instead of reading the whole file into one string (rep.content).
Thats what my code looks like:
std::ifstream is(full_path_no_params.c_str(), std::ios::in | std::ios::binary);
if (!is)
{
rep = reply::stock_reply(reply::not_found);
std::cout << "File not found\n";
return;
}
rep.status = reply::ok;
char buf[512];
while (is.read(buf, sizeof(buf)).gcount() > 0)
rep.content.append(buf, is.gcount());
rep.headers.resize(2);
rep.headers[0].name = "Content-Length";
rep.headers[0].value = boost::lexical_cast<std::string>(rep.content.size());
rep.headers[1].name = "Content-Type";
rep.headers[1].value = mime_types::extension_to_type(extension);
The data is send using boost::asio::async_write
boost::asio::async_write(socket_, reply_.to_buffers(),strand_.wrap(
boost::bind(&connection::handle_write, shared_from_this(),
boost::asio::placeholders::error)));
and the definition of reply_.to_buffers()
std::vector<boost::asio::const_buffer> reply::to_buffers()
{
std::vector<boost::asio::const_buffer> buffers;
buffers.push_back(status_strings::to_buffer(status));
for (std::size_t i = 0; i < headers.size(); ++i)
{
header& h = headers[i];
buffers.push_back(boost::asio::buffer(h.name));
buffers.push_back(boost::asio::buffer(misc_strings::name_value_separator));
buffers.push_back(boost::asio::buffer(h.value));
buffers.push_back(boost::asio::buffer(misc_strings::crlf));
}
buffers.push_back(boost::asio::buffer(misc_strings::crlf));
buffers.push_back(boost::asio::buffer(content));
return buffers;
}
Any ideas how to do that??
May I use somesthing like boost::asio::streambuf?
Greets Chris