简体   繁体   中英

C++: Saving a large binary file (2GB to 4GB) to RAM in a char*?

I am working on a file transfer program in C++ which converts a file to binary, saves the bytes in a char*, then sends that char* through a TCP connection to another computer. The other computer then recreates the file locally. The program does work, but I run into a big problem with large files! I cannot allocate enough array indexes to contain the bytes! For example, if I want to send a 600MB file, I need a char* with 600 MILLION indexes. This works. But once I go any higher, the program simply cannot allocate the memory and I get errors.

A friend of mine suggested that I split the file into chunks and do the transfer chunk by chunk, however this creates a plethora of other challenges and would require me to basically rewrite the entire program.

Is there any way to get around this?

A friend of mine suggested that I split the file into chunks and do the transfer chunk by chunk, however this creates a plethora of other challenges and would require me to basically rewrite the entire program.

This is why it's called computer science, and why once you have mastered these challenges, you can head to the city and earn the big bucks.

I don't know what you mean with converting to binary , but you shouldn't have to allocate 600MB+ of memory, but work with buffering instead.

For example, to send a file from disk:

  • open file
  • read part of file in buffer
  • send buffer over TCP connection (repeat until done)

You could also use memory mapping (or TransmitFile() in Windows).

In case your data needs to be converted:

  • open file
  • read part in buffer
  • convert buffer
  • send buffer (repeat)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM