For UTF-16, we can read and convert it to wchar
at the same time. For example,
std::wifstream* file = new std::wifstream(name, ifstream::binary);
locale lo = locale(file->getloc(), new std::codecvt_utf16<wchar_t, 0x10ffff, std::little_endian>);
file->imbue(lo);
How could I do the same for UTF-32 input?
You may want to use a classic C++ pattern of allocating wifstream
on the stack instead of the heap ( new
):
std::wifstream* file = new std::wifstream(name, ifstream::binary);
std::wifstream file(name, ifstream::binary);
For the codecvt part, I'd try with std::codecvt_utf16<char32_t>
.
PS Note that wchar_t
can have different sizes (16 bits, 32 bits) on different platforms. So it may be better for you to use std::u16string
for UTF-16 and std::u32string
for UTF-32.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.