简体   繁体   中英

iconv library on Mac OS X: strange behavior

I am porting application from CentOS 6 to Mac OS X. It depends on iconv and works in CentOS normally. However, on Mac OS X it doesn't. I see following behavior:

const char *codePages[] = { "MAC", "LATIN1", "ISO_8859-1", "WINDOWS-1252", "ASCII" };
int codePagesCount = 5;
iconv_t converter1 = iconv_open("UTF-32", codePages[0]);// Works
if(converter1 != (iconv_t)-1)
   iconv_close(converter1);
iconv_t converter2 = iconv_open("UTF−32", "MAC");// Fails, returns -1
if(converter2 != (iconv_t)-1)
   iconv_close(converter2);

This piece of code looks trivial: the first iconv_open creates converter and gets code page name from codePages array, its zero element is MAC, so it is logical for me that Mac OS X must support conversion from its own code page to Unicode. And the first call to iconv_open works. However, the second call to iconv_open does the same. It also creates converter from Mac encoding to Unicode. And for any reason it fails and returns -1. What may be the reason of such situation when call to the same function with the same arguments (one is element of hardcoded array, another one is hardcoded string) results in normal functionality for the first call and failure for the second one?

第二个“ UTF-32”与第一个不同:我猜第一个使用普通的减号,而第二个使用内灰。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM