简体   繁体   中英

Retrieving 16-bit integer from LWIP server response

In the server side I have the following loop, it takes a 16-bit integer (from 0 to 639) and separate it into two 8-bits chars to feed the buffer (1280 Bytes). This is then sent via TCP-IP to the client.

.c

unsigned int data2[1000]; 
char *p;
len = generate_http_header(buf, "js", 1280);
p = buf + len;
for (j=0; j<640; j++)
{
    char_out[1]=(unsigned char)(data2[j]&0x00FF);
    char_out[0]=(unsigned char)((data2[j]>>8)&(0x00FF));
    *p=char_out[0];
    p=p+1;
    *p=char_out[1];
    p=p+1;
}
....
tcp_write(pcb, buf, len, 1);
tcp_output(pcb);

In the client side I want to retrieve the 16-bit integer from the JSON object. I came up with this solution, but something is happenning and I can not get all the integers values (0 to 639).

.js
var bin=o.responseText;
for(i=0;i<1000;i=i+2)
{
    a=bin[i].charCodeAt();
    b=bin[i+1].charCodeAt();

    // Get binary representation.
    a=parseInt(a).toString(2);
    a=parseInt(a);
    //alert('a(bin) before:'+a);

    b=parseInt(b).toString(2);
    b=parseInt(b);

    //padding zeros left.
    a=pad(a,8);
    b=pad(b,8)

    //Concatenate and convert to string.
    a=a.toString();
    b=b.toString();
    c=a+b;

    //Convert to decimal
    c=parseInt(c,2);
    //alert('DECIMAL FINAL NUMBER:'+c)
    fin=fin+c.toString();
}

alert('FINAL NUMBER'+fin);

I used Fire BUG to see the HTTP response from the server:

  �����������   �
 ���
  ������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/        �  0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\  �]�    ^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������    

!"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[]^_`abcdefghijklmnopqrstuvwxyz{|}~

!"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUV QR Ps $ P

After run the .js code I get the right numbers as expected from 0 to 127 (0,1,2,...127), but from 128 to 256, I get all number equals to 255 instead of (128,129,130...256).After 256 every number is ok and in sequence (257,....639). I think the problem is related to the function charCodeAt() that returns the Unicode of the character.For some reason it's returning always 255 considering I have the same character, but this is impossible because the server is sending "129,130,131...255" Any idea what could be happening? Before using the actual solution I tried to retrieve the 16-bit integer directly from the JSON object but could not remove the dependency with a LUT. How can I have the 8-bit of each char in the o.responseText="abcdefgh..." without using a LUT to find the equivalent ASCII Code and then the binary representation? I think it's possible using a bitwise operator & but in this case still need to convert first to binary equivalent then to integer.How can I perform bitwise operations directly on strings in java script?

It looks like your data is displaying as utf8. utf8 is ascii compatible so all ascii characters a displayed fine (characters up to 127), the rest of characters are not valid in utf8 so the program that displays the data replaces these invalid characters with the invalid character replacement character , try to change the client(receiving program) encoding to iso-8859-1.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM