简体   繁体   中英

Deserializing UTF8 encoded Byte[] in javascript in the browser or in the node.js application?

I have an event class in c# :

public class AreaInterventoCreata{
    //public properties
}
var message= new AreaInterventoCreata();

I create an instance of this class server side. My aim then is to communicate this creation to the clients that have subscribed to this kind of event.

Therefore I am passing it to my broker RabbitMQ the follwing byte[] :

var responseBody = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(message))
responseChannel.BasicPublish(
                             "",                 // exchange 
                             properties.ReplyTo, // routingKey
                             responseProperties, // basicProperties 
                             responseBody);      // body

Some node.js server has subscribed to this kind message :

q.subscribe(function (message) { 
    for (var i = 0; i < sockets.length; i++) {
        console.log('_sockets[' + i + '] emitted');
        sockets[i].emit(event, message);
    }
});

This node.js has himself received socket.io connection coming from browsers, and therefore can perform a push to these sockets, eg these browsers :

sockets[i].emit(event, message);

Finally I receive this message in my browser :

var socket = io.connect('http://localhost:8091');
socket.on('Events_AreaIntervento_AreaInterventoCreata:Events', function (data) {
    var json = Utf8.decode(data);
});

When I inspect data trough firebug , it is an object data with an array of number define as data.data .

I assumed this array of number to be the array of byte i have given at the beginning of the process. Am I wrong in this assumption ?

  • How could I deal with it so that I can work with some normal object in javascript? (found it!)
  • Should I translate it in json directly at the node.js server level? is it more efficient? more something to do so?

ok found it (at least part of it) with the help of this website

var Utf8 = {
    // public method for url decoding
    decodeArray: function (utfArray) {
        var string = "";
        var i = 0;
        var c = c1 = c2 = 0;

        while (i < utfArray.length) {

            c = utfArray[i];

            if (c < 128) {
                string += String.fromCharCode(c);
                i++;
            }
            else if ((c > 191) && (c < 224)) {
                c2 = utfArray[i + 1];
                string += String.fromCharCode(((c & 31) << 6) | (c2 & 63));
                i += 2;
            }
            else {
                c2 = utfArray[i + 1];
                c3 = utfArray[i + 2];
                string += String.fromCharCode(((c & 15) << 12) | ((c2 & 63) << 6) | (c3 & 63));
                i += 3;
            }

        }
        return string;
    }
}

to be used like this

<script>
    var socket = io.connect('http://localhost:8091');
    socket.on('Events_AreaIntervento_AreaInterventoCreata:Events', function (msg) {
        var jsonString = Utf8.decodeArray(msg.data);
        alert(jsonString);
    });
</script>

But I still wonder if it is better to do it on the client, or on the node.js server, and how to do it on the node.js server ?

I have never used RabbitMQ before, but it looks like there is something wrong with the way you are passing your JSON string from .NET to RabbitMQ. Your "responseProperties" are probably incorrect as you shouldn't need to decode a UTF-8 array before using the JSON in nodejs or on the client.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM