简体   繁体   中英

Sending WebSocket message over C# Socket

I was looking at Create "Hello World" WebSocket example and began trying to adapt this to send a custom message over the WebSocket that's entered at the console, but I'm having a few issues understanding the code.

My question is what is the correct way to encode my own message so that it can be sent correctly based on the answer I linked above

client.Send(my-own-message);

So first of all Console.Read() only reads one char and returns int representing this char.

If you want to send a message you probably want to use Console.ReadLine() that returns a string.

string msg = Console.ReadLine();
client.Send(GetBytes(msg));

static byte[] GetBytes(string str)
{
    byte[] bytes = new byte[str.Length * sizeof(char)];
    System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
    return bytes;
}

If encdoing metters:

byte[] ascii = System.Text.Encoding.ASCII.GetBytes (msg);
byte[] utf8 = System.Text.Encoding.UTF8.GetBytes (msg);

You should really go to the source . The Websocket specification is actually fairly straightforward to read, and it tells you exactly how your messages should be formatted.

But in short, and assuming you've already completed the initial handshake establishing the connection, here is what data a Websocket frame should contain:

  • an opcode, a single byte with the value 0x81 if the message is formatted as UTF-8 text, and 0x82 if the message is binary data (note that a couple of browsers do not support the latter)
  • a length field of one or more bytes, describing the length of the message. The most significant bit of the first byte must be set on messages sent by the client (it indicates that the payload is masked, which must be done on client-to-server messages, and must not be done on server-to-client messages). The length field can have a variable length: If the length is below 126 bytes, it is simply encoded as a single byte (with the most significant bit reserved to indicate masking, as mentioned before). If the length is less than 65KB, the 7 available bits of the first byte take the value 126, and the two subsequent bytes contain the length as a 16-bit integer. Otherwise, the 7 bits of the first byte take the value 127, and the subsequent 8 bytes contain the length as a 64-bit integer.
  • a 4-byte masking key, which must be picked randomly for every message
  • and finally, the actual message you wish to send. This must be masked using the masking key, simply by XOR'ing each byte with a byte of the masking key. (byte i of the message should be XOR'ed together with the i%4 th byte of the masking key).

Do this, and you've created a valid websocket frame containing either UTF8 text or raw binary data. As you can see, there are a few steps involved, but each is relatively straightforward. (And again, please check with the RFC I linked to, because I just wrote all of this from memory, so there might be minor inaccuracies)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM