简体   繁体   中英

C# binarywriter start and end of string

Im sending a large string 0.443+0.064+-0.120+-0.886+0.15167+-0.26754+0.95153 over a TCP socket-connection.

The message i recieve is not similar to the string i send. It is cut at random points, ie 43+0.064+-0.120+-0.886+0.15167+-0.26754+0

How can i make sure the full string is read?

This is the clientcode:

public static void SendMessage(string message)
{
   if (socketConnection == null)
      {
        return;
      }
   using (BinaryWriter writer = new
   BinaryWriter(socketConnection.GetStream(), Encoding.ASCII, true))
      {
         writer.Flush();
         writer.Write(message);
         writer.Flush();
       }
}

This is my servercode:

private void ListenForIncommingRequests()
{
     tcpListener = new TcpListener(IPAddress.Parse("127.0.0.1"), 8080);
     tcpListener.Start();
     connectedTcpClient = tcpListener.AcceptTcpClient();

     using (BinaryReader reader = new 
     BinaryReader(connectedTcpClient.GetStream()))
     {                   
         while (true)
          {
            string clientMessage = reader.ReadString();
          }
     }
}

As @NineBerry pointed out in the comments, you're writing ASCII encoded Bytes, but reading default (Unicode (UTF-16)) Encoded Bytes. Make sure to use the same Encoding on both ends, I'd recommend using Unicode, so either remove Encoding.ASCII when instantiating your BinaryWriter or use Encoding.Unicode when instantiating your BinaryWriter AND your BinaryReader

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM