Sunday, December 29, 2013

Detecting Timeout on NetworkStream

Hi guys,



I currently have a custom network protocol I have written that uses the NetworkStream class. Within my protocol I have implemented a way to detect data that also works as a timeout. This seemed to be working fine until my client uploads an image of approx 24kb+ and then my timeout method fails and returns which then causes the upload to fail.




Does anybody have any pointers as to where I am going wrong? The code I have is below:



public bool WaitForData(int TimeoutSeconds) { if (TimeoutSeconds < 0) // No timeout { while (netstream.DataAvailable == false) Thread.Sleep(1); return true; } else { Stopwatch timer = new Stopwatch(); timer.Start(); do { if (netstream.DataAvailable) return true; Thread.Sleep(1); } while (timer.Elapsed.Seconds < TimeoutSeconds); timer.Stop(); return false; } }



The way this code is/was used is:



public byte[] ReadBytes(int size) { return ReadBytes(size, null); } public byte[] ReadBytes(int size, Action ProgressBarCallback) { byte[] barr = new byte[size]; for (int i = 0; i < barr.Length; ) { // the source of problems uploading invoices over approx 24kb: //if (WaitForData(timeout) == false) throw new Exception("RawSocket ReadBytes() Timed Out: Total bytes read: " + i); int read = netstream.Read(barr, i, Math.Min(barr.Length - i, 8192)); i += read; if (i % 10 == 0 &} if (ProgressBarCallback != null) ProgressBarCallback(100); return barr; }



Any ideas on improving? I am currently relying on the client not disconnecting during the upload.



Thanks.
Full Post

No comments:

Post a Comment