There is nothing inherently built into any concept of streaming.
When a producer produces content, a subscriber consumes the content
without having to wait until all the data is transfered.
What .NET /C# gives you is the ability to transfer data to and from.
I am not sure why you want to tranfer video from one camera to other
camera. Cameras produce video, don't consume as far as I know.
In your case,
a) Capture the frames from the camera at the rate you want.
b) Open a socket or TCPClient communication to a server
c) Begin transfering the content
d) Hopefully you can do the same from the other side
e) You might want to show the video that is received. I don't know
how you would do that but you could possibly figure that out
To add to the points above, you can use the System.Net namespace to send and receive data from managed code. That said, I assume you want to stream this content over the internet? Currently, the System.Net namespace does not have much QoS (Quality of Service) support other than TOS marking in the IP header. We are strongly considering exposing QoS functionality in a future release. For more details on the unmanaged QoS APIs please see:
QWave Download , MSDN Blogs - on Net Namespace