Click here to Skip to main content
15,867,141 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I'm building a streaming video server. Now I must transfer sequence data (data packets) of a video file to client. But I don't know timer tick to transfer a data packet. If i transfer too fast, client don't have enough time to decode and display.I don't know whether it depend on Bitrate or other information of a video file ?.Video file i'm using is WMV. Protocol i'm using to streaming is Windows Media Http Streaming protocol. Programing language I'm using is C#.

Information of video file :

Audio: Windows Media Audio 48000Hz stereo 64Kbps [Raw Audio 0]

Video: Windows Media Video 9 320x240 24.00fps 230Kbps [Raw Video 1]

Whether have formular to calculate time interval to transfer data?
Posted

1 solution

I'm not familiar with streaming video but instead of finding a calculated bit rate i'd first check if i would get buffer notifications of the client. E.g. Buffer 80% like serial ports can provide. Calculating the play rate may lead to glitches in your video presentation depending on the stability/load of the underlying network.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900