Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Networking Australia Television The Internet Technology

100GbE To Slash the Cost of Producing Live Television 180

New submitter danversj writes "I'm a Television Outside Broadcast Engineer who wants to use more IT and Computer Science-based approaches to make my job easier. Today, live-produced TV is still largely a circuit-switched system. But technologies such as 100 Gigabit Ethernet and Audio Video Bridging hold the promise of removing kilometres of cable and thousands of connectors from a typical broadcast TV installation. 100GbE is still horrendously expensive today — but broadcast TV gear has always been horrendously expensive. 100GbE only needs to come down in price just a bit — i.e. by following the same price curve as for 10GbE or 1GbE — before it becomes the cheaper way to distribute multiple uncompressed 1080p signals around a television facility. This paper was written for and presented at the SMPTE Australia conference in 2011. It was subsequently published in Content and Technology magazine in February 2012. C&T uses issuu.com to publish online so the paper has been re-published on my company's website to make it more technically accessible (not Flash-based)."
This discussion has been archived. No new comments can be posted.

100GbE To Slash the Cost of Producing Live Television

Comments Filter:
  • by zbobet2012 ( 1025836 ) on Monday September 10, 2012 @02:57AM (#41285483)
    100GbE is huge demand for core infrastructure people due to backbones being strained everywhere by the explosion of online video usage. Tier 1 providers are simply at a demand level that current foundries can't even come close to providing. Thus no one has an incentive to slash prices.
  • by YesIAmAScript ( 886271 ) on Monday September 10, 2012 @04:13AM (#41285687)

    You don't see that all the time on slashdot.

    Great article.

    I think many are getting confused here and think that this article is about reducing the cost of producing live TV on a shoestring. The figures in this article are very high, but for professional video production, existing figures are also very high.

    If you take into account that this could allow production trucks to shrink in size a bit (RG6 takes up a lot of space), the price of this new way could be even lower.

  • by SmallFurryCreature ( 593017 ) on Monday September 10, 2012 @04:52AM (#41285809) Journal

    Replacement tech rarely catches up. 1080p signal? Please, that is so last year. 4k is the new norm. No TV's for it yet? Actually, they are already on sale which means that if you are not recording your repeatable content right now in 4k, you will have a hard time selling it again in the future. That is why some smart people recorded TV shows they hoped to sell again and again on film and not video-tape. Because film has a "wasted" resolution in the days of VHS video tapes but when DVD and now Blu-ray came out, these shows can simply be re-scanned from the original footage and voila, something new to flog to the punters.

    I don't know how much data a 100GbE link can truly handle but the fact is that trying to catch up to currect tech means by the time you are finished, you are obsolete. the 4k standard created by the Japanese (and gosh doesn't that say a lot about the state of the west) isn't just about putting more pixels on a screen it is about all the infrastructure needed to create such content. And you better be ready for it now because if you are not, you will be left behind by everyone else.

    The future may not be now, but it sure needs to have been planned for yesterday.

  • by SimonTheSoundMan ( 1012395 ) on Monday September 10, 2012 @06:18AM (#41286061)

    this is about live TV. Live TV is a different. The infrastructure relies on point-to-point circuit switching. One video signal is sent down one coax cable. 8 cameras is 8 coax cables, now have 1km of cable that's 8km just for the live camera feeds to the OB truck. 100GbE means one cable. 8km of coax or fibreoptic isn't cheap, and usually requires a truck and a team of sparks to transport all these cables.

    Back to caferace's conversation. It is a bottleneck indeed for content that is not live. Digitising rushes to intermediate codecs takes time, tape is usually played back at normal speed or double speed, output via HD-SDI from the deck and the workstation that transcodes on the fly in realtime. Tapeless workflows speeds the process up as you can import faster that 1-2x but still takes time to transcode. However, having this slow down is not a problem, the rushes have to be logged, while they are converting this logging process can be done manually.

    Cinematic filming have the workflow sorted to some extent. High end cameras shoot direct to an intermediate codec, a DIT works on set and logs as the footage is shot, and sound and continuity departments can log electronically to the same system now too. The problem at the moment is it is not one system but many have to come down to one. I work in the sound department in film as an assistant. One of my responsibilities is keeping time-code correct on set, I have to go round each department times a day* and "jam" each system, recorder, slate, camera etc so they are correctly in time so when all the data is put together by the DIT. One day they will get unified

    Logging while shooting cannot be done for news or reality TV as everything happens too quick.

    * Three times a day because Sony can't make a $100,000 camera that doesn't have an internal clock that doesn't drift by +/- 2-3 frames a day.

  • by Controlio ( 78666 ) on Monday September 10, 2012 @10:46AM (#41287595)

    HDSDI uncompressed video is 1.5Gb/s. That is the standard for moving uncompressed video around inside a TV truck, whether 720p or 1080i. It rises to 3Gb/s if you're doing multiple phases of video (3D video, super slo-mo, etc). Within that 1.5Gb/s is still more than enough headroom to embed multiple datastreams and channels of audio (8 stereo pairs is the norm, some streams do up to 16). So I fail to see why 100Gb/s is necessary to transmit uncompressed video.

    It's also a chicken-and-egg scenario. I'm a broadcast engineer and audio specialist. I had Ma Bell contact me about 7 years ago asking about how important uncompressed video transmission was, as they were trying to gauge a timeframe for a network rebuild to allow for uncompressed video transmission. My answer hasn't changed much in 7 years, because although moving uncompressed video from site to (in the case of Fox) Houston and then back to your local affiliate would be nice, it's completely unnecessary because by the time it reaches your house your local cable or satellite operator has compressed your 1.5Gb/s signal down to between 4Mb/s and 10Mb/s typically, making the quality gains negligible.

    It will solve one problem, which is image degradation due to multiple passes of compression. Think about it... the 1.5Gb/s leaves our TV truck and gets ASI compressed into 270Mb/s (best case scenario, satellite transmission is significantly lower bandwidth, and most networks don't use an entire 270M circuit, they use less). It then arrives at the network hub, where it gets decompressed. If it's live it then goes through several switchers and graphics boxes, then gets re-compressed to ASI and sent either to another hub or to your local affiliate. (If not live, it gets put into a server which re-compresses the video even harder before playout.) Your local affiliate then decompresses it, it passes through more switchers and graphics boxes, then it gets either broadcast using 8VSB, or it gets re-compressed and passed on to your cable or satellite provider, who then un-compresses it, processes it into MPEG or some other flavor, and re-compresses it into its final 3-12Mb/s data stream for your receiver to decompress one final time.

    This would eliminate several compression steps, and mean a better final image quality because you're not recompressing compression artifacts over and over and over again. A real 1.5Gb/s video frame looks like staring out a window compared to the nastiness you see when you hit pause on your DVR during a football game (also a best-case scenario, most cable/broadcast/sat providers ramp up the bitrate to the max for live sports and then set it back down shortly thereafter).

    But the 100Gb/s makes no sense to me. Are you (crazy) overcompensating for latency? Are you sending 100% redundant data for error correction? Why in the world would you need that much overhead? I can't imagine it's to send multiple video feeds, the telco companies don't want you to do that because then you order less circuits from them. Plus you'd want at least two circuits anyways in case your primary circuit goes down for some reason.

    (Side note: The one benefit to a TV truck using Ethernet as a transmission medium is the fact that these circuits are bi-directional. Transmission circuits nowadays are all unidirectional, meaning you need to order more circuits if you need a return video feed, meaning higher transmission costs. The ability to send return video or even confidence return signals back down the same line would be huge for us and a big money saver.)

  • by swillden ( 191260 ) <shawn-ds@willden.org> on Monday September 10, 2012 @11:24AM (#41288099) Journal

    Why do people in this industry need 6 simultaneous unbuffered streams?

    A typical broadcast studio has dozens, if not hundreds of simultaneous streams. Several editing suites running at once, a few people reviewing incoming feeds and selecting content from a variety of other sources, a couple of studios with 3-4 cameras each, plus actual output streams for each of the channels being produced, with large master control panels mixing the inputs to make them.

    I spent a couple of years working for Philips Broadcast Television Systems (BTS), which makes equipment to run these systems. I worked on the router control systems, a bunch of embedded 68K (this was almost 20 years ago) that control big video and and audio switchers, many with hundreds of inputs and outputs (technical terms: "gazintas" and "gazaoutas"). It's unbelievable how many video and audio streams even a small studio manages, and the wiring to support it all is massive, as in foot-thick bundles routed all over under the raised floor. It makes your typical data center cable management problem look like child's play.

    Besides just cabling costs, I could see packet-switched video enormously simplifying the engineering effort required to build and maintain these facilities. And it would also eliminate the need for lots of very expensive hardware like the switches BTS sold. Even with 100GbE, I'll bet large studios will still end up with cable bundles and link aggregation, but it would be vastly better than what can be done now.

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...