factors to consider when choosing the right codec for your ... › download › get ›...
TRANSCRIPT
H.264
H.263
VP8
AVC
THEORA
DIRAC
AVC-INTRAJPEG
MPEG-2
H.261
MPEG-4
VC-1
www.avsystemsdesign.com 35 34 AV SYSTEMS DESIGN Summer 2011Feature StorY
E very day, video over IP technology reaches farther
into our professional and personal lives. As networks
expand and streaming technologies mature, we find
ourselves with greater access to information and an
improved ability to communicate using video. Some streaming
applications like videoconferencing have been commonplace in
AV systems for many years. New applications continue to emerge
every day, as streaming technologies provide new capabilities
or greater economy to existing applications. As the variety of
streaming products increases, AV professionals will be challenged
with the task of selecting the ideal streaming products and
compression codecs for their applications.
The term “codec” has several meanings. It is the combination
of two words, encoder and decoder. The word “codec” is used
as a generic name for a hardware or software-based product,
such as a videoconferencing codec, which encodes and
decodes video simultaneously. It is also used to describe the
analog-digital-analog conversion process. As we explore the
Selecting a Streaming CodecFactors to consider when choosing the right codec for your streaming application
By Karl Johnson
avsd_summer2011.indd 34 6/6/11 7:12 PM
H.264
H.263
VP8
AVC
THEORA
DIRAC
AVC-INTRAJPEG
MPEG-2
H.261
MPEG-4
VC-1
www.avsystemsdesign.com 35 34 AV SYSTEMS DESIGN Summer 2011 Feature StorY
Low High
Convenience
Mission critical availability
Price Sensitivity
Network bandwidth abundance
Network quality of service
Network management and security
Low communication delay
Range of adjustments and controls
Asset management: Metadata, search tools
Scalability
Resolution and image quality
Simplicity of user interface
Relative ExpectationsIndustrial Consumer
variety of codecs that are available for streaming AV media and
consider the rapid advancement of technology in this area, many
questions emerge, such as:
• Why are there so many different codecs?
• What are the pros and cons of these codecs?
• Is there one codec that is ideal for every application?
• What differences in technical performance exist between
codecs, and in what application environments might they
be important?
• Standards based and proprietary codecs exist. When do
I use one versus the other?
Consumer and Business UseExperiences at home with personal electronics or computers can
shape expectations in industrial applications. It’s not uncommon
to hear customers express their frustration comparing a tried
and true office videoconferencing product with a similar
communication tool they use at home such as Skype. I’ve heard
individuals say, “Why is it that when I’m at home, I can Skype
with my friends and see and hear them so clearly, yet when I’m
in the office with all this expensive equipment, I get such poor
quality picture and sound?”
A lot of factors can contribute to this experience. For starters,
the individual may be using a dated, standard-definition
videoconferencing system in the office. The system may use a
low bandwidth communication link that delivers a fraction of
the resolution that is available on the large, flat panel display
it is used with. At home they may be operating Skype in a
small application window on a moderately sized desktop flat
panel. The quality experienced at home is perceived to be higher
because the video is presented at a higher pixel density in an
application window that matches the streaming resolution. The
Skype video would also appear grainy and pixelated if it was
presented on a large flat panel. Streaming quality is often a
matter of perspective, and gauging “good enough” for a specific
application is important. Regardless of the specific conditions,
customers can have experiences at home that raise expectations
for the office.
Furthermore, applications used on a personal computer
at home will not support the same mission-critical reliability
or provide the security that industrial equipment must.
Industrial and consumer solutions are faced with a different set
of requirements.
The impact and excitement created from the rapid uptake of
tablet PCs and video communication on mobile phones is likely to
expand an expectation gap as users desire to connect professional
AV systems with the convenience of mobile computing devices
and intelligent mobile phones.
Codec StandardsMany codecs have been developed by international or industry
standards organizations. Some codecs have been developed by
independent organizations with intentions for use in open or
closed application environments. Of the codecs that have been
developed independently, some have made a transition from
proprietary to becoming a standard.
The most commonly known codecs have originated from
standards organizations. The Joint Photographic Experts
Group – JPEG, developed the JPEG and JPEG2000 for still image
compression, and these have also been applied in motion video
applications. JPEG2000, the most recent standard, has been
adopted as the compression standard used in the digital cinema
industry for playback of motion pictures from hard disk players.
The Motion Picture Experts Group, or MPEG, of the
International Standardization Organization – ISO developed
the first MPEG-1 codec. It continued with MPEG-2 to provide
increased image quality and support for high definition video.
MPEG-2 has been used in a wide range of products from DVDs,
cable and satellite distribution, as well as high quality broadcast
contribution applications.
The International Telecommunication Union - ITU
recommended the H.323 audiovisual communication protocols,
which included the H.261, H.263, and H.264 video codecs intended
for specific use in interactive videoconferencing applications used
on communication links such as ISDN, T1, E1, or the Internet.
The ITU and ISO came together to form the Joint Video Team
– JVT and developed H.264, MPEG-4 Part 10, Advanced Video
Coding or AVC codec. More commonly referred to as H.264 these
Business applications require greater attention to a broader set of capabilities
than consumer ones, particularly reliability and security.
Comparison of expectations for Industrial and Consumer Communications applications
avsd_summer2011.indd 35 6/6/11 7:13 PM
www.avsystemsdesign.com 37 36 AV SYSTEMS DESIGN Summer 2011Feature StorY
Codecs
JPEG, JPEG 2000 MPEG-1, MPEG-2, MPEG-4
NOTE: Google aquired On2 Technologies in 2010.
H.261, H.262, H.264
H.264, MPEG-4 Part 10/AVC
VC-1, SMPTE VC-1 Dirac, Dirac Pro SMPTE VC-2 AVC-Intra
Theora VPx, VP6, VP8 (WebM)
Organization
Developer
CODECS AND THE DEVELOPER ORGANIZATION
ISO
JPEG
ISO
MPEG
ITU
H.323 Recommendation
ISO and ITU
JVT
Microsoft
Microsoft
BBC
BBC Research
Panasonic
Panasonic
Xiph. Org Foundation
Xiph. Org Foundation
On2 Technologies
TransportEncoder
HD-SDIor
DVI
PresentationSystem
Production System
Decoder
VIDEO
Live MediaCollection
HD-SDI
DVI
VIDEO VIDEO
MediaExchangePoint
days, this codec has replaced MPEG-2 as the most commonly
used codec applied in new streaming media hardware encoders.
In the broadcast television industry, the Society of Motion
Picture and Television Engineers – SMPTE and the European
Broadcasting Union – EBU have a vested interest in defining
recommended platforms for broadcast video, interchange, and
interoperability for transmission and production systems. They
recommend and define standards for their industry, and their
decisions extend into other industries.
Both Microsoft and Panasonic have produced video codecs
independently, targeted at specific applications. Microsoft’s VC-1
was developed to support a more efficient and higher quality
compression of interlaced video, making it more attractive to
broadcast applications where use of interlaced video is common.
VC-1 started as a proprietary codec, but was later established as
a standard by SMPTE. Microsoft has also applied the VC-1 codec
technology into the WMV3, WMVA, and WVC1 codecs used in
Windows Media Player. The Panasonic AVC-Intra codec is fully
compliant with the H.264 MPEG-4/AVC standard and SMPTE
recommended practices. It is targeted for capturing production
quality 10-bit video collected in camcorders at bit rates typically
used in Electronic News Gathering applications. The format is
also used in storage and archive applications. Development of this
codec aligns with the performance needs of related equipment
that Panasonic manufactures: cameras, decks, and camcorders.
As you can see, both of these organizations developed a codec
to fulfill targeted applications. One, VC-1 started as proprietary,
while the other, AVC-Intra is branded as unique, but is based
on standards. As illustrated with these real world examples,
applications drive codec development.
Open and Closed EnvironmentsMany streaming applications apply codecs in closed network
systems or on managed connections where control exists over the
hardware or software used at the endpoints. Hardware encoding
and decoding solutions will use an electronic signal interface such
as HD-SDI, DVI, or other formats to transfer the media. If software
decoding is used, installation of plug-ins can be managed in the
closed system. Here, the initial concerns for the AV integrator are
interfacing the correct signal format to the encoder input and the
decoder output, and establishing and maintaining the required
communications link or network connection. Beyond that, the
quality, bit rate requirements, and technical performance of the
codec are assessed individually for each project.
Open network environments, such as the Internet or even large
enterprises, will have very large numbers of potential endpoints.
Use of decoding hardware at every potential endpoint will not be
Streaming solutions using an electronic signal for media exchange.
electronic Image Signal Delivered from managed Decoder
avsd_summer2011.indd 36 6/6/11 7:13 PM
www.avsystemsdesign.com 37 36 AV SYSTEMS DESIGN Summer 2011 Feature StorY
Codecs
JPEG, JPEG 2000 MPEG-1, MPEG-2, MPEG-4
NOTE: Google aquired On2 Technologies in 2010.
H.261, H.262, H.264
H.264, MPEG-4 Part 10/AVC
VC-1, SMPTE VC-1 Dirac, Dirac Pro SMPTE VC-2 AVC-Intra
Theora VPx, VP6, VP8 (WebM)
Organization
Developer
CODECS AND THE DEVELOPER ORGANIZATION
ISO
JPEG
ISO
MPEG
ITU
H.323 Recommendation
ISO and ITU
JVT
Microsoft
Microsoft
BBC
BBC Research
Panasonic
Panasonic
Xiph. Org Foundation
Xiph. Org Foundation
On2 Technologies
TransportEncoder
HD-SDIor
DVI
Video Bitstream1001011011
Video Bitstream1001011011
Live MediaCollection
MediaExchangePoint
Decoding on PC
Decoder
Flat Panel Display
VIDEO
VIDEO
practical, and supporting software at each endpoint may not be
practical either. If we consider streaming to PCs or decoders that
are not managed, the media exchange for the streamed content
needs to be as a video bitstream. The endpoints must support
codecs, protocols, or applications that are compatible with the
video bitstream.
Codecs in the Computing EnvironmentProfessional AV systems pull together and manage presentation of
media from three different environments 1) Telecommunications –
videoconferencing systems, 2) Video based on broadcast standards,
and 3) PC originated media. The PC environment creates another
variable for the world of codecs.
Standards in this environment are frequently established
based on market forces and the strength, weakness, or ubiquity
established by commercial organizations or coalitions. The most
dominant players establish de facto standards. Industry giants
such as Microsoft, Apple, Adobe, and Google each pursue different
product and technology paths to support their customers, and as
commercial organizations, they make investments in technology
for the purpose of establishing competitive barriers and achieving
financial objectives.
In the PC streaming environment, a broader set of variables
must often be considered beyond the codec, including the
container format, transport method, media player, Digital Rights
Management, operating systems, and computing devices.
Today, Adobe’s Flash Player delivers the vast majority of
online video. It uses a proprietary container format – Flash Video
installed as a plug-in that is used in Web browsers for viewing
video on PCs. The ubiquity of Flash represents a competitive
advantage for Adobe products, but many developer solutions
in the PC environment would prefer to use an open-source
technology rather than conform to the operational requirements
of using Flash.
A great deal of attention has been paid to the very public
battle between Adobe and Apple concerning use of the Flash
Player on Apple’s iOS operating system used on iPhone and iPod
Video Bitstream Delivered in open environment
Streaming of video to unmanaged or unknown endpoints requires exchange of
media as a video bitstream.
what is a “de facto” standard?a de facto standard is a product, specification, industry convention, or
method that has achieved dominant use in an industry based on market
forces. users, suppliers, and developers all follow the de facto standard
in order to conform to the solution environment. Defacto standards are
not established by standards bodies like ISo or Itu.
avsd_summer2011.indd 37 6/6/11 7:13 PM
www.avsystemsdesign.com 39 38 AV SYSTEMS DESIGN Summer 2011Feature StorY
Touch. Adobe had been pursuing a programming environment
that crossed platforms and developed software that converted
Flash applications into native iPhone apps. Apple then changed
its developer agreement blocking Flash-derived apps. Adobe
chose to add support for Apple’s HTTP Live Streaming in Flash
Media Server and introduced this capability at the 2011 National
Association of Broadcasters Show, adapting to the restriction Apple
placed on their platform. Both Apple and Adobe pursued their
own interests in hopes of establishing a competitive advantage.
An industry battle such as this creates risks for customers
making investments in software platforms, applications, and
endpoint devices.
A great deal of news has also been generated over the past
year concerning use of codecs in HTML5 - the new Internet
browser standard. HTML5 represents the first opportunity to
add video embedded into the Web experience and workflow as
an object just like text or graphics. Members of the World Wide
Web Consortium HTML Working Group could not agree on a
common codec for use in HTML5. The ideal situation would be
for the group to select a single codec providing interoperability
for all endpoint applications. But this is not what occurred. A
whole series of codec choices have taken place. Initially Theora
and Ogg container, developed by Xiph.Org Foundation, appeared
to be the codec of choice. It was royalty free. H.264 was favored
by Microsoft and it provided strong compression performance.
Members such as Mozilla and Google could not agree to use
H.264 due to licensing requirements and in 2010, Google had
acquired On2 Technologies, developer of the WebM format. This
format includes the VP8 codec, and Google elected to use it with
their Chrome browser. Mozilla also chose to use WebM in Firefox,
as did Opera. VP8 has also been gaining greater use by YouTube.
In 2010, Microsoft announced Internet Explorer 9 support
for H.264 and a new Windows Media Player HTML5 Extension
for Chrome, allowing Windows 7 Chrome users to view H.264
encoded content. This means operating system suppliers can offer
H.264 plug-ins for use in browsers that don’t support it. Likewise,
Google can offer plug-ins for WebM. Most recently, in May 2011,
Microsoft, who has been committed to using H.264, purchased
Skype, which uses the VP8 codec.
Nearly all digital video that is delivered to end users is
developed as an end to end solution. Most of the concerns
about codec standards for video distribution are focused on the
development of content and delivery through connections across
the Internet, cable, satellite, or physical media. Even if the video
content is encoded with a standards based codec, it will likely use
a proprietary container, transport protocol, media player, or Digital
Rights Management scheme. In order to view standards-based
content, the endpoint device must use a browser based plug-in or
a mini application embedded into an appliance or other media
accessory device. Here, proprietary influence has been exerted by
the PC industry members, creating a barrier to the interoperability
that a standards-based codec is envisioned to deliver.
The standards war played out by PC industry giants creates
interesting news; however, this pursuit of competitive advantage
prohibits use of a single codec, and simple interoperability for
Internet streaming. It also demonstrates how popular video
streaming will be on PCs and other devices, and how critical
competitive barriers are to their future profitability.
That “Standards” Topic, AgainStandards are established by industry, national, or international
working groups. They can also be established by unofficial
consortiums or organizations. Standards are essential to
interoperability in communications technologies, particularly
one-to-many or many-to-many applications. Examples of
standards include:
• IEEE 100BaseT – Ethernet networking interface
• NTSC or ATSC - Video broadcast standards
• ITU H.323 – Videoconferencing standard
Products that employ these standards are truly
interoperable. You plug an NTSC signal into a display with
a BNC connector labeled NTSC, and you get a video picture.
what is a container format?a container or wrapper format holds the encoded audio and video
media as well as certain information pertaining to the media such
as: the compression codec, number and types of streams, subtitles,
metadata, and sync information. a container is more important to media
playback files than live streamed content, but media streamed in the pC
environment must factor in the container and media player applications,
that are used to view them. examples of media container formats include:
FLV and F4V used with adobe Flash Video, aSF and aVI from microsoft,
Quicktime from apple, the mpeG-2 transport stream, and mp4 that is
based on ISo’s mpeG-4 part 12.
h.264 Profiles and Levelsthe h.264 compression standard includes 17 profiles and 16 levels for
encoding video for different applications. each profile uses different
techniques with different complexity, requiring different processing power
from decoding devices. each level specifies the data rate and resolution.
Different profiles and levels can be used to target performance specific
to different classes of streaming applications. the h.264 compression
standard is used widely in many different streaming encoders; however,
the variety of profiles and levels available means it is important to select
devices applying a profile and level appropriate for the application, as well
as confirm that encoding and decoding device performance is matched.
avsd_summer2011.indd 38 6/6/11 7:13 PM
www.avsystemsdesign.com 39 38 AV SYSTEMS DESIGN Summer 2011 Feature StorY
The same applies to Ethernet communications or the H.323
videoconferencing standard.
Codec standards typically apply to the data format and
the decoding, leaving opportunities for different product
implementations to optimize or simplify the encoding for a
specific application. Standards are not established for quality;
they are established with a priority to facilitate interoperability.
However, “best solutions” are solicited from the research
community and they make their way into the standard.
As discussed earlier, the AV industry uses broadcast video
standards every day, but there is also a vast array of connections,
resolutions, signal formats, digital rights management, and
encryption methods used by computer-video and consumer video
equipment, which are based on specifications or de facto standards.
As these are not “ratified” standards, the specifications for these
interfaces may not always be followed by manufacturers to a
degree that will deliver the interoperable performance a “ratified”
standard offers. Integration of this type of equipment represents
an ongoing challenge for AV manufacturers and integrators.
Standards based codecs are more likely to exploit the cost
curve of popular technology. For example, increased use of
MPEG-2 or H.264 codecs will create more suppliers, increasing
competition, resulting in a greater variety of products. The
combined effort of many users of the standard will also advance
and improve the capability of the technology over time. However,
use of standard codec does not guarantee interoperability
between products. The processing capability of the streaming
devices, transport protocols, or container formats used are often
proprietary. Compatibility between products may need to be
audited or proven in field tests.
Interoperability is highly desirable in applications with open,
unmanaged endpoints, with one-to-many or many-to-many
delivery paradigms or production workflows where the video
must be delivered as a digital file or bitstream in a common user
Extron’s H.264 Streaming Technology – A Platform
Supporting the Broad requirements of AV Systems
In 2011, extron introduces the Sme 100 Streaming
media encoder, its first streaming product based on the
mpeG-4/aVC h.264 compression standard. the Sme 100
interfaces commonly used DVI, rGB, and standard definition
and high definition video and audio, providing advanced
interfacing and signal processing features common to extron
signal processing products. It features the high quality scaling
technology found in many extron products and offers a wide
range of audio and video compression controls providing
the ability to stream at a variety of resolutions and frame
rates. the combination of compression and bit rate controls
and the flexibility to choose from a variety of streaming
protocols make the product flexible for use in a variety of
network environments.
Selection of the h.264 standard as the codec for aV
streaming products supports an open technology environment,
offering a high degree of interoperability and compatibility for
the Sme 100 with media servers as well as the flexibility to
decode live streams on hardware or software platforms.
the Sme 100 is used in Corporate, education,
Government, and other enterprise applications. It can be used
to stream aV System sources to pCs for monitoring purposes.
presentations or videoconferences can be streamed to
desktop pCs or overflow rooms, extending the reach of aV
systems. It can also be integrated with Streaming media
Servers or Content Delivery networks to provide scalable
distribution across an enterprise network or the Internet.
Extron’s h.264 Streaming Technology
Sme 100
model Version Part no.
Sme 100 h.264 Streaming media encoder 60-1061-01
avsd_summer2011.indd 39 6/6/11 7:13 PM
www.avsystemsdesign.com 41 40 AV SYSTEMS DESIGN Summer 2011Feature StorY
format. A natural conclusion may be to always select a standard.
However, arbitrarily selecting a commonly used codec because
it is “a standard” may or may not be the best choice for every
application due to technical or performance reasons.
Image Transforms and CodecsVideo codecs use one of two primary transforms. The Discrete
Cosine Transform – DCT and the Discrete Wavelet Transform
– DWT. These transforms convert the video data from a spatial
domain into a frequency domain where the image data may be
compressed more easily. Each transform has different strengths
and weaknesses. Use of DCT is partly traditional. Limited
computer processing power at the time of its creation made use of
the DCT desirable because it consumed minimal processing power.
The DWT is used in many applications but has seen broader use
with the advent of the JPEG2000 compression standard.
The DCT is used broadly in codecs employing temporal
compression, using data processed across a Group of Pictures –
GOP. This includes codecs used to stream video on the Internet
such as: H.264, Theora, and VP8. These codecs achieve bit rates
below 10 and even 5 Mbps streaming high definition video, and
below 1 Mbps for standard definition video or lower resolutions,
such as 176 × 144, QCIF – Quarter Common Intermediate Format.
The DWT on the other hand, provides very high efficiency in
still image compression for a more continuous or graceful image
output as compression ratios are increased on still frames.
Applied in JPEG2000, it has utility streaming high definition video
on networks that can support bit rates of 100 Mbps or higher,
maintaining very high image qualities and low encode-to-decode
latency below 100 ms.
Exceptions to this summary include : 1) four H.264 High profiles
using intra-compression, which can be applied to low delay
applications providing comparable qualities to JPEG2000, 2) the
Dirac codec, which includes a temporal compression applying a
Group of Pictures scheme, and 3) the PURE3 codec from Extron,
which uses the DWT, but offers a unique form of temporal
compression capable of achieving bit rates from 10 Mbps down
to 1 Mbps with high resolution computer-video inputs in
certain applications.
Proprietary CodecsProprietary codecs are used every day in industrial and consumer
applications. Situations that motivate development of a proprietary
codec include:
1. A developer or manufacturer has established user
requirements that cannot be fulfilled with available
standards. An investment is made to develop a codec
that fulfills a unique need. This is more likely to occur in
closed applications.
2. An organization maintains a degree of control over a
significant user group and produces a unique codec,
container format, or viewer application from which it can
manage the quality delivered to endpoints. This situation is
more likely to exist in broad use applications such as PCs.
3. A developer or manufacturer creates a codec to provide
a competitive advantage and barrier to entry for
alternative products.
4. A company creates a codec to avoid licensing fees or legal
risk associated with patents.
Enterprise applications with defined, manageable endpoints
are strong candidates to consider proprietary codecs. The delivery
interface, endpoints, economics, and expansion of the system can
be planned. Specific performance or quality requirements may be
based on bit rate targets or error resilience operating on certain
types of networks. They may be based on support for a specific
input or output signal format, resolution, or picture quality. They
may also be based on very low encoding and decoding delay,
below 100ms for instance.
When applied to enterprise applications, dedicated hardware
designed alongside a proprietary video codec can provide
superior performance relative to standards-based video codecs
that often rely on a blend of intellectual property or technology
from different suppliers. Proprietary encoding algorithms and
dedicated hardware can often be optimized and tailored to special
requirements found in unique applications. Standards-based
hardware and software applications may be fixed or preconfigured
dCT and dwT ComparisondCT dwT
Spatial Compression efficiency moderate high
Degradation with increased spatial compression
Discrete Continuous
applied with temporal compression using – Gop
Common rare
applied in low delay applications Common Common
Transform use in different CodecsdCT dwT
JpeG, mpeG-1, mpeG-2, h.264, Vp6, Vp8, VC-1
JpeG2000, Dirac, Dirac-pro, pure3
avsd_summer2011.indd 40 6/6/11 7:13 PM
www.avsystemsdesign.com 41 40 AV SYSTEMS DESIGN Summer 2011 Feature StorY
for delivery of specific video content such as entertainment or
video teleconferencing at bandwidths targeted below 5 Mbps.
These low bandwidth solutions may not be suitable for the image
quality or low latency requirements of enterprise level applications
such as remote video contribution links, telemedicine, command
and control, surveillance, or simulation environments.
Proprietary codecs can also provide greater security when the
technology platform is not easily accessible to the public at large.
The foundation for security in any communication application
starts with physical security, communications architecture and
network policies for authentication, and encryption. Use of a
proprietary codec provides opportunity for an additional layer
of protection, since the codec will not be easily accessible to
unintended users.
The PUrE3 Codec – Fulfilling the Most Challenging
Streaming requirements
the extron pure3 codec has been developed for customers
operating on private networks with real-time, mission-critical
applications. It has been developed to support the most
demanding image quality requirements streaming computer
or video inputs, including maintaining native source resolution
and frame rate for virtually any computer or video input up
to 1920×1200 or 1080p resolutions. Visually lossless image
quality and 4:4:4 color quantization are retained ensuring that
video delivered to production systems or large projection
displays will not exhibit compression artifacts. the codec
supports use in real-time, interactive, collaborative applications
where dispersed participants can be confident they are acting
on identical visual information and bidirectional communication
will not be hampered by delay of any significance.
the pure3 codec provides highly efficient compression
and is intended for use on commonly used Lan and Wan
infrastructures. an error concealment system in the pure3
codec makes it highly immune to network errors, preserving
high image quality even under conditions of heavy packet loss
without requiring additional delay or bandwidth used by error
correction technology.
the pure3 codec was developed because the performance
available from mpeG-2 or JpeG2000 technology was unable
to support this combination of demanding requirements.
JpeG2000 was capable of supporting the quality, but at high
bit rates; mpeG-2 supported more efficient bit rates but would
not maintain the quality with low encoding delays. the pure3
codec has been implemented in extron’s Vn matrix 225
codecs, which support rGB or DVI signals and digital audio,
and the Vn matrix 300 codec, which supports SDI, hD-SDI,
and 3G-SDI video formats and embedded audio. Vn-matrix
products employing the pure3 codec are in use by customers
in the following quality-critical applications:
• Video Contribution & Collaboration - For broadcast,
post-production, scientific, military, product design, and
oil/gas exploration
• Control rooms - For broadcast, surveillance, and
command & control applications
• training, education & Documentation - For visualization
and simulation environments
Extron’s PuRE3 Codec
model Version Part no.
VnC 225 DVI Codec for DVI-I, audio/Keyboard/mouse 60-1118-02
Vne 225 DVI encoder for DVI-I & Digital audio 60-1119-02
VnD 225 DVI Decoder for DVI-I & Digital audio 60-1120-02
VnC 225 DVI
avsd_summer2011.indd 41 6/6/11 7:13 PM
www.avsystemsdesign.com 43 42 AV SYSTEMS DESIGN Summer 2011Feature StorY
A real-world example of vulnerability using an open standard
was experienced in 2008 when Iraqi insurgents used SkyGrabber,
an openly available $26 satellite snooping software program,
to intercept and monitor video from US Predator drones. A
proprietary codec would have made it considerably more difficult
for the insurgents to extract this information.
Network and User EnvironmentsIt’s valuable to consider the network environment in which
your streaming product will be deployed. Let’s examine three
network environments:
• Private – Private networks can be designed and
configured to support streaming traffic requirements.
Their performance can be measured and managed.
Where the network infrastructure is applied in shared
use data, voice, and video, greater attention will be made
of bandwidth use.
• Public – Streaming delivered across the Internet presents
more challenges as it is a publicly shared network. Limits
to Quality of Service – QoS and available bandwidth at
endpoints limit the types of streaming applications which
can be served. Security is also a concern.
• Virtual Private Networks or VPNs – VPNs provide
encryption and greater security for connections crossing
the Internet.
Private, managed networks can be designed to support use
of codecs requiring bandwidths of 50 Mbps or higher. Public
networks or shared use private networks will typically have lower
bandwidth available, making streaming at bit rates below 5 Mbps
or lower desirable. Fewer guarantees for QoS on the Internet
may require error correction systems or transport and streaming
protocols designed to deliver reliable performance. For more
information concerning transport and streaming protocols, read
“Different Methods for Streaming Media” in the Spring 2011 issue
of AV Systems Design.
Deciding on a Codec – Standards-Based or Proprietary?The discussion thus far has presented a broad set of topics
including a variety of codecs targeted at different applications
both proprietary and standards-based, open and closed systems,
different network environments, and summary performance
delivered by codecs using different transforms. Some streaming
applications require delivery of electronic signals to presentation
or production systems, and others require direct delivery of a
video bitstream to a PC or hardware decoders in open systems.
Finally, the PC world is currently subject to a dynamic playing
field influenced by powerful organizations. Where codecs are
concerned, it’s clear that:
• There are many codecs in use, some of which have
targeted different applications and requirements
• Certain applications requiring interoperability for
communications or media exchange use codecs, which
have been ratified or endorsed into standards by official
standards bodies
• Audiovisual applications must provide solutions for media
sources, which often do not offer a standard resolution,
interface, or interoperability
• Use of a standard offers the opportunity to support many
endpoints through interoperability and transfer of digital
media in live streaming or production workflows
• Proprietary solutions can offer unique performance and
potentially increased security
• Plug-ins and conversion tools exist for many codecs and
proprietary container formats providing compatibility for
different decoding devices
A decision criteria for selecting a codec is presented on the
facing page. It considers: 1) the user and system environment,
2) exchange format, 3) delivery paradigm, 4) endpoints, and 5)
performance. This decision criteria is not absolute, but it provides
a guidepost for selecting a codec in a manner that avoids
over-simplified thinking, marketing hype, and the politics of
industrial markets.
A final word of wisdom - stay informed. The Joint Video Team
of ITU and ISO are currently working on the High Efficiency
Video Coding - HEVC as a successor to H.264. The goal is to
cut the bit rate of H.264 in half for low complexity applications.
Expect continued editorial discussion, white papers, and
promotion of different codecs in the future. Rapid growth and
advancements in both consumer and industrial streaming video
applications will continue to bring change and the emergence
of new codecs.
Karl Johnson is Director of product marketing at extron electronics for streaming
technologies and videowall processing systems. he worked at electrosonic for over 20 years,
most recently as General manager of the electrosonic product division.
how many could many-to-many be?It could be thousands to millions of endpoints.
how many could few-to-few be?It could be hundreds of endpoints.
avsd_summer2011.indd 42 6/6/11 7:13 PM
www.avsystemsdesign.com 43 42 AV SYSTEMS DESIGN Summer 2011 Feature StorY
Personal computers or consumer devices
Video bitstream, software decoding
One-to-many or many-to-many
Hardware decoding or managed PCs
Electronic signal or software decoding
One-to-one or few-to-few
Endpoint device?
Exchange format?
Delivery paradigm?
Endpoint positions?
Performance?
PROPRIETARY
Defined endpointsUnique performance
Open, Unmanaged Endpoints Closed, Managable Endpoints
STANDARDS-BASED
A B COpen computing environment
De facto or real standards
Defined endpoints
Unique performance
Variable endpoints
Standard performance
use a Standards-Based CodecEvaluate Standards-Based
Codecs and Proprietary Codecs use a Proprietary Codec
For applications with:
1. an open network environment delivering media to pCs or consumer devices
2. the delivery paradigm is one-to-many or many-to-many
a standards based codec should be selected which originates from a standards body such as ISo. however, certain applications may target use of endpoints that are restricted to use a proprietary codec or container and media players. Best practice would be to not let the endpoint device drive the technology selection. a solution using a standards based codec will offer a future that includes greater interoperability with new media systems and endpoint devices.
If the application is:
1. a closed system or the opportunity exists to manage the endpoints
2. applied to a private or managed connection
3. uses an electronic signal interface such as hD-SDI or DVI, or decoder plug-ins can be used on decoding devices
4. the delivery paradigm is one-to-one or few-to-few
If endpoints are not managed or part of a closed system and interoperability is required, then a standards-based codec is recommended. Compatibility between encoders and decoders must still be confirmed based on the class of application served and use of common protocols.
a proprietary codec should be considered for applications that are:
1. Closed systems2. reside on a private network or offer
the ability to manage endpoints3. uses an electronic signal interface such
as hD-SDI, DVI, or decoder plug-ins can be used on decoding devices
4. the delivery paradigm is one-to-one or few-to-few
5. unique streaming performance is required and is not available from standards-based codecs
When security is an important consideration, a proprietary codec may contribute additional protection for an application.
A B C
avsd_summer2011.indd 43 6/6/11 7:13 PM