Volicon At IBC – Stand 7.G23

At IBC2014, Volicon will demonstrate five powerful new applications within the company’s Observer® Media Intelligence Platform™ for the first time in Europe. The Media Intelligence Platform boasts an enterprise-wide solution that records a station’s media from ingest to playout, as well as multiple on-air broadcasts. In addition to enabling multiple users to stream, analyze, and review content from anywhere at any time, the platform supports a range of applications including compliance, quality assurance, competitive analysis, production, and repurposing for multiple platforms and social media outlets. With these tools, Media Intelligence Platform users are equipped to capitalize on new opportunities to create compelling content, raise viewer ratings, and generate new ad revenue.

STAND 7.G23, Hall 7

Capture
Today’s broadcaster must capture media from a variety of sources to produce compelling content for viewers, whether delivered via on-air broadcast or a digital platform such as Web, mobile, streaming, and OTT services. Serving as a cost-effective alternative to expensive and cumbersome capture stations, Volicon’s new Capture application allows broadcasters to capture media from any source at any time, ingesting media either according to a schedule, in real time, and/or 24/7 recording. The application supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and MAM systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer® Media Intelligence Platform™ users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

Share
Broadcasters today need an agile way to publish exciting and compelling content to a multitude of digital platforms including the Web and social media outlets. This typically is a cumbersome and expensive process, but Volicon’s new Share application allows the broadcaster to repurpose existing content quickly and efficiently and subsequently push it to digital platforms and social media sites. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Share also makes use of existing closed captioning text to publish content that is compliant with government regulations.

Review
The new Volicon Review application provides broadcasters, networks, and playout service providers with a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis. This application facilitates rapid access to broadcast content for users working centrally and across geographically distributed sites, thus giving all key stakeholders the ability to keep an eye on their own broadcasts, as well as those of their competitors, and associated ratings data within a single GUI. Making high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available on any device, the application gives users the ability to review and analyze their broadcasts at any time, from anywhere. The application interfaces with the playout automation system to provide as-run log data for comparison with the frame-accurate recording of the broadcast output, thus making it easy for users to show advertisers what they’re getting for their money.

Comply
Volicon’s new Comply application enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements. Addressing a complete array of regulations, ranging from decency to closed captioning to loudness, this scalable and highly reliable application allows users to respond quickly and unambiguously to compliance requests. Leveraging Volicon’s proven compliance monitoring technology, Comply lays critical A/V metadata over frame-accurate video to create a clear visual affidavit of compliance.

Monitor
Built on Volicon’s acclaimed digital video monitoring technology, the new Monitor application allows users to monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. While continuously analyzing logged content for a variety of faults such as black or static screen, loss of video or closed captions, and incorrect audio levels, this application provides flexible, individually configurable alert thresholds, with notifications delivered via email or SNMP traps. Quality measurement thresholds may be configured per channel to optimize performance and error reporting. To further simplify network monitoring and troubleshooting, the application provides an integrated multiviewer feature that enables Observer® Media Intelligence Platform™ users to use their standard displays as multiviewers or record the output of a traditional multiviewer. With multiple streams presented on a network wall, users can respond immediately to any issues, instantly grabbing the suspect stream via their desktop interfaces to begin resolving the problem.

 

 

TrueVisions Extends Volicon Observer® Monitoring and Logging System

TrueVisions, Thailand’s leading cable and satellite television operator, has expanded its Observer® digital video monitoring and logging system to simplify compliance verification across a total of 120 channels in its direct-to-home (DTH) platform. Local distributor and system integrator Trinergy provided a new Observer TS® (transport stream) monitoring and logging system that allows TrueVisions to perform continual monitoring and recording of both baseband and compressed signals from within a unified user interface.

“The original installation of Volicon technology allowed us to capture consistent high-quality recorded video that is incredibly easy to access and use, and it also helped us to realize cost savings associated with the shift away from tape-based compliance monitoring,” said Vichai Sernvongsat, chief technology officer at TrueVisions. “Now, with the addition of Observer TS, which logs the full MPEG transport stream, our staff can examine or export content from a recently captured transport stream, or go back further in time and view a low bit rate proxy version of aired content.”

The complete Observer system continuously monitors and records aired content across the TrueVisions lineup, providing real-time fault detection, as well as effortless clip identification and extraction for easy proof of compliance. Installed at TrueVisions’ Bangkok facilities, the Volicon system provides a simple and efficient means of verifying that advertising and program content have been aired properly and at the right time.

The Observer system captures, stores, and streams aired content, giving authorized users at TrueVisions instant access to live and recorded content from an easy-to-use Web-based GUI. Using this interface, desktop users can search, retrieve, analyze, and export video clips with metadata. Volicon’s As-Run-Log Integration module allows users to search and sort the as-run log via ID or commercial/program name for quick and easy ad verification with a direct link to video content. The Observer system’s quality of experience module provides real-time alarms for faulty video, audio, and closed captioning by issuing alerts via email/SNMP with a direct link to content and a master fault log.

“Compliance verification is a critical part of any broadcast business, and the Observer offers a flexible, cost-effective, and intuitive tool for meeting this requirement,” said Russell Wise, vice president global sales at Volicon. “Because it also is a modular system, it provides TrueVisions with a scalable foundation for monitoring additional channels or for bringing additional functions into its monitoring operations.”

Today’s Remote Monitoring Technology Is Primed for OTT and Streaming Services

by Gary Learner, Chief Technology Officer, Volicon
For VideoEdge Magazine

Today’s broadcasters face continued consolidation, centralization of staff, and ever-increasing pressure to improve efficiency — challenges that can take their toll on quality of experience (QoE) for customers. Fortunately there are affordable, reliable, flexible solutions for remote monitoring that can help overcome these challenges. These solutions perform all-important proactive quality checks at audio/video service handoffs to ensure maximum QoE. The latest generation of compact, low-cost remote-monitoring solutions has an expanded range of functions such as QoE-based content monitoring with recording, remote viewing, and troubleshooting of A/V feeds across linear, on-demand, and interactive services. This functionality is especially important given the addition of over-the-top (OTT), and internet streaming services content to broadcasters’ already long list of services requiring monitoring.

A Complicated Monitoring Landscape

Just as content distribution has evolved in recent history — from over the air to digital and now to OTT and Internet streaming services — so too has the need for and complexity of content monitoring. It used to be that aired content was recorded and watched back afterward to look for faults, an inefficient, tedious, time-consuming task that often took hours or even days to detect problems. It was never a practical approach, to be sure, but at least then it could be done. Today, given the scope and complexity of services requiring monitoring, that manual method would be impossible even for the most well-staffed, well-funded operations.

The introduction of OTT and streaming services boosts the number of portals through which content can be consumed. Add to this the plethora of viewing devices — PCs, tablets, and smartphones — and associated “flavors” of content they require, and the challenge of assuring the best possible experience (within the constraints of all components) can be enormous.

OTT and Streaming Services Present Special Challenges

With media being delivered directly to viewers, who might be scattered across the country or even across different countries, there is no longer a “middle man” to share responsibility for QoE. Thus every operation, regardless of size, must be able to monitor the availability and quality of services across platforms, CDNs, and video service providers — all from a single location — in order to offer a high standard of quality. Such solutions are especially important for operations that lack dedicated monitoring staff and budgets.

Sophisticated Tools for Proactive Remote Monitoring

Fortunately the industry has already addressed remote monitoring of linear, on-demand, and interactive services, eliminating the need for expensive, time-consuming manual and visual channel inspections. With these tools, broadcasters can proactively identify and respond to faults rather than waiting for customer complaints.

Advanced monitoring solutions today can scan hundreds of channels around the clock and automatically test signal integrity, issue alerts (via email and SNMP), and capture the problematic content when channels do not conform to preset limits. Positioned “behind” the set-top-box (STB), such solutions give operators a single system and location from which to access and monitor the video output continuously. Remote monitoring capabilities enable engineers to review video and audio for issues such as static or black screen, as well as errors in closed captions and audio levels.

In terms of on-demand content, today’s sophisticated monitoring systems can ensure content availability and system capacity, record DPI ad insertions to prove ad conformance, or monitor interactive STB guides to ensure a customer’s experience. With complete STB command access, broadcasters can perform troubleshooting more effectively and use a historical review of content and services to address intermittent yet chronic issues. Refined for intuitive use, such systems often combine familiar VCR-like controls with color cues that clearly indicate the channels being displayed, whether they are live or recorded. Layouts and source selection are managed with simple mouse clicks, and a built-in clock facilitates navigation to the desired time stamp.

Remote monitoring technology is already deployed in applications ranging from competitive news analysis to monitoring of “out-of-footprint” broadcast and distribution channels. Now broadcasters are applying the technology to OTT services.

A New Challenge: Remote Monitoring for Streaming and OTT Services

Long gone are the days of monitoring the quality of a single, linear broadcast. Now broadcasters not only must assure video quality for multiple content streams in multiple formats to multiple devices, but, through OTT services, they must also deliver a personalized user experience alongside that video content. It’s a scenario that effectively multiplies their outputs. On top of that, they’re working with a variety of distribution platforms and might need to deliver content via a number of CDNs. The situation is further complicated by the fact that the groups of files delivered to each CDN will need to accommodate a wide range of devices, each with its own profile. It’s easy to see why it’s a significant QoE challenge. There is no plausible way to monitor all of these outputs and versions all the time, but today’s monitoring solutions make it possible for broadcasters to institute OTT service monitoring strategies that work.

Monitoring Content at Every Key Point

The most viable monitoring strategy makes assessments at key points in the delivery workflow: ingest, encoding, packaging, delivery, and distribution to the viewer (albeit in a controlled environment free of the vagaries of ISP service). It’s a mostly passive monitoring process that can give broadcasters reasonable confidence that the content they are delivering is packaged correctly in the formats compatible with target devices.

Monitoring ingested files is critical because they are frequently used as a high-bit-rate coded reference file for all future transcodes of that content. Monitoring at ingest is straightforward, as it typically requires continuous monitoring of just one feed.

Monitoring becomes more demanding in the encoding stage because of the number of files that result from this stage. Working with as many as a dozen files, broadcasters must shift to passive monitoring methods and begin examining data about the file rather than the video itself — a far more cost-effective approach when dealing with large numbers of files. By looking at bit rates, syntax, reference timestamps, and alignment of each of the files, the broadcaster can make a sound determination of the files’ integrity.

Once files have been packaged in the appropriate formats for the target CDNs and platforms, they are delivered along with a manifest describing those files. The simplest way for the broadcaster to confirm that the packaging is being performed properly and that delivery is correct and on schedule is to “accept” files in the same way that a CDN would. After that point, the broadcaster no longer has control over the content and must hope that the CDNs — and, subsequently, ISPs — will carry it to the viewer without introducing faults or compromising quality.

Streaming and OTT Services Require Active Monitoring

When it comes to monitoring the quality of streaming content, passive monitoring won’t work. Instead, broadcasters must apply sampled active monitoring on the tail end of the chain. This approach acknowledges not only the complexity of multiplatform media delivery, but also the exponential leap in the volume of media being delivered.

Actively capturing a sampling of outputs can give broadcasters an accurate idea of the quality that most of its OTT viewing audience is experiencing. Thus, for a relatively modest investment of time and money, particularly as compared with the cost of monitoring all outputs all the time, broadcasters can assure that most of their customers are enjoying quality service most of the time.

Besides active monitoring at the end of the delivery chain, broadcasters are also taking advantage of “round robin” emulation with different devices, rates/resolutions, and CDNs. In round robin monitoring, the broadcaster alternatively checks the lower, middle, and upper bit rates; examines Apple HLS, Microsoft Smooth Streaming, and Adobe HDS formats; and monitors content for quality. By taking these measurements in a controlled environment, broadcasters can easily separate the issues they can control from the ones that occur during ISP delivery.

With a combination of active sampling and round robin emulation, a broadcaster can effectively become a consumer of its own OTT services. When these monitoring tasks are automated and alerts have been configured to warn engineers of any issues, the broadcaster can maintain a proactive approach to monitoring across its full complement of services.

Required Infrastructure

For this model of active sampling to work, the broadcaster’s video monitoring and logging system must touch on a multitude of MPEG transport stream points and monitor adaptive bit rate streaming of both encrypted and unencrypted media. In order to monitor OTT media delivered through apps, the system can employ an external device to emulate and record app behavior. This method accommodates both decryption and authentication while illustrating the user experience. With this functionality, the broadcaster can effectively monitor streaming media (encrypted or unencrypted) in any format.

Monitoring Streaming Content for Compliance

Federal Communications Commission (FCC) regulations demand that both broadcast and OTT content include descriptive video and captioning, so monitoring OTT content for compliance purposes is just as important as maintaining QoE. Fortunately, the very monitoring tools and techniques that support QoE monitoring of OTT services also enable broadcasters to make sure that their services comply with regulations from the FCC and others.

Conclusion

Advertising, encoding, delivery mechanisms, target devices, and other variables are combined to make monitoring across OTT services a challenging but necessary task. The simplest and most cost-effective means of monitoring to address new OTT services is to extend installed monitoring and logging technology. In this way, broadcasters can take advantage of proven technology and workflows to assure that they are delivering the high-quality personalized content today’s media consumers desire.

 

# # #

 

This paper was first presented at the 2014 NAB Broadcast Engineering Conference on Wednesday, April 9, 2014 in Las Vegas, Nevada. You can find additional papers from the 2014 NAB Broadcast Engineering Conference by purchasing a copy of the 2014 BEC Proceedings at www.nabshow.com.

India’s NSTPL Uses Volicon’s Observer® Monitoring Technology to Support Headend-in-the-Sky Platform

Observer Monitoring and Logging System Enables NSTPL to Confirm Compliance, Quality, and Availability of Content on New JAINHITS Platform

 NSTPL (Noida Software Technology Park Limited), part of India’s Jain TV Group, is using the Volicon Observer® Media Intelligence Platform™ digital video monitoring and logging system to enable efficient, effective compliance and quality of service (QoS) monitoring for over 200 channels being aggregated, processed, and uplinked via the company’s Headend-in-the-Sky (HITS) platform, JAINHITS. This platform, the first of its kind in India, offers cable operators across India a straightforward and cost-effective means of meeting the country’s mandatory shift from analog to addressable digital systems.

NSTPL, already an established provider of TV broadcasting, newsgathering, and video up-link services, launched JAINHITS in October 2012 to help cable operators meet the December 2014 digitization deadline set by the Indian Parliament. Through this platform, the company downlinks content from different broadcasters, processes the signals, and uplinks them via satellite for download by its customers and cable operators across India.

The Observer Media Intelligence Platform continuously captures and stores this content, enabling NSTPL to maintain a visual record of the content that has been processed and uplinked. Through an intuitive Web-based interface, the Volicon system also provides easy access both to live streams and recorded media. Monitoring staff and other users at the desktop can thus monitor the content going out to customers or go back days or months to find and provide proof that uplinked content met all appropriate regulations, standards, and quality parameters.

Volicon Works With Astro Malaysia to Roll Out Monitoring and Off-Air Logging System for More Than 200 Channels

Malaysian pay-TV operator Astro is using an Observer® Enterprise video monitoring and logging system to enable off-air logging for more than 200 channels. Focusing on key points in the transmission path, the Volicon system monitors and logs incoming and outgoing feeds in a variety of formats. With broad format support and the ability to support high-density monitoring and logging applications, the Observer system serves as a reliable and flexible solution that addresses the needs of Astro departments ranging from engineering to media sales.

 “Volicon’s Observer system gives us an integrated off-air logging system built on proven technology and equipped with features that give our staff a high degree of flexibility in working with aired media,” said Chris McMillan, vice president, production services, Astro. “By allowing simultaneous users across our operations to access logged content quickly and with ease, the Observer system has enabled us to improve our efficiency and responsiveness in assuring the quality and compliance of Astro services.”

With a customer base of more than 3.9 million residential customers (representing approximately 56 percent of Malaysian TV households), Astro offers 171 TV channels, including 39 HD channels, delivered via direct-to-home satellite TV, IPTV, and OTT platforms.

Installed in Astro’s main DTH broadcast center, the All Asia Broadcast Centre located in Kuala Lumpur, the Observer Enterprise accepts and monitors signal types including composite, component, HD/SD SDI, and transport stream inputs. The system is equipped with Volicon’s quality of experience (QoE) module, as well as an as-run log module that allows users to search and sort the as-run log via ID or commercial/program name for quick and easy ad verification with a direct link to video content. A content export module makes it easy for Observer users to extract and share select clips from recorded content.

The Observer’s multiview display feature enables users to watch multiple programs on a network wall and use the desktop interface to target and begin inspecting or troubleshooting a suspect stream without delay. In executive offices and board rooms, this capability opens up a host of valuable monitoring and review opportunities for both real-time and recorded broadcasts.

“The complexity of large-scale pay-TV operations demands a monitoring and logging system that is robust yet intuitive,” said Gary Learner, CTO at Volicon. “The system must meet the technical requirements of the engineers responsible for maintaining the integrity and quality of the service output, as well as the various needs of the staff who use logged content in other areas of the business. The Observer Enterprise does it all, thereby simplifying critical tasks across the broadcast facility.”

More at: volicon.com

How Advances In Remote Monitoring Benefit Even The Smallest Operations

By Gary Learner, CTO

With the continued consolidation of broadcast operations, centralization of staff, and ever-increasing pressure to improve efficiency, broadcasters need affordable, reliable, and flexible solutions for remote monitoring. As always, the ability to perform quality checks proactively at A/V service handoffs is key in ensuring the highest quality of experience (QoE) for customers. The definition of comprehensive remote monitoring, however, and the challenges inherent in establishing such technology within broadcast operations, have undergone radical changes since the first solutions were implemented.

Not all that long ago, the monitoring operations of broadcast stations were focused on viewing aired content across the conventional distribution chain. To establish some form of remote monitoring, such facilities often resorted either to recording the remote transmission to tape or using the equivalent of a modified TiVo system to capture aired audio and video. While neither approach offered an ideal solution, both were preferable to dedicating staff to monitoring at remote sites.

Using one of these recording methods, engineers gained access to all aired content. They could go back and review the content that had been broadcast on a specific channel on a particular date and time, determining both video quality and compliance through visual inspection. The process of finding specific content, however, was inefficient, requiring a significant investment of a staff member’s time. So tedious and time-consuming was this approach that it often took hours or even days before problems in aired advertising and program content were detected.

Given the practical and financial difficulties of making this approach work, the introduction of digital video monitoring and logging systems was a welcome advance. Enabled by improvements in broadcast and network infrastructure, as well as increased use of IP networks for communications, such monitoring systems allowed broadcasters and other media companies to establish much more efficient and cost-effective remote monitoring models and to gain real-time visibility into the end-to-end transmission path.

Though the emergence of digital television (DTV) brought with it several high-definition (HD) resolutions, the industry did standardize on 720p and 1080i with a number of different frame rates. By embracing MPEG-2 compression for video and AC3 and AAC compression for audio, the industry further expanded the resolutions and metadata involved in broadcasting. Nevertheless, the broadcast product itself often remained a single product delivered to a specific region. Consequently, the shift to DTV made relatively little impact on how broadcasters performed monitoring. Whether analog or digital, many broadcasters chose to apply a “fire-and-forget” approach; once content was aired or delivered to the cable operator, QoE was out of their hands, and the job was done.

As the Internet emerged as a means of delivering content, broadcasters recognized the potential this groundbreaking technology held in enabling them to extend their reach worldwide and to countless viewing devices. While IP-based content delivery — OTT and Internet streaming services — opened the door to a host of new services and revenue-generation opportunities, it also presented significant and sizeable new challenges in assuring QoE. Thus, in taking advantage of the opportunity afforded by OTT, broadcasters considerably expanded the scope and complexity of services requiring monitoring.

Practical Need for Comprehensive Monitoring

Today, the services being monitored by broadcast operations both large and small include not only linear broadcasts, on-demand content, and interactive services, but also a growing volume of OTT services. While on-demand services cause exponential jumps in the amount of content being provided, the introduction of OTT services boosts the number of portals through which content can be consumed. Add to this the plethora of viewing devices — PCs, tablets, and smartphones — and associated “flavors” of content they require, and the challenge of assuring the best possible experience (within the constraints of all components) can be enormous.

Even as they provide a growing number of services and a rising volume of content in an increasingly complex environment, tough competitive and economic conditions are pushing broadcasters of all sizes to do more with fewer resources. That said, the need for effective monitoring across all these linear and nonlinear services is especially keen for smaller broadcasters. Such operations lack the sizeable infrastructure — or whole division dedicated to running a proprietary CDN — that makes internal monitoring of all services both practical and economically feasible.

Smaller facilities must manage all monitoring tasks themselves; no one else will do it. The very nature of OTT services calls for centralized monitoring. With media being delivered directly to viewers, who may be scattered across the country or even across different countries, there no longer is a “middle man” to share responsibility for QoE. Thus, even a smaller operation must be able to monitor, from a single location, the availability and quality of services across platforms, CDNs, and video service providers.

OTT service providers must offer QoE that meets a high standard. This is not easy, however, when the service relies on a heterogeneous delivery chain, with different companies controlling elements that will affect the viewer experience. Without an effective means of monitoring all services, the quality of delivery could be compromised for an extended period of time before engineering is alerted to that fact. Given the limited patience of the consumer, who today has plenty of other options available, the service provider cannot expect to get a phone call about poor video quality and other such issues. Thus, even with a multitude of different delivery points, the content provider must implement some monitoring mechanism that addresses them all.

Without staff and budgets dedicated to monitoring, it’s important that smaller broadcasters capitalize on solutions that can offer a window into all different portals — not just over-the-air, cable, and OTT, but even the different CDNs and how they deliver content. This investment also needs to justify itself over the long haul, offering the flexibility and extensibility necessary to adapt to the continued evolution of media delivery. Doing business in a dynamic marketplace, with existing standards evolving and new standards emerging, particularly for OTT, broadcasters must adopt monitoring solutions that can adapt and continue to support the full range of current service offerings.

Technical Challenges of Comprehensive Monitoring

Many of the technical challenges associated with remote monitoring of linear, on-demand, and interactive services already have been addressed by the industry. Advanced monitoring solutions today can scan hundreds of channels around the clock and automatically test signal integrity, issue alerts (via email and SNMP), and capture the problematic content when channels do not conform to prespecified limits. Positioned “behind” the set-top-box (STB), such solutions give operators a single system and location from which to access and monitor the video output continuously. Engineers typically enjoy remote monitoring capabilities that enable them to review video and audio for issues such as static or black screen, as well as errors in closed-captioning and audio levels.

Operators also can use today’s sophisticated monitoring systems to access and play out on-demand content to ensure content availability and system capacity, record DPI ad insertions for ad proof of conformance, or monitor interactive STB guides to ensure a customer’s experience. With complete STB command access, broadcasters can perform troubleshooting more effectively and use a historical review of content and services to address intermittent, yet chronic issues. Refined to enable intuitive use, such systems often combine controls familiar from VCR units (and now also from video player software) with color cues that clearly indicate the channels being displayed, whether they are live or recorded. Layouts and source selection are managed with simple mouse clicks, and a built-in clock facilitates navigation to the desired time stamp.

With these tools, remote monitoring systems have eliminated the need for expensive and time-consuming manual and visual channel inspections. Broadcasters thus have been able to adopt active approaches to remote monitoring and respond to faults they identify rather than waiting for customer complaints. Given its benefits, remote monitoring technology already has been employed in applications ranging from competitive news analysis to monitoring of “out-of-footprint” broadcast and distribution channels. Now, the challenge facing broadcasters is to apply that technology to their OTT services.

Extending Monitoring to OTT 

In the linear broadcast world, operators have been able to monitor their services passively, assessing the customer’s experience and, for high-value services, continuously logging, recording, and monitoring them for viewers’ experience of metadata and A/V quality. Working within this model, broadcasters needed to monitor only a single output, but much has changed with the rise of Internet-delivered content. The days in which a preset and pre-packaged schedule of programming is consumed only on a TV set, typically viewed within the confines of the living room, are a thing of the past.

Broadcasters now must ensure delivery of a personalized user experience alongside video content. This task poses a significant QoS/QoE (quality of service/quality of experience) challenge. When providing OTT services, broadcasters effectively multiply their outputs, sometimes generating dozens of versions that require monitoring. Working with a variety of distribution platforms, broadcasters may need to deliver content to consumers via a number of CDNs. The situation is further complicated by the fact that the groups of files delivered to each CDN will need to accommodate a wide range of devices, each with its own profile. There is no plausible way to monitor all of these outputs and versions all of the time, but this doesn’t mean that it is impossible for broadcasters to institute OTT service monitoring strategies that work.

The most viable monitoring strategy looks at key areas of the overall delivery workflow: ingest, encode, packaging, delivery, and the signal received by the viewer (albeit in a controlled environment free of the vagaries of ISP service).

Monitoring at ingest is straightforward, as it typically requires continuous monitoring of just one feed, most likely a studio SDI uncompressed feed or an MPEG transport stream. It is essential to monitor the files generated in ingest because they frequently are used as a high bit rate coded reference file for all future transcodes of that content.

It is during the encoding stage that monitoring becomes more demanding; this is the point at which the broadcaster creates multiple versions of the reference file at the different bit rates and resolutions required for multiplatform distribution. Working with anywhere from several files to more than a dozen files, the broadcaster must shift to passive monitoring methods. Examination of data about the file, rather than the video itself, is a far more cost-effective approach when dealing with large numbers of files. By looking at bit rates, syntax, reference timestamps, and alignment of each of the files, the broadcaster can make a sound determination of the files’ integrity.

Once files have been packaged  in the appropriate formats for the target CDNs and platforms, they are delivered along with a manifest describing those files. The simplest way for the broadcaster to confirm that packaging is being performed properly and that delivery is correct and on schedule is to “accept” files in the same way that a CDN would.

After this point, the broadcaster loses control over file-based content. Having worked to assure that the content delivered to each CDN is packaged correctly in the formats compatible with target devices, the broadcaster must hope that the CDNs and, subsequently, ISPs that carry media to the viewer will do so without introducing any faults or compromising video quality.

Though they can offer details on bandwidth usage, few CDNs provide the depth of reporting that would be necessary to uncover an error in packaging or a video quality issue. Once the CDN hands files off to the consumer’s ISP for delivery, the viewing experience can be influenced by a variety of factors ranging from a poor or slow mobile connection to a weak or spotty network in the home.

Adopting a Practical OTT Monitoring Model

The rise of OTT service delivery brings a widespread move from strictly passive monitoring models toward models that apply sampled active monitoring on the tail end of the chain. This approach acknowledges not only the complexity of multiplatform media delivery chain, but also the exponential leap in the volume of media being delivered.

A sampling of outputs can yield an accurate representation of service quality as experienced by the majority of the broadcaster’s OTT viewing audience. Thus, for a relatively modest investment of time and money, particularly as compared with the cost of monitoring all outputs all of the time, the broadcaster can assure that most of its customers are enjoying quality service most of the time.

At the end of the delivery chain, broadcasters are also taking advantage of “round robin” emulation with different devices, rates/resolutions, and CDNs. In this instance, the broadcaster alternatively checks the lower, middle, and upper bit rates; examines Apple HLS, Microsoft Smooth Streaming, and Adobe HDS formats; and monitors content for quality. (Naturally, the greatest value is realized when the bulk of sampling involves the most popular viewing devices.) When these tasks are undertaken in a controlled environment, the broadcaster can easily separate the issues over which it has control from the issues that can occur once content is subject to the variable conditions of ISP delivery.

The combination of active sampling and round robin emulation allows the broadcaster to set itself up as a consumer of its own OTT services. When these monitoring tasks are automated and alerts have been configured to warn engineers of any issues, the broadcaster is able to maintain a proactive approach to monitoring across its full complement of services.

Supporting this model, the broadcaster’s video monitoring and logging system must touch on a multitude of MPEG transport stream points and monitor adaptive bit rate streaming of both encrypted and unencrypted media. To facilitate monitoring of media delivered through apps, the system can employ an external device to emulate and record app behavior. This method accommodates both decryption and authentication while illustrating the user experience. With this functionality, the broadcaster can effectively monitor streaming media (encrypted or unencrypted) in any format.

Though maintaining QoE is a must for OTT service providers, the need to monitor for compliance purposes extends beyond conventional services and into the OTT realm. Fortunately, the very monitoring tools and techniques that support QoE monitoring of OTT services also enable broadcasters to make sure that their services comply with regulations — namely the FCC regulations stemming from the 2010 Consumer Video Accessibility Act (CVAA) — governing accessibility to media.

The FCC’s implementation of the CVAA demands that both broadcast and OTT content include descriptive video and captioning. With the aforementioned approach to monitoring OTT services, broadcasters can be proactive in ensuring that streamed content includes the requisite SMPTE Timed Text (SMPTE-TT) format.

Conclusion

 Variables including advertising, encoding, delivery mechanisms, and target devices combine to make monitoring across OTT services a challenging but necessary task. Broadcasters and other providers must have the means to log and monitor all of their outputs, and the simplest and most cost-effective solution is, where possible, the extension of installed monitoring and logging technology to address new OTT services. In this way, broadcasters can take advantage of proven technology and workflows to assure that they are delivering the high-quality personalized content desired by today’s media consumers.

Observer Media Intelligence Platform Brings Agility Key Broadcast Applications

Broadcasters today can gain a competitive edge by quickly providing timely high-quality content to media consumers on multiple platforms. Designed to enable and accelerate this process, the Observer Media Intelligence Platform (MIP)™ from Volicon equips users to capitalize on new opportunities not only to create compelling content, but also to raise viewer ratings and generate new ad revenue. Building on a series of purpose-built applications, the Volicon platform supports a range of applications including compliance, quality assurance, competitive analysis, production, and repurposing for multiple platforms and social media outlets.

The Observer MIP is unique in capturing content from any source and enabling users at the desktop to edit and clip that content in real time, from anywhere, at any time. With the platform’s Capture application, broadcasters can easily capture media from a variety of sources and rapidly bring it into the media asset management (MAM) system and editing workflow, whether for on-air broadcast or a digital platform such as Web, mobile, and OTT services. With the capacity ingest media either according to a schedule, in real time, or 24×7 recording, the platform serves as a cost-effective alternative to expensive dedicated capture stations. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, local and remote Observer MIP users can collaborate in creating valuable footage for distribution.

To speed the publishing of content to a multitude of digital platforms including the Web and social media outlets, Volicon offers the Share application for the Observer MIP. This is normally a cumbersome and expensive process, but Share allows the broadcaster to repurpose (without the edit suite) and push content to these targets quickly using one-button publishing profiles. As media is delivered to YouTube, Twitter, Facebook, and other popular targets, Share uses existing closed captioning text to provide content that is compliant with government regulations.

Also available on the Observer MIP, the highly scalable Review application gives users a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis. Making high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available with ratings data on any device, the application allows users to review and analyze their broadcasts, and those of competitors, at any time.

With the Observer MIP Monitor application, users can monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. Flexible, individually configurable alert thresholds trigger notifications delivered via email or SNMP traps, and an integrated multiviewer feature enables users to drive their standard displays as multiviewers and instantly grab and evaluable a suspect stream via their desktop interfaces. At the same time, the Comply application for the Observer MIP enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements, ranging from decency to closed captioning to loudness, with a clear visual affidavit of compliance.

Intelligent and intuitive, the Observer MEP makes it easy for broadcasters to deliver timely content to virtually any platform.

 

Volicon and Kaltura Introduce End-to-End Workflow for Rapid Publishing of Captioned Content to the Web

Volicon and Kaltura have announced a technology partnership that enables an end-to-end workflow for bringing live broadcast capture and captioning to the full range of digital and social media platforms quickly and cost-effectively. The companies’ integrated solution unites the unique content capture, streaming, review, and clip-extraction capabilities of Volicon’s Observer® Media Intelligence Platform™ with the highly automated video transcoding, publishing, and management tools of Kaltura’s video platform to facilitate delivery of “rights approved” and captioned content to all digital platforms with unparalleled speed and efficiency.

“Our joint customers — TV stations and networks — need a seamless, fast, and reliable process for generating FCC-compliant captioned content that can be viewed on any display, and that is exactly what this partnership provides,” said Gary Learner, chief technology officer at Volicon. “By simplifying and accelerating the creation, packing, and distribution of this content, we help our customers gain a competitive advantage in providing timely content across virtually every outlet and viewing device.”

“Rapid time to market is a critical element in the effective monetization of broadcast assets, and this is one key benefit that makes our joint solution so valuable in today’s media marketplace,” said Ron Yekutiel, Kaltura chairman and CEO. “The speed, efficiency, and simplicity of this approach, along with its reliance on approved, compliant broadcast content, make it an exceptionally powerful solution.”

The Observer Media Intelligence Platform records aired broadcast content 24/7/365 and provides instant access to this content through an easy-to-use Web-based interface. Working at a desktop, multiple users can rapidly locate, stream, and review content immediately following its broadcast; quickly mark “in” and “out” points; and extract and deliver H.264-encoded 1080i clips along with metadata — including caption information — to the Kaltura system.

Using the high-resolution clip provided by the Volicon system, the Kaltura platform creates versions of the video in all “flavors” necessary for the target distribution platforms and viewing devices. Captioning is incorporated into each version to ensure that content meets the accessibility requirements of the FCC’s Twenty-First Century Communications and Video Accessibility Act (CVAA).

Eliminating the need to take broadcast content back into a costly edit suite for editing, the solution from Volicon and Kaltura saves broadcasters both time and money. Because the aired content captured by the Observer Media Intelligence Platform has been fully produced and approved, users of all technical skill levels can confidently clip and upload it to Kaltura for processing. Pre-built profiles within the Observer system not only streamline this process, but also assure that content is prepared correctly.

Learn More at: http://www.volicon.com/solutions/media-intelligence-platform

Volicon Debuts Review, Comply, and Monitor Applications for Observer® Media Intelligence Platform™ at the 2014 NAB Show

During the 2014 NAB Show, we will also feature the new Review, Comply, and Monitor applications within the company’s Observer® Media Intelligence Platform™. While the platform itself records a station’s media from ingest to playout, as well as multiple on-air broadcasts, these three applications provide powerful tools for evaluating and validating the content of advertising, promos, and programming; assuring the adherence of audio, video, and required metadata to applicable standards and regulations; and monitoring and maintaining the integrity of content and service delivery.

“The Review, Comply, and Monitor applications within the Observer Media Intelligence Platform capitalize on proven Volicon technology to address critical concerns within the broadcast enterprise,” said Russell Wise, vice president global sales at Volicon. “While the Review application provides convenient tools for assessing and improving the quality, substance, and timing of aired content, the Comply application supplies the combination of recorded content and metadata that is essential for effective demonstration of regulatory compliance. The Monitor application provides the robust toolset that has made our Observer systems the top choice of engineers at media facilities worldwide.”

Our Media Intelligence Platform boasts unique content-recording capabilities and an intuitive user interface that enables multiple users to stream, analyze, and review content from anywhere at any time. When equipped with the Volicon Review application, the platform gives broadcasters, networks, and playout service providers a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis.

The Review application makes high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available on any device. With immediate access to broadcast content, users working centrally and across geographically distributed sites can keep an eye on their own broadcasts, as well as those of their competitors, and the associated ratings data within a single GUI. Because the application interfaces with the playout automation system to provide as-run log data for comparison with the frame-accurate recording of the broadcast output, users can easily show advertisers what they’re getting for their money.

Our Comply application enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements. Addressing a complete array of regulations, ranging from decency to closed captioning to loudness, this scalable and highly reliable product allows users to respond quickly and unambiguously to compliance requests. Leveraging Our proven compliance monitoring technology, the Comply application lays critical A/V metadata over frame-accurate video to create a clear visual affidavit of compliance.

Built on our acclaimed digital video monitoring technology, the new Monitor application allows users to monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. While continuously analyzing logged content for a variety of faults such as black or static screen, loss of video or closed captions, and incorrect audio levels, this application provides flexible, individually configurable alert thresholds, with notifications delivered via email or SNMP traps.

Quality measurement thresholds may be configured per channel to optimize performance and error reporting. To further simplify network monitoring and troubleshooting, the Monitor application provides an integrated multiviewer feature that enables Media Intelligence Platform users to use their standard displays as multiviewers or record the output of a traditional multiviewer. With multiple streams presented on a network wall, users can respond immediately to any issues, instantly grabbing the suspect stream via their desktop interfaces to begin resolving the problem.

We will demonstrate the Observer Media Intelligence Platform(tm), running the Review, Comply, and Monitor applications, at its booth (SU7121) during the 2014 NAB Show.

Stop by and see us!

Volicon Unveils ‘Capture’ and ‘Share’ Applications for New Observer® Media Intelligence Platform™ at 2014 NAB Show

 Volicon will showcase the new Capture and Share applications for the company’s Observer® Media Intelligence Platform™ at the 2014 NAB Show. Designed to help broadcasters capitalize on new opportunities to increase their ability to capture and distribute new content quickly, these applications build on the Media Intelligence Platform’s unique content-recording capabilities and intuitive user interface to speed and streamline multiplatform content creation and delivery.

The new Observer Media Intelligence Platform makes it easy and cost-effective for broadcasters to capture media from a variety of sources and quickly produce and deliver compelling content to viewers via on-air broadcast, as well as digital and social media platforms,” said Gary Learner, chief technology officer at Volicon. “Our Capture and Share applications for this innovative platform provide powerful, intuitive tools that simplify and accelerate this process.”

Serving as a cost-effective alternative to expensive dedicated capture stations, Volicon’s new Media Intelligence Platform allows broadcasters to capture media from any source at any time, ingesting media either according to a schedule, in real time, or via 24×7 recording. Equipped with the Capture application, the platform supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and media asset management systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer Media Intelligence Platform users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

Further enabling speed and agility in publishing content to a multitude of digital platforms, Volicon’s Observer Share application within the Media Intelligence Platform allows the broadcaster to repurpose existing content quickly and efficiently and subsequently push it to digital platforms and social media sites. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Observer Share also makes use of existing closed-captioning text to publish content that is compliant with government regulations.

Volicon will demonstrate the Media Intelligence Platform, equipped with both the Capture and Share applications, at its booth SU7121 during the 2014 NAB Show.