How Advances In Remote Monitoring Benefit Even The Smallest Operations

By Gary Learner, CTO

With the continued consolidation of broadcast operations, centralization of staff, and ever-increasing pressure to improve efficiency, broadcasters need affordable, reliable, and flexible solutions for remote monitoring. As always, the ability to perform quality checks proactively at A/V service handoffs is key in ensuring the highest quality of experience (QoE) for customers. The definition of comprehensive remote monitoring, however, and the challenges inherent in establishing such technology within broadcast operations, have undergone radical changes since the first solutions were implemented.

Not all that long ago, the monitoring operations of broadcast stations were focused on viewing aired content across the conventional distribution chain. To establish some form of remote monitoring, such facilities often resorted either to recording the remote transmission to tape or using the equivalent of a modified TiVo system to capture aired audio and video. While neither approach offered an ideal solution, both were preferable to dedicating staff to monitoring at remote sites.

Using one of these recording methods, engineers gained access to all aired content. They could go back and review the content that had been broadcast on a specific channel on a particular date and time, determining both video quality and compliance through visual inspection. The process of finding specific content, however, was inefficient, requiring a significant investment of a staff member’s time. So tedious and time-consuming was this approach that it often took hours or even days before problems in aired advertising and program content were detected.

Given the practical and financial difficulties of making this approach work, the introduction of digital video monitoring and logging systems was a welcome advance. Enabled by improvements in broadcast and network infrastructure, as well as increased use of IP networks for communications, such monitoring systems allowed broadcasters and other media companies to establish much more efficient and cost-effective remote monitoring models and to gain real-time visibility into the end-to-end transmission path.

Though the emergence of digital television (DTV) brought with it several high-definition (HD) resolutions, the industry did standardize on 720p and 1080i with a number of different frame rates. By embracing MPEG-2 compression for video and AC3 and AAC compression for audio, the industry further expanded the resolutions and metadata involved in broadcasting. Nevertheless, the broadcast product itself often remained a single product delivered to a specific region. Consequently, the shift to DTV made relatively little impact on how broadcasters performed monitoring. Whether analog or digital, many broadcasters chose to apply a “fire-and-forget” approach; once content was aired or delivered to the cable operator, QoE was out of their hands, and the job was done.

As the Internet emerged as a means of delivering content, broadcasters recognized the potential this groundbreaking technology held in enabling them to extend their reach worldwide and to countless viewing devices. While IP-based content delivery — OTT and Internet streaming services — opened the door to a host of new services and revenue-generation opportunities, it also presented significant and sizeable new challenges in assuring QoE. Thus, in taking advantage of the opportunity afforded by OTT, broadcasters considerably expanded the scope and complexity of services requiring monitoring.

Practical Need for Comprehensive Monitoring

Today, the services being monitored by broadcast operations both large and small include not only linear broadcasts, on-demand content, and interactive services, but also a growing volume of OTT services. While on-demand services cause exponential jumps in the amount of content being provided, the introduction of OTT services boosts the number of portals through which content can be consumed. Add to this the plethora of viewing devices — PCs, tablets, and smartphones — and associated “flavors” of content they require, and the challenge of assuring the best possible experience (within the constraints of all components) can be enormous.

Even as they provide a growing number of services and a rising volume of content in an increasingly complex environment, tough competitive and economic conditions are pushing broadcasters of all sizes to do more with fewer resources. That said, the need for effective monitoring across all these linear and nonlinear services is especially keen for smaller broadcasters. Such operations lack the sizeable infrastructure — or whole division dedicated to running a proprietary CDN — that makes internal monitoring of all services both practical and economically feasible.

Smaller facilities must manage all monitoring tasks themselves; no one else will do it. The very nature of OTT services calls for centralized monitoring. With media being delivered directly to viewers, who may be scattered across the country or even across different countries, there no longer is a “middle man” to share responsibility for QoE. Thus, even a smaller operation must be able to monitor, from a single location, the availability and quality of services across platforms, CDNs, and video service providers.

OTT service providers must offer QoE that meets a high standard. This is not easy, however, when the service relies on a heterogeneous delivery chain, with different companies controlling elements that will affect the viewer experience. Without an effective means of monitoring all services, the quality of delivery could be compromised for an extended period of time before engineering is alerted to that fact. Given the limited patience of the consumer, who today has plenty of other options available, the service provider cannot expect to get a phone call about poor video quality and other such issues. Thus, even with a multitude of different delivery points, the content provider must implement some monitoring mechanism that addresses them all.

Without staff and budgets dedicated to monitoring, it’s important that smaller broadcasters capitalize on solutions that can offer a window into all different portals — not just over-the-air, cable, and OTT, but even the different CDNs and how they deliver content. This investment also needs to justify itself over the long haul, offering the flexibility and extensibility necessary to adapt to the continued evolution of media delivery. Doing business in a dynamic marketplace, with existing standards evolving and new standards emerging, particularly for OTT, broadcasters must adopt monitoring solutions that can adapt and continue to support the full range of current service offerings.

Technical Challenges of Comprehensive Monitoring

Many of the technical challenges associated with remote monitoring of linear, on-demand, and interactive services already have been addressed by the industry. Advanced monitoring solutions today can scan hundreds of channels around the clock and automatically test signal integrity, issue alerts (via email and SNMP), and capture the problematic content when channels do not conform to prespecified limits. Positioned “behind” the set-top-box (STB), such solutions give operators a single system and location from which to access and monitor the video output continuously. Engineers typically enjoy remote monitoring capabilities that enable them to review video and audio for issues such as static or black screen, as well as errors in closed-captioning and audio levels.

Operators also can use today’s sophisticated monitoring systems to access and play out on-demand content to ensure content availability and system capacity, record DPI ad insertions for ad proof of conformance, or monitor interactive STB guides to ensure a customer’s experience. With complete STB command access, broadcasters can perform troubleshooting more effectively and use a historical review of content and services to address intermittent, yet chronic issues. Refined to enable intuitive use, such systems often combine controls familiar from VCR units (and now also from video player software) with color cues that clearly indicate the channels being displayed, whether they are live or recorded. Layouts and source selection are managed with simple mouse clicks, and a built-in clock facilitates navigation to the desired time stamp.

With these tools, remote monitoring systems have eliminated the need for expensive and time-consuming manual and visual channel inspections. Broadcasters thus have been able to adopt active approaches to remote monitoring and respond to faults they identify rather than waiting for customer complaints. Given its benefits, remote monitoring technology already has been employed in applications ranging from competitive news analysis to monitoring of “out-of-footprint” broadcast and distribution channels. Now, the challenge facing broadcasters is to apply that technology to their OTT services.

Extending Monitoring to OTT 

In the linear broadcast world, operators have been able to monitor their services passively, assessing the customer’s experience and, for high-value services, continuously logging, recording, and monitoring them for viewers’ experience of metadata and A/V quality. Working within this model, broadcasters needed to monitor only a single output, but much has changed with the rise of Internet-delivered content. The days in which a preset and pre-packaged schedule of programming is consumed only on a TV set, typically viewed within the confines of the living room, are a thing of the past.

Broadcasters now must ensure delivery of a personalized user experience alongside video content. This task poses a significant QoS/QoE (quality of service/quality of experience) challenge. When providing OTT services, broadcasters effectively multiply their outputs, sometimes generating dozens of versions that require monitoring. Working with a variety of distribution platforms, broadcasters may need to deliver content to consumers via a number of CDNs. The situation is further complicated by the fact that the groups of files delivered to each CDN will need to accommodate a wide range of devices, each with its own profile. There is no plausible way to monitor all of these outputs and versions all of the time, but this doesn’t mean that it is impossible for broadcasters to institute OTT service monitoring strategies that work.

The most viable monitoring strategy looks at key areas of the overall delivery workflow: ingest, encode, packaging, delivery, and the signal received by the viewer (albeit in a controlled environment free of the vagaries of ISP service).

Monitoring at ingest is straightforward, as it typically requires continuous monitoring of just one feed, most likely a studio SDI uncompressed feed or an MPEG transport stream. It is essential to monitor the files generated in ingest because they frequently are used as a high bit rate coded reference file for all future transcodes of that content.

It is during the encoding stage that monitoring becomes more demanding; this is the point at which the broadcaster creates multiple versions of the reference file at the different bit rates and resolutions required for multiplatform distribution. Working with anywhere from several files to more than a dozen files, the broadcaster must shift to passive monitoring methods. Examination of data about the file, rather than the video itself, is a far more cost-effective approach when dealing with large numbers of files. By looking at bit rates, syntax, reference timestamps, and alignment of each of the files, the broadcaster can make a sound determination of the files’ integrity.

Once files have been packaged  in the appropriate formats for the target CDNs and platforms, they are delivered along with a manifest describing those files. The simplest way for the broadcaster to confirm that packaging is being performed properly and that delivery is correct and on schedule is to “accept” files in the same way that a CDN would.

After this point, the broadcaster loses control over file-based content. Having worked to assure that the content delivered to each CDN is packaged correctly in the formats compatible with target devices, the broadcaster must hope that the CDNs and, subsequently, ISPs that carry media to the viewer will do so without introducing any faults or compromising video quality.

Though they can offer details on bandwidth usage, few CDNs provide the depth of reporting that would be necessary to uncover an error in packaging or a video quality issue. Once the CDN hands files off to the consumer’s ISP for delivery, the viewing experience can be influenced by a variety of factors ranging from a poor or slow mobile connection to a weak or spotty network in the home.

Adopting a Practical OTT Monitoring Model

The rise of OTT service delivery brings a widespread move from strictly passive monitoring models toward models that apply sampled active monitoring on the tail end of the chain. This approach acknowledges not only the complexity of multiplatform media delivery chain, but also the exponential leap in the volume of media being delivered.

A sampling of outputs can yield an accurate representation of service quality as experienced by the majority of the broadcaster’s OTT viewing audience. Thus, for a relatively modest investment of time and money, particularly as compared with the cost of monitoring all outputs all of the time, the broadcaster can assure that most of its customers are enjoying quality service most of the time.

At the end of the delivery chain, broadcasters are also taking advantage of “round robin” emulation with different devices, rates/resolutions, and CDNs. In this instance, the broadcaster alternatively checks the lower, middle, and upper bit rates; examines Apple HLS, Microsoft Smooth Streaming, and Adobe HDS formats; and monitors content for quality. (Naturally, the greatest value is realized when the bulk of sampling involves the most popular viewing devices.) When these tasks are undertaken in a controlled environment, the broadcaster can easily separate the issues over which it has control from the issues that can occur once content is subject to the variable conditions of ISP delivery.

The combination of active sampling and round robin emulation allows the broadcaster to set itself up as a consumer of its own OTT services. When these monitoring tasks are automated and alerts have been configured to warn engineers of any issues, the broadcaster is able to maintain a proactive approach to monitoring across its full complement of services.

Supporting this model, the broadcaster’s video monitoring and logging system must touch on a multitude of MPEG transport stream points and monitor adaptive bit rate streaming of both encrypted and unencrypted media. To facilitate monitoring of media delivered through apps, the system can employ an external device to emulate and record app behavior. This method accommodates both decryption and authentication while illustrating the user experience. With this functionality, the broadcaster can effectively monitor streaming media (encrypted or unencrypted) in any format.

Though maintaining QoE is a must for OTT service providers, the need to monitor for compliance purposes extends beyond conventional services and into the OTT realm. Fortunately, the very monitoring tools and techniques that support QoE monitoring of OTT services also enable broadcasters to make sure that their services comply with regulations — namely the FCC regulations stemming from the 2010 Consumer Video Accessibility Act (CVAA) — governing accessibility to media.

The FCC’s implementation of the CVAA demands that both broadcast and OTT content include descriptive video and captioning. With the aforementioned approach to monitoring OTT services, broadcasters can be proactive in ensuring that streamed content includes the requisite SMPTE Timed Text (SMPTE-TT) format.

Conclusion

 Variables including advertising, encoding, delivery mechanisms, and target devices combine to make monitoring across OTT services a challenging but necessary task. Broadcasters and other providers must have the means to log and monitor all of their outputs, and the simplest and most cost-effective solution is, where possible, the extension of installed monitoring and logging technology to address new OTT services. In this way, broadcasters can take advantage of proven technology and workflows to assure that they are delivering the high-quality personalized content desired by today’s media consumers.

Observer Media Intelligence Platform Brings Agility Key Broadcast Applications

Broadcasters today can gain a competitive edge by quickly providing timely high-quality content to media consumers on multiple platforms. Designed to enable and accelerate this process, the Observer Media Intelligence Platform (MIP)™ from Volicon equips users to capitalize on new opportunities not only to create compelling content, but also to raise viewer ratings and generate new ad revenue. Building on a series of purpose-built applications, the Volicon platform supports a range of applications including compliance, quality assurance, competitive analysis, production, and repurposing for multiple platforms and social media outlets.

The Observer MIP is unique in capturing content from any source and enabling users at the desktop to edit and clip that content in real time, from anywhere, at any time. With the platform’s Capture application, broadcasters can easily capture media from a variety of sources and rapidly bring it into the media asset management (MAM) system and editing workflow, whether for on-air broadcast or a digital platform such as Web, mobile, and OTT services. With the capacity ingest media either according to a schedule, in real time, or 24×7 recording, the platform serves as a cost-effective alternative to expensive dedicated capture stations. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, local and remote Observer MIP users can collaborate in creating valuable footage for distribution.

To speed the publishing of content to a multitude of digital platforms including the Web and social media outlets, Volicon offers the Share application for the Observer MIP. This is normally a cumbersome and expensive process, but Share allows the broadcaster to repurpose (without the edit suite) and push content to these targets quickly using one-button publishing profiles. As media is delivered to YouTube, Twitter, Facebook, and other popular targets, Share uses existing closed captioning text to provide content that is compliant with government regulations.

Also available on the Observer MIP, the highly scalable Review application gives users a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis. Making high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available with ratings data on any device, the application allows users to review and analyze their broadcasts, and those of competitors, at any time.

With the Observer MIP Monitor application, users can monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. Flexible, individually configurable alert thresholds trigger notifications delivered via email or SNMP traps, and an integrated multiviewer feature enables users to drive their standard displays as multiviewers and instantly grab and evaluable a suspect stream via their desktop interfaces. At the same time, the Comply application for the Observer MIP enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements, ranging from decency to closed captioning to loudness, with a clear visual affidavit of compliance.

Intelligent and intuitive, the Observer MEP makes it easy for broadcasters to deliver timely content to virtually any platform.

 

Volicon and Kaltura Introduce End-to-End Workflow for Rapid Publishing of Captioned Content to the Web

Volicon and Kaltura have announced a technology partnership that enables an end-to-end workflow for bringing live broadcast capture and captioning to the full range of digital and social media platforms quickly and cost-effectively. The companies’ integrated solution unites the unique content capture, streaming, review, and clip-extraction capabilities of Volicon’s Observer® Media Intelligence Platform™ with the highly automated video transcoding, publishing, and management tools of Kaltura’s video platform to facilitate delivery of “rights approved” and captioned content to all digital platforms with unparalleled speed and efficiency.

“Our joint customers — TV stations and networks — need a seamless, fast, and reliable process for generating FCC-compliant captioned content that can be viewed on any display, and that is exactly what this partnership provides,” said Gary Learner, chief technology officer at Volicon. “By simplifying and accelerating the creation, packing, and distribution of this content, we help our customers gain a competitive advantage in providing timely content across virtually every outlet and viewing device.”

“Rapid time to market is a critical element in the effective monetization of broadcast assets, and this is one key benefit that makes our joint solution so valuable in today’s media marketplace,” said Ron Yekutiel, Kaltura chairman and CEO. “The speed, efficiency, and simplicity of this approach, along with its reliance on approved, compliant broadcast content, make it an exceptionally powerful solution.”

The Observer Media Intelligence Platform records aired broadcast content 24/7/365 and provides instant access to this content through an easy-to-use Web-based interface. Working at a desktop, multiple users can rapidly locate, stream, and review content immediately following its broadcast; quickly mark “in” and “out” points; and extract and deliver H.264-encoded 1080i clips along with metadata — including caption information — to the Kaltura system.

Using the high-resolution clip provided by the Volicon system, the Kaltura platform creates versions of the video in all “flavors” necessary for the target distribution platforms and viewing devices. Captioning is incorporated into each version to ensure that content meets the accessibility requirements of the FCC’s Twenty-First Century Communications and Video Accessibility Act (CVAA).

Eliminating the need to take broadcast content back into a costly edit suite for editing, the solution from Volicon and Kaltura saves broadcasters both time and money. Because the aired content captured by the Observer Media Intelligence Platform has been fully produced and approved, users of all technical skill levels can confidently clip and upload it to Kaltura for processing. Pre-built profiles within the Observer system not only streamline this process, but also assure that content is prepared correctly.

Learn More at: http://www.volicon.com/solutions/media-intelligence-platform

Volicon Debuts Review, Comply, and Monitor Applications for Observer® Media Intelligence Platform™ at the 2014 NAB Show

During the 2014 NAB Show, we will also feature the new Review, Comply, and Monitor applications within the company’s Observer® Media Intelligence Platform™. While the platform itself records a station’s media from ingest to playout, as well as multiple on-air broadcasts, these three applications provide powerful tools for evaluating and validating the content of advertising, promos, and programming; assuring the adherence of audio, video, and required metadata to applicable standards and regulations; and monitoring and maintaining the integrity of content and service delivery.

“The Review, Comply, and Monitor applications within the Observer Media Intelligence Platform capitalize on proven Volicon technology to address critical concerns within the broadcast enterprise,” said Russell Wise, vice president global sales at Volicon. “While the Review application provides convenient tools for assessing and improving the quality, substance, and timing of aired content, the Comply application supplies the combination of recorded content and metadata that is essential for effective demonstration of regulatory compliance. The Monitor application provides the robust toolset that has made our Observer systems the top choice of engineers at media facilities worldwide.”

Our Media Intelligence Platform boasts unique content-recording capabilities and an intuitive user interface that enables multiple users to stream, analyze, and review content from anywhere at any time. When equipped with the Volicon Review application, the platform gives broadcasters, networks, and playout service providers a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis.

The Review application makes high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available on any device. With immediate access to broadcast content, users working centrally and across geographically distributed sites can keep an eye on their own broadcasts, as well as those of their competitors, and the associated ratings data within a single GUI. Because the application interfaces with the playout automation system to provide as-run log data for comparison with the frame-accurate recording of the broadcast output, users can easily show advertisers what they’re getting for their money.

Our Comply application enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements. Addressing a complete array of regulations, ranging from decency to closed captioning to loudness, this scalable and highly reliable product allows users to respond quickly and unambiguously to compliance requests. Leveraging Our proven compliance monitoring technology, the Comply application lays critical A/V metadata over frame-accurate video to create a clear visual affidavit of compliance.

Built on our acclaimed digital video monitoring technology, the new Monitor application allows users to monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. While continuously analyzing logged content for a variety of faults such as black or static screen, loss of video or closed captions, and incorrect audio levels, this application provides flexible, individually configurable alert thresholds, with notifications delivered via email or SNMP traps.

Quality measurement thresholds may be configured per channel to optimize performance and error reporting. To further simplify network monitoring and troubleshooting, the Monitor application provides an integrated multiviewer feature that enables Media Intelligence Platform users to use their standard displays as multiviewers or record the output of a traditional multiviewer. With multiple streams presented on a network wall, users can respond immediately to any issues, instantly grabbing the suspect stream via their desktop interfaces to begin resolving the problem.

We will demonstrate the Observer Media Intelligence Platform(tm), running the Review, Comply, and Monitor applications, at its booth (SU7121) during the 2014 NAB Show.

Stop by and see us!

Volicon Unveils ‘Capture’ and ‘Share’ Applications for New Observer® Media Intelligence Platform™ at 2014 NAB Show

 Volicon will showcase the new Capture and Share applications for the company’s Observer® Media Intelligence Platform™ at the 2014 NAB Show. Designed to help broadcasters capitalize on new opportunities to increase their ability to capture and distribute new content quickly, these applications build on the Media Intelligence Platform’s unique content-recording capabilities and intuitive user interface to speed and streamline multiplatform content creation and delivery.

The new Observer Media Intelligence Platform makes it easy and cost-effective for broadcasters to capture media from a variety of sources and quickly produce and deliver compelling content to viewers via on-air broadcast, as well as digital and social media platforms,” said Gary Learner, chief technology officer at Volicon. “Our Capture and Share applications for this innovative platform provide powerful, intuitive tools that simplify and accelerate this process.”

Serving as a cost-effective alternative to expensive dedicated capture stations, Volicon’s new Media Intelligence Platform allows broadcasters to capture media from any source at any time, ingesting media either according to a schedule, in real time, or via 24×7 recording. Equipped with the Capture application, the platform supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and media asset management systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer Media Intelligence Platform users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

Further enabling speed and agility in publishing content to a multitude of digital platforms, Volicon’s Observer Share application within the Media Intelligence Platform allows the broadcaster to repurpose existing content quickly and efficiently and subsequently push it to digital platforms and social media sites. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Observer Share also makes use of existing closed-captioning text to publish content that is compliant with government regulations.

Volicon will demonstrate the Media Intelligence Platform, equipped with both the Capture and Share applications, at its booth SU7121 during the 2014 NAB Show.

 

Proven Platform Simplifies Content Capture, Repurposing, and Publishing to Social Media

By Russell Wise, Vice President of Global Sales

Still operating in an environment that demands they do more with less, broadcasters must consider every new opportunity for to create compelling content, higher ratings, and new ad revenue.  To take advantage of this opportunity they must be able to capture media efficiently, make it readily available   into an edit workflow to create content and deliver it to broadcast, and web, cell, and social media platforms.

Broadcasters gain a competitive edge when they can create timely content not only for air, but also for live streaming via the Internet. The content may be a news promo or replay of a breaking news story, sports coverage such as a highlights reel, or a full replay of a live event. Whatever the subject matter, this capability gives the broadcaster a valuable chance to leverage valuable content more fully.

Key to this task is the use of a cost-effective solution to capture, share, review, and extract content from the aired broadcast and other sources. For this reason, content repurposing has become one of the most popular applications for Volicon’s Observer® video monitoring and logging technology in the broadcast realm. This is true in part because staff members already are familiar with the Observer’s operation, but also because the system allows frame-accurate content to be reviewed very quickly and easily. The difference today is that many broadcasters use the system and the high-quality aired content it captures to repurpose and provide timely content to viewers through alternative viewing platforms.

Seeing the “capture and create” trend explode across the Observer user base, Volicon has unveiled a product engineered to better facilitate the capture of content and multiplatform media delivery. At the 2014 NAB Show, Volicon is showcasing the new Create product, which provides a cost-effective alternative to expensive dedicated capture stations. With this solution, broadcasters can capture media from any source at any time, ingesting media either according to a schedule, in real time, and or 24×7 recording.

The Volicon product supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and MAM systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

To simplify and accelerate publishing of this content, Volicon had introduced the Share product, which enables users to package and move content to Web, mobile, and social media platforms with speed and ease. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Using this new application to distribute timely content to a multitude of platforms and destinations quickly, broadcasters and networks can cost-effectively extend the utility and value of their content while establishing a meaningful presence across popular digital and social media platforms.

During the 2014 NAB Show, we will demonstrate how its flagship Observer product suite has evolved to support capture and publishing, along with compliance monitoring, ad verification, competitive analysis, and quality-of-service monitoring.  The company will show how these products can provide the intelligence, functionality, and intuitive operation required for broadcasters to be successful in delivering content to their target audiences on virtually any platform.

Stop by our  booth at the 2014 NAB Show (SU7121) for ongoing product demonstrations throughout the show.

 

 

Volicon Products and Applications at the 2014 NAB Show

STOP BY AND SEE WHAT’S NEW AT NAB - BOOTH SU7121

At the 2014 NAB Show, Volicon will unveil five powerful new applications within the company’s Observer® Media Intelligence Platform (MIP)™, an enterprise-wide solution that records a station’s media from ingest to playout, as well as multiple on-air broadcasts. In addition to enabling multiple users to stream, analyze, and review content from anywhere at any time, the platform supports a range of applications including compliance, quality assurance, competitive analysis, production, and repurposing for multiple platforms and social media outlets. With these tools, MIP users are equipped to capitalize on new opportunities to create compelling content, raise viewer ratings, and generate new ad revenue.

NEW Capture

Today’s broadcaster must capture media from a variety of sources to produce compelling content for viewers, whether delivered via on-air broadcast or a digital platform such as Web, mobile, streaming, and OTT services. Serving as a cost-effective alternative to expensive and cumbersome capture stations, Volicon’s new Capture application allows broadcasters to capture media from any source at any time, ingesting media either according to a schedule, in real time, and/or 24×7 recording. The application supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and MAM systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer® Media Intelligence Platform (MIP)™ users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

NEW Share

Broadcasters today need an agile way to publish exciting and compelling content to a multitude of digital platforms including the Web and social media outlets. This typically is a cumbersome and expensive process, but Volicon’s new Share application allows the broadcaster to repurpose existing content quickly and efficiently and subsequently push it to digital platforms and social media sites. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Share also makes use of existing closed captioning text to publish content that is compliant with government regulations.

NEW Review

The new Volicon Review application provides broadcasters, networks, and playout service providers with a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis. This application facilitates rapid access to broadcast content for users working centrally and across geographically distributed sites, thus giving all key stakeholders the ability to keep an eye on their own broadcasts, as well as those of their competitors, and associated ratings data within a single GUI. Making high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available on any device, the application gives users the ability to review and analyze their broadcasts at any time, from anywhere. The application interfaces with the playout automation system to provide as-run log data for comparison with the frame-accurate recording of the broadcast output, thus making it easy for users to show advertisers what they’re getting for their money.

NEW Comply

Volicon’s new Comply application enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements. Addressing a complete array of regulations, ranging from decency to closed captioning to loudness, this scalable and highly reliable application allows users to respond quickly and unambiguously to compliance requests. Leveraging Volicon’s proven compliance monitoring technology, Comply lays critical A/V metadata over frame-accurate video to create a clear visual affidavit of compliance.

NEW Monitor

Built on Volicon’s acclaimed digital video monitoring technology, the new Monitor application allows users to monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. While continuously analyzing logged content for a variety of faults such as black or static screen, loss of video or closed captions, and incorrect audio levels, this application provides flexible, individually configurable alert thresholds, with notifications delivered via email or SNMP traps. Quality measurement thresholds may be configured per channel to optimize performance and error reporting. To further simplify network monitoring and troubleshooting, the application provides an integrated multiviewer feature that enables Observer® Media Intelligence Platform (MIP)™ users to use their standard displays as multiviewers or record the output of a traditional multiviewer. With multiple streams presented on a network wall, users can respond immediately to any issues, instantly grabbing the suspect stream via their desktop interfaces to begin resolving the problem.

Presentation: “How Advances in Remote Monitoring Benefit Even the Smallest Operations”

Gary Learner, chief technology officer at Volicon, has been selected to present a paper as part of the Broadcast Engineering Conference at the 2014 NAB Show. Learner’s presentation, titled “How Advances in Remote Monitoring Benefit Even the Smallest Operations,” will be held on April 8 at 5 p.m. in room S228 at the Las Vegas Convention Center.

Powerful New Applications of Monitoring Technology in News and Sports Broadcasting

 By Russell Wise, Vice President of Global Sales at Volicon

Digital video monitoring and logging solutions have been adopted worldwide for compliance and monitoring applications. Users often record the transport stream   to enable immediate review of aired content at the highest possible quality and then also maintain a lower resolution version of aired content for months or even years. Feeding as-run logs from automation to the monitoring system, media companies can very quickly and easily dial back and export a clip that demonstrates compliance with loudness, closed-captioning, and decency regulations, as well as with advertising contracts. While this capability is of enormous value, broadcasters have discovered that they can leverage this technology investment in even more important applications.

A number of leading broadcasters across the globe  are putting monitoring and logging system to work to improve news operations through review of aired content for ad verification and competitive review, and extend content delivery to a digitial platform consisting of web, cell, social media and fouth screen.”

In news operations, the system enables fast, easy review of recent newscasts. Thus, when working with news consultants to improve the on-air product, the network can simply provide browser-based access to a proxy version of content and export a high-quality transport stream for areas requiring closer attention. Because video can be viewed remotely with the functions typical of a DVR, it is easy for the network and its consultants to work collaboratively in enriching both production and news quality. This convenient capability is used on a daily basis.

Ad review has proved to be another valuable application supported by advanced monitoring and logging systems. The network’s stations often field questions about whether or not ads were delivered as required. With the ability to locate and extract clips from a browser-based GUI, authorized users at the desktop can offer proof of performance and more, quickly compiling and delivering to the client a sequence of clips that showcases a full ad campaign. The result is a powerful ad sales tool — particularly at the local level — that helps to persuade clients that their past, current, and future investment with the station is worthwhile.

Finally, the network is using its monitoring and logging system daily to drive content to other digital platforms including web, cell, social media and  fourth screen.  For example one leading TV stations uses their logging system to repurpose content to be viewed in taxi cabs.  Again using as-run logs from automation, the network can clip and export segments to which it has rights and publish it to the fourth screen system, complete with accurate captions.

The same technology has been implemented by a leading network’s news website to facilitate rapid content creation and sharing. Clips extracted from logged content is encoded at a high bit rate (720p 6Mbps) and delivered, complete with SMPTE-TT-compliant closed-captions, to the CDN for streaming to website visitors. Offering exceptionally fast access to aired content, as well as the captioning now required for Internet-delivered content, the monitoring and logging system supports timely mobile and Web publishing — within as few as seven seconds of the actual air time.

As staff members create the companion stories for the website, they can review a proxy version of the broadcast to write copy, check quotes, and identify still images for posting along with it. The solution also offers a convenient means of sharing content on social media sites such as YouTube, Twitter, and Facebook. Unencumbered by the need for clients or by complex access requirements, this approach provides a lightweight, cost-effective, and highly portable solution for cross-platform publishing.

Volicon’s Observer platform, already widely deployed in broadcast facilities around the world, offers a uniquely fast, simple, and economical solution for all of these applications. Combining media intelligence with video monitoring and logging technology, the solution provides functionality that is appropriate not only for engineering and operations, but also for production and promotions, new media, news, sales and traffic, media relations, and executive and legal departments. Observer users thus can leverage the video, audio, and data continuously being captured for compliance purposes and employ those resources in critical applications including content archiving, content repurposing for new media outlets, competitive analysis, producer and talent evaluation, media sales, and executive review of content.

Cost-Effective Monitoring With Enterprise-Grade Place-Shifting and Time-Shifting

By Andrew Sachs – VP Product Management

Because video operators are large, geographically diverse, and deliver complex services, the ability of the operator to check their QoE and view their service across their footprint is critical. Place-shifting is such a useful capability within operations that consumer-grade appliances, up until recently the only devices to offer this functionality, — have been deployed widely by networks and multichannel video programming distributors (MVPDs). What was intended to serve as a consumer device for consumer placeshifting streaming became an important part of their operations, for better and worse.

Consumer place-shifting devices were introduced in 2004 as a means of watching subscribed content when the consumer was not in their home. Using a small device located in the home, a single user could stream video over the Internet. By limiting the technology to a single user, the Consumer Electronics (CE) industry allayed the concerns of content licensors that the technology would lead to widespread violations of the time and place specific licenses granted to the broadcaster or MVPD.  This, combined with the relatively low penetration of Slingbox, allowed the company to avoid being in the litigation crosshairs and grow, albeit slowly however, for networks and MVPDs, the place-shifting devices flourished for the operational reasons above.

While the benefits were clear, the deployment of a consumer product in an enterprise setting presented a number of different challenges. The first of these challenges was maintaining the system. Because the device was designed for a single user, with a single username and password, video service providers had no way to control or limit logins — who logged in, from where, when, and for what purpose — and no means of controlling what happened to the devices. Without a central management system, the provider was forced to login to each device separately, and only upon login was there any indication of whether or not the device actually functioned.

Reliability issues also plagued MPVDs that took this approach. With any piece of equipment engineered for consumer use, there is a compromise between cost and reliability, and this became a significant problem with using consumer-grade place-shifting devices. Issues caused by technical hiccups such as set-top reboots also threatened continuous operation. When these devices were deployed remotely, accessing and fixing them was an expensive proposition as many offices were dark, unmanned facilities.  Whether repairing or replacing, each additional visit made this cheap solution expensive.

Many of the costs associated with this approach were hidden, but considered cumulatively, they added up not only in terms of actual purchasing and maintenance costs, but also in the price paid by not being able to troubleshoot an issue efficiently. Given other shortcomings with respect to limited access, control, management, and functionality, operators began looking at professional-grade alternatives that could facilitate both place- and time-shifting.

Professional Place- and Time-Shifting

To realize more flexible and reliable remote viewing and monitoring capabilities, MVPDs have turned to hardware-based single-channel systems built on components — the RAM, solid-state drive, processor, motherboard, and power supply — designed to be incorporated into an enterprise-grade device. Equipped to handle all common set-top box interfaces, this type of solution not only controls the set-top box channel setting and power through IR or IP link, but also provides from three to seven days of storage. With these features, the enterprise-grade solution overcomes the limitations of the consumer-grade device and far exceeds its performance.

Deployed as part of a larger system, the remotely positioned hardware unit can be centrally managed along with other tools used for monitoring and troubleshooting. Logging in just once through a central server, the user can view simultaneous streams from units in the field. Because this type of system can be deployed on a VPN rather than a less-secure Internet link, enterprises can eliminate concerns over access and security. Active directory integration enables enterprises not only to configure user groups and access privileges, but also to maintain a log of all use and activity on each remote unit. When used for both monitoring and troubleshooting, individual boxes can be set to deliver notifications —vie SNMP, email, or within the viewing interface — about issues with a particular channel.

Use Cases: Interactive Troubleshooting, Proactive Monitoring

To perform manual interactive troubleshooting within the browser-based system interface, the user can simply open up the live stream from a specific unit and pull up the virtual remote control, on which control functions are mapped much as they are one a common handheld remote. Along with these controls are buttons that may be configured to trigger commonly used command combinations.

In addition to providing control over the live stream, the virtual remote control gives the user access to recorded content. This combination of place-shifting and time-shifting capabilities, enabled by built-in solid-state storage, offers significant benefits to the enterprise. If, for example, the provider performed a test the day before at a specific time, a staff member working at the desktop interface can use the virtual remote to dial back the clock and watch the live broadcast from that time.

With the remote, the user can fast-forward, pause, and rewind video. For applications such as the troubleshooting of ad insertion or to determine the timing of specific ads, the user can employ an integrated frame counter to move forward and backward at the frame level. This tool allows users to get down to the point of counting black frames between ad insertion events. A tool for marking in and out points not only allows the user to define a problem area, but also to create and export clips for further review.

Key data, such as time code or loudness measurements, can be burned into the video frame-accurately to provide a more comprehensive picture of the problem. The resulting file, which can be played on the desktop or emailed for further review, helps to speed the troubleshooting and resolution process. Because streamed video is full-frame-rate video, the recipient gets all the detail necessary for effective visual evaluation.

Because the user can watch live or recorded streams from multiple boxes at once, he or she can evaluate multiple synchronized streams of previously aired content to determine if a problem affecting one channel in fact affected others, as well. Likewise, the user can monitor a single channel that is distributed across multiple geographic areas. In this way, identification of issues introduced at the local level becomes much simpler.

When manual mode isn’t required for close evaluation and/or troubleshooting, proactive monitoring provides continual scanning of live broadcasts. The system dials each channel in the lineup, spending a few seconds on each to check for issues such as static or black screen. The rotation across channels can include linear channels, as well as on-demand and interactive services.

To enable monitoring of interactive services, the system dials into menus and navigates applications in the same way the home viewer might use the remote control. In this manner, the service provider can proactively and automatically test provisioning, the deployment of software on the set-top boxes, the availability of interactive applications or video-on-demand, and other interactive elements.

Advanced Use Cases

The enterprise-grade place- and time-shifting system can serve as the foundation for numerous advanced use cases, thus providing value beyond interactive troubleshooting and proactive monitoring.

Scheduled recording allows the user to record a particular channel on a particular date and time. In this case, the system scans the channel lineup up until that event, stays on the specified channel to record the event in full, and then returns to scanning the full lineup. This is a very useful tool for looking at high-value events such as sports events or presidential debates. It also makes it easy for the provider to perform the 24 hours of scheduled recording now required for a spot check procedure.

The spot check is a new requirement for loudness compliance in the United States, and it tasks cable operators with performing periodic checks on uncertified channels to verify loudness. The test itself specifies a 24-hour period of loudness measurement. Leveraging the recording capabilities of the enterprise-grade place- and time-shifting system, as well as optional loudness measurement capabilities, the operator can very simply fulfill this requirement. Across the 24-hour recording, the user can navigate to particular points, look at loudness levels, determine if there is a violation, and see if a commercial spot is the content responsible for that violation. If a violation is identified, the user can create and export a clip that not only offers audio and video, but also the loudness measurements associated with the content. As loudness requirements continue to evolve, the system software updates can assure support for the latest loudness standards and measurement techniques.

When an API supports the running of scripts on the unit, a variety of functions can be triggered over IP, with SMNP, or via HTTP messages. As a result, the enterprise can integrate the monitoring system with other systems to achieve even greater visibility into the video delivered to customers. Integration with transport stream analyzers enables the user to look at the visual record of errors; integration with ad servers supports the logging of one or more specific ad insertion events; and integration with emergency alert or caller ID systems could be used to verify those applications.

Summary

Until the introduction of an enterprise-grade place- and time-shifting solution, operators were forced to deploy consumer-grade remote video troubleshooting solutions that left them limited not only by stringent user and network restrictions, but also by a lack of features, reliability, and management capabilities. Today, however, flexible time-shifting and place-shifting technology is available in low-cost, professional-grade alternatives that enable operators to reduce the time and cost associated with chronic network troubleshooting. Going beyond standard monitoring, this technology can be leveraged to support a variety of other valuable applications, such as loudness monitoring for compliance, spot check, ad monitoring, and interactive testing or applications.

Learn More: Click Here

Volicon Hosting Three Webinars Addressing Monitoring in Evolving Delivery and Regulatory Environment

Volicon will host three webinars in the coming month that target critical topics in A/V monitoring. Led by Andrew Sachs, vice president of product management at Volicon, each one-hour session offers participants expert insight into challenges and solutions for monitoring OTT/Web content; establishing flexible, highly automated monitoring and ad verification; and maintaining compliance with the latest recommended practices for loudness monitoring.

“Ensuring OTT and Web Streaming Services: How to Compete in a New Nonlinear World” is scheduled for Thursday, May 30, at 11 a.m. EDT. Topics will include:

  • The evolution and delivery of A/V content
  • The challenges of OTT monitoring, content streaming, and handling authenticated content
  • Advertising, encoding, delivery mechanisms, and target devices across OTT services
  • Regulatory shifts applying to OTT streamed content

Learn more or register at https://www1.gotomeeting.com/register/731520432

“Throwing Away the Sling: Enterprise Replacement and a Whole Lot More” will be held Thursday, June 6, at 11 a.m. EDT. The presentation will include a discussion of how Volicon provides an affordable time-shifting, place-shifting, and continuous-monitoring solution for enterprise-wide use. The webinar will also touch on:

  • Automating testing rather than just doing it manually by tweaking the remote
  • Uniting continuous monitoring and troubleshooting on one platform
  • Improving the workflow for validating local and targeted advertising insertion
  • Validating STB configuration and channel plans
  • Leveraging additional platform features such as user management and multiuser access

Learn more or register at https://www1.gotomeeting.com/register/888563977

“Assuring Compliance With the Revised ATSC A/85 and the SCTE 197 Spot Check Loudness RPs” will take place on Thursday, June 20, at 11 a.m. EDT. The webinar will address key challenges presented by evolving loudness standards and conclude with a Q&A. Topics will include:

  • Key changes in the ATSC A/85 Recommended Practice, including 5.1 downmix and BS-1770-3 measurement
  • How to perform simultaneous 5.1 and 5.1 downmix measurements, and how to select the new BS.1770-3 standard
  • SCTE’s Spot Check Recommended Practice, Spot Check applicability, and best practices for execution
  • How to integrate with the programmer’s automation as-run log to measure every asset precisely for the Spot Check requirement
  • How to provide unambiguous proof of (non)compliance
  • Hierarchical responsibilities in the broadcast chain
  • Best practices for ensuring compliance in specific regions and markets

Learn more or register at https://www1.gotomeeting.com/register/128265817