Paving a Smooth Road to Multiplatform Content Delivery

Broadcasters today face a complex mix of challenges and new opportunities. While existing economic conditions have put continued pressure on broadcasters to do more with less, resulting in the consolidation of operations and/or reduction in human and technical resources, the  demand by consumers for multiplatform media has escalated, presenting new possibilities for leveraging content already being created for air.

To strengthen brand awareness and even generate additional revenues from aired content, broadcasters must establish a fast and cost-effective model for capturing the content they air; altering that content for consumption on computers, tablets, and smartphones; and delivering the resulting output to the appropriate outlet or platform in the proper format. Addressing all of these requirements, Volicon has enhanced its digital video monitoring and logging technology to provide a flexible foundation for quickly building successful and competitive multiplatform content distribution services.

Volicon’s new Observer® Media Intelligence Platform (MIP)™ along with its 5 applications (Capture, Share, Review, Monitor and Comply)  is an example of how powerful the unique content-recording capabilities and highly refined user interfaces of advanced video monitoring and logging systems can be in supporting multiplatform services. Volicon’s Observer technology which already supports thousands of broadcast channels worldwide, has been tailored within the new Media Intelligence Platform to accelerate the creation and delivery of quality content across multiple industry platforms.

A series of complementary applications lends versatility to the platform and its deployment. Users can customize MIP to their business and technical needs, taking advantage of the platform in one or many different applications — from compliance, quality assurance, and competitive analysis to production and repurposing for digital and social media outlets.

Within the repurposing workflow enabled by this platform, broadcasters can now capture media from a variety of sources and quickly produce and deliver compelling content to viewers who watch on-air broadcast service or access video via Web, mobile, and OTT services. At the front end of this workflow, the Capture application for MIP makes it easy to capture media from a variety of sources and rapidly bring it directly into the media asset management (MAM) system and editing workflow.

With flexible media ingest options, MIP with Capture application offers a robust yet cost-effective alternative to expensive dedicated capture stations. Low-resolution proxy versions of content are created along with the True HD (1080i 25/30, 720p 50/60) recording, so authorized users can access, review, and edit content at any time, from anywhere and quickly create content with quality suitable for virtually any outlet.

The Share application within MIP allows the broadcaster to streamline and accelerate the process of publishing content to a diverse array of digital platforms, supporting a wide variety of container and encoding requirements. One-button publishing profiles ensure that content is processed appropriately to meet these requirements, thus making it easy for broadcasters to target content for websites, YouTube, Twitter, and Facebook. The Share application also can incorporate closed-captioning text that accompanies aired content into the multiplatform output,  so that compliance with accessibility regulations is maintained. Guaranteeing compatibility not only with the requirements of key outlets, but also those of government regulations, the platform allows desktop users to create and distribute multiplatform content with speed and confidence.

For other valuable use cases such as on-air content review, ad placement verification, and competitive analysis, Volicon offers the Review application. Equipped with the Review application, MIP makes live and historical broadcast content and ratings data available to users in the high- or low-resolution format appropriate to their viewing device and network connection capabilities. With such flexible access to media, broadcasters and their partners can conveniently review and analyze their broadcasts, and those of competitors.

Drawing on the robust monitoring capabilities developed over years for the Observer product line, the Monitor application for MIP supports video quality monitoring, provides fault reports and automated alarms, and supplies the full recording of the on-air broadcast for instant review of errors and their impact. Volicon has newly incorporated a Multiviewer feature with recording capability into the application to allow viewing of multiple channels side by side on a standard display. If a suspect stream is detected, users can immediately pull up that stream – live or recorded – on the desktop interface for further evaluation.

Finally, Volicon offers the Comply application, which enables broadcasters to record, clip, and export their broadcast outputs and, in turn, provide an unambiguous visual affidavit of compliance with regulatory and licensing requirements including decency, closed captioning, and loudness.

Yielding the many valuable capabilities developed by Volicon in response to industry demand — and in conjunction with its customers within the television broadcast industry — the Observer Media Intelligence Platform supports an end-to-end workflow that can give the user a competitive edge in publishing content to its own website and services, as well as to other popular websites and platforms.

TV Tech: Audio Loudness Still a Hot Topic Customers continue to have questions

NEW YORK—Time flies. It’s been nearly four years since Congress passed H.R. 1084/S. 2847, commonly referred to as the CALM Act. International legislation requiring broadcasters to limit the variance in level between programming and advertised has also been established. Has loudness monitoring reached full maturity, or can we expect refinements to both laws and the technology that ensures broadcasters remain in compliance with them?

TV Technology asked a number of major industry players for their views on this subject, including Chris Shaw, executive vice president sales and marketing for Cobalt Digital, Andrew Sachs, vice president of product management at Volicon, Martin Winsemius, sustaining engineering manager at Wohler, Peter Pörs of Jünger, and Tim Carroll, chief technology officer for the Telos Alliance.

TV Technology: Was loudness monitoring a hot topic at the 2014 NAB show?

 

Chris Shaw

Chris Shaw: Most U.S. broadcasters believe they provide CALM-compliant programming but that’s debatable. Besides a number of broadcasting groups adding loudness processing cards (Cobalt’s 9085, 9985 and options for cards in their possession), into their openGear frames, the initial impetus has slowed.

This is certainly not the case in many other areas of the world. Continual requests are being received for both broadband and loudness over IP solutions. At this time, Latin America, South Korea, Southeast Asia and India are growth markets for loudness processing. Broadcasters in these regions understand that they have to be compliant when transmitting programming to major markets such as U.S. and Europe. Interest from these markets is expected during the upcoming IBC Show.

Peter Pörs: Loudness is still a hot topic; a lot of misunderstanding remains, and a lot of misalignments. Make it pleasant for the ears and you will see; you are almost compliant to the recommendations!

Andrew Sachs: There is still confusion. Between measuring full mix or downmix, changes to the referenced BS-1770 version and actual compliant—but for the objectively loud commercials—we are not done with the task.

Martin Winsemius: My general impressions were that loudness is more understood now, and accepted as a necessary evil by our customers, a problem to be dealt with rather than avoided. For loud commercials in the U.S., and everything elsewhere, pushback to the provider is common practice now. It should have been all along, but now the agency enforcement threat outweighs business pressures.

TV Technology: What are the most common misperceptions regarding loudness monitoring?

 

Tim Carroll

Tim Carroll: One of the most common misconceptions about loudness metering is that the loudness value should never vary from the target. The only way to accomplish this is to remove the dynamic range (i.e. the life) from a mix; or to accidentally broadcast test tones. Neither is a career expanding move.

Another misconception is that a loudness meter is intended to replace all other metering; it is not. A loudness meter is like a speedometer in a car, you glance at it as needed, but mostly you focus on the road. Similarly, live loudness metering is integrated over a period of time—often 10 seconds—it is useless for seeing sudden changes but very useful for maintaining a comfortable balance.

 

Andrew Sachs

Andrew Sachs: One common misperception is that it is desirable to be alerted the instant an output is found to be too loud. The reality is that the ‘too high loudness’ of specific assets is usually an audio production chain problem that is best fixed with updated processes.

Another common misconception is that one cannot assume that being in compliance will translate to an absence of ‘loud’ commercials. In fact, the different loudness yardsticks used for content (speech) and commercials (full mix), combined with the ability to ‘deliver content low,’ enable fully compliant 5-6 dB jumps to occur at the transition from program to commercial.

Chris Shaw: There remains a misunderstanding that loudness is measured (averaged) over the length of a program or session, whether it be a commercial or complete production. In actuality loudness can be above or below the LKFS (LUFS) limit, but is averaged over the complete session. The term ‘loudness’ refers to not only the highest volume, but also the quieter segments of a session.

 

Peter Pörs

Peter Pörs: People are mixing the numbers of the peak world with loudness numbers. Both values are represented the same way. And that makes it difficult for some people to clearly identify what is what.

TV Technology: Is it possible that Congress, or international bodies, will further refine the regulations that govern loudness?

Andrew Sachs: It’s unlikely the algorithm (BS-1770-3) used for loudness measurement or the CALM Act itself will change significantly, but the application regarding gating (level or speech) and 5.1 tracks (downmix or full mix) specified in the ATSC A/85 RP is still under some flux.

Martin Winsemius: Taking into consideration a variety of factors—amplitude, frequency, and time for all channels within a program—the ITU BS.1770 LKFS (loudness K-weighted relative to full scale) standard is designed to yield an accurate loudness number that can be used effectively in content creation and content monitoring applications. The standard since has been refined, with the latest release being ITU BS.1770-3.

Peter Pörs: I don’t think another kind of regulation can improve the scenario. More important is that all people accept the new way of creating audio programs and that their “artist ego” is focused to audio quality and not to highest loudness. If everybody is trusting his native unstudied listening impression it should become easier to stay compliant with loudness recommendations or regulatory requirements.

TV Technology: What distinguishes your company’s product line from the competition?

Tim Carroll: Linear Acoustic and the entire Telos Alliance will continue to innovate products and technologies to support audio for broadcast, no matter how content is created, delivered, and consumed. Our unique value add is our people: we are proud to be part of every dimension.

Martin Winsemius: Hardware loudness metering/monitoring can take many forms, either standalone or integrated with other necessary audio functions, the latter being the strong suit of Wohler’s AMP2-16V audio and video monitoring unit. The AMP2-16V combines VU, PPM, and loudness metering configurable to popular or custom scales, Wohler’s renowned acoustic performance with a wealth of I/O conversion, routing, mixing, de-embedding and re-embedding functions at no extra charge.

Chris Shaw: From the initial days of the CALM Act and the need for loudness processing, we have worked closely with Linear Acoustic using the Aeromax algorithms, the Cobalt preferred product of choice and the best solution on the market. Also, we have listened to our customers, many of whom had major concerns regarding compliance. The combination has enabled us to provide loudness processing products to meet all budgetary and technical needs.

The Cobalt 9000 series COMPASS cards offer single processing for HD/SD-SDI requirements. The 9900 series of FUSION3G cards provides for multiple processing on a single card. A selection of processing including multiple stereo, 5.1 audio and added upmixing (Linear Acoustic upMax) are all available on these cards to meet endusers needs. Cobalt’s LMNTS provides for multichannel loudness processing over IP/ASI. This unit provides end-users financial and real estate rack economy.

Peter Pörs: All Jünger solutions are real-time algorithms. All of our circuits are wideband, no unwanted side effects, no pumping, no breathing, no distortion, no coloration.

Andrew Sachs: Volicon is unique in combining real-time monitoring, streaming A/V with a variety of measurements, asset-specific program loudness compliance reporting, affidavit production with burned-in measurements, and both full mix and downmix measurements. Together, these capabilities make the Observer Media Intelligence Platform the most complete, easy-to-use monitoring and compliance solution.

- See more at: http://www.tvtechnology.com/article/audio-loudness-still-a-hot-topic/271595#sthash.ukfgueQw.dpuf

Volicon and Grass Valley Partner to Accelerate and Streamline Collaborative Content Repurposing Workflow

Volicon Observer® Capture and Share Applications and Grass Valley EDIUS® Video Editing Software Support New Highly Efficient Workflow for Timely Content Creation and Delivery

Volicon’s partnership with Grass Valley enables the companies’ customers to realize a rapid repurposing workflow in which high-value content is quickly and easily captured, clipped, and edited for delivery to any platform. The Capture and Share applications, part of Volicon’s Observer® Media Intelligence Platform™, and Grass Valley EDIUS® video editing software together enable fast, efficient collaborative production of compelling content for TV, Web, mobile, and social media platforms.

Today’s competitive and economic factors make it critical that content providers establish an efficient and cost-effective way to leverage all available media sources to enhance their service offerings, whether content is broadcasted or delivered over the Internet. The combination of our Capture and Share applications with the EDIUS editing solution from Grass Valley provides an accelerated production pipeline to support the rapid creation and delivery of compelling content.

The Capture application facilitates the continuous real-time capture of high-quality (up to 720p) content from any source — cable set-top boxes, studios, and live camera feeds — at any time and makes it immediately available to users across the enterprise. The Capture application pairs HD H.264 encodes with proxy versions so that both local and remote users, such as remote staff, partners, consultants, outside talent, and reporters in the field, can collaborate seamlessly to review live or previously captured content, clip high-value segments, and push them directly into Grass Valley’s EDIUS video editing software.

Because the Capture application captures a composite feed that marries closed caption data to both the high- and low-resolution versions of the content, users can perform complex searches of all captured content. Capture returns a list of results, each with a representative image and a short excerpt from the closed captioning. The footage is immediately viewable, or frame-accurately sub-clipped and made available to the MAM system for rebroadcast or to the Grass Valley editing system for repurposing.

EDIUS empowers editors with superior real-time workflows in all resolutions. With no rendering and no limitations with respect to the number of audio, video, graphics, and title tracks it supports, the Grass Valley software enables a fluid and rapid creative process that yields a more engaging finished product.

To accelerate content distribution, the Share application provides single-click publishing profiles that facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. The Share application also makes use of existing closed captioning text to publish content that is compliant with government regulations.

Remote Troubleshooting for the Enterprise, Cable and IPTV Operator, the Usage of Your Consumer Place-Shifting Devices and What Volicon Has to Offer.

Transcribed from a webinar hosted by Itamar Levin, Product Manager

To Listen, click here

So, the Consumer Place-Shifting Devices started coming out in the early 2000’s, and what brought that adoption was internet speed increase.  These devices hooked up to the Set-top box and allowed the owner to watch a TV from anywhere they have an internet connection.  This was really groundbreaking technology at that time, very innovative and it kind of set the expectation of how video could be delivered –  and really paved  the way for OTT.  Soon afterwards, organizations quickly took notice and began deploying these devices widely across the network, whether it was in their labs or remote on man location, with obvious value in the troubleshooting around it.  It allowed for an increase in edge visibility as well as helped troubleshoot interactive and local channel.  It saved a lot of time and money however, it became clear that these devices lacked reliability and manageability and carried hidden cost.

This cost emerged all the way from the installation and deployment phase as well as during device software updates which required knowledgeable technicians to be present.  Besides the device itself, there were other pieces involved; they needed to be configured in order for things to work.  Everything from LAN IP changes, reporting needs to be reconfigured all resulted in frequent downtime and downtime that is usually only detected when you need to use the device.  All in all, with the amount of effort required to manage and maintain these devices, total cost of ownership is significantly higher than first it’s expected.  In fact, there is no tap on these costs over the lifetime of the device.

Fortunately, we have a product, the RPM, which has features that was designed and built for enterprise use.  The RPM probe can connect to anywhere from one to six Set-top boxes, RPM 100 to 600.  Each Set-top box is on IR blaster and IR isolating shelf in order to prevent external interference and as the contact flows from the Set-top box, it is being analyzed and recorded locally on the probe.  The amount of storage that we can record to 7 days minimum, but that can be configured to pretty much any period or any preferred duration.  The probe can work with multiple inputs; can work with positive component inputs or in more advance cases, for example if you want to record AC3, we can take in the component, pass it to a converter and input that at HDMI.  We can also take an HDMI directly from the Set-top box.

The Scout is actually the most affordable RPM Probe platform there is.  It has a 1RU 14 inch chassis and it can be put on the rack or it can be placed anywhere an organization may currently have their current consumer place-shifting device on a cable or work bench.  The Scout can work with the same inputs as the RPM and it comes with 3 days of storage by default.

RPM deployments over a large geographic area can be mixed and matched between the different RPMs that we have.  We can have RPM 400, RPM 300 and Scouts to put it all along the system and it depends on the configuration required.  There’s essential a browser-based server that is used as a single point of access to the system, and manages all the probes and configurations.  It is also responsible for aggregating various help and service messages from the probes and sending them out via SNMP to a default management systems, for example, or via email and SMS.  The entire deployment is connected over a network.  It can be connected over the internet like the current place-shifting device can, or we can actually run in over a VPN, which makes it more secure and easier to deploy and maintain.  The users connect first to the Central Server.  They log in and they get to see the resources they have available to them.  From there, a user can initiate streaming directly from the probe.  One user can have remote control access and the other user can see what’s going on.  Each user has an independent video stream so one user can be paused, the other user can be fast forwarding or playing, and of course, if one user changes the channel, the other user is going to see that channel being changed.  Thus, users can also stream from multiple probes, at the same time, and can see multiple players together at once. When the system is idle, when there are no users logged in for example, there’s no streaming being initiated then there’s no banners being used.

The RPM captures recording, full frame rate video whether it’s SD or on HD, and provides a virtual [MIDI] control interface which allows users to replicate over to customer’s experience. On the RPM system, we can actually pull up the video, we can take remote control ownership by clicking an icon.  What this lets us do is have the remote control ownership and this locks out other users from being able to change the channel at the same time.  Once the piece is up, we can open up our remote control interface and press the button here to change the channel, for example or we can access the shortcuts menu that we have.  We have a shortcuts menu, so we can actually program long sequences of commands.  For example, if you’d like to pull up the OCD dials or any kind of engineering interface on your system and then quickly recall them in the future.  We can see the long presses or if you need to hold the button down for a few seconds, we can limit that as well through this interface.

Another feature that we have is being able to view multiple streams at the same time in the same interface which allows us to quickly troubleshoot in an entire area. Over a certain region, we can all pin up all these players and actually look at what is playing at a certain time.  Click CNN, we hear that CNN is having issues.  In one city, we can open up the CNN in other cities around it and see exactly where the problem lies.

Another feature that differentiates us from the classic consumer place-shifting device is the amount of users that can access the system at the same time.  The current devices are limited to have one user, logged in at a time.  The Observer RPM allows users to troubleshoot with multiple people.  People all over the internet can connect to the devices to help troubleshoot a problem.

One thing with the storage, because we store all the contacts that’s coming from that Set-top box, we can actually then go back and recall, so think of it as DVR.  Users can go back into the system and actually request to see any time period from the past few days and look at exactly what was playing.  This also allows us to go back in time and with multiple players at the same time, and we can see exactly what was going on then.  We have this feature called Sync-To-Me, so once we navigate with one clear to a certain time, we can actually click on Sync-To-Me which synchronizes the rest of the players to that specific time.  This is like a time machine.

This configuration is probably the most affordable solution.  If you’re looking for a way to remotely troubleshoot multiple Set-top boxes that are all in the same location.  This allows us to essentially hook up pretty much an unlimited amount at Set-top boxes with single RPM, the Scout for example and through their interface, we can switch between those Set-top boxes.

Another thing that differentiates us from these devices is that the RPM is built for the enterprise, so when users logged into our system, all the user management is done in the central server rather than per device.  We have roll-based permission and we can even further integrate with your existing authentication systems in order to provide features like single sign up.  If you’re not integrated with your [LBAP] or active directory, we can have our own user management and we can edit each of the groups and give them the permission to specifically model your companies or organization hierarchy.

We also offer constant health monitoring and service monitors, so for example, if communication is lost with the central server, the central server will send out an alarm. Same thing if a video was lost on the Scout, for example. If the Set-top box is disconnected, there was technician there, or if somebody knocked out the Set-top box, we can actually detect that, send a message to SNMP or email, and notify you that this resource is no longer online.  If that communication is restored or if the videos have been recovered, we’ll also send a clear message on them.

Finally, I’ll talk about a little bit about the RPM reliability compared to place-shifting device.  The RPM is built to last, so it’s a professional equipment and we’ve designed it to work 24-7.  It has the enterprise, ECC memory enterprise, hard drives enterprise, motherboard server components, they’re all going to increase the lifetime of the device way pass any of the consumer placing devices you have today.

Okay, so this is where it gets interesting.  When you’re not using the devices, they’re essentially just sitting there doing nothing.  What we have is ability to do proactive scanning and be able to go through all your channels and pass them for black screen, static screen and various audio level and if they surpass a certain threshold that we specify, we will send out an alert, so this actually makes use of your equipment already and you can find the problem before your customers do.  We also offer Advanced Interactive Channel Scanning by providing a Pattern Matching System.  This system allows you to navigate the complex VOD and menus that you may have, so you can quickly access an asset or test your VOD play out.  This can also be used, for example for the new OTT in Cloud services that are offered to test their delivery to the Set-top box.

Another feature of the monitoring is the ability to automatically determine if an issue is isolated to a probe, a group, or to entire system, so if we have these problems on any channel, or across an area, we can increase the priority of a particular alert to let you know that this issue is not just isolated at one location.  Again, because we have this DVR capability; because we’re recording everything that’s coming out of the Set-top box, we have a record of every service visit, we have a record of every fault that happened, so not just getting an alert saying, “Oh, okay.”  We had a static screen.  You’re going to get an image of it and then you can even log in and view the actual video of it, as it’s happening.  For example, users can view the existing open issues, the recoveries, so over time, we can see an issue for example, at night, a service goes out but if it comes back in, we’ll recover, and we’ll send a recovery message upon it.

Another one of our features is the ability to do Loudness monitoring and Loudness measurements.  The strength of this is that we have the content and we have the measurements.  We’re able to do seven short form measurements and seven long form measurements at the same time, which makes it almost ridiculously easy to do a spot check.  If you want to see a 24-hour period spot check, we can quickly look at the measurements. So over 24-hour period of time, we can quickly find spikes where it’s potentially out of compliance, and we can determine if that was a commercial or is that a regular content.   Basically, when in finding these peaks, and if you determined that it’s a commercial, we can dig further in by zooming in, to determine if it was commercial.  You can do full 24-hours spot checks in a matter of minutes using this method.

Few last things, I’d also discuss triggering.  Triggering allows us to accept external trigger, the external events and react to those events.  We have an API that can accept SNMP.  It can accept HTTP and what it allows us is to react to the external events for example, off the hook, transport stream analyzer.

The RPM is a software platform.  It means that we’re inherently flexible into making changes to this system.  The company has been around for a long time, for 10 years now and we’re consistently delivering software updates and enhancements to the RPM.  We have a lot of new features on the roadmap and we’re excited to work with you and grow with your changing needs. Please contact us for more information. info@volicon.com

A Brief History of FCC Closed Captioning Regulations, Where We Are Today, and the Forthcoming Changes

Except from CC Webinar seen HERE

There’s a new law that came out, and it was issued by the FCC and we are getting a lot of questions from our customers; network stations, cable, satellite, IPTV operators. We decided to do a quick informational about that law, help you understand what it is, and how it could affect your organization.

I’m going to go over a little bit of the history of Closed Caption Regulations, when did it start?  When you think about this CC quality, it’s really an outcome of what was really started way back in 1976 and we’ll look at some of the steps along the way.  Why Closed Captioning Quality Regulations now?  Why not 5 years ago, 10 years ago?  I’ll go over that, and then specifically, get into the February 2014 Report and Order.

Actual rules were amended in March but the FCC put out the Closed Caption in quality order and it is composed of a few main pieces.  The Report and Order, discusses the rules that are going to effect in February of 2015.  It’s got some declaratory rulings, the rules that are actually going to effect immediately, and some notices of proposed rulemaking.  It’s an area where they’re really looking for comments and about how to further the law and make it more practical, and then finally it discusses the rules themselves, the updates, and the FCC Regulations.

In 1976, the FCC and some technology members got together and established the Analog Closed Captioning Standard Online 21 of the Vertical Blanking Interval or VBI, this is now referred to as 608 Captioning.  It us relatively simple by today’s standards, but it allowed certain bit rate of metadata to be carried inside the video, framed accurately and was allowed for analog presentation of Closed Captioning.  TV’s greater than 13 inches had to have a 608 Decoder built into it so you couldn’t manufacture a set that couldn’t decode captions but at this point. There really was no regulation or even anything other than voluntary compliance – that the network and programmers had to actually caption their content.  That started to change in about 1997.  The FCC laid out some percentages depending on market size, how much of your content needed to be captioned and obviously the larger markets have greater quantity.  In 2000, the ATSC Closed Caption Standards were developed, 708, and then in 2004, some groups petitioned the FCC for what amount to be a Closed Captioning quality mandate that stated that the Close Captioning is supposed to be there, but if it’s not correct, it actually doesn’t fulfill its purpose. An effort was started in 2004 and now, in February of 2015, the Closed Captioning Quality Act should became law, so the effort have started in 2004 and has been in a kind of a proposed rulemaking, modification phase for 10 years is now happening.  This is sort of finishing the job that was actually started almost 40 years ago.

So, why now?  Certainly there’s some economic pressures on stations and this is pushing down into the captioning services. Making the captioning providers lower their prices and to a certain extent, the Closed Captioning Quality, has degraded the quality, and as a result of that competitive pressure, something has to give.  Closed Captioning isn’t just for the heart of hearing and in a lot of ways, many people use it as form of audio correction.  If something was said, but it wasn’t really loud and you didn’t understand it, you can look on the screen and see what was actually said, so it’s turned out to be a tool not just for the deaf or harder hearing but for everyone and usually in homes or even out in large venues, we consume the video with no audio.  With multiple users in multiple scenarios, everyone is now caring about the Closed Captioning Quality.

What happened in February of this year?  It’s called the Closed Captioning Quality Order, a.k.a. FCC 14-12, if you look at the PDF, it’s 14-12A1 and it’s got a nice long name.  We’re going to call it FCC 14-12 or Closed Captioning Quality Order from the rest of this blog post, and it really has four main sections.  It has a Report and Order Section and that’s really the rules that are coming, it’s got a Declaratory Ruling which are basically rules that are immediately effective for clarifications or rules, it’s got a Notice for Proposed Rulemaking, and then the Final Rules, and this is what was actually added in March, the actual updates to the law, the stuff that the lawyers are actually going to read and work off. So it’s those four sections that are important.  It’s a big document and there’s some important information in there that is going to be immediately effective, certainly some stuff that’s going to be effective in February of next year, and then a large area that’s really still up for study and probably going to be clarified by the FCC hopefully soon.

So what’s in that Report and Order Section of FCC 14-12? They established four non-technical standards, so these are not objective measurements – and the first is Accuracy. What this basically gets down to is that the Closed Captioning, unlike Teletext, needs to reflect dialogues, sounds, music, and speakers. Identifying… It’s really for those that actually can’t hear, so they provide extra context if you can’t hear the audio track at all.  Synchronicity, captions that are in time,  (we’ll talk a little bit about live and non-live captions and what allocation later), what the leniencies are here, that are able to be read in time with the content and the need to be able to read at a reasonable speed.  Completeness, that the captions are there for the complete duration of the program and that they’re placed in a location, does not include faces, key graphics or key on-screen text.  So if you have a news program and you’ve got a banner across the bottom, that wouldn’t be the place, you’d place the captions.  When we get into discussions in the FNPRM, there’s a lot of energy around how much did the new Caption Standard allow organizations to locate cc or change the font or opacity, how much flexibility is given, and how much leniency does it give you in complying to this sort of place and requirement.  For these are non-technical standards, there’s actually no objective measurement defined in the document and that’s actually something that’s in the FNPRM.

So these are the first four technical standards and then the question is where are they applied?  There’s slightly different standards for different programming. To get the full adherence to the law, there is no leniency allowed on pre-recorded, more than 24 hours in advance.  Anything that’s live or near live, is going to be given more leniency on timing and quality.  They didn’t really defined what more is, but the idea obviously is that if you’re live captioning something, that you wouldn’t be driven to the same alignment of time, as if something were actually pre-recorded and you could do the captioning offline.  There’s a special technique used by news organizations, specifically those in smaller markets, where you get to call Electronic Newsroom Technique, where you can turn news scripts into the closed captioning feed directly, so there is no need for live captioning services.  The challenge was that that it frequently resulted in poor quality during live events, interviews, weather, or any breaking news, anything that wasn’t actually specifically scripted, so the FCC said that, this is an exception to the rest of the reporting order. There are more restrictions on the use of ENT as of July of this year, so if you are using ENT, look at the law, there’s some specific sections, exemptions you can still have but for the most part, it restricted the use of ENT to help improve the quality during live events, and the quality of the caption services during live events.

Inside the Report and Order, the FCC is holding the program distributor’s accountable, so anybody delivering television, broadcast TV stations, cable, satellite, IPTV, they’re not directly holding the networks, so the video programmer is responsible.  This is very similar to Loudness.  The VPDs are also required to monitor and keep records, so they have to monitor that their equipment is actually working, periodically and keep records of that, and execute corrective action, as they notice deficiencies.  There’s a lot of detail, I’ll talk about it a little bit later, but with monitoring, there’s some very specific practices in the document about what’s acceptable monitoring and I would suggest that if you’re responsible for that area that your lawyers or your engineers look at that section and make sure that you are complying to those monitoring best practices.

There are some exceptions in the Report and Order, and basically, if a channel is less than 3 Million dollars in revenue, then it’s exempt from most of the rules.  If a station has a Dot 1 and Dot 2 – this isn’t over the air station, then the FCC calls that a multicast.  There is not IP Multicast, this is that you’re putting out more than one program which is pretty much every TV station and in that case, the revenue is actually per channel. If you have a Dot 1 and it has 10 Million Dollars in revenue, you have to follow the rules on that channel, but if you have Dot 2 and it’s 2 Million dollars, then that channel would be exempt from the rules and you would have to provide that meet those standards that you have for the Dot 1.

The document gets into some of the potential violations and what the cost might be and they did establish a fine of up to $8,000 per hour of programming – that is for a flagrant violation.  I don’t know exactly what constitutes a flagrant violation but I’m pretty sure that if you want to avoid it, if you don’t’ ever want to be called flagrant, the best is to follow the best practices their document outlines.  It’s rather large document, a 132 pages, but it has a section in paragraph 51, page 31 for best practices and right here.  That’s a very important section, so it’s a large section, it’s about 16 pages but it goes into things that the TV stations or the programming distributors need to do in order to be in compliance.  There’s a lot of monitoring recommendations, checking and record keeping, and it’s specific for program distributors, for stations and for the sort of MVPD side.  Pretty much anyone in here, so those back practices will be given time to correct.  If there’s any violation before any enforcement starts, the idea is if you’re following those practices and you have a violation, you don’t start that $8,000 an hour.  You just have to monitor and do some spot checking, improve your practices and then you avoid the flagrant violation category.

The second section… There’s actually four section of that document.  The first was the Report and Order.  The second is the Declaratory Ruling and this is an area where they’ve actually clarified some rules then made some rules that are effective immediately.  The approach is your content, if mixed, and there’s some specific rules about it.  It’s mostly Spanish and there’s a little bit of English, you have to catch in the English, but it does specific talk about that.  It is specific about on-demand programming, not being exempt. I’m still trying to understand if content was never broadcast and that’s available on-demand, how does that qualify in the new IP world, so we have questions about some pieces of it, and then it also gets into the fact that there’s no low power TV exemption.  There’s still a less than 3 Million dollar revenue exemption but if you’re a low power TV station and you’re broadcasting in New York City, you could easily be more than 3 Million dollars of revenue.

The third section is the Further Notice of Proposed Rulemaking.  These are areas of study.  There are the things that the FCC has looked at, knows need to be further clarified, but they haven’t decided to put them into actual law yet.  First question is should the video programmers be held accountable?  They are ultimately the people that lived with the content but there might be some level of responsibility that needs to be pushed more directly back to the video programmers, so this is an area for study.  More objective rules rather than just these subjective quality, timeliness, completeness and no occluding things.  Maybe defining actual objective quantity that measures for compliance, so you know, 98.7 or 99.8% of the words are actually captioned properly.  That might be a quantitative measure.  Repositioning a broadcast, re-captioning and rebroadcasting content, so it is the same as the content was produced live with some allowances for accuracy or timeliness, might involve additional rules saying, “Hey, you need to clean up the caption and can’t just rebroadcast if you open up live.”  There’s a challenge and this is kind of related to the void, what about non-broadcast content?  An IPTV delivery?  Should that be subjected to the same rules?  There are questions about specific checks, to monitor and ensure compliance.  What should there be certain time periods setup?  Certain checklist?  Reporting of those checklists?  Should ads be allowed to not have captions, so there’s been a little broad exemption for ads not having to have captions, should that be continued? As I noted this one earlier, the Closed Captioning font, color, size, opacity, how was that contributing to the efforts to maintain compliance?  Do they give additional leniency too?  3D and Ultra HD – where do you put the captions in 3D?  Are they in three dimensional or what depth are they? They can’t just be the same on the screen because of it will mess up the 3D effect and Ultra HD, how big are they?  If you’re consuming Ultra HD, the size and positions might be different compared to regular HD or SD contents certainly.  If your organization is active in this area, you probably know this already but the comment period has been extended and is available for additional comments until August 8th.  What does that mean?  It means the FCC is not done with this yet, so they’re still taking comments from the public.  Once they finished that process, then they’ll likely come back with more clarifications.

The last section of the document is the FNPRM, it’s the Final Rule and this is page 100 to 108.  It’s for the lawyers.  It’s the actual amendment to the FCC rules, Part 79, Title 47.  In the sense, it’s the only things that really matter.  Becasue this is a legal document, it’s mostly for your lawyers and it’s probably worth knowing that whatever you see there is going to change as a result of the FNPRM.

So, in the summary, the goal is clear.  Closed Captioning Quality really matters.  It started in 1976 and the FCC’s goal here is to finish the job that started back then.  Some rules are in effect now.  The Electronic News Gathering Techniques, some are coming in February and many more are still to be written.  If you deliver TV, you’re responsible for it.  Exact methods for determining the compliance are still to be determined.  Certainly, it includes monitoring and periodic checks and keeping a record of those, but the FNPRM’s are likely to come up with additional modifications and stuff necessity as to what means, what you can do to ensure compliance and this is really an informational webinar.  We’re obviously the key players in the compliance realm.  It’s on our DNA, that’s how the company was started and really our goal here is to help our customers achieve compliance and quickly improve compliance so that they can spend their time delivering content instead of worrying about FCC Rules.

Disclaimer -  we’re not lawyers; and we don’t claim that this is legal advice, but wanted to make sure everyone knew that when it comes to time to actually know what you need to do 

Volicon At IBC – Stand 7.G23

At IBC2014, Volicon will demonstrate five powerful new applications within the company’s Observer® Media Intelligence Platform™ for the first time in Europe. The Media Intelligence Platform boasts an enterprise-wide solution that records a station’s media from ingest to playout, as well as multiple on-air broadcasts. In addition to enabling multiple users to stream, analyze, and review content from anywhere at any time, the platform supports a range of applications including compliance, quality assurance, competitive analysis, production, and repurposing for multiple platforms and social media outlets. With these tools, Media Intelligence Platform users are equipped to capitalize on new opportunities to create compelling content, raise viewer ratings, and generate new ad revenue.

STAND 7.G23, Hall 7

Capture
Today’s broadcaster must capture media from a variety of sources to produce compelling content for viewers, whether delivered via on-air broadcast or a digital platform such as Web, mobile, streaming, and OTT services. Serving as a cost-effective alternative to expensive and cumbersome capture stations, Volicon’s new Capture application allows broadcasters to capture media from any source at any time, ingesting media either according to a schedule, in real time, and/or 24/7 recording. The application supports a fast, simple edit workflow by enabling the user to review content as it is captured, immediately clip high-value content, and push it directly to editing and MAM systems without time-consuming transcoding steps. Because a low-resolution proxy is created along with the True HD (1080i 25/30, 720p 50/60) recording, both local and remote Observer® Media Intelligence Platform™ users — remote staff, partners, consultants, outside talent, and reporters in the field — can quickly and easily collaborate to access, review, and clip content to create valuable footage for distribution.

Share
Broadcasters today need an agile way to publish exciting and compelling content to a multitude of digital platforms including the Web and social media outlets. This typically is a cumbersome and expensive process, but Volicon’s new Share application allows the broadcaster to repurpose existing content quickly and efficiently and subsequently push it to digital platforms and social media sites. One-button publishing profiles facilitate rapid processing of content to meet appropriate container and encoding requirements for an array of platforms, including target websites, YouTube, Twitter, and Facebook. Share also makes use of existing closed captioning text to publish content that is compliant with government regulations.

Review
The new Volicon Review application provides broadcasters, networks, and playout service providers with a fast and intuitive solution for reviewing on-air content, validating ad placement, and performing competitive analysis. This application facilitates rapid access to broadcast content for users working centrally and across geographically distributed sites, thus giving all key stakeholders the ability to keep an eye on their own broadcasts, as well as those of their competitors, and associated ratings data within a single GUI. Making high-resolution live and historical broadcast content available locally and lower-resolution proxy versions available on any device, the application gives users the ability to review and analyze their broadcasts at any time, from anywhere. The application interfaces with the playout automation system to provide as-run log data for comparison with the frame-accurate recording of the broadcast output, thus making it easy for users to show advertisers what they’re getting for their money.

Comply
Volicon’s new Comply application enables users to record, clip, and export their broadcasts to meet regulatory and licensing requirements. Addressing a complete array of regulations, ranging from decency to closed captioning to loudness, this scalable and highly reliable application allows users to respond quickly and unambiguously to compliance requests. Leveraging Volicon’s proven compliance monitoring technology, Comply lays critical A/V metadata over frame-accurate video to create a clear visual affidavit of compliance.

Monitor
Built on Volicon’s acclaimed digital video monitoring technology, the new Monitor application allows users to monitor video quality, respond to fault reports, and use a full recording of the on-air broadcast for instant review of errors and their impact. While continuously analyzing logged content for a variety of faults such as black or static screen, loss of video or closed captions, and incorrect audio levels, this application provides flexible, individually configurable alert thresholds, with notifications delivered via email or SNMP traps. Quality measurement thresholds may be configured per channel to optimize performance and error reporting. To further simplify network monitoring and troubleshooting, the application provides an integrated multiviewer feature that enables Observer® Media Intelligence Platform™ users to use their standard displays as multiviewers or record the output of a traditional multiviewer. With multiple streams presented on a network wall, users can respond immediately to any issues, instantly grabbing the suspect stream via their desktop interfaces to begin resolving the problem.

 

 

TrueVisions Extends Volicon Observer® Monitoring and Logging System

TrueVisions, Thailand’s leading cable and satellite television operator, has expanded its Observer® digital video monitoring and logging system to simplify compliance verification across a total of 120 channels in its direct-to-home (DTH) platform. Local distributor and system integrator Trinergy provided a new Observer TS® (transport stream) monitoring and logging system that allows TrueVisions to perform continual monitoring and recording of both baseband and compressed signals from within a unified user interface.

“The original installation of Volicon technology allowed us to capture consistent high-quality recorded video that is incredibly easy to access and use, and it also helped us to realize cost savings associated with the shift away from tape-based compliance monitoring,” said Vichai Sernvongsat, chief technology officer at TrueVisions. “Now, with the addition of Observer TS, which logs the full MPEG transport stream, our staff can examine or export content from a recently captured transport stream, or go back further in time and view a low bit rate proxy version of aired content.”

The complete Observer system continuously monitors and records aired content across the TrueVisions lineup, providing real-time fault detection, as well as effortless clip identification and extraction for easy proof of compliance. Installed at TrueVisions’ Bangkok facilities, the Volicon system provides a simple and efficient means of verifying that advertising and program content have been aired properly and at the right time.

The Observer system captures, stores, and streams aired content, giving authorized users at TrueVisions instant access to live and recorded content from an easy-to-use Web-based GUI. Using this interface, desktop users can search, retrieve, analyze, and export video clips with metadata. Volicon’s As-Run-Log Integration module allows users to search and sort the as-run log via ID or commercial/program name for quick and easy ad verification with a direct link to video content. The Observer system’s quality of experience module provides real-time alarms for faulty video, audio, and closed captioning by issuing alerts via email/SNMP with a direct link to content and a master fault log.

“Compliance verification is a critical part of any broadcast business, and the Observer offers a flexible, cost-effective, and intuitive tool for meeting this requirement,” said Russell Wise, vice president global sales at Volicon. “Because it also is a modular system, it provides TrueVisions with a scalable foundation for monitoring additional channels or for bringing additional functions into its monitoring operations.”

Today’s Remote Monitoring Technology Is Primed for OTT and Streaming Services

by Gary Learner, Chief Technology Officer, Volicon
For VideoEdge Magazine

Today’s broadcasters face continued consolidation, centralization of staff, and ever-increasing pressure to improve efficiency — challenges that can take their toll on quality of experience (QoE) for customers. Fortunately there are affordable, reliable, flexible solutions for remote monitoring that can help overcome these challenges. These solutions perform all-important proactive quality checks at audio/video service handoffs to ensure maximum QoE. The latest generation of compact, low-cost remote-monitoring solutions has an expanded range of functions such as QoE-based content monitoring with recording, remote viewing, and troubleshooting of A/V feeds across linear, on-demand, and interactive services. This functionality is especially important given the addition of over-the-top (OTT), and internet streaming services content to broadcasters’ already long list of services requiring monitoring.

A Complicated Monitoring Landscape

Just as content distribution has evolved in recent history — from over the air to digital and now to OTT and Internet streaming services — so too has the need for and complexity of content monitoring. It used to be that aired content was recorded and watched back afterward to look for faults, an inefficient, tedious, time-consuming task that often took hours or even days to detect problems. It was never a practical approach, to be sure, but at least then it could be done. Today, given the scope and complexity of services requiring monitoring, that manual method would be impossible even for the most well-staffed, well-funded operations.

The introduction of OTT and streaming services boosts the number of portals through which content can be consumed. Add to this the plethora of viewing devices — PCs, tablets, and smartphones — and associated “flavors” of content they require, and the challenge of assuring the best possible experience (within the constraints of all components) can be enormous.

OTT and Streaming Services Present Special Challenges

With media being delivered directly to viewers, who might be scattered across the country or even across different countries, there is no longer a “middle man” to share responsibility for QoE. Thus every operation, regardless of size, must be able to monitor the availability and quality of services across platforms, CDNs, and video service providers — all from a single location — in order to offer a high standard of quality. Such solutions are especially important for operations that lack dedicated monitoring staff and budgets.

Sophisticated Tools for Proactive Remote Monitoring

Fortunately the industry has already addressed remote monitoring of linear, on-demand, and interactive services, eliminating the need for expensive, time-consuming manual and visual channel inspections. With these tools, broadcasters can proactively identify and respond to faults rather than waiting for customer complaints.

Advanced monitoring solutions today can scan hundreds of channels around the clock and automatically test signal integrity, issue alerts (via email and SNMP), and capture the problematic content when channels do not conform to preset limits. Positioned “behind” the set-top-box (STB), such solutions give operators a single system and location from which to access and monitor the video output continuously. Remote monitoring capabilities enable engineers to review video and audio for issues such as static or black screen, as well as errors in closed captions and audio levels.

In terms of on-demand content, today’s sophisticated monitoring systems can ensure content availability and system capacity, record DPI ad insertions to prove ad conformance, or monitor interactive STB guides to ensure a customer’s experience. With complete STB command access, broadcasters can perform troubleshooting more effectively and use a historical review of content and services to address intermittent yet chronic issues. Refined for intuitive use, such systems often combine familiar VCR-like controls with color cues that clearly indicate the channels being displayed, whether they are live or recorded. Layouts and source selection are managed with simple mouse clicks, and a built-in clock facilitates navigation to the desired time stamp.

Remote monitoring technology is already deployed in applications ranging from competitive news analysis to monitoring of “out-of-footprint” broadcast and distribution channels. Now broadcasters are applying the technology to OTT services.

A New Challenge: Remote Monitoring for Streaming and OTT Services

Long gone are the days of monitoring the quality of a single, linear broadcast. Now broadcasters not only must assure video quality for multiple content streams in multiple formats to multiple devices, but, through OTT services, they must also deliver a personalized user experience alongside that video content. It’s a scenario that effectively multiplies their outputs. On top of that, they’re working with a variety of distribution platforms and might need to deliver content via a number of CDNs. The situation is further complicated by the fact that the groups of files delivered to each CDN will need to accommodate a wide range of devices, each with its own profile. It’s easy to see why it’s a significant QoE challenge. There is no plausible way to monitor all of these outputs and versions all the time, but today’s monitoring solutions make it possible for broadcasters to institute OTT service monitoring strategies that work.

Monitoring Content at Every Key Point

The most viable monitoring strategy makes assessments at key points in the delivery workflow: ingest, encoding, packaging, delivery, and distribution to the viewer (albeit in a controlled environment free of the vagaries of ISP service). It’s a mostly passive monitoring process that can give broadcasters reasonable confidence that the content they are delivering is packaged correctly in the formats compatible with target devices.

Monitoring ingested files is critical because they are frequently used as a high-bit-rate coded reference file for all future transcodes of that content. Monitoring at ingest is straightforward, as it typically requires continuous monitoring of just one feed.

Monitoring becomes more demanding in the encoding stage because of the number of files that result from this stage. Working with as many as a dozen files, broadcasters must shift to passive monitoring methods and begin examining data about the file rather than the video itself — a far more cost-effective approach when dealing with large numbers of files. By looking at bit rates, syntax, reference timestamps, and alignment of each of the files, the broadcaster can make a sound determination of the files’ integrity.

Once files have been packaged in the appropriate formats for the target CDNs and platforms, they are delivered along with a manifest describing those files. The simplest way for the broadcaster to confirm that the packaging is being performed properly and that delivery is correct and on schedule is to “accept” files in the same way that a CDN would. After that point, the broadcaster no longer has control over the content and must hope that the CDNs — and, subsequently, ISPs — will carry it to the viewer without introducing faults or compromising quality.

Streaming and OTT Services Require Active Monitoring

When it comes to monitoring the quality of streaming content, passive monitoring won’t work. Instead, broadcasters must apply sampled active monitoring on the tail end of the chain. This approach acknowledges not only the complexity of multiplatform media delivery, but also the exponential leap in the volume of media being delivered.

Actively capturing a sampling of outputs can give broadcasters an accurate idea of the quality that most of its OTT viewing audience is experiencing. Thus, for a relatively modest investment of time and money, particularly as compared with the cost of monitoring all outputs all the time, broadcasters can assure that most of their customers are enjoying quality service most of the time.

Besides active monitoring at the end of the delivery chain, broadcasters are also taking advantage of “round robin” emulation with different devices, rates/resolutions, and CDNs. In round robin monitoring, the broadcaster alternatively checks the lower, middle, and upper bit rates; examines Apple HLS, Microsoft Smooth Streaming, and Adobe HDS formats; and monitors content for quality. By taking these measurements in a controlled environment, broadcasters can easily separate the issues they can control from the ones that occur during ISP delivery.

With a combination of active sampling and round robin emulation, a broadcaster can effectively become a consumer of its own OTT services. When these monitoring tasks are automated and alerts have been configured to warn engineers of any issues, the broadcaster can maintain a proactive approach to monitoring across its full complement of services.

Required Infrastructure

For this model of active sampling to work, the broadcaster’s video monitoring and logging system must touch on a multitude of MPEG transport stream points and monitor adaptive bit rate streaming of both encrypted and unencrypted media. In order to monitor OTT media delivered through apps, the system can employ an external device to emulate and record app behavior. This method accommodates both decryption and authentication while illustrating the user experience. With this functionality, the broadcaster can effectively monitor streaming media (encrypted or unencrypted) in any format.

Monitoring Streaming Content for Compliance

Federal Communications Commission (FCC) regulations demand that both broadcast and OTT content include descriptive video and captioning, so monitoring OTT content for compliance purposes is just as important as maintaining QoE. Fortunately, the very monitoring tools and techniques that support QoE monitoring of OTT services also enable broadcasters to make sure that their services comply with regulations from the FCC and others.

Conclusion

Advertising, encoding, delivery mechanisms, target devices, and other variables are combined to make monitoring across OTT services a challenging but necessary task. The simplest and most cost-effective means of monitoring to address new OTT services is to extend installed monitoring and logging technology. In this way, broadcasters can take advantage of proven technology and workflows to assure that they are delivering the high-quality personalized content today’s media consumers desire.

 

# # #

 

This paper was first presented at the 2014 NAB Broadcast Engineering Conference on Wednesday, April 9, 2014 in Las Vegas, Nevada. You can find additional papers from the 2014 NAB Broadcast Engineering Conference by purchasing a copy of the 2014 BEC Proceedings at www.nabshow.com.

India’s NSTPL Uses Volicon’s Observer® Monitoring Technology to Support Headend-in-the-Sky Platform

Observer Monitoring and Logging System Enables NSTPL to Confirm Compliance, Quality, and Availability of Content on New JAINHITS Platform

 NSTPL (Noida Software Technology Park Limited), part of India’s Jain TV Group, is using the Volicon Observer® Media Intelligence Platform™ digital video monitoring and logging system to enable efficient, effective compliance and quality of service (QoS) monitoring for over 200 channels being aggregated, processed, and uplinked via the company’s Headend-in-the-Sky (HITS) platform, JAINHITS. This platform, the first of its kind in India, offers cable operators across India a straightforward and cost-effective means of meeting the country’s mandatory shift from analog to addressable digital systems.

NSTPL, already an established provider of TV broadcasting, newsgathering, and video up-link services, launched JAINHITS in October 2012 to help cable operators meet the December 2014 digitization deadline set by the Indian Parliament. Through this platform, the company downlinks content from different broadcasters, processes the signals, and uplinks them via satellite for download by its customers and cable operators across India.

The Observer Media Intelligence Platform continuously captures and stores this content, enabling NSTPL to maintain a visual record of the content that has been processed and uplinked. Through an intuitive Web-based interface, the Volicon system also provides easy access both to live streams and recorded media. Monitoring staff and other users at the desktop can thus monitor the content going out to customers or go back days or months to find and provide proof that uplinked content met all appropriate regulations, standards, and quality parameters.

Volicon Works With Astro Malaysia to Roll Out Monitoring and Off-Air Logging System for More Than 200 Channels

Malaysian pay-TV operator Astro is using an Observer® Enterprise video monitoring and logging system to enable off-air logging for more than 200 channels. Focusing on key points in the transmission path, the Volicon system monitors and logs incoming and outgoing feeds in a variety of formats. With broad format support and the ability to support high-density monitoring and logging applications, the Observer system serves as a reliable and flexible solution that addresses the needs of Astro departments ranging from engineering to media sales.

 “Volicon’s Observer system gives us an integrated off-air logging system built on proven technology and equipped with features that give our staff a high degree of flexibility in working with aired media,” said Chris McMillan, vice president, production services, Astro. “By allowing simultaneous users across our operations to access logged content quickly and with ease, the Observer system has enabled us to improve our efficiency and responsiveness in assuring the quality and compliance of Astro services.”

With a customer base of more than 3.9 million residential customers (representing approximately 56 percent of Malaysian TV households), Astro offers 171 TV channels, including 39 HD channels, delivered via direct-to-home satellite TV, IPTV, and OTT platforms.

Installed in Astro’s main DTH broadcast center, the All Asia Broadcast Centre located in Kuala Lumpur, the Observer Enterprise accepts and monitors signal types including composite, component, HD/SD SDI, and transport stream inputs. The system is equipped with Volicon’s quality of experience (QoE) module, as well as an as-run log module that allows users to search and sort the as-run log via ID or commercial/program name for quick and easy ad verification with a direct link to video content. A content export module makes it easy for Observer users to extract and share select clips from recorded content.

The Observer’s multiview display feature enables users to watch multiple programs on a network wall and use the desktop interface to target and begin inspecting or troubleshooting a suspect stream without delay. In executive offices and board rooms, this capability opens up a host of valuable monitoring and review opportunities for both real-time and recorded broadcasts.

“The complexity of large-scale pay-TV operations demands a monitoring and logging system that is robust yet intuitive,” said Gary Learner, CTO at Volicon. “The system must meet the technical requirements of the engineers responsible for maintaining the integrity and quality of the service output, as well as the various needs of the staff who use logged content in other areas of the business. The Observer Enterprise does it all, thereby simplifying critical tasks across the broadcast facility.”

More at: volicon.com