Media Publishing Glossary

Let's get technical with our media publishing glossary.

4K Streaming

4K streaming is a high-resolution video also known as Ultra-HD (UHD) that is four times larger than standard resolution (SD) video.  4K videos have more than 7 million pixels in the video display and a horizontal pixel count of approximately 4000.  4K content is becoming more widely available online including on Apple TV, YouTube, Netflix, Hulu, and Amazon.

 

Common 4k video resolutions include:

  • – 4096 × 2160 (full-frame, 256∶135 or ≈1.90∶1 aspect ratio)
  • – 3840 × 2160 or 2160p (an aspect ratio of 1.77∶1 (16∶9) or wider)

Adaptive Streaming

Adaptive streaming (adaptive bitrate streaming or ABR) is a method of video streaming over HTTP where the source content is encoded at multiple bit rates. Each of the bit rate streams are segmented into multi-second parts. Segment size are typically between two and ten seconds. The adaptive bitrate (ABR) algorithm in the client decides which bit rate segments to download, based on the current state of the users network resulting in smoother playback and less buffering.

 

Implementation of adaptive streaming include:

  • – MPEG-DASH
  • – Adobe HTTP Dynamic Streaming
  • – Apple HTTP Live Streaming
  • – Microsoft Smooth Streaming (HLS)
  • – QuavStreams Adaptive Streaming over HTTP
  • – Self-learning clients

Aspect Ratio

An aspect ratio is the proportional relationship between the width and height of a video or image which is expressed as a ratio.

 

Common Broadcast aspect ratios are

  • 4:3 for SD video. Sometimes expressed as 1.33:1 (4:3 or “full screen” which came from fitting older TV sets)
  • 16:9 for HD wide screen formats. Sometimes expressed as 1.78:1 (16:19 or widescreen).
  • 1.85:1 or 2.35:1 for film (CinemaScope, TohoScope and other cinematic formats).
  • – 2.39:1. Known as anamorphic widescreen format, for shooting scenic landscapes.
  • – 2.76:1 (70mm) often used for IMAX developed in the 1950s and first usedin Best Picture-winning film Ben-Hur
  • – 1.37:1 (Academy ratio). Only slightly wider than the 4:3 ratio used throughout the silent film era
  • – 2.59:1 to 2.65:1 (Cinerama) a super widescreen format involving three standard 35mm film cameras that simultaneously project a film onto a curved screen.
  • – 2.35:1 to 2.66:1 (Cinemascope) developed by the head of research at 20th Century Fox and only required one projector, which made it much less complex than Cinerama.

Bandwidth

Bandwidth is the maximum amount of data that can be transferred from the video source location to the receiver or viewer in a period of time. Bandwidth is a measure of the internet connection speed or the amount of data consumed and is typically measured in kilobits or megabits per second, or in gigabytes. Bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth. A viewers bandwidth may change due to network conditions. Modern adaptive streaming protocols were developed to adjust the data bit rate based on a users fluctuating bandwidth to ensure streaming stability and less buffering.

Bitrate

Bitrate is the number of bits per second that can be transmitted along a digital network. The data is measured in bits, not to be confused with bytes. Since data is measured in bits, the bitrate measurements are presented in either bits, kilobits (Kbps – kilobits per second), megabits (Mbps – megabits per second), gigabits, or terabits per second.  For example, the bitrate of a 720p video stream may be approximately  2 Mbps, and a standard audio bitrate is 96 – 128 Kbps.

Buffering

Buffering is the process of loading data into memory (a buffer). In live and on-demand streaming audio or video over the Internet, buffering refers to pre-downloading chunks or segments of data before the player starts and then throughout playback. The preloading of the data buffer can assist with smooth playback. Visible buffering during real-time playback (playback interruption) is caused by a delay in the data preloading process from low bandwidth, weak Wi-Fi, or issues with the streaming origin.

 

Techniques for smooth uninterrupted video and audio playback includes the use of adaptive streaming. Adaptive streaming adjusts to low or high bandwidth conditions by delivering an optimised bitrate (lower or higher) to the data buffer. GluePlyr supports all leading adaptive streaming technologies including HLS and MPEG-DASH.

Captions

Closed Captions (CC) display text over a video to provide dialogue and a description of background audio such as sound effects and audio-only on-screen events. Closed Captions may be turned on or off by the viewer.

 

Open Captions differ from Closed Captions as they are always in view and cannot be turned off.

 

Subtitles differ from Closed Captions as they do not describe non-dialogue audio and assume the user can hear audio. Subtitles are often used for the translation of the dialogue.

Codec

A codec is a compression and decompression technology to encode and decode a signal or data stream. The name codec is a portmanteau word that expresses the combination of coder-decoder or encode-decode or compress-decompress.

 

A codec is a code standard capable of encoding and decoding audio or video for transmission over a data network by reducing the file sizes of video, audio, and other media formats. 

 

The most common video codecs for video streaming are:

  • – H.264/AVC
  • – VP9
  • – H.265/HEVC
  • – AV1
  • – H.266/VVC

Codecs should not be confused with Containers. A codec applies lossy compression to reduce the source video file size where unnecessary data is discarded and decompress during playback. 

 

Containers, on the other hand, store the video codec, audio codec, and metadata such as subtitles or preview images. So the container holds all the components together and determines which programs can accept the stream.

Container

Containers store the video codec, audio codec, and metadata such as subtitles or preview images. So the container holds all the components together and determines which programs can accept the stream.

Codecs should not be confused with Containers. A codec applies lossy compression to reduce the source video file size where unnecessary data is discarded and decompress during playback. 

 

Containers are also referred to as the format of that file. The common video containers are:

  • – 3GP Third Generation Partnership
  • – ADTS Audio Data Transport Stream
  • – FLAC Free Lossless Audio Codec
  • – MPEG / MPEG-2 Moving Picture Experts Group (1 and 2)
  • – MPEG-4 (MP4) Moving Picture Experts Group 4
  • – Ogg Ogg
  • – QuickTime (MOV) Apple QuickTime movie
  • – WebM

Deinterlace

Deinterlacing combines the two alternating fields found in interlaced videos to form a clean shot in a progressive video. Interlacing was developed to reduce the data transmission size for traditional terrestrial television systems such a cable networks. Without deinterlacing, the interlaced content will often display motion with a line-like appearance.

Embedding

Embedding a video or video player is the process of coding a video player on a website or application. Normally, this is done by copying an “embed code” of a live stream or video-on-demand file from an OVP and inserting it into the HTML code of your website. Although the video play can be seen on the website it is hosted on the OVP or another website through HTML embed code.

 

An iframe embed of a video player is the most common method of embedding a video player.

Encoding & Transcoding

Video encoding is the process of compressing raw, uncompressed video. While video transcoding is the process to re-encoding compressed files and therefore involves an additional step to decode the incoming video before encoding it.

 

Video and audio transcoders are more commonly referred to as encoders however they typically perform transcoding processes in additional to encoding such as:

  • – decoding different container formats including mp4, ts
  • – decoding bitstreams using different video codecs including H.264/AVC, HEVC, AV1, VP9
  • – change the resolution of the video to produce outputs of different resolutions (critical to ABR stream production)

 

In addition to encoding and transcoding, there are other terms used in the encoding and transcoding process to prepare audio and video files for streaming, which are:

  • – Transrating involves changing the bitrate of the video
  • – Transmuxing involves changing the container format

 

Video encoders can be software or hardware and process live or on-demand video sources by compressing (encoding) and converting (transcoding) the video format. Encoding by definition takes an analog source and digitizes that content whereas transcoding takes an existing digital format decodes it, compresses and encodes it to a different digital format.

 

An example is a .mov file being transcoded to H.264/AAC HLS file to be streamed on a live stream player or mobile device.

 

The transcoding process is a vital function to ensure playback compatibility with target devices and browsers that do not support the current format of your media. Encoding & transcoding also improve streaming performances by, compression techniques to reduce file size and create adaptive streaming formats that adjust to the user’s available bandwidth.

 

During the encoding process, additional security measures may be applied such as DRM packaging of license keys to prevent unauthorised video playback.

Frame Rate

Frame rate or frame frequency is the frequency at which consecutive images are captured or displayed per second.The amount of images displayed per second is measured in frames per second (fps).

Essentially video is made up of a series of still pictures or frames that are displayed one after the other. The most common digital video frame rate uses 30 fps however sports and online gaming video streams with fast-action often use 60 fps. VR has even higher frame rates targeting 90 fps.

 

At 30 fps, 30 distinct images would appear in succession within one second. If the fps is too low, movement will appear jagged and jerky. 24 fps is considered the lowest to achieve smooth video motion.

 

It is common for the source frame rate to be matched during the video encoding process to archived efficient encoding and optimised playback, therefore frame rates vary slightly due to different standards adopted around the world.

Geo-blocking, Geo-filtering & Geo-targeting

Geo-blocking restricts access to video content based on a users geographical location.

 

Geo-filtering restricts access to specific video content based on a users geographical location.

 

Geo-targeting tailors access to video content based on a users geographical location.

HTTP Live Streaming (HLS)

HTTP Live Streaming is an HTTP-based adaptive bitrate streaming protocol developed by Apple also known as HLS. The HLS protocol enables responsive live streaming with the delivery of an adaptive bitrate set that adjusts to a users bandwidth to achieve smoother playback. HLS creates small video chunks and delivers them to the player via a manifest file that describes the available bitrates and order of video chunks. The quality can differ from chunk to chunk based on a user’s internet connection at the time the data is sent.

 

Despite its name, HLS sends both live and on-demand audio and video to iPhone, iPad, Mac, Apple Watch, Apple TV, and PC with HTTP Live Streaming (HLS) technology.

 

By using the same HTTP protocol that powers the web, HLS lets you stream video and audio content using ordinary web servers and content delivery networks.

 

Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers.

Video Streaming

Glue Media Publishing System is tailored for video streaming production workflows. GlueMPS integrates with broadcast production systems to ingest video & content and metadata, processing your media assets for cross-device playback.

Read more

Akamai

Akamai is the largest Content Delivery Network (CDN Platform) in the world. They currently have greater than 240,000 servers that cache video content to reduce latency resulting in an improved video viewing experience. Akamai also provides benefits to the media publishing systems by offsetting load to origin services resulting in better performance and availability and cyber security such as DDoS attack mitigation. GlueMPS integrates with Akamai CDN platform.

Application Programming Interface (API)

An API is an application programming interface that facilitates the request and response of data between applications. APIs provide a software to software service for applications to add, edit and delete data thus keeping two or more systems in sync. Typically APIs provide fronted systems such as website and native apps an interface to request content or data to display and make updates to backend systems such as database records.

The Glue API’s are organized around REST. Our RESTful API has resource-oriented URLs, and uses HTTPS response codes in combination with API Status codes to indicate API errors. We use built-in HTTP features.

JSON is returned in all API responses including errors.

 

GlueMPS APIs to build a simple or complex frontend or backend application. The base APIs allow you to locate, view, filter, sort and paginate content, user, and assets data stored in the system.

Article Schema

Article Schema is a web form in GlueMPS administration console to enter metadata, images, assets and publishing rules that are assigned to an asset such as audio or video files. Article schemas have a taxonomy or metadata structure that provides context and relationships between the data. Taxonomies of metadata (“categories” “channels” “movies”) allows the GlueMPS content administrator to add, edit and publish content metadata to the audio and video application that is optimised for the user with easy information discovery. Article schema content is available via Glue RESTful content API. The content API is valuable because it contains not only tags that tell an application where the title, main content and subheads are, but also information that gives meaning and context to the data in the content, so a user can quickly find a channel, show or movies by a favourite actor etc.

Content Delivery Network (CDN)

A CDN is a content delivery network that comprises a distributed system of servers that store (cache) and load content based on a users location. Caching of the content in a CDN assists with offloading file requests, that would otherwise go directly to the media platforms origin servers, which assists with scalability and improving load speeds due to a reduction in latency. CDN is typically distributed globally and is designed to increase the speed and reliability of delivering digital content to users by caching content dynamically at edge servers located close to the audience.

 

Live and on-demand video streaming is a resource-intensive process therefore most video streaming platforms utilize a CDN. The process of caching the content (video files, web pages, etc.) offloads the file requests from a single server to a large network of distributed servers. CDNs work in combination with cloud-based hosting infrastructure to provide scalability by dynamically increasing resources to meet traffic demands.

 

Glue Media Publishing System supports all leading CDN’s including Akamai, Amazon Cloufront, Fastly, Cloudflare, Alibaba Cloud CDN, Limelight Networks, Microsoft Azure CDN, KeyCDN and more.

Content Management System (CMS)

A Content Management System or CMS provides a software interface to add, store and edit digital content and publish the digital content to the frontend web (apps). A content management system typically consists of a database, API and user interface (Administration console).

Content is delivered from a CMS to websites and applications via APIs.

 

Content Management Systems differ from Online Video Platforms (OVP) or Media Publishing Systems (MPS), in that, they do not feature media publishing workflows tailored to audio and video assets or managing the metadata specific to audio and video.

 

OVP and MPS have additional features from a CMS including:

  • – video broadcasting ingest workflow
  • – audio broadcasting ngest workflow
  • – video encoding
  • – audio encoding
  • – video metadata schemas
  • – audio metadata schemas
  • – digital asset management (audio, video, images)
  • – video players
  • – audio players
  • – closed captions
  • – subtitles
  • – video entitlements
  • – audio entitlements
  • – digital rights management (DRM)
  • – geo-targeting video & audio
  • – geo-filtering video & audio
  • – Video paywalls (SVOD, TVOD)
  • – Audio paywalls (SVOD, TVOD)
  • – Media monetisation pricing
  • – Media monetisation transaction logging
  • – Media monetisation revenue reporting
  • – Media metadata management
  • – API for media asset delivery
  • – API for media tracking and reporting
  • – Audience user management
  • – Audience content personalisation
  • – Audience user-generated content
  • – Audience subscription and payment history
  • – Web and native applications for OTT platforms
  • – Web and native applications for Video streaming platforms
  • – Web and native applications for Music streaming platforms
  • – Web and native applications for Podcast streaming platforms
  • – Web and native applications for Radio streaming platforms
  • – Web and native applications for Sports streaming platforms
  • – Web and native applications for Online Education platforms
  • – QoS video monitoring system
  • – QoS audio monitoring system
  • – Dynamically scalable infrastructure for responding to large video streaming events
  • – CDN infrastructure for responding to large video streaming events

Video Platforms

Tincidunt elit magnis nulla facilisis. Dolor sagittis maecenas. Sapien nunc amet ultrices, dolores sit ipsum velit purus aliquet, massa fringilla leo orci. Lorem ipsum dolor sit amet elit magnis nulla nunc amet ultrices purus aliquet.

Read more

Market Access and Import/Export

Tincidunt elit magnis nulla facilisis. Dolor sagittis maecenas. Sapien nunc amet ultrices, dolores sit ipsum velit purus aliquet, massa fringilla leo orci.

Customs Valuation and Niche Tradings

Tincidunt elit magnis nulla facilisis. Dolor sagittis maecenas. Sapien nunc amet ultrices, dolores sit ipsum velit purus aliquet, massa fringilla leo orci.

State Trading and Regional Trading

Tincidunt elit magnis nulla facilisis. Dolor sagittis maecenas. Sapien nunc amet ultrices, dolores sit ipsum velit purus aliquet, massa fringilla leo orci.

Monetisation

Tincidunt elit magnis nulla facilisis. Dolor sagittis maecenas. Sapien nunc amet ultrices, dolores sit ipsum velit purus aliquet, massa fringilla leo orci. Lorem ipsum dolor sit amet elit magnis nulla nunc amet ultrices purus aliquet.

Read more

Glue Media Publishing System is a platform as a service for your online video platform or online music platform.

Turnkey OVP & OMP solutions to get you streaming video, music or podcasts to the largest possible audience.
Video content management systems frequently lack end-to-end features that are necessary for managing an online video platforms entire publishing workflow. GlueMPS is a platform as a service providing infrastructure, software and applications to stream live and on-demand television, radio, music, podcasts, events, education, sport, news and government content.

Glue Media Publishing System cloud-based software features modules to ingest, manage, distribute and monetise content.