A Brief History of TV and Closed Captioning in Canada

By the mid 1940’s, in some Canadian cities neighbouring the US border, aerial television signals from American stations could be captured with elevated antennae and amplifiers, spurring the development of Canadian content transmission by the late 1940’s into the 1950’s. Because Canadian programming began its rise in tandem with American programming, the Canadian transmission system mirrored the technical standards set out in the US Federal Communications Commission (FCC).

In Canada, television began under the regulatory enforcement of the Canadian Broadcasting Corporation (CBC), however, the dual responsibility of program development and national distribution were too much for one organization to tackle alone, and private licences were distributed to produce national programs for the CBC. In the late 1950’s, the Broadcasting Act introduced by the Conservative government of John Diefenbaker, instituted the Board of Broadcast Governors (BBG) to oversee the regulation of private and public broadcasting. Due to lingering ambiguities in the Broadcasting Act, a new act was passed in 1968 delegating licensing authority to a new organization, the Canadian Radio-Television Commission, later renamed in 1976 the Canadian Radio-Television and Telecommunications Commission (CRTC). Since its inception, the CRTC has worked to ensure that radio and television stations and networks operated under Canadian ownership “safeguard, enrich and strengthen the cultural, political, social and economic fabric of Canada," and that the programming provided should be "of high standard, using predominantly Canadian creative and other resources."

The technology involved with radio and television broadcasting was developing at a fast pace in the US with Canada caught in a race to keep up. By the 1970’s, the Public Broadcasting Service in America was offering select television programs with open captioning, but closed captioning was still an experimental technology. Washington’s public television station successfully tested a closed captioning system in 1973, by using line 21 of the television signal to carry the caption encoding that could be activated by the user. Encouraged by the success of this test, the FCC reserved line 21 for the transmission of closed captions in the United States in 1976. By 1979, the National Captioning Institute (NCI) was formed for the sole purpose “to promote and provide access to television programs for the deaf and hard-of-hearing community through closed captioning technology.” Following in their footsteps, the Canadian Association of the Deaf petitioned the CRTC to initiate captioning in CBC programming and in 1981 closed captioning was included in Canadian content.

Live, or ‘real-time’ captioning took longer to appear in Canada because of significant technical hurdles. Until networks could provide in-house captioning, live captioned programs in Canada would be fed live to CCDA to be captioned, and then transmitted back simultaneously to the network for broadcast.

In 2007, the CRTC announced a new policy that required all Canadian broadcasters to caption 100% of their programs, except for advertisements and promotions.

For more information on the history of television broadcasting, there are numerous websites available with this information:

Return to top of page

CRTC Regulations

The Canadian Radio-television and Telecommunications Commission (CRTC) is an arms-length quazi-judicial body accountable to the Federal Ministry of Canadian Heritage. Its mandate is to interpret, create and enforce policy under the Canadian Broadcasting and Telecommunications Act. Specifically, parliament’s objectives for the broadcasting policy in section 3(1) of the Broadcasting Act include accessibility, and state that programming accessible by disabled persons should be provided within the Canadian broadcasting system as resources become available for that purpose. The CRTC, therefore, regulates and licences all Canadian television and radio stations including all accessibility requirements.

Broadcasters must complete several steps to obtain and keep their licence; they must obtain a broadcasting frequency from Innovation, Science and Economic Development Canada and broadcast certification from the CRTC, they may undergo a competitive process to obtain their licence, and they must submit several forms outlining the type of broadcasting proposed. Applications must meet certain minimum criteria in the areas of ownership, financial capacity, technical capacity, and programming requirements before they can be awarded a licence to broadcast.

This is all in place so that the CRTC can ensure that there is a need for the proposed broadcaster, and that they are meeting the requirements for Canadian content and accessibility. If there are any issues that arise from the broadcaster’s programming that are not addressed, the CRTC can follow up on complaints received in writing.

For more information on the CRTC, please visit their website: http://www.crtc.gc.ca/eng/home-accueil.htm

For more information on how the CRTC regulates accessibility features in television broadcasting, please visit the following webpage: http://www.crtc.gc.ca/eng/tv_radio4.htm?_ga=1.119770888.1685172260.1484153535

Return to top of page

Accessible Media

The technology is called closed captioning (CC) because it can be turned on and off. Captions appear on the television screen only if enabled by the viewer. On some televisions, the viewer can access the captioning through a CC button on the remote control, and on the newer Smart TVs captioning can be activated using a designated menu on the Service Provider’s setup options. Although all televisions manufactured after 1993 are equipped with captioning capability, the method to access it varies.

Image for displaying Closed Caption button on a universal remoteClosed Caption button on a universal remote

There are a few different methods for encoding captions:

  • Live captioning is usually performed by a steno-captioner or a shadow captioner in real-time and will appear on the television screen text during a live broadcast such as news or sports. There is little or no time for any errors to be corrected. Also, there will always be a lag between when something is said and when the caption appears. For live programming, the Broadcaster is usually responsible for providing the captions.

    Image of Captionist typing on a stenotype machineCaptionist typing on a stenotype machine

  • Post-production captioning is done after a show has been made by a production studio but before it is broadcast. There is some time for errors to be corrected and for the captions to be checked before they are broadcast. The most common display of post-production captions is called ‘pop on and off’ where all lines of captioning appear/disappear at once; they do not roll-up the screen. Sometimes the production studio is responsible for captioning and sometimes it is the Broadcaster’s responsibility.

  • Speech Recognition (SR) captioning enables the recognition and translation of spoken language into text by computers. The quality of captions provided through speech recognition software varies a great deal and currently introduces more errors compared with captions produced by a human steno-captioner. Speech recognition technology is a rapidly advancing sector of software development, and has seen great improvements in recent years. In the future, it may be possible that this technology will be used instead of steno-captioning.

  • Described Video (also known as Audio Description or Video Description) is audio-narrated descriptions of a television program's key visual elements. These descriptions are inserted into natural pauses in the program's dialogue. Described Video makes television programming more accessible to individuals who are blind or have low vision. (https://www.fcc.gov/general/video-description, http://www.crtc.gc.ca/ENG/INFO_SHT/b322.htm)

  • Shadow Captioning (also known as Re-speaking) uses trained human speakers who listen to the original audio, interpret it and repeat it to the computer’s speech recognition system. The computer then produces the captions for broadcast. (https://www.crim.ca/Publications/2006/documents/plein_texte/PAR_BouGals_Interspeech06.pdf)

Return to top of page

Assistive Listening Device (ALD) Options

While captions are one way to provide hearing accessibility, audio quality is crucial to viewer comprehension. Especially important is being able to hear dialogue over the background noise of the program.

If the television speakers do not provide sufficient volume, clarity or speech comprehension, assistive devices can bring the sound directly to your ears, through headphones, hearing aids or cochlear implants. This section will provide information on several different types of systems, such as proprietary wireless protocol communication systems, FM and Infrared, Induction Loop, as well as some low-cost alternatives.

Discreet Wireless Microphones

Bluetooth, Phonak Roger Link and Apple systems are examples of proprietary wireless communication systems that can be synchronized to hearing aids or cochlear implants by hearing healthcare professionals, and helps the individual to understand more while listening to speech, multimedia, Smartphones and TV. These devices must be paired with your hearing aids/cochlear implant in the hearing clinic.

Once they are synchronized with a device the user must return to the hearing clinic to make any changes (e.g., changing to a new device), and there are some limitations as to how many devices can be synched at any one time.

FM and Infrared Systems

Most FM and Infrared systems consist of a transmitter/emitter that connects the television’s audio system to a receiver and headset worn by the user. Sometimes, the receiver is built into the headset itself.

FM systems use radio waves to transmit sound and Infrared systems use light to transmit the sound. The FM and Infrared systems are wireless, providing some mobility for the user. However, there are some considerations before deciding on which system to use.

Image for Example of an FM systemExample of an FM system

Image for Infrared system with a televisionInfrared system with a television

Induction Loop Systems

Induction loop systems (also known as hearing loops, or loop systems) use an electromagnetic (EM) field to transmit sound. Induction loop systems technically work with flux coils, but are more commonly known as telecoils (or t-coils). This type of technology picks up modulated electromagnetic frequencies and converts them into sounds that are picked up by personal amplification devices, such as hearing aids, bone anchored hearing aids, or cochlear implant processors.

For most induction loop systems, a wire is taped to the floor or baseboards around the perimeter of the room, and acts as an antenna to transmit the audio signal through the EM field. One benefit of using an induction loop system with personal amplification devices (such as hearing aids or cochlear implants) that have a telecoil, is that no additional headphones or receivers are needed.

While induction loop systems require very little maintenance once set up, they can be susceptible to electrical interference. Therefore, it is important that this system is installed correctly to produce an adequate signal for a telecoil to pick up. Unlike proprietary systems such as Bluetooth systems, telecoils do not drain the batteries on a hearing aid or cochlear implant any faster than the microphone program, and telecoils work with all induction loop systems – regardless of the brand of the hearing aid/cochlear implant.

A hearing healthcare practitioner can provide information and support on telecoils and systems that will help with listening to the TV.

Image for Induction Loop systemInduction Loop system

Audio Streaming

Audio Streaming works by sending the audio signals from a pre-amplifier or mixer digitally over standard Wi-Fi to Android or Apple smartphones and tablets. A low-cost audio streamer processor replaces the old-style analog transmitters (such as FM or IR systems). Many people who wear hearing aids and cochlear implants are able to stream audio from their smartphones using Bluetooth technology, allowing them to listen to the audio with the amplification level prescribed by their audiologist. With multiple Wi-Fi Access Points the system is scalable for virtually any venue.

Audio Streaming diagram: Audio signals are sent from a pre-amplifier through a Wi-Fi network to smartphones and tablets to allow people to listen through their personal devices.Audio Streaming System

Lower cost accessibility options

Sound energy weakens over distance, so bringing the audio source closer to the listener is a simple alternative to improve the listening experience. This can be done with small speakers placed close to the listener. Wireless, portable television speakers such as the TV Soundbox, for example, have a range of approximately 30 meters and have volume control that works independently from the televisions’ speakers.

There are speakers that bring sound close to where the listener sits, and are even designed to blend into the furniture with customizable colour options, such as the Audio Fox brand wireless TV speakers. After the speakers are set up, the volume can be controlled by a remote that is provided with the speakers. These types of systems come with single or dual speakers that can be draped over the back of a chair, couch or bed.

For more information on Assistive Listening Devices, please refer to the booklet “Full Access: A Guide for Broadcasting Accessibility for Canadians living with hearing loss

Return to top of page