Closed Caption vs. Subtitles: Understanding the Differences

Table of Contents

What are captions

 

Captions are a text representation of what is happening on the screen. They presume that the viewer cannot hear the video’s audio track.

 

Today, video has become the main source of education, entertainment, and news-watching. Different people watch them in different circumstances. Watching environments can be noisy or sound-sensitive. Videos can be on mute. People who are watching them may have a hearing impairment. Captions aim to make the videos accessible to all these viewers. This means that the transcription must include more than a dialogue. Captions denote noises, music, and certain actions (such as sneezing or laughing). There is also character differentiation so that the viewer knows who's talking.

what are captions

Types of captions

 

There are two types of captions, open and closed.

 

Open captions

 

Open captions are also known as burned-in, baked-on, or hard-coded. They are a permanent feature of the video and you cannot turn them on and off. Open captions accompany films at scheduled screenings for the deaf and hearing-impaired. This is because not all cinemas have the necessary equipment for closed captions. The same is true for online videos. Not all online video players have the necessary functionality. So many online videos have open captions. The settings on social media videos usually include initially muted playback. This has made the open captions a permanent feature of social media videos.

 

Open captions come with several advantages:

 

  • You can style the captions, choosing the font, color and size before embedding them;
  • Open captions require no special functionality from the media players;
  • By embedding captions into the video you don’t have to worry about keeping track of two separate files.
  • But there are some limitations as well:
  • The viewer cannot disabled open captions and that may provoke a negative reaction from some;
  • Once embedded into the video, open captions cannot be edited or removed.

Closed captions

 

Closed captions are the most common type. They come as an option for almost any video viewing experience. What does ‘closed caption’ mean? ‘Closed’ in what regard? The name ‘closed captions’ is there because of the viewer’s ability to turn them on and off, in other words, to “close” them. Closed captions require special functionality in the online media players. The viewers also need special equipment in the form of TV decoders. So, if you want to make your video as accessible as possible, open captions may be the best bet. This way, you can be sure that the audience can view and understand your video on any website or device.

 

That said, closed captions have important advantages:

 

  • Can be switched on and off, making the video viewing experience customizable;
  • Because they exist as a separate file, they can be re-edited and reuploaded with the video;
  • Closed captions can be created in a range of formats to make them suitable for a variety of video players.
  • The main problems with closed captions have to do with the fact that not all media players have this option, and that it is up to the viewers to find the right toggle and turn the captions on.

Where are captions used

 

Today, captions are virtually everywhere. In the US, the Americans with Disability Act of 1990 had prohibited discrimination of people with disability. Video captions were not named in the Act, but there were multiple lawsuits that had established legal clauses on this issue. Today, all public media content, whether it’s shown on TV or in a classroom, has to have captions. The 2012 case against Netflix established the platform as a ‘place of public accommodation’. For this reason, it has to provide captions on all its shows.

 

Captions or SDH (subtitles for the deaf or hard of hearing) are used in the movies, hard-coded to appear on a screen, when a character speaks in a foreign language or an important text has to be made clear – and obviously to offer translation for foreign language films, for example, translate from english into french or german to english , etc.

 

They are also used on social media videos that are often watched on mute.

 

What are subtitles

what are subtitles

So, what does “subtitle” mean? We do hear this term all the time. Subtitles are a text transcription of the dialogue on the screen. Subtitles are based on the premise that the people watching the video can hear its audio track, but do not understand it. For this reason, subtitles only include the spoken text.

 

Subtitling is primarily used to offer translation of the video’s text into other languages, thus expanding its audience. As such, subtitles are a common feature of foreign-language films.

 

But subtitles can also be a great learning tool, when same-language subtitles accompany a film, a learning video, or content for kids. Studies show that subtitles can enhance language acquisition and literacy in children.

 

What’s the difference between closed captioning and subtitles?

 

The main difference between closed captions and subtitles is their presumed audience. Closed captions are for those who are deaf or unable to hear the video’s audio track for another reason. Subtitles are for those who can hear the audio track but do not understand it. This can happen because the language is foreign to them or the person in the video speaks with a strong accent.

 

Closed captions contain information about all components of the audio track. These include the actual dialogue, names of the speakers, their actions, background music, and sound effects. Subtitles contain only the dialogue and rarely specify the speakers.

 

Open captions vs subtitles: what is best

 

The choice between open captions and subtitles is not whether one is better than the other. Choosing between the two is more of an issue of your target audience and screening context. If you want to ensure that the video is accessible to everyone, open captions are generally the best choice. This is especially true when you are unsure if the media player or playback equipment has the captioning functionality.

 

That said, the viewers cannot turn open captions off, so they may prove to be an unwanted distraction to some. Others may be unhappy that the captions are hiding a part of the picture. Nonetheless, choose the open captions if you aim for the widest audience possible. This is also true if you don’t want to risk the people’s or equipment’s inability to turn on the closed captions.

 

Subtitles are primarily used to translate foreign language films and videos. They rely on people hearing the audio without understanding the language. For this reason, they only contain the dialogue text. That said, there is now a new SDH format of subtitling, created for deaf and hard-of-hearing people. It provides both the translation of a foreign-language video and the captions of all the other audio.

what's the difference between captions and subtitles

Subtitle and caption file formats

 

SRT

 

SRT (*.srt) subtitle format stands for SubRip Text. This is the most elementary and easy-to-use format. The name comes from the format’s original functionality – it was used to extract captions and subtitles from the media files using DVD-ripping software. SRT files contain a sequential number of subtitles, start and end timestamps, and subtitle text. The files have the following format:

 

A numeric counter to indicate the number of position of subtitles

The start and end time of the subtitle separated by ––> characters

Text of subtitle in one or more lines

A blank line indicating the end of subtitle.

 

WebVTT

WebVTT

VTT (*.vtt) is the next most popular caption and subtitle file format. VTT is a shorthand for WebVTT, which is short for Web Video Text Tracks. The format was created in 2010 as an extension to SRT compatible with HTML5 video players. It is like SRT, but richer in its functionality. The timestamps can be more precise, the subtitles’ positioning on the screen can be defined, there’s optional character differentiation, as well as font, color, and text formatting. WebVTT also carries the video’s metadata. But these options come at a cost. The more features the format has, the harder it is for the media players to support it, so *.srt still enjoys a greater prevalence.

 

Importance of multi-language captions/SDH

 

We live in a connected world. Reaching people that live on different continents is easier than ever. All you need is an internet connection. That – and offering people the ability to understand what is happening in your video. This is why multi-language captions (also known as SDH/subtitles for the deaf and hard of hearing) are any creator’s best friend.

 

  • Multi-language captions remove the language barrier, expanding your audience manifold;
  • They serve as “wheelchair access” to your video for those, who lack both the knowledge of a foreign language and are hearing-impaired;
  • They help to retain viewers who may be watching your video in a sound-sensitive or very noisy environment. This is because they can watch it even on mute;
  • They provide community service by boosting language learning. SDH help people to gain proficiency by combining foreign-language audio and subtitles;
  • The additional bonus is that SDH improve your video’s SEO. The algorithms of Google, YouTube, and Facebook index the video files through text information associated with them. It treats different languages as separate search results. Several SRT files with subtitles in different languages increase the video’s visibility in search.

The benefits of using a subtitling company

 

We’ve already established that SDH are a necessary feature for any video that seeks to reach the broadest audience possible. Why use a subtitling company for that? The benefits are numerous:

 

  • Outsourcing all the technical work, such as time-coding, syncing, encoding, etc.
  • Customization of subtitles
  • Time savings
  • Extensive range of languages

At vidby, we offer translation, subtitling, and dubbing services in 80 languages. For languages that are in use in different countries (think Portuguese spoken in Portugal and Brazil or French in France, Switzerland, Belgium, and Canada), we offer translations and dubbing in the dialect of your choice. What’s more, we do it superfast and save you a lot of money in the process. How do we do it? Well, we use state-of-the-art AI-powered translation and dubbing software. It enables us to translate one minute of video and dub it (with AI voices of your choice, available in different genders and age groups) in just two minutes.

subtitling services

We offer different rates depending on the purpose and desired quality of your final video. For the highest-quality video, human experts verify all AI-powered translations and professional actors do the dubbing. Such packages are perfect for expanding the audiences of your films, cartoons, and advertising campaigns. Other packages offer human-checked AI translation, unaudited AI translation, AI dubbing – depending on the audience and content of your video. Join such giants as Samsung, Siemens, and McDonalds who use our services and save money, time, and effort with our help!

 

FAQs

 

Can you download captions and subtitles on YouTube?

 

Yes, you can download caption and subtitle text from YouTube as a text file if the video’s creator added those subtitles to their video file.

 

To do so, go to the YouTube video, look at the menu below the video, and click on three dots next to the Save button. Select ‘Open Transcript’ to see the interactive script in a window next to the video. Click the dropdown menu at the bottom of the script to select the necessary language. Click the three dots at the top of the transcript to toggle the timestamps (you can choose to save the transcript with or without them). Once you check these options, highlight the transcript of the text, copy, and paste it to a new document.

 

Does closed captions mean subtitles?

 

Asking “Is closed captioning the same as subtitles?” is a totally valid question because many people use the two names interchangeably. More often than not, when you think you are seeing/using subtitles, it is actually closed captions. If the text you see on the screen includes more than the dialogue lines, you are watching a video with closed captions. If the text on the screen assumes that you hear the sounds, but cannot understand or decipher them, you are seeing subtitles.

 

That said, there are geographical differences in the two names as well. The distinction above is applicable in the United States and Canada. In the UK and many other countries, the word ‘subtitles’ is used to refer to both. An additional format, called SDH (subtitles for the deaf and hard of hearing) has appeared in recent years to differentiate between subtitles and closed captions.

 

What are the subtitles?

 

Subtitles are a text script of all speech in a video. Subtitles assume that the viewers can hear the sounds, but are unable to understand them. They are, therefore, created to translate the video’s audio into a language that the viewer understands. This is true both for foreign-language films, and for films that include some foreign-language dialogue or shots of newspaper headlines, shop signs, etc. that are relevant to the story.

 

Another use of subtitles is for language learning, when people watch a show or a movie in a foreign language and with subtitles in the same language. Studies show that as long as the difficulty of text is the same as the user’s skill, such watching can improve language comprehension.

 

Subtitles usually have a transparent background. SDH text is put on a darker background similar to the closed captions.

What are closed captions?

 

Closed captions are a text script of all sounds in a video. Closed captions assume that the viewers cannot hear the sounds. They provide a step-by-step narration on the screen, with character differentiation, descriptions of sounds such as laughter and crying, information about background music, etc.

 

Closed captions are also useful when the user watches a video in noisy or sound-sensitive environments. Another example of use is on social media where the sound is often muted.

 

Closed captions are done in white letters against a dark-gray/black background. They use square brackets or parenthesis to display additional information about sounds (not dialogue).

Why are subtitles different from the audio track?

 

When the movie is translated with subtitles, the idea is to stay as close to the text as possible. (Obviously, things like idioms should not be translated word for word). When the movie has dubbing over the original language, other considerations come to the fore. Dubbed lines must be as close in length as the original one as possible to match the scene timing and mouth movements. Therefore, if you watch the same movie with dubbing (audio) and subtitles, you may very well discover discrepancies in them. It also happens that different people do dubbing and subtitle translations. Their translations may not be matched and so the same word can be translated differently.

 

Closed captions can also differ from the audio track, albeit for different reasons. Captioning may be edited with the target audience in mind. If you are watching a kid’s show with a lot of repetitions, you might see some of this repetition removed from the captions. If captioning is done live, the captioner may mishear (or even mistype) something. Sometimes, the captioning is fully automated – and that can also produce mistakes.

 

Why is it called closed captions?

 

Closed captions got their name because – you guessed it – they can be closed (and opened at will). Unlike open captions, which are hard-coded into the video and cannot be removed, closed captions can be shown and hidden at will.

Last news

More in this tags