Search completed in 0.96 seconds.
<video>: The Video Embed element - HTML: Hypertext Markup Language
the html
video element (<
video>) embeds a media player which supports
video playback into the document.
... you can use <
video> for audio content as well, but the <audio> element may provide a more appropriate user experience.
... the above example shows simple usage of the <
video> element.
...And 65 more matches
HTMLVideoElement.videoHeight - Web APIs
the html
videoelement interface's read-only
videoheight property indicates the intrinsic height of the
video, expressed in css pixels.
... syntax height = html
videoelement.
videoheight; value an integer value specifying the intrinsic height of the
video in css pixels.
... if the element's readystate is htmlmediaelement.have_nothing, then the value of this property is 0, because neither
video nor poster frame size information is yet available.
...And 7 more matches
HTMLVideoElement.videoWidth - Web APIs
the html
videoelement interface's read-only
videowidth property indicates the intrinsic width of the
video, expressed in css pixels.
... syntax width = html
videoelement.
videowidth; value an integer value specifying the intrinsic width of the
video in css pixels.
... if the element's readystate is htmlmediaelement.have_nothing, then the value of this property is 0, because neither
video nor poster frame size information is yet available.
...And 7 more matches
VideoPlaybackQuality.corruptedVideoFrames - Web APIs
the
videoplaybackquality interface's read-only corrupted
videoframes property the number of corrupted
video frames that have been received since the <
video> element was last loaded or reloaded.
... syntax corruptframefount =
videoplaybackquality.corrupted
videoframes; value the number of corrupted
video frames that have been received since the <
video> element was last loaded or reloaded.
... it is up to the user agent to determine whether or not to display a corrupted
video frame.
...And 3 more matches
HTMLVideoElement.getVideoPlaybackQuality() - Web APIs
the html
videoelement method get
videoplaybackquality() creates and returns a
videoplaybackquality object containing metrics including how many frames have been lost.
... the data returned can be used to evaluate the quality of the
video stream.
... syntax
videopq =
videoelement.get
videoplaybackquality(); return value a
videoplaybackquality object providing information about the
video element's current playback quality.
...And 2 more matches
VideoPlaybackQuality.droppedVideoFrames - Web APIs
the read-only dropped
videoframes property of the
videoplaybackquality interface returns the number of
video frames which have been dropped rather than being displayed since the last time the media was loaded into the html
videoelement.
... syntax value =
videoplaybackquality.dropped
videoframes; value an unsigned 64-bit value indicating the number of frames that have been dropped since the last time the media in the <
video> element was loaded or reloaded.
... this information can be used to determine whether or not to downgrade the
video stream to avoid dropping frames.
...And 2 more matches
VideoPlaybackQuality.totalVideoFrames - Web APIs
the
videoplaybackquality interface's total
videoframes read-only property returns the total number of
video frames that have been displayed or dropped since the media was loaded.
... syntax value =
videoplaybackquality.total
videoframes; value the total number of frames that the <
video> element has displayed or dropped since the media was loaded into it.
... example this example calls get
videoplaybackquality() to obtain a
videoplaybackquality object, then determines what percentage of frames have been lost by either corruption or being dropped.
...And 2 more matches
HTMLVideoElement.msInsertVideoEffect() - Web APIs
the htmlmediaelement.msinsert
videoeffect() method inserts the specified
video effect into the media pipeline.
... syntax str = htmlmediaelement.msinsert
videoeffect(activatableclassid: domstring, effectrequired: boolean, config); parameters activatableclassid a domstring defining the
video effects class.
... effectrequired a boolean which if set to true requires a
video effect to be defined.
... example var o
video1 = document.getelementbyid("
video1"); o
video1.msinsert
videoeffect("windows.media.
videoeffects.
videostabilization", true, null); see also html
videoelement microsoft api extensions ...
Web video codec guide - Web media technologies
due to the sheer size of uncompressed
video data, it's necessary to compress it significantly in order to store it, let alone transmit it over a network.
... imagine the amount of data needed to store uncompressed
video: a single frame of high definition (1920x1080)
video in full color (4 bytes per pixel) is 8,294,400 bytes.
... at a typical 30 frames per second, each second of hd
video would occupy 248,832,000 bytes (~249 mb).
...And 111 more matches
Video and audio content - Learn web development
previous overview: multimedia and embedding next now that we are comfortable with adding simple images to a webpage, the next step is to start adding
video and audio players to your html documents!
... in this article we'll look at doing just that with the <
video> and <audio> elements; we'll then finish off by looking at how to add captions/subtitles to your
videos.
... objective: to learn how to embed
video and audio content into a webpage, and add captions/subtitles to
video.
...And 64 more matches
Creating a cross-browser video player - Developer guides
this article describes a simple html5
video player that uses the media and fullscreen apis and works across most major desktop and mobile browsers.
... working example our example
video player displays a clip from an open source movie called tears of steel, and includes typical
video controls.
... the
video first of all the <
video> element is defined, contained within a <figure> element that acts as the
video container.
...And 50 more matches
Audio and Video Delivery - Developer guides
we can deliver audio and
video on the web in a number of ways, ranging from 'static' media files to adaptive live streams.
... the audio and
video elements whether we are dealing with pre-recorded audio files or live streams, the mechanism for making them available through the browser's <audio> and <
video> elements remains pretty much the same.
... to deliver
video and audio, the general workflow is usually something like this: check what format the browser supports via feature detection (usually a choice of two, as stated above).
...And 49 more matches
Signaling and video calling - Web APIs
this tutorial will guide you through building a two-way
video-call.
... webrtc is a fully peer-to-peer technology for the real-time exchange of audio,
video, and data, with one central caveat.
... in this article, we will further enhance the websocket chat first created as part of our websocket documentation (this article link is forthcoming; it isn't actually online yet) to support opening a two-way
video call between users.
...And 44 more matches
Video and Audio APIs - Learn web development
previous overview: client-side web apis next html5 comes with elements for embedding rich media in documents — <
video> and <audio> — which in turn come with their own apis for controlling playback, seeking, etc.
... prerequisites: javascript basics (see first steps, building blocks, javascript objects), the basics of client-side apis objective: to learn how to use browser apis to control
video and audio playback.
... html5
video and audio the <
video> and <audio> elements allow us to embed
video and audio into web pages.
...And 38 more matches
Multimedia: video - Learn web development
previous overview: performance next as we learned in the previous section, media, namely images and
video, account for over 70% of the bytes downloaded for the average website.
...this article looks at optimizing
video to improve web performance.
... objective: to learn about the various
video formats, their impact on performance, and how to reduce
video impact on overall page load time while serving the smallest
video file size based on each browsers file type support.
...And 32 more matches
Adding captions and subtitles to HTML5 video - Developer guides
in other articles we looked at how to build a cross browser
video player using the htmlmediaelement and window.fullscreen apis, and also at how to style the player.
... captioned
video example in this article, we will refer to the
video player with captions example.
... html5 and
video captions before diving into how to add captions to the
video player, there are a number of things that we will first mention, which you should be aware of before we start.
...And 25 more matches
HTMLVideoElement - Web APIs
the html
videoelement interface provides special properties and methods for manipulating
video objects.
...you should either provide your
video in a single format that all the relevant browsers supports, or provide multiple
video sources in enough different formats that all the browsers you need to support are covered.
...l="#fff" stroke="#d4dde4" stroke-width="2px" /><text x="411" y="94" font-size="12px" font-family="consolas,monaco,andale mono,monospace" fill="#4d4e53" text-anchor="middle" alignment-baseline="middle">htmlmediaelement</text></a><polyline points="331,89 321,84 321,94 331,89" stroke="#d4dde4" fill="none"/><line x1="321" y1="89" x2="291" y2="89" stroke="#d4dde4"/><a xlink:href="/docs/web/api/html
videoelement" target="_top"><rect x="131" y="65" width="160" height="50" fill="#f4f7f8" stroke="#d4dde4" stroke-width="2px" /><text x="211" y="94" font-size="12px" font-family="consolas,monaco,andale mono,monospace" fill="#4d4e53" text-anchor="middle" alignment-baseline="middle">html
videoelement</text></a></svg></div> a:hover text { fill: #0095dd; pointer-events: all;} properties inherits p...
...And 19 more matches
Audio and video manipulation - Developer guides
having native audio and
video in the browser means we can use these data streams with technologies such as <canvas>, webgl or web audio api to modify audio and
video directly, for example adding reverb/compression effects to audio, or grayscale/sepia filters to
video.
...
video manipulation the ability to read the pixel values from each frame of a
video can be very useful.
...
video and canvas the <canvas> element provides a surface for drawing graphics onto web pages; it is very powerful and can be coupled tightly with
video.
...And 19 more matches
Video player styling basics - Developer guides
in the previous cross browser
video player article we described how to build a cross-browser html5
video player using the media and fullscreen apis.
... preliminary modifications from the original example this section summarises the modifications that were made to the original
video player example to make the styling task easier, before the bulk of the work was started.
...the custom
video controls and <progress> element are now contained within <div> elements, rather than residing inside unordered list items.
...And 16 more matches
Manipulating video using canvas - Web APIs
by combining the capabilities of the
video element with a canvas, you can manipulate
video data in real time to incorporate a variety of visual effects to the
video being displayed.
...html> <head> <style> body { background: black; color:#cccccc; } #c2 { background-image: url(media/foo.png); background-repeat: no-repeat; } div { float: left; border :1px solid #444444; padding:10px; margin: 10px; background:#3b3b3b; } </style> </head> <body> <div> <
video id="
video" src="media/
video.mp4" controls="true" crossorigin="anonymous"/> </div> <div> <canvas id="c1" width="160" height="96"></canvas> <canvas id="c2" width="160" height="96"></canvas> </div> <script type="text/javascript" src="processor.js"></script> </body> </html> the key bits to take away from this are: this document establishes two canvas elements, with th...
... canvas c1 is used to display the current frame of the original
video, while c2 is used to display the
video after performing the chroma-keying effect; c2 is preloaded with the still image that will be used to replace the green background in the
video.
...And 13 more matches
DASH Adaptive Streaming for HTML 5 Video - HTML: Hypertext Markup Language
this means that it allows for a
video stream to switch between bit rates on the basis of network performance, in order to keep a
video playing.
... browser support firefox 21 includes an implementation of dash for html5 webm
video which is turned off by default.
... firefox 23 removed support for dash for html5 webm
video.
...And 13 more matches
DisplayMediaStreamConstraints.video - Web APIs
the displaymediastreamconstraints dictionary's
video property is used to configure the
video track in the stream returned by getdisplaymedia().
...since a
video track must always be included, a value of false results in a typeerror exception being thrown.
... more precise control over the format of the returned
video track may be exercised by instead providing a mediatrackconstraints object, which is used to process the
video data after obtaining it from the device but prior to adding it to the stream.
...And 9 more matches
MediaStreamConstraints.video - Web APIs
the mediastreamconstraints dictionary's
video property is used to indicate what kind of
video track, if any, should be included in the mediastream returned by a call to getusermedia().
... syntax var
videoconstraints = true | false | mediatrackconstraints; value the value of the
video property can be specified as either of two types: boolean if a boolean value is specified, it simply indicates whether or not a
video track should be included in the returned stream; if it's true, a
video track is included; if no
video source is available or if permission is not given to use the
video source, the call to getusermedia() will fail.
... if false, no
video track is included.
...And 9 more matches
VideoTrackList - Web APIs
the
videotracklist interface is used to represent a list of the
video tracks contained within a <
video> element, with each track represented by a separate
videotrack object in the list.
... retrieve an instance of this object with htmlmediaelement.
videotracks.
... event handlers onaddtrack an event handler to be called when the addtrack event is fired, indicating that a new
video track has been added to the media element.
...And 9 more matches
HTMLVideoElement.msZoom - Web APIs
mszoom is a read/write property which gets or sets whether the
video frame is trimmed, on the top and bottom or left and right, to fit the
video display.
... syntax html
videoelement.mszoom; value boolean value set to true trims the
video frame to the display space.
... set to false the
video frame uses letter box or pillarbox to display
video.
...And 8 more matches
Live streaming web audio and video - Developer guides
streaming audio and
video on demand streaming technology is not used exclusively for live streams.
... it can also be used instead of the traditional progressive download method for audio and
video on demand: there are several advantages to this: latency is generally lower so media will start playing more quickly adaptive streaming makes for better experiences on a variety of devices media is downloaded just in time which makes bandwidth usage more efficient streaming protocols while static media is usually served over http, there are several protocols for serving adaptive streams; let's take a look at the options.
... important: although the <audio> and <
video> tags are protocol agnostic, no browser currently supports anything other than http without requiring plugins, although this looks set to change.
...And 8 more matches
VideoTrack.kind - Web APIs
the kind property contains a string indicating the category of
video contained in the
videotrack.
...see
video track kind strings for a list of the kinds available for
video tracks.
... syntax var trackkind =
videotrack.kind; value a domstring specifying the type of content the media represents.
...And 7 more matches
VideoTrack - Web APIs
the
videotrack interface represents a single
video track from a <
video> element.
... the most common use for accessing a
videotrack object is to toggle its selected property in order to make it the active
video track for its <
video> element.
... properties selected a boolean value which controls whether or not the
video track is active.
...And 7 more matches
MediaStream.getVideoTracks() - Web APIs
the get
videotracks() method of the mediastream interface returns a sequence of mediastreamtrack objects representing the
video tracks in this stream.
... syntax var mediastreamtracks[] = mediastream.get
videotracks(); parameters none.
... return value an array of mediastreamtrack objects, one for each
video track contained in the media stream.
...And 6 more matches
VideoConfiguration - Web APIs
the
videoconfiguration dictionary of the media capabilities api is used to define the
video file being tested when calling the mediacapabilities methods encodinginfo() and decodinginfo() to determine whether or not the described
video configuration is supported, and how smoothly and how smoooth and power-efficient it can be handled.
... properties the
videoconfiguration dictionary is made up of five
video properties, including: contenttype: a valid
video mime type.
... see our web
video codec guide for types which may be supported.
...And 6 more matches
VideoPlaybackQuality - Web APIs
a
videoplaybackquality object is returned by the html
videoelement.get
videoplaybackquality() method and contains metrics that can be used to determine the playback quality of a
video.
... properties the
videoplaybackquality interface doesn't inherit properties from any other interfaces.
... dropped
videoframes read only an unsigned long giving the number of
video frames dropped since the creation of the associated html
videoelement.
...And 5 more matches
Web Video Text Tracks Format (WebVTT) - Web APIs
web
video text tracks format (webvtt) is a format for displaying timed text tracks (such as subtitles or captions) using the <track> element.
... the primary purpose of webvtt files is to add text overlays to a <
video>.
... within site css
video::cue { background-image: linear-gradient(to bottom, dimgray, lightgray); color: papayawhip; }
video::cue(b) { color: peachpuff; } here, all
video elements are styled to use a gray linear gradient as their backgrounds, with a foreground color of "papayawhip".
...And 5 more matches
HTMLVideoElement.msIsLayoutOptimalForPlayback - Web APIs
msislayoutoptimalforplayback is a read-only property which indicates whether the
video can be rendered more efficiently.
... syntax html
videoelement.msislayoutoptimalforplayback: domstring; value boolean value set to true indicates that
video is being rendered optimally (better performance and using less battery power).
... for msislayoutoptimalforplayback to be true, avoid the following:
video elements with cascading style sheets (css) outlines set.
...And 4 more matches
VideoTrackList.length - Web APIs
the read-only
videotracklist property length returns the number of entries in the
videotracklist, each of which is a
videotrack representing one
video track in the media element.
... a value of 0 indicates that there are no
video tracks in the media.
... syntax var trackcount =
videotracklist.length; value a number indicating how many
video tracks are included in the
videotracklist.
...And 4 more matches
VideoTrackList.onaddtrack - Web APIs
the
videotracklist property onaddtrack is an event handler which is called when the addtrack event occurs, indicating that a new
video track has been added to the media element whose
video tracks the
videotracklist represents.
... syntax
videotracklist.onaddtrack = eventhandler; value set onaddtrack to a function that accepts as input a trackevent object which indicates in its track property which
video track has been added to the media.
... usage notes the addtrack event is called whenever a new track is added to the media element whose
video tracks are represented by the
videotracklist object.
...And 4 more matches
HTMLMediaElement.videoTracks - Web APIs
the read-only
videotracks property on htmlmediaelement objects returns a
videotracklist object listing all of the
videotrack objects representing the media element's
video tracks.
...once you have a reference to the list, you can monitor it for changes to detect when new
video tracks are added or existing ones removed.
... see event handlers in
videotracklist to learn more about watching for changes to a media element's track list.
...And 3 more matches
VideoTrackList.onremovetrack - Web APIs
the
videotracklist onremovetrack event handler is called when the removetrack event occurs, indicating that a
video track has been removed from the media element, and therefore also from the
videotracklist.
... the event is passed into the event handler in the form of a trackevent object, whose track property identifies the track that was removed from the media element's
videotracklist.
... syntax
videotracklist.onremovetrack = eventhandler; value set onremovetrack to a function that accepts as input a trackevent object which indicates in its track property which
video track has been removed from the media element.
...And 3 more matches
Digital video concepts - Web media technologies
in this article, we explore important concepts that are useful to understand in order to fully grasp how to work with
video on the web.
... encoding color representing the colors in an image or
video requires several values for each pixel.
...there are several color models, and
video codec makes use of one or more of these to represent their pixels during the encoding process as well as after decoding the
video frames.
...And 3 more matches
VideoTrack.label - Web APIs
the read-only
videotrack property label returns a string specifying the
video track's human-readable label, if one is available; otherwise, it returns an empty string.
... syntax var
videotracklabel =
videotrack.label; value a domstring specifying the track's human-readable label, if one is available in the track metadata.
... example this example returns an array of track kinds and labels for potential use in a user interface to select
video tracks for a specified media element.
...And 2 more matches
VideoTrackList.onchange - Web APIs
the
videotracklist property onchange is an event handler which is called when the change event occurs, indicating that a
videotrack in the
videotracklist has been made active.
...to determine the new state of media's tracks, you'll have to look at their
videotrack.selected flags.
... syntax
videotracklist.onchange = eventhandler; value set onchange to a function that should be called whenever a track is made active.
...And 2 more matches
Media type and format guide: image, audio, and video content - Web media technologies
originally, these capabilities were limited, and were expanded organically, with different browsers finding their own solutions to the problems around including still and
video imagery on the web.
... web
video codec guide this article provides basic information about the
video codecs supported by the major browsers, as well as some that are not commonly supported but that you might still run into.
... codecs used by webrtc webrtc doesn't use a container, but instead streams the encoded media itself from peer to peer using mediastreamtrack objects to represent each audio or
video track.
...And 2 more matches
HTMLVideoElement.msHorizontalMirror - Web APIs
mshorizontalmirror is a read/write property which gets or sets whether a
video element is flipped horizontally in the display.
... syntax html
videoelement.mshorizontalmirror: boolean; value boolean value set to true flips the
video playback horizontally.
...
video perspective is flipped on a horizontal axis - this may be useful for playback of a webcam
video, providing the user with better mirroring of their real behaviors (ie.
... example var my
video = document.getelementbyid("
videotag1"); my
video.mshorizontalmirror = true; my
video.play(); example #2: var flip = document.queryselector('#flip'); flip.addeventlistener('click', function() {
video.mshorizontalmirror = true; }); see also html
videoelement microsoft api extensions ...
VideoPlaybackQuality.creationTime - Web APIs
the read-only creationtime property on the
videoplaybackquality interface reports the number of milliseconds since the browsing context was created this quality sample was recorded.
... syntax value =
videoplaybackquality.creationtime; value a domhighrestimestamp object which indicates the number of milliseconds that elapased between the time the browsing context was created and the time at which this sample of the
video quality was obtained.
... example this example calls get
videoplaybackquality() to obtain a
videoplaybackquality object, then determines what percentage of frames have been lost by either corruption or being dropped.
... var
videoelem = document.getelementbyid("my_vid"); var quality =
videoelem.get
videoplaybackquality(); if ((quality.corrupted
videoframes + quality.dropped
videoframes)/quality.total
videoframes > 0.1) { lostframesthresholdexceeded(); } specifications specification status comment media playback qualitythe definition of '
videoplaybackquality.corrupted
videoframes' in that specification.
VideoTrack.id - Web APIs
the id property contains a string which uniquely identifies the track represented by the
videotrack.
... this id can be used with the
videotracklist.gettrackbyid() method to locate a specific track within the media associated with a media element.
... syntax var trackid =
videotrack.id; value a domstring which identifies the track, suitable for use when calling gettrackbyid() on an
videotracklist such as the one specified by a media element's
videotracks property.
... specifications specification status comment html living standardthe definition of '
videotrack: id' in that specification.
Videotrack.language - Web APIs
the read-only
videotrack property language returns a string identifying the language used in the
video track.
... for tracks that include multiple languages (such as a movie in english in which a few lines are spoken in other languages), this should be the
video's primary language.
... syntax var
videotracklanguage =
videotrack.language; value a domstring specifying the bcp 47 (rfc 5646) format language tag of the primary language used in the
video track, or an empty string ("") if the language is not specified or known, or if the track doesn't contain speech.
... specifications specification status comment html living standardthe definition of '
videotrack: language' in that specification.
VideoTrack.selected - Web APIs
the
videotrack property selected controls whether or not a particular
video track is active.
... syntax is
videoselected =
videotrack.selected;
videotrack.selected = true | false; value the selected property is a boolean whose value is true if the track is active.
... only a single
video track can be active at any given time, so setting this property to true for one track while another track is active will make that other track inactive.
... specifications specification status comment html living standardthe definition of '
videotrack: selected' in that specification.
msSetVideoRectangle - Web APIs
the html
videoelement.msset
videorectangle() method sets the dimensions of a sub-rectangle within a
video.
... syntax html
videoelement.msset
videorectangle(); parameters left a number representing left-side position.
... example html
videoelement.msset
videorectangle(left: 2, top: 0, right: 4, bottom: 4); see also html
videoelement microsoft api extensions ...
MediaRecorder.videoBitsPerSecond - Web APIs
the
videobitspersecond read-only property of the mediarecorder interface returns the
video encoding bit rate in use.
... syntax var
videobitspersecond = mediarecorder.
videobitspersecond value a number (unsigned long).
... example // tbd specifications specification status comment mediastream recordingthe definition of '
videobitspersecond' in that specification.
SourceBuffer.videoTracks - Web APIs
the
videotracks read-only property of the sourcebuffer interface returns a list of the
video tracks currently contained inside the sourcebuffer.
... syntax var my
videotracks = sourcebuffer.
videotracks; value an
videotracklist object.
... example tbd specifications specification status comment media source extensionsthe definition of '
videotracks' in that specification.
VideoTrack.sourceBuffer - Web APIs
the read-only
videotrack property sourcebuffer returns the sourcebuffer that created the track, or null if the track was not created by a sourcebuffer or the sourcebuffer has been removed from the mediasource.sourcebuffers attribute of its parent media source.
... syntax var sourcebuffer =
videotrack.sourcebuffer; value a sourcebuffer or null.
... specifications specification status comment media source extensionsthe definition of '
videotrack: sourcebuffer' in that specification.
VideoTrackList: change event - Web APIs
the change event is fired when a
video track is made active or inactive, for example by changing the track's selected property.
... bubbles no cancelable no interface event event handler property onchange examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.
videotracks.addeventlistener('change', (event) => { console.log(`'${event.type}' event fired`); }); // changing the value of `selected` will trigger the `change` event const toggletrackbutton = document.queryselector('.toggle-track'); toggletrackbutton.addeventlistener('click', () => { const track =
videoelement.
videotracks[0]; track.selected = !track.selected; }); using the onchange event handler property: const
videoelement = document.queryselector('
video');
videoelement.
videotracks.onchange = (event) => { console.log(`'${event.type}' event fired`...
...); }; // changing the value of `selected` will trigger the `change` event const toggletrackbutton = document.queryselector('.toggle-track'); toggletrackbutton.addeventlistener('click', () => { const track =
videoelement.
videotracks[0]; track.selected = !track.selected; }); specifications specification status html living standardthe definition of 'change' in that specification.
VideoTrackList.selectedIndex - Web APIs
the read-only
videotracklist property selectedindex returns the index of the currently selected track, if any, or -1 otherwise.
... syntax var index =
videotracklist.selectedindex; value a number indicating the index of the currently selected track, if any, or -1 otherwise.
... specifications specification status comment html living standardthe definition of '
videotracklist: selectedindex' in that specification.
Video presentations - Archive of obsolete content
mozilla is actively working to produce
video presentations that can help you learn how the mozilla codebase works and how to take advantage of its technology in your own applications and extensions.
...(as quicktime; 105 mb and 34 mb) other
videos (might be obsolete) mozilla
video presentations (2001-2002) seamonkey brownbag training series (2000) ...
Anatomy of a video game - Game development
this article looks at the anatomy and workflow of the average
video game from a technical point of view, in terms of how the main loop should run.
... present, accept, interpret, calculate, repeat the goal of every
video game is to present the user(s) with a situation, accept their input, interpret those signals into actions, and calculate a new situation resulting from those acts.
HTMLVideoElement.msFrameStep() - Web APIs
the html
videoelement.msframestep() method steps the
video by one frame forward or one frame backward.
... syntax html
videoelement.msframestep(forward); parameters forward a boolean which if set to true steps the
video forward by one frame, if false steps the
video backwards by one frame.
HTMLVideoElement.msIsStereo3D - Web APIs
msisstereo3d is a read-only property which determines whether the system considers the loaded
video source to be stereo 3-d or not.
... syntax html
videoelement.msisstereo3d: boolean; value boolean value set to true indicates that the
video source is stereo 3d.
onMSVideoFormatChanged - Web APIs
onms
videoformatchanged is an event which occurs when the
video format changes.
... syntax value description event property object.onms
videoformatchanged = handler; attachevent method object.attachevent("onms
videoformatchanged", handler) addeventlistener method object.addeventlistener("", handler, usecapture) event handler parameters val[in], type=function see also html
videoelement microsoft api extensions ...
onMSVideoFrameStepCompleted - Web APIs
onms
videoframestepcompleted is an event which occurs when the
video frame has been stepped forward or backward one frame.
... syntax value description event property object.onms
videoframestepcompleted = handler; attachevent method object.attachevent("onms
videoframestepcompleted", handler) addeventlistener method object.addeventlistener("", handler, usecapture) event handler parameters val[in], type=function see also html
videoelement microsoft api extensions ...
onMSVideoOptimalLayoutChanged - Web APIs
onms
videooptimallayoutchanged is an event which occurs when the msislayoutoptimalforplayback state changes.
... syntax value description event property object.onms
videooptimallayoutchanged = handler; attachevent method object.attachevent("onms
videooptimallayoutchanged", handler) addeventlistener method object.addeventlistener("", handler, usecapture) synchronous no bubbles no cancelable no see also msislayoutoptimalforplayback html
videoelement microsoft api extensions ...
videoCapabilities - Web APIs
the mediakeysystemconfiguration.
videocapabilities read-only property returns an array of supported
video type and capability pairs.
... syntax var
videocapabilities[{contenttype: 'contenttype', robustness:'robustness'}] = mediasystemconfiguration.
videocapabilities; specifications specification status comment encrypted media extensionsthe definition of '
videocapabilities' in that specification.
VideoPlaybackQuality.totalFrameDelay - Web APIs
the
videoplaybackquality.totalframedelay read-only property returns a double containing the sum of the frame delay since the creation of the associated html
videoelement.
... syntax value =
videoplaybackquality.totalframedelay; example var
videoelt = document.getelementbyid('my_vid'); var quality =
videoelt.get
videoplaybackquality(); alert(quality.totalframedelay); ...
VideoTrackList: addtrack event - Web APIs
the addtrack event is fired when a track is added to a
videotracklist.
... bubbles no cancelable no interface trackevent event handler property onaddtrack examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.
videotracks.addeventlistener('addtrack', (event) => { console.log(`
video track: ${event.track.label} added`); }); using the onaddtrack event handler property: const
videoelement = document.queryselector('
video');
videoelement.
videotracks.onaddtrack = (event) => { console.log(`
video track: ${event.track.label} added`); }; specifications specification status html living standardthe definition of 'addtrack' in that specification.
VideoTrackList: removetrack event - Web APIs
the removetrack event is fired when a track is removed from a
videotracklist.
... bubbles no cancelable no interface trackevent event handler property onremovetrack examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.
videotracks.addeventlistener('removetrack', (event) => { console.log(`
video track: ${event.track.label} removed`); }); using the onremovetrack event handler property: const
videoelement = document.queryselector('
video');
videoelement.
videotracks.onremovetrack = (event) => { console.log(`
video track: ${event.track.label} removed`); }; specifications specification status html living standardthe definition of 'removetrack' in that specification.
Video textures - Web APIs
« previous this example demonstrates how to use
video files as textures for webgl surfaces.
... textures from
video « previous ...
Using audio and video in HTML - Web media technologies
the html <audio> and <
video> elements let you embed audio and
video content into a web page.
... we don't have a particularly good guide to using these objects offscreen at this time, although audio and
video manipulation may be a good start.
Guide to streaming audio and video - Web media technologies
in this guide, we'll examine the techniques used to stream audio and/or
video media on the web, and how you can optimize your code, your media, your server, and the options you use while performing the streaming to bring out the best quality and performance possible.
...for example, hls lets the server stream a
video with multiple audio streams which the user can choose from, in order to hear their own language.
Index - Web APIs
121 audioconfiguration api, audio, audioconfiguration, experimental, interface, media capabilities api, reference,
video the audioconfiguration dictionary of the media capabilities api defines the audio file being tested when calling mediacapabilities.encodinginfo() or mediacapabilities.decodinginfo() to query whether a specific audio configuration is supported, smooth, and/or power efficient.
... 185 audioscheduledsourcenode: ended event audio, html dom, htmlmediaelement, media, media streams api, reference,
video, web audio api, ended, events the ended event of the audioscheduledsourcenode interface is fired when the source node has stopped playing.
... 186 audiotrack audio, audiotrack, html, html dom, interface, media, reference, track the audiotrack interface represents a single audio track from one of the html media elements, <audio> or <
video>.
...And 297 more matches
Media container formats (file types) - Web media technologies
the format of audio and
video media files is defined in two parts (three if a file has both audio and
video in it, of course): the audio and/or
video codecs used and the media container format (or file type) used.
...instead, it streams the encoded audio and
video tracks directly from one peer to another using mediastreamtrack objects to represent each track.
...some support only audio while others support both audio and
video.
...And 35 more matches
Codecs used by WebRTC - Web media technologies
the webrtc api makes it possible to construct web sites and apps that let users communicate in real time, using audio and/or
video as well as optional data and other information.
...however, rfc 7742 specifies that all webrtc-compatible browsers must support vp8 and h.264's constrained baseline profile for
video, and rfc 7874 specifies that browsers must support at least the opus codec as well as g.711's pcma and pcmu formats.
... while compression is always a necessity when dealing with media on the web, it's of additional importance when
videoconferencing in order to ensure that the participants are able to communicate without lag or interruptions.
...And 35 more matches
The "codecs" parameter in common media types - Web media technologies
at a fundamental level, you can specify the type of a media file using a simple mime type, such as
video/mp4 or audio/mpeg.
... however, many media types—especially those that support
video tracks—can benefit from the ability to more precisely describe the format of the data within them.
... for instance, just describing a
video in an mpeg-4 file with the mime type
video/mp4 doesn't say anything about what format the actual media within takes.
...And 34 more matches
Capabilities, constraints, and settings - Web APIs
the constraint exerciser lets you experiment with the results of different constraint sets being applied to the audio and
video tracks coming from the computer's a/v input devices (such as its webcam and microphone).
...for example: let constraints = { width: 1920, height: 1080, aspectratio: 1.777777778 }; mytrack.applyconstraints(constraints); in this case, the constraints indicate that any values are fine for nearly all properties, but that a standard high definition (hd)
video size is desired, with the standard 16:9 aspect ratio.
... applying constraints the first and most common way to use constraints is to specify them when you call getusermedia(): navigator.mediadevices.getusermedia({
video: { width: { min: 640, ideal: 1920 }, height: { min: 400, ideal: 1080 }, aspectratio: { ideal: 1.7777777778 } }, audio: { samplesize: 16, channelcount: 2 } }).then(stream => {
videoelement.srcobject = stream; }).catch(handleerror); in this example, constraints are applied at getusermedia() time, asking for an ideal set of options with fallbacks for the
video.
...And 32 more matches
Client-side storage - Learn web development
this can be used for things from complete sets of customer records to even complex data types like audio or
video files.
...you can store
videos, images, and pretty much anything else in an indexeddb instance.
...you can store just about anything you want, including complex objects such as
video or image blobs.
...And 28 more matches
Accessible multimedia - Learn web development
previous overview: accessibility next another category of content that can create accessibility problems is multimedia —
video, audio, and image content need to be given proper textual alternatives so they can be understood by assistive technologies and their users.
...images,
videos, <canvas> elements, flash movies, etc., aren't as easily understood by screenreaders or navigated by the keyboard, and we need to give them a helping hand.
... for example: <img src="dinosaur.png" alt="a red tyrannosaurus rex: a two legged dinosaur standing upright like a human, with small arms, and a large head with lots of sharp teeth."> accessible audio and
video controls implementing controls for web-based audio/
video shouldn't be a problem, right?
...And 24 more matches
Taking still photos with WebRTC - Web APIs
the first panel on the left contains two components: a <
video> element, which will receive the stream from webrtc, and a <button> the user clicks to capture a
video frame.
... <div class="camera"> <
video id="
video">
video stream not available.</
video> <button id="startbutton">take photo</button> </div> this is straightforward, and we'll see how it ties together when we get into the javascript code.
... (function() { var width = 320; // we will scale the photo width to this var height = 0; // this will be computed based on the input stream var streaming = false; var
video = null; var canvas = null; var photo = null; var startbutton = null; those variables are: width whatever size the incoming
video is, we're going to scale the resulting image to be 320 pixels wide.
...And 18 more matches
Using the Media Capabilities API - Web APIs
you can, therefore, test for the presence of the api like so: if ("mediacapabilities" in navigator) { // mediacapabilities is available } else { // mediacapabilities is not available } taking
video as an example, to obtain information about
video decoding abilities, you create a
video decoding configuration which you pass as a parameter to mediacapabilities.decodinginfo() method.
... this returns a promise that fulfills with information about the media capabilities as to whether the
video can be decoded, and whether decoding will be smooth and power efficient.
... you can also test audio decoding as well as
video and audio encoding.
...And 17 more matches
Using the Screen Capture API - Web APIs
for example, if you specify a width constraint for the
video, it's applied by scaling the
video after the user selects the area to share.
... capturing shared audio getdisplaymedia() is most commonly used to capture
video of a user's screen (or parts thereof).
... however, user agents may allow the capture of audio along with the
video content.
...And 17 more matches
Perceivable - Accessibility
non-text content refers to multimedia such as images, audio, and
video.
... multimedia content (i.e., audio or
video) should at least have a descriptive identification available, such as a caption or similar.
... see text alternatives for static caption options, and audio transcripts,
video text tracks, and other multimedia content for other alternatives.
...And 17 more matches
Recording a media element - Web APIs
while the article using the mediastream recording api demonstrates using the mediarecorder interface to capture a mediastream generated by a hardware device, as returned by navigator.mediadevices.getusermedia(), you can also use an html media element (namely <audio> or <
video>) as the source of the mediastream to be recorded.
... html content <p>click the "start" button to begin
video recording for a few seconds.
... you can stop the
video by clicking the creatively-named "stop" button.
...And 16 more matches
Animating textures in WebGL - Web APIs
« previous in this demonstration, we build upon the previous example by replacing our static textures with the frames of an mp4
video file that's playing.
... getting access to the
video the first step is to create the <
video> element that we'll use to retrieve the
video frames: // will set to true when
video can be copied to texture var copy
video = false; function setup
video(url) { const
video = document.createelement('
video'); var playing = false; var timeupdate = false;
video.autoplay = true;
video.muted = true;
video.loop = true; // waiting for these 2 eve...
...nts ensures // there is data in the
video video.addeventlistener('playing', function() { playing = true; checkready(); }, true);
video.addeventlistener('timeupdate', function() { timeupdate = true; checkready(); }, true);
video.src = url;
video.play(); function checkready() { if (playing && timeupdate) { copy
video = true; } } return
video; } first we create a
video element.
...And 15 more matches
From object to iframe — other embedding technologies - Learn web development
previous overview: multimedia and embedding next by now you should really be getting the hang of embedding things into your web pages, including images,
video and audio.
... a little while later (late 90s, early 2000s), plugin technologies became very popular, such as java applets and flash — these allowed web developers to embed rich content into webpages such as
videos and animations, which just weren't available through html alone.
... finally, the <iframe> element appeared (along with other ways of embedding content, such as <canvas>, <
video>, etc.) this provides a way to embed an entire web document inside another one, as if it were an <img> or other such element, and is used regularly today.
...And 14 more matches
Introduction to events - Learn web development
a
video is played, paused, or finishes.
...it makes sense to use onplay only on specific elements, such as <
video>).
...the media recorder api, for example, has a dataavailable event, which fires when some audio or
video has been recorded and is available for doing something with (for example saving it, or playing it back).
...And 14 more matches
MediaDevices.getUserMedia() - Web APIs
that stream can include, for example, a
video track (produced by either a hardware or virtual
video source such as a camera,
video recording device, screen sharing service, and so forth), an audio track (similarly, produced by a physical or virtual audio source like a microphone, a/d converter, or the like), and possibly other track types.
... the constraints parameter is a mediastreamconstraints object with two members:
video and audio, describing the media types requested.
... the following requests both audio and
video without any specific requirements: { audio: true,
video: true } if true is specified for a media type, the resulting stream is required to have that type of track in it.
...And 14 more matches
Index - Learn web development
3 accessible multimedia accessibility, article, audio, beginner, codingscripting, html, images, javascript, learn, multimedia,
video, captions, subtitles, text tracks this chapter has provided a summary of accessibility concerns for multimedia content, along with some practical solutions.
...every time a web page does more than just sit there and display static information for you to look at—displaying timely content updates, interactive maps, animated 2d/3d graphics, scrolling
video jukeboxes, or more—you can bet that javascript is probably involved.
... 63
video and audio apis api, article, audio, beginner, codingscripting, guide, javascript, learn,
video i think we've taught you enough in this article.
...And 13 more matches
MIME types (IANA media types) - HTTP
no whitespace is allowed in a mime type: type/subtype the type represents the general category into which the data type falls, such as
video or text.
...discrete types are types which represent a single file or medium, such as a single text or music file, or a single
video.
...
video list at iana
video data or files, such as mp4 movies (
video/mp4).
...And 13 more matches
Configuring servers for Ogg media - HTTP
html <audio> and <
video> elements allow media presentation without the need for the user to install any plug-ins or other software to do so.
... serve media with the correct mime type *.ogg and *.ogv files containing
video (possibly with an audio track as well, of course), should be served with the
video/ogg mime type.
... if you don't know whether the ogg file contains audio or
video, you can serve it with the mime type application/ogg, and the browser will treat it as a
video file.
...And 13 more matches
Index - MDN Web Docs Glossary: Definitions of Web-related terms
47 cms cms, composing, content management system, glossary a cms (content management system) is software that allows users to publish, organize, change, or remove various kinds of content, not only text but also embedded images,
video, audio, and interactive code.
...the most common examples of continuous media are audio and motion
video.
... 213 ice codingscripting, glossary, networking, protocols, webrtc ice (interactive connectivity establishment) is a framework used by webrtc (among other technologies) for connecting two peers to each other, regardless of network topology (usually for audio and/or
video chat).
...And 12 more matches
Autoplay guide for media and Web Audio APIs - Web media technologies
automatically starting the playback of audio (or
videos with audio tracks) immediately upon page load can be an unwelcome surprise to users.
... autoplay blocking is not applied to <
video> elements when the source media does not have an audio track, or if the audio track is muted.
... that means that both of the following are considered autoplay behavior, and are therefore subject to the browser's autoplay blocking policy: <audio src="/music.mp4" autoplay> and audioelement.play(); the following web features and apis may be affected by autoplay blocking: the html <audio> and <
video> elements the web audio api from the user's perspective, a web page or app that spontaneously starts making noise without warning can be jarring, inconvenient, or off-putting.
...And 12 more matches
Web media technologies
over the years, the web's ability to present, create, and manage audio,
video, and other media has grown at an increasing pace.
... <
video> the <
video> element is an endpoint for
video content in a web context.
... it can be used to simply present
video files, or as a destination for streamed
video content.
...And 12 more matches
The building blocks of responsive design - Progressive web apps (PWAs)
we've written a simple-but-fun prototype for an application called snapshot, which takes a
video stream from your webcam (using getusermedia()) then allows you to capture stills from that
video stream (using html5 <canvas>), and save them to a gallery.
...for a start, let's have a look at what happens when we include the <
video> and <img> elements inside our first two columns, naked and unstyled.
...this is pretty horrible, but generally this kind of problem is easily fixed with some simple css: img,
video { max-width: 100%; } this tells the replaced elements to remain constrained inside their container's widths, no matter what.
...And 11 more matches
MediaTrackSettings - Web APIs
properties of
video tracks aspectratio a double-precision floating point value indicating the current value of the aspectratio property, specified precisely to 10 decimal places.
...the value will be one of: "user" a camera facing the user (commonly known as a "selfie cam"), used for self-portraiture and
video calling.
... framerate a double-precision floating point value indicating the current value of the framerate property, specifying how many frames of
video per second the track includes.
...And 10 more matches
RTCPeerConnection.addTrack() - Web APIs
see starting negotiation in signaling and
video calling for details.
...for example, if all you're sharing with the remote peer is a single stream with an audio track and a
video track, you don't need to deal with managing what track is in what stream, so you might as well just let the transceiver handle it for you.
... here's an example showing a function that uses getusermedia() to obtain a stream from a user's camera and microphone, then adds each track from the stream to the peer connection, without specifying a stream for each track: async opencall(pc) { const gumstream = await navigator.mediadevices.getusermedia( {
video: true, audio: true}); for (const track of gumstream.gettracks()) { pc.addtrack(track); } } the result is a set of tracks being sent to the remote peer, with no stream associations.
...And 10 more matches
Setting up adaptive streaming media sources - Developer guides
in your
video source (src) attribute you point to the mpd instead of to the media file as you would with non-adaptive media.
...we're going to take a look at ondemand profile for
video on demand (vod) and the live profile.
... here's a simple example that provides an audio track representation and four separate
video representations.
...And 10 more matches
HTML attribute reference - HTML: Hypertext Markup Language
autoplay <audio>, <
video> the audio or
video should play as soon as possible.
... buffered <audio>, <
video> contains the time range of already buffered media.
... controls <audio>, <
video> indicates whether the browser should show playback controls to the user.
...And 8 more matches
<audio>: The Embed Audio element - HTML: Hypertext Markup Language
note: sites that automatically play audio (or
videos with an audio track) can be an unpleasant experience for users, so should be avoided when possible.
...also available is a guide to the codecs supported for
video.
... <audio> elements can't have subtitles or captions associated with them in the same way that <
video> elements can.
...And 8 more matches
Test your skills: Multimedia and embedding - Learn web development
this aim of this skill test is to assess whether you've understood our
video and audio content and from object to iframe — other embedding technologies articles.
... multimedia and embedding 2 in this task we want you to mark up a slightly more complex
video player, with multiple sources, subtitles, and other features besides.
... add some appropriate fallback text for browsers that don't support <
video>.
...And 7 more matches
MediaTrackConstraints - Web APIs
properties of
video tracks aspectratio a constraindouble specifying the
video aspect ratio or range of aspect ratios which are acceptable and/or required.
... height a constrainlong specifying the
video height or range of heights which are acceptable and/or required.
... width a constrainlong specifying the
video width or range of widths which are acceptable and/or required.
...And 7 more matches
RTCInboundRtpStreamStats - Web APIs
this value is only available for
video streams.
... framesdecoded a long integer value indicating the total number of frames of
video which have been correctly decoded so far for this media source.
...only valid for
video streams.
...And 7 more matches
RTCOutboundRtpStreamStats - Web APIs
valid only for
video streams.
...only valid for
video streams.
... plicount an integer specifying the number of times the remote receiver has notified this rtcrtpsender that some amount of encoded
video data for one or more frames has been lost, using picture loss indication (pli) packets.
...And 7 more matches
RTCPeerConnection.createOffer() - Web APIs
offertoreceive
video optional (legacy) a legacy boolean option which used to control whether or not to offer to the remote peer the opportunity to try to send
video.
... if this value is false, the remote peer will not be offered to send
video data, even if the local side will be sending
video data.
... if this value is true, the remote peer will be offered to send
video data, even if the local side will not be sending
video data.
...And 7 more matches
Index - Developer guides
6 audio and
video delivery audio, guide, html, html5, media,
video whether we are dealing with pre-recorded audio files or live streams, the mechanism for making them available through the browser's <audio> and <
video> elements remains pretty much the same.
... 7 adding captions and subtitles to html5
video html5, media, webvtt, captions, subtitles, track in other articles we looked at how to build a cross browser
video player using the htmlmediaelement and window.fullscreen apis, and also at how to style the player.
... 8 creating a cross-browser
video player apps, html5,
video, full screen this article describes a simple html5
video player that uses the media and fullscreen apis and works across most major desktop and mobile browsers.
...And 7 more matches
Introduction to automated testing - Learn web development
the app then configures a new vm with the os and browser you specified, and returns the test results in the form of screenshots,
videos, logfiles, text, etc.
...you can also record a
video of your test session by hitting the recorder button in your test session.
... note: all the
videos and images captured inside a test session are captured inside the gallery, test logs, and issue tracker at lambdatest.
...And 6 more matches
MediaDevices.ondevicechange - Web APIs
it displays in the browser window two lists: one of audio devices and one of
video devices, with both the device's label (name) and whether it's an input or an output device.
... html content <p>click the start button below to begin the demonstration.</p> <div id="startbutton" class="button"> start </div> <
video id="
video" width="160" height="120" autoplay></
video><br> <div class="left"> <h2>audio devices:</h2> <ul class="devicelist" id="audiolist"></ul> </div> <div class="right"> <h2>
video devices:</h2> <ul class="devicelist" id="
videolist"></ul> </div> <div id="log"></div> css content body { font: 14px "open sans", "arial", sans-serif; }
video { margin-top: 20px; border: 1px solid black; } .button { cursor: pointer; width: 160px; border: 1px solid black; font-siz...
... let
videoelement = document.getelementbyid("
video"); let logelement = document.getelementbyid("log"); function log(msg) { logelement.innerhtml += msg + "<br>"; } document.getelementbyid("startbutton").addeventlistener("click", function() { navigator.mediadevices.getusermedia({
video: { width: 160, height: 120, framerate: 30 }, audio: { samplerate: 44100, samp...
...And 6 more matches
Media Source API - Web APIs
using mse, media streams can be created via javascript, and played using <audio> and <
video> elements.
... media source extensions concepts and usage playing
video and audio has been available in web applications without plugins for a few years now, but the basic features offered have really only been useful for playing single whole tracks.
...streaming media has up until recently been the domain of flash, with technologies like flash media server serving
video streams using the rtmp protocol.
...And 6 more matches
Aprender y obtener ayuda - Learn web development
some of the articles will be tutorials, to teach you a certain technique or important concept (such as "learn how to create a
video player" or "learn the css box model"), and some of the articles will be reference material, to allow you to look up details you may have forgotten (such as "what is the syntax of the css background property"?) mdn web docs is very good for both types — the area you are currently in is great for learning techniques and concepts, and we also have several giant reference sections allowing you to ...
...
videos there are also a number of sites that have
video learning content on them.
... youtube is an obvious one, with channels such as mozilla layout land, mozilladeveloper, and google chromedevelopers providing many useful
videos.
...And 5 more matches
Handling common HTML and CSS problems - Learn web development
in general, most core html and css functionality (such as basic html elements, css basic colors and text styling) works across most browsers you'll want to support; more problems are uncovered when you start wanting to use newer features such as flexbox, or html5
video/audio, or even more nascent, css grids or -webkit-background-clip: text.
... more complex elements like html <
video>, <audio>, and <canvas> (and other features besides) have natural mechanisms for fallbacks to be added, which work on the same principle as described above.
... for example: <
video id="
video" controls preload="metadata" poster="img/poster.jpg"> <source src="
video/tears-of-steel-battle-clip-medium.mp4" type="
video/mp4"> <source src="
video/tears-of-steel-battle-clip-medium.webm" type="
video/webm"> <!-- offer download --> <p>your browser does not support html5
video; here is a link to <a href="
video/tears-of-steel-battle-clip-medium.mp4">view the
video</a> directly.</p> </
video> this example includes a simple link allowing you to download the
video if even the html5
video player doesn't work, so at least the user can still access the
video.
...And 5 more matches
MediaConfiguration - Web APIs
properties a valid configuration includes a valid encoding configuration type or decoding configuration type and a valid audio configuration or
video configuration.
...
video configurations mush include a valid
video mime type as contenttype, the bitrate, and framerate, along with the width and the height of the
video file.
... a valid
video configuration includes: contenttype: valid
video mime type.
...And 5 more matches
msIsBoxed - Web APIs
msisboxed is a property which gets or sets when the
video player control is in boxed (letterbox or pillarbox) mode.
... returns true if the
video is in letterbox or pillarbox mode.
... letterbox format displays black bars on the top and bottom of a
video to fill in between the wide screen format of a
video, and the aspect ratio of the screen.
...And 5 more matches
<source>: The Media or Image Source element - HTML: Hypertext Markup Language
the html <source> element specifies multiple media resources for the <picture>, the <audio> element, or the <
video> element.
... permitted parents a media element—<audio> or <
video>—and it must be placed before any flow content or <track> element.
... src required for <audio> and <
video>, address of the media resource.
...And 5 more matches
HTTP Index - HTTP
a complete document is reconstructed from the different sub-documents fetched, for instance text, layout description, images,
videos, scripts, and more 4 basics of http guide, http, overview http is a pretty extensible protocol.
...http has evolved from an early protocol to exchange files in a semi-trusted laboratory environment, to the modern maze of the internet, now carrying images,
videos in high resolution and 3d.
... 10 incomplete list of mime types audio, file types, files, http, mime, mime types, php, reference, text, types,
video here is a list of mime types, associated by type of documents, ordered by their common extensions.
...And 5 more matches
Web audio codec guide - Web media technologies
additionally, webrtc implementations generally use a subset of these codecs for their encoding and decoding of media, and may support additional codecs as well, for optimal cross-platform support of
video and audio conferencing, and to integrate better with legacy telecommunication solutions.
... g.711 pulse code modulation (pcm) of voice frequencies rtp / webrtc g.722 7 khz audio coding within 64 kbps (for telephony/voip) rtp / webrtc mp3 mpeg-1 audio layer iii mp4, adts, mpeg1, 3gp opus opus webm, mp4, ogg vorbis vorbis webm, ogg [1] when mpeg-1 audio layer iii codec data is stored in an mpeg file, and there is no
video track on the file, the file is typically referred to as an mp3 file, even though it's still an mpeg format file.
... latency 5 ms to 66.5 ms browser compatibility feature chrome edge firefox internet explorer opera safari opus support 33 14 15 no 20 11[2] this information refers to support for opus in html <audio> and <
video> elements, and not to webrtc.
...And 5 more matches
Multimedia: Images - Learn web development
previous overview: performance next media, namely images and
video, account for over 70% of the bytes downloaded for the average website.
...this article looks at optimizing image and
video to improve web performance.
... for the average website, 51% of its bandwidth comes from imagery, followed by
video at 25%, so it's safe to say it's important to address and optimize your multi-media content.
...And 4 more matches
HTMLMediaElement.srcObject - Web APIs
examples basic example in this example, a mediastream from a camera is assigned to a newly-created <
video> element.
... const mediastream = await navigator.mediadevices.getusermedia({
video: true}); const
video = document.createelement('
video');
video.srcobject = mediastream; in this example, a new mediasource is assigned to a newly-created <
video> element.
... const mediasource = new mediasource(); const
video = document.createelement('
video');
video.srcobject = mediasource; supporting fallback to the src property the examples below support older browser versions that require you to create an object url and assign it to src if srcobject isn't supported.
...And 4 more matches
The HTML DOM API - Web APIs
management of media connected to the html media elements (<audio> and <
video>).
...for example, the <audio> and <
video> elements both present audiovisual media.
... the corresponding types, htmlaudioelement and html
videoelement, are both based upon the common type htmlmediaelement, which in turn is based upon htmlelement and so forth.
...And 4 more matches
Key Values - Web APIs
keyboardevent.key value description virtual keycode windows mac linux android "avrinput" [3] changes the input mode on an external audio/
video receiver (avr) unit.
... vk_dimmer gdk_key_brightnessadjust (0x1008ff3b) "displayswap" cycles among
video sources.
... vk_display_swap "dvr" switches the input source to the digital
video recorder (dvr).
...And 4 more matches
MediaRecorder() - Web APIs
this source media can come from a stream created using navigator.mediadevices.getusermedia() or from an <audio>, <
video> or <canvas> element.
... options optional a dictionary object that can contain the following properties: mimetype: a mime type specifying the format for the resulting media; you may simply specify the container format (the browser will select its preferred codecs for audio and/or
video), or you may use the codecs parameter and/or the profiles parameter to provide detailed information about which codecs to use and how to configure them.
...
videobitspersecond: the chosen bitrate for the
video component of the media.
...And 4 more matches
MediaStream Image Capture API - Web APIs
the mediastream image capture api is an api for capturing images or
videos from a photographic device.
... mediastream image capture concepts and usage the process of retrieving an image or
video stream happens as described below.
...the example below simply says give me whatever
video device is available, though the getusermedia() method allows more specific capabilities to be requested.
...And 4 more matches
MediaTrackSettings.facingMode - Web APIs
the mediatracksettings dictionary's facingmode property is a domstring indicating the direction in which the camera producing the
video track represented by the mediastreamtrack is currently facing.
... syntax var facingmode = mediatracksettings.facingmode; value a domstring whose value is one of the strings in
videofacingmodeenum.
...
videofacingmodeenum the following strings are permitted values for the facing mode.
...And 4 more matches
Media Capabilities API - Web APIs
'' : 'not ') + 'power efficient.') }) .catch(() => { console.log("decodinginfo error: " + contenttype) }); } media capabilities api concepts and usage there are a myriad of
video and audio codecs.
... whether a device uses hardware or software decoding impacts how smooth and power efficient the
video decoding is and how efficient the playback will be.
... to test support, smoothness and power efficiency of a
video or audio file, you define the media configuration you want to test, and then pass the audio or
video configuration as the parameter of the mediacapabilities interface's encodinginfo() and decodinginfo() methods.
...And 4 more matches
Transcoding assets for Media Source Extensions - Web APIs
to check if the browser supports a particular container, you can pass a string of the mime type to the mediasource.istypesupported method: mediasource.istypesupported('audio/mp3'); // false mediasource.istypesupported('
video/mp4'); // true mediasource.istypesupported('
video/mp4; codecs="avc1.4d4028, mp4a.40.2"'); // true the string is the mime type of the container, optionally followed by a list of codecs.
... currently, mp4 containers with h.264
video and aac audio codecs have support across all modern browsers, while others don't.
... because the audio codec in the mov container is already aac and the
video codec is h.264, we can instruct ffmpeg not to perform transcoding.
...And 4 more matches
Media Capture and Streams API (Media Stream) - Web APIs
the media capture and streams api, often called the media streams api or simply mediastream api, is an api related to webrtc which provides support for streaming audio and
video data.
... concepts and usage the api is based on the manipulation of a mediastream object representing a flux of audio- or
video-related data.
... see an example in get the
video.
...And 4 more matches
Page Visibility API - Web APIs
for example, if your web app is playing a
video, it can pause the
video when the user puts the tab into the background, and resume playback when the user returns to the tab.
... the user doesn't lose their place in the
video, the
video's soundtrack doesn't interfere with audio in the new foreground tab, and the user doesn't miss any of the
video in the meantime.
... example view live example (
video with sound).
...And 4 more matches
RTCRtpStreamStats - Web APIs
kind a domstring whose value is "audio" if the associated mediastreamtrack is audio-only or "
video" if the track contains
video.
...this statistic is only available to the device which is receiving the stream and is only available for
video tracks.
... plicount the number of times the receiving end of the stream sent a picture loss indiciation (pli) packet to the sender, indicating that it has lost some amount of encoded
video data for one or more frames.
...And 4 more matches
Establishing a connection: The WebRTC perfect negotiation pattern - Web APIs
const constraints = { audio: true,
video: true }; const config = { iceservers: [{ urls: "stun:stun.mystunserver.tld" }] }; const self
video = document.queryselector("
video.selfview"); const remote
video = document.queryselector("
video.remoteview"); const signaler = new signalingchannel(); const pc = new rtcpeerconnection(config); this code also gets the <
video> elements using the classes "selfview" and "remoteview"; these will conta...
... async function start() { try { const stream = await navigator.mediadevices.getusermedia(constraints); for (const track of stream.gettracks()) { pc.addtrack(track, stream); } self
video.srcobject = stream; } catch(err) { console.error(err); } } this isn't appreciably different from older webrtc connection establishment code.
...then, finally, the media source for the self-view <
video> element indicated by the self
video constant is set to the camera and microphone stream, allowing the local user to see what the other peer sees.
...And 4 more matches
WebRTC API - Web APIs
webrtc (web real-time communication) is a technology which enables web applications and sites to capture and optionally stream audio and/or
video media, as well as to exchange arbitrary data between browsers without requiring an intermediary.
... webrtc concepts and usage webrtc serves multiple purposes; together with the media capture and streams api, they provide powerful multimedia capabilities to the web, including support for audio and
video conferencing, file exchange, screen sharing, identity management, and interfacing with legacy telephone systems including support for sending dtmf (touch-tone dialing) signals.
... media streams can consist of any number of tracks of media information; tracks, which are represented by objects based on the mediastreamtrack interface, may contain one of a number of types of media data, including audio,
video, and text (such as subtitles or even chapter names).
...And 4 more matches
WebRTC Statistics API - Web APIs
erconnection(pcoptions); statsinterval = window.setinterval(getconnectionstats, 1000); /* add event handlers, etc */ } catch(err) { console.error("error creating rtcpeerconnection: " + err); } function getconnectionstats() { mypeerconnection.getstats(null).then(stats => { var statsoutput = ""; stats.foreach(report => { if (report.type === "inbound-rtp" && report.kind === "
video") { object.keys(report).foreach(statname => { statsoutput += `<strong>${statname}:</strong> ${report[statname]}<br>\n`; }); } }); document.queryselector(".stats-box").innerhtml = statsoutput; }); } when the promise returned by getstats() is fulfilled, the resolution handler receives as input an rtcstatsreport object containing the statistics information...
... this example specifically looks for the report whose type is inbound-rtp and whose kind is
video.
... this way, we look only at the
video-related statistics for the local rtcrtpreceiver responsible for receiving the streamed media.
...And 4 more matches
HTML documentation index - HTML: Hypertext Markup Language
5 dash adaptive streaming for html 5
video guide, html, html5 dynamic adaptive streaming over http (dash) is an adaptive streaming protocol.
...property values are either a string or a url and can be associated with a very wide range of elements including <audio>, <embed>, <iframe>, <img>, <link>, <object>, <source> , <track>, and <
video>.
... 39 html attribute: crossorigin advanced, attribute, cors, html, needscontent, reference, security the crossorigin attribute, valid on the <audio>, <img>, <link>, <script>, and <
video> elements, provides support for cors, defining how the element handles crossorigin requests, thereby enabling the configuration of the cors requests for the element's fetched data.
...And 4 more matches
Common Firefox theme issues and solutions - Archive of obsolete content
miscellaneous issues html 5 media controls html 5 media controls are not styled the html5
video control bar is not styled.
... please go to mozilla's mission page using both your theme and the default theme and try the
video on that page.
... html 5 media control bar is missing full screen button the fullscreen icon is missing from the html5
video control bar.
...And 3 more matches
Mozilla splash page - Learn web development
previous overview: multimedia and embedding in this assessment, we'll test your knowledge of some of the techniques discussed in this module's articles, getting you to add some images and
video to a funky splash page all about mozilla!
... objective: to test knowledge around embedding images and
video in web pages, frames, and html responsive image techniques.
...unfortunately, no images or
video have been added yet — this is your job!
...And 3 more matches
Graceful asynchronous programming with Promises - Learn web development
let's consider a hypothetical
video chat application.
... the application has a window with a list of the user's friends, and clicking on a button next to a user starts a
video call to that user.
... the code that the
video chat application would use might look something like this: function handlecallbutton(evt) { setstatusmessage("calling..."); navigator.mediadevices.getusermedia({
video: true, audio: true}) .then(chatstream => { selfviewelem.srcobject = chatstream; chatstream.gettracks().foreach(track => mypeerconnection.addtrack(track, chatstream)); setstatusmessage("connected"); }).
...And 3 more matches
Handling common accessibility problems - Learn web development
people with hearing impairments relying on captions/subtitles or other text alternatives for audio/
video content.
...you might have inherited a site where the semantics are not very good (perhaps you've ended up with a horrible cms that generates buttons made with <div>s), or you are using a complex control that does not have keyboard accessibility built in, like the html5 <
video> element (amazingly, opera is the only browser that allows you to tab through the <
video> element's default browser controls).
...see creating a cross-browser
video player for some good examples of this.
...And 3 more matches
RTCRtpEncodingParameters.scaleResolutionDownBy - Web APIs
the rtcrtpencodingparameters dictionary's scaleresolutiondownby property can be used to specify a factor by which to reduce the size of a
video track during encoding.
... this property is only available for tracks whose kind is
video.
... syntax rtpencodingparameters.scaleresolutiondownby = scalingfactor; rtpencodingparameters = { scaleresolutiondownby: scalingfactor }; value a double-precison floating-point number specifying the amount by which to reduce the size of the
video during encoding.
...And 3 more matches
TextTrackList.onaddtrack - Web APIs
syntax texttracklist.onaddtrack = eventhandler; value set onaddtrack to a function that accepts as input a trackevent object which indicates in its track property which
video track has been added to the media.
... usage notes the addtrack event is called whenever a new track is added to the media element whose
video tracks are represented by the texttracklist object.
... this happens when tracks are added to the element when the media is first attached to the element; one addtrack event will occur for each
video track in the media resource.
...And 3 more matches
VTTCue - Web APIs
vttcue.region a vttregion object describing the
video's sub-region that the cue will be drawn onto, or null if none is assigned.
... vttcue.snaptolines returns true if the vttcue.line attribute is an integer number of lines or a percentage of the
video size.
...this can be the string auto or a number representing the percentage of the vttcue.region, or the
video size if vttcue.region is null.
...And 3 more matches
VTTRegion - Web APIs
the vttregion interface—part of the api for handling webvtt (text tracks on media presentations)—describes a portion of the
video to render a vttcue onto.
... vttregion.width a double representing the width of the region, as a percentage of the
video.
... vttregion.viewportanchorx a double representing the viewport anchor x offset, as a percentage of the
video.
...And 3 more matches
WebGL best practices - Web APIs
estimate a per-pixel vram budget webgl doesn't offer apis to query the maximum amount of
video memory on the system because such queries are not portable.
... teximage/texsubimage uploads (particularly with
videos) can cause pipeline flushes most texture uploads from dom elements will incur a processing pass that will temporarily switch gl progams internally, causing a pipeline flush.
... useprogram(prog1) <pipeline flush> bindframebuffer(target) drawarrays() bindtexture(webgl_texture) teximage2d(html
videoelement) drawarrays() ...
...And 3 more matches
How can we design for all types of users? - Learn web development
if you want an elastic/responsive website, and you don't know what the browser's default width is, you can use the max-width property to allow up to 70 characters per line and no more: div.container { max-width:70em; } alternative content for images, audio, and
video websites often include stuff besides plain text.
... audio/
video you must also provide alternatives to multimedia content.
... subtitling/close-captioning you should include captions in your
video to cater to visitors who can't hear the audio.
...And 2 more matches
How much does it cost to do something on the Web? - Learn web development
media editors if you want to include
video or audio into your website, you can either embed online services (for example youtube, vimeo, or dailymotion), or include your own
videos (see below for bandwidth costs).
...likewise,
video-editing software can be free (pitivi, openshot for linux, imovie for mac), less than $100 (adobe premiere elements), or several hundred dollars (adobe premiere pro, avid media composer, final cut pro).
... of course, you'll need a more serious computer if you want to produce complicated designs, touch up photos, or produce audio and
video files.
...And 2 more matches
What is accessibility? - Learn web development
let's consider
video: hearing impairment how does a hearing-impaired person benefit from a
video?
... visual impairment again, provide a text transcript that a user can consult without needing to play the
video, and an audio-description (an off-screen voice that describes what is happening in the
video).
... pausing capacity users may have trouble understanding someone in a
video.
...And 2 more matches
Images in HTML - Learn web development
our above code would give us the following result: note: elements like <img> and <
video> are sometimes referred to as replaced elements.
... this is because the element's content and size are defined by an external resource (like an image or
video file), not by the contents of the element itself.
... a figure could be several images, a code snippet, audio,
video, equations, a table, or something else.
...And 2 more matches
Setting up your own test automation environment - Learn web development
y = '{accesskey}'; // gridurl: gridurl can be found at automation dashboard const grid_host = 'hub.lambdatest.com/wd/hub'; function searchtextongoogle() { // setup input capabilities const capabilities = { platform: 'windows 10', browsername: 'chrome', version: '67.0', resolution: '1280x800', network: true, visual: true, console: true,
video: true, name: 'test 1', // name of the test build: 'nodejs build' // name of the build }; // url: https://{username}:{accesstoken}@hub.lambdatest.com/wd/hub const gridurl = 'https://' + username + ':' + key + '@' + grid_host; // setup and build selenium driver object const driver = new webdriver.builder() .usingserver(gridurl) .withcapabilities(capabilities) ...
... now if you go to your lambdatest automation dashboard, you'll see your test listed; from here you'll be able to see
videos, screenshots, and other such data.
...you will also find a
video recording of your selenium script execution.
...And 2 more matches
nsIDOMHTMLSourceElement
the nsidomhtmlsourceelement interface is the dom interface to the source child of the audio and
video media elements in html.
...note that dynamically manipulating this value after the page has loaded has no effect on the containing element; instead, change the src attribute of that element (audio or
video) instead.
... example <
video controls> <source src="foo.webm" type="
video/webm; codecs="vp9, opus""> your browser does not support the <code>
video</code> element.
...And 2 more matches
Guide to the Fullscreen API - Web APIs
activating full-screen mode given an element that you'd like to present in full-screen mode (such as a <
video>, for example), you can present it in full-screen mode by simply calling its requestfullscreen() method.
... let's consider this <
video> element: <
video controls id="my
video"> <source src="some
video.webm"></source> <source src="some
video.mp4"></source> </
video> we can put that
video into full-screen mode as follows: var elem = document.getelementbyid("my
video"); if (elem.requestfullscreen) { elem.requestfullscreen(); } this code checks for the existence of the requestfullscreen() method before calling it.
...to get the same fullscreen behavior in webkit, you need to add your own "width: 100%; height: 100%;" css rules to the element yourself: #my
video:-webkit-full-screen { width: 100%; height: 100%; } on the other hand, if you're trying to emulate webkit's behavior on gecko, you need to place the element you want to present inside another element, which you'll make fullscreen instead, and use css rules to adjust the inner element to match the appearance you want.
...And 2 more matches
HTMLMediaElement.play() - Web APIs
example this example demonstrates how to confirm that playback has begun and how to gracefully handle blocked automatic playback: let
videoelem = document.getelementbyid("
video"); let playbutton = document.getelementbyid("playbutton"); playbutton.addeventlistener("click", handleplaybutton, false); play
video(); async function play
video() { try { await
videoelem.play(); playbutton.classlist.add("playing"); } catch(err) { playbutton.classlist.remove("playing"); } } function handleplaybutton() { if (
videoelem.pause...
...d) { play
video(); } else {
videoelem.pause(); playbutton.classlist.remove("playing"); } } in this example, playback of
video is toggled off and on by the async play
video() function.
... it tries to play the
video, and if successful sets the class name of the playbutton element to "playing".
...And 2 more matches
ImageCapture - Web APIs
constructor imagecapture() creates a new imagecapture object which can be used to capture still frames (photos) from a given mediastreamtrack which represents a
video stream.
... imagecapture.takephoto() takes a single exposure using the
video capture device sourcing a mediastreamtrack and returns a promise that resolves with a blob containing the data.
... imagecapture.grabframe() takes a snapshot of the live
video in a mediastreamtrack, returning an imagebitmap, if successful.
...And 2 more matches
MediaDevices.getDisplayMedia() - Web APIs
since getdisplaymedia() requires a
video track, the returned stream will have one even if no
video track is expressly requested by the constraints object.
... return value a promise that resolves to a mediastream containing a
video track whose contents come from a user-selected screen area, as well as an optional audio track.
... notfounderror no sources of screen
video are available for capture.
...And 2 more matches
MediaStream Recording API - Web APIs
overview of the recording process the process of recording a stream is simple: set up a mediastream or htmlmediaelement (in the form of an <audio> or <
video> element) to serve as the source of the media data.
... once the source media is playing and you've reached the point where you're ready to record
video, call mediarecorder.start() to begin recording.
...var stream = canvas.capturestream(25); var recordedchunks = []; console.log(stream); var options = { mimetype: "
video/webm; codecs=vp9" }; mediarecorder = new mediarecorder(stream, options); mediarecorder.ondataavailable = handledataavailable; mediarecorder.start(); function handledataavailable(event) { console.log("data-available"); if (event.data.size > 0) { recordedchunks.push(event.data); console.log(recordedchunks); download(); } else { // ...
...And 2 more matches
MediaTrackConstraints.cursor - Web APIs
the mediatrackconstraints dictionary's cursor property is a constraindomstring describing the requested or mandatory constraints placed upon the value of the cursor constrainable property, which is used to specify whether or not the cursor should be included in the captured
video.
... syntax var constraintsobject = { cursor: constraint }; constraintsobject.cursor = constraint; value a constraindomstring which specifies whether or not the mouse cursor should be rendered into the
video track in the mediastream returned by the call to getdisplaymedia().
... usage notes you can check the setting selected by the user agent after the display media has been created by getdisplaymedia() by calling getsettings() on the display media's
video mediastreamtrack, then checking the value of the returned mediatracksettings object's cursor object.
...And 2 more matches
MediaTrackConstraints.facingMode - Web APIs
syntax var constraintsobject = { facingmode: constraint }; constraintsobject.facingmode = constraint; value an object based on constraindomstring specifying one or more acceptable, ideal, and/or exact (mandatory) facing modes are acceptable for a
video track.
... "user" the
video source is facing toward the user; this includes, for example, the front-facing camera on a smartphone.
... "environment" the
video source is facing away from the user, thereby viewing their environment.
...And 2 more matches
MediaTrackSettings.displaySurface - Web APIs
syntax displaysurface = mediatracksettings.displaysurface; value the value of displaysurface is a string that comes from the displaycapturesurfacetype enumerated type, and is one of the following: application the stream's
video track contains all of the windows belonging to the application chosen by the user.
... the windows are aggragated into a single
video track, with any empty space filled with a backdrop; that backdrop is selected by the user agent.
... browser the stream's
video track presents the entire contents of a single browser tab which the user selected during the getdisplaymedia() call.
...And 2 more matches
Web Audio playbackRate explained - Developer guides
the playbackrate property of the <audio> and <
video> elements allows us to change the speed, or rate, at which a piece of web audio or
video is playing.
...next we set playbackrate to 0.5, which represents half normal speed (the playbackrate is a multiplier applied to the original rate.) a complete example let's create a <
video> element first, and set up
video and playback rate controls in html: <
video id="my
video" controls> <source src="https://udn.realityripple.com/samples/6f/08625b424a.m4v" type='
video/mp4' /> <source src="https://udn.realityripple.com/samples/5b/8cd6da9c65.webm" type='
video/webm' /> </
video> <form> <input id="pbr" type="range" value="1" min="0.5" max="4" step="0.1" > <p>playback rate <span...
... id="currentpbr">1</span></p> </form> and apply some javascript to it: window.onload = function () { var v = document.getelementbyid("my
video"); var p = document.getelementbyid("pbr"); var c = document.getelementbyid("currentpbr"); p.addeventlistener('input',function(){ c.innerhtml = p.value; v.playbackrate = p.value; },false); }; finally, we listen for the input event firing on the <input> element, allowing us to react to the playback rate control being changed.
...And 2 more matches
HTML5 - Developer guides
multimedia: making
video and audio first-class citizens in the open web.
... using html5 audio and
video the <audio> and <
video> elements embed and allow the manipulation of new multimedia content.
... webrtc this technology, where rtc stands for real-time communication, allows connecting to other people and controlling
videoconferencing directly in the browser, without the need for a plugin or an external application.
...And 2 more matches
Introduction to Web development - Developer guides
crockford on javascript — an in-depth
video series on the javascript language.
... advanced learning advanced javascript — john resig's guide to advanced javascript crockford on advanced javascript — a three part
video series on advanced javascript concepts javascript garden — documentation of the most quirky parts of javascript.
... google's html, css, and javascript from the ground up these easily digestible
video tutorials from google's expert web developers cover the basics of html, css and javascript.
...And 2 more matches
HTML elements reference - HTML: Hypertext Markup Language
image and multimedia html supports various multimedia resources such as images, audio, and
video.
... <track> the html <track> element is used as a child of the media elements, <audio> and <
video>.
... <
video> the html
video element (<
video>) embeds a media player which supports
video playback into the document.
...And 2 more matches
Preloading content with rel="preload" - HTML: Hypertext Markup Language
larger images and
video files.
...
video:
video file, as typically used in <
video>.
... you can see an example of this in our
video example (see the full source code, and also the live version): <head> <meta charset="utf-8"> <title>
video preload example</title> <link rel="preload" href="sintel-short.mp4" as="
video" type="
video/mp4"> <link rel="preload" href="sintel-short.webm" as="
video" type="
video/webm"> </head> <body> <
video controls> <source src="sintel-short.mp4" type="
video/mp4"> <source src="sintel-...
...And 2 more matches
HTML: Hypertext Markup Language
html markup includes special "elements" such as <head>, <title>, <body>, <header>, <footer>, <article>, <section>, <p>, <div>, <span>, <img>, <aside>, <audio>, <canvas>, <datalist>, <details>, <embed>, <nav>, <output>, <progress>, <
video>, <ul>, <ol>, <li> and many others.
... multimedia and embedding this module explores how to use html to include multimedia in your web pages, including the different ways that images can be included, and how to embed
video, audio, and even entire other webpages.
... use html to solve common problems provides links to sections of content explaining how to use html to solve very common problems when creating a web page: dealing with titles, adding images or
videos, emphasizing content, creating a basic form, etc.
...And 2 more matches
Common MIME types - HTTP
this table lists some important mime types for the web: extension kind of document mime type .aac aac audio audio/aac .abw abiword document application/x-abiword .arc archive document (multiple files embedded) application/x-freearc .avi avi: audio
video interleave
video/x-ms
video .azw amazon kindle ebook format application/vnd.amazon.ebook .bin any kind of binary data application/octet-stream .bmp windows os/2 bitmap graphics image/bmp .bz bzip archive application/x-bzip .bz2 bzip2 archive application/x-bzip2 .csh c-shell script application/x-csh ...
...tps://datatracker.ietf.org/doc/draft-ietf-dispatch-javascript-mjs/ .json json format application/json .jsonld json-ld format application/ld+json .mid .midi musical instrument digital interface (midi) audio/midi audio/x-midi .mjs javascript module text/javascript .mp3 mp3 audio audio/mpeg .mpeg mpeg
video video/mpeg .mpkg apple installer package application/vnd.apple.installer+xml .odp opendocument presentation document application/vnd.oasis.opendocument.presentation .ods opendocument spreadsheet document application/vnd.oasis.opendocument.spreadsheet .odt opendocument text document application/vnd.oasis.opendocument.text .oga...
... ogg audio audio/ogg .ogv ogg
video video/ogg .ogx ogg application/ogg .opus opus audio audio/opus .otf opentype font font/otf .png portable network graphics image/png .pdf adobe portable document format (pdf) application/pdf .php hypertext preprocessor (personal home page) application/x-httpd-php .ppt microsoft powerpoint application/vnd.ms-powerpoint .pptx microsoft powerpoint (openxml) application/vnd.openxmlformats-officedocument.presentationml.presentation .rar rar archive application/vnd.rar .rtf rich text format (rtf) application/rtf .sh bourne shell script application/x-sh .svg scalable vecto...
...And 2 more matches
Handling media support issues in web content - Web media technologies
one of the realities of working with audio and
video presentation and manipulation on the web is that there are a number of media formats available, of varying degrees of popularity and with a variety of capabilities.
... using poster frames a poster frame is a still image that's representative of the content of a
video.
... this may be simply the first frame of
video; however, in many cases, the first frame is blank, or contains nothing but the logo of a business, or some other image that doesn't give the reader any context for the
video's contents.
...And 2 more matches
Web Performance
we cover them in this section: key performance guides animation performance and frame rateanimation on the web can be done via svg, javascript, including <canvas> and webgl, css animation, <
video>, animated gifs and even animated pngs and other image types.
... multimedia: images and
video the lowest hanging fruit of web performance is often media optimization.
...additional tips like removing audio tracks from background
videos can improve performance even further.
...And 2 more matches
Index - Archive of obsolete content
212 stringview as web applications become more and more powerful, adding features such as audio and
video manipulation, access to raw data using websockets, and so forth, it has become clear that there are times when it would be helpful for javascript code to be able to quickly and easily manipulate raw binary data.
...the actual keys are: 489 introducing the audio api extension deprecated the audio data api extension extends the html5 specification of the <audio> and <
video> media elements by exposing audio metadata and raw audio data.
... 688
video presentations presentations mozilla is actively working to produce
video presentations that can help you learn how the mozilla codebase works and how to take advantage of its technology in your own applications and extensions.
...for example, the adobe flash plug-in is used to access flash content (including
videos and certain interactive applications), and the quicktime and realplayer plugins are used to play special format
videos in a web page.
HTML: A good basis for accessibility - Learn web development
for example, a control button to play a
video on your site could be marked up like this: <div>play
video</div> but as you'll see in greater detail later on, it makes sense to use the correct element for the job: <button>play
video</button> not only do html <button>s have some suitable styling applied by default (which you will probably want to override), they also have built-in keyboard accessibility — users can navigate between button...
... you can find a nice explanation of the importance of proper text labels, and how to investigate text label issues using the firefox accessibility inspector, in the following
video: accessible data tables a basic data table can be written with very simple markup, for example: <table> <tr> <td>name</td> <td>age</td> <td>gender</td> </tr> <tr> <td>gabriel</td> <td>13</td> <td>male</td> </tr> <tr> <td>elva</td> <td>8</td> <td>female</td> </tr> <tr> <td>freida</td> <td>5</td> <td>female</td> </tr> </table> bu...
... text alternatives whereas textual content is inherently accessible, the same cannot necessarily be said for multimedia content — image and
video content cannot be seen by visually-impaired people, and audio content cannot be heard by hearing-impaired people.
... we cover
video and audio content in detail in the accessible multimedia, but for this article we'll look at accessibility for the humble <img> element.
HTML: A good basis for accessibility - Learn web development
for example, a control button to play a
video on your site could be marked up like this: <div>play
video</div> but as you'll see in greater detail later on, it makes sense to use the correct element for the job: <button>play
video</button> not only do html <button>s have some suitable styling applied by default (which you will probably want to override), they also have built-in keyboard accessibility — users can navigate between button...
... you can find a nice explanation of the importance of proper text labels, and how to investigate text label issues using the firefox accessibility inspector, in the following
video: accessible data tables a basic data table can be written with very simple markup, for example: <table> <tr> <td>name</td> <td>age</td> <td>gender</td> </tr> <tr> <td>gabriel</td> <td>13</td> <td>male</td> </tr> <tr> <td>elva</td> <td>8</td> <td>female</td> </tr> <tr> <td>freida</td> <td>5</td> <td>female</td> </tr> </table> bu...
... text alternatives whereas textual content is inherently accessible, the same cannot necessarily be said for multimedia content — image and
video content cannot be seen by visually-impaired people, and audio content cannot be heard by hearing-impaired people.
... we cover
video and audio content in detail in the accessible multimedia, but for this article we'll look at accessibility for the humble <img> element.
Introduction to web APIs - Learn web development
audio and
video apis like htmlmediaelement, the web audio api, and webrtc allow you to do really interesting things with multimedia such as creating custom ui controls for playing audio and
video, displaying text tracks like captions and subtitles along with your
videos, grabbing
video from your web camera to be manipulated via a canvas (see above) or displayed on someone else's computer in a web conference, or ...
... the youtube api, which allows you to embed youtube
videos on your site, search youtube, build playlists, and more.
... the twilio api, which provides a framework for building voice and
video call functionality into your app, sending sms/mms from your apps, and more.
... overview: client-side web apis next in this module introduction to web apis manipulating documents fetching data from the server third-party apis drawing graphics
video and audio apis client-side storage ...
Third-party APIs - Learn web development
youtube example we also built another example for you to study and learn from — see our youtube
video search example.
... this uses two related apis: the youtube data api to search for youtube
videos and return results.
... the youtube iframe player api to display the returned
video examples inside iframe
video players so you can watch them.
... previous overview: client-side web apis next in this module introduction to web apis manipulating documents fetching data from the server third party apis drawing graphics
video and audio apis client-side storage ...
HTML performance features - Learn web development
complications can occur when, for example, the file size of a <
video> embed is too large, or when a webpage is not optimized for mobile devices.
... elements & attributes impacting performance the <picture> element the <
video> element the <source> element the <img> srcset attribute responsive images preloading content with rel="preload" - (https://w3c.github.io/preload/ ) async / defer attributes <iframe> <object> <script> rel attribute conclusion previous overview: performance next in this module the "why" of web performance what is web performance?
... measuring performance multimedia: images multimedia:
video javascript performance best practices.
... html performance features css performance features fonts and performance mobile performance focusing on performance see also the <picture> element the <
video> element the <source> element the <img> srcset attribute responsive images preloading content with rel="preload" - (https://w3c.github.io/preload/ ) ...
Handling common JavaScript problems - Learn web development
for example, when showing a
video stream, make sure it is turned off when you can't see it.
... typed arrays allow javascript code to access and manipulate raw binary data, which is necessary as browser apis for example start to manipulate streams of raw
video and audio data.
... webrtc api for multi-person, real-time
video/audio connectivity (e.g.
...
video conferencing).
HTMLMediaElement.load() - Web APIs
usage notes calling load() aborts all ongoing operations involving this media element, then begins the process of selecting and loading an appropriate media resource given the options specified in the <audio> or <
video> element and its src attribute or child <source> element(s).
... this is described in more detail in supporting multiple formats in
video and audio content.
... example this example finds a <
video> element in the document and resets it by calling load().
... var mediaelem = document.queryselector("
video"); mediaelem.load(); specifications specification status comment html living standardthe definition of 'htmlmediaelement.load()' in that specification.
HTMLMediaElement.textTracks - Web APIs
you can detect when tracks are added to and removed from an <audio> or <
video> element using the addtrack and removetrack events.
... examples we start with a <
video> that has several <track> children <
video controls poster="/images/sample.gif"> <source src="sample.mp4" type="
video/mp4"> <source src="sample.ogv" type="
video/ogv"> <track kind="captions" src="samplecaptions.vtt" srclang="en"> <track kind="descriptions" src="sampledescriptions.vtt" srclang="en"> <track kind="chapters" src="samplechapters.vtt" srclang="en"> <track kind="subtitles" s...
..._en.vtt" srclang="en"> <track kind="subtitles" src="samplesubtitles_ja.vtt" srclang="ja"> <track kind="subtitles" src="samplesubtitles_oz.vtt" srclang="oz"> <track kind="metadata" src="keystage1.vtt" srclang="en" label="key stage 1"> <track kind="metadata" src="keystage2.vtt" srclang="en" label="key stage 2"> <track kind="metadata" src="keystage3.vtt" srclang="en" label="key stage 3"> </
video> the htmlmediaelement.texttracks returns a texttrackslist thru which we can iterate.
... var tracks = document.queryselector('
video').texttracks; for (var i = 0, l = tracks.length; i < l; i++) { /* tracks.length == 10 */ if (tracks[i].language == 'en') { console.dir(tracks[i]); } } properties & methods properties length returns the number of text tracks in texttracklist object.
HTMLMediaElement - Web APIs
the htmlmediaelement interface adds to htmlelement the properties and methods needed to support basic media-related capabilities that are common to audio and
video.
... the html
videoelement and htmlaudioelement elements both inherit this interface.
... htmlmediaelement.
videotracks read only returns the list of
videotrack objects contained in the element.
... ended fired when playback stops when end of the media (<audio> or <
video>) is reached or because no further data is available.
msStereo3DPackingMode - Web APIs
msstereo3dpackingmode is a read/write property which gets or sets the frame-packing mode for stereo 3-d
video content.
... syntax html
videoelement.msstereo3dpackingmode(topbottom, sidebyside, none); value the following values return, or set, the stereo 3-d content packing as "topbottom", "sidebyside", or "none" for regular 2-d
video.
... none (0): specifies regular 2-d
video.
... see also html
videoelement microsoft api extensions ...
ImageCapture() constructor - Web APIs
syntax const imagecapture = new imagecapture(
videotrack) parameters
videotrack a mediastreamtrack from which the still images will be taken.
... this can be any source, such as an incoming stream of a
video conference, a playing movie, or the stream from a webcam.
... return value a new imagecapture object which can be used to capture still frames from the specified
video track.
... navigator.mediadevices.getusermedia({
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream const track = mediastream.get
videotracks()[0]; imagecapture = new imagecapture(track); }) .catch(error => console.log(error)); specifications specification status comment mediastream image capturethe definition of 'imagecapture' in that specification.
MediaSource - Web APIs
mediasource.activesourcebuffers read only returns a sourcebufferlist object containing a subset of the sourcebuffer objects contained within mediasource.sourcebuffers — the list of objects providing the selected
video track, enabled audio tracks, and shown/hidden text tracks.
... examples the following simple example loads a
video with xmlhttprequest, playing it as soon as it can.
... this example was written by nick desaulniers and can be viewed live here (you can also download the source for further investigation.) var
video = document.queryselector('
video'); var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource(); //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceopen (_) { //console.log(this.readystate); // open var mediasource = th...
...is; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; function fetchab (url, cb) { console.log(url); var xhr = new xmlhttprequest; xhr.open('get', url); xhr.responsetype = 'arraybuffer'; xhr.onload = function () { cb(xhr.response); }; xhr.send(); }; specifications specification status comment media source extensionsthe definition of 'mediasource' in that specification.
MediaTrackSettings.cursor - Web APIs
the mediatracksettings dictionary's cursor property indicates whether or not the cursor should be captured as part of the
video track included in the mediastream returned by getdisplaymedia().
... syntax cursorsetting = mediatracksettings.cursor; value the value of cursor comes from the cursorcaptureconstraint enumerated string type, and may have one of the following values: always the mouse should always be visible in the
video content of the {domxref("mediastream"), unless the mouse has moved outside the area of the content.
... motion the mouse cursor should always be included in the
video if it's moving, and for a short time after it stops moving.
... never the mouse cursor is never included in the shared
video.
MediaTrackSupportedConstraints.frameRate - Web APIs
the framerate constraint can be used to establish acceptable upper and lower bounds on the
video frame rate for a new
video track, or to specify an exact frame rate that must be provided for the request to succeed.
... checking the value of this property lets you determine if the user agent allows constraining the
video track configuration by frame rate.
...if the property isn't present, the user agent doesn't allow specifying limits on the frame rate for
video tracks.
... example this simple example looks to see if your browser supports constraining the frame rate when requesting
video tracks.
msPlayToSource - Web APIs
syntax ptr = object.msplaytosource; value playto is a means through which an app can connect local playback/display for audio,
video, and img elements to a remote device.
... msplaytosource is used in the sourcerequested handler -- get the playtosource object from an audio,
video, or img element using the msplaytosource property and pass it to e.setsource, then set the playtosource.next property to the msplaytosource of another element for continual playing.
... example <
video id="
videoplayer" src="http://www.contoso.com/clip.mp4" controls autoplay /> <script type="text/javascript"> // step 1: obtain playtomanager object for app’s current view.
... e.sourcerequest.setsource(document.getelementbyid("
videoplayer").msplaytosource); // the media will then be streamed to the device chosen by the user in the ui.
Navigator.getUserMedia() - Web APIs
the deprecated navigator.getusermedia() method prompts the user for permission to use up to one
video input device (such as a camera or shared screen) and up to one audio input device (such as a microphone) as the source for a mediastream.
... if permission is granted, a mediastream whose
video and/or audio tracks come from those devices is delivered to the specified success callback.
...your callback can then assign the stream to the desired object (such as an <audio> or <
video> element), as shown in the following example: function(stream) { var
video = document.queryselector('
video');
video.srcobject = stream;
video.onloadedmetadata = function(e) { // do something with the
video here.
... navigator.getusermedia = navigator.getusermedia || navigator.webkitgetusermedia || navigator.mozgetusermedia; if (navigator.getusermedia) { navigator.getusermedia({ audio: true,
video: { width: 1280, height: 720 } }, function(stream) { var
video = document.queryselector('
video');
video.srcobject = stream;
video.onloadedmetadata = function(e) {
video.play(); }; }, function(err) { console.log("the following error occurred: " + err.name); } ); } else { console.log("getusermedia not supported"); ...
RTCRtpCapabilities - Web APIs
that means that, for instance, if there are two entries for the h.264 codec (as identified by the mimetype being "
video/h264"), there are other values in the capabilities objects indicating how they're different in some way.
...those components are: red (redundant audio data) the media type of an red entry may vary due to there being several versions of it, but it will end with red, such as
video/red or
video/fwdred.
...one possible value is
video/ulpfec (a generic error connection model).
... rtx (retransmission) this component is responsible for retransmission of data; it's media type should be
video/rtx.
RTCRtpEncodingParameters - Web APIs
scaleresolutiondownby only used for senders whose track's kind is
video, this is a double-precision floating-point value specifying a factor by which to scale down the
video during encoding.
... the default value, 1.0, means that the sent
video's size will be the same as the original.
... a value of 2.0 scales the
video frames down by a factor of 2 in each dimension, resulting in a
video 1/4 the size of the original.
... the value must not be less than 1.0 (you can't use this to scale the
video up).
RTCRtpReceiver.getCapabilities() static function - Web APIs
all browsers support the primary media kinds: audio and
video.
... example the function below returns a boolean indicating whether or not the device supports receiving h.264
video on a webrtc connection.
... since rtcrtpreceiver.getcapabilities() actually only indicates probable support, attempting to receive h.264
video might still fail even after getting a positive response from this function.
... function canreceiveh264() { let capabilities = rtcrtpreceiver.getcapabilities("
video"); capabilities.codecs.foreach((codec) => { if (codec.mimetype === "
video/h264") { return true; } }); return false; } specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtcrtpreceiver.getcapabilities()' in that specification.
RTCRtpSendParameters.encodings - Web APIs
scaleresolutiondownby only used for senders whose track's kind is
video, this is a double-precision floating-point value specifying a factor by which to scale down the
video during encoding.
... the default value, 1.0, means that the sent
video's size will be the same as the original.
... a value of 2.0 scales the
video frames down by a factor of 2 in each dimension, resulting in a
video 1/4 the size of the original.
... the value must not be less than 1.0 (you can't use this to scale the
video up).
RTCRtpSender.replaceTrack() - Web APIs
the new track must be of the same media kind (audio,
video, etc) and switching the track should not require negotiation.
... the new track is a
video track and its raw or pre-encoded state differs from that of the original track.
... examples switching
video cameras // example to change
video camera, suppose selected value saved into window.selectedcamera navigator.mediadevices .getusermedia({
video: { deviceid: { exact: window.selectedcamera } } }) .then(function(stream) { let
videotrack = stream.get
videotracks()[0]; pcs.foreach(function(pc) { var sender = pc.getsenders().find(function(s) { return s.track.kind ==
videotrack.kind; }); console.log('found sender:', sender); sender.r...
...eplacetrack(
videotrack); }); }) .catch(function(err) { console.error('error happens:', err); }); specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtcrtpsender.replacetrack()' in that specification.
SourceBuffer - Web APIs
sourcebuffer.
videotracks read only a list of the
video tracks currently contained inside the sourcebuffer.
... examples the following simple example loads a
video chunk by chunk as fast as possible, playing it as soon as it can.
... this example was written by nick desaulniers and can be viewed live here (you can also download the source for further investigation.) var
video = document.queryselector('
video'); var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource(); //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceopen (_) { //console.log(this.readystate); // open var mediasource = th...
...is; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); } function fetchab (url, cb) { console.log(url); var xhr = new xmlhttprequest; xhr.open('get', url); xhr.responsetype = 'arraybuffer'; xhr.onload = function () { cb(xhr.response); }; xhr.send(); } specifications specification status comment media source extensionsthe definition of 'sourcebuffer' in that specification.
TextTrack.mode - Web APIs
safari additionally requires the default boolean attribute to be set to true when implementing your own
video player controls in order for the subtitles cues to be shown.
...in general: tracks whose kind is "subtitles" or "captions" are rendered with the cues overlaid over the top of the
video.
... tracks whose kind is "descriptions" are presented in a non-visual form (for example, the text might be spoken to describe the action in the
video).
... example in this example, we configure the text track's cues so that every time a cue is finished, the
video automatically pauses playback.
TrackEvent - Web APIs
events based on trackevent are always sent to one of the media track list types: events involving
video tracks are always sent to the
videotracklist found in htmlmediaelement.
videotracks events involving audio tracks are always sent to the audiotracklist specified in htmlmediaelement.audiotracks events affecting text tracks are sent to the texttracklist object indicated by htmlmediaelement.texttracks.
...if not null, this is always an object of one of the media track types: audiotrack,
videotrack, or texttrack).
... example this example sets up a function, handletrackevent(), which is callled for any addtrack or removetrack event on the first <
video> element found in the document.
... var
videoelem = document.queryselector("
video");
videoelem.
videotracks.addeventlistener("addtrack", handletrackevent, false);
videoelem.
videotracks.addeventlistener("removetrack", handletrackevent, false);
videoelem.audiotracks.addeventlistener("addtrack", handletrackevent, false);
videoelem.audiotracks.addeventlistener("removetrack", handletrackevent, false);
videoelem.texttracks.addeventlistener("addtrack", handletrackevent, false);
videoelem.texttracks.addeventlistener("removetrack", handletrackevent, false); function handletrackevent(event) { var trackkind; if (event.target instanceof(
videotracklist)) { trackkind = "
video"; } else if (event.target instanceof(audiotracklist)) { trackkind = "audio"; } else if (event.target instanceof(texttracklist)) { trackkind = "tex...
getTrackById - Web APIs
the
videotracklist method gettrackbyid() returns the first
videotrack object from the track list whose id matches the specified string.
... syntax var thetrack =
videotracklist.gettrackbyid(id); paramters id a domstring indicating the id of the track to locate within the track list.
... return value a
videotrack object indicating the first track found within the
videotracklist whose id matches the specified string.
... specifications specification status comment html living standardthe definition of '
videotracklist.gettrackbyid()' in that specification.
WebGLRenderingContext.texImage2D() - Web APIs
pixels); void gl.teximage2d(target, level, internalformat, format, type, html
videoelement?
...et, level, internalformat, width, height, border, format, type, glintptr offset); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, htmlcanvaselement source); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, htmlimageelement source); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, html
videoelement source); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, imagebitmap source); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, imagedata source); void gl.teximage2d(target, level, internalformat, width, height, border, format, type, arraybufferview srcdata, srcoffset); parameters target a glenum specifying t...
... rgba i16 i16 i16 i16 ● rgba16ui rgba ui16 ui16 ui16 ui16 ● rgba32i rgba i32 i32 i32 i32 ● rgba32ui rgba ui32 ui32 ui32 ui32 ● possible values in webgl2 for the versions of teximage2d that take a texture an htmlimageelement, htmlcanvaselement, html
videoelement, imagebitmap, or imagedata gl.alpha: discards the red, green and blue components and reads the alpha component.
... imagedata, htmlimageelement, htmlcanvaselement, html
videoelement, imagebitmap.
Compressed texture formats - Web APIs
these are useful to increase texture detail while limiting the additional
video memory necessary.
... if supported, textures can be stored in a compressed format in
video memory.
... this allows for additional detail while limiting the added
video memory necessary.
... note that webgl makes no functionality available to compress or decompress textures: they must already be in a compressed format and can then be directly uploaded to
video memory.
Using DTMF with WebRTC - Web APIs
in order to more fully support audio/
video conferencing, webrtc supports sending dtmf to the remote peer on an rtcpeerconnection.
... let dialstring = "12024561111"; let callerpc = null; let receiverpc = null; let dtmfsender = null; let hasaddtrack = false; let mediaconstraints = { audio: true,
video: false }; let offeroptions = { offertoreceiveaudio: 1, offertoreceive
video: 0 }; let dialbutton = null; let logelement = null; these are, in order: dialstring the dtmf string the caller will send when the "dial" button is clicked.
...we want an audio-only connection, so
video is false, while audio is true.
...in this case, we state that we want to receive audio but not
video.
Web Audio API - Web APIs
these could be either computed mathematically (such as oscillatornode), or they can be recordings from sound/
video files (like audiobuffersourcenode and mediaelementaudiosourcenode) and audio streams (mediastreamaudiosourcenode).
...an html <audio> or <
video> element), audio destination, intermediate processing module (e.g.
... mediaelementaudiosourcenode the mediaelementaudiosourcenode interface represents an audio source consisting of an html5 <audio> or <
video> element.
...udio is routed through your app.controlling multiple parameters with constantsourcenodethis article demonstrates how to use a constantsourcenode to link multiple parameters together so they share the same value, which can be changed by simply setting the value of the constantsourcenode.offset parameter.example and tutorial: simple synth keyboardthis article presents the code and working demo of a
video keyboard you can play using the mouse.
Content categories - Developer guides
>, <h4>, <h5>, <h6>, <header>, <hgroup>, <hr>, <i>, <iframe>, <img>, <input>, <ins>, <kbd>, <keygen>, <label>, <main>, <map>, <mark>, <math>, <menu>, <meter>, <nav>, <noscript>, <object>, <ol>, <output>, <p>, <picture>, <pre>, <progress>, <q>, <ruby>, <s>, <samp>, <script>, <section>, <select>, <small>, <span>, <strong>, <sub>, <sup>, <svg>, <table>, <template>, <textarea>, <time>, <ul>, <var>, <
video>, <wbr> and text.
...o this category are <abbr>, <audio>, <b>, <bdo>, <br>, <button>, <canvas>, <cite>, <code>, <command>, <data>, <datalist>, <dfn>, <em>, <embed>, <i>, <iframe>, <img>, <input>, <kbd>, <keygen>, <label>, <mark>, <math>, <meter>, <noscript>, <object>, <output>, <picture>, <progress>, <q>, <ruby>, <samp>, <script>, <select>, <small>, <span>, <strong>, <sub>, <sup>, <svg>, <textarea>, <time>, <var>, <
video>, <wbr> and plain text (not only consisting of white spaces characters).
...elements that belong to this category include: <audio>, <canvas>, <embed>, <iframe>, <img>, <math>, <object>, <picture>, <svg>, <
video>.
... some elements belong to this category only under specific conditions: <audio>, if the controls attribute is present <img>, if the usemap attribute is present <input>, if the type attribute is not in the hidden state <menu>, if the type attribute is in the toolbar state <object>, if the usemap attribute is present <
video>, if the controls attribute is present palpable content content is palpable when it's neither empty or hidden; it is content that is rendered and is substantive.
Index - HTTP
browsers set adequate values for this header depending on the context where the request is done: when fetching a css stylesheet a different value is set for the request than when fetching an image,
video or a script.
... 40 csp: media-src csp, directive, http, reference, security the http content-security-policy (csp) media-src directive specifies valid sources for loading media using the <audio> and <
video> elements.
...the autoplay attribute on <audio> and <
video> elements will be ignored.
... 65 feature-policy: camera directive, feature policy, feature-policy, http, reference, camera the http feature-policy header camera directive controls whether the current document is allowed to use
video input devices.
List of Mozilla-Based Applications - Archive of obsolete content
owser for unix kirix strata data browser kiwix offline version of wikipedia kneemail prayer, praise, and journal application komodo and komodo edit and open komodo development tools mozilla-based application (pre-xulrunner style), xul ui kompozer wysiwyg html editor unofficial bug-fix release of nvu kylo
video browser uses gecko biofortis labmatrix web-accessible software application used for information management and integration of patient clinical, specimen, genetic and molecular assay data based on xul liaison groupware client for novell’s email and collaboration server previously called mozngw linbox kiosk browser (fr) web browser dedicated browser...
...system makes use of some mpl files such as libsecurity_asn1 maemo browser browser for maemo internet tablet development name is microb magooclient business process management tool uses mozilla rhino mantra security tool mccoy secure update tool for add-ons xulrunner application mediacoder media converter transcoder for
video, audio, and even devices such as zen, zune, pocketpcs, ipods, and psps mekhala browser part of the khmeros linux distro midbrowser mobile web browser mockery mockup creation tool built on xulrunner mongodb database project uses spidermonkey moyura email client part of the khmeros linux distro mozcards, jolis...
...widgets desktop widgets uses mozilla spidermonkey yoono desktop social networking app standalone version of yoono firefox add-on zap sip client status update from august 2008 zimbra desktop email and calendar application uses prism zinc
video browser according to faq the standalone version is based on firefox zk web application framework makes use of xul zotero reference manager firefox extension and xulrunner application note: this page was previously hosted on mozpad.org and the history for that page can be found on that site.
Plug-in Development Overview - Gecko Plugin API Reference
d resource of the plug-in dll should contain the following set of string/value pairs: mimetype: for mime types fileextents: for file extensions fileopenname: for file open template productname: for plug-in name filedescription: for description language: for language in use in the mime types and file extensions strings, multiple values are separated by the "|" character, for example:
video/quicktime|audio/aiff|image/jpeg the version stamp will be loaded only if it has been created with the language set to "us english" and the character set to "windows multilingual" in your development environment.
...for example: str 128 mime type string 1
video/quicktime string 2 mov, moov string 3 audio/aiff string 4 aiff string 5 image/jpeg string 6 jpg several other optional strings may contain useful information about the plug-in.
...for example, this description list corresponds to the types in the previous example: string 1: "quicktime
video", string 4: "aiff audio", and string 5: "jpeg image format." str#' 126: string 1 can contain a descriptive message about the plug-in.
What is accessibility? - Learn web development
the below
video also provides a brief example of what the experience is like.
...captions) that can be displayed along with
video.
... a good foundation of accessibility for people with cognitive impairments includes: delivering content in more than one way, such as by text-to-speech or by
video; easily-understood content, such as text written using plain-language standards; focusing attention on important content; minimizing distractions, such as unnecessary content or advertisements; consistent webpage layout and navigation; familiar elements, such as underlined links blue when not visited and purple when visited; dividing processes into logical, essential steps with progress ind...
Images, media, and form elements - Learn web development
replaced elements images and
video are described as replaced elements.
... certain replaced elements, such as images and
video, are also described as having an aspect ratio.
...this technique will work with other replaced elements such as <
video>s, or <iframe>s.
How do I start to design my website? - Learn web development
teach music through
videos.
... teach music through
videos.
...mail?) define how people will find those contact channels from your website sell goodies create the goodies store the goodies find a way to handle shipping find a way to handle payment make a mechanism on your site for people to place orders teach music through
videos record
video lessons prepare
video files viewable online (again, could you do this with existing web services?) give people access to your
videos on some part of your website two things to notice.
What is a URL? - Learn web development
on an html document, for example, the browser will scroll to the point where the anchor is defined; on a
video or audio document, the browser will try to go to the time the anchor represents.
... the html language — which will be discussed later on — makes extensive use of urls: to create links to other documents with the <a> element; to link a document with its related resources through various elements such as <link> or <script>; to display media such as images (with the <img> element),
videos (with the <
video> element), sounds and music (with the <audio> element), etc.; to display other html documents with the <iframe> element.
... note: when specifying urls to load resources as part of a page (such as when using the <script>, <audio>, <img>, <
video>, and the like), you should generally only use http and https urls, with few exceptions (one notable one being data:; see data urls).
Creating hyperlinks - Learn web development
note: a url can point to html files, text files, images, text documents,
video and audio files, or anything else that lives on the web.
... linking to non-html resources — leave clear signposts when linking to a resource that will be downloaded (like a pdf or word document), streamed (like
video or audio), or has another potentially unexpected effect (opens a popup window, or loads a flash movie), you should add clear wording to reduce any confusion.
... let's look at some examples, to see what kind of text can be used here: <p><a href="http://www.example.com/large-report.pdf"> download the sales report (pdf, 10mb) </a></p> <p><a href="http://www.example.com/
video-stream/" target="_blank"> watch the
video (stream opens in separate tab, hd quality) </a></p> <p><a href="http://www.example.com/car-game"> play the car game (requires flash) </a></p> use the download attribute when linking to a download when you are linking to a resource that's to be downloaded rather than opened in the browser, you can use the download attribute to provide a default save...
Multimedia and Embedding - Learn web development
this module explores how to use html to include multimedia in your web pages, including the different ways that images can be included, and how to embed
video, audio, and even entire webpages.
...
video and audio content next, we'll look at how to use the html5 <
video> and <audio> elements to embed
video and audio on our pages, including basics, providing access to different file formats to different browsers, adding captions and subtitles, and how to add fallbacks for older browsers.
... assessments the following assessments will test your understanding of the html basics covered in the guides above: mozilla splash page in this assessment, we'll test your knowledge of some of the techniques discussed in this module's articles, getting you to add some images and
video to a funky splash page all about mozilla!
Drawing graphics - Learn web development
these can be simple images, frames from
videos, or the content of other canvases.
... note: in our github repo you can also find another interesting 3d cube example — three.js
video cube (see it live also).
... this uses getusermedia() to take a
video stream from a computer web cam and project it onto the side of the cube as a texture!
What is JavaScript? - Learn web development
a high-level definition javascript is a scripting or programming language that allows you to implement complex features on web pages — every time a web page does more than just sit there and display static information for you to look at — displaying timely content updates, interactive maps, animated 2d/3d graphics, scrolling
video jukeboxes, etc.
... html is the markup language that we use to structure and give meaning to our web content, for example defining paragraphs, headings, and data tables, or embedding images and
videos in the page.
... audio and
video apis like htmlmediaelement and webrtc allow you to do really interesting things with multimedia, such as play audio and
video right in a web page, or grab
video from your web camera and display it on someone else's computer (try our simple snapshot demo to get the idea).
JavaScript performance - Learn web development
previous overview: performance next while images and
video account for over 70% of the bytes downloaded for the average website, byte per byte, javascript has a greater negative impact on performance.
... similar to images and
video, the best way to improve performance is to omit what is not, in fact, necessary.
... measuring performance multimedia: images multimedia:
video javascript performance best practices.
Embedding API for Accessibility
ndfile); setcharpref("alert.audio.script_waiting", pathtosoundfile); setcharpref("alert.audio.redirect_waiting", pathtosoundfile); setcharpref("alert.audio.refresh_waiting", pathtosoundfile); setcharpref("alert.audio.plugin_content_waiting", pathtosoundfile); setcharpref("alert.audio.
video_waiting", pathtosoundfile); setcharpref("alert.audio.audio_waiting", pathtosoundfile); setcharpref("alert.audio.timed_event_waiting", pathtosoundfile); /* these alerts will also be mirrored visually, either on the status bar or elsewhere */ no background images ...
...directs", acceptredirects); no content refreshes setboolpref("browser.accept.refreshes", acceptrefreshes); no plugin content setboolpref("browser.accept.plugin_content.[plugin_name_goes_here]", acceptplugincontent); no
video setboolpref("browser.accept.
video", accept
video); no audio setboolpref("browser.accept.audio", acceptaudio); no timed events setboolpref("browser.accept.timed_events", accepttimedevents); no ...
...the audio file could be a clip of a human voice saying "you've got
video" or even a simple beep.
Experimental features in Firefox
nightly 64 no developer edition 64 no beta 64 no release 64 no preference name media.setsinkid.enabled htmlmediaelement properties: audiotracks and
videotracks enabling this feature adds the htmlmediaelement.audiotracks and htmlmediaelement.
videotracks properties to all html media elements.
... however, because firefox doesn't currently suport multiple audio and
video tracks, the most common use cases for these properties don't work, so they're both disabled by default.
...this is a still image file format that leverages the capabilities of the av1
video compression algorithms to reduce image size.
mozbrowsercontextmenu
in the case of an image or
video context menu, this is the src of the image or
video clicked on to get the context menu.
... has
video a boolean.
... in the case of a
video context menu, this returns true if the
video has metadata and is bigger than 0 x 0, or false if not.
Plug-in Basics - Plugins
for example, this object element calls a plug-in that displays
video: <object data="newave.avi" type="
video/avi" width="320" height="200" autostart="true" loop="true"> </object> a hidden plug-in is a type of embedded plug-in that is not drawn on the screen when it is invoked.
... for example, a plug-in that displays
video could have private attributes that determine whether to start the plug-in automatically or loop the
video automatically on playback, as in the following embed element: <embed src="myavi.avi" width="100" height="125" autostart="true" loop="true"> with this embed element, gecko passes the values to the plug-in, using the arg parameters of the npp_new call that creates the plug-in instance.
... the plug-in must scan its list of attributes to determine whether it should automatically start the
video and loop it on playback.
Plug-in Development Overview - Plugins
d resource of the plug-in dll should contain the following set of string/value pairs: mimetype: for mime types fileextents: for file extensions fileopenname: for file open template productname: for plug-in name filedescription: for description language: for language in use in the mime types and file extensions strings, multiple values are separated by the "|" character, for example:
video/quicktime|audio/aiff|image/jpeg the version stamp will be loaded only if it has been created with the language set to "us english" and the character set to "windows multilingual" in your development environment.
...for example: str 128 mime type string 1
video/quicktime string 2 mov, moov string 3 audio/aiff string 4 aiff string 5 image/jpeg string 6 jpg several other optional strings may contain useful information about the plug-in.
...for example, this description list corresponds to the types in the previous example: string 1: "quicktime
video", string 4: "aiff audio", and string 5: "jpeg image format." str#' 126: string 1 can contain a descriptive message about the plug-in.
AudioContext.createMediaStreamSource() - Web APIs
example in this example, we grab a media (audio +
video) stream from navigator.getusermedia, feed the media into a <
video> element to play then mute the audio, but then also feed the audio into a mediastreamaudiosourcenode.
... the range slider below the <
video> element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!
... var pre = document.queryselector('pre'); var
video = document.queryselector('
video'); var myscript = document.queryselector('script'); var range = document.queryselector('input'); // getusermedia block - grab stream // put it into a mediastreamaudiosourcenode // also output the visuals into a
video element if (navigator.mediadevices) { console.log('getusermedia supported.'); navigator.mediadevices.getusermedia ({audio: true,
video: true}) .then(function(stream) {
video.srcobject = stream;
video.onloadedmetadata = function(e) {
video.play();
video.muted = true; }; // create a mediastreamaudiosourcenode // feed the htmlmediaelement into it va...
AudioTrack.enabled - Web APIs
function swapcommentarymain() { var
videoelem = document.getelementbyid("main-
video"); var audiotrackmain; var audiotrackcommentary;
videoelem.audiotracks.foreach(track) { if (track.kind === "main") { audiotrackmain = track; } else if (track.kind === "commentary") { audiotrackcommentary = track; } } if (audiotrackmain && audiotrackcommentary) { var commentaryenabled = audiotrackcommentary.enabled; ...
... audiotrackcommentary.enabled = audiotrackmain.enabled; audiotrackmain.enabled = commentaryenabled; } } the swapcommentarymain() function above finds within the audio tracks of the <
video> element "main-
video" the audio tracks whose kind values are "main" and "commentary".
... note: this example assumes that there is only one of each kind of track in the
video, but this is not necessarily the case.
Using images - Web APIs
html
videoelement using an html <
video> element as your image source grabs the current frame from the
video and uses it as an image.
... using frames from a
video you can also use frames from a
video being presented by a <
video> element (even if the
video is not visible).
... for example, if you have a <
video> element with the id "my
video", you can do this: function getmy
video() { var canvas = document.getelementbyid('canvas'); if (canvas.getcontext) { var ctx = canvas.getcontext('2d'); return document.getelementbyid('my
video'); } } this returns the html
videoelement object for the
video, which, as covered earlier, is one of the objects that can be used as a canvasimagesource.
DisplayMediaStreamConstraints - Web APIs
the displaymediastreamconstraints dictionary is used to specify whether or not to include
video and/or audio tracks in the mediastream to be returned by getdisplaymedia(), as well as what type of processing must be applied to the tracks.
...
video if true (the default), the display contents are included in a mediastreamtrack within the stream provided by getdisplaymedia().
... optionally, a mediatrackconstraints object may be given, providing options specifying processing to be performed on the
video data before adding it to the stream.
Fullscreen API - Web APIs
example in this example, a
video is presented in a web page.
... pressing the return or enter key lets the user toggle between windowed and full-screen presentation of the
video.
...switching to full-screen mode is done by calling element.requestfullscreen() on the <
video> element.
msAudioCategory - Web APIs
msaudiocategory specifies the purpose of the audio or
video media, such as background audio or alerts.
...examples include the following local media playback scenarios: local playlist streaming radio streaming playlist music
videos streaming audio/radio, youtube, netflix, etc.
... yes communications for streaming communication audio such as the following: voip real time chat or other type of phone calls should not be used in non-real-time or non-communication scenarios, such as audio and/or
video playback, as playback startup latency is affected.
HTMLMediaElement.audioTracks - Web APIs
the media element may be either an <audio> element or a <
video> element.
... <
video id="
video" src="some
video.mp4"></
video> javascript the javascript code handles muting the
video element's audio tracks.
... var
video = document.getelementbyid("
video"); for (var i = 0; i <
video.audiotracks.length; i += 1) {
video.audiotracks[i].enabled = false; } specifications specification status comment html living standardthe definition of 'htmlmediaelement.audiotracks' in that specification.
HTMLMediaElement.autoplay - Web APIs
note: sites which automatically play audio (or
videos with an audio track) can be an unpleasant experience for users, so it should be avoided when possible.
... note: some browsers offer users the ability to override autoplay in order to prevent disruptive audio or
video from playing without permission or in the background.
... <
video id="
video" controls> <source src="https://player.vimeo.com/external/250688977.sd.mp4?s=d14b1f1a971dde13c79d6e436b88a6a928dfe26b&profile_id=165"> </
video> *** disable autoplay (recommended) *** false is the default value document.queryselector('#
video').autoplay = false; specifications specification status comment html living standardthe definition of 'htmlmediaelement.autoplay' in that specification.
HTMLMediaElement: loadstart event - Web APIs
bubbles no cancelable no interface event event handler property onloadstart examples live example html <div class="example"> <button type="button">load
video</button> <
video controls width="250"></
video> <div class="event-log"> <label>event log:</label> <textarea readonly class="event-log-contents"></textarea> </div> </div> css .event-log-contents { width: 18rem; height: 5rem; border: 1px solid black; margin: .2rem; padding: .2rem; } .example { display: grid; grid-template-areas: "button log" "
video log"; } button { grid-area: button; width: 10rem; margin: .5rem 0; } vid...
...eo { grid-area:
video; } .event-log { grid-area: log; } .event-log>label { display: block; } js const load
video = document.queryselector('button'); const
video = document.queryselector('
video'); const eventlog = document.queryselector('.event-log-contents'); let source = null; function handleevent(event) { eventlog.textcontent = eventlog.textcontent + `${event.type}\n`; }
video.addeventlistener('loadstart', handleevent);
video.addeventlistener('progress', handleevent);
video.addeventlistener('canplay', handleevent);
video.addeventlistener('canplaythrough', handleevent); load
video.addeventlistener('click', () => { if (source) { document.location.reload(); } else { load
video.textcontent = "reset example"; source = document.createelement('source'...
...); source.setattribute('src', 'https://interactive-examples.mdn.mozilla.net/media/examples/flower.webm'); source.setattribute('type', '
video/webm');
video.appendchild(source); } }); result specifications specification status html living standardthe definition of 'loadstart media event' in that specification.
HTMLMediaElement: progress event - Web APIs
bubbles no cancelable no interface event event handler property onprogress examples live example html <div class="example"> <button type="button">load
video</button> <
video controls width="250"></
video> <div class="event-log"> <label>event log:</label> <textarea readonly class="event-log-contents"></textarea> </div> </div> css .event-log-contents { width: 18rem; height: 5rem; border: 1px solid black; margin: .2rem; padding: .2rem; } .example { display: grid; grid-template-areas: "button log" "
video log"; } button { grid-area: button; width: 10rem; margin: .5rem 0; }
video { ...
... grid-area:
video; } .event-log { grid-area: log; } .event-log>label { display: block; } javascript const load
video = document.queryselector('button'); const
video = document.queryselector('
video'); const eventlog = document.queryselector('.event-log-contents'); let source = null; function handleevent(event) { eventlog.textcontent = eventlog.textcontent + `${event.type}\n`; }
video.addeventlistener('loadstart', handleevent);
video.addeventlistener('progress', handleevent);
video.addeventlistener('canplay', handleevent);
video.addeventlistener('canplaythrough', handleevent); load
video.addeventlistener('click', () => { if (source) { document.location.reload(); } else { load
video.textcontent = "reset example"; source = document.createelement('sour...
...ce'); source.setattribute('src', 'https://interactive-examples.mdn.mozilla.net/media/examples/flower.webm'); source.setattribute('type', '
video/webm');
video.appendchild(source); } }); result specifications specification status html living standardthe definition of 'progress media event' in that specification.
HTMLMediaElement.seekToNextFrame() - Web APIs
this method lets you access frames of
video media without the media being performed in real time.
...possible uses for this method include filtering and editing of
video content.
... if there is no
video on the media element, or the media isn't seekable, nothing happens.
HTMLMediaElement: timeupdate event - Web APIs
user agents are encouraged to vary the frequency of the event based on the system load and the average cost of processing the event each time, so that the ui updates are not any more frequent than the user agent can comfortably handle while decoding the
video.
... using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('timeupdate', (event) => { console.log('the currenttime attribute has been updated.
... again.'); }); using the ontimeupdate event handler property: const
video = document.queryselector('
video');
video.ontimeupdate = (event) => { console.log('the currenttime attribute has been updated.
MediaCapabilities.encodingInfo() - Web APIs
syntax mediacapabilities.encodinginfo(mediaencodingconfiguration) parameters mediaencodingconfiguration a valid mediaencodingconfiguration dictionary containing a valid media encoding type of record or transmission and a valid media configuration: either an audioconfiguration or
videoconfiguration dictionary.
... return value a promise fulfilling with a mediacapabilitiesinfo interface containing three boolean attributes: supported smooth powerefficient exceptions a typeerror is raised if the mediaconfiguration passed to the encodinginfo() method is invalid, either because the type is not
video or audio, the contenttype is not a valid codec mime type, or any other error in the media configuration passed to the method, including omitting any of the media encoding configuration elements.
... example //create media configuration to be tested const mediaconfig = { type : 'record', // or 'transmission'
video : { contenttype : "
video/webm;codecs=vp8.0", // valid content type width : 1920, // width of the
video height : 1080, // height of the
video bitrate : 120000, // number of bits used to encode 1s of
video framerate : 48 // number of frames making up that 1s.
MediaDevices - Web APIs
getusermedia() with the user's permission through a prompt, turns on a camera and/or a microphone on the system and provides a mediastream containing a
video track and/or an audio track with the input.
...var
video = document.queryselector('
video'); var constraints = window.constraints = { audio: false,
video: true }; var errorelement = document.queryselector('#errormsg'); navigator.mediadevices.getusermedia(constraints) .then(function(stream) { var
videotracks = stream.get
videotracks(); console.log('got stream with constraints:', constraints); console.log('using
video device: ' +
videotracks[0].label); stream.onremovetrack = function() { console.log('stream ended'); }; window.stream = stream; // make variable available to browser console
video.srcobject = stream; }) .catch(function(error) { if (error.name === 'constraintnotsatisfiederror') { errormsg('the resol...
...ution ' + constraints.
video.width.exact + 'x' + constraints.
video.height.exact + ' px is not supported by your device.'); } else if (error.name === 'permissiondeniederror') { errormsg('permissions have not been granted to use your camera and ' + 'microphone, you need to allow the page access to your devices in ' + 'order for the demo to work.'); } errormsg('getusermedia error: ' + error.name, error); }); function errormsg(msg, error) { errorelement.innerhtml += '<p>' + msg + '</p>'; if (typeof error !== 'undefined') { console.error(error); } } specifications specification status comment media capture and streamsthe definition of 'mediadevices' in that specification.
MediaRecorder.mimeType - Web APIs
if (navigator.mediadevices) { console.log('getusermedia supported.'); var constraints = { audio: true,
video: true }; var chunks = []; navigator.mediadevices.getusermedia(constraints) .then(function(stream) { var options = { audiobitspersecond: 128000,
videobitspersecond: 2500000, mimetype: '
video/mp4' } var mediarecorder = new mediarecorder(stream,options); m = mediarecorder; m.mimetype; // would return '
video/mp4' ...
... }) .catch(function(error) { console.log(error.message); }); changing line 14 to the following causes mediarecorder to try to use avc constrained baseline profile level 4 for
video and aac-lc (low complexity) for audio, which is good for mobile and other possible resource-constrained situations.
... mimetype: '
video/mp4; codecs="avc1.424028, mp4a.40.2"' assuming this configuration is acceptable to the user agent, the value returned later by m.mimetype would then be
video/mp4; codecs="avc1.424028, mp4a.40.2".
MediaRecorder - Web APIs
options are available to do things like set the container's mime type (such as "
video/webm" or "
video/mp4") and the bit rates of the audio and
video tracks or a single overall bit rate.
...if this attribute is false, mediarecorder will record silence for audio and black frames for
video.
... mediarecorder.
videobitspersecond read only returns the
video encoding bit rate in use.
MediaStreamAudioSourceNode - Web APIs
example in this example, we grab a media (audio +
video) stream from navigator.getusermedia, feed the media into a <
video> element to play then mute the audio, but then also feed the audio into a mediastreamaudiosourcenode.
... the range slider below the <
video> element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!
... var pre = document.queryselector('pre'); var
video = document.queryselector('
video'); var myscript = document.queryselector('script'); var range = document.queryselector('input'); // getusermedia block - grab stream // put it into a mediastreamaudiosourcenode // also output the visuals into a
video element if (navigator.mediadevices) { console.log('getusermedia supported.'); navigator.mediadevices.getusermedia ({audio: true,
video: true}) .then(function(stream) {
video.srcobject = stream;
video.onloadedmetadata = function(e) {
video.play();
video.muted = true; }; // create a mediastreamaudiosourcenode // feed the htmlmediaelement into it va...
MediaStreamConstraints - Web APIs
video either a boolean (which indicates whether or not a
video track is requested) or a mediatrackconstraints object providing the constraints which must be met by the
video track included in the returned mediastream.
... if constraints are specified, a
video track is inherently requested.
...streams isolated in this way can only be displayed in a media element (<audio> or <
video>) where the content is protected just as if cors cross-origin rules were in effect.
MediaStreamTrack.enabled - Web APIs
for
video tracks, every frame is filled entirely with black pixels.
...empty
video frames have every pixel set to black.
... usage notes if the mediastreamtrack represents the
video input from a camera, disabling the track by setting enabled to false also updates device activity indicators to show that the camera is not currently recording or streaming.
MediaStreamTrack - Web APIs
the mediastreamtrack interface represents a single media track within a stream; typically, these are audio or
video tracks, but other track types may exist as well.
... mediastreamtrack.kind read only returns a domstring set to "audio" if the track is an audio track and to "
video", if it is a
video track.
... mediastreamtrack.readonly read only returns a boolean value with a value of true if the track is readonly (such a
video file source or a camera that settings can't be modified), false otherwise.
MediaStreamTrackAudioSourceNode - Web APIs
example in this example, we grab a media (audio +
video) stream from navigator.getusermedia, feed the media into a <
video> element to play then mute the audio, but then also feed the audio into a mediastreamaudiosourcenode.
... the range slider below the <
video> element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!
... var pre = document.queryselector('pre'); var
video = document.queryselector('
video'); var myscript = document.queryselector('script'); var range = document.queryselector('input'); // getusermedia block - grab stream // put it into a mediastreamaudiosourcenode // also output the visuals into a
video element if (navigator.mediadevices) { console.log('getusermedia supported.'); navigator.mediadevices.getusermedia ({audio: true,
video: true}) .then(function(stream) {
video.srcobject = stream;
video.onloadedmetadata = function(e) {
video.play();
video.muted = true; }; // create a mediastreamaudiosourcenode // feed the htmlmediaelement into it va...
MediaTrackConstraints.displaySurface - Web APIs
usage notes you can check the setting selected by the user agent after the display media has been created by getdisplaymedia() by calling getsettings() on the display media's
video mediastreamtrack, then checking the value of the returned mediatracksettings object's displaysurface object.
... for example, if your app needs to know that the surface being shared is a monitor or application—meaning that there's possibly a non-content backdrop—it can use code similar to this: let mayhavebackdropflag = false; let displaysurface = displaystream.get
videotracks()[0].getsettings().displaysurface; if (displaysurface === "monitor" || displaysurface ==="application") { mayhavebackdropflag = true; } following this code, mayhavebackdrop is true if the display surface contained in the stream is of type monitor or application; either of these may have non-content backdrop areas.
... later code can use this flag to determine whether or not to perform special processing, such as to remove or replace the backdrop, or to "cut" the individual display areas out of the received frames of
video.
Network Information API - Web APIs
this example would be called soon after page load to check for a connection type where preloading a
video may not be desirable.
... if a cellular connection is found, then the preload
video flag is set to false.
... let preload
video = true; var connection = navigator.connection || navigator.mozconnection || navigator.webkitconnection; if (connection) { if (connection.effectivetype === 'slow-2g') { preload
video = false; } } interfaces networkinformation provides information about the connection a device is using to communicate with the network and provides a means for scripts to be notified if the connection type changes.
RTCInboundRtpStreamStats.pliCount - Web APIs
a pli packet indicates that some amount of encoded
video data has been lost for one or more frames.
...these are sent by the receiver's decoder to notify the encoder (the sender) that an undefined amount of coded
video data, which may span frame boundaries, has been lost.
... this information is only available for
video streams.
RTCInboundRtpStreamStats.qpSum - Web APIs
the qpsum property of the rtcinboundrtpstreamstats dictionary is a value generated by adding the quantization parameter (qp) values for every frame sent or received to date on the
video track corresponding to this rtcinboundrtpstreamstats object.
... in general, the higher this number is, the more heavily compressed the
video data is.
... note: this value is only available for
video media.
RTCOutboundRtpStreamStats.pliCount - Web APIs
a pli packet indicates that some amount of encoded
video data has been lost for one or more frames.
...these are sent by the receiver's decoder to notify the sender's encoder that an undefined amount of coded
video data, which may span frame boundaries, has been lost.
... note: this property is only used for
video streams.
RTCOutboundRtpStreamStats.qpSum - Web APIs
the qpsum property of the rtcoutboundrtpstreamstats dictionary is a value generated by adding the quantization parameter (qp) values for every frame this sender has produced to date on the
video track corresponding to this rtcoutboundrtpstreamstats object.
... in general, the higher this number is, the more heavily compressed the
video data is.
... note: this value is only available for
video media.
RTCPeerConnection.addStream() - Web APIs
the obsolete rtcpeerconnection method addstream() adds a mediastream as a local source of audio or
video.
... example this simple example adds the audio and
video stream coming from the user's camera to the connection.
... navigator.mediadevices.getusermedia({
video:true, audio:true}, function(stream) { var pc = new rtcpeerconnection(); pc.addstream(stream); }); migrating to addtrack() compatibility allowing, you should update your code to instead use the addtrack() method: navigator.getusermedia({
video:true, audio:true}, function(stream) { var pc = new rtcpeerconnection(); stream.gettracks().foreach(function(track) { pc.addtrack(track, stream); }); }); the newer addtrack() api avoids confusion over whether later changes to the track-makeup of a stream affects a peer connection (they do not).
RTCPeerConnection.createAnswer() - Web APIs
example here is a segment of code taken from the code that goes with the article signaling and
video calling.
...in this case, a websocket connection is used to send a json message with a type field with the value "
video-answer" to the other peer, carrying the answer to the device which sent the offer to connect.
... see handling the invitation in signaling and
video calling to see the complete code, in context, from which this snippet is derived; that will help you understand the signaling process and how answers work.
RTCPeerConnection: negotiationneeded event - Web APIs
see signaling transaction flow in signaling and
video calling for a description of the signaling process that begins with a negotiationneeded event.
... pc.addeventlistener("negotiationneeded", ev => { pc.createoffer() .then(offer => return pc.setlocaldescription(offer)) .then(() => sendsignalingmessage({ type: "
video-offer", sdp: pc.localdescription })) .catch(err => { /* handle error */ ); }, false); after creating the offer, the local end is configured by calling rtcpeerconnection.setlocaldescription(); then a signaling message is created and sent to the remote peer through the signaling server, to share that offer with the other peer.
... you can also set an event handler for the negotiationneeded event by assigning the event handler function to the rtcpeerconnection.onnegotiationneeded property: pc.onnegotiationneeded = ev => { pc.createoffer() .then(offer => return pc.setlocaldescription(offer)) .then(() => sendsignalingmessage({ type: "
video-offer", sdp: pc.localdescription })) .catch(err => { /* handle error */ ); }; for a more detailed example, see starting negotiation in signaling and
video calling.
RTCPeerConnection.ontrack - Web APIs
example this example, taken from the code for the article signaling and
video calling, connects the incoming track to the <
video> element which will be used to display the incoming
video.
... pc.ontrack = function(event) { document.getelementbyid("received_
video").srcobject = event.streams[0]; document.getelementbyid("hangup-button").disabled = false; }; the first line of our ontrack event handler takes the first stream in the incoming track and sets the srcobject attribute to that.
... this connects that stream of
video to the element so that it begins to be presented to the user.
RTCPeerConnection - Web APIs
its methods from: eventtargetaddicecandidate()when a web site or app using rtcpeerconnection receives a new ice candidate from the remote peer over its signaling channel, it delivers the newly-received candidate to the browser's ice agent by calling rtcpeerconnection.addicecandidate().addstream() the obsolete rtcpeerconnection method addstream() adds a mediastream as a local source of audio or
video.
...if no stream matches, it returns null.gettransceivers()the rtcpeerconnection interface's gettransceivers() method returns a list of the rtcrtptransceiver objects being used to send and receive data on the connection.removestream() the rtcpeerconnection.removestream() method removes a mediastream as a local source of audio or
video.
... constant description "balanced" the ice agent initially creates one rtcdtlstransport for each type of content added: audio,
video, and data channels.
RTCRtpSender.getCapabilities() static function - Web APIs
all browsers support the primary media kinds: audio and
video.
... example the function below returns a boolean indicating whether or not the device supports sending h.264
video on an rtcrtpsender.
... function cansendh264() { let capabilities = rtcrtpsender.getcapabilities("
video"); capabilities.codecs.foreach((codec) => { if (codec.mimetype === "
video/h264") { return true; } }); return false; } specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtcrtpsender.getcapabilities()' in that specification.
RTCRtpStreamStats.kind - Web APIs
the kind property of the rtcrtpstreamstats dictionary is a string indicating whether the described rtp stream contains audio or
video media.
... its value is always either "audio" or "
video".
... syntax mediakind = rtcrtpstreamstats.kind; value a domstring whose value is "audio" if the track whose statistics are given by the rtcrtpstreamstats object contains audio, or "
video" if the track contains
video media.
RTCRtpStreamStats.pliCount - Web APIs
a pli packet indicates that some amount of encoded
video data has been lost for one or more frames.
... a pli message is used by
video decoders (running on the receiving end of the stream) to notify the encoder (the sender) that an undefined amount of coded
video data, which may span frame boundaries, has been lost.
... note: this value is only available on the receiver, and only for
video media.
RTCRtpStreamStats.qpSum - Web APIs
the qpsum property of the rtcrtpstreamstats dictionary is a value generated by adding the quantization parameter (qp) values for every frame sent or received to date on the
video track corresponding to this rtcrtpstreamstats object.
... in general, the higher this number is, the more heavily compressed the
video data is.
... note: this value is only available for
video media.
RTCStatsReport - Web APIs
the statistics object is an rtcaudioreceiverstats object if kind is audio; if kind is
video, the object is an rtc
videoreceiverstats object.
...if kind is "audio", this object is of type rtcaudiosenderstats; if kind is "
video", this is an rtc
videosenderstats object.
... track the object is one of the types based on rtcmediahandlerstats: for audio tracks, the type is rtcsenderaudiotrackattachmentstats and for
video tracks, the type is rtcsender
videotrackattachmentstats.
RTCStatsType - Web APIs
the statistics object is an rtcaudioreceiverstats object if kind is audio; if kind is
video, the object is an rtc
videoreceiverstats object.
...if kind is "audio", this object is of type rtcaudiosenderstats; if kind is "
video", this is an rtc
videosenderstats object.
... track the object is one of the types based on rtcmediahandlerstats: for audio tracks, the type is rtcsenderaudiotrackattachmentstats and for
video tracks, the type is rtcsender
videotrackattachmentstats.
RTCTrackEvent - Web APIs
you can add a track event listener to be notified when the new track is available so that you can, for example, attach its media to a <
video> element, using either rtcpeerconnection.addeventlistener() or the ontrack event handler property.
... example this simple example creates an event listener for the track event which sets the srcobject of the <
video> element with the id
videobox to the first stream in the list passed in the event's streams array.
... peerconnection.addeventlistener("track", e => { let
videoelement = document.getelementbyid("
videobox");
videoelement.srcobject = e.streams[0]; }, false); specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtctrackevent' in that specification.
Screen Capture API - Web APIs
to start capturing
video from the screen, you call getdisplaymedia() on the instance of media navigator.mediadevices: capturestream = await navigator.mediadevices.getdisplaymedia(displaymediaoptions); the promise returned by getdisplaymedia() resolves to a mediastream which streams the captured media.
... mediatrackconstraints.logicalsurface indicates whether or not the
video in the stream represents a logical display surface (that is, one which may not be entirely visible onscreen, or may be completely offscreen).
... mediatracksettings.logicalsurface a boolean value which is true if the
video being captured doesn't directly correspond to a single onscreen display area.
SourceBuffer.abort() - Web APIs
a buffer is being appended but the operation has not yet completed) a user "scrubs" the
video seeking to a new point in time.
... in this case you would want to manually call abort() on the source buffer to stop the decoding of the current buffer, then fetch and append the newly requested segment that relates to the current new position of the
video.
... you can see something similar in action in nick desaulnier's bufferwhenneeded demo — in line 48, an event listener is added to the playing
video so a function called seek() is run when the seeking event fires.
TextTrackCue - Web APIs
texttrackcue.starttime a double that represents the
video time that the cue will start being displayed, in seconds.
... texttrackcue.endtime a double that represents the
video time that the cue will stop being displayed, in seconds.
... texttrackcue.pauseonexit a boolean for whether the
video will pause when this cue stops being displayed.
WebGLRenderingContext.texSubImage2D() - Web APIs
pixels); void gl.texsubimage2d(target, level, xoffset, yoffset, format, type, html
videoelement?
... 2: void gl.texsubimage2d(target, level, xoffset, yoffset, format, type, glintptr offset); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, htmlcanvaselement source); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, htmlimageelement source); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, html
videoelement source); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, imagebitmap source); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, imagedata source); void gl.texsubimage2d(target, level, xoffset, yoffset, width, height, format, type, arraybufferview srcdata, srcoffset); parameters target a glenum specifying the bindin...
... imagedata, htmlimageelement, htmlcanvaselement, html
videoelement, imagebitmap.
Lifetime of a WebRTC session - Web APIs
webrtc lets you build peer-to-peer communication of arbitrary data, audio, or
video—or any combination thereof—into a browser application.
...see signaling and
video calling for an actual example with a step-by-step explanation of what the code does.
...this code should connect the tracks to its consumer, such as a <
video> element.
Web accessibility for seizures and physical reactions - Accessibility
web technologies that use
video, animated gifs, animated pngs, animated svgs, canvas, and css or javascript animations are all capable of content that can induce seizures or other incapacitating physical reactions.
... certain
video games or tv broadcasts containing rapid flashes or alternating patterns of different colors.
... you don't even need an image or
video to cause harm.
Web Accessibility: Understanding Colors and Luminance - Accessibility
this is evolving, and new methods for measuring color involve measurements using other color spaces, but color measurements in the rgb color space still predominates, and this includes
video production.
... (see the
video, the photosensitive epilepsy analysis tool) measuring color & luminance methods that measures color exploring the rgb color space further, as it is the color space used by the data type <color>, note that there are actually multiple "versions" of the rgb color space, such as srgb, scrgb, and rgba.
... from iec 61966-2-2:2003(en), "
video systems approximate the lightness response of vision by computing a luma component y′ as a weighted sum of nonlinear r′g′b′ primary components: each rgb signal is, comparable to the 1/3 power function with an offset defined by l*.
Media buffering, seeking, and time ranges - Developer guides
sometimes it's useful to know how much <audio> or <
video> has downloaded or is playable without delay — a good example of this is the buffered progress bar of an audio or
video player.
... this will work with <audio> or <
video>; for now let's consider a simple audio example: <audio id="my-audio" controls src="music.mp3"> </audio> we can access these attributes like so: var myaudio = document.getelementbyid('my-audio'); var bufferedtimeranges = myaudio.buffered; timeranges object timeranges are a series of non-overlapping ranges of time, with start and stop times.
... (i = 0; i < myaudio.buffered.length; i++) { var startx = myaudio.buffered.start(i) * inc; var endx = myaudio.buffered.end(i) * inc; var width = endx - startx; context.fillrect(startx, 0, width, mycanvas.height); context.rect(startx, 0, width, mycanvas.height); context.stroke(); } }); } this works better with longer pieces of audio or
video, but press play and click about the player progress bar and you should get something like this.
Developer guides
audio and
video delivery we can deliver audio and
video on the web in several ways, ranging from 'static' media files to adaptive live streams.
... audio and
video manipulation the beauty of the web is that you can combine technologies to create new forms.
... having native audio and
video in the browser means we can use these data streams with technologies such as <canvas>, webgl or web audio api to modify audio and
video directly, for example adding reverb/compression effects to audio, or grayscale/sepia filters to
video.
HTML attribute: accept - HTML: Hypertext Markup Language
tified, so a site that accepts word files might use an <input> like this: <input type="file" id="docpicker" accept=".doc,.docx,application/msword,application/vnd.openxmlformats-officedocument.wordprocessingml.document"> whereas if you're accepting a media file, you may want to be include any format of that media type: <input type="file" id="soundfile" accept="audio/*"> <input type="file" id="
videofile" accept="
video/*"> <input type="file" id="imagefile" accept="image/*"> the accept attribute doesn't validate the types of the selected files; it simply provides hints for browsers to guide users towards selecting the correct file types.
... <p> <label for="soundfile">select an audio file:</label> <input type="file" id="soundfile" accept="audio/*"> </p> <p> <label for="
videofile">select a
video file:</label> <input type="file" id="
videofile" accept="
video/*"> </p> <p> <label for="imagefile">select some images:</label> <input type="file" id="imagefile" accept="image/*" multiple> </p> note the last example allows you to select multiple iamges.
... the string
video/* meaning "any
video file".
<input type="file"> - HTML: Hypertext Markup Language
additional attributes in addition to the common attributes shared by all <input> elements, inputs of type file also support the following attributes: attribute description accept one or more unique file type specifiers describing file types to allow capture what source to use for capturing image or
video data files a filelist listing the chosen files multiple a boolean which, if present, indicates that the user may choose more than one file accept the accept attribute value is a string that defines the file types the file input should accept.
...nstance, there are a number of ways microsoft word files can be identified, so a site that accepts word files might use an <input> like this: <input type="file" id="docpicker" accept=".doc,.docx,application/msword,application/vnd.openxmlformats-officedocument.wordprocessingml.document"> capture the capture attribute value is a string that specifies which camera to use for capture of image or
video data, if the accept attribute indicates that the input should be of one of those types.
... the string
video/* meaning "any
video file".
itemprop - HTML: Hypertext Markup Language
property values are either a string or a url and can be associated with a very wide range of elements including <audio>, <embed>, <iframe>, <img>, <link>, <object>, <source> , <track>, and <
video>.
... if the element is a meta element the value is the value of the element's content attribute if the element is an audio, embed, iframe, img, source, track, or
video element the value is the resulting url string that results from parsing the value of the element's src attribute relative to the node document (part of the microdata dom api) of the element at the time the attribute is set if the element is an a, area, or link element the value is the resulting url string that results from parsing the value of the element's href attribute relati...
...the url property elements are the a, area, audio, embed, iframe, img, link, object, source, track, and
video elements.
Compression in HTTP - HTTP
if text can typically have as much as 60% redundancy, this rate can be much higher for some other media like audio and
video.
...
video formats on the web are lossy; the jpeg image format is also lossy.
... as compression brings significant performance improvements, it is recommended to activate it for all files, but already compressed ones like images, audio files and
videos.
List of default Accept values - HTTP
source safari */* chrome image/webp,image/apng,image/*,*/*;q=0.8 source internet explorer 8 or earlier */* see ie and the accept header (ieinternals' msdn blog) internet explorer 9 image/png,image/svg+xml,image/*;q=0.8, */*;q=0.5 see fiddler is better with internet explorer 9 (ieinternals' msdn blog) values for a
video when a
video is requested, via the <
video> html element, most browsers use specific values.
... user agent value comment firefox earlier than 3.6 no support for <
video> firefox 3.6 and later
video/webm,
video/ogg,
video/*;q=0.9,application/ogg;q=0.7,audio/*;q=0.6,*/*;q=0.5 see bug 489071 source chrome */* source internet explorer 8 or earlier no support for <
video> values for audio resources when an audio file is requested, like via the <audio> html element, most browsers use specific values.
... user agent value comment firefox 3.6 and later audio/webm,audio/ogg,audio/wav,audio/*;q=0.9,application/ogg;q=0.7,
video/*;q=0.6,*/*;q=0.5 see bug 489071 source safari, chrome */* source internet explorer 8 or earlier no support for <audio> internet explorer 9 ?
Feature-Policy - HTTP
the autoplay attribute on <audio> and <
video> elements will be ignored.
... camera controls whether the current document is allowed to use
video input devices.
... picture-in-picture controls whether the current document is allowed to play a
video in a picture-in-picture mode via the corresponding api.
An overview of HTTP - HTTP
a complete document is reconstructed from the different sub-documents fetched, for instance text, layout description, images,
videos, scripts, and more.
...due to its extensibility, it is used to not only fetch hypertext documents, but also images and
videos or to post content to servers, like with html form results.
...it then parses this file, making additional requests corresponding to execution scripts, layout information (css) to display, and sub-resources contained within the page (usually images and
videos).
Image file type and format guide - Web media technologies
its animation support caused a resurgence in its popularity in the social media era, when animated gifs began to be widely used for short "
videos", memes, and other simple animation sequences.
... webp image webp supports lossy compression via predictive coding based on the vp8
video codec, and lossless compression that uses substitutions for repeating data.
...or n/a n/a indexed color n/a n/a greyscale with alpha n/a n/a true color with alpha n/a n/a compression lossless licensing open source choosing an image format picking the best image format for your needs is likely easier than
video formats, as there are fewer options with broad support, and each tends to have a specific set of use-cases.
Animation performance and frame rate - Web Performance
animation on the web can be done via svg, javascript, including <canvas> and webgl, css animation, <
video>, animated gifs and even animated pngs and other image types.
... for animated media, such as
video and animated gifs, the main performance concern is file size - downloading the file size fast enough to not negatively impact performance is the greatest issue.
... there's also a
video version of this walkthrough: animating using margin leaving the "use margin" option set, start the animation, open the performance tool, and make a recording.
Using shadow DOM - Web Components
think for example of a <
video> element, with the default browser controls exposed.
... all you see in the dom is the <
video> element, but it contains a series of buttons and other controls inside its shadow dom.
...this is the case with built in elements that contain shadow doms, such as <
video>.
2015 MDN Fellowship Program - Archive of obsolete content
in 2015, mdn will expand the impact of this content by developing kits of key learning materials, including such elements as code samples,
videos and other elements being finalized.
... activities and deliverables act as lead curator for technical curriculum addressing a key web technology, developing code samples,
videos, interactive exercises and other components to be determined.
Archived Mozilla and build documentation - Archive of obsolete content
introducing the audio api extension the audio data api extension extends the html5 specification of the <audio> and <
video> media elements by exposing audio metadata and raw audio data.
...
video presentations mozilla is actively working to produce
video presentations that can help you learn how the mozilla codebase works and how to take advantage of its technology in your own applications and extensions.
Index - Game development
2 anatomy of a
video game games, javascript, main loop, requestanimationframe i want to be clear that any of the above, or none of them, could be best for your game.
... 38 webrtc data channels api, games, needscontent, network, p2p, webrtc, data channels the webrtc (web real-time communications) api is primarily known for its support for audio and
video communications; however, it also offers peer-to-peer data channels.
Media (Audio-visual presentation) - MDN Web Docs Glossary: Definitions of Web-related terms
the term media (more accurately, multimedia) refers to audio,
video, or combined audio-visual material such as music, recorded speech, movies, tv shows, or any other form of content that is presented over a period of time.
... learn more general knowledge multimedia on wikipedia technical reference web media technologies: a guide to all the ways media can be used in web content multimedia and embedding in the mdn learning area <audio> and <
video> elements, used to present media in html documents ...
SDP - MDN Web Docs Glossary: Definitions of Web-related terms
sdp contains the codec, source address, and timing information of audio and
video.
... here is a typical sdp message: v=0 o=alice 2890844526 2890844526 in ip4 host.anywhere.com s= c=in ip4 host.anywhere.com t=0 0 m=audio 49170 rtp/avp 0 a=rtpmap:0 pcmu/8000 m=
video 51372 rtp/avp 31 a=rtpmap:31 h261/90000 m=
video 53000 rtp/avp 32 a=rtpmap:32 mpv/90000 sdp is never used alone, but by protocols like rtp and rtsp.
WebVTT - MDN Web Docs Glossary: Definitions of Web-related terms
webvtt (web
video text tracks) is a w3c specification for a file format marking up text track resources in combination with the html <track> element.
... webvtt files provide metadata that is time-aligned with audio or
video content like captions or subtitles for
video content, text
video descriptions, chapters for content navigation, and more.
Accessibility - Learn web development
the following
video provides a nice introduction to it: looking to become a front-end web developer?
... accessible multimedia another category of content that can create accessibility problems is multimedia —
video, audio, and image content need to be given proper textual alternatives, so they can be understood by assistive technologies and their users.
Pseudo-classes and pseudo-elements - Learn web development
:playing matches an element representing an audio,
video, or similar resource that is capable of being “played” or “paused”, when that element is “playing”.
... :paused matches an element representing an audio,
video, or similar resource that is capable of being “played” or “paused”, when that element is “paused”.
Document and website structure - Learn web development
basic sections of a document webpages can and will look pretty different from one another, but they all tend to share similar standard components, unless the page is displaying a fullscreen
video or game, is part of some kind of art project, or is just badly structured: header: usually a big strip across the top with a big heading, logo, and perhaps a tagline.
... main content: a big area in the center that contains most of the unique content of a given webpage, for example, the
video you want to watch, or the main story you're reading, or the map you want to view, or the news headlines, etc.
Structuring the web with HTML - Learn web development
multimedia and embedding this module explores how to use html to include multimedia in your web pages, including the different ways that images can be included, and how to embed
video, audio, and even entire other webpages.
... solving common html problems use html to solve common problems provides links to sections of content explaining how to use html to solve very common problems when creating a webpage: dealing with titles, adding images or
videos, emphasizing content, creating a basic form, etc.
Perceived performance - Learn web development
images and
video should be served in the most optimal format, compressed, and in the correct size.
... measuring performance multimedia: images multimedia:
video javascript performance best practices.
The business case for web performance - Learn web development
definining and promoting a budget helps performance proponent advocates for good user experience against competing interests, such as marketing, sales, or even other developers that may want to add
videos, 3rd party scripts, or even frameworks.
... measuring performance multimedia: images multimedia:
video javascript performance best practices.
The "why" of web performance - Learn web development
building websites requires html, css, and javascript, typically including binary file types such as images and
video.
... measuring performance multimedia: images multimedia:
video javascript performance best practices.
Web performance - Learn web development
multimedia:
video the lowest hanging fruit of web performance is often media optimization.
... in this article we discuss the impact
video content has on performance, and cover tips like removing audio tracks from background
videos can improve performance.
Mozilla’s UAAG evaluation report
(p1) c preferences, appearance, colors - "use my chosen colors, ignoring the colors and background image specified" 3.2 toggle audio,
video, animated images.
... (p1) p animated images can be made still with the escape key animated images can be made still as a preference under preferences, privacy & security, images - "animated images should loop" mozilla has no preference or command to toggle audio or
video 3.3 toggle animated/blinking text.
HTML parser threading
speculative loads when the tree builder on the parser thread encounters html script, stylesheet link,
video (with a poster frame), base or img or svg script, style or image elements, preload operations are appended to a speculative load queue.
...when the executor acts on speculative loads, it starts speculative http fetches for images (including
video poster frames), style sheets and scripts.
Creating localizable web content
happens for things like faq, or
video pages for example.
... for
video, use the code to display the
video in an overlayed
video player on the page instead of linking to a
video section in english.
Index
498 nsidomhtmlsourceelement dom, html, html5, interfaces, interfaces:scriptable, media, xpcom, xpcom interface reference the nsidomhtmlsourceelement interface is the dom interface to the source child of the audio and
video media elements in html.
... 512 nsidomprogressevent interfaces, interfaces:scriptable, reference, xmlhttprequest, xpcom interface reference, nsidomprogressevent, progress the nsidomprogressevent is used in the media elements (<
video> and <audio>) to inform interested code of the progress of the media download.
AbortController.AbortController() - Web APIs
examples in the following snippet, we aim to download a
video using the fetch api.
... var controller = new abortcontroller(); var signal = controller.signal; var downloadbtn = document.queryselector('.download'); var abortbtn = document.queryselector('.abort'); downloadbtn.addeventlistener('click', fetch
video); abortbtn.addeventlistener('click', function() { controller.abort(); console.log('download aborted'); }); function fetch
video() { ...
AbortController.abort() - Web APIs
examples in the following snippet, we aim to download a
video using the fetch api.
... var controller = new abortcontroller(); var signal = controller.signal; var downloadbtn = document.queryselector('.download'); var abortbtn = document.queryselector('.abort'); downloadbtn.addeventlistener('click', fetch
video); abortbtn.addeventlistener('click', function() { controller.abort(); console.log('download aborted'); }); function fetch
video() { ...
AbortController.signal - Web APIs
examples in the following snippet, we aim to download a
video using the fetch api.
... var controller = new abortcontroller(); var signal = controller.signal; var downloadbtn = document.queryselector('.download'); var abortbtn = document.queryselector('.abort'); downloadbtn.addeventlistener('click', fetch
video); abortbtn.addeventlistener('click', function() { controller.abort(); console.log('download aborted'); }); function fetch
video() { ...
AbortController - Web APIs
examples in the following snippet, we aim to download a
video using the fetch api.
... var controller = new abortcontroller(); var signal = controller.signal; var downloadbtn = document.queryselector('.download'); var abortbtn = document.queryselector('.abort'); downloadbtn.addeventlistener('click', fetch
video); abortbtn.addeventlistener('click', function() { controller.abort(); console.log('download aborted'); }); function fetch
video() { ...
AbortSignal - Web APIs
examples in the following snippet, we aim to download a
video using the fetch api.
... var controller = new abortcontroller(); var signal = controller.signal; var downloadbtn = document.queryselector('.download'); var abortbtn = document.queryselector('.abort'); downloadbtn.addeventlistener('click', fetch
video); abortbtn.addeventlistener('click', function() { controller.abort(); console.log('download aborted'); }); function fetch
video() { ...
AudioTrack - Web APIs
the audiotrack interface represents a single audio track from one of the html media elements, <audio> or <
video>.
... usage notes to get an audiotrack for a given media element, use the element's audiotracks property, which returns an audiotracklist object from which you can get the individual tracks contained in the media: var el = document.queryselector("
video"); var tracks = el.audiotracks; you can then access the media's individual tracks using either array syntax or functions such as foreach().
AudioTrackList: change event - Web APIs
bubbles no cancelable no interface event event handler property onchange examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.audiotracks.addeventlistener('change', (event) => { console.log(`'${event.type}' event fired`); }); // changing the value of `enabled` will trigger the `change` event const toggletrackbutton = document.queryselector('.toggle-track'); toggletrackbutton.addeventlistener('click', () => { const track =
videoelement.audiotracks[0]; track.enabled = !track.enabled; }); using the onchange event handler property: const
videoelement = docume...
...nt.queryselector('
video');
videoelement.audiotracks.onchange = (event) => { console.log(`'${event.type}' event fired`); }; // changing the value of `enabled` will trigger the `change` event const toggletrackbutton = document.queryselector('.toggle-track'); toggletrackbutton.addeventlistener('click', () => { const track =
videoelement.audiotracks[0]; track.enabled = !track.enabled; }); specifications specification status html living standardthe definition of 'change' in that specification.
AudioTrackList.length - Web APIs
example this snippet gets the number of audio tracks in the first <
video> element found in the dom by queryselector().
... var
videoelem = document.queryselector("
video"); var numaudiotracks = 0; if (
videoelem.audiotracks) { numaudiotracks =
videoelem.audiotracks.length; } note that this sample checks to be sure htmlmediaelement.audiotracks is defined, to avoid failing on browsers without support for audiotrack.
CanvasRenderingContext2D.drawImage() - Web APIs
the specification permits any canvas image source (canvasimagesource), specifically, a cssimagevalue, an htmlimageelement, an svgimageelement, an html
videoelement, an htmlcanvaselement, an imagebitmap, or an offscreencanvas.
...the same goes for
videowidth and
videoheight if the element is a <
video> element, and so on.
Canvas API - Web APIs
among other things, it can be used for animation, game graphics, data visualization, photo manipulation, and real-time
video processing.
... manipulating
video using canvas combining <
video> and <canvas> to manipulate
video data in real time.
Document.fullscreenEnabled - Web APIs
example in this example, before attempting to request full-screen mode for a <
video> element, the value of fullscreenenabled is checked, in order to avoid making the attempt when not available.
... function requestfullscreen() { if (document.fullscreenenabled) {
videoelement.requestfullscreen(); } else { console.log('your browser cannot use fullscreen right now'); } } specifications specification status comment fullscreen apithe definition of 'document.fullscreenenabled' in that specification.
DocumentOrShadowRoot.fullscreenElement - Web APIs
example this example presents a function, is
videoinfullscreen(), which looks at the value returned by fullscreenelement; if the document is in full-screen mode (fullscreenelement isn't null) and the full-screen element's nodename is
video, indicating a <
video> element, the function returns true, indicating that the
video is in full-screen mode.
... function is
videoinfullscreen() { if (document.fullscreenelement && document.fullscreenelement.nodename == '
video') { return true; } return false; } specifications specification status comment fullscreen apithe definition of 'document.fullscreenelement' in that specification.
Element.requestFullscreen() - Web APIs
examples this function toggles the first <
video> element found in the document into and out of full-screen mode.
... function togglefullscreen() { let elem = document.queryselector("
video"); if (!document.fullscreenelement) { elem.requestfullscreen().catch(err => { alert(`error attempting to enable full-screen mode: ${err.message} (${err.name})`); }); } else { document.exitfullscreen(); } } if the document isn't already in full-screen mode—detected by looking to see if document.fullscreenelement has a value—we call the
video's requestfullscreen() method.
GlobalEventHandlers.onplay - Web APIs
example <p>this example demonstrates how to assign an "onplay" event to a
video element.</p> <
video controls onplay="alertplay()"> <source src="mov_bbb.mp4" type="
video/mp4"> <source src="mov_bbb.ogg" type="
video/ogg"> your browser does not support html5
video.
... </
video> <p>
video courtesy of <a href="http://www.bigbuckbunny.org/" target="_blank">big buck bunny</a>.</p> <script> function alertplay() { alert("the
video has started to play."); } </script> specification specification status comment html living standardthe definition of 'onplay' in that specification.
HTMLMediaElement.canPlayType() - Web APIs
syntax canplayresponse = audioor
video.canplaytype(mediatype); parameters mediatype a domstring containing the mime type of the media.
... example var obj = document.createelement('
video'); console.log(obj.canplaytype('
video/mp4')); // "maybe" specifications specification status comment html living standardthe definition of 'canplaytype' in that specification.
HTMLMediaElement: canplaythrough event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('canplaythrough', (event) => { console.log('i think i can play through the entire ' + '
video without ever having to stop to buffer.'); }); using the oncanplaythrough event handler property: const
video = document.queryselector('
video');
video.oncanplaythrough = (event) => { console.log('i think i can play thru the entire ' +...
... '
video without ever having to stop to buffer.'); }; specifications specification status html living standardthe definition of 'canplaythrough media event' in that specification.
HTMLMediaElement.captureStream() - Web APIs
return value a mediastream object which can be used as a source for audio and/or
video data by other media processing code, or as a source for webrtc.
...the stream can then be used for other purposes—like a source for streaming over webrtc, to allow sharing prerecorded
videos with another person during a
video call.
HTMLMediaElement.controls - Web APIs
syntax var ctrls =
video.controls; audio.controls = true; value a boolean.
... example var obj = document.createelement('
video'); obj.controls = true; specifications specification status comment html living standardthe definition of 'htmlmediaelement.controls' in that specification.
HTMLMediaElement.defaultMuted - Web APIs
syntax var dmuted =
video.defaultmuted; audio.defaultmuted = true; value a boolean.
... example var
videoele = document.createelement('
video');
videoele.defaultmuted = true; console.log(
videoele.outerhtml); // <
video muted=""></
video> specifications specification status comment html living standardthe definition of 'htmlmediaelement.defaultmuted' in that specification.
HTMLMediaElement.defaultPlaybackRate - Web APIs
syntax var dspeed =
video.defaultplaybackrate; audio.defaultplaybackrate = 1.0; value a double.
... example var obj = document.createelement('
video'); console.log(obj.defaultplaybackrate); // 1 specifications specification status comment html living standardthe definition of 'htmlmediaelement.defaultplaybackrate' in that specification.
HTMLMediaElement: emptied event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('emptied', (event) => { console.log('uh oh.
...did you call load()?'); }); using the onemptied event handler property: const
video = document.queryselector('
video');
video.onemptied = (event) => { console.log('uh oh.
HTMLMediaElement: ended event - Web APIs
this event occurs based upon htmlmediaelement (<audio> and <
video>) fire ended when playback of the media reaches the end of the media.
... using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('ended', (event) => { console.log('
video stopped either because 1) it was over, ' + 'or 2) no further data is available.'); }); using the onended event handler property: const
video = document.queryselector('
video');
video.onended = (event) => { console.log('
video stopped either because 1) it was over, ' + 'or 2) no fur...
HTMLMediaElement.error - Web APIs
example this example establishes a
video element and adds an error handler to it; the error handler simply logs the details to console.
... var
videoelement = document.createelement('
video');
videoelement.onerror = function() { console.log("error " +
videoelement.error.code + "; details: " +
videoelement.error.message); }
videoelement.src = "https://example.com/bogus
video.mp4"; specifications specification status comment html living standardthe definition of 'htmlmediaelement.error' in that specification.
HTMLMediaElement.fastSeek() - Web APIs
example this example quickly seeks to 20-second position of the
video element.
... let my
video = document.getelementbyid("my
videoelement"); my
video.fastseek(20); specifications specification status comment html living standardthe definition of 'fastseek()' in that specification.
HTMLMediaElement: loadeddata event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('loadeddata', (event) => { console.log('yay!
... the readystate just increased to ' + 'have_current_data or greater for the first time.'); }); using the onloadeddata event handler property: const
video = document.queryselector('
video');
video.onloadeddata = (event) => { console.log('yay!
HTMLMediaElement: loadedmetadata event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('loadedmetadata', (event) => { console.log('the duration and dimensions ' + ' of the media and tracks are now known.
... '); }); using the onloadedmetadata event handler property: const
video = document.queryselector('
video');
video.onloadedmetadata = (event) => { console.log('the duration and dimensions ' + ' of the media and tracks are now known.
HTMLMediaElement.loop - Web APIs
syntax var loop =
video.loop; audio.loop = true; value a boolean.
... example var obj = document.createelement('
video'); obj.loop = true; // true specifications specification status comment html living standardthe definition of 'htmlmediaelement.loop' in that specification.
HTMLMediaElement.muted - Web APIs
syntax var ismuted = audioor
video.muted audio.muted = true; value a boolean.
... example var obj = document.createelement('
video'); console.log(obj.muted); // false specifications specification status comment html living standardthe definition of 'htmlmediaelement.muted' in that specification.
HTMLMediaElement: pause event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('pause', (event) => { console.log('the boolean paused property is now true.
... either the ' + 'pause() method was called or the autoplay attribute was toggled.'); }); using the onpause event handler property: const
video = document.queryselector('
video');
video.onpause = (event) => { console.log('the boolean paused property is now true.
HTMLMediaElement.paused - Web APIs
syntax var ispaused = audioor
video.paused value a boolean.
... example var obj = document.createelement('
video'); console.log(obj.paused); // true specifications specification status comment html living standardthe definition of 'htmlmediaelement.paused' in that specification.
HTMLMediaElement: play event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('play', (event) => { console.log('the boolean paused property is now false.
... either the ' + 'play() method was called or the autoplay attribute was toggled.'); }); using the onplay event handler property: const
video = document.queryselector('
video');
video.onplay = (event) => { console.log('the boolean paused property is now false.
HTMLMediaElement.playbackRate - Web APIs
syntax //
video video.playbackrate = 1.5; // audio audio.playbackrate = 1.0; value a double.
...(default: 1.0) example var obj = document.createelement('
video'); console.log(obj.playbackrate); // expected output: 1 specifications specification status comment html living standardthe definition of 'htmlmediaelement.playbackrate' in that specification.
HTMLMediaElement.readyState - Web APIs
syntax var readystate = audioor
video.readystate; value an unsigned short.
... have_future_data 3 data for the current playback position as well as for at least a little bit of time into the future is available (in other words, at least two frames of
video, for example).
HTMLMediaElement.seekable - Web APIs
syntax var seekable = audioor
video.seekable; value a timeranges object.
... examples var
video = document.queryselector("
video"); var timerangesobject =
video.seekable; var timeranges = []; //go through the object and output an array for (let count = 0; count < timerangesobject.length; count ++) { timeranges.push([timerangesobject.start(count), timerangesobject.end(count)]); } specifications specification status comment html living standardthe definition of 'htmlmediaelement' in that specification.
msStereo3DRenderMode - Web APIs
syntax html
videoelement.msstereo3drendermode(mono, stereo); value the following values set the stereo display to mono or stereo.
... see also html
videoelement microsoft api extensions ...
LocalMediaStream - Web APIs
see stopping a
video stream in mediastreamtrack to learn how.
...see stopping a
video stream in mediastreamtrack to learn how to stop an entire stream.
MediaCapabilities.decodingInfo() - Web APIs
syntax mediacapabilities.decodinginfo(mediadecodingconfiguration) parameters mediadecodingconfiguration a valid mediadecodingconfiguration dictionary containing a valid media decoding type of file or media-source and a valid media configuration: either an audioconfiguration or a
videoconfiguration.
... return value a promise fulfilling with a mediacapabilitiesinfo interface containing three boolean attributes: supported smooth powerefficient exceptions a typeerror is raised if the mediaconfiguration passed to the decodinginfo() method is invalid, either because the type is not
video or audio, the contenttype is not a valid codec mime type, the media decoding configuration is not a valid value for the media decoding type, or any other error in the media configuration passed to the method, including omitting values required in the media decoding configuration.
MediaDecodingConfiguration - Web APIs
a media configuration — a
videoconfiguration or audioconfiguration dictionary.
... examples //create media configuration to be tested const mediaconfig = { type : 'file', // or 'media-source'
video : { contenttype : "
video/webm;codecs=vp8", // valid content type width : 800, // width of the
video height : 600, // height of the
video bitrate : 10000, // number of bits used to encode 1s of
video framerate : 30 // number of frames making up that 1s.
MediaDeviceInfo - Web APIs
mediadeviceinfo.kindread only returns an enumerated value that is either "
videoinput", "audioinput" or "audiooutput".
... navigator.mediadevices.enumeratedevices() .then(function(devices) { devices.foreach(function(device) { console.log(device.kind + ": " + device.label + " id = " + device.deviceid); }); }) .catch(function(err) { console.log(err.name + ": " + err.message); }); this might produce:
videoinput: id = cso9c0ypaf274oucpua53cne0yhlir2yxci+sqfbzz8= audioinput: id = rkxxbyjnabbadgqnnzqlvldmxls0yketycibg+xxnvm= audioinput: id = r2/xw1xupiyzunfv1lgrkoma5wtovckwfz368xcndm0= or if one or more media streams are active, or if persistent permissions have been granted:
videoinput: facetime hd camera (built-in) id=cso9c0ypaf274oucpua53cne0yhlir2yxci+sqfbzz8= audioinput: default (built-in micro...
MediaEncodingConfiguration - Web APIs
a media configuration — a
videoconfiguration audioconfiguration dictionary.
... examples //create media configuration to be tested const mediaconfig = { type : 'record', // or 'transmission'
video : { contenttype : "
video/webm;codecs=vp8", // valid content type width : 800, // width of the
video height : 600, // height of the
video bitrate : 10000, // number of bits used to encode 1s of
video framerate : 30 // number of frames making up that 1s.
MediaError.code - Web APIs
example this example creates a <
video> element, establishes an error handler for it, and then sets the element's src attribute to the
video resource to present in the element.
... the error handler simply outputs a message var obj = document.createelement('
video'); obj.onerror = function() {console.log("error with media: " + obj.error.code);} obj.src="https://example.com/blahblah.mp4"; specifications specification status comment html living standardthe definition of 'mediaerror.code' in that specification.
msExtendedCode - Web APIs
example var
video1 = object.getelementbyid("
video1");
video1.addeventlistener('error', function () { var error =
video1.error.msextendedcode; //...
... }, false);
video.addeventlistener('canplay', function () {
video1.play(); }, false); ...
MediaRecorder.onerror - Web APIs
bufferlist = []; try { recorder = new mediarecorder(stream); } catch(err) { return err.name; /* return the error name */ } recorder.ondataavailable = function(event) { bufferlist.push(event.data); }; recorder.onerror = function(event) { let error = event.error; switch(error.name) { case invalidstateerror: shownotification("you can't record the
video right " + "now.
... try again later."); break; case securityerror: shownotification("recording the specified source " + "is not allowed due to security " + "restrictions."); break; default: shownotification("a problem occurred while trying " + "to record the
video."); break; } }; /* this would be a good place to create a worker to handle writing the buffers to disk periodically */ recorder.start(100); /* 100ms time slices per buffer */ return recorder; } specifications specification status comment mediastream recordingthe definition of 'mediarecorder.onerror' in that specification.
MediaSource.activeSourceBuffers - Web APIs
the activesourcebuffers read-only property of the mediasource interface returns a sourcebufferlist object containing a subset of the sourcebuffer objects contained within sourcebuffers — the list of objects providing the selected
video track, enabled audio tracks, and shown/hidden text tracks.
...readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream(); console.log(mediasource.activesourcebuffers); // will contain the source buffer that was added above, // as it is selected for playing in the
video player
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; ...
MediaSource.addSourceBuffer() - Web APIs
example the following snippet is from a simple example written by nick desaulniers (view the full demo live, or download the source for further investigation.) var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource; //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceo...
...pen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; specifications specification status comment media source extensionsthe definition of 'addsourcebuffer()' in that specification.
MediaSource.endOfStream() - Web APIs
their sourcebuffer.updating property is true.) example the following snippet is from a simple example written by nick desaulniers (view the full demo live, or download the source for further investigation.) var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource; //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceo...
...pen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; specifications specification status comment media source extensionsthe definition of 'endofstream()' in that specification.
MediaSource.isTypeSupported() - Web APIs
example the following snippet is from an example written by nick desaulniers (view the full demo live, or download the source for further investigation.) var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource; //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceo...
...pen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; specifications specification status comment media source extensionsthe definition of 'istypesupported()' in that specification.
MediaSource.readyState - Web APIs
example the following snippet is from a simple example written by nick desaulniers (view the full demo live, or download the source for further investigation.) if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource; //console.log(mediasource.readystate); // closed
video.src = url.createobjecturl(mediasource); mediasource.addeventlistener('sourceopen', sourceopen); } else { console.error('unsupported mime type or codec: ', mimecodec); } function sourceopen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener...
...('updateend', function (_) { mediasource.endofstream();
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; specifications specification status comment media source extensionsthe definition of 'readystate' in that specification.
MediaStream.getAudioTracks() - Web APIs
example this example gets a webcam's audio and
video in a stream using getusermedia(), attaches the stream to a <
video> element, then sets a timer that, upon expiring, will stop the first audio track found on the stream.
... navigator.mediadevices.getusermedia({audio: true,
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream; // stop the audio stream after 5 seconds settimeout(() => { const tracks = mediastream.getaudiotracks() tracks[0].stop() }, 5000) }) specifications specification status comment media capture and streamsthe definition of 'getaudiotracks()' in that specification.
MediaStream - Web APIs
a stream consists of several tracks such as
video or audio tracks.
... mediastream.get
videotracks() returns a list of the mediastreamtrack objects stored in the mediastream object that have their kind attribute set to "
video".
MediaStreamTrack.applyConstraints() - Web APIs
for example, you may prefer high-density
video but require that the frame rate be a little low to help keep the data rate low enough not overtax the network.
... const constraints = { width: {min: 640, ideal: 1280}, height: {min: 480, ideal: 720}, advanced: [ {width: 1920, height: 1280}, {aspectratio: 1.333} ] }; navigator.mediadevices.getusermedia({
video: true }) .then(mediastream => { const track = mediastream.get
videotracks()[0]; track.applyconstraints(constraints) .then(() => { // do something with the track such as using the image capture api.
MediaStreamTrack.kind - Web APIs
the mediastreamtrack.kind read-only property returns a domstring set to "audio" if the track is an audio track and to "
video", if it is a
video track.
... "
video": the track is a
video track.
MediaStreamTrack.stop() - Web APIs
examples stopping a
video stream in this example, we see a function which stops a streamed
video by calling stop() on every track on a given <
video>.
... function stopstreamed
video(
videoelem) { const stream =
videoelem.srcobject; const tracks = stream.gettracks(); tracks.foreach(function(track) { track.stop(); });
videoelem.srcobject = null; } this works by obtaining the
video element's stream from its srcobject property.
MediaTrackConstraints.aspectRatio - Web APIs
syntax var constraintsobject = { aspectratio: constraint }; constraintsobject.aspectratio = constraint; value a constraindouble describing the acceptable or required value(s) for a
video track's aspect ratio.
...for example, the standard high-definition
video aspect ratio of 16:9 can be computed as 1920/1080, or 1.7777777778.
MediaTrackConstraints.logicalSurface - Web APIs
usage notes you can check the setting selected by the user agent after the display media has been created by getdisplaymedia() by calling getsettings() on the display media's
video mediastreamtrack, then checking the value of the returned mediatracksettings object's logicalsurface object.
... for example, if your app needs to know if the selected display surface is a logical one: let islogicalsurface = displaystream.get
videotracks()[0].getsettings().logicalsurface; following this code, islogicalsurface is true if the display surface contained in the stream is a logical surface; that is, one which may not be entirely onscreen, or may even be entirely offscreen.
MediaTrackSupportedConstraints.cursor - Web APIs
capturing is then started by calling getdisplaymedia() and attaching the returned stream to the
video element referenced by the variable
videoelem.
... async function capturewithcursor() { let supportedconstraints = navigator.mediadevices.getsupportedconstraints(); let displaymediaoptions = {
video: { displaysurface: "browser" }, audio: false; }; if (supportedconstraints.cursor) { displaymediaoptions.
video.cursor = "always"; } try {
videoelem.srcobject = await navigator.mediadevices.getdisplaymedia(displaymediaoptions); } catch(err) { /* handle the error */ } } specifications specification status comment screen capturethe definition of 'mediatracksupportedconstraints.cursor' in that specification.
MediaTrackSupportedConstraints.displaySurface - Web APIs
capturing is then started by calling getdisplaymedia() and attaching the returned stream to the
video element referenced by the variable
videoelem.
... async function capture() { let supportedconstraints = navigator.mediadevices.getsupportedconstraints(); let displaymediaoptions = {
video: { }, audio: false; }; if (supportedconstraints.displaysurface) { displaymediaoptions.
video.displaysurface = "monitor"; } try {
videoelem.srcobject = await navigator.mediadevices.getdisplaymedia(displaymediaoptions); } catch(err) { /* handle the error */ } } specifications specification status comment screen capturethe definition of 'mediatracksupportedconstraints.displaysurface' in that specification.
MediaTrackSupportedConstraints.logicalSurface - Web APIs
capturing is then started by calling getdisplaymedia() and attaching the returned stream to the
video element referenced by the variable
videoelem.
... async function capture() { let supportedconstraints = navigator.mediadevices.getsupportedconstraints(); let displaymediaoptions = {
video: { }, audio: false; }; if (supportedconstraints.logicalsurface) { displaymediaoptions.
video.logicalsurface = "monitor"; } try {
videoelem.srcobject = await navigator.mediadevices.getdisplaymedia(displaymediaoptions); } catch(err) { /* handle the error */ } } specifications specification status comment screen capturethe definition of 'mediatracksupportedconstraints.logicalsurface' in that specification.
Microsoft API extensions - Web APIs
touch apis element.mszoomto() mscontentzoom msmanipulationevent msmanipulationstatechanged msmanipulationviewsenabled mspointerhover media apis html
videoelement.msframestep() html
videoelement.mshorizontalmirror html
videoelement.msinsert
videoeffect() html
videoelement.msislayoutoptimalforplayback html
videoelement.msisstereo3d html
videoelement.mszoom htmlaudioelement.msaudiocategory htmlaudioelement.msaudiodevicetype htmlmediaelement.mscleareffects() htmlmediaelement.msinsertaudioeffect() mediaerror.msextendedcode msgraphicstrust msgraphi...
...cstruststatus msisboxed msplaytodisabled msplaytopreferredsourceuri msplaytoprimary msplaytosource msrealtime mssetmediaprotectionmanager msset
videorectangle msstereo3dpackingmode msstereo3drendermode onms
videoformatchanged onms
videoframestepcompleted onms
videooptimallayoutchanged msfirstpaint pinned sites apis mssitemodeevent mssitemodejumplistitemremoved msthumbnailclick other apis x-ms-aria-flowfrom x-ms-acceleratorkey x-ms-format-detection mscaching mscachingenabled mscapslockwarningoff event.msconverturl() mselementresize document.mselementsfromrect() msisstatichtml navigator.mslaunchuri() mslaunchuricallback element.msmatchesselector() msprotocols msputpropertyenabled mswriteprofilermark ...
msPlayToPreferredSourceUri - Web APIs
example <
video src="http://www.contoso.com/
videos/
video.mp4" msplaytopreferredsourceuri="http://www.contoso.com/catalogid=1234" /> var
video = document.createelement('
video'); document.body.appendchild(
video);
video.src = "http://www.contoso.com/
videos/
video.mp4";
video.msplaytopreferredsourceuri = "http://www.contoso.com/catalogid=1234"; see also microsoft playready content access and pr...
...otection technology is a set of technologies that can be used to distribute audio/
video content more securely over a network, and help prevent the unauthorized use of this content.
RTCConfiguration.bundlePolicy - Web APIs
this string, which must be a member of the rtcbundlepolicy enumeration, has the following possible values: balanced the ice agent begins by creating one rtcdtlstransport to handle each type of content added: one for audio, one for
video, and one for the rtc data channel, if applicable.
... if the remote peer isn't bundle-aware, the ice agent chooses one audio track and one
video track and those two tracks are each assigned to the corresponding rtcdtlstransport.
RTCDtlsTransport - Web APIs
if the connection was created using max-compat mode, each transport is responsible for handling all of the communications for a given type of media (audio,
video, or data channel).
... thus, a connection that has any number of audio and
video channels will always have exactly one dtls transport for audio and one for
video communications.
RTCInboundRtpStreamStats.receiverId - Web APIs
the receiverid property of the rtcinboundrtpstreamstats dictionary specifies the id of the rtcaudioreceiverstats or rtc
videoreceiverstats object representing the rtcrtpreceiver receiving the stream.
... syntax var receiverstatsid = rtcinboundrtpstreamstats.receiverid; value a domstring which contains the id of the rtcaudioreceiverstats or rtc
videoreceiverstats object which provides information about the rtcrtpreceiver which is receiving the streamed media.
RTCInboundRtpStreamStats.trackId - Web APIs
the trackid property of the rtcinboundrtpstreamstats dictionary indicates the id of the rtcreceiveraudiotrackattachmentstats or rtcreceiver
videotrackattachmentstats object representing the mediastreamtrack which is receiving the incoming media.
... syntax var trackstatsid = rtcinboundrtpstreamstats.trackid; value a domstring containing the id of the rtcreceiveraudiotrackattachmentstats or rtcreceiver
videotrackattachmentstats object representing the track which is receiving the media from this rtp session.
RTCOutboundRtpStreamStats.framesEncoded - Web APIs
syntax var framesencoded = rtcoutboundrtpstreamstats.framesencoded; value an integer value indicating the total number of
video frames that this sender has encoded so far for this stream.
... note: this property is only valid for
video streams.
RTCOutboundRtpStreamStats.trackId - Web APIs
the trackid property of the rtcoutboundrtpstreamstats dictionary indicates the id of the rtcsenderaudiotrackattachmentstats or rtcsender
videotrackattachmentstats object representing the mediastreamtrack which is being sent on this stream.
... syntax var trackstatsid = rtcoutboundrtpstreamstats.trackid; value a domstring containing the id of the rtcsenderaudiotrackattachmentstats or rtcsender
videotrackattachmentstats object representing the track which is the source of the media being sent on this stream.
RTCPeerConnection.onaddstream - Web APIs
example this code, based on an older version of our signaling and
video calling sample, responds to addstream events by setting the
video source for a <
video> element to the stream specified in the event, and then enabling a "hang up" button in the app's user interface.
... pc.onaddstream = function(event) { document.getelementbyid("received_
video").srcobject = event.stream; document.getelementbyid("hangup-button").disabled = false; }; you can also use addeventlistener() to add a handler for addstream events to an rtcpeerconnection.
RTCPeerConnection.removeStream() - Web APIs
the rtcpeerconnection.removestream() method removes a mediastream as a local source of audio or
video.
... example var pc,
videostream; navigator.getusermedia({
video: true}, function(stream) { pc = new rtcpeerconnection();
videostream = stream; pc.addstream(stream); } document.getelementbyid("closebutton").addeventlistener("click", function(event) { pc.removestream(
videostream); pc.close(); }, false); ...
RTCPeerConnection.removeTrack() - Web APIs
example this example adds a
video track to a connection and sets up a listener on a close button which removes the track when the user clicks the button.
... var pc, sender; navigator.getusermedia({
video: true}, function(stream) { pc = new rtcpeerconnection(); var track = stream.get
videotracks()[0]; sender = pc.addtrack(track, stream); }); document.getelementbyid("closebutton").addeventlistener("click", function(event) { pc.removetrack(sender); pc.close(); }, false); specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtcpeerconnection.removetrack()' in that specification.
RTCPeerConnection.setRemoteDescription() - Web APIs
this code is derived from the example and tutorial in the article signaling and
video calling; take a look at that for more details and a more in-depth explanation of what's going on.
... function handleoffer(msg) { createmypeerconnection(); mypeerconnection.setremotedescription(msg.description).then(function () { return navigator.mediadevices.getusermedia(mediaconstraints); }) .then(function(stream) { document.getelementbyid("local_
video").srcobject = stream; return mypeerconnection.addstream(stream); }) .then(function() { return mypeerconnection.createanswer(); }) .then(function(answer) { return mypeerconnection.setlocaldescription(answer); }) .then(function() { // send the answer to the remote peer using the signaling server }) .catch(handlegetusermediaerror); } after creating our rtcpeerconnection and saving it as mypeerconnection, we pass the description included in the received offer message, msg, directly into setremoted...
RTCPeerConnection: track event - Web APIs
pc = new rtcpeerconnection({ iceservers: [ { urls: "turn:fake.turnserver.url", username: "someusername", credential: "somepassword" } ] }); pc.addeventlistener("track", e => {
videoelement.srcobject = e.streams[0]; hangupbutton.disabled = false; }, false); the event handler assigns the new track's first stream to an existing <
video> element, identified using the variable
videoelement.
... pc.ontrack = e => {
videoelement.srcobject = e.streams[0]; hangupbutton.disabled = false; return false; } specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'track' in that specification.
Streams API - Web APIs
this is something browsers do anyway when receiving assets to be shown on webpages —
videos buffer and more is gradually available to play, and sometimes you'll see images display gradually as more is loaded.
...previously, if we wanted to process a resource of some kind (be it a
video, or a text file, etc.), we'd have to download the entire file, wait for it to be deserialized into a suitable format, then process the whole lot after it is fully received.
TextTrackList.onchange - Web APIs
the texttracklist property onchange is an event handler which is called when the change event occurs, indicating that a change has occurred on a texttrack in the
videotracklist.
... var tracklist = document.queryselector("
video, audio").texttracks; tracklist.onchange = function(event) { ....
TextTrackList - Web APIs
examples getting a
video element's text track list to get a media element's texttracklist, use its texttracks property.
... var texttracks = document.queryselector("
video").texttracks; monitoring track count changes in this example, we have an app that displays information about the number of channels available.
TrackEvent() - Web APIs
the trackevent() constructor creates and returns a new trackevent object describing an event which occurred on a list of tracks (audiotracklist,
videotracklist, or texttracklist).
... eventinfo optional an optional dictionary providing additional information configuring the new event; it can contain the following fields in any combination: track optional the track to which the event refers; this is null by default, but should be set to a
videotrack, audiotrack, or texttrack as appropriate given the type of track.
TrackEvent.track - Web APIs
this will be an audiotrack,
videotrack, or texttrack object.
... syntax track = trackevent.track; value an object which is one of the types audiotrack,
videotrack, or texttrack, depending on the type of media represented by the track.
WebGL2RenderingContext.texImage3D() - Web APIs
dth, height, depth, border, format, type, glintptr offset); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, htmlcanvaselement source); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, htmlimageelement source); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, html
videoelement source); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, imagebitmap source); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, imagedata source); void gl.teximage3d(target, level, internalformat, width, height, depth, border, format, type, arraybufferview?
... gl.int gl.half_float gl.float gl.unsigned_int_2_10_10_10_rev gl.unsigned_int_10f_11f_11f_rev gl.unsigned_int_5_9_9_9_rev gl.unsigned_int_24_8 gl.float_32_unsigned_int_24_8_rev (pixels must be null) source one of the following objects can be used as a pixel source for the texture: arraybufferview, imagebitmap, imagedata, htmlimageelement, htmlcanvaselement, html
videoelement.
WebGL2RenderingContext.texSubImage3D() - Web APIs
pixels); void gl.texsubimage3d(target, level, xoffset, yoffset, zoffset, width, height, depth, format, type, html
videoelement?
... imagebitmap, imagedata, htmlimageelement, htmlcanvaselement, html
videoelement.
Viewpoints and viewers: Simulating cameras in WebXR - Web APIs
to do this, webxr asks your renderer to draw the scene twice for each frame of
video—once for each eye.
... next, we go ahead and queue up the request to render the next frame of
video, so we don't have to worry about doing it later, by simply calling requestanimationframe() again.
WebXR Device API - Web APIs
each xrview corresponds to the
video display surface used to present the scene to the user.
... playing
video in a 3d environment in this guide, we examine how to play
video into a 3d scene.
Web APIs
ent htmlquoteelement htmlscriptelement htmlselectelement htmlshadowelement htmlslotelement htmlsourceelement htmlspanelement htmlstyleelement htmltablecaptionelement htmltablecellelement htmltablecolelement htmltableelement htmltablerowelement htmltablesectionelement htmltemplateelement htmltextareaelement htmltimeelement htmltitleelement htmltrackelement htmlulistelement htmlunknownelement html
videoelement hashchangeevent headers history hkdfparams hmacimportparams hmackeygenparams i idbcursor idbcursorsync idbcursorwithvalue idbdatabase idbdatabaseexception idbdatabasesync idbenvironment idbenvironmentsync idbfactory idbfactorysync idbindex idbindexsync idbkeyrange idblocaleawarekeyrange idbmutablefile idbobjectstore idbobjectstoresync idbopendbrequest idbrequ...
...event ulongrange url urlsearchparams urlutilsreadonly usb usbalternateinterface usbconfiguration usbdevice usbendpoint usbintransferresult usbinterface usbisochronousintransferpacket usbisochronousintransferresult usbisochronousouttransferpacket usbisochronousouttransferresult usbouttransferresult usvstring userdatahandler userproximityevent v vttcue vttregion validitystate
videoconfiguration
videoplaybackquality
videotrack
videotracklist visualviewport w webgl_color_buffer_float webgl_compressed_texture_astc webgl_compressed_texture_atc webgl_compressed_texture_etc webgl_compressed_texture_etc1 webgl_compressed_texture_pvrtc webgl_compressed_texture_s3tc webgl_compressed_texture_s3tc_srgb webgl_debug_renderer_info webgl_debug_shaders webgl_depth_texture webgl_draw...
ARIA - Accessibility
videos of screen readers using aria see both real and simplified examples from around the web, including "before" and "after" aria
videos.
...
videos following talks are a great way to understand aria: aria, accessibility apis and coding like you give a damn!
Cognitive accessibility - Accessibility
another form it can take is depression, such as when mourning the loss of a loved one, or being momentarily saddened by a tweet or
video they just saw online.
...cognitive skills include: attention memory processing speed time management letters and language numbers symbols and math understanding and making choices a solid approach to providing accessible solutions for people with cognitive impairments includes: delivering content in more than one way, such as by text-to-speech or by
video; providing easily-understood content, such as text written using plain-language standards; focusing attention on important content; minimizing distractions, such as unnecessary content or advertisements; providing consistent web page layout and navigation; incorporating familiar elements, such as underlined links that are blue when not visited and purple when visited; dividing processes int...
::backdrop - CSS: Cascading Style Sheets
syntax ::backdrop examples styling the backdrop for full-screen
video in this example, the backdrop style used when a
video is shifted to full-screen mode is configured to be a grey-blue color rather than the black it defaults to in most browsers.
...
video::backdrop { background-color: #448; } the resulting screen looks like this: note the dark grey-blue letterbox effect above and below where the backdrop is visible.
@document - CSS: Cascading Style Sheets
media-document(), with the parameter of
video, image, plugin or all.
... formal syntax @document [ <url> | url-prefix(<string>) | domain(<string>) | media-document(<string>) | regexp(<string>) ]# { <group-rule-body> } examples specifying document for css rule @document url("http://www.w3.org/"), url-prefix("http://www.w3.org/style/"), domain("mozilla.org"), media-document("
video"), regexp("https:.*") { /* css rules here apply to: - the page "http://www.w3.org/" - any page whose url begins with "http://www.w3.org/style/" - any page whose url's host is "mozilla.org" or ends with ".mozilla.org" - any standalone
video - any page whose url starts with "https:" */ /* make the above-mentioned pages really ugly */ body { color...
Demos of open web technologies
canvas (code demos) 3d raycaster processing.js p5js 3d on 2d canvas minipaint: image editor (source code) zen photon garden (source code) multi touch in canvas demo (source code) svg bubblemenu (visual effects and interaction) html transformations using foreignobject (visual effects and transforms) phonetics guide (interactive) 3d objects demo (interactive) blobular (interactive)
video embedded in svg (or use the local download) summer html image map creator (source code)
video video 3d animation "mozilla constantly evolving"
video 3d animation "floating dance" streaming anime, movie trailer and interview billy's browser firefox flick virtual barber shop transformers movie trailer a scanner darkly movie trailer (with built in controls) events firing and volume cont...
...rol dragable and sizable
videos 3d graphics webgl web audio fireworks ioquake3 (source code) escher puzzle (source code) kai 'opua (source code) virtual reality the polar sea (source code) sechelt fly-through (source code) css css zen garden css floating logo "mozilla" paperfold css blockout rubik's cube pure css slides planetarium (source code) loader with blend modes text reveal with clip-path ambient shadow with custom properties luminiscent vial css-based single page application (source code) transformations impress.js (source code) games ioquake3 (source code) kai 'opua (source code) web apis notifications api html5 notifications (source code) web audio api web audio fireworks oscope.js - javascript oscilloscope html5 web audio ...
Cross-browser audio basics - Developer guides
auto-playing audio (and
video) is usually really annoying.
...xample, mediaelement.js includes flash fallbacks, which are used something like this: <audio controls> <source src="audiofile.mp3" type="audio/mpeg"> <source src="audiofile.ogg" type="audio/ogg"> <!-- fallback for non supporting browsers goes here --> <a href="audiofile.mp3">download audio</a> <object width="320" height="30" type="application/x-shockwave-flash" data="mediaelement-flash-
video.swf"> <param name="movie" value="mediaelement-flash-
video.swf" /> <param name="flashvars" value="controls=true&is
video=false&file=audiofile.mp3" /> </object> </audio> note: you should be aware that flash and silverlight code require that the user has the appropriate plugin installed, and that the browser cannot guarantee the security aspects of code running on those plugin platforms.
Media events - Developer guides
various events are sent when handling media that are embedded in html documents using the <audio> and <
video> elements; this section lists them and provides some helpful information about using them.
... you can easily watch for these events, using code such as the following: var v = document.getelementsbytagname("
video")[0]; v.addeventlistener("seeked", function() { v.play(); }, true); v.currenttime = 10.0; this example fetches the first
video element in the document and attaches an event listener to it, watching for the seeked event, which is sent whenever a seek operation completes.
Graphics on the Web - Developer guides
video using html5 audio and
video embedding
video and/or audio in a web page and controlling its playback.
... webrtc the rtc in webrtc stands for real-time communications, a technology that enables audio/
video streaming and data sharing between browser clients (peers).
User input and controls - Developer guides
fullscreen you might need to present an element of your application (such as a <
video>, for example) in fullscreen mode.
...bear in mind that many browsers still implement this with a vendor prefix, so you will probably need to fork your code something like this: var elem = document.getelementbyid("my
video"); if (elem.requestfullscreen) { elem.requestfullscreen(); } else if (elem.msrequestfullscreen) { elem.msrequestfullscreen(); } else if (elem.mozrequestfullscreen) { elem.mozrequestfullscreen(); } else if (elem.webkitrequestfullscreen) { elem.webkitrequestfullscreen(); } note: to find more out about adding fullscreen functionality your application, read our documentation about using fullscreen mode.
HTML attribute: capture - HTML: Hypertext Markup Language
the capture attribute takes as it's value a string that specifies which camera to use for capture of image or
video data, if the accept attribute indicates that the input should be of one of those types.
... <p> <label for="soundfile">what does your voice sound like?:</label> <input type="file" id="soundfile" capture="user" accept="audio/*"> </p> <p> <label for="
videofile">upload a
video:</label> <input type="file" id="
videofile" capture="environment" accept="
video/*"> </p> <p> <label for="imagefile">upload a photo of yourself:</label> <input type="file" id="imagefile" capture="user" accept="image/*"> </p> note these work better on mobile devices; if your device is a desktop computer, you'll likely get a typical file picker.
<embed>: The Embed External Content element - HTML: Hypertext Markup Language
examples <embed type="
video/quicktime" src="movie.mov" width="640" height="480" title="title of my
video"> accessibility concerns use the title attribute on an embed element to label its content so that people navigating with assistive technology such as a screen reader can understand what it contains.
...this context shift can be confusing and time-consuming, especially if the embed element contains interactive content like
video or audio.
Content Security Policy (CSP) - HTTP
there are specific directives for a wide variety of types of items, so that each type can have its own policy, including fonts, frames, images, audio and
video media, scripts, and workers.
...-src 'self' example 2 a web site administrator wants to allow content from a trusted domain and all its subdomains (it doesn't have to be the same domain that the csp is set on.) content-security-policy: default-src 'self' *.trusted.com example 3 a web site administrator wants to allow users of a web application to include images from any origin in their own content, but to restrict audio or
video media to trusted providers, and all scripts only to a specific server that hosts trusted code.
CSP: media-src - HTTP
the http content-security-policy (csp) media-src directive specifies valid sources for loading media using the <audio> and <
video> elements.
... examples violation cases given this csp header: content-security-policy: media-src https://example.com/ the following <audio>, <
video> and <track> elements are blocked and won't load: <audio src="https://not-example.com/audio"></audio> <
video src="https://not-example.com/
video"> <track kind="subtitles" src="https://not-example.com/subtitles"> </
video> specifications specification status comment content security policy level 3the definition of 'media-src' in that specification.
Mixed content - Web security
passive content list this section lists all types of http requests which are considered passive content: <img> (src attribute) <audio> (src attribute) <
video> (src attribute) <object> subresources (when an <object> performs http requests) mixed active content mixed active content is content that has access to all or parts of the document object model of the https page.
...some common examples of mixed content include javascript files, stylesheets, images,
videos, and other media.
Tutorials
intermediate level multimedia and embedding this module explores how to use html to include multimedia in your web pages, including the different ways that images can be included, and how to embed
video, audio, and even entire other webpages.
... javascript
videos a collection of javascript
videos to watch.
StringView - Archive of obsolete content
e object stringview.prototype to create a collection of methods for such string-like objects (since now: stringviews) which work strictly on arrays of numbers rather than on creating new immutable javascript strings to work with unicode encodings other than javascript's default utf-16 domstrings introduction as web applications become more and more powerful, adding features such as audio and
video manipulation, access to raw data using websockets, and so forth, it has become clear that there are times when it would be helpful for javascript code to be able to quickly and easily manipulate raw binary data.
Index of archived content - Archive of obsolete content
tuning pageload urischeme uris and urls uriloader using addresses of stack variables with nspr threads on win16 using cross commit using gdb on wimpy computers venkman using breakpoints in venkman venkman internals venkman introduction
video presentations when to use ifdefs writing textual data xml in mozilla xpinstall creating xpi installer modules install wizards (aka: stub installers) mac stub installer unix stub installer windows stub installer ...
List of Former Mozilla-Based Applications - Archive of obsolete content
2010, rw/w reports jolicloud is on chrome/chrome os joost tv over internet switched from xulrunner-based client to a web application liferea news aggregator switched to webkit in version 1.6 manyone browser browser originally mozilla-based but now i believe the have a web-based tool (need reference for that) miro (formerly democracy player)
video switched from xulrunner to webkit in version 3.0.2 moblin browser browser when moblin became meego it switched from a custom gecko-based browser to chrome nautilus file manager hasn't used mozilla code since version 2.0 raptr client gaming client was a xulrunner app initially but now uses adobe air rift technologies software installation...
MMgc - Archive of obsolete content
incremental collection the flash player is frequently used for animations and
video that must maintain a certain framerate to play properly.
ActiveX Control for Hosting Netscape Plug-ins in IE - Archive of obsolete content
usage insert some html like this into your content: <object classid="clsid:dbb2de32-61f1-4f7f-beb8-a37f5bc24ee2" width="500" height="300"> <param name="type" value="
video/quicktime"/> <param name="src" value="http://www.foobar.com/some_movie.mov"/> <!-- custom arguments --> <param name="loop" value="true"/> </object> the classid attribute tells ie to create an instance of the plug-in hosting control, the width and height specify the dimensions in pixels.
JavaScript basics - Learn web development
these include: browser application programming interfaces (apis) built into web browsers, providing functionality such as dynamically creating html and setting css styles; collecting and manipulating a
video stream from a user's webcam, or generating 3d graphics and audio samples.
HTML Cheatsheet - Learn web development
embedded
video <
video controls src="https://udn.realityripple.com/samples/1f/6cc66528b8.mp4">the <code>
video</code> element is unsupported.</
video> the
video element is unsupported.
Use HTML to solve common problems - Learn web development
abbreviations and make them understandable how to add quotations and citations to web pages how to define terms with html hyperlinks one of the main reasons for html is making navigation easy with hyperlinks, which can be used in many different ways: how to create a hyperlink how to create a table of contents with html images & multimedia how to add images to a webpage how to add
video content to a webpage scripting & styling html only sets up document structure.
Responsive images - Learn web development
like <
video> and <audio>, the <picture> element is a wrapper containing several <source> elements that provide different sources for the browser to choose from, followed by the all-important <img> element.
Introducing asynchronous JavaScript - Learn web development
related to blocking), many web api features now use asynchronous code to run, especially those that access or fetch some kind of resource from an external device, such as fetching a file from the network, accessing a database and returning data from it, accessing a
video stream from a web cam, or broadcasting the display to a vr headset.
Fetching data from the server - Learn web development
in this case we want to return our response as an image file, and the data format we use for that is blob (the term is an abbreviation of "binary large object" and can basically be used to represent large file-like objects, such as images or
video files).
Client-side web APIs - Learn web development
video and audio apis html5 comes with elements for embedding rich media in documents — <
video> and <audio> — which in turn come with their own apis for controlling playback, seeking, etc.
JavaScript object basics - Learn web development
when you accessed the document object model using lines like this: const mydiv = document.createelement('div'); const my
video = document.queryselector('
video'); you were using methods available on an instance of the document class.
JavaScript — Dynamic client-side scripting - Learn web development
every time a web page does more than just sit there and display static information for you to look at—displaying timely content updates, interactive maps, animated 2d/3d graphics, scrolling
video jukeboxes, or more—you can bet that javascript is probably involved.
Theme concepts
return; } currenttheme = theme; browser.theme.update(themes[theme]); } learn more about dynamic themes and see an additional example in the following
video: if you have not built a browser extension before, check out your first extension for a step-by-step guide.
Developer guide
firefox development
video tutorials brian bondy's
video tutorials on firefox development.
MathML Accessibility in Mozilla
mathml windows mac linux accessfu diagonal of a regular pentagon
video video video / x+2 __________ x plus 2 __________ x+2 x plus 2 x + y ab __________ a over b __________ fraction start, a over b, end of fraction fraction start.
Mozilla projects on GitHub
codefirefox the codefirefox site, with
videos and tutorials about how to contribute to the firefox project and mozilla code in general.
Index
the jar archive itself or all entries in the archive) field #2: the name of the data you are specifying; for example: install-script field #3: data corresponding to the name in field #2 for example, the -i option uses the equivalent of this line: + install-script: script.js this example associates a mime type with a file: movie.qt mime-type:
video/quicktime for information about the way installer script information appears in the manifest file for a jar archive, see the jar format on netscape devedge.
nss tech note1
lass : sec_asn1_boolean, sec_asn1_integer, sec_asn1_bit_string, sec_asn1_octet_string, sec_asn1_null, sec_asn1_object_id, sec_asn1_object_descriptor,† sec_asn1_real, sec_asn1_enumerated, sec_asn1_embedded_pdv, sec_asn1_utf8_string, sec_asn1_sequence, sec_asn1_set, sec_asn1_numeric_string, sec_asn1_printable_string, sec_asn1_t61_string, sec_asn1_teletex_string, sec_asn1_t61_string, sec_asn1_
videotex_string, sec_asn1_ia5_string, sec_asn1_utc_time, sec_asn1_generalized_time, sec_asn1_graphic_string, sec_asn1_visible_string, sec_asn1_general_string, sec_asn1_universal_string, sec_asn1_bmp_string note that for sec_asn1_set and sec_asn1_sequence types, you must also include the method type macro sec_asn1_constructed to construct a fully valid tag, as defined by the asn.1 standard .
NSS tools : signtool
the jar archive itself or all entries in the archive) field #2: the name of the data you are specifying; for example: install-script field #3: data corresponding to the name in field #2 for example, the -i option uses the equivalent of this line: + install-script: script.js this example associates a mime type with a file: movie.qt mime-type:
video/quicktime for information about the way installer script information appears in the manifest file for a jar archive, see the jar format on netscape devedge.
Zest
zest topics usecases reporting security vulnerabilities to developers reporting security vulnerabilities to companies defining active and passive scanner rules deep integration with security tools runtimes the runtime environments that support zest tools the tools that include support zest implementation the state of zest development
videos simon demoed zest at appsec usa in november 2013, and the full
video of my talk is available on youtube.
The Rust programming language
to learn more about rust, you can: watch the
videos below for a closer look at the power and benefits rust provides.
AT APIs Support
open komodo - platform for building developer environments miro - a free, open source internet tv and
video player any xulrunner application (songbird media player, etc.) extensions to other xul apps (e.g.
AudioContext.createMediaElementSource() - Web APIs
the createmediaelementsource() method of the audiocontext interface is used to create a new mediaelementaudiosourcenode object, given an existing html <audio> or <
video> element, the audio from which can then be played and manipulated.
AudioContext.createMediaStreamTrackSource() - Web APIs
navigator.mediadevices.getusermedia ({audio: true,
video: false}) .then(function(stream) { audio.srcobject = stream; audio.onloadedmetadata = function(e) { audio.play(); audio.muted = true; }; let audioctx = new audiocontext(); let source = audioctx.createmediastreamsource(stream); let biquadfilter = audioctx.createbiquadfilter(); biquadfilter.type = "lowshelf"; biquadfilter.frequency.value = 3000; biquadfilter.gain.value = ...
AudioContextLatencyCategory - Web APIs
"interactive" the audio is involved in interactive elements, such as responding to user actions or needing to coincide with visual cues such as a
video or game action.
AudioContextOptions - Web APIs
"interactive" the audio is involved in interactive elements, such as responding to user actions or needing to coincide with visual cues such as a
video or game action.
AudioNode - Web APIs
an html <audio> or <
video> element, an oscillatornode, etc.), the audio destination, intermediate processing module (e.g.
AudioTrack.language - Web APIs
for tracks that include multiple languages (such as a movie in english in which a few lines are spoken in other languages), this should be the
video's primary language.
AudioTrackList: addtrack event - Web APIs
bubbles no cancelable no interface trackevent event handler property onaddtrack examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.audiotracks.addeventlistener('addtrack', (event) => { console.log(`audio track: ${event.track.label} added`); }); using the onaddtrack event handler property: const
videoelement = document.queryselector('
video');
videoelement.audiotracks.onaddtrack = (event) => { console.log(`audio track: ${event.track.label} added`); }; specifications specification status html living standardthe definition of 'addtrack' in that specification.
AudioTrackList.getTrackById() - Web APIs
function disablecharacter(
videoelem, charactername) {
videoelem.audiotracks.gettrackbyid(charactername).enabled = false; } this short function gets the audiotracklist containing the
video's audio tracks using htmlmediaelement.audiotracks, then calls gettrackbyid() on it, specifying the character's name.
AudioTrackList.onaddtrack - Web APIs
document.queryselector("
video").audiotracks.onaddtrack = function(event) { addtotracklist(event.track); }; specifications specification status comment html living standardthe definition of 'audiotracklist.onaddtrack' in that specification.
AudioTrackList.onchange - Web APIs
var tracklist = document.queryselector("
video").audiotracks; tracklist.onchange = function(event) { tracklist.foreach(function(track) { updatetrackenabledbutton(track.id, track.enabled); }); }; the updatetrackenabledbutton(), in this example, should be a function that finds a user interface control using the track's id (perhaps the app uses the track id as the control element's id) and the track's enabled flag to determine which s...
AudioTrackList.onremovetrack - Web APIs
document.queryselector("my-
video").audiotracks.onremovetrack = function(event) { mytrackcount = document.queryselector("my-
video").audiotracks.length; }; the current number of audio tracks remaining in the media element is obtained from audiotracklist property length.
AudioTrackList: removetrack event - Web APIs
bubbles no cancelable no interface trackevent event handler property onremovetrack examples using addeventlistener(): const
videoelement = document.queryselector('
video');
videoelement.audiotracks.addeventlistener('removetrack', (event) => { console.log(`audio track: ${event.track.label} removed`); }); using the onremovetrack event handler property: const
videoelement = document.queryselector('
video');
videoelement.audiotracks.onremovetrack = (event) => { console.log(`audio track: ${event.track.label} removed`); }; specifications specification status html living standardthe definition of 'removetrack' in that specification.
AudioTrackList - Web APIs
var audiotracks = document.queryselector("
video").audiotracks; monitoring track count changes in this example, we have an app that displays information about the number of channels available.
CanvasCaptureMediaStreamTrack.requestFrame() - Web APIs
example // find the canvas element to capture var canvaselt = document.getelementsbytagname("canvas")[0]; // get the stream var stream = canvaselt.capturestream(25); // 25 fps // send the current state of the canvas as a frame to the stream stream.get
videotracks()[0].requestframe(); specifications specification status comment media capture from dom elementsthe definition of 'canvascapturemediastream.requestframe()' in that specification.
CanvasCaptureMediaStreamTrack - Web APIs
the canvascapturemediastreamtrack interface represents the
video track contained in a mediastream being generated from a <canvas> following a call to htmlcanvaselement.capturestream().
CanvasImageSource - Web APIs
the interfaces that it allows to be used as image sources are the following: htmlimageelement svgimageelement html
videoelement htmlcanvaselement imagebitmap offscreencanvas specifications specification status comment html living standardthe definition of 'canvasimagesource' in that specification.
CanvasPattern - Web APIs
the canvaspattern interface represents an opaque object describing a pattern, based on an image, a canvas, or a
video, created by the canvasrenderingcontext2d.createpattern() method.
CanvasRenderingContext2D.createPattern() - Web APIs
it can be any of the following: htmlimageelement (<img>) svgimageelement (<image>) html
videoelement (<
video>, by using the capture of the
video) htmlcanvaselement (<canvas>) imagebitmap offscreencanvas repetition a domstring indicating how to repeat the pattern's image.
A basic ray-caster - Web APIs
not exactly a new member of the id software family, but pretty decent considering it's a fully interpreted environment, and i didn't have to worry about memory allocation or
video modes or coding inner routines in assembler or anything.
Basic animations - Web APIs
for more information about the animation loop, especially for games, see the article anatomy of a
video game in our game development zone.
Basic usage of canvas - Web APIs
fallback content the <canvas> element differs from an <img> tag in that, like for <
video>, <audio>, or <picture> elements, it is easy to define some fallback content, to be displayed in older browsers not supporting it, like versions of internet explorer earlier than version 9 or textual browsers.
Hit regions and accessibility - Web APIs
partially overlaying the circle is a green <a href="http://en.wikipedia.org/wiki/square" onfocus="drawsquare();" onblur="drawpicture();">square</a> and a purple <a href="http://en.wikipedia.org/wiki/triangle" onfocus="drawtriangle();" onblur="drawpicture();">triangle</a>, both of which are semi-opaque, so the full circle can be seen underneath.</p> </canvas> see the
video how nvda reads this example by steve faulkner.
ContentIndex.add() - Web APIs
homepage article
video audio icons: optional an array of image resources, defined as an object with the following data: src: a url string of the source image.
ContentIndex.getAll() - Web APIs
homepage article
video audio icons: optional an array of image resources, defined as an object with the following data: src: a url string of the source image.
Binary strings - Web APIs
the reason that brought to use utf-16 code units as placeholders for uint8 numbers is that as web applications become more and more powerful (adding features such as audio and
video manipulation, access to raw data using websockets, and so forth) it has become clear that there are times when it would be helpful for javascript code to be able to quickly and easily manipulate raw binary data.
Document.mozSyntheticDocument - Web APIs
the document.mozsyntheticdocument property indicates whether or not the document is a synthetic one; that is, a document representing a standalone image,
video, audio, or the like.
Document - Web APIs
document.mozsyntheticdocument returns a boolean that is true only if this document is synthetic, such as a standalone image,
video, audio file, or the like.
Element.onfullscreenchange - Web APIs
function togglefullscreen() { let elem = document.queryselector("
video"); elem.onfullscreenchange = handlefullscreenchange; if (!document.fullscreenelement) { elem.requestfullscreen().then({}).catch(err => { alert(`error attempting to enable full-screen mode: ${err.message} (${err.name})`); }); } else { document.exitfullscreen(); } } function handlefullscreenchange(event) { let elem = event.target; let isfullscreen = document.fullscre...
Element.onfullscreenerror - Web APIs
since full-screen mode changes are only permitted in response to a user input, this causes an error to occur, which triggers the delivery of the fullscreenerror event to the error handler, let elem = document.queryselector("
video")}} elem.onfullscreenerror = function ( event ) { displaywarning("unable to switch into full-screen mode."); }; //....
Event - Web APIs
(for example, a webpage with an advertising-module and statistics-module both monitoring
video-watching.) when there are many nested elements, each with its own handler(s), event processing can become very complicated—especially where a parent element receives the very same event as its child elements because "spatially" they overlap so the event technically occurs in both, and the processing order of such events depends on the event bubbling and capture settings of each handler trigge...
Using files from web applications - Web APIs
here is how to preview uploaded
video: const
video = document.getelementbyid('
video'); const obj_url = url.createobjecturl(blob);
video.src = obj_url;
video.play(); url.revokeobjecturl(obj_url); specifications specification status comment html living standardthe definition of 'file upload state' in that specification.
File.type - Web APIs
moreover, file.type is generally reliable only for common file types like images, html documents, audio and
video.
Using the Gamepad API - Web APIs
technologies like <canvas>, webgl, <audio>, and <
video>, along with javascript implementations, have matured to the point where they can now support many tasks previously requiring native code.
HTMLCanvasElement.captureStream() - Web APIs
the htmlcanvaselement capturestream() method returns a mediastream which includes a canvascapturemediastreamtrack containing a real-time
video capture of the canvas's contents.
HTMLCanvasElement - Web APIs
htmlcanvaselement.capturestream() returns a canvascapturemediastream that is a real-time
video capture of the surface of the canvas.
HTMLLinkElement.as - Web APIs
the as property of the htmllinkelement interface returns a domstring representing the type of content being loaded by the html link, one of "script", "style", "image", "
video", "audio", "track", "font", "fetch".
HTMLMediaElement: abort event - Web APIs
bubbles no cancelable no interface event event handler property onabort examples const
video = document.queryselector('
video'); const
videosrc = 'https://path/to/
video.webm';
video.addeventlistener('abort', () => { console.log(`abort loading: ${
videosrc}`); }); const source = document.createelement('source'); source.setattribute('src',
videosrc); source.setattribute('type', '
video/webm');
video.appendchild(source); specifications specification status html living standard living standard html5 recommendation ...
HTMLMediaElement.buffered - Web APIs
example var obj = document.createelement('
video'); console.log(obj.buffered); // timeranges { length: 0 } specifications specification status comment html living standardthe definition of 'htmlmediaelement.buffered' in that specification.
HTMLMediaElement: canplay event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('canplay', (event) => { console.log('
video can start, but not sure it will play through.'); }); using the oncanplay event handler property: const
video = document.queryselector('
video');
video.oncanplay = (event) => { console.log('
video can start, but not sure it will play through.'); }; specifications specification s...
HTMLMediaElement.currentSrc - Web APIs
example var obj = document.createelement('
video'); console.log(obj.currentsrc); // "" specifications specification status comment html living standardthe definition of 'htmlmediaelement.currentsrc' in that specification.
HTMLMediaElement.currentTime - Web APIs
example var
video = document.createelement('
video'); console.log(
video.currenttime); usage notes reduced time precision to offer protection against timing attacks and fingerprinting, browsers may round or otherwise adjust the value returned by currenttime.
HTMLMediaElement.duration - Web APIs
examples var obj = document.createelement('
video'); console.log(obj.duration); // nan specifications specification status comment html living standardthe definition of 'htmlmediaelement.duration' in that specification.
HTMLMediaElement: durationchange event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('durationchange', (event) => { console.log('not sure why, but the duration of the
video has changed.'); }); using the ondurationchange event handler property: const
video = document.queryselector('
video');
video.ondurationchange = (event) => { console.log('not sure why, but the duration of the
video has changed.'); }; specificatio...
HTMLMediaElement.ended - Web APIs
example var obj = document.createelement('
video'); console.log(obj.ended); // false specifications specification status comment html living standardthe definition of 'htmlmediaelement.ended' in that specification.
HTMLMediaElement: error event - Web APIs
bubbles no cancelable no interface event event handler property onerror examples const
video = document.queryselector('
video'); const
videosrc = 'https://path/to/
video.webm';
video.addeventlistener('error', () => { console.error(`error loading: ${
videosrc}`); });
video.setattribute('src',
videosrc); specifications specification status html living standard living standard html5 recommendation ...
HTMLMediaElement: playing event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('playing', (event) => { console.log('
video is no longer paused'); }); using the onplaying event handler property: const
video = document.queryselector('
video');
video.onplaying = (event) => { console.log('
video is no longer paused.'); }; specifications specification status html living standardthe definiti...
HTMLMediaElement: ratechange event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('ratechange', (event) => { console.log('the playback rate changed.'); }); using the onratechange event handler property: const
video = document.queryselector('
video');
video.onratechange = (event) => { console.log('the playback rate changed.'); }; specifications specification status html living standardthe de...
HTMLMediaElement: seeked event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('seeked', (event) => { console.log('
video found the playback position it was looking for.'); }); using the onseeked event handler property: const
video = document.queryselector('
video');
video.onseeked = (event) => { console.log('
video found the playback position it was looking for.'); }; specifications specification ...
HTMLMediaElement: seeking event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('seeking', (event) => { console.log('
video is seeking a new position.'); }); using the onseeking event handler property: const
video = document.queryselector('
video');
video.onseeking = (event) => { console.log('
video is seeking a new position.'); }; specifications specification status html living standardthe...
HTMLMediaElement.src - Web APIs
example var obj = document.createelement('
video'); console.log(obj.src); // "" specifications specification status comment html living standardthe definition of 'htmlmediaelement.src' in that specification.
HTMLMediaElement: stalled event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('stalled', (event) => { console.log('failed to fetch data, but trying.'); }); using the onstalled event handler property: const
video = document.queryselector('
video');
video.onstalled = (event) => { console.log('failed to fetch data, but trying.'); }; specifications specification status html living standardt...
HTMLMediaElement: suspend event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('suspend', (event) => { console.log('data loading has been suspended.'); }); using the onsuspend event handler property: const
video = document.queryselector('
video');
video.onsuspend = (event) => { console.log('data loading has been suspended.'); }; specifications specification status html living standardthe...
HTMLMediaElement.volume - Web APIs
syntax var volume =
video.volume; //1 value a double values must fall between 0 and 1, where 0 is effectively muted and 1 is the loudest possible value.
HTMLMediaElement: volumechange event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('volumechange', (event) => { console.log('the volume changed.'); }); using the onvolumechange event handler property: const
video = document.queryselector('
video');
video.onvolumechange = (event) => { console.log('the volume changed.'); }; specifications specification status html living standardthe defini...
HTMLMediaElement: waiting event - Web APIs
using addeventlistener(): const
video = document.queryselector('
video');
video.addeventlistener('waiting', (event) => { console.log('
video is waiting for more data.'); }); using the onwaiting event handler property: const
video = document.queryselector('
video');
video.onwaiting = (event) => { console.log('
video is waiting for more data.'); }; specifications specification status html living standardt...
HTMLScriptElement - Web APIs
javascript files should be served with the application/javascript mime type, but browsers are lenient and block them only if the script is served with an image type (image/*),
video type (
video/*), audio type (audio/*), or text/csv.
HTMLSourceElement - Web APIs
the htmlsourceelement.src property has a meaning only when the associated <source> element is nested in a media element that is a <
video> or an <audio> element.
HTMLTrackElement: cuechange event - Web APIs
if the track is associated with a media element, using the <track> element as a child of the <audio> or <
video> element, the cuechange event is also sent to the htmltrackelement.
HTMLTrackElement - Web APIs
this element can be used as a child of either <audio> or <
video> to specify a text track containing information such as closed captions or subtitles.
ImageCapture.getPhotoCapabilities() - Web APIs
const input = document.queryselector('input[type="range"]'); var imagecapture; navigator.mediadevices.getusermedia({
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream; const track = mediastream.get
videotracks()[0]; imagecapture = new imagecapture(track); return imagecapture.getphotocapabilities(); }) .then(photocapabilities => { const settings = imagecapture.track.getsettings(); input.min = photocapabilities.imagewidth.min; input.max = photocapabilities.imag...
ImageCapture.getPhotoSettings() - Web APIs
const input = document.queryselector('input[type="range"]'); var imagecapture; navigator.mediadevices.getusermedia({
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream; const track = mediastream.get
videotracks()[0]; imagecapture = new imagecapture(track); return imagecapture.getphotocapabilities(); }) .then(photocapabilities => { const settings = imagecapture.track.getsettings(); input.min = photocapabilities.imagewidth.min; input.max = photocapabilities.imag...
ImageCapture.grabFrame() - Web APIs
the grabframe() method of the imagecapture interface takes a snapshot of the live
video in a mediastreamtrack and returns a promise that resolves with a imagebitmap containing the snapshot.
ImageCapture.takePhoto() - Web APIs
the takephoto() method of the imagecapture interface takes a single exposure using the
video capture device sourcing a mediastreamtrack and returns a promise that resolves with a blob containing the data.
MSGraphicsTrust - Web APIs
the msgraphicstrust() constructor returns an object that provides properties for info on protected
video playback.
MediaDeviceInfo.kind - Web APIs
the kind readonly property of the mediadeviceinfo interface returns an enumerated value, that is either "
videoinput", "audioinput" or "audiooutput".
MediaDevices.enumerateDevices() - Web APIs
navigator.mediadevices.enumeratedevices() .then(function(devices) { devices.foreach(function(device) { console.log(device.kind + ": " + device.label + " id = " + device.deviceid); }); }) .catch(function(err) { console.log(err.name + ": " + err.message); }); this might produce:
videoinput: id = cso9c0ypaf274oucpua53cne0yhlir2yxci+sqfbzz8= audioinput: id = rkxxbyjnabbadgqnnzqlvldmxls0yketycibg+xxnvm= audioinput: id = r2/xw1xupiyzunfv1lgrkoma5wtovckwfz368xcndm0= or if one or more mediastreams are active or persistent permissions are granted:
videoinput: facetime hd camera (built-in) id=cso9c0ypaf274oucpua53cne0yhlir2yxci+sqfbzz8= audioinput: default (built-in microphone) id=...
MediaError - Web APIs
the mediaerror interface represents an error which occurred while handling media in an html media element based on htmlmediaelement, such as <audio> or <
video>.
MediaImage - Web APIs
its contents can be displayed by the user agent in appropriate contexts, like player interface to show the current playing
video or audio track.
MediaRecorder.isTypeSupported - Web APIs
example var types = ["
video/webm", "audio/webm", "
video/webm\;codecs=vp8", "
video/webm\;codecs=daala", "
video/webm\;codecs=h264", "audio/webm\;codecs=opus", "
video/mpeg"]; for (var i in types) { console.log( "is " + types[i] + " supported?
MediaSettingsRange - Web APIs
const input = document.queryselector('input[type="range"]'); var imagecapture; navigator.mediadevices.getusermedia({
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream; const track = mediastream.get
videotracks()[0]; imagecapture = new imagecapture(track); return imagecapture.getphotocapabilities(); }) .then(photocapabilities => { const settings = imagecapture.track.getsettings(); input.min = photocapabilities.imagewidth.min; input.max = photocapabilities.imag...
MediaSource.MediaSource() - Web APIs
example the following snippet is taken from a simple example written by nick desaulniers (view the full demo live, or download the source for further investigation.) var
video = document.queryselector('
video'); var asseturl = 'frag_bunny.mp4'; // need to be specific for blink regarding codecs // ./mp4info frag_bunny.mp4 | grep codec var mimecodec = '
video/mp4; codecs="avc1.42e01e, mp4a.40.2"'; if ('mediasource' in window && mediasource.istypesupported(mimecodec)) { var mediasource = new mediasource; //console.log(mediasource.readystate); // closed
video.src = url.create...
MediaSource.duration - Web APIs
the full demo live, or download the source for further investigation.) function sourceopen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream(); mediasource.duration = 120;
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; ...
MediaSource.sourceBuffers - Web APIs
on.) function sourceopen (_) { //console.log(this.readystate); // open var mediasource = this; var sourcebuffer = mediasource.addsourcebuffer(mimecodec); fetchab(asseturl, function (buf) { sourcebuffer.addeventlistener('updateend', function (_) { mediasource.endofstream(); console.log(mediasource.sourcebuffers); // will contain the source buffer that was added above
video.play(); //console.log(mediasource.readystate); // ended }); sourcebuffer.appendbuffer(buf); }); }; ...
active - Web APIs
var promise = navigator.mediadevices.getusermedia({ audio: true,
video: true }); promise.then(function(stream) { var startbtn = document.queryselector('#startbtn'); startbtn.disabled = stream.active; };) specifications specification status comment media capture and streamsthe definition of 'active' in that specification.
MediaStream.getTrackById() - Web APIs
example this example activates a commentary track on a
video by ducking the audio level of the main audio track to 50%, then enabling the commentary track.
MediaStream.getTracks() - Web APIs
example navigator.mediadevices.getusermedia({audio: false,
video: true}) .then(mediastream => { document.queryselector('
video').srcobject = mediastream; // stop the stream after 5 seconds settimeout(() => { const tracks = mediastream.gettracks() tracks[0].stop() }, 5000) }) specifications specification status comment media capture and streamsthe definition of 'gettracks()' in that specification.
MediaStream.id - Web APIs
syntax var id = mediastream.id; example var p = navigator.mediadevices.getusermedia({ audio: true,
video: true }); p.then(function(stream) { console.log(stream.id); }) specifications specification status comment media capture and streamsthe definition of 'mediastream.id' in that specification.
MediaStream.onaddtrack - Web APIs
example this example adds a listener which, when a new track is added to the stream, appends a new item to a list of tracks; the new item shows the track's kind ("audio" or "
video") and label.
MediaStreamAudioSourceNode() - Web APIs
// define variables var audioctx = new (window.audiocontext || window.webkitaudiocontext)(); // getusermedia block - grab stream // put it into a mediastreamaudiosourcenode if (navigator.mediadevices.getusermedia) { navigator.mediadevices.getusermedia ( // constraints: audio and
video for this app { audio: true,
video: false }).then(function(stream) { var options = { mediastream : stream } var source = new mediastreamaudiosourcenode(audioctx, options); source.connect(audioctx.destination); }).catch(function(err) { console.log('the following gum error occured: ' + err); }); } else { co...
MediaStreamTrack.muted - Web APIs
when a track is disabled by setting enabled to false, it generates only empty frames (audio frames in which every sample is 0, or
video frames in which every pixel is black).
MediaStreamTrackAudioSourceNode() - Web APIs
let audioctx = new (window.audiocontext || window.webkitaudiocontext)(); if (navigator.mediadevices.getusermedia) { navigator.mediadevices.getusermedia ( { audio: true,
video: false }).then(function(stream) { let options = { mediastreamtrack: stream.getaudiotracks()[0]; } let source = new mediastreamtrackaudiosourcenode(audioctx, options); source.connect(audioctx.destination); }).catch(function(err) { console.log('the following gum error occured: ' + err); }); } else { console.log('new getusermedia not supported on ...
MediaStreamTrackAudioSourceOptions.mediaStreamTrack - Web APIs
let audioctx = new (window.audiocontext || window.webkitaudiocontext)(); if (navigator.mediadevices.getusermedia) { navigator.mediadevices.getusermedia ( { audio: true,
video: false }).then(function(stream) { let options = { mediastreamtrack: stream.getaudiotracks()[0]; } let source = new mediastreamtrackaudiosourcenode(audioctx, options); source.connect(audioctx.destination); }).catch(function(err) { console.log('the following gum error occured: ' + err); }); } else { console.log('new getusermedia not supported on ...
MediaTrackConstraints.frameRate - Web APIs
syntax var constraintsobject = { framerate: constraint }; constraintsobject.framerate = constraint; value a constraindouble describing the acceptable or required value(s) for a
video track's frame rate, in frames per second.
MediaTrackSettings.groupId - Web APIs
however, it can be used to ensure that audio input and output are both being performed on the same headset, for example, or to ensure that the built-in camera and microphone on a phone are being used for
video conferencing purposes.
MediaTrackSettings.height - Web APIs
syntax var height = mediatracksettings.height; value an integer value indicating the height, in pixels, of the
video track as currently configured.
MediaTrackSettings.logicalSurface - Web APIs
syntax islogicalsurface = mediatracksettings.logicalsurface; value a boolean value which is true if the
video track in the stream of captured
video is taken from a logical display surface.
MediaTrackSettings.width - Web APIs
syntax var width = mediatracksettings.width; value an integer value indicating the width, in pixels, of the
video track as currently configured.
MediaTrackSupportedConstraints - Web APIs
properties specific to shared screen tracks for tracks containing
video sources from the user's screen contents, the following additional properties are may be included in addition to those available for
video tracks.
MimeTypeArray - Web APIs
var mimetypes = navigator.mimetype; var flashplugin = mimetypes['
video/x-flv']; if (typeof flashplugin === "undefined") { var vid = document.createelement('
video'); // use vid.canplaytype() to test for a supported mime type.
msRealTime - Web APIs
msrealtime should not be used in non-real-time or non-communication scenarios, such as audio and/or
video playback, as this can affects playback startup latency of audio and
video playback.
msSetMediaProtectionManager - Web APIs
the mediaprotectionmanager class can be passed as an input to a media playback api or the mediaprotectionmanager property inside the tag's
video or audio.
Navigator - Web APIs
navigator.getusermedia() after having prompted the user for permission, returns the audio or
video stream associated to a camera or microphone on the local computer.
RTCSessionDescription.sdp - Web APIs
syntax var value = sessiondescription.sdp; sessiondescription.sdp = value; value the value is a domstring containing an sdp message like this one: v=0 o=alice 2890844526 2890844526 in ip4 host.anywhere.com s= c=in ip4 host.anywhere.com t=0 0 m=audio 49170 rtp/avp 0 a=rtpmap:0 pcmu/8000 m=
video 51372 rtp/avp 31 a=rtpmap:31 h261/90000 m=
video 53000 rtp/avp 32 a=rtpmap:32 mpv/90000 example // the remote description has been set previously on pc, an rtcpeerconnection alert(pc.remotedescription.sdp); specifications specification status comment webrtc 1.0: real-time communication between browsersthe definition of 'rtcsessiondescription.sdp' in that specif...
RTCTrackEventInit.receiver - Web APIs
syntax var trackeventinit = { receiver: rtpreceiver, track: mediastreamtrack, streams: [
videostream], transceiver: rtptransceiver }; var rtpreceiver = trackeventinit.receiver; value the rtcrtptransceiver which pairs the receiver with a sender and other properties which establish a single bidirectional srtp stream for use by the track associated with the rtctrackevent.
RTCTrackEventInit.streams - Web APIs
syntax var trackeventinit = { receiver: rtpreceiver, track: mediastreamtrack, streams: [
videostream], transceiver: rtptransceiver }; var streamlist = trackeventinit.streams; value an array of mediastream objects, one for each stream which make up the track.
RTCTrackEventInit.track - Web APIs
syntax var trackeventinit = { receiver: rtpreceiver, track: mediastreamtrack, streams: [
videostream], transceiver: rtptransceiver }; var track = trackeventinit.track; value a mediastreamtrack representing the track with which the event is associated.
RTCTrackEventInit.transceiver - Web APIs
syntax var trackeventinit = { receiver: rtpreceiver, track: mediastreamtrack, streams: [
videostream], transceiver: rtptransceiver }; var rtptransceiver = trackeventinit.transceiver; value the rtcrtptransceiver which pairs the receiver with a sender and other properties which establish a single bidirectional srtp stream for use by the track associated with the rtctrackevent.
ReadableStream.pipeThrough() - Web APIs
for example, a textdecoder, has bytes written to it and strings read from it, while a
video decoder has encoded bytes written to it and uncompressed
video frames read from it.
Request.context - Web APIs
the context of a request is only relevant in the serviceworker api; a service worker can make decisions based on whether the url is for an image, or an embeddable object such as a <
video>, iframe, etc.
Request.mode - Web APIs
however, for requests created other than by the request.request constructor, no-cors is typically used as the mode; for example, for embedded resources where the request is initiated from markup, unless the crossorigin attribute is present, the request is in most cases made using the no-cors mode — that is, for the <link> or <script> elements (except when used with modules), or <img>, <audio>, <
video>, <object>, <embed>, or <iframe> elements.
ShadowRoot.mode - Web APIs
when the mode of a shadow root is "closed", the shadow root’s implementation internals are inaccessible and unchangeable from javascript—in the same way the implementation internals of, for example, the <
video> element are inaccessible and unchangeable from javascript.
TextTrack: cuechange event - Web APIs
if the track is associated with a media element, using the <track> element as a child of the <audio> or <
video> element, the cuechange event is also sent to the htmltrackelement.
TextTrackList: addtrack event - Web APIs
bubbles no cancelable no interface trackevent event handler property onaddtrack examples using addeventlistener(): const mediaelement = document.queryselector('
video, audio'); mediaelement.texttracks.addeventlistener('addtrack', (event) => { console.log(`text track: ${event.track.label} added`); }); using the onaddtrack event handler property: const mediaelement = document.queryselector('
video, audio'); mediaelement.texttracks.onaddtrack = (event) => { console.log(`text track: ${event.track.label} added`); }; specifications specification status html living standardthe definition of 'addtrack' in that specification.
TextTrackList: change event - Web APIs
bubbles no cancelable no interface event event handler property onchange examples using addeventlistener(): const mediaelement = document.queryselectorall('
video, audio')[0]; mediaelement.texttracks.addeventlistener('change', (event) => { console.log(`'${event.type}' event fired`); }); using the onchange event handler property: const mediaelement = document.queryselector('
video, audio'); mediaelement.texttracks.onchange = (event) => { console.log(`'${event.type}' event fired`); }; specifications specification status html living standardthe definition of 'change' in that specification.
TextTrackList.length - Web APIs
var mediaelem = document.queryselector("
video, audio"); var numtexttracks = 0; if (mediaelem.texttracks) { numtexttracks = mediaelem.texttracks.length; } note that this sample checks to be sure htmlmediaelement.texttracks is defined, to avoid failing on browsers without support for texttrack.
TextTrackList.onremovetrack - Web APIs
document.queryselectorall("
video, audio")[0].texttracks.onremovetrack = function(event) { mytrackcount = document.queryselectorall("
video, audio")[0].texttracks.length; }; the current number of text tracks remaining in the media element is obtained from texttracklist property length.
TextTrackList: removeTrack event - Web APIs
bubbles no cancelable no interface trackevent event handler property onremovetrack examples using addeventlistener(): const mediaelement = document.queryselector('
video, audio'); mediaelement.texttracks.addeventlistener('removetrack', (event) => { console.log(`text track: ${event.track.label} removed`); }); using the onremovetrack event handler property: const mediaelement = document.queryselector('
video, audio'); mediaelement.texttracks.onremovetrack = (event) => { console.log(`text track: ${event.track.label} removed`); }; specifications specification status html living standardthe definition of 'removetrack' in that specificati...
TimeRanges.end() - Web APIs
example given a
video element with the id "my
video": var v = document.getelementbyid("my
video"); var buf = v.buffered; var numranges = buf.length; if (buf.length == 1) { // only one range if (buf.start(0) == 0 && buf.end(0) == v.duration) { // the one range starts at the beginning and ends at // the end of the
video, so the whole thing is loaded } } this example looks at the time ranges and looks to see if the entire
video has been loaded.
TimeRanges.length - Web APIs
syntax length = timeranges.length; example given a
video element with the id "my
video": var v = document.getelementbyid("my
video"); var buf = v.buffered; var numranges = buf.length; if (buf.length == 1) { // only one range if (buf.start(0) == 0 && buf.end(0) == v.duration) { // the one range starts at the beginning and ends at // the end of the
video, so the whole thing is loaded } } this example looks at the time ranges and looks to see if the entire
video has been loaded.
TimeRanges.start() - Web APIs
example given a
video element with the id "my
video": var v = document.getelementbyid("my
video"); var buf = v.buffered; var numranges = buf.length; if (buf.length == 1) { // only one range if (buf.start(0) == 0 && buf.end(0) == v.duration) { // the one range starts at the beginning and ends at // the end of the
video, so the whole thing is loaded } } this example looks at the time ranges and looks to see if the entire
video has been loaded.
TimeRanges - Web APIs
the timeranges interface is used to represent a set of time ranges, primarily for the purpose of tracking which portions of media have been buffered when loading it for use by the <audio> and <
video> elements.
TrackDefault.type - Web APIs
audio,
video, or text track.) syntax var mytype = trackdefault.type; value a domstring — one of audio,
video or text.
TrackDefault - Web APIs
audio,
video, or text track.) trackdefault.bytestreamtrackid read only returns the id of the specific track that the sourcebuffer should apply to.
URL.createObjectURL() - Web APIs
using object urls for media streams in older versions of the media source specification, attaching a stream to a <
video> element required creating an object url for the mediastream.
VTTCue() - Web APIs
var cue = new vttcue(2, 3, 'cool text to be displayed'); specifications specification status comment webvtt: the web
video text tracks formatthe definition of 'vttcue()' in that specification.
WebGL by example - Web APIs
miscellaneous advanced examples
video textures this example demonstrates how to use
video files as textures.
WebGL tutorial - Web APIs
animating textures in webgl shows how to animate textures; in this case, by mapping an ogg
video onto the faces of a rotating cube.
WebRTC coding guide - Web APIs
how do you create a web application that uses two-way
video or data streams without having to do all the hard work of compressing frames, building streams, and so forth by yourself?
High-level guides - Web APIs
webrtc (web real-time communications) is a broad, multi-component system for setting up and operating complex audio,
video, and data channels across networks among two or more peers on the web.
A simple RTCDataChannel sample - Web APIs
this method accepts, optionally, an object with constraints to be met for the connection to meet your needs, such as whether the connection should support audio,
video, or both.
Fundamentals of WebXR - Web APIs
field of view the term field of view (fov) is one which applies to any visual technology, from old film cameras to modern digital
video cameras, including the cameras in computers and mobile devices.
Geometry and reference spaces in WebXR - Web APIs
while you can use webxr for everything from augmenting the world with annotations to 360°
video playback to scientific simulations to virtual reality training systems or anything else you can imagine, let's take a 3d
video game as an example of a typical webxr application.
Lighting a WebXR setting - Web APIs
another scenario in which lighting estimation can be used to obtain information about the user without permission: if the light sensor is close enough to the user's display to detect lighting changes caused by the contents of the display, an algorithm could be used to determine whether or not the user is watching a particular
video—or even to potentially identify which of a number of
videos the user is watching.
self.createImageBitmap() - Web APIs
syntax const imagebitmappromise = createimagebitmap(image[, options]); const imagebitmappromise = createimagebitmap(image, sx, sy, sw, sh[, options]); parameters image an image source, which can be an <img>, svg <image>, <
video>, <canvas>, htmlimageelement, svgimageelement, html
videoelement, htmlcanvaselement, blob, imagedata, imagebitmap, or offscreencanvas object.
XRSession: select event - Web APIs
examples of comon kinds of primary action are users pressing triggers or buttons, tapping a touchpad, speaking a command, or performing a recognizable gesture when using a
video tracking system or handheld controller with an accelerometer.
XRSession: selectend event - Web APIs
primary actions include things like users pressing triggers or buttons, tapping a touchpad, speaking a command, or performing a recognizable gesture when using a
video tracking system or handheld controller with an accelerometer.
XRSession: selectstart event - Web APIs
primary actions include things like users pressing triggers or buttons, tapping a touchpad, speaking a command, or performing a recognizable gesture when using a
video tracking system or handheld controller with an accelerometer.
XRSession: squeeze event - Web APIs
examples of comon kinds of primary action are users pressing triggers or buttons, tapping a touchpad, speaking a command, or performing a recognizable gesture when using a
video tracking system or handheld controller with an accelerometer.
XRSession: squeezeend event - Web APIs
primary squeeze actions include things like users pressing triggers or buttons, tapping a touchpad, speaking a command, or performing a recognizable gesture when using a
video tracking system or handheld controller with an accelerometer.
ARIA: figure role - Accessibility
description any content that should be grouped together and consumed as a figure (which could include images,
video, audio, code snippets, or other content) can be identified as a figure using role="figure".
ARIA: img role - Accessibility
<div role="img" aria-label="description of the overall image"> <img src="graphic1.png" alt=""> <img src="graphic2.png"> </div> description any set of content that should be consumed as a single image (which could include images,
video, audio, code snippets, emojis, or other content) can be identified using role="img".
Understandable - Accessibility
see
video and audio content, and pronunciation guide for english dictionary note: also see the wcag description for guideline 3.1 readable: make text content readable and understandable.
Accessibility
accessible multimedia another category of content that can create accessibility problems is multimedia —
video, audio, and image content need to be given proper textual alternatives so they can be understood by assistive technologies and their users.
::cue-region - CSS: Cascading Style Sheets
t-stretch font-style font-variant font-weight line-height opacity outline outline-color outline-style outline-width ruby-position text-combine-upright text-decoration text-decoration-color text-decoration-line text-decoration-style text-decoration-thickness text-shadow visibility white-space specifications specification status comment webvtt: the web
video text tracks formatthe definition of 'the ::cue-region pseudo-element' in that specification.
::cue - CSS: Cascading Style Sheets
::cue { color: #fff; background-color: rgba(0, 0, 0, 0.6); } specifications specification status comment webvtt: the web
video text tracks formatthe definition of '::cue' in that specification.
Subgrid - CSS: Cascading Style Sheets
see also on the mozilla developer youtube channel, see the
videos laying out forms using subgrid and don't wait to use subgrid for better card layouts hello subgrid!
CSS Grid Layout - CSS: Cascading Style Sheets
ayout using named grid lines auto-placement in css grid layout box alignment in css grid layout css grid, logical values and writing modes css grid layout and accessibility css grid and progressive enhancement realising common layouts using css grid subgrid external resources css grid and ie11 (polyfill) examples from jen simmons grid by example - a collection of usage examples and
video tutorials codrops grid reference firefox devtools css grid inspector css grid playground grid garden - a game for learning css grid specifications specification status comment css grid layout module level 2 working draft added subgrids.
Replaced elements - CSS: Cascading Style Sheets
replaced elements typical replaced elements are: <iframe> <
video> <embed> <img> some elements are treated as replaced elements only in specific cases: <option> <audio> <canvas> <object> <applet> html spec also says that an <input> element can be replaced, because <input> elements of the "image" type are replaced elements similar to <img>.
aspect-ratio - CSS: Cascading Style Sheets
in firefox, the internal stylesheet rule looks like this: img, input[type="image"],
video, embed, iframe, marquee, object, table { aspect-ratio: attr(width) / attr(height); } specifications specification status comment css box sizing module level 4the definition of 'aspect-ratio' in that specification.
resize - CSS: Cascading Style Sheets
resize does not apply to the following: inline elements block elements for which the overflow property is set to visible formal definition initial valuenoneapplies toelements with overflow other than visible, and optionally replaced elements representing images or
videos, and iframesinheritednocomputed valueas specifiedanimation typediscrete formal syntax none | both | horizontal | vertical | block | inline examples disabling resizability of textareas in many browsers, <textarea> elements are resizable by default.
Overview of events and handlers - Developer guides
ue to resizing the browser, the window.screen object, such as due to changes in device orientation, the document object, including the loading, modification, user interaction, and unloading of the page, the objects in the dom (document object model) tree including user interactions or modifications, the xmlhttprequest objects used for network requests, and the media objects such as audio and
video, when the media stream players change state.
Event developer guide - Developer guides
two common styles are: the generalized addeventlistener() and a set of specific on-event handlers.media eventsvarious events are sent when handling media that are embedded in html documents using the <audio> and <
video> elements; this section lists them and provides some helpful information about using them.mouse gesture eventsgecko 1.9.1 added support for several mozilla-specific dom events used to handle mouse gestures.
A hybrid approach - Developer guides
we used some elements of responsive web design to give the site a mobile layout, along with user-agent detection to provide mobile-friendly
videos and to re-order the demos if the user is on a phone.
HTML attribute: crossorigin - HTML: Hypertext Markup Language
the crossorigin attribute, valid on the <audio>, <img>, <link>, <script>, and <
video> elements, provides support for cors, defining how the element handles crossorigin requests, thereby enabling the configuration of the cors requests for the element's fetched data.
<link>: The External Resource Link element - HTML: Hypertext Markup Language
font css @font-face image <img> and <picture> elements with srcset or imageset attributes, svg <image> elements, css *-image rules object <object> elements script <script> elements, worker importscripts style <link rel=stylesheet> elements, css @import track <track> elements
video <
video> elements worker worker, sharedworker crossorigin this enumerated attribute indicates whether cors must be used when fetching the resource.
Inline elements - HTML: Hypertext Markup Language
<bdi> <bdo> <big> <br> <button> <canvas> <cite> <code> <data> <datalist> <del> <dfn> <em> <embed> <i> <iframe> <img> <input> <ins> <kbd> <label> <map> <mark> <meter> <noscript> <object> <output> <picture> <progress> <q> <ruby> <s> <samp> <script> <select> <slot> <small> <span> <strong> <sub> <sup> <svg> <template> <textarea> <time> <u> <tt> <var> <
video> <wbr> see also block-level elements html element reference display content categories block and inline layout in normal flow ...
Microdata - HTML: Hypertext Markup Language
commonly used vocabularies: creative works: creativework, book, movie, musicrecording, recipe, tvseries embedded non-text objects: audioobject, imageobject,
videoobject event health and medical types: notes on the health and medical types under medicalentity organization person place, localbusiness, restaurant product, offer, aggregateoffer review, aggregaterating action thing intangible major search engine operators like google, microsoft, and yahoo!
Evolution of HTTP - HTTP
http has evolved from an early protocol to exchange files in a semi-trusted laboratory environment, to the modern maze of the internet, now carrying images,
videos in high resolution and 3d.
Identifying resources on the Web - HTTP
on an html document, for example, the browser will scroll to the point where the anchor is defined; on a
video or audio document, the browser will try to go to the time the anchor represents.
Content negotiation - HTTP
the accept header is defined by the browser, or any other user-agent, and can vary according to the context, like fetching an html page or an image, a
video, or a script: it is different when fetching a document entered in the address bar or an element linked via an <img>, <
video> or <audio> element.
Feature Policy - HTTP
examples of what you can do with feature policy: change the default behavior of autoplay on mobile and third party
videos.
Accept - HTTP
browsers set adequate values for this header depending on the context where the request is done: when fetching a css stylesheet a different value is set for the request than when fetching an image,
video or a script.
Expect - HTTP
put /somewhere/fun http/1.1 host: origin.example.com content-type:
video/h264 content-length: 1234567890987 expect: 100-continue the server now checks the request headers and may respond with a 100 (continue) response to instruct the client to go ahead and send the message body, or it will send a 417 (expectation failed) status if any of the expectations cannot be met.
Feature-Policy: camera - HTTP
the http feature-policy header camera directive controls whether the current document is allowed to use
video input devices.
Feature-Policy: fullscreen - HTTP
it can do so by delivering the following http response header to define a feature policy: feature-policy: fullscreen 'self' then include an allow attribute on the <iframe> element: <iframe src="https://other.com/
videoplayer" allow="fullscreen"></iframe> iframe attributes can selectively enable features in certain frames, and not in others, even if those frames contain documents from the same origin.
Feature-Policy: picture-in-picture - HTTP
the http feature-policy header picture-in-picture directive controls whether the current document is allowed to play a
video in a picture-in-picture mode via the corresponding api.
Save-Data - HTTP
a value of on indicates explicit user opt-in into a reduced data usage mode on the client, and when communicated to origins allows them to deliver alternative content to reduce the data downloaded such as smaller image and
video resources, different markup and styling, disabled polling and automatic updates, and so on.
Sec-Fetch-Dest - HTTP
audioworklet sec-fetch-dest: document sec-fetch-dest: embed sec-fetch-dest: empty sec-fetch-dest: font sec-fetch-dest: image sec-fetch-dest: manifest sec-fetch-dest: nested-document sec-fetch-dest: object sec-fetch-dest: paintworklet sec-fetch-dest: report sec-fetch-dest: script sec-fetch-dest: serviceworker sec-fetch-dest: sharedworker sec-fetch-dest: style sec-fetch-dest: track sec-fetch-dest:
video sec-fetch-dest: worker sec-fetch-dest: xslt sec-fetch-dest: audioworklet sec-fetch-dest: audioworklet values audio audioworklet document embed empty font image manifest object paintworklet report script serviceworker sharedworker style track
video worker xslt nested-document examples todo specifications specification ...
HTTP headers - HTTP
it is a structured header whose value is a token with possible values audio, audioworklet, document, embed, empty, font, image, manifest, object, paintworklet, report, script, serviceworker, sharedworker, style, track,
video, worker, xslt, and nested-document.
Indexed collections - JavaScript
however, as web applications become more and more powerful, adding features such as audio and
video manipulation, access to raw data using websockets, and so forth, it has become clear that there are times when it would be helpful for javascript code to be able to quickly and easily manipulate raw binary data in typed arrays.
JavaScript typed arrays - JavaScript
however, as web applications become more and more powerful, adding features such as audio and
video manipulation, access to raw data using websockets, and so forth, it has become clear that there are times when it would be helpful for javascript code to be able to quickly and easily manipulate raw binary data.
Mapping the width and height attributes of media container elements to their aspect-ratio - Web media technologies
this appears in the browser's internal ua stylesheet, similar to the following: img, input[type="image"],
video, embed, iframe, marquee, object, table { aspect-ratio: attr(width) / attr(height); } this actually affects any element that acts as a container for complex or mixed visual media — <embed>, <iframe>, <marquee>, <object>, <table>, and <
video>, in addition to actual images (<img> and <input type="image">).
Lazy loading - Web Performance
<img src="image.jpg" loading="lazy" alt="..." /> <iframe src="
video-player.html" loading="lazy"></iframe> the load event fires when the eagerly-loaded content has all been loaded; at that time, it's entirely possible (or even likely) that there may be lazily-loaded images that are within the visual viewport that haven't yet loaded.
Progressive web app structure - Progressive web apps (PWAs)
the streams api allows developers to have direct access to data streaming from the server — if you want to perform an operation on the data (for example, adding a filter to a
video), you no longer need to wait for all of it to be downloaded and converted to a blob (or whatever) — you can start right away.
Mobile first - Progressive web apps (PWAs)
our html looks like this: <article> <nav> <ul> <li><a href="#">home</a></li> <li><a href="#">articles</a></li> <li><a href="#">
videos</a></li> <li><a href="#">work</a></li> <li><a href="#">about</a></li> <li><a href="#">contact</a></li> </ul> </nav> <header> <a id="top" href="#bottom">jump to menu</a> <h1>my article</h1> </header> <div class="main"> <p>lorem ipsum … </p> <a id="bottom" href="#top">back to top</a> </div> </article> <button id="install-btn">install</button> ...
opacity - SVG: Scalable Vector Graphics
as a presentation attribute, it can be applied to any element but it has effect only on the following elements: <a>, <audio>, <canvas>, <circle>, <ellipse>, <foreignobject>, <g>, <iframe>, <image>, <line>, <marker>, <path>, <polygon>, <polyline>, <rect>, <svg>, <switch>, <symbol>, <text>, <textpath>, <tspan>, <use>, <unknown>, and <
video> html, body, svg { height: 100%; } <svg viewbox="0 0 200 100" xmlns="http://www.w3.org/2000/svg"> <defs> <lineargradient id="gradient" x1="0%" y1="0%" x2="0" y2="100%"> <stop offset="0%" style="stop-color:skyblue;" /> <stop offset="100%" style="stop-color:seagreen;" /> </lineargradient> </defs> <rect x="0" y="0" width="100%" height="100%" fill="url(#gradient)" /> ...
systemLanguage - SVG: Scalable Vector Graphics
35 elements are using this attribute: <a>, <altglyph>, <animate>, <animatecolor>, <animatemotion>, <animatetransform>, <audio>, <canvas>, <circle>, <clippath>, <cursor>, <defs>, <discard>, <ellipse>, <foreignobject>, <g>, <iframe>, <image>, <line>, <mask>, <path>, <pattern>, <polygon>, <polyline>, <rect>, <set>, <svg>, <switch>, <text>, <textpath>, <tref>, <tspan>, <unknown>, <use>, and <
video> usage notes value <language-tags> default value none animatable no <language-tags> the value is a set of comma-separated tokens, each of which must be a language-tag value, as defined in bcp 47.
visibility - SVG: Scalable Vector Graphics
as a presentation attribute, it can be applied to any element but it has effect only on the following nineteen elements: <a>, <altglyph>, <audio>, <canvas>, <circle>, <ellipse>, <foreignobject>, <iframe>, <image>, <line>, <path>, <polygon>, <polyline>, <rect>, <text>, <textpath>, <tref>, <tspan>, <
video> html, body, svg { height: 100%; } <svg viewbox="0 0 220 120" xmlns="http://www.w3.org/2000/svg"> <rect x="10" y="10" width="200" height="100" stroke="black" stroke-width="5" fill="transparent" /> <g stroke="seagreen" stroke-width="5" fill="skyblue"> <rect x="20" y="20" width="80" height="80" visibility="visible" /> <rect x="120" y="20" width="80" height="80" visibility="...
SVG 2 support in Mozilla - SVG: Scalable Vector Graphics
on-rendered elements not included in addressable characters implementation status unknown unknown elements in text render as unpositioned spans implementation status unknown offset distances of text positioned along a transformed path measured in text elements coordinate system implementation status unknown embedded content change notes <
video> implementation status unknown <audio> implementation status unknown <iframe> implementation status unknown <canvas> implementation status unknown <source> implementation status unknown <track> implementation status unknown painting change notes paint-order implemented (bug 828805) will-cha...
WebAssembly Concepts - WebAssembly
we have run into performance problems, however, when trying to use javascript for more intensive use cases like 3d games, virtual and augmented reality, computer vision, image/
video editing, and a number of other domains that demand native performance (see webassembly use cases for more ideas).