Skip to content

Commit cb974f8

Browse files
committed
Fix internal links in Media Source Extensions
1 parent 1af9e66 commit cb974f8

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

Sources/WebKit/WebKit.docc/InDepth/MediaSourceExtensions.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
The Media Source Extensions specification defines a set of classes which allows clients to implement their own loading, buffering, and variant switching behavior, as opposed to requiring the UA to handle same.
66

7-
Clients `fetch()` media initialization segments and media segments, typically subsets of a single [fragmented MP4 file](https://www.w3.org/TR/mse-byte-stream-format-isobmff/) or [WebM file](https://www.w3.org/TR/mse-byte-stream-format-webm/), and append those segments into a SourceBuffer object, which is associated with a HTMLMediaElement through a [MediaSource](#mediasource) object.
7+
Clients `fetch()` media initialization segments and media segments, typically subsets of a single [fragmented MP4 file](https://www.w3.org/TR/mse-byte-stream-format-isobmff/) or [WebM file](https://www.w3.org/TR/mse-byte-stream-format-webm/), and append those segments into a SourceBuffer object, which is associated with a HTMLMediaElement through a <doc:MediaSourceExtensions#MediaSource> object.
88

99
## Relevant Classes
1010

@@ -16,9 +16,9 @@ MediaSource serves two purposes:
1616
* Creating SourceBuffer objects.
1717
* Associating those SourceBuffer objects with a HTMLMediaElement.
1818

19-
Once created, clients can create query for container and codec support via `isTypeSupported(type)`, [SourceBuffer](#sourcebuffer) objects via `addSourceBuffer(type)`, explicitly set the MediaSource's `duration`, and signal an end of the stream via `endOfStream(error)`.
19+
Once created, clients can create query for container and codec support via `isTypeSupported(type)`, <doc:MediaSourceExtensions#SourceBuffer> objects via `addSourceBuffer(type)`, explicitly set the MediaSource's `duration`, and signal an end of the stream via `endOfStream(error)`.
2020

21-
Before creating any [SourceBuffer](#sourcebuffer) objects, the MediaSource must be associated with a HTMLMediaElement.
21+
Before creating any <doc:MediaSourceExtensions#SourceBuffer> objects, the MediaSource must be associated with a HTMLMediaElement.
2222
The MediaSource can be set directly as the HTMLMediaElement's `srcObject`. Alternatively, an [extension](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/mediasource/DOMURL%2BMediaSource.idl) to DOMURL allows an ObjectURL to be created from a MediaSource object, and that ObjectURL can be set as the HTMLMediaElement's `src`.
2323

2424
A MediaSource object will fire a `"sourceopen"` event when successfully associated with a HTMLMediaElement, and a `"sourceclose"` event when disassociated.
@@ -29,30 +29,30 @@ The state of the MediaSource object can be queried via its `readyState` property
2929
([.idl](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/mediasource/SourceBuffer.idl), [.h](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/mediasource/SourceBuffer.h), [.cpp](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/Modules/mediasource/SourceBuffer.cpp))
3030

3131
SourceBuffer accepts buffers of initialization segments and media segments, which are then parsed into media tracks and media samples. Those samples are cached within the SourceBuffer (inside its [SourceBufferPrivate](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/SourceBufferPrivate.h) object)
32-
and enqueued into platform-specific decoders on demand. The primary storage mechanism for these samples is a [SampleMap](#samplemap), which orders those samples both in terms of each sample's DecodeTime and PresentationTime. These two times can differ for codecs that support frame reordering, typically MPEG video codecs such as h.264 and HEVC.
32+
and enqueued into platform-specific decoders on demand. The primary storage mechanism for these samples is a <doc:MediaSourceExtensions#SampleMap>, which orders those samples both in terms of each sample's DecodeTime and PresentationTime. These two times can differ for codecs that support frame reordering, typically MPEG video codecs such as h.264 and HEVC.
3333

3434
Clients append these segments via `appendBuffer()`, which sets an internal `updating` flag, fires the `"updatestart"` event, and subsequently fires the `"updateend"` event and clears the `updating` flag once parsing is complete. The results of the append are visible by querying the `buffered` property, or by querying the `audioTracks`, `videoTracks`, and `textTracks` TrackList objects.
3535

3636
### MediaSourcePrivate
3737

3838
([.h](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/MediaSourcePrivate.h))
3939

40-
MediaSourcePrivate is an abstract base class which allows [MediaSource](#mediasource) to communicate through the platform boundary to a platform-specific implementation of MediaSource.
40+
MediaSourcePrivate is an abstract base class which allows <doc:MediaSourceExtensions#MediaSource> to communicate through the platform boundary to a platform-specific implementation of MediaSource.
4141

4242
When the GPU Process is enabled, the MediaSourcePrivate in the WebContent process is typically a [MediaSourcePrivateRemote](https://github.com/WebKit/WebKit/blob/main/Source/WebKit/WebProcess/GPU/media/MediaSourcePrivateRemote.cpp), which will pass commands and properties across the WebContent/GPU process boundary.
4343

4444
For Apple ports, the MediaSourcePrivate is typically a [MediaSourcePrivateAVFObjC](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h).
4545

4646
For GStreamer-based ports, the MediaSourcePrivate is typically a [MediaSourcePrivateGStreamer](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourcePrivateGStreamer.h).
4747

48-
When running in DumpRenderTree/WebKitTestRunner, a "mock" MediaSourcePrivate can be enabled, and a [MockMediaSourcePrivate](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/mock/mediasource/MockMediaSourcePrivate.h) can be created. This is useful for writing platform-independent tests which exercise the platform-independent [MediaSource](#mediasource) and [SourceBuffer](#sourcebuffer) objects directly.
48+
When running in DumpRenderTree/WebKitTestRunner, a "mock" MediaSourcePrivate can be enabled, and a [MockMediaSourcePrivate](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/mock/mediasource/MockMediaSourcePrivate.h) can be created. This is useful for writing platform-independent tests which exercise the platform-independent <doc:MediaSourceExtensions#MediaSource> and <doc:MediaSourceExtensions#SourceBuffer> objects directly.
4949

5050
### SourceBufferPrivate
5151

5252
([.h](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/SourceBufferPrivate.h), [.cpp](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/graphics/SourceBufferPrivate.cpp))
5353

5454
SourceBufferPrivate is a semi-abstract base class which accepts initialization segment and media segment buffers, parse those buffers with platform-specific parsers, and enqueue the resulting samples into platform-specific decoders.
55-
SourceBufferPrivate is also responsible for caching parsed samples in a [SampleMap](#samplemap).
55+
SourceBufferPrivate is also responsible for caching parsed samples in a <doc:MediaSourceExtensions#SampleMap>.
5656

5757
### MediaTime
5858

@@ -72,8 +72,8 @@ MediaTime offers convenience methods to convert from (`createTimeWithDouble()`)
7272

7373
([.h](https://github.com/WebKit/WebKit/blob/main/Source/WebCore/platform/MediaSample.h))
7474

75-
MediaSample is an abstract base class representing a sample parsed from a media segment. MediaSamples have `presentationTime()`, `decodeTime()`, and `duration()`, each of which are [MediaTime](#mediatime) values,
76-
which are used to order these samples relative to one another in a [SampleMap](#samplemap).
75+
MediaSample is an abstract base class representing a sample parsed from a media segment. MediaSamples have `presentationTime()`, `decodeTime()`, and `duration()`, each of which are <doc:MediaSourceExtensions#MediaTime> values,
76+
which are used to order these samples relative to one another in a <doc:MediaSourceExtensions#SampleMap>.
7777
For codecs which support frame reordering, `presentationTime()` and `decodeTime()` for each sample may differ.
7878

7979
### SampleMap

0 commit comments

Comments
 (0)