You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/media-services/latest/frequently-asked-questions.md
+109-1Lines changed: 109 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ editor: ''
11
11
ms.service: media-services
12
12
ms.workload:
13
13
ms.topic: article
14
-
ms.date: 03/18/2020
14
+
ms.date: 04/07/2020
15
15
ms.author: juliako
16
16
---
17
17
@@ -164,6 +164,114 @@ For more information, see [Migrate to Media Services v3](media-services-v2-vs-v3
164
164
165
165
It is now recommended to use the server-side storage encryption (which is on by default). For more information, see [Azure Storage Service Encryption for Data at Rest](https://docs.microsoft.com/azure/storage/common/storage-service-encryption).
166
166
167
+
## Offline streaming
168
+
169
+
### FairPlay Streaming for iOS
170
+
171
+
The following frequently asked questions provide assistance with troubleshooting offline FairPlay streaming for iOS:
172
+
173
+
#### Why does only audio play but not video during offline mode?
174
+
175
+
This behavior seems to be by design of the sample app. When an alternate audio track is present (which is the case for HLS) during offline mode, both iOS 10 and iOS 11 default to the alternate audio track. To compensate this behavior for FPS offline mode, remove the alternate audio track from the stream. To do this on Media Services, add the dynamic manifest filter "audio-only=false." In other words, an HLS URL ends with .ism/manifest(format=m3u8-aapl,audio-only=false).
176
+
177
+
#### Why does it still play audio only without video during offline mode after I add audio-only=false?
178
+
179
+
Depending on the content delivery network (CDN) cache key design, the content might be cached. Purge the cache.
180
+
181
+
#### Is FPS offline mode also supported on iOS 11 in addition to iOS 10?
182
+
183
+
Yes. FPS offline mode is supported for iOS 10 and iOS 11.
184
+
185
+
#### Why can't I find the document "Offline Playback with FairPlay Streaming and HTTP Live Streaming" in the FPS Server SDK?
186
+
187
+
Since FPS Server SDK version 4, this document was merged into the "FairPlay Streaming Programming Guide."
188
+
189
+
#### What is the downloaded/offline file structure on iOS devices?
190
+
191
+
The downloaded file structure on an iOS device looks like the following screenshot. The `_keys` folder stores downloaded FPS licenses, with one store file for each license service host. The `.movpkg` folder stores audio and video content. The first folder with a name that ends with a dash followed by a numeric contains video content. The numeric value is the PeakBandwidth of the video renditions. The second folder with a name that ends with a dash followed by 0 contains audio content. The third folder named "Data" contains the master playlist of the FPS content. Finally, boot.xml provides a complete description of the `.movpkg` folder content.
#### How can I deliver persistent licenses (offline-enabled) for some clients/users and non-persistent licenses (offline-disabled) for others? Do I have to duplicate the content and use separate content key?
228
+
229
+
Since Media Services v3 allows an Asset to have multiple StreamingLocators. You can have
230
+
231
+
* One ContentKeyPolicy with license_type = "persistent", ContentKeyPolicyRestriction with claim on "persistent", and its StreamingLocator;
232
+
* Another ContentKeyPolicy with license_type="nonpersistent", ContentKeyPolicyRestriction with claim on "nonpersistent", and its StreamingLocator.
233
+
* The two StreamingLocators have different ContentKey.
234
+
235
+
Depending on business logic of custom STS, different claims are issued in the JWT token. With the token, only the corresponding license can be obtained and only the corresponding URL can be played.
236
+
237
+
#### What is the mapping between the Widevine and Media Services DRM security levels?
238
+
239
+
Google's "Widevine DRM Architecture Overview" defines three different security levels. However, in [Azure Media Services documentation on Widevine license template](widevine-license-template-overview.md),
240
+
five different security levels are outlined. This section explains how the security levels map.
241
+
242
+
The Google's "Widevine DRM Architecture Review" doc defines the following three security levels:
243
+
244
+
* Security Level 1: All content processing, cryptography, and control are performed within the Trusted Execution Environment (TEE). In some implementation models, security processing may be performed in different chips.
245
+
* Security Level 2: Performs cryptography (but not video processing) within the TEE: decrypted buffers are returned to the application domain and processed through separate video hardware or software. At level 2, however, cryptographic information is still processed only within the TEE.
246
+
* Security Level 3 Does not have a TEE on the device. Appropriate measures may be taken to protect the cryptographic information and decrypted content on host operating system. A Level 3 implementation may also include a hardware cryptographic engine, but that only enhances performance, not security.
247
+
248
+
At the same time, in [Azure Media Services documentation on Widevine license template](widevine-license-template-overview.md), the security_level property of content_key_specs can have the following five different values (client robustness requirements for playback):
249
+
250
+
* Software-based white-box crypto is required.
251
+
* Software crypto and an obfuscated decoder is required.
252
+
* The key material and crypto operations must be performed within a hardware backed TEE.
253
+
* The crypto and decoding of content must be performed within a hardware backed TEE.
254
+
* The crypto, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware backed TEE.
255
+
256
+
Both security levels are defined by Google Widevine. The difference is in its usage level: architecture level or API level. The five security levels are used in the Widevine API. The content_key_specs object, which
257
+
contains security_level is deserialized and passed to the Widevine global delivery service by Azure Media Services Widevine license service. The table below shows the mapping between the two sets of security levels.
258
+
259
+
|**Security Levels Defined in Widevine Architecture**|**Security Levels Used in Widevine API**|
260
+
|---|---|
261
+
|**Security Level 1**: All content processing, cryptography, and control are performed within the Trusted Execution Environment (TEE). In some implementation models, security processing may be performed in different chips.|**security_level=5**: The crypto, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware backed TEE.<br/><br/>**security_level=4**: The crypto and decoding of content must be performed within a hardware backed TEE.|
262
+
**Security Level 2**: Performs cryptography (but not video processing) within the TEE: decrypted buffers are returned to the application domain and processed through separate video hardware or software. At level 2, however, cryptographic information is still processed only within the TEE.| **security_level=3**: The key material and crypto operations must be performed within a hardware backed TEE. |
263
+
|**Security Level 3**: Does not have a TEE on the device. Appropriate measures may be taken to protect the cryptographic information and decrypted content on host operating system. A Level 3 implementation may also include a hardware cryptographic engine, but that only enhances performance, not security. |**security_level=2**: Software crypto and an obfuscated decoder are required.<br/><br/>**security_level=1**: Software-based white-box crypto is required.|
264
+
265
+
#### Why does content download take so long?
266
+
267
+
There are two ways to improve download speed:
268
+
269
+
* Enable CDN so that end users are more likely to hit CDN instead of origin/streaming endpoint for content download. If user hits streaming endpoint, each HLS segment or DASH fragment is dynamically packaged and encrypted. Even though this latency is in millisecond scale for each segment/fragment, when you have an hour long video, the accumulated latency can be large causing longer download.
270
+
* Provide end users the option to selectively download video quality layers and audio tracks instead of all contents. For offline mode, there is no point to download all of the quality layers. There are two ways to achieve this:
271
+
272
+
* Client controlled: either player app auto selects or user selects video quality layer and audio tracks to download;
273
+
* Service controlled: one can use Dynamic Manifest feature in Azure Media Services to create a (global) filter, which limits HLS playlist or DASH MPD to a single video quality layer and selected audio tracks. Then the download URL presented to end users will include this filter.
See [frequentlyaskedquestionsprovideassistancewithtroubleshooting](frequently-asked-questions.md#why-does-only-audio-play-but-not-video-during-offline-mode).
0 commit comments