Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revamp Device Profile Builder #519

Merged
merged 35 commits into from
Jan 13, 2024
Merged

Revamp Device Profile Builder #519

merged 35 commits into from
Jan 13, 2024

Conversation

holow29
Copy link
Contributor

@holow29 holow29 commented Aug 8, 2022

Draft related to #296.

Background

While theoretically it might make sense to query a given media item at playtime for support (I thought about investigating MediaInfo API maybe) and then send decision to server about play method, I don't believe that the API/server provides that capability. It seems architected in the opposite way using these device profiles.

Before, the device profile builder was relying on mapping the device to codec capabilities based on an external mapping of device model to CPU chip. This requires maintenance each time a new device is released, and it appears to be unnecessary given that we can query directly to see if a certain codec is supported, in many cases.

I encountered some difficulty with this because I wasn't sure where to draw the codec names for containers, audio codecs, video codecs, etc. I initially assumed ffmpeg would be good for this, but it appears it might be some combination of ffmpeg, file extensions, and maybe some hard-coded strings server-side. For example, ffmpeg does not use "mkv" but rather "matroska" or "matroska,webm." However, for the container, we use "mkv." To further complicate matters, using AVFoundation's AVURLAsset.isPlayableExtendedMIMEType() relies on the MIME type of the codec/container. Some of these are standardized with the IANA, some are from the MP4 registration authority, and others are from querying Apple's Universal Type Identifiers (UTI) using UniformTypeIdentifiers functions such as UTType.types() and preferredMIMEType. To make things easier, I have come up with the following list:

MIME Types by format/codec
// Create dictionary with "ffmpeg codec" : ["MIME", "type", "alternate MIME in container"]
// ffmpeg codecs from ffmpeg -codecs
// ffmpeg formats from ffmpeg -formats and ffprobe -formats
// [unsupported] if not supported by native player

let possibleCodecs = [

    // Containers
    "mov" : ["video/quicktime", "container"],
    // What about an fmp4 container? Is that necessary? Don't see it in codecs list
    "mp4" : ["video/mp4", "container"],
    "m4v" : ["video/x-m4v", "container"],
    "avi" : ["video/avi", "container"],
    "3gp" : ["video/3gpp", "container"],
    "3g2" : ["video/3gpp2", "container"],
    // "ts" : ["video/mp2t", "container"], // Might only be supported in HLS as fragments? Same thing as mpeg2TransportStream I think...but no MIME types associated apparently I think [unsupported] Should this be mpegts or just ts?
    "mpegts" : ["video/mp2t", "container"], // .ts or .m2ts? (BDMV) Supported?
    "webm" : ["video/webm", "container"], // [unsupported]
    "mkv" : ["video/x-matroska", "container"], // [unsupported] // should this be mkv or matroska?
    "ogg" : ["video/ogg", "container"], // is this also ogv? [unsupported]
    "asf" : ["video/x-ms-asf", "container"], // [unsupported]
    "wmv" : ["video/x-ms-wmv", "container"], // [unsupported]
    "mpeg" : ["video/mpeg", "container"], // also video/x-mpeg? [unsupported]
    "mpg" : ["video/mpg", "container"], // also video/x-mpg? [unsupported]
    // "wtv" : ["", "container"], // [unsupported] Proprietary with very little support. Not sure it even has a proper MIME type
    // "iso" : ["application/x-iso9660-image", "container"], // [unsupported]
    "flv" : ["video/x-flv", "container"], // [unsupported]

    // Audio
    "aac" : ["audio/aac", "audioCodec", "mp4a.40"], // Many different levels...might have to use codec profiles to specify...I think it is actually just seeing mp4a? but that doesn't make sense because it fails with just mp4a...
    "mp3" : ["audio/mp3", "audioCodec", "mp3"], // Seems like it might have different ones? mp4a.40.34 for example or mp4a.69 or mp4a.6B ??????
    "ac3" : ["audio/ac3", "audioCodec", "ac-3"],
    "eac3" : ["audio/eac3", "audioCodec", "ec-3"], // audio/eac3 returns false and video/mp4; codecs="ec-3" returns true?? Is this even supported for anything but specific apps?
    "flac" : ["audio/flac", "audioCodec", "flac"],
    "alac" : ["audio/mp4; codecs=alac", "audioCodec", "alac"],
    "opus" : ["audio/opus", "audioCodec", "opus"], // It recognizes support for some of these but not all. Basically, opus can be in mp4 container but isn't supported in .ogg?
    "aiff" : ["audio/aiff", "audioCodec", "aiff"],
    "wav" : ["audio/wav", "audioCodec", "wav"],
    "amr" : ["audio/amr", "audioCodec", "samr"], // this is only narrowband...doesn't deal with wideband or extended wideband... ???
    // "pcm_s24le" : ["", "audioCodec", ""], // LPCM type...there are so many others though [?]
    // "mp4a" : ["audio/mp4; codecs=mp4a", "audioCodec"], // - this seems quite general?
    "vorbis" : ["audio/vorbis", "audioCodec", "vorbis"], // (or audio/ogg) [unsupported]
    "truehd" : ["audio/mp4; codecs=mlpa", "audioCodec", "mlpa"], // Dolby TrueHD (MLP) [unsupported]
    "mlp" : ["audio/mp4; codecs=mlpa", "audioCodec", "mlpa"], // [unsupported] Is this correct MIME?
    "ac4" : ["audio/mp4; codecs=ac-4", "audioCodec", "ac-4"], // Dolby AC-4 [unsupported] //not supported in FFMPEG yet?
    "dts" : ["audio/mp4; codecs=dtsc", "audioCodec", "dtsc"], // [unsupported] DCA (DTS Coherent Acoustics)
    "dtshd" : ["audio/mp4; codecs=dtsh", "audioCodec", "dtsh"], // DTS-HD (I believe this is the same as DTS-HD HR) [unsupported] ????
    // DTS-HD MA : ["audio/mp4; codecs=dtsl", "audioCodec", "dtsl"], [unsupported]
    // DTS-X : ["audio/mp4; codecs=dtsx", "audioCodec", "dtsx"], [unsupported]
    // what about 3gpp2 and 3gpp audio codecs?
    // ape, mp1, mp4als, pcm variants?, wma ones?
    // Tested unsupported natively: mp2, wmav2

    // Video
    "h264" : ["video/mp4; codecs=avc1", "videoCodec", "avc1"], //avc1 but also avc3 support - I don't think I can differentiate here? Maybe can differentiate in codecProfiles. Apple does not recommend using avc3
    "h263" : ["video/mp4; codecs=h263", "videoCodec", "h263"],
    // "flv1" : ["", "videoCodec", ""], // [unsupported] A variant of h263?
    "mpeg4" : ["video/mp4; codecs=mp4v", "videoCodec", "mp4v"],
    // dvhe - HEVC-based "Dolby Vision" Apple does not recommend using dvhe
    // dvh1 and dvh3 - dvh1 based on hvc1 and dvh3 based on hev1. Apple does not recommend use of dvh3
    "hevc" : ["video/mp4; codecs=hvc1", "videoCodec", "hvc1"], //hvc1 and hev1 are very similar but different way parameters (metadata) is stored. Apple does not recommend using hev1
    "vc1" : ["video/vc1", "videoCodec", "vc-1"], // [unsupported]
    "vp8" : ["video/mp4; codecs=vp08", "videoCodec", "vp08"], // [unsupported]
    "vp9" : ["video/mp4; codecs=vp09", "videoCodec", "vp09"], // [unsupported]
    "av1" : ["video/mp4; codecs=av01", "videoCodec", "av01"], // [unsupported]
    "vvc" : ["video/mp4; codecs=vvc1", "videoCodec", "vvc1"] // [unsupported]
    // msmpeg4v1, theora, wmv3
    // h265, wmv ??? are these still in use as is
    // tested unsupported natively: mpeg1video, wmv1, wmv2, msmpeg4v2, msmpeg4v3, mpeg2video
]

When deciding if media can be played, we must keep in mind the following sources of limitations:

  • Device support: This matters for native player as well as hardware decoding for VLCKit (e.g. HEVC); additionally, this has an interplay with container support since, for example, mp3 is technically supported in mp4, but it does not seem supported by Apple
  • [VLCKit]: VLCKit support
  • Container support: With HLS, we are looking at either mpegts or fmp4. Each of these formats has limitations about which codecs are supported in them.
  • ffmpeg/server support: Even if a container can technically contain a codec, ffmpeg might not be able to mux it.

Therefore, querying for support alone is unfortunately insufficient.

Changes to Function

Codec querying

  • Whereas before, the profile builder was checking to see if AC3/EAC3 are supported, we don't need to do that given that iOS 15 (& family) is a prerequisite (only devices with A8 chip+), and any device (A7 chip+) will support these two codecs.
  • Similarly, there were some errors in the device builder (such as using truehd as Atmos, when the Apple-supported version of Atmos is lossy EAC3+JOC). Unfortunately, TrueHD is not support natively or by VLCKit because it requires Dolby licensing, so we do not need to check that anymore.
  • Currently, it does not check for Dolby Vision (or HDR) support because I am not actually sure the check would work; it is an item to investigate.

Profile separation

There is now a separate device profile for the native player and the VLCkit one. They share codec conditions and response profiles at the moment because I didn't see fit to separate those too, but containers, video codecs, audio codecs, and subtitle codecs theoretically differ. In the current implementation, VLCKit player only queries for HEVC support since it can software decode FLAC without issue; native player has had blocks on software decoding (which might happen on chips prior to A10). It can also decode HEVC but the experience will be poor unless the chip supports it (I think A9 chip+ required), so we check for it.

To Note

MIME querying

  • Not every MIME type returned for a UTI is playable. For example, for AVI, it returns mimetypes ["video/avi", "video/msvideo", "video/x-msvideo"] but only "video/avi" is returns true for playableExtendedMIMEType.
  • Some MIME types seem to imply a container. For example, "audio/opus" returns unsupported, but "video/mp4; codecs=opus" returns supported. I think this is because it expects audio/opus to be in .ogg container, which is unsupported by native player.
  • At the moment, every codec supported in one container is reported as supported in every container. This is wrong, but it might not make a practical difference since 1) media that is trying to DirectPlay would have to be corrupted or truly abnormal to meet this issue and 2) everything being transcoded is being put into either mpegts or fmp4.
  • isPlayableExtendedMIMEType does not differentiate between codec levels such as avc, hevc levels or avc1 vs. avc2 vs. avc3 vs. etc.
  • VP9 support was added in iOS 14, but seems to be restricted to YouTube app? YouTube binary has special entitlement (com.apple.developer.coremedia.allow-alternate-video-decoder-selection)

Container/HLS limitations

  • HLS is used to deliver the media to the app. I am not sure if it is used during DirectPlay or not with VLCKit, but it appears that any type of DirectPlay with the native player forces a remux for HLS stream in either mpegts or fmp4 (controllable in BaseItemDto+VideoPlayerViewModel? Though sometimes remux uses container defined in transcoding profile...maybe only when transcoding disabled for user server-side?). It seems like HLS has its own codec restrictions, and these might be different than the support shown using isPlayableExtendedMIMEType.
  • Need to abide by container limitations for transcoding/remuxes (outlined below):
    We have the following examples of limitations based on container (italics mean I am having trouble with ffmpeg muxing):
    • mpegts unsupported natively (might need format identifier by client): flac, h263, flv1, vp8, vorbis, theora, msmpeg4v2, msmpeg4v3, wmv1, wmv2, wmv3, wnv1, wmav2, alac, pcm_s24le, certain other types of pcm audio
    • fmp4 unsupported: h263, flv1, vp8, vorbis, theora, msmpeg4v2, msmpeg4v3, wmv1, wmv2, wmv3, wnv1, wmav2, mlp, pcm_s24le
      • mp3 unsupported for playback in mp4 on Apple devices, it seems

To Investigate

Needs testing

  • Does querying for support for FLAC actually work? On iPhone 6S simulator, it comes back supported, but this seems wrong.
  • VLCKit playback: I don't currently have a way to build this and test it on one of my devices, so I have been relying on simulators for testing as well as the VLC app. I would appreciate it if someone could test on a real device. For example, I don't seem to be able to play anything using VLCKit on the simulator. (That might be me doing something wrong!)

Meta question/question for devs

  • [Native player] Is the behavior of DirectPlay supposed to be a remux (into .ts or fmp4) as opposed to the file itself? This was a bit confusing because even if the user has remuxing disabled server-side, since it sees it as DirectPlay, it still remuxes it. If so, this changes the definition of the directPlay profiles for the native player since remuxing into these two formats can be problematic given their codec support/ffmpeg limitations. It means that we must be attentive to restrictions based on the containers used to deliver the remux, not just restrictions based on the device. For example, while a file with flac audio can be played by the hardware, flac cannot be delivered by mpegts or it will not play.
  • How is HDR indicated in a device profile? ffmpeg doesn't seem to use dvh1, hvc1, etc.? (Also, VLCKit might not support HDR? https://code.videolan.org/videolan/vlc/-/issues/18618); I think tvOS does tonemapping when using the native player; unclear if tvos 16 supports HDR10+; I think it does on certain Apple TV models. It does support certain Dolby Vision profiles (native player). With tvOS 17, DV Profile 8.1 will be supported (in addition to previously-supported Profile 5 and Profile 8.4).
  • Transcoding: It appears that transcoding profiles will prefer the codecs in order for video and audio transcoding. However, it seems there is something else at play because this isn't universally true - it won't prefer ALAC even if it is listed first, for example. After we figure this out, we should order the codecs. Also, it appears that we need to have all supported codecs present in the profile or it will transcode. For example, if video isn't supported but audio is, it will need to transcode video; however, if the audio codec is in the directplay profile but not the transcode profile, it will then transcode the audio as well? Need to be careful because ffmpeg can decode but not encode some codecs, so order would matter here.
  • Transcoding: What is the point of having multiple transcoding profiles? How does the server choose which one to use?
  • What do container profiles do?
  • Can maxAudioChannels be boosted to 8 for 7.1 instead of 6? tvOS can handle 7.1, but I'm unsure about iOS. Additionally, even if it is set to 8, ffmpeg seems to specify -ac 6 (for example trying to transcode 7.1 TrueHD to FLAC). If -ac is unset in ffmpeg, FLAC defaults to 5.1(side) vs. 5.1.
  • Can we have container of remux/transcode switch based on what is supported in remux container? Maybe this is unnecessary? Regardless, need to decide on fmp4 vs. ts as transcode container and remux container. This decision will affect which codecs must be disabled for directplay for native player (assuming remux for HLS) as well as which must be disabled for transcoding.

Bugs/Unexpected behavior

  • [Native player] h264 to h264 transcoding for no apparent reason (To reproduce: try to play a h264-encoded .ts file and it will transcode h264 to h264; can mitigate by turning off video transcoding permissions for user on server)
  • [Native player] Potentially related to above, for h264, dts m2ts file, it is not transcoding audio and instead remuxing even though dts is not in directplay profile
  • Transcoding: ffmpeg encounters error when trying to mux flac into fmp4; flac is not supported in mpegts, so if we want to transcode anything lossless into flac, I think we need to use fmp4 for HLS.
  • VOBs don’t work because it gives the dvd_nav_packet and the video and no audio to HLS mixer

Codec support

  • Figure out PCM audio support in both native player and VLCKit - there are many different strings for PCM audio in ffmpeg - and add supported ones to device profile audioCodec list. LPCM is supported by both mpegts and mp4, I believe. Here are some sample PCM codec strings (pcm_s8,pcm_s16be,pcm_s16le,pcm_s24le,pcm_s32le,pcm_f32le,pcm_alaw,pcm_mulaw)
    • pcm_s24le can represent LPCM, but ffmpeg can't seem to mux into mp4 and it is a private stream in mpegts...
  • [Native player] Is mpeg4 videocodec actually supported? Seems like it is more of a container that can hold multiple codecs like divx type ones? (https://wiki.videolan.org/MPEG-4/)
  • Does HEVC have to be in fmp4? Apple HLS specs say yes. If so, that means that codecs like mp3 that are unsupported in mp4 should not be in directplay profile or there needs to be multiple directplay profiles. (Of course, under current behavior where directplay is always remuxing for HLS)
  • Look over potential VLC codec resources to make sure all are appropriately included in device profile (divX, for example; also all of the other codecs in the array I made like mpeg2video, mpeg1video, etc. though mpeg1video might actually be an issue not sure. need to test). What about mlp? There might be some that are supported in native player as well.
  • Reorder codecs to be consistent and correct if transcoding preference is involved. (For example, HEVC before or after h264? There are server settings related to transcoding preference and would need to see how they influence this. Might be specific to video and not audio.)

Future Work

Subtitles

  • Investigate subtitles for both native player and VLCKit. I have done no work on this yet.

Audio transcoding to lossless

  • I would like to have lossless audio that is unsupported by VLCKit or native player (like TrueHD) converted to another lossless format that is supported like FLAC in order to play. This relies on fixing some issues above, such as FLAC in fmp4 (which is not standard, but might be supported by ffmpeg? At least supported by Apple's devices) as well as maxAudioChannels not being overwritten by ffmpeg, and then reordering the transcoding profile as well as changing the container for transcoding to mp4. ALAC is another option, but it doesn't seem to support default 7.1 audio, only 7.1(wide) if specified. Default for 8 channel input for ALAC encoder seems to be 6.1(back).

Meta optimization/decisions

  • Profiles will not change unless the app is updated, iOS/tvOS is updated, or the device changes. Therefore, there is no reason to build these profiles more than once at app launch or another infrequent event. Right now, the profiles are built everytime we go into the media view.

Querying for support

  • [Native player] Decide whether to query by container and build native player device profiles that way
Code sample
for (codec,mimeWithType) in possibleCodecs where mimeWithType[1] == "container" && AVURLAsset.isPlayableExtendedMIMEType(mimeWithType[0]) {
  var supportedCont = ""
  var supportedAud = ""
  var supportedVid = ""
  supportedCont += codec

  for (codec2,mimeWithType2) in possibleCodecs {
    switch mimeWithType2[1] {
      case "audioCodec":
        if AVURLAsset.isPlayableExtendedMIMEType(mimeWithType[0]+"; codecs="+mimeWithType2[2]) {
          supportedAud += ",\(codec2)"
        }
        case "videoCodec":
        if AVURLAsset.isPlayableExtendedMIMEType(mimeWithType[0]+"; codecs="+mimeWithType2[2]) {
          supportedVid += ",\(codec2)"
        }
        default:
        break
    }
  }
  
  if supportedAud != "" {
    supportedAud.remove(at:supportedAud.startIndex)
  }
  if supportedVid != "" {
    supportedVid.remove(at:supportedVid.startIndex)
  }
}
  • [Native player] Decide whether to query for known unsupported MIME types now, hoping they will be supported by Apple in the future
  • Investigate alternative methods of querying for support - such as using VideoToolbox/CoreMedia? This can give information on hardware vs. software decoding capabilities as well.

Resources

Possible codecs

Apple HLS

Apple codec support

MIME types

Potential VLCKit codecs

Supported formats in containers

Removed device-specific identifier and replaced with built-in AVFoundation check to see if codec is playable.
All devices that support iOS 15 (minimum required) support AC3/EAC3 so only check for FLAC (in native player) and HEVC (in both native and VLCKit).
Suppressed compile warning and did linting
Native player profile:
- Remove h263 from native direct play profile because it is unsupported in mpegts and ffmpeg does not support muxing it into mp4.
- Remove alac from native direct play profile because it is unsupported in mpegts, which is currently being used for HLS direct play streams.
- Remove flac check and flac from native profile because it is unsupported in mpegts.

VLCKit player profile:
- Add h263 to direct play profile because it was removed from base.
- Add in missing flv1 to direct play profile.
- Add alac to direct play profile because it was removed from base.
- Remove dca from direct play and transcoding profiles because I believe dts covers that codec.
- Remove flac, vp8, theora, msmpeg4v2, msmpeg4v3, wmv1, wmv2, vorbis, wmav2, pcm_s24le from transcoding profile because they are unsupported in mpegts.
WAV is an audio container format and should not have been in the video profile
@LePips LePips added the enhancement New feature or request label Aug 9, 2022
@LePips
Copy link
Member

LePips commented Aug 9, 2022

Thanks for the very comprehensive write up! I've CC'd the JF devs to potentially get some server insights to many of the questions regarding what the server expects, profiles, transcoding expectations and such. You know a lot more than I do for this, so I'm going to be playing a lot of catch up. I'll be able to give a more in depth comment later but regarding some of the development stuff:

There is now a separate device profile for the native player and the VLCkit one.

Yes and thank you, this is how it should be.

Does querying for support for FLAC actually work? On iPhone 6S simulator, it comes back supported, but this seems wrong.

It seems like FLAC was introduced in iOS 11? This discussion says that it supports it 🤷

For example, I don't seem to be able to play anything using VLCKit on the simulator.

VLCKit via the simulator is overall slow and only to prove that playback can happen. It's just laggy but should work just fine since you can build the application, meaning you have the correct libraries. Regarding the simulator in this and the previous point, I've also asked my local dev community to see if the simulator is an actual "simulator" of the device (and would returns their correct corresponding types from audiovisualTypes() or audiovisualMIMETypes()) or is just the same simulator with different identifiers that will return the Mac's supported types. If the simulator does properly reflect the device it's simulating, this means we are also given the safety to deploy this without much testing. I've never had to think about that before.

Investigate subtitles for both native player and VLCKit. I have done no work on this yet.

We should be fine on the VLCKit front as we haven't received any issues regarding subtitle support yet. (this differs from the other subtitle issues we currently have)

Profiles will not change unless the app is updated, iOS/tvOS is updated, or the device changes. Therefore, there is no reason to build these profiles more than once at app launch or another infrequent event. Right now, the profiles are built everytime we go into the media view.

We can just have this be a computed variable on UIDevice but ultimately this is the most trivial thing about all this.

Add FLAC DirectPlay support
Fix string builders for codec strings
Native player only supports VTT external subs
Get rid of duplicates in VLCKit player
@holow29
Copy link
Contributor Author

holow29 commented Feb 5, 2023

Thanks for the offer - the issue with my testing initially was actually having a mac with a new-enough version of Xcode, but I've solved that! I have answered and completed some of the tasks here; I need to update the list. I'm running into a few issues that I think are server-side, and I'm eyeing some in-progress PRs that I hope will solve them. (I don't want to push a device profile that will cause less functionality because there is an active issue server-side.)
Which is all to say there is some amount of progress being made on this, and I will try to make that clear so I can at least get something workable out even if it isn't perfect.

Change maxAudioChannels from 6 to 8 in Transcoding Profiles
Change transcodingMaxAudioChannels to 8 from 6 in HLS builder
@holow29
Copy link
Contributor Author

holow29 commented Feb 18, 2023

Okay I think this is ready for review/comments. (And could use some testing from someone else probably.)

Changes from original idea:

Whereas before I was going to query for HEVC and FLAC, I've decided against that. Querying can be messy and imprecise as I detailed in the OP. As far as I can tell, FLAC can be software or hardware decoded by all devices without issue and HEVC only affects three devices that support iOS/tvOS 15 but have an A8 chip (i.e. no HEVC hardware decoding): Apple TV HD, iPad Air 2, iPad Mini 4 - but it can software decode at least on Apple TV HD. I'm assuming it can either software decode on the others as well or those devices are old enough (7+/8+ years) that it won't be an issue. This means that device profiles are static!

I've left some comments in the code, but I have made the DirectPlay profile for VLCKit so it only defines audio codecs; basically any container/video codec should pass through for DirectPlay. The way I see it, anything I know of that jellyfin can detect (and ffmpeg can transcode) is supported by VLCKit other than TrueHD/mlp. If someone finds another codec that can be decoded server-side but not in VLCKit, we can add it or change strategy there.

Other changes

I've gotten rid of fmp4 experimental option and just switching to mp4 for all HLS transcoding/remuxing; while ts supports more in terms of older formats, there are really only a few cases where transcoding will actually happen for VLCKit player - namely, 1) when TrueHD/mlp audio is selected and 2) when server forces a transcode for bitrate reasons. In case one, it is unlikely the video codec won't be able to pass through so we would get DirectStream. In case 2, video needs to be transcoded anyway, so we might as well transcode it to h264 or hevc and then mp4 container will fit well there. It shouldn’t happen with external subtitles on VLCKit because they can be passed externally as well. As the DirectPlay profile is defined right now, I don't think remuxing will ever happen for VLCKit.

I've changed the maxAudioChannels to 8 from 6 because iOS/tvOS can handle 8 channels and tvOS can spit out 8 channel LPCM. This enables us to output DTS-HD MA up to 7.1 losslessly as well as TrueHD/mlp (transcoded server-side to FLAC) losslessly. It is waiting for my PR (jellyfin/jellyfin#9266) to hit serverside, though. Until then, channel limit is stuck at 6 server-side.

I've changed it so the native player will actually DirectPlay vs. requesting an HLS stream for things it can actually DirectPlay. I believe this is important for content like MP4 with HEVC HDR and compatible audio (maybe AC3) because then this file can be played on SDR screens and native player will tonemap. However, more importantly perhaps, it maintains the meaning of DirectPlay and if user does not have permission to remux server-side, it will no longer do so.

Why choose VLCKit or Native player?

  • HDR Content: VLCKit does not support HDR - it will tonemap. Native player does support HDR and if you want HDR, you should use native player to play the file. However, as mentioned above, native player will not tonemap to SDR displays unless it is DirectPlaying the file (not remuxing or DirectStreaming because that uses HLS); in the case of SDR display with remux or DirectStream/transcoding, server will tonemap if you have that enabled server-side.
  • Refresh Rate Switching: VLCKit player atm does not support automatic refresh rate switching; native player does.
  • Codec support: VLCKit supports almost any codec that Jellyfin can detect; native player codec support is quite limited. For example, native player does not support VC-1 or mpeg2video, codecs commonly used for DVDs and old Blu-rays.

What doesn't work yet?

  • Blu-rays and DVDs are not properly passed to the client. I haven't tested it, but I am hopeful that this PR will fix it: Fix DVD and BD folder playback jellyfin#9254. This manifests in various ways:
    • For VOB files, it means that server is trying to send dvd_nav_packet+video in the stream instead of the proper video+audio.
    • Blu-rays in BDMV won't play with DTS track; an issue with AudioStreamIndex perhaps ([Issue]: Player Error Encountered attempting to play UHD content jellyfin#9101)
    • When Blu-rays remux, ffmpeg chooses audio that does not match user preference (e.g. preferring supported AC3 track to unsupported main audio even if in a different language or commentary)
  • The server treats mp4, m4v, mov, 3gp, and 3g2 containers all as the same. This really only has an impact for native player under the circumstance that something in one of those containers is not actually supported in the DirectPlay profile under its container. In reality, I think this is only ever bound to happen with mp3 audio in mp4 - which Apple for some reason does not support. I'm not sure if Container profiles or Response profiles could solve this.

Misc. Notes

Would like at some point to figure out VP8/VP9 support for native player...very unclear what is actually supported.
I am probably still missing some PCM formats from directplay profiles as well as some other audio formats from VLCKit one, but I have put in quite a few, and I doubt people will have any issues - if they do, we can always add whatever obscure/ancient codec after.
There is also probably more work to be done on the native player side of things in terms of subtitle support in the HLS stream - I haven't had the chance to test it at least. The hope would be that subrips and other text-based subs could be transcoded easily to webvtt and then passed in the HLS stream per Apple's specs.

Please ask if there are questions. While the code changes are not that large themselves, there is so much at play here in terms of what is actually limited/defined server-side or by Apple or by ffmpeg or by HLS, etc. I'll say I found it extremely difficult to find documentation about these profiles.

@holow29 holow29 marked this pull request as ready for review February 18, 2023 02:54
@schrock
Copy link

schrock commented Feb 26, 2023

This looks great! I'm hoping this will be merged and released soon.

@LePips
Copy link
Member

LePips commented Feb 26, 2023

Thank you for working on this! This will require testing and will be merged after #593. I don't think there will be many conflicts.

@holow29
Copy link
Contributor Author

holow29 commented Jul 13, 2023

Here are the main issues I am encountering now (on tvOS - have not tested on iOS):

  • Subtitles are still a mess. It appears server is always trying to extract subtitles before playback, at least on tvOS. I'm not sure if this is because of the device profile or because of some other issue with subtitle selection on tvOS (which doesn't appear to work well in VLCKit/Swiftfin player)
  • In unstable server, DVD/Bluray playback works thanks to Fix DVD and BD folder playback jellyfin#9254. However, it doesn't seem to like to pick the default audio track, instead defaulting to AC3, even in a different language. Not entirely sure if this is serverside since the selection is going through libbluray/ffmpeg or HLS restrictions implemented since TrueHD/DTS were shifted. However, web player seems to request the correct (first) audio track, so this is probably client-side. It also seemed to have transcoded DTS audio instead of passing it through for some reason when no other audio tracks are available that would DirectPlay.

Other issues that I believe are unrelated to this PR but that nonetheless are problematic:

Might also want to investigate native AV1 decoding support (which was introduced on iPhone 15 Pro/Max).

@ShadowJonathan
Copy link

I can confirm that this fixes #790, after toggling this patch on my local build, it seems to automatically "work" for my Apple TV HD.


Additionally, I'm not too sure about the approach to not conditionally enable better codec options for future devices, as while the "lowest" profile for VLCkit seems to be alright, I'm not sure if that'll satisfy 4K HDR video and whatnot.

But, for the reason to create a proper codec base for Apple devices, and continue from there, this will work.

@LePips LePips self-requested a review December 13, 2023 22:57
Shared/Objects/DeviceProfileBuilder.swift Outdated Show resolved Hide resolved
@LePips
Copy link
Member

LePips commented Dec 16, 2023

So with this, I've always wanted to learn more about how the server works and test on a variety of media configurations. I was sitting down to do this but I just don't have the time or expertise to focus on this system of the app/server. Due to that, assume my answer to all of these questions is 🤷.

I know that it may be disappointing/upsetting that this PR has lived for so long and I can't provide any direction about whether this is correct or not, but I do like it a lot better than the manual specification that we did before. I am confident that this is a very large step in the right direction and we can, obviously, fix any issues that arise.

I know that as we all learn more about the server we can answer some of these questions (ex: why remux if not necessary, how is HDR specified, codec ordering) and implement the work accordingly.

I am comfortable with this implementation and luckily it's very localized so we can revert if necessary. If you are comfortable as well, this can be merged.

@AhsanFazal
Copy link
Contributor

@holow29 and @LePips I just created this repository https://github.com/AhsanFazal/avplayer-supported-media-types/.

This repository contains the static files that MediaToolbox.framework uses to check playback capabilities when called by AVFoundation functions, for example:

class func isPlayableExtendedMIMEType(_ extendedMIMEType: String) -> Bool

This function above calls a static function of MediaToolbox.framework which in turn parses the supplied string parameter and checks the "playability" of the MIME type based on the [DEVICE_ID]/MediaValidator.plist file.

With these files, we I think we now have all the information we need to determine the EXACT native playback capabilities of each device.

@holow29 I would love to work together with you on finishing this PR. Let me know if you'd like that! :)

@holow29
Copy link
Contributor Author

holow29 commented Jan 10, 2024

@AhsanFazal Thanks for your work extracting those files. I was using isPlayableExtendedMIMEType originally, but it didn't prove to be extremely helpful. I left some notes in the OP explaining why (under the list of MIME types I put together and in the "To Note" section); the gist is that unless something has changed in iOS 16/17 there is no way to properly query for playability of a codec/MIME type in a container using isPlayableExtendedMIMEType AFAIK - and Apple is extremely picky about being able to play some codecs only in certain containers. If you've found otherwise, I would obviously be interested in that. Additionally, for practically all codecs apart from newer ones like AV1 or AC4, the limitation of iOS 15/tvOS 15 normalizes the codec limitations across almost all supported devices. The exception is HEVC support for ~2 iPad models, but it was similarly unclear if isPlayableExtendedMIMEType would be helpful in this instance because I believe software-decoding of HEVC was added even though it isn't performant; I don't recall what is returned on these. Regardless, the MediaValidator.plist file might still be helpful in determining some of the more specific limitations such as HEVC profiles, etc.
I am sorry I haven't had much time to devote to this. I will see if I can quickly get something together that can be merged and maybe leave some questions for future work.

@holow29 holow29 marked this pull request as ready for review January 12, 2024 21:28
@holow29
Copy link
Contributor Author

holow29 commented Jan 12, 2024

I believe linting check is failing because of files without 2024 in the header. 15 Pro simulator (even though it should support AV1 hardware decode) is not reporting success using isPlayableExtendedMIMEType, so I left it out.

Issues that might be solved by merging:

Refer to my above comment for more information about things that are not solved/potential issues that might become more apparent because of this PR. In addition to that, there might be more to investigate re: HDR and any signalling that must go on from OS(/TV) -> App -> Server.

@holow29 holow29 requested a review from LePips January 12, 2024 22:01
@LePips
Copy link
Member

LePips commented Jan 13, 2024

The build is failing on tvOS so we need to fix that, however the lint I can approve without it succeeding and we can get that later. I think I will quickly refactor this into DeviceProfile instead of it being its own type.

I will consider this the work done for #296, so this will close that and ongoing work is always implied.

Here is my stance on all of the playback issues: I have always wanted to close them. We don't control playback, we hand that off to VLCKit or AVPlayer and all we do is construct the URLs, so I never thought of them as actually a concern of ours. I actually will close them sometime but I have deferred that to when I get to implementing mpv and will update how we accept issues around anything playback related. While this does solve some issues, my policy would have always been: direct play or transcode your media to something more usable.

I'm grateful for all the work that you've done here and the research done by @AhsanFazal to get this through!

@holow29
Copy link
Contributor Author

holow29 commented Jan 13, 2024

The build is failing on tvOS so we need to fix that

I am having difficulty seeing why it is failing on here; I expected to see a log but it just says the job failed. Build succeeds on my end, and tvOS has been my main testing ground. Please let me know if there is something I am missing!

Here is my stance on all of the playback issues:[...]

I think that is sound. Best maybe to make sure that the URL is being constructed correctly and passed off to the players properly to ensure they support everything they should, which this works towards. I think the player issues are somewhat helpful in giving an idea of the player limitations, some of which are documented in this PR thread. Obviously they will change with MPV, though.

@LePips
Copy link
Member

LePips commented Jan 13, 2024

Hm, the tvOS failed build seems to just have been a GH runner issue. I'll go ahead and merge this.

@LePips LePips merged commit e2d6237 into jellyfin:main Jan 13, 2024
5 of 6 checks passed
@pwntester
Copy link

@LePips Thanks for implementing this. How can I try it on my ATVHD 4? Seems like the latest version on the AppStore is one year old and cant seem to find a testflight version.

Thanks!

@ericswpark
Copy link
Contributor

@pwntester there is no Testflight build and I think @LePips mentioned there won't be one for tvOS specifically until some more changes can be done. In the meantime you can build and sideload it yourself with Xcode.

@pwntester
Copy link

Thanks @ericswpark Is it possible to build swiftfin without an apple developer account? I seem to need a provisioning profile which I dont have

@ericswpark
Copy link
Contributor

ericswpark commented Apr 7, 2024

@pwntester you should be able to build and sign Swiftfin with any Apple account. I'm hazy on the details, but I think you may have to sign up with the developer "program" and accept some ToSes, but you definitely don't have to pay to deploy Swiftfin.

Do keep in mind that non-paid accounts can only sign builds that last 7 days, compared to paid developer accounts that can go up to 1 year.

Once your developer profile is set up you should be able to select the development team in Xcode, by clicking on the "Swiftfin" project in the sidebar and selecting the profile.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants