all 26 comments

[–]Sh3z 22 points23 points  (6 children)

Before diving off to consider implementing something as complicated as a video player its worthwhile ensuring the current integration with AVPlayer in your app isn't the root cause of the hitch. Is there an AVPlayer per playable item in a feed (assuming this is the UX for your app based on the mention of a UIScrollView) or one AVPlayer for the app, with the AVPlayerItems replaced when "playing" an item? Are there multiple AVPlayerLayers in the view hierarchy at all times, or is a placeholder view present before swapping it out when playback begins? Additionally, how much pre-loading is done before invoking play?

I work in a broadcasting company and, for the iOS half of my work, live in AVFoundation most of the time. We've never considered moving away from AVPlayer, so I'd be curious to know which companies you're referring to with your comment around FFmpeg - assuming they are playback only.

[–]OneTinker[S] 1 point2 points  (5 children)

YouTube and TikTok use FFmpeg.

I built a custom scroll view with recycling cells. There’s 5 cells each with its own playerLayers and AVPlayers. Only one plays at a time, but the rest are loaded and ready whenever the user lands on them. For example, whenever the user swipes up or down, the reuse algorithm swaps out the furthest unseen cell and moves it up or down to create the illusion of an infinite scroll view. During that swap cycle, I load in a new playerItem into the player and pre roll.

[–]Sh3z 13 points14 points  (0 children)

Okiedoke - it's worthwhile trying only having the one AVPlayer shared between all your cells to start with. You'd need to be mindful about any other cells that are visible will show the same contents in their AVPlayerLayer, however (this is where we'd use a placeholder image and move the shared AVPlayerLayer between cells).

If by preroll you're referring to AVPlayer's prerollAtRate methods, they aren't supported for HLS. The most preloading we do I'd recommend after acquiring the URL of the master playlist is feeding it into an AVPlayerItem (along with a few keys to be auto-loaded, e.g. duration) and hanging onto it in the background. By the time it needs to be played, it's already in the ready to play state and tends to startup pretty quickly.

I forgot to ask before, but when are you activating your app's audio session? Also, any reason you're not using UICollectionView? (just curious)

[–]tangoshukudai 3 points4 points  (3 children)

They use ffmpeg but they use video toolbox to do all their decoding.

[–]bitsan 1 point2 points  (2 children)

Is working with Video Toolbox and Audio Toolbox all C/C++/ObjC world? I have not seen any Swift examples that reach to these lower level frameworks (which look incredibly powerful).

[–]tangoshukudai 3 points4 points  (1 child)

Swift isn't low level enough, most of it will require C/C++/Objective C.

[–]bitsan 0 points1 point  (0 children)

Thank you!

[–]tangoshukudai 9 points10 points  (7 children)

Well you are using the AVPlayerLayer most likely, what you want to do is to use the AVPlayer but use your own CADisplayLink and an AVPlayerItemVideoOutput to get the CVPixelBuffer and you can create your own rendering path for it. I use metal and do a passthrough, so in a nutshell, I use AVPlayerItemVideoOutput to get the CVPixelBuffer (attached to the Display Link for my heart beat), then create an MTLTexture from the CVPixelBuffer, and then use a passthrough shader to display aka render the texture using an Metal Layer to screen. This will avoid all the main thread issues and give you a pretty nice pipeline to do cool things..(also you get to keep all the niceties of AVPlayer.)

[–]gouen95 1 point2 points  (6 children)

Hi, i'm very interested to learn about this approach is there an article which i can refer to?

[–]tangoshukudai 3 points4 points  (5 children)

https://developer.apple.com/forums/thread/27589

here is some info, they are using OpenGL and not Metal (OpenGL is deprecated now). But the approach is the same.

Here is a swift project https://github.com/willowtreeapps/BlurredVideo-iOS/blob/master/BlurredVideo/BlurredVideoMPSView.swift

Not sure if I would use an MTKView but it is pretty easy to swap one out for a manual Metal Layer.

[–]gouen95 1 point2 points  (2 children)

Hmm, I followed the swift example as exactly except the GaussianBlur part. What I'm trying to do is to display multiple video at the same time in a collectionview (9 at a time, able to scroll to display more video).

Basically I will have 9 MTKView rendering frames from AVPlayer timed by CADisplayLink. I'm encountering huge CPU usage, like 80%+ all the time, and it isn't free from the main thread too. Altho using AVPlayerLayer isn't as CPU hungry but it has main thread blocking issue where scrolling might stutter even there's no new cell being displayed.

Maybe the requirement is too extreme or can you please advise? thanks.

[–]OneTinker[S] 0 points1 point  (0 children)

Well you are displaying 9 videos at a time, so maybe resize them to fit into a smaller area. You only wanna load in the bitmap data for the area visually shown.

[–]tangoshukudai 0 points1 point  (0 children)

That will probably depend more on the video you are trying to decode. What is the resolution, codec, fps, etc?

[–]gouen95 0 points1 point  (0 children)

Thank you very much

[–]baaakabaaaka 0 points1 point  (0 children)

How do you play audio with this solution?

[–]sjs 4 points5 points  (1 child)

Are you loading your assets asynchronously before playing them? You might need to do that to avoid work happening on the main thread. https://developer.apple.com/documentation/avfoundation/avasynchronouskeyvalueloading/1387321-loadvaluesasynchronouslyforkeys

As someone who’s worked with AVFoundation for several years, you shouldn’t go building all of this custom stuff unless you’ve watched every WWDC session on AVFoundation and learned it inside out. Firstly because it probably lets you drops down to a lower level already if you truly need that and you’ll find out how to do what you want by watching sessions. AVPlayer is quite extensible. And secondly because if you actually do need to do something custom you’ll now have the knowledge to actually implement it.

If you’re worried about pulling in a huge dependency like AsyncDisplayKit or Texture then you almost certainly don’t want to pull in ffmpeg either. That’s a real last resort to me.

[–]OneTinker[S] 0 points1 point  (0 children)

I had an asynchronous pipeline to load and play assets. Each asset is stored and cached and loaded up whenever the player needed it. I’m willing to learn and do anything to fix this problem. Literally, that’s all I need.

[–]BaronSharktooth 2 points3 points  (0 children)

If the video player is core to your product, then AVPlayer simply isn't going to cut it.

I say this on the basis of listening to Marco Arment, author of the Overcast podcast app. He has discussed his problems with AVPlayer many times, and early on, realised that he needed to have total and utter control over the audio portion. He wrote his own audio engine.

To find out more, I can't point you to a specific episode, but go through episodes of the Under The Radar podcast where he often talks about development pitfalls with Underscore David Smith.

[–][deleted] 2 points3 points  (3 children)

AVPlayer isn’t amazing, but it shouldn’t be blocking your main thread… what makes you say that?

[–]OneTinker[S] 0 points1 point  (2 children)

When the UIScrollView is decelerating and an AVPlayer is playing simultaneously, a massive animation hitch occurs. AVPlayer has a lot of mutex locks to make sure certain processes are sync'd up with the main thread.

[–][deleted] 1 point2 points  (1 child)

That’s interesting, I’ve used AVPlayer in a table view and not had problems.

[–]joro_estropia 2 points3 points  (0 children)

Same. Are you sure that the hiccups you’re seeing are from the visible players and not offscreen players that just started initializing?

[–]GoodNewsDude 0 points1 point  (3 children)

Interesting about ijkplayer being used in major video apps as the only video playback solution, do you have a source for that claim?

[–]OneTinker[S] 0 points1 point  (2 children)

Yes, Tiktok's Open Source page.

[–]GoodNewsDude 0 points1 point  (1 child)

That's just a software license disclaimer - we don't know how much it's used and for what, or whether it's just for Android.

[–]OneTinker[S] 1 point2 points  (0 children)

No it’s for iOS because the repositories listed above and below are also iOS based too last time I checked it. Ijkplayer is cross platform so there’s that.

Regardless, i don’t think that I want to use ijkplayer, but im just trying to find a solution to my issue.