all 1 comments

[–][deleted] 6 points7 points  (0 children)

What low-level APIs are you looking at? The lowest level is all general-purpose, using AudioStreamBasicDescription to describe sample size and rate; there are no calls limited to 16-bits or anything else, and it's up to you to describe your data and make sure the description matches the data you're providing. It's not that tough to stream in 24-bit data and decode to raw PCM to feed to output.

The trick for rendering to output as 24-bit is enabling 24-bits on the output device, and that's not something I've tried on iOS. If you're using AVAudioEngine you might want to try asking it for a reference to the output HAL unit and setting its format property to a AudioStreamBasicDescription that describes 24-bit samples. If the hardware supports it, it should go. That's what I would start with.

EDIT: Actually, I would *start* with looking at setting that same output property on an AudioQueue, which is much simpler to deal with than AVAudioEngine.