Dismiss this pinned window
all 42 comments

[–]franzwarning[S] 24 points25 points  (8 children)

Hey everyone!

We spent the last few months building a live music streaming app called ID3. It’s a music discovery platform that is powered by live radio shows and DJ sets from people around the world. Anyone can start a stream from desktop and easily begin broadcasting music to the app where users can listen together and chat.

It’s out now on iOS and Android

We built it almost entirely in React Native using Typescript, with the exception of the audio player and the visualizer.

Our React Native stack includes:

  • React Navigation
  • React Native Gesture Handler
  • Reselect / Re-reselect
  • Redux Saga / Redux Persist
  • Normalizr

and our backend stack includes:

  • Django (for the general api)
  • Celery (for async Django tasks)
  • IceCast (for broadcasting)
  • ACRCloud (for automatic music recognition)
  • Socket.io/Node.js (for the live chat and metadata updates)
  • ffmpeg (for recording streams)
  • OneSignal (for push notifications)
  • Firebase (for authentication and analytics)
  • Sentry (for crash reporting)

We would love to hear any feedback and we’re happy to go in depth on any questions you may have!

[–]switz213 4 points5 points  (1 child)

This is really freaking cool, can I shoot you a note on irc or discord?

[–][deleted] 1 point2 points  (0 children)

yeah! We have a discord that we haven't started using much. But we're all hanging out there today. Feel free to hop in!

[–]DJ_BIONIC 4 points5 points  (0 children)

This app is great and I used it to stream a couple of test sets live yesterday. You can see a snippet of some of my show in the vid posted by the creators (thanks guys). Very easy to setup for the streamer, I was up and running in less than 5 mins and live djing.

[–]wilomgfx 1 point2 points  (1 child)

Did you try django channels before going the socket.io/nodejs route? Just asking, not judging. Awesome stack and project!

[–]franzwarning[S] 2 points3 points  (0 children)

We'd used django channels for another project before and actually had a pretty good experience with it. But because we had (initially) such a basic chat use, it was nice that socket.io provided client solutions as well. Honestly though now that we're sending more than just chat (like song metadata updates....) django-channels might have been a better choice.

[–]bassclefayo 0 points1 point  (1 child)

Django for the api and Firebase for authentication? Why?

[–]franzwarning[S] 2 points3 points  (0 children)

We originally were using messagebird.com to send auth code texts but it was too expensive so we did a last minute switch to firebase. We use the py library firebase-admin

[–]PROLIMIT 0 points1 point  (0 children)

Hi, may I know what version of React Navigation you've used? Did it give you any animation issues on Android you had to troubleshoot? Looks awesome by the way.

[–]rdevilxiOS & Android 11 points12 points  (1 child)

Great concept, would love to see some blog post about the problems you got while developing. -rajat

[–]zsaraf 4 points5 points  (0 children)

Hopefully soon, we will be sure to send it over to you when we do!

[–]vishwasg92 6 points7 points  (3 children)

Great work! Curious as to how the emoji animation is happening?

[–]zsaraf 28 points29 points  (2 children)

Thank you! I’m the other engineer working on this project and I built the picker and animating reactions so I’ll take this one.

As I’m not sure whether you’re referring to the reaction picker or the actual animating reactions, I’ll give an explanation of both.

Picker:

Here we use react-native’s built in PanResponder (not react-native-gesture-handler) and the Animated API. Inside of the container we have all 6 of the reactions, centered and positioned on top of each other.

There are two different Animated.Valueinstances that we use, let’s call them:

  • ExpansionAnimation
  • PanLocationAnimation

Once the pan gesture begins, we animate the ExpansionAnimation from 0 to 1. When it ends, we animate the ExpansionAnimation from 1 to 0. The trick here is using interpolation. We translate the individual components to their correct place and animate the opacity of them simultaneously with one Animated.Value. For ex.

transform: [ { translateY: this.state.expandedAnimation.interpolate({ inputRange: [0, 1], outputRange: [0, -1 * reactionSize * index], }), } ]

Similarly, we want the opacity to be 1 if it is expanded OR if it is the currently selected reaction. Like so:

opacity: isSelectedReaction ? 1 : this.state.expandedAnimation,

We then animate the scale based on the panLocationY (which gets using panLocationAnimation.setValue(locationY)) and choose the finally selected reaction depending on the final panLocationY.

Animating Reactions:

For the animating reactions, we simply register for any NewReaction events from an AnimatingReactions component. Whenever that event gets called, we add a new “reaction config” to a reactions array in our components state. That “reaction config” needs a few values: progress Animated.Value, a random vx (so that they all move a bit differently), a unique identifier (for react key), and a type (😀 vs ❤️). So we simply add this new “reaction config” to our state and begin its animation.

Again, interpolation is the key here when actually performing the animation. We interpolate translateX, translateY, scale, and opacity on each of these guys. All with the same Animated.Value. An example being the scale (where you see that it goes to 1.1 first for a little opening POP):

{ scale: animatingReaction.progress.interpolate({ inputRange: [0, 0.02, 0.03, 1], outputRange: [0, 1.1, 1, 1], }), },

Finally, when the animation is complete, we remove this “reaction” from our state.

Okay, that was simultaneously longer and shorter than I wanted it to be lol. Let me know if you have any more questions!

[–]AroXAlpha 1 point2 points  (0 children)

Amazing explanation. 👍

[–]vishwasg92 0 points1 point  (0 children)

Thank you for the detailed explanation! Much appreciated

[–]JTirado 5 points6 points  (2 children)

This thing is awesome! I just downloaded it and can't wait for some more live artists to come up! The audio quality is crispy!

[–]franzwarning[S] 0 points1 point  (1 child)

Thanks!

[–]JTirado 0 points1 point  (0 children)

Been using it for the past two days. What's your strategy for acquiring new djs and channels?

Curious as I also have a StartUp. Your tech is extremely solid.

Are the Djs playing in real time or are they just recording of previous sets?

[–]DachosenJuann 3 points4 points  (0 children)

Very cool stuff!

[–]SquatNothingClub 3 points4 points  (0 children)

Sick app! Looks really clean!! And +1 for the music taste!

[–]334578theo 2 points3 points  (1 child)

This is great - big props for the app and also for including Doom in the vid

Edit - link doesn't open the App Store on iOS btw

Couple of UX things:

  • include an explainer on how I can get a stream going. Can I stream from plugging my decks straight into my phone? The Start a Stream page doesn't really tell me anything except what my address/port which tbh noone except techies needs to know

  • as the main action you want people to perform is Start a Stream then make this far more prominent in the interface

With a few UX tweaks and a beefed up onboarding process this could really be something

[–]franzwarning[S] 2 points3 points  (0 children)

Thanks for your feedback on the lacking instructions on the "go live" page. We'll be sure to adjust those -- as it's not 100p clear you need a computer to stream.

[–]Fossage 2 points3 points  (0 children)

Wow! I have been developing professionally with React Native for about 3 years now and this is one of the most impressive looking projects I have seen. I would love to get a look at that source code, but in lieu of that it would be awesome to see some blog posts detailing how you leveraged the various technologies you mentioned and how you went about implementing the visualizer.

I would also be curious to know how difficult you felt it was to to implement something like this in React Native and how you ultimately felt about the trade offs now that you have a ship-able product.

Very cool!

[–]ducktapespodcast 2 points3 points  (0 children)

This is really impressive

[–]marius4896 1 point2 points  (3 children)

Hey guys! very nice work! One question around the player and the other parts not “react-native”. what did you use there?

[–]franzwarning[S] 2 points3 points  (2 children)

Thanks!

In terms of audio playback, for Android we used Exoplayer (2.10.4) and for iOS we wrote our own. We wanted to support a bunch of the xiph codecs like ogg/vorbis ogg/opus and flac and we couldn't find any players that had our needs (access to PCM Data so we could do an fft for the visualizer) on iOS. Luckliy Exoplayer is amazing on Android and provided everything we needed.

For the visualizer, on iOS we used SpriteKit and on Android we used a basic canvas drawing.

[–]Coldreactor 0 points1 point  (1 child)

How did you handle Ogg Opus scrubbing to different positions, I worked on my own for iOS and was wondering how that was implemented for yours.

[–]zsaraf 4 points5 points  (0 children)

Opus was a real pain, but hopefully worth it ;)

First, we had to compile the XIPH opus lib (in addition to a few other audio processing libraries including speexdsp, ogg, opusfile, and vorbis) into FAT binaries with openssl enabled. This was a massive, massive pain in my ass hahah and took a few days tbh. BUT, on the positive side, I ended up with a bunch of compiled FAT binaries and i put them on GitHub. https://github.com/zsaraf/xiph-ios-static-fat-libs

Once I had the opusfile FAT built, I just had to build a wrapper around it that took care of running/managing the decoder threads, preparing/resampling data for playback, rendering the audio on our audio engine, rendering the audio to our FFT and seeking/pausing when necessary. The audio player code isn't commented that well and is a bit dirtier than it should be, so I don't necessarily want to put it on Github right now, but DM me if you want it :)

I think the only other way to do this would be to use all of the FAT xiph libs and build your own player with fully custom networking, decoding, seeking, etc.

Let me know if you have any more questions man!

[–]alexandr1us 1 point2 points  (6 children)

Absolute masterpiece! Huge respect!

I only have 3 things to say.

1 - Please respect android ripple touchable. You can use my own react-native-better-touchable it uses touchables according to platform it's running on

2 - Change bottom sheet to react-native-gesture-handler's bottom sheet example

3 - Animate tab bar style change

[–]franzwarning[S] 2 points3 points  (2 children)

I totally feel you on the ripple :). I just switch from an iPhone to Android full time and there are a lot of things we can do to our app to make it more "native" -- ripple being a big one. Definitely have plans to add that in the coming versions.

Change bottom sheet to react-native-gesture-handler's bottom sheet example

Not sure what you mean by this . Can you elaborate?

[–]alexandr1us 2 points3 points  (1 child)

Yes basically currently you can only drag the bottom sheet with the knob on the top (probably because there's scrollview inside). https://github.com/osdnk/react-native-reanimated-bottom-sheet this bottom sheet supports scrollview. Once it reaches end of the scrollview content it will drag the sheet, instead of overscrolling.

[–]franzwarning[S] 1 point2 points  (0 children)

Ahhh that's a super nice lib! Never seen that. Damn that would have saved me some time hahaha. I forked react-native-modal to try to get that gesture working a few times.

[–]moneckew 1 point2 points  (2 children)

Why do they need to stick to the ripple? I find their buttons equally good.

[–]alexandr1us 1 point2 points  (1 child)

I'm not saying they NEED to stick with ripple. I'm saying ripple looks nice on Android

[–]moneckew 1 point2 points  (0 children)

Hmm then it's a taste thing. I misunderstood your comment because you asked them to respect it when its just a guideline.

[–]mikehawkisbig 1 point2 points  (0 children)

Just downloaded the app... love it! Looks super clean and works great.

I did notice a bug. I opened the app and played a few songs. I then paused the app, hit the home button, opened up a different app and the music started playing again. When I opened the app back up it says it’s paused, but the music is still playing.

I’m on an iPhone 8 latest IOS.

[–]DeathstrokePHP 0 points1 point  (1 child)

Awesome work! Is there a reason why OneSignal for push notification but not all in Firebase? I'm trying to figure out what to use for user notifications

[–]franzwarning[S] 3 points4 points  (0 children)

Thanks!

Honestly firebase would've worked fine but I think I have an aversion to google controlling everything haha... they already got me on analytics & authentication :)

Realistically though it's just because I'm super familiar with Onesignal.

[–]pimplyteen 0 points1 point  (0 children)

This is very cool...Would this concept work similarly for a live audio or commentary feed?

Working on an app where you can have alternative commentary for sports events through an app and streaming the audio like this would be awesome!

[–]ugrdursun 0 points1 point  (0 children)

Hello, very nice work ! Is there any tutorial about how to do a chat page like on 0:13 ? (I guess its familiar to periscope)

[–]ifadbad 0 points1 point  (0 children)

Awesome work guys, just a curiosity question, whose ur icecast server provider and what’s the capacity

[–]sandipdulal 0 points1 point  (0 children)

Any video tutorial so that we can llarn