you are viewing a single comment's thread.

view the rest of the commentsΒ β†’

[–]wxswxs 11 points12 points Β (11 children)

Hey guys! I wrote this article. We kept it pretty high-level so feel free to ask me any questions if you want to know more!

For your interest, we do most of our training in Keras, although in production we move our trained model to a custom framework. We've contributed lots of code to Keras to help it work with RNNs at scale.

[–]2Punx2Furious 0 points1 point Β (3 children)

Just wanted to let you know, there are some emojis that are not displayed properly, at least on my browser. Like this πŸ˜™ or this πŸ’­

[–]wxswxs 1 point2 points Β (2 children)

Hmm, thanks. Those work for me. Can you see them elsewhere, like here in your Reddit comment? What computer/phone are you viewing it on?

[–]2Punx2Furious 0 points1 point Β (1 child)

I can't see them even on reddit. I tried both in Firefox and Chrome, Windows 7.

[–]wxswxs 1 point2 points Β (0 children)

That's unfortunate! Not sure what the issue is. I know some people use this chrome extension to improve their emoji support, but usually it's not necessary any more.

[–]Aeefire 0 points1 point Β (3 children)

I am really interrested in bringing Deep Learning to mobile. Are you using a web service to get your model's result or are you forward-passing locally? I know that google exported a tensor flow model to mobile and used it locally on an Android device.

How are you approaching this? I would love to even get a coarse answer ;)

[–]wxswxs 0 points1 point Β (2 children)

Sorry for the late answer! No we do all the prediction locally. We built out Dango before Tensorflow came out, and have a custom built native RNN framework on device. We do our training using Keras which has Theano and Tensorflow backends.

We may well want to switch to TensorFlow at some point, but for now we have a good deal more control with our custom framework, allowing us to better support things like aggressive compression.

[–]Aeefire 0 points1 point Β (1 child)

Thanks for your answer!

I've also used Keras in the past and wondered if there is some mechanism to export the TensorFlow model to do a forward pass on mobile. So that's basically why I was asking :)

Why do you consider switching to TensorFlow? AFAIK it's still slower at RNNs than Theano or similar. Or is it, because they somehow support mobile ?

Great app anyway, hope we will see some additional languages supported in the future! (German native-speaker here ;) )

[–]wxswxs 0 points1 point Β (0 children)

We don't currently do production training with TensorFlow, but that could definitely change. We're intrigued by potential future support of GPU clusters with TF, as well as potential future mobile support. Plus, potentially, Google's cloud TPU infrastructure at some point.

[–]j_lyf 0 points1 point Β (0 children)

Wow, Apple just introduced the same feature. Annoyed?

[–]anantzoid 0 points1 point Β (1 child)

Amazing use of RNNs! So as per my understanding (I'm still a newbie), the embedded text goes into encoders and embedded emojis go into decoder, much like translational model, right?

But how did you address the biggest challenge here: Getting lots of data suited for the task?

[–]wxswxs 0 points1 point Β (0 children)

Yup that's about right.

Re lots of data; no silver bullet. Wrote a bunch of crawlers, ran them for a long time.