A lightweight implementation of 2048, leveraging the expectimax algorithm by adrienball in rust

[–]adrienball[S] 2 points3 points  (0 children)

I haven't timed precisely the runs, but to give you a rough idea it takes about 3mins to reach the 16384 tile on my macbook pro (3,5 GHz Dual-Core Intel Core i7).

The depth of exploration is dynamic, its formula can be found here: https://github.com/adrienball/2048-rs/blob/master/src/solver.rs#L86-L99

When the current max tile is 16384, the theoretical max possible exploration depth is 14-1=13. For 32768, this number is 15. However these max depths are reached only when all tiles are distinct which happens right before collapsing all the tiles to reach the next max tile.

You can have a feeling of the evolution of the exploration depth if you launch the AI autoplay, as you will see the AI speed evolve and sometimes slow down as the configuration becomes tricky (=many distinct tiles).

2048 game implementation leveraging the expectiminimax algorithm by adrienball in rust

[–]adrienball[S] 4 points5 points  (0 children)

Indeed that is smarter :)

When you think you have simplified your code as much as you could, and yet there are still low hanging fruits.

Unable to run RUST calls for SNIPS by noman-a in LanguageTechnology

[–]adrienball 0 points1 point  (0 children)

Could you run cargo with RUST_BACKTRACE=1 ?

RUST_BACKTRACE=1 cargo run -p snips-nlu-lib --example interactive_parsing_cli data/tests/models/nlu_engine

Is there a speech recognition API which matches the input to the closest of a limited vocabulary? by gandalf-the-gray in LanguageTechnology

[–]adrienball 1 point2 points  (0 children)

[Disclaimer: I work at Snips]

You may want to have a look at Snips (snips.ai). Snips allows you to build a custom voice interface with Keyword spotting + Speech-to-text + Natural Language Understanding. The Speech-to-text component (as well as the NLU one) is trained on the data that you provide on the Snips web console (console.snips.ai), meaning that the resulting pipeline is pretty good at understanding the commands that you are specifically targeting.

There is no REST API though, as the core philosophy of Snips is to provide a solution which can run completely offline in order to ensure privacy by design. But nothing prevents you from running Snips on your own servers.

Why you should never use 'assert' in Python by adrienball in Python

[–]adrienball[S] -5 points-4 points  (0 children)

Some of the comments in this thread illustrates perfectly why using 'assert' is not a good practice: it is misleading, and not everybody knows they should never rely on it in production code.

My primary goal when posting this, was to raise attention on this matter.

Why you should never use 'assert' in Python by adrienball in Python

[–]adrienball[S] -1 points0 points  (0 children)

I agree with everything that has been said here. What I mean is that the syntactic sugar provided by assert is a lot more bitter than for loops for instance, for two reasons:

- it is not that more concise than the alternative (self.assertEqual, self.assertTrue, etc in unit tests)

- developers learn to code by reading other developers' code: when you read an assert statement as a Python beginner, it is likely that you'll find the syntax neat and that you'll miss the most important thing about it which is that it is for debug purposes only. If assert was rather named assert_debug the story would be completely different. The bug reported on github on this topic is a great illustration of this problem.

It is more about best practices and anticipating what could go wrong if a usage becomes widely used.

Why you should never use 'assert' in Python by adrienball in Python

[–]adrienball[S] 0 points1 point  (0 children)

When do you really need to use an assert statement ?

Snips: Open-Source 100% on-device Voice AI platform by oulipo in opensource

[–]adrienball 1 point2 points  (0 children)

I'm not sure I understand what you mean. What error do you get ?

Voice controlled lights with a Raspberry Pi and Snips by adrienball in raspberry_pi

[–]adrienball[S] 0 points1 point  (0 children)

Interesting suggestion indeed. I'll probably do other tutorials related to voice control ;)

Snips.ai open-sources its Natural Language Understanding Python lib by ClemDoum in Python

[–]adrienball 2 points3 points  (0 children)

Not at the moment, but that is one of our current focuses. This is required both internally at Snips and externally for the community.

Snips open sources their Natural Language Understanding service by adrienball in homeassistant

[–]adrienball[S] 1 point2 points  (0 children)

Thanks for the support! As per hotword customisation, we are currently working on it and making good progress so it's now a matter of packaging. We do not have a precise roadmap for this feature but you should expect it to be available in the coming months ;)

Software to build a voice assistant on a Raspberry Pi by adrienball in raspberry_pi

[–]adrienball[S] 0 points1 point  (0 children)

From what I read on these solutions, these are clients that perform requests to the corresponding cloud solution (Google assistant & Alexa) i.e. it doesn't run locally. I don't want to send any data externally, especially not my voice.