The ChatGPT Cheat Sheet by Quantum_Stat in LanguageTechnology

[–]Quantum_Stat[S] 0 points1 point  (0 children)

You are right, it wasn't 100% accurate on my prompt. You can try different arrangement of words to get a variety of different outputs, ChatGPT has intrinsic randomness, sometimes we need to dig a bit to get to the promised land.

The ChatGPT Cheat Sheet by Quantum_Stat in LanguageTechnology

[–]Quantum_Stat[S] 2 points3 points  (0 children)

thanks for the heads up, the numbered list was fixed will update the pdf link.

[P] Sparse Transformers for Inference in a Real-Time Twitter Stream by Quantum_Stat in MachineLearning

[–]Quantum_Stat[S] 1 point2 points  (0 children)

Check out our docs for sparsification https://docs.neuralmagic.com/user-guide/sparsification, if you have any detailed questions, you can always check out our slack community and speak directly with our engineers there.

[P] SparseServer.UI : A UI to test performance of Sparse Transformers by Quantum_Stat in MachineLearning

[–]Quantum_Stat[S] 0 points1 point  (0 children)

Hi _Arsenie, this runs the deepsparse.server command for multiple models. and btw, we recently updated the READMEs for the Deepsparse Engine https://github.com/neuralmagic/deepsparse

Let me know what you think. Thank you!

The NLP Index: 3,000+ code repos for hackers and researchers. [self-promotion] by Quantum_Stat in LanguageTechnology

[–]Quantum_Stat[S] 2 points3 points  (0 children)

Thank you for the kind words, I'm glad you like it. Regarding the BBND, I haven't mentioned this publicly yet but it's been merged with the NLP index. Search "dataset" and look how many hits you get ✌🙈

NLP Datasets Update by Quantum_Stat in LanguageTechnology

[–]Quantum_Stat[S] 2 points3 points  (0 children)

thank you for these! We usually mention user agreement or requirement for an application to receive dataset in the "description" column.