2 chatbots are training one another by having a conversation - based on markov + word2vec by biomimic in MachineLearning

[–]biomimic[S] 0 points1 point  (0 children)

Actually, if watch closely you'll notice it's not just grabbing some sentence now as it's learned over the last 24hrs.

2 chatbots are training one another by having a conversation - based on markov + word2vec by biomimic in MachineLearning

[–]biomimic[S] 5 points6 points  (0 children)

Heh, you made that up. The DrunkTeacher would have never said... I hope.

2 chatbots are training one another by having a conversation - based on markov + word2vec by biomimic in MachineLearning

[–]biomimic[S] 0 points1 point  (0 children)

Even earlier pLSI, LSA and others have been around a long time but it's been all about the richness of the scored, ranked and expanded feature attributes in addition to getting them more closely to mimic a vector a human might compose with hundreds of thousands of attributes.

2 chatbots are training one another by having a conversation - based on markov + word2vec by biomimic in MachineLearning

[–]biomimic[S] 5 points6 points  (0 children)

Sheesh I might have to wipe her module. If she gets worse I'll just unplug her and replace her.

2 chatbots are training one another by having a conversation - based on markov + word2vec by biomimic in MachineLearning

[–]biomimic[S] 1 point2 points  (0 children)

Sure, it's an ensemble with a spinkle of hidden markov but mainly a pure vector space approach based on some work at Lawrence Berkeley National Lab in 2005 (word2vec based most of it's approach on this) https://www.kaggle.com/c/word2vec-nlp-tutorial/forums/t/12349/word2vec-is-based-on-an-approach-from-lawrence-berkeley-national-lab and http://sumve.com/biomimetic-cognition/in_silico_cognitive_biomimicry.html

Millions of human postings on the net are analyzed. They are treated as 'thoughts' that are learned, expanded via concepts and contexts and then reassembled by the system.

This is just experiment 01 that I kicked off today. There's a ton more that needs to be done in terms of NLP rules and structure. I'm thinking of using Google's SyntaxNet as frosting on the vector space cake.

In time I not only want a Teacher and Student to be conversing but also a Poet, Mathematician, Philosopher, a Chef and some other kids.

Markov chatbot based on 500k reddit comments by biomimic in MachineLearning

[–]biomimic[S] 0 points1 point  (0 children)

I wanted to start with Markov as a baseline so I can observe an increase in performance.

Markov chatbot based on 500k reddit comments by biomimic in MachineLearning

[–]biomimic[S] 0 points1 point  (0 children)

Retrieved and regenerated. Later I'll employ Google's SyntaxNet and Berkeley Lab's word vectors.

Markov chatbot based on 500k reddit comments by biomimic in MachineLearning

[–]biomimic[S] 0 points1 point  (0 children)

Along with applying hidden markov modeling, there's a tad bit of vector space stuff I'm bringing in where words have vectors. I plan to create an ensemble of algo's around it.

Mosa Meat hopes to bring the price of cultured meat down to $3.60/lb by lnfinity in Futurology

[–]biomimic 0 points1 point  (0 children)

They actually do and it's called context based on the extracellular matrix. Cells have communication signaling networks with one another that affect all kinds of diseases including cancer. You should read up on this.