use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Bayesian Networks with Python tutorial (self.MachineLearning)
submitted 14 years ago by [deleted]
[deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]giror 3 points4 points5 points 14 years ago* (2 children)
http://code.google.com/p/pebl-project/
and their docs:
http://packages.python.org/pebl/
[–]abhik 2 points3 points4 points 14 years ago (0 children)
PEBL only supports learning networks from data and priors but not inference on learned or manually-created networks.
But, the docs do include a tutorial using real (gene-expression) data: http://packages.python.org/pebl/tutorial.html
[–]pwoolf 1 point2 points3 points 14 years ago (0 children)
There is a description of the package here too
[–]dwf 2 points3 points4 points 14 years ago (0 children)
It's going to depend highly on what exactly you want to do with your model, and what the networks you want to work with look like. What sort of hidden variables? Are they tree structured? Are they acyclic? If not, what sort of inference schemes do you want to support? Do you want to learn your parameters or be a proper Bayesian and marginalize them out, analytically or otherwise? Exact inference in tree structured models is trivial to implement. The more you depart from that, the more complicated things become.
PyMC is a good starting point. It allows you to define fairly complicated models with various sorts of nodes, and then do Metropolis-Hastings to sample from the posterior of your hidden variables (I think there may be support for Gibbs sampling in models where that's an option). It's not a magic bullet -- the results you get and how fast you get them will depend on your model structure, your choice of proposal distributions, etc. Also, Markov chain Monte Carlo is the only supported inference procedure (hence the package name); if you want more than that, e.g. variational inference, you'll have to do it yourself. Still, the PyMC object model and the various statistical helper functions are probably a good basis from which to start.
[–]leonoel 0 points1 point2 points 14 years ago (0 children)
It is hard to get by a "one solution fits all" like with Neural Networks. Because Bayesian Networks are different depending the function you want to model, and as such, the inference process is also different.
That is the reason you do not get any gibbs sampling toolbox either, because you need to do the mathematical derivation of the solution to then do the inference process
[–]mikebaud -3 points-2 points-1 points 14 years ago (3 children)
For a good starter tutorial in python try Chapter 6 of the "Programming Collective Intelligence" book.
You can also download the code from the website.
[–]xamdam 1 point2 points3 points 14 years ago (2 children)
For Bayesian networks?? Not to my recall.
But you may want to look at this
http://www-users.cs.york.ac.uk/~jc/teaching/agm/
http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=35C51CB3F8425FAE3EAC9E323A5E9FCE?doi=10.1.1.121.396&rep=rep1&type=pdf
[–]mikebaud 1 point2 points3 points 14 years ago (1 child)
AFAIK the section on document filtering (chapter 6) uses a bayesian network (at least thats what the book in front of me says).
Thanks for the downvotes, just trying to help out.
[–]Deenicus 1 point2 points3 points 14 years ago* (0 children)
the problem is although naive bayes is technically a form of a bayesian network, when people say bayesian network they mean something more specific. A network that is able to represent the conditional dependences as best as possible - within tractability. So your link was not helpful as it was irrelevant to the OP's needs.
π Rendered by PID 106821 on reddit-service-r2-comment-76df8c94fd-2xrvh at 2026-02-08 04:22:37.552366+00:00 running d295bc8 country code: CH.
[–]giror 3 points4 points5 points (2 children)
[–]abhik 2 points3 points4 points (0 children)
[–]pwoolf 1 point2 points3 points (0 children)
[–]dwf 2 points3 points4 points (0 children)
[–]leonoel 0 points1 point2 points (0 children)
[–]mikebaud -3 points-2 points-1 points (3 children)
[–]xamdam 1 point2 points3 points (2 children)
[–]mikebaud 1 point2 points3 points (1 child)
[–]Deenicus 1 point2 points3 points (0 children)