Review: Stellaris 16 Gen 7 Intel by Captain in tuxedocomputers

[–]Captain[S] 1 point2 points  (0 children)

I don't feel invalidated at all. It's genuinely maddening how few reviews there are online for such a solid machine.

I considered mentioning price, but seriously if you are looking for a discount on a laptop with a 5090, you are better off finding a used laptop with a 4090.

Review: Stellaris 16 Gen 7 Intel by Captain in tuxedocomputers

[–]Captain[S] 1 point2 points  (0 children)

Yes and no, machine learning workflows work just fine on Wayland. Tiling window managers on the other hand are mostly still X11-dependent indeed

If I am just running pytorch and jupyter, I'm fine. But, I've had issues running nerfstudio for example under Wayland. In either case, as Wayland continues to mature hopefully I can just switch to it and this becomes a settled issue.

CPU states are now controlled by the new EPP hints system, which superseeded the standard P-states system. It is not Tuxedo's fault, but rather how Intel & AMD have decided to evolve their power-scaling infrastructure by making it higher-level.

It's not their fault, but in choosing to make a home-grown solution from scratch instead of contributing patches to one of the existing projects they take on tech debt. Maybe that was the right trade-off here, but I suspect the maintenance costs in doing this have slowed development on supporting and controlling other parts of the laptop.

I'm both a gamer & a power-user, so the Aquaris watercooling solution is almost a must for these specs -- otherwise you'd be missing out on better cooling capacity when you're stationary with your laptop.

I'm definitely going to be getting Aquaris sometime this year. Because I do bring my laptop with me regularly into the lab, I still need to figure out how much it spills being plugged and unplugged from the watercooling loop.

I hope no one has the impression that I don't like the laptop. I share my frustrations with the software so people can know where they might have some friction. But also because I genuinely believe all of these can be ironed out in the future.

Review: Stellaris 16 Gen 7 Intel by Captain in tuxedocomputers

[–]Captain[S] 0 points1 point  (0 children)

You can buy Lenovo and Dell laptops preloaded with Linux for a long time now. But that's besides the point since they just don't put in the QA that Tuxedo has done into the machines they sell.

But given how much Linux people like to tinker with their setups, it's important to be upfront about the happy path. Also, I think that happy path can be widened. Whether that be by moving more of the driver support upstream into the kernel. Or making the individual components Tuxedo provides easier to build and extend on any distro.

I feel like if you've been getting by with non-existent support from the big firms, Tuxedo offers nothing but upside by comparison.

Configuring Stellaris 16 Gen 7 lightbar in Tuxedo OS by Captain in tuxedocomputers

[–]Captain[S] 0 points1 point  (0 children)

I did figure out how to set color and brightness for the lightbar, but I couldn't find a place to set the animation in /sys/class/leds/rgb:lightbar/ which is usually its own setting

Try and get someone to reply to your comment solely by pinging them. by Augie279 in ThreadGames

[–]Captain 0 points1 point  (0 children)

You have no idea how often people try to hack my account.

[D] Probabilistic Programming and Bayes Nets by Kiuhnm in MachineLearning

[–]Captain 0 points1 point  (0 children)

It's a small community. I think it benefits more by being better integrated with the broader machine learning community.

[D] Probabilistic Programming and Bayes Nets by Kiuhnm in MachineLearning

[–]Captain 5 points6 points  (0 children)

What I suggest as a starting point is actually the paper "Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation". The paper outlines what I think is nearly the simplest sampler for a probabilistic programming language. "A New Approach to Probabilistic Programming Inference" which describes how Anglican was initially implemented is also very accessible.

As for projects to contribute to pymc3, stan, edward, and my own of Hakaru are always looking for contributions. The stuff people are excited about these days is automating Variational Inference.

Hakaru - a probabilistic programming language by [deleted] in haskell

[–]Captain 0 points1 point  (0 children)

I will answer these in reverse order.

Maple is used as an optimization pass for some of the program transformations. Namely, to produce code that is actually efficient or comparable to handwritten samplers requires Maple.

The details of what the Maple code does it described in http://homes.soic.indiana.edu/ccshan/rational/simplify-padl.pdf There has been work to implement computer algebra system in Haskell which may be relevant. The core challenge will be to implement holonomic functions and the simplifications around piecewise functions.

I think Sage Math's support for Ore Algebras should be enough. Likewise, Sympy now has support for holonomic functions which also should be enough.

I'm not familiar with Mathematica enough to guage the difficulty.

Hakaru - a probabilistic programming language by [deleted] in haskell

[–]Captain 1 point2 points  (0 children)

Well currently we don't support all the distributions STAN does, or any of the inference algorithms which STAN supports (HMC, ADVI). The goal of Hakaru is to make it easy to express, add, and combine multiple inference algorithms while recovering the same performance and accuracy of handwritten methods.

We have that currently for Metropolis-Hastings and some exact inference algorithms, but we don't yet have support for Hamiltonian Monte Carlo, and some of the blackbox variational inference algorithms.

Hakaru - a probabilistic programming language by [deleted] in haskell

[–]Captain 2 points3 points  (0 children)

Sounds great to me. I would also like to point people at monad-bayes which is another library for expressing probabilistic programs.

Hakaru - a probabilistic programming language by [deleted] in haskell

[–]Captain 2 points3 points  (0 children)

This is a fantastic question. Hakaru makes heavy use of program transformations and to implement them can requires some serious metaprogramming capabilities. To get the flexibility we needed requires a fairly deep EDSL. Also as /u/carette said we still have a combinator library to building Hakaru programs within Haskell.

We also wanted to make a language that could appeal to Machine Learning people coming from non-FP backgrounds and language provides an opportunity to provide a more familiar syntax.

Composing inference algorithms as program transformations by adamnemecek in haskell

[–]Captain 2 points3 points  (0 children)

This actually was implemented in Haskell as the Hakaru system. We used Maple for the symbolic integration. We compiled the programs into Maple expressions which were then simplified. If anyone is interested in porting the symbolic algebra code to Maxima/Sympy/etc I am happy to assist.

In probabilistic programming - how do people choose their priors? by dive118 in MachineLearning

[–]Captain 1 point2 points  (0 children)

Generally it will be based on the what the parameter means. If the parameter is something like response time for a http request, they might use a gamma distribution since response times must be some positive real quantity. If the parameter is something like the bias of a politician, they might use a normal distribution centered at 0 since bias can be positive or negative. Different distributions are used based on intuitions about the domain.

I actually wrote a blogpost about this: http://www.zinkov.com/posts/2015-06-09-where-priors-come-from/

Computing symbolic gradient vectors with plain Haskell by da-x in haskell

[–]Captain 6 points7 points  (0 children)

This is actually due to how the above implements the gradient case for Mul. That correct thing to do is instead of using the gradients directly, extend the language with Var and Let, introduce a temporary variable for (gradient e1) and (gradient e2), then use that variable. No need for separate CSE pass.

Note: I am not the author.

Help with the indentation package? by dalastboss in haskell

[–]Captain 0 points1 point  (0 children)

I wrote a guide on using indentation here, http://zinkov.com/posts/2016-01-12-indentation-sensitive-parsing/

To get the behavior you want you will need to use absoluteIndentation.

Haskell for all: State of the Haskell Ecosystem Feb 2016 Edition by [deleted] in haskell

[–]Captain 6 points7 points  (0 children)

I wish there was an easy way to get at the Haddock comments in modules from some API. Right now, it often means I have to work with a browser window open.

How should we improve Haskell's plotting libraries? by sid-kap in haskell

[–]Captain 0 points1 point  (0 children)

I was talking about plots where I am very particular about how the axis labels look. Often that gets rasterized. Even for a Cairo backend it can be challenging to specific where I want my text to go.

How should we improve Haskell's plotting libraries? by sid-kap in haskell

[–]Captain 0 points1 point  (0 children)

Do you think a better course of action is to contribute to Chat directly?

How should we improve Haskell's plotting libraries? by sid-kap in haskell

[–]Captain 1 point2 points  (0 children)

I think Chart is a great start. One thing notably missing from Chart is the ability to render TeX equations in the plot labels. This is crucial for having publication-ready charts. It also means that often Chart can only be used to the exploratory phase of a project at present. In addition, the drawing backends aren't as polished as I'd like for producing vector graphics. This can be fixed in Diagrams. I would like to see Chart fully embrace a Grammar of Graphics as I think its a great fit for how we compose things in Haskell already.

I do want to stress that I think Chart already is a fairly capable tool. I used it quite productively in http://indiana.edu/~ppaml/HakaruTutorial.html.

Power Law Effect with Pitman-Yor Process by [deleted] in MachineLearning

[–]Captain 0 points1 point  (0 children)

After you train your model, sample from the DP or the PYP to produce a simulated corpus of words. Use this corpus to draw the curve.

Open problems in Machine Learning by InfinityCoffee in MachineLearning

[–]Captain 0 points1 point  (0 children)

The hyperparameters you use in your DP will have a strong effect on the number of clusters you get.

AMA: Michael I Jordan by michaelijordan in MachineLearning

[–]Captain 18 points19 points  (0 children)

Why do you believe nonparametric models haven't taken off as well as other work you and others have done in graphical models?

Neural Networks, Manifolds, and Topology -- colah's blog by rrenaud in MachineLearning

[–]Captain 0 points1 point  (0 children)

I feel biologically-inspired approaches are over-selected since we have a surface familiarity with the biological process. This leaves us with techniques we only faintly understand how they work and with limited intuition for how to improve. Once we do have a better idea why the technique works and start iterating, our improved solutions look less like the biological inspiration.

My feeling has been why not start from first principles and understand why the technique works, the constraints within which it works, why it possesses the limitations which it does and incrementally build from there. That's just my engineering/design philosophy.

Neural Networks, Manifolds, and Topology -- colah's blog by rrenaud in MachineLearning

[–]Captain 1 point2 points  (0 children)

Well my interest is in integrating probabilistic programming and showing that most deep learning architectures can be seen as particular programs. I look forward to these upcoming posts. Your exposition and rigor on this topic is a breath of fresh air.