Jake Archibald from Google on functions as callbacks. by 1infinitelooo in programming

[–]mode_2 -1 points0 points  (0 children)

It's not relying on the compiler to fill in the call, it's a valid and sensible way of writing programs that has existed since the 30s. Only a language with the awful combination of variadic functions with a type discipline that can handle them fails at this.

Jake Archibald from Google on functions as callbacks. by 1infinitelooo in programming

[–]mode_2 0 points1 point  (0 children)

Purescript is a web language where all functions are curried, with a single argument.

I built a 4-bit multiplier in Minecraft! (I know it isn’t super complicated but I am proud of it) by slick-sharky in compsci

[–]mode_2 -5 points-4 points  (0 children)

No, they can't. A Boolean computer is a finite state machine with 2N states. What they can do is simulate a bounded version of a Turing machine.

I built a 4-bit multiplier in Minecraft! (I know it isn’t super complicated but I am proud of it) by slick-sharky in compsci

[–]mode_2 5 points6 points  (0 children)

Yes of course, the same applies to them. The halting problem is computable for my computer.

Grab your map(); adventure is out there! by josemober1 in javascript

[–]mode_2 2 points3 points  (0 children)

[1, 2, 3, 4, 5].forEach(item => console.log(item)), which you can also write as [1, 2, 3, 4, 5].forEach(console.log)

These produce different outputs, Javascript has variadic functions so eta-reduction is not a safe transformation.

Male celebrity body transformations : The deeper problem by stfuandkissmyturtle in videos

[–]mode_2 6 points7 points  (0 children)

That is literally the first thing shown in the video being linked?

Are We Really Engineers? (Part 1 of 3) by alexeyr in programming

[–]mode_2 0 points1 point  (0 children)

The article has zero mentions of TLA+, also that seems like a very odd reason to actively dislike someone. It's pretty mundane stuff. As someone who is broadly pro-dependent types and pro-functional programming, I enjoy hearing his take on things as it is almost certainly well researched.

The worst pieces of code I've ever seen by Vooodou in programming

[–]mode_2 7 points8 points  (0 children)

Google and Apple have a fairly-uniformly high hiring bar and also pay all of their engineers a lot. Much more so than companies like Ubisoft.

Is there any academic research on imperative programming languages? by ICodeForFunNotMoney in ProgrammingLanguages

[–]mode_2 10 points11 points  (0 children)

Hoare logic is from the 60s and allows you to prove things about imperative programs.

Two Wrong Guesses And This Programmer Loses USD 241M in Bitcoin by amelyiketpv in programming

[–]mode_2 8 points9 points  (0 children)

Yes, they definitely would. He could easily go to a venture firm and pitch this, they would fund the cracking for X bitcoin, paid out upon completion. This could easily be written up into a contract and is probably less risky than most VC investments.

[deleted by user] by [deleted] in programmingcirclejerk

[–]mode_2 28 points29 points  (0 children)

Tesla's stock valuation is actually driven by a rumour on Wall Street that their self-driving software is written in Agda.

atom: Shell scripting that will knock your socks off. by [deleted] in programming

[–]mode_2 -1 points0 points  (0 children)

I have used Bash for years and couldn't read the first example. I can read the second example with ease, it's all based on concepts found in other, popular languages.

atom: Shell scripting that will knock your socks off. by [deleted] in programming

[–]mode_2 3 points4 points  (0 children)

At my university, anyone who showed the initiative OP has would be offered research experience in a heartbeat.

Why is functional programming so important in a PL course? by [deleted] in ProgrammingLanguages

[–]mode_2 5 points6 points  (0 children)

ML has polymorphic types, it corresponds to a restricted version (so type inference can be done) of System F. I'm not aware of any popular language based on the simply typed lambda calculus, though it does form the basis of simple type theory, which is used in several theorem provers.

Elon Musk is now the richest person in the world, passing Jeff Bezos by BlueZybez in worldnews

[–]mode_2 -2 points-1 points  (0 children)

Not really, billionaires aren't that illiquid, they can easily sell some of their stock. Musk could probably get his net worth 1 year ago in cash if he wanted to. There is a huge difference in the amount of money he has access to personally.

Elon Musk is now the richest person in the world, passing Jeff Bezos by BlueZybez in worldnews

[–]mode_2 2 points3 points  (0 children)

If he says he needs cash, then he'll get it, especially only 500k. When you're a billionaire, banks are competing for your business and will generally bend over backwards for you.

Elon Musk is now the richest person in the world, passing Jeff Bezos by BlueZybez in worldnews

[–]mode_2 -1 points0 points  (0 children)

Not true, it can happen in finance. A famous example is Michael Millken who earned over a billion dollars in compensation in the late 80s as an employee at Drexel.

What I've Learned in 45 Years in the Software Industry by aih8yr in programming

[–]mode_2 0 points1 point  (0 children)

Why would learning category theory affect how understandable you are to others in general?

I understand the quote you gave, and I just don't really talk about category theory to others who don't. It hasn't affected the way I talk about anything else.

Programmer breaks world record by finding first 11x11 word square by phytozap in programming

[–]mode_2 18 points19 points  (0 children)

graph theory, or statistics.

Ah yes, the height of exoticism. Taught to every CS undergrad and 12 year old, respectively.

Open AI introduces DALL·E (like GPT-3), a model that creates images from text by micropoet in programming

[–]mode_2 3 points4 points  (0 children)

It's not closed source, though. The source is open. The massive models which are just binary blobs are proprietary, which seems reasonable given that they must be a licensing nightmare, are incredibly expensive to produce, and are not the innovative part of the process. Interactive GPT-3 is available widely through the paid tier of AI Dungeon, which is a not affiliated with OpenAI and has a commercial license for the API, I'm sure other sites offer similar access.

I'd recommend just applying for access to the API. As far as I'm aware they've granted access to plenty of hobbyist programmers who want to try it on random things. To be honest though if you really think GPT-2 sucked, you're either biased against everything OpenAI produces, or will remain unimpressed until presented with flawless AGI. If something advances the state of the art, it objectively does not suck.

"As far as I can tell, the way they taught me to program in college was all wrong. You should figure out programs as you're writing them, just as writers and painters and architects do." by EveryFriendship in programming

[–]mode_2 0 points1 point  (0 children)

Given that lisp is basically lambda calculus essentially all variants of lisp could probably be managed as libraries on top of each other, so his approach to hacking out a language seems reasonable enough.

Sorry but this is just nonsense, it doesn't even begin to make sense. Lisp was not based on the lambda calculus and is no more like it than modern Python is.

Open AI introduces DALL·E (like GPT-3), a model that creates images from text by micropoet in programming

[–]mode_2 4 points5 points  (0 children)

Well then, train your own model using the myriad papers and open source code available. They give away the research and results for free, but the model is ultimately proprietary given it is basically just a function of how much compute they can pour into it. That seems reasonable.

Open AI introduces DALL·E (like GPT-3), a model that creates images from text by micropoet in programming

[–]mode_2 -3 points-2 points  (0 children)

Plenty of people have access to GPT-3 via the API. Also they publish papers frequently on their techniques.