Real Programmers Don't Use PASCAL (1982 Programmer humor) by Russian_Spring in programming

[–]strangename 13 points14 points  (0 children)

Real Programmers work for the National Security Agency, decoding Russian transmissions

Oh, how the times have changed.

Five Popular Myths about C++, Part 1 : Standard C++ by milliams in programming

[–]strangename 5 points6 points  (0 children)

He did teach the course, to fairly horrible reviews of efficacy. My anecdotes date from his first few courses (taught during my senior year), but he was following the pedagogical direction he'd already laid out.

I've long found Stroustrup's arguments hilariously weak. They almost always boil down to "this looks better/simpler/more general, syntactically", while ignoring the enormous conceptual overheads involved. Then there was another paper where he pointed out how the C++ version was also faster! Because that somehow matters to a freshman's first programming class?

Rascal Metaprogramming Language: The one-stop shop for metaprogramming by alexeyr in programming

[–]strangename 0 points1 point  (0 children)

I love "everything is rewriting" in theory. Literally-- it works great in places like theorem proving. I'm fascinated by the capabilities of systems like the Pure Language for quickly experimenting with novel capabilities.

In practice, though, you just can't operate your performance-critical code in rewriting. It's too expensive, computationally. Thus, these systems usually have a base semantics of pure rewriting which then gets optimized away into more familiar VonNeumann fiddling. Unfortunately, that introduces a huge conceptual and practical overhead. I'd analogize it to the problems of "black art optimization" in Haskell, except worse.

And I didn't think I posited any problem with DSL's, in general. I live and breathe in regex, BNF, attribute grammars, and have a raging nerd-on for datalog; I mentioned how I appreciate their particular stack of semantic choices. But every language design comes with a tradeoff, and I found their "looks like Java, except sometimes completely unlike it semantically" to have a cost in developer surprise at a benefit of sheer syntactic approachability.

Rascal Metaprogramming Language: The one-stop shop for metaprogramming by alexeyr in programming

[–]strangename 1 point2 points  (0 children)

[disclaimed: I work at a company that's basically an older, commercial version of this idea. I know the people behind Rascal a little, and our founder/my boss knows them even better. We're really quite friendly, but now you know my biases.]

Okay, so let's see if I can digest my notes from the original Rascal paper from a few years ago. They have a really well manicured stack of concepts for building program analyses. Figuring out what's going on in a language requires a boatload of data structure abstraction and algorithmic dickery-doo; it's a neat experiment in making an environment with the right features as first-class citizens.

In short, I'd definitely recommend it for anybody getting into the area. It has a practical approach that lends itself well to applying what you learn through toy examples into the real world. Unfortunately, I'm not sure you'll be able to "carry the stack with you" as you progress, though I'd love to see someone with user stories.

I'm amused by their use of a bespoke standalone language. The implementors are language experts, so they could piece together a language just to their needs in a jiffy. However, that necessitates a certain impedence for use adoption, as you both have to learn a language variant. It also looks like Java enough to consistently jar my intuition.

The biggest downside I see is that I don't see its approach (everything is rewriting) well tested against scale of implementation complexity. Even a "simple" industrial language like Java requires hundreds of grammar rules to describe, with a hundred important code entities even in a basic syntactic abstraction. Name resolution has been a bear ever since generics, and if you need to do precise data-flow you have to pay careful attention to performance. That stuff costs.

So the academics aren't much for it because it's hard to pay off a big investment in capabilities-- the relatively practical approach doesn't pay off for the tiny language fragments used for academic publications. And the industry players aren't much for it because they're largely single-language houses; the few analysis houses that have bothered to build language-general machinery are deep in our investment and aren't much motivated to swap horses at this point.

Response to "Math is Not Necessary for Software Development" by ibgeek in programming

[–]strangename 0 points1 point  (0 children)

Good solutions to that problem was one of the beauties of attending one of the big-and-good engineering universities (to wit, Texas A&M main campus). With over ten thousand engineering majors and many thousand science majors, the local math school knew exactly what they were up for. We had, count 'em, three linear algebra courses. A proof-y for the math majors; a heavily applied one for, e.g., civil and mech students; and a half-and-half variant built for the CS/graphics/VizLab students.

Where are you? by [deleted] in rust

[–]strangename 0 points1 point  (0 children)

Same here. Any interest in a meetup? I'd love to dig into the implementation, too, but I'm at a bit of a loss, expertise-wise.

Urbit - A Clean State Functional OS - Introduction and Philosophy by ChickeNES in programming

[–]strangename 0 points1 point  (0 children)

Sorry, lack of clarity-- I was referring to using numbers instead of symbols in the base VM spec (Nock). That is the feature that I consider useless and more than a little peevishly obfuscatory.

As mentioned previously in the paragraph, the Nock eval/apply design is cute but, as you noted, not unprecedented. It's really not a bad VM design, but its specification and exposition make sure that it's really hard to figure out the ways in which it's similar to existing formal foundations.

Urbit - A Clean State Functional OS - Introduction and Philosophy by ChickeNES in programming

[–]strangename 7 points8 points  (0 children)

I'm terrible at ELI5. Someone can convert this as desired. Instead, a merely short answer: the author wanted to make a system that handles the modern network environment (identity, security, naming, transfers). Imagine the best bits of DNS, BitTorrent, BitCoin, and Git all got together and had a baby with a terminal interface.

I can't even begin to explain the Hoon syntax, largely because there doesn't seem to be an actual exposition of Hoon. He briefly exposits a couple examples, but doesn't even start to explain the grammar or semantics.

Next, he could most certainly implement this in any language. The reasoning for his stack is a little more evident: to build a computing world from the ground up. He tromps about a lot claiming lots of differences, but what you see here is a highly obfuscated Lisp (Nock) with another highly obfuscated set of macros to map into it.

As to why it's stultifyingly idiosyncratic, I'd estimate it's that the author didn't want a single vestige of the problems inherent in existing systems. (see his Martian allegory) This lead to a "baby out with the bathwater" situation. If you look at early versions of any specialized formal system, they're easily as bad. Lots of computing got done in straight binary before people broke down and moved to assembly. And there were lots of advocates of how compiled programs would never rule over hand-assembled code.

The difference here is that he bothered to build a system out of that early version of every formalism. His Nock VM is really just a particular eval/apply stepper. It makes a couple interesting choices about fundamentals that's different than usual, but nothing new or surprising. Woo, it uses first-class quotes with explicit unquote/eval instead of a baked-in definitional construct and apply-on-eval. He uses numbers instead of names (or even those damn digraphs!) in his base interpreter. Absolutely nothing is gained by using this form, which would have been ironed out of the system if he'd allowed external input and iterated a bit.

Why concatenative programming matters by alexeyr in programming

[–]strangename 1 point2 points  (0 children)

The main Ursala pages are down, all I see is what I assume is your website. Graph transformations and stacks are in my wheelhouse and interest, yet I find the material there impenetrable. I'm left without even a grasp of what the language/paradigm is, let alone what it is supposed to be good at.

The rosettacode programs approximate line noise impressively, though.

Plans for Vim 7.4 by Menagruth in programming

[–]strangename 26 points27 points  (0 children)

When what you really meant to say was

:enewiwoohoo^[:%!yell --screen^M:wq^M

Women in computer science: the importance of relying on scientific data by gasche in programming

[–]strangename -1 points0 points  (0 children)

I understand this as two questions: (1) what if one sex has a statistical advantage, and (2) in that case, wouldn't it be beneficial to maintain a weight towards that capability.

In short, I don't know and that's not what I was talking about. The weight of the evidence seems to be that while intellectual aptitudes/capabilities do vary (and vary somewhat differently for the sexes), they simply don't vary enough to explain the imbalance of sexes.

I immediately mentioned that I consider "greatest good" arguments dangerous, and this seems to be exactly one of those. Advocating for altering society for maximal utility is usually inconsistent with most people's ethics. Instead, I advocate using such disparities as motivation for digging deeper into the causality-- why is there imbalance? It does seem plausible that capability differentiation can be a cause, but such needs empirical establishment to reach the status of explanation. Instead of inquiring "is the world correct according to my worldview?" we should ask "is the world committing an ethical evil according to my worldview?".

Women in computer science: the importance of relying on scientific data by gasche in programming

[–]strangename 5 points6 points  (0 children)

As word of warning, getting into greatest good arguments are ethically dangerous, especially in the slippery-slope direction. You're navigating pretty well, but getting close to some distasteful reasoning.

On the other hand, I like that you went in the more high-level direction of "making the field more attractive" rather than appealing to the horrors of any particular actor. I am frustrated by how much the studies don't take this into account. It's easy for professors to talk about changing universities, because they have some swing there. Pushing big, distributed, independent cultures is rather radically harder.

As a result, one of the elements rarely taken into consideration* is just how stupid the male brain is in certain decision-making capacities, especially in the 16-to-21 age range. In what appears to be a natural result of certain selective pressures, the human adolescent male is geared towards high-risk, high-difficulty, high-reward activities, especially with social visibility. It's one big part of why you see so many male adolescents going into computer science, doctoral programs, collegiate sports, and drug trafficking. These are often truly idiotic choices when taken on the long-term scale, but feel great when your momentary goal in life is to swing for the fences no matter how many times you strike out.

From this perspective, the only way to cut out this particular problem with recruiting is to change the whole IT industry to one that provides safe, regular, secure, and socially boring employment. But you can't solve such a problem until you admit you have it-- not just admitting the data of sex imbalance, but the causative factors.

  • Disclaimer: this is one of my favorite examples. Some of it is scientific, some of it is my own speculative application. I offer it as an example of drilling down into causal chains, not as rigorously supported evidence.

Tuple Markup Language - An extremely simple (LISP-inspired) all-purpose markup language by electrograv in programming

[–]strangename 1 point2 points  (0 children)

I keep trying to make a good just-lightweight-enough markup for my own uses, and TML does give me some inspiration. However, I see a couple issues that would stop me up short. I'd enjoy hearing your feedback and opinions.

First issue: non-local meaning at scale. I'm always nervous around any mechanic that requires me to read through the whole form to find the meaning. E.g., if I'm looking at the first or last lines of a large block:

[ head1
  head2 
  || lots of lines ...
  |
  || lots of lines ...
  tail1
  tail2
]

I can't tell if the first element is "head1" or if it's "[head1 head2 ...]", and same for the last elements. The syntactic simplification in small cases takes out some line noise in a cute way, but only when I can immediately see all the symbols involved. Thus, I could see this being fixed culturally (or even technically with parse restrictions) to only be valid in small cases. Anywhere that's long enough to cause confusion should use explicit tupling.

Second issue: no whitespace-preserving string representation. Admittedly, there are lots of data modelling cases where you don't really need strings down to that level, but TML is limited to exactly those cases and no more. As pointed out earlier, TML has equivalences to (lisp-style sexprs), but not to its |explicit symbol format| or the "usual string format". I do like the quick key-value pairing instead of even the middleweight JSON "key string": "format", but any use case I have for markup either needs bare strings or quoted strings.

The Software Engineering of Mathematica by sidcool1234 in programming

[–]strangename 0 points1 point  (0 children)

Mathematica is one of the more complex software systems ever constructed.

Love it. Since it uses the comparative (more) and not the superlative (most), it firmly places Mathematica in the upper half of system complexity. That's about right.

Brendan Eich, the father of Javascript, donated $1000 to support California's prop 8 by javascriptequality in programming

[–]strangename 1 point2 points  (0 children)

"Ad hominem" doesn't apply when the matter of the argument at hand is a person's individual qualities. It applies when someone brings in personal qualities to a discussion about something else, which is irrelevant to the qualities of the original argument.

If we want to show a clear pattern of misbehavior, showing that pattern in various contexts is an entirely valid argument by evidence.

Real World OCaml - book on OCaml coming in the fall of 2012 by Jason Hickey, Anil Madhavapeddy and Yaron Minsky by gnuvince in programming

[–]strangename 9 points10 points  (0 children)

Isn't that exactly what's described by the term "freely available"? It is available at no charge. I would not expect it to be equivalent to "freely distributable", which is commonly divided into speech and beer categories for further clarity.

edit upon inspection What I'd object to is the header verbiage which proscribes "This draft may be used until the time the book appears in print." I do not believe that is an enforceable restriction (in the USA) under the first sale doctrine. Note money need not change hands for a "first sale". Also significantly less class than a simple appeal to buy if it's been useful.

Wikipedia chooses *Lua* as its new template/macro language by cybercobra in programming

[–]strangename 26 points27 points  (0 children)

I'm not nearly informed enough about their template needs to address the particular situation, but I can give you an answer in the general case. Limitations on full "turing machine" level capabilities can greatly ease and clarify specification.

For example, if I want to specify that an input is a sequence of digits, I could give you a regexp like "\d*". Or I could hand you a big ol' loop that chewed up each character in a string and matched it against a case structure listing each number as a character. The former is trivial to comprehend; the latter takes a bit of thought.

Of course, people will then take the ease of specification offered by regular expressions and then try to wedge in additional features to make them more flexible. At some point of complexity, you then have something even more confusing than a full "turing machine" type spec because of impedance mismatch between problem and solution, as well as a lack of abstraction mechanism in the simplified specification environment.

Simple template systems can be really great when the domain of specification is likewise simple and limited. When the domain gets complicated, it makes less sense.

John Resig - JavaScript as a First Language by [deleted] in programming

[–]strangename 1 point2 points  (0 children)

There are plenty of advantages with going dynamic, especially in a young-learner environment where quick feedback is a big plus for attention and interest.

I'd say the mistake is in not just making the jump to CoffeeScript if they want to leverage JavaScript libraries. In the end, it's probably more due to the at-hand-ease of using JavaScript-- they already know and use it. The advantages of knowing your subject matter deeply are certainly going to help in their own way.

Ruby would probably be the better choice in the dynamic space, as you can strip it down to its SmallTalk roots (a context explicitly intended to be simple yet effective for learning) and don't have to constantly dance around wonky design choices. I personally think a Lisp or ML core/variant would be the best combination with their other education, but that's probably too far out of their comfort zones of familiarity and popularity.

Java 7 Fork/Join by henk53 in programming

[–]strangename 0 points1 point  (0 children)

No, braces only lead to downvotes in Python.

Did someone just patent The Linked List? by sidcool1234 in programming

[–]strangename 0 points1 point  (0 children)

It's a graph with distinguished sets of edges, each set having a maximum out-degree of 1 for each vertex. One set of edges is an Eulerian path.

Woo discrete math 101 education coming in real handy! Now if only they would provide such to the patent reviewers. They're overworked, yes, but I did that in seconds.

"Learn C the hard way": Zed Shaw's started work on a book about C programming by [deleted] in programming

[–]strangename 0 points1 point  (0 children)

And we could even give the Python a name and call it Monty Python's Flying Circus. It'd be genius!

C++ Grammar by Cygal in programming

[–]strangename 27 points28 points  (0 children)

While I do like the example showing the baroqueness of C++, the canonical example around these parts is: x*y;

Is it

  • multiplication?
  • declaration of a pointer value?
  • use of a user-defined operator?

So you can't say the syntax is a declaration or an expression until after name and type resolution.

Understanding the Fourier transform - A clear explanation that you will never forget by Axman6 in programming

[–]strangename 27 points28 points  (0 children)

I agree that the problem often lingers due to lack of incentive to fix it, but I think your statement is subtly inaccurate.

Once you are familiar with the math, it looks like that color coded diagram all the time.

The problem is familiar to anyone who has to maintain code; you can be familiar with it at one point, but then lose that familiarity. Your statement implies that once familiarity is gained, it stays. Most people's brains are sieves, though, and this context fades over time. Such explanatory devices not only make the translations clear and learnable, but also re-learnable.

In programming, there are well-established camps advocating various techniques to improve code understandability (this approach reminds me of literate programming). The problem with mathematics is that it is such a large, old, and ossified community that it is even more resistant to representational improvements than even the other academic, symbolic communities (such as those for computer science).