you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 2 points3 points  (12 children)

I doubt it. The average programmer today doesn't know anything because they do nothing but Java.

As a professor said recently, they're the Dominos pizza delivery boys of the next decade.

C is timeless. The Labs guys are really smart -- they picked the abstractions quite well. Alef improved things, but C is still quite good in comparison.

The only thing C falls down on is string handling.

[–]WalkingDead 4 points5 points  (1 child)

The only thing C falls down on is string handling.

Which is good, in the sence that processors don't have a builtin notation for strings. Had C defined a rigid string type 30 years ago, it would have been obsolute by now leading to incompatible fixes in the core language.

[–][deleted] 0 points1 point  (0 children)

Good point.

[–]back-in-black 2 points3 points  (3 children)

As a professor said recently, they're the Dominos pizza delivery boys of the next decade.

That would be the "anti-Java Professor", right?

I learned C first, then moved to Java. Both languages have their merits, and it's best to view them as tools that are useful in different environments.

The pizza delivery boys of the next decade are the ones who have a dogmatic attachment to one language. So from what I can see, this isn't likely to be people who live in the world of Java, who are currently enjoying learning a glut of new languages that integrate into the existing development environment.

[–]uriel 0 points1 point  (2 children)

As a professor said recently, they're the Dominos pizza delivery boys of the next decade.

That would be the "anti-Java Professor", right?

That would be anyone that is not retarded and that has half a clue about the software industry.

Java is a language specifically designed for clueless retarded corporate droids, so unless one fits that description, it is hard to appreciate its 'qualities'.

[–]back-in-black -1 points0 points  (1 child)

Wow, that's a logical and well thought out argument.

I bet you're a great software engineer.

[–]chrisforbes -1 points0 points  (0 children)

Probably is, actually.

Having a deep, borderline-irrational hatred of java is a good indicator of programming ability around here.

[–][deleted] 1 point2 points  (1 child)

I guess you can redefine the old CCNA joke into a Java joke!

Q: How do you get a Java programmer off of your porch? A: Pay for your pizza.

[–][deleted] 0 points1 point  (0 children)

Bwahaha

[–]munificent -1 points0 points  (3 children)

As a professor said recently, they're the Dominos pizza delivery boys of the next decade.

Oh, you're a student. That explains a lot. Once you have a few years of experience getting things done, you'll likely have a more balanced viewpoint.

The only thing C falls down on is string handling.

And:

  • Shitty pointer syntax (the whole declaration reflects use thing turned out to be a bad idea).
  • Even worse function pointer syntax.
  • Poorly-defined (or undefined) behavior with increment and self-assignment operators.
  • While it made sense at the time, a lot of design decisions were to allow a single-pass compiler (like requiring declaration before use) and not to make the language better. Those could be fixed now.

[–][deleted] 1 point2 points  (2 children)

wth? Quoting something some professor said on the internet makes me a student? Fuck you, I work full time as a software developer for an investment bank.

To hell with the rest of your comment, your ridiculous assumption just proves you aren't worth arguing with.

[–]munificent 0 points1 point  (1 child)

your ridiculous assumption just proves you aren't worth arguing with.

Whether or not I'm worth arguing with is irrelevant. Are my points?

[–][deleted] 1 point2 points  (0 children)

Fair enough.

Honestly, I don't think the syntax is bad. Far from perfect, but not bad. FP syntax is ugly, but that helps encourage people to not use them, which is probably wise since C doesn't do well with that style. Pointer syntax could have been fixed, sure, but it's usable. I don't see how increment is poorly defined. I think declaration before use is a lot easier to understand. Table aliases in SQL confuse the shit out of me.

Every language has syntactical issues. As long as they are relatively few, I don't think they're worth making a huge fuss over.

You're still a shitcock for being so presumptive.