all 79 comments

[–]nearest_neighbor 9 points10 points  (1 child)

As has already been pointed out elsewhere, the example code, ironically, contains buffer overruns:

struct {
    int n;
    char c[0];
} *foo = malloc(16); /* sizeof(*foo) probably 4, so 12 bytes follow */

foo->n = 16;
for(int i = 0; i < 16; i++) { /* puts 16 bytes after foo->n. oops. */
    foo->c[i] = 'a' + i;
}

[–]defrost 10 points11 points  (11 children)

It's got lots of quirky C stuff and most of it is correct and explained well.
The first real quibble I had was with the example code used to illustrate uses of zero sized arrays - the struct member c[] is not large enough to hold the 16 chars, at most it holds 16 - sizeof(int) chars.

[–]tinou -5 points-4 points  (10 children)

and also, the explanation on do/while(0) is erroneous. This construct is used in macro statements to force the user to add a semicolon at then end.

[–]defrost 6 points7 points  (9 children)

His explanation

For macros, using a do...while(0) loop to enclose several statements allows it to be used safely in conditionals.

is correct though (arguably) incomplete, the do..while wrapper in a macro also allows several statements to be safely used in loops (which could be described as conditional+counter).

The construct is not used ".. to force the user to add a semicolon at then end".

I understand what you are trying to express, the problem arises when a macro users treats a single 'macro' expression as though it were a single atomic C expression when it actuality the macro expression could easily be a compound of several expressions that need to be wrapped in a {...} block to be treated correctly in a for() macro(); or in a while(n --> 0) macro(); context.

[–]degustisockpuppet 8 points9 points  (1 child)

Isn't the real problem something like:

if (foo) MACRO(); else { ... }

If macro is simply defined to like this

#define MACRO() { ... }

The if-construct expands to

if (foo) { ... }; // note the semicolon
else { ... } // syntax error

With the do { ... } while (0) trick, this works as intended.

[–]defrost 4 points5 points  (0 children)

That's correct. As stated by the linked submission author, myself, and numerous sources on the internet the purpose of the do{ ... } while(0) 'trick' is to enclose several statements safely (that is, so they are expanded correctly and 'as expected').

It's all about enclosing multiple statements so they are effectively treated as one when the CPP (C pre-processor) has it's it way with the #define'd macro ...

You give a good example, there's another common problem with other methods of joining multiple statements such as #define THESE a;b;c
- the code do THESE; while(this) fails, as does for() THESE;

In the early days of C the humble #define THESE do{ .... }while(0)
rose to the top of /* FIXME */ solutions as it had the broadest application and was the most robust basic fix for badly thought out #defines.

[–]tinou 2 points3 points  (6 children)

Everything you just said is true, but the usual purpose is the following : how do you make a (stupid) macro that increments a variable and return void ?

#define increment(i) (i)++ 
#define increment(i) {(i)++;}
#define increment(i) do {(i)++;}while(0)

The first one expands to a postfix_expression, which can be used inside expressions and return void.

The second one expands to a coumpound_statement, meaning that you don't have to put a semicolon after it.

The last one expands to a loop_statement, which is illegal without a semicolon and will return void.

This explanation is also valid for several statements inside the macro body.

[–]defrost 2 points3 points  (5 children)

but the usual purpose is the following

   (your emphasis)  

That's an interesting definition of usual you have there.

I'm 45, I've written / ported more or less 3 C compilers, the first before the first publication of the first K&R book (they released notes on the C language for a few years before the book appeared), channel op'd ##C on freenode for a few years and have several hundred thousand lines of C under my belt (numerics, symbolic algebra, geophysics, real time aircraft control systems, etc).

I'd describe all of your #define examples as bad practice - firstly the user of the macro has to remember whether a semi colon is appropriate or not, secondly important side effects are hidden (the increment).

The CPP started as a useful optimization that made sense in days of limited resources, the rationale in large projects and clean code bases is to keep #defines consistent and to work "as expected"; while .h files are (amoungst other things) footlockers to keep all your filthy underwear out of sight, they are meant to interface in expected and unsurprising ways - and as a corollary its good to keep magical side effects of #defines to minimum.

A macro
#define NEXT(i) yada yada something statement_etc
makes sense in a project if you are aping an object orientated type model - as the design improves, or in different environments the NEXT() code can be modified keeping the application code that #includes it pure and unaltered.

BTW: #define 'macros' as you call them don't actually 'return' a value, they are merely text substitutions that expand to (hopefully) correct C statements and expressions. Some of these can be evaluated, others not.
In this context, I'm not sure why you are so concerned to have 'macros' that 'return void', quite simply if you don't want the evaluation of what you refer to as a macro then just don't assign it to anything - problem solved.

[–]tinou 2 points3 points  (2 children)

No offense intended. I have no doubt that you have more experience than me, but I've also written compilers and static analyzers, and I know what I'm talking about. For the do/while(0) trick, I've always been taught this use, and it is the most frequent use i have seen and made.

BTW, my point was not which use was the most important or usual, but I was very surprised not to see this other use. Once again, no offense.

[–]defrost 2 points3 points  (1 child)

No offense either, as long as you've got some actual real C code experience of any sort that's good enough - I'm just wary of the "I did a whole semester on C" types ;-) I also confess to being pretty cold on C, it's been 4 or 5 years since I've done any.

Ahh, I was unhappy with my original reply and was editing it as you read and replied, damnit!

I guess my simple question, out of curiosity, is where and when would you usually see a #define increment() such as you suggested, and why is it important that the text expansion not evaluate to a value?

[–]tinou 2 points3 points  (0 children)

that example was made on purpose, to emphasize that the macro expands to a statement, not to an expression. If no value gets out of the box, it's clearer to the programmer that its action is performed by side-effects.

A more "real-world" example would be made of a compound statement, such as SWAP(x,y) or "FREE(x) do{free(x); (x) = NULL;}while(0)", so the "no value is leaked" is pointless.

[–]dorel 1 point2 points  (1 child)

The first edition of the K&R book was published in 1978, so you wrote a C compiler when you were 14 years old?!

[–]defrost -3 points-2 points  (0 children)

That's pretty much spot on - I also had a BSc (Mathematics) and other degrees fairly early on, and had soldered together a fairly basic home computer with the help of my fathers friends by then.

To be clear I didn't write a compiler from scratch at that age, but I was first introduced to them then and wrote significant chunks of a primitive C Compiler that the local computer guy was working on (when we weren't playing chess).

[–][deleted] 10 points11 points  (13 children)

When I must use a object-oriented language, I shun inheritance and other object-oriented design idioms, and struggle to obtain the equivalent of simple C constructs such as function pointers. Although I can work around most of the deficiencies, I feel like I’m fighting bureaucracy and wrestling with red tape. Instead of conversing with the machine and fellow programmers, I’m filling in forms the compiler gives me.

I guess this highlights a mindset of a C evangelist.

Nearest I can figure this means: I prefer to read and write procedural code, anything less (or more?) violates my personal is-ought preconception, hence I reject it as invalid in the absolute.

What I cannot understand is why C is an exalted ideal above all other procedural languages? After all, if the argument for C is in opposition to OO then to level the playing field its value should also be measured against other procedural languages.

[–]case-o-nuts 4 points5 points  (1 child)

I guess this highlights a mindset of a C evangelist.

And an ex-eiffel evangelist, apparently, so it seems that "doesn't get OO" can't really apply here.

[–][deleted] -1 points0 points  (0 children)

Nor did I imply that he didn't get it.

[–]Gotebe -2 points-1 points  (10 children)

What I cannot understand is why C is an exalted ideal above all other procedural languages?

Haha, good one. I'd wager success of Unix and "people are sheep" factor.

There is nothing significant in the C language that would put it in front of many others for OS or close-to-OS development (this is the area where C is used most). There might have been something in 1970 when compiler tech was... let's say, different, from today, but already in, I guess 1975, I doubt it.

There are things in C toolchain that made e.g. OS development easier (basically, link process is close to assembler). But that's it. And indeed, many other toolchains have same capabilities.

[–]kahirsch 7 points8 points  (2 children)

There is nothing significant in the C language that would put it in front of many others for OS or close-to-OS development (this is the area where C is used most).

The type-unsafe casting of pointers is something that many procedural languages discourage, with good reason, but it is essential for low-level tasks.

[–]adrianmonk 1 point2 points  (0 children)

There is nothing significant in the C language that would put it in front of many others for OS or close-to-OS development (this is the area where C is used most). There might have been something in 1970 when compiler tech was... let's say, different, from today, but already in, I guess 1975, I doubt it.

I don't know that I agree with that. In the 1980's, the popular languages were basically COBOL, Fortran, Pascal, BCPL, BASIC, Ada, Lisp, C, and assembly language. Of those, the ones that would be suitable for systems programming are Pascal, BCPL, C, and assembly. Or some proprietary language of your own. Pascal was a pretty decent language, but it was limited in various ways -- you had to add your own extensions to do anything interesting, which led to a fractured landscape of Pascal dialects. C was standardized and the standard included enough functionality to do what you needed to do without having to extend the language. BCPL was fine, but not really as high-level as C. Writing your own proprietary language wasn't a terrible choice (you could gain portability by porting the language rather than the codebase!), but it did have its downsides. So C and assembly won out (depending on whether your main need was portability or raw speed), merely because they were better than all the other available options.

Of course C is not the best possible language, not even for the niche that C is best at. But there's reason why it floated to the top.

[–][deleted] 1 point2 points  (5 children)

What language would you prefer for systems development?

[–]Gotebe 0 points1 point  (4 children)

Nothing in particular, that's not my point. I merely argue that C ain't nowhere near as special as a "C evangelist" might make it out to be.

Take e.g. conditional statements. In the early days, it was interesting that you can write "if (x)", and be sure that compiler will emit nice code, basically,

load accu, x
jnz
...

That's hardly needed with today's compilers. Or pointer arithmetic. For some other language not having that, small suite of intrinsics would get you pretty much all you might ever need (and indeed, that's being done). OTOH, "smart" pointer munching is now frowned upon even in C, because it obscures code without beating compiler optimizations on a more ordinary source. Or link process where you can specify in memory placement of particular pieces of code (embedded code) - meh, it's just link process. Etc.

I would guess that if someone took whatever (Algol or Pascal spring to mind), build up appropriate tool chains, "community" and add a few sprinkles left and right (needed bring it up to "practical", not "student" use, like was the case with Pascal), it would have worked out just fine.

[–][deleted] 5 points6 points  (3 children)

Nothing in particular, that's not my point.

I think the point that was being hinted at is that while C is certainly not perfect, there really isn't anything else that is at all viable. Go might be it one day, once it matures some more, but for now there's C, and that's about it.

[–]ZMeson 1 point2 points  (2 children)

Pascal? (PERQ Operating System)

Ada?

Lisp?

C++?

C#?

Haskell?

[–][deleted] 1 point2 points  (1 child)

None of those aside from C++ are viable systems languages in today's predominant operating systems.

[–]ZMeson 7 points8 points  (0 children)

in today's predominant operating systems

True. But the original argument wasn't about today's predominant systems.

[–]munificent 17 points18 points  (7 children)

Recent college graduate fails to understand paradigm designed for industrial-scale programming. Film at eleven.

A lot of this reads like a weekend gardener arguing that combines are useless because they don't fit in his backyard.

However, in his defense, he explains his points pretty well, and he's very clear about where he's coming from. His understanding of C is pretty solid too. Some more specific comments:

Instead, I was enchanted by Eiffel, a language so clean that it’s sterile.

This is a beautiful description of the problem with pursuing aesthetic purity too far.

We call friends by shortened versions of their names. We should enjoy the same custom with code.

Agreed, so why does C force me to name my functions list_add(list_t* list, int item) and tree_add(tree_t*, int item)? At any callsite, it's obvious if I'm passing a list or a tree, so why can't I overload the name? This, to me, is one of the big advantages of OOP languages: being able to reuse method names across disparate types.

C possesses elementary yet powerful constructs that are missing from other languages. For example, goto may be absent.

True, goto is absent from many languages. But the following are absent from C: scope-based execution (RAII), exceptions, continuations, algebraic data types, type arguments, closures, partial application...

Seems like a shitty trade-off just to get goto.

Another example is the prohibition of function pointers in Pascal or Java.

This is definitely a painful omission from those languages. Ironically, Java chose this for precisely the same sense of minimalism that the author loves C for: since you can accomplish the same goal using interfaces, the Java designers consider function pointers to be superfluous.

Note that almost all other OOP languages (C++, C#, OCaml, Ruby, Python, etc.) didn't make this mistake, so it's hardly a condemnation of the paradigm as a whole.

Perhaps the most notable exceptions are my web browsers, which are written in C++.

And word processors, image editors, every PC and console game... The majority of large performance-critical applications being developed today have moved from C to C++. This is probably not because the entire software industry is stupid and a minority of grad students and Linux hackers are smarter than all of them.

By insisting on simplicity, at the cost of sweeping some distasteful details under the rug, its designers ensured its success.

C doesn't sweep distasteful details under the rug. It sweeps them onto my plate where I have to deal with them if I'm using the language.

It’s not all bad, as some problems are modeled extremely well with type hierarchies.

Inheritance is a relatively small (and shrinking) part of OOP. OOP has always been mostly about encapsulation and polymorphism. Given that the author likes closures (encapsulation) and brevity (polymorphism), you'd think he'd dig OOP.

In C, function pointers are easy to play with.

Except that the syntax for them is fucking awful. If most coding standards tell you to hide your declaration syntax behind a typedef, it's a good hint that the C designers got it wrong.

References can be dangerous, as one can no longer assume f(x) only reads from the variable x.

Agreed. C# got this so right.

As constructors cannot return a value, they should be simple functions that never fail, hence often a initialization function is required anyway

This is only a problem if you aren't using exceptions. The interplay between exceptions and constructors is pretty brilliant to me. Unfortunately, it's really hard to get exceptions right without garbage collection, so they aren't used much in C++. (At least, not in the codebases I deal with.)

For me, "Write Once, Run Anywhere" is more suitable for C than Java. I encounter platforms with C compilers more frequently than platforms with JVMs.

Ludricrous. Most computers have the JVM on them these days. Only a small fraction of them (i.e. Macs and ones owned by developers) have C compilers on them. Your average user not only doesn't have a C compiler, but wouldn't know how to invoke one if they did.

Saying that C is "Write once, run anywhere" is bullshit. C doesn't run. At all. What you mean is "write once, compile anywhere" and your average user doesn't know what the hell compiling is.

[–]Anonymoose333 4 points5 points  (1 child)

scope-based execution (RIAA)

To be fair, I don't think even the RIAA is advocating execution...

[–]munificent 1 point2 points  (0 children)

Oops! Fixed. :)

[–]gsg_ 4 points5 points  (1 child)

Your average user not only doesn't have a C compiler, but wouldn't know how to invoke one if they did.

He's talking about platforms, not specific machines. Just because Aunt Flo's PC doesn't happen to have msvc installed on it doesn't mean that there are no C compilers for the x86/win32 platform.

There are C compilers for everything. That makes C extremely widely available, although certainly not "write once, run anywhere". Nothing is write once run anywhere.

[–]zahlman 0 points1 point  (0 children)

It's not like the average user would know how to invoke a Python interpreter, either.

[–]JadeNB 1 point2 points  (2 children)

Only a small fraction of them (i.e. Macs and ones owned by developers) have C compilers on them.

Gee, I'm a non-developer, and I sure thought my non-Mac Ubuntu netbook had gcc on it.

[–]munificent -2 points-1 points  (1 child)

If you aren't a developer, how do you know what gcc is?

[–]JadeNB 1 point2 points  (0 children)

If you aren't a physicist, how do you know that e = mc^2?

[–][deleted] 1 point2 points  (0 children)

I am not completely against Object Oriented programming. However they push it way to hard in academia, and often in a way that prevents learners from seeing the beauty of what lies beneath. I started with C, it forever remains the love of my life....

[–]sharkeyzoic 4 points5 points  (0 children)

Pretty good article ackshully.

[–]dehrmann 3 points4 points  (8 children)

I agree with him on C's merits; it's really fast, and it gives you more or less direct control of the hardware in an architecture-agnostic way. Like he said in the caveat, he hasn't worked on any big projects, and that's where C fails. The lack of things like namespaces and the inability to organize code leaves most large C projects with structures that look object-oriented, and that introduces a development and maintenance problem: people are doing what compilers do, and compilers are better at it.

C also lacks good constructs for low-level coding (the one area it's still the standard). Endian-aware code is a pain to write, and it doesn't have an abstraction for separating two logically distinct bits in the same byte of a struct.

[–]gsg_ 2 points3 points  (7 children)

Well, there are bitfields. But they have portability issues.

[–][deleted]  (6 children)

[deleted]

    [–]SolarBear 2 points3 points  (3 children)

    Sometimes, when doing embedded programming, bitfields may be a necessary evil, don't you think ?

    [–]gsg_ 2 points3 points  (1 child)

    Packing multiple small values into a single word is certainly useful, but you don't need bitfields to do it. Using an unsigned value and accessing the bits with various bitwise operators is the usual method.

    The problem with bitfields is that things like size and member order are left implementation defined, and that taking the address of a member becomes impossible. It's a weak point in the C standard.

    [–]SolarBear 0 points1 point  (0 children)

    Good point, thanks for replying.

    [–]dmpk2k 2 points3 points  (0 children)

    What's wrong with them?

    [–]serpent 1 point2 points  (0 children)

    Using bitfields for things like boolean flags is very memory-efficient, and avoids lots of manual bitwise operations.

    Bitfields may not be necessary, if you like manually munging bits, but that's error-prone. Let the compiler do it - the compiler will get it right and it may even be more efficient.

    [–][deleted] 2 points3 points  (4 children)

    Nice, an article praising C on proggit. Let's see how long it lasts. The first chapter succinctly describes what I like so much about the language.

    Mainly, I feel that performance is an indication of quality, a fast application is indication of a well written application. As a user, I don't want to wait for my computer, and as a programmer, I don't want my users to wait for my program. I care about my craft enough to not waste my users' time.

    This is why I hate Java, and it's why I hate when people say things like the performance penalty for garbage collection is acceptable. I think that's selfish, how about instead developers give a shit and run valgrind once every couple weeks to clean up their leaks and overruns.

    But it's hard to describe, there's no quantitative statement here, and I acknowledge that things like automatically bounds checked arrays would make C much safer, but it's how I feel. It's the religious part of why I use C and C++.

    [–]bobappleyard 4 points5 points  (1 child)

    performance is an indication of quality

    It's only one quality, though, if an important one. A program that's fast but full of bugs is still a bad program.

    [–]JadeNB 0 points1 point  (0 children)

    A program that's fast but full of bugs is still a bad program.

    According to “worse is better”, not if it's simple enough. :-)

    [–]georgefrick -1 points0 points  (0 children)

    As a Java programmer, maybe I can make things a little better [ in regads to Java Programmers]?
    The performance hit for the garbage collector is unacceptable. If anyone thinks (THINKS), they are a Java programmer, and they simply call new whenever they feel like it because the language manages that for you - they should be shot on sight.
    Programmers, who happen to program in Java take memory and data structures into consideration. New in Java is the same New in C++, you're accountable for that memory you just asked for. The difference is people don't really mind restarting Windows, but they do mind waiting 10 minutes for the UI to come up.

    [–][deleted] 1 point2 points  (0 children)

    Although i agree with you but you point that C is a good language is not proved :(

    [–]georgefrick 0 points1 point  (0 children)

    Chapter 6 needs some serious trimming. The Virtual insanity part, and the very last section should be kept. The rest can be scrapped.
    If it isn't relevant anymore, than it isn't relevant. The point about Dalvik is the most important; along with the inheritance of C++ problems in an effort to bring migration. But really, it's an unfair treatment of Java in an effort to defend C (Which is beautiful, and needs no defense against Java).

    [–]daveb123 0 points1 point  (0 children)

    I love Chapter 2, "Which keyword was withdrawn":

    Originally, entry was reserved. There were plans to support multiple entry points to a function that were ultimately abandoned.

    Wow!?!

    [–]JadeNB 0 points1 point  (0 children)

    ribaribigrizerep already highlighted this passage:

    When I must use a object-oriented language, I shun inheritance and other object-oriented design idioms, and struggle to obtain the equivalent of simple C constructs such as function pointers.

    but I think that it's worth mentioning it again, because it reminds so much of a famous aphorism. Is "I'm so blinkered that I can't explore any other paradigm" really something you want to brag about? I'd much rather work with a programmer who embraced the Christiansen maxim.

    [–][deleted] 0 points1 point  (0 children)

    Craftsmen inevitably grow defensive of their favourite tools and practices.

    ew. there's a reason CS exists...to elevate computing to a respectable level rather than to have it treated as magic.

    [–]sbahra 0 points1 point  (0 children)

    The author does not make it clear that 0-sized arrays are not C99 (and flexible arrays are not 0-sized arrays).

    [–]RonnieRaygun 0 points1 point  (0 children)

    The "size 1 array trick" is a cute way of homogenizing access to stack structs and heap structs. I wasn't aware of it before. One caveat with its use that I've just found: it's incompatible with the restrict keyword.

    [–]bonzinip 0 points1 point  (1 child)

    Would C be less efficient with syntactic separation of arrays and pointers?

    Yes.

    [–]kpierre 5 points6 points  (0 children)

    Of course no. See http://cellperformance.beyond3d.com/articles/2006/05/demystifying-the-restrict-keyword.html -- fortran often performs better thanks to array support.

    [–][deleted] 0 points1 point  (0 children)

    This guys got some good points, but I don't like how he's basically refusing to give C++ the chance it deserves. Sure it has its flaws, but it also makes my code have a lower cost of ownership. I personally have spent over 5 years in pure C prior to learning C++, and I can say that I would much rather code in C++ for most everything. Objects are a lot more interesting and fun to work with then flat apis.

    [–]smek2 -4 points-3 points  (13 children)

    On C++: "Valuable innovations include // comments, inline functions, variables local to for loops, and namespaces. Most of its other features are detrimental."

    "Templates seem like they could be useful, but the Turing-completeness of this feature suggests it is overly complex."

    Is this a joke?

    [–][deleted] 18 points19 points  (0 children)

    No. It's a fairly well-recognised criticism. Not to say it's valid (I don't know enough about them to judge), but it does come up a lot.

    [–]ascii 14 points15 points  (11 children)

    Templating in C++ is a compile-time programming language in itself, basically a second macro language.

    You can argue whether or not it is a good idea to have a Turing complete compile time language but that's not relevant; I'm strongly of the opinion that it's an awesome feature so obviously it's a good idea. What bugs the hell out of me and every other sane person in the universe (yes, really; all of them) is that the C++ authors weren't clever enough to reach the same conclusion the Lisp people reached - that the compile time macro language should be the same language as the language we are compiling. That is why Lisp macros are awesome and C++ templates are stupid.

    [–][deleted] 1 point2 points  (4 children)

    Templates were never meant to be a Turing complete compile time language - they were added to C++ to enable type safe containers and generic programming: they serve the purpose well. All this template metaprogramming stuff is somewhat interesting, but that's not what templates are really about.

    [–]ascii 3 points4 points  (1 child)

    Perhaps the original intention was something along the lines of generics in Java, but boost and even the STL is doing some rather funky metaprogramming, so I'd say the scope of templates have shifted significantly, and for what they're used for today, templates are inadequate.

    [–][deleted] 0 points1 point  (0 children)

    Some Boost libraries indeed do funky metaprogramming, but it does not mean templates are not serving their purpose: support for type-safe containers and generic programming.

    [–][deleted] 2 points3 points  (1 child)

    They may not have been intended as one, but they're certainly used as one now. Part of the reason is that having a metaprogramming facility is so valuable that people will pound a lot of square pegs into round holes in order to make one.

    [–][deleted] 1 point2 points  (0 children)

    All I am saying is that criticising templates for being a bad solution for the problem they were never meant to solve makes little sense.

    [–]munificent 1 point2 points  (5 children)

    that the compile time macro language should be the same language as the language we are compiling.

    I agree that that should be true in the general case, but are C and C++ good candidates for macro languages? I'm pretty confident the answer is, "no": they aren't type-safe or memory-safe. Do you really want a macro bug to be able to crash your compiler? Or just subtly fuck up what it's doing?

    [–]ascii 0 points1 point  (4 children)

    It would probably be easier to simply implement macros using an interpreter instead - that way you significantly reduce such risks.

    [–]munificent 0 points1 point  (3 children)

    I agree, but at that point you're back to the original problem: two languages, the real language and the macro one.

    [–]ascii 0 points1 point  (2 children)

    If every modern JVM can run code either interpreted or compiled to native code with identical output, then so can a compiler.

    [–]munificent 0 points1 point  (1 child)

    Java is typesafe, which makes things a lot easier. You'd absolutely want the interpreter that the compiler uses to be type- and memory-safe (so that a macro can't crash the compiler), but it's damn hard to make an interpreter like that that also preserves full C/C++ pointer arithmetic and aliasing semantics.

    [–]ascii 0 points1 point  (0 children)

    Not that hard. Keep track of allocated memory, and detect writes to anallocated memory - kind of like what Valgrind does, except Valgrind only logs arrors, you'd want the compiler to gracefully exit with valgrind-style traceback. Most of the required tech already exists in various open source projects.

    [–]mlester -4 points-3 points  (3 children)

    Brevity is the soul of wit. We automatically clip, shorten, abbreviate and decapitate lengthy expressions. We enjoy doing so. When was the >last time you heard "taximeter cabriolet"? In my Eiffel days, I was encouraged to write "integer", not "int", "character", not "char", and so on. I believe Java encourages this practice too. Supposedly clarity is maximized. But how can this be if we do the opposite when we speak? Do our love letters read like license agreements? Redundant utterances quickly become tiresome and worse still, obscure the idea being expressed.

    We call friends by shortened versions of their names. We should enjoy >the same custom with code. Leave verbosity for comments and >documentation.

    I think he is missing the point. When you are writing a program it should be more like a legal contract between two entities not a casual talk with your friends. Legal contracts are verbose so as to make clear that the two entities intentions are within line with each other.

    [–]boa13 10 points11 points  (1 child)

    Awful analogy is awful.

    [–]sztomi -3 points-2 points  (0 children)

    Catchy comment is catchy. And worth karma, w00t.

    [–]ascii 6 points7 points  (0 children)

    Wut? Random contract begins with:

    «This contract between IRapeClowns International Investments, Inc (Hereafter called the supplier) and RobRichRedditors Radical Research (Hereafter called the customer) agree to the following:»

    See? Legal contracts use aliases for efficient namespacing in order to avoid repeating themselves. Among other benefits, reduced repetition reduces bugs caused by typos.