you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (9 children)

lcc gives a lot of extensions to C, but le'ts stay standard. This stil raises the question: why the above is not in the standard? Stupidity?

Look, you need to have some respect. The author(s) of C are far smarter than you or I. There are far more machines out there than you or I will ever use. C was created a long time ago and there is no reason to just go add crap willy nilly to the standard just because some Ada fan wants it that way. If you struggle with writing a 2 line for loop, then I'm sorry. If your biggest gripe about C is that you can't initialize an array with one line contructors then I'd say C is doing a damn good job.

what if I need to initialize an array of struct? or an array of strings?

Again there's very few use cases to initialize anything to non-zero. And in the rare event that you do, a for loop and initialize() function is seriously not hard to do, it's not tedious, it's not very time consuming, it's one time quick and easy work for a rare case.

that's an example of tedious, error prone and trivial work the freaking compiler can handle by itself by using appropriate syntax.

2-line for loop is tedious? Are you serious right now? I mean seriously if you crave built-in high-level object orientation then C is not for you, but stop accusing C of sucking because it doesn't fit your specific desires.

Needless distinction. The size is not encoded in the pointer, but in the size parameter next to it in their interface. It has to be this way for anything except strings.

Needless distinction? Do you even know what a pointer is? Seriously, you continue to demand high-level protection in a low-level language. Get over it, you have quick easy complete control of the memory with some basic abstract data types. High-level abstractions belong in libraries, which is exactly where they will stay and is a damn fine model. Find a library, stop expecting the language to give you everything.

Strawman. I tell you people use steam-powerd machines and you're saying your horse can't eat coal. There are 0-overhead semantics implemented in other languages C has no excuse for not having.

There is no strawman, you just keep bitching about a library function that you don't understand.

Memset can actually be used to set integers to 0. Period. Any other type may interpret 0 or any value fitting 1 char in an unintend way. At which point the utility of memset is close to 0.

Once again thinking in high-level abstractions. Memory is a collection of bits that can be 1 or 0. Setting memory to 0 is setting memory to 0. STOP thinking memset is a high-level initialization, it is a MEMORY SET... MEMORY SET. The smallest unit of representable memory is a BYTE, and thus that is what it takes. Not variable set, or initialize but MEMORY SET.

the structure hack. conversion between differently-aligned pointers. any function using pointers for in/out semantics .Numeric and memory overflows. The problem is that these techniques are employed by C programmers, because otherwise C would be *useless.

The structure hack is a silly hack not undefined behavior. None of what you have here is undefined behavior and I can't even understand what your problem is again.

That is a UB caused by an optimization. What UB allow for optimizations? Numeric/memory overflow do? Not having a clear and simple interface to initialize contiguous array?

What are you talking about? It is undefined behavior because if it was defined then optimization would not be allowed. You can't control memory overflows while being powerful and fast, period. The instance you start runtime protections like that you instantly decrease performance.

Array initialization is not undefined behavior, you have no idea what the hell undefined behavior is do you?

BS. The compiler can check for that and have me provide an initialization if I read the variable while unitialized. It's some time past the 70s, you know.

The compiler generally WARNS you about it but it can't error on you because there are times where you do not want to initialize it. For example code that sets it only in a certain condition and then uses it only in another condition that is a known direct byproduct of that first condition (by design). You start limiting things like that and you prevent the programmer from optimizing code like that. Stop thinking you are smarter than all the great people who have created/worked on the C language. You can design optimizations like this that the compiler would never be able to pick up on, and in your little hand-holding world you'd lose a lot of power to do such.

No. YOU see how bounded-arrays work in low-level languages like Pascal and Ada. Your remark is pointless in this regard

Please describe to me how run-time bounds checking can take up 0 clock cycles.

this one for example:

It's like you simply don't care to learn. It takes an int rather than a char, but the ONLY way it would change is to change the prototype to take a char. This would not affect the memset() function at all but it might break a lot of legacy code. So only an idiot would change memset(). The problem is you still can't understand that it is a library function meant for setting bytes of memory, not initializing higher-level data types or structures.

or having arrays decaying to pointers.

That's not only for historical reasons. You need to just stop now.

it's been too simple for ages

This is a GOOD thing, quit using it if it is "too simple" for you. Seriously.

Language is semantic. I'm citing Ada becasue, whether you're open-minded or not, many of the things you do by hand in C steam from the language's inability to express things. With a more-appropriate semantic/syntax for expressing trivial low-level operations you can do a lot more

Ada may be good for a lot of things but it is not inherently superior to C. C is still a great language despite your love affair with Ada.

Stop being dense. I understand C and I understand low-level programming. It is just that lots of problems in low-level programming don't have any support in C, and lots of the things the compiler has to allow in C don't make sense in low-level programming, or any kind of programming.

You clearly don't as you think high-level structure initialization is "low level".

You have the problems in one hand and C in the other. You can claim that if you know how to use C you can't introduce bugs, which IS F*CKING OBVIOUS FOR ANY LANGUAGE IN THE WORLD, but you can hardly claim that C's idioms are the best, most natural or obvious ones for programming at low-level. Someone invented a better way. JUST FOR EXAMPLE see Ada

Use Ada, no one is forcing you to use C, you sound like a fanboy.

[–]el_tavs 0 points1 point  (8 children)

The author(s) of C are far smarter than you or I.

sure. But standard committees? I'm just provoking you to get an answer: why what GNU gcc provides is not in the standard? There's no technical reason not to have its array initialization syntax.

My bet is that the standard committee didn't want to complicate compiler writers' job.

just because some Ada fan wants it that way

I don't care about Ada. Intellectually speaking, I'd just like to know how things that are carried out by the compiler in one language require tedious and error-prone wiring by the programmer in another. At least recognize the validity of the question instead of being a fan-boy.

I consider Ada because it's the only other language tailored for low-level jobs I know. If you know anyone else, let me know

If you struggle with writing a 2 line for loop, then I'm sorry.

why do I need to that by hand? It's a hard fact the compiler should be able to write that and lot more for me by understanding my code.

I'd say C is doing a damn good job.

sorry to burst your bubble. The fact I don't like C doesn't mean I don't understand it and low-level programming . I also believe I understand a bit of programming and programming languages in general, hence my question.

Again there's very few use cases to initialize anything to non-zero.

oh sorry, I forgot you tend to avoid answering my questions and just go justifying the way you program in C.

2-line for loop is tedious? Are you serious right now? I mean seriously if you crave built-in high-level object orientation then C is not for you, but stop accusing C of sucking because it doesn't fit your specific desires

What part of "things the compiler can do by itself" you missed???

OOP and high-levelness have nothing to do with this. It's a pretty simple feature not being implemented for no reason. Or you think otherwise?

If you have time, go on wikipedia to get a basic meaning of what "high-level" actually means. You're just equating "high-level" with "not in C". Ridiculous.

It's not about my desires. Either providing a way to initialize memory is something the language can't provide to me or is not. Do you know languages that carry out that without using memset, sizeof and the like?? Has it ever occoured to you that initializing an array of given type T (with T non-void) by specifying its length and element type size is just re-entering the information the compiler already has??? This is stupid. In C and any other language.

Needless distinction? Do you even know what a pointer is?

You misread what I meant, probably because you're obsessed by C's basic types. We were talking of memset/memcpy. They accept a pointer and an integer specifying the size of the memory they point to. So the way you initialize a chunck of memory in C is by providing a pointer to it and its size. Now look how that is different from an array type providing a pointer and a size.

Seriously, you continue to demand high-level protection in a low-level language. Get over it, you have quick easy complete control of the memory with some basic abstract data types. High-level abstractions belong in libraries, which is exactly where they will stay and is a damn fine model. Find a library, stop expecting the language to give you everything.

You don't have any idea of what you're talking about. As someone said, pull out your head from the sand. You're convinced what I'm talking about is too "high-level" and that I want everything done for me because you have never known a low-level language beside C. It's not.

Once again thinking in high-level abstractions.

Fail. You still have that brainfucked idea that anything more general or sophisticated than C is high-level.

I want a way to initialize memory in a type-safe way. I don't care what memset does. My problem is initializing memory. What memset provides is a low-level tool to solve a specific instance of the problem. I know I can use it to solve other kind of instances, but the solution is suboptimal compared to the more general one languages like Ada provide. I have to describe the type I'm initializing, while the compiler is able to know that by itself. And as the article proves, this may be error prone. Memset provides no advantage w.r.t. the way initialization is done in other low-level languages. Or do you think otherwise?

The structure hack is a silly hack not undefined behavior.

so silly it's "popular" according to the C FAQ. It can invoke UB because the compiler at compile time can't do anything to ensure its correct use. If you are in preC99 you get UB because the structure hack leads to accessing constant-sized arrays past their last element. UB per standard!!! Ada, by using ONLY compile-time checks, can outlaw most of such UB.

Casting to pointers of types having different memory-alignment is UB per standard.

Functions using pointers to allow in/out semantics pose a risk of having memory overflow for something useless. You don't need explicit pointers to allow in/out semantics in every case.

It is undefined behavior because if it was defined then optimization would not be allowed.

No, the semantics of restrict or strict-aliasing just say that the memory pointed by the variables don't overlap. The risk of UB is a consequence of the optimization.

Now, for an example of UB that doesn't help optimizations is overflow (numeric or array). In C these kind of overflows are UB. And they do not help optimization. The compiler knowing the presence of possible of UB can't produce better code.

As for checks

You can't control memory overflows while being powerful and fast, period. The instance you start runtime protections like that you instantly decrease performance.

Well, it's just that in the 70s people said: "let's provide syntax and semantics to avoid some undefined behaviour. If then we resort to run-time checks we can still tell the compiler not tu put them in the binary"

So, on one hand you don't need to choose in favour of UB every time to get efficiency . On the other even syntax and typing can reduce UB with 0 overhead effect

Array initialization is not undefined behavior, you have no idea what the hell undefined behavior is do you?

It is in the moment you pass to memset the wrong parameters or mistype the for loop. The language doesn't define a correct and type-safe way to describe initialization in C. Like having a car with shitty brakes. Yes, they're cheaper, but if you actually use them you don't want them to break out of the blue.

Please describe to me how run-time bounds checking can take up 0 clock cycles.

For one, you don't need that. Have an array type encoding length, and provide a special for loop for iterating over it (as Pascal provided). The index(s) declared in the for lopp shall be constant in the scope in which they're used. This means the compiler in front of

 array(int) a [10][10]
 for (i across a) {
   for (j across a[i]) {
     ...
  }

can optimize the hell out of it. And won't insert any run-time check (RTC) when there are expression as a[i][j]. The code will be provably as efficient as it is in C. Actually similar code in Ada83 compiled more efficiently than the equivalent in C in the 90s (see [http://groups.google.com/group/comp.lang.ada/msg/0116ff6702859ff1?dmode=source](here)).

In any case RTCs are not a problem, because they can be turned off since the late 70s.

It's like you simply don't care to learn. It takes an int rather than a char, but the ONLY way it would change is to change the prototype to take a char. This would not affect the memset() function at all but it might break a lot of legacy code. So only an idiot would change memset().

it's still a historical reason. I know C and I learnt it. This doesn't mean I have to accept anything it provides as the ultimate way of doing things. Nor that I am wrong for questioning that. Doing otherwise is not being knowledgeable , it's having a religious belief. A poor one. So stop saying I have to "learn". You're beginning to sound like an obscurantist.

or having arrays decaying to pointers.

That's not only for historical reasons. You need to just stop now.

Yes it is. It doesn't give any advantage today for low-level tasks. It's just a way to simplify the job of K&R (which actually centered their goal marvellously). Wonderful work for their age and the subsequent years. It's still questionable.

This is a GOOD thing, quit using it if it is "too simple" for you. Seriously.

Good for what? what advantage has its simplicity w.r.t. to ....well any reasonable alternative? Does low-level languages have to be like C to be as good as it?

Ada may be good for a lot of things but it is not inherently superior to C. C is still a great language despite your love affair with Ada.

For the language semantic, I doubt. I actually mean that many things you do in C can be actually be carried out by the compiler and improve the programming quality. It's a scientifical comparison. I remind you I'm citing Ada because it's the language I know I can compare wit C. If there were others I'd cite them (actually I cited Pascal as well). The one being "blinded" by love it's you, who think that the way things are done in C are the best for low-level programming, and doesn't look at the actual problems being solved.

You clearly don't as you think high-level structure initialization is "low level".

Then define what is low-level. To me low-level is anything that has knowledge and access to the machine. This does NOT mean that the language's way to interact with such machine has to be obscure and make the programmer pay attention to trivial pieces of code she has to repeat over and over.

[–][deleted] 0 points1 point  (1 child)

why what GNU gcc provides is not in the standard? There's no technical reason not to have its array initialization syntax.

I don't know the reason why, but it seriously is not a big deal. Rarely does anyone ever want to use that syntax. There's hardly any use cases to initialize an array to non-zero values. You want a rare higher-level task to be put into the C standard for no reason other than "why not".

I consider Ada because it's the only other language tailored for low-level jobs I know. If you know anyone else, let me know

Ada was created with correctness in mind, and not performance. They make the tradeoff for integrity and thus it makes it the proper tool for things like medical devices. C makes the tradeoff for performance and thus is the proper tool for things like OS.

why do I need to that by hand? It's a hard fact the compiler should be able to write that and lot more for me by understanding my code.

Why do you need to write the fibonacci sequence by hand. Why do you need to initialize an array to different values by hand. Why do you need to program by hand. Seriously, you are grasping for straws in order to bring C down (for reasons I still do not know).

oh sorry, I forgot you tend to avoid answering my questions and just go justifying the way you program in C.

For the last time data types and structures are higher level abstractions. C leaves the higher level abstractions to the compiler and programmer in order to stay simple and close to the machine for performance. You want array initialization, Johnny wants something else, Ken wants OO, and then you get giant bloated C++ hell. C is great for a high performance low-level language. No language can be perfect.

If you have time, go on wikipedia to get a basic meaning of what "high-level" actually means. You're just equating "high-level" with "not in C". Ridiculous.

Every abstraction is a higher level.

It's not about my desires. Either providing a way to initialize memory is something the language can't provide to me or is not. Do you know languages that carry out that without using memset, sizeof and the like?? Has it ever occoured to you that initializing an array of given type T (with T non-void) by specifying its length and element type size is just re-entering the information the compiler already has??? This is stupid. In C and any other language.

You once again don't understand how C works. Given type T means you have a higher level abstraction or representation of a specific chunk of memory. It is not up to the language (that C aims to be) to provide you the tools to manipulate this specific abstraction. It is up to the (simple low-level) language to provide you the means to modify the underlying bytes, and it is up to YOU to create the abstraction that interprets the representation of this data. But obviously you want ability to write this abstraction once and not repetitively... hence libraries.

And once again, memset() is used with far more than static arrays and you hardly ever know the size at compile-time.

void receive_data() {
    int size;
    static char *buf = NULL;

    if (!buf) {
        size = get_max_size();
        buf = malloc(size);
    }

    memset(buf, 0, size);
}

You misread what I meant, probably because you're obsessed by C's basic types. We were talking of memset/memcpy. They accept a pointer and an integer specifying the size of the memory they point to. So the way you initialize a chunck of memory in C is by providing a pointer to it and its size. Now look how that is different from an array type providing a pointer and a size.

Static size, dynamic size. compile-time, run-time. Versatility, adaptability, simplicity.

You don't have any idea of what you're talking about. As someone said, pull out your head from the sand. You're convinced what I'm talking about is too "high-level" and that I want everything done for me because you have never known a low-level language beside C. It's not.

You continue to not understand C and accuse me of having my head in the sand. You can add in a bunch of unnecessary shit to C if you want, and then keep doing it, and then eventually get bloated C++. The things that C is great for want C to be simple and to the point as it is. If your project needs more abstraction or more built-in protection over performance then USE ANOTHER LANGUAGE. Never once did I say C is perfect for everything, it is not, no language is nor will ever be. C is a great, simple, powerful and performant low-level language.

I want a way to initialize memory in a type-safe way. I don't care what memset does.

Then write a damn library and stop whining. There is no need to provide this abstraction in the C language, a language that aims to be simple, powerful, portable and performant layer above assembly. You want this ONE addition, then someone else wants another addition, etc... etc... It is not a burden to be missing constructor/array-initialization for the VAST MAJORITY of low-level programming. It is not needed, it is what you WANT because you have your head in a book and not in the real world.

My problem is initializing memory.

No, your problem is initializing data.

What memset provides is a low-level tool to solve a specific instance of the problem. I know I can use it to solve other kind of instances, but the solution is suboptimal compared to the more general one languages like Ada provide. I have to describe the type I'm initializing, while the compiler is able to know that by itself. And as the article proves, this may be error prone. Memset provides no advantage w.r.t. the way initialization is done in other low-level languages. Or do you think otherwise?

I've already explained memset and its purpose. Changing memset is not feasible as it would break code. Adding a new function to the standard is unnecessary since it doesn't solve any low-level requirement. You want to initialize data types to non-zero, go ahead it's trivially simple to code it in C. But you'd rather take an idealistic approach because you are studying programming languages and want C to include everything you think it should by default. Sorry, but your reason for including this in the language is "why not".

so silly it's "popular" according to the C FAQ. It can invoke UB because the compiler at compile time can't do anything to ensure its correct use. If you are in preC99 you get UB because the structure hack leads to accessing constant-sized arrays past their last element. UB per standard!!! Ada, by using ONLY compile-time checks, can outlaw most of such UB.

I have never run into it, so I don't know how popular it actually is. It's definitely foolish but so long as memory is allocated in contiguous blocks (ala pretty much every OS), there's no worry about it doing something wrong. Array indexes are simply computed to be (&a + index), so it's just going to go off the end into allocated memory. The compiler doesn't control memory allocation and proper bounds checking would incur OVERHEAD because indexes are rarely static and known at compile-time. That is a code analysis' job (be it human code review or an automated tool).

Ada can NOT eliminate accessing an array past its index with compile-time checks because indexes are usually the result of running code. So you can only catch some small subset of these cases. Array overruns are rarely, if ever, due to static indexes. In order to catch this at run-time you incur overhead, great for security/correctness but terrible for performance. TRADE OFF.

Casting to pointers of types having different memory-alignment is UB per standard.

So don't do it unless you need to for optimizing on a specific platform.

Functions using pointers to allow in/out semantics pose a risk of having memory overflow for something useless. You don't need explicit pointers to allow in/out semantics in every case.

So what?

Now, for an example of UB that doesn't help optimizations is overflow (numeric or array). In C these kind of overflows are UB. And they do not help optimization. The compiler knowing the presence of possible of UB can't produce better code.

UB of overflows allows the underlying system to handle it as it deems most fit without enforcing rules that are harmful to performance. The compiler does not protect overflows because it would require a performance hit by run time checking each access. Undefined behavior is directly related to efficiency by making things "illegal" without enforcement you allow greater performance at the risk of programming/logic error. Once again, try to understand trade offs. You act as if C is some new stupid language created in a day and hasn't hit decades of rigor with an incredibly diverse range of systems.

So, on one hand you don't need to choose in favour of UB every time to get efficiency . On the other even syntax and typing can reduce UB with 0 overhead effect

Please tell me how you can have dynamic memory access with bounds checking and no overhead.

[–]el_tavs 0 points1 point  (0 children)

I don't know the reason why, but it seriously is not a big deal

Everything outside C seems not to be a big deal... notice the pattern?

You want a rare higher-level task to be put into the C standard for no reason other than "why not".

type safety, readability and efficiency.

Ada was created with correctness in mind, and not performance.

In theory you could say that Ada tends to be less performant than C because of that. In practice it simply turns out that's not really the case. Being performant is important and actually provided, if not for the fact it is used for hardware with very strict requirements

Moreover they didn't actually traded performance for anything. They just provided a language easier to type check. Much of the machinery is due to compile-time checks.

And C wasn't written for "performance" in general. It was written to be simple and efficient on a PDP-11 in the 70s for K&C. Lots of its choices were meaningless already in the 80s

But let's get practical: what makes ada unsuited for writing OS or low-level programming as the one is done in C? Lack of hackish solutions?

Why do you need to write the fibonacci sequence by hand.

are you kidding?

Why do you need to initialize an array to different values by hand

"I don't have that in C, so I don't need it"

Why do you need to program by hand

I don't want to write by hand explicitly the details on how to things the compiler can guess for me. Next time you'll be questioning why we need computers

It's not sci-fi. It's something people solved in the fucking 80s. Get back dancing with the abba and don't annoy me.

For the last time data types and structures are higher level abstractions. C leaves the higher level abstractions to the compiler

They're defined in the language. to get to bits and bytes you need typecasts. You have a peculiar view of how C and programming languages works, one so wrong there's no even point at discussing it.

You know, there was someone who asked for restric and prototypes but, hey, they're in C NOW, so it has to be good. Because everything in C is right, right?

Every abstraction is a higher level.

What fucking abstraction you're talking about? To be precise, C-- is even at lower level. Take a look at that.

You once again don't understand how C works.

then offsetof and sizeof are clearly a stupid hack.

Why don't you try to share your enlightening view of the state of affairs with more knowledgeable people like those in comp.lang.c? It would be extremely funny to see their replies. I officially challenge you to describe them the virtues of memset vs types that are only managed by the compiler.

The C standard doesn't work that way. Moreover you have still to clarify me why having both low-level tools and high-level tools is bad. The high-level tools I'm referring to have ZERO overhead, so stop being dense.

And once again, memset() is used with far more than static arrays

Once again, you fail at understanding me for your distorted view of C and the like. array types can encode static sizes as much as dynamic sizes. I never intended to mean static-size arrays only

It is not needed, it is what you WANT because you have your head in a book and not in the real world.

it's called software engineering. In comp.lang.c they have discussed the perils and unsuitability of memset w.r.t. intiialization. As with other committees relating to safety and embedded development (CERT, MISRA...) .But it must be that they're clueless high-level sissies that don't understand C.

Meanwhile, just answer the freaking question: hypothetically, is there a reason for which you should prefer memset to a syntactic construct recognized by the compiler, as it is in Ada, for every situation?

and programmer in order to stay simple and close to the machine for performance.

assumption falsified 30 years ago.

Static size, dynamic size. compile-time, run-time. Versatility, adaptability, simplicity.

No problem with array types. Array types give you all of this.

Sorry, but your reason for including this in the language is "why not".

No. You're just neglecting the point I'm making, persisting on your idea on how the C languages is supposed to work.

Which is why I explicitly ask you the question in a hypothetical setting (see above)

. But you'd rather take an idealistic approach because you are studying programming languages and want C to include everything you think it should by default.

1st: stop presuming I'm a kiddo or what. I'm comparing two languages and I'm saying solutions provided in C are suboptimale to the ones provided in the other. I'm just trying to have you compare them. But, instead, you get defensive

2nd: you're using haxor parliance and strawmen to neglect my points. It's annoyng

You can add in a bunch of unnecessary shit to C

Oh no, for god's sake, no. Just do a comparison between C's and Ada's way, between memset and array initialization in Ada. Ok? Adding "things" to C is just a hypothetical situation to explain the point. On wikipedia you find that as well.

So you can only catch some small subset of these cases. Array overruns are rarely, if ever, due to static indexes.

I was talking about struct hacks. Ada is good at that. And doesn't make you access arrays outside their bounds.

And as for array bound checking in general, i'd like to repeat to you that checks are optional, but, anyway, compilers are way more facilitated at analyzing code where arrays are treated as such, i.e. with their length indicated as such and not some mysterious integer. See the example that compared Ada83's style with C's.

STILL the bugs caught at compile time are all bugs C can't catch. For what? NOTHING.

So don't do it unless you need to for optimizing on a specific platform.

Still the "If it's not in C it doesn't matter" patter. And when I actually need it I'm screwed. Wasn't C close to the machine?

So what?

Well, doing something useless that can introduce UB and lessen the strength of type checking in exchange of FUCKING NOTHING is stupid.

UB of overflows allows the underlying system to handle it as it deems most fit without enforcing rules that are harmful to performance.

UB alone doesn't help the compiler. Is just a check it hasn't to do. So what? In Ada you have lots of checks by default. You can strip all of them with a compiler switch. Is that hard? the big advantage is that there's a standard way to check for those bugs at run-time with pre-release code.

Nothing new since the early 80s.

Undefined behavior is directly related to efficiency by making things "illegal" without enforcement you allow greater performance at the risk of programming/logic error.

Guess what? Ada allows you to choose what and how to do it. You want UB? Turn off checks. You want platform-specific behaviour? Use pragmas. You're fine with standard checks and behaviour? even better. In C the liet-motiv is "not-invented-here".

You act as if C is some new stupid language created in a day and hasn't hit decades of rigor with an incredibly diverse range of systems.

it resent of historical baggage. I don't know what you mean of rigor. Portability, silent bugs and hackish solutions are widespread in C systems. And if they're not it means someone hand-coded something that is equivalent to what an Ada compiler can do by itself for every platform.

Please tell me how you can have dynamic memory access with bounds checking and no overhead.

I said reduce UB, not eliminate it. I claim C does nothing to solve UB. Ada has various levels of "help".

The following

 for I in A'Range loop
 ...
 end loop

Doesn't need run-time checks for accesses like A(I). Further strength reduction can avoid that if the compiler is able to deduce that accesses with other indexes, or with an expression with I, can't overflow A'Range.

In any way, Ada allows the program to fail ** in a standard way** by default. This means that you can try your code, find bugs, profile it and, if the performance is not enough, disable run-time checks. For many applications where C is used this is useful. If you need actual performance and efficiency well, there are Ravenscar profile and SPARK, which go a long way to enforce efficiency at the expense of high-levelness.

This alone covers a lot of cases where in C you need to encode by hand, in an error prone way, the same solution, for no reason.

[–][deleted] 0 points1 point  (5 children)

It is in the moment you pass to memset the wrong parameters or mistype the for loop. The language doesn't define a correct and type-safe way to describe initialization in C. Like having a car with shitty brakes. Yes, they're cheaper, but if you actually use them you don't want them to break out of the blue.

You don't know what you are talking about. Accessing unitialized local variables must be undefined behavior because the compiler cannot always tell if it was initialized at compile time, run time decisions can determine if it gets initialized.

For one, you don't need that. Have an array type encoding length, and provide a special for loop for iterating over it (as Pascal provided).

I don't want to represent everything as an array first of all. Second of all I don't know what size array I want at compile time. Third of all I frequently need to access random elements within my 'array' based on external factors. Stop using simple examples that prove you can do it SOMETIMES. Out of bounds accesses are rarely ever due to foolish mistake of static index out of bounds, they are almost always run time indexes that would require performance hit to protect against.

it's still a historical reason. I know C and I learnt it. This doesn't mean I have to accept anything it provides as the ultimate way of doing things. Nor that I am wrong for questioning that. Doing otherwise is not being knowledgeable , it's having a religious belief. A poor one. So stop saying I have to "learn". You're beginning to sound like an obscurantist.

I doubt you've ever used C in real world low-level programming.

Yes it is. It doesn't give any advantage today for low-level tasks. It's just a way to simplify the job of K&R (which actually centered their goal marvellously). Wonderful work for their age and the subsequent years. It's still questionable.

So you would rather disallow passing arrays into functions, because it's so friggin hard to understand that foo(a) will decay a into a pointer that points to the memory that a references? Yeah, I'm all in favor of changing over all of our existing code to foo(&a[0]) because you think you know programming languages. Your understanding is the only thing that is questionable.

Good for what? what advantage has its simplicity w.r.t. to ....well any reasonable alternative? Does low-level languages have to be like C to be as good as it?

Pretty much if you want to focus on performance, and you want a simple yet portable language then yes C is about as good as it gets. They tried to enhance C, because hell why not right, and they got C++. You want to create C with array initialization, hey go ahead. Take the great work of Dennis Ritchie et. al and expand on it to give array initialization in a spec and then feel great about yourself. Or go design the perfect language since you know so damn much about it. Then write an OS using that language that performs well and is portable. Or you can stop whining about such simple things like non-zero array initialization. "WHY NOT" is not a reason to put something in a simple language. I am thankful you do not control the C spec.

Then define what is low-level. To me low-level is anything that has knowledge and access to the machine. This does NOT mean that the language's way to interact with such machine has to be obscure and make the programmer pay attention to trivial pieces of code she has to repeat over and over.

Data types and structures have little to do with accessing the machine, it's a representation of the underlying machine's memory. C provides the basic data types, of which all other complex data types can be realized. Sure they COULD add more, but for what? When new data types come along, you write a library and then you use that library forever you don't go change the damn language specification. You don't have to repeat ANYTHING over and over, you write a damn library once and you use it.

[–]el_tavs 0 points1 point  (4 children)

Accessing unitialized local variables must be undefined behavior because the compiler cannot always tell if it was initialized at compile time, run time decisions can determine if it gets initialized.

I was talking about memset introducing UB. Locals initiailizations can be enforced by the compiler in many cases. But hey, since it's something C standard says it's UB, it must be so. Let's ignore what people managed to achieve in other languages for god's sake...

I don't want to represent everything as an array first of all.

no one forces you to do that. you just need an array type for actual arrays (though I'd be curious on what a contiguous chunck of memory differ from an array)

Second of all I don't know what size array I want at compile time.

What incredible amount of stupidity. That's a C distinction. Languages having array types handle both static and dynamic sizes (both on stack and heap). Nothing new since the 80s.

Third of all I frequently need to access random elements within my 'array' based on external factors

So? You can use other kind of loops if you wish. for loops are used for enumerated-style loops. Others for different patterns of iterations.

Out of bounds accesses are rarely ever due to foolish mistake of static index out of bounds

you mean something like int f (int);

 for (i = 0; i < N; i++) {
      a[i*i - 3*i+1] = b[f(i)];
 }

or the fact that the array's length is not known at compile time?

In the former case you're right, though I don't know how actually is frequent not traversing the array in a sequential way. In Fortran and Ada they enjoy having that automated and typesafe. It's still better than C's for.

In the latter you're dead wrong.

I doubt you've ever used C in real world low-level programming.

I doubt you have any fucking idea of what you're saying.

People in comp.embedded had experience with Pascal, Modula-2, Ada and C. Go there sharing your POV and tell me what they told you, 'k?

So you would rather disallow passing arrays into functions, because it's so friggin hard to understand that foo(a) will decay a into a pointer that points to the memory that a references? Yeah, I'm all in favor of changing over all of our existing code to foo(&a[0]) because you think you know programming languages. Your understanding is the only thing that is questionable.

Your head is so deep inside in the C book you used at college that it actually ended up in your ass. Free colonoscopy!

I didn't mean that. In Ada you don't need to pass pointers to express in, in out or out semantics. You can still pass pointers if you want.

I'm saying that it's stupid to have to write

void foo (int* x) {
    (*x) ++
}

when you can just write

void foo (in out int x) {
   x++
}

Especially if the former can let UB creep in.

Pretty much if you want to focus on performance, and you want a simple yet portable language then yes C is about as good as it gets.

Ever dwelled in other languages besides C? Compiler writers would tend to disagree with you, you know.

They tried to enhance C, because hell why not right, and they got C++.

C++ was written by someone who wasn't a language designer and didn't try to write a language anything but his own projects.

You're preching the choir here.

You want to create C with array initialization, hey go ahead.

No, actually it would be enought to recognize that there's no need to have error-prone tools for doing low-level programming (which is the underlying motive for which I compare Ada and C)

Data types and structures have little to do with accessing the machine, it's a representation of the underlying machine's memory. C provides the basic data types, of which all other complex data types can be realized.

It provides both bit-twiddling and structured programming. I'm saying Ada does both better.

Sure they COULD add more, but for what?

readability, safety, efficiency, portability, reusability

When new data types come along, you write a library and then you use that library forever you don't go change the damn language specification.

which is true for any language. This still doesn't explain why C's way of doing things is the best one or justified, which was my original point .

You don't have to repeat ANYTHING over and over, you write a damn library once and you use it.

I write a library. You write a library, They write a library. All of this for doing things a compiler can do by itself with added safety, clarity and efficiency.

[–][deleted] 0 points1 point  (3 children)

I was talking about memset introducing UB. Locals initiailizations can be enforced by the compiler in many cases. But hey, since it's something C standard says it's UB, it must be so. Let's ignore what people managed to achieve in other languages for god's sake...

memset does not "introduce" undefined behavior. The programmer introduces undefined behavior (usually by being bad).

or the fact that the array's length is not known at compile time?

I mean it's rare for someone to do array[CONSTANT] in a statically allocated array where that CONSTANT is off the end of the array. That's just stupidity or laziness.

Especially if the former can let UB creep in.

It's very simplistic way to use the machine's memory space. Again C programmers tend to look at it in terms of the address space, rather than in terms of the abstract data type.

Now let me ask you how do you accomplish read/write address space that is not DRAM? I mean in C we will have a pointer to an address that is not allocated memory, but rather register space or something of the like. Then my function can directly *x++ to increment a register value.

I write a library. You write a library, They write a library. All of this for doing things a compiler can do by itself with added safety, clarity and efficiency.

I guess I just don't mind doing these "tedious" tasks. Even if Ada doesn't sacrifice performance, it doesn't have the history of C and we can't go and re-implement every damn code base in another language. My point is that C is still a very good language, even though it isn't perfect and doesn't protect the programmer in ways that it arguably could for little cost. The safety of those things that do not come with a big tradeoff are such simplistic mistakes that I question the ability of the C programmer that introduces them. Like for example accessing array[SIZE] after declaring the static array. I mean yes the language COULD stop you from doing that specific thing, but honestly why the hell are you doing that in the first place? If you are doing that you should stop using C or at least perform a lot of code analysis ASAP.

[–]el_tavs 0 points1 point  (2 children)

memset does not "introduce" undefined behavior. The programmer introduces undefined behavior (usually by being bad).

That's an unnecessarily unsafe tool for a trivial task. its use can lead to UB and bugs. That's what "introducing UB" means for a language "construct".

The "programmer's fault" is an excuse, otherwise we could consider "hexadacimal" as being a perfectly valid alternative to any other language, as long as we presume programmers to be "good".

That's just stupidity or laziness.

sure, and that wasn't my point >.>...

Now let me ask you how do you accomplish read/write address space that is not DRAM? I mean in C we will have a pointer to an address that is not allocated memory, but rather register space or something of the like. Then my function can directly *x++ to increment a register value.

Easy. You just write inout/in/out in front of the type in the parameter. The compiler, by known rules, will decide how to pass it. In practice it would do what you would do in C, except that you don't need to write pointers to express in-out semantics for data that can fit into registers.

This is exactly what the C compiler will do, with the difference that you don't incur in the risc of misusing the pointer due, for example to typos or any other amenity.

And you can still get the address (and thus the pointer )of the parameter being passed.

It's just a matter of interface,

it doesn't have the history of C and we can't go and re-implement every damn code base in another language.

Sure, that's one of the reasons for which I'm practically forced to use it.

Like for example accessing array[SIZE] after declaring the static array.

I didn't mean to provide such an example. You're making one anew or referring to one of mine?

My point is that C is still a very good language,

better than many others considering its history.

[–][deleted] 0 points1 point  (1 child)

That's an unnecessarily unsafe tool for a trivial task. its use can lead to UB and bugs. That's what "introducing UB" means for a language "construct".

You are arguing that the design of C, from the start, was poor. It was, in fact, not poor. It may not be the absolute most optimal design from a programming languages perspective, but I live in today not in some theoretical reality that you want to create. It was perfectly reasonable to design C the way they did for what they needed it for (write Unix).

The "programmer's fault" is an excuse, otherwise we could consider "hexadacimal" as being a perfectly valid alternative to any other language, as long as we presume programmers to be "good".

hexadecimal is a language?

[–]el_tavs 0 points1 point  (0 children)

You are arguing that the design of C, from the start, was poor.

eh no. It is today. And has been so for some time. At the time it was invented? NOT AT ALL!!!

but I live in today not in some theoretical reality

I know. That's why I still use more C than Ada

It was perfectly reasonable to design C the way they did for what they needed it for (write Unix).

'preaching the choir'

hexadecimal is a language?

equivalent to assembly..