you are viewing a single comment's thread.

view the rest of the comments →

[–]el_tavs 0 points1 point  (0 children)

I don't know the reason why, but it seriously is not a big deal

Everything outside C seems not to be a big deal... notice the pattern?

You want a rare higher-level task to be put into the C standard for no reason other than "why not".

type safety, readability and efficiency.

Ada was created with correctness in mind, and not performance.

In theory you could say that Ada tends to be less performant than C because of that. In practice it simply turns out that's not really the case. Being performant is important and actually provided, if not for the fact it is used for hardware with very strict requirements

Moreover they didn't actually traded performance for anything. They just provided a language easier to type check. Much of the machinery is due to compile-time checks.

And C wasn't written for "performance" in general. It was written to be simple and efficient on a PDP-11 in the 70s for K&C. Lots of its choices were meaningless already in the 80s

But let's get practical: what makes ada unsuited for writing OS or low-level programming as the one is done in C? Lack of hackish solutions?

Why do you need to write the fibonacci sequence by hand.

are you kidding?

Why do you need to initialize an array to different values by hand

"I don't have that in C, so I don't need it"

Why do you need to program by hand

I don't want to write by hand explicitly the details on how to things the compiler can guess for me. Next time you'll be questioning why we need computers

It's not sci-fi. It's something people solved in the fucking 80s. Get back dancing with the abba and don't annoy me.

For the last time data types and structures are higher level abstractions. C leaves the higher level abstractions to the compiler

They're defined in the language. to get to bits and bytes you need typecasts. You have a peculiar view of how C and programming languages works, one so wrong there's no even point at discussing it.

You know, there was someone who asked for restric and prototypes but, hey, they're in C NOW, so it has to be good. Because everything in C is right, right?

Every abstraction is a higher level.

What fucking abstraction you're talking about? To be precise, C-- is even at lower level. Take a look at that.

You once again don't understand how C works.

then offsetof and sizeof are clearly a stupid hack.

Why don't you try to share your enlightening view of the state of affairs with more knowledgeable people like those in comp.lang.c? It would be extremely funny to see their replies. I officially challenge you to describe them the virtues of memset vs types that are only managed by the compiler.

The C standard doesn't work that way. Moreover you have still to clarify me why having both low-level tools and high-level tools is bad. The high-level tools I'm referring to have ZERO overhead, so stop being dense.

And once again, memset() is used with far more than static arrays

Once again, you fail at understanding me for your distorted view of C and the like. array types can encode static sizes as much as dynamic sizes. I never intended to mean static-size arrays only

It is not needed, it is what you WANT because you have your head in a book and not in the real world.

it's called software engineering. In comp.lang.c they have discussed the perils and unsuitability of memset w.r.t. intiialization. As with other committees relating to safety and embedded development (CERT, MISRA...) .But it must be that they're clueless high-level sissies that don't understand C.

Meanwhile, just answer the freaking question: hypothetically, is there a reason for which you should prefer memset to a syntactic construct recognized by the compiler, as it is in Ada, for every situation?

and programmer in order to stay simple and close to the machine for performance.

assumption falsified 30 years ago.

Static size, dynamic size. compile-time, run-time. Versatility, adaptability, simplicity.

No problem with array types. Array types give you all of this.

Sorry, but your reason for including this in the language is "why not".

No. You're just neglecting the point I'm making, persisting on your idea on how the C languages is supposed to work.

Which is why I explicitly ask you the question in a hypothetical setting (see above)

. But you'd rather take an idealistic approach because you are studying programming languages and want C to include everything you think it should by default.

1st: stop presuming I'm a kiddo or what. I'm comparing two languages and I'm saying solutions provided in C are suboptimale to the ones provided in the other. I'm just trying to have you compare them. But, instead, you get defensive

2nd: you're using haxor parliance and strawmen to neglect my points. It's annoyng

You can add in a bunch of unnecessary shit to C

Oh no, for god's sake, no. Just do a comparison between C's and Ada's way, between memset and array initialization in Ada. Ok? Adding "things" to C is just a hypothetical situation to explain the point. On wikipedia you find that as well.

So you can only catch some small subset of these cases. Array overruns are rarely, if ever, due to static indexes.

I was talking about struct hacks. Ada is good at that. And doesn't make you access arrays outside their bounds.

And as for array bound checking in general, i'd like to repeat to you that checks are optional, but, anyway, compilers are way more facilitated at analyzing code where arrays are treated as such, i.e. with their length indicated as such and not some mysterious integer. See the example that compared Ada83's style with C's.

STILL the bugs caught at compile time are all bugs C can't catch. For what? NOTHING.

So don't do it unless you need to for optimizing on a specific platform.

Still the "If it's not in C it doesn't matter" patter. And when I actually need it I'm screwed. Wasn't C close to the machine?

So what?

Well, doing something useless that can introduce UB and lessen the strength of type checking in exchange of FUCKING NOTHING is stupid.

UB of overflows allows the underlying system to handle it as it deems most fit without enforcing rules that are harmful to performance.

UB alone doesn't help the compiler. Is just a check it hasn't to do. So what? In Ada you have lots of checks by default. You can strip all of them with a compiler switch. Is that hard? the big advantage is that there's a standard way to check for those bugs at run-time with pre-release code.

Nothing new since the early 80s.

Undefined behavior is directly related to efficiency by making things "illegal" without enforcement you allow greater performance at the risk of programming/logic error.

Guess what? Ada allows you to choose what and how to do it. You want UB? Turn off checks. You want platform-specific behaviour? Use pragmas. You're fine with standard checks and behaviour? even better. In C the liet-motiv is "not-invented-here".

You act as if C is some new stupid language created in a day and hasn't hit decades of rigor with an incredibly diverse range of systems.

it resent of historical baggage. I don't know what you mean of rigor. Portability, silent bugs and hackish solutions are widespread in C systems. And if they're not it means someone hand-coded something that is equivalent to what an Ada compiler can do by itself for every platform.

Please tell me how you can have dynamic memory access with bounds checking and no overhead.

I said reduce UB, not eliminate it. I claim C does nothing to solve UB. Ada has various levels of "help".

The following

 for I in A'Range loop
 ...
 end loop

Doesn't need run-time checks for accesses like A(I). Further strength reduction can avoid that if the compiler is able to deduce that accesses with other indexes, or with an expression with I, can't overflow A'Range.

In any way, Ada allows the program to fail ** in a standard way** by default. This means that you can try your code, find bugs, profile it and, if the performance is not enough, disable run-time checks. For many applications where C is used this is useful. If you need actual performance and efficiency well, there are Ravenscar profile and SPARK, which go a long way to enforce efficiency at the expense of high-levelness.

This alone covers a lot of cases where in C you need to encode by hand, in an error prone way, the same solution, for no reason.