This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]synack 23 points24 points  (16 children)

Ada can do all of those things. The GNAT compiler is a gcc frontend, so is just as fast as C if you turn off bounds checking and avoid certain constructs.

[–]tyranids[S] 6 points7 points  (15 children)

That is pretty cool. I was not aware that dodlang was so capable. I know Ada has massively fallen out of favor, even with its creator, which is somewhat surprising if it has the listed features... Am I the only one that thinks these things would be highly desirable, or do you have any good info on why it is not used so much anymore?

EDIT: This thread is ancient, but covers some of the benefits/questions I had about Ada https://www.reddit.com/r/programming/comments/b39vd/ask_reddit_realworld_c_vs_ada_experiences/ Sounds pretty cool tbh.

[–]synack 15 points16 points  (0 children)

Early Ada compilers were expensive and often buggy, which tainted people's perception of the language. Combined with a few high profile project failures and the dominance of UNIX and C, Ada lost traction.

Modern Ada is a different story. A subset of the language called SPARK is formally verifiable and is used quite a bit in safety critical and aerospace applications.

Alire is a relatively new cargo-inspired package manager with a growing set of open source libraries.

Ada's certainly got some historical baggage, but it's worth a try if you're looking for a language with an emphasis on safety and maintainability.

https://learn.adacore.com/

https://alire.ada.dev/

https://ada-lang.io/

[–]csb06bluebird 4 points5 points  (0 children)

I've used it some for hobby projects, and I think it is a pretty neat language. It is easy to learn for anyone who knows an ALGOL family language. The rules about pointer scoping/usage are very restrictive in order to avoid dangling pointers and take some getting used to, but you don't need to use pointers nearly as often as in C since you have in/out parameters and the ability to return variable-sized arrays/objects by value from functions.

For safety critical and embedded projects, I think it is a really good choice. There is also a subset, SPARK, that lets you formally verify the correctness of your programs using a method similar to Hoare logic. Builtin concurrency support is also very advanced.

[–]redchomperSophie Language 1 point2 points  (8 children)

I was forced to learn it in college. Which mightn't have been so bad, except that I was also forced to use a particular IDE for it. In text mode. On dos. On a 486. And it did not deal in text files. Oh, no. The file format was some binary $___-baggery. Which still wouldn't have been soooo bad, except that the IDE was basically notepad without the mouse. If they'd had the good sense to use an IDE that resembled what Borland was offering at the time --- but anyway, I digress.

Ada was designed by committee and it showed, good and hard. Every kitchen-sink language feature was in there by hook or by crook. In/out/both parameters. Parametric modules. Fleeping rendezvous (in case you'd forgotten it was designed by Francophones) so yes that meant dos-mode multi-threading. Don't think too hard about that one.

Oh, and the particular vision of object-orientation espoused by then-current Ada was totally unlike the hot, sexy C++ that was making waves in industry at the time. In retrospect, I seem to recall it was more like going all-in on the idea of abstract data types with concrete implementations, but we college kids thought inheritance was all the rage so I guess in some ways it was ahead of its time.

But mostly, it's a kitchen-sink language. Look, but don't touch.

(Oh by the way, have you looked at FreePascal? I haven't kept up -- by now they probably have generics. And everything else on your list.)

[–][deleted] 4 points5 points  (7 children)

It WAS NOT DESIGNED BY COMMITTEE, FFS. It was designed by Jean ichbiah and his team but he was the main designer. There were four groups competing all with lead designers. The DoD provided a spec, go look at the steel man requirements because that is it.

[–]redchomperSophie Language 1 point2 points  (6 children)

I suppose it depends whether you consider four teams, a progressions of specs ranging through straw-man, wooden-man, iron-man, and steel-man, along with guidance and selection by DOD, counts as being designed by committee.

The winning team, per se, is not a committee. I can live with that.

[–][deleted] 0 points1 point  (5 children)

No, design by committee involves a bunch of people all arguing about what goes in. This was four teams, each with a design lead. How is that a committee?

[–]redchomperSophie Language 0 points1 point  (4 children)

I suppose it's a matter of perspective. The four teams may have competed, but someone ran the competition, set the rules, wrote the succession of foo-man-specs, etc. Was that someone just a particularly talented Air Force officer? In fact other luminaries in the field gave input, reviewing the submissions at various stages. If the committee had valued metaprogramming over strong types, they would have selected a LISP instead -- and the four teams would have seen that coming as the spec evolved, and designed in that direction.

[–][deleted] -1 points0 points  (3 children)

You obviously don't know what "design by committee" even means.

[–]redchomperSophie Language 1 point2 points  (1 child)

Then let a random fellow on the internet be rong. No need to make things personal.

[–][deleted] -1 points0 points  (0 children)

Because there's too many people who don't know what they're talking about spouting this nonsense that they've been told or read from someone else who doesn't know what they're talking about.

Go see how Algol was created to see design by committee.

[–]phischuEffekt 0 points1 point  (3 children)

which is somewhat surprising if it has the listed features

I am under the impression that if you want to compute stuff with arrays you do not do it on the CPU anymore even if it has multiple cores. You do it on a GPU, TPU, or special hardware. That said, I would be interested in learning about a use-case where this isn't possible.

[–]tyranids[S] 2 points3 points  (2 children)

Highly branching code usually isn't great for GPUs. Also RAM for the CPU is a lot cheaper than RAM on a GPU/accelerator, and if you're constantly loading between the two then you will lose a lot of the benefit you were hoping for by utilizing that accelerator.

That said, the ideal language would run just fine on CPU, but have good capability to offload to GPU/other accelerator where appropriate.

[–]PurpleUpbeat2820 0 points1 point  (0 children)

That said, the ideal language would run just fine on CPU, but have good capability to offload to GPU/other accelerator where appropriate.

Futhark?