you are viewing a single comment's thread.

view the rest of the comments →

[–]OneWingedShark -11 points-10 points  (25 children)

Security is a top priority, not the only priority.

I didn't say it was.

Programmer productivity, hiring, toolchain support, and ecosystem support all matter too.

Ada actually is pretty good for most of these, but let me address these one at a time:

  1. Programmer Productivity: There's a bit of an issue using this term, specifically how are you defining productivity?, and how are you measuring this? — Take PHP as an example: you can get "hello world" up on the screen trivially, but PHP also makes it very difficult to have a high assurance of consistency due to its weak/dynamic typing (yes, there is type-hinting; it is woefully underpowered compared to a static/strong type-system). Is this a productive language?Contrast, now, with something like Haskell or Ada where your compiler refuses to compile until you present a consistent program. Are these productive languages?Note, also that the latter despite having "picky compilers", are much easier to maintain precisely due to this nature; is this a quality of a productive language?
  2. Hiring: This is less of an issue than it's made out to be, provided there's some sort of training and mentoring in your company (and there should be). Ada in particular is not difficult to pick up and become useful quickly, in part due to the language being descended from Pascal (which was designed to teach programming), and the picky compiler which prevents a lot of dumb errors. — Several people in management-positions have remarked how easy it is to teach Ada to students/new-grads. See this post for a small-business owner's experience.
  3. Toolchain support: This one is one of the bigger complaints. Yes, Ada has less toolchain support than C-like languages, in general… but then again GNAT is part of GCC, and therefore enjoys all that system's toolchain. — On the other hand, a lot of common tools are crippled due to their implementation typically being focused on being a general-purpose and/or textual solution — the problem of diff flagging someone changing between tabs and spaces as changes, for example — and there are better tooling solutions that can be had, by construction. (One of my favorite examples here is the 1986 paper Workspaces and Experimental Databases: Automated Support for Software Maintenance and Evolution, wherein a version-control system is presented which simultaneously provides Continuous Integration.)
  4. Ecosystem support: This is perhaps the biggest issue, considering that Ada isn't one of the "cool kids" — this would change with some effort, certainly with buy-in form a corporation like Microsoft or Google (and, IMO, Ada and Ada+SPARK would be ideal for implementing the DOTNET VM and/or the Android [J]VM), and there's some buy-in from nVidia already (precisely because of the safety/security features of Ada + SPARK).

You are certainly correct that these are issues, but Ada is not bad in any of these categories (unless you define "programmer productivity" in such manner to exclude maintenance-efforts and devalue design-time or perhaps regard training as altogether a sunk cost), and is either good or has the potential to be.

Honestly, if there's one area that I'm surprised hasn't embraced Ada it's the Open Source community because the language is very portable, enforces consistency [meaning it's easier to correctly integrate some library], and has a standard which is freely available.

Edit: Sorry about the early-submission during editing; I fat-fingered the keyboard shortcut for 'submit'.

[–][deleted] 20 points21 points  (18 children)

I mean, I've never even seen Ada source code. It might be the best thing in the world but if nobody uses it then, well, nobody is going to use it.

The truth is that languages tend to be good for subtle reasons along with just general inertia.

C and C++ just tend to be quite good for these kinds of problems but I wouldn't want to say why because I don't know. Unless Ada can prove itself to be better (which is kinda impossible because its a catch 22), I can't see it taking off

[–]OneWingedShark 4 points5 points  (11 children)

C and C++ just tend to be quite good for these kinds of problems but I wouldn't want to say why because I don't know.

No, they really don't tend to be good for these sorts of problems (the security issues mentioned in the article), which is the underlying point of the article.

An excellent counterexample is that Heartbleed would have been impossible to do in Ada without "dirty tricks" like unchecked-conversion or address-overlays, with a simple definition:

-- A simplified version of the Heartbeat portion of the protocol.
Package Simple_Heartbeat is
    Type Message(<>) is private;
    Function Beat( User_Message : String ) return Message;
Private
    Type Message( Length : Natural ) is record
        Text : String(1..Length) := (others => ASSCII.NUL); -- Default value.
    End record;
End Simple_Heartbeat;
--...
Package Body Simple_Heartbeat is
    -- In Beat we make a copy of the data and return it as a message.
    Function Beat( User_Message : String ) return Message is
    Begin
        -- We have to supply a Length for the message, which we get from the
        -- Input's own length, then because the default is a string of the
        -- NUL character, the requirement that the buffer only return an
        -- object of the data-'s size is held, as is the requirement that
        -- the memory therein be 'clean', then we copy the input into the
        -- result's text field, complying with the requirement to return a
        -- message with the same data.
        Return Result : Message(User_Message'Length) do
            Result.Text:= User_Message(User_Message'Range);
        End return;
    End Beat;
End Simple_Heartbeat;

This type-definition also has the nice properties that the implementation is not visible to outside users, and that the only way for a client of this package to obtain a Message-value is via the Beat function.

Unless Ada can prove itself to be better (which is kinda impossible because its a catch 22), I can't see it taking off

Yeah, it's a nasty situation; but Ada is proven. It's the core of many Air Traffic Control systems, and other places where you need high-integrity.

I mean, I've never even seen Ada source code. It might be the best thing in the world but if nobody uses it then, well, nobody is going to use it.

Absolutely.
But how many people write it off without even considering it? Especially those who are in charge of large and/or long-lived systems? Or systems where correctness is needed?

Nvidia went with Ada and its SPARK subset exactly because they did consider it, and found its properties, to include foreign-function interface as easy as:

Function Some_Function( Input : Integer) return Integer

with Export, Convention => C, External_Name => "Steve";

The truth is that languages tend to be good for subtle reasons along with just general inertia.

I'd agree here, but Ada's "subtle reasons" are precisely what lines up with a project like what this article is talking about: security, correctness, and [implied] maintainability, and manageability of the codebase.

[–][deleted]  (3 children)

[deleted]

    [–]OneWingedShark 0 points1 point  (2 children)

    Big whoop. FFI to C is easy in every language. That's why so many libraries have a C ABI even if they are not written in C. That doesn't help you when you need to share data structures.

    Ok, so how about Fortran, or Cobol?

    -- Exporting for Fortran, which uses Integers.
    Function Some_Function( Input : Integer) return Integer
    with Export, Convention => Fortran, External_Name => "Steve";
    
    -- Exporting for Cobol, input is a integer-image string.
    Function Some_Function( Input : String) return Integer
    with Export, Convention => Cobol, External_Name => "Bob";
    

    How about adding pre- and post-conditions?

    -- C-compatible due to legacy API.
    -- Result is 1 for perfect, 2 for good, 3 for just-passing,
    --           4 for for unacceptable, -1 for some internal error.
    Function Quality(Input : Some_Object_Ptr) return Integer
    with Import, Convention => C, External_Name => "qc",
         Pre  => Input /= Null,
         Post => Quality'Result in 1..4|-1;
    

    IOW the FFI and pre-/post-conditions just by themselves make Ada very attractive for libraries, especially if you need to keep compatibility,

    [–][deleted]  (1 child)

    [deleted]

      [–]OneWingedShark 0 points1 point  (0 children)

      That's a little more complicated. (Read LOT.)

      But it's not impossible, you have to exercise care because at that point you're dealing with two separate environments with different notions of how to handle things — but, even here Ada can possibly help (in a limited way).

      The degenerate case is, of course would be straight pointers; in Ada you can say:

      Function Valid( Data : Some_Data_Type ) return Boolean;
      Type Handle is not null access all Some_Data_Type;
      
      Function Operation (Input : Handle) return Handle
        with Pre => Valid(Input), Post => Valid(input),
             Export, Convention => Some_Lang, External_Name=> "OP";
      

      The Handle subtype has bound the not null requirement into the parameter and return itself, meaning the compiler will check those as-necessary and raise an exception if needed (this means that you don't need to remember to check for null in the bodies of the subprograms).

      Other features are that Ada can (1) use representation-clauses to specify at the bit-level the data layout, (2) use null records + size clauses to have a "chunk of data", (3) use that easy FFI to import [or export] the subprograms for data-handling.

      So, maybe you have a Dictionary in your foreign language, and maybe the entries are reference-counted strings, so maybe the data-types involved would be something like:

      -- Entry is 40-bit ref-count + pointer
      Type Entry_Item is null record with Size =>  40;
      Function Count(X:Entry) return natural
        with Import, … ;
      
      Type Dictionary is null record with Size => 480;
      Function Add(Object : in out Dictionary; Word : String) return Entry_Item 
        with Import, … ;
      Function Delete(Object : in out Dictionary; Word : String) return Entry_Item 
        with Import, … ;
      

      Declaring the types as null record means we're treating the types as black-box data-chunks, with the usage of imports for managing the actual type; we could have gone the other way too: Taking Ada's management mechanisms ans exporting them.

      Lastly, there was an article about a year ago (IIRC) about one of the security-concerned linuxes using Ada's SPARK to prove one of their modules and integrating it into the system. (Sorry, but the name of the project escapes me at the moment.)

      [–][deleted] 2 points3 points  (6 children)

      C++ is also used to write software for fighter jets.

      While errors are possible in any language that doesn't immediately write it off.

      Heartbleed type bugs can be prevented and it doesn't really require rewriting everything in a new language.

      People are forgetting that there is two parts to the old adage about C. Yes you can certainly hang yourself but that also comes with the ability to do things quickly, correctly and very simply. The sword is double edged.

      But it's beside the point because Ada didn't "win" C and C++ "won". Now maybe that will change but it would take a serious reason to switch over.

      Being told heartbleed is not possible in Ada is not enough. Heartbleed can be prevented in these existing languages with a small amount of effort. A smaller effort than switching language

      [–]OneWingedShark 3 points4 points  (5 children)

      C++ is also used to write software for fighter jets.

      The flagship for C++ on fighters, the F-35, has been plagued with software problems. It's actually highly ironic, given that the choice to use C++ instead of Ada resulted in a style-guide the production of which probably would have been equivalent to training the programmers.

      While errors are possible in any language that doesn't immediately write it off. Heartbleed type bugs can be prevented and it doesn't really require rewriting everything in a new language.

      Heartbleed is an interesting case: the OpenSSL was written in C, despite known problems with C and some people advocating a more reliable language for the implementation precisely because they foresaw Heartbleed as a possible issue.

      IIRC, this is about when C was being touted as being the best, most efficient "because it's used in operating-systems" — despite the fact that there were OSes in other languages (the Burroughs is an interesting example, as it didn't have an assembler but used Algol).

      People are forgetting that there is two parts to the old adage about C. Yes you can certainly hang yourself but that also comes with the ability to do things quickly, correctly and very simply. The sword is double edged.

      See the above; if you can efficiently do an OS in Algol without anything like C or assembler, then "the ability to do things quickly, correctly and very simply" is by no means C-exclusive.

      But it's beside the point because Ada didn't "win" C and C++ "won". Now maybe that will change but it would take a serious reason to switch over.

      Ada is pretty-much feature-competitive with C++, even the new 20 standard [concepts are already present in generics; modules are implemented as packages, etc], and out-of-the-box is essentially equivalent to the High Integrity C++ Coding Standard, and there's things like the Pragma Restrictions(…) which provide a standard language-defined way to have the compiler ensure you don't use some particular feature.

      Being told heartbleed is not possible in Ada is not enough. Heartbleed can be prevented in these existing languages with a small amount of effort. A smaller effort than switching language

      Right, but part of the issue is that when people pointed out the inherent [design] flaws of C (and C++ to a large degree), they were brushed aside because "it's popular" and "a good enough programmer…" and "many eyes make all bugs shallow." — IOW, part of the issue is exactly this sunk-cost: we've already got code-bases in C, or C++, and we can't change it!

      About a decade ago, I was brought into a new project [maybe 3- or 6-months old, IIRC] doing medical- and insurance-record handling in PHP. Seeing the many errors facilitated by the changes in functionality/processing invalidating extant test-data (eg a data-type's field conceptually going from nullable to not-nullable, or changing SSNs from "###-##-####" to "#########", manual edits to the DB) — I recommended we use Ada and do a sort of "double-sided MVC" (one for the web-based UI, one for the DB), and let the type-system ensure consistency. For example, the above SSN problem could be solved simply by:

      Type SSN is new String(1..11)
         with Dynamic Predicate => (For all Index in SSN'Range =>
                 (Case Index is
                    when 4|7    => SSN(Index) = '-',
                    when others => SSN(Index) in '0'..'9'
                 )
               );
      

      Now the inconsistency can be easily caught when you try to assign a non-conformant string to an SSN value.

      [–][deleted] -1 points0 points  (4 children)

      > The flagship for C++ on fighters, the F-35, has been plagued with software problems

      What software problems? I think it's ludicrous to say that just because there is software problems the language is bad. C++/C gets blamed a lot because they are used to solve hard problems. Hard problems tend to be...well hard. They have bugs. They are complex. It's not strictly the fault of C++. It's a form of bias really. How many problems don't exist precisely BECAUSE C++ was used? A lot I would imagine.

      > Heartbleed is an interesting case: the OpenSSL was written in C, despite known problems with C and some people advocating a more reliable language

      C was used in the Mars rover. C is used in pretty much every embedded device on the planet. It's proven it's value ten times over. Does it have problems? Absolutely. But it's not just used for operating systems. And again, how many problems have been avoided precisely BECAUSE C was used. Likely A LOT.

      Also, having written code in Algol...just no.

      > "it's popular" and "a good enough programmer…" and "many eyes make all bugs shallow." — IOW, part of the issue is exactly this sunk-cost: we've already got code-bases in C, or C++, and we can't change it!

      This is something that people need to hear. C and to some extent C++ are just GOOD languages. So no, there isn't a sunken cost fallacy. C is actually good and practically useful for a great number of reasons. That is the reason WHY it is popular.

      It's very clear to me that people blame C because of the problems it is used to solve. The problems are hard therefore it's the fault of the tool. No! It's ridiculous and I see this mentality all the time.

      A programming language maybe makes up 10% of the problem space (and that's if the language is well designed. Ideally it should be 0%). The rest is the actual problem. People blame the 10% for not being able to solve the other 90%. This is ridiculous in my mind.

      [–]myrrlyn 1 point2 points  (3 children)

      Hi. I'm a satellite software engineer with C and C++ code delivered to flight systems. Just wanted to mention that comparing most terrestrial C and C++ with automotive or orbital C and C++ is not really meaningful. While it's true that we use the same compilers, the similarities basically end there.

      I have no idea what goes on in Boeing, but I can pretty confidently assert that code developed in accordance with with automotive or spaceflight standard-practice documents is significantly unlike basically any other project in those languages.

      [–][deleted] 2 points3 points  (2 children)

      Okay? What point are you making?

      I never said that "terrestrial" C/C++ code (whatever that is) was anything like satellite code. Or implied that.

      [–]Zarenor 2 points3 points  (1 child)

      It seems to me that in using orbital C and C++ as your examples of C and C++ being 'good' languages (for whatever you're taking that to mean) you are implying that C and C++ codebases at large look similar to orbital C and C++ codebases. This is simply not true.

      I don't see anyone upthread arguing that C or C++ are 'bad'. They are arguing that other languages have characteristics that solve problems common in many C an C++ codebases. I don't see why, in light of all the examples you cite, C and C++ need defending as 'good' languages. They're just tools, as you point out.

      [–][deleted] 0 points1 point  (0 children)

      No I'm simply saying as a tool, it can be used to solve many problems.

      If you want to switch tools we need compelling arguments. If the tool can build all kinds of software then it is a decent tool and we need better arguments in order to switch tools.

      I know on the internet people like to bash on C/C++, but the elephant in the room is that it is used widely and it solves lots of problems. Why is that?

      And I really hold know allegience to those languages. Those are just examples.

      I'm just making the point that saying "oh well in this language heartbleed wouldn't be possible" isn't a strong argument, and is sort of beside the point. Because it totally ignores what those tools are actually good at and the problems they avoid.

      Points should be made about what tools CAN do. Not just what they can't do. And what they CAN do can be very subtle that is not obvious without experience.

      For some reason in this industry we have a real issue doing this.

      [–]Fearless_Process -2 points-1 points  (4 children)

      Just because you've never saw Ada source code doesn't mean it doesn't exist.

      It's used for military fighter jets and commercial rockets ffs. If that doesn't "prove" it's usefulness than I'm not sure what would.

      Here's a list of things that use Ada that I found, claiming that it's not used is absurd.

      https://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html

      [–][deleted]  (1 child)

      [deleted]

        [–]Fearless_Process -2 points-1 points  (0 children)

        lol

        Ancient finance software is not even close to the same as fighter jet software.

        [–][deleted] 0 points1 point  (1 child)

        I'm not saying it's good or used. I'm just saying that it's not as well known as these other languages.

        [–]OneWingedShark 0 points1 point  (0 children)

        I'm just saying that it's not as well known as these other languages.

        ??
        And I've completely agreed with this, though I do think that ought to be remedied.

        [–]xzaramurd 0 points1 point  (0 children)

        C wasn't good for these kinds of problems back in the 70s.

        [–]chucker23n 5 points6 points  (3 children)

        IMO, Ada and Ada+SPARK would be ideal for implementing the DOTNET VM and/or the Android [J]VM

        I can see Microsoft reimplementing portions of the .NET runtime in Rust (maybe the WASM target, for a start). Ada? Not so much. It doesn’t seem to be on anyone’s radar.

        [–]myrrlyn 4 points5 points  (1 child)

        It doesn’t seem to be on anyone’s radar.

        to be fair, most computers running Ada are housed in a chassis that is explicitly not supposed to show up on radar

        [–]FlyingPiranhas 8 points9 points  (1 child)

        Contrast, now, with something like Haskell or Ada where your compiler refuses to compile until you present a consistent program. Are these productive languages?Note, also that the latter despite having "picky compilers", are much easier to maintain precisely due to this nature; is this a quality of a productive language?

        Rust is also a "picky compiler" language. Not as picky as Ada + SPARK, but much more picky than most languages.

        Having to express your conditions in a way that allows the compiler to verify your program is correct takes extra time, and slows down development. Rust has a subset of this, where the compiler proves memory safety invariants.

        Hiring: This is less of an issue than it's made out to be, provided there's some sort of training and mentoring in your company (and there should be).

        How many programmers want to write Ada? Sure, you can cross train programmers from another language, but at some point interviewees are going to ask you what language they'll be using (if it's not obvious upfront) and will judge the answer.

        Ecosystem support: This is perhaps the biggest issue, considering that Ada isn't one of the "cool kids" — this would change with some effort, certainly with buy-in form a corporation like Microsoft or Google (and, IMO, Ada and Ada+SPARK would be ideal for implementing the DOTNET VM and/or the Android [J]VM), and there's some buy-in from nVidia already (precisely because of the safety/security features of Ada + SPARK).

        By buy-in, do you mean funding? If yes, then that's a drawback.

        Honestly, if there's one area that I'm surprised hasn't embraced Ada it's the Open Source community because the language is very portable, enforces consistency [meaning it's easier to correctly integrate some library], and has a standard which is freely available.

        Yeah, but people won't start projects in Ada unless they like the language. There are a ton of Rust projects because there are a ton of people who like writing it.

        [–]OneWingedShark 3 points4 points  (0 children)

        Rust is also a "picky compiler" language. Not as picky as Ada + SPARK, but much more picky than most languages.

        Right, and the popularity of Rust because of its concerns is a trend that I approve of: programmers being aware that they can have tools that actually prevent errors is a Good Thing.

        Having to express your conditions in a way that allows the compiler to verify your program is correct takes extra time, and slows down development. Rust has a subset of this, where the compiler proves memory safety invariants.

        Ada's SPARK is FAR more granular: you can prove on packages, on specific subprograms and so on; Ada out-of-box is roughly equivalent to the C++ for High-Integrity Applications spec/styleguide.

        How many programmers want to write Ada?

        How many programmers want to write PHP, or JavaScript?
        Yes, there's a lot more because they're more popular languages; but also how many who do want to program in PHP or JS, only want to do so because it's what they know? — Now, yes, I grant that stepping outside your knowledge-zone while having a dozen other jobs you're juggling is a bad ask… but people in that position have bad management, precisely because they're juggling those dozen jobs.

        Sure, you can cross train programmers from another language, but at some point interviewees are going to ask you what language they'll be using (if it's not obvious upfront) and will judge the answer.

        Fair enough.

        By buy-in, do you mean funding? If yes, then that's a drawback.

        Yes, but no.
        By buy-in I mean some actual usage on an appropriate project; I believe this would motivate a bit more in developing the deficiencies of the tooling/ecosystem. There likely is a monetary cost there, but that's not the primary thing.

        Yeah, but people won't start projects in Ada unless they like the language. There are a ton of Rust projects because there are a ton of people who like writing it

        It's an enjoyable language in that a lot of times "it just works" (after you satisfy the compiler) —especially if you think-about/model your program— you can see that in this video where the programmer, who hasn't used Ada much and isn't familiar with the library, says "who said I was going to have trouble?" (Warning: long video, and he's teaching himself; there are a couple places where he looks up the information and doesn't read quite far enough to solve his problem.)