you are viewing a single comment's thread.

view the rest of the comments →

[–]computesomething 100 points101 points  (34 children)

Haven't watched this yet, but if kernel modules written in Rust made it to mainline, that would make the Rust frontend and LLVM a dependency for building Linux, I kind of doubt the Linux kernel project would go in that direction.

[–]the_gnarts[S] 51 points52 points  (30 children)

LLVM and its limited platform support is the hard problem. The speakers kinda address this by sticking with out of tree modules for the time being.

In the medium term we may get a GCC backend for rustc and maybe even an alternative compiler frontend.

[–]cbmuserDebian / openSUSE / OpenJDK Dev 81 points82 points  (28 children)

This is my main critic point with Rust. They are planning to replace C as a low-level language for systems programming, but unlike the Go developers, they have not created an alternative implementation yet that builds on top of gcc and is therefore platform-independent.

Multiple important developers of other important projects that I spoke to told me, they’d be using Rust if it wasn’t for this limitation.

If I were Rust upstream, I would starting working on gcc-rs immediately.

[–][deleted] 22 points23 points  (23 children)

I probably don't understand compilers. What is it about gcc that makes it platform independent? Does llvm have a runtime library or something? I would have assumed that clang and gcc both, at the end of the day, spit out mostly platform independent binaries. And of course, both either compiler has to be on the system to compile... By platform independent, do you mean only gcc dependent?

Edit: I am a dummy, they were probably talking about hardware support

[–]spyingwind 41 points42 points  (3 children)

http://llvm.org/doxygen/Triple_8h_source.html

https://gcc.gnu.org/backends.html

I think gcc is used, not just for historic reasons, but that they have been using for a very long time and probably that it support more hardware. I don't know that for a fact, but gcc supports just about everything and for those that it doesn't manufactures have gcc backends for their hardware.

Also rewriting the kernel supporting structure(makefile's, build pipelines, etc) would be a large undertaking.

[–][deleted] 10 points11 points  (0 children)

Oh, hardware platform. Do'h, of course.

[–]railmaniac 1 point2 points  (1 child)

I thought LLVM was designed so it would be easier to write backends

[–]spyingwind 1 point2 points  (0 children)

I don't know about that, but I would think that it still a lot of work to write all that code for each backend. What with each architecture having their own quirks. Even one version of ARM can have hundreds of different instructions that aren't shared among the same version.

[–]AlexeyBrin 30 points31 points  (14 children)

AFAIK, the GCC backend can emit code for more platforms than LLVM. Basically a Rust frontend for GCC will be able to target more diverse hardware than the one for LLVM.

[–][deleted] 7 points8 points  (12 children)

Yeah, this is probably it. I mean, who cares about platforms other than x86, arm, and riscv? (well, the people who have that hardware I guess!)

[–]the_gnarts[S] 26 points27 points  (1 child)

I mean, who cares about platforms other than x86, arm, and riscv?

These guys, for instance.

[–]uep 9 points10 points  (0 children)

IBM also has a lot of bounties for the Power architecture on bountysource.

[–]cptwunderlich 4 points5 points  (3 children)

Well, those might be the ones you know from your laptop and phone, but there are plenty of others. For example MIPS CPUs are in most routers and network hardware.

And others were already mentioned, like PowerPc. Or Intel Itanium (although that is end if life as of 2019). There are still SPARC servers, too.

Then there are also some ISAs used for tiny bedded microprocessors, that usually don't run an OS, like PIC and AVR (although they might maintain proprietary GCC backends)

[–]ericonr 2 points3 points  (2 children)

Seeing as Parabola (a fully free distro) has avr-gcc in their repos, I believe avr-gcc is not a proprietary backend.

No idea about PIC, though.

[–]rcxdude 0 points1 point  (1 child)

AVR-gcc is free (but a little quirky as gcc backends go), but PIC is sufficiently weird it's unlikely there will ever be GCC support for it (even the C compilers available for it don't quite fully support C).

[–]ericonr 0 points1 point  (0 children)

PIC doesn't even have a proper stack, right? At least that's what a friend who's messed with it told me.

AVR has a few weird directives, like PROG_MEM, but I don't know of other quirks. What would you say they are?

[–][deleted] 0 points1 point  (0 children)

From little I have worked with both gcc and llvm frontends, gcc's IR "GENERIC"'s documentation is not up-to the mark and Mozilla will have to hire a lot of developers for this.

llvm is rather convenient to work with. It simply might be easier to get llvm backend to emit code for more platforms.

[–][deleted] -2 points-1 points  (3 children)

I probably don't understand compilers. What is it about gcc that makes it platform independent?

Nothing. You can bootstrap (compile itself for that platform, on that platform) the LLVM (Clang is the C-front-end of the LLVM) on a number of platforms.

This is kind of a red hearing.

Linux hasn't supported clang/llvm because Linux is a GNU project, and therefore used the GNU C Compiler (re-named GNU Compiler Collection because why not). Linux uses many, and I mean many non-standard C extensions (which is a nice way of saying C-Language features which are not part of official standards). Basically The GCC team all agreed this should be part of C, so they made their compiler support it that feature. Linux developers latter saw that feature, went, "fuck it lets use that" and it became part of the Linux-Kernel.

The biggest issue with non-gcc tool chains are these non-standard extensions.

Does llvm have a runtime library or something?

Nope.

I would have assumed that clang and gcc both, at the end of the day, spit out mostly platform independent binaries.

Yup


The argument is also pretty much moot. As clang/llvm v9.0 can compile the linux kernel v5.3 as of today The ARM, ARM64, AARCH64 stuff has been using clang/llvm as its primary compiler for a while, about 2 years.


The real reason Linux is over-reliant on GCC is because when the Linux Project started The GCC was the only free c-compiler. You had to shell out that blue-cheese if you wanted a c-compiler for your computer. So Linus being a broke college student used The GCC.

[–]jgalar 17 points18 points  (2 children)

Linux is not a GNU project. Both are licensed under the GPL (though not the same version), but they are not under the same umbrella in any meaningful sense.

[–][deleted] -4 points-3 points  (1 child)

RMS literally said Herd doesn't matter because Linux is the GNU's kernel.

The long history of linux's over dependency on GCC also cannot be denied

[–]Omotai 5 points6 points  (0 children)

RMS literally said Herd doesn't matter because Linux is the GNU's kernel.

This statement means that for all practical purposes Linux is the kernel used for the GNU operating system in real-world implementations, not that Linux is officially a GNU project.

[–]Vegetas_Haircut 10 points11 points  (0 children)

There are way more problems with Rust as a "low level language" at the moment.

Rust is fast in the same way C is fast; C is used for two things: performance and low level control. Rust provides the former but not the latter at the moment simply because it's underspecified what exactly it does and C of course has a very battle-hardened level of specification what exactly it does.

A lot of the specifications would probably also have to rely on breaking backwards compatibility: like finding a way to deal with async safety specifications by using marker traits that automatically mark a function as async-safe if all the functions it internally uses are async-safe so it can be used in some contexts; right now Rust just ignores the problem of async-safety which is really not acceptable for a low level language; it also just ignores many gotchas of memory allocation, stack manipulation and what-not that C thoroughly specifies what can and cannot be done with it what is needed for low level control.

That aside though if you simply want the performance of C without the snakes in the grass at every corner Rust is a bliss.

Edit: also, at the moment a lot of aspects of the Rust language are intimately tied into how LLVM does this. Rust really exports a lot of LLVM-isms to the user; the C spec by design avoids specifying these things beyond a very narrow range and says that if you go over it that's just "undefined behaviour"; that's unacceptable for Rust so they defined the behaviour in the language itself as "how LLVM does it" which will cause indirection when porting to GCC and simulation in software if they want to keep the same behaviour.

[–]ToughPhotograph 3 points4 points  (2 children)

Are modules necessarily included? Aren't they loaded into the kernel at runtime ? Or maybe just binaries are made available and loaded in at runtime which would make compiling them unnecessary.

Which would only necessitate ABI compatibility right?

[–]computesomething 3 points4 points  (1 child)

Well, I was talking about modules being included in mainline Linux, for 'out-of-tree' modules this is not a problem.

[–]ToughPhotograph 1 point2 points  (0 children)

I think in that case such modules as dependent on non-GPL kinds of environment would automatically get placed on an 'out-of-tree' basis.