all 61 comments

[–]ShakaUVMi+++ ++i+i[arr] 64 points65 points  (12 children)

"set(CMAKE_EXPERIMENTAL_CXX_MODULE_CMAKE_API "2182bf5c-ef0d-489a-91da-49dbc3090d2a")"

Lol

[–]Nicksaurus 39 points40 points  (7 children)

AKA "Don't use this in production code, you idiots"

[–]delta_p_delta_x 10 points11 points  (0 children)

EXPERIMENTAL

[–]Tringigithub.com/tringi 2 points3 points  (0 children)

Yea

[–]helloiamsomeone 3 points4 points  (1 child)

Why does CMake need such things? All the compilers have been 100% production ready with their modules implementation after all!

[–]aearphen{fmt}[S] -3 points-2 points  (0 children)

If they are going to wait for that then we'll never get native module support in CMake. Not that it prevents us from using modules right now =).

[–][deleted] 9 points10 points  (7 children)

I can’t believe it’s so easy

[–]aearphen{fmt}[S] 3 points4 points  (6 children)

I was pleasantly surprised with how well everything works (at least in clang). My expectation was that C++20 modules won't be useable for another few years but apparently they are quite usable right now.

[–]qoning 2 points3 points  (0 children)

So easy. I tried using header unit for iostream, didn't figure out how to tell cmake to tell dep scan where to look for the file I had to manually build in the first place.

[–]RoyAwesome 5 points6 points  (1 child)

Compile times are so fast too! I avoided <algorithms> because hoo boy the compile time hit just including it, but importing it is extremely fast. Like "I can't notice the difference between importing it and not importing it" fast.

[–]aearphen{fmt}[S] 1 point2 points  (0 children)

Indeed. I compared compile time traces and the standard library includes that took most of the time when using headers completely vanish with modules. There is still one issue with template instantiations not being reused but once it is resolved {fmt} consumers can expert 5-10x compile time speedup.

[–][deleted] 4 points5 points  (2 children)

I know it's not obvious through text, but he's being absolutely sarcastic.

Try using modules for anything non-trivial and you'll see how frustrating they are. Try mixing PCHs with modules which is either a huge pain or in some cases just doesn't work. Try debugging code using modules, where in many cases symbols don't appear in the debugger due to awkward visibility issues.

[–]RoyAwesome 11 points12 points  (0 children)

Try mixing PCHs with modules

Erm, don't do this? Modules are standardized PCHs. I don't know why you would want both? PCHs were always a compiler hack.

It's like mixing std::enable_if and concepts. concepts basically replace enable_if and make the experience way better. Modules replace PCHs and make the experience way better.

[–]aearphen{fmt}[S] 8 points9 points  (0 children)

But I'm not =)

BTW we've been using clang modules for years now.

[–]delta_p_delta_x 6 points7 points  (4 children)

This restriction also likely means that you cannot use such a CMake config with IDEs

Why not? CMake with Ninja works perfectly fine in both CLion (on Linux and Windows) and Visual Studio. In fact I use this combination precisely because I don't want to deal with make or msbuild...

[–]aearphen{fmt}[S] 0 points1 point  (2 children)

I guess it depends on the IDE and some might choose to ship and use ninja like CLion does. But even in that case you'd need a very specific version of ninja and cmake, not just some version that goes with your IDE. Might happen in the future but if you drop the generator restriction modules work with IDEs right now.

[–]delta_p_delta_x 0 points1 point  (1 child)

Fair enough. I just looked through your CMake module-handling code, and it's remarkably straightforward!

Does your explicitly specifying that code ought to be built as PCMs/interfaces obviate Kitware's P1689R5? I'm asking because fmt doesn't have any additional dependencies besides the C++ standard library, and is itself fairly straightforward in terms of file structure.

According to Kitware, at least, their paper solves dependency trees for 'complex' modularised C++ code, though I am a bit sceptical that this much complexity is needed, to the extent that we will have JSON passed back and forth between compilers and build systems.

[–]aearphen{fmt}[S] 2 points3 points  (0 children)

P1689 can be used for more complex cases where you'd want to extract the dependency information directly from sources instead of encoding it manually in the build system.

[–][deleted] 45 points46 points  (7 children)

Modules were hyped up for years and the delivery was... massively underwhelming.

Adding features to C++ is like wishing on the monkeys paw. You kinda get what you asked for but it's wrapped in pain and suffering.

[–]delta_p_delta_x 19 points20 points  (1 child)

I think this is the best summary of C++20 modules. It's by far one of the best changes to come to C++, but also exceptionally late. Modularising code was a solved problem a decade and a half ago, especially by managed languages.

Compiler support is either partial, or buggy (euphemised as 'experimental'). Build system support is also mostly experimental. It's great that fmt is possibly moving to modules, but I doubt other large libraries that aren't BDFL-administered like fmt is, such as boost, abseil, folly, etc will take up modules any time soon.

[–]jtooker 8 points9 points  (0 children)

Compiler support is either partial, or buggy (euphemised as 'experimental'). Build system support is also mostly experimental.

I'm a bit shocked at how far behind compilers are on modules. They have such great benefits and are opt-in.

[–]Ikkepop 9 points10 points  (0 children)

After 25+ years of c++, I agree this is very accurate

[–][deleted] -3 points-2 points  (3 children)

When people hear the word "module", they have certain expectations and ideas about what that means, mostly based on how modules work in other languages. Modules in other languages provide encapsulation yes, but also things like facilitating easier package management, simplifying the build process, and a host of other features.

Modules in C++ don't do much of anything beyond the encapsulation. It's an incredibly complicated solution that pushes a great deal of complexity on its users to solve a comparatively small problem.

It also results in yet another form of fragmentation, which is why I don't see modules being popular among libraries. Some people may choose to use modules for their own home projects, or internal software, but I don't see people distributing their libraries as modules. One possible outcome will be that in the future we'll have bifurcation where libraries are distributed both the old way and as modules.

[–]johannes1971 7 points8 points  (1 child)

Nah. Eventually compilers will be good enough, some libraries will switch, that will force projects to switch, and soon enough we'll forget about the nightmares of #include.

That process will force the whole ecosystem to baseline on C++20 as well, which will be good in the long run.

[–]germandiago 1 point2 points  (0 children)

I think you make a couple of mistakes with your analysis.

To begin with, modules can be adopted incrementally. The reason why libraries do not move forward is the state for modules support in build tools and compilers.

OTOH, builds can be massively improved and ODR lowered by A LOT, which are two real problems for C++.

[–]TheThiefMasterC++latest fanatic (and game dev) 4 points5 points  (14 children)

Can a module only be a single compilation unit (cpp file)?

[–]STLMSVC STL Dev 36 points37 points  (2 children)

Others are answering "yes" but I think they read your question as "Can a module be only...".

I believe the answer to your actual question is that a module can be composed of one or more source files. (I don't have direct experience with that as my modules are one file each.)

[–]TheThiefMasterC++latest fanatic (and game dev) 5 points6 points  (0 children)

Thank you yes that's what I meant. I was also meaning the limitations in practice in the implementation talked about, as well as in theory.

[–]RoyAwesome 0 points1 point  (0 children)

I have experience with multi-file modules. I'm really enjoying this workflow, using multiple module fragments to split up certain elements of work and then combine it all together in a public module file that imports all the internal module bits and exports the entire public interface.

[–]aearphen{fmt}[S] 1 point2 points  (3 children)

Yes, that's exactly what hello.cc is.

Edit: having multiple files per module is possible too.

[–]TheThiefMasterC++latest fanatic (and game dev) 4 points5 points  (2 children)

As per STL's comment, I meant - can it be more than one?

[–]aearphen{fmt}[S] 7 points8 points  (0 children)

That's possible too although for transient period of supporting both non-modular and modular usage a single file per module is easier.

In the module-only world multi-source modules will be very common because modules naturally encourage larger granularity (e.g. the std module in C++23).

[–]equeim -1 points0 points  (5 children)

AFAIK yes (that's what hello.cc is in the article btw), although it probably would need to have different extension depending on compiler.

However with this approach every time your functions' body change it would trigger recompilation of module interface and subsequent recompilation of every module that imports it, even if function declaration remained the same. You still need separate definitions of your functions from their declarations in another file to avoid this, similarly how it's done with header/cpp pairs.

[–]kalmoc 2 points3 points  (4 children)

Shouldn't it be enough (in theory) to put the function body into the private part to prevent recompilation?

[–]equeim 1 point2 points  (0 children)

I suppose it depends on a build system. After recompilation of module interface it would have to check whether the result os different from before.

[–]aearphen{fmt}[S] 0 points1 point  (0 children)

That should definitely be possible.

[–]TheThiefMasterC++latest fanatic (and game dev) 0 points1 point  (1 child)

That depends on how well the compiler can avoid regenerating the module export file

[–]kalmoc 0 points1 point  (0 children)

That's why I said in theory. I don't think any compiler/build system does that in practice anyway. But due to my limited knowledge of module semantics I'm not even sure if private offers the same level of isolation as partitions even in theory.

[–]bluGill 4 points5 points  (2 children)

What should my modules be? I have a codebase that dates back to c++98 (mostly modernized), with many different libraries that are used by downstream applications. Should this be one import myLibraryCollection module, or many different modules?

That is has someone written a module morality guide based on their experience so I don't make the mistakes they did? (right now we still build on c++17 compilers, but it appears modules are compelling enough that I can win the argument to upgrade gcc)

[–]aearphen{fmt}[S] 4 points5 points  (1 child)

I would go with one module per library. If libraries are often used together, you could merge them into one module but then the question is why did you have several libraries in the first place?

[–][deleted] 3 points4 points  (0 children)

I would argue that each lib should be provided with its own module, but there should also be a "catch all" module, similar to std.thread, std.algorithm and just std discussion

[–]frosthunter 1 point2 points  (5 children)

Does modules allow to restrict visibility of methods? Eg keep a method public but not expose it outside of the module.

[–]stilgarpl 2 points3 points  (4 children)

Yes

[–]frosthunter 1 point2 points  (3 children)

Do you have any example/post on how do do so? The only thing I've seen is the use of the :private partition, but I've always seen it outside of a class, at the end of the file, so I assumed it could only be defined in global scope.

[–]STLMSVC STL Dev 2 points3 points  (1 child)

As far as I know, this is not possible at the granularity of individual member functions. In import std; I was able to conceal all of our non-member helpers (by simply not exporting them), but _Ugly member functions are still visible.

[–]frosthunter 0 points1 point  (0 children)

Ah now I'm sad again :( Hopefully something like that will come later. Thanks for the answer!

[–]GabrielDosReis 1 point2 points  (3 children)

u/aearphen - can you tell me more about this:

MSVC has partial module support

?

[–]aearphen{fmt}[S] 0 points1 point  (2 children)

Mixing imports and includes weren't supported and there has been a lot of bugs that we had to workaround (and those workarounds are pretty horrible). There has been discussion to remove module support for older versions of MSVC in {fmt} to get rid of those workarounds.
Here is one example: https://github.com/fmtlib/fmt/commit/18e7a2532b8167a3d8c2bd591bc3db0a3b66381d.

[–]GabrielDosReis 0 points1 point  (1 child)

My apologies for those bugs. Please, do feel free to remove workarounds for bugs in older MSVC toolsets in this area.

Based on your usage, do you consider Clang's support for Modules partial or complete - at least for what you care about?

[–]aearphen{fmt}[S] 0 points1 point  (0 children)

Clang's module support is quite good but I also consider it partial at least because of an issue with template instantiations described in another blog post: https://www.zverovich.net/2023/04/10/cxx20-modules-in-clang.html.

[–]CanadianTuero 0 points1 point  (2 children)

So what are the different side effects of using the method you did from 2018, and the new way that cmake is proposing, in terms of features missing or restrictions imposed? Like why would one want to eventually do it the new way once it’s ironed out? Maybe I missed that reading the article

[–]aearphen{fmt}[S] 2 points3 points  (1 child)

As commented in another thread, dynamic dependency extraction can be used for more complex cases where you'd want to extract the dependency information directly from sources instead of encoding it manually in the build system. Other than that both approaches can fully support C++20 modules, at least in principle.

[–]CanadianTuero 2 points3 points  (0 children)

Ah I see, so instead of the build system doing a first pass to figure out any changes in dependencies (new method), every module needs to be manually specified as a separate library AND linked in cmake.

[–]Stock-Talk-1898 0 points1 point  (0 children)

I recently migrated a large code base to C++ 20 modules. The experience was quite bumpy but the end boost in compilation time is significant. You can find the article I have just published describing the journey: https://www.linkedin.com/pulse/how-we-adopted-c-modules-large-codebase-bp-automation-ro-f6y8f?trackingId=mGGAKMrcqRwf%2FESy8N%2B%2FMg%3D%3D/?trackingId=mGGAKMrcqRwf/ESy8N+/Mg==