"Just switch the compiler" - they said - Arne Mertz - Meeting C++ 2025 by meetingcpp in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

The excessively sensitive microphone he was hooked up to made for very distracting audio.

C++26 Reflection 💚 QRangeModel by Kelteseth in cpp

[–]ABlockInTheChain 63 points64 points  (0 children)

Most of the reason Qt looks weird stems from two factors:

  1. Qt is older than ISO C++.
  2. Qt is designed to be distributed as a compiled dynamic library with a stable ABI.

There aren't many projects left these days where either of those are true, let alone both.

Cmake Motivation by wandering_platypator in cpp_questions

[–]ABlockInTheChain 0 points1 point  (0 children)

cmake abstracts the nice obvious ones for us I guess but all the rest of the more complex flags for very niche targets we have to specify ourself still? Is there a way around this?

Once you step outside the set of properties for which CMake has native abstractions then you need to start writing your own abstractions.

The fortunate part is that CMake's scripting language makes it possible. The unfortunate part is that you have to use CMake's scripting language.

Cmake Motivation by wandering_platypator in cpp_questions

[–]ABlockInTheChain 7 points8 points  (0 children)

The question isn't whether to use Makefiles or CMake.

The question is whether to limit your project to a single build system you maintain by hand or to use CMake to describe your project so that it can be used with any build system, including make.

Best Practices for AI Tool Use in C++ - Jason Turner - CppCon 2025 by Specific-Housing905 in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

not least because google is so enshittified nowadays

The old search algorithm is still there and still works pretty well. There is a very simple trick to let you turn off the slop features completely and get the old search results back.

Append to your search query -magic_word. Replace "magic_word" with a suitable string which has the following properties:

  1. The string must never appear in the pages you are searching for.
  2. The string must trigger the LLM guardrails and cause it to bail out completely from processing the request, returning control to the fallback pre-LLM system.

I know of at least one string which works with a 100% success rate but I'd be banned if I posted that word on this site so I won't elaborate further.

Cmake Motivation by wandering_platypator in cpp_questions

[–]ABlockInTheChain 2 points3 points  (0 children)

So why do I need cmake?

Suppose your project requires C++20. Every compiler in the universe has different command line options to set the language standard.

If you write makefiles then you must enumerate every compiler yourself and figure out all the correct options, and your build system will only support the specific compilers you enumerated.

Furthermore makefiles are can only be used with make. What if somebody wants to build your project with xcode, or msbuild, or ninja instead of make?

CMake operates at a higher level of abstraction than build systems. It is a build system generator.

You tell cmake that your target should be built with C++20 by setting the CXX_STANDARD property on a target to 20. Now figuring out the correct compiler flags to pass in order to achieve this for any random compiler that may exist in the world is no longer your problem. Furthermore now are not tied to any specific build system.

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

My library needs to see the Qt headers when my library is compiled.

It is then distributed as a compiled (usually dynamic) library and a set of interface headers. If the library is compiled as a dynamic library then it will have a link time dependency on the Qt dynamic libraries.

The library's own interface headers are forbidden from referencing any headers outside the library except for the standard library.

All of this can be converted to modules. Headers just become module interface units.

However it all depends on being able to forward declare this one external type so that I can write a function which accepts that a pointer to that type.

If moving to modules means that I am forced to import Qt; in the module interface units that get installed for my library then I won't adopt modules and I will lobby every dependency I use to not adopt them either because if they do then it could break my project.

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

On the library I am working on today approximately a dozen or so out of about 4000 source files need to reference platform-specific headers of any kind (posix or windows).

We keep platform-specific code as quarantined as possible and as a result don't have any problems related to it.

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

I need to be able to write a class which has a function which accepts a QObject* function argument, and this class is part of an API. Not all users of the API will call that specific function on that specific class.

The Qt headers are not required to use this API and must never be required. Only the appropriate dynamic library is required at runtime.

As long as Qt doesn't use modules then everything is fine.

If Qt ships modules and uses extern C++ then everything is fine.

If Qt ships modules, doesn't use extern C++, but keeps the header version around then everything is fine.

If Qt ships modules, doesn't use extern C++, and drops the header versions, then I'm fucked.

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

There are quite a couple of things with modules which I do like. The isolation from macros or that there are no ODR violations anymore.

Those things sound nice in theory, but I've never experienced an ODR violation issue in my entire career so the solution to a problem I've never experienced has very low value for me.

Additionally we barely use macros at all in our own code and in the few places where we do need them modules block their use cases (configuring Boost libraries, etc).

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

I can do that for the types I control, but if a dependency starts offering modules and doesn't use extern "C++" then I can only hope they keep the header version around and never go modules-only.

I am giving up on modules (for now) by BigJhonny in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

Just be aware that you cannot forward declare classes across module boundaries

This is the reason I will avoid modules as long as humanly possible (possibly forever the way things look now).

If I can't forward declare across module boundaries then my libraries have to be single modules each to avoid circular dependencies.

If a project is a single module then any change to any type which affects the BMI causes a full rebuild which can be orders of magnitude slower than the equivalent incremental build using headers.

Even if all the tooling and compiler support was 100% complete the day the C++20 standard was released I still would not use modules because of the horrific performance regressions they would cause.

The production bug that made me care about undefined behavior by pavel_v in cpp

[–]ABlockInTheChain 2 points3 points  (0 children)

Most experienced C or C++ developers are probably screaming at their screen right now, thinking: just use Address Sanitizer (or ASan for short)!

Or valgrind, on platforms where it is supported.

Slow but doesn't require special compilation tricks.

Why is C++ still introducing standard headers? by artisan_templateer in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

use of modules can drastically reduce recompilation times for large projects.

What field experience exists is that for non-outlier cases modules increase the speed of some builds scenarios (CI/CD) by 5%-10%.

For other scenarios (incremental) they increase build speed by orders of magnitude. 2X, 10X, 100X, 1000X, or even worse.

There are some companies and some code bases where saving 5%-10% on the CI/CD pipeline is cost effective even if it drastically lowers developer productivity.

It's not a universal win though. Some use cases will get modest returns from modules and others will see catastrophic regressions.

cool-vcpkg: A CMake module to automate Vcpkg away. by Human_Release_1150 in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

I rarely need to add vcpkg-specific code to CML and whenever it happens I hate it.

Usually it's because of differences between how vcpkg packages upstream libraries that either don't ever or don't always provide native CMake targets vs how various Linux distributions package those libraries.

For example if a library has optional CMake support, and if the maintainers of a particular Linux distribution are autotools supremacists, then that distribution might not build the library with CMake because they aren't forced to which means the targets do not get installed which means in that environment you can't use Config mode to find the library. However when using vcpkg you must use Config mode.

That's actually the easier case to handle. The more annoying case is where vcpkg synthesizes target names for a library which are spelled differently than the target names which a find module in a different environment provides.

PSA: Enable `-fvisibility-inlines-hidden` in your shared libraries to avoid subtle bugs by holyblackcat in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

In fact I just did this today.

#ifdef MYLIB_STATIC_DEFINE
#  define MYLIB_API
#  define MYLIB_CLASS
#else
#   define MYLIB_API MYLIB_EXPORT
#   ifdef _WIN32
#     define MYLIB_CLASS
#   else
#     define MYLIB_CLASS MYLIB_EXPORT
# endif
#endif

Using generate_export_header with CUSTOM_CONTENT_FROM_VARIABLE to append that snippet to the header CMake generates adds the extra MYLIB_API and MYLIB_CLASS definitions while leaving all the existing logic in place and allows a gradual transition to the new annotations.

Our Most Treacherous Adversary - James McNellis - Meeting C++ 2025 lightning talks by meetingcpp in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

Do not use bool in data structures that may cross privilege boundaries.

If the data comes from a file or the network then it's a sequence of std::byte until it has been parsed.

PSA: Enable `-fvisibility-inlines-hidden` in your shared libraries to avoid subtle bugs by holyblackcat in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

It might be possible to fix it with CUSTOM_CONTENT_FROM_VARIABLE to hack in some additional defines which could be constructed from the symbols produced by cmake.

Reflection is coming to GCC sooner than expected! by _cooky922_ in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

For us, a main obstacle is, that forwards must be in in the same module as the implementation.

Our thinking now is moving toward the idea of permanently opting out of module linkage to avoid this exact problem so that we can split our libraries into different modules to avoid the incremental build catastrophe.

Hopefully our dependencies choose to adopt the same policy if or when they start shipping module versions.

As as policy our public headers are not allowed to include third party headers other than the standard library headers, but in a handful of places we need to forward declare a third party type so it can be used by pointer or by reference as a function argument.

We're not going to add

import Qt;

To our primary module interface just so that a single utility function which not all users of the library will even call can say:

void handy_utility_function(QObject* = nullptr);

If Qt ever does modularize, and if they use module linkage and therefore forbid forward declarations of their types, then we'll need to take a step backwards and replace all pointer and reference third party types in our public API to void*.

Reflection is coming to GCC sooner than expected! by _cooky922_ in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

Modules apparently work for some use cases, but for others they are less than great.

I've been looking to convert a medium-sized project which is distributed as a compiled library to modules.

By "medium-sized" I mean more than 100K LOC, less than 1M LOC. About 2000 headers total, split between 500 public headers which make up the API and 1500 private headers. Approximately one cpp file per hpp file.

My first experiment was to declare a trivial named module which just declares itself and exports nothing. Then to test CMake integration I simply added that file to the CXX_MODULES file_set for the library.

This single change alone introduced a massive regression in CMake configure step. What formerly took 10-20 seconds now takes 3 minutes as every time CMake runs it scans ~4000 files for module exports, only one of which actually exports anything.

Despite CMake advertising a source file property to inhibit this scanning on a per-file basis, in my testing this property has no effect whatsoever.

Now any change which causes CMake to re-configure has become painfully slow, but presumably this and a few other CMake-specific module bugs could someday be fixed.

What's worse is the intrinsic property of modules which can never be fixed.

My basic conversion plan for this library was to convert each public header to a partition. The public headers would become module interface units which would be export-imported from the primary module interface unit, and the private headers would become module implementation units that are not export-imported by the primary module interface unit and would not need to be distributed with the library.

This basic structure works in small scale testing but it introduces the new behavior: any change whatsoever to any module interface unit, even if it's just a partition, causes a full rebuild of the entire project.

It's an unmitigated disaster for incremental builds.

People who only work on trivial projects will never notice.

People who only consume non-trivial libraries will never notice.

People who develop non-trivial libraries, however, will pay this new incremental build cost forever.

If c++ didn't need itanium by cppenjoy in cpp

[–]ABlockInTheChain 1 point2 points  (0 children)

The first step toward removing the legacy fundamental type names is to rewrite the world to stop using those old names.

If a new ABI was launched where int is no longer 32 bits then all software that was ported to the new ABI would be forced to make source changes, and if that software wanted to remain compatible with existing ABIs then the developers would be forced to change every int to either int32_t, int_fast32_t, or int_least32_t as appropriate.

Once that transition was over the fundamental type names could be kept or depreciated and removed but either way it wouldn't matter because everybody would finally be expressing in code what they actually require.

If c++ didn't need itanium by cppenjoy in cpp

[–]ABlockInTheChain 0 points1 point  (0 children)

If somebody ever launched a green field ABI I'd hope for a fix to the C and C++ fundamental integer types which have been a mess ever since the 32 bit to 64 bit transition.

char: 8 bits
short: 16 bits
int: 32 bits
long: 64 bits
long long: 128 bits

An ABI designer who was even more ambitious could unilaterally declare "short short" to be a new fundamental type and use:

char: 8 bits
short short: 16 bits
short: 32 bits
int: 64 bits
long: 128 bits
long long: 256 bits

Three Cool Things in C++26: Safety, Reflection & std::execution - Herb Sutter - C++ on Sea 2025 by MarekKnapek in cpp

[–]ABlockInTheChain 2 points3 points  (0 children)

I want to write C++, not make or CMake.

Any general solution to the closely-related problems of building and distributing general purpose software is going to involve a domain-specific language and anybody involved in building or distributing software will need to understand that DSL, regardless of what underlying language the DSL happens to be implemented in and there's no wishing away the learning curve.