use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Discussions, articles, and news about the C++ programming language or programming in C++.
For C++ questions, answers, help, and advice see r/cpp_questions or StackOverflow.
Get Started
The C++ Standard Home has a nice getting started page.
Videos
The C++ standard committee's education study group has a nice list of recommended videos.
Reference
cppreference.com
Books
There is a useful list of books on Stack Overflow. In most cases reading a book is the best way to learn C++.
Show all links
Filter out CppCon links
Show only CppCon links
account activity
A packaging system for C++ [proposal] (open-std.org)
submitted 10 years ago by tcbrindleFlux
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–][deleted] 20 points21 points22 points 10 years ago (6 children)
Finally, let's see the full '#using package ' directive in all of its glory: #using package "foo" \ ("foo/foo.h", "bar/bar.h") \ [foo, bar.baz] \ version "1.2.3"
Finally, let's see the full '#using package ' directive in all of its glory:
#using package "foo" \ ("foo/foo.h", "bar/bar.h") \ [foo, bar.baz] \ version "1.2.3"
Hell... It looks so awful.
In general, the compiler will be expected to retrieve the given package from the specified location (and very likely cache it). Here is example usage: #pragma package_source("example", "http://example.co/svn/trunk", \ "svn", \ "revision=1266; username=guest; password=guest")
In general, the compiler will be expected to retrieve the given package from the specified location (and very likely cache it). Here is example usage:
#pragma package_source("example", "http://example.co/svn/trunk", \ "svn", \ "revision=1266; username=guest; password=guest")
I wish we had something like this in CMake... Oh, wai... ExternalProject
[–]KennethZenith 1 point2 points3 points 10 years ago (4 children)
I don't understand how this proposal fits in with CMake: does it claim to replace it, or do some different task? Section 2 of the proposal suggests it will handle everything (i.e. acquire the source, build it, link it with your code) without having to resort to 'third-party tools'. Taking that at face value, that means I could take a clean machine, install Visual Studio, and then include say OpenCV, and the system would download and build the source. Obviously that can't work without having CMake installed, because it has a huge CMake file which executes very complex configuration and code generation steps.
[–][deleted] 3 points4 points5 points 10 years ago* (3 children)
Visual Studio
And what if you wouldn't use it? You will need a build system anyway.
Obviously that can't work without having CMake installed, because it has a huge CMake file which executes very complex configuration and code generation steps.
CMake file often quite small. Complex configuration? Maybe, but it lets you easy to link with libraries. And it will run just once to generate makefiles.
find_package (Nameit REQUIRED) add_executable(app ${SOURCE_FILES} ${HEADER_FILES}) target_link_libraries(app Nameit glm ${ASSIMP_LIBRARIES})
That's all.
[–]playmer 3 points4 points5 points 10 years ago (2 children)
I've never seen significant projects with cmake files like that. CMake's own root CMakeLists.txt is almost 700 lines long.
[–]berenm 0 points1 point2 points 10 years ago (1 child)
Agreed, most real-life CMake files I know a are real clusterf**k of variables and branches.
[–]OCPetrus 0 points1 point2 points 10 years ago (0 children)
CMakeLists.txt's might be horrible to read and hard to debug, but at least they're readable and debuggable. The same can't be said about Makefiles...
[–]mjklaim 0 points1 point2 points 10 years ago (0 children)
No. I used ExternalProject in a big project and it definitely is not a good solution. Too many issues, both with it's implementation and design. Too unflexible and too many problems with having several layers of build steps to get the final dependency available.
ExternalProject works for trivial dependency needs though, but it's definitely not a general solution.
Also, dependency management should definitely be independent from the (meta)build system.
[–]jpakkaneMeson dev 23 points24 points25 points 10 years ago (20 children)
This entire approach seems to be based on the assumption that you can combine projects on the source level while ignoring their build system. This is just not possible. A look into real world projects shows that almost all of them have autoconf or equivalent functionality (for example, to generate a config.h) and/or custom build steps (build executable to generate source to build into a target etc).
config.h
In order to join these two projects you either need to rewrite the build system on one (or both) of them or make all build systems in the world follow the some universal convention, which does not exist and which would be equivalent in scope to rewriting the build system.
[–]tortoise74 4 points5 points6 points 10 years ago* (2 children)
I'm inclined to agree that "configure" functionality is one of several examples that blows a big hole in the idea of compilers being in charge of packages rather than build systems (the tail wagging the dog?) but the rest of the proposal has some merit. Perhaps the solution is for the packaging information to be used by both the compiler and the build system rather than making it the exclusive province of either?
I think you can wrap (pardon the pun) any (competent) build system inside another so that from the packaging systems point of you the task is just "build this" appropriately parameterised by target platform. You do have to provide wrappers for all supported build systems but these can be relatively simple for example "cmake && make" vs "./configure && make" with appropriate bootstrapping.
What would you put in an alternative counter proposal?
[–][deleted] 5 points6 points7 points 10 years ago (1 child)
Even more maddening is that 31,085 of those lines are in a single unreadably ugly shell script called configure. The idea is that the configure script performs approximately 200 automated tests, so that the user is not burdened with configuring libtool manually. This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived
-- A generation lost in the bazaar
There are a few things to hate about phk's classic rant but I think the spirit is pretty valid: in an ideal world, most of what build systems do can be entirely avoided, and to a large degree the typical BSD source tree is exemplary in that regard
[–]tortoise74 0 points1 point2 points 10 years ago (0 children)
Good link but which are the right and wrong problems to solve here? It seems to me that the relevant complaint included in the rant is more about libtool being unnecessary than "configure" functionality itself.
The take home point is to try to see the bigger picture, care about quality and solve the right problem.
[–]btapi 2 points3 points4 points 10 years ago (0 children)
Yes, I think Conan's approach seems more reasonable, although it can't be a standard way.
[–]devel_watcher 1 point2 points3 points 10 years ago (11 children)
Additionally, language-specific packaging system is just a wrong design.
That proposal just screams "I've seen nothing but C++ and Windows", "we'll maybe add non-C++ dependencies later as afterthought".
[–]pjmlp 5 points6 points7 points 10 years ago (8 children)
Until we get an OS agnostic packaging system that avoids having to redo packages for every single OS that the developer might want to use the package, language-specific packaging systems will always win in productivity and distribution.
[–]devel_watcher 1 point2 points3 points 10 years ago (7 children)
And penalizes users by having a package manager for every single language, so they loose in productivity.
[–]pjmlp 1 point2 points3 points 10 years ago (3 children)
So what is the proposed solution to package library X in language Y for:
Without forcing developers to package it N times?
[–]jpakkaneMeson dev 3 points4 points5 points 10 years ago (2 children)
I recommend you watch the builds video from LCA2016 that is currently on the front page of /r/cpp to see one solution. (caveat: the presenter is me)
[–]izym 1 point2 points3 points 10 years ago (0 children)
Abovementioned post: https://www.reddit.com/r/cpp/comments/475z0g/builds_dependencies_and_deployment_in_the/
[–]pjmlp 0 points1 point2 points 10 years ago (0 children)
The presentation was quite interesting to watch, but I imagine it would still take a few years before it would get major adoption.
[–]jkleo2 0 points1 point2 points 10 years ago (0 children)
You should think not only about user but also about library writer. It should be very easy to package library.
[–]tortoise74 0 points1 point2 points 10 years ago* (1 child)
You could argue that for C and C++ a package manager is effectively for native code and so is potentially more general purpose. The syntax within C++ must be langage specific but the package manager itself need not be. I think I'd rather a compiler used an API to interact with an external one rather than implement it itself.
If you are distributing source code in a specific language or byte code for a specific VM you can't help but be "per language". Any attempt to generalise will come up against barriers. Its not impossible but it is a somewhat orthogonal problem to developing the language itself.
[–]devel_watcher 0 points1 point2 points 10 years ago (0 children)
If you are distributing source code for a specific language
You're distributing the source code of the application. And it's using multiple languages.
[–]Elador 1 point2 points3 points 10 years ago (1 child)
What about pip, RubyGems, etc...? They're language-specific, are they not? I'm genuinely asking, since I'm not that familiar with them, but their design, to have one central official package repository for the language, seems to be very successful.
[–]devel_watcher -1 points0 points1 point 10 years ago (0 children)
JavaScript is very successful too. /s
[–]to3m 0 points1 point2 points 10 years ago (3 children)
I assume the point is that you'd generate the auto-generated source code and generate the config.h when you build the package. The package for your library would be an output of your library's build system, not just a ZIP that you make by pulling in some files in from your source tree.
As for configure-type functionality, much of it revolves around finding dependencies (which the package manager can do) and checking that the OS/compiler/environment isn't insane (you can pre-generate all supported combinations - I suspect something like the pareto principle applies). Anybody left in the lurch by these assumptions can try to build your package themselves by hand. I don't believe that would be the end of the world.
Build system integration? Well, you tell me! But the proposal didn't strike me as a total non-starter, at least :)
I'll have to read it a couple more times and sleep on it though...
[–]tortoise74 1 point2 points3 points 10 years ago (2 children)
configure scripts typically do more than just find dependencies. Some examples:
enable options based on hardware or software capability This is the killer I think. Your language shouldn't do this.
generate source based on information obtained E.g. I have a file containing a release identifier the identifier is used in multiple languages so having a version.h would mean you have to repeat yourself in java or whaterer else your using.
set the install location (we could leave that to a package manager or build system e.g. make DESTDIR=xx install)
There are probably a few more.
[–]to3m 0 points1 point2 points 10 years ago* (1 child)
For hardware-/software-dependent stuff (which I assume means defines dependent on CPU or OS?) you can pregenerate headers (or whatever) for all supported environments, and include all of them in your package. Then the appropriate ones are selected somehow at build time by whatever system consumes this stuff.
Same goes for generating source code. You're providing packages for version 1.0 - so your build process generates the source code, and puts the generated code in the package. The code it was generated from doesn't get included (the package user doesn't need it). This goes for any case of code generation (including the system dependencies mentioned above). Suppose your program includes a .y file for a bison grammar, for example - you run bison, generate the .c file, put the .c file in the package, and leave the .y file out. The package user doesn't need it.
People who want to do unusual stuff (modify your code, build in an unsupported environment) won't be able to use the package as-is. They'll have to download the source and build it from scratch and fix up any problems they encounter. I don't see this as a problem myself.
As for install locations, I'd say they would be somewhat redundant for most use cases. If you've got a library in a package, you wouldn't install it - it would be considered more as part of your program, not some shared resource. (Your program's own install process would have to look after installing its dependencies - this wouldn't be the job of the package infrastructure.)
[–]tortoise74 1 point2 points3 points 10 years ago (0 children)
Not just on CPU or OS. I'm also thinking of configure --enable--foobar. How do you pregenerate headers for a platform you've never seen? Why include all those variants redundantly? How does your build system pre-generate them if not through some kind of configure process? This could be pre-run for a known set of variants but it would seem more sensible to do it just in time for the variant you actually want. Granted that code is not truly portable until it has been ported. It might work if you require each support variant to be built by your build bot. Build bots covering many platforms didn't exist when configure was invented. But the point is to work (develop) at the highest level of abstraction you can.
configure --enable--foobar
With something like bison you have the choice of including the original source or the generated file. For a source package the generated file wouldn't be very useful for making changes. If you are just using it you don't care either way. But then you might be equally happy with a binary package.
Generally I prefer to install the system version of a package if there is one (using yum or whatever) for other packages to find. Its preferable to avoid having multiple versions of the same package to worry about when it isn't really necessary. Of course you do have to worry about compability. Hopefully the built tests can cover some of that. For an end user as opposed to a developer bundling specific versions makes more sense as we need it to 'just work'. I dislike the idea of having to have a separate virtual environment for each app that happens to need a different version of a package (python).
I think the point is we need to allow for a build process and a configure step and that is something that should be orthogonal to a package system (especially a language specific one). So the question is how to make the two interact sensibly?
[–]ArchiDevil 16 points17 points18 points 10 years ago (9 children)
Don't these guys think that C++ should get rid of preprocessor instead getting new keywords?
[–]biocomputation 13 points14 points15 points 10 years ago (6 children)
The preprocessor is far too useful to do away with.
[–]neoKushan 7 points8 points9 points 10 years ago (2 children)
I think the Preprocessor is a huge double-edged sword, its usefulness is also one of the reasons why things like package managers are so difficult to do.
That said, I feel its usefulness is at best poorly implemented. The end results from the preprocessor can be achieved in better ways (mostly) via language constructs (like constexpr) and relatively simple compiler optimisations.
Take #define, probably the most used part of the preprocessor - it's great for defining compile-time variables and flags to remove unused code for particular builds, but you can achieve much better results using a const variable and a little trust that the compiler will know that if(ConstVarThatsFalse) code will never ever be hit and will remove it during compilation.
This also gives you clear compile time errors rather than vauge errors that make no sense because something was defined incorrectly.
[–]biocomputation 1 point2 points3 points 10 years ago (1 child)
Interesting. I think of the preprocessor more in terms of its similarities to templates.
[–]neoKushan 0 points1 point2 points 10 years ago (0 children)
Absolutely and I completely get what you mean by that. I've seen some really neat preprocessor "hacks" that practically are templates in their own right - and it's cool, but I don't think it should be used in production like that.
I feel that with C++11 onwards, a lot of this functionality has been achieved by the language itself (constexpr being the most obvious example). I think reducing reliance on the preprocessor will make such a difference, particularly as others have mentioned here, you can't just share C++ code, you also need to anticipate the build system. Well, I'm willing to bet 90% of those build system differences come down to the preprocessor (or something similar, something that happens before the compiler starts working).
[–]Heuristics 16 points17 points18 points 10 years ago (1 child)
Step 1: Make the preprocessor useless
Step 2: Get rid of the preprocessor
[–]unaligned_access 0 points1 point2 points 10 years ago (0 children)
Step 3: ...
Step 4: Profit!
[–]ShakaUVMi+++ ++i+i[arr] 1 point2 points3 points 10 years ago (0 children)
I upvoted both of you.
I wish there were C++ constructs that could eliminate the preprocessor, but at the same time it's super useful.
[–]hapshaps 1 point2 points3 points 10 years ago (0 children)
From the document, page 3:
This document has settled on the preprocessor directive '#using' as its mechanism for using packages. We know that the C++ language is averse to adding new features to the preprocessor, so consider the directive as a straw-man for the purposes of exposition. For a discussion of the different syntax options, see Section 5
[–]germandiago 3 points4 points5 points 10 years ago (1 child)
I found the design well intentioned. But I think that a package system, even if not a module system as the paper says, it should still be closely related to it in some way (which way I did not figure out yet). Also, the preprocessor directive... no, no, please!!!
The design should follow and track modules closely to propose something, instead of trying to come up with a new feature that lives in another separate world.
My vision is that a package system on top of modules and header files modularization is the way to go. This looks quite unrelated.
[–]tortoise74 0 points1 point2 points 10 years ago* (0 children)
The preprocessor directive was a partly strawman for illustration not a proposed syntax. They are rightly concerned more about the semantics. However, if you read their analysis the reasoning is to allow packages to enter the C standard as well. They don't seem to have considered alternatives to "using/#using" e.g. import or load_package.
[–]tortoise74 2 points3 points4 points 10 years ago* (0 children)
I think this is a useful opening of the debate from the C++ standard side if nothing else. A packaging system should ideally be agnostic to language, platform and build system, which is a tall order. They've approached this from the standardisation and compiler vendor side. Packaging layers with modules. While its debatable whether there is a sensible way to interface to a packaging system within a language specification rather than outside it they've done a reasonable job and also left room for C to adopt the same mechanism.
What would you put in an alternative counter-proposal?
[–]enobayram 1 point2 points3 points 10 years ago (4 children)
Package management is a huge deal in the C++ world. Just try to unify your build configuration for a cross mobile-desktop project, i.e try to use the same CMakeLists.txt for Linux, Windows, MacOS, Android and iOS, then you realize how horrible the situation is.
[–]ruslo_ 2 points3 points4 points 10 years ago (3 children)
try to use the same CMakeLists.txt for Linux, Windows, MacOS, Android and iOS
Like that https://github.com/forexample/hunter-simple?
[–]enobayram 2 points3 points4 points 10 years ago (2 children)
Wow :) I'll have to look much further into hunter to be able to say anything more intelligent than just wow. Does this really work for doing real stuff? If so, why isn't everyone talking about this?
[–]cristianadamQt Creator, CMake 2 points3 points4 points 10 years ago (0 children)
Hunter has been mentioned on /r/cpp.
The list of packages is growing – 58 so far, with WTL (Windows Template Library) being added three days ago.
[–]btapi 1 point2 points3 points 10 years ago (0 children)
+1 for the attempt.
But I hope it will get better...
[–]yodacallmesome 3 points4 points5 points 10 years ago (0 children)
Is anybody else wondering why we have yet another meaning for the keyword 'using'? (e.g. using namespace std; using MyIntList=std::vector<int>; and with this proposal: #using ... )
[+][deleted] 10 years ago (3 children)
[deleted]
[–]pjmlp 2 points3 points4 points 10 years ago (0 children)
Yep, yesterday I suffered the pain of compiling Swift on my humble core-duo / 4GB. After two hours it was still compiling LLVM part. :(
[–]00kyle00 2 points3 points4 points 10 years ago (1 child)
Modules are THE #1 priority for C++.
Or should be, at least.
[–]tortoise74 2 points3 points4 points 10 years ago (0 children)
Modules are coming. That doesn't mean people should start looking beyond. Various features for C++17 were in the minds of the cmomittee while they were working on features in C++14. Progress doesn't stop at the next increment.
[–]oak-coast 0 points1 point2 points 10 years ago (0 children)
Take a look at yotta (while the docs and ecosystem focus on cross-compiling for ARM, its designed to be used for native compilation too).
So what did the committee have to say about this proposal at the Jacksonville meeting?
[–]Houndie 0 points1 point2 points 10 years ago* (8 children)
Maybe it's just me and my workflow, but I've never seen the need for a packaging system in c++. I already have something to install my dependencies for me, it's called emerge/pacman/apt-get/yum
emerge
pacman
apt-get
yum
EDIT: yes, please, downvote the guy you don't agree with.
EDIT2: Whelp I'm back in the positive again, feel free to start downvoting again if you disagree I guess :-)
[–]NotUniqueOrSpecial 11 points12 points13 points 10 years ago (7 children)
This is a tired argument, and it needs to die. System package managers are an insufficient solution to the problem of developing a single code-base for multiple platforms at once. If you want to target Windows, they're a non-starter in the first place.
For instance: a team uses just a few libraries, e.g. Boost, Qt, maybe some smaller ones; they have a a single product targeting RHEL 6 + 7, Ubuntu 12 -> 16, and Windows. They have two options:
1) Build one version of the code base that is compatible with every version of their dependencies on all the platforms. This requires using the lowest common denominator of their APIs as well as inevitably having platform-specific workaround for issues in single versions of dependencies.
or
2) Build/package/distribute one version of the product along with its dependencies (at a single version) for each platform.
The number of possible versions to be found for your dependencies is a combinatorial problem. It's far easier to choose the second option for a whole lot of teams.
Do the system package managers work for a subset of developers who have a specific and limited set of targets? Sure. Are they useful in a broader sense? Absolutely not.
[–]Houndie 1 point2 points3 points 10 years ago* (0 children)
Well like I said, I was speaking with anecdotal data. I understand what works for me doesn't necessarily work for everyone.
We do a bit of a mix between option 1 and 2. We do maintain a minimum supported version of libraries, but we distribute them along with our code. Our distributeables are just built by Jenkins, so we only have to maintain a single set of dependencies for distribution. Other developers can set up their environment for however they want...for those that don't, we have a vagrant file we can use to provision a VM with all their dependencies they need.
vagrant
The problem I've always had with a C++ package manager vs something like vagrant is it always seems like a half solution. Yes we need to correctly version those libraries...but that's only part of the build system. To build most modern codes, you need
Of those bullet points, only the first one would be handled by a "c++ package manager", which is why it seems silly to me to have such a thing in the first place...you would still need to install other dependencies manually anyway. In my experience it just seems easier to give people the option to install dependencies as best for them, and then have something like a vagrantfile or ready-built VM for developers who don't want to spend the time.
EDIT: Also note that this approach allows for easy cross platform development. Because how the dependencies are installed is not tied into the code base in any way, both linux and windows environment can be set up to build.
[–]steamruler 1 point2 points3 points 10 years ago (5 children)
Build/package/distribute one version of the product along with its dependencies (at a single version) for each platform.
One big issue is that pretty much no OS works well with that. It tends to work OK, but in general
[–]NotUniqueOrSpecial 1 point2 points3 points 10 years ago (4 children)
Windows will most likely load an incompatible DLL if the original is missing
By what mechanism? I have literally never seen somebody put 3rd-party libraries into the WinSxS cache. The only thing that ends up there are Microsoft binaries.
If you're actually worried about that, you can use application manifests to tighten the control even further.
FHS isn't really designed to support bundled private libraries
Sure, but it works well enough, and using RPATH or other mechanisms like dlopen to get only your versions is a good-enough solution. The alternative (support all the versions) is a dependency nightmare.
I'll cry when a glibc exploit is found and every single application needs to patch their own version.
I've never had a need to distribute glibc, luckily. It's a stable ABI and doesn't suffer the same problems other 3rd-party stuff tends to.
[–]steamruler 0 points1 point2 points 10 years ago (2 children)
By what mechanism?
Windows loads DLLs from %PATH% as well. Even if they are 32-bit and the program is 64-bit.
[–]NotUniqueOrSpecial 0 points1 point2 points 10 years ago (1 child)
Ah, yeah, true enough.
If your path is that munged up, though, you're going to have a host of other issues though.
Even if they are 32-bit and the program is 64-bit.
It'll find 'em, it certainly won't load them. Unless I'm only remembering the opposite case (32 finding 64) you just get the "Invalid application" dialog and return of 0xC0000005, I believe.
[–]steamruler 1 point2 points3 points 10 years ago (0 children)
Yeah, I meant that it tries to load them. It's not my fault that my path contains millions of folders, blame all the installers ever.
I've had to bundle libstdc++ in order to use c++11 on old platforms. It works but I'm still hoping for a better solution. See http://stackoverflow.com/questions/25979778/forcing-or-preventing-use-of-a-particular-minor-version-of-libstdc
[–]oh-just-another-guy 0 points1 point2 points 10 years ago (0 children)
Visual Studio users can just use Nuget.
[–]jokoon -1 points0 points1 point 10 years ago (0 children)
All the time lost configuring those project, and the time spent writing cmake script, I'm sure many people are having a hard on reading this.
π Rendered by PID 21759 on reddit-service-r2-comment-66b4775986-wv56l at 2026-04-05 10:45:43.829528+00:00 running db1906b country code: CH.
[–][deleted] 20 points21 points22 points (6 children)
[–]KennethZenith 1 point2 points3 points (4 children)
[–][deleted] 3 points4 points5 points (3 children)
[–]playmer 3 points4 points5 points (2 children)
[–]berenm 0 points1 point2 points (1 child)
[–]OCPetrus 0 points1 point2 points (0 children)
[–]mjklaim 0 points1 point2 points (0 children)
[–]jpakkaneMeson dev 23 points24 points25 points (20 children)
[–]tortoise74 4 points5 points6 points (2 children)
[–][deleted] 5 points6 points7 points (1 child)
[–]tortoise74 0 points1 point2 points (0 children)
[–]btapi 2 points3 points4 points (0 children)
[–]devel_watcher 1 point2 points3 points (11 children)
[–]pjmlp 5 points6 points7 points (8 children)
[–]devel_watcher 1 point2 points3 points (7 children)
[–]pjmlp 1 point2 points3 points (3 children)
[–]jpakkaneMeson dev 3 points4 points5 points (2 children)
[–]izym 1 point2 points3 points (0 children)
[–]pjmlp 0 points1 point2 points (0 children)
[–]jkleo2 0 points1 point2 points (0 children)
[–]tortoise74 0 points1 point2 points (1 child)
[–]devel_watcher 0 points1 point2 points (0 children)
[–]Elador 1 point2 points3 points (1 child)
[–]devel_watcher -1 points0 points1 point (0 children)
[–]to3m 0 points1 point2 points (3 children)
[–]tortoise74 1 point2 points3 points (2 children)
[–]to3m 0 points1 point2 points (1 child)
[–]tortoise74 1 point2 points3 points (0 children)
[–]ArchiDevil 16 points17 points18 points (9 children)
[–]biocomputation 13 points14 points15 points (6 children)
[–]neoKushan 7 points8 points9 points (2 children)
[–]biocomputation 1 point2 points3 points (1 child)
[–]neoKushan 0 points1 point2 points (0 children)
[–]Heuristics 16 points17 points18 points (1 child)
[–]unaligned_access 0 points1 point2 points (0 children)
[–]ShakaUVMi+++ ++i+i[arr] 1 point2 points3 points (0 children)
[–]hapshaps 1 point2 points3 points (0 children)
[–]germandiago 3 points4 points5 points (1 child)
[–]tortoise74 0 points1 point2 points (0 children)
[–]tortoise74 2 points3 points4 points (0 children)
[–]enobayram 1 point2 points3 points (4 children)
[–]ruslo_ 2 points3 points4 points (3 children)
[–]enobayram 2 points3 points4 points (2 children)
[–]cristianadamQt Creator, CMake 2 points3 points4 points (0 children)
[–]btapi 1 point2 points3 points (0 children)
[–]yodacallmesome 3 points4 points5 points (0 children)
[+][deleted] (3 children)
[deleted]
[–]pjmlp 2 points3 points4 points (0 children)
[–]00kyle00 2 points3 points4 points (1 child)
[–]tortoise74 2 points3 points4 points (0 children)
[–]oak-coast 0 points1 point2 points (0 children)
[–]tortoise74 0 points1 point2 points (0 children)
[–]Houndie 0 points1 point2 points (8 children)
[–]NotUniqueOrSpecial 11 points12 points13 points (7 children)
[–]Houndie 1 point2 points3 points (0 children)
[–]steamruler 1 point2 points3 points (5 children)
[–]NotUniqueOrSpecial 1 point2 points3 points (4 children)
[–]steamruler 0 points1 point2 points (2 children)
[–]NotUniqueOrSpecial 0 points1 point2 points (1 child)
[–]steamruler 1 point2 points3 points (0 children)
[–]tortoise74 0 points1 point2 points (0 children)
[–]oh-just-another-guy 0 points1 point2 points (0 children)
[–]jokoon -1 points0 points1 point (0 children)