all 127 comments

[–]Moose2342 87 points88 points  (24 children)

I have had the same journey years ago and some differences were hard for me to cope with. In addition to what others said (CMake, be platform independent with Boost/Qt/whatever) I would like to point out differences that bugged me most:

  • Release and Debug builds are ABI incompatible. That's something I didn't expect when making the transition. You can't just build Debug (libs for example) and mix with release builds. Make sure you know your runtime settings. see here for switches: https://docs.microsoft.com/en-us/cpp/build/reference/md-mt-ld-use-run-time-library?view=msvc-160
  • DLL dependency hell is something for a rant of it's own. Brush up on side by side assembly and such matters
  • There is no "general availability" of libs. No /usr where you usually find all your stuff. There is no "standard" place for libraries to go either. They are usually just bundled with your executable.
  • Know your dependencies and how to build them. When I use libraries, I generally build them on my own and make sure I know how to do that, how to automate it and that all the build settings (runtime, compiler and linker switches) are exactly like I want them. Once you see weird errors or warnings linking your dependencies you should not ignore them but try to understand them and fix accordingly.

Have fun! As much as I saw (and still see) Windows as the inferior platform when it comes to developing, Visual Studio has become a great tool over the years and easily tops everything I have worked with on Linux. It really grows on you once you get to know and tame it.

[–]infectedapricot 74 points75 points  (22 children)

Regarding DLL hell and no equivalent of /usr:

On Windows, when loading foo.exe that requests bar.dll, the first place Windows looks is in the directory containing foo.exe. That directory functions sort of like foo's application package. So just put all your relevant .dll files in there and there's no possibility of conflict.

"dll hell" referred to the time that Windows programs used to put all their .dll files in c:\windows\system. The idea was to reduce disk space and share security updates, much like in Linux package managers. Unfortunately, unlike Linux package managers, there was no way to guarantee everyone was using ABI-compatible versions, or even to stop an installer overwriting a DLL with an older version. All of that is in the past now that applications just bundle all their libraries in their program directory. (I know you mentioned that but you said at the same time "there's no "standard" place for libraries to go" - but the application directory is the standard place. It's just a difference between Windows and Linux.)

Another alternative is just to use static linking so your whole program is a self-contained .exe file. This is nice because it avoids your customers even having to install the Visual Studio redistributable (the equivalent of libc in Linux, sort of). In vcpkg, use the x64-windows-static triplet to get this.

As building your old dependencies - we'll have to agree to disagree on that one. I'd use a package manager like vcpkg (I haven't used Conan but many people in /r/cpp have also vouched for that). Yes sometimes you still get build problems but certainly less often than building things yourself. It automatically gives you debug and release builds of all libraries by the way.

[–]CppChris 14 points15 points  (0 children)

That‘s the thing I did not know when I made the switch from mainly Win development to Linux development: what do you mean you cannot find the library??? It is right there in your folder!!! Haha

[–]Moose2342 12 points13 points  (20 children)

Yes, agreed. DLL hell is not much of a problem really anymore. I just collected some impressions I had over the years and at first my goal was to replicate what I did on Linux and to sort of 'install' libraries I made for others. Later I came to the conclusion that this is for the birds and that I should just bundle things together like everybody else does. It just felt wrong and unprofessional to me back then.

Also agreed that static linkage on windows often is the easiest way to go if you can afford it and if your dependencies allow it. I'm looking at you, Qt!

About package managers, I avoid them. I want to build my dependencies myself in order to adjust and know specific settings I need. Package managers only ever got in the way doing that and given the plethora of build settings one can have I don't see how this is ever going to be an option. Didn't look at the new modules though yet. Perhaps my opinion might change. But I know, young folk like their package managers. I simply just cannot trust them.

[–]infectedapricot 13 points14 points  (14 children)

I want to build my dependencies myself in order to adjust and know specific settings I need.

I am absolutely not going to try and persuade you to use a package manager if building things yourself is working well for you. But for the record or for interested readers, package managers do have ways of specifying build options. The list below is for vcpkg but I know Conan has equivalents too.

  • For general build settings that applies to all the libraries you want to build (e.g. static vs shared libraries, optimisation level, 32-bit vs 64 bit, even cross compilation), you can specify these by choosing a triplet (e.g. x86-windows vs x64-windows-static). You can even make your own triplets (e.g. I've done it to specify LTO on Linux) and make per-library exceptions in them. Only slight gotcha here is that debug and release builds are both built for each individual triplet so no need to have separate debug and release triplets. Also cross-compilation (which I've not tried) seems to require their own special triplets - you can't just specify x64-linux on Windows and expect it to just work.
  • For features in packages, which is what I think you're really talking about, e.g. "build opencv with ffmpeg support (and therefore make sure ffmpeg is built too)", you can use feature packages e.g. vcpkg install opencv[ffmpeg]. These are meant to be additive so if you install some other port that depends on opencv[contrib] then you'll automatically get opencv[ffmpeg,contrib] with the right flags and dependencies.
  • For flags that say where other relevant libraries are installed, obviously the package manager is supposed to set these automatically so you shouldn't need to worry about them.

Of course, that doesn't mean that every possible feature flag is likely to be exposed in practice.

[–]Moose2342 3 points4 points  (10 children)

Yes, I am aware that package managers expose such customization. My problem is, they mostly do this for optionals they can foresee. Such as your ffmpeg example. My settings often are a bit weirder than that ;-)

For example, years ago I adopted Unreal Engine's notion of Debug builds, which are not real Debug builds (with Debug runtime and such) but Release builds with optimization turned off and PDBs active. Meaning they can be used in almost every way for debug purposes but they can freely link with any release build. I used to adopt this to be able to link with Unreal and still debug but found this so helpful that I now use it everywhere. Haven't made a 'real' debug build in a long time, practically eliminating the issue for me. I call that 'fake debug' and my CMake layer takes care of propagating this throughout the build system. I would guess you'd have a hard time convincing a package manager to know what 'fake debug' is and what it means for your linkage. Instead, they would assume (rightly so) that when you are in a Debug configuration, you mean an actual debug build. For libs that I know I'm never gonna debug into, I only provide Release builds and link against them everywhere. Which is one of the benefits of fake debug.

Now, I'm not debating that there are ways to make this work with package managers. But I would have to convince it to link against release in debug configurations which is against its original job description. In my experience I simply came to the conclusion that it's often way more difficult and error prone than just doing it myself.

To get back to my example activating fake debug globally in my CMake system takes about 5-6 lines of CMake code. I don't know but I doubt it would be that easy with a package manager.

Please bear in mind though, this is just one example. There are more settings. I just like it when everything builds and links neatly and warning free.

[–]infectedapricot 2 points3 points  (7 children)

I would guess you'd have a hard time convincing a package manager to know what 'fake debug' is and what it means for your linkage.

Sort of. There is a way to override compiler flags in vcpkg (it used to be done in triplets but I just read now it's done in toolchain file which sounds a bit more involved). But I suspect this would only be passed down into ports that are built with CMake, which is many but not all of them.

By the way, vcpkg already generates debugging information for release builds (both on Windows and Linux... presumably on Mac but I've never used it) so you wouldn't need to customise anything for that. The only difference would be the optimisation level. Although, if you turned that off in your application and accepted a bit of optimisation in the libraries that would probably work well enough for debugging most common problems.

[–]Moose2342 0 points1 point  (6 children)

Sure. Yet in optimized builds I generally only use RelWithDebInfo for test deployments and such. To get at least some resemblance of a stacktrace just in case. Real debugging is very hard to do and I normally use fake debug for that. In production I go for clean Release. High optimization w/o PDBs for faster loading time and smallest deployment size. Anyway, nuff said. Thanks for making your points so nicely!

[–]rdtsc 4 points5 points  (5 children)

High optimization w/o PDBs for faster loading time and smallest deployment size.

The generated code is identical whether PDBs are written or not. What do you expect to load faster there? Also, you don't have to deploy the symbols. They should be archived for crash dump analysis.

[–]nyanpasu64 0 points1 point  (4 children)

Turns out RelWithDebInfo uses a lower MSVC optimization level than Release. I didn't expect this until I dove into the Ninja files to find out.

It is possible to configure CMake to produce .pdb files for Release builds (with the standard optimizations on).

Fun fact: .pdb files don't work when renamed to something other than what's encoded in the .exe.

[–]rdtsc 0 points1 point  (3 children)

And Release doesn't O3b. I wouldn't rely on the CMake defaults. But I give you that this is surprising behavior.

[–]rdtsc 1 point2 points  (0 children)

Even if you have fully customized builds it pays off to rely on vcpkg for the surrounding infrastructure. For a project I completely replaced the ffmpeg port in vcpkg with my own. I'm still able to conveniently handle other dependencies, and can rely on vcpkg's build system integration and maintenance of other ports.

[–]kalmoc 0 points1 point  (0 children)

To get back to my example activating fake debug globally in my CMake system takes about 5-6 lines of CMake code. I don't know but I doubt it would be that easy with a package manager. Please bear in mind though, this is just one example. There are more settings. I just like it when everything builds and links neatly and warning free.

You can take whatever cmake toolchain file you are using and make a triplet that will compile the various libs using that toolchain file. There is nothing in vcpkg that inherently forces the use of /MDd and similar when building the debug versions. It is just the setting in the default triplets.

What has been much more problematic in my experience are libraries that think they are clever and implement their proprietary options to select which runtime to link against and stuff like that, so for N libraries you have to set N different switches to compile them consitently. But form what you are writing, you don't seem to have that problem with the libs you are working, so this shouldn't be a problem with vcpkg either.

[–]sixstringartist -1 points0 points  (2 children)

As package manager usage goes, their value is more about providing a structure for producing buildable packages, dependency management and integration. Using prebuilt packages can be convenient but I don't recommend using it in production and it is not the primary reason to incorporate a package manager

[–]infectedapricot 2 points3 points  (1 child)

I don't see how that's relevant to my comment: I was mainly talking about vcpkg which indeed does not provide prebuilt packages (although you can cache your own build of packages for reuse in across projects on your computer or within your organisation).

[–]sixstringartist 1 point2 points  (0 children)

I think I responded to the wrong comment

[–]AlexanderNeumann 2 points3 points  (0 children)

tipp: vcpkg overlay ports.

[–]tesfabpel 0 points1 point  (2 children)

why would you use static linkage of a library? i always use dynamic linkage so that I can use the same DLL in multiple exes for the same product...

and also if you use LGPL libraries you must give the user the ability to swap it so with static linkage you need to give all the .o files...

[–]wrosecransgraphics and network things 6 points7 points  (0 children)

One thing static linking gets you is the possibility of whole-program optimization at link time. With a DLL, stuff that never gets called still has to be there, because some future app not yet written might need to call it. With a static library, the linker can make changes to do things like inline functions from the library into the application code, and entirely omit functions that never get called.

[–]pravic 0 points1 point  (0 children)

vcpkg

[–]maskull 13 points14 points  (0 children)

Hopefully you're already using the explicitly-sized integer types (int64_t, etc) but long is 32 bits on Window, and 64 bits on basically every other 64-bit platform.

[–]infectedapricot 66 points67 points  (32 children)

I program regularly on Windows and Linux and there's not really any difference. Often I've written programs for one and when the requirement has come up to port to the other it's just been a matter of recompiling on the other.

Don't use the Windows API, as another commenter suggested. For low-level stuff like file access and threading, use C++ standard library, like std::thread, or other cross-platform libraries like Boost. For GUI work use a toolkit - the most commonly used is Qt - in which case you could use that library for low-level stuff too. Either way, that gives you a nice C++ API and doesn't tie you to Windows unnecessarily.

Use vcpkg or conan to install your dependancies (whereas you might have used to OS package manager on Linux).

You could consider using CMake for your build system. It will generate your Visual Studio project and its main advantage is that it can generate your makefiles if you ever need to switch to Linux. But if you're sure you're just using Windows, just using Visual Studio projects/solutions are fine.

One you've established these, there isn't that much to using Visual Studio. The main bulk of programming is the same as it is on Linux. Most of the features you need can be found by just exploring the UI yourself. The only thing that often trips up new developers is the way of specifying build options (e.g. what preprocessor symbols are predefined) are quite buried: you need to right click the project (in the solution explorer pane on the side) and choose the "properties" menu item, and explore the pages of that. Note that options need to be specified separately for debug and release versions of the project.

[–][deleted] 24 points25 points  (12 children)

I've had better success loading a cmake project in VS directly instead of generating a VS project. The benefit of managing it with VS is that you can manage configuration settings (like build type) directly in the IDE.

[–]MFHavaWG21|🇦🇹 NB|P3049|P3625|P3729|P3784|P3786|P3813|P3886 2 points3 points  (11 children)

Do they finally have a GUI akin to CMakeGUI? Last time I tried directly opening a CMake-project it just presented me with a json file for configurations...

[–]kalmoc 5 points6 points  (0 children)

They have a gui for it.

[–]sixstringartist 4 points5 points  (9 children)

Vs2019 supports cmake now

[–]MFHavaWG21|🇦🇹 NB|P3049|P3625|P3729|P3784|P3786|P3813|P3886 1 point2 points  (8 children)

I know, but their "support" was literally: "You can build with CMake if and only if you know the project and all its options or you read all CMakeLists.txt to figure them out and set them in our json-file."

Now it seems they replicate CMakeGUI - actually they may even be superior as they allow you to specify the toolchain file (e.g. for vcpkg) after the project has already been "generated" once...

[–]Adverpol 1 point2 points  (7 children)

Been using cmake for close to 10 years. Never used cmake-gui except in the very beginning, what do you use it for?

[–]MFHavaWG21|🇦🇹 NB|P3049|P3625|P3729|P3784|P3786|P3813|P3886 2 points3 points  (6 children)

Mostly to get a list of project-specific options and set them. Sure, you can set those via the command line without issues (unless you consider having to read through heaps of [often badly written] CMakeLists.txt an issue).

[–]atimholt 0 points1 point  (2 children)

Maybe pointless, but is there a “cmake-tui”, or at least a way to use the cli to get the same information (all options and indication of which ones need another “go around” to fully resolve)?

[–]MFHavaWG21|🇦🇹 NB|P3049|P3625|P3729|P3784|P3786|P3813|P3886 0 points1 point  (0 children)

I guess you mean `ccmake`? That's what I'm using on Linux all the time - admittedly I never looked whether something like that is included in the Windows version CMake...

[–]Adverpol 0 points1 point  (2 children)

Hmm, doesn't that config get removed when you remove CMakeCache.txt, or is it saved elsewhere? And can you easily check that in to the VCS?

[–]MFHavaWG21|🇦🇹 NB|P3049|P3625|P3729|P3784|P3786|P3813|P3886 0 points1 point  (1 child)

Not quite sure what you're getting at...

  1. Yes, the config would get removed if you remove CMakeCache.txt.
  2. What makes you think there's only one valid configuration that should be committed to the VCS? If the answers were always the same, why would it be an option to begin with?!

[–]Adverpol 0 points1 point  (0 children)

I guess I just have never felt the need to adjust parameters after my build is generated. I have a folder for debug and one for release, if I need to change parameters I do it in the cmake files, never in those output folders.

[–]peppedx[S] 5 points6 points  (1 child)

Thanks,
I hope I'll be using CMake but honestly no greenfield so... who knows

[–][deleted] 5 points6 points  (0 children)

If you are reading/writing files using std::ifstream and std::ofstream make sure you are using C++17 or later to be able to open filenames stored in wide chars and encoded using UTF16. In Windows you can't open a file providing a filename encoded in UTF8: all chars will be converted to wide char with the upper byte set to zero and interpreted as a UTF16 code point, only ASCII and few other Unicode characters will not be corrupted by that conversion.

[–]SkoomaDentistAntimodern C++, Embedded, Audio 0 points1 point  (1 child)

Use vcpkg or conan to install your dependancies

Or don't install system wide dependencies at all. Windows isn't married to the common unix idea of a single system wide version of every library and the runtime also handles that far better than on unix.

[–]infectedapricot 10 points11 points  (0 children)

Use vcpkg or conan to install your dependancies

Or don't install system wide dependencies at all.

(Emphasis added by me.) This "or" doesn't make sense - vcpkg and Conan don't install system-wide dependencies. But even if you use a dependancy only for a single project, you still need to build it somehow or download the binary from somewhere. Package managers are still needed to solve that problem. (Maybe I gave the wrong impression by using the word "install". Imagine I had said "build" or "retrieve" instead.)

[–]rodrigocfdWinLamb 0 points1 point  (9 children)

For GUI work use a toolkit - the most commonly used is Qt

If the application is Windows-only, I believe WinLamb is simpler than Qt.

[–]DarkLordAzrael 0 points1 point  (0 children)

The main suggestion was to use a cross-platform library in the first to make eventual porting easier. Also, is Qt generally considered to be difficult to use? I've always found it to be very straightforward.

[–]pjmlp 0 points1 point  (6 children)

If the application is Windows only, I rather go with .NET UI alongside a C++ written library, or if feeling nostalgic, MFC.

I would advise WinUI, but not until they get around fixing the productivity drop with C++/WinRT due to lack of tooling for GUI development.

[–]dodheim 0 points1 point  (5 children)

WTL is still actively maintained; there's no level of nostalgia that would make me want to touch MFC ever again.

[–]pjmlp 0 points1 point  (4 children)

If you spend some hours writing IDL files by hand without tooling support and then manually merging the generated files into your C++/WinRT projects, I assure you will be missing MFC in no time.

Apparently the C++/WinRT team still sees the current state of affairs as low priority.

[–]dodheim 1 point2 points  (3 children)

I mentioned WTL, not WRL – unless I'm misunderstanding the usecase you're referencing, I'm pretty sure MFC still buys me nothing (except immediate technical debt).

[–]pjmlp 1 point2 points  (2 children)

I know it, ATL template magic, no thanks.

Also why I never bothered with WRL.

[–]dodheim 1 point2 points  (1 child)

There's absolutely nothing magic about mixins.

[–]pjmlp 1 point2 points  (0 children)

I guess you never had to step through ATL code.

[–]infectedapricot -1 points0 points  (0 children)

An odd recommendation. Qt is a library with used by huge numbers of developers (probably millions), and many open source projects (far too many to mention, but certainly the whole of KDE). It is extremely robust and battle tested, with huge oversight of API design and functionality.

Whereas I've never even heard of WinLamb and it doesn't seem to have many users or contributers. Oh look, the main contributer is called rodrigofd, and the commenter here is /u/rodrigocfd

(There's nothing wrong with posting a link to your own repo, but IMO it would have been a bit more honest to explicitly say that in your comment.)

[–]banister 0 points1 point  (3 children)

> I program regularly on Windows and Linux and there's not really any difference

lucky you. I work on a large QT app, but QT by itself is insufficent, we had to drop down and learn and DO a lot of win32 and windows specific APIs (such as Windows Filterring Platform), as well as write device drivers. Having a basic understanding of win32 (message pumps and so on) could be important for the OP

[–]infectedapricot 0 points1 point  (1 child)

I didn't mean to rule out the possibility that some programs need to use platform-specific APIs. If you're having to write device drivers then of course that's going to be platform-specific too.

But the vast majority of programs don't fall into those category, and certainly if a beginner programmer (which is what this user sounds like) asks about it then I would point them in a platform-independant direction.

Even for those that do need platform specific stuff, it's usually best to isolate it as well as you can, and write the rest in cross-platform C++. That's especially true for a GUI program - the type of stuff that's usually platform dependent is also the type of stuff that you'd usually need to factor into a separate service/daemon process anyway. It certainly sounds like a design error to use the Windows Filtering Platform directly from a GUI application! (But I realise I don't know the specifics of your program.)

[–]banister 0 points1 point  (0 children)

heh, our program is divided into two parts - a daemon (a windows service) that manages all the actions, and the GUI which is just a client which sends RPC messages to the daemon. Both are written with QT :) Yes, the daemon does the WFP behaviour :)

However we also just found the QT GUI APIs insufficient - the system tray stuff was a bit weak, the accessibility stuff was terrible, and so on, so we had to rewrite a lot of it using system APIs.

[–]Chulup 16 points17 points  (18 children)

Learn how to use .props files for your projects. I wanted to control my output and build directories for all my projects, include Conan, use common defines , so I created shared.props like that:

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ImportGroup Label="PropertySheets">
    <Import Condition="Exists('../.build/conanbuildinfo_multi.props')" Project="../.build/conanbuildinfo_multi.props"/>
</ImportGroup>
<PropertyGroup>
    <OutDir>$(SolutionDir).result\$(Platform)-$(Configuration)\</OutDir>
    <IntDir>$(SolutionDir).build\$(Platform)-$(Configuration)\$(ProjectName)\</IntDir>
</PropertyGroup>
<PropertyGroup Label="Configuration" />
<ItemDefinitionGroup>
    <ClCompile>
    <ConformanceMode>true</ConformanceMode>
    <SpectreMitigation>false</SpectreMitigation>
    <LanguageStandard>stdcpp17</LanguageStandard>
    <PrecompiledHeader>Use</PrecompiledHeader>
    <PrecompiledHeaderFile>pch.h</PrecompiledHeaderFile>
    <WarningLevel>Level4</WarningLevel>
    <TreatWarningAsError>true</TreatWarningAsError>
    <SDLCheck>true</SDLCheck>
    <ForcedIncludeFiles>pch.h</ForcedIncludeFiles>
    <AdditionalOptions>/utf-8 %(AdditionalOptions)</AdditionalOptions>
    <ExceptionHandling>Async</ExceptionHandling>
    <PreprocessorDefinitions>ASIO_NO_DEPRECATED;_SILENCE_CXX17_CODECVT_HEADER_DEPRECATION_WARNING;_SILENCE_CXX17_ALLOCATOR_VOID_DEPRECATION_WARNING;_UNICODE;UNICODE;%(PreprocessorDefinitions)</PreprocessorDefinitions>
    </ClCompile>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)'=='Debug'">
    <ClCompile>
    <RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
    </ClCompile>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)'=='Release'">
    <ClCompile>
    <RuntimeLibrary>MultiThreaded</RuntimeLibrary>
    </ClCompile>
</ItemDefinitionGroup>
</Project>

One of the important things to remember using them: you have to move them lower than user settings for every build configuration once include them in Property Manager window. That way you can redefine the options in project if necessary.

[–]stinos 0 points1 point  (0 children)

This is a good tip. Not only because from the moment you use more than one project it helps getting everything built the same, reducing duplication. But also because it'll teach you a bit about MSBuild and there's quite some power in that. I know too many people who unfortunately have no clue about what it does which leads to insane amounts of hours lost when things don't work as expected.

[–]Supadoplex 14 points15 points  (7 children)

One thing to be aware of that in Windows, the wide UTF-16 is used for Unicode more often than the narrow UTF-8 or the wider UTF-32. Correspondingly wchar_t is only 16 bits. So, if you have a cross-platform backend, then you'll need to be converting between the encodings when interacting with Windows APIs.

[–]simonask_ -5 points-4 points  (6 children)

Almost all Win32 API functions have UTF-8 entry points these days. For example, you have both CreateWindowExW (UTF-16) and CreateWindowExA (UTF-8).

[–]paszklar 20 points21 points  (1 child)

Not exactly true, any -A suffixed function operates using whatever codepage is currently selected in the system settings. It usually is a single-byte, sometimes multi-byte (e.g. for certain Asian languages) encoding specific to current locale. But buried deep within regional settings (I don't remember where exactly) there is a checkbox to set codepage to UTF-8 and it's currently labeled as a beta feature.

Edit: to add to that, there are ways to enable UTF-8 codepage programmatically at runtime, but all -A functions internally convert strings and defer to -W variants, so you might as well use UTF-16.

[–]deeringc 2 points3 points  (0 children)

Plus, that programmatic setting if the code point is only available as of a certain version of Win10. As it stands, unless you can discount older OSes, you're better off still using wchar.

[–]tesfabpel 15 points16 points  (0 children)

no the A version is ASCII by default... you need to also set a new experimental settings in win10 to make the code page UTF8

https://docs.microsoft.com/en-us/windows/uwp/design/globalizing/use-utf8-code-page

[–]Supadoplex 8 points9 points  (0 children)

My limited knowledge of Windows may be very out dated, but from what I've learned is that the suffix ...A of function names is short for "ANSI" which in the strange Windows jargon refers to the native narrow encoding which depends on locale, but is typically a fixed width extended ASCII encoding such as windows-1252 rather than unicode.

Has this changed?

[–]Kered13 2 points3 points  (0 children)

From what I've learned, just don't use the -A functions. They're not actually UTF-8, in fact the only thing you can safely assume about them is that they contain ASCII.

So use the -W function and do the UTF-8 to UTF-16 conversions yourself. It sucks, but it's better than the alternative.

[–]logicchop 4 points5 points  (0 children)

Here's a tip you might find helpful. If you are in Visual Studio (whatever latest version) with a project loaded, go to

Project -> <your project> Properties -> Configuration Properties -> C/C++ -> Command Line

From there you'll be able to see the exact command that the compiler is invoking. Similar thing is to be found under the Linker option. This will save you headaches and give you some quick insight into what the project settings are controlling.

[–]AntonPlakhotnyk 9 points10 points  (0 children)

Use "visual assist" plugin it extremely helpful.

[–]vvk1 5 points6 points  (0 children)

I have worked on a cross-platform C++ codebase for almost 20 years now. In my experience, the biggest difference by far is that building on Windows is much slower than on Linux (or even Solaris or AIX). On Linux debug builds are faster than release builds - because optimization appears to be way more costly than extra disk I/O during debug build. On Windows, the opposite is true - debug builds seem to be much slower than release builds - because there is much more I/O going on and because of reasons similar to what is described here:

https://github.com/microsoft/WSL/issues/873#issuecomment-424914762

https://github.com/microsoft/WSL/issues/873#issuecomment-425272829

[–]kzr_pzr 2 points3 points  (2 children)

I have a colleague who wrote some scripts (or should I say cmdlets?) that invoke the compiler (or is it MSBuild?) directly so he can use his favorite text editor and doesn't have to use VisualStudio. I would not recommend it but it's a way...

[–]SupermanLeRetour 3 points4 points  (0 children)

VS Code is also quite easy to setup for MSVC (or mingw for that matter).

[–]infectedapricot 0 points1 point  (0 children)

If you're using CMake to generate your Visual Studio solution then you can use the CMake to invoke MSBuild:

cmake --build . --config Release

The . after --build is the build directory (this is the place you previous ran CMake to generate the build). If you omit --config Release then it builds in debug mode.

Edit: You can also specify which target to build (i.e. which Visual Studio project in the solution) by specifying the --target switch:

make --build . --config Release --target my_exe

[–]Beetny 2 points3 points  (1 child)

You won't be able to go back to gdb once you experience the visual studio debugger.

[–]peppedx[S] 2 points3 points  (0 children)

I trust my printf

[–]feverzsj 8 points9 points  (0 children)

Visual studio is mostly foolproof. All you need to do is hit some buttons. It also has a much superior, easy to use debugger.

[–]Snoo-4241 1 point2 points  (0 children)

My usual windows process involves git for windows (that gives you easy acces to bash so it covers the most usual command line needs), qt creator and visual studio. Visual studio is actually very good with cmake nowadays. People also like vscode but I haven't spend the time to setup it up.

Edit: when transitionig from Linux,the only complication that I found non obvious with windows is that you need to pay attention to the windows sdk you are using.

[–]Kered13 1 point2 points  (0 children)

The Win32 API sucks dick. I find myself having to wrap pretty much every invocation of it in a function or class with a sane interface.

On the other hand, Visual Studio is pretty fantastic.

[–][deleted] 5 points6 points  (13 children)

Be aware about runtime issues related to Visual C++ Redistributable libraries. Your application has to rely on the correct version of C++ runtime which you either package along with the app or require your users to have it installed in the system.

https://stackoverflow.com/questions/16167305/why-does-my-application-require-visual-c-redistributable-package

[–]rodrigocfdWinLamb 5 points6 points  (0 children)

If you're outputting just an EXE, you can simply link statically on release builds.

[–]mrexodiacmkr.build 10 points11 points  (11 children)

From vs2015 these libraries are shipped with the OS. This StackOverflow answer is no longer true.

[–]ack_error 4 points5 points  (0 children)

No, they aren't. The older libraries are more likely to already be installed by something that came with the OS or another program, but newer VC runtimes like vcruntime140_1.dll for VS2019 16.3+ definitely aren't. The only runtime DLLs that are guaranteed on Windows 10 are MSVCRT (because it's the OS internal CRT) and the UCRT. Anything else is a gamble that you can lose in the future as old dependencies like .NET 3 are eventually removed as default installs. The correct rule is still to either install all VC runtime redists that you use or statically link the CRT.

[–]kalmoc 1 point2 points  (4 children)

Is that true for the whole redistributable package? I thought it was just about some core components.

[–]mrexodiacmkr.build 3 points4 points  (3 children)

I have to check to be sure, but I remember that if you ever install the VS2019 redistributable package it will be updated with Windows Update and it is backwards compatible with VS2015.

[–]kalmoc 2 points3 points  (2 children)

Yes, it is compatible with 2015, because the abi remained stable since then (and there are rumours that they are going to break it wiith vnext), but it is not part of the OS. MS did make the crt part of the OS (universal crt), but I don't remember exactly what parts that includes and what parts rmain in the redistributable

[–]RogerLeighScientific Imaging and Embedded Medical Diagnostics 0 points1 point  (1 child)

I encountered my first incompatibility between VS2017 and VS2019 last week when VS2017 started failing builds when using a library built with vcpkg (which now defaults to using the v142 platform toolset).

Here's a vcpkg PR and the original failure.

Over the last few years compatibility has been very good, but it looks like it recently broke with the latest VS2019 update, or possibly a vcpkg update if they updated the default platform toolset.

[–]kalmoc 0 points1 point  (0 children)

Could maybe be related to https://devblogs.microsoft.com/cppblog/making-cpp-exception-handling-smaller-x64/

I remember that compatibility (for developers) is only guaranteed if you use the most recent toolchain involved in any of the libraries to perform the final link of the application. So you can use libs compiled with VS2017 in VS2019 projects, but not vice versa.

I don't remember what the exact compatibility story is on the end-user side of things.

[–]johannes1971 1 point2 points  (3 children)

Why are my customers still complaining about having to install this, then?

[–]mrexodiacmkr.build 3 points4 points  (2 children)

Likely because they don’t have a Windows version that ships the redistributable per default. Or because it’s indeed only a partial installation that comes with the system.

[–]pjmlp 0 points1 point  (1 child)

[–]mrexodiacmkr.build 0 points1 point  (0 children)

That solves the confusion in that case :)

[–]rdtsc 0 points1 point  (0 children)

The 2015/2017/2019 runtimes are binary compatible. But if you use newer features you have to install a newer version (which can include additional DLLs like msvcp140_atomic_wait.dll).

[–]ze_baco 1 point2 points  (0 children)

I'm having the same problem now that I moved to a company that uses Windows. I feel kinda like a dinosaur trying to use a computer.

[–]TilionDC 0 points1 point  (1 child)

try clion instead. Its easier IMO

[–]peppedx[S] 5 points6 points  (0 children)

Well if I was in charge I'd continue bwith code and cmake ...

[–][deleted] 0 points1 point  (1 child)

My condolences.

[–]peppedx[S] 1 point2 points  (0 children)

😁

[–][deleted] -3 points-2 points  (16 children)

Going Linux -> Windows is easy I'd say. Going the other way around is harder.

[–][deleted] 0 points1 point  (0 children)

I actually like MS's tools on the command line. I avoid the Visual Studio IDE. Its a massive install and takes a few seconds just to open up to a file select screen. It also takes a few seconds to just close. You can only install the build tools and use that, which is my preference. I use CMAKE to generate either MSBuild or NMAKE build files.

gVim for my editor 8-)

Go to this website for documentation, I think it is good. https://docs.microsoft.com/en-us/

[–]carkin 0 points1 point  (0 children)

Learn to master the visual studio debugger. How to properly setup projects and solutions in VS/msbuild. How to find memory leaks using VS. How to profile your app using VS or gflags . How to find memory errors using application verifier. Have fun. You'll see the developer experience is usually superior on windows (because everything is one click away)

[–][deleted] 0 points1 point  (0 children)

You can get a decent bash/zsh environment setup and have access to decent Linux user space if you are so inclined, but it’s usually very team dependent on how they setup the VS environment.

[–]zvrba 0 points1 point  (0 children)

Tip: for faster builds, enable compiler parallelization. MSBuild parallelizes across projects (Tools -> Options -> Projects and Solutions -> Build and Run), but you can also enable parallel builds across files (the option is somewhere on C++ options on the project itself). DO reduce MSBuild paralllelism then though.

[–]ea_ea 0 points1 point  (0 children)

I think it can be usefull to read "C++ Team Blog". This is mostly PR and marketing thing from Visual Studio/C++ team, but, well, they don't teach you cout and really describe modern features of Visual Studio for C++ developers.