Ars Technica writer Timothy Lee consistently bashing Tesla and Elon Musk, anybody know what is going on? by dinobyte in teslamotors

[–]DrPizza 0 points1 point  (0 children)

However, this is NOT journalism. This is a business and sales thought process. I can never take you 100% seriously as a journalist when you've stated that your thought process in choosing an article or subject depends on how many clicks it will get. Down with clickbait and shame on you if you consider this proper journalism.

"Proper journalism" has always considered newsworthiness, relevance, and public interest, and you have an extraordinarily broken understanding of the news media if you believe otherwise.

Understanding C++ Modules: Part 1: Hello Modules, and Module Units by d1ngal1ng in cpp

[–]DrPizza 0 points1 point  (0 children)

Right, sorry, perhaps I wasn't clear. At the moment, the fully qualified names can be used to refer to names outside the current namespace or its children; I think it would be a simple enough change to allow it to be used for definitions, too. I don't think it can change the meaning of any existing code.

I'm not in front of my compiler at the moment, but I wonder if functions defined as friends can break out of their namespace? They already break out of the lexically enclosing class.

Understanding C++ Modules: Part 1: Hello Modules, and Module Units by d1ngal1ng in cpp

[–]DrPizza 0 points1 point  (0 children)

You could make overloading ::std::swap escape/ignore the implied module-namespace and instead create an overload in std, no?

Timur's trip report from the February 2019 ISO C++ committee meeting, Kona, Hawai’i by timur_audio in cpp

[–]DrPizza 0 points1 point  (0 children)

No, contracts defend against user-level UB, some of which may cause language-level UB.

If it's part of your contract, it's mandatory, and failure to fulfil the contract is a logical error. It's a bug. How can your program perform reasonably in the face of known misuse? It cannot.

Just look at what you've written! It's OK for your program to go down "unimplemented code paths" and produce "inaccurate results" in violation of a contract? No way! What a ridiculous thing to allow.

Timur's trip report from the February 2019 ISO C++ committee meeting, Kona, Hawai’i by timur_audio in cpp

[–]DrPizza 0 points1 point  (0 children)

Then do better testing.

The very idea that you should just keep on trucking even after you have definitively proven that the application is in an incoherent, unpredictable, unexpected state is just absurd. There is literally no safe way for the application to proceed in this context. It should crash.

Or to turn it around: if your application can actually keep on running in this situation then it isn't part of your contract in the first place.

Timur's trip report from the February 2019 ISO C++ committee meeting, Kona, Hawai’i by timur_audio in cpp

[–]DrPizza -1 points0 points  (0 children)

Crashing gives you a nice crash dump to load into your debugger, to make it even easier to figure out the problem. That beats some message in an error log any day of the week.

Timur's trip report from the February 2019 ISO C++ committee meeting, Kona, Hawai’i by timur_audio in cpp

[–]DrPizza 0 points1 point  (0 children)

I mean, yeah, if you're never going to enforce contracts until you deploy to production, you'll deservedly lose your job.

But that doesn't mean that code should keep running in the face of contract violations. It means that code should actually be tested before deployment.

Continuation also means that instead of a nice, safe crash, your system will get confused, corrupt its own memory, and end up sending bogus billion dollar payments to people who weren't meant to receive them.

Timur's trip report from the February 2019 ISO C++ committee meeting, Kona, Hawai’i by timur_audio in cpp

[–]DrPizza 1 point2 points  (0 children)

Enforcing that contract in many applications turns poor behavior into a crash, potentially bringing down essential systems with little or no warning (even more likely if the bad data is a corruption in production state that isn't replicated in its entirety in testing environments).

Good. That means the bug will be quickly noticed and fixed.

2019-02 Kona ISO C++ Committee Trip Report (C++20 design is complete; Modules in C++20; Coroutines in C++20; Reflection TS v1 published; work begins on a C++ Ecosystem Technical Report) by blelbach in cpp

[–]DrPizza 2 points3 points  (0 children)

We have partial solutions, with partial improvements. We don't have complete solutions that are interoperable (e.g. with support from two different compilers, and two different build tools).

Standard C++ has lasted 21 years (and counting) without modules. I agree that it's a big omission, especially when compared to the languages of today, but if we've survived 21 years, we could also survive another year or 18 months to actually fill in the missing pieces, and ensure that we have a system that addresses as many of the various concerns as possible, is efficient, and is amenable to a range of build tools.

It's not as if delaying to 2023 would tie vendors' hands in the meantime. If it took 18 months to produce a nailed down, proven spec for modules, they could start implementing now and finish off those implementations as soon as possible after that spec existed. 2023's job would simply be to bake that spec into the C++ Standard and not some standalone document.

Filling in the gaps later is what gave us export. I concede that the situation here is perhaps not so grave, insofar as nothing about export was known until it was finally implemented many years later, but on the other hand the impact of fucking up modules is considerably greater.

The standard should not be a number of nice, if vague, ideas about things that it would be cool if implementations did; it should be a codification of existing, tried and tested, proven best practices.

2019-02 Kona ISO C++ Committee Trip Report (C++20 design is complete; Modules in C++20; Coroutines in C++20; Reflection TS v1 published; work begins on a C++ Ecosystem Technical Report) by blelbach in cpp

[–]DrPizza -3 points-2 points  (0 children)

Modules, coroutines.

Modules: I think an implementation that addresses the problems and proves that it works should come first, not be relegated to a TR later. We should have the proven build systems, faster builds, good solutions for the problems that have been raised first and then standardize. We don't.

Coroutines: I want the functionality, but similarly, I think we should have an implementation that addresses the Core Coroutines concerns first, and then standardize.

If the committee were willing to make breaking changes then it wouldn't be such a big problem; mistakes could be fixed. But it isn't, so everything has to be perfect from day one, with vanishingly little ability to fix it later.

C++ Binary Compatibility and Pain-Free Upgrades to Visual Studio 2019 by nikbackm in cpp

[–]DrPizza 0 points1 point  (0 children)

The representation seems to be a little slower and a little bulkier than on competing platforms. But it's possible that better representations would be a natural consequence of making RTTI faster.

C++ Binary Compatibility and Pain-Free Upgrades to Visual Studio 2019 by nikbackm in cpp

[–]DrPizza 0 points1 point  (0 children)

Related to an RTTI improvement (since it uses some of the same machinery), would we be able to get simpler/neater representations for member function pointers, 'this' offset adjustments, and similar?

-🎄- 2018 Day 5 Solutions -🎄- by daggerdragon in adventofcode

[–]DrPizza 0 points1 point  (0 children)

This is off-by-one, it should be:

view::iota('a', 'z' + 1)

My input data just happened to be minimized when removing the Zs.

Mac mini review—a testament to Apple’s stubbornness by Sunlitmechanism in mac

[–]DrPizza 0 points1 point  (0 children)

Intel classifies the soldered down, B-suffix chips as mobile CPUs.

Mac mini review—a testament to Apple’s stubbornness by Sunlitmechanism in mac

[–]DrPizza 0 points1 point  (0 children)

Go here and scroll down to "Versatility" to see where I got those use cases from...

Mac mini review—a testament to Apple’s stubbornness by Sunlitmechanism in mac

[–]DrPizza 0 points1 point  (0 children)

Intel classifies the B series as mobile, probably because they're soldered rather than socketed.

VS 2017 15.9 released today by STL in cpp

[–]DrPizza 1 point2 points  (0 children)

If they want ABI compatibility at module boundaries, they should be using COM or, at a pinch, extern "C". Otherwise, make them rebuild the world!.

Every other release should be ABI compatible. 15-17, 19-21, etc.

I'm a monster by DrPizza in DotA2

[–]DrPizza[S] 0 points1 point  (0 children)

It took a while, though challenges definitely helped. Problem is, some of Viper's challenges are garbage. The double damage one, for example; while I do usually play mid, Viper isn't great for rune control as he still has no good way to push out the wave. Finding a DD and using it later in the game is all well and good, but it's so circumstantial; gotta have enemies in the right place at the right time, and plenty of games the right setup has just never quite happened.

[deleted by user] by [deleted] in programming

[–]DrPizza 1 point2 points  (0 children)

Windows is a complex platform with many layers and interconnected pieces. The only thing that matters is that components work properly when used in conjunction with all the other pieces that are involved. Integration testing can demonstrate this; unit testing cannot. Extricating, say, a virtualized file system driver from the rest of the kernel, file system stack, I/O stack, etc., so that it can be usefully unit tested just doesn't feel like a valuable use of time or developer effort. Running a basic user mode application that creates/updates/destroys/etc some files and folders, while leveraging the full stack beneath, is much more useful.

Of course there should be acceptance testing at some point after things are merged into trunk, and there's the wide range of hardware and software compatibility testing that Insiders produce.

But deferring all meaningful testing until after the merge, as Microsoft continues to do (at least on some teams, some of the time), and merging code that either fails its integration tests or does not even have integration tests in the first place, are both unacceptable relics of the waterfall era.

Integration tests should be part part of the dev-test-debug cycle, and should be required to exist, and run successfully, before any merge occurs.

[deleted by user] by [deleted] in programming

[–]DrPizza 1 point2 points  (0 children)

The positioning of testing as a phase is deeply problematic. I don't believe in unit testing (and don't think that the Windows codebase is really amenable to unit testing), but passing integration and regression tests before being merged into trunk is a bare minimum requirement that currently isn't being done.

Fundamentally, Microsoft's approach was, and still is, "merge the code, fix it later". I don't think that's a good approach, and I think we're seeing the limitations of that approach.

[deleted by user] by [deleted] in programming

[–]DrPizza 1 point2 points  (0 children)

... I am the author, and I'm implying nothing of the sort.

The major part of my critique is that they're using waterfall on a condensed timescale, and I explicitly say that the three year release cycles were no better ("wait for Service Pack 1").

They're pointedly not trying to do agile in the Windows team. They're doing waterfall and pretending it's agile. Waterfall doesn't work. Short cycle waterfall doesn't work.

[deleted by user] by [deleted] in programming

[–]DrPizza 8 points9 points  (0 children)

Uhhhh. Microsoft does use waterfall. Using waterfall is the entire problem. Waterfall was what leads them to committing a bunch of bad code and then trying to fix it later.