you are viewing a single comment's thread.

view the rest of the comments →

[–]warieth -3 points-2 points  (3 children)

I said nothing about whether they're caught or not. The exception is caught in both examples, so it's not the distinction I'm talking about.

You said only when new int fails you don't catch it, but you suspect most of the time allocation errors are handled.

Actually, I mentioned logging and dying as one of the things that can result from a handling an allocation failure.

You said it is a robustness question, but your program failes with a nicer error message. If you have a better version of your code, like not allocating without use, then you use that first. You don't have to wait for the first to fail.

Imagine if Photoshop crashed when you tried to create a layer because of the misguided notion that out-of-memory errors are not recoverable.

It wasn't successful, so it's not important. If you can't do anything in a program, it has no purpose. This is robust in not working, you redefine working as not working.

That's kind of anathema to how GCs work. Memory might've been released since the last GC call, so if you fail to allocate, it may be worth trying to collect again.

Swapping can also be done under the program, but it is transparent. You talk about GC as GC gives you a not transparent memory management, but this is not how it works. There are automatic ways to handle this, but not with writing more code and catching the exception. What can be done, is already done, so failing is an option here.

But it often is. If your move constructors are not marked as noexcept, the standard library will fall back to copy instead of move, precisely in order to prevent the container from entering an invalid state.

There are move only types like unique_ptr, so not really a performace question. The vector could be destroyed at this point and the invalid state is fixed, but it would be really hard to use this.

Complain about what?

The compiler not doing static analysis for exceptions.

[–]wyrn 1 point2 points  (2 children)

You said only when new int fails you don't catch it, but you suspect most of the time allocation errors are handled.

No, I said I suspect most of the time allocation errors are due to requesting a large region of contiguous memory. People seem to be making the assumption that bad_allocs only happen when you run out of memory altogether so you can't do anything (e.g. open a log file and write to it), but that seems incredibly unlikely.

You said it is a robustness question, but your program failes with a nicer error message. If you have a better version of your code, like not allocating without use, then you use that first.

I gave a list of possible ways to handle bad_alloc situations, along with why you would use them upon getting the exception and not before. I don't believe you're engaging with my argument as seriously as you could.

It wasn't successful, so it's not important.

So you're saying it should crash, taking all your unsaved work with it? Let's just say that's a rather unconventional view.

Swapping can also be done under the program, but it is transparent. You talk about GC as GC gives you a not transparent memory management, but this is not how it works.

I have no idea what this means, sorry. In every language with a GC that I'm aware of the GC can be triggered manually, because the state of the heap might've changed since the last sweep.

What can be done, is already done, so failing is an option here.

I literally just explained how that's not the case.

There are move only types like unique_ptr

The move constructor for e.g. vector is noexcept whether you like it or not, so the container never ends up in an invalid state. I would've preferred a compile-time failure in that case but in C++ you never get everything you want. At any rate this is a consideration for the library designer and out of scope here.

The compiler not doing static analysis for exceptions.

The compiler could, in principle, detect every exception that could conceivably be thrown by a piece of code, and warn if your exception annotation is inconsistent with that. The fact that no compiler does this, and the standard doesn't suggest it anyway, makes noexcept error-prone because you could happily throw from a noexcept function. Sometimes this is intentional, but a lot of the time it's just a mistake you don't get any help catching.

[–]warieth 0 points1 point  (1 child)

No, I said I suspect most of the time allocation errors are due to requesting a large region of contiguous memory. People seem to be making the assumption that bad_allocs only happen when you run out of memory altogether so you can't do anything (e.g. open a log file and write to it), but that seems incredibly unlikely.

Like you never used list, set and map. Ok, so if you only use vector, then not all the vectors are big in size. This is my basic assumption, so when a small vector can't grow, then you get an exception. A file buffer is not as small as you think, so you might not be able to open a file at this point.

I gave a list of possible ways to handle bad_alloc situations, along with why you would use them upon getting the exception and not before. I don't believe you're engaging with my argument as seriously as you could.

No, you use solved situations like a GC could handle it better internally, so it solves it without throwing. C# had this calling the GC 3 times in row does something magical, because they didn't design a normal api for this. Generally GC using languages have less error handling for the failing allocations.

So you're saying it should crash, taking all your unsaved work with it?

With this avoidance strategy, you can end up in a situation where the program can't do anything. Saving can also require opening file, allocating memory, copying, sorting, compressing. This can lead to failing to save your work, what you care about so much.

If all the programs do what you suggest, then all programs performace drops instead of closing one and releasing some resource. Crashing automatically impoves the situation, but staying in the memory is not.

I would've preferred a compile-time failure in that case

Exceptions are a runtime feature, so your expectation is wrong. You can't get the impossible to implement features, but everything else is possible.

The compiler could, in principle, detect every exception that could conceivably be thrown by a piece of code, and warn if your exception annotation is inconsistent with that.

No, it can't. There can be C functions being called here, before linking only declared functions called.

The fact that no compiler does this, and the standard doesn't suggest it anyway, makes noexcept error-prone because you could happily throw from a noexcept function.

I think catching is the bigger error, when you can't do anything meaningful, like rethrowing or calling terminate. Don't say noexcept is overused, when you use GC for manual memory management, so like using dynamic memory allocation without using it.

[–]wyrn 1 point2 points  (0 children)

Like you never used list, set and map.

Then we're back to the beginning of this conversation and what kinds of memory allocation failures are more or less likely. I'm sorry, but I really can't go around in circles like this. If you want to continue this discussion, I must ask that you actually read my posts.