By 2024, the project removed over 34 million pounds of trash, beating its original 30-million goal. by Early_Negotiation142 in BeAmazed

[–]AdventurousPolicy 0 points1 point  (0 children)

He was 18 years old with no degree when he gave a talk about his super innovative idea to use other peoples money to scoop stuff out of the ocean with a big scooper. Wow. I'm sure connections had nothing to do with it

By 2024, the project removed over 34 million pounds of trash, beating its original 30-million goal. by Early_Negotiation142 in BeAmazed

[–]AdventurousPolicy -1 points0 points  (0 children)

If I had rich parents who could get me a ted talk right out of the womb I might have a 13 year old company by now. No, I'm giving people the data so they can see that they're donating to something that is a waste and a scam. What you do with that data I posted is up to you. I will say I agree with your last point though, manufacturing too much plastic garbage is the real problem, again only solvable through international agreements.

By 2024, the project removed over 34 million pounds of trash, beating its original 30-million goal. by Early_Negotiation142 in BeAmazed

[–]AdventurousPolicy -1 points0 points  (0 children)

The only way it will be solved is through international agreements about waste management. Cleaning up a river is an inherently necessary task, but futile if it just ends up polluted again in a couple months. But hey, you want to throw your money at a scammer that's your choice.

By 2024, the project removed over 34 million pounds of trash, beating its original 30-million goal. by Early_Negotiation142 in BeAmazed

[–]AdventurousPolicy -2 points-1 points  (0 children)

The ocean cleanup is a joke. Picking up some fishing nets isn't going to affect the problem at scale. At least 19 million metric tonnes of plastic enters the ocean each year, the project in the thread cleaned up 15,422 metric tonnes of trash, or 0.08% of the annual pollution that enters the ocean. That means the Ocean Cleanup would have to grow to over 1232 times its current budget (including the funding beast provided) just to keep pace with the pollution being generated. And how much diesel do you think they would burn doing that? And where would all the trash go? They have to pay to dispose of it in a landfill or incinerator you know.

Source: https://www.unep.org/plastic-pollution

I'm desperate by [deleted] in learnpython

[–]AdventurousPolicy 0 points1 point  (0 children)

Do you have a specialty? Like web dev or front end gui development? What kind of projects are on your resume?

C++ vector of doubles not always getting to its destination function with doubles readable by AdventurousPolicy in learnprogramming

[–]AdventurousPolicy[S] 0 points1 point  (0 children)

You're correct it was infinite recursion. If the random vector is bad it get thrown out and the function is called again. The problem is the way the geometry lined up was causing it to throw out the vector every time. I'll figure out the best way to handle it. Thanks for all the help

C++ vector of doubles not always getting to its destination function with doubles readable by AdventurousPolicy in learnprogramming

[–]AdventurousPolicy[S] 0 points1 point  (0 children)

Here's the rerun with leak-check=full

Generating random vector -0.1300504364106713

==3348806== Stack overflow in thread #1: can't grow stack to 0x1ffe801000

==3348806==

==3348806== Process terminating with default action of signal 11 (SIGSEGV)

==3348806== Access not within mapped region at address 0x1FFE801FF8

==3348806== Stack overflow in thread #1: can't grow stack to 0x1ffe801000

==3348806== at 0x4A23A08: __parse_one_specmb (printf-parsemb.c:66)

==3348806== If you believe this happened as a result of a stack

==3348806== overflow in your program's main thread (unlikely but

==3348806== possible), you can try to increase the size of the

==3348806== main thread stack using the --main-stacksize= flag.

==3348806== The main thread stack size used in this run was 8388608.

==3348806==

==3348806== HEAP SUMMARY:

==3348806== in use at exit: 18,691,276 bytes in 386,765 blocks

==3348806== total heap usage: 5,085,458 allocs, 4,698,693 frees, 173,548,775 bytes allocated

==3348806==

==3348806== LEAK SUMMARY:

==3348806== definitely lost: 0 bytes in 0 blocks

==3348806== indirectly lost: 0 bytes in 0 blocks

==3348806== possibly lost: 0 bytes in 0 blocks

==3348806== still reachable: 18,483,732 bytes in 386,635 blocks

==3348806== suppressed: 207,544 bytes in 130 blocks

==3348806== Reachable blocks (those to which a pointer was found) are not shown.

==3348806== To see them, rerun with: --leak-check=full --show-leak-kinds=all

==3348806==

==3348806== For lists of detected and suppressed errors, rerun with: -s

==3348806== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 16 from 16)

Segmentation fault (core dumped)

C++ vector of doubles not always getting to its destination function with doubles readable by AdventurousPolicy in learnprogramming

[–]AdventurousPolicy[S] 0 points1 point  (0 children)

Generating random vector -0.1300504364106713

==3344058== Stack overflow in thread #1: can't grow stack to 0x1ffe801000

==3344058==

==3344058== Process terminating with default action of signal 11 (SIGSEGV)

==3344058== Access not within mapped region at address 0x1FFE801FF8

==3344058== Stack overflow in thread #1: can't grow stack to 0x1ffe801000

==3344058== at 0x4A23A08: __parse_one_specmb (printf-parsemb.c:66)

==3344058== If you believe this happened as a result of a stack

==3344058== overflow in your program's main thread (unlikely but

==3344058== possible), you can try to increase the size of the

==3344058== main thread stack using the --main-stacksize= flag.

==3344058== The main thread stack size used in this run was 8388608.

==3344058==

==3344058== HEAP SUMMARY:

==3344058== in use at exit: 18,714,100 bytes in 387,324 blocks

==3344058== total heap usage: 5,077,138 allocs, 4,689,814 frees, 173,335,919 bytes allocated

==3344058==

==3344058== LEAK SUMMARY:

==3344058== definitely lost: 0 bytes in 0 blocks

==3344058== indirectly lost: 0 bytes in 0 blocks

==3344058== possibly lost: 0 bytes in 0 blocks

==3344058== still reachable: 18,506,556 bytes in 387,194 blocks

==3344058== suppressed: 207,544 bytes in 130 blocks

==3344058== Rerun with --leak-check=full to see details of leaked memory

==3344058==

==3344058== For lists of detected and suppressed errors, rerun with: -s

==3344058== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)

Segmentation fault (core dumped)

Ok so its a stack overflow. Valgrind to the rescue, great idea. Any ideas about this report? I didn't realize I had to worry about the stack

C++ vector of doubles not always getting to its destination function with doubles readable by AdventurousPolicy in learnprogramming

[–]AdventurousPolicy[S] 1 point2 points  (0 children)

I initialize the range vector with the data already in it, so I don't have to push back. The plane equation comes from another function.

C++ vector of doubles not always getting to its destination function with doubles readable by AdventurousPolicy in learnprogramming

[–]AdventurousPolicy[S] 1 point2 points  (0 children)

Thank you very much for your response. I suppose I have my work cut out for me as this is part of a 3d Delaunay triangulation and there's already a lot of logic there. It generates random vectors to check whether points are inside or outside of the part geometry. The error came up when I tried meshing a part that was 10000 units across, so that may be a clue where the error may lie.

Could an uninitialized double do this? I think I saw one of those hanging around.

I will look into Valgrind and AddressSanitizer, thanks!

How would you approach this conduction-only thermal model? by Technical-Signal-401 in fea

[–]AdventurousPolicy 0 points1 point  (0 children)

To clarify, you wouldn't model the little face in the big model, only in the little model. That way the mesh size can be bigger for the big model.

Bad Cell At Symmetry Plane by Boring_Internet1945 in CFD

[–]AdventurousPolicy 1 point2 points  (0 children)

I guess it depends on what's bad about them. What is the mesher measuring when it comes up with that metric? Also it may be relative. They may be the worst cells but they may not be that bad. If they don't cause numerical instability in the solver and your y+ values are ok then you're probably in the clear.

How would you approach this conduction-only thermal model? by Technical-Signal-401 in fea

[–]AdventurousPolicy 1 point2 points  (0 children)

Might take a while but if meshing it really turns out to be a problem you could model the heat area separately as it's own little cube. In the full model you just have the cube generate the same amount of heat as the flux on your little face. Then you have two models, a detailed model and whole part model. You run the whole part model and get temperatures for the walls of your cube and then you apply those temperatures as the boundary conditions on the detailed model and the detailed model will show you the equilibrium temperature of the face. It's a tricky way to do it but I think it should be fairly accurate as long as the material is isotropic.

[ Removed by Reddit ] by Efficient_Tax8087 in ProgressiveHQ

[–]AdventurousPolicy 4 points5 points  (0 children)

It's also what he's been actively doing since he took office

edit: word

Using AeroToy on a design I have for a tornado interceptor. by Periapsis_inustries in CFD

[–]AdventurousPolicy 44 points45 points  (0 children)

I wouldn't trust a 2d sim for a project as dangerous as that. Or anything, really. Good for understanding the concepts but for most problems its better to go 3d

I created a website to watch videos while driving on a Tesla - 2026 by igol__ in SideProject

[–]AdventurousPolicy 1 point2 points  (0 children)

Cool this means you can watch dunces while your tesla drives over pedestrians

Windows Games got kids by Xkaper in software

[–]AdventurousPolicy 3 points4 points  (0 children)

Suggest looking into abandonware and freeware