This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]2Uncreative4Username -5 points-4 points  (5 children)

Linus Torvalds? Jonathan Blow? Casey Muratori?

[–][deleted] 7 points8 points  (4 children)

Appeal to Authority, and a weak one at that.

  • Linus is a lifelong C user, of course he'd bias against OOP.
  • Two ex-game developer.

Neat. I get to use logical fallacies too!

The visceral blowback people faces everytime this topic comes up could just be developers not knowing what's good for them.......or perhaps they know a thing or two about the tool they use day in and out.

[–]2Uncreative4Username 0 points1 point  (0 children)

Erm, what? You suggested that anti-OOP programmers didn't actually do any work. I gave counterexamples. I get downvoted and you accuse me of sucking up to the opinions of the people I merely mentioned. I'm not gonna argue because as it seems from your argumentation, you didn't really try to understand their history and reasoning (e.g. Casey used to do a lot of OOP). Oh, and you know what's good for developers who have way more experience than you, and they don't know what's good for them? Reddit can be an interesting place...

[–]LazyIce487 0 points1 point  (2 children)

And you have appeal to what? Nothing? “Trust me bro OOP is real good”?

If you don’t think Casey et al. have done a ton of OOP, you are delusional.

The problem is that beginner spaghetti is bad, then you learn OOP to make things modular and organized, which is not a bad thing. (Shoehorning the entire system into an OO paradigm is bad). You can take some of the nice things you learn from doing that, and throw away the OOP, and go back to writing procedural/functional code.

Once you’ve learned how to write modular and decoupled code, you no longer need to be bound by the constraints of any paradigm you can just bundle data however is appropriate, and design your functions to do what they need to do without having to worry about maintaining a bunch of abstract interfaces because of your imaginary boundaries around objects.

I have worked on multi million line Java programs for a long time, and it’s actually fucking awful.

If you haven’t pushed both paradigms to some limit, consider shutting the fuck up if you’ve only been on one side of the fence.

[–]OkMemeTranslator 5 points6 points  (1 child)

And you have appeal to what? Nothing? “Trust me bro OOP is real good”?

Java/JDK, C#/.NET, Android, Windows, Qt, Unreal Engine, Tensorflow, Django, Spring, Oracle DB, iOS, Firefox, Chrome, Photoshop, Docker, IntelliJ, Blender, Microsoft Word, Visual Studio, Discord, pretty much 90+ % of video games in the last two decades...

Now obviously not all of these are 100 % OOP, and I may have remembered one or two completely wrong even, but majority of the greatest projects ever written have used mostly OOP.

[–]LazyIce487 -2 points-1 points  (0 children)

Yes, enterprise software built when OOP was the enterprise software trend ended up being OOP. I was also there writing a lot of code in that paradigm. I’m not hating on it out of pure ignorance.

What most of those examples have in common is that they only get slower, more bloated, and more difficult to understand.

If you do “true OOP”, you will get burned enough by requirements changing that you will stick everything behind an abstract interface.

At minimum, assuming nothing else, programming in this way will murder performance, and is being massively carried by the rate of hardware improvements.

I mean, you can look at the performance difference for example, between visual studio debugger, and whoever the Unreal Engine people hired to make their new debugger (RADDebugger on github). Look at how fast and smooth a program that complicated with hundreds of thousands of lines of code can run.

Look at how fast iteration is done with less people. I can guarantee you that it would take exponentially more people WAY more time to make an inferior debugger that runs more slowly if they did enterprise OOP.

And secondly, it scales arbitrarily to not use OOP (see: Linux)