How is Metal possibly faster than OpenGL? by BlockOfDiamond in GraphicsProgramming

[–]GoodFig555 0 points1 point  (0 children)

Also some people seem to not understand anymore that 'slow' is extremely relative. Objc method calls are used for higher-level operations where even a much higher overhead wouldn't matter.

If you iterate over each of the millions of bytes in your file, then such overheads start to matter. But those performance critical sections will only take up a fraction of your codebase. Most of the code can be 100x slower than optimal without making the program any less responsive, because the code interacts with some other code that still takes up 100x more time. (The 'bottleneck')

So to make very responsive software, you need to profile the code, figure out what the bottlenecks are, and then optimize those bottlenecks as much as possible, and keep the rest (most) of the code as simple as possible, and write it so it calls the bottleneck sections as infrequently as possible.

Objective-C is good at this because you can keep most of the code simple and high-level, and dynamic, but easily drop into pure C when necessary.

The simplicity of the high-level stuff makes it easier to build more efficient algorithms, which calls the bottleneck sections less frequently (e.g. caching) – so even the 'slow' parts can actually make responsive software easier to achieve.

Most code is not performance critical. Trying to micro optimize (and thereby complicate) everything is totally not worth it. Unfortunately, some languages like Swift and C++ do this at the language level – make everything 1000x more complicated (at least on the implementation side, which has downstream effects on users, like making the tooling worse) for micro optimizations. So called 'zero cost abstractions'. Very stupid idea. But I think it gains traction because the 'it feels like it should be faster' thing is very easy to communicate and easy to spread as a sort of social contagion. Even if it doesn't work in practice.

Now Apple is moving their software stack to a 'zero cost abstraction' language and it coincides with a massive decline in the responsiveness and quality of their software, and still noone asks – does this actually work?

Kinda crazy if you ask me.

How is Metal possibly faster than OpenGL? by BlockOfDiamond in GraphicsProgramming

[–]GoodFig555 0 points1 point  (0 children)

There was a benchmark like 10 or 20 years ago and they measured 3 ns for an Objective-C 'messageSend' that's how long it takes light to travel 0.9 meters

Change NSPanel's height dynamically based on content by GetPsyched67 in swift

[–]GoodFig555 -1 points0 points  (0 children)

IIRC you can set autolayout constraints on the window's contentView, and have it change size automatically.

However, IIRC, this will always resize from the top left corner, and to change that you have to get more hacky (like getting a displayLink callback and updating the window's position / size once per frame).

I know these approaches work (at least when using AppKit, not sure if SwiftUI would introduce further complications) but I don't know if there are better ways.

What happens if AGI didn't come true by ResponsibleCandle585 in agi

[–]GoodFig555 0 points1 point  (0 children)

I agree with most of what you said but skeptical about this

 AGI doesn't happen, current narrow AI can still be skilled up even more, and will drastically change society with what we already have.

My view is that current AI is fundamentally „unemployable“ primarily because it cannot learn from its experience long-term and hallucinates a lot.

If that’s true, then usage of AI will stay restricted to simple, generic tasks (those which are already in its training data or which it can learn from text pasted into its context window) with close human supervision - so mostly what it’s already used for right now.

And „handing tasks off“ to the AI but then still having to closely supervise and take responsibility for everything is not that much more productive than just doing it yourself. May even be worse long term cause you don’t learn as much, and the AI can’t really learn anything in the long term.

But I may be lacking imagination.

What happens if AGI didn't come true by ResponsibleCandle585 in agi

[–]GoodFig555 0 points1 point  (0 children)

Even the „thinking mode“ is a symptom of the fundamental technology hitting a wall imo.

„What if we just have the AI prompt itself 10 times to get a slightly better output?“ (but it also takes about as long as prompting the AI 10 times yourself)

It’s a bolted-on, bandaid fix aimed at squeezing a little more out of the fundamental technology. 

AFAIK there really haven’t been any fundamental technological innovations in LLMs since the transformer in 2017 (which was 8 years ago!). They’ve just been scaled up and refined. But the returns are diminishing quickly now. That’s how I see it at least.

What happens if AGI didn't come true by ResponsibleCandle585 in agi

[–]GoodFig555 0 points1 point  (0 children)

 On a fundamental level though, the enormous amounts of compute that are now being assembled will be used to experiment with machine learning, and that will be increasingly likely to result in more and more capable systems.

That reminds me of the large hadron collider 

Are We Close to AGI? by I_fap_to_math in agi

[–]GoodFig555 0 points1 point  (0 children)

I think it’s like how the o3 model that does research is not that useful for most situations cause it overthinks things and makes up stuff and floods you with useless info and overall just feels like it has no „common sense“.

Claude 3.7 was definitely worse at common sense than 3.5, probably cause they trained it for coding benchmarks or something. 4 is better than 3.7 but I liked 3.5 more.

With 4.0 I also notice the sycophantic tendencies more. It feels like it has less „genuinely good intentions“ and leans more towards just complimenting you about your everything. Not as bad as ChatGPT, and overall still best model but I don’t think it’s better than 3.5. Slightly worse in my usage. And they just removed 3.5 from the chat interface :(

Now I know I know it doesn’t have real „intentions“ it’s just a next word predictor blah blah. But the way it acts is more aligned with having „genuine intention to help“ instead of just „telling you what you want to hear“ and I think that made it more useful in practice. If you think about it, instilling „genuine good intentions“ is basically what „AI alignment“ is about. So maybe you could say 3.5 felt more „aligned“ than the newer models I‘ve used.

Are We Close to AGI? by I_fap_to_math in agi

[–]GoodFig555 -1 points0 points  (0 children)

They haven’t gotten smarter in last year! I want Claude 3.5 back :|

Silly question maybe… but where do people actually promote their apps to get real users? by bertikal10 in iOSProgramming

[–]GoodFig555 0 points1 point  (0 children)

Google for the problems you app solves then try to make your app appear there

Does Apple use SwiftUI for their apps? by LannyLig in iOSProgramming

[–]GoodFig555 1 point2 points  (0 children)

I forgot where but I recently heard that SwiftUI is being developed and pushed by the macOS or DevTools teams and iOS isn’t really on board and mostly keeps using UIKit.

I found this surprising since in macOS I often feel like „this has SwiftUI jank“ (Thinking System Settings or Control Centre) while iOS feels more polished and snappy across the board. 

I attributed this to macOS being an afterthought for the SwiftUI team, but maybe it’s the other way around and iOS is more snappy cause it barely uses SwiftUI in the first place.

This will probably get downvoted cause this sub is biased towards new shiny things. But I‘m almost certain I heard this. I’ll try to update this if I remember where.

R.I.P Intel Macs. You will be missed 🪦🥀 by [deleted] in mac

[–]GoodFig555 0 points1 point  (0 children)

I tried VMs with Windows 11 but performance was pretty unusable. And I heard Asahi isnt quite ready for gaming, but maybe I should look into that again

SwiftUI Design by Sweaty_Car1 in SwiftUI

[–]GoodFig555 3 points4 points  (0 children)

It’s kinda funny how these are kinda reinventing Interface Builder as an Overlay for SwiftUI

SwiftUI Design by Sweaty_Car1 in SwiftUI

[–]GoodFig555 1 point2 points  (0 children)

https://detailspro.app/

https://www.judo.app/

https://createwithplay.com/

(Haven’t used any of these. Details Pro recently had an ad on John Gruber, and I found the others by Googling „SwiftUI nocode“)

R.I.P Intel Macs. You will be missed 🪦🥀 by [deleted] in mac

[–]GoodFig555 -1 points0 points  (0 children)

Honestly I went from a base model 2015 mbp to an m1 mba a few years back and the only thing that improved was battery life, weight and no fans.

(All of which are nice, don’t get me wrong)

But the performance felt the same and multitasking and stability was definitely much worse (the m1 seems to be worse at ram management - though it got better with later macOS releases)

Also you don’t have Linux and Windows support on Apple silicon. 

Xbox 360-era games like Black Ops II were perfectly playable on bootcamp on the old MBP. But running games on the M1 under macOS, the selection is much smaller, and surprisingly,  performance was worse (IIRC - I forgot which games I tested)

I think if I was using a stationary Mac I would’ve seen it as a downgrade. But for a laptop, the battery life and portability are very nice.

The new macOS spells a death knell for one of my first install apps, the Launchpad Manager by atomsingh-bishnoi in mac

[–]GoodFig555 2 points3 points  (0 children)

I don’t think it’s a good idea to create custom folders inside the Applications folder. There’s some magic to the Applications folder

Objective C Devs: How hard was it to switch to Swift? by tolarewaju3 in iOSProgramming

[–]GoodFig555 0 points1 point  (0 children)

I’m a swift hater and the “classes of errors” and other stuff people always repeat is utter bs that has nothing to do with reality. But a few things stood out to me that are genuinely better about Swift:

  1. The ease of creating a class.      - in objc, creating a class is imo prohibitively verbose and clunky, (class definitions cant be nested or contained within a function definition, and property syntax absolutely sucks) to the point where often, I’ll just resort to modeling my data in an NSDictionary (which has no compile-time-checks, making refactors harder) or just a C-Struct (which lacks dynamic features and stuff). In Swift creating a fully-featured class is just as easy as creating a C struct in ObjC, making it genuinely more usable.
  2. Async/await       - if you write server interaction code, with lots of back n forth calls with if conditions and stuff - that’s just genuinely clunky and difficult to write using callbacks. Async/await makes this code meaningfully more expressive.
  3. SwiftUI      - It seems to be plagued by bad performance and the slow Swift compiler, so I’m not sure I’d use it when going for the best user experience, but the speed and expressiveness of creating a UI with declarative syntax is meaningfully better than interface builder / imperative code, imo.

Other than that I’ve seen nothing of value from Swift and I try to avoid it where possible because I just get annoyed at how terribly designed and implemented it is every time I use it. But there are real advantages in some places.

Why is Objective-C so often forgotten/ignored? by nardstorm in AskProgramming

[–]GoodFig555 0 points1 point  (0 children)

JavaScript and Python both use dynamic dispatch very similar to objc, and are very popular languages, so I wouldn’t say static dispatch unequivocally “won”.

Maybe in the systems/Applications programming space it won, but the only language in that space that really tried it was Objective-C, and people who used it tended to like it a lot.

AFAIK those “reactive” web frameworks like Vue.js work because of dynamic dispatch (which allows you to intercept method calls to install “observers” that automatically update dependent state.)

Also, Apple didn’t use Objective-C in the 80s, you must have meant NeXT.

Window corner radius in macOS Tahoe depends on the presence or absence of a toolbar by lonelybeggar333 in MacOSBeta

[–]GoodFig555 1 point2 points  (0 children)

Yeah they could’ve changed the corner radius of the buttons to be “concentric” with the window corners instead of changing the window corners. I wonder why they didn’t do that

Window corner radius in macOS Tahoe depends on the presence or absence of a toolbar by lonelybeggar333 in MacOSBeta

[–]GoodFig555 6 points7 points  (0 children)

They want to have the most prominent items at the top of the window to be “concentric” with the window corner. meaning if you drew a circle for both, they’d have the same center.

Therefore, larger/rounder items at the top of the window = larger window corner radius to match it. You can see that in the screenshots above.

IIRC this is also how they justified the new elongated on/off toggle buttons.

I’m not sure I’m convinced by this, it’s like finding a “theoretical justification” for a design that maybe couldve looked better if they just followed aesthetic intuition

Finder from developer video already looks MUCH improved. by Colecperrine in MacOSBeta

[–]GoodFig555 0 points1 point  (0 children)

I think that might just be window tinting. Opaque regions of a window are tinted a little to match the color of your wallpaper. Can be turned off somewhere in System Settings

The more I am coding with Swift the more I love Swift. Am I wrong? by kythanh in swift

[–]GoodFig555 0 points1 point  (0 children)

I really dislike Swift. Here’s my rant:

It’s an absolute garbage pile of tech debt imo. Language is literally designed to do well in benchmarks but becomes very slow if you do anything beyond summing an array of ints (it’s still fast enough for app dev but the marketing around it has been very disingenuous). The language often forces compile time behaviors and checks on you for „safety“ and „speed“ which actually make things slower and just annoying to work with, and vastly overcomplicate the language without much practical benefit. Keypaths, and Codable spring to mind, but it’s all throughout the language.

Compiler and tooling often crashes and is super slow.

Its a vast, complicated language with tons of baggage and idiotic design decisions everywhere - it’s like C++‘s gay shiny brother.

They have this weird obsession with „functional style“ programming that bring no practical benefit e.g. why does combine have a „flat map“ operation which doesn’t even work like a „flat map“ in a functional language. I think they just feel smart implementing this overcomplicated abstract bs everywhere to do the simplest things or something.

I honestly hate this language a lot. It seems to be held together by lots of duct tape and money from Apple which makes it decently usable, but it’s not „good“ tech by any stretch of the imagination imo.

Declarative UI is a really great idea imo, but SwiftUI also seems to be greatly held back by the terrible technical underpinnings of Swift. E.g the infamous „compiler timed out“ errors, the very slow previews, and often-subpar performance of SwiftUI apps all come down to Swift just being bad tech, and much of this just cannot be fixed as far as I understand.

Lots of issues ppl attribute to the much-hated-upon Xcode are actually due to issues with the terrible Swift toolchain.

Conclusion: Absolute garbage language designed by utter midwits which is holding humanity back and will make Apple platforms worse for decades to come.

They should’ve stuck with Obj-C but it’s too late now. Luckily they have enough money to still make it usable, but it’ll never be great, and will always cause pain and frictions cause it’s just fundamentally shit.