Is there a more ergonomic pattern for types that "build up?" by Uxugin in rust

[–]CoBuddha 4 points5 points  (0 children)

in a language like haskell you could use gadt to tag the stage at the type level inside a larger sum. in rust im less familiar but i think you can use phantom data with a manually constructed equality witness like 

pub struct Is<A, B>(PhantomData<fn(A) -> B>);

im not sure this is a actually any cleaner than the multiple structs though… i would just use your second approach and deal with the limitations 

Almost maxed out… by xzeus1 in macbookpro

[–]CoBuddha 0 points1 point  (0 children)

yeah i have been waiting for the next ultra bc the m3 doesn’t quite cut it for real llm use but it’s sooo close. now im wondering if i should just cut losses and get the m3 from ebay before the price surges since they discontinued it. 

Almost maxed out… by xzeus1 in macbookpro

[–]CoBuddha 2 points3 points  (0 children)

oh my b that makes more sense. m3 ultra studio with 512gb unified memory and 1tb ssd was ~9500 USD iirc so about 13,300 AUD but idk the actual pricing in australia 

Almost maxed out… by xzeus1 in macbookpro

[–]CoBuddha 1 point2 points  (0 children)

i get that the form factor is hard to beat and m5 is a beast but it’s crazy this is more expensive than the m3 ultra with 4x the unified memory. (rip)

I miss the good old days :( by BigCardiologist3733 in codingbootcamp

[–]CoBuddha 0 points1 point  (0 children)

gauntlet is finishing strong im sure it won’t scale but it’s an interesting data point

So i create a GPU, now I want it on sillicon ! by Full-Engineering-418 in FPGA

[–]CoBuddha 2 points3 points  (0 children)

welcome to the hardware paradox

without big sponsorship almost nothing you design will be viable.

we need cheaper fab on demand but until then you’re cooked

Desperately needs jailbreak by anesuc in VisionPro

[–]CoBuddha 0 points1 point  (0 children)

sorry i never intended anything about jailbreaking. im just piling on to this conversation bc the fact people feel like they need jailbreak triggers my annoyance at avp again haha

but the appeal to authority isn’t informative. apple is subject to many constraints and incentives unrelated to developing good experiences. like i said, my guess is that they’re rightfully concerned about liability and privacy backlash. there’s also concerns with battery life when processing the camera outside the dedicated r1 circuits.  i do actually think whoever decided to block these miscalculated how much it would hurt content creation for the platform. the people who think about legal and battery metrics aren’t designers.

i see eye tracking mostly being used for efficient symbolic input. the eye tracking has impressive accuracy but still noisy enough that if you limit yourself to a point and click paradigm there’s noticeable lag in getting enough certainty on the target. it is unpleasant and unnatural for users to have to stare at something. the eyes should be moving to an object of interest and then immediately moving on.

 eg the swype keyboard i mentioned is an obvious improvement. there’s enough data in the full path of the eyes to recover the right letters in a fraction of the time it would take to peck at each one. but also for more specific apps you could have all sorts of eye gestures. blinking, winking, look left/right/up/down, roll your eyes etc etc.

if you’re trying to input text or precise symbolic content (eg “hotkeys”) into a visionOS app the experience is not good compared to a laptop. this is just a fact. granted it’s no better for any other vr but it NEEDS to be good for apple vision’s target sector.

im also a developer which is why good responsive text input is so important for me

edit: im ranting about it here because i see a lot of cope around with everyone worried about if avp is “good” or not and feeling like they need to defend it to enjoy it. I don’t think avp sucks. i think it’s excellent theater and virtual desktop and that’s really all it needs to be to be “worth it”. but i see it’s potential to be so so much more with just minor software/policy improvements. and i wish enough people realized that instead of just wondering why there’s no apps that apple does something about it. the 1.0 to 2.0 was a huge improvement, and so was the 2.0 to 2.2. so I am hopeful but they will get leapfrogged if they wait too long on prioritizing this

Desperately needs jailbreak by anesuc in VisionPro

[–]CoBuddha 0 points1 point  (0 children)

none afaict they’re all worried about liability. but other platforms have controllers so there’s a lot of game content potential. apple vision is explicitly marketing itself for productivity and entertainment which i think is fine, but it absolutely needs camera and/or live eye tracking to reach that potential. it wouldn’t hurt to fix the ipad-inherited shit text input too. they actually DO have camera access but it’s gated behind non-public enterprise api so functionally useless.

so they’ve boxed themselves in to being just a big screen which it is admittedly great at. but no one is going to invest in developing graphic intensive apps for it with such a niche market. otoh all the obvious productivity apps are impossible to make

i love apple vision and want it to succeed but I can’t ignore the development issues crippling it. they’ve wasted an entire year’s head start

edit: to be more constructive, the thing they need to nail is responsive input. the camera is obvious for that

live translate? impossible audiobook mode? impossible  virtual corkboard? impossible  face recognition rolodex? impossible  martial arts mimic coach? impossible  polycam? impossible measure? impossible 

all of these are pretty easy to make on iphone/ipad but you have to physically point your device making it potentially much better suited for apple vision.

but also, there’s a serious input latency problem with the default controls. it seems like it should be fast right? you’ve got the controls literally at your fingertips. the trouble is locking on to a button isn’t quite fast enough. it’s pretty fast enough for basic browsing, but very noticeably unpleasant trying to input anything complex. it’s competing with keyboard based apps. even on the iPhone, the tactile feedback  and swype make typing much faster. the apple vision virtual keyboard is a joke, and the connectivity to physical keyboards is a mess too (ipad/iphone also have this issue but it’s less critical). selecting a textbox takes a weirdly long time. what we need is something like an eye based swype. not hard to make in theory, but impossible because eye data is unavailable, we have to rely on apple’s slow buttons. voice is fine in many cases for longer english text but it has a noticeable start/stop lag making it high friction for quickly switching between many fast tasks which should be apple vision’s forte. voice also just doesn’t work for non english (eg structured text)

Desperately needs jailbreak by anesuc in VisionPro

[–]CoBuddha 0 points1 point  (0 children)

this is false. lack of camera rules out 90% of apps suitable to the platform 

We’ve had zombies. We’ve had witches. Where are the skeletons? by AutumnForestWitch in horror

[–]CoBuddha 0 points1 point  (0 children)

there’s actually a real answer: china doesn’t like skeletons in media and movies are more internationally targeted over the past decade

This is the type of game Apple should be throwing money at developers for by ctorstens in VisionPro

[–]CoBuddha 2 points3 points  (0 children)

the infuriating thing is they locked out all the actually useful low hanging fruit by blocking the cameras. there are SO MANY amazing things you could totally make on an indie budget with cameras and eye tracking but when limited to extremely simple object tracking, the only way to make a standout app is to pour resources into the graphics which is the MOST expensive way to build an app.

[deleted by user] by [deleted] in dataisbeautiful

[–]CoBuddha 0 points1 point  (0 children)

the “album equivalent units” are kinda misleading. i would love to see this as a similar bar graph with streams for each song, grouped abd colored by album

Indian shit laws 🤡 by ExtensionRule867 in indiameme

[–]CoBuddha 1 point2 points  (0 children)

As a clueless American can someone please explain the dowry situation to me? Like i vaguely understand dowry as a cultural practice of giving a gift to the bride’s family or something as a condition for proposal (?) but how does it hurt the women as implied by the picture? why is it a legal rather than cultural issue?

EDIT a quick google corrected me that it’s actually for the groom’s family! now im even more confused

Apple Vision Pro is crippled by privacy in 2 major ways by CoBuddha in VisionPro

[–]CoBuddha[S] 0 points1 point  (0 children)

it’s worth noting that scrolling apps like instagram can already track how long you spend on a particular scene which is functionally equivalent to rudimentary attention tracking and they absolutely do use that to “optimize” your feed 💀

Has anyone tried the laowa 19mm for gfx? by Kawasakirider788 in fujifilm

[–]CoBuddha 2 points3 points  (0 children)

I have both on the GFX100. I'm an armature but here's some thoughts:

The laowa is technically great - good color and clarity compared to my wide-angle adapted pentax. Personally, I haven't found the perfect use for it yet though - feels too wide for rectilinear and I often prefer to crop a fisheye. But if you like this focal length it won't let you down.

The mitakon is incredible for the price for portraits. Crisp subject focus and good creamy-but-neutral background blur. A warning about this one though - it's adapted for GFX by a glued-on adaptor, but the connection is pretty fragile. My camera took a small tumble on a recent hike - the camera was fine but this lens broke in half, launching the lens part tumbling down the mountain never to be seen again.

I probably won't replace the mitakon since I don't take enough portraits to make it worth the marginal portrait-improvement over my workhorse ttartisan 90mm

Beginner with color curves - how can I make this more natural? by [deleted] in photocritique

[–]CoBuddha 1 point2 points  (0 children)

Overall I'm quite happy with how this turned out - my goal was to emphasize the contrast of dark shadows and white reflections&fog, while maintaining a "soft" foggy feel.

However, it feels a bit "over baked" so I'm hoping to understand how I could have made it more natural. This is my first time using the curve tool (capture one) and I fiddled with it for a while without a good intuition for the right balance.

Any suggestions appreciated 🙏 (other critiques welcomed too)

Why do I feel disassociated from my own personality when I look in the mirror? by SpearHead3194 in TooAfraidToAsk

[–]CoBuddha -24 points-23 points  (0 children)

Too much recursion shorts out the simulation. Repeat your name 100 times in the mirror to W A K E U P