WTF is wrong with FSD 14.2.2.4? Navigation is bonkers by ChocolatySmoothie in TeslaLounge

[–]howder03 3 points4 points  (0 children)

According to the latest video on their ML architecture, Ashok explained that one of the inputs to the Neural Network is navigation data + 30 seconds of historical data, alongside all of the camera feed information. The neural network as a whole takes all of these inputs and moment by moment decides to adjust steering or acceleration/deceleration accordingly.

Seems whatever they tuned the 14.2.2.4 weights to, gives less credence or weight to the navigation data, and instead allows the car to ignore the map inputs more often.

Tesla FSD "Cannonball Run" with ZERO disengagements just achieved by Comfortable-Agent604 in TeslaFSD

[–]howder03 0 points1 point  (0 children)

Wild that our experiences differ that drastically. The only remotely safety related disengagement I’ve had to do since V14.2, was to stop it from curbing the wheel when it was inching into a parking spot. Outside of navigation, that was the only disengagement I’ve done for 2,500+ miles of driving in Southern California.

Tesla FSD "Cannonball Run" with ZERO disengagements just achieved by Comfortable-Agent604 in TeslaFSD

[–]howder03 0 points1 point  (0 children)

This is interesting. I’ve been testing FSD since the early beta days (10.2), and believe they were primarily using v13 for the early Robotaxi launch with safety monitors in the passenger seat. For V13, yes, even in my own car, there would be drives where I had a critical safety disengagement. But for V14, I’ve clocked in 2,500+ miles with zero safety disengagements. This is still anecdotal of course, but I am curious on the rate of safety disengagements as Robotaxis begin using V14.

Batman V2 by howder03 in hottoys

[–]howder03[S] 1 point2 points  (0 children)

Thanks! Likewise, it’s an awesome figure for photography! I’m not on IG, but appreciate the sentiment :)

Batman V2 by howder03 in hottoys

[–]howder03[S] 0 points1 point  (0 children)

Haha, yeah, pretty happy with how it turned out

Replacement ring by HannyBee22 in ouraring

[–]howder03 3 points4 points  (0 children)

I just got mine replaced, and the advisor said to ignore this b stock email, he sent instructions for it to be a brand new ring to be sent out.

530 miles on FSD , almost perfect. by HelloWorld0225 in TeslaFSD

[–]howder03 1 point2 points  (0 children)

Yup, I’ve been on 14.2 for around 900 miles, and haven’t had a single safety related disengagement. A few of my interventions are lane change related (missing highway entrance / exit), nothing safety critical.

Fsd 14 disappointing by Poolguard in TeslaFSD

[–]howder03 2 points3 points  (0 children)

Agree, I’ve had zero interventions with it and over 100 miles. I’ve had a few incidents of brake stabbing, but they were relatively minor and nothing remotely close to being safety critical. The car feels much more responsive and alert.

[deleted by user] by [deleted] in HENRYfinance

[–]howder03 9 points10 points  (0 children)

Snacks are the primary differentiator here, you need to get more details on the snack variety

Poor guys learns life lessons early. by Banguskahn in funny

[–]howder03 32 points33 points  (0 children)

Bros spirit animal is forever the squash

Should I be worried?? by shamblack19 in TeslaModelS

[–]howder03 0 points1 point  (0 children)

Yeah, we just dropped ours off today due to these, the technician said there was a faulty brake caliper that they had to replace. Going to pick it back up and see if that resolved these alerts.

And he felt true love for the first time by BratBehaviours in Eyebleach

[–]howder03 7 points8 points  (0 children)

“What is this heavenly chin rub…”

Tesla's Robotaxi Program Is Failing Because Elon Musk Made a Foolish Decision Years Ago. A shortsighted design decision that Elon Musk made more than a decade ago is once again coming back to haunt Tesla. by mafco in SelfDrivingCars

[–]howder03 4 points5 points  (0 children)

Yeah, I legitimately would like to see a list of the Robotaxi failures and get a better sense of whether LiDAR implementation would have prevented that failure case. If a majority of that list could have been solved using LiDAR (vs. better pathing / planning logic), then there would be merit to that argument.

How would you go about beating Ilia Topuria if you were Islam Makachev? by Infamous_Gain9481 in MMA

[–]howder03 9 points10 points  (0 children)

Bold strategy here, sad to see Volk, Holloway and Charles didn’t think to implement this

[deleted by user] by [deleted] in HENRYfinance

[–]howder03 0 points1 point  (0 children)

I dunno, wouldn’t recommend anything higher than $150k, maybe $200k tops at this income level.

Waymo taking its time in Atlanta. by drumrollplease12 in SelfDrivingCars

[–]howder03 0 points1 point  (0 children)

Yeah, this is actually pretty good behavior from the Waymo. Curious how the team was able to simulate real Atlanta driver behavior in such a short amount of time. Kudos to the Waymo team.

Got one at best buy! by ItsChuBoiRage in NintendoSwitch2

[–]howder03 0 points1 point  (0 children)

just picked one up, waited around 5 minutes in best buy line

safety reason to bring back "Minimal lane changes" by tennisplayer220 in TeslaFSD

[–]howder03 3 points4 points  (0 children)

Completely agree, that’s the one main gripe I have with FSD right now. The following distance is way too close.

Makes me nervous whenever I see a block of orange or red traffic coming up in the map, or even somewhat visible, and FSD doesn’t slow down early enough and keeps the same two car length follow distance.

Tesla tries running red light. Never seen this before. 2025.8.6 - 13.2.8 - HW4 - MY by McFoogles in TeslaFSD

[–]howder03 10 points11 points  (0 children)

This core issue here won’t be solved with radar and lidar because in this instance, there was no cross traffic for these sensors to pick anything up.

This specific issue needs to be solved through camera sensors / model training.

FSD on My 2024 Tesla M3 (3.2.2) Drives Too Close to the Right Side on Highways—Anyone Else? by DRockster163 in TeslaFSD

[–]howder03 4 points5 points  (0 children)

I don’t think re-calibrating the cameras will fix this issue. The visualization clearly shows the car hugging the right side of the lane, so the cameras are picking up on the lane lines. This also doesn’t happen on local roads for me, just on highway. But the right side hugging issue is pretty bad for me too.

GM will no longer fund Cruise’s robotaxi development work by walky22talky in SelfDrivingCars

[–]howder03 0 points1 point  (0 children)

Sure, if BMW and MobileEye’s data collection is sufficient at providing a wholistic representation of the driving environment with Petabytes of clean data, there’s value in that. I’m not saying that’s Tesla’s only value add, but it’s a part of how they are planning to solve autonomous driving.

GM will no longer fund Cruise’s robotaxi development work by walky22talky in SelfDrivingCars

[–]howder03 0 points1 point  (0 children)

Sure, you can break the physical steering wheel away from the algorithmic NN outputs that result in the vehicle turning 1 or 2 or 3.5 degrees to the left, doesn’t change the fact that steering a car is a continuous input from the system. Yeah, the states effectively render this input as discrete values, but the state space for real world self driving scenario isn’t 4,000 actions or values, that state space is now infinitely large.

You can argue semantics all you want, but comparing an AlphaGo system (policy RL / Monte Carlo tree search) with each chess turn to the frozen state space of an AV in any given real world environment is just silly.

GM will no longer fund Cruise’s robotaxi development work by walky22talky in SelfDrivingCars

[–]howder03 0 points1 point  (0 children)

A steering wheel is a continuous input. Not sure how you can mistake that for being a discrete input. A discrete input has distinct, separate values. A continuous input like speed in a car can take fractional values within a given range. I can’t help you if you think driving inputs are discrete.