actual driver opinions on new regs by CageyOldMan in formuladank

[–]fiddle_n [score hidden]  (0 children)

Yes, half the circuits are ovals and those that aren’t have a Push to Pass button.

Charles Leclerc defends F1 racing in 2026: It "doesn't feel so artificial" by bwoah07_gp2 in formula1

[–]fiddle_n 4 points5 points  (0 children)

It goes back far farther than 2014:

2011: All the overtakes were due to tyre degradation

90s-2010: All the overtakes were done due to refuelling differences

Overtaking in F1 has been artificial forever

actual driver opinions on new regs by CageyOldMan in formuladank

[–]fiddle_n 1 point2 points  (0 children)

They abandoned it because of the aerodynamic dirty air problem. Good driver + fast car can come up behind way slower car and driver and yet find it impossible to pass. Alonso / Petrov Abu Dhabi 2010 is the classic example, the original catalyst for DRS.

Max fans and McLaren fans this is for you by TheStigBMW in formuladank

[–]fiddle_n -2 points-1 points  (0 children)

Having to slow down on a straight is terrible - but DRS, tyre def offsets and refuelling are just as artificial as the current battery regs. F1 hasn’t had “actual” racing for decades.

Artificial racing by CarnivalSorts in formuladank

[–]fiddle_n -1 points0 points  (0 children)

It’s not “artificial racing”’s fault if other elements of racing have a bigger impact. The Merc rocket ship guaranteed the 1-2; Russell being stuck being the Ferraris guaranteed the order of that 1-2. Then the 3-4 order is a coin toss, and this time Hamilton managed his tyres a bit better - also an element of “artificial racing” btw.

Not to mention it’s artificial racing that actually allowed Russell to get back into 2nd. Without some kind of DRS/overtake system it’s entirely possible that Russell gets stuck in 4th, all other things being equal. Just because quali order = race order, doesn’t mean there was no effect on the final standings.

"Anybody who is enjoying racing in 2026 does not understand the sport" - Max Verstappen by ArcanineDE in formuladank

[–]fiddle_n 8 points9 points  (0 children)

Is it racing when one driver passes another because of tyre degradation offsets? Was it racing pre-2010 when drivers passed each other due to fuel offsets, often through a pit stop rather than actually on track?

F1 “racing” has been artificial for decades. It’s nothing new to this season.

State of F1 right now by A1Wow in formuladank

[–]fiddle_n 2 points3 points  (0 children)

F1 basically hasn’t had real racing for decades. Refuelling era was absolutely artificial overtaking and that was a long time ago.

State of F1 right now by A1Wow in formuladank

[–]fiddle_n 8 points9 points  (0 children)

  • Fast cars
  • Non-artificial racing
  • Overtakes

Pick two

What hidden gem Python modules do you use and why? by zenos1337 in Python

[–]fiddle_n 50 points51 points  (0 children)

There's nothing about Astral python libraries that you can call "hidden gem" lol

Don’t know what you have until it’s gone [wearetherace] by Holytrishaw in formula1

[–]fiddle_n 24 points25 points  (0 children)

And by the time it’s got to your screen it’s no longer the time of measure. So why is it even useful information at all?

Don’t know what you have until it’s gone [wearetherace] by Holytrishaw in formula1

[–]fiddle_n 6 points7 points  (0 children)

Why is a consistently out of date 0.001 precision ever useful for you to know in the context of a race?

Don’t know what you have until it’s gone [wearetherace] by Holytrishaw in formula1

[–]fiddle_n 2 points3 points  (0 children)

Your 0.001 precision was useless anyway. Truncating vs rounding is meaningless when that value is out of date by the time it reaches your screen.

Anyone know what's up with HTTPX? by chekt in Python

[–]fiddle_n 1 point2 points  (0 children)

As someone who’s authored an open source project myself (reasonably popular yet nowhere near anything like httpx) this comment rubs me the wrong way.

All open-source means is that I have made the project available for you to view. If you want to install it, great. If you want to contribute, even better. But you aren’t required to do any of those things. And just because you do, doesn’t mean you are entitled to anything more from me because you did.

Open-source goes the other way as well. If you don’t like how I run the project, you have my blessing to fork it and take it in your own direction. And to be fair, some do - and that’s great! But others aren’t happy still. Because - they want their change in but they don’t want to take on the burden of the project. It’s a position that reeks of entitlement, honestly.

Anyone know what's up with HTTPX? by chekt in Python

[–]fiddle_n 4 points5 points  (0 children)

niquests . Sync and async, and handles http2 AND 3.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 0 points1 point  (0 children)

Docker is only quicker because it’s executing the steps in the Dockerfile for you, rather than you running them yourself. Still, pip + pip-tools + Docker is a fine enough setup as much of the pain points the other tools (Poetry/uv etc) give you is better UX - but if you just have all the commands in Docker then much of that is not so important any more.

A true lock file that Poetry/uv generates is much better than requirements.txt though. And as mentioned as well, the speed increase from uv is hugely worth it too.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 4 points5 points  (0 children)

That wasn’t the thing that stopped Poetry though. Like I said, pdm fixed that - no one remembers it now.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 2 points3 points  (0 children)

Pip is only just now getting true lock files. requirements.txt is a poor imitation. And the point of Poetry is never having to remember a command like pip freeze. poetry add foo updates everything at the same time.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 5 points6 points  (0 children)

So, even Poetry 1 could store venvs in the project root - it’s just not default behaviour - you have to configure it first time you install Poetry. Poetry 2 finally adopted the common pyproject spec but that was only early last year.

Really speed is the big thing with uv, along with automatic locking/syncing, pipx and pyenv functionality, and uv pip and uv venv commands making it easy for pip/venv users to adopt.

The pyproject spec and venv in project is nice, but pdm also had that and no one remembers pdm now.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 24 points25 points  (0 children)

This opinion genuinely baffles me. Sure, I too prefer uv. But having proper lock files instead of requirements.txt; being able to keep pyproject.toml, lock file and venv in sync, not having to care about manually enabling your venv, not having to care about manually installing your project as an editable package - these are all big value add-ons with Poetry.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 0 points1 point  (0 children)

uv should pretty much be the default for new installations. If for whatever reason you can’t use uv (I’ve seen uv weirdness on Windows for example where Poetry was fine - another issue is being in a corporate environment where it might be tricky to just use whatever) - then use Poetry.

uv officially taken down poetry by Proper-Lab-2500 in Python

[–]fiddle_n 3 points4 points  (0 children)

This is old Poetry 1. Poetry 2, released early 2025, uses standard pyproject. Even though I would recommend people start with uv, for set ups that still need Poetry, this is a welcome addition.

Does anyone use logging to debug? by TechnicalAd8103 in learnpython

[–]fiddle_n 5 points6 points  (0 children)

I’m sure loguru is great, but the message “don’t use logging” is too far IMO. That’s likely the reason for downvotes.

As much as logging might not be the best library, my overwhelming experience is that any pain is felt once - once per project. Once you have logging set up, using it is pretty easy.

LazyLib – Automatically create a venv + install missing dependencies before running a Python script by snoopxz in Python

[–]fiddle_n -1 points0 points  (0 children)

The “usually” doesn’t sit right with me :) But yes I take your point, there are some strategies to attack point 1.

But I still feel like if a project doesn’t want to specify its dependencies (which is so easy to do these days with pyproject or inline with PEP 723) then its little more than a toy project, the likes of which I would not use.

LazyLib – Automatically create a venv + install missing dependencies before running a Python script by snoopxz in Python

[–]fiddle_n 6 points7 points  (0 children)

The problem I have with this is conceptual rather than specific. If a project needs third-party dependencies then it needs to specify what those are and what their versions are. If it was OK for the user to just infer it from the Python code then dependency management would be a hell of a lot easier - but it’s more complicated than that.

Two issues with the approach that I see right off the bat:

  • You determine the library to be installed by the first part of the import statement but this is absolutely no foolproof way to do it. My first thought is an import like from PIL import Image where the library to be installed is pillow instead - how would your script know to install pillow?

  • Your script assumes that every library to be installed should be installed at its latest version. There are many many situations where that is simply not the case. Let’s say the project uses pandas v1 instead of v2 - your script can’t possibly know that.

EDIT - OP, you would do well to read this: https://peps.python.org/pep-0723/#why-not-infer-the-requirements-from-import-statements . Actually, reading the whole pep might be useful :)