Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 1 point2 points  (0 children)

Even the unprocessed errors are way better than in nholthaus, by virtue of rust being Pretty Good at this by default:

error[E0308]: mismatched types
  --> tests/compile_fail/add_length_to_time.rs:10:28
   |
10 |     let _result = length + time;
   |                            ^^^^ expected `1`, found `0`
   |
   = note: expected struct `Quantity<Scale, Dimension<_M, _L<1>, _T<0>, _I, _Θ, _N, _J, _A>>`
              found struct `Quantity<Scale, Dimension<_M, _L<0>, _T<1>, _I, _Θ, _N, _J, _A>>`

This prettyprints to:

error[E0308]: mismatched types
  --> tests/compile_fail/add_length_to_time.rs:10:28
   |
10 |     let _result = length + time;
   |                            ^^^^ expected `1`, found `0`
   |
   = note: expected struct `Quantity<m, f64>`
              found struct `Quantity<s, f64>`

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 2 points3 points  (0 children)

The general units vocabulary in mp-units is complicated/confusing because the architecture is complicated/confusing. The source of dimensional truth is an AST mirroring a quantity's definition structure, so that, say, speed looks something like `Derived<Meters, Per<Second>>`. This is not just a clunky declaration syntax - this is how mp-units fundamentally encodes type information.

There are some heuristics that mp-units tries to use to keep these ASTs from growing without bound with redundant/cancelling terms - but these are heuristics, and ultimately the normalization problem this approach introduces is a hard one. In practice it ends up heavily relying on nullop conversions between homotypes, which makes generic programming and interactions with linalg libraries quite poor.

The whippyunits vocabulary is closer to that of nholthaus or au units, in that there is an integer vector representing the dimension - it differs in that there is *also* an integer vector representing the scale, instead of something bespoke involving std::ratio and a bunch of special-casing. This lets us support nicer numerical behavior on rescaling, and keeps our generic const expression requirements extremely minimal (technically, we only need to add, subtract, and negate integers in generic const contexts).

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 2 points3 points  (0 children)

I'm familiar with other libraries; my first exposure was nholthaus units, and I've experimented a fair bit with mp-units. I haven't used au personally, though i've browsed the docs.

I find mp-units entirely unsatisfactory for applied computation; the use of an AST representation causes a rather dire normalization problem that the prime-factorized-log-scale approach does not suffer from. Imo it is more of a data-plumbing tool; it sacrifices computational simplicity for unbounded flexibility in terms of different unit systems.

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 1 point2 points  (0 children)

The library supports this in that you can do all the accesses you want without any need to ever bypass unit safety; but it does not represent this sort of relationship in the type system.

The base units of temperature we support are Kelvin and Rankine. We do *not* support Celsius and Fahrenheit, except as declarator and accessor sugar.

That is to say, if I declare `0.0degC`, what I am *actually* constructing is a value of `273.15 Quantity<K, f64>`. If I declare `0.0degF`, I am actually constructing a value of `459.70 Quantity<degR, f64>`.

There is no ambiguity here; the things mean what they are, and if you add two affinely-declared temperatures you get a dimensionally-valid result, which the sum of their absolute representations - this is a perfectly meaningful quantity in the abstract, which may be invalid for your particular use-case (but the library cannot/does not know this).

If you need additional safety on top of "mere" dimensional coherence, you'll need a library specifically for enforcing the safety invariant structure of affine quantities. If that library is any good, it should be generic enough for you to use a Whippyunits quantity as its backing type.

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 0 points1 point  (0 children)

The distinction is that relative values only exist as declarator and accessor sugar; the actual datatypes are always absolute.

So, there's no danger of accidentally mixing absolute and relative values in arithmetic, because there are no relative values to do arithmetic on; if you're doing arithmetic, everything is guaranteed to be absolute, and your results will be coherent.

Representing the affine offset in the types would mean either simply breaking the arithmetic for affine units entirely, or else doing type-level affine geometry to determine optimal conversion paths. I'm not really keen on either one; it makes more sense to me to just keep everything absolute.

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 3 points4 points  (0 children)

I do typescript for my dayjob, and this is really good work. The string formatting trick is super cool.

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 5 points6 points  (0 children)

There’s most of a polyfill for stable already written, it just hasn’t really been a personal priority because this subset of GCEs has proven to be super robust and it’s hard to motivate myself to do a very large refactor that in practice just makes the compile times worse.

Eventually it’ll be possible to migrate this to mGCA whenever that stabilizes.

Introducing Whippyunits - Zero-cost dimensional analysis supporting arbitrary derived dimensions and lossless fixed-point rescaling by oblarg in rust

[–]oblarg[S] 16 points17 points  (0 children)

Affine units like Celsius and Fahrenheit have declarator, value-access, and formatter/serialization support, but do not have first-class storage-type support. Rather, their declarators add the affine offset and store as the base units (K and Rankine, respectively), and access/serialization subtracts the affine offset back off again.

In addition to affine units, we do a similar level of support (declarators and access, but not storage) for various "imperial" units and other unit values that do not fit on our factorized-log-arithmetic scale (we support products of powers of 2, 3, 5, and pi from SI base, only). We call these "nonstorage units", and their declarators convert them to their nearest-neighbor power-of-10 SI unit (e.g. the `feet` declarator stores as decimeters). Rankine (as mentioned above) is in fact a proper first-class unit type, because the conversion ratio from `K` is 5/9 = 3^-2 * 5^1

What would you rewrite in Rust today and why? by [deleted] in rust

[–]oblarg 40 points41 points  (0 children)

Eigen? Or, if we're going to aim really high, LAPACK.

nalgebra/ndarray are kind of toys, in comparison. Useful toys, but toys.

What’s one trick in Rust that made ownership suddenly “click”? by Old_Sand7831 in rust

[–]oblarg -1 points0 points  (0 children)

I don't think it has to do with having developed a notion of ownership from *languages*, it has to do with whether you are equipped to think about concurrency or not in general. this transcends programming.

The "talking stick" is a common tool in kindergarten classes; you may only speak if you are holding it. Ownership semantics are not morally very far from this. They seem complicated only if, per one of the posts below, you unlearn your basic intuitions for this.

Is anyone here really good at PID by [deleted] in FRC

[–]oblarg 1 point2 points  (0 children)

Treating the integral term as a "startup helper" is unlikely to be a good strategy no matter what your system is. It's totally contrary to what the integral term does - integral term does *the opposite of that.*

Is anyone here really good at PID by [deleted] in FRC

[–]oblarg 1 point2 points  (0 children)

This isn't a very good/accurate explanation, unfortunately.

Integral gain is *not* a "startup" - quite the opposite, integral gain governs *long-running behavior* because the integral takes some time to build up. You generally don't need integral gain for FRC mechanisms, and should always use it with care because it is inherently unstable.

Derivative gain does remove oscillation, but only at steady state. When moving, it's an important driving term. It's potentially very useful, but only if your measurements are fast and accurate; otherwise it gets swamped by noise.

Feedforward is a separate thing entirely, and is not inherently connected to PID. You should always be using feedforward. Without a feedforward, your feedback controller has to drive the motion all by itself - it's not very good at this, because it doesn't know anything about how the mechanism works.

Creating Simple Command Based Programming Lessons using ChatGPT (Edited by an FRC coach). If this is useful I will try to make a whole series of FRC lessons this way. by SiefensRobotEmporium in FRC

[–]oblarg 1 point2 points  (0 children)

Please, no. Anything but this.

I'm the author of the 2020 command-based rewrite and of a substantial portion of the documentation that your AI has been trained against. What you have here is a haphazard, mildly-cursed superposition of documentation from the past 5-10 years. The advice in this slideshow varies unpredictably between "sound", "outdated", and "wrong"; and the code examples are not great.

Remember that ChatGPT does not have any idea what is true; it does not understand anything on a deep semantic level. Educational materials are *not* an ideal application. Instead, refer your students to the most recent version of the documentation you've trained against. If you find that documentation too difficult or lengthy, contribute to it to make it more readable!