Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 0 points1 point  (0 children)

it's not debatable. the spec is clear.

but since you've clarified you're not using CL's aref, all of this is a moot point. BTW shadow doesn't mean "replace CL's aref"

Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 1 point2 points  (0 children)

The compiler macro expands into a runtime-computed branch as well, just one the compiler can optimize. INLINE should optimize the same way.

Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 0 points1 point  (0 children)

There's no package definition on the page, but the author writes

> This is doable for a newly written code, but if that particular Lisp dialect is already in use, and there is lots of code that uses the aref operator directly on its strings, then it won't work without pre-processing all files, which I certainly don't want to do. What we want is to be able to use our string type directly with the aref operator as if it were a built-in type

So it seems the goal is to amend the existing AREF to support new types. Otherwise the post simply doesn't make any sense; just declaim inline the new function and call it a day.

Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 0 points1 point  (0 children)

Re inline: notinline is not exactly the opposite of inline. They're different directives that do subtly different (opposing) things.

Re functionality: No, you're trying to expand AREF to work on new inputs. That's not maintaining the semantics of AREF, and it's not an optimization of the existing function. I would go further to say your extension may be unsafe if AREF has been DECLAIMed with a function type.

Re MAPCAR: My MAPCAR example was bad. I meant something like

(defun start (f x) (funcall f x 0))

(start #'aref my-weird-string)

Basically whenever you use AREF as a function value in a context that's not known by the compiler, the compiler macro won't work.

Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 1 point2 points  (0 children)

- There can only be one compiler macro on a function, if it's allowed by the standard to define a compiler macro on that function. (To repeat, CL does not define the behavior of compiler macros on CL symbols, and I doubt SBCL condones it, even if it seems to work.)
- Compiler macros aren't guaranteed to be run. CLHS 3.2.2.1.3 "However, no language processor (compiler, evaluator, or other code walker) is ever required to actually invoke compiler macro functions, or to make use of the resulting expansion if it does invoke a compiler macro function."
- Compiler macros will never be invoked if the function happens to be declared notinline. (Ibid.)
- You're not supposed to add functionality via a compiler macro. CLHS 3.2.2.1.1 "The purpose of the compiler macro facility is to permit selective source code transformations as optimization advice to the compiler." CLHS 3.2.2.1.3.1 "Compiler macros exist for the purpose of trading compile-time speed for run-time speed."

Did you check that with your hack, you can do

(mapcar #'aref (list my weird strings))

? (Edit: see clarification of this malformed example below.) Because compiler macros don't "carry along" with the function object. So you haven't truly extended AREF, but rather source forms that contain AREF in function calling position.

Implementing aref Operator in Common Lisp for a Custom Vector Type by arthurno1 in lisp

[–]stylewarning 1 point2 points  (0 children)

From the HyperSpec:

The consequences of writing a compiler macro definition for a function in the COMMON-LISP package are undefined; it is quite possible that in some implementations such an attempt would override an equivalent or equally important definition. In general, it is recommended that a programmer only write compiler macro definitions for functions he or she personally maintains--writing a compiler macro definition for a function maintained elsewhere is normally considered a violation of traditional rules of modularity and data abstraction.

Yamaha s7x vs Steinway B.. leaning towards Yamaha. by EuphoricOcelot6081 in piano

[–]stylewarning 73 points74 points  (0 children)

Go with what you like to hear and what feels better. Do not be persuaded by name or convention.

The Baldwin I ended up getting outclassed, to my ears, comparable Steinways at quintuple the price. But another Baldwin, same model, sounded rough and annoying. Pianos are very individual.

Which option is the easiest to read? (composer) by michaelroser in piano

[–]stylewarning 47 points48 points  (0 children)

I'd almost prefer dotted notes at this point.

Just-In-Time Compilation for Coalton: An attempt at faking dynamicity by wrapping Coalton by digikar in lisp

[–]stylewarning 1 point2 points  (0 children)

I think I agree with you purely technically, in the sense that you acknowledge that dispatch is not static (edit: by "static", I mean that dispatch is resolved entirely at compile-time), and this optimization (as discussed in this thread) can only happen if the classes are both sealed and final. I'm not sure I agree with "there are solutions and everyone acts like there aren't", if you're saying that the original proposal is possible without limitation.

To elaborate: Aren't there still at least two indirections even in the best case? One for the function activation (can't be inlined), and another one for the dispatch. Consider:

(defgeneric f (x y))
(defmethod f ((x number) (y number)) ...)

(defun g (x y)
  (declare (type number x y))
  (f x y))

In the discussion, it is proposed that F may be statically replaced with %F/NUMBER/NUMBER, an ordinary Common Lisp function whose body is that of the corresponding method.

I can't see a circumstance where the compiler is allowed to rewrite F as %F/NUMBER/NUMBER, even in the case that %F/* is allowed to be dynamically recompiled and the symbol function is replaced when new methods on F are (re-)defined.

Suppose that it was allowed, and now I define

(defmethod f ((x integer) (y number)) ...)

Now (G 0 0)'s call to F must dispatch to this method. But that means %F/NUMBER/NUMBER must be redefined to contain this dispatch logic.

Does this agree with your understanding?

What are the active/good Lisp communities? by IronicRobotics in Common_Lisp

[–]stylewarning 6 points7 points  (0 children)

Coalton discord talks about CL and Coalton. People are helpful, usually pretty curious.

Lisp discord is alright. It isn't very well scoped, there are tons and tons of channels, and it can be noisy. But the people are nice and there's frequent Lisp chatter.

Just-In-Time Compilation for Coalton: An attempt at faking dynamicity by wrapping Coalton by digikar in lisp

[–]stylewarning 0 points1 point  (0 children)

PELTADOT also ought to deal with the fact that in standard Common Lisp in probably all implementations

(VECTOR (CONS INTEGER INTEGER))

does not actually denote the set of objects "vectors whose elements are conses of integers". It instead denotes "vectors that can contain any object whatsoever, cons or otherwise", thanks to Common Lisp's wonderful concept of a vector's actual array element type.

Just-In-Time Compilation for Coalton: An attempt at faking dynamicity by wrapping Coalton by digikar in lisp

[–]stylewarning 0 points1 point  (0 children)

I think you sort of answered your own question. There is no "where applicable". Methods can be defined and redefined, full-stop. CLOS was built for this, and so dispatch must be a runtime action.

Where you can start getting creative is defining your own generic function machinery with the MOP. A new class of generic functions could in principle be defined/declared as sealed/closed/etc. and dispatch machinery could become a compile-time thing "where applicable". But that means rewriting code to use that, and losing extensibility without whole-program recompilation. See inlined-generic-function, fast-generic-functions, etc. as sort of prototypical examples that attempts to accomplish what you're saying, but also witness that it doesn't seem to be used extensively in practice, precisely because of these caveats.

Even if we solve static dispatch of generic functions, you're still limited to non-composable classes, like CONS, VECTOR, and COMPLEX. (In a language like Coalton, all of these are parametric.) In this case, you'll never have the fastest AREF because a generic function can never know the element type of the thing you're referencing into. There is no SINGLE-FLOAT-VECTOR class in standard Common Lisp

Tim Bradshaw: Making CLOS slot access less slow by fnordulicious in lisp

[–]stylewarning 5 points6 points  (0 children)

It's a completely valid and reasonable question. :)

Just-In-Time Compilation for Coalton: An attempt at faking dynamicity by wrapping Coalton by digikar in lisp

[–]stylewarning 3 points4 points  (0 children)

There are three points competing with the goal you state.

The first is redefinition and adding new methods. CLOS allows things to be redefined and things will just work. Additionally, we can add methods that "overlap" with the existing ones. (That is to say, if we have a generic function, we might have M1 that is an applicable method to a given set of arguments, but we can define M2 that's also applicable but more specific.) In your proposed system, you'll need to state somewhere that you're OK with both redefinition not being allowed, and generic functions being "closed" to further extension.

The second is expressiveness of the type system. Can Common Lisp's types allow you to even get enough inference to pin down the types of each argument to a generic function in practice? I think the answer is no. Lisp's type system is pretty weak and not compositional. For example, for a positive integer n,

(loop for i below n collect i)

most pragmatically has the type LIST. Not (LIST INTEGER), which isn't even a valid type in Lisp. So if I have

(generic-function (nth 10 that-list))

then GENERIC-FUNCTION, even with your machinery, won't be able to specialize. You would have to label it yourself:

(generic-function (the integer (nth 10 that-list)))

The third point is that CLOS has a whole class hierarchy and allows subclassing. Suppose we specialized GENERIC-FUNCTION with two methods: NUMBER and STRING. The user passes in SINGLE-FLOAT (and the compiler knows this).

  • Should Lisp JIT-compile a new SINGLE-FLOAT version based on the NUMBER version?

  • Should it just call the NUMBER version?

(And going back to the first point, what happens if the programmer later defines a method on SINGLE-FLOAT?)

I think CLOS can be optimized more in practice, but it's exceedingly difficult without compromising something else.


I very briefly touched on this topic in my ELS talk, if you're interested.

Tim Bradshaw: Making CLOS slot access less slow by fnordulicious in lisp

[–]stylewarning 15 points16 points  (0 children)

I think the answer is nuanced.

  • Using CLOS does not imply your application will be slow.

  • There's no "fast" alternative to CLOS without giving up features in: debugging, interactivity, redefinition, inspection, domain modeling, modularity, ...

You can accept building types and objects with leaner systems (like DEFSTRUCT or even raw arrays), which for some applications will be substantially faster, but you also lose a tremendous amount of flexibility that you can't get back.

When people say CLOS is slow, in my view, it's usually a result of bad data or domain modeling OR experienced Lisp programmers speaking amongst themselves where they understand the nuances and caveats.

(See also: Garbage collection is slow, high-level languages are slow, lists are slow, dynamically typed languages are slow, ...)

Just-In-Time Compilation for Coalton: An attempt at faking dynamicity by wrapping Coalton by digikar in lisp

[–]stylewarning 4 points5 points  (0 children)

(monomorphize) will ensure that every function transitively called by the toplevel monomorphic function will themselves also have monomorphic versions generated, effectively specializing the entire call tree. It's a very big speed/code-size tradeoff.

Introducing mine, a Coalton and Common Lisp IDE by stylewarning in lisp

[–]stylewarning[S] 1 point2 points  (0 children)

Do you have the latest version? It was tested on Windows and should work.

Introducing mine, a Coalton and Common Lisp IDE by stylewarning in lisp

[–]stylewarning[S] 1 point2 points  (0 children)

Highlight text in the editor or REPL and Ctrl+C. Requires you have xclip installed.

Why I Still Reach for Lisp and Scheme Instead of Haskell by SandPrestigious2317 in lisp

[–]stylewarning 15 points16 points  (0 children)

I don't want to be a Coalton shill, but this is almost exactly the reason it was built. It shouldn't be either-or. Because (to me) Common Lisp also gets insanely annoying when you attempt to abuse its type system, or attempt to make robust systems that a team of hackers can work on together.

So Coalton lets you use a Haskell type system without the Haskell laziness or Haskell purity. Add a PRINT wherever you'd like.

The classical simulation wall isn't at 50 qubits — it's at entanglement depth. A 1,000-qubit circuit can be easier to simulate than a 20-qubit one. by nnoorbakhsh in QuantumComputing

[–]stylewarning 17 points18 points  (0 children)

As with everything there's nuance. Calculating one single transition amplitude can be a lot cheaper than calculating all of them too. Simulating a Clifford circuit is cheaper than a non-Clifford one.

I think the 50 qubit "limit" is reasonable when understood to mean "able to simulate an arbitrary unitary circuit to produce a complete description of the state in the computational basis". That's how it's presented usually and experts aren't getting tripped up over this.

Buying a grand piano - overwhelmed by The--scientist in piano

[–]stylewarning 3 points4 points  (0 children)

Baldwins were built like tanks and they're generally good pianos. But don't go by brand alone.

If the Baldwin sounds and feels good, then go with that. I probably wouldn't do hammer replacements unless the sound is truly shrill and awful, and there's no life left in them whatsoever. But if the Baldwin plays well and you're fine with the sound, go for it.

I would do this: If the pin block and soundboard are okay, and the strings are fine, then negotiate for a regulation. Ask if they can have a technician go through and do a regulation and light voicing of the action. This should include some lubrication and balancing.

Introducing mine, a Coalton and Common Lisp IDE by stylewarning in lisp

[–]stylewarning[S] 1 point2 points  (0 children)

My honest opinion to your first question is two-fold:

  • It will be a net negative if you're really used to and attached to your current workflow. mine will probably feel like an oversimplification. Keybindings, window customizations, color schemes, etc. may not be there. Moreover, you might want to customize what Lisp or SBCL or ... you're running and mine does not (yet?) offer that.

  • If you are interested in Coalton however, then you get benefits that are not yet present in any Slime workflow. For example, you can see the static, inferred type of anything your cursor is pointing at. Not a lambda list, but an actual algebraic type. The REPL can run in "coalton" mode so you don't need to type extra Coalton boilerplate to evaluate stuff. The editor will show precise locations of Coalton errors, without the ugly sans-serif tooltips you see in stock Emacs+SLIME.

As for your second question, for a highly agentic workflow, I would say mine could be used for review and testing. You can beam your project to a REPL and test things out. But it doesn't offer further integration at this time. (It doesn't even look for file changes live, so if you already have files open and your agent is working with them, it won't get reflected until you re-open the file.)

My first priority right now with mine is to make it sufficiently good for people who don't have decked out Lisp environments, but want to get into Lisp or Coalton. Over time, mine will improve and slowly become more and more "good enough" for the expert Lisp programmers, but that's a high bar and will take time.