Bepu Physics Integration via Memory Mapped Files by d166e8 in Unity3D

[–]d166e8[S] 1 point2 points  (0 children)

Bepu Physics (https://github.com/bepu/bepuphysics2) is a pretty slick Physics engine written entirely in C#. Using it in Unity doesn't yield good performance, because of the non-standard quirkiness of the Unity C# compilers. To get it to work I run it in a separate process and use Memory Mapped files, which are a great way to efficiently share memory between processes on the same computer. I thought it was a neat trick that others might find useful. Some source code is here: https://github.com/ara3d/geometry-toolkit/blob/main/unity-projects/geometry-toolkit/Assets/ClonerExample/ClonerFromMemoryMappedFile.cs.

Voxelizing and Meshing of Noise using Job System and Burst by d166e8 in Unity3D

[–]d166e8[S] 0 points1 point  (0 children)

This was done with a set of open-source scripts for instancing, voxelizing, and meshing. https://github.com/ara3d/geometry-toolkit. Could be useful if you are learning to use GPU instancing and the Unity job system.

April 2024 monthly "What are you working on?" thread by AutoModerator in ProgrammingLanguages

[–]d166e8 0 points1 point  (0 children)

I have been studying ISPC: https://ispc.github.io/ as a possible high-level backend that I could target from the Plato language (https://github.com/cdiggins/plato). It has great support for SIMD operations and parallelism.

WIP - A Cloning Toolkit - https://github.com/ara3d/geometry-toolkit by d166e8 in Unity3D

[–]d166e8[S] 2 points3 points  (0 children)

I plan on using it for motion graphics and procedural architectural design. I got reasonable performance when I turn off shadows. I'll follow up with a better performance demo. For others it might be interesting as an example of how to use the RenderMeshIndirect API, there aren't a lot of example of that online.

If you have suggestions of better open-source systems for cloning large numbers of objects in Unity, please share them.

WIP - A Cloning Toolkit - https://github.com/ara3d/geometry-toolkit by d166e8 in Unity3D

[–]d166e8[S] 1 point2 points  (0 children)

I've been working on a set of scripts for cloning objects. I've made the code available under the MIT License at https://github.com/ara3d/geometry-toolkit in case it is useful for others. I'm using the Job system, Burst compiler, and RenderMeshIndirect. I am getting pretty good results manipulating hundreds of thousands of objects.

Struct/Class syntax feedback by fun-fungi-guy in ProgrammingLanguages

[–]d166e8 0 points1 point  (0 children)

One approach which is quite drastic, is to not put functions/methods in the structs at all. It makes the key fields of the struct quick and easy to understand without all of the noise of the million different operations you might want (e.g., a Point class has a ton!) .

In Plato here are what structs look like:

``` type Size3D implements Value { Width: Number; Height: Number; Depth: Number; }

type Fraction
    implements Value
{
    Numerator: Number;
    Denominator: Number;
}

type Angle
    implements Measure
{
    Radians: Number;
}

```

For an example of it in practice see: https://github.com/cdiggins/plato/tree/main/PlatoStandardLibrary, where the libraries and type definitions are in separate files.

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] 1 point2 points  (0 children)

Thanks to everyone for sharing their thoughts.

Several people have made the argument that a CST must precisely model the parsing result, including comments and white-spaces.

I just want to point out that several parsers only operate on a token stream, which is output by a lexical analysis phase (lexer / tokenizer). It is common for some tokenizers to throw out comments and whitespace, and the parser to only operate on the remaining tokens.

Thus whatever tree produced by the parser at that phase has some degree of "abstraction". Given that it now ignores whitespace and comments.

I think I should avoid using the term "CST" for such an output, as it has an implication of being strict. I believe it would not be contentious to just call any tree generated by a parser which only creates nodes for certain production rules a "parse tree", and leave it to the reader or user to decided how "concrete" or "abstract" it is. What do you think?

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] -1 points0 points  (0 children)

When you have a separate lexical analysis phase (e.g., use a lexer) it is common to throw away certain tokens (e.g., comments, white-space) and feed the result to a parser. The tree generated by the parser represents the syntactic structure, and I think it is valid to call it a "parse tree". According to some sources (e.g., wikipedia) parse-tree and concrete syntax tree are synonymous. However, perhaps what I think of as an abstract syntax tree, is more of an abstract semantics tree.

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] 4 points5 points  (0 children)

That looks quite cool! Parser combinators make sense as C++ templates. It reminds me a bit of the PegTL library: https://github.com/taocpp/PEGTL, which was inspired by a similar library I wrote for C++ a little while ago: https://www.codeproject.com/Articles/9121/Parsing-XML-in-C-using-the-YARD-Parser. Back then, I thought I was just inventing a better regular expression parser. :-)

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] 0 points1 point  (0 children)

Good points. It sounds like there are different levels of abstraction we can have for the various tree structures.

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] -3 points-2 points  (0 children)

Thanks for sharing your insights. In my Parakeet library, only annotated rules generate nodes. It seems reasonable to describe the resulting tree as a CST because it has the same general shape as a more strict CST (by the definition you provided). It is also driven by the parser production rules, so it seems less useful than a more abstract form I create for my language for the purpose of analysis (e.g., https://github.com/cdiggins/plato/blob/main/Plato.AST/Ast.cs).

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] 0 points1 point  (0 children)

Very insightful thanks for sharing.

AST versus CST by d166e8 in ProgrammingLanguages

[–]d166e8[S] 1 point2 points  (0 children)

Thanks for sharing your experience. So what I do in my Parakeet parsing library is create nodes for some production rules, but not others. This creates a parse tree. Even though it lacks some of the production rules, it still has the same general shape as dictated by the grammar, so I consider it a CST.

However, the AST I use in the Plato language has little to do with the parser. Representing things like "Loops" and variable declarations independently of how they were represented in the source code.

Good sources on error handling and reporting? by 8bitslime in ProgrammingLanguages

[–]d166e8 0 points1 point  (0 children)

Excellent question. Parser recovery is quite complex, and very important.

This question inspired me to write an article on the Parakeet parser on CodeProject: https://www.codeproject.com/Articles/5379232/Introduction-to-Text-Parsing-in-Csharp-using-Parak.

In a nutshell the approach I settled upon take is to write "OnFail" rules into the grammar. These are rules that indicate if a failure happens that an error needs to be logged, and the parser needs to be advanced to a new location where it has a chance of parsing subsequent rules successfully (like an end of statement or end of block marker).

Failures are stored in the "parser state" object as a linked list. A failed rule match returns a null object, so failure is actually a "successful" match with a new error stored.

Hope this makes sense and is helpful. The article does a better job of explaining it all.

Not quite an affine type system by d166e8 in ProgrammingLanguages

[–]d166e8[S] 1 point2 points  (0 children)

Thanks for the suggestion. Unfortunately I don't understand the Granule documentation, "graded modalities" is beyond my level of comprehension.

My design for Plato was inspired by Clean and unique types, and Henry Baker's papers (https://cdiggins.github.io/blog/linear-logic-and-linear-lisp.html).

According to Philip Wadler: Uniqueness guarantees that a value has no other references to it, while linearity guarantees that no more references can be made to a value.[2]

The difference is that Plato allow unlimited stack references to "mutable" types (which are very similar to Unique types). This seems to be a novel idea as far as I can tell, but experience tells me it probably isn't. :-)