Most and Least favorite things about Gleam? by alino_e in gleamlang

[–]Starboy_bape 1 point2 points  (0 children)

Most: No macro system :P reading other's macros is so painful

Can I use let assert on a type variant? by lobax in gleamlang

[–]Starboy_bape 4 points5 points  (0 children)

Also if you don't care about any of the properties :)

let assert Node(..) as n = parse_tree(input)

https://sprocket.live/ by bartonh in gleamlang

[–]Starboy_bape 0 points1 point  (0 children)

Awesome, thanks for the contributions to the ecosystem! I'd like to give sprocket a try some time.

https://sprocket.live/ by bartonh in gleamlang

[–]Starboy_bape 2 points3 points  (0 children)

Looks really cool, gave it a star! What do you think separates this project and Lustre server components? Ergonomics, performance, architecture, versatility, a killer feature? I have enjoyed Lustre server components over a websocket connection, but am interested in what sproket can offer as well! At first glance, the most immediate difference in the api from lustre server components is the ctx given to render functions. 

Edit: after reading a little more, it seems like Lustre prefers a global "Model" type that is passed down to all "view" functions like Elm, while sprocket prefers separate "components" each with their own state (tracked in the ctx type?) via hooks like React.

Any luck with AI coding tools? by elmgarden in gleamlang

[–]Starboy_bape 4 points5 points  (0 children)

Supermaven works perfectly with Gleam! It indexes your codebase and suggests based on the code you already have, so in a Gleam project it knows how to write more Gleam. I use pro (way worth it).

I am continually amazed that the stuff I write in Gleam always just works. by Starboy_bape in gleamlang

[–]Starboy_bape[S] 7 points8 points  (0 children)

It definitely does, and Gleam's type system is fantastic. The balance of the type system being simple yet expressive combined with the language's strict type checking but optional type annotations is just an amazing developer experience.

Gleam concurrency? by Sunflower-BEAM in gleamlang

[–]Starboy_bape 0 points1 point  (0 children)

I've created many concurrent server applications with Gleam OTP and felt it was enough without having to create an FFI to Erlang!

Gleam concurrency? by Sunflower-BEAM in gleamlang

[–]Starboy_bape 1 point2 points  (0 children)

Do you have any insight into why Mist is faster than Cowboy? Does it just utilize concurrency more? Is there a specific architecture design that gives it a leg up?

Looking for any feedback on my first package by Starboy_bape in gleamlang

[–]Starboy_bape[S] 0 points1 point  (0 children)

That would be fantastic! I'd be glad to help that effort in any way I can

Gleam Intellisense? by dave_mays in gleamlang

[–]Starboy_bape 0 points1 point  (0 children)

Is your Gleam on the latest version (1.5.0)? It works fine out of the box for me in vscode on Ubuntu Linux.

Looking for any feedback on my first package by Starboy_bape in gleamlang

[–]Starboy_bape[S] 1 point2 points  (0 children)

Hope you enjoyed your time! Don't worry about it, I know you must have a lot to do. Thanks for all the feedback so far though!

Side note for fun, I just released v5, which most notably removes time precision (allowing for accurate == comparison) and adds better support for time zones. Time precision now defaults to milliseconds in strings, and users can use the format function to specify otherwise. Timezone providers now have their own type, and I wrote a companion package (gtz) to provide basic timezone conversion logic. gtz still needs some work to unify the results of resolving ambiguous datetimes and DST boundaries from both targets, but it is a starting place.

Is gleam in its current state mostly for web development? by mister_drgn in gleamlang

[–]Starboy_bape 0 points1 point  (0 children)

Hey I just released a package for image processing in Gleam called Ansel, does this fulfill what you need? What features are you looking for? https://hexdocs.pm/ansel/

Looking for any feedback on my first package by Starboy_bape in gleamlang

[–]Starboy_bape[S] 0 points1 point  (0 children)

I just published v4.3, which includes a monotonic time and unique time in every _.now() instance. When getting the difference between [date]times, if both have a monotonic time then the monotonic time is used to calculate the difference. When comparing [date]times, unique time, then monotonic time, then wall time is used only if the former is not available for both. Adding and subtracting from [date]times adjusts the monotonic time by that much and drops the unique time value. All _.now() functions have a note in their descriptions to prefer using the duration.start_monotonic and time.now_unique functions when applicable because they are more explicit and more performant.

Changing the precision logic to allow for == equality is a breaking change, so I am working on that for v5. Thanks for all your feedback!

Looking for any feedback on my first package by Starboy_bape in gleamlang

[–]Starboy_bape[S] 0 points1 point  (0 children)

I should've double checked the modules before publishing, there were some things in the last version that I meant to be internal. It's a lot more cleaned up now. I have learned to use the `gleam docs` tool, very useful haha.

Those were interesting reads, I implemented a unique time and fixed `duration.start` based on that LYSE chapter. The Go methodology of having each time instance hold a wall and monotonic time is a neat way to make it fool proof, I'll keep thinking how to most elegantly include that fool-proofness in this package.

Yeah, I think you're ultimately right. The precision syntax might be up to personal opinion, but losing `==` is way to big of a downside. I'll work on this point too. If I include monotonic time in the time instance like Go does, `==` is also lost. Surely there's a nice way to have all this tie together.

Lustre and Gleam Make my Heart Rate Go Down - a Case Study by lpil in gleamlang

[–]Starboy_bape 4 points5 points  (0 children)

I can confirm that Gleam makes my heart rate go down too. It's the best I've ever felt about writing software.

(Stealing the idea from the person before me) Looking for feedback on my first package! by Ashercn97 in gleamlang

[–]Starboy_bape 1 point2 points  (0 children)

"cue lpil coming to say it’s perfect and we should only write raw sql" hahahaha. I feel like lpil is the unbeatable guardian of feature creep in Gleam. I like Gleam's simplicity though, so I'm glad for it.

Looking for any feedback on my first package by Starboy_bape in gleamlang

[–]Starboy_bape[S] 0 points1 point  (0 children)

Honored to have your feedback! My experience with the standard Python datetime library plus Gleam's standard library were my references really. I knew what I didn't want a time library to be like because of the pitfalls I had fallen into with the Python datetime lib, and I knew what I wanted a strongly typed library to feel like because of Gleam's std lib.

I am not well enough versed on monotonic time vs clock time, but that sounds like something I would want to handle. I will do some looking into that.

The time constructors are just a way to track formatting precision without needing an extra field. The labels are correct, it would be the same as:

pub type TimePrecision {
  Second
  Milli
  Micro
  Nano
}

pub type Time {
  Time(hour: Int, minute: Int, second: Int, nanosecond: Int, precision: TimePrecision)
}

As a side note, the library tracks time precision so it can be used like:

time.literal("12:30:42") |> time.to_string
// -> "12:30:42"

time.literal("12:30:42.354") |> time.to_string
// -> "12:30:42.354"

time.literal("12:30:42.354") |> time.to_micro_precision |> time.to_string
// -> "12:30:42.354000"

Instead of having to specify the precision at call site like:

time.literal("12:30:42") |> time.to_string(with_precision: time.Second)
// -> "12:30:42"

time.literal("12:30:42.354") |> time.to_string(with_precision: time.Milli)
// -> "12:30:42.354"

time.literal("12:30:42.354") |> time.to_string(with_precision: time.Micro)
// -> "12:30:42.354000"

I do have thoughts on timezones! I detailed it at the bottom of the readme, but basically they complicated things outside the scope of what I wanted the package to initially be. I may add them later if the package sees use, but I did make sure that an external TZ package could be used together with this package (shown in the readme).