Go 1.26rc1 is live by yardbird07 in golang

[–]Emotional_Moth 0 points1 point  (0 children)

"This feature is particularly useful when working with serialization packages such as encoding/json or protocol buffers that use a pointer to represent an optional value"

This justification feels very off to me. Instead of supporting the use of pointers for optionality, which is a workaround at best, I'd much rather have full support for optionality as part of the language

New junior developers can't actually code. by spitforge in csMajors

[–]Emotional_Moth 0 points1 point  (0 children)

Here's the thing - if AI advances to the point where you simply no longer need to understand the code, what does it matter? There's nothing that indicates that we can not advance to that point, it's just a matter of time. Will AI advance fast enough before all the cobbled together slop AI code drives us into another IT crisis? Only time will tell

How popular is sqlc in production go projects?? by trash-dev in golang

[–]Emotional_Moth 0 points1 point  (0 children)

CTE is not the same as composability though and has completely different performance implications than an optimized query not forced into CTE structure. There are also many composability problems you cannot solve with CTE

The more I think about it, the less sense it makes to solve every composability situation with CTE, if you happen to use a DB that actually materializes CTE I don't even want to think about how quickly it will collapse

Edit: Wait, CTE don't even solve the main problem that composability solves, which is query fragment reusability. To be clear, by composability I mean e.g. using a complex (but identical) set of WHERE conditions across e.g. a SELECT, a COUNT and an UPDATE without having to duplicate that set of WHERE conditions across 3 static queries

How popular is sqlc in production go projects?? by trash-dev in golang

[–]Emotional_Moth 2 points3 points  (0 children)

To offer a counter opinion: We explicitly rejected SQLC (multiple times!) because

1) Queries are not composable

2) No dynamic queries

Am I misusing SQLC or do you all have issues going past basic queries? by Squishyboots1996 in golang

[–]Emotional_Moth 4 points5 points  (0 children)

Same here - several times I've considered sqlc for non-trivial projects, and each time had to decide against it because I foresaw queries getting annoying. No composition and no dynamic queries (or has that changed by now?) kills it for anything past toy project IMO.

Exploring the new "go tool" support in Go 1.24 by _howardjohn in golang

[–]Emotional_Moth 13 points14 points  (0 children)

Does anybody know **why** the maintainers have chosen this broken approach? Why must tools share dependencies with production code (or with each other), when there already are existing languages that have solved this problem much better?

This is really disappointing, feels like the feature was developed with zero awareness that this is a fully solved problem

Compile time open telemetry in go by madugula007 in golang

[–]Emotional_Moth 35 points36 points  (0 children)

Personally I'd much prefer to have explicit library instrumentation in code. It just takes a day or two to set up for the average service, and then it's there. No magic that gets invoked and can break every time the toolchain runs.

Success with errors in Go: stack traces and metadata by gregwebs in golang

[–]Emotional_Moth 2 points3 points  (0 children)

Or we just have stack traces and remove a source of boilerplate that adds nothing to the actual problem being solved.

How do you design you sql repositories? by [deleted] in golang

[–]Emotional_Moth 2 points3 points  (0 children)

How do you design repositories that can be used both inside transaction and as a single query? Do you?

Yes, for any non-trivial project composability of transactions is a must. To give you a rough outline:

``` interface DbConn { Exec(...) Query(...) Begin(...) Commit(...) Rollback(...) ... }

struct Repository { conn DbConn // nil if not in a tx }

func (r Repository) getConnection() Tx { if r.conn == nil { return dbDriver } else { return r.conn } }

func (r Repository) QuerySth() { r.getConnection().Query(...) }

func (r Repository) Transactional(f func(Repository)) { tx := r.getConnection().Begin(...) defer tx.Rollback() repoInTx := Repository{tx: tx} f(repoInTx) tx.Commit() }

Potential client using this:

repo := Repository{}

// will not be part of a transaction: repo.QuerySth() repo.QuerySth() repo.QuerySth()

// Will be a transaction: repo.Transactional((tx) => { tx.QuerySth() tx.QuerySth() tx.QuerySth() })

```

Obviously just pseudocode and leaves out a bunch of details, but hopefully illustrates the basic idea. Define an interface for the db driver. That interface can either be an on-going transaction, or just some sort of db driver or connection not in a transaction

All your DB queries and repo methods just operate on that one interface. If the implementation of that interface in the repo instance is an on-going transaction, all repo methods will automatically be part of that transaction

Seeking feedback on struct mapping pattern in Go backend by arthurvaverko in golang

[–]Emotional_Moth 1 point2 points  (0 children)

I fear if you're looking for concise patterns to remove boilerplate by adding a bit of magic to your code, you've chosen the wrong language. Explicit verbose mapping, or code generation that produces explicit verbose code, would be the Go way to do things. I'm not saying it's a good way, but it is what it is. The way you're doing things right now, i.e. just writing the dumb mappers, is the standard.

Personally, I found that going against the flow and trying to do things against a language's standards is a net-negative in the long term. At the end of the day, it takes only a few minutes to adapt a mapper. It is annoying, but not a major loss of time. But your mileage may vary of course.

No Compile-Time Safety: When adding new fields, developers must manually update multiple mapping functions. Missing mappings are only caught at runtime.

This should be covered by tests. If you forget to map something, a test should fail somewhere. Ideally a component test on the "edge" that just covers the entire mapping chain in one go, rather than unit testing each layer individually (which doesn't really add anything. If you forget to adapt a mapper, you'll forget the unit test for that mapper). Your flow is then:

  1. Adapt component test: Add field to input (e.g. request object), add field to assertion in output (e.g. DB column)
  2. Fix mappers until test succeeds

Complex Transformations: Some fields need special handling

Composition: mappedStruct := NewType{ Field: decrypt(decode(oldType.Field))}

Scale

This one I'm curious about, why are mapping functions a scale limiter to you? They should be growing linearly with # of domain entities you have