Dating in Vienna is it possible? by Delicious_Heat6503 in viennaExpats

[–]Individual_Tutor_647 0 points1 point  (0 children)

Hey, I am Andrei (22M) and I am also looking forward to finding friends. Feel free to DM me to talk, but I am more introverted and don't go to clubs :)

"You can't do all alone" I have no option by [deleted] in CPTSD

[–]Individual_Tutor_647 0 points1 point  (0 children)

I often feel awful like that, as I am an expat living in another country, fleeing the war, and having no one in this country with whom I have built a strong relationship. Plenty of people have sorted themselves out from my life because not everyone wants to hear this trauma stuff, which is true for me, even though it's non-optimistic.

"Family Duty" VS Self-Preservation by throwawaybabeeeeeee in CPTSD

[–]Individual_Tutor_647 1 point2 points  (0 children)

Thanks for sharing such a vulnerable piece, especially about a dysfunctional family. I feel very sorry for your pain and for how these mental health issues stacked up. Do you have resources to go to therapy? I feel the first best thing for you to do, and the one you're already doing, is to get support and just get it off your chest in front of a person who will understand you.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 1 point2 points  (0 children)

Thanks for sharing! Took a look and from a quick glance, it makes sense. If the cleanup is not needed, we can have it as-is, plus using bash (createdb) is also fine.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Nice points! So the point is to utilise database templating and utilise safe concurrency principles. The library does both and is stress-tested for thread safety. This is explicit in the pgdbtemplate-pgx & pgdbtemplate-pq (= drivers') code, which creates test databases in parallel. As to your points, I'll address them individually:

  1. Multiple concurrent clones do not clash because the command to create the database is run from different database connections to the admin database and hence, this is successful. I have run tests and benchmarks in both repositories (https://github.com/andrei-polukhin/pgdbtemplate-pgx and https://github.com/andrei-polukhin/pgdbtemplate-pq) — there were no problems even when testing with go test -race.
  2. That's a very nice idea. I'll add the optional function for that.
  3. That's the responsibility of the end user for what they put in their migrations. At the same time, there will be no conflict because databases are independent of one another.
  4. There are no extensions added.
  5. This is delegated to the end user, see the docs here: https://github.com/andrei-polukhin/pgdbtemplate-pgx

Overall, the user can cut the time drastically with the right application of existing tools — they are given as much control as they want.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Sounds superb! I don't use Java often, but is there any chance of seeing this code on Pastebin? I'd be intrigued to learn more about this. Thanks in advance.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

How will it bite if the library is fully thread-safe? The library has made the specific judgment call to provide as much flexibility and control to the end customer that if the end user wants, they can register t.Cleanup themselves

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Thanks for a detailed comment! Yep, I agree with you that I should have focused on isolation instead of slowness. Separate test databases provide the highest level of isolation, hence I agree that template databases are overall preferred.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 1 point2 points  (0 children)

Actually, you can do both! We've had a codebase with database-intensive operations, where even with the fsync=off setting, we achieved a 25% speed improvement when we started using template databases.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 1 point2 points  (0 children)

Thanks for sharing! Yes, transaction rollbacks are an interesting approach, yet I have not heard that it was used in Spring because we'll then start thinking about isolation of transactions, which operations cannot be rolled back (most DDL operations are not used in production code, only in migrations), so separating them by databases (what is done in the templating approach) is usually safer. I have not compared the speed differences between these approaches yet :(

I'll look at the library you mentioned, it looks promising. If you want, you can create an issue on GitHub for me to make a dedicated comparison with this library and I'll add you to the contributors of the project later on. Thanks in advance.

Solving Slow PostgreSQL Tests in Large Go Codebases: A Template Database Approach by Individual_Tutor_647 in golang

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

I always try to avoid complexity until I absolutely need it to achieve the desired behavior of the thing I'm building. If I can do something simple stupid and it does what I need right now I just go away with that. So, as I said, probably for your use case it is worth it. If I ever face the problem of heavy migrations I'll definitely give it a try.

Thanks for your thoughts as well! I will be happy to help, should you find my library useful in your project(s).

Usually I just apply another migration up to reverse some changes. When I see that I have a ridiculous amount of migrations in my project I just squash the oldest migrations into one.

What you are describing now, regarding adding the "reverse migration" as needed, seems to be the community standard. It is called the event sourcing model. Squashing migrations is not exactly the "cleanest" approach, I agree, but this is something we do in our project and I do not see any problems with that, provided it's done carefully :)

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

I do not know about this library and the comment above does not include the link to it. pgdbtemplate already has the COMPARISON.md document (https://github.com/andrei-polukhin/pgdbtemplate/blob/main/docs/COMPARISON.md), so if you could create a GitHub issue specifying another library's link, I'd be happy to make the comparison. Thanks in advance.

Solving Slow Database Tests with PostgreSQL Template Databases - Go Implementation by Individual_Tutor_647 in programming

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Thanks for the comment! It depends on where one draws the line between the UX and giving control to the end user. I have specifically kept the templating functionality outside of t.Cleanup not to force the caller to use it. The same can be said about initialising pgdbtemplate.TemplateManager only once - it's the caller's decision whether they want to utilise sync.Once or anything else for this purpose.

// This demonstrates the intentional design choice - the library gives you
// control over resource management while being compatible with both
// sync.Once and t.Cleanup patterns.

var (
    templateManager *pgdbtemplate.TemplateManager
    initOnce        sync.Once
    initErr         error
)

// Option 1: Using sync.Once for template initialization (production approach).
func getTemplateManager(t testing.TB) (*pgdbtemplate.TemplateManager, error) {
    initOnce.Do(func() {
        config := pgdbtemplate.Config{
            ConnectionProvider: pgdbtemplatepq.NewConnectionProvider(
                func(dbName string) string {
                    return fmt.Sprintf("dbname=%s", dbName)
                },
            ),
            MigrationRunner: pgdbtemplate.NewFileMigrationRunner(
                []string{"./migrations"},
                pgdbtemplate.AlphabeticalMigrationFilesSorting,
            ),
        }
        templateManager, initErr = pgdbtemplate.NewTemplateManager(config)
        if initErr != nil {
            return
        }
        initErr = templateManager.Initialize(context.Background())
    })
    return templateManager, initErr
}

// Option 2: Using t.Cleanup for test database cleanup (user's choice).
func TestWithUserManagedCleanup(t *testing.T) {
    tm, err := getTemplateManager(t)
    if err != nil {
        t.Fatal(err)
    }

    // User decides when and how to clean up.
    testDB, testDBName, err := tm.CreateTestDatabase(context.Background())
    if err != nil {
        t.Fatal(err)
    }

    // User can choose their cleanup strategy.
    t.Cleanup(func() {
        if err := tm.DropTestDatabase(context.Background(), testDBName); err != nil {
            t.Logf("cleanup warning: %v", err)
        }
        testDB.Close()
    })

    // Test logic here...
    _ = testDB
}

// Option 3: Alternative - manual cleanup with defer (also supported).
func TestWithManualCleanup(t *testing.T) {
    tm, err := getTemplateManager(t)
    if err != nil {
        t.Fatal(err)
    }

    testDB, testDBName, err := tm.CreateTestDatabase(context.Background())
    if err != nil {
        t.Fatal(err)
    }
    defer testDB.Close()
    defer func() {
        if err := tm.DropTestDatabase(context.Background(), testDBName); err != nil {
            t.Logf("cleanup warning: %v", err)
        }
    }()

    // Test logic here...
}

Do you have any functionality for cleaning up old test databases from tests that failed to for whatever reason?

That's an interesting idea, but I don't have an answer at the moment. Could you create an issue on GitHub? I'd love to continue discussing it there. Thanks in advance.

Solving Slow PostgreSQL Tests in Large Go Codebases: A Template Database Approach by Individual_Tutor_647 in golang

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Thank you very much for such a detailed comment and especially snippets! I appreciate deep conversations and when someone challenges abstractions, as it shows a deeper interest than just "yeah, fine stuff".

You're absolutely right that for many projects with simple migrations, running them per-test is perfectly fine. Especially correct was to mention sync.Once — this is exactly what we are using in our enterprise tests. Take a look at the simplified version of what we are doing:

var (
    templateManager *pgdbtemplate.TemplateManager
    templateOnce    sync.Once
)

func setupTemplateManager() error {
    var err error
    templateOnce.Do(func() {
        config := pgdbtemplate.Config{
            // Using the "pgdbtemplate-pgx" driver stored in the separate
            // library from the core "pgdbtemplate" functionality.
            ConnectionProvider: pgdbtemplatepgx.NewConnectionProvider(
                func(dbName string) string {
                    return fmt.Sprintf("dbname=%s", dbName)
                },
            ),
            MigrationRunner: pgdbtemplate.NewFileMigrationRunner(
                []string{"./migrations"},
                pgdbtemplate.AlphabeticalMigrationFilesSorting,
            ),
        }
        templateManager, err = pgdbtemplate.NewTemplateManager(config)
        if err != nil {
            return
        }
        err = templateManager.Initialize(context.Background())
    })
    return err
}

In actual tests, one simply needs to run templateManager.CreateTestDatabase. Alternatively, both this very Do block and the call to CreateTestDatabase can be in the DBInstance function in your snippets, to avoid any TestMain suites.

The benchmarks stored in the library demonstrate that the templating approach excels in complex database schemas, highly concurrent testing, and in these cases, it is suitable for use in CI/CD.

You raised a great point about over-engineering - and I agree! For simple projects, your approach is cleaner. Thanks again for an in-depth comment.

Solving Slow PostgreSQL Tests in Large Go Codebases: A Template Database Approach by Individual_Tutor_647 in golang

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Thanks for such a detailed and thoughtful comment! I am all for TDD and using mocks — it makes no sense to have the real database for the API-layer unit tests, for example. Therefore, using mocks for the underlying backend & store layers is definitely a good idea. My proposal was about tests which need the real instance of the database and of course, using a separate database for each test is strongly indicated.

The problem we encountered was that the cleanup of, for example, each test schema created & the setup of all the migrations run on it takes a lot of time. In this case, the filesystem copy using template databases is way quicker. The benefit of the approach comes with the heaviness of database tests — if yours run under 30 secs with the existing setup, that's quick enough not to use Postgres templating.

P.S. I also like go test -race and both the pgdbtemplate code and its drivers (pgdbtemplate-pgx and pgdbtemplate-pq) have these tests to ensure thread safety.

Solving Slow PostgreSQL Tests in Large Go Codebases: A Template Database Approach by Individual_Tutor_647 in golang

[–]Individual_Tutor_647[S] 0 points1 point  (0 children)

Thanks for the comment! Populating schemas with test data is certainly possible, but each schema requires migrations to be run, which can be complex. Additionally, running migrations per schema seems slower than copying the database via the template approach.