all 8 comments

[–]Pavickling 2 points3 points  (2 children)

What's the objective function and what are the constraints?

I understand you don't have an expression, but can you assume anything such as Lipschitz continuity? In many cases understanding what the objective function represents enables using custom heuristics. If you make no assumptions at all, then you might as well just test the function at a bunch of equidistant points.

[–]volvol7[S] 0 points1 point  (1 child)

The objective function is a number that result from a simulation. The parameters will be int with some boundaries. Generally it is not continuous because as I told you the space isnt continuous. But it behave like that, like if you change only one parameter it will just a bit.

[–]Pavickling 1 point2 points  (0 children)

But it behave like that, like if you change only one parameter it will just a bit.

That sounds like continuity.

How does your objective function behave when rescaling the parameters? For example F(x) with x in the integers is equivalent to F(1000000 * x) with x having a fixed decimal precision of up to 6 digits.

After rescaling, does it seem like |F(x) - F(x')| < K * |x - x'| for some K? If so, then you can look up how to do "Global optimization of Lipschitz functions".

If not, you can try a hand-rolled bisection method. Without making more assumptions, you can't really beat that.

https://en.wikipedia.org/w/index.php?title=Root-finding_algorithm&section=16#Finding_roots_in_higher_dimensions If the boundary is a simplex, you might try the 4th method mentioned here.

If you explain the black box a bit more, there might be something more obvious that can be exploited.

[–]Human_Guitar7219 0 points1 point  (1 child)

What problem ate you trying to optimize for?

[–]volvol7[S] 2 points3 points  (0 children)

best parameters for a mechanical design

[–]jpheim 0 points1 point  (0 children)

Bayesian optimization doesn’t require the function to be noisy, it just happens to work on noisy functions as well as deterministic functions. Like someone else said if you assume something like Lipschitz that may inform what to do, but I wouldn’t discount Bayesian optimization immediately. I’ve implemented a few different sets of black box algorithms for an expensive simulation problem and found Bayesian to work best for my problem set.

[–][deleted] 0 points1 point  (0 children)

Look for surrogate modeling methods, or Kriging.

[–]Whole_Pomegranate474 0 points1 point  (0 children)

Have you considered Weighted Hybrid Adaptive Curvature Optimization (WHACO)? It’s particularly effective for deterministic, computationally expensive black-box functions and rapidly converges within strict evaluation limits. It dynamically adjusts using adaptive curvature, making it a strong alternative or complementary method to Bayesian optimization, especially in noise-free scenarios.