Literature to look at
This issue is like a "to-do list" and these are suggestions people made.
- Ruppert's algorithm as suggested here
- Wavelet compression as suggested here
- compare performance with hyperopt or GyOpt suggested here
- Importance Sampling and Information-Directed Sampling see this suggestion
- Kriging see this suggestion
- Bayesian Inference, NUTS, and Hamilton Monte-Carlo see this suggestion
- cast your custom loss function that you’re using for sampling to the variance function and the kernel. This might be a step towards demonstrating a type of optimality. Using something like the KL divergence would give you results like you’d get from krining see this suggestion
- ZBrush's Sculptris Pro feature generates new polygons on a 3D model see this suggestion
- there is a similar program called MultiNest that is an adaptive MC sample see this suggestion
- Quad trees see this suggestion
It's probably interesting to compare some of these methods to Adaptive for the paper.
Ruppert's algorithm
Looks interesting. Does anyone know what differences this would give with respect to Bowyer-Watson?
AFAICT most other suggestions require global information, rather than the local information that adaptive uses (AFAIK all Bayesian inference works this way at least). Anton pointed this out on the Reddit thread.
AFAICT most other suggestions require global information, rather than the local information
Which is not to say that we should not implement them, of course. It would be a good test of the abstractions to see what globally-updating algorithms would look like (I don't think that there should be too many pain points)
Quad trees see this suggestion
I think this might be good if derivatives need to be computed, as the Redditor who posted this suggests. I think a quadtree should be able to get to an arbitrary point in parameter space in O[log D ] evaluations, or something like that, where D is the dimension of the space. This means that we don't have to worry too much about being restricted to a square grid (responding to Bas' point in the Reddit thread).
Ruppert's algorithm
Looks interesting. Does anyone know what differences this would give with respect to Bowyer-Watson?
If I am correct, Ruppert's algorithm is not about finding a triangulation for a given set of points, but rather to find a mesh such that some polygon that you inserted is embedded in the mesh (something like a constrained delaunay triangulation).
While the goal of the Bowyer-Watson algorithm is just about triangulating a given set of points without adding new ones
However the algorithm is very similar to the pointchosing algorithm of the LearnerND
- if the circumcenter is inside the simplex, add the circumcenter
- otherwise add the middle of the longest edge (similar but slightly different from Ruppert's algorithm)