Replies: 1 comment
-
Hello, Angus, I used CDT for creating TINs quite a bit and the performance should be sufficient for such input sizes. I think the best way to find out would be to benchmark CDT directly on your use cases. Please share your findings if you do :) Recent performance improvements addressed corner cases where vertices were ordered in the way not optimal for the algorithm (huge speedups there) and sped up other cases by a good factor by using a more efficient data structure for the nearest point query. As for the constraint edges: it should not be an issue for this use case. Most of the times edge would already be in the triangulation and inserting it would be very cheap (one hash table lookup). If not the insertion is not too costly either. |
Beta Was this translation helpful? Give feedback.
-
Hello, just hoping to get some guidance regarding the performance of CDT for reasonably sized data sets.
I need to find an alternative library to Triangle for use in a commercial application. CDT is looking like a great option so far.
What I intend to use CDT for is the tinning of 3D contour data collected from real terrain to create meshes which will be rendered in a simulation.
This contour data can consist of over 20,000 poly-lines and 100,000 vertices. All vertices belong to poly-lines so the total number of edges roughly equals the number of vertices.
I read through github.com//issues/40 but I couldn't determine what the results of your improvements were. It would be great to know how things have improved.
From the benchmarks provided before your improvements it looked like CDT took between 1s and 8.5s for 100,000 vertices depending on the vertex distribution, this would be acceptable for my use case. It looks to me as though those benchmarks were taken without any edges however, so I am unsure how representative they will be for how I intend to use the library. How do the algorithms scale with number of edges?
Cheers.
Beta Was this translation helpful? Give feedback.
All reactions