-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Point density - FIS #388
Comments
Have you looked into Whitebox Tools?
https://jblindsay.github.io/ghrg/WhiteboxTools/index.html
It has a point density tool (look with the "Lidar" suite of tools).
I've personally only used WBT on raster data, but have found it
straightforward, robust, and very well documented. It can be accessed via
cmd, python, QGIS, and even has it's own gui. The maintainers are active
and quite responsive. The tools for point clouds looks quite good for
freeware.
Will
…On Thu, 3 Sep 2020, 20:29 pauli432, ***@***.***> wrote:
I'm currently working with GCD 7.4.4.0 for my thesis on a hydraulic
physical model and it is amazing. Unfortunately, I cannot solve the problem
with the point density in calculating a non uniform error surface with FIS
error model (TS_ZError_PD_SLPdeg).
I have got a point cloud with a high density (1.6 mm at a distance of 10
m) and I process this data in CloudCompare.
From there I export a raster with the grid size of 1 cm to import it in
ArcGIS and GCD.
For my thesis I need to calculate change detection between to scans.
Therefore, I would like to calculate an error surface for each scan.
Everything works find with the calculation of the Slope Degrees for the
FIS. Only the point density doesn't work.
I cannot just calculate it in GCD because the original point cloud cannot
be imported in ArcGIS because it is to big.
There is a way to export a point density file from CloudCompare as well
but unfortunately the error calculation with this file doesn't work
properly.
Another problem is that the exported .txt file with the point density
cannot be opened with the text editor because it is to big. For me it seems
that it just stores all the coordination information of every single point.
The result after the change detection calculation is in the most parts
NoData.
Is there a way to calculate the point density with the information from
the scan (1.6 mm at a distance of 10 m) or does anyone know how to export a
point density file from CloudCompare that works for GCD?
Thank you very much in advance!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#388>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AJ7FW2VY6GM5BMV5PODYUWTSD5HVDANCNFSM4QUPQMGA>
.
|
Thanks Will (@fluviotect) for chiming in on this. Awesome to see community supporting community (especially when we're so slow). I didn't know about Whitebox, that looks promising. A little older workflow to question of:
is to use TopCAT as described here. TopCAT is now part of TAT, and is specifically for dealing with large point clouds and gives you a workflow to get into GCD in a way Arc can handle it. You may also have better luck using stand-alone GCD as it is 64 Bit, but the simple point density algorithm in there is nowhere near as efficient as what ToPCAT does. Point count, which can be divided by cell area is a byproduct of the standard output of ToPCAT and what we used in all our point cloud derived FIS. That said, Whitebox could likely do it as can PySESA with more sophistication. Another option |
Happy to share Joe! I had a feeling you'd dig Whitebox, especially the non-proprietary GIS processing ;-). Be advised, not all tools are available through all interfaces, so if you're accessing something through Python and either getting an error or a weird result, he probably just hasn't gotten to it porting it to Python yet. John is pretty good about porting stuff over to the Python API if requested through GitHub. |
I'm currently working with GCD 7.4.4.0 for my thesis on a hydraulic physical model and it is amazing. Unfortunately, I cannot solve the problem with the point density in calculating a non uniform error surface with FIS error model (TS_ZError_PD_SLPdeg).
I have got a point cloud with a high density (1.6 mm at a distance of 10 m) and I process this data in CloudCompare.
From there I export a raster with the grid size of 1 cm to import it in ArcGIS and GCD.
For my thesis I need to calculate change detection between to scans. Therefore, I would like to calculate an error surface for each scan. Everything works find with the calculation of the Slope Degrees for the FIS. Only the point density doesn't work.
I cannot just calculate it in GCD because the original point cloud cannot be imported in ArcGIS because it is to big.
There is a way to export a point density file from CloudCompare as well but unfortunately the error calculation with this file doesn't work properly.
Another problem is that the exported .txt file with the point density cannot be opened with the text editor because it is to big. For me it seems that it just stores all the coordination information of every single point.
The result after the change detection calculation is in the most parts NoData.
Is there a way to calculate the point density with the information from the scan (1.6 mm at a distance of 10 m) or does anyone know how to export a point density file from CloudCompare that works for GCD?
Thank you very much in advance!
The text was updated successfully, but these errors were encountered: