You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From what I understand, we can now exploit purely databases. However, a lot of users have their data in files or in databases that are not optimized at all for OLAP (Thinking of a postgresl, Mysql, Mariadb, etc...).
The idea would be to :
-extract the said data from this "slow" datasource
-write it in a "fast" internal datasource. Duckdb would be really great for that kind of stuff.
-allow to schedule the refresh and monitor the task refresh.
Best regards,
Simon
The text was updated successfully, but these errors were encountered:
Thanks again for this suggestion, it seems interesting, since we can request Duckdb in SQL from what I see.
CSV/Excel imports are already planned, with asynchronous import for big files.
But I since we will include a support for DuckDb, because I can directly see where I can use that - meaning, I need to test that on with a real use case.
Hello,
From what I understand, we can now exploit purely databases. However, a lot of users have their data in files or in databases that are not optimized at all for OLAP (Thinking of a postgresl, Mysql, Mariadb, etc...).
The idea would be to :
-extract the said data from this "slow" datasource
-write it in a "fast" internal datasource. Duckdb would be really great for that kind of stuff.
-allow to schedule the refresh and monitor the task refresh.
Best regards,
Simon
The text was updated successfully, but these errors were encountered: