Auto Materialization of source tables #2410
Replies: 3 comments
-
Should likely limit the content also or this is a potential to blow up huge tables, or cost a lot. If you use datagrip or some other editor they usually limit to some fixed size (500 rows) for you automatically behind the scenes when you do stuff. You could have a default limit, that is override-able
|
Beta Was this translation helpful? Give feedback.
-
Agree that this could be dangerous and we should think about how not to destroy people's DBs However also automatic (unspecified) limits can also be confusing and lead to errors in analysis (Easy to miss a dropped row if there were 505 rows in the table, and you excluded the last 5) An alternative would be to refuse to run tables above x rows (eg x=100,000) unless the user specifies limit=0 (or rows=all or similar), and throw a warning in this case? |
Beta Was this translation helpful? Give feedback.
-
Although now I think about it, that would require a count(*) operation on the table which is sometimes (depending on DBMS) an expensive operation |
Beta Was this translation helpful? Give feedback.
-
Feature Description
Ability to get tables directly from source, more concisely, for SQL based databases
connection.yaml
or even
Goal of Feature
Current Solution / Workarounds
sources/orders.sql
sources/reviews.sql
Considerations
select *
which may be undesirabletables - *
Beta Was this translation helpful? Give feedback.
All reactions