You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These are my two main container classes my project uses, and I noticed they don't make use of the Memory Model concept. Can I add a request that this gets added for these classes so that the overhead of the size_type can be reduced?
The text was updated successfully, but these errors were encountered:
Did you see this as a global or per-instance memory model?
If it is per-instance, then there is a backward compatibility issue for etl::istring and etl::ivector.
I was thinking per-instance. Most uses of etl::string and etl::vector would be less than 256 in length for my project, but probably not all.
This request is primarily due to the overhead of etl::string<N> and etl::vector<char, N> for N < 256. Both would just need the char array and a uint8_t for its current size for its data members. So I was surprised to see that it would actually be 48 bytes. In the case of etl::string<N>, 8 bytes are required for the current_size and 8 for CAPACITY.
After looking at the base class more closely, I see how backwards compatibility would be difficult to obtain with such a change. Perhaps another option would be to introduce small_string and small_vector classes.
These are my two main container classes my project uses, and I noticed they don't make use of the Memory Model concept. Can I add a request that this gets added for these classes so that the overhead of the size_type can be reduced?
The text was updated successfully, but these errors were encountered: