To explain why the factor 2 is a particularly bad choice: It represents exactly the threshold that inhibits the usage of previously allocated memory. Growing by a factor of 2 is the worst possible allocation strategy.
Consider the scenario where a vector grows multiple times in a row (no other allocations occur in between). Typically, there may be a large block of memory, at the beginning of which the vector is placed initially. # denotes an allocated and - an unusable byte.
When the vector grows by a factor of 2, the sum of the previous blocks is always a single byte too small for the new vector. As a consequence, the memory of
all of its previous allocations cannot be used again, and the vector wanders continuously forward in memory.
Round Allocated Memory Usage
1. 1 #
2. 2 -##
3. 4 ---####
4. 8 -------########
5. 16 ---------------################
As a result, a vector of capacity N uses 2N-1 bytes while growing. In the worst case, only slightly more than 50% of the elements are actually used, resulting in an effective usage of 25% of the memory, while 75% are wasted. Even though 50% can be used for future allocations, the growing process itself requires an overly big amount of memory.
The Dinkumware STL implementation of MSVC uses a factor 1.5 and therefore avoids this issue. After 4 allocations, old blocks can be used again.
Round Allocated Memory Usage
1. 1 #
2. 2 -##
3. 3 ---###
4. 4 ------####
5. 6 ######----