Skip to content

How to resize a large vector safely in C++

An answer to this question on Stack Overflow.

Question

I've made a prime finder in C++ that writes primes into a .txt file, but it crashed after finding the 102,144,001th (2,084,058,601). It also stores the found primes in a vector, so it won't have to divide by every single number lesser than the currently checked number's square root, only the prime ones. It resizes the vector that stores the primes after every (10,240n+1)th found prime, and 102,144,001 is 10,240n+1, so it crashed while resizing. I use a vector unsigned __int64, so it used about 780 megabytes when it crashed, but I have 8 GB RAM. Should I use vector.reserve()? (I don't want another crash after 50 minutes...)

Answer

Well, available memory is not the same as usable memory.

cppreference says that a vector's elements are stored contiguously. Therefore, it's not a question of how much RAM you have, but how much contiguous RAM you had at the time the program was running. If a vector must be contiguous, then the vector can not expand beyond the largest available section of contiguous memory.

As a result you may also run into problems when the vector requires resizing. If the vector is adjacent to a large block of contiguous memory it can expand into, then great; however, if the vector is in too small a block, then it must copy itself to a new location... this takes time.

The deque container may be a better choice as it does not require contiguous storage locations. This allows you to utilize more of your available RAM and avoids costly copy operations during resizing.

(If you're worried about whether the standard guarantees that a vector is contiguous (thus potentially leading to the OP's problems), reading this and this may help shed light on the matter.)