简体   繁体   中英

vector's size in C++

#include <iostream>
#include <vector>
#include "mixmax.h"//<-here, there is random number generator called  mixmax
#include <algorithm>
#include <cmath>
using namespace std;
int main()
{
const unsigned long long int n=10000000;
vector < float > f(n);
vector < float > distance_1(n);
vector < float > distance_2(n);
rng_state_t s;
rng_state_t *x=&s;
seed_spbox(x,12345);//<-here we just devlare our random number generator
for(int i=0;i<n;i++)
f[i]=int(n*get_next_float(x));//,<-here we just get random numbers,like    rand()
sort(f.begin(),f.end());
for(int i=0;i<n;i++)
{
   distance_1[i]=f[i]-i;
   distance_2[i]=(i+1)-f[i];
}
float        discrep=max(*max_element(distance_1.begin(),distance_1.end()),*max_element(dis    tance_2.begin(),distance_2.end()));
cout<<"discrep= "<<discrep<<endl;
cout<<"sqrt(n)*discrep= "<<discrep/sqrt(n)<<endl;
}

When I print f.max_size() (the vector declined above in code) gives me this huge number 4611686018427387903, but when I take n=10000000000, it does not work, it gives this error:

terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted (core dumped). 

(I tried it in Visual Studio under windows.)

What's the problem ??? If vectors do not work for big sizes, can anyone tell me how can I use vectors or arrays with very big sizes ???

Quoting cplusplus.com ,

std::vector::max_size

Returns the maximum number of elements that the vector can hold.

This is the maximum potential size the container can reach due to known system or library implementation limitations, but the container is by no means guaranteed to be able to reach that size: it can still fail to allocate storage at any point before that size is reached.

Hence, vector doesn't guarantee that it can hold max_size elements, it just an implementation limitation.

Also, chris mentioned:

Well 10 GB * sizeof(float) * 3 is a ton of memory to allocate. I'm going to guess your OS isn't letting you allocate it all for a good reason.

The OP asks,

If vectors do not work for big sizes, can anyone tell me how can I use vectors or arrays with very big sizes ???

Yes you can. Try Roomy or STXXL .

max_size() is different of size() and is different of capacity()

Current capacity is n=10000000 so the last element is distance_1[9999999]

What's the problem ???

Presumably, the allocation fails because your computer doesn't have 120GB of memory available. max_size tells you how many elements the implementation of vector can theoretically manage, given enough memory; it doesn't know how much memory will actually be available when you run the program.

If this vectors do not work for big sizes Can anyone tell me how can I use vectors or arrays with big, very big size ???

Increase the amount of RAM or swap space on your computer (and make sure the OS is 64-bit, though from the value of max_size() I guess it is). Or use something like STXXL to use files to back up huge data structures.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM