#include <iostream>
int mainarr[500000050];
int P[500000050];
int main()
{
cout << p[100000];
}
Above code works fine on MAC but crashes in windows. Why Does mac happen to have more memory space for global variables than windows(Not tested on linux till now). How can I increase the size on windows.
You should'nt use that much global memory. Maybe mac has more memory for global than windows.
Global and static variables are defined in the Data Segment of the program's memory. I suggest reading about it on wiki . Allocating such big buffers, either on the stack or in the data segment is a really bad practice. (sorry for not being more polite, but it should be avoided at all costs,) Probably Mac-OS allows bigger Data Segments than other OS, but you should not rely on that!
I need to address a few more issues here:
#define fi(i, n)
for example, is another way to make your code completely unreadable, prone to errors, not scalable and so on. Just use for loop as intended !#define lli long long int
is obscuring the type, understanding that lli
is long long int
isn't that simple.. Since C++11 you can use #include <cstdint>
look at the reference and use int64_t
, knowing the exact size of the integer. (also, when it comes to not-negative numbers, you should use either std::size_t
or uintXX_t
, I personally use the uint64_t
on 64-bit machines). using lli = long long int;
instead of #define...
const lli _MOD = 1e9 + 7;
should be constexpr uint64_t _MOD = 1e9 + 7;
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.