python - How do I work with large, memory hungry numpy array? -


i have program creates array:

list1 = zeros((x, y), dtype=complex_) 

currently using x = 500 , y = 1000000. initialize first column of list formula. subsequent columns calculate own values based on preceding column. after list filled, display multidimensional array using imshow().

the size of each value (item) in list 24 bytes. sample value code is: 4.63829355451e-32

when run code y = 10000000, takes ram , system stops run. how solve problem? there way save ram while still being able process list using imshow() easily? also, how large list can imshow() display?

there's no way solve problem (in general way).

computers (as commonly understood) have limited amount of ram, , require elements in ram in order operate on them.

an complex128 array size of 10000000x500 require around 74gib store. you'll need somehow reduce amount of data you're processing if hope use regular computer (as opposed supercomputer).

a common technique partitioning data , processing each partition individually (possibly on multiple computers). depending on problem you're trying solve, there may special data structures can use reduce amount of memory needed represent data - example sparse matrix.

it's unusual need memory - make sure consider if it's actually needed before dwell extremely complex workarounds.


Comments

Popular posts from this blog

Fail to load namespace Spring Security http://www.springframework.org/security/tags -

sql - MySQL query optimization using coalesce -

unity3d - Unity local avoidance in user created world -