Barnes I'm unsure of -- I don't recall the barnes algorithm and data
structure well enough. Have a look at the code and see how much memory
it allocates.
Ocean is simpler... recall that the data structure for ocean is an
(N+2)x(N+2) matrix. I don't recall the size of the elements (either 4 or
8 bytes, I'll assume 8 for this calculation). I also don't recall if
Ocean uses a shadow of its data or updates the data in-place (I'll
assume a shadow).
Size of ocean working set ~=
Matrix Size * sizeof(matrix element) * Number of Matrices + ~ small
number of stack vars
66x66 x 8 bytes x 2 matrices + noise
~= 69,696 bytes total ...
divide that by, say, four L1 data caches (as Ocean is essentially
embarrassingly parallel, has great spatial and temporal locality, and
very little sharing):
~= 17,424 bytes per cache...
Thus, after the caches are warm, I'd expect to see only a few coherence
misses.
Regards,
Dan Gibson
Daniele Bordes wrote:
Thank you very much for your quick reply, Dan. I have a question: how
can I determine the working set dimension of Ocean and Barnes varying
input parameters (that is, for instance, grid dimension for Ocean and
particles number for Barnes)?
Sorry to be a nuisance.
Thank you very much.
_______________________________________________
Gems-users mailing list
Gems-users@xxxxxxxxxxx
https://lists.cs.wisc.edu/mailman/listinfo/gems-users
Use Google to search the GEMS Users mailing list by adding "site:https://lists.cs.wisc.edu/archive/gems-users/" to your search.
--
http://www.cs.wisc.edu/~gibson [esc]:wq!
|