Skip to content

Commit 95626a9

Browse files
committed
Sparse tensors for monitors explanation
1 parent 326d270 commit 95626a9

File tree

2 files changed

+79
-0
lines changed

2 files changed

+79
-0
lines changed

docs/source/guide/guide_part_i.rst

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -278,6 +278,47 @@ Similarly, one can get the contents of a network monitor by calling :code:`netwo
278278
function takes no arguments; it returns a dictionary mapping network components to a sub-dictionary mapping state
279279
variables to their tensor-valued recording.
280280

281+
282+
:py:class:`bindsnet.network.monitors.AbstractMonitor` objects can also store sparse tensor-valued variables.
283+
For example, spikes can be stored efficiently using a sparse monitor:
284+
285+
.. code-block:: python
286+
287+
Monitor(
288+
network.layers[layer], state_vars=["s"], time=int(time / dt), device=device, sparse=True
289+
)
290+
291+
Note that using sparse tensors is advantageous only when the percentage of non-zero values is less than 4% of the total values.
292+
The table below compares memory consumption between sparse and dense tensors:
293+
294+
======================= ====================== ====================== ====================
295+
Sparse (megabytes used) Dense (megabytes used) Ratio (Sparse/Dense) % % of non zero values
296+
======================= ====================== ====================== ====================
297+
15 119 13 0.5
298+
30 119 25 1.0
299+
45 119 38 1.5
300+
60 119 50 2.0
301+
75 119 63 2.5
302+
89 119 75 3.0
303+
104 119 87 3.5
304+
119 119 100 4.0
305+
134 119 113 4.5
306+
149 119 125 5.0
307+
164 119 138 5.5
308+
179 119 150 6.0
309+
194 119 163 6.5
310+
209 119 176 7.0
311+
224 119 188 7.5
312+
239 119 201 8.0
313+
253 119 213 8.5
314+
268 119 225 9.0
315+
283 119 238 9.5
316+
======================= ====================== ====================== ====================
317+
318+
The tensor size does not affect the values in the third column.
319+
This table was generated by :code:`examples/benchmark/sparse_vs_dense_tensors.py`
320+
321+
281322
Running Simulations
282323
-------------------
283324

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
import torch
2+
3+
4+
assert torch.cuda.is_available(), 'Benchmark works only on cuda'
5+
device = torch.device("cuda")
6+
7+
8+
def create_spikes_tensor(percent_of_true_values, sparse):
9+
spikes_tensor = torch.bernoulli(
10+
torch.full((500, 500, 500), percent_of_true_values, device=device)
11+
).bool()
12+
if sparse:
13+
spikes_tensor = spikes_tensor.to_sparse()
14+
15+
torch.cuda.reset_peak_memory_stats(device=device)
16+
return round(torch.cuda.max_memory_allocated(device=device) / (1024 ** 2))
17+
18+
19+
print('======================= ====================== ====================== ====================')
20+
print('Sparse (megabytes used) Dense (megabytes used) Ratio (Sparse/Dense) % % of non zero values')
21+
print('======================= ====================== ====================== ====================')
22+
percent_of_true_values = 0.005
23+
while percent_of_true_values < 0.1:
24+
result = {}
25+
for sparse in [True, False]:
26+
result[sparse] = create_spikes_tensor(percent_of_true_values, sparse)
27+
percent = round((result[True] / result[False]) * 100)
28+
29+
row = [
30+
str(result[True]).ljust(23),
31+
str(result[False]).ljust(22),
32+
str(percent).ljust(22),
33+
str(round(percent_of_true_values * 100, 1)).ljust(20),
34+
]
35+
print(' '.join(row))
36+
percent_of_true_values += 0.005
37+
38+
print('======================= ====================== ====================== ====================')

0 commit comments

Comments
 (0)