Pytorch-Memory-Utils icon indicating copy to clipboard operation
Pytorch-Memory-Utils copied to clipboard

pytorch memory track code

Results 19 Pytorch-Memory-Utils issues
Sort by recently updated
recently updated
newest added

Hi , I use the following to track my gpu utility from gpu_mem_track import MemTracker gpu_tracker = MemTracker() gpu_tracker.track() model = Model() model.to("cuda:0") gpu_tracker.track() the tracker tell me the total...

Hi, I'm using your code with: torch 1.10.0+cu113 I used the example code as follow: ``` import torch import inspect from torchvision import models from gpu_mem_track import MemTracker # 引用显存跟踪代码...

When I track the GPU usage, the used memory stays unchanged wherever I write the code `gpu_tracker.track()`. Here is the output for one .py file: ``` GPU Memory Track |...

The previous display total used memory plus the middle tensor memory does not equal the following total used memory

Hi, I want to ask about your code: `new_tensor_sizes = {(type(x), tuple(x.size()), ts_list.count(x.size()), np.prod(np.array(x.size()))*4/1000**2)` what's the means of **2 operation?

Multi-GPUs in some machine, how can make the program show the memory of other gpus? I have tested CUDA_VISIBLE_DEVICES, os.environ['VCUDA_VISIBLE_DEVICES'], but they didnot be effective.

God job in this project! Can you add one feature which can print variable's name?(Intermediate variables do not belong to network architectures. ). When I want to track how my...

Hi, why cannot I find where the outputting .txt file is? It doesn`t in the current dir of my code. What I run is your example code and it runs...