memory_reserved¶
- paddle.device.cuda. memory_reserved ( device=None ) [source]
-
Return the current size of GPU memory that is held by the allocator of the given device.
- Parameters
-
device (paddle.CUDAPlace or int or str, optional) – The device, the id of the device or the string name of device like ‘gpu:x’. If device is None, the device is the current device. Default: None.
- Returns
-
The current size of GPU memory that is held by the allocator of the given device, in bytes.
- Return type
-
int
Examples
>>> >>> import paddle >>> paddle.device.set_device('gpu') >>> memory_reserved_size = paddle.device.cuda.memory_reserved(paddle.CUDAPlace(0)) >>> memory_reserved_size = paddle.device.cuda.memory_reserved(0) >>> memory_reserved_size = paddle.device.cuda.memory_reserved("gpu:0")