trailofbits.python.automatic-memory-pinning.automatic-memory-pinning

Author
unknown
Download Count*
License
If possible, it is better to rely on automatic pinning in PyTorch to avoid undefined behavior and for efficiency
Run Locally
Run in CI
Defintion
rules:
- id: automatic-memory-pinning
message: "If possible, it is better to rely on automatic pinning in PyTorch to
avoid undefined behavior and for efficiency "
languages:
- python
severity: WARNING
metadata:
category: security
cwe: "CWE-676: Use of Potentially Dangerous Function"
subcategory:
- audit
confidence: HIGH
likelihood: LOW
impact: LOW
technology:
- pytorch
description: "`PyTorch` memory not automatically pinned"
references:
- https://pytorch.org/docs/stable/data.html#memory-pinning
license: CC-BY-NC-SA-4.0
pattern-either:
- patterns:
- pattern: torch.utils.data.DataLoader(...)
- pattern-not: torch.utils.data.DataLoader(..., pin_memory=$VALUE, ...)
- pattern: torch.utils.data.DataLoader(..., pin_memory=False, ...)
Examples
automatic-memory-pinning.py
import torch
# ok: automatic-memory-pinning
loader = torch.utils.data.DataLoader(dataset, batch_size=2, collate_fn=collate_wrapper,
pin_memory=True)
# ok: automatic-memory-pinning
torch.utils.data.DataLoader(dataset, batch_size=2, collate_fn=collate_wrapper,
pin_memory=1)
# ok: automatic-memory-pinning
loader = torch.utils.data.DataLoader(dataset, batch_size=2, collate_fn=collate_wrapper,
pin_memory=3)
# ruleid: automatic-memory-pinning
loader = torch.utils.data.DataLoader(dataset, batch_size=2, collate_fn=collate_wrapper)
# ruleid: automatic-memory-pinning
loader = torch.utils.data.DataLoader(dataset, pin_memory=False, batch_size=2, collate_fn=collate_wrapper)
Short Link: https://sg.run/jz5N