Inference failed with vLLM Error: "Too large swap space"

ValueError: Too large swap space. 16.00 GiB out of the 15.17 GiB total CPU memory is allocated for the swap space.
submission_hash : 5cf077872040ec65672a2b26fc8083c77e6c5309.
issue:AIcrowd

@helencheung : This is because you have not enabled GPU for your submission, and your code is running on CPU and is being constrained by the available Memory on the CPU nodes.

Please set gpu: true in your aicrowd.json file.

Oh! I’m very sorry, I didn’t notice this. Thank you.