The requirements said " Especially, participants can use or fine-tune the following Llama 3 models from https://llama.meta.com/llama-downloads:
- Llama 3.2 11B
- Llama 3.2 90B
Any other non-llama models used need to be under 1.5b parameter size limit." Can we use Llama4 models like meta-llama/Llama-4-Scout-17B-16E-Instruct?