Can we use any other LLM at parameter of 1.5B like Qwen-1.5B?

The competition rule said: Any other non-llama models used need to be under 1.5b parameter size limit.

I wonder if we can use Qwen1.5B, which parameter size if equal to 1.5b.

Raised the question to Meta organizers. Will come back to you when we have responses.