Can we use any other LLM at parameter of 1.5B like Qwen-1.5B?

The competition rule said: Any other non-llama models used need to be under 1.5b parameter size limit.

I wonder if we can use Qwen1.5B, which parameter size if equal to 1.5b.

Raised the question to Meta organizers. Will come back to you when we have responses.

We receive the confirmation from the organizers from Meta, that you are allowed to use anything < (strictly) 1.6B, so QWen-1.5B is OK.

1 Like