It'd be foolish for FAIR to donate preconfigured homing-missiles to OpenAI and others via one-way tech transfer.
No, they could GPL it, and I don't think they're worried about competition taking the models anyway, there's nothing particularly special about the weights or training data, just the compute. I think part of it is pressure from AI "safety" hangers-on who pretend that AI is dangerous so only those who don't want to abide by license terms should have unfettered access. The other commercial reasons are harder to understand. With pytorch they became the standard that everyone builds off of, they could do that with their recent AI, particularly LLaMA but they chose this silly route.
Also, LLaMA has a more permissive license than this translation one, and is a more powerful model, so I don't really see the "homing missiles to open AI" angle.
> [...] I don't think they're worried about competition taking the models anyway, there's nothing particularly special about the weights or training data, just the compute
If that is the case, then what do you suppose is the reason most researcher outfits stopped releasing model weights, or offer more restrictions when they do?
Using the GPL won't prevent the larger AI competitors from using your model outputs from tuning their non-public models to consistently beat yours, but a non-commercial clause does.
> Also, LLaMA has a more permissive license than this translation one, and is a more powerful model, so I don't really see the "homing missiles to open AI" angle.
LLaMa lags ChatGPT 4, but SeamlessM4T is ahead of WhisperX in some ways
Also, LLaMA has a more permissive license than this translation one, and is a more powerful model, so I don't really see the "homing missiles to open AI" angle.