Max context length

#12
by jphme - opened

Hi,

the config.json states "max_position_embeddings": 32768, whereas mistral-small-latest 24-09 on Le Platforme gives 128k max tokenshere - is this the same model and are you using rope scaling or is this a different model trained for longer context sizes?

Thanks for the release and your great work once again :)

jp

Sign up or log in to comment