Extending the context length to 1M tokens
(qwenlm.github.io)
After the release of Qwen2.5, we heard the community’s demand for processing longer contexts.
After the release of Qwen2.5, we heard the community’s demand for processing longer contexts.