SUTD develops original mini AI model with performance that surpasses many existing open source language models

31 Jan 2024

8 World News, 31 Jan 2024, 新科大研发独创迷你人工智能模型 性能超越许多现有开源语言模型 (translation)
 
Researchers from a local university have developed an original mini artificial intelligence (AI) model whose performance exceeds many existing open-source language models.
 
A team led by Associate Professor Lu Wei, Associate Head (Research), Information Systems Technology and Design, Singapore University of Technology and Design, launched the AI model "TinyLlama", an open-source small language model with 1.1 billion parameters, and pre-trained with 3 trillion tokens.
 
The research team said that current large language models (LLMs) such as ChatGPT or Google Bard are managed by tens of thousands of graphics processing units (GPUs) and require users to connect to their massive servers.
 
In comparison, TinyLlama is built on 16 graphics processing units and takes up relatively less random-access memory (RAM) space. In other words, "TinyLlama" can be easily deployed on mobile devices, allowing everyone to carry a "Mini ChatGPT" in their pocket.