USANEWSBEST

Science

Large language models can be squeezed onto your phone — rather than needing 1000s of servers to run — after breakthrough

Powerful artificial intelligence (AI) models like ChatGPT need copious amounts of power to run so they are usually housed in vast data centers. But a new breakthrough could compress these AI models so they fit onto a smartphone or laptop.

A new algorithm, dubbed Calibration Aware Low precision Decomposition with Low Rank Adaptation (CALDERA), compresses the massive amounts of data needed to run a large language model (LLM) by trimming redundancies in the code and reducing the precision of its layers of information.

okaygteam

About Author

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like

Science

NASA’s Chandra Finds Galaxy Cluster That Crosses the Streams

Astronomers using NASA’s Chandra X-ray Observatory have found a galaxy cluster has two streams of superheated gas crossing one another.
Science

Tackling the reality of noma

Adamu, a 14-year-old noma survivor, is screened by physicians at the Noma Hospital in Sokoto, Nigeria. Adamu’s father has been