- cross-posted to:
- hackernews@derp.foo
- technews@radiation.party
- technology@lemmy.ml
- cross-posted to:
- hackernews@derp.foo
- technews@radiation.party
- technology@lemmy.ml
Context: Falcon is a popular free LLM, this is their biggest model yet and they claim it’s now the best open model in the market right now.
It’s made for researchers and engineers. Nothing is packaged in a form to simply download and run on a stock PC. It assumes a high level of comfort configuring Python environments, GPU drivers, and GPU compute backends like CUDA.
If you don’t know what all of that means, you would be better off looking downstream to projects like GPT4All that package some of this stuff into a simple installer that anyone can run.
As for Falcon in particular, you will not be able to run this on any consumer hardware. It requires at least 160GB of memory, and that’s GPU memory ideally. The largest consumer GPU on the market only has 24GB.