LLMs keep leaping with Llama 3, Meta’s newest open-weights AI model

LLaMA – the most significant of the open source AI large language models – has just entered its third release. From Ars Technica:

At the moment, Llama 3 is available in two parameter sizes: 8 billion (8B) and 70 billion (70B), both of which are available as free downloads through Meta’s website with a sign-up. Llama 3 comes in two versions: pre-trained (basically the raw, next-token-prediction model) and instruction-tuned (fine-tuned to follow user instructions). Each has a 8,192 token context limit.

That’s geek-speak for ‘freshly baked and out of the oven’.

Read the full report here.

Leave a comment