Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
Nigel Powell

The new Llama 3.3 70B model has just dropped — here’s why it’s a big deal

Llama3.3B.

Meta has just dropped its Llama 3.3 70B model, providing further proof that open models continue to close the gap with proprietary rivals.

It’s only been released in a 70B parameter version but its benchmark performance puts it not far off the older Llama 3.1 405B, and it even ranks above OpenAI’s GPT-4o and Google’s Gemini Pro 1.5 in some ratings.

The new model is available for download and installation at Ollama, Hugging Face or at Meta’s official Llama site.

Why is Llama 3.3 70b a big deal?

(Image credit: Meta AI)

For developers, and those who want to use AI models on their own computers instead of the cloud, this is a big deal. Every new Llama release shows how small open models can compete and even beat the best of the rest.

Meta has made no secret of the fact that it sees the open model paradigm as the best defense against potential abuse from proprietary products.

Smaller models mean that users can use cheaper smaller graphics cards with less VRAM and still receive a decent performance from the AI. The key to the usability of AI on desktop computers lies in getting snappy responses. The best AI in the world is useless if it takes an hour to deliver an answer.

By also giving users the ability to customize and enhance the base Llama models, there’s also every chance that open will continue to keep pace with closed in the long run.

The plan seems to be working since Llama models were downloaded over 20 million times this August alone, which is a 10x increase over the same time last year. A big factor behind these numbers is that each release of these Llama versions comes with a significant decrease in cost and increase in performance and capability.

What's new in Llama 3.3 70b?

The new model supports eight languages, including Spanish, Hindi and Thai, and has been deliberately designed so developers can fine-tune and add on additional capabilities or languages as they need.

Two points stand out from the success of these open models. First, there is a demographic of large and small companies that prefer to retain a measure of control over their AI product integration. There is also a growing group of AI enthusiasts and specialists who are looking to run smaller models on more modest consumer-level hardware.

There are more than 60,000 derivative models on Hugging Face, showing the strength of demand for fine-tuning the Llama model. In addition, large enterprise users like Goldman Sachs, Accenture and Shopify are also using Llama internally.

A lot of the large enterprise use is based on the cloud versions, whereas Llama is also building a sizable fan base for its more powerful models. Companies like Zoom and DoorDash, for instance, are using Llama as part of their AI mix in a wide variety of tasks, including customer support, software engineering and data analysis.

Final thoughts

This growing Llama ecosystem is a clever play by Meta. Not only does it establish the company’s strength in general-purpose AI, but it also provides some strong marketing juice for its in-house Meta AI product.

With over 350 million downloads of Llama models across the world to date, Meta has now grabbed a firm spot as one of the world’s top AI companies. It’s AI assistant, Meta AI, has just topped 600 million monthly users. This number is likely to explode once Llama 4 is released early next year as expected.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.