And it's now available for commercial use.
Llama 2 was trained on 40% more tokens (vs. LLaMA 1), its parameter size goes up to 70K (from 65K), and it has 4K tokens for context length (2x).
Llama 2 still lags in parameter size vs. its competitors from OpenAI and Google
Llama 2 outperforms all other open-source LLMs in AI benchmarks
Microsoft is Meta's “preferred partner,” in a clear challenge to OpenAI, which received $10B+ in funding from Microsoft.
Microsoft is Meta's “preferred partner,” in a clear challenge to OpenAI, which received $10B+ in funding from Microsoft.
Llama 2 is looking to consolidate itself as the leader in open source and challenges OpenAI's GPT-3.5-turbo.
It is hard to see how open-source startups like Mosaic or Stability can compete with Meta's infinite war chest for open-source LLMs.
Meta is willing to dump money on open-source LLMs as its core business model is ads, not selling services.
If Llama 2 really is comparable to GPT-3.5-turbo, will we see mass migration, given GPT's higher price?