top of page

Bigger, Better, Bolder: The Unstoppable Surge of Large Language Models in 2024

4/14/24

Editorial team at Bits with Brains

The AI landscape continues to be reshaped by the emergence of yet more powerful large language models.

At Google Cloud Next 2024, the tech giant finally announced the global availability of Gemini 1.5 Pro, its latest language model boasting a 1 million token context window. This development enables users to process and generate vast amounts of text, making it an invaluable tool for various applications, from content creation to data analysis. Not to be outdone, OpenAI unveiled a majorly improved GPT-4 Turbo model. While details remain scarce, the new model is said to excel in coding and mathematics, positioning it as a strong competitor in the AI race. The open-source community is also making strides, with Stable LM 2 12B and Mistral's Mixtral 8x22B pushing the boundaries of what's possible with open-source language models. Google further expanded its AI offerings with the release of new Gemma models. Code Gemma, fine-tuned for coding applications, and Recurrent Gemma, designed for efficient research, demonstrate Google's commitment to providing specialized AI tools for diverse use cases.


Details are sparse but here’s what we know about these new releases so far.


Google Gemini 1.5 Pro

Google's Gemini 1.5 Pro is now available in over 180 countries, offering native audio understanding, system instructions, a JSON mode, and more. JSON (JavaScript Object Notation) is a widely used data format that is easily readable by humans and parsed by machines, making it highly interoperable across various platforms and programming languages.


However, its most impressive feature is the 1 million token context window. With one token representing approximately 75% of a word, this translates to a combined input and output capacity of 750,000 words. This vast context window enables users to process and generate extensive amounts of text, making Gemini 1.5 Pro a more valuable tool for numerous applications, from content creation to data analysis. Developers can now access Gemini 1.5 Pro via its API.


Google has also rolled out new versions of Gemma, its open-source large language models. Code Gemma is a model fine-tuned for coding applications, while Recurrent Gemma is designed for efficient research purposes. Both models aim to compete with other open-source coding-specific large language models.


Open AI GPT-4 Turbo

In the meantime, OpenAI also announced a majorly improved GPT-4 Turbo model. While details remain scarce, the new model is said to excel in coding and mathematics, positioning it as a strong competitor in the AI race.


The GPT-4 Turbo model is now also available via the OpenAI API and is being rolled out inside ChatGPT. The model supports JSON mode as well, offers function calling, has a 128,000-token context window. It is updated through December 2023.


According to the Chatbot Arena, the newest version of GPT-4 Turbo (April 9th edition) has surpassed Claude 3 Opus as the strongest and most powerful model, as voted by the Chatbot Arena community.


Stability AI LM2

At the same time, Stability AI released Stable LM 2, a 12-billion parameter open-source model that slightly underperforms the state-of-the-art Mixtral 8X 7B model in most benchmarks – a significant feat. However, despite being marketed as an open-source product, Stable LM 2 requires a Stability AI membership for commercial use, raising questions about its true open-source nature.


Mistral Mixtral 8x22B

Mistral AI, not to be outdone, released Mixtral 8x22B, a large language model using its mixture of experts’ architecture. The model was released as a torrent link on X with almost no context as per usual for Mistral, and its weights file size is 281 gigabytes. The new model features a 65,000-token context window and a combined total of 176 billion parameters. While information about Mixtral 8x22B is limited, it is expected to dominate current open-sources model once more tests are conducted. 


The first fire-tuned versions are already available on Hugging Face.


Meta Llama 3

However, Meta is close to releasing Llama 3, the replacement for Llama 2, its market leading open-source large language model expected to be roughly as good as GPT-4.


As with Llama 2, Llama 3 will be made publicly available for anyone to use, fine-tune, and build upon. The model will come in several different versions, each fine-tuned for different purposes. According to TechCrunch, Llama 3 is expected to be released within the next month.


And these announcements have all happened within the last week or so!


The continuous leapfrogging in size and capabilities of large language models pushing the boundaries of what's possible. As these models become more sophisticated and accessible, they are opening new opportunities for businesses and developers alike.


Sources:

[1] https://www.datacamp.com/blog/gpt4-turbo

[2] https://openai.com/blog/new-models-and-developer-products-announced-at-devday

[3] https://research.aimultiple.com/large-language-models/

[4] https://www.theverge.com/2023/11/6/23948426/openai-gpt4-turbo-generative-ai-new-model

[5] https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/gpt-with-vision

[6] https://www.zdnet.com/article/openai-unveils-a-less-lazy-gpt-4-turbo-price-drops-plus-new-and-updated-models/

[7] https://www.splunk.com/en_us/blog/learn/google-cloud-next.html

[8] https://www.wired.com/story/5-updates-gpt-4-turbo-openai-chatgpt-sam-altman/

[9] https://hatchworks.com/blog/gen-ai/large-language-models-guide/

[10] https://developers.googleblog.com/2024/02/google-cloud-next-24-session-library-now-available.html?m=1

[11] https://www.youtube.com/watch?v=RG7vEwfba-4

[12] https://deci.ai/blog/list-of-large-language-models-in-open-source/

[13] https://openai.com/blog/new-embedding-models-and-api-updates

[14] https://ai.google.dev/gemma/docs/gemma_cpp

[15] https://github.com/google/gemma.cpp

[16] https://aws.amazon.com/blogs/machine-learning/gemma-is-now-available-in-amazon-sagemaker-jumpstart/

[17] https://www.nature.com/articles/s41698-024-00573-2

[18] https://ai.google.dev/gemma

[19] https://www.forbes.com/sites/forbestechcouncil/2024/03/06/how-to-leverage-large-language-models-for-engineering-and-more/?sh=58d440982a58

Sources

bottom of page