Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Despite the huge competition for AI tools, we have a multi-faceted future


Subscribe to our daily and weekly newsletters for the latest updates and content from the industry’s leading AI site. learn more


Every week – sometimes every day – a new one the type of modern AI they are born to the world. As we enter 2025, the pace at which new models are being released is staggering, if not exhausting. The edge of the rollercoaster continues to grow, and exhaustion and wonder have become constant companions. Every release shows why this one brand is better than the rest, with endless benchmarks and charts filling our diets as we strive to achieve it.

The number of major launch models released each year has been increasing since 2020
Charlie Giattino, Edouard Mathieu, Veronika Samborska and Max Roser (2023) – “Artificial Intelligence” Published online at OurWorldinData.org.

Eighteen months ago, most manufacturers and businesses were using a one type of AI. Today, the opposite is true. It’s rare to find a business so large that it’s built on the skills of a single model. Companies are wary of vendor lock-in, especially for technology that has quickly evolved into an important part of long-term operations and short-term costs. It is dangerous for groups to bet on a single major language model (LLM).

But despite this divide, many model providers still insist that AI will be a win-win market. They say expertise is the math needed to train the most advanced and rare, protected and self-reinforcing people. From their point of view, the hype bubble of building AI models eventually it will collapse, leaving a single, large model of general intelligence (AGI) that will be used for anything and everything. To have such a brand means to be the most powerful company in the world. The growth of this prize has led to a race for more and more GPUs, with new zeros being added to the number of training sessions every few months.

Deep Thoughts, monolithic AGI from the Hitchhiker’s Guide to the Universe
BBC, Hitchhiker’s Guide to the Galaxy, television series (1981). The image has been returned for reference.

We hope this assumption is wrong. There will never be a single model that will rule the universe, not even in the next year or the next decade. Instead, the future of AI will be multifaceted.

Linguistics are abstract things

The Oxford Dictionary of Economics defines a commodity as “a fixed good that is bought and sold at a rate at which its units can be exchanged.” Linguistics are important in two important ways:

  1. The same models are becoming more flexible in many areas of work;
  2. The research expertise needed to develop these models is becoming increasingly widespread and readily available, as frontier laboratories continue to expand and independent researchers in the open-source community keep pace.
Item Description (Currency: Not Diamond)

But while the languages ​​are improving, they are doing so inconsistently. There is a huge spectrum of possibilities that every model, from the GPT-4 to the Mistral Small, is worth handling. At the same time, as we move from coast to coast, we see greater and greater differences, with some providers of models that are well-known in coding, reasoning, recovery-augmented generation (RAG) or mathematics. This leads to endless hand-writing, reddit-searching, checking and fine-tuning to find the right version for each job.

AI models are creating things around the ability to start and stay on the edge. Credit: Not Diamond

And so even though the language samples are for sale, they are described very accurately things that don’t make sense. For most use cases, AI models will be flexible, with metrics like cost and latency determining which model to use. But at the edge of possibility, the opposite will happen: Patterns will continue to be known, being very different. For example, Deepseek-V2.5 it is more powerful than GPT-4o for coding in C#, although it is a fraction of the size and 50 times cheaper.

Both of these – commoditization and specialization – dispel the notion that a single brand can handle every possible problem. Instead, they point to the increasingly fragmented field of AI.

Multi-channel dialing and routing

There is an apt metaphor for the market dynamics of linguistic models: the human brain. The structure of our brains has remained unchanged for 100,000 years, and brains are more alike than they are different. For a long time on earth, many people learned the same things and had the same skills.

But then things changed. We developed the ability to communicate in language — first in words, then in writing. Communication technologies enable networking, and as people began to connect with each other, we became more technologically advanced. We were freed from the burden of wanting to be stable in all areas, to be self-sufficient islands. Ironically, the proliferation of information technology has also resulted in many people today having more power than our ancestors.

With enough space, the environment is always unique. This is true from molecular chemistry, biology, and social studies. Due to the sheer diversity, distributed systems are always more successful than monoliths. Hopefully the same will be true of AI. The more we can leverage the power of multiple models instead of relying on just one, the more specific those models can be, expanding the range of possibilities.

Multidisciplinary systems can allow for specialization, creativity and innovation. Source: Not Diamond

The most important way to use the power of different models is to control them – sending queries to the most relevant models, and using cheaper, faster models while doing so does not degrade quality. Control allows us to take advantage of all the advantages of technology – high accuracy and low cost and latency – without sacrificing any durability.

A simple demonstration of the power of the network can be seen because many of the most advanced brands in the world are routers themselves: They are built using control methods. Expert Mixing The architecture that drives each subsequent generation to multiple types of multiple professionals. If it’s true that LLMs are becoming more and more abstract, then moderation should be an important part of any AI stack.

There is a sense that LLMs will rise when it comes to human intelligence – that when we saturate the power, we will connect around one common model like we have connected to AWS, or the iPhone. None of the platforms (or their competitors) have 10X capabilities in the last few years – so we can be comfortable in their environment. We believe, however, that AI does not stand on human intelligence; it will go far beyond any imaginable limit. When it does, it will become more divided and unique, just like in any other universe.

It cannot be overstated that the classification of AI models is a very good thing. Decentralized markets are good markets: They empower consumers, increase efficiency and reduce costs. And to the extent that we can use smaller, more decentralized networks instead of sending everything through one big network, we move toward a safer, more interpretable and sustainable AI future.

Great inventions have no owners. Ben Franklin’s heirs have no electricity. A Turing point does not consist of all computers. AI is arguably one of the greatest human inventions; we believe that its future will be – and must be – multifaceted.

Zack Kass is the former head of marketing at OpenAI.

Tomás Hernando Kofman is co-founder and CEO of Not Diamond.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including data professionals, can share insights and innovations about data.

If you want to read about the best and latest ideas, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might think so support the story yours!

Read more from DataDecisionMakers



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *