With AI, Do We Still Need Humans?
This article is from What Builders Talk About When They Talk About AI, featuring insights from 8 industry leaders including OpenAI CTO Mira Murati and Microsoft AI EVP Kevin Scott on current AI development and perspectives.
1️⃣ The Third Era of Computing
LLMs (Large Language Models) will undoubtedly become the next generation computing platform. Currently, LLMs are still in very early stages - the underlying Transformer architecture was only invented 6 years ago, and ChatGPT was released less than a year ago.
2️⃣ This AI Wave Will Drive Market Economic Development
Suppose you need to generate an image:
- Artist: Drawing fee $100/h, time spent 1h
- AI: Inference cost $0.01, time spent 1s
The price of image generation has dropped by 10,000x. Considering the value of time, the overall image production cost has decreased by at least 4-5 orders of magnitude. This is undoubtedly huge for economic efficiency gains.
3️⃣ For Early Use Cases: Creativity > Correctness
Hallucination (invalid or incorrect information produced by LLMs) is currently a notorious problem. But for certain use cases like virtual companions, art, and games, creativity is more important than correctness.
4️⃣ For Other Cases Like "Copilot", Correctness Gradually Improves with Usage
Copilot was the first widely adopted AI assistant for several reasons:
- Programmers prefer new technologies
- LLMs are trained on large amounts of code, performing well on code-related questions
- Good AI feedback loop - programmers immediately respond to AI code suggestions by accepting or rejecting them
5️⃣ AI Combined with Biology Further Enhances Disease Treatment Capabilities
Biology is extremely complex, far beyond human brain capacity. But with AI augmentation, providing cross-domain insights, diagnosing difficult diseases, and offering new treatment plans and early prevention becomes feasible.
6️⃣ Give Large Models to Users and Let Them Discover More Use Cases
"You need to remember: large models are not products... AI is just a new and interesting piece of the puzzle in your infrastructure that can help you solve new types of problems or solve old problems in better ways." — Kevin Scott, Microsoft
7️⃣ AI Memory Capabilities Will Continue to Strengthen
AI is currently memoryless - context windows give them short-term memory capabilities. Most large models currently have around 32k tokens, and this number will keep growing, with the ability to process super-large documents becoming stronger.
8️⃣ How We Interact with AI Is a Major Research Area
Current large model interactions are mainly chat-based, which is just convenient for building applications, not necessarily the best form of interaction. Multimodality (text, images, voice, other media) will provide more diverse and three-dimensional interaction forms.
9️⃣ Will We Use One General-Purpose Model, Multiple Domain-Specific Models, or Both?
The answer depends on your needs and payment capacity, and will change over time. We're still on the eve of a massive explosion in AI applications.
1️⃣0️⃣ How Enterprises Use AI and Handle Datasets
AI application in enterprises is still early. Enterprises generally move slowly and care about their datasets, unwilling to hand their valuable data to another company. Therefore, enterprises currently have three choices: find an LLM provider, fine-tune an LLM themselves, or develop a completely new LLM.
1️⃣1️⃣ Can Scaling Laws Lead Us to AGI?
AGI: Artificial General Intelligence refers to a theoretical form of artificial intelligence with human-equivalent intelligence and cognitive abilities.
LLMs currently follow scaling laws: with algorithms unchanged, as data and computing power increase, LLM performance improves.
But when will scaling laws take effect? Can they lead us to AGI before that? There's no answer yet.
1️⃣2️⃣ What New Capabilities Does AI Have?
When performing certain tasks, AI not only performs better but continues to improve.
1️⃣3️⃣ Will LLM Prices Drop?
Current chip shortages have created supply insufficiency, raising LLM costs. But reports indicate NVIDIA will produce more H100s next year, which should reduce LLM expenses.
1️⃣4️⃣ How Do We Measure Progress Toward AGI?
Difficult to measure. The initial test method was the "Turing Test," but now making AI pretend to be human in conversation isn't hard. What's hard is making AI act like humans in the real world.
1️⃣5️⃣ Do We Still Need Humans?
New technologies will replace some jobs while creating more new job positions for more people. While AI appears to automate our work in the future, the new levels of problems and possibilities AI brings are unpredictable.
1️⃣6️⃣ Now Is the Best Time to Create an AI Company