Many of yesterdayâs talks were littered with the acronyms youâd expect from this assemblage of high-minded panelists: YC, FTC, AI, LLMs. But threaded throughout the conversationsâfoundational to them, you might sayâwas boosterism for open source AI.
It was a stark left turn (or return, if youâre a Linux head) from the app-obsessed 2010s, when developers seemed happy to containerize their technologies and hand them over to bigger platforms for distribution.
The event also happened just two days after Meta CEO Mark Zuckerberg declared that âopen source AI is the path forwardâ and released Llama 3.1, the latest version of Metaâs own open-source AI algorithm. As Zuckerberg put it in his announcement, some technologists no longer want to be âconstrained by what Apple will let us build,â or encounter arbitrary rules and app fees.
Open source AI also just happens to be the approach OpenAI is not using for its biggest GPTs, despite what the multi-billion dollar startupâs name might suggest. This means that at least part of the code is kept private, and OpenAI doesnât share the âweights,â or parameters, of its most powerful AI systems. It also charges for enterprise-level access to its technology.
“With the rise of compound AI systems and agent architectures, using small but fine-tuned open source models gives significantly better results than an [OpenAI] GPT4, or [Google] Gemini. This is especially true for enterprise tasks,â says Ali Golshan, cofounder and chief executive of Gretel.ai, a synthetic data company. (Golshan was not at the YC event).
âI donât think itâs OpenAI versus the world or anything like that,â said Dave Yen, who runs a fund called Orange Collective for successful YC alumni to back up-and-coming YC founders. âI think itâs about creating fair competition and an environment where startups donât risk just dying the next day if OpenAI changes their pricing models or their policies.â
âThatâs not to say we shouldnât have safeguards,â Yen added, âbut we donât want to unnecessarily rate-limit, either.â
Open-source AI models have some inherent risks that more cautious technologists have warned about. The most obvious being that the technology is open and free; people with malicious intent are likely to use these tools for harm then they would a costly, private AI model. Researchers have pointed out that itâs cheap and easy for bad actors to train away any safety parameters present in these AI models.
âOpen sourceâ is also a myth in some AI models, as WIREDâs Will Knight has reported. The data used to train them may still be kept secret, their licenses might restrict developers from building certain things, and ultimately, they may still benefit the original model-maker more than anyone else.
And some politicians have pushed back against the unfettered development of large-scale AI systems, including California State Senator Scott Wiener. Wienerâs AI safety and innovation bill, SB 1047, has been controversial in technology circles. It aims to establish standards for developers of AI models that cost over $100 million to train, requires certain levels of pre-deployment safety testing and red-teaming, protects whistleblowers working in AI labs, and grants the stateâs attorney general legal recourse if an AI models causes extreme harm.
Wiener himself spoke at the YC event on Thursday, in a conversation moderated by Bloomberg reporter Shirin Ghaffary. He said he was âdeeply gratefulâ to people in the open source community who have spoken out against the bill, and that the state has âmade a series of amendments in direct response to some of that critical feedback.â One change thatâs been made, Wiener said, is that the bill now more clearly defines a reasonable path to shutting down an open source AI model thatâs gone off the rails.
The celebrity speaker of Thursdayâs event, a last-minute addition to the program, was Andrew Ng, the cofounder of Coursera, founder of Google Brain and former chief scientist at Baidu. Ng, like many others in attendance, spoke in defense of open source models.
âThis is one of those moments where [itâs determined] if entrepreneurs are allowed to keep on innovating, Ng said, âor if we should be spending the money that would go towards building software on hiring lawyers.â
+ There are no comments
Add yours