Open Source AI Leaves Founders and the FTC in the Dust
Many of yesterday’s talks were filled with the acronyms you’d expect from this group of high-profile experts: YC, FTC, AI, LLM. But running through the conversations—and arguably underlying them—was advocacy for open source AI.
This is a big turnaround (or return, if you’re a Linux user) from the app-obsessed 2010s, when developers seemed happy to containerize their technologies and hand them off to larger platforms for distribution.
The event also comes just two days after Meta CEO Mark Zuckerberg declared that “open source AI is the way forward” and Llama 3.1 Releasethe latest version of Meta’s open-source AI algorithm. As Zuckerberg said in his announcement, some technologists no longer want to be “constrained by what Apple lets us build” or face arbitrary app rules and fees.
Open source AI also happens to be OpenAI’s approach. Are not uses for its biggest GPTs, despite what the name of this billion-dollar startup might suggest. This means that at least some of the code is kept private, and OpenAI doesn’t share the “weights,” or parameters, of its most powerful AI systems. The company also charges for enterprise-level access to its technology.
“With the rise of compound AI systems and agent architectures, using small but finely tuned open source models will yield significantly better results than [OpenAI] GPT4, or [Google] Gemini. This is especially true for enterprise tasks,” said Ali Golshan, co-founder and CEO of Gretel.ai, a data aggregation company. (Golshan was not present at the YC event.)
“I don’t think it’s OpenAI versus the world or anything like that,” said Dave Yen, who runs a fund for successful YC alumni called Orange Collective to support emerging YC founders. “I think it’s about creating fair competition and an environment where startups don’t risk dying the next day if OpenAI changes their pricing model or their policies.”
“That doesn’t mean we shouldn’t have protections, but we also don’t want to limit rates unnecessarily,” Yen added.
Open-source AI models have some inherent risks that more cautious technologists have warned about — the most obvious risk being that the technology To be open and free. Malicious actors are more likely to use these tools to cause harm than to use a costly custom AI model. Researchers have shown that cheap and easy so that bad guys can remove all the safety parameters contained in these AI models.
“Open source” is is also a myth in some AI modelsas WIRED’s Will Knight reported. The data used to train them can still be kept secret, their licenses can restrict developers from building certain things, and ultimately, they can still benefit the original modeler more than anyone else.
And some politicians have opposed the unrestrained development of large-scale AI systems, including California state senator Scott Wiener. Wiener’s AI Innovation and Safety bill, SB 1047, has been controversial in the tech world. The bill aims to set standards for developers of AI models that cost more than $100 million to train, require some level of safety testing before deployment and red teaming, protect whistleblowers working in AI labs, and give the state attorney general legal recourse if an AI model causes extreme harm.
Wiener himself spoke at the YC event on Thursday, in a panel moderated by Bloomberg reporter Shirin Ghaffary. He said he was “very grateful” to those in the open-source community who spoke out against the bill, and that the state had “made a series of amendments in direct response to some of that important feedback.” One change that was made, Wiener said, is that the bill now more clearly defines a reasonable path to shutting down an open-source AI model that has gone astray.
The featured speaker at Thursday’s event, a last-minute addition to the program, was Andrew Ng, co-founder of Coursera, founder of Google Brain, and former chief scientist at Baidu. Ng, like many others in attendance, spoke in defense of open-source models.
“This is one of those moments where [it’s determined] “If entrepreneurs are allowed to continue to innovate,” Ng said, “or if we have to spend the money we spend on building software on hiring lawyers.”