Infra
AI pioneer Yoshua Bengio urges Canada to build $1B public supercomputer | CBC News
Yoshua Bengio has been thinking for a while about what happens if the technology he helped pioneer becomes smarter than humans — and escapes our control.
“We could basically create new types of living entities that have their own preservation as a more important value than our own,” he said.
Entities, he worries, that, with the aid of robots, could one day “roam the planet.”
But Bengio, who is scientific director of Mila, the Montreal-based artificial intelligence institute he founded in 1993, is increasingly contemplating political solutions to head off such a sinister scenario.
At home in his spacious but unpretentious 1950s residence on the edge of Mount-Royal Park, the AI guru clearly feels time is of the essence, both when it comes to his own to-do list for 2024 and for governments to rein in increasingly powerful artificial intelligence systems.
“It’s going to be regulation first,” he says, “But eventually they will want to take back some control, maybe initially by building their own infrastructure.”
That infrastructure includes building much more powerful computers, stacked with thousands of graphics processing units (GPUs), components that are ideal for training or testing AI large language models like ChatGPT.
He’d like to see that class of machine built in Canada, funded by governments, so public entities have the digital firepower to keep up with the private tech giants they’ll be tasked with monitoring or regulating.
“I think government will need to understand at some point, hopefully as soon as possible, that it’s important for [them] to have that muscle,” said Bengio.
Bengio says such a supercomputing resource would cost about a billion dollars, and when he pitched the idea to governments in Canada the response so far has been, “‘we are listening.”
“It’s a lot of money,” he acknowledged.
U.K. investing big
Other governments around the world, though, are already spending a lot of money to build more powerful public computers for AI, notably in the United Kingdom, which last fall announced one called Isambard-AI that would be built at the University of Bristol as part of a £900M plan to “transform the U.K.’s computing capacity.”
That computer would be 10 times faster than anything else now operating in the U.K., and about 20 times faster than the most powerful publicly accessible supercomputer in Canada, the Narval, housed at Montreal’s École de technologie supérieure, according to the director of the organization that runs it, Calcul Québec’s Suzanne Talon.
The non-profit, funded by Ottawa, the province and academia, is one of five National Host Sites, called clusters, for Canadian public supercomputers along with other installations at universities in Victoria, Vancouver, Waterloo and Toronto.
Between its two supercomputers — Narval and Beluga — Calcul Québec currently has a total of 1,300 GPUs available to university researchers in Canada.
But that’s dwarfed by tech giants like Meta, which alone plans to have the equivalent of 600,000 of those GPUs on hand by the end of the year as it tries to develop artificial general intelligence.
Talon says while bigger isn’t necessarily better, there is a lack of public computing resources available to Canadian researchers, with Calcul Québec’s servers already running full time, except for maintenance.
“There’s no question there has not been a really massive investment in AI in Canada and so the main question is, ‘What is the right amount?'” she says.
Although Talon says that may mean working on more “frugal” models, it’s important public-based research isn’t left behind.
“I mean we have to understand how AI works, so we need access,” she said.
Competition for GPUs
Siva Reddy, an assistant professor of linguistics and computer science at McGill University who is also a core academic member at Mila, estimates the total combined resources available for public AI research in Canada add up to about one-tenth of what a single big U.S. tech company has — just for itself.
He says although researchers do have access to GPUs through computing clusters like Narval, scale is an issue.
While Reddy says it is possible to run smaller AI models like Meta’s Llama with what’s currently available through public computing clusters in Canada, that’s not the case for larger ones like ChatGPT.
“That model, forget about it, we can’t run it in our clusters,” he said.
When it comes to, say, research into discrimination or systemic bias, Reddy says while analyzing AI models is possible with current public computing power, building new models from scratch or fine-tuning existing ones isn’t, short of monopolizing resources shared by hundreds of researchers across the country.
He says training a large language model like ChatGPT requires continuous access to 1,000 GPUs for 34 days.
“So imagine if you want to take this entire cluster, it means nobody can do any work for a month,” says Reddy, “So we need a supercluster for priority projects on its own.”
While he “absolutely” supports Bengio’s idea of building one or more public supercomputers for work on large language models, Reddy points out it’s important to recognize the environmental impact of the energy required to operate them.
“Running these systems also requires a lot of carbon emissions.”
Governments working toward solution
James Peltier, who oversees the Research Computing Group, IT services at Simon Fraser University, site of the CEDAR supercomputer, thinks the five cluster sites are doing a “decent job” of serving Canadian researchers’ needs but that demand greatly outstrips supply.
“We’re currently only able to meet about 20 per cent of the GPU need and 40 per cent of the CPU need,” he says. CPUs are central processing units, which are not as well-suited for training AI models
But Peltier says when it comes to spending on AI infrastructure, private companies who are focused on profit aren’t subject to the same fiscal constraints as governments, who have other public policy priorities to pay for, like fighting COVID-19 or dealing with the opioid crisis.
“They don’t have the same competing challenges,” he says.
The office of Industry Minister Francois-Philippe Champagne says the government is working with partners to support Canadian researchers with the “secure and affordable compute capacity” needed for AI.
Spokesperson Audrey Champoux says Champagne regularly speaks to Bengio, who is co-chair of the minister’s AI Council, and that Bengio has raised the issue of AI computers and safety.
In Quebec, a spokesperson for Economy Minister Pierre Fitzgibbon says that while Bengio hasn’t raised the issue of building an AI supercomputer on the scale of what’s being built in the U.K., the minister is available to discuss such a project with him.
Regulating for the present and for the future
Besides making the case for more public AI infrastructure, Bengio is also redoubling his calls for more democratic oversight and government regulation of the artificial intelligence sector.
Last summer he appeared before a U.S. Senate subcommittee hearing on AI regulation and in the fall was commissioned to chair a “State of the Science” report on the capabilities and risks of frontier AI, which came out of a global safety summit hosted by the U.K.
In Canada, Bengio has called out the federal government for moving too slowly to adopt Bill C-27, which includes provisions to partially regulate AI and is currently under consideration before the House’s standing committee on industry and technology.
That’s as some take issue with putting so much emphasis on the existential risks of AI saying it distracts from more real-world problems the technology is already causing, or that excessive rules could stifle innovation.
When asked about his warnings of catastrophic consequences being considered “AI hype,” Bengio says it’s possible to mitigate the current harms caused by the technology, while also preparing for more hypothetical risks.
“For me, there’s no separation,” he says. “We need to manage the risks and minimize the harms. Now, tomorrow, five years and 10 years and 20 years [from now].”