This is an audio transcript of the FT News Briefing podcast episode: ‘Governments dip into AI regulation’

[MUSIC PLAYING]

Marc Filippino
Good morning from the Financial Times. Today is Wednesday, November 1st, and this is your FT News Briefing. Sam Bankman-Fried’s fate will soon rest with the jury. Eurozone inflation numbers are out, and people finally have something to celebrate. Plus, innovation in artificial intelligence is barrelling ahead. 

Madhumita Murgia
But if we do create artificial general intelligence, superintelligence, do we know how we would control such a system? 

Marc Filippino
I’m Marc Filippino, and here’s the news you need to start your day.

[MUSIC PLAYING]

Marc Filippino
Sam Bankman-Fried’s trial is set to wrap up today. A jury in a New York federal court will determine the fate of the man who founded the now collapsed cryptocurrency giant FTX. If convicted of fraud, Bankman-Fried could potentially face a life in prison. He’s pleading not guilty. The FT’s Joe Miller is covering the trial. Hi, Joe. 

Joe Miller
Hi, Marc. 

Marc Filippino
So we are wrapping up a five-week trial, which you have covered extensively. What is happening today? 

Joe Miller
Well, today we finally get to closing arguments, which is the sort of blockbuster stage of a criminal case where the prosecution and the defence get to outline their various narratives of what exactly went on here and particularly what happened in the final days of FTX before it collapsed with a $8bn hole in its balance sheet last November. And really what we’ll have is a collation of some of the extraordinary testimony that we’ve heard over the last five weeks. Most recently, Sam Bankman-Fried himself taking to the witness stand for more than two days in his own defence in what is quite a rare move for white collar defendants in which we’ve heard his side of the story under oath. 

Marc Filippino
Yeah, it is kind of weird for the defendant in a case like this to take the stand. What exactly did we learn from what he said? 

Joe Miller
It is, and the reason it’s so weird is because defence counsel tend to advise their clients not to do this because it exposes them to cross-examination. And it was during that cross-examination, really, that we learned things that perhaps we didn’t know before. And what we heard in the last few days was that Sam Bankman-Fried testified that he was very surprised to learn of the $8bn hole that I mentioned earlier and essentially trying to deflect blame onto his subordinates and that perhaps he didn’t have his eye on the ball, to paraphrase some of his testimony, as much as he should have, but that the large mistakes and the oversights were made by others. 

Marc Filippino
Joe, do we have a sense of how this case is going to turn out? I mean, does it hinge on any one thing? 

Joe Miller
It hinges, like most criminal cases, on whether there was criminal intent behind any of these actions. And what the prosecution is going to try and prove is that contrary to Sam Bankman-Fried’s assertion that he maybe had some lax risk controls and was caught unawares by a downturn in the crypto market, that this was a scheme that was orchestrated from really the beginning of FTX to, in effect, borrow customer funds to make risky investments that then went wrong. 

Marc Filippino
Now, ever since the trial started and really ever since FTX imploded last year, there’s been a lot of speculation about what it all represents for this crypto sector, which has been pretty unregulated. And I’m curious if over the past five weeks we’ve gotten any answers to that. 

Joe Miller
I think we have to the extent that it shows what an unruly world this whole industry was. We got an insight into how Sam Bankman-Fried was able quite cheaply to buy influence in Washington with political donations, to buy celebrity support. But more interestingly, and perhaps more importantly, he was also able to attract investors, serious venture capital investors, who were to a greater or lesser degree, taken in by his pitch, really. And it kind of provided an insight into how much leeway there was given to these people, who come along with a new innovation that perhaps not very many even sophisticated investors understood very well but were really willing to pour hundreds of millions, if not billions, into. And whether that party is now coming to an end. 

Marc Filippino
Joe Miller covers US legal affairs for the FT. Thanks, Joe. 

Joe Miller
Thanks very much. 

[MUSIC PLAYING]. 

Marc Filippino
Eurozone inflation has dropped like a rock. The report for October came out yesterday. It showed that the rate of inflation fell to 2.9 per cent. Last month’s low inflation rate is the slowest we’ve seen in the eurozone since July 2021. And a lot of people hope this good news will convince central banks to stop raising interest rates. The European Central Bank kept rates flat when it met last week. The Federal Reserve is expected to do the same today. But this helping of good news comes with a side of caution. Economists think that the slide in inflation might not last. And that’s because they’re worried that the Israel-Hamas war will push up energy prices.

[MUSIC PLAYING]. 

Marc Filippino
It’s a big week for those hoping to rein in the potential risks of artificial intelligence. There was US president Joe Biden’s executive order on Monday. 

Joe Biden clip
We face a genuine inflection point in history, one of those moments where the decisions we make in the very near term are going to set the course for the next decades. 

Marc Filippino
And today the UK kicks off a massive summit where world leaders and tech executives will discuss potential ground rules for the development of AI. Madhumita Murgia is the FT’s AI editor, and she joins me now to talk about the summit and wider international action. So Madhu, why are we at a point where AI is in such need for regulation? 

Madhumita Murgia
So really, over the last year or so, we’ve seen this explosion of what now we’re calling generative AI. This is a specific subset of AI, and the way that you and I might have experienced it is through ChatGPT, for example. These are essentially tools, software that can generate language, images, video and audio that’s almost indistinguishable from human outputs. But alongside that increased sophistication has also come increased fear, really, which has come from fringe into mainstream over the last year, of if we do create artificial general intelligence, superintelligence, do we know how we would control such a system? 

Marc Filippino
Yeah, and it looks like the governments are at least trying to answer that question. Like I mentioned, we had Biden’s announcement earlier this week and now this summit in the UK. Do we know what the focus will be here? 

Madhumita Murgia
There’s definitely disagreements about what should be the focus of the summit. There’s a group of researchers who believe we should be regulating the immediate, near-term harms that we’re already seeing today. For example, the use of AI deepfakes to propagate misinformation during elections. And then there’s another group who feels that there are more extreme risks that also need to be focused on existential risks to humanity, as they call it. These are also AI researchers, well-regarded, who feel that we need to be thinking about an off switch for these systems if and when it gets to a point where they’re more intelligent than humans. And so among these factions, it seems like the UK government does want to focus also on these bigger or more future-facing risks and not just the immediate ones. 

Marc Filippino
OK. So what can we realistically expect this summit to accomplish? 

Madhumita Murgia
A lot of people we’ve spoken to say the first step and a really hard one was actually getting people in the same room who disagree. That means you can have that common discussion. The communiqué is one of the things that they hope to come out of it. But there’s also talk of a new institute being set up both within the UK to look at safety but also a supranational body, which will include lots of different governments, which the companies will sort of subscribe to also to design regulation going forward. 

Marc Filippino
So Madhu, who exactly is attending this thing? Who should we be looking out for. 

Madhumita Murgia
In terms of attendees of the summit, we know that the US vice-president Kamala Harris is going to be there to talk about regulation from a US perspective. And President Joe Biden talked about, for example, deepfakes and their risks in elections. But also they’re really keen that the companies report back on safety tests that they do around weapons, whether these systems can be used, for example, to build weapons. So the focus there is a lot on safety and security. And then we also know that there’s going to be representatives from Canada, from France, and of course, the Chinese delegation will be attending also. 

Marc Filippino
Madhumita Murgia is the FT’s AI editor.

[MUSIC PLAYING]. 

Marc Filippino
Odey Asset Management is closing. The hedge fund has been in crisis for months. It started after the FT published allegations that the firm’s founder, Crispin Odey, sexually assaulted and harassed 13 women, many of whom had worked at the firm. And more came forward with similar allegations following the story. That brought the total number of women to 20. The allegations pushed many banking partners to cut ties with Odey Asset Management. Investors also pulled their money out of what was once one of Europe’s largest hedge funds. A woman who said Odey sexually assaulted her talked about the firm’s closure. She said she hoped it would, quote, serve as a warning to all those in positions of power.

[MUSIC PLAYING] 

Marc Filippino
You can read more on all of these stories at FT.com for free when you click the links in our show notes. This has been your daily FT News Briefing. Make sure you check back tomorrow for the latest business news.

[MUSIC PLAYING]. 

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Comments

Comments have not been enabled for this article.