OpenAI allows paying users to create custom versions of ChatGPT that other subscribers can then access through an online store © FT montage/Bloomberg/NurPhoto via Getty Images

Chatbots that claim to predict future share prices and evade plagiarism detectors are among the most popular on OpenAI’s new app store, according to data that provides the first glimpse into the practical ways that millions are using generative artificial intelligence.

The Microsoft-backed start-up has allowed paying users to create custom versions of ChatGPT since November, with other subscribers then able to access these so-called “GPTs” through an online store.

According to new data from analytics group SimilarWeb, some of the most popular GPTs serve educational purposes, with the second most used app being Consensus, a tool to search and summarise academic papers. 

Other apps have surged in usage this year, including design tools that can instantly generate images, translate between languages or help with job applications by reviewing CVs and cover letters.

But a Financial Times analysis also found many popular GPTs could be in breach of OpenAI’s usage policies, which has rules against chatbots that provide financial, legal or medical advice that have not been reviewed by a qualified professional.

Five of the most-viewed GPTs are described as being able to produce content that can bypass detection tools employed by schools and universities to determine if students had produced essays and answers using AI. These tools were viewed at least 3mn times in total, despite OpenAI barring apps that engage in or promote academic dishonesty.

Another app called Finance Wizard, which has been used more than 200,000 times, claims to reveal future stock movements. Its creator told the FT the app makes predictions based on historical data and contains disclaimers warning against using it as financial advice.

OpenAI chief Sam Altman said last year that the company would ensure GPTs “follow our policies before they’re accessible”. OpenAI said it uses a combination of automated systems, human review and user reports to find GPTs that may violate its policies. Users can also now rate and review GPTs.

Altman’s vision is to use customised versions of ChatGPT, OpenAI’s hit chatbot which has 100mn weekly users, to create a digital platform that will also supercharge its business. The strategy is seen as attempt to replicate the success of Apple’s app store for the iPhone.

Altman said top GPT creators will start to get a slice of revenue from their apps later this year, seeking to draw more developers into making GPTs.

SimilarWeb’s data is limited, as the figures do not include views from OpenAI’s ChatGPT mobile app, which has been downloaded 169mn times, according to analytics group data.ai. But the figures also suggest that customised GPTs so far have limited appeal.

GPTs created by subscribers accounted for just 1.5 per cent of desktop visits to ChatGPT’s website in February, the SimilarWeb data showed. Weekly traffic counts have also levelled off since the store launched.

“In some ways OpenAI is following a very standard ‘how to build a platform’ template that’s so predictable that it might have used ChatGPT to write it,” said Benedict Evans, an independent technology analyst. “So, we have a developer conference, an API and an app Store. But it’s not clear to me whether this really has traction.”

OpenAI said millions of people had interacted with GPTs since the store launched, adding: “We know there are improvements to make to the GPT store and we appreciate the feedback from developers to help us improve.”

Some established organisations, including hiking app AllTrails, education non-profit Khan Academy and travel search engine Kayak, have created popular GPTs.

Canva, an Australian technology company, was among the first to launch a custom GPT, which allows users to create social media graphics and has racked up more than 4.4mn views.

“We see it as an opportunity to meet users where they are,” said Duncan Clark, head of Europe at Canva.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments