You are here
Home > Other news >

ChatGPT “Endangering Motorcyclists”

ChatGPT is putting bikers in danger, according to one source. The online Artificial Intelligence (AI) information service offers to answer any question, based on its own information from the web and without any direct human input.

Roy Martin, founder of the US-based Motorcycle Gear Hub – a website which tests helmets, jackets, boots and gloves – says that ChatGPT is providing inaccurate information when asked questions about motorcycle protective gear. “We have received a huge number of questions from our readers over the last month regarding the inaccuracies of ChatGPT’s answers,” he said. “(It) is also posting inaccurate answers that are endangering the safety of motorcycle riders.”

Martin adds that on 9th April 2023 he spent an hour asking ChatGPT a range of simple questions about bike gear and that all of the answers were inaccurate. When asked ‘what is EN 17092’ (the European standard for jackets, trousers and suits), ChatGPT wrongly included gloves and boots, which have a different standard. Asked what the classes are for EN 17092, it listed four – A, B, C and D. There are actually five of them – A, AA, AAA, B and C. Of these, C has the lowest abrasion-resistance, but ChatGPT insisted that C was one of the highest standards. Other errors included an incorrect listing for CE Level 2 gloves.

“It’s impossible to know what kind of garbage ChatGPT has been fed to provide such wrong and inaccurate answers,” concluded Roy Martin.

“I’ve often wondered whether AI is related to military intelligence, often cited as a contradiction in terms. ” said BMF Chair Jim Freeman.

Written by Peter Henshaw

Top image courtesy of Joyi Chang/Alamy

Top