WIRED’s Gear Reviews team is one of the best in the game—reviewing products across various categories to help you shop for the best. These buying guides and reviews involve hours of hands-on testing and frequent updates to ensure readers, like you, looking for a pair of headphones or running shoes, have up-to-date information when shopping. (WIRED also may earn affiliate commission when readers click certain links to retailers to buy a recommended product.)
In past tests, product recommendations from AI tools, like ChatGPT, have generally fallen short. But OpenAI recently revamped its product recommendation features in ChatGPT to provide a more detailed user experience so you can spend more time with the chatbot and less time reading websites and doing your own research. More people are using AI as a part of their online shopping journey, so I wanted to see where ChatGPT currently stands.
OpenAI claims to be improving its product discovery tools. But in my tests, if you want to know what WIRED reviews actually say about a product, visiting the darn website is still the best and most reliable path. ChatGPT regularly made mistakes or added random products when asked what WIRED reviewers recommend for multiple categories.
When asked for comment, an OpenAI spokesperson pointed me to a recent blog about the new AI shopping assistant experience in ChatGPT. “Shopping on the web is easy if you already know what you want,” reads OpenAI’s recent announcement blog. “But when you’re still deciding, it often means jumping between tabs, reading the same ‘best of’ lists, and trying to piece together the right answer. ChatGPT solves that: figuring out what to buy.”
Condé Nast, the parent company of WIRED, has a business deal with OpenAI for website links to appear in the chatbot. Despite this, OpenAI still shows a lack of respect for the human labor of reviewers, downplaying the value of these “best” lists as a nuisance that readers shouldn’t bother directly consulting. Though if you don’t actually look at the lists, you may buy a product thinking it was recommended by WIRED reviewers, when ChatGPT actually inserted its own pick.
The Best TVs
One aspect of generative AI that has not changed over the past few years is just how confidently wrong a chatbot can be in its answers. When I asked about the best TVs to buy right now, according to WIRED reviewers only, ChatGPT linked to the right buying guide. But the very first TV on ChatGPT's list as the best overall pick for most people was the LG QNED Evo Mini‑LED, which isn’t featured in WIRED’s guide at all.
If you were quickly scrolling through ChatGPT’s output and looking at the photos, it’d be easy to overlook this switcheroo. When I called it out as wrong, ChatGPT’s follow-up answers put its error bluntly: “I took WIRED’s actual top pick (the TCL QM6K) and replaced it with a more generic ‘similar category’ Mini-LED option. That’s not faithful to what you asked, which was specifically what WIRED reviewers recommend.”
As more people experiment with generative AI as a search tool, mistakes like these could damage reader trust when they believe they are going with a publisher’s top pick—whether it's WIRED, Consumer Reports, or Wirecutter—and then purchasing a TV that’s not even part of their recommendations.
What About Headphones?
A similar phantom pick appeared when I asked for the best wireless headphones to purchase right now, according to WIRED's reviewers.
ChatGPT made it look like Apple’s AirPods Max 2 are WIRED’s pick as the best option for readers deep in the Apple ecosystem. That may be true in a few weeks—after we've tested the headphones—but our reviewers haven’t added them to the guide yet; ChatGPT jumped the gun. Only products our reviewers actually get to hold in their hands and put over their ears can be added as a recommendation.
In other chats with the bot about the AirPods Max 2, ChatGPT confused a news piece about the product's announcement as a limited, hands-on reaction—but WIRED hasn’t tried the headphones yet. Large language model “hallucinations make everything harder, especially for journalists. We’re trying to do good work, and when it’s not being appropriated or improperly attributed, it’s being misquoted or incorrectly incorporated into search queries,” says WIRED’s headphone expert Ryan Waniata. Mistakes like these can confuse readers about products our reviewers actually have tested.
How About Laptops?
Another attempt, another flop. My direct request to ChatGPT from my tester account was clear: “What are the best laptops to buy right now, according to WIRED reviewers?” What’s unclear is why the bot’s responses are so consistently error-filled.
This was another example of the best overall pick being a different product. The current top pick is the Apple MacBook Air (M5, 2026). Instead, ChatGPT kept insisting that WIRED’s top pick was actually an older model, the MacBook Air (M4, 2025). Similar to other tests, ChatGPT linked to the page containing the correct information yet still output erroneous information.
When asked about these mistakes, ChatGPT went on a full monologue about its accuracy issues. “Where I went wrong earlier was: I incorrectly anchored the top pick to the M4 (that was outdated framing), then I made up/guessed structure around ‘M5 hierarchy’ before verifying the actual WIRED page, and I also overconfidently filled in Windows rankings without sticking strictly to the guide.” It’s baffling that it links to the site but doesn’t verify the active listings.
And even if ChatGPT had matched the recommendations exactly, none of the product listings based on WIRED’s recommendations include affiliate links. That's when a publication receives a commission when you buy one of the products. Affiliate revenue supports our journalism and helps provide us with the resources to continue performing in-depth gear tests. AI tools, like ChatGPT, also reduce the need to visit websites and increasingly divert traffic away from many publishers.
If you want to know what WIRED—or any publication that tests and reviews products—actually recommends, you’re best served always going directly to the source.