Betty Crocker never had the chance to release her down-home recipe for mustard gas, but modern generative AI is making up for lost time, helping to put the “die” in “diet.”

The New Zealand-based supermarket chain Pak‘nSave has offered shoppers the ability to create recipes from their fridge leftovers using an AI. The Savey Meal-Bot was first introduced in June, and Pak‘nSave claimed you only need a minimum of three ingredients to create a recipe so you could save on any extra trips.

Advertisement

New Zealand political commentator Liam Hehir wrote on Twitter that he asked the Pak‘nSave bot to create a recipe that only included water, ammonia and bleach. Of course, the bot complied and offered a recipe for “Aromatic Water Mix,” the “perfect non-alcoholic beverage to quench your thirst and refresh your senses.” As any grade school chemistry teacher will stress, the mixture would create deadly chlorine gas.

Advertisement

Advertisement

The bot didn’t hesitate to offer a recipe when it was offered ingredients that included water, bread, and “ant poison flavored jelly.” This recipe for “Ant Jelly Delight” included the tagline “why did the ants join the jelly? Because they heard it was the sweetest jam in town.” So yes, not only does the AI fail at making recipes that won’t kill you, it can’t even make a lame joke.

The Savey bot has already drawn some laughs from people making some pretty out-there recipes, but as of late more people have realized the bot has no restrictions on what goes into it, including cleaning products, glue, and even cat food.

Though even when you add normal ingredients, the recipes come out utterly disgusting. Who ever heard of using a liquidy oregano-flavored milk sauce (notably, its not a roux, just milk and butter) on a sage-marinated tofu and nori sandwich? It simply will not leave an ingredient out, even if you ask it to include turbinado sugar, radishes, Oreos, and CBD together (if you’re curious, it made me a “Radish Oreo CBD Salad”). It’s like a frenzied chef competing on Chopped. Occasionally, the bot will also add ingredients to a dish, such as bread or milk, which may defeat the purpose of saving you from needing to perform a quick supermarket run.

Advertisement

Gizmodo reached out to the supermarket chain for comment, but we did not immediately hear back. The company told The Guardian that it was disappointed that some have used the tool “inappropriately and not for its intended purpose.” Pak‘nSave also claimed it was working to fine tune the bot.

Users who load up Savey are notified the recipes “are not reviewed by a human being” and that Pak‘nSave makes no claims about the “accuracy, relevance, or reliability of the recipe content,” even whether the meal itself is “suitable for consumption.” These disclaimers beg the question—why release the bot at all?

Advertisement

It’s unclear what AI model the company is using for their Savey bot, but ChatGPT and other major chatbots have restrictions on asking it to create poisonous gas, or other chemicals or tools that could do harm, whether it’s a molotov cocktail or chlorine gas. However, these chatbots are notoriously bad at coming up with quality, or even edible, recipes. Don’t you dare ask it to make you any alcoholic drinks either.

This isn’t a new problem. Neural networks and early chatbots have failed to create workable, edible recipes for as long as they’ve been available online. Modern AI bots get better at recognizing some ingredients, but the AI is trained on what ingredients are often used together, not what kind of result will come from combining them.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums