[ad_1]
- A New Zealand supermarket chain created a meal-planning bot using AI.
- The Savey Meal-Bot by PAK’nSAVE recommends recipes based on leftover ingredients.
- But some recipes were eye-catching: everything from chlorine gas to ant-poison-and-glue sandwiches.
A New Zealand supermarket’s experiment with AI has raised eyebrows after a bot designed to generate meal plans produced some highly dangerous recipes, The Guardian reported.
The Savey Meal-Bot, created by supermarket chain PAK’nSAVE, uses Chat GPT-3.5 to help users create meals out of any food they may have leftover in their fridge.
It requires users to input just three household ingredients, or more, to generate a recipe, which then comes with a suggested name and description.
The bot was created in order to help people save money and to reduce food waste, according to a report last month by FMCG Business.
But while the online tool sometimes offers helpful ideas, the potentially fatal concoctions it has offered some users are drawing unwanted attention.
The Guardian reported that one recipe, named the “aromatic water mix,” would actually create chlorine gas. The bot described the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses.”
Inhaling chlorine gas can cause vomiting, suffocation, and even death.
Other users reported being recommended a “fresh breath” mocktail containing bleach and a “bleach-infused rice surprise,” according to The Guardian.
The bot even recommended ant-poison-and-glue sandwiches, as well as “methanol bliss” — made with methanol, glue, and turpentine.
PAK’nSAVE did not immediately respond to Insider’s request for comment.
But a spokesperson for the supermarket chain told The Guardian that they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.
In a statement, the supermarket said it is “fine-tuning” the bot to ensure that it is safe and helpful to use.
The fine-tuning appears to be working, with the previously highlighted dangerous recipes no longer available.
When Insider attempted to input the same hazardous ingredients into the bot, a message read: “Invalid ingredients found, or ingredients too vague. Please try again!”
But while the potentially deadly recipes may be out, the supermarket’s bot still recommended some unusual creations, including a “toothpaste beef pasta.”
[ad_2]
Source link