A Spanish influencer couple claimed they were denied boarding on a flight to Puerto Rico after relying on incorrect visa information obtained from an artificial intelligence chatbot.
Caldass said she had consulted the AI tool regarding visa requirements for Puerto Rico and was informed that none were necessary. “I always do a lot of research, but I asked the chatbot, and it said no,” she stated in the footage.
The video, which has attracted over 6 million views on TikTok, shows the couple reacting emotionally to the situation. Between tears, Caldass suggested that the chatbot may have given misleading information in retaliation for previous insults she had directed at it. She added: “I sometimes insult it and call it useless… this must be revenge.”
Social media users criticised the couple for relying on unofficial sources for travel documentation. One comment read: “Natural selection, I suppose. If you are travelling across the ocean and you rely solely on a chatbot, you got off lightly.”
Another remarked: “Who uses a chatbot for visa advice?”
Some users defended the AI system, arguing that the couple may have asked an imprecise question. While Spanish citizens do not require a visa to enter Puerto Rico, they must obtain an Electronic System for Travel Authorization (ESTA) before travel, as the island is a United States territory.
On the other side, in a separate case in the United States, in which a 60-year-old man reportedly experienced severe psychological symptoms after substituting table salt with a toxic chemical based on dietary advice from an AI tool.
The man, who had no prior mental health history, spent three weeks in hospital suffering from hallucinations, paranoia and extreme anxiety. According to a medical journal report, he had replaced sodium chloride with sodium bromide, a compound historically used in sedatives and now found primarily in swimming pool cleaning agents.
He was later diagnosed with bromism, a rare condition that was once responsible for a significant proportion of psychiatric admissions in the 19th century. Symptoms include delusions, skin eruptions and gastrointestinal distress.
The man reportedly arrived at an emergency department claiming his neighbour was attempting to poison him.
Medical staff, concerned by the case, subsequently tested the chatbot and found it continued to recommend sodium bromide as a salt substitute, with no accompanying warning about potential health risks.


