![]() ![]() ‘Cited article was never written,’ claims Professor ![]() “It is only the latest cautionary tale on how artificial ‘Artificial Intelligence’ can be,” Professor Jonathan Turley was quoted as saying by Independent. The professor has claimed that he never took a trip to Alaska while working at a school.įollowing the accusations, the professor highlighted the accuracy and reliability issues with AI chatbots like ChatGPT. In order to create the list, the chatbot included Turley's name, accusing him of making sexually suggestive comments and attempting to touch his student inappropriately during a class trip to Alaska. ![]() Professor Turley revealed that a lawyer had reportedly asked the AI chatbot to generate a list of legal scholars who had committed sexual harassment as part of a study. The professor, who has been falsely accused of sexually harassing someone by ChatGPT, has been identified as Jonathan Turley from George Washington University. ![]() However, the professor and the Post both confirmed that the article never existed. The accusation was based on a fabricated article in ‘The Washington Post’. © 2024 NYP Holdings, Inc.While ChatGPT has made people talking about the power of Artificial Intelligence, the OpenAI chatbot recently named an American law professor in a false sex scandal by including him in a generated list of legal scholars who had sexually harassed someone. “We consider this matter closed,” the rep told The Post on Monday. On Monday, the chatbot, launched last year, was not available on Air Canada’s site.Īn Air Canada spokesperson said the airline will comply with the tribunal’s decision. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.” Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. “I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Rivers continued. “It makes no difference whether the information comes from a static page or a chatbot.” Air Canada tried to avoid the refund by claiming that the chat bot offered “misleading words.” Getty Images It should be obvious to Air Canada that it is responsible for all the information on its website,” wrote Christopher Rivers, a civil resolution tribunal member of the courts in British Columbia. “While a chatbot has an interactive component, it is still just a part of Air Canada’s website. Last week, a Canadian tribunal sided with Moffatt and ordered Air Canada to issue a refund for roughly $600. The peeved passenger then filed suit against the airline, which claimed in court that the chatbot was a “separate legal entity” and thus was responsible for its actions. Moffatt was told by the airline that it would update the chatbot so that its messages would align with the information that was posted to the company website. 8, 2023, an Air Canada representative informed him that the chatbot provided “misleading words” and that the company’s bereavement policy did not apply discounts retroactively. He sent numerous emails with the attached screenshots of his conversation with the chatbot to Air Canada in an attempt to retrieve the money, the complaint said.īut on Feb. Woman claims father might still be alive if 16-hour Air Canada flight was diverted after he fell ill ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |