
The voice you hear on the other end of your phone call may not be who you think it is, the person you’re texting with could really be a bot, and the face in a photo or video on your favourite dating app might not even exist.
Technological advancements in artificial intelligence are creating the potential to fuel romance scams, said Jeff Clune, an associate professor of computer science at the University of British Columbia.
Scammers now have “more tools in their tool box to hoodwink people, especially people who are not aware of recent advances in technology,” Clune said in an interview.
Such advancements include voice simulators, face generators and deepfakes — in which an existing image or video is used to create fake but believable video footage. Another set of advancements is chat bots, like ChatGPT, which generate humanlike text responses on all sorts of online platforms.
Source: Canadian Broadcasting Corporation
Date: February 23rd, 2023
Discussion
- For a company, it might be very useful to have an AI bot that can interact in a genuine way with customers. For a person, it could be incredibly damaging to have an AI bot act as a romantic partner. Where does the line for one end and the other begin, and why?
- What possible steps could be taken to limit malicious AI bot use?
Leave a Reply