TECH NEWS – In a China that has become increasingly dystopian, what we have been expecting for some time has happened: the use of artificial intelligence has been proven harmful, and the result is jail.
The South China Morning Post has reported that Chinese police have arrested a man who allegedly used ChatGPT to create a fake news story and spread it online. It could be the first time someone has been jailed for using artificial intelligence. The man’s wrists were slapped in handcuffs in the country’s northeastern province of Gansu, and police would only reveal his surname (Hong). According to a police statement, he “used artificial intelligence to concoct false and untrue information.”
Hong’s article surfaced on April 25 and falsely claimed that a local train crash had occurred in which nine people were killed. According to cybersecurity officials, it was posted on over 20 accounts on a microblogging platform, Baijiahao (run by the local Google equivalent, Baidu), and read by at least 15,000 people. It is the first arrest in China since the law on artificial intelligence and deep fakes was enacted in January. The Administrative Provisions on Deep Synthesis for Internet Information Service target technologies that generate text, images, audio, or video and explicitly mention deep learning models. It doesn’t outlaw the creation of things using these technologies but does force them to be “clearly labeled.”
The police found out that the suspect had a company, and ten days after the article was published, the arrest, confiscation of the PC, and the house search came. According to their statement, Hong admitted that he had taken elements from older stories popular in China to feed into ChatGPT, resulting in several versions of the fake story that were eventually uploaded. Hong said his friends on WeChat told him he could make money from clicks. Hong’s crime of “picking quarrels and provoking trouble” carries a possible sentence of up to five years in prison, and the authorities could impose this on Hong as a deterrent to others from trying.
It sounds scary.
Source: PCGamer
Leave a Reply