Recently, “Xinhua point of view” reporter investigation found that some network black industry practitioners use the e-commerce platform to resell illegally obtained face and other identity information, as well as “photo activation” network tools and tutorials, with a price tag of 0.5 yuan for face data and 35 yuan for modifying software.
Yes, you’re right. You can buy a face information for 50 cents. At present, the leakage of user information and personal privacy is very common, but I never thought that face information is almost in a “naked” state, and even I am completely in the dark. It’s terrible to think about it.
The most typical application of artificial intelligence is face recognition. In recent years, railway stations, hotels, some residential areas, schools, shopping payment and other scenes can be done by brushing face, and the development trend is very rapid. Undeniably, thanks to technological progress, face recognition does bring many conveniences to the work and life of users, but if it is abused, it may bring unexpected troubles.
At the end of last year, the artificial intelligence ethics research group of Nandu Personal Information Protection Research Center released the “observation report of face recognition landing scene (2019)”. The report points out that face recognition is quietly implemented in public places. Most people are brushed and their whereabouts are recorded, but they do not know.
When the researchers of the research group investigated Xiushui Street shopping mall in Beijing, Joy City in Xidan and Yintai in77 shopping mall in Hangzhou equipped with face recognition system, they found that even though these shopping malls brushed customers’ faces and tracked their consumption trajectory, none of them informed customers and obtained their consent, and customers did not know that they were brushed or their whereabouts were recorded.
Therefore, as you can see, face recognition devices in many scenes do not provide privacy policies or user agreements, and the public cannot use them with informed consent. Tan Jianfeng, President of Shanghai Information Security Industry Association, pointed out to the point that a considerable number of Internet companies only focus on traffic, but ignore user safety, experience and privacy protection.
“When face recognition comes out, many companies advertise their recognition accuracy, but in fact, the higher the accuracy, the greater the risk.” Tan Jianfeng said that because these companies did not prompt, or that users did not consider the problem of back-end servers: in the Internet environment, once biometric authentication is adopted, there must be a feature database. As long as all biometric data is entered into the computer, it will be converted into computer code. As long as the code can be intercepted, replayed and refactored.
In his view, the server stores a large number of user’s characteristic database. Once the characteristic database is obtained by hackers or criminals, the consequences are unimaginable. Therefore, the application of face recognition needs to consider the user’s privacy and guard against the abuse of face recognition, otherwise the cost may be much higher than the convenience.
It has to be said that face recognition is like a double-edged sword. In some scenes, the disadvantages even outweigh the advantages. In order to prevent the abuse of face recognition technology to the greatest extent, we should formulate laws, regulations and national standards to clarify the enterprise qualifications and behavior norms; in more sensitive scenarios, we should restrict or even prohibit the application of face recognition, and develop a sound access mechanism, regulatory measures and accountability system.
In my opinion, although face recognition is good, supporting management should also keep up with it. Only by combining technical specifications and legal constraints can face recognition truly benefit the people and increase the user’s sense of security, instead of co-existence of convenience and risk, and the fact that face information is sold back without knowing it.
Editor in charge: PJ