微信公眾號(hào):田間小站
Facial recognition
人臉識(shí)別
Nowhere to hide
無(wú)處躲藏
Facial recognition is not just another technology. It will change society
人臉識(shí)別不只是另一種技術(shù)。它將改變社會(huì)
THE human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the face’s ability to send emotional signals, whether through an involuntary blush or the artifice of a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate.
人類的臉是一件杰作。面部特征之紛繁各異令人驚嘆,它讓人們能相互辨認(rèn)窝稿,也是形成復(fù)雜社會(huì)群體的關(guān)鍵。人臉傳遞情感信號(hào)的功能也同樣重要凿掂,無(wú)論是通過(guò)下意識(shí)的臉紅還是有技巧的假笑伴榔。人們?cè)谇逍褧r(shí)花費(fèi)大量時(shí)光研讀一張張面孔——在辦公室,在法庭庄萎,在酒吧踪少,在臥室,尋找著興趣惨恭、敵意、信任和欺騙的跡象耙旦。他們也花大把的時(shí)間試圖掩飾自己的神色脱羡。
Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers’ attendance; in Britain, by retailers to spot past shoplifters. This year Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apple’s new iPhone is expected to use it to unlock the homescreen.
科技正迅速趕上人類研讀臉孔的能力。在美國(guó)免都,教堂使用人臉識(shí)別來(lái)追蹤教徒做禮拜的出席情況锉罐;在英國(guó),零售商用它來(lái)辨認(rèn)有扒竊前科的顧客绕娘。今年脓规,威爾士警方利用人臉識(shí)別在足球場(chǎng)外逮捕了一名嫌疑犯。在中國(guó)险领,人臉識(shí)別被用于驗(yàn)證網(wǎng)約車司機(jī)的身份侨舆、讓游客刷臉進(jìn)景點(diǎn)、讓顧客微微一笑就能刷臉買單绢陌。蘋果的新款iPhone預(yù)計(jì)將用這一技術(shù)來(lái)解鎖屏幕挨下。
Set against human skills, such applications might seem incremental. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.
與人類的技能相比,這樣的應(yīng)用看似只是錦上添花脐湾。飛行或互聯(lián)網(wǎng)這樣的重大突破明顯改變了人類的能力臭笆,而人臉識(shí)別似乎只是對(duì)面孔進(jìn)行編碼。盡管人的面孔為個(gè)人獨(dú)有,但也是公開的愁铺,因此乍看起來(lái)鹰霍,技術(shù)并沒有侵犯隱私之嫌。但是茵乱,低成本茂洒、快速、大量地記錄似将、存儲(chǔ)和分析人臉圖像的能力終有一天會(huì)使隱私获黔、公平和信任等觀念發(fā)生根本性的改變。
The final frontier
終極戰(zhàn)線
Start with privacy. One big difference between faces and other biometric data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte, a social network, and can identify people with a 70% accuracy rate. Facebook’s bank of facial images cannot be scraped by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. China’s government keeps a record of its citizens’ faces; photographs of half of America’s adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens’ privacy.
先說(shuō)隱私在验。人臉相比指紋等其他生物特征數(shù)據(jù)的一個(gè)巨大區(qū)別就是它們能夠遠(yuǎn)距離起作用玷氏。人們只要有手機(jī)就可以拍下照片,供人臉識(shí)別程序使用腋舌。俄羅斯的一款應(yīng)用FindFace抓拍陌生人的照片與社交網(wǎng)絡(luò)VKontakte上的照片比對(duì)盏触,識(shí)別人的準(zhǔn)確率達(dá)70%。Facebook的面部圖片庫(kù)不能被其他人提取块饺,但是赞辩,舉個(gè)例子,這家硅谷巨頭可以獲得汽車展廳內(nèi)到訪者的照片授艰,然后使用人臉識(shí)別技術(shù)在自己的網(wǎng)站上找到這些人辨嗽,向他們發(fā)送汽車廣告。即使私人公司無(wú)法將照片和身份聯(lián)系起來(lái)淮腾,國(guó)家往往可以做到糟需。中國(guó)政府有公民的面部記錄;美國(guó)半數(shù)成年人口的照片儲(chǔ)存在數(shù)據(jù)庫(kù)中谷朝,可供FBI使用洲押。如今,執(zhí)法機(jī)關(guān)在追蹤罪犯方面擁有了一個(gè)強(qiáng)大的武器圆凰,但它可能會(huì)令公民隱私遭受巨大的損害杈帐。
The face is not just a name-tag. It displays a lot of other information—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive. But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.
人臉不僅僅能表明身份,它還顯示了許多其他信息专钉,同樣能由機(jī)器讀取挑童。這同樣帶來(lái)了一些益處。一些公司正通過(guò)分析臉部特征來(lái)自動(dòng)診斷罕見遺傳疾病跃须,比如Hajdu-Cheney綜合癥【譯注:顱骨發(fā)育不良伴肢端溶骨癥】炮沐,和其他可能的手段相比,早早就發(fā)現(xiàn)了病情回怜。測(cè)量情緒的系統(tǒng)也許能讓自閉癥患者更好地理解對(duì)他們來(lái)說(shuō)難以捉摸的社交信號(hào)大年。但這項(xiàng)技術(shù)也造成了威脅换薄。斯坦福大學(xué)的研究人員已經(jīng)證明,面對(duì)一個(gè)男同性戀者和一個(gè)異性戀者的照片時(shí)翔试,算法識(shí)別他們性取向的準(zhǔn)確率可以達(dá)到81%轻要。人類只能達(dá)到61%。在那些視同性戀為犯罪的國(guó)家垦缅,一個(gè)能從面部推斷出性取向的軟件讓人恐懼冲泥。
Keys, wallet, balaclava
鑰匙,錢包壁涎,頭套
Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants’ faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts’ decisions about bail and sentencing.
不那么暴力的歧視也可能變得普遍凡恍。雇主本來(lái)就可能會(huì)根據(jù)自己的偏見來(lái)拒絕雇用某個(gè)人,而人臉識(shí)別也許會(huì)讓這種偏見成為常態(tài)怔球,令公司能夠通過(guò)種族以及顯現(xiàn)智力水平和性取向的特征過(guò)濾所有工作申請(qǐng)嚼酝。夜總會(huì)和體育場(chǎng)館也許會(huì)受到壓力,可能需要掃描訪客的臉來(lái)識(shí)別暴力威脅竟坛,從而保護(hù)人們——盡管由于機(jī)器學(xué)習(xí)的性質(zhì)闽巩,所有的人臉識(shí)別系統(tǒng)都不可避免地面對(duì)概率問(wèn)題。此外担汤,這類系統(tǒng)可能會(huì)對(duì)那些非白色皮膚的人有偏見涎跨,因?yàn)橛脕?lái)訓(xùn)練算法的數(shù)據(jù)集里大部分是白人面孔,這樣的算法不太適用于其他種族崭歧。在影響法院保釋和量刑決定的自動(dòng)評(píng)估工具中隅很,已經(jīng)出現(xiàn)過(guò)這樣的偏見。
Eventually, continuous facial recording and gadgets that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life. If your partner can spot every suppressed yawn, and your boss every grimace of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someone’s face. Relationships might become more rational, but also more transactional.
最終率碾,持續(xù)的面部記錄和用計(jì)算機(jī)數(shù)據(jù)測(cè)量真實(shí)世界的小工具可能會(huì)改變社交互動(dòng)的本質(zhì)叔营。掩飾有助于潤(rùn)滑日常生活的齒輪。如果你的伴侶能發(fā)現(xiàn)每一個(gè)強(qiáng)壓下去的哈欠播掷,你的老板能覺察每一絲惱怒的表情审编,婚姻和工作關(guān)系都會(huì)變得更真實(shí)撼班,但也更不和諧歧匈。社交互動(dòng)的基礎(chǔ)可能也會(huì)改變,從基于信任的一系列承諾砰嘁,變成對(duì)風(fēng)險(xiǎn)和回報(bào)的算計(jì)件炉,這些算計(jì)則源自于計(jì)算機(jī)對(duì)人們面部信息的解讀。人際關(guān)系可能變得更理性矮湘,但也變得更像交易斟冕。
In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates’ images. Suppliers of commercial face-recognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally. Firms that use such technologies should be held accountable.
至少在民主國(guó)家,立法可以幫助改變利弊之間的平衡缅阳。歐洲監(jiān)管機(jī)構(gòu)已在即將出臺(tái)的數(shù)據(jù)保護(hù)法規(guī)中嵌入了一套原則磕蛇,規(guī)定包括“臉紋”在內(nèi)的生物信息屬于其所有者,使用這些信息需要征得本人同意。這樣秀撇,F(xiàn)acebook在歐洲就不能像在美國(guó)那樣超棺,直接向參觀汽車展的人推送廣告了。反歧視法律可以適用于篩選求職者照片的雇主呵燕。商業(yè)人臉識(shí)別系統(tǒng)的供應(yīng)商可能要接受審核棠绘,證明它們的系統(tǒng)沒有在無(wú)意中傳播偏見。使用這些技術(shù)的公司也應(yīng)該被問(wèn)責(zé)再扭。
Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook’s plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.
然而這類規(guī)定并不能改變發(fā)展的方向氧苍。隨著可穿戴設(shè)備的普及,攝像頭只會(huì)越來(lái)越普遍泛范。從太陽(yáng)鏡到化妝让虐,試圖欺騙人臉識(shí)別系統(tǒng)的種種努力已被挫敗。劍橋大學(xué)的研究表明敦跌,人工智能可以重建偽裝之下的面部結(jié)構(gòu)澄干。谷歌已經(jīng)明確表示不會(huì)將面部信息和身份匹配,擔(dān)心這會(huì)被非民主政權(quán)濫用柠傍。其他的科技公司似乎沒那么講究麸俘。亞馬遜和微軟都在使用它們的云服務(wù)來(lái)提供人臉識(shí)別,這項(xiàng)技術(shù)也是Facebook計(jì)劃的核心惧笛。政府不會(huì)想放棄自己的利益从媚。改變即將到來(lái)。直面它吧患整。????
(Sep 9th 2017)