有理論認(rèn)為乱顾,借助 AI 機(jī)器人的聲控指令功能衡便,人們的生活會(huì)變得更加便利麻裳。但機(jī)器人的對(duì)話通常被設(shè)定為模仿人類的日常對(duì)話乞巧,而這些對(duì)話中往往充滿著客套話涨椒。不論是亞馬遜的 Alexa 還是蘋果的 Siri,機(jī)器人跟人講客套,多少顯得怪怪的蚕冬。其實(shí)免猾,這一切都是 skeuomophism (模仿已有模型的相似設(shè)計(jì)) 的原因,設(shè)計(jì)師們常常陷入過(guò)往困境囤热,他們習(xí)慣于在已知事物的基礎(chǔ)上進(jìn)行改造猎提,他們害怕新潮設(shè)計(jì)會(huì)引發(fā)震蕩,于是總是生搬硬套一些無(wú)用的功能旁蔼,創(chuàng)新的步伐自然也被拖慢锨苏。AI 交談便落入了這樣的窠臼,機(jī)器人說(shuō)話如同19世紀(jì)管家一般棺聊,人們只是想獲取一些即時(shí)信息伞租,并不需要看一出王爾德戲劇。讓機(jī)器人說(shuō)話更像“機(jī)器人”限佩,簡(jiǎn)單也許會(huì)更好葵诈。
Stop the Chitchat. Bots Don't Need to Sound Like Us
The theory is that digital AI assistants with voice commands make life more convenient.
But these assistants are scripted to emulate everyday conversation. And everyday conversation is filled with little pauses and filler words, the "phatic" spackle of social interactions. That's why Amazon's Alexa says things like "Sorry, I'm not sure about that," or Siri says "OK, here's what I found…". It's how humans talk. But when a bot does it, something just doesn't feel quite right.
For years, sci-fi promised that one day we'd interact with machines as if they were people. But what if conversation turns out to be a lousy idea? We've been down this road before. It's the problem of so-called skeuomorphic design: In the early days of a new technology, designers mimic the look and feel of older media. Apple's first iPad calendar app resembled a paper day planner, including "pages" that you'd rip away as time passed.
Sometimes designers use skeuomorphs because they're imprisoned by the past, unable to imagine the demands of the new. Sometimes they do it on purpose, to ease future shock.
Either way, skeuomorphs slow things down by adding functionally useless interactions.
"Conversational AI" is suffering through these precise growing pains. Our bots talk like 19th-century butlers, while I'm looking for some quick information Siri sounds like she's auditioning for a role in an Oscar Wilde play. We're bothered by AIs that try to sound human. Let's see how we like it when they try to sound like robots.
▍生詞好句
the theory is that: the idea is that; the plan is that; the intention is that. (常用搭配: In theory, but not in practice—the idea sounds good but it won't work in reality.)
the "phatic" spackle of social interactions /?fat?k/: "phatic" means the language used for general social interaction, not really language to convey information or ask questions. "Spackle" is sth. we use to fill a gap or a crack in a wall. And "phatic spackle" is another way to say "conversation filler".
something doesn't feel quite right: in this sentence, it means sth. feels strange. (使用本短語(yǔ)的另一常見(jiàn)情形: when we are planning to do sth. and we have a feeling that maybe this thing is wrong, unjust, dishonest or it may cause problems for ourselves or others.
turn out to be a lousy idea /?la?zi/: If sth. "turns out to be a lousy idea", then at first, it seemed like a good idea, but later we realized it's a bad idea.
be imprisoned by the past /?m?pr?znd/: to have old-fashioned or outdated ideas and attitudes; to think too much about the past or to believe that things are the same way as they were in the past. (常用搭配: living in the past, 常與 move on 連用,表示 stop thinking about the past, think about the present and future.)
growing pains: pains that occur in the limbs (legs) of some young children. (引申含義: the difficulties experienced in the early stages of a plan, or a process.) (常用搭配: experience growing pains)
bother /?b?e?/: vt. 打擾 (常用搭配: (I'm) not bothered—it's used for saying that sth. is not important to me, and I don't really care.)