流利說-懂你英語-個人筆記 Level8-Unit1-Part2:On Controlling AI

英語流利說 Level8 Unit1 Part2 : On Controlling AI
Sam Harris: Can we build AI without losing control over it?
TEDSummit ? 14:27 ? Posted September 2016

Can we build AI without losing control over it?
我們能造個不失控的人工智能嗎冰蘑?
L8-U1-P2: On Controlling AI 1

1
I'm going to talk about a failure of intuition that many of us suffer from.
我要談一個許多人都錯誤理解的直覺贿衍。
開頭的I'm going to我完全是猜的唤殴,聽起來像much of about橄教。

2
It's really a failure to detect a certain kind of danger.
在察覺某種危險方面它無疑是失敗的橄维。
a certain kind of 某種形式的……枣接;某一種的……
eg:This is a certain kind of difficulty.

3
I'm going to describe a scenario that I think is both terrifying and likely to occur,
我要描繪一個我認為駭人且可能發(fā)生的場景证舟,

4
and that's not a good combination, as it turns out.
事實證明硕旗,這不是個好的組合。
這里的combination指的是both terrifying and likely to occur女责。
as it turns out聽起來像as transout漆枚。

5
And yet rather than be scared, most of you will feel that what I'm talking about is kind of cool.
并非感到害怕,你們中的多數(shù)人反而覺得我要說的有點酷抵知。

6
I'm going to describe how the gains we make in artificial intelligence could ultimately destroy us.
我將講述我們在人工智能上的成果如何最終毀滅我們墙基。

7
And in fact, I think it's very difficult to say how they won't destroy us or inspire us to destroy ourselves.
事實上,我認為就它們是否會毀滅我們或者讓我們自相殘殺這一點上刷喜,很難講残制。

8
And yet if you're anything like me, you'll find that it's fun to think about these things.
然而如果你有那么一點像我,你就會發(fā)現(xiàn)思考這些事情很有趣吱肌。

9
And that response is part of the problem. OK? That response should worry you.
這種反應是問題的一部分痘拆。對嗎?這種反應應該會讓你擔心。
that response指it's fun to think about these things氮墨。關于人工智能會毀滅人類這個問題既有趣纺蛆,也很讓人擔憂吐葵。
這個OK只出現(xiàn)了一瞬間。

10
If I were to convince you in this talk that we were likely to suffer a global famine, either because of climate change or some other catastrophe,
如果我想要在這個演講中說服你桥氏,我們很可能迎來一個全球饑荒温峭,要么是因為氣候變化,要么是其它災難字支,
famine n. 饑荒凤藏;饑餓,奇缺
eg:Many old people had experienced a national famine, so they are very thrift.

11
and that‘s your grandchildren, or their grandchildren, are very likely to live like this, you wouldn't think, "Interesting. I like this TED Talk."
你的孫輩堕伪,或他們的孫輩揖庄,很可能過著這樣的生活,你不會想說:“有趣欠雌,我喜歡這個TED演講蹄梢。”

12
Famine isn't fun. Death by science fiction, on the other hand, is fun,
饑荒不有趣富俄。然而禁炒,科幻小說中的死亡是有趣的,

13
and one of the things that worries me most about the development of AI at this point is that we seem unable to marshal
此時霍比,關于AI發(fā)展中幕袱,最令我擔憂的事就是我們似乎不能整理
marshal n. 元帥;司儀悠瞬;vt. 整理们豌;引領;編列
eg:These thing are in a mess, so how can I mashal them?

14
an appropriate emotional response to the dangers that lie ahead.
對即將到來的危險做出適當?shù)那榫w反應阁危。
lie ahead 即將來臨玛痊;在前面
eg:Facing the challenge directly, then the success lies ahead.

15
I am unable to marshal this response, and I'm giving this talk.
我無法給出答案,我只能做了這個演講狂打。

L8-U1-P2: On Controlling AI 2

16
It's as though we stand before two doors.
這就如同我們站在兩扇門前擂煞。

17
Behind door number 1, we stop making progress in building intelligent machines.
在1號門后,我們停止在智能機器方面的建設趴乡。

18
Our computer hardware and software just stops getting better for some reason.
我們電腦硬件和軟件由于某種原因停止改進对省。

19
Now take a moment to consider why this might happen.
現(xiàn)在花點時間思考下為什么這可能發(fā)生。

20
I mean, given how valuable intelligence and automation are, we will continue to improve our technology if we are all able to.
我意思是晾捏,考慮到智能和自動化的價值蒿涎,如果我們都有能力,我們將繼續(xù)提高我們的科技惦辛。
這里的given是“考慮到”的意思
eg:Given he's a green hand, the thing that he has done such a stupid mistake is pardonable.

21
What could stop us from doing this?
什么能阻止我們這么做呢劳秋?

22
A full-scale nuclear war?
一個全面的核戰(zhàn)爭?

23
A global pandemic?
一個全球流感?

24
An asteroid impact?
一個行星撞地球玻淑?

25
Justin Bieber becoming president of the United States?
Justin Bieber成為美國總統(tǒng)嗽冒?

26
The point is, something would have to destory civilization as we know it.
問題是,就我們所知补履,有些東西會毀滅文明添坊。
as we know it 正如我們所知道的那樣;正如我們所知
eg:The point is, some bad behavior would have to leave a bad impression for other people as we know it.

27
You have to imagine how bad it would have to be to prevent us from making improvements in our technology permanently, generation after generation.
你可以想象下箫锤,一代又一代贬蛙,永久性地阻止科技進步會有多么糟糕。

28
Almost by definition, this is the worst thing that's ever happened in human history.
幾乎可以確定谚攒,這是人類歷史上所發(fā)生的最糟糕的事阳准。

29
So the only alternative, and this is what lies behind door number two,
所以唯一的選擇,就是二號門背后的東西馏臭,

30
is that we continue to improve our intelligent machines, year after year after year.
就是我們持續(xù)改進我們的智能機器溺职,年復一年长踊。

31
At a certain point, we will build machines that are smarter than we are,
某種意義上剪菱,我們將會造出比我們更聰明的機器挠阁。
at a certain point 某種意義上

32
and once we have machines that are smarter than we are, they will begin to improve themselves.
而一旦我們造出比我們更聰明的機器,他們將開始自我改進塑崖。

33
And then we risk what the mathematician IJ Good called an "intelligence explosion," that the process could get away from us.
然后我們冒著數(shù)學家IJ Good所說的“智能爆炸”的風險,這個進程會拜托我們的控制痛倚。
當智能機器達到與人類智力相當?shù)呐R界點后规婆,人類將沒有還手之力了,因為機器自我學習的速度遠超人類蝉稳。

34
Now, this is often caricatured, as I have here, as a fear that armies of malicious robots will attack us.
這通常被畫為漫畫抒蚜,就如我這里所展示的,我們害怕殘暴的機器人軍隊會攻擊我們耘戚。
caricature v. 把……畫成漫畫嗡髓;滑稽地描述,使滑稽化
eg:Many famous articles scenario has been caricatured.

35
But that isn't the most likely scenario.
但這不是最可能的場景收津。

36
It's not that our machines will become spontaneously malevolent.
不是我們的機器會變得越來越殘暴饿这。
spontaneously adv. 自發(fā)地;自然地撞秋;不由自主地
malevolent = malicious adj. 惡毒的长捧;有惡意的;壞心腸的
eg:If a person often does some bad unethical things, he or she will become spontaneously malvolent.

37
The concern is really that we will build machines that are so much more competent than we are that the slightest divergence between their goals and our own could destroy us.
這個真正的顧慮是吻贿,我們將會造出比我們更有能力的機器串结,以至于他們的目標和我們的目標有些微的分歧,就會毀滅我們。
divergence n. 差異肌割;分歧卧蜓;分散,發(fā)散声功;(氣流或海洋的)分開處
eg:Divergence will always exist, but the attitude that you want to express determines whether or not you can decrease it.

38
Just think about how we relate to ants.
想一下我們和螞蟻的關系烦却。

39
We don't hate them. We don't go out of our way to harm them.
我們不恨他們,我們不故意傷害他們先巴。
go out of the way 故意其爵;不怕麻煩
eg:We don't go out of our way to harm anyone, but it doesn't mean that we afraid trouble.

40
In fact, sometimes we take pains not to harm them. We step over them on the sidewalk.
事實上,有時候我們不是費力傷害他們伸蚯。我們只是跨過人行道而已摩渺。
take pains 盡力,費苦心剂邮;耐心
step over 跨過摇幻;單步執(zhí)行;不進入函式挥萌;逐過程
eg:Take pains as soon as possible, then you will step over all the difficulties.

41
But whenever their presence seriously conflicts with one of our goals, let's see when constructing a building like this one,
但是無論何時绰姻,他們的存在嚴重違背了我們其中的一個目標,我們看下當建造這樣一個東西時會發(fā)生什么引瀑,

42
we annihilate them without a qualm.
我們毫不猶豫毀滅之狂芋。
annihilate vt. 殲滅;戰(zhàn)勝憨栽;廢止
qualm n. 疑慮帜矾;不安
eg:Annihilate, destory, is the real history of human beings, and going forward with a qualm is a right way to make progress.
《三體》里面有一句話:毀滅你,與你何干屑柔?

43
The concern is that we will one day build machines that, whether they're conscious or not, could treat us with similar disregard.
令人擔憂的是屡萤,我們終有一天,會制造一個無論它是否有意識掸宛,都會用同樣的無視來對待我們的機器死陆。
disregard v. 忽視,無視唧瘾,不尊重翔曲;n. 忽視,無視劈愚,不尊重
eg:Treat people with disregard will get disregard.

44
Now, I suspect this seems far-fetched to many of you.
我猜這對你們中的許多人來說看上去有點牽強瞳遍。
far-fetched adj. 牽強附會的
eg:This new is a little far-fetched to explain whether there is alien.

45
I bet there are those of you who doubt that super intelligent AI is possible, much less inevitable.
我敢說你們中有人懷疑超級人工智能是否有可能,更不用說不可避免了菌羽。
much less 更不用說掠械;不及
eg:We even don't understand his meaning, much less help him.

46
But then you must find something wrong with one of the following assumptions.
但是你必須找到以下假設中的錯誤。

47
And there are only three of them.
只有3個。

48
Intelligence is a matter of information processing in physical systems.
智能猾蒂,在物理體系中均唉,是一種信息處理問題。

49
Actually, this is a little bit more than an assumption.
事實上肚菠,這不止是個假設舔箭。

50
We have already built narrow intelligence into our machines, and many of these machines perform at a level of super human intelligence already.
我們已經在我們的機器中制造了弱智能,其中許多機器已經在超人類智能的水平上運行了蚊逢。

51
And we know that mere matter can give rise to what is called "general intelligence," an ability to think flexibly across multiple domains.
我們知道层扶,僅僅是物質就可以產生所謂的“一般智力”,這是一種橫跨多個領域靈活思考的能力烙荷。
give rise 引起镜会;招致
eg:Bad sleep could give rise to bad work.

52
because our brains have managed it.
因為我們的大腦已經做到了。

53
There's just atoms in here, as long as we continue to build systems of atoms that display more and more intelligent behavior,
它們在這里就是原子终抽,只要我們繼續(xù)建造原子系統(tǒng)戳表,展示越來越多的智能行為,

54
we will eventually, unless we are interrupted, we will eventually build general intelligence into our machines.
我們將最終昼伴,除非我們被中斷匾旭,我們將最終在機器中建造通用智能。

55
It's crucial to realize that the rate of progress doesn't matter.
重要的是要意識到進步的速度并不重要圃郊。

56
because any progress is enough to get us into the end zone.
因為任何的進步都足以把我們逼到絕境季率。

57
We don't need Moore's law to continue.
我們不需要摩爾定律來維持。

58
We don't need exponential progress.
我們不需要指數(shù)增長描沟。

59
We just need to keep going.
我們只需繼續(xù)下去。

60
The second assumption is that we will keep going.
第二個假設是我們將繼續(xù)下去鞭光。

61
We will continue to improve our intelligent machines.
我們將繼續(xù)改進我們的智能機器吏廉。

62
And given the value of intelligence.
考慮到智能的價值。

63
intelligence is either the source of everything we value or we need it to safeguard everything we value. It is our most valuable resource.
智能要么是我們所珍惜的萬物之源惰许,要么就是我們需要它保護我們珍視的東西席覆。它本身就是我們最珍貴的資源。

64
So we want to do this. We have problems that we desperately need to solve.
所以我們需要做這件事汹买。我們有急需要被解決的問題佩伤。

65
We want to cure diseases like Alzheimer's and cancer.
我們希望治愈老年癡呆癥和癌癥等疾病。

66
We want to understand economic systems. We want to improve our climate science.
我們想要理解經濟系統(tǒng)晦毙。我們想要提高我們的氣候科學生巡。

67
So we will do this, if we can.
如果可以的話,我們會這么做见妒。

68
The train is already out of the station, and there's no brake to pull.
開弓沒有回頭箭孤荣。
brake n. 剎車;阻礙(物)

69
Finally, we don't stand on a peak of intelligence, or anywhere near it, likely.
最后,我們不可能站在智力的頂峰上盐股,也不可能站在它附近钱豁。

70
And this really is the crucial insight. This is what makes our situation so precarious,
這是最重要的洞察。這使得我們的處境很危險疯汁,
precarious adj. 危險的牲尺;不確定的
eg:This forest is precarious, because there are wolves at night.

71
and this is what makes our intuitions about risk so unreliable.
這使得我們關于風險的直覺非常不靠譜。

72
Now, just consider the smartest person who has ever lived.
現(xiàn)在幌蚊,想想有史以來最聰明的人谤碳。

73
On almost everyone's shortlist here is John von Neumann.
幾乎每個人的候選名單上都有約翰·馮·諾伊曼。

74
I mean, the impression that von Neumann made on the people around him,
馮·諾依曼給他周圍的人留下的印象霹肝,

75
and this included the greatest mathematicians and physicists of his time, is fairly well-documented.
包括了他那個時代最偉大的數(shù)學家和物理學家估蹄,這些都有詳細的記錄。
well-documented adj. 證據(jù)充分的
eg:It was well documented that he had done it.

76
If only half the stories about him are half true, there's no question he's one of the smartest people who has ever lived.
如果關于他的故事的一半中只有一半是真實的沫换,毫無疑問臭蚁,他是有史以來最聰明的人之一。

77
So consider the spectrum of intelligence.
思考一下智力的范圍讯赏。

78
Here we have John von Neumann.
這里是約翰·馮·諾伊曼垮兑。

79
Then we have you and me.
我和你在這。

80
And then we have a chicken.
雞在這

81
Sorry, a chicken.
不好意思漱挎,在這里系枪。

82
There's no reason for me to make this talk more depressing than it needs to be.
我沒有理由把這個演講弄得比實際情況更令人沮喪。

83
It seems overwhelmingly likely, however, that the spectrum of intelligence extends much further than we currently conceive,
然而磕谅,似乎非常有可能的是私爷,智力的范圍比我們目前想象的要延伸得更遠,
在這里overwhelmingly = really

84
and if we build machines that are more intelligent than we are,
如果我們制造出比我們更聰明的機器

85
They will very likely to explore this spectrum in ways that we can't imagine and exceed us in ways that we can't imagine.
它們很可能用我們無法想象的方式擴展這個范圍膊夹,以及用我們無法想象的方式超過我們衬浑。

L8-U1-P2: On Controlling AI 3

86
And it's important to recognize that this is true by virtue of speed alone.
單就速度而言,這是真實的放刨,知道這一點很重要工秩。
by virtue of 由于,憑借
eg:He has got the offer from that company by virtue of friends' acquaintance.

87
Right? So imagine if we just built a super intelligent AI that was no smarter than your average team of researchers at Stanford or MIT.
對吧?想象一下进统,如果我們只是建立了一個超級智能的人工智能助币,它并不比你在斯坦福大學或麻省理工學院的普通研究團隊更聰明。

88
Well, electronic circuits function about a million times faster than biochemical ones,
電子電路運行速度比生物個體快100萬倍螟碎,

89
so this machine should think about a million times faster than the minds that built it.
所以這個機器應該比制造它的大腦思考速度快100萬倍眉菱。

90
So you set it running for a week, and it will perform 20,000 years of human-level intellectual work, week after week after week.
所以你讓他運行一周,它將完成20000年的人類水平的智力活動掉分,周復一周倍谜。

91
How could we even understand, much less constrain, a mind making this sort of progress?
所以我們如何理解迈螟,更不必說約束,這種進化水平的大腦呢尔崔?

92
The other thing that's worrying, frankly, is that, imagine the best case scenario.
另一個令人擔憂的事是答毫,想象一下最好的情況。

93
So imagine we hit upon a design of super intelligent AI that has no safety concerns. We have the perfect design the first time around.
想象下季春,我們碰巧設計出了個超級人工智能洗搂,沒有安全顧慮,我們在第一次就擁有了完美的設計载弄。
hit upon 偶然發(fā)現(xiàn)耘拇,偶然碰到
eg:I hit upon a better way to make my sleep well.

94
It's as though we've been handed an oracle that behaves exactly as intended.
仿佛我們被贈予了一個稱心如意的禮物。

95
Well, this machine would be the perfect labor-saving device.
這個機器會是完美的節(jié)省人力的設備宇攻。

96
It can design the machine that can build the machine that can do any physical work, powered by sunlight, more or less for the cost of raw materials.
它能設計出能制造出做任何物理工作的機器惫叛,太陽能供能,或多或少的消耗原材料逞刷。
more or less 或多或少
eg:More or less, you have some merits, just think yourself deeply.

97
So we're talking about the end of human drudgery.
所以我們在談論人類苦工的終結嘉涌。

98
We're also talking about the end of most intellectual work.
我們也在談論大多數(shù)智力工作的終結。

99
So what would apes like ourselves do in this circumstance?
所以類人猿在這種情況下會做什么呢夸浅?
ape n. [脊椎] 猿仑最;傻瓜;模仿者

100
Well, we'd be free to play Frisbee and give each other massages.
我們可以自由地玩飛盤帆喇,互相按摩警医。
Frisbee n. (投擲游戲用的)飛盤

101
Add some LSD and some questionable wardrobe choices, and the whole world could be like Burning Man.
嗑一些迷幻藥,穿著奇裝異服坯钦,全世界就像火人節(jié)预皇。
LSD 麥角酰二乙胺,被認為是當代最驚奇婉刀、最強烈的迷幻劑吟温,有強烈的致幻作用
wardrobe n. 衣柜;行頭路星;全部戲裝
Burning Man 火人節(jié),是一個宣揚充分彰顯自我诱桂、創(chuàng)新和社區(qū)的節(jié)日

102
Now, that might sound pretty good, but ask yourself what would happen under our current economic and political order?
這聽上去不錯洋丐,但是問下你自己,在現(xiàn)有經濟和政治秩序下會發(fā)生什么挥等?

103
It seems likely that we would witness a level of wealth inequality and unemployment that we have never seen before.
這看上去我們可能會見證一系列我們從未見過的財富不平等和失業(yè)友绝。

104
Absent a willingness to immediately put this new wealth to the service of all humanity,
沒有立即將這些新財富用于全人類服務的意愿,

105
a few trillionaires could grace the covers of our business magazines while the rest of the world would be free to starve.
一些億萬富翁能夠登上商業(yè)雜志的封面肝劲,而世界上其它人卻在瘋狂挨餓迁客。
be free to 自由去做…郭宝,任意
在這里be free to有諷刺意味,富人們能光鮮亮麗掷漱,窮人卻在過著食不果腹的生活粘室。

106
And what would the Russians or the Chinese do if they heard that some company in Silicon Valley was about to deploy a super intelligent AI?
如果俄羅斯人和中國人知道一些硅谷的公司將要怕配置一個超級人工智能,他們作何感想卜范?

107
This machine would be capable of waging war, whether terrestrial or cyber, with unprecedented power.
這臺機器將能夠以前所未有的力量發(fā)動戰(zhàn)爭衔统,無論是地面戰(zhàn)爭還是網(wǎng)絡戰(zhàn)爭。
wage n. 工資海雪;報酬锦爵;代價;報應奥裸;v. 進行险掀,發(fā)動(運動、戰(zhàn)爭等)湾宙;開展
cyber adj. (與)計算機或網(wǎng)絡(有關)的
unprecedented adj. 空前的樟氢;無前例的
eg:The company has waged an unprecedented cyber revolution about the application of block chain.

108
This is a winner-take-all scenario.
這是個贏者通吃的局面。

109
To be six months ahead of the competition here is to be 500,000 years ahead, at a minimum.
在這里6個月的競爭領先就是至少500,000年的領先创倔。

110
So it seems that even mere rumors of this kind of breakthrough could cause our species to go berserk.
因此嗡害,似乎僅僅是關于這類突破的謠言就能讓我們人類抓狂。
berserk adj. 狂怒的畦攘,失控的:(激動得)控制不住的
eg:Don't cheat, or your teacher would go berserk.

L8-U1-P2: On Controlling AI 4

111
Now, one of the most frightening things, in my view, at this moment, are the kinds of things that AI researchers say when they want to be reassuring.
在此刻霸妹,我認為最可怕的事之一就是人工智能研究者們尋求安慰時所說的話。
reassuring adj. 安心的知押;可靠的叹螟;鼓氣的
eg:Don't complain, I think you'd rather say some reassuring words.

112
And the most common reason we're told not to worry is time.
告訴我們不要擔心的最常見的理由就是時間。

113
This is all a long way off, don't you know. This is probably 50 or 100 years away.
還有很長時間台盯,你不知道嘛罢绽。可能50年或100年開外静盅。
a long way off 還有好長一段距離
eg:Due to the influence of COVID - 19 outbreak, this is all a long way off to recover the economy.

114
One researcher has said, "Worrying about AI safety is like worrying about overpopulation on Mars."
一個研究人員說過:“擔憂人工智能安全就如同擔憂火星上人口過多良价。”

115
This is the Silicon Valley version of "don't worry your pretty little head about it."
這是硅谷版本的“不要擔心你那小小的腦袋蒿叠∶鞴福”

116
No one seems to notice that referencing the time horizon is a total non sequitur.
似乎沒有人注意到推斷這個時間范圍是不合乎邏輯的。
sequitur n. 推斷市咽,結論(自前提演變的)
eg:Many people believe that the bad luck owns to fate, I think it's a total non sequitor.

117
If intelligence is just a matter of information processing and we continue to improve our machines, we will produce some form of super intelligence.
如果智能就是信息處理痊银,并且我們持續(xù)改進我們的機器,我們將制造出一種超級智能施绎。

118
And we have no idea how long it will take us to create the conditions to do that safely.
我們不知道需要多長時間才能創(chuàng)造安全的條件溯革。

119
Let me say that again. We have no idea how long it will take us to create the conditions to do that safely.
再強調一下贞绳。我們不知道需要多長時間才能創(chuàng)造安全的條件。

120
And if you haven't noticed, 50 years is not what it used to be. This is 50 years in months.
如果你未曾注意到致稀,50年已不是以前的50年了冈闭。而是以月來算的50年。

121
This is how long we've had the iPhone.
這是我們擁有iPhone的時間豺裆。

122
This is how long "The Simpsons" has been on television.
這是“辛普森一家”出現(xiàn)在電視的時間拒秘。

123
50 years is not that much time to meet one of the greatest challenges our species will ever face.
去面對我們有史以來遇到的最大的挑戰(zhàn)之一,50年不算很長的時間臭猜。

124
Once again, we seem to be failing to have an appropriate emotional response to what we have every reason to believe is coming.
再一次躺酒,我們似乎對于百分百確認的事情沒有一個正確的感性回應。
前面蔑歌,我們認為超級人工智能不會來羹应,這一次我們認為50年很短。
eg:Once again, I unable to have a right response to what I have every reason to believe is right.

125
The computer scientist Stuart Russell has a nice analogy here. He said, imagine that we received a message from an alien civilization,
電腦科學家Stuart Russell有一個很好的比喻次屠。他說:“想象一下我們收到了一個外星文明的信息园匹,
analogy n. 類比;類推劫灶;類似
eg:There is a very nice analogy to describe what is the feeling of love: It just like I have an armour one moment, it also like I have a weakness instantly.
Stuart Russell正好也有一個TED裸违,叫做:3 principles for creating safer AI,可以看作本TED的延續(xù)本昏。

3 principles for creating safer AI | Stuart Russell

126
which read: "People of Earth, we will arrive on your planet in 50 years. Get ready."
上面寫著:地球人供汛,我們將在50年后登錄你們的星球。做好準備吧涌穆。

127
And now we're just counting down the months until the mothership lands?
現(xiàn)在我們就數(shù)著日子等待外星人降臨嗎怔昨?

128
We would feel a little more urgency than we do.
我們會有一點緊迫感。

129
Another reason we're told not to worry is that
另一個告訴我們不要擔憂的理由是

130
these machines can't help but share our values because they will be literally extensions of ourselves.They'll be grafted onto our brains,
那些機器會情不自禁共享我們的價值觀宿稀,因為它們基本上就是我們自己的延伸趁舀。它們會被植入我們的大腦中,
graft v. 嫁接祝沸;移植矮烹;辛苦地工作;貪污
eg:Many TV plays are now grafted onto many advertisements.

131
and we'll essentially become their limbic systems.
我們最終會變成它們的邊緣系統(tǒng)罩锐。

132
Now take a moment to consider that the safest and only prudent path forward, recommended,
現(xiàn)在花點時間思考下最安全也是唯一謹慎的前進道路奉狈,我建議,

133
is to implant this technology directly into our brains.
直接把這個科技植入到我們大腦唯欣。

134
Now, this may in fact be the safest and only prudent path forward,
現(xiàn)在嘹吨,這可能事實上是最安全和唯一謹慎的前進道路搬味,

135
but usually one's safety concerns about a technology have to be pretty much worked out before you stick it inside your head.
但是通常境氢,一個人對科技的安全顧慮必須在把它植入到大腦前充分考慮蟀拷。

136
The deeper problem is that building super intelligent AI on its own seems likely to be easier
更深的問題是由機器自己制造超級人工智能似乎更簡單

137
than building super intelligent AI and having the completed neuroscience that allows us to seamlessly integrate our minds with it.
相比于人類自己制造超級人工智能,以及擁有完整的神經科學來讓我們的大腦與機器無縫集成萍聊。

138
And given that the companies and governments doing this work are likely to perceive themselves to be in a race against all others,
考慮到在做這個工作的公司和政府可能感到自己在于其他人競爭问芬,

139
given that to win this race is to win the world, provided you don't destroy it in the next moment,
考慮到贏得這個競爭就是贏得這個世界,只要你不在下一刻毀掉它寿桨,

140
then it seems likely that whatever is easier to do will get done first.
看上去更簡單的事情會先被解決此衅。

141
Now, unfortunately, I don't have a solution to this problem,apart from recommending that more of us think about it.
不幸的是,我對此也沒辦法亭螟,除了建議我們更多的人去思考這個問題挡鞍。

142
I think we need something like a Manhattan Project on the topic of artificial intelligence.
我認為我們需要一個關于人工智能的曼哈頓計劃。

143
Not to build it, because I think we'll inevitably do that,
不是去建造它预烙,因為我認為我們終將這么做墨微,

144
but to understand how to avoid an arms race and to build it in a way that is aligned with our interests.
而是要理解如何避免軍備競賽,如何在符合我們利益的情況下進行軍備競賽扁掸。

145
When you're talking about super intelligent AI that can make changes to itself,
當你在談論能自我變更的超級人工智能翘县,

146
it seems that we only have one chance to get the initial conditions right,
似乎我們我們只有一次機會使得最初的條件正確,

147
and even then we will need to absorb the economic and political consequences of getting them right.
即使到那時谴分,我們也需要消化正確處理這些問題所帶來的經濟和政治后果锈麸。

148
But the moment we admit that information processing is the source of intelligence,
此刻我們承認信息處理就是智能的源頭,

149
that some appropriate computational system is what the basis of intelligence is,
一些正確的計算系統(tǒng)就是智能的基礎牺蹄,

150
and we admit that we will improve these systems continuously,
我們承認忘伞,我們將不斷改進這些系統(tǒng)

151
and we admit that the horizon of cognition very likely far exceeds what we currently know,
我們也承認認知的邊界很可能遠超過我們目前所了解的,

152
then we have to admit that we are in the process of building some sort of god.
我們必須承認我們處于造神的過程钞馁。

153
Now it would be a good time to make sure it's a god we can live with.
現(xiàn)在是個確定這是個我們能與之共存的神的好時機虑省。

154
Thank you very much.
非常感謝。

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯(lián)系作者
  • 序言:七十年代末僧凰,一起剝皮案震驚了整個濱河市探颈,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌训措,老刑警劉巖伪节,帶你破解...
    沈念sama閱讀 212,599評論 6 492
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異绩鸣,居然都是意外死亡怀大,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 90,629評論 3 385
  • 文/潘曉璐 我一進店門呀闻,熙熙樓的掌柜王于貴愁眉苦臉地迎上來化借,“玉大人,你說我怎么就攤上這事捡多”涂担” “怎么了铐炫?”我有些...
    開封第一講書人閱讀 158,084評論 0 348
  • 文/不壞的土叔 我叫張陵,是天一觀的道長蒜焊。 經常有香客問我倒信,道長,這世上最難降的妖魔是什么泳梆? 我笑而不...
    開封第一講書人閱讀 56,708評論 1 284
  • 正文 為了忘掉前任鳖悠,我火速辦了婚禮,結果婚禮上优妙,老公的妹妹穿的比我還像新娘乘综。我一直安慰自己,他們只是感情好套硼,可當我...
    茶點故事閱讀 65,813評論 6 386
  • 文/花漫 我一把揭開白布瘾带。 她就那樣靜靜地躺著,像睡著了一般熟菲。 火紅的嫁衣襯著肌膚如雪看政。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 50,021評論 1 291
  • 那天抄罕,我揣著相機與錄音允蚣,去河邊找鬼。 笑死呆贿,一個胖子當著我的面吹牛嚷兔,可吹牛的內容都是我干的。 我是一名探鬼主播做入,決...
    沈念sama閱讀 39,120評論 3 410
  • 文/蒼蘭香墨 我猛地睜開眼冒晰,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了竟块?” 一聲冷哼從身側響起壶运,我...
    開封第一講書人閱讀 37,866評論 0 268
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎浪秘,沒想到半個月后蒋情,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經...
    沈念sama閱讀 44,308評論 1 303
  • 正文 獨居荒郊野嶺守林人離奇死亡耸携,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內容為張勛視角 年9月15日...
    茶點故事閱讀 36,633評論 2 327
  • 正文 我和宋清朗相戀三年棵癣,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片夺衍。...
    茶點故事閱讀 38,768評論 1 341
  • 序言:一個原本活蹦亂跳的男人離奇死亡狈谊,死狀恐怖,靈堂內的尸體忽然破棺而出,到底是詐尸還是另有隱情河劝,我是刑警寧澤渊抄,帶...
    沈念sama閱讀 34,461評論 4 333
  • 正文 年R本政府宣布,位于F島的核電站丧裁,受9級特大地震影響耐朴,放射性物質發(fā)生泄漏轧坎。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點故事閱讀 40,094評論 3 317
  • 文/蒙蒙 一丹鸿、第九天 我趴在偏房一處隱蔽的房頂上張望贪染。 院中可真熱鬧缓呛,春花似錦、人聲如沸杭隙。這莊子的主人今日做“春日...
    開封第一講書人閱讀 30,850評論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽痰憎。三九已至票髓,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間铣耘,已是汗流浹背洽沟。 一陣腳步聲響...
    開封第一講書人閱讀 32,082評論 1 267
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留蜗细,地道東北人裆操。 一個月前我還...
    沈念sama閱讀 46,571評論 2 362
  • 正文 我出身青樓,卻偏偏與公主長得像炉媒,于是被迫代替她去往敵國和親踪区。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 43,666評論 2 350