首页
外语
计算机
考研
公务员
职业资格
财经
工程
司法
医学
专升本
自考
实用职业技能
登录
考研
A computer model has been developed that can predict what word you are thinking of. (41) Researchers led by Tom Mitchell of C
A computer model has been developed that can predict what word you are thinking of. (41) Researchers led by Tom Mitchell of C
admin
2011-03-11
28
问题
A computer model has been developed that can predict what word you are thinking of. (41) Researchers led by Tom Mitchell of Carnegie Mellon University in Pittsburgh, Pennsylvania, "trained" a computer model to recognize the patterns of brain activity associated with 60 images, each of which represented a different noun, such as "celery" or "aeroplane".
(42) . Words such as "hammer", for example, axe known to cause movement-related areas of the brain to light up; on the other hand, the word "castle" triggers activity in regions that process spatial information. Mitchell and his colleagues also knew that different nouns are associated more often with some verbs than with others--the verb "eat", for example, is more likely to be found in conjunction with "celery" than with "aeroplane". The researchers designed the model to try and use these semantic links to work out how the brain would react to particular nouns. They fed 25 such verbs into the model.
(43) . The researchers then fed the model 58 of the 60 nouns to train it. For each noun, the model sorted through a trillion-word body of text to find how it was related to the 25 verbs, and how that related to the activation pattern. After training, the models were put to the test. Their task was to predict the pattern of activity for the two missing words from the group of 60, and then to deduce which word was which. On average, the models came up with the right answer more than three-quarters of the time.
The team then went one step further, this time training the models on 59 of the 60 test words, and then showing them a new brain activity pattern and offering them a choice of 1 001 words to match it. The models performed well above chance when they were made to rank the 1001 words according to how well they matched the pattern. The idea is similar to another "brain-reading" technique. (44) . It shouldn’t be too difficult to get the model to choose accurately between a larger number of words, says John-Dylan Haynes.
An average English speaker knows 50 000 words, Mitchell says, so the model could in theory be used to select any word a subject chooses to think of. Even whole sentences might not be too distant a prospect for the model, saysMitchell. "Now that we can see individual words, it gives the scaffolding for starting to see what the brain does with multiple words as it assembles them," he says. (45)
Models such as this one could also be useful in diagnosing disorders of language or helping students pick up a foreign language. In semantic dementia, for example, people lose the ability to remember the meanings of things--shown a picture of a chihuahua, they can only recall "dog", for example--but little is known about what exactly goes wrong in the brain. "We could look at what the neural encoding is for this," says Mitchell.
[A] The team then used functional magnetic resonance imaging (FMRI) to scan the brains of 9 volunteers as they looked at images of the nouns
[B] The study can predict what picture a person is seeing from a selection of more than 100, reported by Nature earlier this year
[C] The model may help to resolve questions about how the brain processes words and language, and might even lead to techniques for decoding people’s thoughts
[D] This gives researchers the chance to understand the "mental chemistry" that the brain does when it processes such phrases, Mitchell suggests
[E] This research may be useful for a human computer interface but does not capture the complex network that allows a real brain to learn and use words in a creative way
[F] The team started with the assumption that the brain processes words in terms of how they relate to movement and sensory information
[G] The new model is different in that it has to look at the meanings of the words, rather than just lower-level visual features of a picture
选项
答案
D
解析
本题位于段落的最后,所以应结合上文找出正确答案。本段说明这个模型的意义及研究的前景,指出平均每个说英语的人会50000个单词,那么在理论上它应该能选择出主体所想的任何单词。本空的前一句提到,这就为开始研究大脑在组合单词时如何处理它们提供了支撑。那么所填的句子应该涉及“组合单词”的信息。在[B]、[D]和[E]这三个选项中,涉及此信息的只有[D],故为正确答案。[B]从整体上说了研究的意义,[E]说明了研究的不足,与该段的整体内容不符,故都排除。
转载请注明原文地址:https://jikaoti.com/ti/jH0RFFFM
0
考研英语一
相关试题推荐
Darwinisbasicallyright,thoughonlytosomeextentthatspeciesandindividualscompete,fight,killandsurvivalbelongsto
Humanshavealteredtheworld’sclimateby(1)_____heat-trappinggasessincealmostthebeginningofcivilizationandevenprev
By"MiucciaPradawasobviouslybitinghernails"(Paragraph1),theauthormeansTowhichofthefollowingistheauthorlikely
ThesizeoftheEskimolanguagespokenbymostwhitesis______.Inthepassage,theauthormentioned:"Eskimowordsarehighly
Accordingtothosewhosupportmergers,railwaymonopolyisunlikelybecause______.Theword"arbiters"(Paragraph4)mostprob
Almostexactlyayearago,inasmallvillageinNorthernIndia,AndreaMillinerwasbittenonthelegbyadog."Itmusthave(
Almostexactlyayearago,inasmallvillageinNorthernIndia,AndreaMillinerwasbittenonthelegbyadog."Itmusthave(
YouaresupposedtoinviteDr.Kingtomakeaspeechaboutthefuturedevelopmentofcomputerscienceattheannualconferenceo
Hejoinedthearmywithoutfirstgetting______fromhisparents,(permit)
Therewasatimewhenparentswhowantedaneducationalpresentfortheirchildrenwouldbuyatypewriter,aglobeoranencyclo
随机试题
不是公路工程施工周转材料的摊销方法是()。
可以适用实施强制许可的专利是()
慢性骨髓炎手术时机的选择是
在下列剂型中.药物吸收速度最慢的是
王某经营一家房地产中介公司,后来经过市场调查,发现某大型小区有大量的空置房并有许多客户要求购买或租赁该小区物业,于是房地产中介公司决定在该小区附近设立分公司,主要服务该小区的房地产中介。该公司利用在小区内外的路口派发宣传单来获得房源和客源。为了提高公司的效
银行内部监督中内部控制评价包括()。
有人说,要与人为善,退一步海阔天空;有人说,不要做老好人,老好人太好惹了,不好惹的人反而更受人欢迎。对此你怎么看?
什么是信息呢?在我们的日常生活中,信息是指具有新内容、新知识的消息。比如,人们收听了一次广播,听到了一些新闻,也就是接受到一些信息。同样,人们从别人的谈话和通信中,从电话和电视中,从阅读书报和文献资料中,从接触自然景物和周围环境中等等所获得的新消息,也都是
[A]Whathavetheyfound?[B]Isittruethatlaughingcanmakeushealthier?[C]Sowhydopeoplelaughsomuch?[D]Whatmakes
系统集成商Y公司承担了某企业的业务管理系统的开发建设工作,Y公司任命陈工为项目经理。陈工估计该项目24天即可完成,如果出现问题耽搁了也不会超过35天完成,最快19天即可完成。根据项目历时估计中的三点估算法,该项目的历时为(41),该项目历时的估算方差约为(
最新回复
(
0
)