色盒直播

Hume's internal bundle grows a few extra knots

四月 5, 1996

Thinking about consciousness is no longer the preserve of philosophers; it is part of science too. Experts from a range of disciplines give their individual views

I am working on a programme involving neural networks. This research may enable an artificial system to have a point of view of its own, one that includes enough knowledge and experience to let it consider itself to be conscious.

This is not through some precise definition of consciousness, but through the same notions we all have about our own consciousness. We know we are conscious without being able to say exactly what consciousness means. I can say roughly that it means first of all that I am sufficiently awake to notice my surroundings. Then, even if I close my eyes, and I am in a quiet room, I still know that I am conscious because I have a sensation that I call "thought". This consists of a variety of internal sensations that are a bit like that which I sense with my eyes open. I can even use natural language to describe my thoughts to someone else. I call this a "folk" description of consciousness.

The machine I am using, called Magnus, is a neural net. It can be configured by an experimenter to test hypotheses about how representations of sensation may be created that resemble the perceptual sensations themselves. They are also capable of representing that which they themselves might do with artificial actuators (hands, fingers, voice chords) - that is, an awareness of self. Also central to our Magnus project is the absorption of natural language, as used by human beings. This then develops, with Magnus learning about named objects and more abstract concepts. Such activity includes the build up of emotions from instincts and representations of some philosophers' pet notion, "qualia".

The key to all this is a discovery that something we call "iconic learning" can take place in a neural net. Neurons have patterns of behaviour when they are exposed to sensory input. Iconic learning is a phenomenon of these patterns being sustained when the perceptual input is no longer there. Such patterns are called "states" and the world of the organism is represented by a rich structure of such states, which is the seat of the organism's mechanism for consciousness. So, if such things go on in Magnus, why not in living beings?

Igor Aleksander is professor of neural systems engineering, Imperial College, London.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT