Mailinglist Archives:
Infrared
Panorama
Photo-3D
Tech-3D
Sell-3D
MF3D
|
|
Notice |
This mailinglist archive is frozen since May 2001, i.e. it will stay online but will not be updated.
|
|
Re: Computer 3-goD
>Date: Wed, 30 Oct 1996 20:30:04 -0600
>From: "P3D Gregory J. Wageman" <gjw@xxxxxxxxxx>
>Subject: Re: Computer 3-goD
>Translating a single 2D image into a 3D image requires the computer to
>recognize the *content* of the image
Not necessarily - that's how a human would do it, but machines aren't
constrained to use the same techniques humans do. A photograph contains
information that a human viewer doesn't necessarily use.
>in order to determine what the
>various shapes in the image are, which are connected and which are
>overlapping, and thus decide which elements would naturally be in the
>foreground and which in the background. This requires reasoning and
>judgement. I'm not sure at what age a human being acquires these
>skills, but you're talking about a computer with at least the
>cognitive abilities of an older child!
It would be more productive to try to avoid phrases like "a computer knows
this" or "a computer does that" - though I'm sure you meant it as a shorthand
expression, it can be confusing to people who don't know much about computers.
In general, from the viewpoint of the outside world, a computer doesn't
*know* anything, and unless it's a specialized device, left to itself it
usually just sits there passively. It's the *program* (I'm including stored
data bases in this) that embodies the "knowledge" and implements the actions
of the computer. Most computers today can implement the model of the
"Turing machine", in the sense that anything that can be fully and explicitly
described, and doesn't require too much memory or too many computations,
can be done by them in a "reasonable" period of time. Fully and explicitly
stating what is to be done is what countless thousands of computer programmers
are being paid to do.
>Our brains are also nothing like computers. Did you ever "forget"
>something, only to recall it a some later time? A computer that did
>that would be considered broken.
On the other hand, humans have "instantaneous" knowledge of a vast amount
of information, including references to other information, and no computer
programs I am aware of do. Humans have an amazing ability to "know what they
know", and to quickly perceive whether something they know may be relevant to
a given situation. I was thinking a few days ago that if computer programs
could be made "aware" in this fashion, it might make them absent-minded
like humans. Of course, they would still retain the "lossless" storage of
data bases, in the same way that the tables in our reference books don't
spontaneously change just because we don't remember what we had for
breakfast yesterday or what we went to the store to buy.
>On the other hand, stored-program
>digital computers have absolutely no imagination or intuition. There's
>been some speculation that the human brain uses quantum phenomena at
>the lowest level, which is where our creativity and 'randomness'
>come from.
As I said, digital computers can generally do anything that can be fully
described in concrete terms. The realm of what can be fully described in terms
a computer program could potentially work with is gradually expanding,
and now includes some techniques to handle uncertainty (fuzzy logic)
and even to emulate chaotic behavior if the characteristics of the
chaotic behavior can be described. While the research continues, I am not
aware of any convincing evidence presently available that human thought and
perception are *inherently* indescribable.
>The closest thing to the human brain in computer-land is the
>neural network, and that is only a crude approximation of a few tens
>of neurons.
I attended a lecture on neural network computing last week, given by one of
the notable researchers in the field. Some of the work has not been published
yet, so it would be inappropriate to go into details, but in general,
much of the research being undertaken by many people in the field is to
create models of a human-like neuron, then run computer simulations of
neural networks on regular digital computers. Chaotic behavior can be
included or left out (there's ongoing debate on whether it's needed).
It's amazing how different neurons are from the elements in a digital
computer - for instance, instead of voltage levels as signals, they use
trains of pulses, where the timing of the pulses carries information.
But if it becomes possible to fully simulate them, and a sufficient amount
of storage is available, then it may indeed be possible to emulate human
thought and perception processes on a digital computer.
(Of course, if it should take a million years on a Cray to emulate five
seconds of human thought, that wouldn't be very useful from a practical
standpoint. Computational requirements have to be taken into consideration
in any actual design. It's very possible that an electronic neural net could
do the job more efficiently than a digital simulation.)
>An probably ant has more thinking ability than the most sophisticated
>computer.
That could be true. I also recently saw a web page at a university, in which
it was stated that if you consider the body of a Stradivarius violin as an
analog computer to convert the energy provided by the bow into specifically
configured acoustic energy, the digital equivalent of its computational
power would be on the order of gigaflops.
John R
------------------------------
|