AGI may be impossible to define, and that's a multibillion-dollar problem

4 months ago 3

jdale

Ars Legatus Legionis

I think that this is actually still in the "not even wrong" territory. Neurons in and of themselves didn't evolve for cognition, they merely co-opted a very very old method of respiration into a means of rapid communication by modulating their voltage potential.

Animals like jellyfish and corals have neurons (and even eyes for some jellyfish), but no brain. Cnidarian intelligence is not a thing one could measure. In the vast majority of animals, the brain is little more than a ganglion in the head to process sensory input from that body segment, with a decentralized nervous system that follows any number of known patterns far different from ours.

It is possible to evolve intelligence via a decentralized invertebrate nervous system, including a true brain, and ganglia or sub-brains to control each limb, in cephalopods. But the vast majority of invertebrates are unlikely to possess anything that we might recognize as intelligence. Insects especially seem to follow pre-programmed "instructions" and even avoid wound-tending in favor of carrying out impossible tasks.

There is nothing inherently inevitable about intelligence arising from neural networks. It can happen. It is a possibility. But given its rarity (and metabolic toll: cephalopods rarely live more than a year), I'm not even sure it's fair to say that cognition is even the primary purpose of neural networks.

I guess my point is that virtually any AI is just an emulation attempting to resemble human output, it's not built on anything resembling a human neural network or any neural network.

That reinforces the point that it's extremely difficult, looking at the hardware, to find the dividing line. I certainly agree that insects and cnidarians are not good candidates for consciousness. But there are other animals whose brain structures are much more similar to our own, and any reasoned argument about whether or not they meet that threshold is going to fall down on our inability to define it.

What we can do pretty well is look at capabilities and performance, and I think it's a better idea to keep a definition of AGI in that space, because it is at least possible to evaluate things on that basis.

Read Entire Article