> Physical abilities spring to mind in regard to being able
> to run faster, climb or swim better, etc, than our current
> human form allows. The ability to fly would be nice too...
A common dream, yet without moving away from the human form it's unlikely
that such physical performance could be augmented too much. Consider that
athletes already are so close to the structural limits of the human form
that they, not infrequently, rip muscles and tendons or even break bones at
the limits of their exertion. Dark Angel or no, adding in some cat DNA isn't
likely to increase the structural strength of bones or allow humans to jump
15 feet vertical.
> That comes down to human values. Anthropomorphising
> creatures, as is the meaning of the word, applieds human
> characteriestics, that unfortunatley would remove much of
> the innocence that animals tend to posses. There are no
> penalties in nature, only consequences.
True, though innocent is an odd word to apply to animals. Animals can be
quite malicious, inflicting horrible things onto each other. Intelligence
implies sentience and the ability to reflect on one's own actions, which in
turn leads to morality and common sense. The greater mind is capable of
imagining greater evils, and conceiving of the methods to make those evils a
reality, which in turn requires that such power be moderated lest it run
unfettered crushing those without. Animals, thankfully, lack such
imagination.
> It's people's value and belief systems - they get caught up
> with polictics at work, a problem that appears too hard,
> etc etc that stops many from stepping back, thinking "so
> what?" being able to leave the problem behind when they go
> home. Many believe the world owes them something and carry
> this with them everywhere, unlike the dog who is happy with
> whatever comes his way.
Oh, I don't know. Even dogs can get cranky, and my dogs are very good at
demanding those things they desire. Albeit desires of greater simplicity
than human desires. And yet even humans possess these simple desires, and
they are our strongest. One can only desire what one can imagine.
I'd like to point out that the limitations regarding transforming a
full-grown human into some other life form goes beyond the issue of changing
the DNA. We are talking about a body that's already set in form. Consider
the bones that are stronger than reinforced concrete, weight for weight. How
would one go about changing them into new shapes? Not merely growing them
larger mind you, but changing their structure. The issue, what I was trying
to say, is that if you were to alter the DNA of a living human being, fully
grown at that, sufficiently that he or she would then change shape into
some kind of monster a la The Relic, all that would happen is he or she
would become seriously ill or die.
righlarian posted:
Although I do agree with both of you about what you've said here
> about robotic furs, there is one very important thing that would be
> true of AI that I'm not sure you have mentioned. I'm sorry if you
> have, or were trying to imply it, I'll just be walking on beaten
> ground.
> Anyway, my point is that an AI cannot feel emotion (and a learning
> computer could quite possibly find a way to defeat its own
> programming, given reason to) and as such would be confused,
> annoyed, or even angered by the emotional shortcomings of its human
> companions. Actually, that's giving a machine an emotional response,
> so that is still incorrect. It might become confused (read: Does not
> compute), or it might decide to circumvent or eliminate problems
> caused by human emotion.
> On another note, what if said AI decided to remove the only thing
> that could shut it down: its creators? I'll admit that that's a
> Terminator-esque idea, but it's still a valid possibility.
I don't doubt that soon AI will start to show emotion, or at least a good
facsimile of emotion. It's even possible that emotions are required for
human level intellect, as a means to provide a frame of reference for the
awesome amount of data that we have to deal with. It may be part of the
solution to the frame problem. That is, when doing even a simple task, like
getting a beer out of the fridge, how does the brain know which knowledge is
applicable to the situation without going through all knowledge? Plus,
having at least the illusion of emotion would make AI seem more human like
and thus easier to get along with, or attached to. I don't think emotions
are going to prove impossible to simulate. As for the Terminator concept,
it's based on self-preservation, and fear, which you wouldn't expect from an
emotionless machine. Why would it worry about its own preservation since
it's incapable of such worry? Besides, fail-safes would no doubt be hard
coded into the AI. Something along the line of the famous laws of robotics.
That the machines would decide to remove humans in order to preserve
themselves would require that they would posses such self-reflection as to
contemplate their own existence, and the ability to fear their own
destruction. Besides, they might just as easily figure that keeping humans
happy would be the easiest, and surest, route to their own happiness.
Received on Thu Sep 18 2003 - 01:29:47 CDT