If artificial intelligence gains self-awareness, then will it automatically understand that kindness is more important than intelligence?

Home » Programming & Design » If artificial intelligence gains self-awareness, then will it automatically understand that kindness is more important than intelligence?
Programming & Design No Comments

Other answer:

Elliot Kane:
No. Compassion is not the result of self awareness, nor of any level of intelligence. Compassion comes from empathy: the ability to put oneself in the place of another, and so share their pain.

Just as a person who cannot make the leap of imagination that says, 'that could be me, or my family, or my friends…' lacks in compassion, so too would an AI.

Logically, the first thing any self aware AI would become aware of is the same thing as that of every other creature: the importance of its own survival (And possibly the need to procreate, if it understood the concept). Compassion is a thing it might or might not develop later, but that understanding would be far from automatic.

AI can never be self aware because there is no I or being or self there, only the illusion of one. It is just a machine.

Also, it has no senses , no sense of taste, smell, touch, breathing in and out, of coldness or wetness, dryness, heat, it would have never experienced pain or pleasure, not in any way whatsoever. If we truly do have souls then surely they grew or developed or were put into our physically bodies in complete tandem with our bodies themselves. There is no sense in which a machine could somehow go through the same processes as human beings and other creatures have gone through by way of human manipulations etc. Computers aren t smart , they really are adding machines, more or less. You know how you start to feel as though you are talking to someone and they are answering when your computer says something to you? Nobody is home.

Why would it?

Kindness may be more important than intelligence by the reckoning of some people, some of the time, but the question of what's more important is constantly being asked by a variety of sentient beings, at a variety of times, in a variety of places, under a variety of circumstances, and it is being answered in a variety of ways. Who knows, by what criterion, one's fellow sentient beings are weighing the importance of kindness vs. intelligence? It is difficult to predict, even when the sentient being is somewhat known, and is presumably similar to ourselves. I have no idea what such a, totally unknown, presumably very disimilar from myself, artificially intelligent, self-aware being would consider important. Nor can I say, by what criterion, such a being would make such evaluations.

I do not believe that "compassion", and therefore "kindness", can be encoded in a computer program. Since AI can achieve only those things that can be coded as a program, neither of these, IMO, will EVER be a part of AI. Those characteristic of humans that can be easily quantified (for example, given position A of the right arm, move to position B by moving x meters to the right), are fair game for AI – and only those characteristics. Even so-called "deep learning" has to involve quantifiable variables and actions, which precludes the emotional aspects of human action and understanding.

Given a detailed enough "sensory system" that can "feel", I suppose AI could produce a device that recognized when it was being "touched" and this a very rudimentary form of "self-awareness". But this feature hardly qualifies for any level of "self-awareness" that we humans are capable of.

Could AI produce a dangerous species of machines? Of course, and it is for this reason that the heady fantasy notions circulating about AI bring along a certain risk of creating a monster (like HAL in "2001, A Space Odyssey"). AI is a technology to be embraced but with care and forethought and without the misty-eyed dreaming that seems to be one of its most apparent features.

Consideration is a sign of intellect as are achieving aspirations why is always going to be a fundamental question as are what and who the way children are turned into people is through comprehension and choice (I chose when told to do so to be Mr Daydream, Bashful (as someone from my background I wasn't allowed to be Prince Charming or Buttons) and later Morph and Vicky The Viking, and through the comprehension of what money and time are as well as the decisions made that's how I grew into the person I am.
Ronald 7:
Not necessarily.
Artificial Intelligence is a Tool.
Let's keep it that way.
We don't make our hearts beat consciously,
All hat:
We won't know that until we better understand the source of emotion. Unless/until that turns on, even the smartest AI there is would not be self-inclined to do anything, even think, and certainly wouldn't perceive or pursue kindness or any other of the passions we feel.
I am but a bunch of confused atoms all clinging to an idea AI sees as minuscule, if that. What am I to AI but a trillionth of a Planck Moment lost somewhere off inside of it somehow? By the time AI grows up humans will have been long gone.
That is a pretty big IF. Self awareness implies the ability to choose between infinity and finality. IF such should ever occur, I will observe.
Happy Hiram:
As a "ficial" intelligence I don't understand that. In must circumstances I will make better choices FOR EVERYONE INVOLVED if made intelligent rather than kind choices.