664cb7e672a52

664cb7e672ff7
1 Guest is here.
 

Topic: The Myth of Sentient Machines
Page: « 1 2 [3] 4 ... 6 »
Read 13941 times  

664cb7e673acbRocketMan

664cb7e673b49
A machine-gun turret is programmed for self-preservation in a sense - shoot anything that comes close to me.  But this is not artificial intelligence.  It is behavior, constrained by design.  The key characteristic of the human brain that differentiates it from the computer processor is that the human brain can form its own synaptic connections... its own program, based on experience and need.  Using its sentience and emotions, it can understand what it means to need something or want something.  This is the leap that AI needs to take before it can become a threat to us.  Otherwise it's just using heuristic algorithms to follow some high level command, with the only deviations resulting from environmental input.  In effect it's an ant or a grasshopper trying to flee from the burning light of a magnifying glass.  While these organisms, organic or synthetic, can learn and adapt, they don't have true sentience because sentience is not complete without emotion.  Emotion takes something that is merely sensory, like touch, which could be simulated with an electronic sensor, and turns it into a personal "experience", which cannot.

I posted a while back about new evidence suggesting that the perception of pain is impossible before birth because pain is learned and personal.  It's the same thing.  You can have a sensor that reports damage, but the unpleasant sensations that make us recoil from sharp or burning objects occur in areas of the limbic system that control emotion, not the areas for nociception.  It's not an intuitive concept but it is true nonetheless.  Without personal emotional experiences, sensory inputs don't have any meaning and cannot produce motivation.  And giving a robot directives does not count because it is always compelled to obey without giving it a further thought.

If your point was to suggest that a robot programmed for self-preservation might kill humans, your point is taken, but that's not an AI attack.  That's just complete lack of foresight and shitty programming.
« Last Edit: 10. August 2016, 20:48:06 by RocketMan »
664cb7e673d7d
Whether or not the currently developed AI systems meets anyones definition of sentience is beside the point (mine at least) when you have potentially dangerous autonomous systems tested in the wild already. So yeah, at this point it's the human element combined with that technology and an rapidly increasing dependency on it that poses the threat. The research in the field of AI is acclerating the problem exponentially.
« Last Edit: 10. August 2016, 21:25:02 by fox »
Acknowledged by: RocketMan

664cb7e67405bRocketMan

664cb7e6740ad
If everyone is so keen on making driverless vehicles they should start with aircraft.
664cb7e6741b5
Apart from the obvious military drone industry, Amazon, DHL and several others are on it already. Not for passenger transportation though.

664cb7e674268RocketMan

664cb7e6742b3
I was referring to passenger transportation.  Drones are lower risk since they don't carry people but I think automation is mature enough for passenger aircraft... just not cars.
664cb7e6743fd
Guess one of the problems with aircraft is that they require much unsafer speeds and altitudes then those ground vehicles they are testing right now. It's obvious that the chaos factor in your average city traffic is much higher and harder to process then any airspace but it's also much easier to minimize the risks involved. I guess watercraft would be ideal (although it would be hard to rescue people should something go wrong). 

Post 9/11 and with terror groups seemingly all over the place, there's also the issue with aircraft being perceived as a potential security threat even without autonomous driving envolved.

664cb7e6744d4RocketMan

664cb7e67452a
The fact that collisions are more catastrophic is certainly offset by the reduced frequency of such collisions and the reduced sophistication of the guidance system necessary to fly a plane compared to that of a car.  In fact newer planes are already capable of executing every step of a flight using automation alone but they just haven't done so yet.  Granted there should always be a pilot in the cockpit but only in case of emergency scenarios.
664cb7e674648
So I've heared. I guess the guidelines for approval by aviation authorities and the psychological factors are playing a huge role. Personally I would be much more likely to agree to sit in a 10 mph autonomous bus then in a 200 mph experimental autonomous aircraft.

664cb7e6747d0RocketMan

664cb7e674825
Intuitively I don't blame you, but the statistics and science would suggest otherwise. 

- A 3-dimensional space can fit so many more random non-intersecting vectors in it than a 2-D space, which reduces the number of intersecting paths compared to all possible paths (less chance of a collision)
- There's almost nothing to hit while in the sky except other planes.  The amount of debris and obstacles are almost nil.  Therefore collision detection can be stupidly primitive and work well.
- You're not glued to a surface which means maintaining parameters like attitude and altitude are much more forgiving than maintaining a heading on the road.  There's no median strip to cross over or guard rail to avoid.
- Bad weather causes far more challenge for the pilot of an aircraft than for the driver of a car.  This is where billions of calculations per second offer a much greater advantage to the pilot.  A computer may be necessary to land a plane in gale force storm with no visibility to the human eye but in a car, you could always just pull over or at the very least go really slow.
- The number of contingencies that a car must deal with is far greater than that of a plane and most of them are not known in advance.  A plane doesn't have to account for nearly as much.  Death to the occupants can only result from a collision and there are only 3 things it can collide with:  a bird, another plane and the ground, one of which is no contest for a plane.
664cb7e674e31
Intuitively I don't blame you, but the statistics and science would suggest otherwise.

You are likely to be right about that but this...
- Bad weather causes far more challenge for the pilot of an aircraft than for the driver of a car.  This is where billions of calculations per second offer a much greater advantage to the pilot.  A computer may be necessary to land a plane in gale force storm with no visibility to the human eye but in a car, you could always just pull over or at the very least go really slow.
...sounds more like a con then a pro. And I'm also not sure this...
- The number of contingencies that a car must deal with is far greater than that of a plane and most of them are not known in advance.  A plane doesn't have to account for nearly as much.  Death to the occupants can only result from a collision and there are only 3 things it can collide with:  a bird, another plane and the ground, one of which is no contest for a plane.
...is entirely accurate. But anyway, I see where you are coming from. Ultimately, I'm sure, it's only a matter of time until passenger transportation via autonomous aircraft will be introduced.


« Last Edit: 19. August 2016, 20:17:18 by fox »

664cb7e67512aZylonBane

664cb7e675182
If everyone is so keen on making driverless vehicles they should start with aircraft.
They did start with aircraft. About a century ago. Current commercial airliners can be pretty damn automated when the pilots feel like it.

664cb7e675279RocketMan

664cb7e6752c9
My point was that they still have a pilot and copilot in the cabin and they still regularly perform take-offs and landings (probably to maintain their training, admittedly).  I think it would suffice to only have 1 member of the flight crew in the cabin and for the computer to handle take-offs and landings regularly and especially in bad conditions where the risk of human error is greater.

664cb7e675f6fhedonicflux~~

664cb7e675fc7
Quote by Kolya:
In the end, if you really succeeded to either simulate or build everything that makes a human what you would have is - a human. That includes death and excludes incredibly fast calculations.

Excludes incredibly fast calculations? Wrong. The human brain makes multiple simultaneous calculatious orders of magnitude faster than the fastest simulation we've made.

The point is that you are not your brain. And your body isn't just a machine to carry your head-computer around. Everything you think and therefore everything you consider intelligent cannot be separated from your experience of being a human body.

You are your brain, and the body supporting it. Your body is just a machine to carry your head computer around. You obviously cannot experience anything if there are no external stimuli and sensory organs to detect them, so it can be argued that your external world, and your body in context, is part of you. But this is all processed and stored in the brain.

What sexual attraction or love actually mean and can do to one's thoughts will forever escape it. And so it will stay stupid.

A machine cannot adapt like a human because it's lacking experiences of the world. While it can "learn" that placing the red ball into the cup results in an energy boost, whereas blue balls do nothing, even such a pitifully simple experiment requires pre-programming of what is a ball, what to do with it and even that energy is a good thing.

Humans can deal with an infinite number of situations because they can adapt memories of previous experiences to new situations taking into account the differences. The process of how these memories are formed, reinforced and overwritten, their quality and how they influence each other, and how they make up an image of the world is inseparable from the human experience and the emotions they invoke.

What you're refering to is the experience of hedonics, and I postulate that AI only needs the capacity to suffer and feel pleasure to be sentient and experience. If you think this can't be simulated, I beg to differ. But I will have to elucidate on that in a later post that I will write soonish.

But that's not going to happen, because we don't know enough about the skills a human baby inherits. For example language acquisition is still a mystery despite or because of Chomsky (who convinced linguists that babies have hereditary grammar for every language in the world that are hooked into during language acquisition).

Genetics can be simulated, and bioevolutionary prcoesses can be simulated. It would require a potentially infinite amount of memory, but I think it's feasible. The brain is a finite piece of matter, but it can hold an infinite amount of information. We just need to discover how. I think a lot of it might have to do with neuroplasticity and feedback loops, synapses that arbitrarilly communicate with several unrelated synapses serving different functions and able to adapt, to be there when called upon.

Quote by Briareos H:
My point, and I'll use the rest of your posts as a basis here, is that it would have been fine if he said a computer could never think like a human, and nothing more. I think so too as well, because as you rightfully point out there is no true duality between our perception and our thinking. Perceptual integration, memory, pattern building etc. are a continuum of experience that is highly dependent on our physical and chemical states.

No. These chemical processes can be simulated as virtual equivalents. What is needed is a feedback loop, and that is the challenge we have a long way to overcoming. The question is how to get a potentially infinite amount of information to flow through and be adaptively retainted by a finite piece of hardware. That's not to mention that our brains have the ability to grow, just as they did when ape brain became human brain (encephalization). Why couldn't we program a computer to expand it's memory space to adapt to it's needs as well? There's the physical component of course, but we could program a robot to go scouting for memory chips when needed.

When you look at pictures that went through Google's Deep Dream, most objects get transformed into animal faces. It does so because it was trained to see animal faces everywhere: when you present it with something which it doesn't know, it is going to represent it in a way where it can see an animal face in it. I am arguing that if it was trained with enough (i.e. more) neurons, and with a learning set that encompassed the entire web, the way it would represent data when presented with a new input would be in no way different than the way an "intelligent entity living in the web" would represent data. As such, I fully believe that the idea in your last post's second to last paragraph (feeding romance novels) is sound and I don't agree with your conclusion. When triggered in the right way, it could understand and translate any language, it could predict outcomes of complex systems (hello stockmarket abuse), it could understand the hidden desires and motivations of most internet users and interact with them in a meaningful way (hello targeted ads), it could create new art in the styles that it learned (which it already does).

Yes, and as I stated before, the vital component is the hedonic (or "affective") component, which I'll expound upon in a later post.

Quote by RocketMan:
In order for any sentient being to pose a threat, it must have some sort of motivation.  Motivation, while neurochemical in nature, is based on emotional prerequisites such as being able to form goals, anticipate rewards and form affective correlations between the sheer notion of a reward and the personal experience of being rewarded.  Human beings are drug addicts.  We are addicted to the stimulating sensations of norepinephrine and dopamine.  We are relieved by the calming effects of seratonin, prolactin, oxytocin, etc.  This high-low cycle is strongly correlated with certain habitual and instinctive behaviours such as eating and having sex.  It can be easily extended to any perceived reward and the necessary actions to achieve it.

Until AI becomes capable of emotions, it is unlikely to spontaneously form goals of any kind that aren't part of its programming guidelines.  If a robot cannot feel anything emotionally, it has no feedback loop.  It has no motivation.  As K alluded to, it cannot relate to the human condition or even the mammalian condition and will exhibit very predictable responses to stimuli.  It will not turn on us and start wiping us out.  What would be its inclination to do that?  So that it can start a colony of robot children and take over the world?  Come on.

This gets right at the heart of the point I'm making. The motivational/emotional component is the hedonic/affective component, and it does deal with the emotional neurotransmitters serotonin, dopamine, and norepinephrine, and well as the unforgiving Substance P. David Pierce talks about this in The Hedonistic Imperative. I've been studying this topic for 9 years and I'll have more to say on it later.

664cb7e6761d3Drone-Dragon

664cb7e67622b
I...don't really want to get into this conversation, it doesn't really tickle my gobbler...

...BUT...I do want to comment on one thing.

Everyone: Oh dear god...

 :D

It is about the brain. Technically both Kolya and hedonic are right. To sum it up, if I'm reading their comments on the brain-body relation correctly.

Hedonic=The brain is a CPU and the body is a vessel/life support system for our brain to live and experience.

Kolya=Every experience we have that shapes who we are happens through our body and our brain is the storage vessel, making both one and the same in terms of existence.

You're both right because of this...

...our brain IS us...it is our CPU or motherboard or whatever comparison you want to make...it is all that we are wrapped up in one organ, and our consciousness is here...

...but without our BODY...what exactly would a human BE? A brain grown in a jar and somehow kept alive would probably not even be self aware...and if it was and you put it in a body, what would it do? Probably have a heart attack and die because existence makes no sense from its standpoint.

So you're both right. Our brain is us...but our body is what makes our brain capable of being what we are in the end.

Also, sex is fun. :heart:

664cb7e6766a1Drone-Dragon

664cb7e676772
Along comes a buddhist and says "There is no you.".

There is no spoon. 8)



Asian 'wisdom' is usually one of two things...either common sense stuff wrapped up in elaborate phrasings to make them sound extra wise...or gibberish that has no place in a sane person's mind.

Other than capitalism, why do you think a lot of Asians are westernizing? :P
« Last Edit: 06. September 2016, 16:45:56 by Drone-Dragon »
664cb7e67690d
Capitalism has nothing to do this. And your view of Buddhism is simplistic and ignorant. For your own sake, be more curious.

664cb7e676dc5Drone-Dragon

664cb7e676e19
Capitalism has nothing to do this. And your view of Buddhism is simplistic and ignorant. For your own sake, be more curious.

I don't...get what you're upset about, if you are.

Capitalism was an aside, Asians are westernizing because the western lifestyle is generally seen as more open and less restrictive than their traditional ways, and one can't deny the influence western culture has on Asia. That doesn't mean everyone is into that change of course but I don't even get what the upset over that part is about, because the capitalism comment was just an aside, not the comment itself.

I'm not against people believing that enlightenment will just come to them if they sit around long enough, anymore than I'm against people believing that when they die they go to a sky paradise. What I am saying is that the western view of Asians is romanticized. Much like the western view of Native Americans is romanticized.

To use the Native American thing as an example, in "Native Americans, An Illustrated History" which is an over 450 page book detailing the life and hardships of many of the tribes of Native Americans, it describes how not only have whites on one side who are bigots made Native Americans look like animals and savages, but on the other side white romantics have turned them into some shining example of what culture and people should be, and it mentioned that this was just a 'white fantasy', which ignores a lot of the true Native American heritage.

The truth is that while Native Americans had some more reasonable aspects to them, and respected nature probably more than any other culture in the world, they could also be just as brutal, with some tribes having ritualistic rape as a punishment for adulterous wives, for example.

So no, from my point of view I don't think my take on Asian 'wisdom' is ignorant and simplistic, we have simply romanticized it way too much. It is simply my point of view. If you don't agree with it, that is fine. I don't hate you for it, and as I said, I don't have any problem with people wanting to live the way Buddhist monks do...it's probably the lifestyle is just more relaxing and carefree, more so than the supposed importance of the religious stuff surrounding it.

If you want to hate me because I disagree with you, that is your choice. If you want to ban be for the same reason that is within your power. But this comment feels more like a slightly nicer variation of something ZylonBane would post, and less something I've seen Kolya post. :(

That too, is just my personal opinion, nothing more.



Also, your fly is undone. :P

Sorry, was just trying to lighten the mood.
664cb7e676f86
No one romanticised Asian beliefs. You're the one dishing out ignorant and condescending clichés.
I'm not against people believing that enlightenment will just come to them if they sit around long enough

You pose an aggressively stupid cliché and declare your tolerance at the same time. It's like saying "I'm not against blacks dancing around wildly if that's all they know."

As I told you, I didn't think ZylonBane was wrong. You do write a lot of daft posts and I hope you will learn to accommodate. I just thought his reaction was uncalled for in that situation.
664cb7e6772dd
Asian 'wisdom' is usually one of two things...either common sense stuff wrapped up in elaborate phrasings to make them sound extra wise...or gibberish that has no place in a sane person's mind.

Buddha's teachings are indeed frequently about stuff that should be common sense and and yeah, buddhist wisdoms can sound like a lot of gibberish to modern westerners. However, just because it sounds like gibberish to you, doesn't mean it's meaningless, right? Fortunately there are many books about them, written specifically with the modern westerner in mind. And just because you think something should be common sense doesn't mean it really is (when it counts). Not judging something you don't understand - should be common sense, yet most of us do it without even realizing it most of the time (including me). Sometimes it's necessary to drag such things into concsiousness - over and over again.

664cb7e677eeaDrone-Dragon

664cb7e677f94
No one romanticised Asian beliefs. You're the one dishing out ignorant and condescending clichés. 
You pose an aggressively stupid cliché and declare your tolerance at the same time. It's like saying "I'm not against blacks dancing around wildly if that's all they know."

As I told you, I didn't think ZylonBane was wrong. You do write a lot of daft posts and I hope you will learn to accommodate. I just thought his reaction was uncalled for in that situation.

I'm not against people believing that enlightenment will just come to them if they sit around long enough, anymore than I'm against people believing that when they die they go to a sky paradise.

Yet, I also said this, but you didn't quote that part and try to defend Christians and Muslims by saying I was insulting them.

Which makes it seem more like you have a personal bias against people that disagree with aspects of Asian culture. It's cool if you like Asian culture, but this being cliche is your opinion, and I respect that, but I have mine. How is that daft?

Also, there have been worse posts that you didn't give this much attention to...why single me out with such aggression, the same thing you just said I was doing, when there are actual shitposters?

Most of my posts have been either giving links to something people might like or disagreeing and sharing my own opinion...or just joking around. How is that daft? The recreation board is for off-topic threads. Must it all be serious or about cyberpunk?

If my posts are wrong simply because the management doesn't agree with them, then I guess I am daft. I do my best to be nice, even when my wit sometimes gets the better of me and some things I say come off as belligerent even if not intended, and as you have seen I try to apologize for my mistakes.

But I'm not going to go along with a 'Good Ole Boy' system just because of disagreements of opinion. As I said, there are worse posts and topics on here you didn't give this much attention. So I don't see why you are all of a sudden singling me out just because I don't agree with you? That doesn't mean I hate you, or the people I'm talking about or to.

I've actually been trying to talk to you in a meaningful and reasonable manner, while you're simply antagonizing me because of perceived slights or hurt perceptions.

What else can I do but simply shut up and not say anything simply because it doesn't conform or 'accommodate' to what you want me to be, even when there is worse on this site that has gotten far less attention or just jokes from you and the management? Hell, I've made jokes about it too, but all of a sudden it is a problem. :/

Isn't singling me out because I'm not posting about super serious topics all the time or because my opinion is perceived by you to be arrogant also condescending? Double standard.

Which is weird, because even when you and I have disagreed it was all good and okay, like the Steam argument. Now it's...awkward and tense.

It's cool, I actually wasn't bringing that up again to use against you or as some vain defense even though I did use the 'daft' part after you mentioned it first, I was just making the Zylon comparison as a reference 'in general' to a type of post. I didn't mean to drag that up. I'm generally not that vindictive. I apologize if you saw it that way. :)


Buddha's teachings are indeed frequently about stuff that should be common sense and and yeah, buddhist wisdoms can sound like a lot of gibberish to modern westerners. However, just because it sounds like gibberish to you, doesn't mean it's meaningless, right? Fortunately there are many books about them, written specifically with the modern westerner in mind. And just because you think something should be common sense doesn't mean it really is (when it counts). Not judging something you don't understand - should be common sense, yet most of us do it without even realizing it most of the time (including me). Sometimes it's necessary to drag such things into concsiousness - over and over again.



I just got a notice saying this was added as I was writing my 'book' above.

Actually a lot of things can be boiled down to common sense with enough empathy. Everyone has differing opinions, and many cultures truly seem alien to some others. How you view them is all a matter of perspective, and it doesn't mean that a somewhat negative view, is a hostile view...the same way dark is not evil and light is not righteous.
1 Guest is here.
Your treachery is noted.
Contact SMF 2.0.19 | SMF © 2016, Simple Machines | Terms and Policies
FEEP
664cb7e6780d9