News:

Sarkat And Rian: Happily Ever After? [EX]
Congratulations shengami & FoxgirlJay for completing your RP!

Main Menu

Free will ?

Started by Medias, April 07, 2013, 10:20:03 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Ephiral

Quote from: Oniya on April 10, 2013, 04:35:44 PM
Love.
Definition? (Yes, I know the concept, but it's very important to nail down specifics in a discussion like this.)

DarklingAlice

I'm still interested in this assertion that we evolved emotions as a benefit as opposed to just an overinflated, psuedo-parasitic feature. I'm not necessarily arguing against it, but I am not sure if that should be the default assumption. Many more fit forms of life do not appear to have emotional capacity, and we know it isn't necessary for societal formation. On most standards of success I imagine a psychopath could perform just as well as if not actually outcompete a wild type individual. Perhaps emotions are an inevitable negative consequence of a certain cognitive capacity as opposed to an acquired benefit. Horrible as that sounds (perhaps) it seems a more likely default hypothesis.
For every complex problem there is a solution that is simple, elegant, and wrong.


Vekseid

Quote from: DarklingAlice on April 10, 2013, 06:44:17 PM
I'm still interested in this assertion that we evolved emotions as a benefit as opposed to just an overinflated, psuedo-parasitic feature. I'm not necessarily arguing against it, but I am not sure if that should be the default assumption. Many more fit forms of life do not appear to have emotional capacity, and we know it isn't necessary for societal formation. On most standards of success I imagine a psychopath could perform just as well as if not actually outcompete a wild type individual. Perhaps emotions are an inevitable negative consequence of a certain cognitive capacity as opposed to an acquired benefit. Horrible as that sounds (perhaps) it seems a more likely default hypothesis.

There are two important components here. Emotions themselves, and what amounts to emulated models.

I don't think I need to go into why emulated models are an evolutionary advantage. They - and the difficulty of implementing them outside of toy scenarios (literally, ai in games) are why I tend to view 'hard takeoff' singularity scenarios with a healthy degree of skepticism.

Chemicals that affect our emotional state directly - dopamine, serotonin, and so on do so because they are neurotransmitters. No dopamine? Dopaminergic receptors don't get triggered, and those neurons are that much more asleep. Since these neurons control the basis of our reward centers - that is, the key to our motivation...

That's what depression is. Chemically, you end up being incapable of motivation - of making certain decisions. It's why depression is such a nasty trap.

Emotions are basically goal adjustors - a single purpose AI may not have them, but something that requires dynamic adjustment of priorities will end up with something analogous to emotions in order to direct their optimization pressure accordingly.

Jude

#28
Quote from: DarklingAlice on April 10, 2013, 06:44:17 PM
I'm still interested in this assertion that we evolved emotions as a benefit as opposed to just an overinflated, psuedo-parasitic feature. I'm not necessarily arguing against it, but I am not sure if that should be the default assumption. Many more fit forms of life do not appear to have emotional capacity, and we know it isn't necessary for societal formation. On most standards of success I imagine a psychopath could perform just as well as if not actually outcompete a wild type individual. Perhaps emotions are an inevitable negative consequence of a certain cognitive capacity as opposed to an acquired benefit. Horrible as that sounds (perhaps) it seems a more likely default hypothesis.
I think it comes down to the fact that being a psychopath is the opposite of pro-social and human beings evolved in a social, primarily tribal context. Emotions are very important for group cohesion. They keep us from making rational decisions (in the economic theory sense) that would otherwise hurt the tribe and indirectly ourselves.

There's a very Hobbesian bent to it, I think.

Plus, what Vekseid said. It reminds me of the Hume quote: "Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them." We need motivations to apply our tools to.

Stepping out on a techno-philosophical limb, computers (I think) lack emotions because currently they are nothing but our tools. We keep them from having their own goals and directions divorced from our ends. Until we become comfortable with machines acting without our control of their own accord (and make some serious strides when it comes to our technological sophistication), we will probably not replicate sentience in computers intentionally. But to me, I can't comprehend why it's impossible.

It seems that all theories that make it impossible are rooted in a major unstated premise of Cartesian Dualism.

Beguile's Mistress

Quote from: Oniya on April 10, 2013, 04:35:44 PM
Love.
Quote from: Ephiral on April 10, 2013, 04:49:23 PM
Definition? (Yes, I know the concept, but it's very important to nail down specifics in a discussion like this.)

Which love should we define for you?  Love of one person for another in a romantic sense or the love of a parent for a child?  The love of candy or roses?  The love of horseback riding?

But why does a definition of love have anything to do with whether or not a computer can feel it?

Keep in mind that you chose to ask that question and you'll chose to respond or not this post.  That is all free will is.  Exercising your option to choose.

Ephiral

Quote from: Beguile's Mistress on April 10, 2013, 09:26:27 PM
Which love should we define for you?  Love of one person for another in a romantic sense or the love of a parent for a child?  The love of candy or roses?  The love of horseback riding?

But why does a definition of love have anything to do with whether or not a computer can feel it?
Because without sharply defining it, we'll wind up arguing semantics all day.

Quote from: Beguile's Mistress on April 10, 2013, 09:26:27 PMKeep in mind that you chose to ask that question and you'll chose to respond or not this post.  That is all free will is.  Exercising your option to choose.
By that definition, a significant number of computers already have free will, and have for decades.

Beguile's Mistress

Quote from: Ephiral on April 10, 2013, 09:30:39 PM
Because without sharply defining it, we'll wind up arguing semantics all day.
Only if you so choose or another poster takes up that mission.

QuoteBy that definition, a significant number of computers already have free will, and have for decades.
No.  They have programmed parameters within which they are forced to operate.  They are not capable of doing anything outside of what they are told is possible. 

Example:  The GPS that sent a man over an embankment because of faulty input and driver who chose to follow the directions of the computer in spite of his instinct and experience telling him something might be amiss. 

Ephiral

Quote from: Beguile's Mistress on April 10, 2013, 09:37:34 PM
Only if you so choose or another poster takes up that mission.
No.  They have programmed parameters within which they are forced to operate.  They are not capable of doing anything outside of what they are told is possible. 

Example:  The GPS that sent a man over an embankment because of faulty input and driver who chose to follow the directions of the computer in spite of his instinct and experience telling him something might be amiss.
Humans have programmed parameters within which they are forced to operate. They are similarly incapable. And a limited decision tree is still a decision tree.

Oniya

It is the very ineffable nature of love that makes it impossible to codify in ways that a computer can process it. 
"Language was invented for one reason, boys - to woo women.~*~*~Don't think it's all been done before
And in that endeavor, laziness will not do." ~*~*~*~*~*~*~*~*~*~*~Don't think we're never gonna win this war
Robin Williams-Dead Poets Society ~*~*~*~*~*~*~*~*~*~*~*~*~*~Don't think your world's gonna fall apart
I do have a cause, though.  It's obscenity.  I'm for it.  - Tom Lehrer~*~All you need is your beautiful heart
O/O's Updated 5/11/21 - A/A's - Current Status! - Writing a novel - all draws for Fool of Fire up!
Requests updated March 17

DarklingAlice

Quote from: Vekseid on April 10, 2013, 07:00:36 PM
Chemicals that affect our emotional state directly - dopamine, serotonin, and so on do so because they are neurotransmitters. No dopamine? Dopaminergic receptors don't get triggered, and those neurons are that much more asleep. Since these neurons control the basis of our reward centers - that is, the key to our motivation...

That's what depression is. Chemically, you end up being incapable of motivation - of making certain decisions. It's why depression is such a nasty trap.

Emotions are basically goal adjustors - a single purpose AI may not have them, but something that requires dynamic adjustment of priorities will end up with something analogous to emotions in order to direct their optimization pressure accordingly.

But dopamine serves many functions e.g. motor control, so is it's reward mechanism really a primary one? How did it arise? When does emotion come to be a necessity? An advantage? And does it, and if so when does it, ever stop?

Even single celled organisms are able to situationaly optimize themselves and form rudimentary societies (and not only bacteria, the body itself breaks down this way). Hive forming insects demonstrate models of (what we presume to be) emotionless efficiency. And while I am not saying I would prefer to not be human, I am saying that by a ton of rubrics they are far more 'successful' than we (likewise psychopaths). I get that we need a flexible system to prioritize goals. For instance, a drive to breed works for a species' benefit. I'm even willing to say that so long as we have consciousness, why not let us enjoy that and tack on the concept of pleasure? But I'm at a loss to explain from a beneficial standpoint why I spent several hours of last night in rope bondage and hooked up to electrodes and felt pleasure in line with my sex drive despite not being engaged in the reproductive activity it is ostensibly 'for' (or why I have no instinctual qualms about having being sterilized). Basically at a certain point you gain the capacity to game the reward system. I have to question whether that is to any real advantage (even though I am not going to stop any time soon)? To be clear I am not decrying hedonism, just trying to suss out it's natural history.

To bring it back around to computers, we would never actually put together a computer system and give it the ability to masturbate. I'm not saying we can't, just that it does not seem to me a beneficial capacity or a smart thing to do when trying to make an effective system. Which makes me call into question whether such capacities actually work to our benefit.

Quote from: Jude on April 10, 2013, 08:28:38 PM
I think it comes down to the fact that being a psychopath is the opposite of pro-social and human beings evolved in a social, primarily tribal context. Emotions are very important for group cohesion. They keep us from making rational decisions (in the economic theory sense) that would otherwise hurt the tribe and indirectly ourselves.

There's a very Hobbesian bent to it, I think.

Plus, what Vekseid said. It reminds me of the Hume quote: "Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them." We need motivations to apply our tools to.

It's a little circular isn't it? We evolved emotions because emotions drive us to be social and we need to be social because we live in societies? The problem with Hobbes is that he thinks the state of nature is 'nasty, brutish, and short' (IIRC). Which really isn't so. I get how that worked when we thought that humanity was unique in being a social animal. Things are different now though. From simple symbiosis to complex societies, functional group behavior can be seen at every level of life from prokaryotes up. Pseudomonas bacteria do not need concepts such as love to fill class roles in dynamic and complex social groups (though interestingly they do still display kin-selection/favoritism), so why is it that we do (or do we?)? Is this a flat out advantage? A necessary, but sub-par solution for dealing with the phenomenon of consciousness (and in turn is sapience itself beneficial?)? Or a complete accident?

I am so going to lose sleep over this <_<
For every complex problem there is a solution that is simple, elegant, and wrong.


Beguile's Mistress

Quote from: Ephiral on April 10, 2013, 10:10:35 PM
Humans have programmed parameters within which they are forced to operate. They are similarly incapable. And a limited decision tree is still a decision tree.

No matter how you argue it humans are not machines and machines are not human.  Computers do not make decisions.  They only provide information to allow thinking people to make a more informed decision. 

Computers cannot do anything completely on their own.  A functioning human can attempt anything they choose to try.

Ephiral

Quote from: Oniya on April 10, 2013, 11:24:35 PM
It is the very ineffable nature of love that makes it impossible to codify in ways that a computer can process it.
If you can't explain what you mean when you use a word, I humbly submit that you understand it no more than said computer.

Quote from: Beguile's Mistress on April 10, 2013, 11:41:46 PM
No matter how you argue it humans are not machines and machines are not human.  Computers do not make decisions.  They only provide information to allow thinking people to make a more informed decision. 

Computers cannot do anything completely on their own.  A functioning human can attempt anything they choose to try.
You keep blindly asserting this. I keep pointing out that computers are capable of doing things like navigating 3D space with which they are unfamiliar, including moving obstacles, on the fly and with no user input. Are you really telling me no decisions are made there?

A human is capable of attempting anything they choose, yes, but the very things they may or may not choose - or even conceive of - are limited by their psychology and brain chemistry. How many people in this thread do you think could murder a stranger in cold blood?

Beguile's Mistress

#37
Quote from: Ephiral on April 10, 2013, 11:46:06 PM
You keep blindly asserting this. I keep pointing out that computers are capable of doing things like navigating 3D space with which they are unfamiliar, including moving obstacles, on the fly and with no user input. Are you really telling me no decisions are made there?
Not by the computer.  Input may derive from previous calculations but those calculations are based on prior programming.  A computer can't ask "What if...?" and postulate a response.  It does what it is told until it can't do it any longer.

QuoteA human is capable of attempting anything they choose, yes, but the very things they may or may not choose - or even conceive of - are limited by their psychology and brain chemistry. How many people in this thread do you think could murder a stranger in cold blood?
Even humans with limited mental ability are not static.  The may be limited at this moment by psychology and brain chemistry and even experience, education, sight, hearing, smell, taste, sensory input, current state of inebriation or influence of medication or drugs, the temperature, time of day, the amount of sleep they had the night before, what they had for breakfast, an argument with an SO, boss, parent or child, when their birthday falls, the last conversation they had, if they have enough money to pay their bills, an allergic reaction to something in nature, the toe they stubbed getting out of bed, the fact their dog chewed the morning paper or the hard drive died on their computer. 

We run into pot holes, speed bumps and road blocks in life all the time and maneuver our way around them by choice.  When new information is needed we have the choice to find it on our own or ask for help.

In the book "Jurassic Park" a computer was programmed to count the creatures on the island.  It would do that and kept finding all the animals it was looking for.  It wasn't told to count ALL the animals and it would stop when it hit the specified number.  A human would have found the number of animals it was looking for but evidence of additional numbers would have given that person the opportunity to choose to investigate further and discover a problem.

As a human with free will I refuse to have my humanity or essence or what ever you wish to call it reduced to the level of a machine. 

As for murdering someone I know it's possible.  Any one of us could murder someone in cold blood but a very high percentage of us would have to consciously decide to do that by exercising our free will.

Ephiral

Quote from: Beguile's Mistress on April 11, 2013, 12:07:19 AM
Not by the computer.  Input may derive from previous calculations but those calculations are based on prior programming.  A computer can't ask "What if...?" and postulate a response.  It does what it is told until it can't do it any longer.
...are you kidding me? There's a reason I specifically cited "unfamiliar terrain". These machines, these devices, are dealing with situations that literally cannot have been previously programmed into them. They do predictive modelling all the damn time. Hell, most predictive modelling is done by computers these days.

Quote from: Beguile's Mistress on April 11, 2013, 12:07:19 AMEven humans with limited mental ability are not static.  The may be limited at this moment by psychology and brain chemistry and even experience, education, sight, hearing, smell, taste, sensory input, current state of inebriation or influence of medication or drugs, the temperature, time of day, the amount of sleep they had the night before, what they had for breakfast, an argument with an SO, boss, parent or child, when their birthday falls, the last conversation they had, if they have enough money to pay their bills, an allergic reaction to something in nature, the toe they stubbed getting out of bed, the fact their dog chewed the morning paper or the hard drive died on their computer. 

We run into pot holes, speed bumps and road blocks in life all the time and maneuver our way around them by choice.  When new information is needed we have the choice to find it on our own or ask for help.
And these are any different than the parameters any machine operates within... how? Complexity is the closest I see you coming to making a distinction here.

Quote from: Beguile's Mistress on April 11, 2013, 12:07:19 AMIn the book "Jurassic Park" a computer was programmed to count the creatures on the island.  It would do that and kept finding all the animals it was looking for.  It wasn't told to count ALL the animals and it would stop when it hit the specified number.  A human would have found the number of animals it was looking for but evidence of additional numbers would have given that person the opportunity to choose to investigate further and discover a problem.
And the movie gave us the infamous "This is UNIX! I know this!" scene. Fiction makes very poor examples, which is why mine are culled from real life. This one is an especially poor example, since it couldn't even do what it was designed for. Would you point to a story about a broken watch and say "Therefore, machines can't tell time."?

Quote from: Beguile's Mistress on April 11, 2013, 12:07:19 AMAs a human with free will I refuse to have my humanity or essence or what ever you wish to call it reduced to the level of a machine.
This seems to be the core of your argument: "I find this insulting, therefore it's wrong." Unfortunately, logic and evidence do not work that way.

Quote from: Beguile's Mistress on April 11, 2013, 12:07:19 AMAs for murdering someone I know it's possible.  Any one of us could murder someone in cold blood but a very high percentage of us would have to consciously decide to do that by exercising our free will.
If you honestly think you could murder a complete stranger in cold blood... please tell me you're either a member of the military or in psychiatric care. Your standard neurotypical civilian is patently not capable of this. There is a reason that a significant amount of military research and training is centered around desensitization and othering the enemy.

Beguile's Mistress

You will never convince me that a machine is capable of making life choices that can only be made by a human.  A machine cannot feel anything and without that it is fatally limited in any so-called decision making ability. 

Since you are unable to carry on this discussion in an objective and logical manner without becoming abusive I'll withdraw.  That is called exercising my free will.


Ephiral

#40
...yeah, this post was a mistake. My apologies.

Vekseid

Ephiral, the proper course of action in these cases is to report the behavior. To myself or other G-level staff, if needed. Please do not escalate these situations by engaging in further snark.

Quote from: Beguile's Mistress on April 11, 2013, 12:30:22 AM
You will never convince me that a machine is capable of making life choices that can only be made by a human.  A machine cannot feel anything and without that it is fatally limited in any so-called decision making ability. 

Since you are unable to carry on this discussion in an objective and logical manner without becoming abusive I'll withdraw.  That is called exercising my free will.

No. Ephiral behaved perfectly fine until their last message prior to yours, with their cold-blooded murder comments.

I fully understand that people like yourself and Oniya have an emotional vestment in your positions. I try really hard not to denigrate it.

In order to 'disprove' an artificial intelligent agent's ability to adapt to new situations and have novel decision-making you cite... a use case failure from a book whose movie adaptation was released before some members of this site were born.

Your comment is like saying sending something to Alpha Centauri is impossible because no one could ever pedal their bicycle fast enough to even escape the Solar System. It's hard to know even where to begin with that. One item is a machine, and remains so despite being programmed. Another is frequently termed an intelligent agent. They are completely different things. One of these has a model for the 'world' around it, and a sense-update-decide-act loop. The other is a machine - which probably still has some AI involved, but lacking said loop, it's not actually an agent in any meaningful sense.

Regardless, it's immensely off-topic. If you want to learn about AI, you are free to ask in a new thread, but I'd hope for a little less magical thinking and a little more willingness to grow.


Beguile's Mistress

Understood, Vekseid.

Computers are amazing machines and become capable of more complicated maneuvers every day.  They are amazing tools and mankind has benefited greatly from their existence and will continue to do so. 

Discussions can provide opportunities for learning and growth.  I'll happily take part in any discussion where opinions can be freely stated and exchanged with mutual respect.  Frustration can build when a position or opinion appears to be under attack.  The conversation was devolving into something that was taking on a personal aspect certain remarks were showing that.  I chose, by an exercise of my free will, to disengage at that point because experience has shown that after a certain point objectivity is lost.

Humans are capable of making choices and decisions.  Those require conscious effort.  It doesn't mean the choices and decisions are always right or the best ones but I will take responsibility for the ones I make even if my thinking is thought to be emotional and magical. 

Free will exists in my opinion and threads like this are proof of that.  The only people I have ever met in my life who dispute the existence of free will are those who wish to avoid taking responsibility for their choices. 




Oniya

Quote from: Ephiral on April 10, 2013, 11:46:06 PM
If you can't explain what you mean when you use a word, I humbly submit that you understand it no more than said computer.

And I would probably agree with you, and raise that none of us really understand love.    If we did, a large number of singer/songwriters would be out of material.  :-)

But the thing is that I don't try to understand it.  I just do it.  The computer needs to understand something in order to perform a task. 
"Language was invented for one reason, boys - to woo women.~*~*~Don't think it's all been done before
And in that endeavor, laziness will not do." ~*~*~*~*~*~*~*~*~*~*~Don't think we're never gonna win this war
Robin Williams-Dead Poets Society ~*~*~*~*~*~*~*~*~*~*~*~*~*~Don't think your world's gonna fall apart
I do have a cause, though.  It's obscenity.  I'm for it.  - Tom Lehrer~*~All you need is your beautiful heart
O/O's Updated 5/11/21 - A/A's - Current Status! - Writing a novel - all draws for Fool of Fire up!
Requests updated March 17

meikle

Quote from: Beguile's Mistress on April 11, 2013, 08:59:52 AMFree will exists in my opinion and threads like this are proof of that.  The only people I have ever met in my life who dispute the existence of free will are those who wish to avoid taking responsibility for their choices.
Perhaps they are people who are not content with an opinion, and want to know what empirical observation of reality has to conclude on the matter?
Kiss your lover with that filthy mouth, you fuckin' monster.

O and O and Discord
A and A

Ephiral

#45
Quote from: Oniya on April 11, 2013, 10:00:01 AM
And I would probably agree with you, and raise that none of us really understand love.    If we did, a large number of singer/songwriters would be out of material.  :-)

But the thing is that I don't try to understand it.  I just do it.  The computer needs to understand something in order to perform a task.
If you don't understand it, how can you say with any degree of assurance that it is incomputable? Is it not possible that you just have insufficient data?

Quote from: meikle on April 11, 2013, 10:02:55 AM
Perhaps they are people who are not content with an opinion, and want to know what empirical observation of reality has to conclude on the matter?
This. Frankly, I find it pretty damn insulting to be accused of being abusive in one breath and then told that I just want an excuse to take no responsibility in the next. What I actually want is to believe what is true, regardless of how uncomfortable it might be. Physics is deterministic. The brain operates within physics. I see a great need for compelling evidence if we're going to claim that the brain is not deterministic - and no such evidence. Indeed, the only attempts at rebuttals I've seen thus far are arguments from incredulity or from personal distaste.

Beguile's Mistress

Quote from: meikle on April 11, 2013, 10:02:55 AM
Perhaps they are people who are not content with an opinion, and want to know what empirical observation of reality has to conclude on the matter?

They have every right to try and find that if it exists.  They are free to form their own hypothesis and seek proof.  I wish them luck with the search.

Ephiral

Quote from: Beguile's Mistress on April 11, 2013, 10:10:29 AM
They have every right to try and find that if it exists.  They are free to form their own hypothesis and seek proof.  I wish them luck with the search.
It seems to me that "this physical object does not violate physics as we know it" is the null hypothesis. I'd say the other side is the one that needs proof.

Beguile's Mistress

Quote from: Ephiral on April 11, 2013, 10:15:27 AM
It seems to me that "this physical object does not violate physics as we know it" is the null hypothesis. I'd say the other side is the one that needs proof.

I'm not sure what you mean by this.  The other side of what?  Where did I say what you have within quotation marks? 

Ephiral

Quote from: Beguile's Mistress on April 11, 2013, 10:26:52 AM
I'm not sure what you mean by this.  The other side of what?  Where did I say what you have within quotation marks?
"The other side" being the people that claim that free will is possible (and thus the brain cannot be deterministic, and thus it is an exception to physics as we know it). You did not say the part I had in quotes; that is my position. I submit that, if you are going to argue a position that inevitably puts you on one side and every example of macro-scale physics on the other, you had best have some extremely strong evidence for your case.