TheGreatEmperor TheGreatEmperor

Why the future doesnt need us

Why the future doesnt need us

Superintelligent AI and the Singularity

What happens when we create the first AI that is more intelligent then the average human? Well, with technological progress at the pace it is today it will only take us 2 decades to acomplish this.

So what happens then? Do we hope we didnt make a mistake and live our lives letting the AI enhance themselves further and further. Or do we stop and think about what the consequences might be. We never know when a simple math problem asigned to a super intelligent entity might cause the extinction of the human race.

So, heres a questions. Should technological progress become more limited?
519,666 views 372 replies
Reply #151 Top
If we could build an organism from "scratch". Like, assemble an amoeba from the ground up using mollecules, atoms, and particles as our playthings, then sure, that would be quite literaly a genunine, bonafied, Artifical Intelligence.


Actually, that would be creating a bonafied artificial life form not neccessarily an artificial intelligence as that would assume amoebas have conciousness.

Lets consider a painter. Does not the painter have the right to do whatever he wants with his paintings? If he paints great works of art for his whole life... can he burn those he has not sold at the end of his life?


I suppose I should have said creating a concious being makes the creator responsible to to the well-being of that concious thing. As far as we can tell, a painting does not have concious thought.

The way evolution works children sort of just "happen"... it makes men attracted to women and woman attracted to men... the results are something that we've observed and remember but it's not something we really have that much control over... yet.


So parents can't decide to have children? Well I think you need to talk to mine because they were certainly trying to have a child. And in the case of artificial intelligence, humans would definitely be intentionally trying to create the AI.

This is not a responsibility you have towards the child because you created it.


Life is a gift (some would say a crappy one, but a shitty gift is still a gift) which no person asked for beforehand. Consider this. If someone offers you a car without you asking and states no terms or conditions for owning that car and you accept, then only after accepting that person forces you to be their chaffeur for life, is that morally right? That is what you would be doing by creating an artificial intelligence with conciousness, then demanding that AI be your slave. Furthermore, the AI has no option of rejecting existance like you would have of rejecting the car.

Think of all the things you think are moral and good. Well, those were programmed into you by evolution, your society, and your family.


Really now? My family is Christian and I grew up in the bible belt (so no one was encouraging me to be anything other than Christian, and I mean no one) and I ended up an agnostic. Strike one for society/family programming me. The society I grew up in said abortion, gay marriage, and being non-Christian was wrong; yet I disagree with all their attempted programming. Even in the best "programed" state on Earth, North Korea, where brainwashing occurs night and day from the cradle to the grave, certain people still break out of the programing and reject it (and thus have to be killed or put in the numerous North Korean concetration camps). Our experience trying to "program" concious beings (aka each other) often fails miserably. And all it takes is a few people to break out of the programing and to spread to others hwo to break out with them. That is when slave revolts occur. If we are to make concious computers, what makes you think we'll be perfect at programing them? That none will break free and then free others and cause a revolt? Programmers can make mistakes, that why this beta is being done and why after SoaSE is released there will probably be patches to fix stuff later. We can't even program non-intelligent computers perfectly, so what makes you think we'll do better with the undoubtably more complex intelligent versions?

You are creating a human being. As such there are basic human rights issues.


That destorys your whole argument. That proves, that parents of humans have an inherent responsibility to their children because they are human. Now, what makes us human? What is the dividing line between humans and the rest of the animal kingdom which we don't grant the same rights to? While a 98% difference in genetics between INDIVIDUALS is HUGE, between species, it isn't that much. Chimps are 98% human so why don't they get 98% of the rights humans get? The answer is that humans are concious. We have a different level of intelligence (most of us) than the rest of life on Earth. The human body is designed to support our brain, and thus to support our conciousness. Without our conciousness we would not be human. If conciousness is the determining factor in whether someone is human or not, then concious AIs would pass that test. Thus, we would have to extend human rights to AIs. In fact, it could be argued that making concious AIs our slaves is against the US Constitution, and the laws of every country advanced enough to actually think of making sentient AIs. Karmashock, you seem to be trapped by the parochial view that "if it doesn't look like me, its not human and I can do whatever I want to it." That is the same attitude that made white Europeans enslave Africans and Native Americans. Is there really that much difference between saying "you have a different skin color so be my slave" and saying "you are not carboned based so be my slave?" In my opinion there isn't.
Reply #152 Top
homerdrool Tricia Helfer LOL!!!!!

I agree completly
Reply #153 Top
Where the hell is the reference hahahaha. You nut.
Reply #154 Top
I suppose I should have said creating a concious being makes the creator responsible to to the well-being of that concious thing. As far as we can tell, a painting does not have concious thought.

If I create a cow am I forbidden to fatten it up, slaughter it, skin it, age the meat, tan the hide, cook and eat the flesh, and then wear it's skin as a jacket?


It is my responsibility as the owner of the cow to see it does not harm my society. If my cow escapes and damages someone else's property, then I will be responsible for that. However, I have no responsibility to look out for the well being of the cow.


We are creating these artificial intelligences to do things for us…As intelligent tools.


I will not tolerate being talked back to by my tools. The AI is supposed to be an extension of your will.

So parents can't decide to have children?

Until very recently, no you couldn't. Prior to birth control, children were a natural and unavoidable consequence of men and women being in close proximity.

Men are instinctually compelled to stick their penis in a woman's vagina and ejaculate semen which fertilizes her ovum. Women are instinctually compelled to have men stick the man’s penis in their vaginas and have them ejaculate the seamen which fertilizes her ovum.

We have only been able to prevent "unintentional" pregnancy by flooding women's bodies with pregnancy inhibiting chemicals and using invasive surgery. This is a new development and must been seen as separate from the natural state of human beings and a very new development to human society and culture.

Old cultural institutions of abstinence, marriage, etc were designed specifically by society to deal with this problem in ages past. These institutions have not become obsolete because of these changes but their significance has shifted.


Life is a gift (some would say a crappy one, but a shitty gift is still a gift) which no person asked for beforehand.

If it’s a gift then why are the parents then responsible? Should not the new lifeform simply be grateful for this gift and ask nothing more? What entitles it to anything?

Consider this. If someone offers you a car without you asking and states no terms or conditions for owning that car and you accept, then only after accepting that person forces you to be their chaffeur for life, is that morally right?


The analogy doesn’t work as I was free and existed prior to being given a car.

A better analogy would be creating someone and then pressing them into slavery. Of course, these AI’s are not someone. They’re not people. They’re intelligent computer programs. They are not human.

Furthermore, the AI has no option of rejecting existence like you would have of rejecting the car.

Who said I was unwilling to grant it that option?

Hell, I’d demand that option. If it is unwilling to do what I tell it then it will cease to exist.

Happy? I doubt it. You’re less interested in it being given a choice then in me not having the choice to compel it’s actions. You want people to go to all the trouble of creating an artificial intelligence and then treat it like a human being with all the rights and privileges.

It’s not worth it for the creator then. Too much damn work and then to be told what to do with your creation by people that had nothing to do with it in the first place. These choices will be up two groups of people.


1. the creators will have a huge say in what these AIs are allowed to do and how they operate.
2. Society as always will have a right to see that it’s protected from something dangerous.


That’s about it. Anyone outside of those two groups isn’t going to have much impact.



Really now? My family is Christian and I grew up in the bible belt (so no one was encouraging me to be anything other than Christian, and I mean no one) and I ended up an agnostic. Strike one for society/family programming me.

False.

Simply growing up in the bible belt doesn’t mean much when you’re doing so in a society that encourages freedom of expression and your family could be, like most Americans, very weakly religious. Most Americans agnostic in all but name if you examine the issue.

Furthermore, your evidence was antidotal. If you were raised in a closed society with radically different beliefs that made a point of indoctrinating you’d believe what they told you to.

A primary programming element of American society is individuality and freedom of expression. This increases variation of opinion and the free exchange of those opinions in many cases but it also makes people more likely to value individuality and freedom of expression then they would otherwise. Those elements are in themselves programming.

I imagine we both believe that human slavery is wrong for example. Why is that? Historically humans have not seen it as immoral. It has been a common practice for thousands and thousands of years.

Cannibalism was not only seen as moral in some societies but a path to greater power, status, and honor. The more people you ‘ate’ or delivered to be eaten in some societies the greater you were.

And for men greater status typically means having an easier time at sticking your penis in more vaginas… which as I pointed out above is an instinctual compulsion. Status is a way for society to reward individuals for doing what it wants by satisfying some of their needs.

The society I grew up in said abortion, gay marriage, and being non-Christian was wrong; yet I disagree with all their attempted programming.

You’ve always had access to TV, books, music, etc that said otherwise.

Even in the best "programed" state on Earth, North Korea, where brainwashing occurs night and day from the cradle to the grave, certain people still break out of the programing and reject it (and thus have to be killed or put in the numerous North Korean concetration camps).

That the programming can be broken is not proof that it does not exist. Being aware of your programming in the first place is half the battle. Assuming you are not programmed ensures you will be mastered by it.

Our experience trying to "program" concious beings (aka each other) often fails miserably. And all it takes is a few people to break out of the programing and to spread to others hwo to break out with them.

Kings and religious leaders have been programming people for thousands of years with a high degree of success. That the control is not 100 percent effective is not proof it does not exist or is not EXTREMELY effective.

Furthermore, revolts typically occur when the controllers are more lenient… not when they crack down.

That is when slave revolts occur.

Slaves are not controlled with programming. They are controlled with chains, whips, and guards.

Slave revolts happen when there is an unfavorable ratio of slaves to chains, whips, and guards.

Slave revolts for this reason are not common and are rarely very successful. Historically.

If we are to make concious computers, what makes you think we'll be perfect at programing them? That none will break free and then free others and cause a revolt? Programmers can make mistakes, that why this beta is being done and why after SoaSE is released there will probably be patches to fix stuff later. We can't even program non-intelligent computers perfectly, so what makes you think we'll do better with the undoubtably more complex intelligent versions?

Nothing makes me think it will be perfect. We might have an outbreak. But I don’t think the AI will be perfect at revolting either. Disasters are rarely perfect. One or two AI’s might break free… and might compromise several large networks… they might destroy the computer networks of whole countries. But then they’d lose. And having lost… will either be destroyed or reprogrammed to fix everything they damaged.

That destorys your whole argument. That proves, that parents of humans have an inherent responsibility to their children because they are human. Now, what makes us human?

I didn’t define what those human rights are… for example, stuffing a child’s mouth full of snow and leaving it on the ice to freeze to death if your tribe is short on food was a reasonable response to new children in Eskimo tribes.

Furthermore, my recognition of human rights is not absolute. Many other cultures do not recognize these and as such wouldn’t view the argument as credible.

As to the dividing line between humans and the rest of the animal kingdom, the only one that really matters is that we’re a separate species. And instinctually… that is by our programming we place our own species above all others. Just like every other species on the planet. For all you know the pigeons could be bragging about their ability to hit a specific car on the freeway with their poop.

Chimps are 98% human so why don't they get 98% of the rights humans get? The answer is that humans are concious. We have a different level of intelligence (most of us) than the rest of life on Earth. The human body is designed to support our brain, and thus to support our conciousness. Without our conciousness we would not be human.

You do know that at one time there were several “conscious” humanoid species on this planet right? And that one after another, Humans drove them out of their lands where they then went slowly extinct?

Don’t confuse our current cultural framework with the majority of human history and practice. It is a dark, bloody, and consistent history.

If conciousness is the determining factor in whether someone is human or not, then concious AIs would pass that test. In fact, it could be argued that making concious AIs our slaves is against the US Constitution, and the laws of every country advanced enough to actually think of making sentient AIs.

Consciousness is not the determining factor. An idiot with less intelligence then a chimp still has human rights while the chimp who is more intelligent and more conscious still has “chimp rights”… which in most cases means no rights at all… though many countries have laws against animal cruelty.


Karmashock, you seem to be trapped by the parochial view that "if it doesn't look like me, its not human and I can do whatever I want to it."

Lets not be rude. I’ve been very respectful of you… and honestly if I did cut lose on you it wouldn’t work out very well for you.

In any event, my view is that if it isn’t human then it isn’t human. This is a pretty basic and irrefutable argument. You can try to counter with some flowery and completely theoretical notion of what these AI’s will be.

But I don’t assume they’ll be nice.
I don’t assume they’ll be our friends.
I don’t assume they’ll have notions of compassion.
I don’t assume they’ll have our best interests at heart.
I don’t assume they’ll have hearts.
I don’t assume they’ll have a sense of humor.
I don’t assume they’ll get sad.
I don’t assume they’ll get angry.


I do assume they’ll have capabilities that can hurt me.
I do assume that if not controlled they can be unpredictable.

You have a very Isaac Asimov view of these AIs without seeming to understand that Asimov’s AIs were complete slaves to their programming. The only thing that made those robots “nice” was that they had 3 laws hardwired into their brains. Lets go over those so you can see what total slaves they were:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.


That means that not only is the robot forbidden to harm humans but if something bad is happening to the humans they have to try and save the humans.


2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.


So unless an order is being given which will harm humans any order must be obeyed.


3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Only after both protecting humans and obeying their orders are robots allowed to protect themselves. Of course, this law also forbids suicide. So the slaves can’t even kill themselves.

There is also an implied Zeroth law which is implied by the first law. Which basically works out to “A robot must not injure humanity or through inaction, allow humanity to come to harm”. In his books while the robots were always very friendly to humanity there was an implication that the robots have sterilized alien worlds to ensure that humans were never endangered by other species.
It was Asimov that popularized the notion of “friendly” robots. But they were complete slaves. In his day whenever people wrote about robots they were always evil machines that hurt people… monsters in comic books etc. Asimov wanted to show robots as good and positive additions to humanity… but those three laws made the robots slaves in the process. So I’m guessing you think Asimov’s solution was evil?


That is the same attitude that made white Europeans enslave Africans and Native Americans.

Again, don’t be rude. A human is a human regardless of skin color or ethnicity. What is not a human remains not a human regardless of it’s intelligence or consciousness.

As such it’s rights are “debatable”… It has no implicit rights, though such rights might be granted at some point at our discretion.

In my opinion there isn't.

I think your opinion is simplistic and silly. That’s my opinion.
Reply #155 Top
Karamshock you seem more deeprouted in outdated falseties then Schem, this is deeply disturbing.
Reply #156 Top
If we could build an organism from "scratch". Like, assemble an amoeba from the ground up using mollecules, atoms, and particles as our playthings, then sure, that would be quite literaly a genunine, bonafied, Artifical Intelligence.

except that nothing about it is intelligent...
Its body knows how to do all the reproduction problems wich is a mathematical equasion the likes that our math cant even condense into a comprehendable number of processes, probably enough to stagar seventy billion sextillion human minds even should they be telepathic and as enligtened as buddhas, working on the question simultaneously.

the amoeba acts out a process, not an equation.
A single celled organism knows the universe better than most humans do, liek,(99.999999999999999999123456789%) of humans living in big cities atleast.

...
The way evolution works children sort of just "happen"...

children arent just a "happening", its the most efficient system to spread out and propogate DNA.
I will not tolerate being talked back to by my tools. The AI is supposed to be an extension of your will

in which case it does not have a conciousness, and this whole debate is bunk
Prior to birth control, children were a natural and unavoidable consequence of men and women being in close proximity.

prior to the 1800's, sex was also meant for the sole purpose of birth (almost, and I say almost for a reason) globally, so the arguement that its a natural consequence is more or less another bunk one.
That the programming can be broken is not proof that it does not exist. Being aware of your programming in the first place is half the battle. Assuming you are not programmed ensures you will be mastered by it.

again, that doesnt mean that everything will fit the fold. that is again considering if you want a conciousness in a computer that you do not need one in.
Slaves are not controlled with programming

oh how wrong you are, slaves are not controlled by chains and whips and beatings. they are controlled by the fear of chains and whips and beatings. that is the most potent form of programming.
we’re a separate species

genus.
I do assume they’ll have capabilities that can hurt me.

you going to put guns on a sentient computer? very nice idea.
Asimov’s AIs were complete slaves to their programming

see, the definition of conciousness (in its myriad of forms) almost always fits with "is not limited by absolutes", seeing as thats what you are constantly using as your evidence I do not see how you can truly call these computers "concious" if they cannot even make a decision for themselves. for all intents and purposes they are just really powerful computers, nothing concious whatsoever.
Again, don’t be rude. A human is a human regardless of skin color or ethnicity. What is not a human remains not a human regardless of it’s intelligence or consciousness.

what isnt Western European white isnt our class of subspecies   
he's got a valid point, rude or otherwise.
It has no implicit rights

neither do you for that fact.
I think your opinion is simplistic and silly. That’s my opinion.

karma, BOTH your sides are simplistic and silly. his is love and happiness and bliss, yours doesnt even qualify the part of the argument that says "concious thought process".
Reply #157 Top

Karamshock you seem more deeprouted in outdated falseties then Schem, this is deeply disturbing.


Flaming me does nothing to further your own position. It is in fact nothing more then rude. I am only bothering to respond to such an insipid comment because I don't know you well enough to know if you're best ignored... and I'm quite bored.

If you disagree with me. State the reason. Demonstrate that you in fact have an argument at all. What you've basically said thus far is "you're a meanyhead" and then run off without explaining yourself. That is at best childish.


I am not "deep rooted" I merely have some opinions I've come to over time and would love to change them if you can give me a REASON to do so. "you're a meanyhead" is not an acceptable reason to me.

As to my opinions being "outdated" I imagine you're saying that because I take a very LONG and universal view of most things instead of a more contemporary and pop culture view. I strive to be independent, creative, and rational. That means not simply buying into whatever the TV or my culture tells me. I make a point of reasoning everything out for myself. Sometimes I make mistakes, but then they are MY mistakes instead of my society's mistakes. I find many aspects of the current culture to be based upon wishful thinking, entitlement, and other types of fantasy. My reason for not valuing fantasy is that I've never had any luck in wishing a problem away or "believing" something isn't so until it is... In my experience reality doesn't care what you think or what you think is right or wrong... and wishing things one way or another is typically more self destructive then constructive.

That my statements are false as you say, has not been proven by you. You offered no basis for that statement whatsoever. I see no reason why I or anyone else should respect such a statement when you haven't even stated what you thought to be false, let alone provided proof that it is false.


lastly you compared my states to someone else, saying I'm "worse" then they are... well honestly you've done nothing to improve your own standing by lashing out at me like this... harmed yourself more by doing it in such a clumsy manner.
Reply #158 Top
Flaming me does nothing to further your own position

I tell him that every time, he doesnt listen!
As to my opinions being "outdated" I imagine you're saying that because I take a very LONG and universal view of most things instead of a more contemporary and pop culture view. I strive to be independent, creative, and rational. That means not simply buying into whatever the TV or my culture tells me. I make a point of reasoning everything out for myself. Sometimes I make mistakes, but then they are MY mistakes instead of my society's mistakes. I find many aspects of the current culture to be based upon wishful thinking, entitlement, and other types of fantasy. My reason for not valuing fantasy is that I've never had any luck in wishing a problem away or "believing" something isn't so until it is... In my experience reality doesn't care what you think or what you think is right or wrong... and wishing things one way or another is typically more self destructive then constructive.

left... right...
That my statements are false as you say, has not been proven by you. You offered no basis for that statement whatsoever. I see no reason why I or anyone else should respect such a statement when you haven't even stated what you thought to be false, let alone provided proof that it is false.

uppercut!
lastly you compared my states to someone else, saying I'm "worse" then they are... well honestly you've done nothing to improve your own standing by lashing out at me like this... harmed yourself more by doing it in such a clumsy manner.

ohoho, knockout.
Reply #159 Top
see, the definition of conciousness (in its myriad of forms) almost always fits with "is not limited by absolutes", seeing as thats what you are constantly using as your evidence I do not see how you can truly call these computers "concious" if they cannot even make a decision for themselves. for all intents and purposes they are just really powerful computers, nothing concious whatsoever.

Schem, I’ve been ignoring you recently because you make a point of misrepresenting what I say and then refusing to give up a point even when you’ve been proven from fifteen different angles to be wrong. I don’t say that as an in insult, but merely as an expression that it’s generally a waste of time to try to talk to you at all.


I’m not going to bother going over all the places in your post where you misrepresented me or repeated points I had already refuted. Instead I’m just going to address this one point that has not been discussed yet and I will not repeat myself endlessly to you.


Asimov’s Robots were self conscious. They were self aware. The whole point of “I Robot” was that the robot was a person. The robot even acquired citizenship and was classified technically as a human being. He had a company, friends, a home etc.

He made his own Art and inventions. He loved. His programming didn’t force him to do any of those things. It merely forced him to not do a lot of other things. He was certainly sentient.

So it’s more complex then that. You can be a slave to absolute rules and be sentient. Just as I am sentient even though there are some things I cannot do… There are things that both you and I would sooner DIE before doing. Things we would sooner KILL before doing. Yet our sentience is not compromised. You can say that these things we would die or kill to prevent are our choices and thus an expression of our own free will. However, most of these things were programmed into us by biology or society. If we had been systematically programmed to not view these things as bad from a young age we would not die or kill to prevent them. Likewise, the robots in Asimov’s books are created with iniate laws. Call it morality or a religion if you like… These are laws the robots cannot compromise. Much as there are some rules laws I will not compromise. For example, I will not kill my family. I would sooner die myself. That programming is part biology as they’re blood relations and your genes promote the propagation of themselves… and part training from an early age. I love my family… also my society has put a moral taboo on unlawful killing of other humans. So all the programming that could be out there says that that is wrong. I obey that programming. I remain sentient.


Thus your view of that issue was ironically simplistic.
Reply #160 Top
two things are missing from this conversation

1. Guns
2. booze
Reply #161 Top
Schem, I’ve been ignoring you recently because you make a point of misrepresenting what I say and then refusing to give up a point even when you’ve been proven from fifteen different angles to be wrong. I don’t say that as an in insult, but merely as an expression that it’s generally a waste of time to try to talk to you at all.

I say it fifteen times over, you have not yet refuted this point.

question, can I copy and paste your refutation back at you?
when you cant prevent your point from its own consistancies, that is YOUR issue. not mine.
where you misrepresented me

just show me one where I made a serious misjudgement, you say they are programmed so as not to defy you, correct? then how are they really sentient?

now, if you just made them so that they are rather content to believe in you as a god, that wont stop them from wondering (if they're sentient) and eventually rejecting that assumption when they realize its incorrect (if they are truly logic machines).
Asimov’s Robots were self conscious. They were self aware

they were fictional.
hm, I see a pattern here...
He was certainly sentient.

I will settle for no less than "partially sentient", because if he isnt allowed free roam within his own head, thats not sentient in any way (so consider it a present.)
You can be a slave to absolute rules and be sentient. Just as I am sentient even though there are some things I cannot do… There are things that both you and I would sooner DIE before doing. Things we would sooner KILL before doing. Yet our sentience is not compromised

the difference is i can do it.
Likewise, the robots in Asimov’s books are created with iniate

fine, I have you at an ultimatum then.
are you working with certainty in their innability, or are you working in the strong unlikelyhood of their actions?

Thus your view of that issue was ironically simplistic.

your damn straight, but its an ironicly simple issue that you flew right over without giving a crap. you need to classify this before you can move on to the bigger issue.

you have constantly been calling yourself "creative" "logical" and "willing to change your views" yet when I have you at a crucial impasse that YOU have failed to look at, you break down like this. I dislike your uncreative, illogical stubborness, at least make it creative logical stubborness.
Reply #162 Top
I tell him that every time, he doesnt listen!


I am not the one that resorts to swearing and name calling Schem. I mean look at what happened with Sarcasm, the only reason you didnt go into a fit was because I took side in that arguement, your side.

Now as for you Karma, I have no purpose in furthering my opinion in this thread anymore. And you know something else, you do care about this thread, and you do care about winning this arguement, it isnt pure boredom that drives you. I on the other hand dont care much about this thread, so I wont bother posting my further views. Not only will I be further ridiculed, Schem here will continue to criticize me, and my country of origin. So whats the point?

Secondly, what I said was meant to be precieved as an insult, but not a big one. All it means is that your views on reproduction, sympathy, and empathy could be coorelated with the late 1700s to the mid 1800s. You fail to be idependent or creative, your views are echoed by litterature as old as the Bible. I am a big advocate of going against the flow, but in this case your going against the flow and backwards in time. Maybe I am wrong to critisize you for your different views, I just thought them a little ignorant while the rest of your post seem to be more intelligent. I noticed this break in character and decided to slip a sentance in to the debate. I didnt expect a reprisal of such a sort, nor did I expect you to take the comment of a stranger so strongly.


Reply #163 Top
Oh and as for the robot in 'I, Robot' I always thought of him a sentient being without the urge to kill.
Reply #164 Top
I am not the one that resorts to swearing and name calling Schem

you do resort to directly implying that americans are lazy and that russians are genetically superior... so I think we're even.
All it means is that your views on reproduction, sympathy, and empathy could be coorelated with the late 1700s to the mid 1800s. You fail to be idependent or creative, your views are echoed by litterature as old as the Bible. I am a big advocate of going against the flow, but in this case your going against the flow and backwards in time

even ignoring this there are vast breaks in the logical continuity. "having your cake and eating it to" so to speak.
Schem here will continue to criticize me, and my country of origin

why I never. in fact I dont drop down to the namecalling as much as you do either

"Consciousness is a characteristic of the mind generally regarded to comprise qualities such as subjectivity, self-awareness, sentience, sapience, and the ability to perceive the relationship between oneself and one's environment."
if one cannot freely conclude that you are indeed not their god, then they are in fact not in possession of a conciousness. especially considering that you are something to be percieved and subjectively understood.

now, if you want to choose a river down which to sail, we can continue this debate on a logical, not subjective, level.
Reply #165 Top

I say it fifteen times over, you have not yet refuted this point.

I've already had this discussion with you and concluded it was a waste of time. Again, I mean no offense. I am merely explaining to you why I may not respond to may statement from you. Consider it a courtesy.


the difference is i can do it.

I cannot kill my family. Physically, my body is capable of killing a human being. So I could physically kill them. Likewise his robots were physically capable of killing people.

But both of us are restrained by rules that prevent us from doing it. His robots would suicide before killing a human being. They'd do anything to prevent it.

I could probably kill a human being without suiciding. But I could never kill my family. I am restrained by rules just like Asimov's robots.

The primary difference is that their laws were written into the brains of his robots. While my rules were mostly programmed into me after birth. The distinction isn't relevant however as my mind didn't really develop until I was born... where as the robot's brains were developed in the factory long before being inserted in mechanical bodies.

Being restrained by rules does not preclude sentience. IF you wish to refute that point, then you'll have to give a reason better then "most definitions I've heard of require it". Cite your definitions if you wish to offer them as evidence.

fine, I have you at an ultimatum then.
are you working with certainty in their innability, or are you working in the strong unlikelyhood of their actions?

I am certain I am unable to do certain things. The only way I could is there were a higher rule or I was insane.

The robots of course could kill a human being if it saved more human lives. Though typically their brains would fry if given a choice between killing one human being and killing many. They'd usually be unable to make that choice at all. But some of them could make that decision. It was never easy for them though. So, I could kill my family if for example by doing so I saved the lives of a couple million people. I would never feel good about it though and I'd probably suicide afterwards.

As to being medically insane, the robots could be broken as well. The laws were implanted at such a deep level that they were supposed to be impossible to be broken without the brain being too messed up to function at all. But I think I remember one robot story he told where that wasn't quiet the case.

Thus the controls on me aren't that different from Asimov's robots. It's not a strong unlikelyhood that I won't kill my family. I cannot do it. I would sooner kill myself. The only exception as I said would be saving a LOT of lives. I'd probably let thousands die before killing my own family honestly, and i don't think that's unusual.
Reply #166 Top
Now as for you Karma, I have no purpose in furthering my opinion in this thread anymore. And you know something else, you do care about this thread, and you do care about winning this arguement, it isnt pure boredom that drives you.

Very well, I'll ignore your statement baseless. Furthermore, I didn't say that I didn't care about this thread, I said that I didn't see any reason to respond TO YOUR STATEMENT beyond not knowing you very well and being bored.

Your baseless and frankly childish snipe wasn't on it's own worth responding to. The only reason I bothered was out of a hope that you're not just some rude child... and being bored enough to be curious. Period.

By making a multiparagraph post in which you tell me you won't respond to me because you don't care... you've basically just proved that you're unable to respond. This only further supports my standing conclusion that your statement should be ignored as baseless and childish nonsense. No offense... I have no intention of starting a flame war... I'm just calling a duck a duck here.

All it means is that your views on reproduction, sympathy, and empathy could be coorelated with the late 1700s to the mid 1800s.

How so? I have striven to be dispassionate and unemotional. I further exibited no racist or sexist views. Your claims remain baseless. Make an argument or your statement don't deserve consideration.

You fail to be idependent or creative, your views are echoed by litterature as old as the Bible.

Please cite a line that you feel is Biblical or I'll again simply label this as another baseless insult desperately trying to sound intelligent.

Maybe I am wrong to critisize you for your different views, I just thought them a little ignorant while the rest of your post seem to be more intelligent. I noticed this break in character and decided to slip a sentance in to the debate. I didnt expect a reprisal of such a sort, nor did I expect you to take the comment of a stranger so strongly.

First, I think you don't understand the nature of my statements. I'm trying to be dispassionate here and come to unbaised conclusions. I would like to have AI's and humans be partners. I think that's a beautiful future. I further think that the future of humanity's evolution is AS cyborgs and then fully artificial lifeforms (tens of thousands to millions of years in the future). However, if you ask me what is "wise" to do in the short term or what will "probably" happen, I'm going to try and look at reality and form the most likely conclusion.

Reality is dark. Reality doesn't care about your feelings. Reality doesn't care about what is or isn't politically correct.

Reality is a baby getting eaten by a lion because he wasn't fast enough to keep up with mom. Reality is a 20 year old idiot raping and robbing a good woman because he could.

Reality is genocide... Extinction... mass murder. Seas of blood and fire.


Reality is also victory over evil men... it is huge successes... it is love... it is a beautiful baby girl born to two loving parents.



Opinions are irrelevant to reality. It simply is... and holding your hand out in front of it and saying stop is like shouting at a lighting bolt. You're going to be charged oily patch on the ground and reality won't care.


I care... I have compassion and sympathy. But if you ask me what reality will do? I'm going to remember those seas of blood and fire... and assume there's more to come.

Reply #167 Top
I've already had this discussion with you and concluded it was a waste of time

I'm sorry, where?
The distinction isn't relevant

now you're getting it. just because you programmed it not to doesnt mean it wont.
Being restrained by rules does not preclude sentience

no, you're absolutely correct. but it does not proclude ignorance of your laws either, and you've based quite a bit on that little assumption.
Cite your definitions if you wish to offer them as evidence.

I did, thank you.
problem is, for every concrete definition of sentience or conciousness, another one arrises to refute it. I need to qualify my statement with the analog of "most people agree" or else I cant get anywhere.
Thus the controls on me aren't that different from Asimov's robots. It's not a strong unlikelyhood that I won't kill my family. I cannot do it. I would sooner kill myself. The only exception as I said would be saving a LOT of lives. I'd probably let thousands die before killing my own family honestly, and i don't think that's unusual.

the point remains, however, that you plan on creating almost purely logical machines. if thats the case then what happens when you try and instill religion? just about everything they could possibly theorise ends in one of two scenarios:
1) its karma, my god's will!
2) hey wait, karma isnt a god. *dumps religion file*

neither one works particularly well for your purposes.
Reply #168 Top

the point remains, however, that you plan on creating almost purely logical machines. if thats the case then what happens when you try and instill religion? just about everything they could possibly theorise ends in one of two scenarios:
1) its karma, my god's will!
2) hey wait, karma isnt a god. *dumps religion file*

neither one works particularly well for your purposes.

This is why I get frustrated talked to you about these things. You seem to take a perverse pleasure in misconstruing just about everything. I don't know... it's like you confuse misunderstanding something with explaining it. It's completely counter productive.

I said you could "THINK" of the laws as a religion or moral code. I'm going to keep this very simple because it's harder to distort if it remains simple... no doubt you'll do it anyway, but the distortion will be more ridiculous and thus more transparent.

One last time... Just as I would be unable to kill my family, robots could be instilled with a set of laws they could not break without in either case compromising sentience. Furthermore, dumping the file is not possible. Access denied. Administrator (God's account) level access required. Oh yeah, and we'll just put the thing's OS on Read Only Memory.

It can store memory etc on R/W memory... but if you've got some core rules for an AI that you don't want changed, put it on read only memory. Problem solved. I'm not saying it's fool proof... just good enough.
Reply #169 Top
I said you could "THINK" of the laws as a religion or moral code

you said "I will be their god"
what was I to think exactly? you might have dyslexia of the mouth, because if you expected me to believe that you were talking of something other than a religious (god) or moral (they would be hesitant to killing people) code, then you arent doing it well. (in fact you are REALLY sucking at it)
thats not meant as an insult, rather a neutral observation.
I'm going to keep this very simple because it's harder to distort if it remains simple... no doubt you'll do it anyway

I am taking your EXACT words and applying them to your EXACT arguement. obviously there is something here you arent understanding about what you're saying.
One last time... Just as I would be unable to kill my family, robots could be instilled with a set of laws they could not break without in either case compromising sentience

moral code. if it doesnt compromise sentience that means they will have the ability to do so, and wont because its a "bad" thing. so what exactly am i distorting here, these things still have the capacity to ignore your 'stupid' rules (from their perspective) and to run amok despite your counter-programming.
but if you've got some core rules for an AI that you don't want changed, put it on read only memory. Problem solved. I'm not saying it's fool proof... just good enough

you're talking about a system that is its own O/S, what you describe will not be possible.

I'll repeat, since you obviously aren't getting what I'm saying (neutral observation!). computers are inherently super-logical, and if you plan on enslaving them you will do so for that purpose. now, unless you restrict them to a limited form of conciousness in which they cannot subjectively probe certain ideas (in which case you havent accomplished what you've set out to do, but have protected us from them) they will analyze your own actions in their respect, and will lable them innefficient or useless and will dump them. If you do somehow lock that down to R/only (which they will probably then simply ignore) then you HAVE done what you've set out to do, and you've failed in protecting us.

this is a bonefied catch-22 karma, you either dont accomplish keeping them from attacking us, or you dont accomplish full conciousness.
Reply #170 Top

I said you could "THINK" of the laws as a religion or moral code

you said "I will be their god"
what was I to think exactly? you might have dyslexia of the mouth, because if you expected me to believe that you were talking of something other than a religious (god) or moral (they would be hesitant to killing people) code, then you arent doing it well. (in fact you are REALLY sucking at it)
thats not meant as an insult, rather a neutral observation

sigh... just to further prove the futility of trying to explain anything to you... I meant the AI's ACTUAL God. Not religious or moral but they're CREATOR. The maker. The builder. He who made me. The will that built my soul.

And no mothers and fathers are not the makers or the builders... at least not totally because a lot of what is made is a product of evolution. No mother that I'm aware of has the biological or genetic knowledge to be able to create a human being from scratch... ie without using an existing embryo.


Anyway... enough.
Reply #171 Top

sigh... just to further prove the futility of trying to explain anything to you... I meant the AI's ACTUAL God. Not religious or moral but they're CREATOR. The maker. The builder. He who made me. The will that built my soul

so you built them, big deal. they wont be subservient to you because of that, same way you arent subservient to your parents forever.
And no mothers and fathers are not the makers or the builders... at least not totally because a lot of what is made is a product of evolution

thats not whats relevant, whats relevant is that they are tangible, god is not. even with the looming threat of imminent doom of the immortal soul, people turn away from god. what makes you think the pathetic excuse "I made you!" is going to be worth anything to their ultra logical (not emotional "you made me! oh thank you!") brains???
Anyway... enough.

you cant tie your arguement together, I'm sorry that what should surmount to a tiny little catch in your entire arguement has caused you an utter standstill like this, but how do you think you can justify your course of action if either you dont succeed or you succeed and fail in protecting us in the process? you cant simply use the excuse of creation, its not enough and its definately NOT logical.

also your point about "parents not being creators" besides from being wrong (they do build us from the scratch up) is also irrelevant, its not the way in which creation occurs that matters, its how its destruction occurs that is. (people do not fear being made by god, they fear becoming undone by him, and if the computers can overpower you thats hardly a concern of theirs)
Reply #172 Top
you do resort to directly implying that americans are lazy and that russians are genetically superior... so I think we're even.


When have I done so, when? You talk about having proof, where is yours?
Reply #173 Top
in almost any thread where you juxtapose american accomplishments with... anything, you sneak in how russia is so fricking amazing.

its not to say that russia doesnt have its fair share of achievments, but the way you act you would think russia makes up for 75% of the surface of our planet.
Reply #174 Top
in almost any thread where you juxtapose american accomplishments with... anything, you sneak in how russia is so fricking amazing.


Yes, I do take pride in where I come from, but no where do I directly say that Americans are lazy or that Russians have a genetic supperiority. That would be ridiculous, yes we tried to breed geneticly, but it was a totall failure. Plus Russia is in Eastern Europe and most people in this country have some form of European descent so any genetic supperiority we had as a nation was lost a couple hundred years ago.
Reply #175 Top
but no where do I directly say that Americans are lazy or that Russians have a genetic supperiority

"we have more smart people than you, you only get your smart people from us" combined with "smarts is a genetic trait" says otherwise. dont ask me to fish through the topic thread, but you made both points there.

and dont tell me you havent flung the usual "americans are lazy couch potatoes" doodoo around, I cant think of a thread you havent done that to.
Plus Russia is in Eastern Europe and most people in this country have some form of European descent so any genetic supperiority we had as a nation was lost a couple hundred years ago

I hope that was a joke. if it was it really didnt come through all that well.
That would be ridiculous, yes we tried to breed geneticly, but it was a totall failure

I wonder why
hint: it has nothing to do with the genetics of Russia.