Jump to content

Welcome to The OFFICIAL Pure Pwnage forums
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

Artificial Intelligence


  • Please log in to reply
36 replies to this topic

#21
Jengerer

Jengerer

    2009 Softest Hair Winner & Best Staff

  • Retired Staff
  • 3,243 posts
  • xfire:Jengerer
  • Gender:Male
  • Location:Toronto, Ontario
  • Interests:Video games. Who knew?
  • Steam ID:Jengerer
  • Rofl-Rupees:2
  • Gamer Army ID:4485
QUOTE (Jimmy Rabbitte @ Mar 24 2010, 03:55 PM) <{POST_SNAPBACK}>
/snip

Fair enough, I see your point. I'm curious whether an autistic savant would be able to solve a larger-scale problem faster than a computer, though.

On topic, I think I'd begin eventually to see them as persons. It'd be a difficult concept to handle initially, and there'd always be a part of me that would think of them differently, but eventually I could come to terms with the concept. I think somebody posted a Wikipedia for a book about AI somewhere here, and how we would use an ever expanding and self-assembling AI to scout out the rest of the universe. Interesting stuff.

#22
way2lazy2care

way2lazy2care
  • Members
  • 10,808 posts
  • Xbox / GFWL:way2lazy2care
  • PSN:A1R5N1P3R
QUOTE (Jengerer @ Mar 24 2010, 05:01 PM) <{POST_SNAPBACK}>
Fair enough, I see your point. I'm curious whether an autistic savant would be able to solve a larger-scale problem faster than a computer, though.

On topic, I think I'd begin eventually to see them as persons. It'd be a difficult concept to handle initially, and there'd always be a part of me that would think of them differently, but eventually I could come to terms with the concept. I think somebody posted a Wikipedia for a book about AI somewhere here, and how we would use an ever expanding and self-assembling AI to scout out the rest of the universe. Interesting stuff.

I remember seeing a show on a guy who could remember page number and line number of a sentence in any book he'd ever read in an instant. He could even do it correcting for misquoting by the person asking the line. That would take a computer a reasonable amount of time for even a handful of books if it had the exact quote. It would take a computer an even longer time to be able to correct for a misquote in a single book.

The problem isn't our ability to process information, it's that we process information differently, and more importantly we access input and memory differently. Where a computer stores everything that it is told to store, we store only that information which we deem important enough to remember and we store it in a multitude of priority access memory that's hard coded into our brains over time.

The problem is a normal human brain isn't wired to access all of the things it's remembered in an instant, but it can access the most common things almost instantly. Looking at a baseball player, they do a tremendous amount of sub-conscious dynamic calculations just to catch a ball without even really having to think about it.
SPAMBOTSTOOKOVERMYSITE D:
Give me LoL Referals.

QUOTE (Virus52 @ Mar 3 2008, 09:44 AM) <{POST_SNAPBACK}>
ALL HAIL THE GREAT AND MIGHTY MOTH!

QUOTE (SN3S @ May 6 2008, 08:27 AM) <{POST_SNAPBACK}>
No sensuality; this is all for fitness.

#23
Jimmy Rabbitte

Jimmy Rabbitte
  • GA Private
  • 1,639 posts
  • Gender:Male
  • Location:Seattle
  • Steam ID:100sph
  • Gamer Army ID:3335
  • Company:Delta
QUOTE (way2lazy2care @ Mar 24 2010, 03:29 PM) <{POST_SNAPBACK}>
I remember seeing a show on a guy who could remember page number and line number of a sentence in any book he'd ever read in an instant. He could even do it correcting for misquoting by the person asking the line. That would take a computer a reasonable amount of time for even a handful of books if it had the exact quote. It would take a computer an even longer time to be able to correct for a misquote in a single book.

The problem isn't our ability to process information, it's that we process information differently, and more importantly we access input and memory differently. Where a computer stores everything that it is told to store, we store only that information which we deem important enough to remember and we store it in a multitude of priority access memory that's hard coded into our brains over time.

The problem is a normal human brain isn't wired to access all of the things it's remembered in an instant, but it can access the most common things almost instantly. Looking at a baseball player, they do a tremendous amount of sub-conscious dynamic calculations just to catch a ball without even really having to think about it.

This is off topic but here is an article you guys might find enjoyable. The human brain is not built for thinking, its built for memory
The Blackman is God

#24
Jengerer

Jengerer

    2009 Softest Hair Winner & Best Staff

  • Retired Staff
  • 3,243 posts
  • xfire:Jengerer
  • Gender:Male
  • Location:Toronto, Ontario
  • Interests:Video games. Who knew?
  • Steam ID:Jengerer
  • Rofl-Rupees:2
  • Gamer Army ID:4485
QUOTE (way2lazy2care @ Mar 24 2010, 06:29 PM) <{POST_SNAPBACK}>
I remember seeing a show on a guy who could remember page number and line number of a sentence in any book he'd ever read in an instant. He could even do it correcting for misquoting by the person asking the line. That would take a computer a reasonable amount of time for even a handful of books if it had the exact quote. It would take a computer an even longer time to be able to correct for a misquote in a single book.

From my very basic knowledge about how the brain works, I think it's a little like a hash-table/map, if you have any clue what that is in programming. It has a key, or several keys, and when that key is activated/called, it calls on the value that's sorted by those keys. So someone being able to remember what book, page, and line a certain quote is found on is simply using this referencing method.

However, that's not to say that memory is the entire extent of human intelligence; some savants are able to do incredible mathematical calculations in seconds or less. I can't recall whether they share the same dilemma as computers, though, in that they don't really understand what they are doing, but only how, like an algorithm in a computer.

That being said, I'm not sure how this would work out in computers in general. A funny thought I had once was that if we had poor neural computing, the computer might be able to forget what a file was, so you'd have to type arbitrary things to try to jog its memory. icon_lol.gif

Getting a little off-topic, though. I guess the bigger question I was asking in the OP was what do you identify as a person? Is it based on religion, biology, just having the ability to rationalize, etc.

#25
Dark.Matter

Dark.Matter
  • Members
  • 1,662 posts
  • Rofl-Rupees:3
QUOTE (Jimmy Rabbitte @ Mar 24 2010, 07:28 PM) <{POST_SNAPBACK}>
This is off topic but here is an article you guys might find enjoyable. The human brain is not built for thinking, its built for memory

I am offended by that article. I solved the word problem on page 5 in about 30 seconds...including the time it took to read it. Am I a superhuman?

Edited by Dark.Matter, 24 March 2010 - 09:31 PM.

QUOTE (Dark.Matter @ Apr 23 2009, 02:22 PM) <{POST_SNAPBACK}>
This thread delivers.

#26
The_Zooloo_Master

The_Zooloo_Master
  • Members
  • 912 posts
  • xfire:thezooloomaster
  • Gender:Male
  • Location:Pwnland-->Switzerland
  • Interests:Entertainment,____, watching pure pwnage, pwning noobs, philosophy.
QUOTE (Dark.Matter @ Mar 25 2010, 03:19 AM) <{POST_SNAPBACK}>
I am offended by that article. I solved the word problem on page 5 in about 30 seconds...including the time it took to read it. Am I a superhuman?


Haha, don't be offended by it -- it's obviously tosh, and probably aimed at primary school kids or something.
"If I continue with this shit I'm going to end up in jail, in a hospital, or dead. Or all 3." -- Joby

#27
clockwork1337

clockwork1337
  • Members
  • 563 posts
  • Gender:Male
  • Location:Wisc. GMT-06:00
  • Interests:Starcraft, css, mw2, rockband, stuff.
  • Steam ID:santa24240
  • Xbox / GFWL:two asian kids
QUOTE (Jimmy Rabbitte @ Mar 24 2010, 06:28 PM) <{POST_SNAPBACK}>
This is off topic but here is an article you guys might find enjoyable. The human brain is not built for thinking, its built for memory



i read this, i liked it. its not about the problem, its about the idea behind the problem. basicly showing us whats going down in us. ya know?
wtf is a signature? do people see this? i hope not... i hope its like a diary... well dear diary... today i found out im gay... :)
www.youtube.com/user/threeasiankids
QUOTE (ya_ba @ May 7 2010, 01:12 PM) <{POST_SNAPBACK}>
I like you :)

QUOTE (sG Core asheS @ Jun 2 2010, 12:59 PM) <{POST_SNAPBACK}>
unrelated, but clockwork's sig made me literally lol
*clap* *clap* *clap*

#28
way2lazy2care

way2lazy2care
  • Members
  • 10,808 posts
  • Xbox / GFWL:way2lazy2care
  • PSN:A1R5N1P3R
QUOTE (Dark.Matter @ Mar 24 2010, 09:19 PM) <{POST_SNAPBACK}>
I am offended by that article. I solved the word problem on page 5 in about 30 seconds...including the time it took to read it. Am I a superhuman?

the word problem is on page 2 or 3.

I did the same though with a completely different answer. The box of tacks is 5 feet tall and you put the candle on top. Suck eggs for not being specific in your word problems article.

+------+
lOOOOl Me-> *
lOOOOl <- the box
+------+

Edited by way2lazy2care, 14 April 2010 - 02:43 PM.

SPAMBOTSTOOKOVERMYSITE D:
Give me LoL Referals.

QUOTE (Virus52 @ Mar 3 2008, 09:44 AM) <{POST_SNAPBACK}>
ALL HAIL THE GREAT AND MIGHTY MOTH!

QUOTE (SN3S @ May 6 2008, 08:27 AM) <{POST_SNAPBACK}>
No sensuality; this is all for fitness.

#29
Jengerer

Jengerer

    2009 Softest Hair Winner & Best Staff

  • Retired Staff
  • 3,243 posts
  • xfire:Jengerer
  • Gender:Male
  • Location:Toronto, Ontario
  • Interests:Video games. Who knew?
  • Steam ID:Jengerer
  • Rofl-Rupees:2
  • Gamer Army ID:4485
QUOTE (way2lazy2care @ Apr 14 2010, 03:38 PM) <{POST_SNAPBACK}>
/snip

I don't think that's how it works, haha. You can't solve an ambiguous problem by making a specific presumption.

#30
Goth Skunk

Goth Skunk
  • Members
  • 30 posts
  • Gender:Male
  • Location:Calgary, AB, Canada
  • Interests:Video Gaming, pwning n00bs, preparing for the zombie apocolypse, boobies, good friends, good laughs, and shared victory.
  • Xbox / GFWL:Goth Skunk
  • Wii:lol wii
QUOTE (Jengerer @ Mar 23 2010, 03:58 PM) <{POST_SNAPBACK}>
What are your thoughts on morality concerning the possibility of artificial intelligence?

If we ever come to a point where we will have constructed artificial sentient beings that replicate human emotion and act like one of us, how do you think they would be treated in society? As a "person" under the law and general moral view? Or as an inferior race, subject to segregation, slavery, etc.?

I had a strange thought concerning this topic: imagine if in 50 years or so, we achieve this type of intelligence, and video games started to implement this kind of technology into their characters. These characters actually believe that they are part of the story that they are set in, and that their world is real. Would you consider it immoral to kill those beings within the game? Does uninstalling the game begin to have moral repercussions? Would being able to revive the characters justify the possibility of their execution within the game? What would be considered a peaceful removal from the game without infringing on the rights we bestow on these beings?

Man, imagine the characters begin to find bugs in the programming and start to question their existence.

Pretty interesting to think about. Discuss!


Characters within a game, no matter the sophistication of intelligence, will never be granted the rights and protections that a human outside the game benefits from. Characters in a video game are limited to the boundaries of their existance within said game. They are fictional entities in a fictional world. I would never consider it immoral to kill any of them off, just as I consider it amusing to kill off Sims.

If a platform* co-existing with humans was able to replicate human emotion, act like one of us, and achieve sentience, then we start getting into a grey area, morally and ethically. An A.I. could be constructed and programmed with a function to perform dangerous work, thereby eliminating the risk to human life. The A.I. would then understand this as its function and purpose, thereby performing the task with no questions asked and with no acknowledgement of its own existence. It's when an A.I. demonstrates self-preservation behaviour that the line is crossed. An A.I. that refuses to perform dangerous work, or that fights back when faced with possible termination should be considered a life form and treated just as any human being would be treated. Just like the Geth in the Mass Effect universe.
Veni, Vidi, Pwni.


#31
Jengerer

Jengerer

    2009 Softest Hair Winner & Best Staff

  • Retired Staff
  • 3,243 posts
  • xfire:Jengerer
  • Gender:Male
  • Location:Toronto, Ontario
  • Interests:Video games. Who knew?
  • Steam ID:Jengerer
  • Rofl-Rupees:2
  • Gamer Army ID:4485
If self-preservation is your criteria for being considered a person or living thing, then why, even if an AI attempted to defend themselves within the game, can't you consider a virtual being a person? If I try to shoot an AI in a video game and attempt terminate them, and they duck for cover and attempt to eliminate the threat, are they not fending for their own life, and thereby filling your criteria? The only difference I see between what you consider an AI "person" and this is the existence of a body, which I think is a negligible difference. I don't think we'd call a composition of human parts without a brain a person, because we classify person as one that can carry out rational thought.

EDIT: An important distinction here is the difference between life form and person. I noticed now that the above mentioned criteria was for being considered a life form, but that doesn't necessarily mean that we would treat it as a human. We don't allow all animals the same liberties as human beings. Survival instinct is an animal trait, but it doesn't distinguish person from living being.

You can't compare killing off Sims to what it would be like to kill a sentient form that's communicating with you, I think. There's a bit of a barrier between you and a virtual intelligence (i.e, preprogrammed bots, like Sims) that can never be crossed, but I think this barrier is less evident in AI relations, because the element of rationality comes into play. Rather than following instructions, they're actually making considerations based on experience.

#32
Goth Skunk

Goth Skunk
  • Members
  • 30 posts
  • Gender:Male
  • Location:Calgary, AB, Canada
  • Interests:Video Gaming, pwning n00bs, preparing for the zombie apocolypse, boobies, good friends, good laughs, and shared victory.
  • Xbox / GFWL:Goth Skunk
  • Wii:lol wii
Virtual characters cannot be considered people or living things because, once dead, they don't STAY dead. Their existance is both finite and infinite. Finite within the boundaries of the game, and infinite in terms of post gameplay. No matter how many times I complete Mass Effect and kill Saren, he's going to come right back when I start a new game, only to suffer the exact same fate again by his own hands. Ultimately, he's going to die.

Now if Saren's character somehow broke the Fourth Wall, turned to the camera and said to me, 'Hey, Goth Skunk, this is the thirteenth time you played this game, with an average of 40 hours each time. That's over 500 hours of life you have dedicated just to this game. What the hell, man! Get a life!" I would be forced to give pause. My first thought would be to wonder if that was an easter egg. Then I would attempt to have a conversation with him, a la Seaman. If that worked, then buy me a cadillac and call me Elvis, I just found a new best friend. But at the same time, Mass Effect no longer becomes Mass Effect, but rather, My Friend Saren.

Furthermore, all enemy characters are fighting against the character I am puppeteering, not me. Their bullets are not coming out of the TV dangerously close to my head and putting my life at risk. It isn't real, therefore the characters cannot in good conscience be treated the same as I would treat another human being.

We don't allow animals the same liberties as we would a human being, that is true. But there are also laws that protect animals from cruel and unusual treatment, despite the fact they lack sentience. Like humans, when animals die, it's permanent.


Veni, Vidi, Pwni.


#33
Nerd_Rage

Nerd_Rage
  • Members
  • 26 posts
  • Gender:Female
  • Location:Toronto/GTA
  • Interests:Psychology, chocolate, Philosophy, Physics, Meditation, Yoga, Environmentalism, Video Games, Societal change, having teh fun, embracing alternative states of consciousness, laughing, intertubes, photography, gardening, food, cooking, baking, eating, smelling, tasting, touching, loving, living....
It's pretty hard to say how one would treat an AI today when it is still so far from being a reality.

Not that I don't enjoy a little thought-fantasy or what have you. Presumably if these new found AI beings with sentience etc did exist they would be treated the same way that humans are treated, or perhaps they would be revered... who the hell knows? I guess it depends on what their contribution is... I mean, lets be realistic. Humans are not peaceful, nor are they all rational. Sure we have the capacity to be as such, but that doesn't mean any AI we develop will embody only those characteristics, unless we master the programming I suppose.
In any case it will be interesting to see how it all develops.

QUOTE
We don't allow animals the same liberties as we would a human being, that is true. But there are also laws that protect animals from cruel and unusual treatment, despite the fact they lack sentience


er... i don't agree with that last part of your statement. It is because animals are argued to be sentient that those laws exist and more are being pushed for as we ..type.


Re- the OP
QUOTE
These characters actually believe that they are part of the story that they are set in, and that their world is real. Would you consider it immoral to kill those beings within the game? Would you consider it immoral to kill those beings within the game? Does uninstalling the game begin to have moral repercussions? Would being able to revive the characters justify the possibility of their execution within the game? What would be considered a peaceful removal from the game without infringing on the rights we bestow on these beings?


I suppose it could be considered immoral to kill those beings within the game, but then the question is why would there be programming to do such a thing? Who are these programmers and what are they trying to do to my brain exactly? Now, if there was a way for these characters to become aware that they are in a game and then make the realization that there is a way out of said game... that would change things entirely.

Assuming the characters are restricted to the game space and over time they do develop into sentient beings then it would be more tricky to say... I think personally I would respect it and leave it installed to enjoy a life that spans the life of the system it is on. Other then that I would probably just flat out ask if it realized the consequences of it's being in a game. Then if it seemed to understand there wouldn't be an issue with uninstalling or killing it for that matter. Just a part of life right? But again, it would depend on a bunch of situational crap.. ugh. ethics.


On a slightly unrelated note this thread makes me think of Tron and Blade Runner and a book called Sophie's world where the main character makes that exact realization (that she is a character in a book). Pretty interesting read if you like philosophy and perhaps even if you don't.


Nerd_Rage

embrace the shadpw.
<3

#34
Jengerer

Jengerer

    2009 Softest Hair Winner & Best Staff

  • Retired Staff
  • 3,243 posts
  • xfire:Jengerer
  • Gender:Male
  • Location:Toronto, Ontario
  • Interests:Video games. Who knew?
  • Steam ID:Jengerer
  • Rofl-Rupees:2
  • Gamer Army ID:4485
QUOTE (Goth Skunk @ Apr 19 2010, 03:58 PM) <{POST_SNAPBACK}>
/snip

I suppose games just wouldn't work the same way as they do now with real AI, because the presence of AI could mean an infinite amount of directions that the game could take. In your Mass Effect example, Saren could choose not to take sides with Sovereign and then you could team up with him to take the Reapers down if you're able to convince him.

I suppose another presupposition would be that you cannot convince the characters that they live in a fictitious world ('cause nobody would believe you anyway). The kinds of virtual universes I'm thinking of are where the programmers simply create the circumstances, provide you with an end goal, and then build a virtual universe and add in obstacles that will stop you from getting there.

This is where the AI comes into play, I suppose. Let's say that given these circumstances, each game or play-through generates a genuine personality for every person in the game (except for Sovereign, which is virtual intelligence and is always the antagonist). So Ashley through one play-through could be a nice warm-hearted individual, but in the next, be a completely different psychotic person.

This is where the permanence of death comes in; you kill the character, and they aren't coming back (as the same person, at least). You'd feel bad about killing somebody in real life because they die and they can't come back, and cloning that person wouldn't change that because they wouldn't be the same person, much like this example.

Then what?

P.S, I'm not trying to infuriate you. I just want to know how you'd think about this given different circumstances.

#35
Goth Skunk

Goth Skunk
  • Members
  • 30 posts
  • Gender:Male
  • Location:Calgary, AB, Canada
  • Interests:Video Gaming, pwning n00bs, preparing for the zombie apocolypse, boobies, good friends, good laughs, and shared victory.
  • Xbox / GFWL:Goth Skunk
  • Wii:lol wii
QUOTE (Jengerer @ Apr 22 2010, 06:03 PM) <{POST_SNAPBACK}>
P.S, I'm not trying to infuriate you. I just want to know how you'd think about this given different circumstances.


I'm not easily infuriated. Although I tend to fly off the handle whenever the Canucks win... verymad.gif

I hope I'm not misinterpreting your point here, either... You're saying that, assume that the characters in Mass Effect were true AI, and that each time you played it through, characters would have different personalities each time. You're also saying that if I kill off a character in one playthrough, that character and that personality are permanently dead for all subsequent playthroughs. Am I on the right track?

I'll answer to that point assuming that I've interpreted your question correctly. If not, just let me know and I'll re-answer:

There's a critical decision in Mass Effect the player has to make: One of your squadmates is going to die, permanently. You are forced to choose who that will be. Now, as a veteran video gamer familiar with the concept of best possible ending, I saw this initially as a point of failure. At some point in the game, I made the wrong decision or said the wrong thing and as a result, I was forced into a no-win situation. After a few more playthroughs I realized that there was no escaping it. Whether you like it or not, you have to choose between Ashley or Kaidan, and for one or the other, the story ends. That hit me hard. Never in a game have I actively been forced to save one squadmate and leave the other to die. And the fact that the one you leave behind is gone and never coming back did initially leave me feeling like I had somehow failed. You spent over half the game getting to know your characters on a personal level, they seem less and less like squadmates and more
LIKE actual people with a history, a personality, a life. A life cut short by a crucial decision that you, the player, had to make.

But they never will have life. They never break the Fourth Wall. They are and will always be a fictional character in a fantasy world. They will never be real.

No matter how sophisticated they ever get to be, characters in a video game, or in a book, or a movie will never be people. Harming them, killing them, will never yield any real-life consequences.
Veni, Vidi, Pwni.


#36
LookingForward

LookingForward
  • Members
  • 8 posts
QUOTE (way2lazy2care @ Mar 23 2010, 05:18 PM) <{POST_SNAPBACK}>
computers aren't smarter. They are just better at doing what they are told.

Do you know how long it would take a computer to do the advanced risk reward computations our brains do every second? The human brain is still remarkably good at pattern recognition too.


You say this now. You have to look at the growth of technology on an exponential scale, not a linear scale. As our understanding of the human brain becomes more clear and as semiconductors increase computation, it will become less of a problem. My bet is programmers will end up using models of the human brain as the basis for quite a lot of the AI. Computers are an I/O system, so are we. The I/O mechanisms will just become more advanced.

#37
premiumseocompany

premiumseocompany
  • Members
  • 13 posts
  • Gender:Female
  • Location:Seattle
for me humans are still smarter than robot... robot won't exists with out human. only humans created them. :-)




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users