View Poll Results: Should we build machines more intelligent than human beings if we could?

Voters
9. You may not vote on this poll
  • Yes, we should

    5 55.56%
  • No, we shouldn't

    0 0%
  • I don't know

    0 0%
  • The question doesn't make sense

    4 44.44%
Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 38

Thread: Dilemma: Should we build machines more intelligent than human beings if we could?

  1. Top | #11
    Veteran Member PyramidHead's Avatar
    Join Date
    Aug 2005
    Location
    RI
    Posts
    4,596
    Archived
    4,389
    Total Posts
    8,985
    Rep Power
    58
    Quote Originally Posted by Speakpigeon View Post

    So, you are replying to a question that doesn't make sense?!
    Literally the first sentence of my reply was an explanation of why I chose that option, so maybe stop acting like a douchebag to everyone who replies

    Maybe we will need them badly. To solve global warming, improve the capabilities of our military, win the economic competition, make scientific discoveries, improve government, cure diseases, cure social ills like poverty and unemployment, make us live longer, and on and on and on.
    Specifically, would we need AIs that are to us what we are to rabbits, in terms of intelligence? I dispute that claim. The problems you listed either can be solved with our current level of technology, are not a matter of technology at all, or are bad things that we shouldn't be doing anyway.

    So now it seems you're saying AIs more intelligent than us would "probably open the doors to other scientific leaps", which sounds to me like a good motivation to build these machines...
    Only if enabling scientific progress is itself a good reason to do something, which I'm not convinced that it is, or at least not convinced that it couldn't be overcome by other considerations.

  2. Top | #12
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Quote Originally Posted by PyramidHead View Post
    Literally the first sentence of my reply was an explanation of why I chose that option, so maybe stop acting like a douchebag to everyone who replies


    Specifically, would we need AIs that are to us what we are to rabbits, in terms of intelligence? I dispute that claim. The problems you listed either can be solved with our current level of technology, are not a matter of technology at all, or are bad things that we shouldn't be doing anyway.

    So now it seems you're saying AIs more intelligent than us would "probably open the doors to other scientific leaps", which sounds to me like a good motivation to build these machines...
    Only if enabling scientific progress is itself a good reason to do something, which I'm not convinced that it is, or at least not convinced that it couldn't be overcome by other considerations.
    We may not absolutely need it. But you don't give any clear reason that we shouldn't do it.
    EB

  3. Top | #13
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Maybe another way to see the problem is to ask why we should not legislate, worldwide, against the creation of AIs more intelligent than us. There are already people warning us of the danger of more intelligent AIs and it is conceivable that they will convince all the major powers that building such machines should be stopped. Building that kind of machines probably can't be done by lone individuals or even small organisations. So if the big powers get to agree it shouldn't be done, it probably won't.

    Assuming this, what do you think are the good reasons that we should, against opinions to the contrary, build such machines?
    EB

  4. Top | #14
    Veteran Member PyramidHead's Avatar
    Join Date
    Aug 2005
    Location
    RI
    Posts
    4,596
    Archived
    4,389
    Total Posts
    8,985
    Rep Power
    58
    Quote Originally Posted by Speakpigeon View Post
    Maybe another way to see the problem is to ask why we should not legislate, worldwide, against the creation of AIs more intelligent than us. There are already people warning us of the danger of more intelligent AIs and it is conceivable that they will convince all the major powers that building such machines should be stopped. Building that kind of machines probably can't be done by lone individuals or even small organisations. So if the big powers get to agree it shouldn't be done, it probably won't.

    Assuming this, what do you think are the good reasons that we should, against opinions to the contrary, build such machines?
    EB
    I think you have a quaint and naive view of what "world powers" are willing to do if you don't see that literally every single one will immediately begin working on intelligent AI in secret the moment such legislation is passed.

  5. Top | #15
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Quote Originally Posted by PyramidHead View Post
    Quote Originally Posted by Speakpigeon View Post
    Maybe another way to see the problem is to ask why we should not legislate, worldwide, against the creation of AIs more intelligent than us. There are already people warning us of the danger of more intelligent AIs and it is conceivable that they will convince all the major powers that building such machines should be stopped. Building that kind of machines probably can't be done by lone individuals or even small organisations. So if the big powers get to agree it shouldn't be done, it probably won't.

    Assuming this, what do you think are the good reasons that we should, against opinions to the contrary, build such machines?
    EB
    I think you have a quaint and naive view of what "world powers" are willing to do if you don't see that literally every single one will immediately begin working on intelligent AI in secret the moment such legislation is passed.
    That is an illogical reply. I assumed explicitly that the big powers would be convinced it shouldn't be done.

    So, stop eluding the question. What do you think are the good reasons that we should, against opinions to the contrary, build such machines?
    EB

  6. Top | #16
    Super Moderator
    Join Date
    Jun 2002
    Location
    Toronto
    Posts
    16,587
    Archived
    42,293
    Total Posts
    58,880
    Rep Power
    86
    Quote Originally Posted by Speakpigeon View Post
    Quote Originally Posted by Tom Sawyer View Post
    If it's possible to do, someone will do it. That means the two choices available are to have the more intelligent machines yourself or to be in competition against those with more intelligent machines without having them yourself. Given the lack of a third option, building them yourself makes the most sense.
    Sure, AIs more intelligent than us would be the means to beat the competition and we will all want them. Yet, if AIs more intelligent than us would be a bad thing, I don't see why it would be impossible to convince all governments and big companies capable of developing AIs that it would be bad and consequently to agree on a moratorium. If they wouldn't be a bad thing, then there would be no reason not to do it. So, the question is whether they would be bad to begin with and whether we could stop ourselves creating them.
    EB
    Whether they're good or bad is irrelevant. It's like nuclear missiles. You have one group with nukes who dominates everybody because they're the only ones with nukes or you get a MAD situation where the nuclear capabilities cancel each other out. If you get nine groups saying "Hey, these are really dangerous, let's not build any" then all you've done is created an opportunity for the tenth group.

    Same with super intelligent AI. Somebody is going to do it if it's possible to do. You can be that somebody or you can be one of the people that somebody uses it against. The third option of not having any of those somebodies develop it in the first place isn't a real option.

  7. Top | #17
    Veteran Member
    Join Date
    Jan 2015
    Location
    West Hollywood
    Posts
    3,823
    Rep Power
    21
    Since the vast majority of humans are irrational and complete morons, I think it is imperative we build machines smarter than humans. My toaster is smarter than most of the fuckwits I work with.

  8. Top | #18
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Quote Originally Posted by Tom Sawyer View Post
    Quote Originally Posted by Speakpigeon View Post
    Quote Originally Posted by Tom Sawyer View Post
    If it's possible to do, someone will do it. That means the two choices available are to have the more intelligent machines yourself or to be in competition against those with more intelligent machines without having them yourself. Given the lack of a third option, building them yourself makes the most sense.
    Sure, AIs more intelligent than us would be the means to beat the competition and we will all want them. Yet, if AIs more intelligent than us would be a bad thing, I don't see why it would be impossible to convince all governments and big companies capable of developing AIs that it would be bad and consequently to agree on a moratorium. If they wouldn't be a bad thing, then there would be no reason not to do it. So, the question is whether they would be bad to begin with and whether we could stop ourselves creating them.
    EB
    Whether they're good or bad is irrelevant. It's like nuclear missiles. You have one group with nukes who dominates everybody because they're the only ones with nukes or you get a MAD situation where the nuclear capabilities cancel each other out. If you get nine groups saying "Hey, these are really dangerous, let's not build any" then all you've done is created an opportunity for the tenth group.

    Same with super intelligent AI. Somebody is going to do it if it's possible to do. You can be that somebody or you can be one of the people that somebody uses it against. The third option of not having any of those somebodies develop it in the first place isn't a real option.
    You are assuming someone will build them. Sure, in this case we better all have them if we don't want to be history. But it is at east conceivable that all the major powers become convinced it would be bad and cooperate to make it impossible in actual fact. Assuming this, what would be your argument that we should or shouldn't build them?
    EB

  9. Top | #19
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Quote Originally Posted by TSwizzle View Post
    Since the vast majority of humans are irrational and complete morons, I think it is imperative we build machines smarter than humans. My toaster is smarter than most of the fuckwits I work with.
    Then we're toasts.
    EB

  10. Top | #20
    Contributor Speakpigeon's Avatar
    Join Date
    Feb 2009
    Location
    Paris, France, EU
    Posts
    6,309
    Archived
    3,662
    Total Posts
    9,971
    Rep Power
    46
    Quote Originally Posted by TSwizzle View Post
    Since the vast majority of humans are irrational and complete morons, I think it is imperative we build machines smarter than humans. My toaster is smarter than most of the fuckwits I work with.
    It is actually not clear at all why you think it is "imperative". How having machines more intelligent than any human being and that fuckwit morons will be able to use for their own ends could possibly be good news?
    EB

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •