xixidu

xixidu

  • Commented on How I feel this morning
    Also check out The Old Reader Google Reader clone....
  • Commented on How I feel this morning
    So far the open source project RSSOwl seems like the best alternative to those who subscribe to a huge number of feeds and need a lot of functionality....
  • Commented on Roko's Basilisk wants YOU
    Sigh...that old thread. I am reluctant to read my old comments. That post was basically my reaction to the Roko incident, although for reasons of censorship I had to circumvent direct criticism of the idea. The real title of the...
  • Commented on Roko's Basilisk wants YOU
    The correct way to state it, hilariously, is 'Plurality is not to be posited without necessity' or 'Entities should not be added without need' or however you phrase it. I think they believe that this does not affect MWI because...
  • Commented on Roko's Basilisk wants YOU
    Also see e.g. the following comment from a LessWrong member: My own guesstimate is that, conditional on FAI being achieved in the next 100 years (so enough information is preserved to make them relatively easy to resurrect), and conditional on...
  • Commented on Roko's Basilisk wants YOU
    The following idea from David Deutsch's The Beginning of Infinity might be relevant: Take a powerful computer and set each bit randomly to 0 or 1 using a quantum randomizer. (That means that 0 and 1 occur in histories of...
  • Commented on Roko's Basilisk wants YOU
    The odds of finding a few hundred people on one blog capable of correctly analyzing and refuting Eliezer based on second-hand reportage are approximately zero. The reason being that a lot of his extraordinary statements are in essence vague and...
  • Commented on Roko's Basilisk wants YOU
    Of course, that is one of the underlying ideas. You could be simulated right now, yes you, the person that is reading these very words right now, by some sort of superintelligence that wants to learn if it can trade...
  • Commented on Roko's Basilisk wants YOU
    There is no time-travel involved. What "acausal" refers to is that decisions are being made based on mutual reasoning, predictions or even simulations of causally disconnected agents. If all parties come to similar conclusions about each others predictions, actions and...
  • Commented on Roko's Basilisk wants YOU
    You don't need to assign the same amount of value to what happens to a copy of you than what happens to your current conscious self. But it is assumed that you do at least somewhat care about what happens...
  • Commented on Roko's Basilisk wants YOU
    Also see my post 'The Nature of Self'. Maybe this dissolves some confusion. Let me know....
  • Commented on Roko's Basilisk wants YOU
    You are right, I just wanted to point out that even if the torture would be aimed at your original biological copy, the strategy to refuse to be blackmailed would still be correct. I now added another example to my...
  • Commented on Roko's Basilisk wants YOU
    Example of a human blackmailer Consider some human told you that in a hundred years they would kidnap and torture you if you don't become their sex slave right now. The strategy here is to ignore such a threat and...
  • Commented on Roko's Basilisk wants YOU
    I keep being told not to expose people to Roko's basilisk. Let me list a few points / explain why I believe that attitude to be part of the reason for why I don't keep quiet about it: 1.) Extraordinary...
  • Commented on Roko's Basilisk wants YOU
    And you certainly have edited the RationalWiki LessWrong page... I did not claim otherwise. I didn't edit the Roko's basilisk entry. I edited the LessWrong entry exactly 5 times, of which two helped to improve the image of LessWrong while...
  • Commented on Roko's Basilisk wants YOU
    I am not aware that I ever claimed that a "significant minority" of LessWrong takes Roko's basilisk seriously. It is unknown how many people take it seriously (notice here that I did not write that RationalWiki entry). What is known...
  • Commented on Roko's Basilisk wants YOU
    And let's be clear about the source of 'toxic mindwaste', as Eliezer Yudkowsky likes to call it now. It is not due to those who try to dissolve people's confusion about those hazards, it is those who cause people to...
  • Commented on Roko's Basilisk wants YOU
    Having spent around six months around Less Wrong, I have literally never encountered someone who believes this is a serious concern other than possibly Eliezer themselves... That's easily explained by the fact that people are not allowed to talk about...
  • Commented on Roko's Basilisk wants YOU
    The following discussion between me and another LessWrong member might be of interest to people reading this thread. It highlights some of the perceived problems: David Gerard: I gotta say: statements of probability about things you don't understand and have...
  • Commented on Roko's Basilisk wants YOU
    Updated my post to add the following: There are various reasons for how humans are unqualified as acausal trading partners and how it would therefore not make sense for a superintelligent expected utility maximizer to blackmail humans at all: 1.)...
  • Commented on Roko's Basilisk wants YOU
    Vanzetti, realize how the existence of such an AI will become even more unlikely if it attemps to backmail people who consistently refuse to be blackmailed and work against blackmailers. Which makes it instrumentally irrational to blackmail such people. Just...
  • Commented on Roko's Basilisk wants YOU
    Vanzetti, let me add the following. It would not make sense for such an agent to punish you for ignoring any such punishments. If you ignore such threats then it will be able to predict that you ignore such threats...
  • Commented on Roko's Basilisk wants YOU
    Vanzetti, given that we're talking about superintelligent consequentialist expected utility maximizers here, the important point is that such agents will try to control the probability of you acting according to their goals. If their simulations will show that you almost...
  • Commented on Roko's Basilisk wants YOU
    This is not directly related but you might also want to check out my interview series on risks associated with artificial general intelligence in which I asked various experts about their opinion: Interview series on risks from AI...
  • Commented on Roko's Basilisk wants YOU
    By the way, I recently wrote a post on how to defeat Roko’s basilisk and stop worrying. Initially I list a few reasons for why I don't think it is sensible, even given that you accept a lot of the...
  • Commented on Roko's Basilisk wants YOU
    By the way, archiving things as PNGs really sucks. Learn how to use wget and wkhtmltopdf, or something. HTML version. I explained exactly my problems with it in our discussion yesterday. He seems more sane than quite a few other...
  • Commented on Roko's Basilisk wants YOU
    @gwern Here is a screenshot (warning: large png image) of the post the quote is from. I don't see how it is misleading. He describes the problems and consequences of taking the whole LessWrong ideology seriously. I don't see how...
  • Commented on Roko's Basilisk wants YOU
    I just want to add a note regarding Eliezer Yudkowsky's decision to censor any discussion of Roko's basilisk: Banning any discussion of an idea is known to spread it. But more importantly, it can give even more credence to an...
  • Commented on Roko's Basilisk wants YOU
    To be honest, the top ranked post on Less Wrong blog is GiveWell Labs trashing the hell out of MIRI, so there is still hope for them. Sure, I am also a LessWrong member with a Karma score of more...
  • Commented on Roko's Basilisk wants YOU
    Here is a taste of the kind of intricate argumentative framework that shields those people from any criticism: (Note: The Machine Intelligence Research Institute (MIRI), formerly known as the Singularity Institute for Artificial Intelligence, control LessWong.) Skeptic: If you are...
Subscribe to feed Recent Actions from xixidu

Following

Not following anyone

Specials

Merchandise

About This Page

Find recent content on the main index or look in the archives to find all content.

Propaganda