极速赛车168官网 Comments on: The Philosophical Landscape of “Westworld” https://strangenotions.com/the-philosophical-landscape-of-westworld/ A Digital Areopagus // Reason. Faith. Dialogue. Fri, 12 Feb 2021 12:00:25 +0000 hourly 1 https://wordpress.org/?v=6.7.1 极速赛车168官网 By: Luke Breuer https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172300 Mon, 12 Dec 2016 18:34:00 +0000 http://strangenotions.com/?p=6769#comment-172300 In reply to Brian Green Adams.

Those are rather difficult to tackle in combox discussions. But if facts in those domains are critically important for the truth or lack thereof of what has been said so far, how does that bear on discussions such as these? Do they become mere entertainment, or exercises in logic at best?

]]>
极速赛车168官网 By: Brian Green Adams https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172294 Mon, 12 Dec 2016 13:28:00 +0000 http://strangenotions.com/?p=6769#comment-172294 In reply to Luke Breuer.

Thanks I think I have your view and you mine. I'm not going to argue parallel histories or practical geopolitics.

]]>
极速赛车168官网 By: Luke Breuer https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172267 Fri, 09 Dec 2016 19:12:00 +0000 http://strangenotions.com/?p=6769#comment-172267 In reply to Brian Green Adams.

BGA: I am not asking you if you would do it, would you accept it as legal? Would you say it is permissible in a society? We are not talking about morality, but fundamental human rights.

LB: Sure, and yet I look at the world and I see that the "fundamental human rights" of the Tutsis in Rwanda did approximately jack to keep them alive, which is surely a prerequisite to "fundamental human rights" having any utility whatsoever.

BGA: I am not sure you kept track of the question you are saying "sure" to above. It was whether you would allow torture and harvesting the organs of the criminally insane paraplegic.

Apologies; the underlined was in response to the underlined.

I would say the first step is acknowledging what fundamental human rights are and who gets them. It is another question altogether in terms of how to prevent them from being violated.

To speak of what gets built on the foundation when one doesn't know what the foundation can support is, in my view, wishful thinking and often not very useful to do things like foster human flourishing.

Next is that if there is no God, as I believe is apparent, it is meaningless.

Actually, what's almost certainly the case if there is no God is that not all rights are satisfiable for all people. So you end up coming up with a rationalization for why the haves get to have, and why the have-nots don't deserved to be helped more than e.g. is documented in The Charitable–Industrial Complex. Remember: part of "Might makes right" is that power lets you come up with rationalize you can shove down others' throats. Or you seize control of education and other institutions for forming citizens and indoctrinate them. All that is left is to explain why e.g. the 2015 Charlie Hebdo shooting was terrible while 1999 NATO bombing of a Serbian news station acceptable, but as it turns out, that's not very hard. Most humans accept the programming without too much fuss.

Next is if there is a God, identifying what his glory is would seem to be an extremely difficult problem and massively subjective.

Indeed, it might be like the blind men and an elephant, except with each person having enough overlap with enough others so that together, if they trust each other, they can reconstruct the entire elephant. It would be the most extreme form of subjectivity which could nevertheless contribute to a collective objectivity.

Ok, but the idea is to come up with rights that make such a scenario less likely to be carried out.

Without some semblance of everyone contributing to the common good, I don't see this working in any stable, long-term fashion. You might look at how in Nazi Germany, the Jews were seen as working against the common good.

But I think when you start making the rights of people dependent on your impression of whether they have the ability and willingness to maximize your version of the common good, we begin to enable this kind of thing.

Who said that I get to be God, that it's my impression, my version which is all-determining?

In other words, under your threshold, genocide is sometimes permissible. Under mine it never is.

I fail to see how your threshold would have done anything to prevent the Holocaust. I care about what works, not what suffices as a salve for my conscience while other people suffer horribly.

I also share your view that if "my attempts to plead with the powers that be fail, I'll go and become one of the slaughtered." But I would do so without any belief in the resurrection or that Justice will ultimately win.

Then evolutionary psychology would apparently be false:

BGA: If our inherent desires acted contrary to our survival, it would be evidence against evolutionary psychology.

Unless somehow you think that inherent desires can somehow give rise to non-inherent desires?

]]>
极速赛车168官网 By: Brian Green Adams https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172265 Fri, 09 Dec 2016 15:01:00 +0000 http://strangenotions.com/?p=6769#comment-172265 In reply to Luke Breuer.

"Sure, and yet I look at the world and I see that the "fundamental human rights" of the Tutsis in Rwanda did approximately jack to keep them alive, which is surely a prerequisite to "fundamental human rights" having any utility whatsoever."

I am not sure you kept track of the question you are saying "sure" to above. It was whether you would allow torture and harvesting the organs of the criminally insane paraplegic.

I would say the first step is acknowledging what fundamental human rights are and who gets them. It is another question altogether in terms of how to prevent them from being violated. For ensuring they have utility. This is an enormous question to which many people are devoted to solving, myself included. But it is not the topic of this discussion.

"the extent to which Christianity shaped Europe and created the
conditions for a great amount of cognitive diversity while the
underlying virtues were sufficiently robust to maintain an awful lot of
unity"

Again, different issue.

"One needs a more basic foundation, upon which rights can be built.'

Indeed, but this is a very difficult question. I have said it should lie in the capacity to suffer and the cognitive ability to be aware of one's existence and have a desire to continue living without suffering. Your threshold seems to be the potential ability to maximize God's glory. I find this enormously problematic. First it fails to include any cognitive ability, perhaps this is implied. Next is that if there is no God, as I believe is apparent, it is meaningless. Next is if there is a God, identifying what his glory is would seem to be an extremely difficult problem and massively subjective. Next is how to we assess what would "maximize" this, especially given some theologies which have as a tenet that any such God is the perfection of all things that can be perfected. There is an enormous dispute even among those who identify as Christian with what is required to maximize God's glory, not to mention other major religions and then individual versions of religions such as your own.

"Oh come on, this isn't a psychopath torturing animals, this is the attempt to save everything you care about from utter destruction."

No in that example it is a non-psychopath human torturing an entity we are trying to decide whether is a think like a human, or a think like a toaster to literally save all humanity from extinction. The question is, can we treat this entity like a toaster, or do we need to treat it like a human, or an animal, or something else? Again, you brought up the Cylons, and this question we are discussing was raised specifically in that show. If something is "artificial" but is more or less exactly the same as us, does it get any rights, all rights, some rights?

"Suffice it to say that if a plan like that starts being carried out and
my attempts to plead with the powers that be fail, I'll go and become
one of the slaughtered. See, I believe in resurrection of the dead, that true Justice will ultimately win.'

Ok, but the idea is to come up with rights that make such a scenario less likely to be carried out. Admittedly, the project is not working so well. But I think when you start making the rights of people dependent on your impression of whether they have the ability and willingness to maximize your version of the common good, we begin to enable this kind of thing. In other words, under your threshold, genocide is sometimes permissible. Under mine it never is.

I also share your view that if "my attempts to plead with the powers that be fail, I'll go and become one of the slaughtered." But I would do so without any belief in the resurrection or that Justice will ultimately win.

]]>
极速赛车168官网 By: Luke Breuer https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172255 Thu, 08 Dec 2016 20:59:00 +0000 http://strangenotions.com/?p=6769#comment-172255 In reply to Brian Green Adams.

I am not asking you if you would do it, would you accept it as legal? Would you say it is permissible in a society? We are not talking about morality, but fundamental human rights.

Sure, and yet I look at the world and I see that the "fundamental human rights" of the Tutsis in Rwanda did approximately jack to keep them alive, which is surely a prerequisite to "fundamental human rights" having any utility whatsoever. I see things like the extra-judicial assassination of Anwar al-Awlaki and the threat to rights that Hedges v. Obama represents and I ask: is the foundational need an agreement on "fundamental human rights", or do we need something deeper, such as a commitment to some common good? Have you ever looked at the 'rights' ostensibly guaranteed by the USSR? Of course they weren't, but I'm sure much fanfare was made of them.

What many people apparently fail to recognize is the extent to which Christianity shaped Europe and created the conditions for a great amount of cognitive diversity while the underlying virtues were sufficiently robust to maintain an awful lot of unity. That deposit of virtue is now eroding. If Brexit and Trump didn't convince you of this, just wait a few more decades. I personally have no interest in trying to continue propping up a failed political liberalism. Without a more robust concept of a common good than we currently have, unity just isn't possible. Yeah, exactly how to formulate and interact with said concept is difficult, but that's just a fact of life.

Last night I just finished watching Westworld, and I note something crucial: the AI and humans never managed to work together toward something they both wanted. You basically had temporary unity in sex and then disunity in war. The idea that merely attributing "fundamental human rights" will solve things—a position you haven't explicitly espoused, but which seems consistent with what you've said to date—is in my mind, utterly wrong. One needs a more basic foundation, upon which rights can be built.

The Cylon has information that will save thousands of lives, in fact save humanity from extinction and it will divulge this if you torture it. We torture animals for far less good outcomes, even vain outcomes. What is the harm in torturing this Cylon? The example is meant to draw out how you feel about the Cylon, whether it has a right to security of the person, or whether it doesn't, like an animal or a plant. And say you don't want to do it, but others do and your job is to stop them or not. What do you do?

Oh come on, this isn't a psychopath torturing animals, this is the attempt to save everything you care about from utter destruction. This is literally the worst example you could give of torture. I am inclined to accept that torture statistically doesn't work, but in the insane hypothetical you've presented, what risk is not worth taking? Only if you believe in resurrection from the dead would you think it is better to succumb than carry out said evil.

You seem to have dropped the idea of willingness and ability to assist the common good from your examination of the basis for rights and now it is the ability (?) intention (?) to glorify God.

If you think this, you need to re-read what I wrote and have multiply linked to:

LB: I hesitated on whether to use the term "God's glory". It fits because I believe God is a servant god—that is, he has the best interest at heart of not just humans, but all of creation. Where we might over-value ourselves and under-value rivers we are polluting, God values things appropriately. At least part of what glorifies God is to make plain how awesomely he did things. So to aim for maximization of God's glory instead of some other glory is to be just instead of partial.

[...] I can easily see politicians taking the position that homosexuals cannot have the ability to glorify god, as they claim to be essentially homosexual.

To live is to play with fire. (I don't see a sustainable alternative to having that as a danger.)

Now this seems extreme, maybe, but is it? Consider God's treatment of the Amalakites. Under your view God recognized that this human people could not glorify Him so he ordered Saul to kill them, even the infants.

I'm not going to get side-tracked into one of these discussions; it would result in a conversation a mile wide and an inch deep. Suffice it to say that if a plan like that starts being carried out and my attempts to plead with the powers that be fail, I'll go and become one of the slaughtered. See, I believe in resurrection of the dead, that true Justice will ultimately win.

]]>
极速赛车168官网 By: Brian Green Adams https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172239 Thu, 08 Dec 2016 13:39:00 +0000 http://strangenotions.com/?p=6769#comment-172239 In reply to Luke Breuer.

" Given that, your statement of "a better explanation" appears 100% unscientific."

You are absolutely correct. We are not doing science. It has a much much higher threshold before it accepts something as true. On that threshold, the origin or our psychology is unknown.

"Given that human brains have significant similarities to animal brains,
either you think that in this world, evopsych is unfalsifiable, or this
was a red herring."

No, had human brains been completely different than animal brains, if our psychology operated the same absent a brain, or irrespective to any physical damage, it would be evidence against evolutionary psychology. If our inherent desires acted contrary to our survival, it would be evidence against evolutionary psychology.

'No, I don't at all think of my actions as permitting me to do whatever I
desire which is not prohibited by someone else's rights.'

I am not asking you if you would do it, would you accept it as legal? Would you say it is permissible in a society?We are not talking about morality, but fundamental human rights.

"I believe the former has the potential for becoming healthy, while psychopathy may be what is 'healthy' for the AI."

But that is why I gave you this example, it is someone who does not have the potential for becoming healthy. (And yes I acknowledge what I mean is no reasonable prospect of becoming healthy, which is all you can say about your AI too.)

"Explain to me how torturing a Cylon plausibly glorifies God. "

The Cylon has information that will save thousands of lives, in fact save humanity from extinction and it will divulge this if you torture it. We torture animals for far less good outcomes, even vain outcomes. What is the harm in torturing this Cylon? The example is meant to draw out how you feel about the Cylon, whether it has a right to security of the person, or whether it doesn't, like an animal or a plant. And say you don't want to do it, but others do and your job is to stop them or not. What do you do?

" Instead, I pointed out that essential psychopaths—such as Lore—don't get the same freedoms as you and I."

All persons have their freedom limited in a civil society and I am in agreement with this as I have said. The question is not whether we can or should limit freedom and rights, but which entities get their rights in the first place to be limited.

"I would again return to the guiding ideal of glorifying God, from which one can derive rights when appropriate."

You seem to have dropped the idea of willingness and ability to assist the common good from your examination of the basis for rights and now it is the ability (?) intention (?) to glorify God. Well why cannot the Sun glorify God? It is part of His creation, without it no life could exist. One of the most famous miracles is the miracle of the sun. Dogs help the blind, police, they comfort us, they appear to love.

Then this brings us to the problem of you providing a rigorous explanation of this threshold. It seems an extremely precarious threshold, I can easily see politicians taking the position that homosexuals cannot have the ability to glorify god, as they claim to be essentially homosexual. Consider people like Mike Pence or Mike Flynn the new national security advisor who has publicly stated that fear of muslims is rational. This gets us very close to "they have no ability to glorify god and never will" once people are convinced of this (and it would seem many many people are) they lose rights, this means it is okay to kill them, to commit genocide on them.

Now this seems extreme, maybe, but is it? Consider God's treatment of the Amalakites. Under your view God recognized that this human people could not glorify Him so he ordered Saul to kill them, even the infants.

to contribute to

]]>
极速赛车168官网 By: Luke Breuer https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172217 Wed, 07 Dec 2016 21:41:00 +0000 http://strangenotions.com/?p=6769#comment-172217 In reply to Brian Green Adams.

Definitely not willing or able to give a rigorous sketch of evolutionary biology. Just think it is a better explanation than is proposed above.

That's not quite what I asked; I said "rigorously sketch out the limits of your evopsych explanation". But we're in the same territory here as your refusal to rigorously sketch out the limits of your materialist position. You could deny that God exists, but you couldn't present a single phenomenon you could possibly conceive of experiencing that your materialism could not explain. Given that, your statement of "a better explanation" appears 100% unscientific. Which is fine, as long as you don't play it off as scientific.

Evolutionary psychology isn't unfalsifiable. It would be falsified if human brains were nothing like animal brains, for example.

Given that human brains have significant similarities to animal brains, either you think that in this world, evopsych is unfalsifiable, or this was a red herring.

So you would deprive people of rights if they lack the ability to contribute to the common good? I would not. Say a criminally insane paraplegic? Would you say we could conduct experiments on, harvest organs?

No, I don't at all think of my actions as permitting me to do whatever I desire which is not prohibited by someone else's rights. That's why I originally spoke of glorifying God and not respecting others' rights. Loving others—agape—is so much more than merely respecting their rights.

The difference between a criminally insane paraplegic and a essentially psychopathic AI is that I believe the former has the potential for becoming healthy, while psychopathy may be what is 'healthy' for the AI. But the correct response to an essentially psychopathic AI would seem to be: how do I create an AI which isn't essentially psychopathic? Not... "How can I therefore torture the psychopathic AI for fun or utility?"

Ok that's the difference between you and me. I would afford cyclons full rights. Good example. In the episode where they torture the cylon. You would say there is nothing less moral about that then say torturing a cat?

Explain to me how torturing a Cylon plausibly glorifies God. It seems to me that torture is actually a way to force others to expiate the evil in the world. (I was hurt unfairly and someone else deserves to be hurt as a consequence. Preferably someone who I can convince myself deserves the hurt.) You might remember that Jesus showed how expiation works. See his command to deny oneself and take up one's cross if one wishes to become his disciple. See also Paul's descriptions.

I see a solution, the one practiced by liberal democracies. That rights be afforded on the basis of cognitive abilities and interests.

It seems to me that you may well be born in the right era to see how well liberal democracies survive without citizens fulfilling enough of their duties. (Recall that I said "Rights are unsustainable without duties.")

I think you are wrong that prioritizing individual good over common good is an ideology of liberalism.

Here's a pretty good authority on this aspect of political liberalism:

    Similarly, the problem of the common good which arises for liberalism has its analogue in at least some other traditions. Its most cogent recent statement has been by Robert A. Dahl in Dilemmas of Pluralist Democracy (New Haven, 1982). In what Dahl calls pluralist democracies, which are very much what I have called liberal political orders, individuals pursue a variety of goods, associating in groups to achieve particular ends and to promote particular forms of activity. None of the goods thus pursued can be treated as overriding the claims of any other. Yet if the good of liberalism itself, the good of the pluralist democratic polity rather than the goods of its constituent parts, is to be achieved, it will have to be able to claim an overriding and even a coerced allegiance. Or, to put the problem in another way, what good reasons could an individual find for placing him or herself at the service of the public good rather than of other goods? Dahl offers an acute and detailed account of "the extreme vulnerability of individualist civic virtue" and discusses possible remedies (op. cit., chapters 6 and 7), but, as he himself stresses, the problems are generated by the very forms of this kind of political order, and the task of institutionalizing any proposed remedies would confront the same set of questions which engendered the problems. (Whose Justice? Which Rationality?, 374)

We can also look at a prominent proponent of political liberalism, John Rawls. According to the IEP, "John Rawls was arguably the most important political philosopher of the twentieth century." What's interesting is a shift we see between two of his books, A Theory of Justice and Political Liberalism. Here's how the IEP describes it:

The seminal idea of PL is “overlapping consensus.” In an overlapping consensus, each citizen—no matter which of society’s many “comprehensive conceptions” he or she endorses—ends up endorsing the same limited, “political conception” of justice, each for his or her own reasons. The principal role of the overlapping consensus is to replace TJ’s description of wholehearted acceptance. Unlike TJ’s description, the overlapping consensus conceptually reconciles wholehearted acceptance with the fact of reasonable pluralism. (IEP: John Rawls)

This 'overlapping consensus' is somewhat deviously named, because as we read in the previous paragraph, "the kind of uniformity in fundamental moral and political beliefs that he imagined in Part Three of TJ can be maintained only by the oppressive use of state force." He describes this as "the fact of oppression". Why is this required for anyone but deviants if citizens have a sufficiently robust sense of the common good?

Here is a question. Who decides what the common good is? How? What is the test for whether someone has the ability to contribute to it? Do you think the more disabled you are the fewer rights you should have? As you would have less ability to contribute?

In my view, everyone plays a part in determining what the common good is, including God. Probably all of creation does, but that I think would broaden this conversation too much. We can work with approximations. Ultimately, I see no other option than this for making [knowably] false the maxim, "Might makes right." If there is nothing within a person which needs to be freely given to society, then that person can be arbitrarily fully manipulated. And that's just what is happening; for bit on that, see Nina Eliasoph's Avoiding Politics: How Americans Produce Apathy in Everyday Life.

I didn't predicate rights on the quantity one can contribute to the common good. Instead, I pointed out that essential psychopaths—such as Lore—don't get the same freedoms as you and I. Furthermore, I would question my own ability to properly evaluate how much any given individual can contribute to the common good. For example, Joni Eareckson Tada has almost certainly contributed more than I ever will.

Also, would you grant rights to the sun? Clearly it has the ability to contribute to the common good. What about dogs. They definitely have this ability.

I would again return to the guiding ideal of glorifying God, from which one can derive rights when appropriate. I don't see how they are appropriate for the sun or dogs. But I do recognize that for political liberalism, 'rights' might be the only true moral force.

]]>
极速赛车168官网 By: Brian Green Adams https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172201 Wed, 07 Dec 2016 12:20:00 +0000 http://strangenotions.com/?p=6769#comment-172201 In reply to Luke Breuer.

Definitely not certain. but if you propose a psychopath robot, I assume you mean the same psychopathy as I am familiar with.

Definitely not willing or able to give a rigorous sketch of evolutionary biology. Just think it is a better explanation than is proposed above.

Evolutionary psychology isn't unfalsifiable. It would be falsified if human brains were nothing like animal brains, for example.

I don't expec Lore would consent. I guess you are right. I don't know the difference between psychopath AI and essentially psychopathic AIs. I don't see how this is different than human psychopaths, I would say they have psychopathy by their very nature too.

So you would deprive people of rights if they lack the ability to contribute to the common good? I would not. Say a criminally insane paraplegic? Would you say we could conduct experiments on, harvest organs?

Ok that's the difference between you and me. I would afford cyclons full rights. Good example. In the episode where they torture the cylon. You would say there is nothing less moral about that then say torturing a cat?

I see a solution, the one practiced by liberal democracies. That rights be afforded on the basis of cognitive abilities and interests.

No the liberal perspective is that individual rights should be respected unless there is reason to violate them. This is done all the time with incarceration. The difference is that we do not strip people of rights, we recognize we are violating those rights to respect the rights of others and to avoid a greater harm. E.g. it is better to violate Paul Bernardo's right to freedom to an extent than to risk him killing more women. But he doesn't lose all rights. I think you are wrong that prioritizing individual good over common good is an ideology of liberalism.

Here is a question. Who decides what the common good is? How? What is the test for whether someone has the ability to contribute to it? Do you think the more disabled you are the fewer rights you should have? As you would have less ability to contribute?

Also, would you grant rights to the sun? Clearly it has the ability to contribute to the common good. What about dogs. They definitely have this ability.

]]>
极速赛车168官网 By: Luke Breuer https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172173 Tue, 06 Dec 2016 00:03:00 +0000 http://strangenotions.com/?p=6769#comment-172173 In reply to Brian Green Adams.

No idea, you are the one hypothesizing about psychotic AI's.

You were quite certain that psychopathy in an AI would necessarily be a personality disorder. Are you no longer so certain?

I am just saying don't start believing it is true until there is reason to.

That's fine; my beef is more that you don't seem willing to rigorously sketch out the limits of your evopsych explanation. Remember: a claim which can account for any phenomena—which is unfalsifiable, Popper-style—isn't an empirical claim, but a metaphysical claim. Is your evopsych explanation empirical, or metaphysical?

Sure, reprogramming him with his consent, this would be analogous to medical treatment. The question here is do you care if he consents?

Nothing in Lore's demonstrated character indicates he would consent. What you're doing here is trying to break out of my posit of AI which are essentially psychopathic. You don't seem to want to deal with the possibility of an AI which is psychopathic by its very nature—such that non-psychopathy would be a personality disorder.

The question you need to ask is why do we grant rights to anyone at all in the first place.

Ability to contribute to the common good, somehow. I said this four days ago (last paragraph).

Is it because they have feelings, desires, and are aware of their own existence? To me this is it.

That isn't good enough; I would not give Cylons full rights, not while they are bent on exterminating humanity.

Keep in mind, in the context of rights, what you are essentially saying is that someone who does not have the willingness to contribute to the common good loses all human rights.

Incorrect. What I'm doing is looking at a world where we claim that everyone has rights, while they are systematically violated. (Ever hear of the Rohingya people?) I ask myself, what would it take for them to have meaningful rights, rights which are protected? I don't see a solution other than one where everyone contributes to the common good. By the way, I do know that an element of political liberalism is that no contribution to the common good is required. The individual good is more important than any common good. I think that we are seeing the breakdown of that ideology, in our world today. Surely this is a purely empirical matter, which can be explored by looking at the evidence?

]]>
极速赛车168官网 By: Brian Green Adams https://strangenotions.com/the-philosophical-landscape-of-westworld/#comment-172172 Mon, 05 Dec 2016 22:32:00 +0000 http://strangenotions.com/?p=6769#comment-172172 In reply to Luke Breuer.

"Psychopathy is a personality disorder for humans; why need it be a personality disorder for AI? Who are you to say what a 'healthy' AI is like, what a 'normal' AI is like?"

No idea, you are the one hypothesizing about psychotic AI's. I am saying we should treat AI's the same as humans if we consider them to be worthy of extending rights to. Humans do not lose this status because of psychopathy, neither should AI's.

"Given that this "no belief at all" can be responsible for people
spending years of their lives studying its object, I'm not sure what
you're saying, here."

I am simply saying a hypothesis a very low standard of proof. For something to be a hypothesis you do not need to think it is more likely than not, or even that it is the best explanation.

"Perhaps that one must not impose that belief hypothesis on others, against their will?"

I don't see how one even could.

"I'm just challenging you to not shut down avenues to discovering more of
reality by insisting on well-developed theories and copious evidence
from the get-go"

I have not shut anything down, this all arose from you saying:

"A force of pure good countered either by a force of evil or imperfect
finitude can yield the result of mediocrity, which is what you are
saying comes from our evolved reptile brains. Future experimentation can
help distinguish which ontology is better."

By all means go ahead and investigate this and perhaps you will destroy my hypothesis of evolutionary psychology. I am just saying don't start believing it is true until there is reason to. You seem to be saying that just because you hypothesize this, you shouldn't be shut down for acting as if you believe it. I agree!

"Curious. Lore is the quintessential psychopath, given repeated
opportunities to either contribute to the common good or at least not
damage it and other individuals, and he failed. He is the poster child
for losing his rights—either via imprisonment or deactivation. Surely
reprogramming him to be more sociable would violate the very principles
you say a mind wipe violates?"

Sure, reprogramming him with his consent, this would be analogous to medical treatment. The question here is do you care if he consents?

I would say psychopathy already is a range.

"You have before you an AI which is arbitrarily good at fooling you,
while being a psychopath underneath. And you want to just extend it
rights. I see problems with that, and I think a good number of other
people would, as well."

The question you need to ask is why do we grant rights to anyone at all in the first place. Is it because of pure biology? That would be arbitrary. Is it because they have an immaterial soul and unlike the rest of creation are endowed with rights by a Creator? This is unfounded, but if true would be a good reason. Is it because they have feelings, desires, and are aware of their own existence? To me this is it.

I then ask how I know they have these, and the only answer must be, because they seem like they do. Because I know I do, and they act just like me in the relevant respects.

So if I encounter an artificial mind with these attributes, I think I must conclude they have feelings, desires, and are aware of their own existence in the same way. Thus I must afford them rights as I would a human.

You get to a de facto protection of rights by balancing. This is why the symbol for justice is scales. For me it has nothing to do with the willingness or ability to contribute to the common good. I would afford full human rights to anyone even if they expressly said they had no willingness or ability to so contribute. You balance someone's right to freedom with the public's right to safety. It can play out either way.

Keep in mind, in the context of rights, what you are essentially saying is that someone who does not have the willingness to contribute to the common good loses all human rights. This would mean that they can be owned, killed, chopped up and fed to children against their will. Consider further, how do we evaluate the "common good" and whether there is a "willingness" to contribute? This is pretty much the rationale communist extremists used in purges of the bourgeoisie. It is pretty much what the Interahamwe said of the Tutsis, the Nazi's said about the Jews and so on, we hear it reflected by politicians who say even today things like those who burn the flag should lose their citizenship.

]]>