Is Moral Value Primarily Determined by Ability to Suffer?

So if they had a pill that would induce some 24/7 adrenaline rush where livestock animals were given the peak of happiness potential, they would suddenly not be morally valuable? Sentience is the basis, not suffering
 
So if they had a pill that would induce some 24/7 adrenaline rush where livestock animals were given the peak of happiness potential, they would suddenly not be morally valuable? Sentience is the basis, not suffering

Well, that's arbitrary. Why? And what do you mean by "moral value"? If an animal can experience the world and their own inner states, then that is one thing. The question of what moral value we give to that is another. For example, if we could make farmed animals happy all the time, even when we kill them, doesn't that exhaust our moral obligation? They will not feel pain and suffer. Consider that is largely the obligation those in favour of alleviating wild animal suffering believe exists. What more is demanded of us because of sentience? A right to be free? A right not to be exploited? Why?
 
Well, that's arbitrary. Why? And what do you mean by "moral value"? If an animal can experience the world and their own inner states, then that is one thing. The question of what moral value we give to that is another. For example, if we could make farmed animals happy all the time, even when we kill them, doesn't that exhaust our moral obligation? They will not feel pain and suffer. Consider that is largely the obligation those in favour of alleviating wild animal suffering believe exists. What more is demanded of us because of sentience? A right to be free? A right not to be exploited? Why?
Is this the "meat-bot" thing :rolleyes:

So if you were about to be killed it would be better if you were given ectasy first?
 
I would say that if someone planned to kill me and could give me some pill that prevented me worrying about it, then it seems of no great concern to me.
 
Well, that's arbitrary. Why?
Sentience (aka distinguishing between independent objects that have agency or dont) is not arbitrary... The tree is not like the squirrel...
And what do you mean by "moral value"?
Tearing the tree to shreds is fundamentally different from tearing the squirrel to shreds...
If an animal can experience the world and their own inner states, then that is one thing.
You are crazy
The question of what moral value we give to that is another.
Not the idea, its not about you, its about them
For example, if we could make farmed animals happy all the time, even when we kill them, doesn't that exhaust our moral obligation?
Obviously not, they do not have the keys to their cages, slavery is still slavery... its about sentience, not suffering
They will not feel pain and suffer.
That is not relevant
Consider that is largely the obligation those in favour of alleviating wild animal suffering believe exists. What more is demanded of us because of sentience? A right to be free? A right not to be exploited? Why?
Sure, if youre going to euthanize then maximum pleasure pills is obviously better than not but that is this totally other topic to veganism...

Why do you demand any of those things? What makes it different for you than it does for the squirrel?
 
What I am getting at, JacobEdward, is that it isn't immediately obvious what kind of moral value sentience demands. The capacity to suffer suggests that, being empathic beings, we might prefer not to cause pain and suffering to other animals. That means we need to do what we can to assure good welfare for the animals we use. Beyond that, what does sentience demand? Yes, I understand that we can want to claim that other animals have a right to have their interests given fair treatment, because that's how we treat other humans. But there doesn't appear to be any particular a priori case that we have to grant that claim. When we talk about rights, most people take that to mean something like liberation, hence your mention of slavery. I would say, what does it matter if animals are owned and "enslaved", so long as their lives are generally without pain and suffering?

Other moral value accrues to animals and plants, for example in terms of species continuation or ecological value or even aesthetic value. But that doesn't only pivot on sentience.

In both cases, the moral value of these animals and plants depends upon a moral calculus by human beings. Absent humans and there are no morals, at least not such as we'd recognise them. Sentience may suggest some moral duty on our part but I don't see why it leads to liberation. Nor do I think moral value only rests on sentience, unless by moral value you mean welfare or even liberation.
 
Welcome to the Monkey House!
Assuming this means you disagree with me, can you outline your disagreement? If someone planned to kill me without my knowledge and they gave me a pill to ensure I did not suffer, of what concern is that to me? None at all, I suggest. For once I am dead I no longer exist to have any feelings about the matter and before I died I did not know that I would. Now, there might be some concern if I have a family or a critical job or some other value, but for me personally, it's not a worry of any kind.
 
Assuming this means you disagree with me, can you outline your disagreement? If someone planned to kill me without my knowledge and they gave me a pill to ensure I did not suffer, of what concern is that to me? None at all, I suggest. For once I am dead I no longer exist to have any feelings about the matter and before I died I did not know that I would. Now, there might be some concern if I have a family or a critical job or some other value, but for me personally, it's not a worry of any kind.
It was a little bit of an inside joke.
Welcome to the Monkey House is a short story. One could say that it explores some questionable moral arguments.

 
Sentience demands consent? How on earth do you get to that? Sentience doesn't mean a fully formed, rational mind with the capacity for language and abstract thought, it just means the capacity to experience the world. It's entirely possible that a machine could do that. The sentience of a cow does not demand consent; such sentience may be sufficient that humans might one day agree that is required but so far at least that is not a commonly agreed legal or moral view.
 
Sentience demands consent? How on earth do you get to that? Sentience doesn't mean a fully formed, rational mind with the capacity for language and abstract thought, it just means the capacity to experience the world. It's entirely possible that a machine could do that. The sentience of a cow does not demand consent; such sentience may be sufficient that humans might one day agree that is required but so far at least that is not a commonly agreed legal or moral view.
Screenshot 2021-06-04 200651.png