glass_superman t1_j241gi4 wrote
Is it ridiculous to worry about evil AI when we are already ruled by evil billionaires?
It's like, "oh no, what if an AI takes over and does bad stuff? Let's stop the AI so that we can continue to have benevolent leaders like fucking Elon Musk and the Koch Brothers."
Maybe our AI is evil because it is owned by dipshits?
cmustewart t1_j24bxuf wrote
I feel like either you or I missed the point of the article, and I'm not sure which. I didn't get any sense of "what if ai takes over". My account is that the author thinks that "ai" systems should have some sort of consequentialism built in, or considered in the goal setting parameters.
The bit that resonates with me is that highly intelligent systems are likely to cause negative unintended consequences if we don't build this in up front. Even for those with the most noble intentions.
glass_superman t1_j24oaqv wrote
It's the article that missed the point. It wastes time considering the potential evil of future AI and how to avoid. I am living in a banal evil right now.
cmustewart t1_j24px5g wrote
Somewhat fair as the article was fairly blah, but I've got serious concerns that the current regimes will become much more locked into place backed by the power of scaled superhuman AI capabilities in surveillance, behavior prediction and information control.
glass_superman t1_j26l6c5 wrote
That's totally what is going to happen. Looks at international borders. As nuclear weapons and ICBMs have proliferated, we find the nation borders are now basically permanent. Before WWII shit was moving around all the time.
AI will similarly cement the classes. We might as well have a caste system.
Meta_Digital t1_j2465yk wrote
Yeah, applying the concept of "banality of evil" to something imaginary like an AI when capitalism is right there being the most banal and most evil thing humanity has yet to contend with is a kind of blindness one might expect if you're living within a banal enough evil.
Edit: Angry rate downs despite rising inequality, authoritarianism, climate change, and the threat of nuclear war - all at once.
Wild-Bedroom-57011 t1_j25k0ew wrote
Is capitalism the most evil thing humanity has dealt with? More than feudalism, slavery, etc?
Further, AI isn't really imaginary-- at worst the author is trying to pre-empt and avoid an issue that is less likely to come to pass
tmac213 t1_j265p0f wrote
In the current world I think the answer is clearly yes, although I would amend the word "evil" to something like destructive or harmful, since an economic system doesn't need to contain malicious intent on order to harm people. Feudalism was and is awful, but has largely given way to capitalism in most places in the modern world. As for slavery, capitalism has been the primary consumer of slave labor since the industrial revolution. So yes, capitalism is the worst.
Wild-Bedroom-57011 t1_j28kq4l wrote
But they said "has yet to content with"
Unless you do in fact mean that every single system of governance, including things before slavery that are hard to conceptualize under one framework, extreme state control (whether you believe NK, USSR are actually socialist or not), etc. etc.
I'm not making a pro-capitalist argument, merely the point that
And ignoring the issue of slavery historically-- "has yet to content with"-- does seem a bit of a deliberate sidestep. Of course capitalism will be the primary consumer of slave labour, but slavery, absolute poverty, etc are lower and falling. Further, modern slavery is completely terrible but less severe than chattel slavery, or slavery that came before that.
But again, my argument was never that capitalism is better than anything else, merely that it isn't the most evil thing. Genocide might be. Or something completely different.
Meta_Digital t1_j25klmm wrote
Yes; capitalism is the first system that seems poised to lead to human extinction if we don't choose to overcome it rather than reacting after it does its damage and self destructs.
The AI the author is referring to is either what we have today, which is just old mechanical automation, or the AI that is imagined to have intelligence. Either way, it's the motives of the creators of those systems that are the core problem of those systems.
Wild-Bedroom-57011 t1_j25m53g wrote
But it seems that the AI alignment issue is also a big concern, too. In either case-- capitalists using AI for SUPER CAPITALISM (i.e. can do all normal capitalism things but faster, more effectively) and so the issue solely being in intent and motive, or capitalists incorrectly specifying outcomes (cutting corners to make profit) leading to misaligned AI that does really bad things, your arguments against capitalism only strengthen the concerns we have with AI
Meta_Digital t1_j25myuu wrote
Indeed.
I think we could conceive of AI and automation that is a boon to humanity (as was the original intent of automation), but any form of power and control + capitalism = immoral behavior. Concern over AI is really concern over capitalism. Even the fear of an AI rebellion we see in fiction is just a technologically advanced capitalist fear of the old slave uprising.
Wild-Bedroom-57011 t1_j25q617 wrote
Sure! However AI itself also has AI specific concerns that are orthogonal to the socio-economic system under which we live in or they are created. Robert Miles on YouTube is a great entertaining and educational source for this
ShalmaneserIII t1_j26k7iw wrote
So if the problem with both capitalism and AI is that the people who create them use them for their own ends and motives, is your problem simply that people want something other than some general good for all humanity? Is your alternative forced altruism or something like it?
Meta_Digital t1_j26krzm wrote
Well, the fundamental problem with capitalism is that it just doesn't work. Not in the long run. Infinite exponential growth is a problem, especially as an economic system. Eventually, in order to maintain that growth, you have to sacrifice all morality. In the end, you have to sacrifice life itself if you wish to maintain it. Look at the promises vs. the consequences of automation for a great example of how capitalism, as a system and an ideology, ruins everything it touches. You don't need forced altruism to have some decency in the world; you just need a system that doesn't go out of its way to eliminate every possible hint of altruism in the world to feed its endless hunger.
ShalmaneserIII t1_j26ldc0 wrote
Automation is great. Without it, we'd still be making everything by hand and we'd have very few manufactured goods as a result, and those would be expensive.
So if you don't want endless growth, how do you suggest dealing with people who want more tomorrow than they have today?
Meta_Digital t1_j26pyv2 wrote
We don't put those kinds of people in charge of society like we do under capitalism.
ShalmaneserIII t1_j26yka7 wrote
We obviously would. Even if all resources were evenly divided, the leader who says "We can all have more tomorrow" is going to be more popular than one saying "This is all you'll ever have, so you'd better learn to like it."
Meta_Digital t1_j2713bm wrote
Yes, well, if everyone will have more tomorrow that sounds like socialism, not capitalism. Capitalism is "I will have more tomorrow and you will have less".
ShalmaneserIII t1_j27it2u wrote
No, capitalism simply is the private ownership of capital. But since some people will turn capital into more capital and others won't, you get the gaps between rich and poor. It doesn't require anyone to get poorer.
Meta_Digital t1_j290zs4 wrote
When wealth is consolidated, that means it moves from a lot of places and into few places. That's why the majority of the world is poor and only a very tiny portion is rich.
ShalmaneserIII t1_j299xeb wrote
Considering the rich portion is the capitalist part, this seems to be a fine support for it. Or is a world where we all toil in the fields equally somehow better?
Meta_Digital t1_j29abgt wrote
The whole world is integrated into capitalism, and the Southern hemisphere (other than Australia / New Zealand) has been extracted to make the Northern hemisphere (primarily Western Europe / US / Canada) wealthy.
We do have a world where people in imperial neocolonies toil in fields. If you don't know that, then you're in one of the empires using that labor for cheap (but increasingly less cheap to feed the owning class) commodities.
ShalmaneserIII t1_j2bf9ci wrote
Not my point. Are you suggesting we'd be happier if we were all in the fields?
Meta_Digital t1_j2bg0qj wrote
No, I am suggesting that we are "happier" in the wealthy parts of the capitalist economy because others are put into fields in slave-like conditions.
ShalmaneserIII t1_j2bgbqv wrote
Sounds great for us, then.
But are you suggesting we'd be happier if wealth were evenly divided?
Meta_Digital t1_j2bjyly wrote
Yes, we would be more prosperous. Poverty is often a form of violence inflicted on a population, and that violence ripples out and comes back and affects us negatively. Things don't have to be perfectly even, that's a strawman, but by elevating the bottom we also lift the top. Certainly the inequality should be reduced, though, because a top elevated too high causes instability for everyone. It's impractical.
ShalmaneserIII t1_j2buiuj wrote
Then do non-capitalist economies have a better track record at reducing poverty than capitalist ones? Because even your nordic-model states are capitalist.
Meta_Digital t1_j2bzdvf wrote
Well, it's not my Nordic model to be fair.
Inequality today is the highest in recorded history, so technically, all other economic systems have a better track record for reducing poverty. Additionally, crashing every 4-7 years, capitalism is the least stable of all historic economic systems. It isn't the dominant system because of either of these reasons.
ShalmaneserIII t1_j2cbna7 wrote
Inequality isn't poverty. A tribe of hunter-gatherers who have some furs and spears shared equally between them is not richer than modern LA.
Meta_Digital t1_j2cby1j wrote
But a group of hunter-gatherers who have free time, personal autonomy, and the basic necessities are a lot richer than the coffee plantation workers that drug LA, the meat industry workers that prepare the flesh they consume, the sweatshops that churn out their fast fashion, and the children in lithium mines that supply the raw material for their "green" transportation.
Where the hunter-gatherer doesn't have many luxuries, the average LA resident's luxuries come at the expense of human dignity and happiness elsewhere.
ShalmaneserIII t1_j2cchew wrote
See, this is why we ignore people like you- you'd offer up a life chasing buffalo and living in a tent as a better alternative to a modern industrial society. For those of us not into permanent camping as a lifestyle, there is no way we want you making economic decisions. And fortunately, since your choices lead to being impoverished- by the actual productivity standards, not some equality metric- you get steamrolled by our way.
Because your non-capitalist societies had one crucial, critical, inescapable flaw: they couldn't defend themselves. Everything else they did was rendered irrelevant by that.
Meta_Digital t1_j2ccpml wrote
I never argued for chasing buffalo or living in a tent. I don't think any of these are required. Are you responding to someone else's post or confusing me with someone else?
What I said is that the primitive life is objectively better than being a child laborer in a toxic metal mine or a wage slave in a sweatshop.
I don't think we have to give up a comfortable lifestyle because we transition to a more functional and ethical system than capitalism.
ShalmaneserIII t1_j2ccz59 wrote
Yes, we would give up that comfortable lifestyle. In the absence of either greed or threat, why work? And without work, what drives productivity?
Meta_Digital t1_j2cd4y1 wrote
In the absence of greed or threat, we'd live in a nice world.
ShalmaneserIII t1_j2cdbcx wrote
Hunting buffalo. Hunter-gatherer levels of productivity are about what people would do if they can't accumulate capital for themselves or if they're not coerced by external threat.
Meta_Digital t1_j2cdhdi wrote
So then is your argument that a productive world is better than one that is pleasant to live in?
ShalmaneserIII t1_j2cdsvu wrote
My argument is that a world without productivity is less pleasant than one with it. Do you like air conditioning? Running water for nice hot showers even in midwinter? Fresh veggies in January?
Basically, what you think of as pleasant- apparently being time to lounge around with your friends- is not what I think of as pleasant.
Meta_Digital t1_j2ce014 wrote
My idea of pleasant is a world where everyone's needs are met as well as some of our wants. Production matters only insofar as it meets those needs and wants. Excess production, like we're seeing today, only destroys us and the planet.
ShalmaneserIII t1_j2ci0e0 wrote
Which means you lose. You will be outproduced by others, and will not have the resources to stop them from doing as they wish.
thewimsey t1_j26kdvy wrote
>when capitalism is right there being the most banal and most evil thing humanity has yet to contend with
This is ridiculous.
>Angry rate downs despite rising inequality, authoritarianism, climate change, and the threat of nuclear war - all at once.
It's because you don't seem to know anything about history, where inequality was much worse, authoritarianism involved dictators, actual fascists, and a much much greater threat of nuclear war.
I'm not sure why you want to blame climate change on capitalism rather than on, oh, humanity. Capitalism is extremely green compared to the ecological disasters created every day by communism.
Meta_Digital t1_j26l7dy wrote
I don't know how to respond to this because it's clear it would be an uneven conversation. You're missing very basic required knowledge here. Inequality, for instance, is at its highest point in recorded history. Capitalism is a form of authoritarianism. Economic conflict turns into military conflict which increases the risk of nuclear war. Capitalism is not human nature; it's actually pretty recent and radically different from its precursors in several important ways. I have no idea what you're even talking about regarding communism or how it's even relevant.
ting_bu_dong t1_j24d3cx wrote
Elon Musk doesn't want to turn us all into paperclips. Yet.
https://en.wikipedia.org/wiki/Instrumental_convergence#Paperclip_maximizer
shumpitostick t1_j25dlxl wrote
The main problem in AI ethics is called "the alignment problem". But it's exactly the same concept that appears in economics as a market failure called "agent-principal problem". We put people in charge and make them act on our behalf, but their incentives (objective function) are different than ours. The discussion in AI ethics would benefit greatly by borrowing from economics research.
My point is, we already have overlords who don't want the same things as us and it's already a big problem. Why should AI be worse?
Wild-Bedroom-57011 t1_j25k9vs wrote
Because of how foreign AI can be. In the space of all human brains and worldviews, there is insane variation. But beyond this, in the space of all minds evolution can create, and all minds that could ever exist, a random, generally intelligent and capable AI could be the paradigmatic example of completely banal evil as it kills us all.
Fmatosqg t1_j296lsd wrote
Because ai is a tool that makes the same kind of output as people but faster. So whatever good or bad things people do on a daily basis, ai does it faster. Which means more of it over the same period of time.
oramirite t1_j24bg3n wrote
The risk, when it comes to AI, is it's linkage to these people. AI is a very dangerous arm of systemic racism, systemic sexism, white supremacy and . It's just a system for laundering the terrible biases we already exhibit into our daily lives even more under the guide of being unbiased. We can't ignore the problems AI will bring because it's an escalation of what we've already been dealing with.
glass_superman t1_j24pzoq wrote
You'll not be comforted to know that the AI that everyone is talking about, ChatGPT, was funded in part by Elon Musk!
We think of AI as some crazy threat but it might as well be the first bow and arrow or AK-47 or ICBM. It's just the latest in tools that is wielded by the wealthy for whatever purpose they want. Usually to have a more efficient way do whatever it is that they were already doing. Never an attempt to modify society for the better.
And why would they? The society is already working perfectly for them. Any technology that further ingrains this is great for them! AI is going to make society more like it is already. If it makes society worse it's because society is already bad.
RyeZuul t1_j25otnr wrote
It's possible to care about more than one thing at once, and it is prudent to spread the word of the potential for AI to go haywire just like releasing the Panama papers or child abuse scandals in the Catholic church. Billionaires will almost certainly start deleting columns of jobs that AI will replace while simultaneously not being very interested in AI ethical game-breaking innovative strategies and unpredictable consequences. If we want to move to a better system of systems, we need to design our overlords well from the ground up.
AndreasRaaskov OP t1_j25f5g8 wrote
The article doesn't mention it but talking about economics is differently also part of AI ethics.
AI ethics helps you understand the power Elon musk gets if he tweaks the twitter algorithm to promote posts he likes and shadow-ban posts he dislikes.
And the Koch bother was deeply involved in the Cambridge Analytical scandal where machine learning was used to manipulate voter behaviour in order to get Trump elected. Even with Cambridge Analytical gone, roomers still go that Charles Koch and a handful of his billionaire friends are training new models to manipulate future elections.
So even if evil billionaires are all you care about you should still care about AI ethics since it also includes how to protect society from people who use AI for evil.
Rhiishere t1_j25wn9p wrote
Whoah, I didn’t know that! That’s freaking amazing in a very wrong way! Like I knew something along those lines was possible in theory but I hadn’t heard of anyone actually doing it!
bildramer t1_j2494g3 wrote
Saying "we are being ruled by evil billionaires" when people like Pol Pot exist is kind of an exaggeration, don't you think?
glass_superman t1_j249j7b wrote
Pol Por rules no one, he's dead.
ting_bu_dong t1_j24djqb wrote
They said that people like him exist. The large majority of people like him are ruling no one, obviously.
Whether any current leaders/rulers/however-you-want-to-call-them are like Pol Pot is debatable... But, no, not so much.
glass_superman t1_j24nq3v wrote
Koch Bros are not as deeply depraved as a fascist leader but they have a much wider breadth of influence. They are more dangerous than Pol Pot because what they lack in depth, they more than make up for in breadth.
VersaceEauFraiche t1_j24pegd wrote
Yes, just like George Soros as well.
threedecisions t1_j24k2q6 wrote
I've heard that there are dogs that will eat until they die from their stomach exploding if they are supplied with a limitless amount of food. People are not so different.
It's not so much that these billionaires are necessarily especially evil as individuals, but their power and limitless greed leads them to evil outcomes. Like, Eichmann's portrayal in the article, he was just doing his job without regard for the moral consequences.
Though when you hear about Facebook's role in the Rohingya genocide in Myanmar, it does seem as though Mark Zuckerberg is a psychopath. He seems to have little regard for the lives of people affected by his product.
oramirite t1_j24bkp9 wrote
No?
Viewing a single comment thread. View all comments