billtowson1982

billtowson1982 t1_jaa2f0n wrote

Reply to comment by net_junkey in So what should we do? by googoobah

You don't know anything about AIs do you? I mean you read an article in USA today and now I'm having to hear you repeat things from it, plus some stuff you imagined to be reasonable extrapolations based on what your read.

0

billtowson1982 t1_ja8xpvl wrote

Reply to comment by net_junkey in So what should we do? by googoobah

They're only better in the sense that Google circa 2004's answers were better than the average humans - both had access to an extremely large database of reasonably written (by humans) information. ChatGPT just adds the ability to reorganize that information on the fly. It doesn't have any ability to understand the information or to produce truly new information - two abilities that literally every conscious human (and in fact every awake animal) has to varying degrees.

1

billtowson1982 t1_ja76gf1 wrote

Possible collapse: If your concern is that you don't want to live to see society regress to a primitive state - don't worry, you won't. We have 8 billion (and growing) people who depend on modern, often just-in-time supply chains for life. In a collapse, with those supply chains gone and people with guns wanting to eat most people will die, not live to see how it plays out (and that's not even addressing whatever causes the collpase in the first place).

Climate change: It's not the greed of powerful people that is the main driver of climate change - having billions in a bank account (or actually as a number that reflects nothing more than ownership of large percentages of the businesses that produce the goods and services we all use), doesn't mean one burns a ton of fossil fuels or farts out a lot of methane. In reality, it's consumption, and the massive fossil fuel use that enables our current many order of magnitude higher level of consumption per person, plus the multiple orders of magnitude larger world population than in the past that drives climate change.

Also climate change is being addressed. Quickly enough to save most non-livestock major mammals? No. Quickly enough to prevent 100s of millions of currently poor people from dying and being forced into refugee status? No. But quickly enough to allow most kids in comparatively well-off countries to live basically normal lives plus a house-destroying disaster or two? Yes. The quicker the better though - every delay costs everyone.

Economy: No, temporary shocks like COVID and the Russian war on Ukraine aside, the economy is not getting worse - whatever you or your friends/family may feel like. The economy is humming along quite nicely, actually.

AI: Your first worry about AI - because it's already happening now - is how manipulated you and everyone else will be (and in fact already is) by AI that is trained to engage you in order to sell you shit. As it turns out, the best way to engage people is to make them angry, anxious, sad, vengeful, etc., and to engage them in conspiracy theories and lies. This is also increasingly used to define your politics for you (and for everyone). The worst of it is that AI is turning you (and everyone) into the worst version of yourself - and addicting you to rage, fear, conspiracy theory, etc. for the primary goal of just selling you on more shit (i.e. the same consumption that drives climate change).

Your second worry about AI should be how it will replace a ton of jobs including in fields like art that most of us really wish would remain human, and that replacement will cause massive social disruption.

Your third concern is that you are young enough that you might live until the development of AGI, and then that will probably be the end for humanity.

4

billtowson1982 t1_ja74jxn wrote

Reply to comment by [deleted] in So what should we do? by googoobah

1.) The idea that humans will ALWAYS be economically productive despite all possible future technological developments is just as much blather as saying in 1950 "in all of history humans have never gone to space, so they never will!" Whether AI ever develops to the point of being able to do all jobs better than any human, I don't know. But the possibility can't be ruled out, and certainly not by "it didn't happen in the past so it never will!"

2.) Strengthening your moral and ethical character is a good thing to do. But it's silly to believe that that is the way to get ahead in a career - a weak moral character can be as much an asset, maybe even more of an asset, to a person's career as a strong one.

0

billtowson1982 t1_ja74aj4 wrote

Reply to comment by net_junkey in So what should we do? by googoobah

1.) Whether AI is sentient not is almost irrelevant for its impact on jobs or pretty much any other aspect of society. Something can be plenty intelligent without being sentient, and even a rather dumb being can still be sentient. AI intelligence (or in other words, capability) will be the main thing that affects society. Not sentience.

2.) No AI today has the complexity of a brain based on any meaningful measurement. Even a brief chat with chatGPT is enough to show a person how stupid it is. Further today's AIs are all absurdly specialized compared to biological actors. Powerful, but in absurdly narrow ways.

1

billtowson1982 t1_ja73q1x wrote

Reply to comment by PO0tyTng in So what should we do? by googoobah

Most big company CEOs are not the biggest shareholders in their companies. If AI really does get to the point where it can do almost every single possible job more efficiently than any and all humans, then CEO jobs are no safer than anyone else's. Zero, one, a few, or in theory (but unlikely in reality) all people will make production decisions for the AI and no one else will do anything of economic value whatsoever. That doesn't mean some humans won't "work" - humans will still be able to make nice wooden tables, for example, but in such a world the AI could make better tables faster, cheaper, and with less waste of resources. For a person to sell a table that they made, they buyer would have to want it because it was made by a human - despite it being inferior in every other way.

2

billtowson1982 t1_j8eef8j wrote

I'm fine with doing it either way. If people want to volunteeer for a Mars trip, knowing the severe risks, good for them. People with explorer spirits have been doing that since time immemorial. In general it makes much more sense to risk the lives of a few volunteers than it does to spend extra 10s or 100s of millions on safety procedures for a few folks, when the same money could easily be spent on healthcare services for the poor that would save many more lives.

Hell, I'd probably do it myself. What's the value of a life here on Earth? You live, you die, it was all pointless and your only legacy is the resources you used and the environment you wrecked along the way. Whereas if you die on Mars...well at least you got to see Mars!

3

billtowson1982 t1_j8e6ryc wrote

Didn't biosphere 2 fail due to CO2 off-gassing from the concrete they used to make it? And that was in the 90s. That both seems like an avoidable problem and in general it seems like we ought to be able to do somewhat better anyway a 1/4 century on.

Of course it would be vastly more expensive in space regardless.

4

billtowson1982 t1_j223j2v wrote

Maybe. But I think the basic flaw of that sort of agriculture being extremely labor intensive and therefore expensive remains. It's hard (and unlikely) for us to scale up a very expensive labor intensive way of doing things in a world where most people in even well off countries neither want, nor can afford, to spend a lot more on food than they currently do. And while labor is cheap in poor countries, another important goal is helping poor countries become better off.

Also we reap a ton of environmental benefits by having most people live in or around cities. Move a ton of people back to rural areas so that they can work on labor intensive farms and we'll pay a big environmental cost for doing that too.

1

billtowson1982 t1_j1yav5q wrote

The problem is that the type of farming you recommend is very labor intensive - which translates to expensive. It also can't be scaled up to feed 8, then 10, then 12 billion people, etc. unless people start wanting to eat a lot less meat, and probably not even then. Finally, and I mean no offense because you seem like a good and well-intentioned person, but you really shouldn't tell vegans (or anyone else) how they should feel about something. Some people oppose meat because of the vast majority of it being factory-farmed. Some have an ethical opposition to taking a mammalian life for food, some have a religious-based opposition to meat, etc. etc. Plenty of those people might oppose any animal-based agriculture, and that's fine. Just as its fine for you to eat animals if you like.

Also, having worked on sustainable farms where animals are treated pretty decently, I can say with a certainty that most are still not treated as "pets." That's just not a realistic view of how this sort of thing is right now, let alone how it could ever, under any circumstances, scale.

11