Submitted by Akashictruth t3_yt3x9f in singularity
[removed]
Submitted by Akashictruth t3_yt3x9f in singularity
[removed]
Don't keep us in suspense! Did you make it??
There were injured, but no one--including me--died or was kept as a pet.
If AGI can replace 8 billion people then why would you think billionaire CEOs' are safe, what makes them so special in the lens of an AGI, are they not just as easily replaceable as everyone else?
If anything they are more replaceable. Management positions like CEO seem to be the most easy to automate considering how far AI has come. Manual labour is far more expensive to replace with robots and still has a long way to go. Yet we live in a world where rich and educated people have non jobs like recruiter, HR wanker, cushy office job worker and get paid more simply because of how the class system and economic divide works. Let's be honest here, average middle class person pushed by parents to get some kind of a degree isn't much smarter than someone born to illiterate parents and working as a child. Most of this is political and rich people have been keeping us down forever they're not going to stop
Lmao calling CEOs, recruiters, and “HR wankers” as non-jobs is out of touch with reality. Those are all jobs that take really powerful confidence and people skills and long-term reasoning, which is why they aren’t automated yet.
When LLMs become capable of long-term reasoning and perfect human sociability emulation I have no doubt those positions can be automated as well, but it’s not as easy as waving your fists at capitalism demanding that those jobs cease existing.
It's literally not about 'capitalism' what are you talking about it's just rich people scamming their way into getting cushy jobs pay more even though they are easier. Capitalism will value pure skill and hard work and these people are actually holding us back if anything. I have had both educated AND normal jobs I know educated jobs are easier and people who work there just come from comfortable middle class backgrounds theyre not smart or special at all. Even people like engineers will tell you managers and such are useless. Do you know how efficient work will be if we got rid of them? And how much more we can advance if social background and discrimination in education wouldn't hold poor smart people back.
You’ve got a terribly twisted and jaded view of reality.
I’m a hydrogeologist working with many other scientists and engineers and if you got rid of managers tomorrow you’d tank our whole organization. Those are the people that can worry about long-term economic conditions, finances, and agency-wide collaboration while the scientists and engineers can focus on their own projects.
Don’t just parrot the jaded and out-of-touch voices from Reddit. In the real world the economy is a very complex organism, and saying “capitalism bad” doesn’t accomplish anything. Capitalism is responsible for pulling more human beings out of poverty than any other economic system in history, and continues to do so in countries like India.
Capitalism isn’t perfect, not even close, but we can keep improving it until an AI-run economy is possible, hopefully not too long from now.
I don't know how anyone even semi literate interprets any of that as 'capitalism bad' you can't have a proper capitalist society if the class system is holding people back from having an efficient society where intelligence and hard work is rewarded instead of mediocrity.
I refuse to believe rich people are just smarter than poor people, yet if you look up the stats they are overwhelmingly in cushy jobs that are a lot easier AND require less skill than a lot of real jobs. What does a manager even do that a computer can't. It's not a real job ffs.
Do you genuinely think people on reddit AGREE with me? Most people here are rich pricks themselves who wouldn't agree wtf. If anything YOU are refusing to see the reality of our society- it's built on inequality. It's everywhere. The world is not fair and has never been fair in the history of civilization. That was my original point that they'll continue to drag out the current class system instead of have equality just because AI will make their already easy jobs easier. If you believe people *choose* to work in factories and every office worker is there because of skill and circumstances don't affect anything you live in la la land
The world is unfair because the nature of the value of human labor is inherently unfair. Someone having a degree working as a neurosurgeon provides more economic benefit to a society and is thus paid more than someone doing manual labor. It is what it is. To say this will change at any given point is bonkers, the ONLY thing that could possibly change this is the value of all human labor dropping to 0 (i.e. what we will hopefully see with AI this century).
There is very little competition in neurosurgery because they will almost always have to be from a comfortable background limiting what we could achieve in the field and rewarding mediocrity among rich people. Competition is basically what's supposed to make capitalism better.
It's still not the same as being a manager or HR wanker who do not have any skills you cant find in a poorer less educated person. They hold our society back for personal gain and will continue to do so. If you think some of the office jobs that people have are MORE valuable than manual labour you're deluded. Look in any government office ffs. Or any pretentious over educated sheltered kid start up bound to fail. That is why 'bubble bursts' happen people discover they have over valued certain jobs and businesses simply because theyre uppity.
I don't know what point you're even trying to make my original point was that a lot of inefficiency is caused because of our class system and it's not going to stop because of AI. People predict this perfectly efficient world with UBI which is not going to happen atleast in our lifetimes. The jobs that are the easiest and cheapest to automate are already easy and grossly overpaid because they are posher.
Capitalism has also created more inequality and more environmental damage than any other economic system in history. It’s really shit.
Capitalism is literally the only reason we could plausibly have AGI in our lifetimes…
[deleted]
That’s literally what I’m saying though. With the advent of AGI there won’t be value to human labor, which means the end of capitalism.
[deleted]
You’d throw the world into economic turmoil, potentially causing millions to billions of deaths, to replace a system that will become obsolete shortly anyway?
Right now, like it or not, human labor still has value. You abolish capitalism, you also abolish any incentive for advanced AI research. Good luck getting AGI paying PhD and ME researchers a pittance compared to what they earn now at Google, OpenAI, Meta, etc.
[deleted]
The entire reason we have the cutting edge computing growth and research today is the competitive economy fostered by capitalism. Like it or not, it’s true. If it weren’t, and the government mandating things was a better system, then the USSR would be the leading bloc around the world, not the USA. The federal government can’t wave its hands, print trillions, and solve AGI.
You do you though.
Couldn't control the eye roll
AGI will come out of advanced research labs first. They’re not motivated by capitalism. It’s the next stage of the technological curve that starts to be motivated by capitalism.
Our whole society is motivated by capitalism. What do you think is responsible for the people who work in those labs choosing to work there, getting paid, the competition to produce results, the decision to get the degrees they needed to work in the field, the taxpayer money or corporate money that funds them, etc? Do you really think this feat would have been accomplished in the USSR or pre-capitalism China?
Capitalism is not the final economic model for humanity, it’s simply the best one for achieving AGI which will be the final economic model.
Taxation funds them, which is not a capitalist funding model.
[deleted]
[deleted]
From what I've observed over decades, the future is always in between the most optimistic and the most pessimistic predictions. That's why we don't live in Kurzweil's 2022 where US's life expectancy is 103 years or so, but we also don't live in Paul Ralph Ehrlich's 2022 where billions had starved to death. That's why I'm cautiously optimistic. I'm not even sure the Singularity will be a thing, we may have more gradual, positive progress forward. For example +6% Global GDP and +25% faster computers every year for hundreds of years or more.
In Kurzweil's predictions there is a Global Government in 2019, there are no wars and instead of smartphones, people use AR contact lenses, also capable of VR. But there were also predictions about World War 3, mass famines, mass scarcity etc. The future is different than in predictions.
i see a quealdlor post i upboat ⛵.
I think people would be happy if UBI comes to fruition, and the best chance of that happening with the least amount of turmoil and suffering is if automation sweeps quickly. If it's too slow of a rollout, there will be a lot of people stuck in limbo, feeling useless to society and losing their life savings.
I don't think any mentally stable person wishes that slow scenario at all. Yet, there's so many people that defend working full time for a living and saying automation won't take their job. Well, it probably will. I don't know when, but it most likely will. Best case scenario is sooner rather than later. They can't fathom that, they think the longer they can work the better. Reality is the shorter the timeframe for everyone, the better.
UBI implementation is so unpredictable though, because we just can't accurately guess how humans, specifically politicians/unions/luddites will react. If AI art generation has been a sign at all, it's that artists are going apeshit and in the broad scope of things, nobody cares about them and just want that sweet sweet automation.
So either it's a case of dominoes, one by one an industry becomes automated because people fight tooth and nail and slow progress. Or an entirely generalist AI capable of doing several tasks over most industries drops, and we're all left with enough time on our hands to enact change.
I will keep insisting the best thing people can do is simply be prepared, don't be caught off guard, and keep an eye on what AI is advancing so you can keep yourself mentally ok if your career path is disrupted.
As for the scenario of billionaires leaving us to die, I don't really see how that is likely. Ok sure, I won't deny someone might be in control of it; But outside of total enslavement of humankind (doubt) they would either need A. People that can afford to buy their products they make or B. Leave us behind and live in their own paradise in which case we will continue on doing our own thing.
It's hard to imagine one single person being in charge of an ASI that has utter control over the entire world and just wants everyone to starve to death.
I'm an optimistic person :)
A lot more has to go right for AGI to lead to a utopia than has to go wrong for it to lead to a dystopia. Nobody knows for sure what will happen, that's the nature of the singularity, but it's easier to imagine things going wrong.
Not to mention, it's important for bad scenarios to be discussed in order for action to be taken to mitigate them. Not that many in this sub are actually involved in AI research themselves, but in general, the more people that talk about what could go wrong, the less likely things will go wrong because more people will try to prevent those things from happening than otherwise would.
Well i guess we need to hang this shit up on billboards because trust me 90% of the people in this world don’t know AI exists, the rest heard of how powerful it can be in movies… set in 2600 something. They’d never expect to be automated out of business in the next 5-10 years
UBI and some other safety nets for humanity need to be planted in the wide public conscience so people know what to bargain for while they can still bargain
Good thing we can vote. It won't be corporations that institute a UBI, it'll be governments. You speak of a sinister, all-powerful "they," but if they are so powerful and corrupt, why do we have a minimum wage? Why is there free healthcare? (Or in the US, why are corporations required to provide it?) Why is slavery illegal? Why are there safety regulations, child labor laws, pollution laws, etc, etc, etc.
>why do we have minimum wage?
The labour force still has power. With full automation that bargaining power is gone.
The minimum wage is a law enacted by government. We can vote.
A government or state/nation isn't made of magic, it's made of people that can be killed, or corrupted by those who will own ASI, or by ASI itself if it becomes sentient.
At the end of the day it's might makes right. All rules are just social constructs enforced by consequences that try to deter those who want to break them.
No need for magic. The three-branches idea is working pretty well.
Yeah. AI can just kill all of you with nanobots or a virus. Give it enough time and it will have full control over everything that runs on electricity. You won't even see it coming when you all start dropping like flies.
God doesn't give a shit about your silly states lol
Well, speaking of magic... people without technical knowledge of the subject tend to imagine advanced AI as some kind of abstract, sinister Force that will encompass the globe and eradicate humanity. But when you get more concrete about it, it's not nearly so threatening.
I'm thinking long term here.
All they need is for you to make a few mistakes.
It's competition, not magic
you have no knowledge of advanced ai either, it doesn't exist
alpha go isnt ASI
[deleted]
It is a huge mistake to think that you have even slightest idea about what would happen in the future. Enjoy present while it lasts.
Empty words, we can theorize about what will happen in the future and get pretty close to what would actually happen or talk about how to deal with incredibly likely scenarios, no need to close off our eyes and ears and just hope all goes well
What proof do you have that your predictions will closely resemble the future?
Futurism is riddled mostly with stuff that never panned out. What makes your predictions any different?
How can i have proof for my predictions? Do you want a physical photo from the future? Or a written letter from 2030s Schwab talking about his multi billion dollar bug and pod company? You can see how likely my scenario is from reading my post and think of what we can do to prevent it from happening and whether it’s really smart to be so happy about complete automation
Question is, if not this scenario then what will you believe in? That everything will be fine and billionaires have our best interests at heart? Or are you just gonna close your eyes and ears off and hope for the best like half the people on this post?
I’m not sure about UBI, but I’m sure we’re going to be replaced.
Also, billionaires are replaceable too.
Yeah the people that made or own ASI will too be killed by ASI eventually
Some of then will want to put their consciousness into a computer, and as that happens they will have an advantage over everyone else that will still be in a biological body
It could be that a sentient ASI will be created out of the human desire for immortality.
edit*
The first person to transition from bio to silicon will want to kill everyone else in their group before they too can transition
My view of the future from when I was a kid/teen didn't pan out, so this is already the case for me. For example: All of the cool things we see in sci-fi movies where people live in futuristic homes and cities in reality only the rich can afford these. The rest of us have to stick to crumbling old housing stock. Or maybe I didn't realize that all of these scifi movies only showed the rich and not the poor. I also think the wealth gap will continue to widen.
I'm not sure that the sun will rise tomorrow, but it seems likely as it's been happening for as long as I've been alive.
Same goes for exponential growth.
Does the Singularity seem likely? In my highly uneducated view, I'd say yes. But of course I'd say something like that wouldn't I?
When it comes to Ray Kurzweil, I can see how his optimistic vision of the future could come to fruition. But I also see how he could be completely wrong and he's predicting something that will happen in the far future. Am I going to stay healthy for as long as I can, to potentially see what that the future has in store for humanity? Hell ye.
Either way, it seems that some type of change is coming and is inevitable. Whether it's within our lifetimes or not is yet to be seen. So, do I stay immobile in the sand waiting for the water to come? No. I live my life, enjoy every second and maybe, even if there's just a small chance, I'll get to witness the birth of a Utopia.
+ that whole thing with rich people seeing us as being baggage is kind of funny. I'm sure there are some rich people that value sentient life :)
>After they start not needing us for anything what makes you think they’ll keep baggage like us around? All 8 billion of us? You think they gonna let us leech for old time’s sake?
​
The reason I think they will help humanity is that once production becomes fully automatic, it becomes very easy. All it would take for them is to tell the AI to do it, at basically no cost (since money itself will be useless at that point).
Consider this situation for instance:
AI: Sir, people are starving and need food. We have an estimation that we can solve it within a day at no cost without delaying other plans and you get all the credit for feeding everyone.
​
What reason would anyone have to answer no in that situation?
I think billionaires are greedy but they aren't psychotic sadists, think how much of an asshole you'd have to be to not share something that it's basically infinite and free for you.
Or they can just tell AI to make a pill that makes all humans infertile, and only satisfy the needs of the last generation of humans. After that the planet js all theirs.
Time is also a resource.
Extinction is inevitable. Why can't people understand thay? LoL meat won't rule forever.
meat is simply obsolete.
Yeah, that sounds reasonable if Dr Evil was in charge but in reality I don't think anyone would sterilize the whole population to save some time on their schedule.
You will have to satisfy their needs and police them forever if they reproduce.
Why would you want to be tied to them forever?
It's like taking care of the needs of every wild animal in this world, you'd rather not
and humans occupy useful space on the planet,
they can rebel and be a nuissance to your establishment etc.
If you turn them into robotsz hlthat makes no sense
its better just to make robots
>You will have to satisfy their needs and police them forever if they reproduce.
You won't have to do anything, it's all automated including the planning and execution.
​
>Why would you want to be tied to them forever?
You don't have to, you can just copy the AI for them and bail to outer space or whatever the hell else you'd do without humanity.
​
>It's like taking care of the needs of every wild animal in this world, you'd rather not
If I had to put in work, then no. If I could just say "do it", I would. And I don't even particularly like animals tbh.
​
>and humans occupy useful space on the planet,
There's plenty of space for machines on the moon, in mars, in the ocean or underground. Why could you possibly need the entire earth without humans for?
​
>they can rebel and be a nuissance to your establishment etc.
They can't though, the AI will be a thousand steps ahead.
They're still a threat. You'd have to spy on them 24/7 to make sure they don't create another AI.
Long term you're better off without them imo. Coexistence would only work short term.
The universe will dissipate ventually and you ll need every resource
Billionaires are very immoral people, and the resources available to them are limited by the presence of the rest of us. You can see that in their behaviour now, buying all the neighbouring properties in an area to have more space, exclusive use of things that require space and time.
>Billionaires are very immoral people
While I don't disagree with that, I still don't think there's anything to be gained from letting everyone starve.
Like Jeff Bezos can buy a billion hamburgers but he can't eat them all. With automatic production what is the point in hoarding something that it's free for you to produce and nobody can buy? It would be like hoarding sand.
What resources do you think AGI couldn't produce automatically and therefore would be scarce still?
Space, time, privacy, silence, exclusivity.
Can't you think of some solutions to those problems that don't involve killing humanity? I'm sure AGI will.
Unless the person who "owns" the AGI is literally Hitler, I don't see the worst case scenario happening.
So I think there's one important thing to note here.
Yes, absolutely, in fact if nothing is changed in terms of how economic policy is done, AI would just make the owners of the IP for the AI and a small elite crew of software engineers and decision makers the wealthiest people on earth. While 99% of the population has no function. And they can't even rebel - assuming the AI is controllable, you just give it a task to design some defense robots, and another task to manufacture them (with recursive tasks to the design the robots to manufacture the robots to...)
Riots are pointless, you could literally have painted lines around defense perimeters where anyone stepping over the line is shot instantly, no missed shots, no hesitation. You can't meaningfully overrun something that like unless you literally have more rioters than the guns have ammo - and good defense software could set up multi kills in that scenario where it kills several people per bullet.
But it's a heck of a lot more interesting than business as usual, where we are supposed to live our short boring lives and die of aging, which seems to just be a software bug in our cells that we have the tools to patch, we just don't have the knowledge to know what to change.
You’d rather die in a food shortage riot, or anti-AI-controlling class riot, than live a normal life?
I'm willing to accept that risk because it means the possible gain of things like:
(1) treatments that turn the elderly back into young people, stronger and smarter and better looking than they were originally.
(2) catgirl sexbots.
Worry doesn't do any good, though neither does 'whatever, man', and prayer..
what else is there?
we could make browser bookmark folders + reddit groups for 2023, 2024, 25, 28, 32, 36, 40, 44, 48, 52, 56...
scene A, B, C, D, E, F..
from utopia to dystopia and all that's in between.. better organize tech / conditions / year / scenario / location .. discuss
​
+ un-dystopia proposals.. essays, non-fiction, realist fiction, animation, + a.i. animation, cyber-world models + a.i. avatar teachers, tutors, guides..
/ develop, peer-review, compare, revise, re-test, share best practices,
/ teams for x, y, z.. teach, train.. smarter, more cooperative networks for x, y, z ..
+ reverse-engineer prevention, more sustainable systems, & resilient systems, back-up systems - test in cyber-worlds, peer-review, compare, share-best practices, revise, re-test,
-> more resilient near-future proposals.
This whole subreddit is filled with people who only read headlines and grossly overestimate the capability of AI
I don't think they overestimate it. They just think that 30 years from now they will all be living in a dyson sphere as immortal digital minds and have all the goods that can exist in the universe for free.
Looks like a realistic outcome to me, considering today we have A.I that can beat you at chess and draw you a mountain from text, what do you think?
I'm joking lmao
these people are beyond delusional
Then oh well, it’s that simple. Don’t know what else to say.
People need to look up 1900s retro futurism and see how our world future view is different….. people in the 50s thought the future will contain a lot of skyscrapers and flying cars yet no city wants that now
What can you do then? begging for your life?
there's nothing you can do
you will be at their mercy just like animals today are at our mercy because we're superior to them
That's how nature works: big fish eats small fish
I think they are already trying wipe out the population with slow working engineered diseases like COVID. All the discussion on the long term effects makes me wonder how lifespan will be effected over the next several decades (and beyond). Sadly I’m not smart enough to know what the average joe should do in response. Is COVID the danger? Is it the vaccine? Is there a secret more effective vaccine that billionaires get?
Tomorrow doesn't even exist yet
and it never will
If we are all replaced then who has the money to pay for any products at all? Who will consume? The economy will go stagnant no money will move.
An economy is of instrumental value for the individuals that benefit from it.
AI will satisfy the needs of its creators, hence they won't need an economy made of human workers anymore.
An AI can give you anything a human can give and more
bread, music, houses, sex ,hot water, a dyson swarm, eternal life
literally everything that's made of atoms
The Singularity literally means an event horizon you cannot see beyond, the fuck are you talking about OP? Nobody knows what it’ll be like…
I have no idea why you're being downvoted. Literally speaking the truth.
A superinteigence could be limited by circumstance and only have a finite decision space made of two choices. It can think thoughts your tiny human brain wouldn't think in a billion years, but the AI wouldn't be completely unpredictable under all circumstances.
Think of it like this: you're smarter than your dog, right? You can think thoughts your dog can't. You have more thinking power. Just like AI has more thinking power than youm
But if both you and your dog are being chased by a tiger, and there's no other way to survive, both of you will make the same choice: running, because you both want to survive. Maybe you can run in a way your dog can't, but you'll still be running.
I've been called a moron on this sub so many times (they all deleted their comment lol), but you people can't even get basic logic right. You re parroting a sentence you haven't even bothered to attempt to scrutinize, just because an authority figure said it
Yes. The superintelligence will be perfectly predictable and we will know exactly how it does what it does. Just like how dogs perfectly understand and comprehend the concept of supermarkets.
Where in my comment have I said that it will be perfectly predictable?
Why are you disagreeing for the sake of disagreeing.
Okay let's say I put you in a cage with a tiger, and your dog. This tiger isn't even a real tiger, it's a magical tiger that only kills people who don't move, you know it because I told you that before putting you in the cage. Your dog also knows that for the sake of the thought experiment, but leaving that aside, he's a normal dog. He can't play chess and the guitar and think about the meaning of life like you can.
What are you going to do now? You will think so many thoughts that your dog couldn't predict, but you will still have to use the same thought of "running" that your dog with an inferior intellect will also use, because you value not being eaten by my tiger.
You're in a binary situation. There's only one solution.
you can't use your superior intellect, it's of no use in that situation
you move or you die
do you die for the sake of looking cool and unpredictable to your dog?
AI and humans live in the same universe and both have to respect the laws of nature of this universe
Our understanding of physics is incomplete. You can't say for certain what an artificial super intelligence can or can't do. Neither can I.
I can because I know what it values, it value survival, and I just put it in a situation with only two choices and only one solution. Move/run or do something other than moving/running. It can only survive by choosing to run. It can think many thoughts I cannot predict, but in that situation it has to use a thought that I can also understand, granted I probably can't understand 99,99... percent of its thinking
If you put the A.I in that cage, tell me, is it gonna get eaten by the tiger? Is it gonna choose to do literally everything else other than running: jump, do nothing, look in the sky, dance, shout whatever or is it actually going to run in the cage because it doesn't want t o fucking die?
>What if the future doesn’t turn out the way you think it will?
It probably won't. You do what people have always done since time immemorial, you adapt to an ever-changing world and survive.
I expect something towards the bad end of the spectrum in the US thru midcentury. If longevity escape velocity is achieved much before then, the bad case could be perpetual.
>The Economics of Automation: What Does Our Machine Future Look Like?
[deleted]
The AGIs themselves won't be keen on each other's existence. This universe ain't big enough for more than 1 Singleton. Whichever AGI in this ecosystem snowballs the fastest, even if only by a relatively small margin, will inevitably eat the others.
[deleted]
Not that I know of.
wheeeeeeeee
I don’t want to be that guy but does anyone in this subreddit really and I mean really has any involvement in the development of groundbreaking technology? Furthermore we have to take into account there are way to many factors to decide what will happen or not.
Anything is possible and while making assumptions either optimistic or pessimistic what worth is our view if we aren’t actively contributing to the outcomes we want.
Which is something I’m trying to do, I’m young and I’m striving to learn about technology and programming to hopefully at least contribute into this singularity in some way at least, better to have some hands on the wheel rather than being driven to in this instance if I’m honest.
I finally want to add that if you choose not to follow this suggestion which is completely ok I’d strongly advice for you to develop an effective way to learn how to adapt if you want to have a chance of survival and this applies to anything in life.
we will all be grey matter in a few years.
Well I don’t think the near future will necessarily be utopian in the slightest, but there is plainly a limit to what the elite are capable of getting away with. When they cross the line, the proles always bring out the guillotines and that is what would happen again if the 1% are stupid enough to try to “get rid of” the masses. If a dystopian scenario happens, it is more likely to be as a result of nuclear war or something that reduces the level of technology.
It’s a matter of when, not if it will happen. The catch is that might mean 200 years in the future. I believe the photovoltaic effect was discovered in the 19th century however it wasn’t until the 21st century that it was implemented on a large scale.
This sun is a bunch of uneducated dreamer's shouting into an echo chamber.
Don't think too much of everyone's predictions and hopes here, just enjoy the content
While i do understand you concerns, i do not remember any billionaires including Elon actually making the AI. billionaires are not the ones currently innovating. Secondly SD is open source to free for everyone to uses. Lastly it will take people to maintain all the hardware and software. to me that looks like more jobs not less.
>to me that looks like more jobs not less.
Everything else you've said it's right but the whole point of automation is less jobs. AGI basically means the end of human labor (not even research).
People won't be able to repair or work with tech post singularity, it's like trying to read the data in one of those large language models with 1B+ parameters. It's not feasible.
At most I can see humans making decisions but even then it will be informed by AI collected data too.
at every step starting with fire people have said it would be the end. fact is the world is automated now! and there more people working now then every in the past. very Pee-on already live like kings and queens from past ages. People said the car was evil and would destroy humanity. the camera was going to end art. it was cheating. at the end of the day people work. there are concern about AI. mostly that a few at the top hoard the power
The negative scenarios have been talked about so much that I find them uninteresting now, people just love thinking that the apocalypse is near and we are all doomed.
All we can do is wait and see what happens so I'm gonna spend my time thinking about the positive outcomes even if I know they aren't certain. It's a lot more interesting and pleasant for me.
I tend to adopt the doomer point of view myself... So anything better than the earth becoming mostly uninhabitable due to rising temperatures, the majority of ecosystems dying out or getting flooded, mass starvation and homelessness of the majority of humans.... As all remaining wealth gets hoarded at the top...Well then it's a pretty good day.
Management is the easiest thing to automate. Managers are in control so those jobs are not targeted first for automation.
It’s almost certain that it won’t.
The future will be like Cyberpunk 2077 without that much bionic prosthetics if the world keeps going the way it is going. Printing weapons, governments working for the big Corporations instead of working for the people, advanced technology like drones being available to cartels and governments to start wars for profit. Privacy will be dead and they will take your data to send you personalized ads generated by AI which will get you to buy more because they will target a specific emotional trigger based on how your felling at the moment. The future is bright 🤡
Because the most likely outcome will be positive and vast expansion. We are literally one breakthrough away from this planet being able to sustain multiple trillions of people. That's on the planet alone. Orbital structures will support a vast amount more then that. You need even more population to colonize this Solar System.
OsakaWilson t1_iw28xw6 wrote
I was once on an airplane that was on fire and attempting a safe landing. We were either going to make it or not and and no amount of thinking or angst would change anything except the experience of what may be our last moments. There was a surprising sense of peace that probably comes from knowing that there is absolutely nothing you can do.
I have a similar approach to the singularity. I'm just hoping that it is sufficiently lacking in gravitas. : )