Frumpagumpus t1_jecynxk wrote
Reply to comment by Yangerousideas in AGI Ruin: A List of Lethalities by Eliezer Yudkowsky -- "We need to get alignment right on the first critical try" by Unfrozen__Caveman
he realizes AI can think so fast but apparently hasn't thought about how software forks all the time and shuts processes down willy nilly (he thinks death is silly and stupid but software does it all the time)
or other mundane details like what it would mean to mentally copy paste parts of your brain or thoughts or mutexes or encryption
Viewing a single comment thread. View all comments