pleasetrimyourpubes
pleasetrimyourpubes t1_jdusbm7 wrote
Reply to comment by DarkCeldori in Drexler–Smalley debate on molecular nanotechnology by jsalsman
What scale are you talking here? I can see hive replication and industry being built. But that is not what I think of when I think "goo". I don't see some kind of Gaia style super nano hivemind. Surviving at the nano scale requires virtually all your capability as an organic system. I could see seed AI being propagated through an organic system, but it would be dumb, like an egg floating around, and wouldn't be able to hatch until certain criteria is met.
pleasetrimyourpubes t1_jdstlhg wrote
Reply to comment by jsalsman in Drexler–Smalley debate on molecular nanotechnology by jsalsman
The whole concept originated with a Feynman thought experiment about exponential growth. He never posed it as a real thing just if a small thing replicated a lot it would take over a planet in a seemingly short doubling. There are so many environmental factors that would destroy this self replicating system and many of them are physical limitations of reality. Yet throughout the 90s we had sensationalist stories about how grey goo was just around the corner.
I see similar sensationalism about the nature of intelligence and the current growth of AI systems and am just enjoying being alive to witness it. It's going to get so much better in such a short period of time and all the basilisks in the world aren't going to come fill our nightmares.
pleasetrimyourpubes t1_jdsqtfe wrote
Reply to comment by FrermitTheKog in Drexler–Smalley debate on molecular nanotechnology by jsalsman
Drexler was selling sensational science fiction books as fact and Smalley was just skeptical of it. The idea of industrial nano technology operating outside of the confines of organic chemistry is and always will be science fiction, particularly the self-replicating kind. Drexlers machines and concepts were so far beyond the realm of physical nature that it's a shame Smalley didn't get to live a few years longer to really rebut Drexler. In the end Drexler at least conceded Grey Goo couldn't happen accidentally and would have to be engineered (though I would posit that even if you engineered it it would die as soon as it stripped the atmosphere away or hit lava; again due to the physical constraints nanosystems must exist in).
pleasetrimyourpubes t1_jcvauha wrote
Reply to comment by SgathTriallair in 1.7 Billion Parameter Text-to-Video ModelScope Thread by Neither_Novel_603
Your friends are essentially a video mask for the AI model. They don't even have to say lines. Simply walk from point A to point B. Emotion, dialog, etc, will all be done by the artist putting it together (and the script may be itself mostly generated as well with only the concept floating around).
pleasetrimyourpubes t1_jbm7tat wrote
Reply to comment by Kickenkitchenkitten in [image] just hold on a little longer by kennystillalive
My highest high in the last 3 years and lowest low all happened in the same 6 hour period. And I am dreading what the future holds.
Submitted by pleasetrimyourpubes t3_10vi2sl in movies
pleasetrimyourpubes t1_jecpgjd wrote
Reply to comment by Cr4zko in AGI Ruin: A List of Lethalities by Eliezer Yudkowsky -- "We need to get alignment right on the first critical try" by Unfrozen__Caveman
Many "Twitter famous AI people"* have turned on him for the TIMES article / Lex interview, when just a few days ago they were bowing at his feet. Yud is gonna for sure expand his blocklist since he is quite famously thin skinned.
Lex's Tweet about weak men gaining a little power was almost certainly about Yud. Because Yud wanted to leave the youth with the wisdom that "AI is going to kill you before you grow up."
The TIMES article was completely asinine.
*who may or may not know shit about AI.