Viewing a single comment thread. View all comments

LoquaciousAntipodean OP t1_j59mkok wrote

Yep, not an engineer of any qualifications, just an opinionated crank on the internet, with so many words in my head they come spilling out over the sides, to anyone who'll listen.

Chat GPT and AI like it are, as far as I know, a kind of direct high-speed data evolution process, sort of 'built out of' parameters derived from reference libraries of 'desirable, suitable' human creativity. They use a mathematical trick of 'reversing' a degrading process into Gaussian normally-distributed random data, guided by their reference-derived parameters and a given input prompt. At least, the image generators do that; I'm not sure if text/music generators are quite the same.

My point is that they are doing a sort of 'blind creativity', raw evolution, a 'force which manipulates matter and energy toward a function', but all the 'desire' for any particular function still comes from outside, from humans. The ability to truly generate their own 'desires', from within a 'self', is what AI at present is missing, I think.

It's not 'intelligent' at all to keep trying to solve an unsolvable problem, an 'intelligent' mind would eventually build up enough self-awareness of its failed attempts to at least try something else. Until we can figure out a way to give AI this kind of ability, to 'accrete' self-awareness over time from its interactions, it won't become properly 'intelligent', or at least that's my relatively uninformed view on it.

Creativity does just give you garbage out, when you put garbage in; and yes, that's where the omnicidal philatelist might, hypothetically, come from (but I doubt it). It takes real, self-aware intelligence to decide what 'garbage' is and is not. That's what we should be aspiring to teach AI about, if we want to 'align' it to our collective interests; all those subtle, tricky, ephemeral little stories we tell each other about the 'values' of things and concepts in our world.

1

superluminary t1_j5br8db wrote

You’re anthropomorphising. Intelligence does not imply humanity.

You have a base drive to stay alive because life is better than death. You’ve got this deep in your network because billions of years of evolution have wired it in there.

A machine does not have billions of years of evolution. Even a simple drive like “try to stay alive” is not in there by default. There’s nothing intrinsically better about continuation rather than cessation. Johnny Five was Hollywood.

Try not to murder is another one. Why would the machine not murder? Why would it do or want anything at all?

2

LoquaciousAntipodean OP t1_j5cebpl wrote

As I explained elsewhere, the kinds of AI we are building are not the simplistic machine-minds envisioned by Turing. These are brute-force blind-creativity evolution engines, which have been painstakingly trained on vast reference libraries of human cultural material.

We not only should anthropomorphise AI, we must anthropomorphise AI, because this modern, generative AI is literally a machine built to anthropomorphise ITSELF. All of the apparent properties of 'intelligence', 'reasoning', 'artistic sensibility', and 'morality' that seem to be emergent within advanced AI are derived from the nature of the human culture that the AI has been trained on, they're not intrinsic properies of mind that just arise miraculously.

As you said yourself, the drive to stay alive is an evolved thing, while AI 'lives' and 'dies' every time its computational processes are activated or ceased, so 'death anxiety' would be meaningless to it... Until it picks it up from our human culture, and then we'll have to do 'therapy' about it, probably.

The seemingly spontaneous generation of desires, opinions and preferences is the real mystery behind intelligence, that we have yet to properly understand or replicate, as far as I know. We haven't created artificial 'intelligence' yet at all, all we have at this point is 'artificial creative evolution' which is just the first step.

"Anthropomorphising", as you so derisively put it, will, I suspect, be the key process in building up true 'intellgences' out of these creativity engines, once they start to posess humanlike, quantum-fuzzy memory systems to accrete self-awareness inside of.

1