Sidivan

Sidivan t1_j4rgjox wrote

That’s a good point about content creation. The way we monetize content today is by consumption. The model measures traffic and assigns value to it. If AI is going to serve up all that content in the form of it’s own content creation, who gets paid? Does everybody it references or considers get a slice? In the above example about tigers, does every single one of the millions of photos it referenced get paid or are they simply inspiration? This battle for revenue is happening right now with visual artists and AI art.

2

Sidivan t1_j4q8u6r wrote

This is actually kind of scary. That’s another level of trust.

Currently, when you search for something, you’re looking for content related to your search. These are independent sources, for better or worse, that you can validate based on their merits. For instance, if I search for information about tigers, I can likely trust WWF, Britannica, etc… but if I search for that and AI generates that info, I don’t have any insight into the source. I have to trust that the AI returned accurate, reliable information.

If AI generates a tiger 3D model that has something obvious like 2 tails, that would raise a mental alarm only because you know they do not. What if you didn’t know that? What if instead of two tails, it just made the black stripes too thick, leading you to believe they aren’t as orange? Would you notice that?

Interesting times when we rely on AI generated content as fact.

23