trash_burner12

trash_burner12 t1_j6tgzyb wrote

The arms race between AI-identifying software and AI software is potentially going to make it impossible to really "know" if the AI-identifying software is even working. I'm under the impression most developers or all developers do not understand exactly what is happening inside these machines during runtime / training time.

Most court systems are sluggish and highly bureaucratic. They aren't going to be able to keep up or decide what AI-identifying software to use when different systems could be rapidly coming out.

As others have said, media evidence could be accepted if it's verified as coming from a trusted source. Unfortunately this causes a sort of "who watches the watchers" scenario -- it's sad because for so long video evidence was basically an oracle of truth.

7

trash_burner12 t1_ir0lewd wrote

The problem is more generally a culture of just releasing new technology without caring much for the repercussions because that's the way the economic winds blow.

On the internet, companies are competing for your attention because that's money, they'll try to tap any emotional button to get it.

Would the younger generation be better off spending less time on social media? Yeah, I think that's pretty much common knowledge now.

7