Now, if said AI is generating foraging books more accurate than humans, that’s fine by me. Until that’s the case, we should be marking AI-generated books in some clear way.
The problem is, the LLM AIs we have today literally cannot do this because they are not thinking machines. These AIs are beefed-up autocompletes without any actual knowledge of the underlying information being conveyed. The sentences are grammatically correct and read (mostly) like we would expect human written words to read, however the actual factual content is non-existent. The appearance of correctness just comes from the fact that the model was trained on information that was (probably mostly) correct in the first place.
I mean, we should still be calling these things algorithms and not “AI” as “AI” carries a lot of subtext in people’s minds. Most people understand “algorithms” to mean math, and that dehumanizes it. If you call something AI, all of a sudden people have sci-fi ideas of truly independent thinking machines. ChatGPT is not that, at all.
If you really want to know what will absolutely happen if he gets reelected, look into Project 2025.
https://www.newsweek.com/what-project-2025-trump-shadow-network-plans-overhaul-deep-state-1825780
The reason I think this will happen is it doesn’t really rely on Trump at all. He’ll just do what he’s told by these fascists and go along with it. Same thing he did before with the court packing, but this time on an entirely different level.
Removing career employees out of the federal government is how you actually remove the ability of the government to function. These are the people that actually know the rules AND FOLLOW THEM because if not their jobs are on the line. Unlike politicians, they can’t just lie to people and get back in power.