An analogy occurred to me the other day.
Oct. 16th, 2024 11:36 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
You can print your own paper money at home, of course, and it can sometimes fool someone who isn't paying much attention. A really dedicated counterfeiter can fool a lot of people. The last mile of craft goes on more or less forever, in terms of accurate fiber content/distribution, ink consistency, foils and inclusions, etc., but generally you pick a use case for your counterfeits and determine a "good enough" fidelity level.
The thing is, though, the value of money doesn't come from the fidelity of the object to a physical standard; it comes from recognition of the authority and integrity of the central bank that issues it. In other words, it's just a symbol that refers to an external thing. If you produce a quantity of perfect, indistinguishable counterfeit bills, you can use it to fool nearly anyone, cause a lot of chaos, and greatly profit in the confusion. But what you can't do with it is magically control the central bank to change the quantity of money that's known to be in circulation. You haven't actually produced money at all.
This is also how "knowledge" works βΒ language operates as a system of symbols in reference to actually-existing external things (or hypothetical things that in turn relate to actually-existing things by analogy, etc.). If you produce flawlessly plausible language but are not correlating the symbols to the proper external referents, you aren't exercising "knowledge;" you're doing something else. Specifically, what you're doing is counterfeiting. Maybe your counterfeit will be good enough to trick someone out of some drugs or a meal or a jug of Tide (figuratively speaking), or maybe you'll get caught with your pants down and find yourself in a world of hurt.
When you hear these LLM guys saying shit like "our next model will reason at a graduate school level" or whatever, they're asserting that if we can make our monopoly money realistic enough, it will transform into real money. But they don't command the central bank, because an LLM does not have any concept of symbols and referents that would allow it to "reason" or "possess even a single fact worth of knowledge;" all it can do is correlate statistics about how language tokens have been placed together in its training corpus. It can print fancy paper, but it can't make money.
Are real and counterfeit money actually different, though? In any way that matters? Well, if all you're trying to do is pass off a bill to acquire some goods, it can maybe seem like there isn't. But keep doing it, and I bet you'll eventually hit a scenario where you will perceive a meaningful difference. Or maybe all you need to do is stand still and watch, because whenever some dipshit completely destabilizes the currency, the result is usually pretty tough to ignore.
no subject
Date: 2024-10-16 08:02 pm (UTC)BTW I actually don't know if this analogy is novel. I don't think I've head it before, but I could have forgot, or I could have Done A Convergent Evolution with someone who got there first. LMK if you can think of prior art.
no subject
Date: 2024-10-17 12:06 am (UTC)no subject
Date: 2024-10-17 03:34 am (UTC)no subject
Date: 2024-10-18 10:04 pm (UTC)I've been mulling this over and trying to think through the implications of "AI sludge as company scrip," and it's very challenging. π But I might be getting there β what I'm imagining is sort of an economy of bullshit exchange...
...specifically, I'm thinking of an "annual performance review" season where all the employees just submit chatgpt spam for their own reviews and get equivalent garbage back, and managers end up promoting the people they've been working to promote regardless, but because the company HR department has these process requirements, they have to go through this whole exchange of meaningless placeholder content.