This seems like a glib one liner but I do think it is profoundly insightful as to how some people approach thinking about LLMs.
It is almost like there is hardwiring in our brains that makes us instinctively correlate language generation with intelligence and people cannot separate the two.
It would be like if for the first calculators ever produced instead of responding with 8 to the input 4 + 4 = printed out "Great question! The answer to your question is 7.98" and that resulted in a slew of people proclaiming the arrival of AGI (or, more seriously, the ELIZA Effect is a thing).
Heres my old benchmark question and my new variant:
"When was the last time England beat Scotland at rugby union"
new variant
"Without using search when was the last time England beat Scotland at rugby union"
It is amazing how bad ChatGPT is at this question and has been for years now across multiple models. It's not that it gets it wrong - no shade, I've told it not to search the web so this is _hard_ for it - but how badly it reports the answer. Starting from the small stuff - it almost always reports the wrong year, wrong location and wrong score - that's the boring facts stuff that I would expect it to stumble on. It often creates details of matches that didn't exist, cool standard hallucinations. But even within the text it generates itself it cannot keep it consistent with how reality works. It often reports draws as wins for England. It frequently states the team that it just said scored most points lost the match, etc.
It is my ur example for when people challenge my assertion LLMs are stochastic parrots or fancy Markov chains on steroids.
> For LLMs, there is an added benefit. If you can formally specify what you want, you can make that specification your entire program. Then have an LLM driven compiler produce a provably correct implementation. This is a novel programming paradigm that has never before been possible; although every "declarative" language is an attempt to approximate it.
That is not novel and every declarative language precisely embodies it.
I think most existing declarative languages still require the programmer to specify too many details to get something usable. For instance, Prolog often requires the use of 'cut' to get reasonable performance for some problems.
I'm currently playing a game that is a blatent rip-off of Stardew Valley to the point where I frequently question why they were so obvious. (Or maybe those elements are rip-offs of Harvest Moon, I haven't played Harvest Moon to know.) Still, it's enjoyable. The design elements and places where it does diverge from Stardew Valley make it more enjoyable in my opinion.
As the saying goes, "good artists borrow, great artists steal."
Harvest Moon defines the "Turning round a dilapidated farm in a small village where you give everyone gifts all the time" genre. It all comes from there.
EDIT: Stardew Valley has so many QoL improvements over harvest moon though. The early HM games are punishing.
That is the opposite of what I usually see. Are you trading CPU for RAM by running a more aggressive GC like ZGC or Shenandoah? Usually, people starve the CPU to buy more RAM.
monofur - my monospaced programming font of choice for decades now has an almost psychotic dedication to glyph disambiguation, every character is exceedingly distinct.
Yeah, there are a bunch of caveats about monofur and ligatures is one of them. For me I don't care for ligatures so it doesn't affect me but would obviously be a show stopper for others.
Grandia II battle system also breaks when you get Teo. An area effect Critical? Basically allows you to completely control every battle even if you haven't invested in a board wiping Uber character.
It is almost like there is hardwiring in our brains that makes us instinctively correlate language generation with intelligence and people cannot separate the two.
It would be like if for the first calculators ever produced instead of responding with 8 to the input 4 + 4 = printed out "Great question! The answer to your question is 7.98" and that resulted in a slew of people proclaiming the arrival of AGI (or, more seriously, the ELIZA Effect is a thing).
reply