HN.zip

The Prompt() Function: Use the Power of LLMs with SQL

37 points by sebg - 8 comments
delichon [3 hidden]5 mins ago

  FROM hn.hacker_news
  LIMIT 100
"Oops I forgot the limit clause and now owe MotherDuck and OpenAI $93 billion."
domoritz [3 hidden]5 mins ago
I love the simplicity of this. Hurray for small models for small tasks.
korkybuchek [3 hidden]5 mins ago
Interesting -- is there any impact from LLM outputs not being deterministic?
drdaeman [3 hidden]5 mins ago
SQL functions can be non-deterministic just fine. E.g. SQL:2003 grammar defines DETERMINISTIC | NOT DETERMINISTIC characteristic for CREATE FUNCTION. Or, e.g. PostgreSQL has IMMUTABLE | STABLE | VOLATILE clauses.
korkybuchek [3 hidden]5 mins ago
Nice, TIL. Thanks!
xnx [3 hidden]5 mins ago
Aren't LLM outputs deterministic given the same inputs?
simonw [3 hidden]5 mins ago
Not at all. Even the ones that provide a "seed" parameter don't generally 100% guarantee you'll get back the same result.

My understanding is that this is mainly down to how floating point arithmetic works. Any performant LLM will be executing a whole bunch of floating point arithmetic in parallel (usually on a GPU) - and that means that the order in which those operations finish can very slightly affect the result.

korkybuchek [3 hidden]5 mins ago
They are not, necessarily. Especially when using commercial providers who may change models, finetunes, privacy layers, and all kinds of other non-foundational-model things without notice.