Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
dcre
4 months ago
|
parent
|
context
|
favorite
| on:
Are OpenAI and Anthropic losing money on inference...
That's interesting, though you have to imagine the data set is very low quality on average and distilling high quality training pairs out of it is very costly.
DenisM
4 months ago
[–]
Hence exponential increase in model training costs. Also hallucinations in the long tail of knowledge.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: