Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> but the hallucinations still happen, and making things up (especially APIs) is unacceptable.

The new models are much better at reading the codebase first, and sticking to "use the APIs / libraries already included". Also, for new libraries there's context7 that brings in up-to-date docs. Again, newer models know how to use it (even gpt5-mini works fine with it).



What size of codebases are we talking here? I've had a lot of issues trying to do pretty much anything across a 1.7 million LOC codebase and generally found it faster to use traditional IDE functionalities.

I've had much more success with things under 20k LOC but that isn't the stuff that I really need any assistance with.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: