Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The idea is that you give the libraries and APIs as context with your prompt.


There's a fairly low ceiling for max context tokens no matter the size of the model. Your hobby/small codebase may work, but for large codebases, you will need to do RAG and currently it's not perfect at absorbing the codebase and being able to answer questions on it.


Thank you. But that doesn't work for me.

If you mean just the name of the version in the prompt? No way.

If you mean all the libary and my code in the contextwindow?

Way too small.


Give it examples of the library being used in the way you need.

Here's an example transcript where I did that: https://gist.github.com/simonw/6a9f077bf8db616e44893a24ae1d3...


Thank you, I experimented in that direction as well.

But for my actual codebase, that is sadly not 100% clear code, it would require lots and lots of work, to give examples so it has enough of the right context, to work good enough.

While working I am jumping a lot between context and files. Where a LLM hopefully one day will be helpful, will be refactoring it all. But currently I would need to spend more time setting up context, than solving it myself.

With limited scope, like in your example - I do use LLMs regulary.


Maybe the LLM could issue queries to fetch parts of your codebase as it needs to look at them, using something like GDB or cscope.


Play around with projects in Claude for an hour. You'll see.


Not _all_ the code. Just the relevant parts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: