Hi, I am wondering if you have best practices on using Cursor or github copilot to investigate codes / bugs / and ramp up on a certain process.
I have been learning codebase by making and breaking things to understand. But I was wondering if there are better practices using AI to do all these triages / investigations.
Thank you
AI is getting increasingly good at understanding codebases now thanks to large context windows. Here are my suggestions:
https://docs.cursor.com/context/@-symbols/@-git
https://docs.cursor.com/guides/working-with-context (this is gold)
https://x.com/gdb/status/1878489681702310392 (how to write a prompt)
Sai gave a bunch of really great tactical advice. Mine is more high-level:
Here's another good thread: "How to optimally use GenAI to leverage my coding prowess and become a better software engineer?"
I also watch this.
https://www.jointaro.com/lesson/gTQZbv0lmmypLPytqw5g/case-study-how-engineers-unlock-huge-impact-with-artificial-intelligence-ai/
But it seems these are the tips for one time prompt. For example as a "[position], give me 10 topics to ask manager"
Is there a way to smartly request prompts on historically aware context. For example, I always put my notes and to do lists on obsidian. I want to chat chatgpt based on the notes and transcripts I have been recording
Ideally it would have semantic knowledge to know what's important based on my second brain: notion and obsidian.
Thank you
I think ChatGPT has the concept of memory (if you turn it on, it will use other chats/context): https://help.openai.com/en/articles/6825453-chatgpt-release-notes#h_76ef02082f
It seems like ChatGPT has the construct of projects too. So you can put in all your relevant notes in that project and query with it.