Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quastion on LLM #71

Open
BradKML opened this issue Feb 12, 2025 · 2 comments
Open

Quastion on LLM #71

BradKML opened this issue Feb 12, 2025 · 2 comments

Comments

@BradKML
Copy link

BradKML commented Feb 12, 2025

  1. Why does it require external LLM calls instead of within Cursor?
  2. Can it also support the top models that are available to OpenRouter? Maybe Ollama with other models?
  3. Are there differences between reasoning models vs regular models? DeepSeek is getting stronger adoption master...guoyongchang:devin.cursorrules:master and master...wzqwtt:devin.cursorrules:master
  4. Are there "read before write" standards like the ones here? master...Toowiredd:devin.cursorrules:master
  5. Can agents/LLMs be mix-and-match such that routing can be simulated for more efficiency? master...YcDon:devin.cursorrules:master
  6. How can chat and compose be seperated? master...devincarrick:devin.cursorrules:master
@grapeot
Copy link
Owner

grapeot commented Feb 12, 2025

  1. Why does it require external LLM calls instead of within Cursor?

It doesn't.

2. Can it also support the top models that are available to OpenRouter? Maybe Ollama with other models?

Yes. OpenRouter is compatible with OpenAI. You can follow their docs to use their models with minimal code changes.

3. Are there differences between reasoning models vs regular models? DeepSeek is getting stronger adoption [master...guoyongchang:devin.cursorrules:master](https://github.com/grapeot/devin.cursorrules/compare/master...guoyongchang:devin.cursorrules:master) and [master...wzqwtt:devin.cursorrules:master](https://github.com/grapeot/devin.cursorrules/compare/master...wzqwtt:devin.cursorrules:master)

I don't quite get the question. They are two different kinds of models and we support both.

4. Are there "read before write" standards like the ones here? [master...Toowiredd:devin.cursorrules:master](https://github.com/grapeot/devin.cursorrules/compare/master...Toowiredd:devin.cursorrules:master)

What do you mean?

5. Can agents/LLMs be mix-and-match such that routing can be simulated for more efficiency? [master...YcDon:devin.cursorrules:master](https://github.com/grapeot/devin.cursorrules/compare/master...YcDon:devin.cursorrules:master)

I don't quite get the question. If you want to mix and match, you can say that in the cursorrules like the link you pasted does.

6. How can chat and compose be seperated? [master...devincarrick:devin.cursorrules:master](https://github.com/grapeot/devin.cursorrules/compare/master...devincarrick:devin.cursorrules:master)

I don't understand. Chat and Composer are two separate features in Cursor. They are already separated.

@BradKML
Copy link
Author

BradKML commented Feb 12, 2025

For 1, basically Cursor can already call models within instead of needing a separate key? If so then 2 will be for experimental purposes.
For 3, 5, and 6, the idea would be how it is possible to use both Chat and Compose on the same task, or using two different models and having them switch without much user input
For 4, it is a matter of prompt engineering to read the code to confirm that their edit won't break something else, as in taking caution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants