Feature request: Add support for self hosted custom llms #150243
Replies: 2 comments 5 replies
-
hey @markhasper As you previously mentionned Copilot doesn't yet support Visual Studio or VScode, however there's a few alternative to this issue. You can use JetBrain IDE which support all of these models with the Copilot intergration. However if you prefer to keep using Visual Studio or VScode. You can manage to integrate some LLM models by installing CodeGPT extension which will allow you to integrate with some OpenAI models, you'll still be unable to connect with Sonnet 3.5. Now if you intend to install this extension you'll need modify VScode settings to reroute your request throught API calls. You'll find these settings in the file settings.json and modify it appropriately. Don't forget to get your API keys from Open AI prior to modifying your Settings.json file |
Beta Was this translation helpful? Give feedback.
-
Could you mark this issue as 'Answered' if you are satisfied with my answer @markhasper |
Beta Was this translation helpful? Give feedback.
-
Select Topic Area
Question
Body
It was great to see copilot add support for o1, 4o and sonnet recently.
It would be even greater if there was also support for the use of any custom llm using the openai api conventions in the same way that cursor supports this.
As self hosted models become more powerful, I think there is a good case for fine tuning an open source llm on our own code and conventions and then using that as the backend for the excellent github copilot ide extensions.
We use both Visual Studio and VSCode extensively, so really appreciate that github copilot is available in both these IDEs, but for the moment that means we cannot train and use our own open source llm.
Beta Was this translation helpful? Give feedback.
All reactions