New Post
Option to connect to local running API(s). This can be done via "FETCH" api in the brower.
Example: Setup LmStudio -> Activate the API -> Then Sudowrite just has to connect to it via local IP.
This could also be done using llama.cpp, Ollama, Faraday, KoboldCPP, oobabooga/text-generation-webui, and other local running "servers" (direct input/output with the AI/LLM) so to speak.
Fetch is a brower based (supported in all browsers) API system that can interact with models - local and/or remote.
This would allow user controlled AIs/LLMs specific to use case(s).
I have already written "fetch code" ( in Javascript - with no dependencies ) for another project so I know this works.