WebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils
WebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils