this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

GPT-4 surprisingly excels at Googling (Binging?) to retrieve up-to-date information about current issues. Tools like Perplexity.ai are impressive. Now that we have a highly capable smaller-scale model, I feel like not enough open-source research is being directed towards enabling local models to perform internet searches and retrieve online information.

Did you manage to add that functionality to your local setup, or know some good repo/resources to do so?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] iChrist@alien.top 1 points 11 months ago

There are 3 options that I have found, they all work.

  1. TextGenerationWebui - web_search extension (there is also a DuckDuckGO clone in github)
  2. LolLLMs - There is an Internet persona which do the same, searches the web locally and gives it as context
  3. Chat-UI by huggingface - It is also a great option as it is very fast (5-10 secs) and shows all of his sources, great UI (they added the ability to search locally and run LLMS models locally recently)

โ€‹

GitHub - simbake/web_search: web search extension for text-generation-webui

GitHub - ParisNeo/lollms-webui: Lord of Large Language Models Web User Interface

GitHub - huggingface/chat-ui: Open source codebase powering the HuggingChat app

If you ask me, try all 3 of them!