is your file is named "guidance.py"?
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
I get a different Error: "guidance\library\_select.py", line 121, in recursive_select for k,v in logprobs_result["top_logprobs"][0].items(): TypeError: 'NoneType' object is not subscriptable Error in program: 'NoneType' object is not subscriptable"
LM Studio sent a response " "text": "there is no safety,\nbut with many advisers there is success.", "logprobs": null, "
I guess the "logprobs" are the Problem here. (I have no Idea what I'm doing, TBH).
I modified only Line 4 of the Example to : guidance.llm = guidance.llms.OpenAI("text-davinci-003", api_key = "foobar", api_base = "
http://localhost:1234/v1
")
what is this script supposed to output?
I use ChatGpt all the time to help me with new project and error :
The error message "guidance is not callable" indicates that guidance
is being used as if it were a function, but it isn't defined as one in the code you've provided. It seems like guidance
is intended to be a module or package that contains certain functions or classes that you want to use.
If guidance
is a module you've imported, you should be calling a function within that module instead of the module itself. You need to check the guidance
module's documentation or source code to see what functions or classes it provides, and then use one of those.
For example, if guidance
has a class or function named create_program
, you would use it like this:
program = guidance.create_program(...)
Alternatively, if guidance
is supposed to be a callable object, then the issue might be with how you're importing or defining guidance
. Make sure that guidance
is imported correctly and that it refers to a callable object. If guidance
is the name of a function you intended to define, ensure that you have actually defined it before trying to call it.
I'm far from having a very technical understanding of any of this, but I have had zero luck with guidance, and some appreciable luck with llama.cpp grammars to constrain outputs. Might be worth having a look into it. Not sure how any of it works with LM studio though.