I wrote this code in Python(using Langchain) in order to load different types of documents and retrieve a proper ai answer related to a question asked before. It was working okay until I decided that I need to implement history, so that the AI Model have information it can relate to.It throws an error saying 'pip install jq' but whenever I try to it does not work and prompts 'Failed to build jqERROR: Could not build wheels for jq, which is required to install pyproject.toml-based projects'I think it is not available for Windows, bc I have tried downloading it with winget, manually downloading jq file and setting a path, but I need to install it as a package, so that JSONLoader can call it on just 1 line. It needs to be a JSON Loader since all the other document loaders are type Loader.
I am sending a list of messages(chat history) and in the PHP Laravel back-end I encode it into json using:
$jsonHistory = json_encode($request['chatHistory'], JSON_PRETTY_PRINT);
After that I receive the chat history as a str in the Python back-end:
class Request(BaseModel): ai_input: str //these two are not important for that company: str // chatHistory: strclass AiModel: async def get_answer(request: Request): txt_loader = DirectoryLoader(f"{request.company}/", glob="**/*.txt", loader_cls=TextLoader) pdf_loader = DirectoryLoader(f"{request.company}/", glob="**/*.pdf", loader_cls=PyMuPDFLoader) csv_loader = DirectoryLoader(f"{request.company}/", glob="**/*.csv", loader_cls=CSVLoader) json_loader = JSONLoader(request.chatHistory, jq_schema='.input[]content')loadersList = [txt_loader, pdf_loader, csv_loader, json_loader] index = VectorstoreIndexCreator().from_loaders(loadersList) llm = ChatOpenAI if request.ai_input: query = request.ai_input answer = index.query(query, llm=llm)return {"answer":answer}
Any ideas How can I proceed, so that I can load the json file with the rest of the docs?