AI 文摘

多智能体RAG超级机器人:GraphRAG+AutoGen+Ollama+Chainlit





作者: PaperAgent 来源: PaperAgent

开源项目Autogen_GraphRAG_Ollama :一个集成微软GraphRAG + AutoGen + Ollama + Chainlit的完全本地且免费的多智能体RAG超级机器人。

这项目简介是真的强,囊括了大模型领域里面比较火的应用,什么GraphRAG、Multi-Agent统统拿下。

  • Agentic-RAG: - 通过函数调用将 GraphRAG 的知识搜索方法与 AutoGen 智能体集成。

  • 离线 LLM 支持: - 配置 GraphRAG(本地和全局搜索)以支持来自 Ollama 的本地模型进行推理和嵌入。

  • 非 OpenAI 函数调用: - 扩展 AutoGen 以支持通过 Lite-LLM 智能体服务器从 Ollama 使用非 OpenAI LLM 进行函数调用。

  • 交互式 UI: - 部署 Chainlit UI 来处理连续对话、多线程和用户输入设置。

效果展示:

代码说明:

AssistantAgent与ChainlitUserProxyAgent定义:


retriever   = AssistantAgent(
       name="Retriever", 
       llm_config=llm_config_autogen, 
       system_message="""Only execute the function query_graphRAG to look for context. 
                    Output 'TERMINATE' when an answer has been provided.""",
       max_consecutive_auto_reply=1,
       human_input_mode="NEVER", 
       description="Retriever Agent"
    )
user_proxy = ChainlitUserProxyAgent(
        name="User_Proxy",
        human_input_mode="ALWAYS",
        llm_config=llm_config_autogen,
        is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
        code_execution_config=False,
        system_message='''A human admin. Interact with the retriever to provide any context''',
        description="User Proxy Agent"
    )

agents的注册


async def query_graphRAG(
          question: Annotated[str, 'Query string containing information that you want from RAG search']
                          ) -> str:
        if LOCAL_SEARCH:
            print(LOCAL_SEARCH)
            result = run_local_search(INPUT_DIR, ROOT_DIR, COMMUNITY ,RESPONSE_TYPE, question)
        else:
            result = run_global_search(INPUT_DIR, ROOT_DIR, COMMUNITY ,RESPONSE_TYPE, question)
        await cl.Message(content=result).send()
        return result
  

for caller in [retriever]:
        d_retrieve_content = caller.register_for_llm(
            description="retrieve content for code generation and question answering.", api_style="function"
        )(query_graphRAG)
  

for agents in [user_proxy, retriever]:
        agents.register_for_execution()(d_retrieve_content)

AtuoGen的聊天对话逻辑


    groupchat = autogen.GroupChat(
        agents=[user_proxy, retriever],
        messages=[],
        max_round=MAX_ITER,
        speaker_selection_method=state_transition,
        allow_repeat_speaker=True,
    )
    manager = autogen.GroupChatManager(groupchat=groupchat,
                                       llm_config=llm_config_autogen, 
                                       is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
                                       code_execution_config=False,
                                       )    
  

# -------------------- Conversation Logic. Edit to change your first message based on the Task you want to get done. ----------------------------- # 
    if len(groupchat.messages) == 0: 
      await cl.make_async(user_proxy.initiate_chat)( manager, message=CONTEXT, )
    elif len(groupchat.messages) < MAX_ITER:
      await cl.make_async(user_proxy.send)( manager, message=CONTEXT, )
    elif len(groupchat.messages) == MAX_ITER:  
      await cl.make_async(user_proxy.send)( manager, message="exit", )
https://github.com/karthik-codex/Autogen_GraphRAG_Ollama

推荐阅读

欢迎关注我的公众号“PaperAgent ”,每天一篇大模型(LLM)文章来锻炼我们的思维,简单的例子,不简单的方法,提升自己。

更多AI工具,参考Github-AiBard123国内AiBard123

可关注我们的公众号:每天AI新工具