Logo for AiToolGo

Build Private Data Knowledge Q&A AIGC Business with Lindorm AI Engine

In-depth discussion
Technical
 0
 0
 104
本文介绍了如何利用Lindorm AI引擎构建私域数据知识问答AIGC业务,探讨了基于向量检索和Prompt Engineering的解决方案,提供了详细的操作步骤和示例代码,旨在帮助用户简化知识问答应用的开发过程。
  • main points
  • unique insights
  • practical applications
  • key topics
  • key insights
  • learning outcomes
  • main points

    • 1
      深入探讨了私域数据知识问答的构建方法
    • 2
      提供了详细的操作步骤和示例代码
    • 3
      结合了最新的AI技术和实践
  • unique insights

    • 1
      介绍了基于向量检索和Prompt Engineering的解决方案
    • 2
      分析了FineTune方法的局限性和替代方案
  • practical applications

    • 文章为用户提供了实用的步骤和示例,帮助他们在实际应用中快速搭建知识问答系统。
  • key topics

    • 1
      Lindorm AI引擎
    • 2
      私域数据知识问答
    • 3
      向量检索与Prompt Engineering
  • key insights

    • 1
      提供一站式解决方案,简化应用开发
    • 2
      结合多种AI模型,提升知识问答的准确性
    • 3
      详细的操作指南和代码示例
  • learning outcomes

    • 1
      掌握使用Lindorm AI引擎构建知识问答系统的技能
    • 2
      理解向量检索和Prompt Engineering的应用
    • 3
      能够独立实现私域数据知识问答的搭建
examples
tutorials
code samples
visuals
fundamentals
advanced content
practical tips
best practices

Introduction to Lindorm AI Engine for Knowledge Q&A

Lindorm AI Engine offers a one-stop solution for building private data knowledge Q&A AIGC applications. By integrating the Lindorm AI Engine with built-in vector search capabilities, users can easily construct knowledge Q&A functionalities with a single SQL statement, significantly simplifying application development. This eliminates the complexities associated with traditional methods, such as fine-tuning large language models (LLMs) or managing vector databases separately.

Background: Building Private Data Knowledge Q&A Systems

The demand for private data knowledge Q&A systems based on Large Language Models (LLMs) is growing. The goal is to enable LLMs trained on public corpora to answer questions using knowledge from a dedicated knowledge base, applicable to internal enterprise scenarios like intelligent work order Q&A. Existing solutions include fine-tuning LLMs on specific datasets or using vector retrieval to supplement user prompts with relevant documents from the dataset. The latter, based on 'vector retrieval + Prompt Engineering,' is more popular due to the high costs and poor timeliness of fine-tuning. This approach involves slicing documents, extracting embeddings, and managing document updates, all of which Lindorm AI Engine simplifies.

Prerequisites for Using Lindorm AI Engine

Before you begin, ensure that the Lindorm AI Engine is activated. Also, verify that your wide table engine is version 2.5.4.3 or later. If you're using an earlier version, consider upgrading or contacting Lindorm support for assistance. Additionally, confirm that the S3 protocol compatibility feature and the unstructured data vector retrieval function are enabled. These prerequisites ensure seamless integration and optimal performance of the Lindorm AI Engine.

Overview of AI Models Used

The private data knowledge Q&A solution involves several AI models. This example uses the BERT text segmentation model from ModelScope for text slicing, the text2vec-base-chinese model from Hugging Face for text vectorization, and the ChatGLM-6B-int4 model from Hugging Face as the LLM. It's important to note that Alibaba Cloud does not guarantee the legality, security, or accuracy of third-party models, and users are responsible for complying with the terms of use and relevant laws and regulations.

Data Preparation: Creating and Populating the Knowledge Base

First, connect to the wide table engine using tools like Lindorm-cli. Then, create a table to store the knowledge base documents. For example: ```sql CREATE TABLE doc_table ( id VARCHAR, doc_field VARCHAR, PRIMARY KEY(id) ); ``` Next, insert data into the table. This data will serve as the knowledge base for the Q&A system. Example data includes information about Lindorm features, updates, and capabilities.

Full Volume Retrieval Q&A Implementation

To implement full volume retrieval Q&A, create a model using the `CREATE MODEL` statement, specifying the source table, target field, task, algorithm, and settings. For example: ```sql CREATE MODEL rqa_model FROM doc_table TARGET doc_field TASK RETRIEVAL_QA ALGORITHM CHATGLM3_6B SETTINGS (doc_id_column 'id'); ``` Then, execute a retrieval Q&A using the `ai_infer` function: ```sql SELECT ai_infer('rqa_model', 'Lindorm是什么'); ``` The result will be an answer generated by the LLM based on the knowledge base.

Incremental Retrieval Q&A Implementation

To enable incremental processing, which automatically handles new, modified, or deleted documents in the knowledge base, you need to activate the stream engine and data subscription. Create a data subscription channel via LTS in Pull mode, specifying the Lindorm table name and Kafka topic name. Then, create an incremental retrieval Q&A model: ```sql CREATE MODEL rqa_model FROM doc_table TARGET doc_field TASK RETRIEVAL_QA ALGORITHM CHATGLM3_6B SETTINGS (doc_id_column 'id', incremental_train 'on', lts_topic 'rqa_xxx_topic' ); ``` Execute the retrieval Q&A as before: ```sql SELECT ai_infer('rqa_model', 'Lindorm是什么'); ``` The result will reflect the updated knowledge base.

Semantic Retrieval (Optional)

If you need to integrate with other LLMs, you can create a semantic retrieval model to enable Lindorm to perform only knowledge base semantic retrieval functions (including document slicing, vectorization, and vector retrieval). Create a semantic retrieval model that only processes full volume documents: ```sql CREATE MODEL sr_model FROM doc_table TARGET doc_field TASK SEMANTIC_RETRIEVAL ALGORITHM TEXT2VEC_BASE_CHINESE SETTINGS (doc_id_column 'id'); ``` Execute semantic retrieval: ```sql SELECT ai_infer('sr_model', 'Lindorm是什么'); ``` Optionally, you can set the `score` parameter to return semantic similarity scores.

Summary: Streamlining Knowledge Q&A with Lindorm AI Engine

Lindorm AI Engine provides a comprehensive and efficient solution for building private data knowledge Q&A AIGC applications. By leveraging its built-in vector search capabilities and simplified SQL interface, developers can create intelligent Q&A systems with ease, reducing development time and complexity. Whether you need full volume retrieval, incremental updates, or semantic search, Lindorm AI Engine offers the tools and flexibility to meet your needs.

 Original link: https://help.aliyun.com/document_detail/2401799.html

Comment(0)

user's avatar

      Related Tools