问题描述
我想从用户输入中提取多个实体.示例-由于高CPU使用率和DNS错误,服务httpd没有响应"所以在这里我要确定以下内容:Httpd高CPU使用率DNS错误
I want to extract multiple entities from a user input.Example- "Service httpd is not responding because of high CPU usage and DNS Error"So here I want to identify below:HttpdHigh CPU usageDNS Error
我将使用此关键字从数据库获得响应.
And I will be using this keywords to get a response from a Database.
推荐答案
只需相应地注释它们,例如
Just annotate them accordingly, e.g.
## intent: query_error
- Service [httpd](keyword) is not responding because of [high CPU usage](keyword) and [DNS Error](keyword)
具有上面的句子,Rasa NLU将提取3个类型为keyword
的实体.然后,您可以通过自定义操作访问这些实体.并查询您的数据库.
Having the sentence from above, Rasa NLU would extract 3 entities of type keyword
. You can then access these entities in a custom action and query your database.
关于所需示例的数量:这取决于
Regarding the number of examples which are required: this depends on
- 您正在使用的 NLU管道.与
spacy_sklearn
相比,tensorflow_embedding
通常需要更多的训练示例,因为它不使用预训练的语言模型. - 您的实体可以具有的不同值的数量.如果只有
httpd
,high CPU usage
和DNS error
,则不需要很多示例.但是,如果您的实体有上千个不同的值,那么您需要更多的训练示例
- the NLU pipeline which you are using. Typically
tensorflow_embedding
requires more training examples thanspacy_sklearn
since it does not use pretrained language models. - the number of different values your entities can have. If it is only
httpd
,high CPU usage
, andDNS error
then you don't need a lot of examples. However, if you have a thousand different values for your entity, then you need more training examples
如果您始终想触发相同的自定义操作,那么一个意图就足够了.但是,如果您想对不同类型的问题进行分类,例如服务器问题和客户端问题,并根据问题的类型触发不同的数据库,您可能会考虑具有多个意图.
One intent is enough if you always want to trigger the same custom action. However, if you want to classify different type of problems, e.g. server problems and client problems, and trigger different databases depending on the type of problems, you might consider having multiple intents.
对于模糊的答案很抱歉,但是在机器学习中,大多数事情高度依赖用例和数据集.
Sorry for the vague answers, but in machine learning most things are highly dependent on the use case and the dataset.
这篇关于如何在RASA中识别多个实体的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!