问题描述
问题:如何在Appsync速度模板中为SQL语句字符串化JSON对象?
Question: How can I stringify JSON object for SQL statement in my Appsync velocity template?
说明:我有一个Aurora RDS表,该表的列的数据类型为JSON. AppSync API已连接到RDS.我的GraphQL模式看起来像
Explanation: I have an Aurora RDS table that has a column with data type JSON. AppSync API is already connected to RDS.My GraphQL schema looks like
input CreateServiceCatalogItemInput {
serviceName: String!
serviceConfig: ServiceConfigInput!
}
type Mutation {
createServiceCatalogItem(input: CreateServiceCatalogItemInput!): ServiceCatalogItem
}
type Query {
getAllServiceCatalogItem: [ServiceCatalogItem]
}
type ServiceCatalogItem {
serviceId: ID!
serviceName: String!
serviceConfig: ServiceConfig!
}
type ServiceConfig {
connectionType: String
capacity: Int
}
input ServiceConfigInput {
connectionType: String
capacity: Int
}
schema {
query: Query
mutation: Mutation
}
我的createServiceCatalogItem突变解析器看起来像
My resolver for createServiceCatalogItem mutation looks like
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '$util.toString($ctx.args.input.serviceConfig)') RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
这引发了我错误:
在com.amazonaws.deepdish.transform.util.TransformUtils类中对方法'toString'的调用引发了java.lang.IllegalArgumentException异常:...处的参数数量错误
如果我这样做:
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '$util.toJson($ctx.args.input.serviceConfig)') RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
这引发了我错误:
无法解析JSON文档:'意外字符('c'(代码99)):期望逗号分隔... \
如果我这样做:
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '$ctx.args.input.serviceConfig') RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
然后我得到了有意义的错误:
Then I get error which makes sense:
RDSHttp:{"message":"错误:类型json的输入语法无效\ n详细信息:令牌\"connectionType \"无效.\ n位置:222 \ n其中:JSON数据,第1行:{connectionType ..."}}
但是,当我在解析器中对JSON进行硬编码时,它会起作用:
However, when I hardcode the JSON in my resolver, it works:
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '{\"connectionType\":\"ftth\",\"capacity\":1}') RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
那么如何将{connectionType = ftth,Capacity = 1}转换为{"connectionType":"ftth","capacity":1}?我在做错什么还是我错过了什么?任何帮助将不胜感激.
So how can I convert {connectionType=ftth, capacity=1} to {"connectionType":"ftth", "capacity":1}? What am I doing wrong or am I missing something? Any help would be highly appreciated.
推荐答案
您可以构建一个JSON变量,如下所示:
You can build a JSON variable like this:
#set($json = $util.toJson({
"connectionType": "$ctx.args.input.serviceConfig.connectionType",
"capacity": $ctx.args.input.serviceConfig.capacity
}))
并在查询中插入
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '$util.escapeJavaScript($json)' RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
由于所有引号和转义符,上述内容颇具挑战性,但我认为使用escapeJavaScript
可以解决问题.
The above is a bit challenging because of all the quotes and escaping but I think the use of escapeJavaScript
will do the trick.
或直接:
{
"version": "2018-05-29",
"statements": [
"INSERT INTO b2b_service_catalog(service_name, service_config) VALUES ('$ctx.args.input.serviceName', '{\"connectionType\":\"$ctx.args.input.serviceConfig.connectionType\",\"capacity\": $ctx.args.input.serviceConfig.capacity}') RETURNING service_id AS \"serviceId\", service_name AS \"serviceName\", service_config AS \"serviceConfig\"",
]
}
这篇关于在AWS AppSync中为SQL查询使用Stringify JSON对象的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!