问题描述
我在MarkLogic数据库中加载了以下XML文档:
I have the following XML document loaded in MarkLogic database:
<x:books xmlns:x="urn:books">
<book id="bk001">
<author>Writer</author>
<title>The First Book</title>
<genre>Fiction</genre>
<price>44.95</price>
<pub_date>2000-10-01</pub_date>
<review>An amazing story of nothing.</review>
</book>
<book id="bk002">
<author>Poet</author>
<title>The Poet's First Poem</title>
<genre>Poem</genre>
<price>24.95</price>
<review>Least poetic poems.</review>
</book>
</x:books>
我是XQuery的新手.从SQL数据库检索XML文档时,如何从XML文档检索值?
I am new to XQuery. How would I retrieve the values from the XML document as I retrieve it from a SQL database?
输出:
BookID | Author | Title | Genre | price | pub_date | review
bk001 | Writer | The First Book | Fiction | 44.95 | 2000-10-01
bk002 | Poet | The Poet's First Poem | Poem | 24.95 | Least poetic poems.
注意:不需要用管道定界,但是有一些收集列表.
Note: Not necessary a pipe delimited but some collection list.
有人可以分享一些链接或帮我写这个XQuery吗?我是新来的.
Can some one share some link or help me write this XQuery? I am new to this.
推荐答案
XQuery的序列构造将包含多个值,但不是分层的-因此,如果您创建一个序列序列,则只需将它们全部组合成一个大序列
XQuery's sequence construct will hold multiple values, but it's not hierarchical - so if you create a sequence of sequences, it will simply join them all together into one large sequence.
这将把所有子元素和属性值捕获到一个序列中,但是由于我刚才提到的序列的属性,将没有内置的方式来获取第二本书的第一个值.您将必须知道这是第七项.第三本书的第一个值将是第14个,依此类推:
This will capture all child element and attribute values into a sequence, but because of the property of sequences I just mentioned, there would be no built in way to get the first value of the second book. You would have to know that it's the 7th item. And that the first value of a third book would be the 14th item, and so on:
$books/book/(*|@*)/string()
仅演示如何实现管道定界列表:
Just to demonstrate how you would achieve a pipe delimited list:
string-join($books/book[1]/(*|@*)/node-name() ! string(), ' | '), (: Create header row :)
for $book in $books/book
return string-join($book/(*|@*)/string(), ' | ')
这篇关于MarkLogic:XQuery从XML文档获取价值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!