问题描述
这可行:
create ["archive.html"] $ do
route idRoute
compile $ do
posts <- (myRecentFirst gitTimes) =<< loadAll "posts/**"
let archiveCtx =
listField "posts" (postCtx allTags allCategories gitTimes) (return posts) `mappend`
constField "title" "Archives" `mappend`
(postCtx allTags allCategories gitTimes)
makeItem ""
>>= loadAndApplyTemplate "templates/archive.html" archiveCtx
>>= loadAndApplyTemplate "templates/default.html" archiveCtx
>>= relativizeUrls
在archive.html中创建最近发布的列表;这是沼泽标准的,它来自我认为的其中一篇教程.除了postsCtx之外,这有点复杂,但此处不相关.
to create a list of recent posts in archive.html ; this is bog-standard, it came from one of the tutorials I think. Except for postsCtx, which is a tad complicated, but shouldn't be relevant here.
但是,我想在普通帖子的侧边栏中列出一些最近的帖子.问题在于最近的帖子最终取决于他们自己.我尝试将帖子本身从自己的生成列表中排除,但找不到合适的位置.这是到目前为止我得到的:
However, I want to put a list of a few recent posts in the sidebar of normal posts. The problem becomes that the recent posts end up depending on themselves. I tried excluding the post itself from its own generated list, but I couldn't find a good place to do that. Here's what I've got so far:
match "posts/**" $ do
route $ (gsubRoute "posts/" (const "")) `composeRoutes` setExtension "html"
compile $ do
recents <- (myRecentFirst gitTimes) =<< loadAll "posts/**"
let postsContext = postCtx allTags allCategories gitTimes `mappend`
-- Show recent posts
recentsNotSelfField "recents" (postCtx allTags allCategories gitTimes) (return $ take 3 recents)
pandocCompilerWithTransform hblogPandocReaderOptions hblogPandocWriterOptions (titleFixer titles)
>>= loadAndApplyTemplate "templates/post.html" postsContext
>>= loadAndApplyTemplate "templates/default.html" postsContext
>>= relativizeUrls
recentsNotSelfField :: String -> Context a -> Compiler [Item a] -> Context b
recentsNotSelfField key context items = Context $ \k _ i ->
if k == key then do
let myId = itemIdentifier i
strippedItems <- items
let remains = filter (\x -> (itemIdentifier x) /= myId) strippedItems
return $ ListField context remains
else
CA.empty
recentsNotSelfField 应该产生一个包含所有最新消息 except 本身的字段,但是它似乎不起作用或执行该操作的位置不正确,原因是:
recentsNotSelfField should produce a field with all the recents except itself, but it doesn't seem to be working or it's the wrong place to do that, because:
Initialising...
Creating store...
Creating provider...
Running rules...
Checking for out-of-date items
Compiling
[ERROR] Hakyll.Core.Runtime.chase: Dependency cycle detected: posts/computing/contact.md depends on posts/computing/contact.md
我被困住了.
我看到 Hakyll说检测到依赖性周期:..." ,它是由loadPosts完成的,所以我尝试了这一点:
I saw Hakyll says "Dependency cycle detected: ..." , that it's the loadPosts that does it, so I tried this:
match "posts/**" $ do
route $ (gsubRoute "posts/" (const "")) `composeRoutes` setExtension "html"
compile $ do
myId <- getUnderlying
recents <- (myRecentFirst gitTimes) =<< loadAll ("posts/**" .&&. complement (fromList [myId]))
let postsContext = postCtx allTags allCategories gitTimes `mappend`
-- Show recent posts
listField "recents" (postCtx allTags allCategories gitTimes) (return $ take 3 recents)
pandocCompilerWithTransform hblogPandocReaderOptions hblogPandocWriterOptions (titleFixer titles)
>>= loadAndApplyTemplate "templates/post.html" postsContext
>>= loadAndApplyTemplate "templates/default.html" postsContext
>>= relativizeUrls
但这让我明白了:
[ERROR] Hakyll.Core.Runtime.chase: Dependency cycle detected: posts/computing/contact.md depends on posts/computing/general/ched.md depends on posts/computing/contact.md
,换句话说,我最后经历了两个最近的循环.
, in other words I end up with the two most recent cycling around each other.
推荐答案
在 https://stackoverflow.com/a/35651294/1461430 对我有用,实际上,它对我来说比作者说的要好.这是我的代码的相关内容:
Turns out the technique at https://stackoverflow.com/a/35651294/1461430 works for me, and in fact it works better for me than the author said it would. Here's the relevant bits of my code now:
match "posts/**" $ version "recents" $ do
route $ (gsubRoute "posts/" (const "")) `composeRoutes` setExtension "html"
compile $ do
pandocCompilerWithTransform hblogPandocReaderOptions hblogPandocWriterOptions (titleFixer titles)
>>= loadAndApplyTemplate "templates/post.html" (postCtx allTags allCategories gitTimes)
>>= relativizeUrls
match "posts/**" $ do
route $ (gsubRoute "posts/" (const "")) `composeRoutes` setExtension "html"
compile $ do
myId <- getUnderlying
recents <- (myRecentFirst gitTimes) =<< loadAll ("posts/**" .&&. hasVersion "recents")
let postsContext = postCtx allTags allCategories gitTimes `mappend`
-- Show recent posts
listField "recents" (postCtx allTags allCategories gitTimes) (return $ take 3 recents)
pandocCompilerWithTransform hblogPandocReaderOptions hblogPandocWriterOptions (titleFixer titles)
>>= loadAndApplyTemplate "templates/post.html" postsContext
>>= loadAndApplyTemplate "templates/default.html" postsContext
>>= relativizeUrls
我不得不在使用template/default.html的其他几个地方添加listField和最近使用的内容,但这很简单.
I had to add the listField and recents to several other places that used templates/default.html, but that was straightforward.
我还必须修改一个使用标识符的函数,以查找从git提取的时间列表:
I also had to modify a function that used Identifier to look up a list of times pulled from git:
-- Pull a file's time out of the list
getGitTimeUTC :: Identifier -> [GitTimes] -> (GitTimes -> UTCTime) -> UTCTime
getGitTimeUTC ident times typeF =
-- The identifier for the things compiled with the "recents"
-- version has the identifierVersion "recents", but we don't care
-- about that since the only reason that exists is to avoid loops,
-- so we strip it here for our lookup.
let fixedIdent = ident { identifierVersion = Nothing }
timeList = filter (\x -> fixedIdent == (gtid x)) times in
if length timeList /= 1 then
-- It's not obvious to me how this could occur even in theory; I'd expect it to error out during getGitTimes
error $ "getGitTimeUTC: Couldn't find the time for " ++ (show fixedIdent) ++ " in GitTimes list " ++ (show times)
else
typeF $ head timeList
因此,我没有文件重复,也没有链接到最新文件的问题.一切正常.
I have no duplication of files as a result, and had no problem making links to the recents files; everything just works.
这篇关于在生成帖子的最新帖子列表时,如何避免依赖周期?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!