Feature Request External label resolution

Sherwood Botsford sgbotsford at gmail.com
Sun Apr 20 11:47:26 EDT 2008


Fletcher T. Penney wrote:

> You could do something like:

>

> cat myfile.txt refs.txt | MultiMarkdown.pl > myfile.html

>

>

> F-

>

> On Apr 20, 2008, at 10:40 AM, John MacFarlane wrote:

>> Pandoc concatenates input from all files specified on the command

>> line. So you can just do:

>>

>> pandoc myfile.txt refs.txt > myfile.html

>>

>> Seems to me that this would be a reasonable default behavior for

>> Markdown.pl as well, but it doesn't seem to work that way now.

>>

>> John

>

>

>

>

> ------------------------------------------------------------------------

>

> _______________________________________________

> Markdown-Discuss mailing list

> Markdown-Discuss at six.pairlist.net

> http://six.pairlist.net/mailman/listinfo/markdown-discuss


I've had several people who have suggested this as a valid
solution. It makes sense if you have a few dozen links.
And if I was sufficiently clever, I could break the site down
so that references were in a bunch of refs.txt files, and no
refs.txt file would have mroe than a few dozen links.

However I don't know how to partition my site in such a way.

If refs.txt has 2000 links in it, and every file
has to parse the entire refs file, it takes a long time.
As the site grows, processing time will grow quadratically.
(If I double the number of files, it will also double the number
of links. So a site that is twice as big will have
also have twice as many links.

At present I have a small site with 117 pages and 183 links.
Given that In a year I figure the link count will be over a
thousand.

At present I have one file that has a 1000 line table (including
all the tags on separate lines) When TemplateToolkit / markdown
hit this file there is a 10 - 15 second pause. Since TT is doing
it as an INSERT, not as an INCLUDE, TT isn't even looking at the
file, so I think it's Markdown scanning this file looking for
tags that is causing the delay. If a thousand line file with no
partial matches is slowing down Markdown this much, I would
expect that a file with 997 non-matching label/url lines and 3
matching label/url lines would cause considerably greater delay.

So perhaps I should ask a more general question:

How do you deal with large numbers of links?


More information about the Markdown-Discuss mailing list