Pokémon Wiki



Narutopedia's ParserSpeed page

ParserSpeed is a special tool, available to administrators, which lists how long pages take to render. It can be found at Special:ParserSpeed.

Longer pages with lots of images and templates will naturally take longer to render than shorter articles. With this tool, you are able to identify which pages on your wikia are the slowest to render, and see specific details as to what might be the cause.

Identifying what causes the slow page loads is the first step towards speeding them up. It is, of course, in everyone's best interests to have articles that are quickly parsed - it's better for editors, visitors and, of course, Wikia's servers.

Why is page load speed important?

This may seem like a silly question. 'This is the internet! It must go faster, faster!' But it is true and actually quite remarkable how quickly we can find ourselves losing patience if a website takes even a single second longer to load than normal. Longer page loads create a significant difference in your perception of a website. We can split up its effect on website perception in three areas:

User experience
The speed at which the page loads has an impact on what the user gets out of that particular page. Since many users come from search, their first experience may be on a specific page. If that page is slow to load, chances are they will expect all pages on your community to be the same. This impression can be lasting, so keeping pages as fast as possible will reduce this risk.
User perception
This is how a user feels about a site after a number of interactions with it. Much like user experience, their long-term perception of the site may be willing to trade some site lag for valuable content. The trade-off threshold is high, however (the content must prove to be immensely unique and valuable), for the user to completely forget or ignore the site speed. Ultimately, if they find a site they feel has a better trade-off (speed vs. content), they will begin to rely upon that site more.
Technical perception
Google does factor site speed into its search ranking algorithms. That's why it's essential for wikias to make sure they achieve a balance between having a lot of information on a page that a user might land on from Google versus having so much information that it slows down page load, and thus drags down the page rank. Even if a page on a wikia has more useful content than the first four search rankings, chances are the searcher will either give up after four fruitless results or feel as though they've reached the maximum available information before ever finding result #5.

The ParserSpeed data

ParserSpeed gives you a list of pages on a wikia, ordered by how long it takes the servers to turn the wikitext into a rendered article ('parsing' a page). Note that this is not quite the same as the time the page will take to load, because parsing only has to happen each time a page is modified or the cache expires.

Avg. parsing time, Min. parsing time, Max. parsing time
These list the average, minimum and maximum times that it has taken to parse the page, based on the last 30 days of data.
Wikitext size and HTML size
Indications of how big the basic wikitext of the article and the rendered HTML currently are in kilobytes (KB). Note that templates can turn a very small amount of wikitext into a large amount of HTML.
'Exp. functions' (Expensive parser functions)
A count of the number of expensive parser functions used on the page.
A limit of 100 is enforced.
Read more details below and on Wikipedia.
'Node count' (Preprocessor node count)
A measure of the complexity of the page, roughly related to how complex the published HTML is.
A limit of 1,000,000 is enforced.
Read more details on Wikipedia.
'Post expand size' (Post expand include size)
The sum of the lengths of the expanded wikitext generated by templates, parser functions, etc.
A limit of 2,097,152 bytes is enforced.
Read more details on Wikipedia.
'Temp arg. size' (Template argument size)
A count of the total length of template arguments that have been substituted.
A limit of 2,097,152 bytes is enforced.
Read more details on Wikipedia.

How to make improvements

There are a few approaches you can take to reduce the parsing time and complexity of a page.

Simpler pages
Split long pages up into several shorter ones - besides the performance benefits, this may also be a better experience for visitors on small screens or slow connections.
Don't overload individual articles with very long lists or huge amounts of data.
Simpler templates
Avoid using heavily nested templates (templates within templates within templates, a.k.a. "Inceplates").
Don't make templates too generic - this can cause them to become full of irrelevant, complex functionality. Don't have parameters just for the sake of having parameters - hard-code as much as is reasonable.
Use fewer templates.
Avoid expensive operations
A number of functions are considered technically expensive to render, because they involve doing a non-cacheable query in the database. Those functions are {{#ifexist:}}, {{PAGESINCATEGORY}}, and {{PAGESIZE}} - avoid them, when possible.
If you need to use DPL, keep it simple. Use caching in DPL at all times. If a table of information is fairly static, it may make more sense to go ahead and code that page manually as opposed to having a query slow the page down.
Switching the template to use Lua templating can result in a much faster template that performs the same functions.
Don't create complex MediaWiki namespace messages
Since MediaWiki messages may be loaded on many pages, don't use the MediaWiki namespace for parser functions or template calls. Keep that namespace simple.

Further help and feedback

Around Wikia's network

Random Wiki