Jump to content

Wikipedia:Reference desk/Archives/Computing/2013 May 28

From Wikipedia, the free encyclopedia
Computing desk
< May 27 << Apr | May | Jun >> May 29 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


May 28

[edit]

Centering text and pictures

[edit]

How do I center complex portions of a page (i.e. not just text) without using <center> tags (which are deprecated)? In particular, I want to center {{Related portals2}}. -- Ypnypn (talk) 16:12, 28 May 2013 (UTC)[reply]

use margin-left:auto and margin-right:auto
like this
-- Finlay McWalterTalk 16:29, 28 May 2013 (UTC)[reply]
Unfortunately, that doesn't work when the width is unknown. I managed to put it in a table, and used Help:TABLECENTER, but your solution is definitely neater :-) Ypnypn (talk) 16:38, 28 May 2013 (UTC)[reply]
Should work without a width if you add display: table;. ¦ Reisio (talk) 21:45, 28 May 2013 (UTC)[reply]

Stopping repeated loading of content, stylesheets and javascript codes to fasten

[edit]
Resolved

Hi! Each time I load a page in Wikipedia, I am loading up a lot of stuffs that is same everywhere in this website, like the globe on the upperleft, sidebar and personal tools, the bottom-bar with two more images, scripts to make up the (vector) skin, .css codes and so on. These are all wasting a lot of time, especially for editors like me with poor internet connections. (I, with only 3k edits must have loaded these stuffs a million times, now what about the usual Wikipedians with tens of thousands of edits.) So, is it possible (in theory) to save these changeless codes in the computer and load only the other page specific data and codes each time? Thanks in advance···Vanischenu「m/Talk」 18:20, 28 May 2013 (UTC)[reply]

Yes, you can use a caching proxy server. Chances are very high that you are already doing so at several layers of abstraction, because you are using a modern web browser, a modern internet service provider, and you are connecting to servers of the Wikimedia Foundation, whose engineers have designed a fairly complicated system of caching and proxying. Each layer performs some degree of caching automatically. The layer closest to your user-experience - the web browser - has the biggest benefit/risk trade-off. The more it caches, the less total data is transferred. However, it is also the layer of abstraction with the least capability to determine when it needs to refresh its local cache with fresh content. Nimur (talk) 19:41, 28 May 2013 (UTC)[reply]
You're not loading everything you think you are - all the static stuff should be held in your browser's webcache, and not loaded again across the network. Let's look at the globe graphic as an example (http://upload.wikimedia.org/wikipedia/en/b/bc/Wiki.png). Wikimedia's Varnish server uses HTTP ETags to avoid resending resources which are already in your browser's webcache. The first time you load the page, it loads everything as normal. But the second time, when it loads the page, the HTTP request it sends to retrieve the globe contains:
 GET /wikipedia/en/b/bc/Wiki.png HTTP/1.1
 Host: upload.wikimedia.org
 If-None-Match: 81a7ce38ad250ee09209b1b2cdf50830

(note, incidentally, that 81a7ce38ad250ee09209b1b2cdf50830 is the MD5 checksum of Wiki.png). Your browser can do this because it's already loaded Wiki.png once, and it stored the Etag for it when it cached. So it's saying "please give me Wiki.png, but only if it's changed from being 81a7ce38ad250ee09209b1b2cdf50830"). And the Varnish server replies
 HTTP/1.1 304 Not Modified
 Etag: 81a7ce38ad250ee09209b1b2cdf50830
which means "nah, you already have the right version". So it doesn't send the 20Kbytes of PNG data for Wiki.png, because your browser already has it cached. -- Finlay McWalterTalk 19:58, 28 May 2013 (UTC)[reply]
You can see exactly what is being requested and downloaded by running a debugging proxy such as Fiddler (software) on your computer. It can be very illuminating. AndrewWTaylor (talk) 20:26, 28 May 2013 (UTC)[reply]
For those of us with dodgy internet connections, it is only the caching in the browser that helps the situation. Caching by the ISP provider doesn't help us significantly (unless they also have a dodgy link). Make sure that your browser is set for maximum caching. This will mean that, just occasionally, it will display an old page and you will have to tell it to refresh, but it will minimise data transfer over the slowest part of the link. Dbfirs 07:55, 29 May 2013 (UTC)[reply]

Thank you all. How to set the browser for maximum caching? I am currently using Mozilla Firefox 17.0.1 and can also use Google Chrome 24.0.1312.56 m. Which is best for maximum caching?
Yes http://upload.wikimedia.org/wikipedia/en/b/bc/Wiki.png is cached in my browser. To speed up the displaying of page, I usually click "disconnect" option in the net setter (other wise the page keeps on loading for hours and will not display till the loading is completed. On disconnecting, the page is displayed with the loaded content) In such cases the textual materials are loaded completely, but the page will lack such images (logo too), stylesheets and javascript effects. So I thought they are getting loaded each time. If they are cached, then why are they not showing up in the page? ···Vanischenu「m/Talk」 14:12, 29 May 2013 (UTC)[reply]

In Firefox: Tools -> Options -> Advanced -> Network will tell you how much memory is being used for caching. Check that it is not set to a low limit. I think the automatic caching is probably best, though you could try a large limit to see if it makes any difference. ( I don't know how Firefox decides, but I assume that it caches most items if there is memory available. ) When my internet is running very slowly, I use Opera turbo without images for Wikipedia. (Tools -> Quick preferences -> Enable Opera Turbo.) Turbo has a small overhead in load time, but this is more than compensated on a slow connection by the reduced number of bytes downloaded. I don't know the answer to why images etc are not showing. I get the same effects when I press escape. I assume that the server has not reached the stage of checking the browser cache. Dbfirs 22:02, 29 May 2013 (UTC)[reply]
Thank you very much. I changed it to 1 GB. The OS installed drive of my computer had only 300 MB or so left, I increased it as well, to around 650 MB. Yes, now it has increased the speed, and loads completely. Thanks to all of you. Regards···Vanischenu「m/Talk」 00:12, 31 May 2013 (UTC)[reply]
However, some sites are just stubborn/retarded/whatever :( For example, tvtropes.org refuses to work off-line. Even if I loaded the page 1 minute ago (not sure what's going on - cache is set to 500 on my firefox). So, there are some sites which are just malicious (Cracked is only slightly better), or the caching algorithm just plain SUCKS...
OTOH, wikipedia is fine 99% of the time. 17:21, 1 June 2013 (UTC)
Oops. Failed to sign that one properly. - ¡Ouch! (hurt me / more pain) 06:03, 3 June 2013 (UTC)[reply]