Wikipedia:Reference desk/Archives/Computing/2016 September 6
Computing desk | ||
---|---|---|
< September 5 | << Aug | September | Oct >> | September 7 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
September 6
[edit].bs7 file
[edit]How to create and modify the entitled file type? Is there also a software to make the task easier? 103.230.105.17 (talk) 18:46, 6 September 2016 (UTC)
- A .bs7 file is a boot skin for Windows 7 created by Windows 7 Boot Updater, a third-party program for modifying the boot animation. I also found a tutorial on DeviantArt regarding how to extract images/animations from this file. clpo13(talk) 19:08, 6 September 2016 (UTC)
- I'm failing to save the [modified] .bmp file inside the .dll file. Any idea why? 103.230.104.6 (talk) 11:20, 7 September 2016 (UTC)
Old technology
[edit]1. What's the most amount of data in one place by 1980?
2. If it's not easily computer-readable (i.e. the Library of Congress or millions of Saturday Night Fever Betamaxes in a warehouse) then how big is the biggest that is? (i.e. a computer tape archive (doesn't have to be in tape readers))
3. I have a book © 1982 that mentions 2x1010 bit computer optical discs. (Which could "soon" replace an 8 acre 1015 bit tape archive with only 50,000 discs) When did these discs come out?
4. Could a 1980 minicomputer be programmed to display a 24+ bit color 4K image with only 1980 technology? (changing only display, software and I/O). About how long would this take to set up? (programming time etc.) How many frames per second could it do? 0.000x fps? Would telling it what to draw be easier if you just program it to use 24,883,200 consecutive octets of it's hard drive as the red, green and blue values of consecutive subpixels? Sagittarian Milky Way (talk) 18:57, 6 September 2016 (UTC)
- Some tricky questions. I have no idea about 1 or 2. Question 3, surely they're talking about Compact disk? 1982 happens to be the year CD players became commercially available. As for 4, I don't see any reason why not. The first digital photos were taken in the mid 70s. It might be horrifically slow but a 1980 computer is just as turing complete as a modern computer. Vespine (talk) 22:41, 6 September 2016 (UTC)
- Just to add a little to 4, I actually have some experience writing a low level display driver, albeit for a display with only 512 pixels. Obviously your microcomputer would lack the driver and protocol to "draw" the screen in the 1st place, there wasn't even vga until 1987, before then most computer screens were just TVs with analog inputs and 640 lines. So you would need to custom build an interface, which would not be trivial and write a display driver from scratch which would also not be trivial. Have a read of Video display controller to get an idea of the challenges you might face. As for "how many frames per second", I think you'd be more likely counting how many frames per day or even per week. Vespine (talk) 23:32, 6 September 2016 (UTC)
- 0.0000something is still a frames per second.. Sagittarian Milky Way (talk) 03:39, 7 September 2016 (UTC)
- I didn't say it wasn't, except in the history of the world I doubt anyone has ever actually used "ten-thousandths of a frame per second" to measure anything, ever. Vespine (talk) 04:08, 7 September 2016 (UTC)
- I should've wrote that in small type. Sagittarian Milky Way (talk) 04:54, 7 September 2016 (UTC)
- I didn't say it wasn't, except in the history of the world I doubt anyone has ever actually used "ten-thousandths of a frame per second" to measure anything, ever. Vespine (talk) 04:08, 7 September 2016 (UTC)
- 0.0000something is still a frames per second.. Sagittarian Milky Way (talk) 03:39, 7 September 2016 (UTC)
- A Red Book CD stores about 4.3218 Mbps × 74 minutes ≈ 1.9×1010 raw bits, so they must have been talking about CDs. (That's about 2.2 GiB, but it includes the coding overhead. You can only store ~1 GiB of useful data on a CD with efficient coding, or 650 MiB with the inefficient coding actually used in the CD-ROM standard.) -- BenRG (talk) 23:49, 6 September 2016 (UTC)
- I had no idea CDs held multiple gigabytes, that sure is inefficient. Sagittarian Milky Way (talk) 04:04, 7 September 2016 (UTC)
- Yes, it is inefficient in a sense, by design. Fault tolerance, error correction, error detection, encoding, etc. Trust me, you really wouldn't want a CD packed with 2 GiB of raw binary data - if it wasn't useless right out of the box, it would very quickly become so. The fine folks who made these standards knew that people would prefer products that work compared to products that have larger storage specs, but don't work. SemanticMantis (talk) 14:26, 7 September 2016 (UTC)
- I always guessed it was under a gig raw but I'm sure they did the best they could at the time. Without sealing it like a tape or hard drive but that probably would've decreased profit and/or smallness beyond what the powers that be wanted. Sagittarian Milky Way (talk) 16:53, 7 September 2016 (UTC)
- They don't really hold multiple gigabytes. There are ~19 billion locations on the disc where the pickup can detect a pit-to-land transition, but the minimum length of a pit or land is three times that. The capacity of a run-length limited channel with a minimum run length of 3 is about 0.55, so the disk can really only hold about 1.2 GiB even in clean-room conditions with no error correction. (There is also a maximum pit/land length of 11, but that has very little effect on the capacity.) -- BenRG (talk) 01:02, 8 September 2016 (UTC)
- Yes, it is inefficient in a sense, by design. Fault tolerance, error correction, error detection, encoding, etc. Trust me, you really wouldn't want a CD packed with 2 GiB of raw binary data - if it wasn't useless right out of the box, it would very quickly become so. The fine folks who made these standards knew that people would prefer products that work compared to products that have larger storage specs, but don't work. SemanticMantis (talk) 14:26, 7 September 2016 (UTC)
- I had no idea CDs held multiple gigabytes, that sure is inefficient. Sagittarian Milky Way (talk) 04:04, 7 September 2016 (UTC)
- Just to add a little to 4, I actually have some experience writing a low level display driver, albeit for a display with only 512 pixels. Obviously your microcomputer would lack the driver and protocol to "draw" the screen in the 1st place, there wasn't even vga until 1987, before then most computer screens were just TVs with analog inputs and 640 lines. So you would need to custom build an interface, which would not be trivial and write a display driver from scratch which would also not be trivial. Have a read of Video display controller to get an idea of the challenges you might face. As for "how many frames per second", I think you'd be more likely counting how many frames per day or even per week. Vespine (talk) 23:32, 6 September 2016 (UTC)
- In answer to (4) it would be NO. Firstly the memory of the minicompute would not be enough to hold a single image. Memories of half a megabyte were common, and you might have been able to upgrade to one megabyte. Secondly the bandwidth and processing power would not be enough to do anything useful, not even enough to copy an image. In that era you could have had a standard definition TV graphical output, or several 24×80 visual display unit terminals. Graeme Bartlett (talk) 13:02, 7 September 2016 (UTC)
- Not even if you told a TV to show what's in the upper left corner, a TV placed physically adjacent to draw the next pixels, and so on and wait through a lot of loading until all
4830(?) TVs have an image? How many frames per week could you get doing that? (reusing frames allowed since the entire hard drive's probably 10 uncompressed frames max) Sagittarian Milky Way (talk) 16:53, 7 September 2016 (UTC)- No. Adding rows of TV sets doesn't magically allow a minicomputer to access more memory than it has. --Guy Macon (talk) 01:07, 8 September 2016 (UTC)
- Not even if you told a TV to show what's in the upper left corner, a TV placed physically adjacent to draw the next pixels, and so on and wait through a lot of loading until all
- This article about the NYIT CG Lab says "The workhorse hardware during the early 80's [included] six programmable Ikonas graphics processors, the largest with 12 megabytes of image memory (an ungodly amount in that day: 2048x2048x24-bits), viewed with rare thousand line color monitors [...]". That's a bit after 1980, and it was a lab with a multimillion dollar budget, but the technology did exist then. SuperPaint had a 640x480x8bpp frame buffer with real time TV output way back in 1972. -- BenRG (talk) 01:38, 8 September 2016 (UTC)
- Ah, Ikonas Graphics Systems! I had forgotten about those. Nice writeup with loads of technical details at [ http://www.virhistory.com/ikonas/ikonas.html ]. We really need an article on this. --Guy Macon (talk) 02:04, 8 September 2016 (UTC)
- I said YES, Guy says NO. I was thinking about this a bit more and I think the answer is "it depends"... I don't think the memory thing is as much a limitation as Guy thinks it is, as you pointed out, read it from disk if you have to, it might take orders of magnitude more time, but it doesn't make it impossible. Impractical definitely, impossible no. Just recently there was a link here to someone who managed to boot up 32bit linux on an 8bit microcontroller using an emulator, similar kind of thing. It's "conventional wisdom" that 32bit software won't run on an 8bit computer since it "doesn't have the address space", but it's not impossible. It took half a day to just boot up, but it "worked". Vespine (talk) 00:18, 9 September 2016 (UTC)
- Ah, Ikonas Graphics Systems! I had forgotten about those. Nice writeup with loads of technical details at [ http://www.virhistory.com/ikonas/ikonas.html ]. We really need an article on this. --Guy Macon (talk) 02:04, 8 September 2016 (UTC)
- You may be interested in the IBM 3850 Mass Storage System which held up to 472 GB. But who owned them, I can't tell you. Graeme Bartlett (talk) 09:10, 11 September 2016 (UTC)
Opening lots of links at the same time
[edit]Is there a way to open all the links on a webpage into new tabs with a single action rather than ctrl-clicking each one to open in new tab. I use Chrome but have other browsers too. Thanks. Anna Frodesiak (talk) 22:45, 6 September 2016 (UTC)
- Linkclump for Chrome and Snap Links Plus for Firefox will open all links that fall in a rectangular selection area in new tabs. -- BenRG (talk) 23:46, 6 September 2016 (UTC)
- Thank you very, very much BenRG!! Anna Frodesiak (talk) 01:58, 7 September 2016 (UTC)
- Beware that this may very well take your computer down, if it lacks the resources to open all those tabs at once. StuRat (talk) 02:37, 9 September 2016 (UTC)
- I've opened up to 200 tabs or more in the past. This is generally with Firefox but I don't see any reason for Chrome to be that different. Opening such a large number of tabs can seriously slow down the browser and some cases even make it crash, but it should not kill the computer. Frankly it's questionable whether any regular operation should take down the computer. If you're computer is killed by such things, I would suggest you either need to upgrade the computer or upgrade the operating system or maybe dump the software as seriously flawed. Worst case would be if the browser uses all your memory. If that happens the whole OS may slow down, but it still should not die. You should eventually be able to close the browser and the OS will largely recover. With Firefox at least, I've found even with the 64 bit variant it seems to top out at about 6GB before the browser is so slow that it can be close to ununsable or at risk of crashing, so if you have 8GB your OS will generally be fine. However Chrome is much more multi-process oriented so may be able to use more RAM. Nil Einne (talk) 07:47, 11 September 2016 (UTC)
- I have an older PC and my limit is more like 20-30 tabs. The typical behavior is that it slows down so much as to become unusable, and this is not just limited to the browser. Even clicking the Windows Start button to restart the PC can be unreasonably slow, causing me to hit the power button to do a hard reset. StuRat (talk) 20:24, 11 September 2016 (UTC)