Wikipedia:Reference desk/Archives/Computing/2015 January 7
Appearance
Computing desk | ||
---|---|---|
< January 6 | << Dec | January | Feb >> | January 8 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
January 7
[edit]computer theory
[edit]discuss 3 application of information and communication technology in education41.219.82.210 (talk) 15:01, 7 January 2015 (UTC)
- Please do your own homework.
- Welcome to Wikipedia. Your question appears to be a homework question. I apologize if this is a misinterpretation, but it is our aim here not to do people's homework for them, but to merely aid them in doing it themselves. Letting someone else do your homework does not help you learn nearly as much as doing it yourself. Please attempt to solve the problem or answer the question yourself first. If you need help with a specific part of your homework, feel free to tell us where you are stuck and ask for help. If you need help grasping the concept of a problem, by all means let us know. --Tagishsimon (talk) 15:13, 7 January 2015 (UTC)
- While we're not allowed to do your homework for you, we can suggest some of our articles for you to read: I would think that Computers in the classroom and Distance education would be good starting points. When you're using Wikipedia to help you write academic reports, it's strongly recommended that (a) you don't just cut and paste from the article because your teacher will undoubtedly check for that and won't be happy if you do it...and (b) Use the references at the bottom of each article - they are far better places to get quotes and references that you can cite in your own writing. SteveBaker (talk) 15:45, 7 January 2015 (UTC)
- Other general recommended reading would be educational technology. Recently, the MOOC has been much discussed. There's also correspondence courses, which used the the older technology of postal mail ("correspondence course" is now a redirect to "distance education" but I linked to the historical section on the practice in early universities). As a teacher, the technologies I use the most are email, chalk boards, pencil and notebook. Two technologies that were very standard for a long time but now seldom used - overhead projector and mimeograph. SemanticMantis (talk) 16:23, 7 January 2015 (UTC)
Materials, money, and technical expertise required to start a website from scratch?
[edit]Although I am aware of the free webhosting services and make-your-own-web-server tutorials, I am wondering what would be required to start a website from scratch. By "from scratch", I mean developing your own hardware for the web server or creating your own web server software, coding your own website, and developing your own registrar to host a domain name? How much money would a person have to invest in order to start and maintain a website? What technical expertise would such a task require? 71.79.234.132 (talk) 16:37, 7 January 2015 (UTC)
- Do you really mean developing your own hardware and creating your own web server software? That seems as hundreds of millions and years developing it. --Noopolo (talk) 18:20, 7 January 2015 (UTC)
- You are being very general. So, I could tell you that it will take you about a lifetime to develop you own hardware beginning with learning what electrons are, then going to understanding current flow, then learning how basic electronic components work, then learning digital logic, then building logic circuits, and then going into a completely different field of learning the basics of semiconductor chemistry and learning how to construct NPN or PNP layers on a wafer... However, I expect that you are asking about purchasing computer parts and building a computer to use a webserver. That is very easy. You need a network connection, a storage device, a processor, a motherboard, and a power supply. It would be nice to have some sort of interface, such as a monitor and keyboard. Then, as far as the software goes, you could spend another lifetime learning how to do socket programming and then learning why you keep getting buffer overrun attacks. Alternately, you could just install Apache. Now, coding a website is very easy. Learn HTML (it takes about 10 minutes for a normal person to learn that anything between < and > is an HTML tag and another 10 minutes to learn the 4 or 5 most commonly used HTML tags). Now, you can make a web page. You want to create a registrar? How much money do you have? It is very very very expensive to create a real registrar. You can make a fake one that only you use. It can be as easy as faking a lot of entries in your /etc/hosts file. But, if you want to make a real one, you need to get accredited with ICANN. Then, you have to get contracts with other registrars to work with them - and they won't want to work with you. You are better off trying to start your own bank than trying to start a registrar. Registering a domain name is much easier. Go to a registrar (any registrar) and see if the domain name is available. If it is, register it. Finally - the technical expertise. It is low - very low - if you are simply trying to get a web page online on your own server. Get a computer (any computer with a network connection). Install Linux and Apache on it (there are literally thousands of howto pages, books, videos, pictographs, etc... available). Make a web page. Plug it into the network. Register a domain name. Now the hard part that you skipped over: Get a domain name service that points your domain name to the IP address of your webserver. Then, you are done. — Preceding unsigned comment added by 209.149.113.90 (talk) 19:10, 7 January 2015 (UTC)
- The problem here is what is meant by "from scratch"? If you mean from the level of piles of rocks and trees, so you have to refine the silicon from which to make your circuits...then clearly you'd need an insane amount of time. If you also assume that you have to first discover electricity and invent the concept of a computer - then it's even longer! If you mean from a bare-bones PC, you'd need to at least rewrite Apache (or some kind of subset of it) and a large chunk of the operating system. If you have the right skills and knowledge, then you could probably make a web site with minimal functionality it in under a year...maybe just a few months. But if you mean from a functional Linux machine with Apache already on it...then much, much less time. Really, it's not possible to come up with any kind of meaningful answer unless you first define your terms. SteveBaker (talk) 21:13, 7 January 2015 (UTC)
- From natural resources: rocks and trees. Explain the process. Or if it's too long, please help me find resources. I really need to be more tech-savvy, which involves where the resources come from to be made into silicon chips and computers. 71.79.234.132 (talk) 21:43, 7 January 2015 (UTC)
- If you wish to make a website from scratch, you must first invent the universe. Vespine (talk) 21:47, 7 January 2015 (UTC)
- The reason I gave a flippant answer is your question seems very frivolous. You KNOW what you are trying to find out, do you want us to google it for you? There's nothing stopping you doing it yourself: Start by searching "how are silicon chips made", the move onto "how was the transistor invented", perhaps "where does silicon come from"? history of computing is a good article, as is history of computing hardware and history of the internet, then basically be prepared to spend the rest of your life looking up random articles, clicking through links you find interesting and learning about stuff you've never dreamed of. There are no shortcuts. Welcome to the club. Vespine (talk) 21:53, 7 January 2015 (UTC)
- If you wish to make a website from scratch, you must first invent the universe. Vespine (talk) 21:47, 7 January 2015 (UTC)
- Really, 71.79.234.132, the previous commentators are right. Your question is too vague. What do you mean by "from scratch"? From basic elemental materials, from basic electronic components, or from a set of ready-bought computer components and an operating system? Please specify in more detail. JIP | Talk 21:58, 7 January 2015 (UTC)
- From natural resources: rocks and trees. Explain the process. Or if it's too long, please help me find resources. I really need to be more tech-savvy, which involves where the resources come from to be made into silicon chips and computers. 71.79.234.132 (talk) 21:43, 7 January 2015 (UTC)
- There is too much to learn. It would take lifetimes to learn it all; it would take years just to paraphrase it.
- Perhaps the original poster should read the famous essay, I, Pencil (1958), in which economist Leonard Read remarks that millions of humans are involved in the creation of even our simplest artifacts. Perhaps more accessible is the somewhat sensationalized 2010 TED Talk, When Ideas Have Sex, in which author Matt Ridley puts forward the premise that nobody on Earth knows how to build a computer mouse. For comparison, he shows a stone adze made by humans from the paleolithic era, and a similarly-sized computer mouse made in 2010. Many ancient humans knew how to build stone tools - each individual knew everything necessary to harness natural resources and produce state-of-the-art technology. Today, this is not true: instead, our collective - our society - knows how to build a computer mouse; but unlike our paleolithic predecessors, our technological knowledge does not reside in any individual. This is specialization of labor.
- These ideas paraphrase what you will read in any good book on the history of technology or anthropology. The present level of knowledge that our species has accumulated is too large for you, as an individual, to know all of it.
- Our best and brightest students in modern schools begin diligent study at the age of 3 or 4. Students who wish to specialize in technological applications might not finish study until they are 22 or 25 or 30 years old. In other words, if you diligently study prior human accomplishments, it will take over twenty years of intense and difficult education, and you might just barely catch up to, say, elementary calculus - where our mathematicians were making new discoveries four hundred years ago. Even if we have an open mind and consider alternatives to conventional Western education, we have a meta-problem: we still do not know a method that is more efficient at transferring knowledge per unit time. While conventional education may have some flaws and inefficiencies, the alternatives tend to be categorically slower: you will not find any Montessori schools that crank out more practicing engineers per unit of time. In other words, we do not yet have technology to holistically educate our next generations about our existing technology. Instead, our system produces specialized students with areas of expertise, and the sum-total knowledge is only retained, piecemeal, among many different individuals.
- Extrapolating forward, it becomes clear that if our society keeps advancing our knowledge of science and technology, we will eventually reach a time when it takes so many years to learn prerequisites that the average human will die before they can advance the species. Science fiction author Isaac Asimov explored this in his story Profession (1957), in which a super-advanced human society must figure out how to efficiently educate its citizens: first, students are selected for their psychological profile, and then a machine accelerates the transfer of knowledge from a computer-tape into their brain, because there is so much they need to learn. This trope has been emulated by lesser authors and screenwriters for decades to follow. I'm fairly certain that when we do invent such a device, it's going to look a lot like a free, free digital computer-based encyclopedia.
- Nimur (talk) 14:42, 8 January 2015 (UTC)
- So, what happens when there is a catastrophic global war that destroys most of human knowledge stored in computers and electronic databases as well as kills many people of specialized expertise? Will that mean humans have to rebuild everything all over again? 71.79.234.132 (talk) 16:00, 8 January 2015 (UTC)
- Possibly. It depends on how complete the loss of knowledge is. Consider this: in much of the developing world, and even in some unfortunate parts of the developed world, basic facts of science - like the germ theory of disease- are not widely understood. It is not actually impossible to learn these facts; these facts are freely available and have not been "lost" to catastrophe; but nonetheless, there exist entire societies exist in which the critical-mass of educated people is insufficient to shape national policy. I worry more about the climate of social behaviors that hinder progress, which are more dangerous than any "catastrophe-event" like an immense global war or a natural disaster. As long as humans are free to improve themselves, I am confident we will adapt whatever remaining natural resources we need to improve our condition, natural and manmade calamities notwithstanding. My optimism is not unique: if you'd like to spend an hour hearing great thinkers and innovators and policy-makers of our modern technological society discuss this topic, here is Ted Koppel moderating a 2006 "roundtable discussion": Seeing Beyond a World of Perpetual Threats. We're rebuilding our world every day. Nimur (talk) 16:33, 8 January 2015 (UTC)
- So, what happens when there is a catastrophic global war that destroys most of human knowledge stored in computers and electronic databases as well as kills many people of specialized expertise? Will that mean humans have to rebuild everything all over again? 71.79.234.132 (talk) 16:00, 8 January 2015 (UTC)