Wikipedia:Reference desk/Archives/Science/2009 November 4
Science desk | ||
---|---|---|
< November 3 | << Oct | November | Dec >> | November 5 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
November 4
[edit]Spiritual Science. Rectifying Hypotheses
[edit]can i create my own hyperlink-web here of my findings and studies with this 'spirituality'? i believe people would be interested in these things. i simply would like to be recognized as the author. please tell me my options. i simply want this question answered, i will not waste my time. —Preceding unsigned comment added by Love me alway (talk • contribs) 03:55, 4 November 2009 (UTC)
- Sorry, but Wikipedia is not a venue for original research. — Lomn 04:05, 4 November 2009 (UTC)
- Such a compilation may be acceptable on your user page or in a user-space sub-page. See the Help for User Subpages and guidelines for allowable subpages. Again, Wikipedia is not the place to conduct original research; but if you are merely compiling and organizing links to other Wikipedia articles that you find useful, you can put it in your user space. This area has a little less regulation for content, as it is not technically part of the encyclopedia. As far as ownership, all content must be submitted under the Creative Commons and GFDL licenses - keep this in mind. You do not own your user-page or the content you submit to it. Finally, remember that even though Wikipedia offers you some freedom in the user-page space, Wikipedia is not your web host. Our goal is to write an encyclopedia; your user-page space is really supposed to help you (and others) to make contributions to actual encyclopedia articles. Nimur (talk) 05:58, 4 November 2009 (UTC)
- Submitting material to Wikipedia is unwise if you wish to be recognized as the author. You could easily be accused later of copying (your own text) from Wikipedia. Cuddlyable3 (talk) 16:16, 4 November 2009 (UTC)
- You can generally prove that you were the author of a particular thing by examining the "history" of the item in question. However, there is nothing whatever you can do to prevent someone from changing what you wrote - or copying it - or deleting it - or anything else. The open licensing of Wikipedia not only allows that - it actively encourages it. However, your own "findings" about something are completely unwelcome here. The goal is for all information to be generated in a neutral manner by reference to respected source material. Nothing you think up for yourself is allowed into any Wikipedia article...so what you suggest would likely be vehemently opposed from all sides! SteveBaker (talk) 22:13, 4 November 2009 (UTC)
- I interpreted "findings" to mean "things the OP found on the web and wants to link to." If the "findings" are actually synthesis of original research, then as Steve has pointed out, they should not be published on Wikipedia. We have a strong policy against publishing original research here. Nimur (talk) 17:44, 5 November 2009 (UTC)
Name reactions in Chinese Wikipedia
[edit]Look at the articles about name reactions in Chinese Wikipedia. Why don't they translate the names into Chinese? --128.232.251.242 (talk) 10:37, 4 November 2009 (UTC)
- Chinese is in general iconic rather than sound based. The problem is with us using Chinese designations as they'd see it is us not using the proper icon but just a particular sound. Dmcq (talk) 10:54, 4 November 2009 (UTC)
- But almost all the people do have a Chinese name, with almost no exceptions, for instance see zh:Category:Living people. Adolph Wilhelm Hermann Kolbe translates to zh:阿道夫·威廉·赫尔曼·科尔贝 in Chinese. So why don't they translate the name Kolbe in Kolbe electrolysis into Chinese? --128.232.251.242 (talk) 11:12, 4 November 2009 (UTC)
- I didn't see any specific guideline there about it. I guess their chemists must just find it easier. You could always ask on a talk page there. Dmcq (talk) 12:33, 4 November 2009 (UTC)
- They don't translate the name because it's a proper noun specifically referring to the reaction. If it were also translated it would be a generic adjective of something. It looks odd because of the huge contrast between roman characters, but it's standard convention in most types of math and science to hold the names. Just because it was able to translate "Kolbe" doesn't mean it came out as "Kolbe"; in its eyes it translated "kolbe", which is a very important distinction. This all happens a whole lot in Japanese, too. Why don't we use their characters for their historical persons or theorems? It's "mostly" easier to move into a lettered format than from that into complex characters. I don't know the term of it in traditional Chinese, but it's "Romanji" in Japanese, literally the romanization of the real language so that most anyone in the world can try to pronounce a word. A nod to European influence via early trading routes into the long-established Eastern cultures. ♪ daTheisen(talk) 15:53, 4 November 2009 (UTC)
- I think the reason is really because people are lazy and there are many different (conflicting) complex standards for transliterating foreign names. PRC, Hong Kong and Taiwan all have different transliteration schemes with different characters, which editors would need to deal with using templates, which has quite a learning curve. It's nowhere near as simple as romaji for Japanese, and it's further compounded by the great firewall of China (although I heard Wikipedia's been unblocked?). --antilivedT | C | G 06:17, 5 November 2009 (UTC)
- They don't translate the name because it's a proper noun specifically referring to the reaction. If it were also translated it would be a generic adjective of something. It looks odd because of the huge contrast between roman characters, but it's standard convention in most types of math and science to hold the names. Just because it was able to translate "Kolbe" doesn't mean it came out as "Kolbe"; in its eyes it translated "kolbe", which is a very important distinction. This all happens a whole lot in Japanese, too. Why don't we use their characters for their historical persons or theorems? It's "mostly" easier to move into a lettered format than from that into complex characters. I don't know the term of it in traditional Chinese, but it's "Romanji" in Japanese, literally the romanization of the real language so that most anyone in the world can try to pronounce a word. A nod to European influence via early trading routes into the long-established Eastern cultures. ♪ daTheisen(talk) 15:53, 4 November 2009 (UTC)
- I didn't see any specific guideline there about it. I guess their chemists must just find it easier. You could always ask on a talk page there. Dmcq (talk) 12:33, 4 November 2009 (UTC)
- But almost all the people do have a Chinese name, with almost no exceptions, for instance see zh:Category:Living people. Adolph Wilhelm Hermann Kolbe translates to zh:阿道夫·威廉·赫尔曼·科尔贝 in Chinese. So why don't they translate the name Kolbe in Kolbe electrolysis into Chinese? --128.232.251.242 (talk) 11:12, 4 November 2009 (UTC)
- In fact our own languages reference desk would probably be best at WP:RD/L. My guess is that people read and write about chemistry but names of people and places are things one says. Even so you will quite often see some special name translated by meaning rather than sound into English. Dmcq (talk) 15:42, 4 November 2009 (UTC)
It's a myth that Chinese is iconic rather than sound-based. All languages are phonetic, but not all languages have a phonemic writing system. John Riemann Soong (talk) 16:39, 4 November 2009 (UTC)
- All spoken languages are phonetic. Sign languages are not, and it is possible to have an entirely written language which is not simply a representation of a spoken language. In fact, I gather written literary Chinese (Literary Sinitic) is quite close to this [1]. 86.142.224.71 (talk) 18:38, 4 November 2009 (UTC)
- Which is what I meant. It can be pronounced very differently in different dialects. Dmcq (talk) 22:27, 4 November 2009 (UTC)
- Not only pronounced differently: it is a different language to most (all?) of the 'dialects'. Children who have a mother tongue other than Mandarin (and even those who have Mandarin?) have to learn it in order to write. 86.142.224.71 (talk) 22:37, 4 November 2009 (UTC)
- Classical Chinese may be helpful here Nil Einne (talk) 10:01, 5 November 2009 (UTC)
- Excellent. And it links to Vernacular Chinese, which is the one that children have to learn Mandarin to be able to read. 86.142.224.71 (talk) 16:28, 5 November 2009 (UTC)
- Classical Chinese may be helpful here Nil Einne (talk) 10:01, 5 November 2009 (UTC)
- Not only pronounced differently: it is a different language to most (all?) of the 'dialects'. Children who have a mother tongue other than Mandarin (and even those who have Mandarin?) have to learn it in order to write. 86.142.224.71 (talk) 22:37, 4 November 2009 (UTC)
- Which is what I meant. It can be pronounced very differently in different dialects. Dmcq (talk) 22:27, 4 November 2009 (UTC)
universities
[edit]can i known what are the top universities in uk for m pharmacy —Preceding unsigned comment added by Nagtej (talk • contribs) 13:39, 4 November 2009 (UTC)
- Sorry, I'm not sure how I can help. The same way that Wikipedia isn't a source or original research, we also can't try to create new ideas using information from Wikipedia. We hold to a standard of neutral point of view in articles, so there would be no way to to make guesses like that even if we thought we should! Good luck... ♪ daTheisen(talk) 15:57, 4 November 2009 (UTC)
- The QUB School of Pharmacy has been rated as the top Pharmacy School in the UK in the 'Times Good University Guide 2010' [www.qub.ac.uk/schools/SchoolofPharmacy/dl/] and The University of Nottingham in the 2006 Times Good University Guide [www.nottingham.ac.uk/pharmacy/undergraduates/index.php]. Tracking down the mentioned guide will surely lead to others. 75.41.110.200 (talk) 16:13, 4 November 2009 (UTC)
- UCAS (Universities & Colleges Admissions Service) can help you with information on undergraduate degree programmes at UK universities and colleges.Cuddlyable3 (talk) 16:10, 4 November 2009 (UTC)
- Yet another best of list at the guardian. --Tagishsimon (talk) 16:14, 4 November 2009 (UTC)
- UCAS (Universities & Colleges Admissions Service) can help you with information on undergraduate degree programmes at UK universities and colleges.Cuddlyable3 (talk) 16:10, 4 November 2009 (UTC)
- The QUB School of Pharmacy has been rated as the top Pharmacy School in the UK in the 'Times Good University Guide 2010' [www.qub.ac.uk/schools/SchoolofPharmacy/dl/] and The University of Nottingham in the 2006 Times Good University Guide [www.nottingham.ac.uk/pharmacy/undergraduates/index.php]. Tracking down the mentioned guide will surely lead to others. 75.41.110.200 (talk) 16:13, 4 November 2009 (UTC)
- Sorry, I'm not sure how I can help. The same way that Wikipedia isn't a source or original research, we also can't try to create new ideas using information from Wikipedia. We hold to a standard of neutral point of view in articles, so there would be no way to to make guesses like that even if we thought we should! Good luck... ♪ daTheisen(talk) 15:57, 4 November 2009 (UTC)
- See here for the Times Good University Guide for Pharmacology and Pharmacy. --Tango (talk) 17:53, 4 November 2009 (UTC)
Was the discovery of evolution "inevitable" in the 19th Century? Why
[edit]I visited an aquarium last week, and one of the exhibits said that because of changes in scientific thinking the discovery of evolution was "inevitable" in the 19th Century. It mentioned that Charles Darwin and Alfred Wallace discovered it independently, and said that even if they had not seen it the theory's time had come and someone else would have done soon. Is this true, or if it was not for these two discoverers could we still not understand evolution today? If it is true what changes in thinking and previous discoveries made the theory of evolution inevitable? As a side question do you "discover", "invent", or "author" a theory - I don't know quite what to write! -- Q Chris (talk) 15:01, 4 November 2009 (UTC)
- It's hard/impossible to speculate whether anything that already happened was "inevitable" but I'm fairly certain that someone else would have struck upon the idea before too long. A lot of the original theory was essentially "Wait a minute... That animal looks a lot like that other animal" and you don't need a rich guy or a traveler to notice that. The Great man theory is probably relevant here, but essentially it's not really possible to factually answer your question. ~ Amory (u • t • c) 15:14, 4 November 2009 (UTC)
- Oh, and as for your last question, you can say "constructed" or "formulated." ~ Amory (u • t • c) 15:16, 4 November 2009 (UTC)
- It was inevitable if you consider that genetic studies were independent of evolution. While the early studies didn't know anything about genes or DNA, they were tracing patterns of inheritance from parent to child, such as colors of kernels of corn and eye color in mice. Eventually, someone would have to recognize that traits were being passed from parent to child. As soon as someone were to stumble upon mutation, evolution would be evident. -- kainaw™ 15:20, 4 November 2009 (UTC)
- The idea of evolution generally was well-known and well-discussed well before Darwin and Wallace. The idea of natural selection being the mechanism of evolution probably would have come out at some point if both of those two had been hit by a train before their time. This distinction is rather important. It is not a matter of saying "this animal looks like this one"—people had already been doing that for a long time, even in the scientific sphere (see, e.g. Erasmus Darwin, Jean-Baptiste Lamarck, and less scientifically, Robert Chambers, etc.).
- There were a lot of people thinking along similar lines at the time—it was definitely "in the air". Darwin is especially well-known because he articulated it in a rather careful form, with lots of evidence, and with all the power of his already-established scientific name attached to it. (Wallace was, in this sense, very much at a disadvantage.) He had powerful friends who assured that the theory would be taken seriously and given attention within the scientific community and not dismissed as just a variation on Lamarck or as just political claptrap (Cf. Vestiges of the Natural History of Creation).
- As for the distinction between "discoveries", "inventions", "authors"—it's a great question, one that professional historians have actually argued quite a lot about. It depends on your conception of authorship itself, and on your conception of what is to be authored. Are theories hanging out there "in the world", waiting to be discovered? Or is the process of articulating a theory a creative act as well as an objective one? I tend to think the answer is somewhere in between—there is a "core" reality to be "discovered", but the aspects of it that get written about, and the ways in which they are formulated in human language, and made convincing and compelling, are a definitely parts of an authorial intervention—an "invention", one could say. This is much easier to see as time goes on—Darwin's theory of natural selection, as he articulated it, contains quite a bit of "nature" in it, but it is very much a work of the particular man himself, and his formulation of it, his preoccupations, his way of arguing it, all reflect that very strongly. --Mr.98 (talk) 15:28, 4 November 2009 (UTC)
- (ec, and now mostly redundant. But I wrote it, so you better read it ;-): "Evolution" was evident quite a while before Darwin and Wallace from looking at fossils. See history of evolutionary thought. Erasmus Darwin described evolution in the late 19th century, and Lamarck formulated his idea of species evolving to better meet environmental conditions in 1809. What Darwin (and Wallace) added was the mechanism of natural selection working on variations of inherited traits, not the idea of evolution itself. --Stephan Schulz (talk) 15:32, 4 November 2009 (UTC)
- I would argue that yes, evolution was an inevitable 'discovery'. It's extremely elementary that weak things are removed over time and robust things remain. It applies to everything across the spectrum, from the erosion of mountains, to biological features, to ruling powers. I suppose it appeared when it did as rationality was starting to question the religious hegemony of past centuries, not because the idea itself was necessarily of profound importance. Vranak (talk) 16:09, 4 November 2009 (UTC)
- I think you underestimate the difficulty of making the scientific case (ignore the religious stuff for a moment—it mattered for some, not for others). Remember that when Darwin proposed evolution there was no good model for biological heredity at all, a relatively new knowledge of the fossil record, and no consensus over the age of the Earth. This makes making a compelling scientific argument about natural selection rather difficult. If the rate of change is too slow, and the Earth is too young, then it doesn't work. Without a model of heredity that allows for things like mutations, you don't have any way for speciation to take place. These are non-trivial concerns. There were plenty of people before Darwin who waived their hands and said "oh this is a common principle to all things so it applies to humans" (again, see Robert Chambers), but 1. they were more often than not wrong (because there are a lot of candidate "common principles"), and 2. they were totally uncompelling (because just because something works in one arena doesn't mean it works in another). Even in Darwin's case, it was not totally compelling—most scientists did not accept natural selection as the mechanism of evolution until long after he had died and the modern evolutionary synthesis was developed (some 70 years after Origin of Species!). (And this latter point has nothing to do with religion—they accepted evolution in general.) --Mr.98 (talk) 16:32, 4 November 2009 (UTC)
- If you read The Origin of Species it's quite clear that Darwin wasn't proposing an original idea. Rather he did a lot of careful synthesis and research in order to champion an idea. The Origin gave a confused set of ideas clarity and made it obvious how and why they had to be true. I think you can divide scientific "discoveries" into three kinds;
- "wtf is this" discoveries, finding something, such as a fossil or a strange unexpected residue, eg Ardi or teflon
- "eureka" discoveries, a sudden stroke of genius like the Dirac equation and Archimedes' principle
- "synthesis" "discoveries" like the theory of evolution or plate tectonics. These are the best and hardest.
- In the popular imagination, people think science is largely of the "eureka" type, which are actually probably the rarest. -Craig Pemberton (talk) 16:45, 4 November 2009 (UTC)
- I like that classification, and I will make the phrase "wtf discoveries" part of my working vocabulary ;-) --Stephan Schulz (talk) 18:10, 4 November 2009 (UTC)
self-ionisation of glacial acetic acid
[edit]How does it compare to water? How does the entropy contribution of the reaction change? What about enthalpy of self-ionisation? John Riemann Soong (talk) 16:37, 4 November 2009 (UTC)
Current and time threshold
[edit]I'm trying to design a circuit that activates a transistor when a photodiode has been illuminated at a certain intensity for a certain time, but only when at least a certain current is produced non-stop during that time, e.g. 5µA for 50ms. Essentially I want it to start a "stopwatch" the instant the current rises above 5µA, and reset that stopwatch the instant the current drops below 5µA. If the timer reaches 50ms before it is reset, a monostable LMC555 fires and holds a transistor (logic-level MOSFET or Darlington BJT - I haven't decided yet) high for about half a second. Ideally the entire procedure would be analogue, since I want to keep everything small and simple, and I also want it to use as little power as possible so I can run it off a minimal battery like a button cell. I'd also like to keep it as small as possible physically.
I've brainstormed about this for a while, but I can't figure out any good, simple ways to do it. Any ideas? --Link (t•c•m) 16:40, 4 November 2009 (UTC)
- First you need to convert the photodiode current into a voltage with either a a current to voltage op amp circuit or just a resistor. The o/p of the I_V converter needs to go to some sort of Schmitt trigger circuit with its threshold set to the equivalent of your desired photodiode current. This will give you a high level out when the current is above threshold. Now you need to use the rising edge of that pulse to start your timer and the falling edge to stop your timer. For the timer circuit I would tend to go for a gated square wave oscillator and a counter, but there must be other ways. Trouble is all the circuitry described probably wouldnt run under 5v. Not sure if CMOS would work reliably at 1.5V--79.67.31.17 (talk) 18:31, 5 November 2009 (UTC)
. Cuddlyable3 (talk) 20:50, 5 November 2009 (UTC)
In the circuit shown the transistor conducts for 0.5s after the photodiode has been lit for 50mS and can conduct longer if the photodiode is lit longer. Cuddlyable3 (talk) 21:05, 5 November 2009 (UTC)
- Cheers! I'll check it out later (I'm not exactly awake yet). --Link (t•c•m) 09:22, 6 November 2009 (UTC)
- Looks like a nice simple circuit that should work well. However, I think the photo diode is shown connected the wrong way round in this circuit as it needs to be used in the photoconductive (not photovoltaic) mode. Also the OP wanted operation from a single cell (1.5v) which is far more complex unless a little inverter is used. —Preceding unsigned comment added by 79.75.83.17 (talk) 15:04, 7 November 2009 (UTC)
- I've actually decided to go about this a bit differently (using a microcontroller) since I need high noise immunity. I did notice the photodiode was connected the wrong way around. Also, I was actually planning to use a 3V button cell - there is very very little that happily runs off 1.5V. :) --Link (t•c•m) 17:40, 7 November 2009 (UTC)
reactivity esters v. carboxylic acids
[edit]Why are carboxylic acids classified as less reactive than esters? Alkoxides tend to be worse leaving groups than hydroxides, right? (Except methoxide, which has a lower pKa). Alkyl groups tend to be electron donating, right..? (Or do they also delocalise some of the negative charge on the ethoxy ester oxygen?) John Riemann Soong (talk) 16:51, 4 November 2009 (UTC)
- How do you mean "less reactive" anyways? In terms of cleaving the C-O bond? Hydroxide is a worse leaving group than any alkoxide, so clearly in terms of -COOR -> -CO + OR if R = alkyl is more favorable than R = H. In other ways; for example in terms of reactivity with bases, carboxylic acids are more reactive. You need to define what reaction you are trying to do! --Jayron32 21:12, 4 November 2009 (UTC)
- Acyl substitution, naturally. (Acid-base reactions are trivial...) How is -OH a bad leaving group compared to an alkoxide? The pKa of water is 15.7; the pKa of most alcohols is 16-18. (Save phenols) John Riemann Soong (talk) 21:41, 4 November 2009 (UTC)
- Alkoxides are more stable ions because they are "softer"... in other words, there is greater dispersion of electric charge across a larger ion. That makes it more kinetically favorable; i.e. it sticks around longer. There's more going on here than just looking at the pKa, which is basically "H+ affinity" and not much else. Thermodynamically, it is harder to remove a proton from an alcohol than from water (and thus, conversely, it is more energetically favorable to protonate an alkoxide anion than to protonate hydroxide). So, if one considers the controlling factor in the reaction to be "Le Chatelier's principle" ONLY (that is, the protonation of the leaving group driving the equilibrium towards completion), then it would appear that hydroxide would make a better leaving group. However, there are other factors to consider beside that; for the equilibrium constant for the leaving group process (i.e. RCOOR <-> RCO+ + -OR). The more OR is made, the faster it will be quenched by availible H+ ions. The difference in the pKa's is probably not nearly as great as the difference in the K's for that process, for example if R = H vs. R = alkyl. You have a complex mix of processes here, and the kinetics of the slowest step is the driving force here. Protonation of the anion is a relatively fast step, so the difference in the pKa's is unlikely to be a major factor here. --Jayron32 05:58, 5 November 2009 (UTC)
fetus/mother sharing
[edit]When a pregnant women has an orgasm does the fetus experience pleasure or the orgasm as well? 71.100.13.177 (talk) 16:58, 4 November 2009 (UTC)
- The umbilical cord doesn't contain any nerves, so the fetus couldn't experience the orgasm itself. They might get some of the hormones that are released during orgasm (oxytocin, prolactin and maybe some others), which might give the fetus the same feelings of pleasure and relaxation following the orgasm. --Tango (talk) 17:59, 4 November 2009 (UTC)
- The main endogenous opioid associated with orgasm is β-endorphin, which can cross the placental barrier, so it seems plausible that the fetus would experience the opiate-high elements of orgasm, but that's a long way from the full and complex emotional and physical experience. In any case, I seriously doubt that the answer is known. --Sean 18:14, 4 November 2009 (UTC)
Antabus
[edit]In the film Skavabölen pojat, the family's father, already becoming slightly alcoholic when his sons are in pre-school age, is shown having taken an Antabus pill to try to cure his alcoholism. He shows his sons his bare stomach, where this Antabus pill has supposedly lodged itself firmly enough to be outwardly visible and tangible. A decade later, desperate to drink more alcohol, he is shown to surgically remove this pill from his stomach, so he can continue drinking without having any nauseous effects. Now I have never had to take Antabus myself, and I hope I never will. So therefore my question is out of scientific curiosity: The film gives the impression that once an Antabus pill is taken, it permanently lodges itself in the person's stomach, never dissolving. This is entirely unlike I have come to understand pills work - they should dissolve within days. How is this? How do Antabus pills work? Are they really permanent or have I just misunderstood the film's hints? JIP | Talk 19:59, 4 November 2009 (UTC)
- The Disulfiram article has information about the pharmacology of Antabuse. The effect is most certainly not permanent. The film appears to have taken some liberties with the mechanism of action. --- Medical geneticist (talk) 20:26, 4 November 2009 (UTC)
- According to our article, the half life of Antibus (disulfiram) is somewhere around 60-120 hours, and it can have some effect for up to two weeks. Red Act (talk) 20:30, 4 November 2009 (UTC)
Opting out of evolution in biology classes
[edit]Currently, many high-school biology classes allow parents to opt their students out of dissections if they are morally or religiously opposed to them. Why isn't a similar system put in place for the teaching of evolution? --J4\/4 <talk> 20:24, 4 November 2009 (UTC)
- Is studying evolution banned by any religion? I'm fairly sure actual dissection is banned by the religion itself, not just the parents. Vimescarrot (talk) 20:30, 4 November 2009 (UTC)
- In some schools, it is. In my California public high school in the 1990s, you could opt out of the evolution unit if your parents wanted you to. You had to go sit in study hall for those two weeks, or whatever length of time it was, while the rest of us stared at pictures of horses' feet. I'm not sure if anyone in my class did it; if it did, it was only one or two. (It was a very boring unit, incidentally—not nearly as racy as they had let on—but not as boring as being in study hall, probably.) I imagine that this issue, like all U.S. school-curricula issues, varies not only state by state, but probably even school district by school district. --Mr.98 (talk) 20:31, 4 November 2009 (UTC)
- There is in the UK. I once helped out on a school trip that involved a visit to a natural history museum and we had to be careful what we said because one of the children wasn't allowed to learn about evolution (there was only one exhibit that we had to gloss over, the rest was pretty safe). --Tango (talk) 20:32, 4 November 2009 (UTC)
- Schools are flexible, and should be. As a seven year old in the UK I adamantly came out as an atheist and refused to take part in anything related to Christmas or nativity because I said it was religious propaganda. I recall I was spared learning Carols and poems about Jesus and given parts of Old Possums book of cats to memorise instead. I doubt the school even bothered to check with my parents, and fifteen months later I softened and agreed to play Noah in a play (which didn't bother me because no one actually believes in Noah). I guess in an ideal world children should be able to opt themselves out of anything as soon as they know enough about it to feel they could make a decision. But I am far from sure that parents should be allowed to opt their children out of things, especially not for bad reasons. To really put the cat amongst the pigeons my reading of this as a Christian is that Genesis clearly explains "knowledge of Good and Evil corrupts" which is why I don't want my own children to see explicit violence or nasty but there is no forbidden fruit on a tree of "knowledge of blindingly obvious consensus science" --BozMo talk 20:42, 4 November 2009 (UTC)
- IMO all teaching should be done in a neutral manner so there should be no need for anyone to opt out. Knowledge is always good, it is the application of knowledge that can be bad. --Tango (talk) 20:50, 4 November 2009 (UTC)
- Not clear what "knowledge" is then, and when it demerges from experience. "Knowledge of how extreme pain feels"? even if you do disagree with Einstein about weapon technology... --BozMo talk 21:04, 4 November 2009 (UTC)
- IMO all teaching should be done in a neutral manner so there should be no need for anyone to opt out. Knowledge is always good, it is the application of knowledge that can be bad. --Tango (talk) 20:50, 4 November 2009 (UTC)
- Schools are flexible, and should be. As a seven year old in the UK I adamantly came out as an atheist and refused to take part in anything related to Christmas or nativity because I said it was religious propaganda. I recall I was spared learning Carols and poems about Jesus and given parts of Old Possums book of cats to memorise instead. I doubt the school even bothered to check with my parents, and fifteen months later I softened and agreed to play Noah in a play (which didn't bother me because no one actually believes in Noah). I guess in an ideal world children should be able to opt themselves out of anything as soon as they know enough about it to feel they could make a decision. But I am far from sure that parents should be allowed to opt their children out of things, especially not for bad reasons. To really put the cat amongst the pigeons my reading of this as a Christian is that Genesis clearly explains "knowledge of Good and Evil corrupts" which is why I don't want my own children to see explicit violence or nasty but there is no forbidden fruit on a tree of "knowledge of blindingly obvious consensus science" --BozMo talk 20:42, 4 November 2009 (UTC)
- Personally, I'd say that one can be a well educated adult without having engaged in dissections. It takes some effort to get the same insights from books and such, but it is certainly possible. On the other hand, I would say that one's education is grossly deficient if you don't understand the basic principles of evolution and the evidence for it. In my opinion, that's true even if one chooses not to accept evolution as factually true. It would be like not learning the atomic theory of matter, or not learning the structure of the solar system. Sure one can survive without that knowledge, but there is rather something incomplete with an education that skips over such basic issues. So, personally, I would resists efforts to allow students to opt-out of evolution discussions on the grounds that it really deprives the student of an important and basic science understanding. I would also point out that evolution is included on many standardized science tests, including those required for high school graduation in some areas. Dragons flight (talk) 20:52, 4 November 2009 (UTC)
- I kind of agree but does that mean I can also force them to learn Shakespeare or force them to learn to swim "one's education is grossly deficient"? I am just not sure I have the right to force it on people... --BozMo talk 21:00, 4 November 2009 (UTC)
- You can object to the whole principle of compulsory education if you want, but you won't find many people that agree with you (other than children that hate school!). --Tango (talk) 21:35, 4 November 2009 (UTC)
- No I am not amongst these folk Compulsory_education#Criticism but really we are talking compulsory syllabus, not compulsory education. You could drop biology 6 years younger than you were allowed to drop Latin at my school. How much evolution is really core education versus say Shakespeare? Pre genetics the careers advice used to be "if you can do maths and want to, do maths, if you can do maths and want to do science do Physics, if you cannot do maths and want to do science do Chemistry, if you cannot do science but want to do science do biology". Clearly I am outraged at having to do so much Latin, but I don't want biology to become the new Latin either... :) --BozMo talk 22:01, 4 November 2009 (UTC)
- Evolution is a more useful concept than Shakespeare. You can use it to find homologues, trace human migrations, do gene mapping, discover drugs, discover gene interactions, model ecological populations, disruptions and ecological balances. Plus, advanced evolution involves a lot of mathematical modelling not unlike that of say, finance. Evolution >> literature. John Riemann Soong (talk) 05:47, 6 November 2009 (UTC)
- No I am not amongst these folk Compulsory_education#Criticism but really we are talking compulsory syllabus, not compulsory education. You could drop biology 6 years younger than you were allowed to drop Latin at my school. How much evolution is really core education versus say Shakespeare? Pre genetics the careers advice used to be "if you can do maths and want to, do maths, if you can do maths and want to do science do Physics, if you cannot do maths and want to do science do Chemistry, if you cannot do science but want to do science do biology". Clearly I am outraged at having to do so much Latin, but I don't want biology to become the new Latin either... :) --BozMo talk 22:01, 4 November 2009 (UTC)
- You can object to the whole principle of compulsory education if you want, but you won't find many people that agree with you (other than children that hate school!). --Tango (talk) 21:35, 4 November 2009 (UTC)
- I kind of agree but does that mean I can also force them to learn Shakespeare or force them to learn to swim "one's education is grossly deficient"? I am just not sure I have the right to force it on people... --BozMo talk 21:00, 4 November 2009 (UTC)
I don't see how you could practically skip the evolution section. As somebody said, nothing in biology makes sense except in light of evolution -- what would you do for the rest of the course? Looie496 (talk) 21:50, 4 November 2009 (UTC)
- There are two counteracting principles here:
- Firstly, one cannot intelligently criticize something unless you've learned a fair bit about it first. Children of parents who disbelieve in evolution are precisely the ones who NEED to be taught it because they'll never learn it any other way - and they'll be incapable of making their own minds up about it as adults unless they were taught it while their minds were still flexible enough to absorb it. That cuts both ways - I don't think the children of atheists should be able to prevent their kids doing at least a basic comparative religion class. That there are religions is a fact - and understanding a reasonable range of them is well worth-while. So long as the class sticks to the observable facts (in both cases) - there should be no problem.
- However, we have a problem. Kids are only in school for so long - and their capacities for maintaining focus is somewhat limited. So we can't go in and teach them absolutely everything about everything. Some things have to take a higher priority. Fundamentals - math, literacy, language, basic science - all have to have a certain amount of time assigned to them, it's just unavoidable. The amount of time left in the school year after the essentials determines how much exposure to these other topics one gets. It's crucial that they see a little of everything that the world of knowledge has to offer - but spending (say) an entire year on evolution or on comparative religion is far too much.
- IMHO, parent's opinions should count for very little indeed. It's flat out not right that a child should be forcably prevented from learning things that their idiot parents failed to grasp. The entire reason that most countries in the world have mandatory education for children is because their parents can't be trusted to do the right thing otherwise. Being able to choose not to have their children learn evolution (or comparative religion) for a few weeks is no more defensible than allowing parents to not have their children be educated at all.
- If you allow parents to cherry-pick the courses they want their kids to attend - on religious grounds, or any other grounds for that matter - then pretty soon you're going to have parents who adhere to Sharia law deciding that their female children should not be educated at all past the age of 8. As a nation, we come to a consensus as to what needs to be taught - and everyone should be taught it.
- SteveBaker (talk) 22:02, 4 November 2009 (UTC)
- Apart from the "as a nation" bit which we are clearly not, I find it hard to disagree with any of this. Up until the child can make an informed refusal they should be taught everything. How much they get taught to make an informed refusal is subjective. --BozMo talk 22:05, 4 November 2009 (UTC)
- I'm not aware of any schools in the US that allow you to opt-out of dissection based on religious reasons. Given separation of church and state, it won't be an issue they can make lightly. Most schools either fail the students for that assignment, or make the assignment "free" so there is no grade for it. — The Hand That Feeds You:Bite 22:22, 4 November 2009 (UTC)
- The reason I mentioned it is because it is that way at my (public) school. If a student brought in a signed note from a parent or guardian explaining why dissections are against his/her moral or religious principals, the student would be given an alternate assignment instead. ----J4\/4 <talk> 23:10, 4 November 2009 (UTC)
- Where I grew up, one could get out of the 9th grade dissections for any reason whatsoever (not just religious ones). However, one wasn't allowed to opt-out of the dissections in the AP Bio class (typically 12th grade), but then the AP class was optional while the 9th grade class was mandatory. I don't recall what they did about grades in 9th grade, but I don't recall it being a big deal. Dragons flight (talk) 00:33, 5 November 2009 (UTC)
- Another question that this raises is, what other subjects could one opt out of? Sex education is the common one—again for moral reasons. Whether that is something that the parent should get to determine or not is a pretty push-button topic. What about select episodes in US history that one doesn't like the presentation of? Certain objectionable books? It's a rather nasty slippery slope to go down—which is why school boards usually set overall standards that are held to, rather that considering it on a case-by-case basis. In any case, there is no quick-and-easy answer, as it is not just a question about evolution, but about the rights of parents v. the rights of states, the goals of compulsory education, and so forth. --Mr.98 (talk) 23:16, 4 November 2009 (UTC)
- The objection against dissection runs deeper than simply "religion versus science" and I think it's useful to make note of that. Dissecting animals involving killing things. They're "just" animals, but let's not sugarcoat it. Living things are being killed so kids can look at them. Whether it stems from religious or personal morality, the concept that killing things - any things - is a "bad thing" is extremely ancient and cannot be lightly cast aside. Being able to poke around inside a dead animal to learn anatomy is arguably a good enough justification for doing it, but it's hardly a foregone conclusion - only a tiny percentage of the kids who partake in it will ever make any use of that knowledge ever again. Don't get me wrong; I did it and I had no qualms about it, but I'm honestly not sure how much I actually learned from it (that I didn't concurrently or later learn from models, videos, books, etc.) My later studies - which included studying human bodily remains - didn't hinge upon splitting open a frog and a worm in grade 10 Biology. Matt Deres (talk) 01:50, 5 November 2009 (UTC)
- But that's precisely the point. While 99% of students seemed to have gained nothing from it - and some of them were grossed-out, maybe had to run to the bathroom to puke, maybe just couldn't bring themselves to do it, maybe took that cue to become Vegans - the other 1% may have become so inspired by the process that they decided to become surgeons or to enter the field of anatomy, biology, zoology or whatever. The problem is that we don't know who those 1% are until we have them dissect something - and we need that early inspiration in order to get kids to be passionate about something (either way - for or against). The experience of doing it might well turn other kids off - but that's really the point of it. I bet nearly everyone who did that (we dissected earthworms and a cow's eye) remembers that hour of Biology class more vividly than almost any other day of their entire school lives. It's not about teaching anatomy any more than measuring the period of a pendulum in Physics classes is about learning that oh-so-not-vital length-versus-time equation that 99% of them will never use or remember again. It's an experience that no parent is ever likely to teach them...and that's precisely why you shouldn't be able to opt-out. That's also why we need to keep metalwork & woodwork class, art class and music classes, if kids never get to experience those things - how will they know whether they are passionate about them? SteveBaker (talk) 13:07, 5 November 2009 (UTC)
- It's hard to prove a counter-factual, though. How do we know that 1% wouldn't have found another path there? (Or that your 99% weren't so turned off by it that they decided science class wasn't for them?) Have we really proven in a rigorous way that dissection actually is a useful pedagogical tool, to the point where those who believe that it is just "unnecessary" killing of animals should be ignored? I don't know the answer, but that seems at issue here—you're assuming the memory of the spectacle itself translates into good pedagogy, but I'm not sure that necessarily follows. (Incidentally, my favorite take on classroom dissection is this one.) --Mr.98 (talk) 15:21, 5 November 2009 (UTC)
- Steve, you seem to have missed the central point of my post and commented solely on my anecdote. Killing things is usually seen as a "bad thing" unless the killing is justified. My point is that the line between "not justified" and "justified" is not the same for everyone at all times and people need to keep that in mind before they go off half-cocked about the choice regarding dissection being a "religion vs science" thing.
- Your comments about the impact high school dissection has on kids can also swing both ways - I would wager that for every kid that got turned on to biology by doing a dissection you would find at least an equal number of students who were traumatized at the very thought and therefore failed to go on to contribute to the studies of cladistics or ethology. While you and I obviously think back to those days as ones of discovery, there are a great number of people who think back to it and shudder at the very thought. Matt Deres (talk) 22:21, 5 November 2009 (UTC)
Bears can play hockey?
[edit]I watched an interesting video on the internet that shows a team of bears playing hockey. This is the internet so it can quite possibly be fake, but it at least looks real. What do you think? http://video.yahoo.com/network/100284668?v=6255496&l=4418225 if it says "video not available", bypass your cache. -- penubag (talk) 22:07, 4 November 2009 (UTC)
- Yes, according to ABC News, bears playing ice hockey is a standard stunt in the Russian circus.[2] Red Act (talk) 22:30, 4 November 2009 (UTC)
- That's crazy. I'd love to go see this in person! -- penubag (talk) 01:25, 5 November 2009 (UTC)
- Bypass your cache?218.25.32.210 (talk) 01:23, 5 November 2009 (UTC)
- WP:Bypass your cache -- penubag (talk) 01:24, 5 November 2009 (UTC)
- I remember seeing a hockey-playing bear at a Budapest circus, so it isn't just Russians (or it was a Russian traveling circus) Rmhermen (talk) 03:05, 5 November 2009 (UTC)
- Are they playing hockey or just holding sticks and hitting pucks? DRosenbach (Talk | Contribs) 03:44, 5 November 2009 (UTC)
- Are we talking ice hockey then? Nil Einne (talk) 09:26, 5 November 2009 (UTC)
- There are some bears in Boston and Providence that can play ice hockey. --Mark PEA (talk) 17:41, 5 November 2009 (UTC)
- They play football in Chicago.—Preceding unsigned comment added by Googlemeister (talk • contribs)
- In Chicago, you can watch wolves playing hockey. Edison (talk) 15:36, 6 November 2009 (UTC)
- They play football in Chicago.—Preceding unsigned comment added by Googlemeister (talk • contribs)
- There are some bears in Boston and Providence that can play ice hockey. --Mark PEA (talk) 17:41, 5 November 2009 (UTC)
- It's amazing how good they are. They're completing passes and everything, and actively trying to score goals. They seem reasonably comfortable ice-skating around in a bipedal fashion.
- White team needs a new goal-keeper, though. APL (talk) 17:48, 5 November 2009 (UTC)
- Could it be a combination of some real bears, people in costumes, computer graphics, and skillful editing? Today one sees lots of TV commercials with animals doing fake things. There are about 39 edits or cut between shots averaging every 3.5 seconds seconds in this 2 minute video, which would be an opportunity to insert closeups of fakery, or to intercut shots from different times to make it look like continuous play. It is also very fuzzy for a slick production with that much editing. This is no camcorder shot. Edison (talk) 15:43, 6 November 2009 (UTC)
- To me the fuzziness looks like it's come from many generations of video tape duplication. Presumably this was on TV at some point. Since there's a live audience, and this is supposedly a common stunt in Russia, I doubt that it's literally fake. I'm sure the editing improves it, but the crowd is loving it, so even without editing it must be pretty good. APL (talk) 20:43, 6 November 2009 (UTC)
- Could it be a combination of some real bears, people in costumes, computer graphics, and skillful editing? Today one sees lots of TV commercials with animals doing fake things. There are about 39 edits or cut between shots averaging every 3.5 seconds seconds in this 2 minute video, which would be an opportunity to insert closeups of fakery, or to intercut shots from different times to make it look like continuous play. It is also very fuzzy for a slick production with that much editing. This is no camcorder shot. Edison (talk) 15:43, 6 November 2009 (UTC)
- Are we talking ice hockey then? Nil Einne (talk) 09:26, 5 November 2009 (UTC)
- Are they playing hockey or just holding sticks and hitting pucks? DRosenbach (Talk | Contribs) 03:44, 5 November 2009 (UTC)
- I remember seeing a hockey-playing bear at a Budapest circus, so it isn't just Russians (or it was a Russian traveling circus) Rmhermen (talk) 03:05, 5 November 2009 (UTC)
- WP:Bypass your cache -- penubag (talk) 01:24, 5 November 2009 (UTC)
Harmful beryllium oxide in ceramic insolators
[edit]I was reading the article about microwave ovens here in Wikipedia, and there was written that some microwave oven magnetrons have ceramic insulators with a piece of harmful beryllium oxide (beryllia) added. How will a person know if the ceramic insulators with a piece of beryllia is a bit broken? Will the microwave oven keep on working if it is a little broken (or a little crushed)? If the ceramic insolator should be a liitle broken or crushed, can the dust from it get inside the cooking chamber (or outside of the microwave oven)?JTimbboy (talk) 22:46, 4 November 2009 (UTC)
- The OP probably refers to this text in the article Microwave oven: Some magnetrons have ceramic insulators with a piece of beryllium oxide (beryllia) added—these ceramics often appear somewhat pink or purple-colored. The beryllium in such oxides is a serious chemical hazard if crushed and ingested (eg, inhaling dust). In addition, beryllia is listed as a confirmed human carcinogen by the IARC; therefore, broken ceramic insulators or magnetrons should not be handled. This is obviously only a danger if the microwave oven becomes physically damaged (ie, cracked ceramics) or upon opening and handling the magnetron directly, and as such should not occur during normal usage. Cuddlyable3 (talk) 01:37, 5 November 2009 (UTC)