Jump to content

Wikipedia:Reference desk/Archives/Science/2010 December 19

From Wikipedia, the free encyclopedia
Science desk
< December 18 << Nov | December | Jan >> December 20 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 19

[edit]

Properties of liquid diamond

[edit]

In recent news articles there's been mention of the possibility of diamond oceans at the centre of ice-giant planets like Neptune. What would it be like to travel as a submariner in such an ocean, assuming I had a magically strong bathysphere? What properties does liquid diamond have that would make the journey unique?

Sober discussion and wild speculation are both welcome.

Thanks Adambrowne666 (talk) 00:08, 19 December 2010 (UTC)[reply]

phase diagram of carbon. The liquid phase isn't really "diamond"
"Liquid diamond" cannot exist, given the type of bonding that exists in a diamond. Diamond is a "network solid" consisting of a single massive covalently bonded network of carbon atoms. In order to liquify, one would need to have discrete molecules of some sort, diamond just isn't organized that way. There could be liquid carbon, but it exists only at very high pressures and temperatures; under pressures less than 10 megapascals, (about 100x the pressure of Earth's atmosphere) it sublimes. Speaking of "liquid diamonds" is the fanciful sort of stuff you read in the science sections of newspapers, but such things don't really exist. --Jayron32 00:49, 19 December 2010 (UTC)[reply]

Yeah, good point; I didn't think of that. I suppose it's like calling water 'liquid ice'. I see here - carbon physics - that it's referred to as liquid carbon, as you say. Can we continue the discussion even so? Adambrowne666 (talk) 01:24, 19 December 2010 (UTC)[reply]

What do you want to discuss? The paper you just linked has a rather exhaustive and complete discussion of the properties of various phases of carbon. I am unclear what else you wish to learn... --Jayron32 01:29, 19 December 2010 (UTC)[reply]
Well, I think he was fairly clear the first time. He wants to know what it would be like to travel through it in a "magically strong bathysphere", and whether the answer would have to do with the medium's properties. I don't have any answer for that, but I don't see that it lacks sufficient specificity to be answered. --Trovatore (talk) 01:47, 19 December 2010 (UTC)[reply]
Jayron32: Not true, liquid diamond can exist - nanodiamonds. Consider dehydroadamantane, technically, the smalest molecular structure which can be called a diamond; an unsaturated carbon structure with delocalized lone electrons which, can have liquid properties under the right conditions. Think about it, a rock does not flow from a bucket, but crush it into fine silt, it flows freely from the bucket. It is not a true liquid, but it has similiar properties. I can't give you any properties of liquid dehydroadamantane, as no one has actually studied it, but I can tell you that it should behave as a highly reducing metallic liquid. Liquid diamond is a metastable substance, as it should irreversably polymerise into a hydrocarbon glass at STP. --Plasmic Physics (talk) 01:54, 19 December 2010 (UTC)[reply]
I don't see how you can call something diamond if it has (any significant amount of) hydrogen in it. Our article says dehydroadamantane is C10H14. --Trovatore (talk) 02:01, 19 December 2010 (UTC)[reply]
There are several degrees of unsaturation, C10H14 is not the most unsaturated adamanane frame. All diamonds have a "dirty" surface, covered by non-carbon groups. There is no formal cut-off ratio of carbon to other elements for molecule to be differenciated from a diamond, how many grains of sand makes a heap. The ratio can be simplified to a surface area to volume ratio. --Plasmic Physics (talk) 05:09, 19 December 2010 (UTC)[reply]
So, the question is now what it would be like to travel submerged through an ocean of liquid carbon. Anyone know if liquid carbon is opaque? It might just be very, very dark down there... WikiDao(talk) 18:12, 19 December 2010 (UTC)[reply]
I would guess that it would resemble graphite in its opacity. And at the high temperature it is at, it would be glowing white hot. Graeme Bartlett (talk) 03:23, 20 December 2010 (UTC)[reply]
Unlike movies, and cartoon representations, you can't see through magma or liquid diamond. Unlike magma, it would have a low viscocity and you should be able to move through it quite freely, given you have an undestructable bathysphere. --Plasmic Physics (talk) 08:40, 20 December 2010 (UTC)[reply]
Thanks everyone. I'm guessing you guys are right about the opacity etc of molten carbon. But I'd like to be sure. According to this paper carbon melt, the optical properties of molten carbon can be "calculated based on the Drude theory [9]. Using the standard relations for the dependence of the optical constants on the complex dielectric function, the reflectivity R(~o) normal to the surface and the absorption coefficient, c~(o~), can be calculated. See Table I for the values of ~(o) and R(o) for frequencies o3 of the pulsed lasers used to melt graphite. [1 ]" This is too technical for me. Is anyone here interested enough to look at the article and see if they can translate it for me? Also - if it is opaque, how would one navigate in it? Infrared? Sonar?

Halophiles

[edit]

http://books.google.com/books?id=o8lOge_6TpsC&pg=PA101&dq=halophile+possibilities+for+industrial+process&hl=en&ei=NmYNTbaZG8L6lwfW16WeDA&sa=X&oi=book_result&ct=result&resnum=1&ved=0CCYQ6AEwAA#v=onepage&q&f=false

(Under the header Extracellular Enzymes)

1. What does the term "Washing" mean?

2. How would halophilic enzymes be used in food processing?

3. (Page 103)

These compounds have industrial applications as stabilizers of enzymes, nucleic acids, membranes...

What would applications be that require for stabilizers of enzymes, membranes, etc.?

4. Reliable reference?

Thanks, Albacore (talk) 02:12, 19 December 2010 (UTC)[reply]

Your #1 and #2 seem straightforward to learn more detail, since the sentence gives you a citation for it:) Page 114 of your ref has a full bibliographic reference for the 2005 work by Ventosa. That author appears to have published extensively in the scientific journal world, so he's probably a reliable reference (but cf. #4, I don't know about the work you are reading that cites him). Also, the cites in the paragraph preceding the one you mention (main §5.4 intro) points to several other references that are described as in-depth reviews of these applications/fields (including an older one by the same Ventosa author). DMacks (talk) 04:37, 19 December 2010 (UTC)[reply]

Jet stream time lapse?

[edit]

To understand how long-term changes in weather work, it would be very desirable to have some video showing the entire jet stream over the Northern Hemisphere, updated once a day, going on for at least a year, and preferably colorized as well with perhaps daytime high or average temperatures. Is anything like that in circulation? Can it exist, or is the jet stream too abstract to actually be placed on a map consistently in this way using data from multiple authorities? Wnt (talk) 05:23, 19 December 2010 (UTC)[reply]

Here are animated northern hemisphere jet stream maps with data available back to 2006. Red Act (talk) 06:17, 19 December 2010 (UTC)[reply]

networking

[edit]

concept of bridge,switch and router plz help me some one....... sand mass. — Preceding unsigned comment added by Satraj2010 (talkcontribs) 07:53, 19 December 2010 (UTC)[reply]

Please see Routing, Bridging (networking), Network switch, and Router. Further questions might be better suited to the Computing Desk. Ginger Conspiracy (talk) 08:00, 19 December 2010 (UTC)[reply]
I've redacted the e-mail address per policy. shoy (reactions) 17:42, 20 December 2010 (UTC)[reply]

physics

[edit]

in a structure of an atom protons are positively charged then why dont they repel each other as every positive charge repel positive charge? —Preceding unsigned comment added by 218.248.64.182 (talk) 12:25, 19 December 2010 (UTC)[reply]

According to Atomic nucleus#Forces, The nuclear force is highly attractive at very small distances, and this overwhelms the repulsion between protons which is due to the electromagnetic force, thus allowing nuclei to exist. 90.195.179.14 (talk) 12:34, 19 December 2010 (UTC)[reply]
The nuclear force is actually repulsive at very close range, but becomes attractive at typical distances between particles in the nucleus. The comparison with electrostatic repulsion is shown here. If a proton ever gains enough energy to separate from its neighbours by three times its normal distance, then the charge repulsion will take over and the proton will leave the nucleus. Dbfirs 12:56, 19 December 2010 (UTC)[reply]
There are two forces at work at the size of the atom. One is the electromagnetic force you are talking about, where like-repels-like. But at shorter ranges, the nuclear force is strong, where nucleon-attracts-nucleon. So protons are attracted to protons and neutrons. Note that you had too many protons in a nucleus without neutrons, you could overwhelm the nuclear force — the electromagnetic force would be too powerful. This is why nuclei require large numbers of neutrons (which are electromagnetically neutral) to balance it out and be more stable. You can get an idea of how much "balancing" is required when you look at uranium. With 92 protons and 146 neutrons, U-238 has a half-life of 4.468×109 years — a very long time! Remove just six of those neutrons and you have U-232, which has a half-life of only 68.9 years — much less stable. So you can think of the inside of a nucleus as a battle between two contradictory forces. This helps to explain why nuclear fission requires big, not-very-stable nuclei, and why nuclear fusion, even of atoms with very minimal positive charges, is very hard to accomplish. Fission is an example of when the electromagnetic force overwhelms the nuclear force (because of being jostled by an extra neutron, usually); fusion is an example of the nuclear force overcoming the repulsion of the electromagnetic force. --Mr.98 (talk) 20:46, 19 December 2010 (UTC)[reply]
But removing protons or adding neutrons will also destabilize a stable nucleus. And nuclei with an even number of protons and neutrons are generally much more stable than otherwise similar nuclei with an odd number of protons and neutrons. And 68.9 years is an incredibly long time compared to the time scale of typical nuclear or chemical reactions (nanoseconds or less). Electromagnetic repulsion has some role in stability, but it's a lot more complicated than that. -- BenRG (talk) 09:03, 20 December 2010 (UTC)[reply]
A strong counter-example to Mr.98's idea of "enough neutrons to keep the protons from getting too repulsive, but an extra few would destabilize" is elements that do not have a "stability plateau" at a certain number of nucleons. Chlorine, for example, 35
17
Cl and 37
17
Cl are stable, whereas 36
17
Cl decays. Now, Cl-36 is still fairly stable as far as nuclei go (half-life is 308000 years), but it seems contradictory from the simple pattern that if you take a stable and add (or remove, depending whether you start with Cl-35 or Cl-37) you get unstable and then if you do more of that same destabilizing change, you become...stable again. And the unstable isotope decays

36
17
Cl→36
18
Ar so taking unstable and adding even more charge but not more particles (net fraction of neutrons decreases!) gives you...stable argon. The idea of parity solves a lot of these (looks like "even number of neutrons" is the key?), but then sulfur is stable as 32
16
S, 33
16
S, 34
16
S, 36
16
S. Nuclei really only obey "the whole set of rules" (and we're only describing reality with rules that are based on it, we can't tell Nature what to do:), not any one in isolation. DMacks (talk) 14:45, 20 December 2010 (UTC)[reply]

Science and God

[edit]

Has science absolutely proven that God cannot exist? I know for a fact that the big bang theory does not, since no one can say where the material for it came from. But what about life? The Primordial soup could have resulted in forming organic molecules, but could they combine or react to form living cells? Thanks. --119.155.7.74 (talk) 13:59, 19 December 2010 (UTC)[reply]

Absolutely not. Relationship between religion and science appears to give an overview of the issues. Clarityfiend (talk) 14:25, 19 December 2010 (UTC)[reply]
Your question betrays a logical error. You ask whether the primordial soup could have produced living cells. Suppose we knew of no way in which this could happen: I suppose you could use this as some kind of evidence for the existence of God. In fact we do know of ways in which this may happen. But this is not evidence for the non-existence of God at all; rather it's a lack of one potential piece of evidence. It's certainly not having "absolutely proven that God cannot exist". (But then of course you have Occam's razor to think about.) May I recommend our article on the conflict thesis? Marnanel (talk) 14:39, 19 December 2010 (UTC)[reply]
Science doesn't absolutely prove anything. Science is all about trying to find the simplest theory that accurately explains our observations. Theorising the existence of a supreme being doesn't explain any observations that can't be explained without a supreme being (at least, not enough to make up for the fact that you now need to explain the existence of the supreme being - it's one step forward and two steps backwards). Just because theories without supreme beings are simpler than theories with and explain our observations just as well doesn't mean they are right, but it tends to work well to assume the simpler theory is the correct one (this is Occam's razor).
It's a little different if you ask about a specific God. Science can disprove (beyond all reasonable doubt, at least) all sorts of claims made in the Bible, for example. It just can't disprove the existence of supreme beings in general. --Tango (talk) 14:47, 19 December 2010 (UTC)[reply]

So science doesn't deny the existence of God, but sidelines it as an unlikely explanation of the universe? Am I right? --119.155.7.74 (talk) 15:35, 19 December 2010 (UTC)[reply]

Science makes no claims about the existence or lack thereof of god/gods. Like Tango said it does contradict many parts of various holy books, Genesis being an obvious example. The modern conception of divinity is that it operates supernaturally and thus cannot be quantified. Science by definition deals with the natural world and cannot determine things about the (supposed) supernatural. The origin of the universe (pre big bang) and of life is something that science has yet to produce firm answers for but that does little to prove or disprove the existence of god. On the other hand the "supernatural" is often used to explain things we don't understand. People used to think gods controlled many natural processes (like the weather) and as our understanding of the world expanded they were forced into ever smaller gaps in understanding. See God of the gaps. --Leivick (talk) 16:00, 19 December 2010 (UTC)[reply]
You could say that science "sidelines" the existence of God as "Unevidenced". Which is the same category alien abductions and ghost stories wind up in, but also some reasonably scientific theories like Panspermia. APL (talk) 19:16, 19 December 2010 (UTC)[reply]

I do understand the concept of God of the gaps. But can science go beyond the big bang? From Stephen Hawking's book 'A brief history of time', and from being a physics student myself, I perfectly understand the aim of almost all physicists in deriving the one single unified theory, but that too would only explain events after the big bang, since all laws break down at the big bang itself. So how can science try to understand events before that? I am not trying to imply that God did indeed make the universe, but just asking if science will be able to explain events before the big bang. --119.155.7.74 (talk) 17:26, 19 December 2010 (UTC)[reply]

I think every honest scientist will acknowledge that an omnipotent God could in principle have faked all the evidence we see, of evolution and everything else. Even creationists, for some reason, are not eager to back that idea, but there is certainly no way to rule it out scientifically. Looie496 (talk) 17:36, 19 December 2010 (UTC)[reply]
(ec) So science doesn't deny the existence of God, but sidelines it as an unlikely explanation of the universe? Am I right? - As stated above, science could never either prove or disprove the existence of God, so, no, science does not deny the existence of God, nor does it sideline it as an unlikely explanation of the universe. But certain scientists are more than happy to do these things. They trumpet their personal opinions as "the truth" and "science". It's a very human thing to do; after all, even scientists have egos. -- Jack of Oz ... speak! ... 17:38, 19 December 2010 (UTC)[reply]
For interest, just in case you missed it, there was this recently ( ...and this) along with numerous media reports.....and a wiki article, Conformal Cyclic Cosmology. Sean.hoyland - talk 17:58, 19 December 2010 (UTC)[reply]

The information provided in the first link exceeds my understanding of physics :P. However, after reading the first few lines of the Wikipedia article on CCC, I gather that a universe apparently 'dies' and gives birth to a new one. That still doesn't explain where the matter for the first one of this kind of universe could have come from. --119.155.7.74 (talk) 18:15, 19 December 2010 (UTC)[reply]

Note that even if science can't go beyond the Big Bang (which is not clear, in any case), it doesn't make "God did it" a very good answer from a scientific perspective. There may be epistemological limits to science. Putting God into those blank spaces is not a logical conclusion. I do not know what my mother is doing this very moment. That doesn't make it logical for me to assume she is eating a cake, or dancing ballet, or trying on hats, even if a very respected book asserts with the deepest solemnity that this is most certainly what she is doing. Hitting a limit on knowledge does not logically authorize an appeal to divine inspiration. --Mr.98 (talk) 18:16, 19 December 2010 (UTC)[reply]
I guess the matter for the first one came from the same place that produced the first god. :) Sean.hoyland - talk 18:25, 19 December 2010 (UTC)[reply]
Right, OP, saying "God" is not a very robust explanation of anything, even if it were to be in some sense ultimately "correct". Science is interested in understanding better how things actually work, in detail and in the actual world. Just shrugging and saying "God did it" isn't very useful to Science, but nothing about Science itself prevents individual scientists from saying that, or something similar, from time-to-time, if they want. ;) WikiDao(talk) 18:23, 19 December 2010 (UTC)[reply]

I know that saying 'God did it' is just plain arrogant, but will somewhere down the line science eventually come to a halt, when it can no longer explain some event or events, when saying 'God did it' might not be so arrogant? --119.155.7.74 (talk) 18:57, 19 December 2010 (UTC)[reply]

Science comes to a halt on a regular basis, see eg. List of unsolved problems in physics. But the whole purpose of Science is to persist in trying to understand the workings of the actual world as well as possible. Vaguely invoking "God" as an explanation for how things work is never useful for that purpose. But note again that that is not the same as saying conclusively "God does not exist." Science does not say that, it just ignores that question altogether as irrelevant. WikiDao(talk) 19:11, 19 December 2010 (UTC)[reply]
Which god? You mean Odin, right? Or Zeus? Quetzalcoatl, perhaps? Or simply a god beyond all current human knowledge? Without evidence for one over the other a scientist should not name (or imply) a specific god.
Anyway, it's "arrogant" because without evidence you're just guessing. You don't need to be a scientist to guess. Any scientist that said "I personally can't think of an answer, so it must be unknowable, and therefore must be The Flying Spaghetti Monster." would be more than arrogant, he'd be missing the point. Better for him to say "I can't figure this out, maybe someone else can." APL (talk) 19:14, 19 December 2010 (UTC)[reply]

Well, i can honestly say that after this discussion my faith in God and my love for science has only increased. Thanks to everyone for contributing. And on that note, what books would you suggest which are similar to 'A Brief History of Time', not too complex and which can slowly introduce me to higher level of this study. Note that I study this alone in my spare time, as a hobby. Thanks. --119.155.7.74 (talk) 20:16, 19 December 2010 (UTC)[reply]

You might be interested in "God: The Failed Hypothesis" by Victor J. Stenger. The author argues that science can indeed have a say in the question of whether God exists. One of the first things he does is to make a distinction between the god (little g) of Deism and Pantheism and the Judeo-Christian-Islamic God (capital G). The former is what people probably think of when they say science ignores the question of "God". This is essentially the idea of Non-overlapping magisteria. But Victor argues that the latter assumes a God which actually plays an important part in the universe and therefore should be detectable by the methods of science. We can't prove "quarks" we can however test the model of the universe which their existence implies, we should in the same way be able to test the model of the universe which the existence of God (capital G) implies. In the same way that science can not disprove the existence of unicorns or dragons, we can after careful investigation, with a certain level of certainty give an opinion on their likelihood. Vespine (talk) 22:28, 19 December 2010 (UTC)[reply]
I agree, and well-stated. I don't mean to be saying above that "the existence of God", carefully defined, should not be an object of scientific investigation. And that existence or not, carefully defined, might have some relevance to other areas of scientific investigation at some point. At this point, though, and again see eg. List of unsolved problems in physics, the whole question seems mostly irrelevant to the other questions science is asking. WikiDao(talk) 22:46, 19 December 2010 (UTC)[reply]

Science as a whole must remain strictly agnostic on subjects such as omniscient, omnipotent beings like God who choose not to reveal themselves in a verifiable manner. Their existence is not a testable hypothesis by definition. However, many scientists choose to identify as atheists, because of the damage religious mythology (such as the 6,000 year age of the Earth implied by Judeo-Christian-Islamic doctrine) do to the fields of biochemistry, radiochemistry, astronomy, geology, paleontology, archeology and other fields in which empirical evidence implies a very much older Earth. This damage manifests as anti-vaccination and similar movements skeptical of medical science, for example. Similarly, many ethicists are skeptical of basing systems of morality on the threat of eternal torture in hell as punishment for wrongdoing, or the idea that an ethnically superior people have been chosen by God to deserve advantages such as the right to rule over others. Many are revolted by the endorsement of slavery which occurs in almost all the ancient religious texts of the world. On the other hand, anthropologists can explain why the many diverse cultures of the world might have come to embrace religions; for example to reinforce social hierarchies, hygiene, and social order in the absence of secular institutions to do so. Ginger Conspiracy (talk) 03:54, 20 December 2010 (UTC)[reply]

I sometimes wonder about an innocent child, given a scientific explanation for everything from birth, or at least the time of verbal awareness, and never exposed to religious views. Would such a child invent religion for themselves? HiLo48 (talk) 04:08, 20 December 2010 (UTC)[reply]
It's possible. "There are no atheists in foxholes" as the saying goes, meaning that belief in the supernatural is associated with imminent danger. Religious faith is defined in terms of doubt, after all. But in my estimation, the agnostics have a far more consistent position than athiests. For example, a loving creator of hominids may not want to reveal itself for the same reason that loving parents want their children to be able to support themselves independently. Ginger Conspiracy (talk) 04:12, 20 December 2010 (UTC)[reply]
Yeah, but that hypothetical, loving creator of hominids has consequently created a heck of a lot of confusion along the way. HiLo48 (talk) 04:27, 20 December 2010 (UTC)[reply]
Well, I'm not a perfect example (as I was attending mass at least once a week at the time), but my parents apparently neglected to mention or explain to me anything about our religion for the first few years of my life, and I distinctly remember believing in reincarnation (not using that word, because nobody had ever mentioned it to me), and even discussing it with my playgroup friends, all before I started school. My parents had encouraged logical thinking, investigation, and a love of knowledge of the world: I'd done experiments with growing beans, and so on, so certainly they had worked to instill a certain amount of scientific reasoning. Once people started actually explaining my religion to me, I got terribly confused and somehow thought girls became angels and boys became devils, when they died. I would definitely say, from my own experience, that the absence of religious indoctrination in many people does not lead to an absence of religious belief. 86.163.0.221 (talk) 15:43, 20 December 2010 (UTC)[reply]
No, I'm hypothesising a far more extreme situation. One where religion doesn't exist in the world of the child as it grows up. While you may not have been indoctrinated, you were surrounded by stories of mystical beings that other people took seriously as creators and destroyers, etc. If the child could be totally removed from that environment, but given everything science has to offer, would they need to invent religion? HiLo48 (talk) 00:16, 21 December 2010 (UTC)[reply]

I dont know about Christianity or Judaism, but Islam does not state the age of Earth as 6000 years. It doesn't imply any specific date for the creation of the Earth. I understand that science will not benefit from keeping God as a possible explanation for any events that it cannot explain as yet. What I meant by my original question was if or not so far the discoveries made by science have in any way removed, beyond all doubt, the idea of God. The answer to that is clearly no, since science is not trying to prove anything, rather just studying and understanding more of how the universe works. What conclusions people draw from those studies regarding God is up to them. --119.155.31.74 (talk) 05:08, 20 December 2010 (UTC)[reply]

Sadly, Islamic scholars are endorsing Young Earth creationism myths more often these days, especially in Turkey, although Muslims were early proponents of evolution. See Tawrat and Islamic creationism. Ginger Conspiracy (talk) 11:47, 20 December 2010 (UTC)[reply]
I generally disagree with the conclusion: What conclusions people draw from those studies regarding God is up to them.. Substitute God for "gravity", or "heliocentricity", or "evolution". I also disagree with Science as a whole must remain strictly agnostic …… not to reveal themselves in a verifiable manner . "The God" of the main theisms isn't a God of "the background" who never reveals himself, that's mutually exclusive, and Science can have an opinion on that. Vespine (talk) 22:09, 20 December 2010 (UTC)[reply]
It's been said that science is questions that may never be answered, and religion answers that may never be questioned. There are some things in every religion that the vast majority of adherents will never dare to question honestly. 66.108.223.179 (talk) 02:01, 21 December 2010 (UTC)[reply]

If science as a whole doesn't try to understand God as it is something supernatural and not beneficial to it, how can it have an opinion on God? It is scientists that individually have their own opinions. Ive never read or heard anywhere science coming to a conclusion regarding the existence of God. --119.155.112.106 (talk) 05:09, 21 December 2010 (UTC)[reply]

"If science as a whole doesn't try to understand God" Since when?, there are lots and lots of scientists that challenge claims made about God in the bible and other holy books. Richard Dawkins is one that comes instantly to mind. It is a fallacy to assume that science is somehow "prohibited" from observing or explaining something supernatural. Yes we always look for a natural explanation, but that's only because in the course of all human history, nothing supernatural has ever been empirically observed. If we did legitimately observe supernatural phenomena, then science would accept it and study it in as great a detail as possible. This would lead to whole new fields of study and no doubt lead to big public research grants, it would be a massive boon to scientists, most scientists would welcome it not fear it. I've never read or heard anywhere science coming to a conclusion regarding the existence of God. Well then I can only assume you haven't looked in the right places. I already linked God: The Failed Hypothesis above, and The God Delusion is Dawkins' contribution, two books fairly recently written about God from a scientific perspective. No these aren't "peer reviewed articles in scientific journals", but then you won't find peer reviewed scientific articles about the evidence for the non existence of unicorns either. Doesn't mean you can't come to a scientific conclusion about the likelihood of unicorns. Vespine (talk) 23:18, 21 December 2010 (UTC)[reply]

The article on 'God: The Failed Hypothesis' says that the author concludes that the existence of God is not impossible, only improbable. Those scientists usually challenge the claims of holy books after being masters of their profession, when they feel they have enough knowledge to scientifically challenge the question of God's existence and have scientific proof to back it. In the end, all that I can conclude is that science so far has only enough data to label it improbable. --116.71.62.183 (talk) 05:13, 22 December 2010 (UTC)[reply]

sigh Yes, because that's what science does. As stated above more then once, science does not absolutely prove, nor disprove ANYTHING, it just gives varying degrees of probability which asymptote towards certainty, but never actually reach it. NOTHING in science is beyond question, that's what makes it different from religion. Vespine (talk) 05:45, 22 December 2010 (UTC)[reply]

Degeneration Redux

[edit]

Because of medical treatment in prolonging life, inheritable susceptibility to disease will expand at an increasing rate. This increasing growth rate would be exacerbated by the following medical advances:

1. Medical science would further prolong the life of these patients, permitting them to have more children.

2 Medical science might find that more diseases can have inheritable susceptibility.

Perhaps the growth rate of the number of these patients could become very steep, perhaps something like an exponential curve. That would eventually make the burden of supporting and treating these patients intolerable, and force action about the birth of such persons.

The great loss of life of native Americans due to “European” diseases shows how susceptibility to disease can be widespread. Modern medicine was not available to save the native Americans. But now susceptible people can live on and beget children who are also susceptible. - Diatom. 173.189.136.110 (talk) 15:27, 19 December 2010 (UTC)[reply]

You just repeat the assertions that were already discussed and refuted above. Do you have a question? --Stephan Schulz (talk) 16:11, 19 December 2010 (UTC)[reply]
I have been repetitive because people are not responding directly to what I say. For instance, there is the question of how long it would take for the problem to become serious. So I expanded on that point in my last post. Also, I don't think I have been refuted. Diatom. 173.189.136.110 (talk) 16:36, 19 December 2010 (UTC)[reply]
No one can say 'how long it would take for the problem to become serious' because 1) None of us has a crystal ball, or if anyone does, they've never shared it with us (I'm not aware any of us are billionares which you would expect if we had access to one) 2) Given the major changes in social structure and major advances in biotechnology trying to predict what will happen in 50 years is problematic in itself let alone 300 years. 3) The problem and 'become serious' aren't really defined in a way that any meaningful predictions can be made even if we tried. (In particular, it's question whether there is any problem that will become serious.) Nil Einne (talk) 17:44, 19 December 2010 (UTC)[reply]
We can say that the statistics of globally increasing lifespans, decreasing infant mortality, and leveling population growth rates after decades of mass intercontinental air travel seem to imply that most of what were only decades ago thought to be the most difficult hurdles have been cleared. In the 1960s, for example, there were few reasons not to believe that the human population would dangerously exceed the planet's carrying capacity by now. What has actually happened, however, is astonishingly optimistic in comparison. There is still a lot of suffering, but as a proportion of human experience it is decreasing more rapidly than ever. The character of diseases which have been emerging don't rule out the possibility of another substantial human plague, but the probability does seem to be decreasing. I recommend Doctors without Borders epidemiologist Hans Rosling's interactive Gapminder statistics browser and his TED lectures using it.[1][2][3][4][5] Ginger Conspiracy (talk) 04:35, 20 December 2010 (UTC)[reply]
Ginger, thanks for the interesting and informative leads. There's lots and lots of goo information there. Diatom.173.189.136.110 (talk) 14:39, 20 December 2010 (UTC)[reply]

It has been suggested that gene modification could be a solution to the problem of inherited susceptibility to disease in human beings. However, I don’t think GM people are a good idea. First of all, In view of the opposition to GM crops, there would be very great public opposition to creating GM people. There might be prejudice against GM people if they were created. Perhaps they would be jeered at as "Frankensteins". Their children would be taunted at school as "The Son of Frankenstein".

Secondly, it would start off with a beneficial purpose – the elimination of disease susceptibility. Then people would want their offspring to be gene-modified in other ways, such as to be super-intelligent boys or super sports figures, or very beautiful girls. There would undoubtedly be doctors prepared to do that for a price. It would also take a more sinister turn. Some would want their son to be very greedy and hence a great moneymaker (very “successful”). Dictators would want to create super soldiers who would be very aggressive, ruthless, and fearless. Another suggestion was to use amniocyntesis to detect disease prone fetuses. That is a much better idea if the procedure can be done early enough to permit legal abortion. It is being done now to detect fetuses that will be born with serious birth defects. It could be extended to fetuses that would apparently be normal at birth, but that would be susceptible to a certain disease later in life. – Diatom. 173.189.136.110 (talk) 16:15, 19 December 2010 (UTC)[reply]

What's the question there? Note that your personal dislike of genetic modification or using amniocentesis is somewhat besides the point. If it happens it happens whether you personally like it or not and it majorly screws up any predictions you try to make of the future. Nil Einne (talk) 17:30, 19 December 2010 (UTC)[reply]
I think in general the main difficulty with your line of argument here is that you seem to treat genetic susceptibility as some kind of rigid determination. It's generally not. If I had your DNA in hand today, I could tell you that you had some markers which gave you a certain risk factor of being susceptible to tuberculosis, breast cancer, prostate cancer, and Alzheimer's. I could probably not tell you for certain whether any of those things would develop. (In some rare cases, where the single mutation is the cause of the disease, I could do that. But those are the exception.) So should I let you reproduce? You tell me. What's the cutoff point? 50%? 60%? 80%? Does it matter if all of those things become easily treatable in the future? None of these issues are simple from either a medical or a social view.
The other major difficulty is that you vastly overestimate the importance of these kinds of things on the health expenditure of the world. Exotic diseases make up a very small part of our world's woes. Banal things related to public health and socio-economic conditions consume the vast majority of our resources. Sterilization programs will not have any impact on these matters. You would save more money by banning fast food and cigarettes than you would through any kind of eugenics program, and with a significant decrease in the ethical, legal, and social quandaries. (This is just an aside, but I find it odd how most of those who I have met these days who are in favor of heavy-handed state measures like eugenics are usually not inclined to take heavy-handed state measures in other realms of public health or the economy. This was, of course, not the case with the Germans, who were heavy-handed all around.) --Mr.98 (talk) 18:33, 19 December 2010 (UTC)[reply]
The measures I am suggesting would be taken only when the burden of supporting and treating susceptible patients becomes intolerable. When that point is reached, there will be no point discussing whether it will be done. Public opinion will insist on it. Some members of the public will not want to take action, so there will need to be at least a majority in favor of it. With the situation getting worse all the time, the no-action people will gradually shift to the other camp.
The situation would be taken on a disease by disease basis. Diseases which are not yet a serious problem would need no action.
Also, if personal susceptibility varies for a particular disease, and that can be measured, that would be taken into consideration. The cutoff point for each individual would have to be decided on somewhere of necessity and would be based on his percentage of susceptibility
If an easy, low cost treatment or cure for a disease is developed after a person has been sterilized, he will simply be out of luck. The sterilization program cannot be delayed in the hope that low cost treatment or a cure will be found. Necessity will not allow delay.
Growing disease susceptibility is one of a number of dire situations that can be seen ahead for the human race, and that will need heavy-handed government action if the human race is to survive. Diatom173.189.136.110 (talk) 21:52, 19 December 2010 (UTC)[reply]
Diatom. You need to re-read Mr.98's post, which is spot-on. Your understanding of the genetic underpinnings of disease is incomplete and simplistic, which is causing you to make assertions that are false. Your predictions about the need for (and inevitability of) governmental sterilization programs are naive and misplaced... the human race has far greater problems to face in the near future, global warming among them. --- Medical geneticist (talk) 23:21, 19 December 2010 (UTC)[reply]
If a person has no children, he will not pass on a harmful inheritable genetic trait. It does not take a geneticist to see that. Diatom. 173.189.136.110 (talk) 05:42, 20 December 2010 (UTC)[reply]

Stephen Hawking's condition is a genetic one. He has benefited a lot from medical help. Screening his parents or even him as a foetus could probably these days prevent people like him being born and the subsequent load on society. But then we wouldn't have had Stephen Hawking. HiLo48 (talk) 00:04, 20 December 2010 (UTC)[reply]

I will have to say again that I am talking about a situation that has reached an extreme state. Conditions will have to be like the great plagues that swept Europe in the Middle Ages. As I have said, the growth of disease susceptibility may be somewhat like a geometric progression, bringing dire conditions sooner than might be expected. Yes, geniuses will not be born if their parent is made sterile. But the same can be said about present legal abortion. In order that a parent will not have the bother and expense of raising a child, millions of fetuses have been destroyed. That undoubtedly included many geniuses. The same can be said about potential parents who die in war. Diatom.173.189.136.110 (talk) ~ —Preceding unsigned comment added by 173.189.136.110 (talk) 05:20, 20 December 2010 (UTC)[reply]

Well, if we're going to keep on with this speculation, I might as well voice my own crank opinion - which is that genetic degradation doesn't matter, not just for the conventional reason that people can fix the damage with genetic manipulation, nor the more macabre expectation that humanity may not survive long enough for it to matter, but because of a third factor, relating back to how our human race came to be. Homo sapiens burst forth in a wave from Africa, supplanting its predecessor almost entirely, as has happened, it would appear, for wave after wave of human races before it. The cradle of mankind, sensu lato, where its genetic diversity is maintained in full in the context of its native environment, remains the only place where this diversity remains, and thus remains the most likely place for any fresh wave of newly adapted humans to emerge. Thus races of men like the Twa, combining the full power of the human mind with a more efficient body, expressing the full range of human genetic diversity, and already displaying a great deal of reproductive isolation, await only the implementation of interstellar travel to evolve and emerge as Homo apotelesmatis, as I would dub them, and to begin an inevitable, exponential growth beyond all the humans that have ever come before them. Wnt (talk) 06:18, 20 December 2010 (UTC)[reply]
The point you are missing is that it will never reach an extreme state. In a large, well-mixed population, genes that are neither selected for nor selected against tend to maintain a roughly constant frequency over time. They do not proliferate. This is not like a geometric progression at all. In a natural setting, negative genes are selected against and a human population would be expected to get "healthier" over time (though the history of any particular gene is strongly influenced by random fluctuations and founder effects in small subpopulations). Medical intervention can plausibly remove the negative pressure in many cases. This prevents evolutionary pressure from making us "healthier" over time, but it doesn't inherently provide any selective advantage to the negative traits. Hence the "healthiness" of the human population would stop evolving and remain roughly constant. Dragons flight (talk) 06:35, 20 December 2010 (UTC)[reply]
I will have to repeat what I have already said 3 or 4 times. Medical treatment can often extend the life span of persons with inheritable susceptibility to disease. As a result, these persons beget more children who will also have the problem. Thus, the proportion of these people in the population continually increases. That sort of thing is a geometric progression. Also, advances in medical science can be expected to give these people even longer life, exacerbating the problem.
Thus, because of medical intervention the positive and negative factors do not remain in balance. Diatom.173.189.136.110 (talk) 13:54, 20 December 2010 (UTC)[reply]
Your understanding of the nature of genetic conditions is pre-Mendelian, much less post-Mendelian. You don't seem to get that screening out the phenotype does not screen out the genotype, and that trying to screen out the genotype means screening out people who are phenotypically healthy. You don't seem to get that most genetic conditions are not simple Mendelian traits anyway and can't be determined with anything more than a probabilistic assessment. You don't seem to get that the catastrophes you fear are probably not going to happen, ever, and that the present and future medical problems of the human race are quite different from them. You don't seem to understand the Hardy-Weinberg principle, which is key to understanding how genes proliferate in a population.
Your ignorance of even the fundamental concepts of genetics is going to impede you from actually having informed discussion on this topic. I think everyone in this thread and the previous has tried to point out exactly where you ignorance lies, but you've stubbornly resisted assimilating new information. I'm not sure the Reference Desk is going to be able to help you with your inquiries if you aren't actually going to take the time to really understand our responses. I'd like to help, but you don't seem eager to listen, and the Reference Desk is neither a debate society nor a place to arbitrarily air your views. --Mr.98 (talk) 14:47, 20 December 2010 (UTC)[reply]
(EC with below) This is probably a bit futile but I'm not sure if anyone has mentioned this yet. "Also, advances in medical science can be expected to give these people even longer life, exacerbating the problem." - this is a confusing statement. Once someone is no longer able to breed (have children) 'exacerbating this problem' is far from clear cut. (Of course the concept of unable to have children can be somewhat fussy nowadays even for females. However I mention this for a reason in that most people even with some susceptibility to various cancers, degenerative brain disorders etc tend to only tend to get cancer after they've had all their kids.) Humans do have a long adolescence and even in adulthood there is an advantage to having living parents as well as even living grandparents. But with the increasing level of social care etc there's likely to be less of an advantage, furthermore there are things like the cost of medical care etc for the living longer, particularly in societies where there is strong expectations of children helping their parents when they are elderly which complicate any advantage of this living longer to the children. Also you may next criticise the trend of delaying reproduction but note that AFAIK there is a fair amount of evidence that in other animals delaying reproduction tends to increase the average lifespan (of course this is unlikely to be very clear cut in humans given my earlier point about fussy reproductive ages as well as all the points me and others have made earlier)... Nil Einne (talk) 16:39, 20 December 2010 (UTC)[reply]
Medical advances that allow someone with a negative genetic condition to live a normal life span (and procreate normally) will mean that the genes are no longer selected against. It does not however mean that these genes will magically proliferate. You've repeated the same fallacy several times, but repetition does not magically make your point true. Consider the case of a simple recessive genetic condition caused by a single gene. Let's label the healthy, dominant gene as type "X" and disease causing recessive gene as type "x". Further, let's arbitrarily say that at an initial time 10% of genes are type "x", and 90% are the healthy type "X". A simple Mendelian cross tells us that of all babies born 81% will be type "XX" (completely healthy), 18% will be type "Xx" (disease carriers), and 1% will be type "xx" (disease victims). If all the disease victims are eliminated, then the gene pool in the next generation becomes 90.91% type "X" and 9.09% type "x". This is the natural effect of a deleterious gene being selected against. On the other hand, if medical intervention saves all of the "xx" people and allows them to live and procreate normally, then the next generation of the gene pool becomes... 90% type "X" and 10% type "x", exactly the same as it was before medical intervention. Your assertions to the contrary are simply wrong. If healthy and sickly people are given an equal opportunity to procreate, then their genes will maintain a constant relative abundance in the gene pool. Under ordinary processes, evolution works to eliminate deleterious genes. Medical interventions can arrest that process, and prevent negative genes from being eliminated, but it takes more than that for a gene to proliferate in the gene pool. In order for type "x" to become more common, there must be some reason that people having it are even more likely to procreate than people with type "X" genes. Giving them an equal shot at life simply isn't enough to do anything other than maintain the genetic status quo. Dragons flight (talk) 16:17, 20 December 2010 (UTC)[reply]
This is a zero-order approximation, but an overly optimistic picture, since random mutations accumulate in the genome. In the past, a 20-year generation cycle of humans, with occasional deaths from genetic factors, was sufficient to remove these mutations at the same rate as they occurred. The point that other participants made is that without this cycle, they will accumulate. I should note that the delay in the age at reproduction itself is sufficient to increase the mutation burden (especially in men, since oocytes undergo a dictyate arrest that prevents nucleic acid changes; conditions like Down syndrome associated with old eggs involve such large chunks of misplaced chromosome that modern medicine doesn't really keep them in the gene pool). And let's not even start on the extra problems that come up when representatives of the international idiot community decide to repeatedly nuke their own country or shut off all a reactor's safety precautions to see if they're truly necessary... As a side issue, certain genes subject to triplet repeat expansion (causing trinucleotide repeat disorders) actually become more vulnerable to severe mutations when milder mutations persist. More importantly (I think) there are proteins such as helicases which when mutated can lead to increased overall mutation rates. I am not advocating some kind of eugenic solution here - among other good reasons, "turning up natural selection and making it twice as effective" really is not an easy thing to do, no matter how smart the selector thinks he is - but recognize that the problem is a real one. Wnt (talk) 16:30, 21 December 2010 (UTC)[reply]

OK everybody. Thanks for all the input. There's more to it than I thought. My education in science ended at the age of 13, except for electronics training in the Air Force. But at least give me credit for the original thought about medical intervention in the situation. Diatom. 173.189.136.110 (talk) 22:50, 21 December 2010 (UTC)[reply]

Megapnosaurus name meaning

[edit]

It's me again. Except with a theropod question this time, which is actually quite simple:

Why the heck does Megapnosaurus mean "big dead lizard?" It's not especially big, and they're all dead (except for the birds of course :D)...Crimsonraptor (talk) 15:30, 19 December 2010 (UTC)[reply]

Well, it means that because that's how the Greek translates. But I suspect you don't want to know that, but rather know why that name was chosen. All I can say is that Palaeontologists have a weird sense of humor... --Stephan Schulz (talk) 16:18, 19 December 2010 (UTC)[reply]
They have weird senses of humor alright. Take Colepiocephale. I looked it up one day and it said the genus name (from Greek) meant "knuckle head." What makes it more funny is that it actually makes sense: it's a pachycephalosaur. Crimsonraptor (talk) 18:05, 19 December 2010 (UTC)[reply]

Unusual Vision?

[edit]

This is a medical question, but not a question for medical advice. Whenever I look through my right eye only, everything appears slightly reddish. When I look through my left eye only, everything appears slightly bluish. With both eyes open, everything appears to be its normal color. What explains the differences in my vision?24.88.86.197 (talk) 18:02, 19 December 2010 (UTC)[reply]

I assume it's not these. Sean.hoyland - talk 18:33, 19 December 2010 (UTC)[reply]
I have the same situation as the OP. My eyes see slightly different colors. My right eye sees a little redder and my left a little bluer. I had always assumed this wasn't a pathogenic problem, merely a result of subtle variation between the distribution of Cone cells in each eye. It's a situation I have always been curious about, but have never asked about myself... --Jayron32 18:50, 19 December 2010 (UTC)[reply]
I have something similar and came to a similar conclusion as Jayron. I've looked in to it briefly before but never really found anything useful. Also asked an optometrist once (during an eye examination although this was a student optometrist at the University of Auckland optometry clinic) but they didn't seem to be sure what I was asking about and it didn't bother me (I was simply asking out of curiosity) and I suspected they wouldn't know so didn't really try to explain. Nil Einne (talk) 19:06, 19 December 2010 (UTC)[reply]
Yes, I've noticed a similar effect, but only very slight, and it seems to vary. I've put it down to a difference in blood flow, but perhaps someone somewhere has done some research, or maybe an expert might have an informed opinion. The condition seems to be common, so it wouldn't be medical advice. I have also wondered whether it might be just a difference in the way the brain processes the two sets of three signals, rather like two identical photographic films, exposed identically, but being processed side-by-side with slightly different chemicals (pre-digital). There would be nothing wrong with either film or its exposure, just a matter of opinion about which colour development looked best, so perhaps there is nothing "wrong" with either eye. Apologies for the speculation, but it is well-known that colour is "seen" in the brain, and the eye doesn't combine signals. Dbfirs 20:56, 19 December 2010 (UTC)[reply]
Cerebral achromatopsia "is a type of color-blindness that is caused by damage to the cerebral cortex of the brain, rather than abnormalities in the cells of the eye's retina".
Color blindness, or color vision deficiency, "is the decreased ability to perceive differences between some of the colors that others can distinguish. It is most often of genetic nature, but may also occur because of some eye, nerve, or brain damage, or exposure to certain chemicals."
I'm not sure what is going on with differences between the color vision of one eye and the other, but would recommend getting a professional vision test for those that have it. WikiDao(talk) 21:20, 19 December 2010 (UTC)[reply]
Except that it isn't color blindness, which is an inability to distinguish different colors, such as being presented with two pictures, one red and the other green, and being unable to tell which is red and which is green. This is not that at all. It's just a slight difference in color perception between each eye. None of us has any trouble identifying colors. --Jayron32 21:25, 19 December 2010 (UTC)[reply]
After reading a bit about colour perception, I think my analogy of chemical processing might be closer to the truth than I realised. The cones use a chemical pigment that is sensitive to a narrowish wavelength range (a normal distribution with mean at the wavelengths of red, green and blue for the respective cones). If one eye happens to be renewing the pigment for a particular colour, then the concentration of the pigment in the cones of that eye might be slightly different, so the signal will be weaker or stronger accordingly. If the effect is permanent and always the same difference between eyes, then there must be a permanent difference between the pigment concentrations (rather like some people have different iris pigmentations). A temporary effect would occur if one eye was renewing pigment when the other wasn't, or after stronger exposure of one eye to a strong colour when sensitivity is reduced so the opposite colour is seen more strongly. Medical advice would be recommended if there is a large permanent difference that happens suddenly, otherwise most of us are happy to accept that our colour vision is slightly imperfect, but not as imperfect as the one in twelve males who have one of the many forms of colour blindness (usually caused by the wrong pigment in the cones). Dbfirs 22:06, 19 December 2010 (UTC)[reply]
I noticed in childhood that I saw things slightly differently coloured with my two eyes. I have never worried about it. It's not clear to me that it is even necessarily a physical or organic difference in my eyes. Since seeing is something that the brain actually learns to do, it seems to me that there is no need for the pathways from the two eyes to learn identical processing of colour: the brain learns to cope with the fact that the field of view from the two eyes is different, so there is no prima facie reason why it should have to identify the "colour signals" from the two eyes at that level. --ColinFine (talk) 00:40, 20 December 2010 (UTC)[reply]
Yes, it seems that different people (and possibly different eyes in the same person) detect colour very differently, with some eyes having forty times as many cones as others (though not this big difference in the same person). The brain does all of the compensation work (like a good digital image processor), and we only notice the effect when we confuse it by rapidly switching eyes. What is surprising, given the difference in physiology, is that most people seem to have roughly the same final perception of colour, and agree on shades, etc. This seems to be because we subconsciously learn to adjust our perception to match that of the majority. Dbfirs 08:46, 20 December 2010 (UTC)[reply]
I have done this before. Close one eye and open the other in a bright room for 1 minute. Then open the closed eye. Each eye sees somewhat different. --Chemicalinterest (talk) 12:32, 20 December 2010 (UTC)[reply]
I have sometimes observed this effect being caused by the *position* of my eyes. For some reason much warmer light was being reflected into one eye than the other, and moving my head shifted that light to the other eye. --Sean 17:25, 20 December 2010 (UTC)[reply]

[For Concentration] ADD meds (stimulants) vs. benzodiazepines/anxiolytic meds (depressants)

[edit]

I am looking for some research articles/papers (Google Scholar etc.) but I am not exactly sure what search terms to use. (Edit: General (knowledgeable) response to the questions posed below is okay. I will try to find citations, to confirm later. Thanks.)

Specifically, I am looking at users who have Benzodiazepine dependence i.e. addicted to Benzodiazepines like Clonazepam; and, these same users are also addicted to stimulants like Ritalin.

Now my assumption is this, that if the user takes both of them together, for example Ritalin with Clonazepam, the net effect to the user will be the same as taking a placebo. As one is a stimulant and the other is a depressant. Is it as simple as this?

Currently Medical Professionals diagnose patients with ADD and Anxiety Disorders i.e. co-morbid disorders.

So then if treatment for ADD is Stimulants, and treatment for Anxiety Disorders is Benzodiazepines, is there any research done on what would be more effective for concentration?

As an example, say they decide to withhold benzodiapines to the test subjects, and only give stimulants. Or the other way around, they withhold stimulants from the test subjects, and only give benzodiapines?

--Vicgarin (talk) 18:16, 19 December 2010 (UTC)[reply]

I don't have any papers to cite right now, but as a quick preliminary response: no, it is not as simple as that. WikiDao(talk) 18:32, 19 December 2010 (UTC)[reply]
PS. I edited my original question i.e. General (knowledgeable) response to the questions posed below is okay. I will try to find citations, to confirm later. So if you would please elaborate, I would appreciate. Thanks. --Vicgarin (talk) 19:01, 19 December 2010 (UTC)[reply]
I'm just saying that the two varieties of drug do not "cancel each other out" as if the net effect of taking both were equivalent to taking neither. Taking both is different than taking either alone and is different than taking neither, in terms of a wide range of psychopharmacological effects, as far as I know, and I'd be interested too in any studies that have looked into that. WikiDao(talk) 19:28, 19 December 2010 (UTC)[reply]
It would help if you explained exactly why you need this. These drugs just mimic natural Neurotransmitter, of which - there is a lot and therefore, there is no simple stimulant/depressant pairing. Naloxone is used as a treatment for heroin overdose. Does that run parallel with your query?--Aspro (talk) 20:30, 19 December 2010 (UTC)[reply]
Start with PMID 17338593, find it in a library and/or online, read it carefully, read the citations relevant to your questions, do a citation search on those specific references trying to find other secondary literature reviews (ask a reference librarian in a medical library to help if possible) and then email your specific questions to the corresponding author, Dr. Kenna, whose email address is given in that citation. This is a very specific medical question and you shouldn't be asking the Reference Desk, you need to ask an expert. You probably also want PMID 18384709, PMID 9680053, PMID 20667290, PMID 20861593, PMID 17915180, PMID 15491232, and PMID 19476419, among others. Please remember to stick with the secondary literature ("reviews" in PubMed) instead of the ordinary journal articles which are much less accurate. Ginger Conspiracy (talk) 05:20, 20 December 2010 (UTC)[reply]
Alternatively, consider the analogous and well-known Speedball (drug) effect: "Cocaine acts as a stimulant, whereas heroin acts as a depressant. Coadministration provides an intense rush of euphoria with a high that combines both effects of the drugs, while excluding the negative effects, such as anxiety and sedation." WikiDao(talk) 11:23, 20 December 2010 (UTC)[reply]
Excluding the negative effects... until one or both of the drugs starts to wear off? Ginger Conspiracy (talk) 15:18, 20 December 2010 (UTC)[reply]

The reason I brought up addiction/dependence in my original question, was based on the assumption, that those treated for Anxiety disorders (Panic attacks, Socialized anxiety, Generalized anxiety etc.) will have developed a tolerance to benzodiazepines (clonazepam).

And similarly those treated for ADD or ADHD will have developed a tolerance to stimulants (Ritalin, Dexedrine, Adderall)

Not specifically substance abuse.

I was looking for research articles on how they treated those with co-morbid disorders: Anxiety Disorders & ADD. I can't find an article that deals with these two disorders only.

There is Atomoxetine (not a stimulant) for use in ADD and/or SSRIs for use in Anxiety disorders. But no research has been done on the efficacy of Stimulants & Benzodiazapienes, like I mentioned in my original question? --Vicgarin (talk) 21:06, 22 December 2010 (UTC)[reply]

Please address your question to medical experts; especially the corresponding authors whose email addresses are given in the PMID review links above, and let us know what they tell you. Thank you. Ginger Conspiracy (talk) 01:29, 23 December 2010 (UTC)[reply]

Birds' ability to map the night sky

[edit]

Hi,

In a TV programme shown recently in the UK ("The Zoo"), it was said that corncrakes map the stars in the sky to enable them to return home after migration. For that reason, the chicks in a captive breeding programme were prevented from seeing the night sky at the breeding site because that would not be their eventual home.

Although the pattern of the stars obviously varies with latitude, the difference in latitude between the breeding site and the eventual release site appears to be only about one degree, which basically this means that the stars are shifted in the sky by about one degree. Is it really possible that corncrakes can reliably detect differences that small? I find it very hard to believe. 21:54, 19 December 2010 (UTC) —Preceding unsigned comment added by 81.159.79.39 (talk)

I don't have an answer for you, but wanted to note that the sky also appears to rotate overnight, so the star locations, relative to the nest, change. I wonder how they deal with that. StuRat (talk) 02:51, 20 December 2010 (UTC)[reply]
Maybe they can locate the Pole Star? 81.159.79.39 (talk) 02:55, 20 December 2010 (UTC)[reply]
The ability of animals to find the migratory homes is still one of the great mysteries of science. It is unknown what mechanisms MANY animals use to find their homes, and it is likely that different animals use different methods, or combinations of methods, to help them. --Jayron32 02:59, 20 December 2010 (UTC)[reply]
Bird migration#Orientation and navigation might be interesting if not exactly helpful for this particular question. I would suggest that olfactory (smell) and magnetic field effects might be more useful for the birds than celestial navigation. Most birds don't prefer to fly in the dark, for reasons you might expect. Ginger Conspiracy (talk) 05:40, 20 December 2010 (UTC)[reply]

Taking high dose vitamin D supplements every 5 days

[edit]
This question has been removed. Per the reference desk guidelines, the reference desk is not an appropriate place to request medical, legal or other professional advice, including any kind of medical diagnosis, prognosis, or treatment recommendations. For such advice, please see a qualified professional. If you don't believe this is such a request, please explain what you meant to ask, either here or on the Reference Desk's talk page.
This question has been removed. Per the reference desk guidelines, the reference desk is not an appropriate place to request medical, legal or other professional advice, including any kind of medical diagnosis or prognosis, or treatment recommendations. For such advice, please see a qualified professional. If you don't believe this is such a request, please explain what you meant to ask, either here or on the Reference Desk's talk page. --~~~~

If you have questions about adjusting the dosages or timings of supplements, you should consult a qualified medical professional — your physician or pharmacist should be able to advise you. TenOfAllTrades(talk) 23:29, 19 December 2010 (UTC)[reply]

Our article on Plateau principle says:" Once hydroxylated, the vitamin has a half-life of about 2 months", but we cannot give medical advice except to warn that a daily dosage 50,000 IU would probably be toxic. As advised above, please consult a professional rather than relying on Wikipedia articles, however well-written. Dbfirs 23:35, 19 December 2010 (UTC)[reply]
Thanks. B.t.w., this is not a medical question at all, as the supplements I'm taking are available without prescription. I don't plan to go to a doctor, as I've read that the science on vitamin D is controversial and many doctors still stick to the guideline that 200 IU per day is the recommended dose. So, the likely "medical advice" I would get is to stop taking the supplements; medical advice on this issue would thus be irrelevant. Count Iblis (talk) 01:06, 20 December 2010 (UTC)[reply]
Please don't restore reqeusts for medical advice to this page, Count Iblis. It it still medical advice you're requesting, even if it deals with medications or supplements taken off-prescription. Please consult your pharmacist for information about supplements; don't ask Wikipedians to give you their advice. TenOfAllTrades(talk) 02:09, 20 December 2010 (UTC)[reply]
The advice your doctor would give is in fact relevant because it is based on Science. Just because you choose to ignore it, doesn't make it irrelevant. If you want advice on megadosing or some other unscientific modality, then this isn't the place to ask. Vespine (talk) 03:09, 20 December 2010 (UTC)[reply]
Counting on medical doctors to base their advice on science is a bit naive. --Trovatore (talk) 08:51, 20 December 2010 (UTC)[reply]
Indeed, doctors are not scientists and they stick to guidelines (which they legally have to do). In case of Vitamin D, the official guidelines are (according to the science on vitamin D), hopelessly outdated. According to the science, 10,000 IU per day is a normal dose, it is not a megadose at all. You only have to be careful if you also get a large amount of vitamin D from the sun (you can get 10,000 IU per day from the Sun). In fact, it is likely that the official guidelines that 200 IU per day is enough, is dangerous advice that is best ignored. Count Iblis (talk) 13:11, 20 December 2010 (UTC)[reply]
THe VitD from the sun is not the same as ingested Vit D, neither is the physiology involved the same. 92.15.26.185 (talk) 16:26, 22 December 2010 (UTC)[reply]
Or just read today's newspaper.
or read this Count Iblis (talk) 14:06, 20 December 2010 (UTC)[reply]
Hypervitaminosis D and Vitamin_d#Overdose_by_ingestion says that overdosing on Vitamin D causes premature aging and heart disease. Do not make the mistake of thinking that because small doses are good, bigger doses must be even better - it does not work like that. There is only a narrow range between benefit and harm. Taking megadoses of vitamins is foolish and does you harm. Taking five days doses or more every five days is really crazy. 92.15.13.152 (talk) 15:28, 20 December 2010 (UTC)[reply]
Oh my goodness. A newspaper is NOT a peer-reviewed journal. The popular press has a major problem being able to distinguish between good science and total bullshit. If you want to be able to prove your point, you're going to have to find better sources than the Montreal Gazette and The Globe and Mail. I am sure they are fine papers, but they are not a source for scrupulously reliable scientific information. --Jayron32 15:27, 20 December 2010 (UTC)[reply]
I know that about newspapers, but all I'm saying is that we can read what the experts themselves are saying. The point being that what they are saying is totally incompatible with the official recommendation regarding vitamin D. The facts regarding vitamin D are that toxicity likely starts at 40,000 IU per day taken over a period of many weeks. There are no known cases of vitamin D overdose caused by consuming 10,000 IU per day or less. Recently, the maximum safe dose was increased from 2,000 IU to 4,000 IU for people older than 9 in the US. But note that this means that everyone in the US can count on taking 4,000 IU per day without that doing harm, regardless of other factors like getting a lot of vitamin D from the Sun. What is clear is that as long as you make sure that your total intake is less than 10,000 IU per day, you cannot get ill effects, unless you suffer from certain rare diseases (in which case you can't spend a lot of time in the Sun either). Count Iblis (talk) 16:18, 20 December 2010 (UTC)[reply]
What experts? Papers are well known to spin stories, I have no idea if the majority of 'experts' on vitamin D are arguing for megadoses but I damn well wouldn't be trusting a random paper to tell me particularly not Globe & Mail which from memory doesn't have a good track record with science stories (at least re: climate change). The paper itself quotes on 'cautionary' scientist, this seems common in spin stories, if you come across an article on a car that runs on water (and I did actually in somewhat respectable media sources once although I can't recall if they quoted a 'cautionary' scientist), they may sometimes also quote a sole 'cautionary' scientist. Also define 'ill effects'. 10k IU may not kill you or cause such obviously harmful effects, but if you are taking it to ward off various cancers or whatever else you need to be sure such (difficult to detect and complicated) negative effects aren't being missed. I would guess there's a very good chance 10k IU does have some harmful effects in far more people you describe even if it's likely to be a net positive for most people (I'm not saying it is, as should be clear I have no idea). Nil Einne (talk) 17:09, 20 December 2010 (UTC)[reply]
Count Iblis is correct in claiming that some experts are investigating the benefits of high dosages of vitamin D, there was a BBC programme about this recently, but I don't think they have ruled out the possibility of harm at levels this close to toxicity. Best advice would be to wait for further research before risking very high dosages. Dbfirs 22:12, 20 December 2010 (UTC)[reply]
The harm is likely to be insidious - it silently accumulates and then suddenly you have something seriously wrong. The was a Scientific American article about this, in the November 2009 or 2008 issue I think. 92.29.124.17 (talk) 22:39, 20 December 2010 (UTC)[reply]

From what I´ve read (and I don't want to push my view here, so I won't give my sources, everyone can search out the sources he/she trusts bests and agree/disagree with me), I have formed the following conclusion about 2 years ago. Until a few centuries ago, the normal vitamin D intake was of the order of 10,000 IU per day, mainly from exposure to UV radiation. This should i.m.o. be considered to be the normal physiological vitamin D intake for humans. However, what has happened gradually during the last few centuries and more dramatically in the last few decades, is that we spend so much time indoors that we get extremely low doses of vitamin D. That then leads to problems with the bones. Then, to not get those problems with the bones, you need a small dose of vitamin D every day, of the order of a few hundred IU per day.

Now, there is reasonably strong evidence that vitamin D does a lot more than promote bone health. This evidence comes from various independent research groups and from different types of investigations (epidemiological studies, studies looking at the way immune cells work etc. etc.). Then, my thinking is that while this is not rigorously proven, the burden of the evidence should be on the hypothesis that extremely low doses that are just enough to maintain healthy bones are good enough for overall health. Compare with humans stopping to eat fruits and vegetables and then having to get some small dose of vitamin C to prevent getting scurvy. If everyone does this for a few centuries and it becomes accepted practice, you can imagine that a grapefruit would be considered to contain a "megadose" of vitamin C. People could then consider eating fruits and vegetables potentially unsafe and would only recommend it is there was rigorous proof that it reduces the risk of cancer and if adverse health risks were ruled out. Thing is that despite not having such proof (not just w.r.t. vitamin C, we don't even know all the compounds that are contained in, say, a cauliflower), we do eat fruits and vegetables.

About the dose at which toxicity starts, I have read that you need to take something of the order of 100,000 IU per day for a few weeks to get ill. But for some people toxicity may start at a lower dose, and some experts have estimated that toxicity like starts in some individuals at 40,000 IU per day. It is not likely that this toxicity dose can be near to 10,000 IU because that is considered to be normal dose that you can get from the Sun. There is a dispute about a study in which volunteers were given 20,000 IU per day on whether the results there show a small elevated calcium level. Now, I don't think 10,0000 IU/day should be considered close to a dangerous dose. Compare e.g. to the recommended calcium dose and the dose at which you get toxicicity. These two doses are also apart by a factor of 3 to 4. Or compare the recomended dose for H2O and the toxic H2O dose.

I take 10,000 IU per day during winter when I know for sure that I'm not getting any vitamin d from the Sun at all. In the summer I take 5,000 IU per day. That way I stay an order of magnitude below the treshold at which toxicity may begin to start in some individuals. The reason why I do this is not per se to reduce the chance of getting cancer. I don't think cancer is all that relevant, it wasn't the main cause of death for our pre-historic ancestors. If vitamin D at, say, 2000 IU per day does reduce the risk of getting cancer, it is likely just a side effect of the lack of its main function. Presumably, the optimal dose is around 10,000 IU per day which one should compare to an F1 car that is fully tuned for optimal performance. If you move a bit away from the optimal settings, the car won't fall apart. But move away from this by a signifact amount, and then you may see it malfunctioning.

Anyway, all this is my personal opinion on this matter... Count Iblis (talk) 00:24, 21 December 2010 (UTC)[reply]

If you bother to read the articles, you will see that the body has its own mechanisms to prevent overdose from skin-created Vitamin D. But it does not have overdose-prevention mechanism for ingested Vitamin D. You say "Now, I don't think 10,0000 IU/day should be considered close to a dangerous dose." So your opinion is going to alter physiology? You must be a god. Its like saying "Now, I don't think driving down Main Street at 10000 miles per hour should be considered close to a dangerous speed". I do believe you are going to do yourself pernament irreversible damage. 92.24.188.27 (talk) 15:28, 21 December 2010 (UTC)[reply]
I think most experts would say something like: "10,000(0) IU/day is probably safe for most people", but perhaps Count Iblis will let us know in a few years whether he has suffered benefit or harm from this dosage. Personally, I would prefer to err on the lower side in the absence of regular medical monitoring. If I had some 50,000(0) IU tablets, and wanted to use myself as a guinea pig, I would cut them in half and take half every three days, but please don't take this as medical advice because I have no medical expertise. Dbfirs 21:26, 21 December 2010 (UTC)[reply]
He won't be able to let us know as he will be dead or diabled. 92.15.26.185 (talk) 16:26, 22 December 2010 (UTC)[reply]
400IU is the current RDA. 100000/400 is two hundred and fifty times the RDA. Its almost like taking a years worth in a day. 92.15.15.127 (talk) 22:07, 21 December 2010 (UTC)[reply]
... later note: sorry, I meant 10,000 which is 25 times the RDA. Dbfirs 21:30, 22 December 2010 (UTC)[reply]
True, but the current recommendation is considered by some experts to be unreasonably low compared with the normal production in human skin exposed to summer sunlight. Dbfirs 22:30, 21 December 2010 (UTC)[reply]
You put an unusual number of zeroes after the comma, which 92.15 may have taken literally. --Trovatore (talk) 22:38, 21 December 2010 (UTC)[reply]
... oops! I must have been too tired to count zeros both in my post (where I have now striken the accidental zero), and in 92.15's reply. I foolishly copied from an earlier error. Apologies for causing confusion, but this just goes to illustrate the danger of taking advice from unqualified people who give replies here! Dbfirs 21:13, 22 December 2010 (UTC)[reply]
Even 25 times the normal dose would do you serious harm if you took other vitamins at that rate. I don't see why Vitamin D should be an exception, especially bearing in mind that the body limits skin production. But its the OPs serious medical problems, disability, slow death, and funeral. 92.29.126.195 (talk) 11:48, 22 December 2010 (UTC)[reply]
What you seemed to have missed is that the body limits skin production to somewhere between 10,000 IU to 20,000 IU per day; toxicity is believed to start at a factor 2 to 4 above this. So, in this respect, it actually is very much like other vitamins/minerals. You seem to take the RDA very serious, but the logic behind the RDA for vitamin D is totally flawed. If you apply that same logic to vitamin C, you would take the RDA to be just enough to prevent scurvy. If I were to argue that eating grapefruits is healthy, you would object to my statement on the grounds that a grapfruit contains an order of magnitude more vitamin C than the RDA. Count Iblis (talk) 14:08, 22 December 2010 (UTC)[reply]
You might benefit from reading Ben Goldacre's book, Bad Science, where he talks about the crazy thing nutritionists say about vast overdoses of vitamins. It's a very approachable work that doesn't require a lot of time to read. He discusses several of the objections above in more detail, and includes references. -- JSBillings 14:30, 22 December 2010 (UTC)[reply]
I don't think that would apply here, because we are not talking about taking large doses here. Count Iblis (talk) 15:51, 22 December 2010 (UTC)[reply]
I'd certainly call a dose 25 times more than the recommended a large dose; I'd call it an extremely large dose. Many years ago I went through the same thing with Vitamin C: everyone was saying how megadoses were the new wonder pill. So I took a megadose tablet every day. I got bad pains in my kidneys, although I dont think I made the connection at the time. It wasnt until some years later that I found out that megadoses of VitC damaged your kidneys. More recently I stopped taking a multivitimin pill every day as I read a scientific paper that people who overdosed on vitamins had higher mortality that those who didnt. In other words, high doese of vitamins are toxic. 92.15.26.185 (talk) 16:21, 22 December 2010 (UTC)[reply]
The point being made is that the recommended dose is under review and is considered by some to be unreasonably low. I agree that self-administered experimentation without medical monitoring is not to be advised. Dbfirs 21:30, 22 December 2010 (UTC)[reply]