Jump to content

Wikipedia:Reference desk/Archives/Science/2012 June 4

From Wikipedia, the free encyclopedia
Science desk
< June 3 << May | June | Jul >> June 5 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


June 4

[edit]

What is the word for different outcome probabilities?

[edit]

What is the basic phenomena called that anything in the universe at different locations will have a different outcome probabilities. Such that a solid block that melts will disperse into droplets that move in different directions and different sizes and not be absolutely uniform? Electron9 (talk) 01:31, 4 June 2012 (UTC)[reply]

Usually a solid block that melts turns into a pool of liquid. Could you make it a bit more clear what you are asking, please? Looie496 (talk) 03:11, 4 June 2012 (UTC)[reply]
Chaos? — kwami (talk) 03:40, 4 June 2012 (UTC)[reply]
See order and disorder (physics). StuRat (talk) 05:43, 4 June 2012 (UTC)[reply]
Second law of thermodynamics or Entropy. Phase space is used to represent all configurations & states a system can have. SkyMachine (++) 07:22, 4 June 2012 (UTC)[reply]

Transit of Earth, as seen from Mars?

[edit]

I am only a basic amateur in this, so tell me if my reasoning is correct. As all the planets pass in great arcs around the Sun, then each planet must experience eclipse events with all other planets closer to the Sun than it is. (This would be true even if the planets did not move in a plane, as they do in reality.) Thus, Earth has both Transits of Venus and of Mercury. I am arguing that Mars would experience transits of both these, as well as a Transit of Earth. Is this correct?


Similarly, Jupiter would experience transits of Mercury, Venus, Earth and Mars. And Pluto would have transit events of all the planets. Have these events been considered and calculated? Which would be the rarest eclipse, for that is what they are? Would there be total eclipses with some transits, as the Sun appears smaller for the outer planets? When all the moons are taken into account, how many transits and partial / total eclipses are there altogether in our Solar System? What is the rarest and most spectacular? There must be cases where there are simultaneous eclipses involving 4 or more bodies. Myles325a (talk) 08:03, 4 June 2012 (UTC)[reply]

You are not the first person here to have thought about this. Have a look at Transit of Earth from Mars and, more generally, Astronomical transit and the navigation box "Transit visibility from planets superior to the transiting body" near the end of each article. None produce anything like a total eclipse Jupiter from Saturn being the the greatest at 5-6 percent, according to Transit_of_Jupiter_from_outer_planets#Saturn. Thincat (talk) 09:49, 4 June 2012 (UTC)[reply]
... and according to this, as seen from Earth, there will be a simultaneous transit of Venus and Mercury on 26 July 69163. Thincat (talk) 10:36, 4 June 2012 (UTC)[reply]
I'll be sure to mark my calendar so I don't miss it. :-) StuRat (talk) 00:47, 5 June 2012 (UTC) [reply]
We have a series of articles covering several of the planets: Solar eclipses on Pluto, Solar eclipses on Jupiter, Solar eclipses on Mars, Solar eclipses on Uranus and Solar eclipses on Saturn. Apparently, most of the planets can experience total eclipses from their respective moons -- Ferkelparade π 10:56, 4 June 2012 (UTC)[reply]
Without having done the math or research, I'd expect that a Neptunian transit viewed from Pluto (and its corollary, the Plutonian transit viewed from Neptune) won't occur due to the 2:3 orbital resonance between those bodies and Pluto's high inclination. Specifically, while you'll be able to draw a rough Sun-Neptune-Pluto line on a regular basis, it'll be in pretty much the same spot every orbit and that spot is not likely to coincide with the line where the Sun, Neptune, and Pluto are actually co-planar. — Lomn 14:08, 4 June 2012 (UTC)[reply]

"Seed for one year, weed for seven"

[edit]

As I struggle this Jubilee weekend to weed the wilderness the counts as my back garden in the rain, my partner can be heard from the kitchen quoting an old saying that is meant to spur me into action. In truth it just annoys the bejesus out of me. The old saying "seed for one year, weed for seven": how true is this? How many times do I have to pull up weeds before they finally give up the ghost and decide that they're not welcome in my garden? -- roleplayer 09:34, 4 June 2012 (UTC)[reply]

What's your life expentancy? ←Baseball Bugs What's up, Doc? carrots09:38, 4 June 2012 (UTC)[reply]
37x109 should cover it. Benyoch ...Don't panic! Don't panic!... (talk) 09:46, 4 June 2012 (UTC)[reply]
I was going to say "6.02x1023", but then I realized your yard was infested with weeds not moles. DMacks (talk) 09:55, 4 June 2012 (UTC)[reply]
*Facepalm* Oh dear. -- roleplayer 09:59, 4 June 2012 (UTC)[reply]
Bravo! Brammers (talk/c) 13:09, 4 June 2012 (UTC)[reply]
I've never heard of that saying, but it reminds me of Exodus 23:10,11. Plasmic Physics (talk) 13:21, 4 June 2012 (UTC)[reply]
Exodus 23:10,11 is not related in any way except that it mentions the number 6. --169.232.178.111 (talk) 07:39, 6 June 2012 (UTC)[reply]
What exactly does it say in Exodus 23:10,11? Not all of us are Christians (I'm not) - I assume this is from one of the Christian bibles? Wickwack121.221.26.41 (talk) 15:03, 4 June 2012 (UTC)[reply]
Surprisingly, the internet is not just a porn machine. -RunningOnBrains(talk) 15:14, 4 June 2012 (UTC)[reply]
Speak for yourself. Reading old testament passages give me screaming orgasms.Throw-away-account-02783457342 (talk) 16:55, 4 June 2012 (UTC)[reply]
There is one Bible, but many translations, not all coherent. Exodus is one Book contained within the Bible. Plasmic Physics (talk) 23:44, 4 June 2012 (UTC)[reply]
I think the saying is that perennials will only need to be planted every seven years or so, but need to be weeded constantly. I'm not sure the seven is meant to be a scientifically stringent and tested number, just that when you garden, planting happens much less often than weeding. Indeed, the major job of a gardener, above all else, is weeding. The actual planting of the seed is a minor amount of the labor involved in the endeavour. So, again, don't look for a scientifically-proven truth to the aphorism; instead look at it as a general idea behind gardening; when you plant a garden expect most of your time and effort to be spent pulling out undesirable plants. --Jayron32 13:55, 4 June 2012 (UTC)[reply]
The only way I've heard of that saying is in relation to the poppy, not to all seeds. That may put your mind at rest. Or you can learn to love your weeds, which after all are mainly wild flowers in unwanted places :) --TammyMoet (talk) 15:02, 4 June 2012 (UTC)[reply]
I had a friend once who used to say that weeds were just flowers that grow where you don't want them to. Imagine my pleasure when the question of what the difference was between weeds and flowers came up on Simon Mayo's drivetime programme on BBC Radio 2, and the expert they brought on to answer the question basically said what my friend had always said in jest. -- roleplayer 18:04, 4 June 2012 (UTC)[reply]
So as your friend and the BBC guest dude are correct, all you need to do is to convert your interest to the non-preferred flower/s or plant/s (aka weed/s) and promote its/their growth in lieu of your present choice and you will be guaranteed to have a successful all be it useless crop that doesnt need weeding. Please, dont thank me, just send cash. Benyoch ...Don't panic! Don't panic!... (talk) 01:03, 5 June 2012 (UTC)[reply]
My garden has spent the last twelve years of my tenure doing its own thing in spite of any effort on my part to do anything different with it. In some parts of the world ivy is considered rustic and quaint. In my garden it's bloody annoying. And don't get me started on bloody bindweed. The only thing preventing me from taking a flametorch to it is the slim chance I might accidentally burn my own house down. -- roleplayer 01:12, 5 June 2012 (UTC)[reply]
I wonder whether anyone has ever considered genetically engineering a fungus which kills only ivy, with a double kill switch coded into it? Plasmic Physics (talk) 01:21, 5 June 2012 (UTC)[reply]
Some years ago I saw this discussed on Gardeners World on the BBC. It refers to what happens once you leave the weeds to set seed and the guy reckoned there was a lot of truth in the adage. His explanation was that only about 50% of the weed seeds that drop onto the soil germinate in any one year - the rest are left in the soil. So, assuming you pull up all the new weeds every year, then each year the number of seeds left in the soil will reduce by 50%. After seven years this will reduce the remaining seeds to a neglible amount. He demonstrated this by dividing a pile of seeds by half and then half again and, after doing this seven times, there were indeed very few left. Who says you can't learn from watching television? And as for bindweed - glyphosate gel painted on the leaves is the only answer, I believe. Richerman (talk) 01:51, 5 June 2012 (UTC)[reply]
How does that relate to a genetically engineered fungus? Plasmic Physics (talk) 03:14, 5 June 2012 (UTC)[reply]
ok, I've outdented it - is that better?
An indent on the same level will do, that way it doesn't look like an answer to my comment. Plasmic Physics (talk) 08:47, 5 June 2012 (UTC)[reply]

Body Vs Face

[edit]

I have been around to many cold places. While we dress up for the cold, we cover our entire body below the chin, the ears and the head. The only parts that are exposed are the cheeks, nose, eyes and majority of the forehead. We do not feel any shivering sensation or extremely cold sensation when these parts are exposed but when you remove one sweater, you start shivering. Why is it so? What is so special about the face ? How is it able to thermoregulate it so well? — Preceding unsigned comment added by 117.193.137.186 (talk) 12:17, 4 June 2012 (UTC)[reply]

The face is just small, so not much heat it lost through it. Shivering isn't localised, it happens when your core body temperature is too low. It doesn't matter where you are losing heat from, just the total amount of heat you are losing. --Tango (talk) 12:53, 4 June 2012 (UTC)[reply]
Agreed. The face still becomes quite cold much like your feet if you happen to be in a cold basement without shoes or socks on. But you don't shiver then either even though your feet feel very cold to the touch. Core body temp being lowered is what triggers shivering. Dismas|(talk) 12:56, 4 June 2012 (UTC)[reply]

Well, I mean, why doesnt it feel uncomfortable? I am sure you would feel really uncomfortable if your feet were in the basement. — Preceding unsigned comment added by 117.193.137.186 (talk) 13:53, 4 June 2012 (UTC)[reply]

If your nose and your cheeks don't feel uncomfortable when you're out in the cold, then there's something wrong with you :P 109.97.179.91 (talk) 14:56, 4 June 2012 (UTC)[reply]
To a degree. Several mitigating factors apply: (a) It is what you are used to. We don't cover our faces unless conditions are arctic, so we are used to the face being colder. When I was in primary school, we didn't wear shoes - we went barefooted. Winter temperatures routinely got down to + or -1 C, summer +40 C. But at high school, they compelled us to wear shoes - I've worn them ever since when out of house. But I still don't sense heat or cold in my feet - If I need to go outside briefly, I don't bother with anything on my feet. (b) as aluded to above, we put sufficient clothing on to keep the whole body comfortable. This means that although the face is a small area, while wearing winter clothing, it is the only skin available for the body to regulate the temperature, so more blood flows in facial skin than would be the case if you took a coat or whatever off so heat can be lost elsewhere. So, even though the face is exposed to cold air, the skin is warmer than it would be if you took some clothing off. (c) Humans have been wearing clothing of some sort in cold areas, even if just animal furs, for probably millions of years. Plenty of time for any slight advantage in not being bothered by a cold face to be an evolutionary pressure. Wickwack121.221.26.41 (talk) 14:58, 4 June 2012 (UTC)[reply]
Interesting. Where and when were you allowed to go to school barefoot ? I'd think you'd have lots of cases of plantar warts and foot injuries. StuRat (talk) 18:51, 4 June 2012 (UTC) [reply]
Australia, semi rural area, late 1950's. Attitudes were very relaxed back then. Children were allowed to be children and were expected to walk to school. Nowadays, Australia is over-the-top safety conscious like the USA. All school kids are compelled to wear a complete school uniform including proper shoes and large floppy hat, + suncream, and parents drive them to school. I'd never heard of plantar warts until your post. I've never had what is shown in the WP article. Foot injuries certainly did occur, but no real problem. It's compensated by having feet completely and properly grown. Those of us who went barefoot to school went barefoot everywhere else and have broader feet, larger toes, and higher arches. It was always considered a bad idea to go barefoot on farms with animals. Wickwack23:50, 4 June 2012 (UTC) — Preceding unsigned comment added by 120.145.46.131 (talk)
On top of blood flow there is also the hot air you exhale which helps to warm your face. I have never heard of anybody having got frostbitten on their nose. --80.112.182.54 (talk) 16:33, 4 June 2012 (UTC)[reply]
Well Medline says it's one of the most vulnerable parts of the body! There are plenty of pictures on Google Images. --TammyMoet (talk) 16:53, 4 June 2012 (UTC)[reply]
Yes, when we have to go out in extremely cold weather we usually cover our faces, too, as far as possible, or wear a hood that keeps the coldest air away from our faces. Because breathing out warms our noses, the problem is usually only serious when there is a strong wind along with the cold temperature. Dbfirs 17:01, 4 June 2012 (UTC)[reply]
The nose is definitely subject to frostbite, as some mountain climbers have learned. That's pretty extreme conditions, though. ←Baseball Bugs What's up, Doc? carrots23:39, 4 June 2012 (UTC)[reply]

In addition to the overall heat loss, there is the concern about localized cooling causing frostbite. I think the face has evolved to have more blood flow, specifically because it's so exposed. Also, the cheeks may be warmed a bit by exhaled air in the mouth. In addition to extremely low temperatures, wind can also play a major factor in frostbite. You may find yourself turning your face away from the wind to prevent that. StuRat (talk) 18:48, 4 June 2012 (UTC)[reply]

Some very good answers already that have covered almost all of this. To focus on one aspect of the original question, I would say that you do feel a cold sensation on the face, but you are accustomed to ignore it (almost certainly by habit, and perhaps by evolution too).

To make that clearer, take a look at the two pics in this BBC news article about British attitudes towards clothing and the cold. In the top pic, the schoolboys are obviously greatly enjoying themselves despite large parts of their bare legs being exposed to icy winds, and snow and ice all around them. In the pic lower down, young ladies enjoy freezing rain on a "night out", despite just about all their legs being exposed. In some times, places, recreations, or professions, it's pretty much de rigeur for ladies to wear extremely short skirts even while their upper bodies are carefully wrapped in warm clothing.

If you dressed me in a short skirt and suggested I go on a "night out" in the middle of winter, I would certainly feel an "extremely cold sensation" in unexpected areas. I'm just not used to such attire. But on the other hand, in recent very cold (by British standards) times, I happily donned a few extra sweaters, coat, hat, scarf and gloves, but didn't think about trying to wear a second pair of trousers or trying to obtain long woollen underwear. (The latter was used extensively by the British during major wars in cold climates when decent shelter was not easily found, but isn't common these days).

Why do I think we're accustomed to ignore cold sensations in certain areas partly by evolution? Well, as others have said, the shivering response is to deal with the core temperature dropping. Cold in your legs, hands or face is not a problem (from an evolutionary perspective) unless either it contributes to a core temperature problem, or it results in frostbite, or it interferes with the function of the body parts in question.

When humans first learned to use animal furs to shield themselves against cold, draping something over the shoulders, hanging down to protect the whole torso (and thus the core temperature), perhaps secured with a simple belt, would have been the first easy step. Making the equivalent of trousers, or decent boots, would take a whole lot more expertise, skill, and time; and useful gloves even more so. (Note that the ancient Greeks, who sometimes gave up on major military operations because the weather was cold and rainy, mostly fought barefoot and considered trousers to be something that women and barbarians wore. In their world, if you're a hoplite and therefore important, you own a cloak and you wrap yourself in it when the weather gets cold.)

So from an evolutionary point of view, the human body perhaps isn't really expecting the face, hands, and maybe knees or feet to be shielded from cold. (Incidentally, the first of those two pics linked above shows that the boy on the ice is well kitted out with hat and gloves, but the one trying to pull him off the ice has neither). The face has to be clear to allow vision (more important than hearing), and covering the nose and mouth not only slightly impedes breathing, but also promotes build-up of damp from the exhaled air. Just like bare legs, humans become used to bare face easily enough. --Demiurge1000 (talk) 23:54, 4 June 2012 (UTC)[reply]

I somewhat agree with Demi here that learning to ignore it is a factor. I grew up in Malaysia but when I came to Auckland even during winter I usually just wear a short sleeve lightweight cotton or polyester T-shirt and long pants. Particularly in Auckland, our winters aren't really that cold (although our houses are potenitally colder then in colder areas due to poor insulation) but still cold enough that I think even people colder areas will often wear some sort of jacket. Going from a warm area (say a restaurant or under a duvet on a bed) can lead to a bit of uncomfortableness at first perhaps even some shivering but usually after a while I don't notice it much. Nil Einne (talk) 08:46, 6 June 2012 (UTC)[reply]

Blood donation as a free medical test?

[edit]

Question 1: Donated blood are screened for various infectious diseases. When a disease is detected is the donor contacted? (I realize high-sensitivity screening tests produce a large number of false positives and require specific tests for confirmation).

If the answer to the above question is "yes", then I have a follow-up question.

Question 2: Please consider the following two sets of infectious diseases:

Set A: Infectious diseases not detected during regular medical check-ups

Set B: Infectious diseases screened for in donated blood

The intersection of Set A and Set B can be screened for "free" by simply donating blood. Of course it's not really free, you're actually paying with your blood, but it's essential a good deed and a free preventive check-up. Why isn't this more popularized? I see it as an excellent sales pitch for blood donation. Is there any legal, medical, or moral reason against it?

Standard disclaimer: I am not asking for medical advice. I do not have any of the above-mentioned infectious diseases. I have not, nor will I in the future, donate blood (needles frighten me terribly). I am not advocating for the abuse of the blood donation system in any shape, way, or form. I'm asking purely out of curiosity after reading WP's excellent article on blood donation.Throw-away-account-02783457342 (talk) 16:51, 4 June 2012 (UTC)[reply]

Our article which hasn't been modified since May seems to more or less answer a lot of what you asked so if you read it I'm not sure why you're asking. For example, it says 'The donor is generally notified of the test result'. Also you seem to be missing the obvious which our article also mention, besides false positives screening can have false negatives (e.g. if the infection can't be detected yet) even if they are rare. To put it a different way, there's a reason in many countries there are restrictions on who can donate even if the extent of these restrictions are sometime controversial (both of these are mention in our article), I think it's rare people suggest there should be no restrictions. This leads to an obvious reason (which ironically our article also mentions in a fashion) of why donor agencies would not want 'a free preventive check-up' to be a sales pitch. They don't want to be primarily encouraging people who think they might have an infectious disease to donate. What they want is people who are unlikely to have an infectious disease. So all in all, it doesn't sound like you read the article properly or are thinking this through. Why would it be so difficult for people to get a screening test without having to donate blood? If these tests are fairly expensive, then this has implications for the cost of the blood donation process so unfortunately that means they're probably not done (another thing our article mentions). So in reality in most countries the same screening tests are available without donating blood and no one needs to encourage people to risk infecting recepients by using free screening as a carrot for blood donation. (Another thing our article mentions, many people think blood donation should rely on unpaid volunteers. While it doesn't really discuss the reasons, one of them is the fear paid donations would encourage donors to lie or otherwise more risky donors.) Nil Einne (talk) 17:43, 4 June 2012 (UTC)[reply]
Thank you for the prompt response. Yes, I read the sentence "The donor is generally notified of the test result". It doesn't sound very certain to me, so I'm asking for clarifications. "Generally" has the meaning of "in disregard of specific instances" according to Merriam-Webster, and I'm precisely asking about one of these "specific instance", so I thought the general statement might not apply to this specific instance.
I used Set A and Set B because I thought it's better to have the discussion in the abstract, but now seems like the discussion can't continue until I name the disease. The only disease I'm aware of that's in both set A and set B is Hepatitis C. According to the WP article, transmission of Hepatitis C from blood transfusion is 1 in 10,000,000, which is essentially the false negative rate you mentioned. Personally I think 1 in 10,000,000 is negligible and can be left out of the discussion entirely. If I have misinterpreted that statistic (quite likely) or you feel the false negatives are still a issue worth discussing then please feel free to point it out.
Regarding your points on donor restrictions and "primarily encouraging people who think they might have an infectious disease to donate", I perfectly understand where you're coming from. My initial question sounded very much like suggesting "let's invent a loophole so that people who abuse drugs intravenously and people who engage in risky sexual behaviors can screw the system." Please take my word that that was not my intention at all. I learned in a recent news article[1] that Hepatitis C is quite prevalent in North America and was shocked to find out that it's not commonly screened for the in the US. Subsequently most carriers of Hepatitis C do not even know they have it, and no amount of annual check-ups will reveal it, until it's too late. Donor restrictions won't have any effect on these silent carriers (since they are completely oblivious of it). Setting aside the people who caught Hepatitis C through injection drug use, the other 40% of Hepatitis C carriers won't even have a reason to suspect they could have caught an infectious disease, so they are definitely not "people who think they might have an infectious disease". Again, I'm not advocating for people lying on their blood donation forms or abusing the blood donation system in any way. I'm saying people who have no reason whatsoever to suspect they have an infectious disease, who have never injected drugs, who have never engaged in risky sexual behavior, (just normal folks like you and me) should donate blood. It helps both society and themselves. This group is roughly 2.5 million people in the US.
Regarding your question "Why would it be so difficult for people to get a screening test without having to donate blood? ", the answer is yes, it's almost impossible to detect Hepatitis C unless you specifically ask for it in North America. And since most people aren't even aware of the disease, they don't know to ask for it. After all, their physician who have years of medical experience didn't bring it up, it's kind of presumptuous and chicken little to say "test me for X" just because they read the WP article on X.Throw-away-account-02783457342 (talk) 18:26, 4 June 2012 (UTC)[reply]
You can't really have complicated policies, and expect the general public to follow them. "Don't donate just to get a test" is managably simple. "Do donate to get tested only if you have no reason to believe that you need to be tested" is too much of a mixed message. I was once deferred from donating blood for a year because I had gotten a rabies vaccination. Now, they consider preemptive vaccinations (like, before travelling) to be fine, but not vaccinations in response to exposure to a potentially-rabid animal (the vaccine is safe, they're just worried that you might be infected still, I guess). However, they don't have an exception for if you later find out that the animal didn't actually have rabies. I assume that the policymakers knew how to formulate a policy that would've worked better if it were followed to the letter, but they were probably also worried about making a policy that people could follow in the first place. Paul (Stansifer) 00:09, 5 June 2012 (UTC)[reply]
You need really to specify what country you're talking about. In the UK potential blood donors are specifically told at http://www.blood.co.uk/can-i-give-blood/donor-health-check/ question 11: Please don’t give blood if you THINK you need a test for HIV or Hepatitis or if you have had sex in the past year with someone you think may be HIV positive or Hepatitis positive. Although the chances of infected blood getting past our screening tests is very small, our tests do not always show if you are infected. This is why we must take care in choosing donors and why you must not give blood if you are infected. We rely on your help and co-operation. Other countries have other rules. The UK system does say it will inform you of positive results. Myself, I gave up blood donation a couple of months ago and took up platelet donation instead (the UK requires either/or). I can recommend it. Tonywalton Talk 00:41, 5 June 2012 (UTC)[reply]
I'm interested in answers all over the world. AFAIK most developed and developing countries screen for Hepatitis C in donated blood, yet none of those countries screen the general population for Hepatitis C. Throw-away-account-02783457342 (talk) 10:59, 5 June 2012 (UTC)[reply]
Just out of interest, I gave a platelet donation today and noted down what it says in the Notes to Donors here in the UK. They test donations (whole blood as well as platelets) for hepatitis B, hepatitis C, HIV, syphilis and HTLV. They also test for such diseases as malaria and West Nile virus if the donor has been in parts of the world where these are prevalent within certain time frames. If any of these are detected they do inform the donor and "offer advice and support". The caveat above ("Please don’t give blood if you THINK you need a test…" is repeated. Tonywalton Talk 21:31, 7 June 2012 (UTC)[reply]

Geologist versus engineer

[edit]

Which type of scientist do enviormental consulting firms hire more of? Would a Hydrogeology concentration in a geology degree make me as desirable to enviormental consulting firms like parametrix as an enviormental engineering degree? Thanks alot. — Preceding unsigned comment added by 99.146.124.35 (talk) 17:29, 4 June 2012 (UTC)[reply]

It's hard to say. If you want to know, call the company directly and ask them what you should study if you want to be considered for a job there or at a similar firm. They will tell you what they are looking for. After all, asking the person who knows the answer directly is more likely to produce the correct answer than asking random strangers on the internet; the odds of finding a hiring manager from the company in question that way are vanishingly small. --Jayron32 18:06, 4 June 2012 (UTC)[reply]

Why didn't anyone try to look for the transit of Venus in the pre-telescopic era?

[edit]

Geocentric theory presumes the two inner planets to be closer to the Earth than the Sun. They pass the Sun hundreds of times a century. Venus never misses by more than c. 17 sun widths. Therefore it might be worth to try to catch one. Maybe all these epicycles didn't predict well enough. Still, every time it looked like it might happen I would look to see if anything interesting happened. Maybe they assumed that it couldn't be seen against the Sun because it's glow would be overpowered by the Sun (it's not obvious even against blue sky after all), and that it had no size. They would not be looking for a shadow. Still I'd just wonder if anything magic happens. Very powerful astrology going on. Sagittarian Milky Way (talk) 21:27, 4 June 2012 (UTC)[reply]

In fact our article Transit of Venus states that in the Dresden Codex the Mayans charted Venus' full cycle, yet despite their precise knowledge of its course, they never mentioned even it's theoretical possibility (don't know if the Mayans thought it had angular size or not). Sagittarian Milky Way (talk) 22:27, 4 June 2012 (UTC)[reply]

I don't think they would have had the means to calculate when a transit would happen, so they would need to need to spend a lot of time looking before they saw one. Even then, they would struggle to see it without magnification. The best thing they could look through would be some kind of smoked glass, which wouldn't give a very clear view. --Tango (talk) 21:59, 4 June 2012 (UTC)[reply]
Eh, I saw it at sunrise with no magnification (heck, eyeglasses for nearsightedness even makes the view slightly smaller). Still they only had to look once at sunrise and sunset once every 0.8 years for.. oh up to 120 years. A 1 out of 2 chance of catching it. But there are two of them. Sagittarian Milky Way (talk) 22:17, 4 June 2012 (UTC)[reply]
Were you using blurry smoked glass? --Tango (talk) 23:29, 4 June 2012 (UTC)[reply]
No, luckily it was pretty hazy and it was right at rising over a relatively low horizon. Of course if you used blurry smoked glass (which you shoudn't do unless the sun is very low I don't think you could keep it still enough for a glass blemish to fool you. It was so long ago memory might not be reliable anymore but now that I think of it it may have been only an indication, though it was absolutely there. Gray. And I bet the "minification" is only somewhere around 80-90%. Sagittarian Milky Way (talk) 17:53, 5 June 2012 (UTC)[reply]
May be it is because they did not have a heliocentric theory of the Sun. The Earth was the center of their universe. Two blobs in the same region of sky may have not excited them. SkyMachine (++) 23:45, 4 June 2012 (UTC)[reply]
The locations of celestial objects definitely did excite people in the geocentric days. That's what astrology is all about. --Tango (talk) 12:48, 5 June 2012 (UTC)[reply]
Maybe they did see it but went blind before they could enter it into historical record. But your answer on the imprecision of calculating exactly when it would occur is most probably the reason no one prior to Horrocks is known to have observed it. It helps to have a reason to observe it which is what Horrocks had, to attempt to work out the geometry of the solar system. SkyMachine (++) 10:10, 6 June 2012 (UTC)[reply]

Why didn't they just put a color filter on the eyepiece?

[edit]
1673 woodcut illustration of Johannes Hevelius's 8 inch telescope with an open work wood and wire "tube" that had a focal length of 150 feet to limit chromatic aberration.

They made these up to 600 feet long, with a tower to hold the the lens and a 600 foot string to keep the eyepiece pointing to the right place. Sagittarian Milky Way (talk) 22:05, 4 June 2012 (UTC)[reply]

A color filter will dramatically reduce the brightness of any image obtained. In modern times, you can simply increase the exposure time to capture an image, but back then it wasn't an option. If you couldn't see the image with your eyes, you couldn't see it. Incidentally, there were high quality color filters back then, in the form of stained glass. Someguy1221 (talk) 23:19, 4 June 2012 (UTC)[reply]
In any case, chromatic aberration cannot be corrected merely by using coloured filters, as the question assumes. If it could, amateur and professional astronomers would not spend considerable sums of money on telescopes with achromatic and (better) apochromatic lenses, which are much more expensive than comparable telescopes without such lenses. While "coloured fringes" are the most obvious symptom of chromatic aberration, it also degrades the entire image. In addition, astronomers often want to see the unaltered colour(s) of what they're looking at, or use very particular colours of filter to increase the contrast of otherwise difficult-to-see details. {The poster formerly known as 87.81.230.195} 90.197.66.109 (talk) 01:33, 5 June 2012 (UTC)[reply]
A good enough colour filter will solve chromatic aberration. If there isn't a range of colours then they can't be focused differently. The reason they aren't use is because, as Someguy says, it would dramatically reduce the brightness. The less chromatic aberration you want, the narrower a range of colours you need to use, and the dimmer the image would be. --Tango (talk) 12:51, 5 June 2012 (UTC)[reply]
[2] See, they make moon filters with far less transmission than the deepest yellow filter. Can't they at least get something out of seeing in yellow? Or.. tie three telescope tubes together, give them red, green, and blue color filters and use mirrors and/or lenses to join the beams together. Maybe that was too much for the early 18th century. But somehow aerial telescopes were more practical. Sagittarian Milky Way (talk) 18:26, 5 June 2012 (UTC)[reply]
You would have the same difficulty with the mirrors and/or lenses that you are using the join the beams together as you did with the original telescope. You wouldn't get much more brightness either, since you're just getting three narrow parts of the spectrum rather than one, which still adds up to being a very small amount of the spectrum. And you wouldn't get anything close to true colour - you can make pretty much any colour by mixing red, green and blue, but the amounts of red, green and blue you need aren't the same as the amounts in the original image (for example, if there is some yellow in the image, you'll need to replace it with equal parts red and green, which your device wouldn't do). --Tango (talk) 12:17, 6 June 2012 (UTC)[reply]
Then add more tubes and colors, that's still probably more practical and/or cheaper than this. Sagittarian Milky Way (talk) 20:00, 7 June 2012 (UTC)[reply]
The OP seems to be confusing the state of optical science in the 16-17th century with its current state, around 4 centuries later. Those hugely long (because of long-focal-length objectives) were used because at the time they hadn't discovered, or lacked the craftsmanship and/or money to obtain, a better way to minimise chromatic aberration. Compound objective lenses (with components of two different glasses which somewhat cancelled out each others' effects) were one solution; another was the reflecting telescope, invented by optical pioneer Isaac Newton for this very reason. {The poster formerly known as 87.81.230.195} 90.197.66.109 (talk) 21:33, 6 June 2012 (UTC)[reply]
The achromat lens had to wait for the mid-1700s but the article doesn't seem to mention anyone trying looking at at least the very bright Moon or planets through yellow stained glass (that color appearing brightest and closest to white due to the luminosity function, or Newtonian reflectors obsoleting 300-foot refractors. Somehow it took a very long time after it's invention by Newton for reflectors to be presented to a scientific society as shown in the article. Maybe they couldn't make reflective enough mirrors back them (but they can still look at the Moon), or parabolas the size of those aerials were much harder to make or silver than sphere-surfaced lenses. [www.stargazing.net/naa/scopemath.htm] If 1-10% of the light was transmitted they could still see 9th-11th magnitude stars with their 8 inch (a magnitude is a 5th root of 100) That's still a lot of stars, more than many binoculars. The darkest extended objects you could see with a near perfect telescope are maybe magnitude 22 per square arc second of surface brightness. Compared to that the Moon and discovered planets are about a million to a billion times brighter. So somehow they saw such an unfocused rainbow blur of colors with sane focal ratios but no one thought of using colored eyepiece glass or something? Sagittarian Milky Way (talk) 20:00, 7 June 2012 (UTC)[reply]

Peregrine Falcon G-forces

[edit]

I could use some help on figuring out how Peregrine Falcons withstand G-forces when pulling out of a dive. I have been talking about this in the talk of wikiproject birds under the section Peregrine Falcon. Plese help.Nhog (talk) 16:48, 4 June 2012 (UTC)[reply]

Moved here from the ref desk talk page. ←Baseball Bugs What's up, Doc? carrots22:18, 4 June 2012 (UTC)[reply]
Existing discussion of this subject can be found here, at WP:BIRDS. I suggested that Nhog ask about it here... --Kurt Shaped Box (talk) 22:27, 4 June 2012 (UTC)[reply]

There's surprisingly little information on this topic; I can understand why User:Nhog is having trouble.

I tried a google search on "Peregrine Falcon acceleration" which yielded a couple of barely-relevant hits, but they seem to talk mostly about acceleration and speed, not deceleration.

Do you have an estimate of the actual deceleration, in g?

It may be that the deceleration experienced by the falcon, though seemingly "high" (in other words, substantially greater than 1, or substantially greater than that encountered by any other bird) is not actually high enough to be dangerous. For example, our G-force article says that ordinary (non test pilot) humans can tolerate up to maybe 5g without too much trouble. Is a falcon pulling out of a dive experiencing a lot more than that? (And yes, I know, birds are not humans; I'm just talking rough orders of magnitude here.)

Steve Summit (talk) 01:13, 8 June 2012 (UTC)[reply]

It would not be hard to estimate the maximum g two different ways (1) measure the trajectory from a video (2) estimate the maximum force the wing can generate and the mass of the bird, big hint Cd is unlikely to exceed 1 by any significant margin. A hgh school student might struggle with the maths involved, or they might not. Greglocock (talk) 02:36, 8 June 2012 (UTC)[reply]

Thanks for all this I will try and ask national geograpic to see if they have anything and if not mabey they can rescerch it. If you can help plese do. Nhog (talk) 18:43, 11 June 2012 (UTC)[reply]

1.15 AU to survive RGB tip

[edit]

Dr. Schroeder and Smith points out in order for planet to survive over RGB their current orbits has to be 1.15 AU or greater. Because Earth will be swallowed up because of the tidal interactions which basically counteracts with sun's loss of gravitational mass and the diagram actually points out Earth will be swallowed up when sun extends to 0.9 AU. If sun hits 1.2 AU, then how will 1.15 AU planet survive with sun's tidal interaction the thing which slows down planet's velocity. Did they actually make a mistake in the calculation? Is it suppose to 1.30 AU, or 1.15 is right? It seems weird and confusing. They never shown us the variables they used.--69.226.45.43 (talk) 22:26, 4 June 2012 (UTC)[reply]

While tidal effects cause the Earth's orbit to decay, constant mass loss from the sun (which will accelerate as it approaches the reg giant phase) causes the orbit to expand. The authors' claim is that present day Earth would need an orbit of 1.15AU in order to move far enough away from the sun to avoid engulfment. Someguy1221 (talk) 23:04, 4 June 2012 (UTC)[reply]
(Edit conflict; I see that Someguy1221 has answered the main question, but I put too much work into this not to post it dammit! :D)
First off, you are neglecting the fact that the Sun is going to lose a substantial amount of mass as it expands, and so Earth's orbit will also expand in response. However, remember that any scientific discourse about events so far in the future is going to involve speculation and large uncertainties. There are many different factors competing, and science can't say for sure which factors will win out (i.e., whether Earth will be consumed or not).
These factors will serve to expand Earth's orbit:
  1. the Sun entering the Red Giant phase will give off much more intense solar wind, which will impart an outward force
  2. this intense solar wind will result in an overall reduction of the Sun's mass, leading to wider planetary orbits
  3. Yarkovsky effect (though I believe this will be negligible even as the sun/earth distance decreases dramatically)
While these will work to contract Earth's orbit:
  1. Tidal decay
  2. Increased drag as the Sun's corona expands
And there are some factors that will have unknown effects:
  1. The orbits of the terrestrial planets may be unstable on these timescales, leading to possibilities of collision with other planets, switching of orbits, or even escape from the solar system
  2. Unknown unknowns!
We know that there will likely be a net expansion of the orbits of the terrestrial planets, but to what degree remains highly uncertain. So, in general, I really have a problem with any definitive statements being made about an event which is forecast to happen further in the future than the solar system is old. The best answer is "Earth might be consumed by the sun in its Red Giant phase".
However, in response to your original question, they state "any hypothetical planet would require a present-day minimum orbital radius of about 1.15 au" (emphasis mine). Thus they are talking about a planet which would today have a 1.15 AU orbit, and so in the future would have a much larger orbit. -RunningOnBrains(talk) 23:19, 4 June 2012 (UTC)[reply]
Do we exactly know how big the sun will be at the tip of RGB, and how much will sun's mass lost at the tip fo RGB? Is 33% of mass lost can pin the premise on the ground, or is it more speculate guesstimate. Is this possible that sun at tip of RGB lost up to 45% of mass which may put Venus and Earth's orbit to 1.30 and 1.80 AU? Can sun lose more mass than 45% of its original mass at the tip of RGB? Yes, I definitely know sun's loss of mass will cause changes of planet's orbits. I was wondering how is it possible for Venus to escape destruction. If Venus escape engulfment by wider orbit that would only be 1.08 AU, that is not enough to avoid destruction of the planet, that is too low. Can sun end up much bigger than 1.2 AU? Can we really pin 1.2 AU down the ground. The site say Mars will most likely survive, not 100% guaranteed, which there is a chance not to but clearly better than 50/50.--69.226.45.43 (talk) 00:26, 5 June 2012 (UTC)[reply]
As I alluded to above, there is no way to know such things for certain. We can make estimates based upon theory and models and observed behavior of presumably similar stars, but all that only gets us just that: estimates. We really don't have many direct observations of star radius; all our knowledge of star sizes is based upon estimates of luminosity and other observable factors. Only 3 other stars besides the sun are actually close enough to be resolved by telescopes (see List of stars with resolved images)! It is also important to note that at the apex of the AGB, stars similar to the Sun will undergo extreme variations in temperature and luminosity, which implies that radius will vary greatly as well. So it's in these "pulses" that are mentioned in the article you link above that the Earth will have the greatest possibility of being consumed; and these pulses are likely to be rather chaotic in nature. As I said before; there really is no definite way to know. Perhaps as we get more and better observations of other stars and computer models get better our certainty one way or the other will rise, but I suspect it will never be 100% certain. Well, at least until 8 billion years from now ;) -RunningOnBrains(talk) 02:08, 5 June 2012 (UTC)[reply]