Talk:Chernoff bound
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||
|
Merge with Chernoff Bounds
[edit]The other page just presents a bit more technical stuff. Could probably be merged pretty easily. Thoughts?
I think Chernoff bound is just a special case of the Chernoff bounds. So, it should be presented appropriately. Definitely merged, in either case. Else, its too confusing. —Preceding unsigned comment added by 24.47.50.165 (talk) 15:46, 18 December 2007 (UTC)
Merge it. Having two articles is really confusing, and I don't see how they differ, or relate to each other. 71.163.207.130 19:43, 14 October 2007 (UTC)
The charts should have a log ordinate scale, and that would help a lot. jontalle@ieee.org — Preceding unsigned comment added by 50.40.105.194 (talk) 04:09, 10 December 2011 (UTC)
Possibly old stuff
[edit]i dont understand how to apply the chernoff theorem please discuss some views on it thank you bbye
There's a mistake! The bound P(sum_i X_i>n/2)=1-epsilon>1-exp(-...) (where X_i=1 if the event E_i happened and X_i=0 otherwise) gives an UPPER bound on n, NOT a lower bound as is given in the second formula in the page! The interpretation is wrong too: it takes no more than 150 coin flips to guess the side of the biased coin with 95% accuracy. This error is also evident from the first graph (it isn't too clear what it represents, but it obviously tells us that the Chernoff bound is an upper bound to n with respect to other strategies)
I suggest the formula P(sum_i X_i>n/2)=1-epsilon>1-exp(-2n(p-1/2)^2) (where X_i=1 if the event E_i happened and X_i=0 otherwise) is added to make the bound more obvious.
Addition 13 June 2008
[edit]This seems to have left some problems:
- What is a "Poisson trail": is this defined somewhere? (link?)
- Appears to duplicate material later in article, under "Theorem (relative error)".
- Incompleteness: eg possibly needs an "=1" in it, depending on what a "Poisson trail" is.
Melcombe (talk) 09:11, 13 June 2008 (UTC)
Missing proof/example
[edit]The formula in the leader to this article seem very useful, but there is no indication as to where they come from in the main body of the article. Not being familiar with the subject, I can't add it myself, but I know I would find it rather useful to see where those formula came from. me_and (talk) 15:10, 25 November 2008 (UTC)
Where does the inequality with p(1-p) come from?
[edit]I would love to use the inequality:
However, I haven't been able to verify it (looked at other inequalities, multiple books, etc..) Is it true? Any reference / proofs?
Thanks! — Preceding unsigned comment added by 128.32.44.125 (talk) 02:47, 8 February 2014 (UTC)
It could be related to a binomial distribution. What are your m, p and x'es? --Thomasda (talk) 08:57, 4 June 2014 (UTC)
General Proof
[edit]In the intro Chernoff Bounds are described as only requiring independence (and I believe, bounded expectation). However the proof given in section 4.2 only accepts Poisson variables, rather than general random, positive variables. Does anyone know a more general proof? — Preceding unsigned comment added by Thomasda (talk • contribs) 16:26, 2 June 2014 (UTC)
Overlap with a book
[edit]The first two sections of the article are almost identical to section 17.2 of the book Algebraic and Stochastic Coding Theory by Dave and Perm Kythe. I wonder if this is appropriate for Wikipedia given that the article does not even cite this book.
On a more technical level, the Chernoff bound
in the Example section does not agree with equation (17.2.2)
from the book. Which one is correct?
--Marozols (talk) 16:10, 21 October 2014 (UTC)
- I just found this post, which states that the current version in article should be correct.
- --Marozols (talk) 16:24, 21 October 2014 (UTC)
Dead Links
[edit]Hey, I noticed the links (there's two) to the book "Probability and Computing: Randomized Algorithms and Probabilistic Analysis" pointing to google books pages are dead.
I looked up, but am not a 100% sure what the appropriate steps are to replace them, and how to execute these.
The book seems to be online in pdf form here http://bayanbox.ir/view/340405828745623973/probability-and-computing-mitzenmacher-upfal.pdf
OldMacDonalds (talk) 19:32, 4 February 2015 (UTC)
Coin tossing example
[edit]There used to be a coin tossing example on the page, which was great except for the fact that it was not consistent with the formula given on present version of the page for the Chernoff bound. I decided that doing without the section was better than including a section that was at best confusing and at worst wrong. It would be great if someone reworked it or perhaps found a better version of the Chernoff bound, that made the argument true. MathHisSci (talk) 15:43, 20 August 2015 (UTC)
Why does the minimum exist?
[edit]Formula (1) claims that a minimum is attained for s>0. This does not seem obvious for me, as I don't see why the limit of s to zero cannot be lower, i.e. why it isn't the case that the infimum is not in the set. I've looked at a few proofs, but I've been unable to find one that proofs this. ShearedLizard (talk) 15:46, 2 March 2018 (UTC)
- Formula (1) assumes that ; otherwise the derivative has the wrong sign at as you say. There is also another problem, namely that might not be finite for all positive t. If we allow infinite expectations and assume we get closer to the truth. Perhaps using inf instead of min is needed too, but I suspect not since the argument looks continuous in t. This is all too messy so I propose removing (1) and similar expressions, and also adding the condition that is finite. The common practice of minimizing over t can be demonstrated with in the examples. McKay (talk) 05:18, 7 March 2018 (UTC)