Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2007 December 31

From Wikipedia, the free encyclopedia
Mathematics desk
< December 30 << Nov | December | Jan >> January 1 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 31

[edit]

"Compressibility" in mathematics

[edit]

What does "compressibility" mean in describing an algorithm, or a series of numbers derived from an algorithm?

Here is the context where I encountered the usage:

If I hand you the first 3,000,000,000 terms of this sequence without giving you the generator as a program or purely in words, it will be impossible, for all practical purposes, for you to tell me whether this is a “random” sequence or a “compressible” one (it is, in fact, highly compressible), and still less possible for you to specify a generating algorithm.

There is a stub Compression (functional analysis) which may be relevant, but it is so brief and recondite as to tell me nothing. Dybryd (talk) 06:51, 31 December 2007 (UTC)[reply]

This reminds me of the concept of Kolmogorov complexity, which is the minimum amount of information necessary to represent something. Take a look at that article, especially the "compression" section. — Kieff | Talk 08:17, 31 December 2007 (UTC)[reply]


(ec) The basic idea is, can the sequence be generated by a computer program significantly shorter than the sequence itself? Of course any finite sequence can be generated by a computer program, because you can just write the sequence into the program as data and have the program print it out, but if there's an underlying regularity, you may be able to do much better. That's compression.
Now, you might reasonably ask, a computer program in what language? It might, after all, make a difference. That's why the concept is usually applied to infinite sequences, and the question is whether every finite initial subsequence of length n can be generated by a program shorter than nC, for some fixed C. See Kolmogorov randomness for more details. --Trovatore (talk) 08:24, 31 December 2007 (UTC)[reply]

Thanks, guys. So, mathematical compressibility always happens in the context of data compression in programming? The quote above seemed to imply that it was a more general concept. Dybryd (talk) 18:30, 31 December 2007 (UTC)[reply]

Well, no, it doesn't have to be programming specifically. The computer in this case is an abstract device anyway, not a real piece of iron. The formal definitions are likely to be phrased in terms of universal Turing machines, which hardly anyone ever bothers writing real "software" for; they're just useful abstractions for capturing a certain very restrictive notion of definability. --Trovatore (talk) 18:45, 31 December 2007 (UTC)[reply]
Ummm ... surely the Turing machine concept captures a very general notion of definability, since any general recursive function can be implemented by a Turing machine, and the Church-Turing thesis says that any function that is, in an intuitive sense, "definable" or "computable" can be implemented as a general recursive function. So if you can define a given sequence using less bits than the sequence itself then there is a Turing machine that encapsulates that definition. Gandalf61 (talk) 19:13, 31 December 2007 (UTC)[reply]
No, "definable" in general is much more general than "computable". For a more-or-less minimal example, consider the binary sequence that has a zero at position n just in case the nth Turing machine halts (and a one if it doesn't). This sequence is not computable (by the unsolvability of the halting problem) but everyone but a few extremists would consider it definable. There's an entire hierarchy (in fact hierarchies of hierarchies) of complexity above that. In descriptive set theory, computability is about the most restrictive notion of definability that is ever considered. --Trovatore (talk) 19:22, 31 December 2007 (UTC)[reply]
Hmmm ... but you can only give me, say, the first billion bits of your binary sequence if you use an oracle that can tell you whether the nth Turing machine halts. And I can see how you then start a hierarchy because there is a halting problem for Turing machines + Oracle A, which can only be solved by a more powerful Oracle B etc. etc. But surely you have only "defined" your sequence as far as the boundaries of the oracle, after which you have to say "here a miracle occurs". That type of definability does not seem to me to be at all useful. Note that I am not advocating a constructivist view - I am happy to grant that your binary sequence exists - I just don't think that it is "definable" in any effective sense. If I say "X is a random real number between 0 and 1; I can recite the binary digits of X starting at any place that you like" then is X defined ? I would say X is only defined in a useful sense if I can tell you exactly how to calculate the binary digits of X - in other words, X is only effectively defined if it is computable. Gandalf61 (talk) 11:16, 1 January 2008 (UTC)[reply]
Okay, I just read the definable real number article, and I see that it does indeed say there are real numbers such as the binary representation of the halting problem that are definable but not computable. I did not realise that "definable" had such a broad definition. Gandalf61 (talk) 12:56, 1 January 2008 (UTC)[reply]
Yes, well that article is really quite bad, and I say that having written perhaps most of its current incarnation. But you should have seen it before I got there. I still don't know what to do about that article -- the basic problem is that having a mathematics article with that title gives the impression that there's a unique and mathematically well-defined concept answering to that name, which there isn't--there are hierarchies of definability, but no clear demarcation between "definable" and "undefinable" simpliciter. But there's a lot of very interesting mathematics associated with the concept, so one doesn't want to just delete it.
Let me respond a little to your philosophical remarks, though. It's true that the mere fact that a number is definable doesn't necessarily mean you can say any other specific thing about it, such as, say, what its actual digits are. But that's nothing particularly new or surprising. We don't (I imagine) know on what day of the week Attila the Hun's mother's father's father's mother's father died, and it's probably hopeless that we can ever find out by human means. But that doesn't invalidate "the day of the week on which ... died" as a definition; barring quibbles about time zones and a death agony extending past midnight, most of us would agree that it's a well-defined day, even if we can never know which day. --Trovatore (talk) 07:42, 2 January 2008 (UTC)[reply]

Multivariable limit

[edit]

http://img91.imageshack.us/img91/1050/limitedz7.jpg

I have that limit, with the proof that it does not exist since for m=1 the limit is 1 and for m!=0 it's 0. However, when I first tried to calculate it, what I did was, since x^2 + y^2 is positive, the above limit is less or equal than x^2*y^2/(xy(xy-2)) (so I have it bounded). I simplify xy in both the numerator and denominator and I get xy/(xy-2), now (I suppose) I can safely plug in x=0, y=0 and get 0/(0-2) = 0; but this is wrong, according to the above resolution. I've been thinking in this for two days without any improvement. Could you, please, tell me what did I do wrong? --Taraborn (talk) 16:24, 31 December 2007 (UTC)[reply]

You seem to have implicitly (and incorrectly) assumed that . Without this assumption, not only is the assertion wrong, but also useless - what you need is . -- Meni Rosenfeld (talk) 17:07, 31 December 2007 (UTC)[reply]
Hmmm... that's what I meant, sorry, but, please, now check the solution. It doesn't work, why? (also it should be less or equal than, not strictly less than as you typed) --Taraborn (talk) 17:16, 31 December 2007 (UTC)[reply]
*sigh* Come on, it's as if you're not even trying. (or any variation thereof) is incorrect, e.g. and then . You need a, b and c to all be positive if you want to deduce (or anything similar). -- Meni Rosenfeld (talk) 18:03, 31 December 2007 (UTC)[reply]
Haha... thanks, now I get it. Sorry, but I guess it was mainly my teacher's fault, since he wasn't very rigurous with this... Thanks again, now everything makes sense. --Taraborn (talk) 22:12, 31 December 2007 (UTC)[reply]

Hi. I'm studying some analytic number theory, and I have a question. First, to set up some notation, suppose that f, g and h are arithmetic functions, and suppose F(s), G(s) and H(s) are the associated Dirichlet series. In other words:

, and

Now, I know, and it's easy to prove, that

,

where the star (*) represents Dirichlet convolution. Is the converse also true? In other words, if , can we be certain that ? -GTBacchus(talk) 20:48, 31 December 2007 (UTC)[reply]

This is probably the first time I've encountered any of these concepts, but I'll give it a try. Let . Then by your stated result, you have . It is given that , so . I think it then follows that . Does this sound okay? -- Meni Rosenfeld (talk) 21:21, 31 December 2007 (UTC)[reply]
Maybe. It comes down to a question of uniqeness. Using your notation, we know that both i and h have the same Dirichlet series, I=H. However, it might be possible for two different arithmetic functions to give rise to Dirichlet series that produce the same function. So, another way of asking my question is this: Is the mapping a one-to-one mapping? -GTBacchus(talk) 21:37, 31 December 2007 (UTC)[reply]
With the right assumptions, this shouldn't be too hard to prove. If we consider the behavior of the function as , we should be able to extract the coefficients of the series. -- Meni Rosenfeld (talk) 21:47, 31 December 2007 (UTC)[reply]
Indeed,
 --Lambiam 22:56, 31 December 2007 (UTC)[reply]
See Apostol's Introduction to Analytic Number Theory Theorem 11.3 - a uniqueness theorem for Dirichlet series. Gandalf61 (talk) 10:11, 1 January 2008 (UTC)[reply]

Wow. Thank you all very much, and Happy New Year. -GTBacchus(talk) 11:57, 1 January 2008 (UTC)[reply]