Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2006 August 9

From Wikipedia, the free encyclopedia
Humanities Science Mathematics Computing/IT Language Miscellaneous Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions at one of the pages linked to above.

< August 8 Mathematics desk archive August 10 >


Let's play with algebra[edit]

What other ways can I express this inequality: where is a matrix, is a vector, and is a positive scalar? Is there something I can say about the relationship of to the nullspace of ? I also tried playing with the triangle inequality but that didn't get me anywhere. Is there a way to express the equation linearly (strangely enough, in the elements of ) or otherwise well to use as a constraint in an optimization problem in ?

What if is the gradient of a function? I have this idea that if we're projecting a function into a new function , ensuring that will ensure that critical points of will also be critical points of , and that generally if we're performing optimization we can make some progress in minimizing by performing optimization over and then setting , then maybe starting again. Any graphical or intuitive understanding of what this inequality would mean or a better one to choose to accomplish that purpose would be helpful.

If I have second order information, would I do better setting (a Newton's method step) where is the Hessian matrix or some approximation to it?

I know my question is confusing. Please just answer whatever small parts you can and go off on any tangents you think might be helpful. 18.252.5.40 08:30, 9 August 2006 (UTC)[reply]

The obvious first suggestion is to read our article on matrix norms. --KSmrqT 09:25, 9 August 2006 (UTC)[reply]