Birch's theorem
In mathematics, Birch's theorem,[1] named for Bryan John Birch, is a statement about the representability of zero by odd degree forms.
Statement of Birch's theorem
[edit]Let K be an algebraic number field, k, l and n be natural numbers, r1, ..., rk be odd natural numbers, and f1, ..., fk be homogeneous polynomials with coefficients in K of degrees r1, ..., rk respectively in n variables. Then there exists a number ψ(r1, ..., rk, l, K) such that if
then there exists an l-dimensional vector subspace V of Kn such that
Remarks
[edit]The proof of the theorem is by induction over the maximal degree of the forms f1, ..., fk. Essential to the proof is a special case, which can be proved by an application of the Hardy–Littlewood circle method, of the theorem which states that if n is sufficiently large and r is odd, then the equation
has a solution in integers x1, ..., xn, not all of which are 0.
The restriction to odd r is necessary, since even degree forms, such as positive definite quadratic forms, may take the value 0 only at the origin.
References
[edit]- ^ Birch, B. J. (1957). "Homogeneous forms of odd degree in a large number of variables". Mathematika. 4: 102–105. doi:10.1112/S0025579300001145.