Thursday, September 3, 2009

3.2-3.3

1. I took Number Theory Summer term, so most of this is really fresh in my mind. I can never remember the proof that if (a,n)=d and ax=b (mod n), there are exactly d solutions distinct mod n to the congruence. Leaving fractions in congruence relations is new.

2. I was brought up with the doctrine that math was a tool and the motivation for inventing/discovering new math was solving problems. Certainly Newton's development of calculus jived with this. Recently in my math courses, I have heard professors say that math is something where you invent rules and see what ramifications those rules have. I also have had two (old) math professors talk about how the vast majority of dissertations are on such esoteric topics that there is no foreseeable application to any results obtained in the research. These two ideas have impressed me with a bit of cynicism. Am I naive in thinking that these "invented" rules are not arbitrary--that there is some real-world motivation for them? And if we assume there is no application of a set of different topics, what does it matter which one is researched, since it's all mind-games anyway?... Now the moral: When I learned about congruence and divisibility, I wavered between it being exclusively either obvious or (more often) inconsequential. Same with primes. But modern cryptography rests on these concepts and (essentially) truths. Did the mathematicians who developed it see the potential in it? I suppose they didn't, but why did these ideas disseminate among the mathematicians and become fixed in the curriculum?

1 comment:

  1. Congruences are very useful for proving things like the fact that the equation x^2 + y^2 = 39 has no integer solutions: just look at the equation modulo 4 to get x^2 + y^2 = 3, and note that the only squares mod 4 are 0 and 1, so they can't add up to 3. If any integer solutions to the original equation existed, they'd have to reduce (mod 4) to solutions to the new equation.

    ReplyDelete