Wikipedia:Reference desk/Archives/Mathematics/2012 April 16

Mathematics desk
< April 15 << Mar | April | May >> April 17 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


April 16

edit

Is this already known by another label? Dru of Id (talk) 01:07, 16 April 2012 (UTC)[reply]

I'm not sure exactly what's being described as the novel or useful insight. Is it that n2 = (2n-1) + (n-1)2? This is trivial application of the FOIL method. The article (which won't last long) says "the way the numbers are laid out is the main significance", but I can't imagine what that means. In any case I would say that whatever is being presented is probably too elementary to have a specific name. Staecker (talk) 02:19, 16 April 2012 (UTC)[reply]
The fact that the sum of the first n odd numbers is equal to n^2 is well known, although I don't know that it has any particular name, or warrants one. For example when teaching mathematical induction this is invariably an example or exercise. Rckrone (talk) 02:29, 16 April 2012 (UTC)[reply]
I remembered it can be used for generating Pythagorean triples and in this context is apparently called "Fibonacci's method". See Formulas_for_generating_Pythagorean_triples#Fibonacci.27s_method. Rckrone (talk) 02:34, 16 April 2012 (UTC)[reply]
My comment is off-topic and perhaps superfluous, but can I encourage everyone to be a little gentle in commenting on this article? I haven't seen any really unpleasant comments yet, but this sort of contribution does sometimes attract them, and there may be a real live seventh-grader on the other end of them. There isn't any danger of long-term damage to the project; as Staecker says, the fate of the article is not in doubt. --Trovatore (talk) 02:42, 16 April 2012 (UTC) [reply]

multiplicative functions

edit
  Resolved

Is there a term for multiplicative functions that aren't Completely multiplicative? Bubba73 You talkin' to me? 05:14, 16 April 2012 (UTC)[reply]

I understand the functions you want as simply being termed 'multiplicative functions', where a and b are necessarily coprime. Completely (or totally, as I learnt it) multiplicative functions are the case where a, b do not have to be coprime. 131.111.184.11 (talk) 13:52, 16 April 2012 (UTC)[reply]
So if you say just "multiplicative functions" that means ones that aren't totally or completely multiplicative? Bubba73 You talkin' to me? 14:16, 16 April 2012 (UTC)[reply]
Yes, that's the usage I was taught (in the UK). Perhaps for a wider audience though, you could initially define your multiplicative function to be as you require, just for added clarity? meromorphic [talk to me] 16:24, 16 April 2012 (UTC)[reply]
Or use the rather awkward multiplicative but not completely multiplicative function. Too bad the terminology weakly multiplicative is not in common use. Duoduoduo (talk) 17:57, 16 April 2012 (UTC)[reply]
Is it really that useful to have a term for this? How often do you really want to know specifically that a function is multiplicative but not completely multiplicative?
Usually "weak foo" includes "foo". If I say that an algebraic structure is a skew field, for example, I do not assert that its multiplication is noncommutative. I merely decline to assert that it is commutative. I am not aware of any standard terminology for "skew field that is not a field", and don't believe there is any pressing need for such. --Trovatore (talk) 19:25, 16 April 2012 (UTC)[reply]
Euler's totient function? meromorphic [talk to me] 20:13, 16 April 2012 (UTC)[reply]
What's your point? Yes, the totient function is multiplicative, and not completely multiplicative. I still don't see any need for a single term that expresses both those facts. --Trovatore (talk) 20:33, 16 April 2012 (UTC)[reply]
Sometimes I have wanted to refer to multiplicative functions that aren't total/complete, and the only way I know to say it is awkward. Bubba73 You talkin' to me? 14:39, 17 April 2012 (UTC)[reply]
Is there some quotable property of multiplicative-but-not-completely-multiplicative functions, that is not instantly reducible to having some property of multiplicative functions, and failing to have some stronger property of completely multiplicative ones? --Trovatore (talk) 20:21, 17 April 2012 (UTC)[reply]
(This sort of thing does sometimes happen. For example, sets that are computably enumerable but not computable form a single Turing degree. Sometimes these sets are called renorec, short for "recursively enumerable not recursive". But this strikes me as the exception. Usually "thus far but no further" is not that useful a concept.) --Trovatore (talk) 20:24, 17 April 2012 (UTC) Whoops, now that I think about it, that's not true — I was thinking of the fact that sets (in Baire space, say) that are   but not   (boldface) are all Wadge reducible to one another. Let's go with that as the example of a useful thus-far-no-further notion. --Trovatore (talk) 20:29, 17 April 2012 (UTC)[reply]
If I understand your question, yes, there is a property. With completely/totally multiplicative functions, you can calculate f(a*b) from f(a)*f(b). With multiplicative functions that aren't completely/totally multiplicative, a and b have to be relatively prime. So multiplicative functions that are not completely/totally multiplicative cannot be calculated as readily as completely/totally multiplicative functions. Bubba73 You talkin' to me? 21:34, 17 April 2012 (UTC)[reply]
No, that's not the kind of thing I was asking for. That's an example of "it has this nice property, but it doesn't have this nicer property". I'm asking for an example of a single quotable property that you can conclude if you know the function is multiplicative-but-not-completely-multiplicative, but that you can't conclude just from knowing it's multiplicative. -Trovatore (talk) 21:42, 17 April 2012 (UTC)[reply]
I don't know what "quotable property" means. Bubba73 You talkin' to me? 00:31, 18 April 2012 (UTC)[reply]
Something you would be interested in knowing. I can't figure out why you would be interested in knowing the conjunction of a nice property and the negation of a nicer property. Generally, it doesn't help you to know that an object fails to have a nice property. --Trovatore (talk) 00:37, 18 April 2012 (UTC)[reply]
I would be interested in knowing (or communicating) the fact that multiplicative functions that aren't complete can't be computed as readily as complete multiplicative functions. Now, I already know that, but if I didn't, I would like to know that. And I would like to be able to tell someone that in a less awkward way. Bubba73 You talkin' to me? 01:08, 18 April 2012 (UTC)[reply]
Oh, I see. Well, I don't think that justifies a whole mathematical term. That's kind of a one-off. You just say that multiplicative functions have a special property that simplifies their computation, and completely multiplicative functions, an even more special property that simplifies it even more. -Trovatore (talk) 01:12, 18 April 2012 (UTC)[reply]

Definitions of logs and exponentials

edit

I'm trying to find a consistent set of definitions for logs and exponentials, and for their derivatives. I like to define the natural logarithm as:

 

From this, it's elementary to show that   and in turn   Euler's constant e is the unique value of x for which ln(x) = 1. This definition is fine for defining the natural log, the exponential function and the derivative of the natural log. Things start to become unstuck when we try to differential the exponential function:

 

As far as I can see, the evaluation of this right hand limit involves the substitution  , whence

 

The evaluation of this final limit relies upon the fact that

 

But here's the first problem: the proof of this limit relies upon one knowing the Taylor series of the exponential function. So the proof that the first derivative of the exponential function is the exponential function itself seems to rely upon one knowing the first, second, third, ..., derivatives of the exponential function. The second problem is how might one reconcile the definitions:

 

I would like to have a single definition (preferably the integral definition of the natural log) from which all of the properties and derivatives followed. Fly by Night (talk) 19:07, 16 April 2012 (UTC)[reply]

Let me begin by saying I don't really know how to make fancy looking math symbols on wiki; so I'm writing ln(x), exp(x), using primes for derivatives, slashes for division, etc. That said: you can show ln(exp(1)) = 1, so ln(exp(x)) = x; differentiate both sides to get, exp'(x) * ln'(exp(x)) = 1; you know ln'(x) = 1/x, thus, exp'(x) * 1/exp(x) = 1, which easily supplies exp' = exp. Phoenixia1177 (talk) 19:46, 16 April 2012 (UTC)[reply]
For the second question, what do you mean by reconcile definitions?Phoenixia1177 (talk) 19:48, 16 April 2012 (UTC)[reply]
Here's another way to get exp'. From what you have shown, exp'(x) = a * exp(x), a some constant. For any function f: f'/f = (ln f)', hence, a = (ln exp x)' = x' = 1; thus, exp' = exp. Phoenixia1177 (talk) 20:13, 16 April 2012 (UTC)[reply]
Thanks Phoenixia1177 for your replies. I know I didn't mention it, and so I hold my hand up, but I didn't want to use the chain rule. I just wanted to use the definition of a derivative:
 
Your reply is a nice one and I appreciate it. I'm tempted to think that the "bridge" I was looking for was the chain rule. In reality: I would rather keep things as simple and "as from first principal" as possible. I'll be coaching some very bright 16-18 year olds next month, and I've been asking myself questions about what they might ask me. I realised that my knowledge is a bit of a patchwork quilt.)
Regarding how to reconcile the definitions, well, how does one start with
  Fly by Night (talk) 21:13, 16 April 2012 (UTC)[reply]
(Thank you for your response on my talk page, I'll play around with it and see what I can work out, though I 'm still going to refrain from using it here since it takes me a while to get used to such things) It's clear that exp'(x) = exp(x) * exp'(0), if you go back to your original derivation of exp'(0), it's not hard to show that the limit definition of e holds iff exp'(0) = 1 (some algebra). So, if you know the derivative of exp at 0, you can show both the limit definition and that exp = exp' are true. Honestly, I think that the chain rule is the clearest way to do this, or just graphing exp(x) and showing that exp'(0) is 1 (obviously, not entirely rigorous, but I don't think it would be out of line; probably could find a way to this rigorously, though it wouldn't be as easy as the chain rule). Phoenixia1177 (talk) 22:40, 16 April 2012 (UTC)[reply]
If you wanted to show the limit identity another way, though this isn't as simple in my opinion, you can do it as follows: since ln is continuous, it commutes with the limit, so you get lim( ln(1 + 1/t) / (1/t)) change the 1 / t to x and take the limit as x goes to 0, you have lim (ln (1 + x) / x), use l'hospitals rule and get lim (1/ (1 + x)), which is clearly 1. Then, calling the limit L, we have ln(L) = 1, which means L = e. Technically, you could argue that we used the chain rule to differentiate ln(1 + x), but for the purpose at hand, I would say that it's such a weak use that it can be glossed over; explained directly (if not, put u in place of 1 + x, now do lim (ln u / (u - 1)) as u approaches 1, use the rule and get lim (1 / u), which works out the same; just as easy actually). So, using nothing but that ln is continous and l'hospitals rule you can get the limit, which gives you the derivative. Phoenixia1177 (talk) 23:03, 16 April 2012 (UTC)[reply]
(an aside on your goal) In my experience, very bright young students are more comfortable with the chain rule than with the limit definition of differentiation. Especially if disallowing the chain rule forces you through additional dry formal derivations. Of course your mileage may vary :) SemanticMantis (talk) 13:21, 17 April 2012 (UTC)[reply]

Define   and  . Then show that: (1)  , (2)  , (3) ln x is differentiable with derivative   (by FTC). So

 

as required. Sławomir Biały (talk) 13:39, 17 April 2012 (UTC)[reply]

Thanks Sławomir, perfect. I'd not spotted the trick of putting 1/n as the denominator instead of just having a factor of n. I'm happy now! Fly by Night (talk) 14:59, 18 April 2012 (UTC)[reply]

(ec) If you begin with ln(x) being defined and you are sure the logarithm properties hold, you can argue this way:

  • ln(x) is continuous and definitely has an inverse, f(x)
  • f(x)=f(1)x for all x.
  •  
  • By continuity,  
  • The far right expression is indeterminate so that you can apply l'Hopital's rule to find that the limit is 1, hence ln({limit definition of e})=1
  • Applying f on both sides, {limit definition of e}=f(1).

Overall, this is just substitution to avoid the chain rule. Really I agree with SemanticMantis' aside that you should jump in the pool and make use of more conventional paths. The pedagogical route you have chosen seems rather idiosyncratic, and while your idea makes for good exercises, it probably isn't the best classroom method. Rschwieb (talk) 13:59, 17 April 2012 (UTC)[reply]

I didn't intend to teach it this way, it was for my own contentment. Fly by Night (talk) 14:59, 18 April 2012 (UTC)[reply]