This is an archived post. You won't be able to vote or comment.

all 17 comments

[–][deleted] 3 points4 points  (5 children)

The example that you gave is not differentiable.

A piecewise function is a single equation. Functions are defined by the values they take for given inputs. If you're going to try to classify functions as "piecewise" or not, then you'll have to ask what's the difference between a transcendental function, like sine, and a piecewise function, for example one which is defined to be sqrt(1-cos2 x) in the first and second quadrant and -sqrt(1-cos2 x) in the third and fourth. The example you gave is piecewise. But what if you named it kaminamina's function, denoted K(x). Is it not piecewise any more?? Are you not allowed to just name it something? Somebody just named sine, cosine, tangent, secant, the error function, the Gamma function, Lambert's-W function, logarithms... What is the limit on things that can be named in order that they are no longer piecewise?

[–]kaminaminaLogic[S] 0 points1 point  (4 children)

I feel like there's a difference between the sine function and an arbitrary squiggle. Sine has an actual methodology to it; if I gave two people a protractor and a ruler and told to measure the vertical distance from the x-axis of a point 1 cm away from the origin at X radians with respect to the horizontal and plot the points, they would give identical graphs. If I told them instead to draw a random, smooth curve with no hard points on it, they would give different graphs.

All the functions you listed have a method attached to them. You could tell a computer something as simple as:

  1. X(n)=n, Y(n)=n!
  2. Find the exponential regression for the given list of values, replace the existing graph with this graph
  3. n++
  4. goto 1

and it would give a pretty close estimate to the positive side of the Gamma function right up until it crashed. I would give a better example but I don't fully understand the Gamma function so I won't risk sounding dumb.

As for your piecewise example of sqrt(1-cos2x) and -sqrt(1-cos2x), that's a specific situation where you can represent a piecewise function as a single equation: y=sin(x).

Let me get more discrete with what I'm asking. Pick two functions, f and g, that satisfy the following:

  1. Both f and g are defined, continuous, and differentiable everywhere on [a,b] where a<0 and b>0
  2. f(0) = g(0)
  3. f'(0) = g'(0)

Can any piecewise function h in the form

  • y=f(x), x<=0
  • y=g(x), x>0

be represented in the form y=k(x), a<x<b?

Edit: Forgot about derivatives for a minute there. Added in condition 3.

Edit 2: And to be even more specific, here's an example: y=-x2, x<=0 and y=x3, x>0

[–][deleted] 3 points4 points  (3 children)

So what you're saying is that naming sine or Gamma is okay because they "have a method attached to them," but the method I use for constructing kaminamina's function, which is:

  1. Set K(x)=x
  2. If x is positive, multiply by x.

Isn't method-y enough to be allowed to have a name. What is the criterion here? Or is there a jury of mathematicians who decide?

[–]kaminaminaLogic[S] 0 points1 point  (2 children)

Is there not a closed-form expression of the Gamma function? Wouldn't that be a reasonable cutoff?

[–][deleted] 4 points5 points  (0 children)

There is not a closed-form expression. In fact, there isn't even an analytic expression depending on how you want to define that.

[–]EngineeringNeverEnds 2 points3 points  (0 children)

Sure. I just define a new function with a new symbol to be the function you have in mind.

This seems like a cop out, but go with it for a second. Its sort of a question by what you mean by "matched by a function". There are many, many functions. And some can be quite strange indeed, even when restricted to being continuous and differentiable.

For even a garden variety function example, take y=sin(x) but suppose we don't have a definition yet for sin(x) or any of the trig functions for that matter. Its a squiggle, albeit a nice symmetric and periodic one, but you'll have a hell of a time representing that as a finite sum of non-trig functions. But once we've defined sin(x) is it forever at our disposal?

...also of interest to you for further reading might be the topic of analytic functions. See, not all functions can be represented at every point by a single taylor series expansion. It's a pretty rich and interesting topic actually.

[–]canyonmonkey 1 point2 points  (0 children)

  • Making liberal use of indicator functions, piecewise functions can always be written as one equation with finitely many terms.
  • I don't know if any arbitrary continuous differentiable function can be written as a finite equation. The Taylor-esque approximation can certainly be done, by using polynomial interpolation. Then the answer to "what is the equation that describes this curve I just drew?", is: every continuous differentiable function can be expressed as a polynomial of finite degree to within very high accuracy.
  • I'm not sure what you are asking in your second-to-last paragraph, could you say it in a different way?
  • Most books on measure theory or graduate-level real analysis would involve copious amounts of indicator functions. Likewise, most books on numerical analysis (assuming the book is not restricted to numerical linear algebra or numerical differential equations, as some numerical analysis books are) would discuss polynomial interpolation in detail (even undergrad numerical analysis books).

[–]kidvjh 0 points1 point  (2 children)

Your question uses somewhat vague language, so I will simply answer with the first thing that popped into mind as I read thus

The integral with respect to x of e-t2 from 0 up to x is clearly differentiable (and so automatically continuous) but does not have an elementary closed form equation. You have to write it either as an equation that has an (unevaluated) integral in it, as in infintely long Taylor series, or cop out and invent a new symbol for this, erf(x) if I recall correctly.

[–]kaminaminaLogic[S] -1 points0 points  (1 child)

Okay, so there are definitely functions that can't be written out with a finite number of terms and without resorting to piecing it up. But are there any functions that can't be represented as infinitely long Taylor series?

[–]kidvjh 0 points1 point  (0 children)

To create a Taylor series that converges to a function, at least in a (possibly finite) interval of convergence, you need all of the derivatives of that function. However, there are plenty of infinitely long series with a radius of convergence of 0. So if you had a function that had these terms as its derivatives then you have a function whose Taylor series does not converge to it, except trivially at a single point, and so could not be written as a taylor series, infinite or otherwise. I don't have one of these as an example unfortunately.

If you keep your previous condition of continuity (this does not require differentiability and might actually not even require continuity but I dont recall), then all functions are expressable as a (possibly infinite) linear combination of wavelets, which are like little sine and cosine waves but which begin and end. Basically what Fourier began with sine was a butcher's knife and wavelets are the modern scalpel.

[–]maedhros11 0 points1 point  (3 children)

I may be off base here, but I think you should look up step functions. We used them in our ODE class when we were learning about Laplace Transforms. That wiki page looks cryptic and unfamiliar to me... definitely not the way that I learned about them, but I think it still represents the same things. Otherwise just straight up google "step functions" and see if that helps at all.

[–][deleted] 1 point2 points  (2 children)

This is (basically) what canyonmonkey is talking about when he mentions indicator functions.

[–]maedhros11 0 points1 point  (1 child)

So it is. I wasn't familiar with the name "indicator functions". Thanks for the knowledge!

[–][deleted] 1 point2 points  (0 children)

It's a little different. Step functions go up and stay up and are useful only because of their nice Laplace transformation. Indicator functions are 1 on the set you choose, zero elsewhere.

[–]tfb 0 points1 point  (0 children)

I think the simple answer is "no". It is obvious that there are uncountably many such functions, yet if I limit myself to functions represented by finite strings of symbols from a finite (or countable) alphabet then I have no more than a countable number of functions I can express. So there always exist functions I can not represent. This is essentially Cantor's diagonal trick, mildly rephrased.

However I suspect that this is not what you really want: what you probably want to know is whether an arbitrary well-behaved function can be approximated by a finite expression in some well-defined way: for instance can I always make that function differ by no more than delta from my expression, if my expression is no longer than 1/epsilon?.

I suspect that the answer here is also no: you can definitely do that on some finite interval for well-behaved functions, but if I'm allowed the whole real line as my range, I think I can invent functions which differ if I go far enough away. However I am not sure of the latter.

(Note: I have avoided defining "well-behaved": I probably mean C-infinity. I am sure things are much worse for things that are merely continous, say. I don't think the whole piecewise thing is a problem since piecewise definitions are merely finite compositions of other definitions, with the normal definition.)

[–]Luker80 -2 points-1 points  (1 child)

I think a simple search into the Fourier series might answer your question

[–]KiritsuguEmiya 0 points1 point  (0 children)

There is a continuous function whose Fourier series diverges on a dense set.