Suppose we have f1(X) + f2(X), where f1 and f2 are functions, and X is an interval.
On which interval is f1+f2 defined?
f1 + f2 is generally defined where both f1 and f2 are, as with real-valued functions.
Is this interval (f1+f2)(X)?
If f1 and f2 are evaluated, say, by evaluating each by replacing the symbol X in the expression by intervals, then f1(X) + f2(X) will contain the range of (f1+f2) over X, just as f1(X) contains the range of f1 over X, and f2(X) contains the range of f2 over X. In that sense, (f1+f2)(X) is always equivalent to f1(X) + f2(X). However, unlike real-valued functions, different ways of writing down f1 and f2 that are equivalent in real arithmetic give rise to different interval values.
On which intervals are f'1(X), f'2(X), f'1+f'2, and (f1+f2)' defined?
Generally, these functions are defined generally where you can define the corresponding real-valued functions. My explanation above for the sum is also valid for the sum of the derivatives.
Think of interval values as BOUNDS ON THE RANGE that are not unique.
R. Baker Kearfott, http://interval.louisiana.edu/kearfott.html
If you have a question related to validated computing, interval analysis, or related matters, I recommend