This is false. You could have any 1,000 values, and it does not necessarily have to be the mean. For instance, you could have all integer values, but when you average it out, it is not an integer. Or you could have two possible values, 0 and 1, and the mean cannot be 0 or 1. (So you have 500 0's and 500 1's, and the average is 0.5, for instance.)
You can assess those with a couple of truly numbers. a million/(three + four) = a million/7. But a million/three + a million/four = 7 / 12, which is way higher. So the primary one cannot be precise. The moment one is fake, as you observe; three(a + b) is 3a + 3b.
Comments
This is false. You could have any 1,000 values, and it does not necessarily have to be the mean. For instance, you could have all integer values, but when you average it out, it is not an integer. Or you could have two possible values, 0 and 1, and the mean cannot be 0 or 1. (So you have 500 0's and 500 1's, and the average is 0.5, for instance.)
Counterexample:
Let's say the set is all integers from 1 to 1002. The average of this set is 501.5, which is not an element of the set.
==> The statement is false.
You can assess those with a couple of truly numbers. a million/(three + four) = a million/7. But a million/three + a million/four = 7 / 12, which is way higher. So the primary one cannot be precise. The moment one is fake, as you observe; three(a + b) is 3a + 3b.
Falseeeeeeeee