The isue is a bit more... complicated... In fact, using a linear scale (whether from 0 - 127 or 0 - 100 is irrelevant) is
not a good choice for volume. Perceptually, we "feel" volume in a logarithmic fashion, and therefore a logarithmic scale (typically in dB - decibel) would be the best choice. And this wouldn't be a 0 - 100 scale, either. It would probably be something like 0 dB for "standard" volume, up to, letīs say, 20dB for some gain, and then down to minus infinity for complete attenuation...
But the fact is, these values were defined by the MIDI standard, based on binary representations of numbers, many years ago. At that time, the definitions had to contend with several tradeoffs: keep it simple, as processors didn't have much computing power (a logarithm was completely out of the question), keep it small (few bits) to minimize transmission time (MIDI used a
very slow transmission speed... by today's standards. At the time, it was cutting edge!). Therefore, MIDI is based on 8-bit representation for everything, but using one bit to differentiate between commands (like "note played") and values (like which note was played). Therefore, everything ended up limited to the 7-bit binary value range: 0 - 127 (or -64 to +63 for signed values). This is simply a consequence of the binary system used by computers, and the number of bits assigned, and has nothing to do with specific units of measurement. This impacts everything in MIDI: note numbers (0 - 127!) instruments (0 - 127, alhough Yamaha prefers to number them 1 - 128, simply adding one to the number to avoid confusing people with an instrument 0... and ending up adding to the confusion!), volume (as you noticed), velocity, filter parameters and everything else.
(Incidentally, the same happens in computers with memory: thats by "1 MB" is actually 1,048,576 bytes, and not one million of bytes, and why ISO/ANSI has started proposing the use of MiB instead of MB to differentiate M = mega = million in the metric system from Mi = mega = 1,048,576 in the computer industry. For more information on this, refer to: https://en.wikipedia.org/wiki/Binary_prefix).Could this be changed? Technically, yes, of course. But realistically, and logistically, it would be a nightmare. The MIDI standard is
THE standard to interconnect musical instruments, DAWs, and many, many more devices (from film editing and synchronization, to sound and video recorders, and even lighting control systems in theatres). Changing the range of values, even if it was only on an instrument and only for display (i.e., the values stay the same internally, only are displayed differently) would confuse everything! Imagine setting the volume to, let's say, 70 on the new "metric" scale, and finding that the value you received in the DAW you are using to record and control your keyboard, the value appears as 89... And viceversa, you set the volume on your DAW to 100, and find that on your keyboard the volume is now 79...
And changing the MIDI standard is out of the question. It is used to control not only new equipment, where a transition could be done, but also old (very old!) equipment, still used and appreciated today. This is also why the evolution of MIDI has been so slow: it needs to keep backwards compatible. The first really big change is rolling out right now, with MIDI 2.0, but this is an opt-in standard (i.e., unless and until both pieces of equipment interconnected agree to use the new standard, everything works as usual). And MIDI 2.0 is, still, a binary standard (although the number of bits, and with it the value ranges, have been greatly expanded).
Sorry for the long explanation, but I didn't want this to end up as a discussion of metric vs. non-metric, which this issue really isn't about. And, for the records, I'm from a fully metric country, and US customary units baffle me...
tl;dr: in e-music and MIDI, the standard value ranges are determined by the binary nature of the standard, and this is what everyone and everything uses, understands, and adapted to.