Over the years there's been a very wide variety of signal levels that you might find being used between equipment. On the professional side, there's a few standards, but domestic equipment has used whatever the manufacturer felt like using as they designed the equipment. Sometimes they pick something that's useful when connecting other equipment together, but that's not a given. So connecting random bits of equipment together can be a bit of a problem, especially if you don't know the specifications for them.
Even connecting a stereo system together made by the same company can be a signal level nightmare. You may find that the tuner has a different signal level than the tape deck, and different again from its CD player. And, to make things worse, they mightn't preset each amplifier input to balance out the levels, so that as you change from one input source to another, you have to turn the volume up or down, sometimes by radical amounts. And anyone who's used domestic players with domestic audio mixers will encounter a similar problem, you'll find some things need to be operated with the faders most of the way down, with others all the way up and still not loud enough.
In the olden days, way back when dinosaurs roamed the earth, professional equipment used a very high nominal signal level of +8 dBm (approx 2 volts rms), as a high signal level gives you good protection against outside noise (the signal being much greater than the noise). Later on, professional equipment, generally, opted for using a slightly lower nominal level of +4 dBu (approx 1.23 volts rms), and this is the current norm.
Those being the main input and output levels, but special signals (such as insert points in the middle of a mixer channel, auxillary sends and returns, and monitor outputs) might use different levels. Making things difficult when you want to connect some special equipment in the middle, such as a reverb unit. It would have to be able to handle signal levels that were different than the main studio levels (usually they were lower), and return similar levels back to the main equipment.
And those were the nominal levels. Nominal, in that you were expected to drive it with signals at those levels, most of the time, and the equipment was designed to give its best performance around those levels. It can handle somewhat higher than nominal levels (there was “headroom,” in reserve, for those occasional louder peaks), and much lower levels, too. But that's what it's designed for (it works best when operated around nominal levels). And the metering was set to register normally around those levels, too, with 0 VU at the nominal signal level, whatever it was. Rarely does 0 VU mean 0 dBu at the input or output, since very few things use that as their nominal level. Think of 0 VU as the fill-marker on a water container, it simply indicates the optimum level to fill to, not how much water is in it.
As a practical example of what would be nominal levels; if you were to play a modern rock recording, the whole album would be about the same volume level. And, on average, that would be at the nominal level of the equipment.
For an audio operator, setting up equipment with test tones, your reference tone would be sent to the equipment at the expected nominal level, equipment levels would be adjusted until the meter indicated the reference level, and the equipment would output a signal at the reference level. It lines everything up to work at their best performance.
But, for domestic gear, signal levels were commonly around 0.1 to 0.3 volts rms, nominally. Much lower than the professional levels, and would often be the cause of problems when you tried to mix the two together.
To feed domestic equipment into professional gear, you'd need to be able to increase the pro gear's input level sensitivity, to get enough signal in. That, usually, wouldn't be a problem for doing something like connecting a cassette player into an audio mixer, as most mixers have a lot of input controls. But could well be a problem if you tried to connect a domestic player directly to a professional recorder, as the recorder may only have a limited range of adjustment on it, or no external adjustment controls available, at all. You'd need to insert a pre-amp into the signal path, to boost the levels up.
The converse of plugging professional equipment outputs into domestic gear could send far too much signal into the domestic equipment, and cause terrible distortions. Though, for the case of something like connecting the outputs of a professional recorder to the inputs of a domestic audio amplifier, you may get away with simply turning down the amplifier's volume control (it depends on the pre-amp circuitry between the inputs and the volume control). But if it couldn't handle it, then you'd need to place an attenuator between the equipment.
Domestic gear started having to be able to cope with higher signal levels when the CD player came out, as they had a 2 volt rms output signal level. Albeit that it was 2 volts at the player's maximum output level, rather than its nominal output. So, not quite as bad as trying to monitor a nominal 2 volt, but occasionally much higher, signal, but still had the potential for problems. And since the early CDs had a fair bit of headroom, their nominal level wasn't that much greater than the rest of the domestic audio equipment, so you could usually just turn the volume control down a bit, and not suffer any problems. But, if you listened to your CDs very loudly, any of those above-nominal moments could be extremely loud, much greater than the louder peaks coming from records and tape decks (around 10 dB more).
Later on, as the recording loudness wars raged on (recording companies trying to make their recordings sound louder than the competition's), the nominal levels of the CD went up, right up to just below the maximum level the CD could reproduce. So, modern domestic amplifiers really did need to be able to handle strong input signals.
So that's the background information. And, although it's often spread around the internet, it's misinformation that CDs are the origins of a 2 volt rms signal level; signal levels of +8 dBm predates them by some twenty to thirty years.
Now down to the nitty gritty of levels. Signals are generally referenced as some form of deciBel, which is a ratio rather than an absolute level. A comparison of one signal against another, much like saying a football team scored twice as many goals as the other side; unlike an absolute measurement, such as how many centimetres tall you are. This was due to the nature of how audio signals were put through equipment and cabling. Where, for example, signal loss needed to be compensated for, and all you needed to do was amplify it up by a certain amount to bring it back to normal, e.g. maybe having to double it. Something that doubles a signal, doubles it no matter what actual level it is. So there's a convenience factor for using a system that works that way.
In the early days, dBm was used, which was a reference against 1 milliwatt into 600 ohms, the standard termination impedance at the time. Later, different impedances became more common, so dBm is inapplicable, and dBu becomes the most common reference. This time, it's a voltage-based reference, instead of a power one, where the impedance isn't part of the equation. And, just to make life easy (for a change), it's the same voltage as the dBm reference used. So what used to be +4 dBm is now +4 dBu. The difference being that +4 dBu could be into any impedance, and it's still the same level.