Analog/POTS line level calibration


I’m looking for resources / information on how to check the signal voltages on an analog phone line.
So far I’ve found Digital milliwatt - Wikipedia and implemented such a test signal (in alaw encoding) on our OCTOI hub (030 1234 9995 now returns this ear-piercing 1kHz, 0dBm test tone).

On an ideal phone line this should apparently result in 0.775V RMS (at 600 Ohms), but I’m unsure what the best way to measure that would be. Terminate the line with a phone and measure AC voltage?
Measuring the upstream/ADC levels would probably require me to get a signal generator and feeding the same 0.775V into the line?

I’m using these 8x a/b channel modules in my PBX to convert from ISDN/PRI to a POTS line.
They use Infineon PEB 2466H chips and their datasheet (page 24) claims to be outputting 1.095 Vrms with the alaw test sequence described in G.711.
There’s a couple op-amps in the circuit after that, so those might mess with the levels again…

In the end, I’d like to ensure the best possible signal/line quality for analog modems.
I wonder if it’s worthwhile to test/try another ISDN BRI->a/b line converter.

Any experiences? Tipps & tricks to share?

If you apply the digital milliwatt to a 600 ohms resistor, you should measure 0.775 volts AC. Note that the line impedance is sometimes defined around 1050 ohms, so you may get a bit more.

the analog side of PSTN is not really my domain of expertise. I would expect you might find plenty of people skilled in that art on the C*net/ckts mailing list at | Home