The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
I have lived since the days when the real reason for this was stil alive and kicking. When I explain to youngsters today, why ASCII was 7 bits and US digital channels 56 kbps, they stare open-mouthed at me: Is that really true?
For those who do not know: Digital phone lines in the US were 8 bit, and the raw bit rate was 64 kbps. But the system designers decided to pack 24 of those up in a 1,5 Mbps T1 connection, leaving no room room for the signalling to/between the phone switches - dialling and such. (When running ISDN over such lines, only 23 phone channels were used, with the last one reserved for signalling.)
So to communicate e.g. number dialling (and a lot of other control signals), every sixth byte in the "user channel" had to sacrify its least significant bit for such use. A recognized term for this is "bit robbing", which I find quite descriptive. The five bytes inbetween were intact 8-bitters. Aside from the problem of every 6th byte being 7 bits only, the problem was that the equipment preparing the data stream couldn't know which one of the bytes were the 6th. (Equipment connected directly to the line, such as a digital phone, could synchronize by "frame hunting", recognizing frame delimiters in one of the six bytes, but not in the other five.) So no LSB was trustworthy. The byte rate was 8000 bytes/sec, the byte size 8 bits, but only 7 were reliable. You couldn't even use it for parity: Half of every 6 bytes (or if you prefer, every 12 bytes) would, on the average, be received with the wrong parity.
Bit robbing was never used in European telephone networks. Our 2 Mbps E1 channels had room for 30 user channels, one for signalling according to international standard and one for national or proprietary signalling. Channnels were clean 8 bit / 8000 Hz, 64 kbps. So, the switch to ISDN went much easier in Europe than in the USA. Here in Norway, ISDN was The line standard from around 1995 up until the present (although most subscribers didn't know: The analog, 1930 or thereabouts standard, signals from the old phones they clung to, where digitized and transformed to ISDN format before entering the phone switch).
(The single phone company providing fixed phone lines here in Norway announced two days ago that they no longer will accept orders for new fixed phone lines; they will tear down their old lines within a couple of years. People who insist on having a desktop phone will be equipped with an adapter transforming the signals from the old rotary dial phone into cellular phone signals. ... I guess that those insisting, have retained the rorary dial, never upgrading to pushbutton DMTF model )
You know, it's positively embarrassing when an Australian, indentured in Canada, working for an international web site, knows more Internet slang-de-jeure created by the American Orangemeister than an American(*) does.