On Tuesday, Jan 20, 2004, at 20:29 America/Denver, Jake Anderson wrote: > my understanding is that standard phone calls run at about 64kbps data > rate. > > in Australia at least the carriers only guarantee the call to 9600bps > the improvement in modem speeds has been finding better ways to > utilise the > line, better modulations etc. The compression and error checking runs > on top > of that. I've had uploads of 30k/s on a 56k modem (which only connect > at > 33.6k up stream btw) due to the compression the modem was doing. Welllll... kinda. This is going to ramble a bit, but ride along for the fun... Now you've also got to take into account the history of the telco side of things. Analog lines up through the 1970's generally used to run all the way back to the Central Office... perhaps getting amplifed and echo-cancelled and all sorts of fun stuff along the way. (Can you say Bridge Taps, boys and girls? Sure I knew you could! Party Line, anyone?) Obviously this wasn't very efficient so Bell Labs came up with some toys in the 60's to do A/D conversions and accurately cram those into a single large synchronously-clocked signal in timeslices... Time-Division Multiplexing was born! That little thing they called a "transistor" seems to have helped in this process. (GRIN) Early analog to digital stuff gave a standard phone call a 64Kb/s timeslice in a larger synchronous circuit to cover the mathematical requirements of Nyquist's Theorem and the desired "usable bandwidth" of an analog circuit at the time. Those A/D conversions would be stuffed into timeslices of a faster synchronously-clocked circuit... the T1. You'd put 24 X 64Kb/s in... synchronous, clocked to one end or the other, all that fun stuff. Other physical challenges awaited the early T1 makers... Framing bits were "stuffed" in to keep early electronic line repeaters operating, and in cases when too many 0's were sent in a row, an algorithm called AMI (Alternate Mark Inversion) was added... this kept the line-powered repeaters from losing power... yep, they stole power from the "AC" of the ones and zeros coming down the line... wheee. And a down loop was always forced to send "All 1's" (which was actually "alternated" by AMI so there was always enough power present to keep the repeaters that weren't chopped out of the loop by a back-hoe... working!) All-1's is also known as "Blue Alarm". Other alarm types were created... "Red Alarm" was simply that the whole bloody thing had come out of sync and crashed. Yellow alarm was more interesting... bits from the audio path were robbed and set to patterns (All 1's in the second most significant bit of each 64 Kb/s timesliced frame) was an indication that something was wrong at the far end, but that the circuit was still up and synchronized. (Usually an indication that the CO switch had dropped ALL calls over the trunk, usually due to an unusually high bit-error-rate.) As clock sources got better, less of this bit stuffing needed to happen as line lengths could driver further and the line-powered repeaters slowly were removed from the network. So, the extra framing bits were stolen to move the end-to-end alarm bits outside of the analog portions of the signal... and this is Expanded Super-Frame, or ESF. Another little trick added to keep things clocked up right, B8ZS (Bit Eight Zero-Supression) was introduced as the "normal" way to bit-stuff an ESF T1 or "T-span"... span being leftover AT&T long-lines terminology for exactly that... huge spans of cable on the early telephone long distance network. Anyway, enough about how early T1's worked... Telcos figured out rapidly that end-subscribers couldn't really tell the difference in audio quality between a full 64Kb/s A/D conversion and say a 16Kb/s conversion... especially if they did some tweaking to the power levels of the mid-range and lows... so as time went on, they figured out how to multiplex more and more analog sampled stuff into a single T1... thus saving on trunking costs between Central Offices. Less cable in the ground, more money in their pockets. Good for everyone. As clock sources and oscillators and everything got better, the available real bandwidth of an analog phone line to the home from the home end to the far end, actually went DOWN. Well... as they say, timing affects the outcome of the raindance. All of these technologies collided in the Age of Modems (GRIN) when modem manufacturers who were counting on that "standard" analog line being able to carry certain frequencies were also competing with (but most of the time they didn't know it) the local telco who wanted to not put more copper in the ground in the Outside Plant. The telco side of this worked its way out to neighborhoods in the form of the Channel Bank. Instead of just saving money between Central Offices, the telco now wanted to run only a small amount of copper wire all the way from the CO to your neighborhood. They realized they could do this by taking the signal to the neighborhood digitally and doing the A/D conversion closer to your house. They came up with a device that would take a whole bunch of analog residential and business circuits and cram them into as few T1's as possible, thus saving the telco money on putting more copper into the ground as a neighborhood grew. Even beyond the Channel Bank, some devices started implementing protocols like GR-303 where the device in the "field" would route individual analog lines to the Central Office ONLY after they had gone to an off-hook state or they had an incoming call. The CO Switch and this "smart device" would steal a single 64K channel to communicate with a serial packet protocol and the smart box could then know when a call was coming in for one of thousands of analog lines, and would connect that line to the appropriate channel number of digital trunk(s) to the CO. [This means that if all the people in your neighborhood picked up the phone at the same time, anywhere from 30-60% of you would NOT get a dial-tone if you live in an area serviced by one of these boxes - depending on how hard the telco tweaked the line usage algorithm in the multiplexers and remote switches. Some regulatory agencies oversee this percentage and don't let it get too out of control in the telco's favor. Ever get a fast-busy signal IMMEDIATELY upon picking up the phone on a busy holiday? You're serviced by one of these gadgets then.] The modem engineers fought back... smart engineers started coming up with ingenious plans to use all of that big beautiful original analog pipe not knowing that what they really had was an analog pipe for three city blocks crammed through a SLC-96 or similar early model Channel Bank at which point all of their headroom outside of the standard voice frequencies was stripped to get rid of problems with good old Mr. Nyquist's theorem... they created faster and faster modems and the general population bit. They wanted speed. But then they also started complaining. "I never get a 56K connection!". Yep. And depending on where you live, that's generally where it stands today. If you're lucky enough to be close to a Central Office and have copper that runs from the analog card in the C.O. switch all the way to your house, you're probably in the minority these days. Surprisingly, your chances of having this are BETTER if you're rural as long as you're not serviced by an ANALOG CO Switch... not many of those beasts left today, but they were common in the Western U.S. even into the late 1980's. Suburbanites in new neighborhoods? Forget it. You're getting crammed through a mux somewhere. Stuff like V.90 is so sensitive it not only requires the end-user not be routed through a tight mux, it also really requires the head-end (ISP) modem bank be fed directly with digital (usually a T1 or in large deployments a multiplexed DS-3) so the fewest number of A/D conversions take place. (The spec calls of ONLY ONE A/D conversion in the ENTIRE path from end-user to modem pool for maximum performance. If the ISP uses "V.90 modems" and you take a look in their POP and it's an analog modem bank with a bunch of RJ11's for connectivity... beware. You'll never get full-data-rate out of a connection to it. Ever. Not physically possible.) Take a breath for some air if you made it this far... Next, you start to realize this entire telco network is synchronous and has to be clocked very accurately, and you can see clearly where many of the wonderful advances we all see today in oscillators, "elastic buffers" and other fun stuff came from. You can also see how Europe's E1 standard evolved slightly later than the T1 and how it was less expensive for telcos in Europe to "just start with" a broader pipe that was taking advantage of the better clock sources available at the time. [Don't even get me started on ISDN... great ideas, way ahead of its time... died a slow and painful death because it was too expensive to deploy into the old network. Oh I do love seeing it relabeled as "iDSL" these days, though when you're too far out for regular DSL technology so they take an ISDN chipset and run it in a raw 144Kb/s data mode with alarming and no signalling and call it "DSL"... heh. Awesome marketing!] Boy we're up to geek party time now! Whoo hoo. Hey! Who robbed my bits! The "generic" claim from most telcos that they'll only "guarantee" 9600 bps is silly -- none of the technologies they've widely deployed have ever really stolen so much audio quality from the line in muxing them down that 9600 is the best the line will do. But line noise and other contributing factors made them all confer with their lawyers and come up with the 9600 bps claims -- it's a Cover Your Ass(ets) type of thing. :-) So the circle of life goes around, and now the telco folks patch in a DSL DSLAM card into your somewhat beleaguered little analog line and use up that "overhead" that was always there... all that beautiful analog copper wire bandwidth that "no one" was using -- is again in use. ;-) Throw in fun like the switch from D4 signalling on T1's to ESF (D1 -- alarm bits are stolen from the second and third most significant bits in the audio timeslices, ESF the alarm bits are moved out of the audio frames into a header and trailer frame), etc. etc. etc. It's all very "evolutionary", with some of the technologies in analog clashing at the very end of the timeline of analog telco. Imagine if you will what happens to a sensitive analog signal like a 56K modem in the early days of such modems when D4 was prevalent... you rob bits out and set them to ones when they're supposed to be zeros and things sound a little different on the far end... eh? Modem doesn't like that so much. (Seen it in the lab... not happy at ALL.) Luckily D4 spans are almost a thing of the past... anyone using them for trunking anymore should be hung up by their shoelaces and given twenty lashes with a wet noodle... unless there's some god-awful outside plant that still has line-powered repeaters somewhere?! EEEEEK. Now if I could just get Qwest convinced that I don't NEED analog telco service to have DSL services on the line, and I could switch to using something like Vonage (http://www.vonage.com) for any analog phone line desires I have... life would be good. About time to fire up the pen and scribble off a note to the local Public Utilities Commission stating that Qwest's rule that one must have dial-tone to have DSL is outdated and holding back proper competition in the local-dial-tone market! :-) Ain't all this stuff FUN? ;-) Telco geek for many years turned Unix geek, but still love telco as it's such a cool "natural" progression of technology for 30 years... Nate Duehr, nate@natetech.com -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu