>>Sometimes it takes a half of one clock to get a transition, and >>other times it takes 1, or even 1 and a half clocks before you get a >>signal transition. This makes it difficult to recover the clock >>that you need to decode the data. > >This is not Manchester data then. In your earlier post, you said >correctly that "manchester data has a transition in the center of each >bit". There is either 1/2 or one full bit time between transitions. >However, special sequences which violate the protocol (missing a required >transition) are often used for sync marker bits. The decoder would need >extra logic to detect these. You're correct. I got it confused with MFM. I believe MFM is the one I described regarding the 3 carrier freq's. I remember playing around with it, but the terminology is not fresh in my mind. >This is the basis of RLL encoding. It uses a basic time unit of 1/3 bit >time, with a guarantee of at least 2 but not more than 7 basic time units >with no transition after each transition. Conforming to these rules, it >is possible to store 8 bits every 16 basic time units. To record on the >same hardware that previously used MFM, every 3 basic time units is >equivalent to 1 MFM bit. Thus the overall bit rate is 1.5 times that >possible from MFM. This format is used on nearly all modern hard disks >as well as CD-ROMs (On CD, 17 basic time units store 8 bits. The extra >time is supposedly used to "reduce DC content"). Indeed, RLL is one type of Group Code Recording. The lookup tables ensure that you will never have too many 0's or 1's in a row. I'll try to come up with a better explanation of GCR. As others have noted, Manchester itself is not so hard to decode. You are guaranteed one transition for each bit of data. That must be why it's used on TV remotes - not necessarily because it's efficient, but just because it's easy to work with. Eric Engler