Search the Community
Showing results for tags 'SMD DD-1'.
-
Hey guys, I was talking to a friend of mine about the DD1 the other day, we stumbled upon the fact that it only measures distortion at 40 Hz, based on the following fact: I have a RF T2500 bdcp amplifier which does come with a output meter, so I tought I could trust it to , at least, know if im near clipping I used a 40 Hz test tone to set the thing by ear and eye (using the output needle), ok, it sounded preety loud at that frequency and didnt look distorted, so I was happy for that day, went to bed and slept The following day, I was at a lower volume , 15 (I've set my gains with a volume of 25) listening to some low bass music (woofer cooker, decaf, etc) and I noticed the bass was extremely intense, took a look at the output needle and it was ONLY red (maximum output) I was very confused, if I used the same principle as the DD-1 it should make that much difference between the frequencies....I was probably very near clippling at volume 15 while Ive set my gains at 25......if I did the same with the DD1, the same thing would have hapenned Am I right or the RFs output needle cant be trusted? So....why 40 Hz at setting gains with the DD1, and does that really work? I mean, couldnt I have a clean signal at 40 Hz and a clipped signal at 28 ? Thanks