TB 9-6625-152-24
Table 3. Frequency Accuracy.
Test instrument
Frequency counter
Frequency
Indication limits (MHz)
(MHz)
Min
Max
0.400
0.399999
0.400001
111.11111
111.111109
111.111111
122.22222
122.222219
122.222221
133.33333
133.333329
133.333331
144.44444
144.444439
144.444441
155.55555
155.555549
155.555551
166.66666
166.666659
166.666661
177.77777
177.777769
177.777771
188.88888
188.888879
188.888881
500.000001
499.999999
500.000001
1050.00000
1049.999999
1050.000001
Move connection on frequency counter from A input to C input and set
1
frequency counter for input C.
(11) Set TI output to minimum and disconnect equipment setup.
b. Adjustments
(1) Press TI HELP SETUP, [Setup], [Calibrate] keys.
NOTE
If the message Locked appears adjacent to the [Calibrate]
key, it must be unlocked before access can be obtained. Press
[Calibrate]. The message ENTRY CODE will appear. Key in
the digits 2, 9, 4, 5, then press the [dBm/ENTER] key. If the
procedure has been carried out correctly the Locked message
will be removed and the [Calibrate] key will now be active.
Press [Calibrate].
(2) Press TI [Freq Std].
(3) Connect the time/frequency workstation standard 10 MHz output to signal
(4) Connect signal generator to TI to N-type RF connector and set signal generator
for a 1 GHz, 0 dBm output.
(5) TI will display an offset reading at bottom of display.
(6) Use TI front panel variable control knob to adjust displayed offset as close as
possible to 0 Hz. (TI [ ], [ ] keys switch between coarse and fine adjustment).
(7) When TI offset indication is set as close as possible to 0 Hz, press [Return],
[Store Cal].
(8) Set all outputs to minimum and disconnect equipment setup.
8