forked from stpaine/FERS
-
Notifications
You must be signed in to change notification settings - Fork 1
ADC Quantization Simulation
David Young edited this page Apr 30, 2025
·
3 revisions
This process simulates the effects of an Analog-to-Digital Converter (ADC) on the rendered complex signal data within a specific time window during HDF5 output generation. The simulation is primarily controlled by the params::adcBits() global parameter, which defines the number of quantization bits. Key aspects modeled include scaling/normalization of the signal to the ADC's dynamic range and clamping of values that exceed this range (representing saturation). Setting params::adcBits() to 0 disables this simulation step, resulting only in normalization.
- Assumes an ideal ADC model with perfectly uniform quantization steps.
- Assumes the
params::adcBits()parameter correctly defines the desired quantization resolution. - Assumes simple clamping of values outside the calculated dynamic range accurately represents ADC saturation behavior.
- Assumes the method used for estimating the ADC's full-scale range (based on the maximum absolute value
max_valuefound within the current window) is appropriate for accurately representing the signal's dynamic range relative to the ADC limits.
- Ideal Model: The simulation only models an ideal uniform quantizer.
-
Ignored Impairments: Does not account for common real-world ADC impairments such as:
- Differential Non-Linearity (DNL)
- Integral Non-Linearity (INL)
- ADC's own noise contribution (Noise Figure)
- Spurious signals generated by the ADC (Spurs)
- Effects of sampling clock jitter.
- Peak-Based Scaling: The full-scale value for quantization is determined by the maximum absolute signal value found within the current processing window. For signals with a high Peak-to-Average Ratio (PAR), this can lead to suboptimal quantization of the lower-power portions of the signal, potentially degrading the effective signal-to-noise ratio (SNR).
- Processing Stage: This effect is applied during the HDF5 output generation phase, specifically after downsampling (if any) and after phase noise simulation (if enabled). See Output Generation Effects.
-
Normalization (
adcBits = 0): If quantization is disabled (adcBits = 0), the output signal is normalized by the maximum absolute value in the window. This results in a loss of absolute amplitude information in the output data unless the calculated maximum value (max_value) is explicitly saved or otherwise accounted for in post-processing.
- Code:
receiver_export.cpp(specifically functionsquantizeWindowandadcSimulate) - Caller:
receiver_export.cpp::exportReceiverBinary - Parameter:
params::adcBits()(global simulation setting)
- Needs Verification: The implementation requires specific testing to confirm its behavior matches the intended ideal model.
-
Key Areas for Validation:
- Verify that the number of output levels corresponds correctly to the specified
params::adcBits. - Confirm that quantization steps are uniform.
- Test the clamping mechanism (saturation) for values exceeding the calculated full-scale range.
- Check the behavior when
adcBitsis 0 (normalization). - Assess the impact of the peak-based scaling on signals with different PARs (e.g., comparing quantization noise floor for a sine wave vs. a noise-like signal scaled to the same peak).
- Confirm the processing order relative to phase noise addition and downsampling within the HDF5 export workflow.
- Verify that the number of output levels corresponds correctly to the specified
- Priority: Medium