Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'TruncationOpManagerInference' object has no attribute 'quantizers' #8

Open
jinz2014 opened this issue Jan 8, 2020 · 2 comments

Comments

@jinz2014
Copy link

jinz2014 commented Jan 8, 2020

Can you please explain what need to be changed for the following error ? Thank you.

python inference/inference_sim.py -a resnet50 -b 512
/home/user/anaconda3/lib/python3.7/site-packages/yaml/constructor.py:126: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
if not isinstance(key, collections.Hashable):
=> using pre-trained model 'resnet50'
Perform BN folding
Traceback (most recent call last):
File "inference/inference_sim.py", line 381, in
im = InferenceModel(ml_logger)
File "inference/inference_sim.py", line 194, in init
QM().quantize_model(self.model)
File "/home/user/ML/cnn-quantization/inference/../pytorch_quantizer/quantization/inference/inference_quantization_manager.py", line 365, in quantize_model
weight_q = QMI().quantize_instant(m.weight, n + '.weight', "weight", override_att=('num_bits', 8), verbose=True)
File "/home/usr/ML/cnn-quantization/inference/../pytorch_quantizer/quantization/inference/inference_quantization_manager.py", line 343, in quantize_instant
return self.op_manager.quantize_instant(tensor, id, tag, stat_id, half_range, override_att, verbose)
File "/home/user/ML/cnn-quantization/inference/../pytorch_quantizer/quantization/inference/inference_quantization_manager.py", line 556, in quantize_instant
q = self.get_quantizer(qtag)
File "/home/user/ML/cnn-quantization/inference/../pytorch_quantizer/quantization/inference/inference_quantization_manager.py", line 510, in get_quantizer
if tag in self.quantizers:
AttributeError: 'TruncationOpManagerInference' object has no attribute 'quantizers'

@bangawayoo
Copy link

I am having the same issue.
Did you find a solution?

@bangawayoo
Copy link

I just found that the command works fine when I ran in terminal.
Hope this helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants