Skip to content

Extent of purpose for quantization-aware-training for weights in aihwkit? #528

Answered by maljoras
arseniivanov asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @arseniivanov,
many thanks for raising this interesting point. In the first paper, hardware-aware training is done with an analog representation of the weights in mind, that is the weight is directly encoded into the conductance values, and only having one pair of resistive elements (one for positive and one for negative parts of the weight). Thus, it makes no sense to use QAT methods since there is no quantization of the weights in this case. There are only noise and limited weight ranges, since the resistive value writing and read out is subject to noise and non-idealities as described in the first paper. Therefore this is very different to the QAT situation in digital, where weights…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@arseniivanov
Comment options

Answer selected by arseniivanov
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants