How to keep the HG weights non-negative: the truncated Perceptron reweighing rule

Giorgio Magri, UMR 7023 SFL (CNRS, University of Paris 8), France


The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant. 


Harmonic Grammar; error-driven learning; Perceptron; convergence

Full Text:



ISSN of the paper edition: 2299-856X