How to keep the HG weights non-negative: the truncated Perceptron reweighing rule


Giorgio Magri, UMR 7023 SFL (CNRS, University of Paris 8), France

Abstract


The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant. 


Keywords


Harmonic Grammar; error-driven learning; Perceptron; convergence

Full Text:

PDF


DOI: http://dx.doi.org/10.15398/jlm.v3i2.115

ISSN of the paper edition: 2299-856X