How to keep the HG weights non-negative: the truncated Perceptron reweighing rule
Keywords:
Harmonic Grammar, error-driven learning, Perceptron, convergenceAbstract
The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant.
DOI:
https://doi.org/10.15398/jlm.v3i2.115Full article
Published
How to Cite
Issue
Section
License
Copyright (c) 2015 Giorgio Magri
This work is licensed under a Creative Commons Attribution 3.0 Unported License.