How to keep the HG weights non-negative: the truncated Perceptron reweighing rule

Authors

  • Giorgio Magri UMR 7023 SFL (CNRS, University of Paris 8)

Keywords:

Harmonic Grammar, error-driven learning, Perceptron, convergence

Abstract

The literature on error-driven learning in Harmonic Grammar (HG) has adopted the Perceptron reweighing rule. Yet, this rule is not suited to HG, as it fails at ensuring non-negative weights. A variant is thus considered which truncates the updates at zero, keeping the weights non-negative. Convergence guarantees and error bounds for the original Perceptron are shown to extend to its truncated variant. 

DOI:

https://doi.org/10.15398/jlm.v3i2.115

Full article

Published

2015-12-07

How to Cite

Magri, G. (2015). How to keep the HG weights non-negative: the truncated Perceptron reweighing rule. Journal of Language Modelling, 3(2), 345–375. https://doi.org/10.15398/jlm.v3i2.115

Issue

Section

Articles