Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

When Errors Become the Rule : Twenty Years with Transformation-Based Learning

Uneson, Marcus LU (2014) In ACM Computing Surveys 46(4). p.50-51
Abstract
Transformation-based learning (TBL) is a machine learning method for, in particular, sequential classification, invented by Eric Brill [Brill 1993b, 1995a]. It is widely used within computational linguistics and natural language processing, but surprisingly little in other areas.



TBL is a simple yet flexible paradigm, which achieves competitive or even state-of-the-art performance in several areas and does not overtrain easily. It is especially successful at catching local, fixed-distance dependencies and seamlessly exploits information from heterogeneous discrete feature types. The learned representation—an ordered list of transformation rules—is compact and efficient, with clear semantics. Individual rules are... (More)
Transformation-based learning (TBL) is a machine learning method for, in particular, sequential classification, invented by Eric Brill [Brill 1993b, 1995a]. It is widely used within computational linguistics and natural language processing, but surprisingly little in other areas.



TBL is a simple yet flexible paradigm, which achieves competitive or even state-of-the-art performance in several areas and does not overtrain easily. It is especially successful at catching local, fixed-distance dependencies and seamlessly exploits information from heterogeneous discrete feature types. The learned representation—an ordered list of transformation rules—is compact and efficient, with clear semantics. Individual rules are interpretable and often meaningful to humans.



The present article offers a survey of the most important theoretical work on TBL, addressing a perceived gap in the literature. Because the method should be useful also outside the world of computational linguistics and natural language processing, a chief aim is to provide an informal but relatively comprehensive introduction, readable also by people coming from other specialities. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Artificial intelligence, Knowledge Representation Formalisms and Methods, Computational Linguistics, Natural Language Processing, Rule learning
in
ACM Computing Surveys
volume
46
issue
4
pages
50 - 51
publisher
Association for Computing Machinery (ACM)
external identifiers
  • wos:000336405900008
  • scopus:84901234598
ISSN
0360-0300
DOI
10.1145/2534189
language
English
LU publication?
yes
additional info
The information about affiliations in this record was updated in December 2015. The record was previously connected to the following departments: Linguistics and Phonetics (015010003)
id
b0e281ff-7ee2-4610-bb32-e6aa01c6bc68 (old id 4406798)
alternative location
http://dl.acm.org/authorize?6905043
date added to LUP
2016-04-01 10:53:05
date last changed
2023-08-31 13:59:47
@article{b0e281ff-7ee2-4610-bb32-e6aa01c6bc68,
  abstract     = {{Transformation-based learning (TBL) is a machine learning method for, in particular, sequential classification, invented by Eric Brill [Brill 1993b, 1995a]. It is widely used within computational linguistics and natural language processing, but surprisingly little in other areas.<br/><br>
<br/><br>
TBL is a simple yet flexible paradigm, which achieves competitive or even state-of-the-art performance in several areas and does not overtrain easily. It is especially successful at catching local, fixed-distance dependencies and seamlessly exploits information from heterogeneous discrete feature types. The learned representation—an ordered list of transformation rules—is compact and efficient, with clear semantics. Individual rules are interpretable and often meaningful to humans.<br/><br>
<br/><br>
The present article offers a survey of the most important theoretical work on TBL, addressing a perceived gap in the literature. Because the method should be useful also outside the world of computational linguistics and natural language processing, a chief aim is to provide an informal but relatively comprehensive introduction, readable also by people coming from other specialities.}},
  author       = {{Uneson, Marcus}},
  issn         = {{0360-0300}},
  keywords     = {{Artificial intelligence; Knowledge Representation Formalisms and Methods; Computational Linguistics; Natural Language Processing; Rule learning}},
  language     = {{eng}},
  number       = {{4}},
  pages        = {{50--51}},
  publisher    = {{Association for Computing Machinery (ACM)}},
  series       = {{ACM Computing Surveys}},
  title        = {{When Errors Become the Rule : Twenty Years with Transformation-Based Learning}},
  url          = {{http://dx.doi.org/10.1145/2534189}},
  doi          = {{10.1145/2534189}},
  volume       = {{46}},
  year         = {{2014}},
}