Desktop Survival Guide
by Graham Williams
Lazy Learning: While traditional learning algorithms compile data into abstractions in the process of inducing concept descriptions, lazy learning algorithms, also known as instance-based, memory-based, exemplar-based, experience-based, and case-based, learning delay this process and represent concept descriptions with the data itself. Lazy learning has its roots in disciplines such as pattern recognition, cognitive psychology, statistics, cognitive science, robotics, and information retrieval. It has received increasing attention in several AI disciplines as researchers have explored issues on massively parallel approaches, cost sensitivity, matching algorithms for use with symbolic and structured data representations, formal analyses, rule extraction, variable selection, interaction with knowledge-based systems, integration with other learning/reasoning approaches, and numerous application-specific issues. Many reasons exist for this level of activity: these algorithms are relatively easy to present and analyse, are easily applied, have promising performance on some measures, and are the basis for today's commercially popular case-based reasoning systems.
Learning Curve: A plot of accuracy against training set size, often used to determine the smallest training set size for which adding extra instances leads to little, if any, improvement of accuracy.
Leverage: A measure of interestingness in terms of the difference between the observed frequency of occurrence of items and the expected frequency if the items were independent.
Lift: A measure of interestingness capturing the increase in the likelihood of an item occurring, for example, within a defined sub population, compared to the full population.
Link Analysis: Explores associations among entities. For example, a law enforcement application might examine familial relationships among suspects and victims, the addresses at which those persons reside, and the telephone numbers that they called during a specified period. The ability of link analysis to represent relationships and associations among entities of different types has proven crucial in assisting human investigators to comprehend complex webs of evidence and draw conclusions that are not apparent from any single piece of information. Computer-based link analysis is increasingly used in law enforcement investigations, fraud detection, telecommunications network analysis, pharmaceuticals research, epidemiology, and many other specialised applications. Also referred to as Social Network Analysis. David Jensen firstname.lastname@example.org, 30 Jan 98, email@example.com
Copyright © 2004-2006 Graham.Williams@togaware.com Support further development through the purchase of the PDF version of the book.