Machine learning
From Wikipedia, the free encyclopedia
As a broad subfield of artificial intelligence, Machine learning is concerned with the development of algorithms and techniques that allow computers to "learn". At a general level, there are two types of learning: inductive, and deductive. Inductive machine learning methods create computer programs by extracting rules and patterns out of massive data sets. It should be noted that although pattern identification is important to Machine Learning, without rule extraction a process falls more accurately in the field of data mining.
Machine learning overlaps heavily with statistics. In fact, many machine learning algorithms have been found to have direct counterparts with statistics. For example, boosting is now widely thought to be a form of stagewise regression using a specific type of loss function.
Machine learning has a wide spectrum of applications including natural language processing, search engines, medical diagnosis, bioinformatics and cheminformatics, detecting credit card fraud, stock market analysis, classifying DNA sequences, speech and handwriting recognition, object recognition in computer vision, game playing and robot locomotion.
Contents |
[edit] Human interaction
Some machine learning systems attempt to eliminate the need for human intuition in the analysis of the data, while others adopt a collaborative approach between human and machine. Human intuition cannot be entirely eliminated since the designer of the system must specify how the data are to be represented and what mechanisms will be used to search for a characterization of the data. Machine learning can be viewed as an attempt to automate parts of the scientific method. Some machine learning researchers create methods within the framework of Bayesian statistics.
[edit] Algorithm types
Machine learning algorithms are organized into a taxonomy, based on the desired outcome of the algorithm. Common algorithm types include:
- supervised learning --- where the algorithm generates a function that maps inputs to desired outputs. One standard formulation of the supervised learning task is the classification problem: the learner is required to learn (to approximate the behavior of) a function which maps a vector into one of several classes by looking at several input-output examples of the function.
- unsupervised learning --- which models a set of inputs: labeled examples are not available.
- semi-supervised learning --- which combines both labeled and unlabeled examples to generate an appropriate function or classifier.
- reinforcement learning --- where the algorithm learns a policy of how to act given an observation of the world. Every action has some impact in the environment, and the environment provides feedback that guides the learning algorithm.
- transduction --- similar to supervised learning, but does not explicitly construct a function: instead, tries to predict new outputs based on training inputs, training outputs, and new inputs.
- learning to learn --- where the algorithm learns its own inductive bias based on previous experience.
The performance and computational analysis of machine learning algorithms is a branch of theoretical computer science known as computational learning theory.
[edit] Machine learning topics
This list represents the topics covered on a typical machine learning course.
- Modeling conditional probability density functions: regression and classification
- Inductive Transfer and Learning to Learn
- Inductive transfer
- Reinforcement learning
- Temporal-Difference
- Monte-Carlo
- Modeling probability density functions through generative models:
- Approximate inference techniques:
- Meta-Learning (Ensemble methods):
- Optimization: most of methods listed above either use optimization or are instances of optimization algorithms.
- Multi-objective Machine Learning: An approach that addresses multiple, and often confliciting learning objectives explicitly using Pareto-based multi-objective optimization techniques.
[edit] See also
- Artificial intelligence
- Computational intelligence
- Data mining
- Predictive analytics
- Bioinformatics
- Pattern recognition
- Important publications in machine learning (computer science)
- Important publications in machine learning (statistics)
- Autonomous robot
- Computer vision
- Inductive logic programming
- Neural network software
[edit] Bibliography
- Ryszard S. Michalski, Jaime G. Carbonell, Tom M. Mitchell (1983), Machine Learning: An Artificial Intelligence Approach, Tioga Publishing Company, ISBN 0-935382-05-4
- Ryszard S. Michalski, Jaime G. Carbonell, Tom M. Mitchell (1986), Machine Learning: An Artificial Intelligence Approach, Volume II, Morgan Kaufmann, ISBN 0-934613-00-1
- Yves Kodratoff, Ryszard S. Michalski (1990), Machine Learning: An Artificial Intelligence Approach, Volume III, Morgan Kaufmann, ISBN 1-55860-119-8
- Ryszard S. Michalski, George Tecuci (1994), Machine Learning: A Multistrategy Approach, Volume IV, Morgan Kaufmann, ISBN 1-55860-251-8
- Bhagat, P. M. (2005). Pattern Recognition in Industry, Elsevier. ISBN 0-08-044538-1
- Bishop, C. M. (1995). Neural Networks for Pattern Recognition, Oxford University Press. ISBN 0-19-853864-2
- Richard O. Duda, Peter E. Hart, David G. Stork (2001) Pattern classification (2nd edition), Wiley, New York, ISBN 0-471-05669-3
- Huang T.-M., Kecman V., Kopriva I. (2006), Kernel Based Algorithms for Mining Huge Data Sets, Supervised, Semi-supervised, and Unsupervised Learning, Springer-Verlag, Berlin, Heidelberg, 260 pp. 96 illus., Hardcover, ISBN 3-540-31681-7[1]
- KECMAN Vojislav (2001), LEARNING AND SOFT COMPUTING, Support Vector Machines, Neural Networks and Fuzzy Logic Models, The MIT Press, Cambridge, MA, 608 pp., 268 illus., ISBN 0-262-11255-8[2]
- MacKay, D. J. C. (2003). Information Theory, Inference, and Learning Algorithms, Cambridge University Press. ISBN 0-521-64298-1
- Mitchell, T. (1997). Machine Learning, McGraw Hill. ISBN 0-07-042807-7
- Ian H. Witten and Eibe Frank "Data Mining: Practical machine learning tools and techniques" Morgan Kaufmann ISBN 0-12-088407-0
- Sholom Weiss and Casimir Kulikowski (1991). Computer Systems That Learn, Morgan Kaufmann. ISBN 1-55860-065-5
[edit] External links
[edit] General resources
- UCI description
- MLnet Mailing List
- Index of Machine Learning Courses
- Kmining List of machine learning, data mining and KDD scientific conferences
- Book "Intelligent Systems and their Societies" by Walter Fritz
- Links from Open Directory Project
- MLpedia – wiki dedicated to machine learning.
- Data Mining Tutorials, Resources Eruditionhome
- The Encyclopedia of Computational Intelligence
[edit] Journals and Conferences
- Journal of Machine Learning Research
- Machine Learning (journal)
- Neural Computation (journal)
- Neural Information Processing Systems (NIPS) (conference)
- International Workshop on Artificial Intelligence & Statistics (AISTATS)
- ICML: International Conference on Machine Learning
- Uncertainty in Artificial Intelligence (conference)
- The "Snowbird" Learning Workshop
- Learning Inquiry: an academic journal centered on learning
- Machine Learning papers @ CiteSeer
[edit] Research groups
- Machine Learning @ the Iowa State University Artificial Intelligence Research Laboratory, USA
- Machine Learning and Biological Computation Group @ University of Bristol, UK
- Alberta Ingenuity Centre for Machine Learning @ University of Alberta, CA
- Statistical Multimedia Learning Group @ University of British Columbia, CA
- Machine Learning @ Cornell University, USA
- Machine Learning Group @ Edinburgh University, UK
- Intelligent Data Analysis Group @ Fraunhofer FIRST, Berlin, Germany
- Machine Learning and Data Mining @ Artificial Intelligence Unit @ University of Dortmund, Dortmund, Germany
- Machine Learning and Natural Language Processing @ University of Freiburg, Freiburg, Germany
- Machine Learning @ The Hebrew University, Israel
- Center for Computational Intelligence, Learning and Discovery @ Iowa State University, USA
- Machine Learning Systems Group @ the Jet Propulsion Laboratory, California Institute of Technology, USA
- Department of Knowledge Technologies @ Jozef Stefan Institute, Slovenia
- Knowledge Engineering Group @ TU Darmstadt, Darmstadt, Germany
- Knowledge Management Group @ HU Berlin, Berlin, Germany
- Machine Learning Group @ Université Libre de Bruxelles, Belgium
- Department of Empirical Inference @ Max Planck Institute for Biological Cybernetics, Tübingen, Germany
- Machine Learning and Data Mining in Bioinformatics Group @ TU München, Germany
- Machine Learning and Applied Statistics @ Microsoft Research
- Machine Learning Group @ University of Toronto, CA
- Machine Learning: Probabilistic and Statistical Inference Group @ University of Toronto, CA
- Machine Learning Group @ Université catholique de Louvain, Belgium
- Machine Learning Department @ Carnegie Mellon University, Pittsburgh, PA, USA
- Computational Learning Laboratory @ Stanford University,Stanford, CA, USA
- Cognitive Computation Group @ the University of Illinois at Urbana-Champaign ,Urbana, IL, USA
- Machine Learning @ the University of Illinois at Urbana-Champaign (UIUC) ,Urbana, IL, USA
[edit] Software
- SPIDER is a complete machine learning toolbox for MATLAB.
- PRTools is another complete package similar to SPIDER and implemented in MATLAB. SPIDER seems to have more native support and functions for kernel methods, but PRTools has a slightly larger variety of other machine learning tools. PRTools has an accompanying textbook and much better documentation. Both SPIDER and PRTools are available freely for non-commercial applications.
- Computer Manual to Accompany Pattern Classification contains a Matlab implementation of many pattern classification algorithms. It is especially suitable for students and novice in the area of pattern classification.
- Orange is a machine learning suite with Python scripting and a visual programming interface.
- YALE (http://yale.sf.net/) is a powerful and freely available integrated open-source software environment for data pre-processing, intelligent data analysis, knowledge discovery, data mining, machine learning, visualization, etc. implemented in Java, featuring more than 350 operators, a graphical user interface (GUI), and a XML-based scripting language for data mining, and fully integrating the Weka Machine Learning Software.
- Weka Machine Learning Software providing machine learning operators for pattern classification, regression, clustering, association rule learning, and meta-operators like e.g. ensemble learners.
- MATLAB, by The MathWorks, has toolbox support for many machine learning tools. The Bioinformatics toolbox includes Support Vector Machines and KNN classifiers. The Statistics toolbox includes linear discriminant and decision tree classification. The Neural Network toolbox is a complete set of tools for implementing Neural Networks (PRTools relies on it for its neural network classifiers). New methods for classifier performance evaluation and cross validation make MATLAB more attractive for machine learning.
- Synapse by Peltarion supports the development of a wide range of machine learning systems and the integration of different types of machine learning into hybrid systems.
- MLC++ is a library of C++ classes for supervised machine learning
- MDR is an open-source software package for detecting attribute interactions using the multifactor dimensionality reduction (MDR) method.
- questsin an Add-In for Microsoft Excel, that uses machine learning to expand your selection similar to the Popular Fill Data Feature.
- [3] SemiL is the world first efficient software for solving large scale semi-supervised learning or transductive inference problems using graph based approaches when faced with unlabeled data. It implements various semisupervised learning approaches.
- PCP is a free program for feature selection and supervised pattern classification, written in C. Supports interactive and batch modes.
- AQ21 program seeks different types of patterns in data and represents them in human-oriented forms resembling natural language descriptions. It integrates several novel abilities such as to discover different types of attributional patterns depending on the parameter settings, to optimize patterns according to a large number of different pattern quality criteria, to learn rules with exceptions, to determine optimized sets of alternative hypotheses generalizing the same data, and to handle data with missing, irrelevant and/or not-applicable meta-values.
- iAQ program demonstrates Natural Induction, that is, an ability of a computer program to learn knowledge from data in forms natural to people, and by that easy to understand and interpret. In iAQ, discovered rules are expressed verbally and also as natural language text.
- LEM3 system implements a novel, non-Darwinian methodology for evolutionary computation, called Learnable Evolution Model or LEM. LEM employs a learning program to guide the evolutionary computation. Instead of conventional random mutations and recombinations, LEM employs hypothesis formation and generation operators to create new populations of individuals. LEM3 can handle very complex, non-linear and multi-mode optimization problems with hundreds of controlable multi-type variables, and is particularly advantageous for problems in which the computation of the evaluation function (fitness function) is costly or time-consuming.
- SNoW is a learning architecture that is tailored for learning in the presence of a very large number of features. SNoW learns linear functions via regularized variations of Perceptron and Winnow. The packaage contains a large number of options and also a good sparse implementation of naive Bayes.