Phase transitions typically occur in combinatorial computational problems and have important consequences, especially with the current spread of statistical relational learning as well as sequence learning methodologies. In Phase Transitions in Machine Learning the authors begin by describing in detail this phenomenon, and the extensive experimental investigation that supports its presence. They then turn their attention to the possible implications and explore appropriate methods for tackling them. Weaving together fundamental aspects of computer science, statistical physics and machine learning, the book provides sufficient mathematics and physics background to make the subject intelligible to researchers in AI and other computer science communities. Open research issues are also discussed, suggesting promising directions for future research.
Abstraction is a fundamental mechanism underlying both human and artificial perception, representation of knowledge, reasoning and learning. This mechanism plays a crucial role in many disciplines, notably Computer Programming, Natural and Artificial Vision, Complex Systems, Artificial Intelligence and Machine Learning, Art, and Cognitive Sciences. This book first provides the reader with an overview of the notions of abstraction proposed in various disciplines by comparing both commonalities and differences. After discussing the characterizing properties of abstraction, a formal model, the KRA model, is presented to capture them. This model makes the notion of abstraction easily applicable by means of the introduction of a set of abstraction operators and abstraction patterns, reusable across different domains and applications. It is the impact of abstraction in Artificial Intelligence, Complex Systems and Machine Learning which creates the core of the book. A general framework, based on the KRA model, is presented, and its pragmatic power is illustrated with three case studies: Model-based diagnosis, Cartographic Generalization, and learning Hierarchical Hidden Markov Models.
Phase transitions typically occur in combinatorial computational problems and have important consequences, especially with the current spread of statistical relational learning as well as sequence learning methodologies. In Phase Transitions in Machine Learning the authors begin by describing in detail this phenomenon, and the extensive experimental investigation that supports its presence. They then turn their attention to the possible implications and explore appropriate methods for tackling them. Weaving together fundamental aspects of computer science, statistical physics and machine learning, the book provides sufficient mathematics and physics background to make the subject intelligible to researchers in AI and other computer science communities. Open research issues are also discussed, suggesting promising directions for future research.
Abstraction is a fundamental mechanism underlying both human and artificial perception, representation of knowledge, reasoning and learning. This mechanism plays a crucial role in many disciplines, notably Computer Programming, Natural and Artificial Vision, Complex Systems, Artificial Intelligence and Machine Learning, Art, and Cognitive Sciences. This book first provides the reader with an overview of the notions of abstraction proposed in various disciplines by comparing both commonalities and differences. After discussing the characterizing properties of abstraction, a formal model, the KRA model, is presented to capture them. This model makes the notion of abstraction easily applicable by means of the introduction of a set of abstraction operators and abstraction patterns, reusable across different domains and applications. It is the impact of abstraction in Artificial Intelligence, Complex Systems and Machine Learning which creates the core of the book. A general framework, based on the KRA model, is presented, and its pragmatic power is illustrated with three case studies: Model-based diagnosis, Cartographic Generalization, and learning Hierarchical Hidden Markov Models.
Phase transitions typically occur in combinatorial computational problems and have important consequences, especially with the current spread of statistical relational learning and as sequence learning methodologies. In Phase Transitions in Machine Learning the authors begin by describing in detail this phenomenon and the extensive experimental investigation that supports its presence. They then turn their attention to the possible implications and explore appropriate methods for tackling them"--
This state-of-the-art overview of the field describes how phase transitions occur and teaches appropriate methods for tackling the consequent problems.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.