Despite their apparently divergent accounts of higher cognition, cognitive theories based on neural computation and those employing symbolic computation can in fact strengthen one another. To substantiate this controversial claim, this landmark work develops in depth a cognitive architecture based in neural computation but supporting formally explicit higher-level symbolic descriptions, including new grammar formalisms. Detailed studies in both phonology and syntax provide arguments that these grammatical theories and their neural network realizations enable deeper explanations of early acquisition, processing difficulty, cross-linguistic typology, and the possibility of genomically encoding universal principles of grammar. Foundational questions concerning the explanatory status of symbols for central problems such as the unbounded productivity of higher cognition are also given proper treatment. The work is made accessible to scholars in different fields of cognitive science through tutorial chapters and numerous expository boxes providing background material from several disciplines. Examples common to different chapters facilitate the transition from more basic to more sophisticated treatments. Details of method, formalism, and foundation are presented in later chapters, offering a wealth of new results to specialists in psycholinguistics, language acquisition, theoretical linguistics, computational linguistics, computational neuroscience, connectionist modeling, and philosophy of mind.
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability. Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability. They show how the core principles of OT lead to the learning principle of constraint demotion, the basis for a family of algorithms that infer constraint rankings from linguistic forms. Of primary concern to the authors are the ambiguity of the data received by the learner and the resulting interdependence of the core grammar and the structural analysis of overt linguistic forms. The authors argue that iterative approaches to interdependencies, inspired by work in statistical learning theory, can be successfully adapted to address the interdependencies of language learning. Both OT and Constraint Demotion play critical roles in their adaptation. The authors support their findings both formally and through simulations. They also illustrate how their approach could be extended to other language learning issues, including subset relations and the learning of phonological underlying forms.
This book is the final version of the widely-circulated 1993 Technical Report that introduces a conception of grammar in which well-formedness is defined as optimality with respect to a ranked set of universal constraints. Final version of the widely circulated 1993 Technical Report that was the seminal work in Optimality Theory, never before available in book format. Serves as an excellent introduction to the principles and practice of Optimality Theory. Offers proposals and analytic commentary that suggest many directions for further development for the professional.
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.