Professor Howard Lasnik is one of the world's leading theoretical linguists. He has produced influential and important work in areas such as syntactic theory, logical form, and learnability. This collection of essays draws together some of his best work from his substantial contribution to linguistic theory.
This collection of essays presents an up-to-date overview of research in the minimalist program of linguistic theory. The book includes a new essay by Noam Chomsky as well as original contributions from other renowned linguists. Contributors: Andrew Barss, Zeljko Boskovic, Noam Chomsky, Hamida Demirdache, Hiroto Hoshi, Kyle Johnson, Roger Martin, Keiko Murasugi, Javier Ormazabal, Mamoru Saito, Daiko Takahashi, Juan Uriagereka, Myriam Uribe-Extebarria, Ewa Willim.
with Marcela Depiante and Arthur Stepanov This book provides an introduction to some classic ideas and analyses of transformational generative grammar, viewed both on their own terms and from a more modern, or minimalist perspective. The major focus is on the set of analyses treating English verbal morphology. The book shows how the analyses in Chomsky's classic Syntactic Structures actually work, filling in underlying assumptions and often unstated formal particulars. From there the book moves to successive theoretical developments and revisions—both in general and in particular as they pertain to inflectional verbal morphology. After comparing Chomsky's economy-based account with his later minimalist approach, the book concludes with a hybrid theory of English verbal morphology that includes elements of both Syntactic Structures and A Minimalist Program for Linguistic Theory. Current Studies in Linguistics No. 33
Natural phenomena, including human language, are not just series of events but are organized quasi-periodically; sentences have structure, and that structure matters. Howard Lasnik and Juan Uriagereka “were there” when generative grammar was being developed into the Minimalist Program. In this presentation of the universal aspects of human language as a cognitive phenomenon, they rationally reconstruct syntactic structure. In the process, they touch upon structure dependency and its consequences for learnability, nuanced arguments (including global ones) for structure presupposed in standard linguistic analyses, and a formalism to capture long-range correlations. For practitioners, the authors assess whether “all we need is Merge,” while for outsiders, they summarize what needs to be covered when attempting to have structure “emerge.” Reconstructing the essential history of what is at stake when arguing for sentence scaffolding, the authors cover a range of larger issues, from the traditional computational notion of structure (the strong generative capacity of a system) and how far down into words it reaches to whether its variants, as evident across the world’s languages, can arise from non-generative systems. While their perspective stems from Noam Chomsky’s work, it does so critically, separating rhetoric from results. They consider what they do to be empirical, with the formalism being only a tool to guide their research (of course, they want sharp tools that can be falsified and have predictive power). Reaching out to skeptics, they invite potential collaborations that could arise from mutual examination of one another’s work, as they attempt to establish a dialogue beyond generative grammar.
This major contribution to modern syntactic theory elaborates a principles-and-parameters framework in which the differences and similarities among languages with respect to WH-questions can be captured. Move &α is part of an overall program, initiated by Noam Chomsky, to create a global theory in which the entire transformational component can be reduced to a single process, Move &α. Lasnik and Saito are concerned particularly with bounding requirements on movement (Subjacency) and proper government requirements on traces (The Empty Category Principle).The first two chapters present and extend the ideas proposed in the author's earlier article, "On the Nature of Proper Government." Included are detailed discussions of &γ-marking, the general rule Affect &α, and the definition of proper government, particularly as these relate to WH constructions. The next two chapters propose a modification of Chomsky's Barriers Theory on the basis of a close examination of topicalization and examine the consequences of the modified theory. The discussion extends to restrictions on possible antecedent governors and the implications for quantifier raising and NP-movement of these restrictions. Consequences for Superiority are also considered, and a modified version of this condition is proposed, as is an extension of Chomsky's Uniformity Condition. The final chapter takes up further theoretical issues and alternative approaches.Howard Lasnik is Professor and Mamoru Saito is Associate Professor, both at the University of Connecticut.
This major contribution to modern syntactic theory elaborates a principles-and-parameters framework in which the differences and similarities among languages with respect to WH-questions can be captured. Move α is part of an overall program, initiated by Noam Chomsky, to create a global theory in which the entire transformational component can be reduced to a single process, Move α. Lasnik and Saito are concerned particularly with bounding requirements on movement (Subjacency) and proper government requirements on traces (The Empty Category Principle).The first two chapters present and extend the ideas proposed in the author's earlier article, "On the Nature of Proper Government." Included are detailed discussions of γ-marking, the general rule Affect α, and the definition of proper government, particularly as these relate to WH constructions. The next two chapters propose a modification of Chomsky's Barriers Theory on the basis of a close examination of topicalization and examine the consequences of the modified theory. The discussion extends to restrictions on possible antecedent governors and the implications for quantifier raising and NP-movement of these restrictions. Consequences for Superiority are also considered, and a modified version of this condition is proposed, as is an extension of Chomsky's Uniformity Condition. The final chapter takes up further theoretical issues and alternative approaches.Howard Lasnik is Professor and Mamoru Saito is Associate Professor, both at the University of Connecticut.
This major contribution to modern syntactic theory elaborates a principles-and-parameters framework in which the differences and similarities among languages with respect to WH-questions can be captured. Move &α is part of an overall program, initiated by Noam Chomsky, to create a global theory in which the entire transformational component can be reduced to a single process, Move &α. Lasnik and Saito are concerned particularly with bounding requirements on movement (Subjacency) and proper government requirements on traces (The Empty Category Principle).The first two chapters present and extend the ideas proposed in the author's earlier article, "On the Nature of Proper Government." Included are detailed discussions of &γ-marking, the general rule Affect &α, and the definition of proper government, particularly as these relate to WH constructions. The next two chapters propose a modification of Chomsky's Barriers Theory on the basis of a close examination of topicalization and examine the consequences of the modified theory. The discussion extends to restrictions on possible antecedent governors and the implications for quantifier raising and NP-movement of these restrictions. Consequences for Superiority are also considered, and a modified version of this condition is proposed, as is an extension of Chomsky's Uniformity Condition. The final chapter takes up further theoretical issues and alternative approaches.Howard Lasnik is Professor and Mamoru Saito is Associate Professor, both at the University of Connecticut.
Restoring the Human Context to Literary and Performance Studies argues that much of contemporary literary theory is still predicated, at least implicitly, on outdated linguistic and psychological models such as post-structuralism, psychoanalysis, and behaviorism, which significantly contradict current dominant scientific views. By contrast, this monograph promotes an alternative paradigm for literary studies, namely Contextualism, and in so doing highlights the similarities and differences among the sometimes-conflicting contemporary cognitive approaches to literature and performance, arguing not in favor of one over the other but for Contextualism as their common ground.
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations.The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways; computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons; Hebbian learning rules and the elaboration of learning vector quantization; the linguistic pathway in the left hemisphere; memory and the hippocampus; truth-conditional vs. image-schematic semantics; objectivist vs.
A Course in GB Syntax is a new kind of linguistics textbook. It presents the fundamental concepts of the Government-Binding approach to syntax in a lecture-dialogue format that conveys the sense of a changing field, with live issues under debate.Students and professionals seeking a lucid introduction to the complexities of GB syntax will have the experience of participating in an actual course taught by a major practitioner. The presentation of fundamentals is followed by further examples, easily understandable discussion of technical questions, and alternative analyses within the same basic framework.The book fits welI between a more general introduction like van Riemsdijk and Williams' Introduction to the Theory of Grammar and the major GB literature. While it has been designed for use by graduate students in a second semester syntax course, it can serve as a reader's companion to important but sometimes forbidding texts like Noam Chomsky's Lectures on Government and Binding and Some Concepts and Consequences of the Theory of Government and Binding.The informal tone makes the subject more approachable; examples are worked out more slowly and in greater detail than is possible in the primary sources; and the definitions and notational devices are carefully explained. Finally, many of the questions that the student might want to raise are raised (in fact, by students) and answers and alternatives are explored.The lectures give an overview of the modular GB model and cover in detail Case theory; Binding Theory; the determination of "empty categories," parasitic gaps, and the Empty Category Principle; extensions and alternatives, such as Aoun's "Generalized Binding Theory" and Higginbotham's "linking" analysis, and various open questions, such as the nature of the Case filter, tough movement, weak crossover, illicit NP-movement, and topicalization.Howard Lasnik is Professor of Linguistics at the University of Connecticut. Juan Uriagereka, one of his graduate students, transcribed and did the initial editing of the tapes of the original lectures. Current Studies in Linguistics.
There is something old-fashioned and sage-like in Walter Howard's poetic voice. I can imagine him reading from a mountaintop-- with the raging elements a backdrop to his words. Howard is a learned man-- and has been an academic for many years-- but his poetry is in the tradition of a true romantic. He uses nature and emotion to find spiritual truth. He embraces beauty-- with all its allure, but is not afraid to reveal its frightening and dark side as well. Howard uses ample doses of levity to pull the fly down on our most cherished traditions and notions, but in the same token he shows a deep respect and affinity for all the things this world has to offer." - Doug Holder, Publisher of Ibbetson Street Press
This is collection of ideas stated over Taft’s lifetime of service as administrator, diplomat, president, and Chief Justice. It singles out the essence of his convictions regarding government, diplomacy, and the law.
Interviews with the director of Scarface, Only Angels Have Wings, His Girl Friday, Sergeant York, Bringing Up Baby, The Big Sleep, Red River, Gentlemen Prefer Blondes, and Rio Bravo
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.