This book is a sequel to Geoffrey Sampson’s well-received textbook Schools of Linguistics. Linguistics changed around the millennium; the advent of cheap air travel and the internet meant that geographical distance ceased to be a barrier to scholarly interaction, so new developments are no longer grouped into separate “schools” located in different places. Consequently, the best way to show how linguistics is flowering in our time is through a sampler displaying individual examples of recent advances. Sampson offers such a sampler, describing two dozen of the most interesting innovations in the subject to have emerged in the present century. And he includes a few looks back at how the approaches described in Schools of Linguistics panned out in the closing years of the old century, before they evolved into—or made way for—today’s more realistic and more diverse linguistics.
This book records a unique attempt over a ten-year period to use stochastic optimization in the natural language processing domain. Setting the work against the background of the logical rule-based approach, the author provides a context for understanding the differences in assumptions about the nature of language and cognition.
A thousand years ago, someone called Anselm decided that people should not believe things just because the Bible said they were so—and, to his delight, he proved the most important issue of all, the existence of God, as a pure logical theorem. Ever since, people have argued about his proof—the atheist Bertrand Russell found it much easier to say it was fallacious than to identify the fallacy, and others have produced independent God proofs. This book brings these proofs to life in the context of the people who created them. It assumes no technical knowledge, and invites readers to decide how convincing they find the arguments.
Are we creatures who learn new things? Or does human mental development consist of awakening structures of thought? A view has gained ground - advocated, for example, by Steven Pinker's book The Language Instinct - that language in much of its detail is hard-wired in our genes. Others add that this holds too for much of the specific knowledge and understanding expressed in language. When the first human evolved from apes (it is claimed), her biological inheritance comprised not just a distinctive anatomy but a rich structure of cognition. This book examines the various arguments for instinctive knowledge, with the author arguing that each one rests on false premises or embodies a logical fallacy. A different picture of learning is suggested by Karl Popper's account of knowledge growing through conjectures and refutations. The facts of human language are best explained, Sampson contends, by taking language acquisition to be a case of Popperian learning. In this way, we are not born know-alls; we are born knowing nothing but able to learn anything and this is why we can find ways to think and talk about a world that goes on changing.
Linguistics has become an empirical science again after several decades when it was preoccupied with speakers' hazy "intuitions" about language structure. With a mixture of English-language case studies and more theoretical analyses, Geoffrey Sampson gives an overview of some of the new findings and insights about the nature of language which are emerging from investigations of real-life speech and writing, often (although not always) using computers and electronic language samples ("corpora"). Concrete evidence is brought to bear to resolve long-standing questions such as "Is there one English language or many Englishes?" and "Do different social groups use characteristically elaborated or restricted language codes?" Sampson shows readers how to use some of the new techniques for themselves, giving a step-by-step "recipe-book" method for applying a quantitative technique that was invented by Alan Turing in the World War II code-breaking work at Bletchley Park and has been rediscovered and widely applied in linguistics fifty years later.
This revised, updated and expanded new edition of Geoffrey Till's acclaimed textbook provides an invaluable guide for anyone interested in the changing and crucial role of seapower in the twenty-first century.
How do you trap someone in a lie? For centuries, all manner of truth-seekers have used the lie detector. In this eye-opening book, Geoffrey C. Bunn unpacks the history of this device and explores the interesting and often surprising connection between technology and popular culture. Lie detectors and other truth-telling machines are deeply embedded in everyday American life. Well-known brands such as Isuzu, Pepsi Cola, and Snapple have advertised their products with the help of the “truth machine,” and the device has also appeared in countless movies and television shows. The Charles Lindbergh “crime of the century” in 1935 first brought lie detectors to the public’s attention. Since then, they have factored into the Anita Hill–Clarence Thomas sexual harassment controversy, the Oklahoma City and Atlanta Olympics bombings, and one of the most infamous criminal cases in modern memory: the O. J. Simpson murder trial. The use of the lie detector in these instances brings up many intriguing questions that Bunn addresses: How did the lie detector become so important? Who uses it? How reliable are its results? Bunn reveals just how difficult it is to answer this last question. A lie detector expert concluded that O. J. Simpson was “one hundred percent lying” in a video recording in which he proclaimed his innocence; a tabloid newspaper subjected the same recording to a second round of evaluation, which determined Simpson to be “absolutely truthful.” Bunn finds fascinating the lie detector’s ability to straddle the realms of serious science and sheer fantasy. He examines how the machine emerged as a technology of truth, transporting readers back to the obscure origins of criminology itself, ultimately concluding that the lie detector owes as much to popular culture as it does to factual science.
Originally published in 1977, this book deals with the social psychological factors which influence the process of bargaining. It examines the structure behind the process, by which it can be analysed and better understood. Particular attention is paid to the character of negotiations in which agreements are obtained.
This will help us customize your experience to showcase the most relevant content to your age group
Please select from below
Login
Not registered?
Sign up
Already registered?
Success – Your message will goes here
We'd love to hear from you!
Thank you for visiting our website. Would you like to provide feedback on how we could improve your experience?
This site does not use any third party cookies with one exception — it uses cookies from Google to deliver its services and to analyze traffic.Learn More.