The Handbook of Contemporary Semantic Theory - Shalom Lappin - E-Book

The Handbook of Contemporary Semantic Theory E-Book

Shalom Lappin

0,0
53,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

The second edition of The Handbook of Contemporary Semantic Theory presents a comprehensive introduction to cutting-edge research in contemporary theoretical and computational semantics. * Features completely new content from the first edition of The Handbook of Contemporary Semantic Theory * Features contributions by leading semanticists, who introduce core areas of contemporary semantic research, while discussing current research * Suitable for graduate students for courses in semantic theory and for advanced researchers as an introduction to current theoretical work

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 1949

Veröffentlichungsjahr: 2015

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Series Page

Title Page

Copyright

Dedication

Notes on Contributors

Preface

Introduction

Part I: Quantifiers, Scope, Plurals, and Ellipsis

Chapter 1: Generalized Quantifiers in Natural Language Semantics

1 Introduction

2 Definitions

3 Determiner Phrases (DPs) and Quantifiers

4 Meaning the Same on Every Universe

5 Domain Restriction

6 Boolean Operations on Quantifiers

7 Quantifiers in the Number Triangle

8 Basic Properties

9 Definiteness

10 Decomposition

11 Questions of Expressive Power

Notes

References

Chapter 2: Scope

1 Scope Basics

2 Theories of Scope

3 Continuations, Scope, and Binding

4 Kinds of Scope Taking

5 Indefinites

6 Dynamic Semantics

7 Hamblin Semantics

8 Computational Processing

Notes

References

Chapter 3: Plurals

1 Introduction

2 Basic Facts and Terminology

3 The Denotation of Referential Plurals

4 Distributivity

5 Plurals and Quantification

6 Conclusion

Notes

References

Chapter 4: Ellipsis

1 Ellipsis: A Window on Context?

2 Meeting the Ellipsis Challenge

3 Dynamic Syntax

4 Reflections

Notes

References

Part II: Modification, Presupposition, Tense, and Modality

Chapter 5: Adjectival Modification and Gradation

1 Introduction

2 Adjective-Noun Combination

3 Gradation and Degrees

4 Adjectives and Scales

5 Comparatives and Degree Operator Scope

6 Conclusion

Notes

References

Chapter 6: Presupposition and Implicature

1 Introduction

2 Presupposition

3 Conversational Implicature

4 Conventional Implicature

5 Conclusion

Notes

References

Chapter 7: The Semantics of Tense and Aspect: A Finite-State Perspective

1 Introduction: Prior and Beyond

2 Within a Timeline

3 Between Timelines

4 Behind Timelines

Notes

References

Chapter 8: Conditionals and Modality

1 Introduction

2 Formal Frameworks

3 Conditionals

4 Current Debates and Open Issues

Acknowledgments

Appendix: Proofs

Notes

References

Part III: Nondeclaratives

Chapter 9: Semantics of Questions

1 Introduction

2 Setting the Field

3 Theories of Questions

4 Minimal Erotetic Semantics: Basics and Tools

5 Minimal Erotetic Semantics: Questions

6 Erotetic Inferences and How Questions Arise

7 Other Developments

8 Further Readings

Acknowledgments

Notes

References

Chapter 10: The Semantics of Imperatives

1 Introduction

2 Examples of Imperatives

3 Problematic Cases

4 Survey of Proposals

5 A Judgmental Approach

6 Conclusions

Notes

References

Part IV: Type Theory and Computational Semantics

Chapter 11: Constructive Type Theory

1 Introduction

2 A Brief History

3 Type Theory in a Nutshell

4 Computability and Constructive Logic

5 Semantics of Natural Language

6 Related Semantic Theories

7 Type Theory as a Logical Framework

8 The Syntax-Semantics Interface

9 Type Theory and Interaction

References

Chapter 12: Type Theory with Records for Natural Language Semantics

1 Introduction

2 A Theory of Types and Situations

3 Grammar in TTR

4 A Theory of Abstract Entities

5 Interaction on Dialogue Gameboards

6 Unifying Metacommunicative and Illocutionary Interaction

7 Traditional Semantic Concerns in a Dialogue Perspective

8 Grammar in Dialogue

9 Conclusions and Future Directions

Notes

References

Chapter 13: Curry Typing, Polymorphism, and Fine-Grained Intensionality

1 Introduction

2 Higher-Order Intensional Logic

3 Property Theory with Curry Typing

4 Fine-Grained Intensionality

5 Probabilistic Semantics

6 Conclusions and Future Work

Notes

References

Chapter 14: Semantic Complexity in Natural Language

1 Introduction

2 Fragments of Language

3 Technical Background

4 Syllogistic Proof Systems

5 Basic Syllogistic Fragments: Complexity

6 Relative Clauses

7 Noun-Level Negation

8 Numerical Determiners

9 Bound-Variable Anaphora

References

Chapter 15: Implementing Semantic Theories

1 Introduction

2 Direct Interpretation or Logical Form?

3 Model Checking Logical Forms

4 Example: Implementing Syllogistic Inference

5 Implementing Fragments of Natural Language

6 Extension and Intension

7 Implementing Communicative Action

8 Resources

9 Appendix

References

Chapter 16: Vector Space Models of Lexical Meaning

1 Introduction

2 Vector Space Models for Document Retrieval

3 Representing Word Meanings as Vectors

4 Compositional Vector Space Models

5 Conclusion

6 Acknowledgements

Notes

References

Chapter 17: Recognizing Textual Entailment

1 Introduction

2 Task Definition

3 Knowledge/Inference Phenomena in Textual Entailment

4 Two Contrasting Models for RTE Inference

5 Theoretical Models for RTE Inference

6 Compromise Approaches to RTE

7 The State of the Art/Future Directions

8 Conclusions

Notes

References

Part V: Interfaces

Chapter 18: Natural Logic

1 Introduction: Logic for Natural Language, Logic in Natural Language

2 Extended Syllogistic Inference

3 Logics with Individual Variables

4 Inference with Monotonicity and Polarity

5 Conclusion

Acknowledgments

References

Chapter 19: The Syntax-Semantics Interface: Semantic Roles and Syntactic Arguments

1 Introduction

2 Types of Lexical Semantic Representation

3 Isolating Semantically Relevant Facets of Meaning

4 Mapping between Lexical Semantics and Syntax

5 Conclusion

Notes

References

Chapter 20: Reference in Discourse

1 Introduction

2 Fundamentals

3 Taking Inventory

4 Form of Reference, Cognitive Status, and Conversational Implicature

5 Complexities in the Interpretation of

a-

and

the

-NPs

6 Complexities in the Interpretation of Pronouns

7 Pronouns as a Window into Referential Processing

8 Conclusion

Notes

References

Chapter 21: Probabilistic Semantics and Pragmatics Uncertainty in Language and Thought

1 Probabilistic Models of Commonsense Reasoning

2 Meaning as Condition

3 Pragmatic Interpretation

4 Semantic Indices

5 Conclusion

6 Acknowledgements

Notes

References

Chapter 22: Semantics and Dialogue

1 Introduction

2 Background: A Naïve Model of Dialogue and the Semantics/Pragmatics Interface in Dialogue

3 Some Dialogue Phenomena that Challenge the Naïve Model

4 The Semantics/Pragmatics Interface in Current Theories of Dialogue Meaning

5 Conclusions

Notes

References

Chapter 23: Semantics and Language Acquisition

1 What are Words For?

2 Starting Points

3 Early Word Use: Overextension and Restriction

4 Semantic Relations and New Words

5 Semantic Fields

6 Approaches to Word Learning

7 Constraint-Based Approaches

8 Sociopragmatic Approaches

9 Crosslinguistic Studies

10 What Children Learn about Meaning in their First Few Years

11 Negotiating Meanings in Conversation

Notes

References

Author Index

Subject Index

End User License Agreement

Pages

1

2

3

4

5

6

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

405

406

407

408

409

410

411

412

413

414

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

436

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

482

483

484

485

486

487

488

489

490

491

492

493

494

495

496

497

498

499

500

501

502

503

504

505

506

507

508

509

510

511

512

513

514

515

516

517

518

519

520

521

522

523

524

525

526

527

528

529

530

531

532

533

534

535

536

537

538

539

540

541

542

543

544

545

546

547

548

549

550

551

552

553

554

556

557

561

562

563

564

565

566

567

568

569

570

571

572

573

574

575

576

577

578

579

580

581

582

583

584

585

586

587

588

589

590

591

592

593

594

595

596

597

598

599

600

601

602

603

604

605

606

607

608

609

610

611

612

613

614

615

616

617

618

619

620

621

622

623

624

625

626

627

628

629

630

631

632

633

634

635

636

637

638

639

640

641

642

643

644

645

646

647

648

649

650

651

652

653

654

655

656

657

658

659

660

661

662

663

664

665

666

667

668

669

670

671

672

673

674

675

676

677

678

679

680

681

682

683

684

685

686

687

688

689

690

691

692

693

694

695

696

697

698

699

700

701

702

703

704

705

706

707

708

709

710

711

712

713

714

715

716

717

718

719

720

721

722

723

724

725

726

727

728

729

730

731

732

733

7

141

271

343

559

Guide

Cover

Table of Contents

preface

Introduction

Part I: Quantifiers, Scope, Plurals, and Ellipsis

Begin Reading

List of Illustrations

Chapter 1: Generalized Quantifiers in Natural Language Semantics

Figure 1.1 The number triangle.

Figure 1.2 Some quantifiers in the number triangle.

Figure 1.3 Symmetry.

Figure 1.4

Mon

and

Mon

Figure 1.5

Mon

and

Mon

Figure 1.6

Mon

and

Mon

Figure 1.7 Smoothness.

Figure 1.8

Some but not all.

Figure 1.9

Mon

.

Chapter 2: Scope

Figure 2.1 Schematic representation of the scope-taking in example (1): “John said [

Mary called

[everyone]

yesterday

] with relief.”

Figure 2.2 Schematic picture of scope taking.

Chapter 4: Ellipsis

Figure 4.1 Processing

John upset Mary.

Figure 4.2 Parsing

Mary, John upset.

Figure 4.3 Result of parsing

John, who smokes, left.

Figure 4.4 Incremental development of Mary's/Bob's context via processing words.

Figure 4.5 Re-use of structure from context: Short Answers to WH-questions.

Chapter 11: Constructive Type Theory

Figure 11.1 The historical background of constructive type theory.

Figure 11.2 Elimination rules for logical constants.

Figure 11.3 Proof tree normalization and corresponding function definition. The small letters in the trees stand for subtrees.

Figure 11.4 A natural deduction tree and the corresponding proof term.

Figure 11.5 Montague's PTQ architecture.

Chapter 12: Type Theory with Records for Natural Language Semantics∗

Figure 12.1 Conversation from (BNC, G4K).

Chapter 14: Semantic Complexity in Natural Language

Figure 14.1 Meaning derivation in Syl.

Figure 14.2 Meaning derivation in TV.

Figure 14.3 Meaning derivation in Syl+Rel.

Figure 14.4 Meaning derivation in TV+Rel.

Figure 14.5 Parsing in TV+Rel+RA.

Chapter 15: Implementing Semantic Theories

Figure 15.1 Example structure tree.

Chapter 16: Vector Space Models of Lexical Meaning

Figure 16.1 Simple example of document and query similarity using the dot product, with term-frequency providing the vector coefficients. The documents have been tokenized, and word matching is performed between lemmas (so

wickets

matches

wicket

).

Figure 16.2 Simple example of document and query similarity using the dot product, with term frequency, inverse document frequency providing the coefficients for the documents, using the same query and documents as Figure 16.1.

Figure 16.6 The effect of IDF on a simple example vector space.

Figure 16.3 Term-document matrix for the simple running example, using

tf-idf

weights but without length normalization.

Figure 16.4 A small example corpus and term vocabulary with the corresponding term-term matrix, with term frequency as the vector coefficients. Each sentence provides a contextual window, and the sentences are assumed to have been lemmatized when creating the matrix.

Figure 16.5 Example sentence with part-of-speech tags from the Penn Treebank tagset (Marcus

et al.

, 1993) and grammatical relations from the Briscoe and Carroll (2006) scheme. Contextual elements for the target word

goal

are shown for various definitions of context.

Figure 16.7 Ranked lists of synonyms for the target words in bold from Curran's system.

Figure 16.8 Example vector space for sentence meanings.

Source

: Clark (2013). © Oxford University Press. By permission of Oxford University Press.

Figure 16.9 Example pregroup derivation.

Source

: Clark (2013). © Oxford University Press. By permission of Oxford University Press.

Figure 16.10 The tensor product of two vector spaces; is the coefficient of on the basis vector.

Source

: Clark (2013). © Oxford University Press. By permission of Oxford University Press.

Figure 16.11 An example “plausibility space” for sentences.

Source

: Clark (2013). © Oxford University Press. By permission of Oxford University Press.

Figure 16.12 Example pregroup derivation with semantic types.

Source

: Clark (2013). © Oxford University Press. By permission of Oxford University Press.

Chapter 17: Recognizing Textual Entailment

Figure 17.1 Representative non-entailing and entailing RTE examples.

Figure 17.2 Histogram of inference phenomena identified for positive entailment examples from RTE 2, from Garoufi (2007).

Figure 17.3 Histogram of number of inference steps needed to resolve entailment against number of examples from RTE 5 subset requiring that number, from Sammons

et al.

(2010).

Figure 17.4 Figure sketching formal proof of an entailment pair.

Figure 17.5 A sample “proof” from a simple shallow lexical RTE system.

Figure 17.6 Syntactic transformation-based model: initial entailment pair and a relevant rule.

Figure 17.7 Syntactic transformation-based model: result of first rule application and a second relevant rule.

Figure 17.8 Syntactic transformation model: final state after all rules are exhausted.

Figure 17.9 Sample alignment for entailing example.

Figure 17.10 Sample alignment for non-entailing example.

Chapter 18: Natural Logic

Figure 18.1 Some systems of natural logic. In this section, we discuss , and in section 2.2 we shall see . In section 3 we shall see .

Source

: Moss (2014). With kind permission from Springer Science and Business Media.

Figure 18.2 The proof system for logic of . In it, and range over unary atoms, and over set terms, over positive set terms, over binary atoms, and over adjective atoms.

Figure 18.3 Proof rules. See the text for the side conditions in the () and () rules. In the (trans) rule, is a comparative adjective.

Source

: Adapted from Moss (2010a). With kind permission from Springer Science and Business Media.

Figure 18.4 Derivations in Example 12. On the top is a derivation in our system, and on the bottom we have the same derivation rendered in Fitch-style.

Figure 18.5 The derivation in Example 15.

Source

: Adapted from Moss (2010a). With kind permission from Springer Science and Business Media.

Chapter 21: Probabilistic Semantics and Pragmatics Uncertainty in Language and Thought

Figure 21.1 The collected Church definitions forming our simple intuitive theory (or conceptual lexicon) for the tug-of-war domain.

Figure 21.2 An example of explaining away. Lines show the distribution on Jane's inferred strength after (a) no observations; (b) observing that Jane beat Bob, whose strength is unknown; (c) learning that Bob is very weak, with strength . (d) learning that Jane and Bob are different genders

Figure 21.3 A linguistic example of explaining away, demonstrating that the literal listener makes non-monotonic inferences about the answer to the QUD “How strong is Jane?” given the utterances described in the main text. Lines show the probability density of answers to this QUD after (a) utterances 1–3; (b) utterances 1–4; (c) utterances 1–5.

Figure 21.4 The probability of the listener interpreting the utterance

Most players played in some match

according to the two possible quantifier scope configurations depends in intricate ways on the interpreter's beliefs and observations about the number of matches and the number of players on each team (a). This, in turn, influences the total information conveyed by the utterance (b). For this simulation there were 10 teams.

Figure 21.5 Normalized probability that the speaker will utter “Jane played in no/some/every match” in each situation, generated by reasoning about which utterance will most effectively bring the literal listener to select the correct answer to the QUD “How many matches did Jane play in?” (The parameter

alpha

is set to 5.)

Figure 21.6 Interpretation of “Jane played in some match” by the literal and pragmatic listeners, assuming that the only relevant alternatives are “Jane played in no/every match.” While the literal listener (left pane) assigns a moderate probability to the “all” situation given this utterance, the pragmatic listener (right pane) assigns this situation a very low probability. The difference is due to the fact that the pragmatic listener reasons about the utterance choices of the speaker (Figure 21.5 above), taking into account that the speaker is more likely to say “every” than “some” if “every” is true.

Figure 21.7 The literal listener's interpretation of an utterance containing a free threshold variable , assuming an uninformative prior on this variable. This listener's exclusive preference for true interpretations leads to a tendency to select extremely low values of (“degree posterior”). As a result the utterance conveys little information about the variable of interest: the strength posterior is barely different from the prior.

Figure 21.8 The pragmatic listener's interpretation of an utterance such as “Bob is strong,” containing a free threshold variable that has been lifted to the pragmatic level. Joint inference of the degree and the threshold leads to a “significantly greater than expected” meaning. (We assume that the possible utterances are to say nothing (cost 0) and “Bob is strong/weak” (cost 6), and

alpha

, as before.)

Figure 21.9 With prior distributions and parameters as above, the probability of the second premise of the sorites paradox is close to 1 when the inductive gap is small, but decreases as the size of the gap increases.

Chapter 22: Semantics and Dialogue

Figure 22.1 Discourse (31) as a segmented discourse representation structure.

List of Tables

Chapter 2: Scope

Table 2.1 Cooper Storage example.

Chapter 6: Presupposition and Implicature

Table 6.1 A typology of meaning classes.

Chapter 7: The Semantics of Tense and Aspect: A Finite-State Perspective

Table 7.1 From Russell–Wiener to Allen.

Chapter 8: Conditionals and Modality

Table 8.1 Common systems of modal logic.

Table 8.2 Some properties of accessibility relations referred to in this chapter.

Chapter 9: Semantics of Questions

Table 9.1 From implying question to implied question.

Chapter 11: Constructive Type Theory

Table 11.1 (a) The four standard types. (b) Propositions defined as types. (c) Introduction rules of natural deduction.

Table 11.2 The elimination operators and their definitions.

Table 11.3 Haskell notations for some type-theoretical concepts.

Chapter 14: Semantic Complexity in Natural Language

Table 14.1 Encoding Horn-clause satisfiability in DTV.

Chapter 16: Vector Space Models of Lexical Meaning

Table 16.1 Example noun vectors in .

Table 16.2 Example transitive verb vectors in .

Table 16.3 Vector for

chases

(order 3 tensor) together with subject

dog

and object

cat

.

Chapter 17: Recognizing Textual Entailment

Table 17.1 A sample of RTE pairs. Entries marked with a “

*

” could reasonably be considered unclear.

Table 17.2 Lexico-syntactic inference phenomena. Entries with no counts were identified as important by one or more of the works cited, but were not individually reported in any of the analyses.

Table 17.4 Types of world knowledge required for inference in entailment examples with frequency counts from various studies. Counts marked with an asterisk represent lower bounds.

Table 17.3 Occurrences of phenomena identifying non entailing/contradictory entailment examples. Counts marked with an asterisk represent lower bounds.

Table 17.5 Augmented syntactic transformation-based model: proof and “cost.” If the summed cost of the operations is less than some threshold, the system will label the example as “entails,” otherwise as “not entails.”

Chapter 18: Natural Logic

Table 18.1 The syntax of .

Table 18.2 Syntax of terms and sentences of .

Chapter 19: The Syntax-Semantics Interface: Semantic Roles and Syntactic Arguments

Table 19.1 Feature definitions of semantic roles in theta theory.

Blackwell Handbooks in Linguistics

This outstanding multi-volume series covers all the major subdisciplines within linguistics today and, when complete, will offer a comprehensive survey of linguistics as a whole.

Recent Titles Include:

The Handbook of Language and Speech Disorders

Edited by Jack S. Damico, Nicole Müller, Martin J. Ball

The Handbook of Computational Linguistics and Natural Language Processing

Edited by Alexander Clark, Chris Fox, and Shalom Lappin

The Handbook of Language and Globalization

Edited by Nikolas Coupland

The Handbook of Hispanic Sociolinguistics

Edited by Manuel Díaz-Campos

The Handbook of Language Socialization

Edited by Alessandro Duranti, Elinor Ochs, and Bambi B. Schieffelin

The Handbook of Intercultural Discourse and Communication

Edited by Christina Bratt Paulston, Scott F. Kiesling, and Elizabeth S. Rangel

The Handbook of Historical Sociolinguistics

Edited by Juan Manuel Hernández-Campoy and Juan Camilo Conde-Silvestre

The Handbook of Hispanic Linguistics

Edited by José Ignacio Hualde, Antxon Olarrea, and Erin O'Rourke

The Handbook of Conversation Analysis

Edited by Jack Sidnell and Tanya Stivers

The Handbook of English for Specific Purposes

Edited by Brian Paltridge and Sue Starfield

The Handbook of Spanish Second Language Acquisition

Edited by Kimberly L. Geeslin

The Handbook of Chinese Linguistics

Edited by C.-T. James Huang, Y.-H. Audrey Li, and Andrew Simpson

The Handbook of Language Emergence

Edited by Brian MacWhinney and William O'Grady

The Handbook of Korean Linguistics

Edited by Lucien Brown and Jaehoon Yeon

The Handbook of Speech Production

Edited Melissa A. Redford

The Handbook of Contemporary Semantic Theory, Second Edition

Edited by Shalom Lappin and Chris Fox

The Handbook of Classroom Discourse and Interaction

Edited by Numa Markee

Full series title list available at www.blackwellreference.com

The Handbook of Contemporary Semantic Theory

Second Edition

Edited by

 

Shalom Lappin and Chris Fox

 

This second edition first published 2015

© 2015 John Wiley & Sons, Inc

Edition History: Blackwell Publishing Ltd (1e, 1996)

Registered Office

John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK

Editorial Offices

350 Main Street, Malden, MA 02148-5020, USA

9600 Garsington Road, Oxford, OX4 2DQ, UK

The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK

For details of our global editorial offices, for customer services, and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com/wiley-blackwell.

The right of Shalom Lappin and Chris Fox to be identified as the authors of the editorial material in this work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books.

Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book.

Limit of Liability/Disclaimer of Warranty: While the publisher and authors have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. It is sold on the understanding that the publisher is not engaged in rendering professional services and neither the publisher nor the author shall be liable for damages arising herefrom. If professional advice or other expert assistance is required, the services of a competent professional should be sought.

Library of Congress Cataloging-in-Publication Data

The Handbook of Contemporary semantic theory / edited by Shalom Lappin and Chris Fox.—Second

Edition.

pages cm

Includes bibliographical references and index.

ISBN 978-0-470-67073-6 (cloth)

1. Semantics – Handbooks, manuals, etc. 2. Semantics (Philosophy) – Handbooks, manuals, etc. I.

Lappin, Shalom, editor. II. Fox, Chris, 1965- editor.

P325.H28 2015

401′.43 – dc23

2015015323

A catalogue record for this book is available from the British Library.

Cover image: Otto Freundlich, Untitled, c. 1930. © OTTO FREUNDLICH / AKG Images

For Ray.

Notes on Contributors

Chris Barker

Professor and Chair, Department of Linguistics, New York University. Chris Barker's current research program applies insights from the theory of programming languages to natural language semantics and the philosophy of language.

Ronnie Cann

Ronnie Cann has a long-established interest in research at the syntax/semantics interface, ranging over a number of theories. In recent years, his interests have focussed on the development of Dynamic Syntax, of which he is a core developer with Ruth Kempson, a collaboration that has resulted in two coauthored books, and a coedited book along with numerous journal articles and book chapters. He took a B.A. degree in classics from University College London before converting to Linguistics, receiving a diploma from UCL in 1979 and a D.Phil. from the University of Sussex in 1984. He has been teaching at the University of Edinburgh since 1984 where he is now Professor of Linguistic Semantics.

Eve V. Clark

Eve V. Clark is the Richard Lyman Professor in Humanities and Professor of Linguistics at Stanford University. She has done extensive crosslinguistic observational and experimental research on children's semantic and pragmatic development. Her books include

Psychology and Language

(with H. H. Clark, 1977),

The Ontogenesis of Meaning

(1979),

The Acquisition of Romance, with Special Reference to French

(1985),

The Lexicon in Acquisition

(1993), and

First Language Acquisition

(2 edn., 2009).

Stephen Clark

Stephen Clark is Reader in Natural Language Processing at the University of Cambridge. Previously he was a member of Faculty at the University of Oxford and a postdoctoral researcher at the University of Edinburgh. He holds a Ph.D. in computer science and artificial intelligence from the University of Sussex and a philosophy degree from Cambridge. His main research interest is the development of data-driven models for the syntactic and semantic analysis of natural language. He is the recipient of a 1M five-year ERC Starting Grant (2012–17) to work on integrating distributional and compositional models of meaning, as well as the coordinator of a

1.5M five-site EPSRC grant (2012–15) in this area.

Robin Cooper

Robin Cooper is Senior Professor at the University of Gothenburg. He was previously Professor of Computational Linguistics at the University of Gothenburg and Director of the Swedish National Graduate School of Language Technology (GSLT). His present work centers on developing and promoting TTR (Type Theory with Records) as a foundational tool for the analysis of cognition and language. He is currently collaborating on this with Ellen Breitholtz, Simon Dobnik, Jonathan Ginzburg, Shalom Lappin and Staffan Larsson.

Jan van Eijck

Jan van Eijck is a senior researcher at CWI (Centre for Mathematics and Computer Science), Amsterdam, and part-time professor of computational semantics at the Institute for Logic, Language and Computation (ILLC), Amsterdam. From 1990 until 2011 has was part-time professor of computational linguistics at Uil-OTS (Research Institute for Language and Speech), Utrecht. Jan van Eijck teaches applied logic in the Master of Logic curriculum and software specification and testing in the Master of Software Engineering curriculum, both at the University of Amsterdam. He is former scientific director of the Dutch Research School in Logic (1997–2002), and former employee of SRI-International (Cambridge UK Laboratory), where he was involved in the design of the Core Language Engine, an industrial-scale natural-language processing project. Before that, he held an associate professorship at the University of Tilburg. He has a Ph.D. from the University of Groningen (1985).

Arash Eshghi

Arash Eshghi is a Research Fellow at Heriot-Watt University. He received his Ph.D. in human interaction from Queen Mary University of London. A computer scientist by training, his research has combined linguistics, computational linguistics, and psychology, with a growing interest in statistical models. The main theme of his research is that of building viable computational and psychological models of meaning and context in conversation. He has over 20 peer-reviewed publications in this area.

Tim Fernando

Tim Fernando has been a lecturer in the Computer Science Department of Trinity College Dublin since 1999. He was a postdoc of Hans Kamp, and a Ph.D. student of Solomon Feferman and Jon Barwise. He is interested in finite-state methods for knowledge representation: how far they reach, and where they break down.

Chris Fox

Chris Fox's research is located in the intersection of linguistics, computer science, and philosophy. His main interest is in the formal interpretation of language, and foundational issues in semantics. He has authored or coauthored numerous publications in this area, including two books:

The Ontology of Language

(CSLI, 2000) and

Foundations of Intensional Semantics

(Blackwell, 2005). These works explore axiomatic and proof-theoretic accounts of meaning. He also coedited the

Handbook of Natural Language Processing and Computational Linguistics

(Wiley-Blackwell, 2010). His current work is focused on foundational issues in the formal interpretation of language, in addition to an interest in the analysis of imperatives and deontic statements. Before his appointment as Reader at the University of Essex, Fox taught at Goldsmiths College, University of London, and King's College London. He was also a visiting fellow at the Computational Linguistics Institute in Saarbrücken. He holds a B.Sc. in computer science, an M.Sc. in cognitive science, and a Ph.D. from the Cognitive Science Centre, University of Essex.

Jonathan Ginzburg

Jonathan Ginzburg is Professor of Linguistics at Université Paris-Diderot (Paris 7). He is one of the founders and editor-in-chief (emeritus) of the journal

Dialogue and Discourse

. His research interests include semantics, dialogue, language acquisition, and musical meaning. He is the author of

Interrogative Investigations

(CSLI Publications, 2001, with Ivan A. Sag) and

The Interactive Stance: Meaning for Conversation

(Oxford University Press, 2012).

Noah D. Goodman

Noah D. Goodman is Assistant Professor of Psychology, Linguistics (by courtesy), and Computer Science (by courtesy) at Stanford University. He studies the computational basis of human thought, merging behavioral experiments with formal methods from statistics and logic. His areas of research include pragmatics, lexical semantics, social cognition, concept learning, and probabilistic programming languages. He received his Ph.D. in mathematics from the University of Texas at Austin in 2003. In 2005 he entered cognitive science, working as postdoc and research scientist at MIT. In 2010 he moved to Stanford, where he runs the Computation and Cognition Lab.

Eleni Gregoromichelaki

Eleni Gregoromichelaki holds an M.Sc. in computational linguistics and formal grammar and a Ph.D. in linguistics from Kings' College London. She is currently a research associate at King's College London working within the dynamic syntax research group (

http://www.kcl.ac.uk/research/groups/ds/

). She has worked in the Dynamics of Conversational Dialogue (DynDial) ESRC project and the Leverhulme-funded Dialogue Matters, an interdisciplinary, international network set up to encourage collaboration on the study of dialogue. Her principal research interests lie in the syntax-semantics/pragmatics interface, in particular anaphora and ellipsis. She has also done work on conditionals, relative clauses, quantification and clitics. In addition, she has published on the philosophical and psychological issues that arise for theories of language.

Magdalena Kaufmann

Magdalena Kaufmann is Assistant Professor at the Department of Linguistics at the University of Connecticut. She graduated from the University of Frankfurt with a doctoral dissertation on imperative clauses (published in Springer's SLAP series, 2012), and has since been working on various aspects of clause types and their relation to modality, as well as various semantic and pragmatic aspects of attitude ascriptions.

Stefan Kaufmann

Stefan Kaufmann is Associate Professor of Linguistics at the University of Connecticut. He works on various topics in semantics and pragmatics, including conditionals and modality, tense and aspect, discourse particles, and probabilistic approaches in natural language semantics and pragmatics. He also has active research interests in computational linguistics, especially in the extraction of semantic information from large text corpora.

Andrew Kehler

Andrew Kehler is Professor of Linguistics at the University of California, San Diego. His primary research foci are discourse interpretation and pragmatics, studied from the perspectives of theoretical linguistics, psycholinguistics, and computational linguistics.

Ruth Kempson

Ruth Kempson's work has spanned syntax, semantics and pragmatics, with special focus on their interface. She is best known in recent years for leading the development of the Dynamic Syntax framework, with many collaborative papers and books with Ronnie Cann, Eleni Gregoromichelaki, Matthew Purver, and others. She worked at the School of Oriental and African Studies (linguistics) 1970–1999, moving to King's College London (philosophy) 1999–2009. She is now an Emeritus Professor of King's College London and research associate at both the School of Oriental and African Studies (linguistics) and Queen Mary University of London (cognitive science group).

Shalom Lappin

Shalom Lappin is Professor of Computational Linguistics at King's College London. His current research focuses on probabilistic type theory for natural language semantics, and on stochastic models of grammaticality. He is working with Robin Cooper, Simon Dobnik, and Staffan Larsson of the University of Gothenburg on the development of a probabilistic version of Type Theory withRecords as the basis for semantic representation and learning. Lappin is also PI of an ESRC research project on the stochastic representation of grammaticality at King's (which includes Alexander Clark and Jey Han Lau) that is constructing enriched-language models and testing them against speakers' grammaticality judgments.

Daniel Lassiter

Daniel Lassiter is an assistant professor of linguistics at Stanford University. He works on modality, gradation, presupposition, implicature, and other topics in semantics and pragmatics, and is interested in using Bayesian tools to integrate formal semantics and pragmatics with cognitive models of language understanding and use.

Beth Levin

Beth Levin is the William H. Bonsall Professor in the Humanities and Professor in the Department of Linguistics at Stanford University. Her work investigates the lexical semantic representation of events and the ways in which English and other languages morphosyntactically express events and their participants.

Lawrence S. Moss

Lawrence S. Moss is Director of the Indiana University Program in Pure and Applied Logic. He is Professor of Mathematics, and Adjunct Professor of Computer Science, Informatics, Linguistics, and Philosophy, and a member of the Program in Cognitive Science and the Program in Computational Linguistics. His research interests include natural logic and other areas of interaction of logic and linguistics; coalgebra and its relation to circularity and category theory in theoretical computer science, and dynamic epistemic logic.

Christopher Potts

Christopher Potts is Associate Professor of Linguistics at Stanford and Director of the Center for the Study of Language and Information (CSLI) at Stanford. In his research, he uses computational methods to explore how emotion is expressed in language, and how linguistic production and interpretation are influenced by the context of utterance. He earned his B.A. from New York University in 1999 and his Ph.D. from University of California Santa Cruz in 2003.

Ian Pratt-Hartmann

Ian Pratt-Hartmann is Senior Lecturer in Computer Science at the University of Manchester and Professor of Computer Science at the University of Opole. He read mathematics and philosophy at Brasenose College, Oxford, and philosophy at Princeton University, receiving his Ph.D. there in 1987. Dr. Pratt-Hartmann has published widely in logic, cognitive science, and artificial intelligence. His current research interests include computational logic, spatial logic and natural language semantics.

Matthew Purver

Matthew Purver is a senior lecturer in the School of Electronic Engineering and Computer Science, Queen Mary University of London. He holds a B.A. and M.Phil. from the University of Cambridge, and a Ph.D. from King's College London (2004), and has held research positions at King's, Queen Mary and Stanford University. His research focus is on computational linguistics as applied to conversational interaction, both face-to-face and online, and he has published over 80 peer-reviewed papers in journals and conference proceedings in this area.

Aarne Ranta

Aarne Ranta received his Ph.D. from the University of Helsinki in 1990 with the thesis “Studies in Constructive Semantics,” supervised by Per Martin-Löf, when Ranta spent time at the University of Stockholm. He continued working with constructive type-theory and published the monograph

Type-Theoretical Grammar

in 1994 (Oxford University Press). As his work gradually focused on the computational aspects of type theory, he wrote grammar implementations that were first used as natural language interfaces to interactiveproof editors. From this work, the Grammatical Framework (GF) emerged in 1998, as a part of a project on Multilingual Document Authoring at Xerox Research Centre Europe in Grenoble. The GF has grown into an international community with the mission of formalizing the grammars of the world and making them usable in computer applications. Grammatical Framework grammars have been written for over 30 languages, sharing a type-theoretical abstract syntax. Ranta's monograph

Grammatical Framework. Programming with Multilingual Grammars

appeared in 2011 (CSLI, Stanford; Chinese translation in 2014 at Shanghai Jiao Tong University Press). Since 1999, Ranta has been Associate Professor and since 2005 full Professor of Computer Science at the University of Gothenburg. From 2010–2013 he was the coordinator of the European Multilingual Online Translation (MOLTO) project, and in 2014 he became cofounder and CEO of Digital Grammars, a startup company with the mission of creating reliable language-technology applications.

Malka Rappaport Hovav

Malka Rappaport Hovav holds the Henya Sharef Chair in Humanities and is Professor of Linguistics and Head of the School of Language Sciences at the Hebrew University of Jerusalem. Her research focuses on the lexical semantic representation of argument-taking predicates and the morphosyntactic realization of their arguments.

Mark Sammons

Mark Sammons is a principal research scientist working with the Cognitive Computation Group at the University of Illinois. His primary interests are in natural language processing and machine learning, with a focus on textual entailment and information extraction. Mark received his MSc in Computer Science from the University of Illinois in 2004, and his Ph.D. in mechanical engineering from the University of Leeds, England, in 2000.

Remko Scha

Remko Scha is Emeritus Professor of Computational Linguistics at the Institute of Logic, Language and Computation of the University of Amsterdam. He has worked on Natural Language Interface Systems at Philips' Research Laboratories in Eindhoven and was head of the Artificial Intelligence Department of BBN Laboratories in Cambridge, MA. His theoretical work has been concerned with formal semantics, discourse structure, Gestalt perception, and probabilistic syntax.

David Schlangen

David Schlangen is Professor of Applied Computational Linguistics at the Faculty of Linguistics and Literary Studies, Bielefeld University. His research interest is in the process by which interlocutors in a dialogue create shared understanding. He explores this by trying to build machines that understand what is being said to them, and that mean what they say. He has workedon the theory and practical implementation of incremental processing in dialogue, and more recently, on integrating gesture interpretation into dialogue systems.

Dag Westerståhl

Dag Westerståhl is Professor of Theoretical Philosophy and Logic in the Department of Philosophy, Stockholm University. His current research focuses on generalized quantifiers in language and logic, compositionality, consequence relations, and logical constants.

Yoad Winter

Yoad Winter's research focuses on problems in formal semantics, computational linguistics and African drum languages. He was an associate professor in computer science at the Technion, Israel Institute of Technology, and since 2009 he has been an associate professor in linguistics and artificial intelligence at Utrecht University.

Andrzej Wiśniewski

Andrzej Wiśniewski is Professor of Logic at the Department of Logic and Cognitive Science, Institute of Psychology, Adam Mickiewicz University in Poznań, Poland. He is the author of

The Posing of Questions: Logical Foundations of Erotetic Inferences

(Kluwer, 1995),

Questions, Inferences, and Scenarios

(College Publications, 2013),

Essays in Logical Philosophy

(LiT Verlag, 2013), and of various articles published,

inter alia,

in

Erkenntnis

,

Journal of Logic and Computation

,

Journal of Logic, Language and Information

,

Journal of Philosophical Logic

,

Logique et Analyse

,

Studia Logica

, and

Synthese

. His major research interests are the logic of questions, epistemic logic, and proof theory.

Preface

We have been working on the second edition of The Handbook of Contemporary Semantic Theory for the past four years. When we started this project we thought that we would produce an update of the first edition. It quickly became apparent to us that we needed a more radical restructuring and revision in order to reflect the very substantial changes that much of the field has experienced in the time since the first edition was published. We think that it is fair to say that the current edition is, in almost all respects, an entirely new book. Most of the authors have changed, the topics have been substantially modified, and much of the research reported employs new methods and approaches.

Editing the Handbook has been a highly instructive and enriching experience. It has given us a clear sense of the depth and the vitality of work going on in the field today. We are grateful to the contributors for the enormous amount of thought and effort that they have invested in their chapters. The results are, in our view, of very high quality. We also appreciate their patience and cooperation over the long process of producing and revising the volume. It is their work that has ensured the success of this venture.

We owe a debt of gratitude to our respective families for accepting the distractions of our work on the Handbook with understanding and good humor. Their support has made it possible for us to complete this book.

Finally, we are grateful to our editors at Wiley-Blackwell, Danielle Descoteaux and Julia Kirk for their help. We have been privileged to work with them on this and previous projects. We greatly value their professionalism, their support, and their encouragement.

Shalom Lappin and Chris Fox

London and Wivenhoe

Introduction

This second edition of The Handbook of Contemporary Semantic Theory is appearing close to 20 years after the first edition was published in 1996. Comparing the two editions offers an interesting perspective on how significantly the field has changed in this time. It also points to elements of continuity that have informed semantic research throughout these years. Many of the issues central to the first edition remain prominent in the second edition. These include, inter alia, generalized quantifiers, the nature of semantic and syntactic scope, plurals, ellipsis and anaphora, presupposition, tense, modality, the semantics of questions, the relation between lexical semantics and syntactic argument structure, the role of logic in semantic interpretation, and the interface between semantics and pragmatics.

While many of the problems addressed in the second edition are inherited from the first, the methods with which these problems are formulated and investigated in some areas of the field have changed radically. This is clear from the fact that computational semantics, which took up one chapter in the first edition, has grown into a section of seven chapters in the current edition. Moreover, many of the chapters in other sections apply computational techniques to their respective research questions. As part of this development the investigation of rich-type theories of the kind used in the semantics of programming languages has become a major area of interest in the semantics of natural language. Related to the emergence of such type theories for natural language semantics, we see a renewed interest in proof theory as a way of encoding semantic properties and relations.

Another interesting innovation is the development of probabilistic theories of semantics that model interpretation as a process of reasoning under uncertainty. This approach imports into semantic theory methods that have been widely used in cognitive science and artificial intelligence to account for perception, inference, and concept formation.

The rise of computational approaches and alternative formal methods have facilitated the development of semantic models that admit of rigorous examination through implementation and testing on large corpora. This has allowed researchers to move beyond small fragments that apply to a limited set of constructed examples. In this respect semantics has kept pace with other areas of linguistic theory in which computational modeling, controlled experiments with speakers, and corpus application have become primary tools of research.

The current edition of the Handbook is organized thematically into five sections,where each section includes chapters that address related research issues. For some sections the connections among the chapters are fairly loose, bundling together issues that have often been associated with each other in the formal semantics literature. In others, the sections correspond to well defined subfields of research. We have been relaxed about this organizational structure, using it to provide what he hope are useful signpostings to clusters of chapters that deal with a range of connected research problems.

Part I is concerned with generalized quantifiers (GQs), scope, plurals, and ellipsis. In his chapter on generalized quantifiers, Dag Westerståhl provides a comprehensive discussion of the formal properties of generalized quantifiers in logic and in natural language. He gives us an overview of research in this area since the late 1980s, with precise definitions of the major classes of GQs, and their relations to the syntactic categories and semantic types of natural language. Particularly useful is his very clear treatment of the expressive power required to characterize different GQ classes. The chapter concludes with a brief discussion of the complexity involved in computing distinct types of GQ.

Chris Barker's chapter analyzes the relationship between semantic scope and syntactic structure. Barker gives us a detailed study of the intricate connections between different sorts of scope interaction and scope ambiguity, and the syntactic environments in which these phenomena occur. He surveys alternative formal and theoretical frameworks for representing the semantic properties of scope taking expressions. He suggests computational models of scope interpretation. This chapter complements the preceding one on GQs, and it provides an illuminating discussion of central questions concerning the nature of the syntax-semantics interface.

Yoad Winter and Remko Scha examine the semantics of plural expressions. A core issue that they address is the distinction between distributive and collective readings of plural noun phrases and verbs. They look at the algebra and the mereology of collective objects, which some plural expressions can be taken to denote. They analyze the relations between different types of quantification and plurality. They consider a variety of theoretical approaches to the problems raised by plural reference. This chapter extends and develops several of the themes raised in the preceding two chapters.

The last chapter in this Part I is devoted to ellipsis. Ruth Kempson et al. consider several traditional ellipsis constructions, such as verb phrase ellipsis, bare argument structures, and gapping. They also take up “incomplete” utterances in dialogue. These are constructions that have not generally been handled by the same mechanisms that are proposed for ellipsis resolution. They review the arguments for and against syntactic reconstruction and semantic theories of ellipsis. They consider the application of these theories to dialogue phenomena, and they examine whether a theory of ellipsis can be subsumed under a general theory of anaphora. They propose a unified account of ellipsis within the framework of dynamic syntax, which relies on underspecified linguistic input and informational update procedures for the specification of an incrementally applied “syntax.” As in the previous chapters, the role of syntactic mechanisms in determiningsemantic scope, and the interaction of quantification and scope are important concerns.

Part II consists of chapters on modification, presupposition, tense, and modality. In his chapter on adjectival modification, Dan Lassiter discusses several types of intersective and intensional adjectives, observing that the differences between these classes of modifiers do not constitute a simple binary distinction. An important phenomenon, to which he devotes a considerable amount of attention, is the class of gradable adjectives and the vagueness involved in their application. Lassiter considers leading accounts of gradation, critically discussing theories that posit degrees of modification. In this part of his chapter he describes a probabilistic view of predication, which is further developed in his coauthored chapter with Noah Goodman in Part V.

Chris Potts addresses the nature of presupposition and implicature. He surveys semantic presuppositions, encoded in the meanings of lexical items, and pragmatic presuppositions, which derive from the conditions of successful discourse. He considers the devices for projecting, filtering, and blocking presuppositions through composition of meaning in larger syntactic constructions. Potts gives us a detailed discussion of the relationship between presupposition and pragmatic implicature. He takes up the question of how speakers accommodate both presupposition and implicature in discourse. He critically examines several influential formal theories of the role of presupposition in semantic interpretation.

Tim Fernando's chapter is devoted to tense and aspect. Fernando surveys a variety of temporal logics and semantic theories for representing the structure of time, as it is expressed in natural language. He suggests that this structure corresponds to strings of situations (where situations include the class of events). He proposes the hypothesis that the semantically significant properties and relations that hold among the temporal strings required to interpret tense and aspect can be computed by finite state automata. Fernando offers a detailed discussion of phenomena associated with tense and aspect to motivate his hypothesis.

In the final chapter in Part II, Magdalena and Stefan Kaufmann examine the problems involved in representing different sorts of modal terms. They begin with an overview of modal logic and Kripke frame semantics. Within this framework modal operators are quantifiers over the set of possible worlds, constrained by an accessibility relation. They go on to look at extensions of this system designed to capture the properties of different modal expressions in natural language. A main feature of the system that is subject to revision is the accessibility relation on worlds. It is specified to restrict accessible worlds to those in which the propositions that hold express the common ground of assumptions on which coherent discourse depends. One of the Kaufmanns' central concerns in this chapter is to clarify the relationship between the semantics of modality and the interpretation of conditional sentences.

Part III of the Handbook is concerned with the semantics of nondeclarative sentences. In the first chapter in this part, Andrzej Wiśniewski explores theinterpretation of questions. A major issue in this area has been the relationship between a question and the set of possible answers in terms of which it is interpreted. Wiśniewski examines this topic in detail. He focusses on the problem of how, given that questions do not have truth values, they can be sound or unsound, and they can sustain inferences and implications. He proposes an account of the semantics of questions within the tradition of erotetic logic, whose historical background he describes.

In the second chapter of this part, Chris Fox discusses the semantics of imperatives. He notes that, like questions, imperatives have logical properties and support entailments, although they lack truth values. He also cites several of the apparent paradoxes that have been generated by previous efforts to model the semantic properties of these sentences. Fox suggests that the logical properties of imperatives are best modelled by a logic in which certain judgement patterns constitute valid inferences, even when their constituent sentences are imperatives rather than propositional assertions. He proposes a fragment of such a logic, which implements an essentially proof-theoretic approach to the task of formalising the semantics of imperatives.

Part IV is devoted to type theory and computational semantics. Aarne Ranta's chapter provides an introduction to the basic concepts of constructive type theory and their applications in logic, mathematics, programming, and linguistics. He demonstrates the power of this framework for natural language semantics with the analysis of donkey anaphora through dependent types. He traces the roots of type theory in earlier work in logic, philosophy, and formal semantics. Ranta illustrates the role of type theory in functional programming through the formalisation of semantically interesting examples in Haskell. He offers an overview of his own system for computational linguistic programming, grammatical framework (GF), in which both the syntactic and semantic properties of expressions are represented in an integrated type theoretical formalism. He goes on to indicate how GF can also be used to capture aspects of linguistic interaction in dialogue.

Robin Cooper and Jonathan Ginzburg present a detailed account of type theory with records (TTR) as a framework for modeling both compositional semantic interpretation and dynamic update in dialogue. They show how TTR achieves the expressive capacity of typed feature structures while sustaining the power of functional application, abstraction, and variable binding in the -calculus. A key element of the TTR approach to meaning is the idea that interpretation consists in judging that a situation is of a certain type. Cooper and Ginzburg illustrate how record types and subtyping permit us to capture fine-grained aspects of meaning that elude the classical type theories that have traditionally been used within formal semantics. They also ground TTR in basic types that can be learned through observation as classifiers of situations. In this way TTR builds compositional semantics bottom up from the acquisition of concepts applied in perceptual judgement.