Optimization in Engineering Sciences - Dan Stefanoiu - E-Book

Optimization in Engineering Sciences E-Book

Dan Stefanoiu

0,0
162,99 €

oder
-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

The purpose of this book is to present the main metaheuristics and approximate and stochastic methods for optimization of complex systems in Engineering Sciences. It has been written within the framework of the European Union project ERRIC (Empowering Romanian Research on Intelligent Information Technologies), which is funded by the EU’s FP7 Research Potential program and has been developed in co-operation between French and Romanian teaching researchers. Through the principles of various proposed algorithms (with additional references) this book allows the reader to explore various methods of implementation such as metaheuristics, local search and populationbased methods. It examines multi-objective and stochastic optimization, as well as methods and tools for computer-aided decision-making and simulation for decision-making.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 515

Veröffentlichungsjahr: 2014

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Contents

List of Figures

List of Tables

List of Algorithms

List of Acronyms

Preface

Acknowledgments

1 Metaheuristics – Local Methods

1.1. Overview

1.2. Monte Carlo principle

1.3. Hill climbing

1.4. Taboo search

1.5. Simulated annealing

1.6. Tunneling

1.7. GRASP methods

2 Metaheuristics – Global Methods

2.1. Principle of evolutionary metaheuristics

2.2. Genetic algorithms

2.3. Hill climbing by evolutionary strategies

2.4. Optimization by ant colonies

2.5. Particle swarm optimization

2.6. Optimization by harmony search

3 Stochastic Optimization

3.1. Introduction

3.2. Stochastic optimization problem

3.3. Computing the repartition function of a random variable

3.4. Statistical criteria for optimality

3.5. Examples

3.6. Stochastic optimization through games theory

4 Multi-Criteria Optimization

4.1. Introduction

4.2. Introductory examples

4.3. Multi-criteria optimization problems

4.4. Model solving methods

4.5. Two objective functions optimization for advanced control systems

4.6. Notes and comments

5 Methods and Tools for Model-based Decision-making

5.1. Introduction

5.2. Introductory examples

5.3. Decisions and decision activities. Basic concepts

5.4. Decision analysis

5.5. Notes and comments

5.6. Other remarks/comments

6 Decision-Making – Case Study Simulation

6.1. Decision problem in uncertain environment

6.2. Problem statement

6.3. Simulation principle

6.4. Case studies

Appendix 1: Uniformly Distributed Pseudorandom Generators

A1.1. Hardware algorithm

A1.2. Software algorithm

A1.3. Properties of (B)PRS

Appendix 2: Prescribed Distribution Pseudo-Random Generators

A2.1. Principle of stochastic universal sampling

A2.2. Baker’s genuine algorithm

A2.3. Baker’s generalized algorithm

A2.4. Examples of generated PRS

Bibliography

Index

First published 2014 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc.

Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address:

ISTE Ltd27-37 St George’s RoadLondon SW19 4EUUK

www.iste.co.uk

John Wiley & Sons, Inc.111 River StreetHoboken, NJ 07030USA

www.wiley.com

© ISTE Ltd 2014The rights of Dan Stefanoiu, Pierre Borne, Dumitru Popescu, Florin Gh. Filip and Abdelkader El Kamel to be identified as the authors of this work have been asserted by them in accordance with the Copyright, Designs and Patents Act 1988.

Library of Congress Control Number: 2014947882

British Library Cataloguing-in-Publication DataA CIP record for this book is available from the British LibraryISBN 978-1-84821-498-9

List of Figures

List of Tables

1.1. Expense rates for computer network maintenance missions
1.2. Taboo search leading to blocking
1.3. Taboo search leading to optimum solution
2.1. List of configuring parameters in GA design
2.2. List of monitored ecological parameters from a greenhouse
2.3. Analogy between musical composition and optimization
2.4. Example of initial harmonic memory
2.5. Example of harmonic memory improvement
2.6. Example of optimal harmonic memory
3.1. Matrix of a stochastic game
4.1. Consequences table for the problem of choosing a job (deterministic case)
4.2. Decision table for the problem of choosing a job (deterministic case)
4.3. Initial data and results obtained by applying TOPSIS method to the problem of choosing a job (deterministic case)
4.4. Consequences table for the problem of selecting an IT product
4.5. Matrix of ranks for the problem of selecting an IT product
4.6. Performance and results for the problem of selecting an IT product
4.7. Decision table for the problem of selecting an IT product
4.8. Condorcet method applied to the problem of selecting an IT product: Cik sets
4.9. Condorcet method applied to the problem of selecting an IT product: dik indicators
5.1. Table of consequences in the problem of job selection (probabilistic case)
5.2. Numerical data for the problem of starting a business
5.3. Decision table in the problem of choosing a job (probabilistic case)
5.4. Consequences table in the problem of IT engineer selection
6.1. Preliminary analysis by ordered quantity, Q (stock management)
6.2. Refined analysis by ordered quantity, Q (stock management)

List of Algorithms

1.1. Basic Monte Carlo procedure for local heuristic optimization
1.2. Basic hill climbing procedure
1.3. Improved hill climbing procedure, by using the Cauchy compass
1.4. Greedy descent procedure
1.5. Taboo search procedure
1.6. Simulated annealing procedure
1.7. Stochastic tunneling procedure
2.1. Genetic procedure for solving the matching pursuit problem in a time-frequency-scale dictionary
2.2. Hill climbing procedure by approaching the steepest ascent
2.3. Hill climbing procedure by approaching the next ascent
2.4. Hill climbing by group of alpinists
2.5. Basic procedure for the optimization by ant colony
2.6. Systemic optimization procedure by means of ant colony
2.7. Optimization procedure by ant colony, to solve the traveling salesman problem
2.8. Standard procedure of PSO
2.9. Adaptive procedure of PSO, with evolutionary strategy
2.10. Standard optimization procedure by using fireflies
2.11. Standard optimization procedure by using bats
2.12. Standard optimization procedure through bee colony
2.13. Standard optimization procedure by harmony search
A2.1. Main steps of Baker’s genuine procedure
A2.2. Main steps of Baker’s generalized procedure

List of Acronyms

ACA

Ant colony algorithm

AHP

Analytic hierarchy process

AI

Artificial intelligence

ANP

Analytic network process

AP

Achieved performance

APSOA

Adaptive particle swarm optimization algorithm

AR

Autoregressive (model(s), class, component, etc.)

ARMA

Autoregressive model (class) with moving average

ARMAX

Autoregressive model (class) with moving average and exogenous control

BA

Baker’s (genuine) algorithm

BatA

Bats algorithm

BeeA

Bees algorithm

BGA

Baker’s generalized algorithm

BPRS

Binary pseudo-random sequence

DM

Decision model

DNA

Deoxyribonucleic acid

DSS

Decision support system

EV

Expected value (statistical mean)

FA

Fireflies algorithm

GA

Genetic algorithm

GRASP

Greedy randomized adaptive search procedure

GT

Games theory

HITA

“How is this (objective) attained?” (decision test)

IPH

Implicit parallelism hypothesis (of Holland)

HSA

Harmony search algorithm

IT

Information technology

LSB

Least significant bit

LSM

Least squares method

MA

Moving average (model(s), class, component, etc.)

MAP

Multi-attribute problem

MAUT

Multi-attribute utility theory

MCDA

Multicriteria decision analysis

MCDM

Multicriteria decision-making

MCP

Multi-criteria problem

MOP

Multi-objective problem

MSB

Most significant bit

N-PRS

Pseudo-random sequence with normal (Gaussian) distribution

N-PRSG

Normal (Gaussian) pseudo-random sequences generator

NC

Nominal controller

NM

Nominal model

NS

Nominal system

NP

Nominal performance

OWA

Ordered weights averaging

P-PRSG

Prescribed probability distribution pseudo-random sequences generator

PQ

Prediction quality

PRS

Pseudo-random (numerical) sequence

PRSG

Pseudo-random sequence generator

PSO

Particle swarm optimization

PSOA

Particle swarm optimization algorithm

RS

Real system

SACA

Systemic ant colony algorithm

SI

System identification

SNR

Signal-to-noise ratio

SPSOA

Standard particle swarm optimization algorithm

ST

System theory

SUS

Stochastic universal sampling

s.t.

subject to (restrictions or constraints)

TOPSIS

Technique for ordering by similarity to ideal solution

U-PRS

Uniformly distributed pseudo-random sequence of numbers

U-PRSG

Uniformly distributed pseudo-random sequences generator

WDTM

“What does this mean?” (decision test)

WITI

“Why is this (objective) important?” (decision test)

Preface

This book is the result of collaboration between teachers and researchers from France and Romania, with the support of FP7 ERRIC European project. The goal is to complete the work presented in Optimization in Engineering Sciences: Exact Methods, which was published by ISTE and John Wiley & Sons, in 2013. The first volume was concerned with the presentation of exact optimization methods at the service of engineers and decision makers.

In the case of uncertain or poorly defined problems, possibly subject to random perturbations or for which the search for solutions might evolve toward the combinatorial explosion, the exact methods are very unlikely to provide solutions within an acceptable period of time.

The methods presented in this volume allow us to find, if not the optimal solution of a problem, at least a good solution in acceptable run times.

The methods presented here are:

– metaheuristics: local approaches (such as the simulated annealing, the Tabu search, etc.) and global approaches (such as the evolutionary algorithms, ant colonies, particle swarms). These methods are mainly based on nature;
– the stochastic approaches: taking into account the uncertainties of various data sources and including some aspects of the game theory;
– multiobjective optimization;
– the methods and techniques for decision-making, also including some aspects of game theory and for which numerous references to existing software are made;
– the use of simulation for decision-making.

Throughout the book, various examples are presented, in order to illustrate the proposed methods, while the possible application fields of the concerned algorithms are indicated.

Dan STEFANOIUPierre BORNE Dumitru POPESCUFlorin Gheorghe FILIPAbdelkader EL KAMELBucharest and LilleSeptember 2014

Acknowledgments

The authors are very grateful to Dr. L. Popescu, Dr. J. Culita, Dr. F. Popa and Miss D. Gherghina, who kindly accepted to be involved in translating this book from French to English and/or to perform professional English proof reading of the whole book.

1

Metaheuristics – Local Methods

1.1. Overview

The term heuristic in the expression heuristic optimization originates from ancient Greek, where heuriskein (χευρισκειv) means “to discover” or “to learn”. There is an even more subtle meaning of this word, as revealed by the following example: assume that the readers of this book are challenged to find and mark in the text the position of the words metaheuristic or metaheuristics. Each of us has a specific strategy to meet the challenge. Nevertheless, in general, there are two categories of readers: readers who systematically analyze the text, trying not to miss occurrences, and readers who approach the text “diagonally”, relying on their visual capacity of pattern recognition, without actually looking at every word. We say that the first category of readers performs an exhaustive search (like a computer), while the second category makes a heuristic search (like a living entity, not necessarily consciously). Obviously, the research duration of the first type of readers is usually longer than that of the second type of readers. However, it is very likely that the first type of readers will be able to find the most occurrences, while the second type of readers could miss enough occurrences. Finally, when comparing the performance of the two categories of readers, it can be seen that the number of occurrences found by the first category is surprisingly close to the number of occurrences marked by the second category, despite the big difference between the search durations.

Chapters 1 and 2 of this book are devoted to the description of a collection of optimization methods that simulate the second type of readers’ attempt. Such methods actually are inspired either by the behavior of biological entities/systems or by the evolution of some natural phenomena. Next, the discussion focuses on a special class of optimization problems in engineering, more specifically on the class of granular optimization. This concept is explained in the following.

The optimization problem of concern is formulated in the context of a cost function (or criterion), as defined below:

[1.1]

where the search spaceS is usually a bounded subset of Rnx. (Sometimes, f is referred to as fitness.) What makes the criterion f so special? On the one hand, it has a fractal variation, with many ruptures (and thus with many local extremes). On the other hand, it is quite difficult (if not impossible) to evaluate its derivatives in complete form (provided that they exist). In terms of regularity, the f criterion is locally continuous or derivable, but this property does not extend to the global variation. (The criterion could be globally smooth, but this very seldom happens.) In turn, we can evaluate the f criterion for each point x of research space S, according to some a priori known mathematical expression. Thus, in order to solve the optimization problem:

[1.2]

which means to find the global optimum of f and the optimum point , t is only possible to compare various criterion values in points of research space. As the search has to end after a reasonable duration, only a finite discrete subset of , say , can be used for this purpose. We aim not to miss the global optimum, and therefore the subset has to include a very large number of points to test.

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!

Lesen Sie weiter in der vollständigen Ausgabe!