Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

Machine Learning for Cloud

Management 1st Edition Kumar


Visit to download the full and correct content document:
https://ebookmeta.com/product/machine-learning-for-cloud-management-1st-edition-k
umar/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Machine Learning Techniques and Analytics for Cloud


Security Advances in Learning Analytics for Intelligent
Cloud IoT Systems 1st Edition Chakraborty

https://ebookmeta.com/product/machine-learning-techniques-and-
analytics-for-cloud-security-advances-in-learning-analytics-for-
intelligent-cloud-iot-systems-1st-edition-chakraborty/

Machine Learning for Intelligent Multimedia Analytics


Techniques and Applications Pardeep Kumar Amit Kumar
Singh Eds

https://ebookmeta.com/product/machine-learning-for-intelligent-
multimedia-analytics-techniques-and-applications-pardeep-kumar-
amit-kumar-singh-eds/

Machine Learning, Big Data, and IoT for Medical


Informatics 1st Edition Pardeep Kumar

https://ebookmeta.com/product/machine-learning-big-data-and-iot-
for-medical-informatics-1st-edition-pardeep-kumar/

Prediction and Analysis for Knowledge Representation


and Machine Learning 1st Edition Avadhesh Kumar
(Editor)

https://ebookmeta.com/product/prediction-and-analysis-for-
knowledge-representation-and-machine-learning-1st-edition-
avadhesh-kumar-editor/
Machine Learning Approach for Cloud Data Analytics in
IoT 1st Edition Sachi Nandan Mohanty (Editor)

https://ebookmeta.com/product/machine-learning-approach-for-
cloud-data-analytics-in-iot-1st-edition-sachi-nandan-mohanty-
editor/

Beginning Machine Learning in the Browser Nagender


Kumar Suryadevara

https://ebookmeta.com/product/beginning-machine-learning-in-the-
browser-nagender-kumar-suryadevara/

Machine Learning and Optimization Techniques for


Automotive Cyber-Physical Systems 1st Edition Vipin
Kumar Kukkala

https://ebookmeta.com/product/machine-learning-and-optimization-
techniques-for-automotive-cyber-physical-systems-1st-edition-
vipin-kumar-kukkala/

Artificial Intelligence and Machine Learning in


Business Management: Concepts, Challenges, and Case
Studies 1st Edition Sandeep Kumar Panda (Editor)

https://ebookmeta.com/product/artificial-intelligence-and-
machine-learning-in-business-management-concepts-challenges-and-
case-studies-1st-edition-sandeep-kumar-panda-editor/

Machine Learning for Financial Risk Management with


Python: Algorithms for Modeling Risk 1st Edition
Karasan

https://ebookmeta.com/product/machine-learning-for-financial-
risk-management-with-python-algorithms-for-modeling-risk-1st-
edition-karasan/
Machine Learning for
Cloud Management
Machine Learning for
Cloud Management

Jitendra Kumar
Ashutosh Kumar Singh
Anand Mohan
Rajkumar Buyya
First edition published 2022
by CRC Press
6000 Broken Sound Parkway NW, Suite 300, Boca Raton, FL 33487-2742

and by CRC Press


2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN

© 2022 Jitendra Kumar, Ashutosh Kumar Singh, Anand Mohan, Rajkumar Buyya

CRC Press is an imprint of Taylor & Francis Group, LLC

Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot as-
sume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have
attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders
if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please
write and let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or
utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including pho-
tocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission
from the publishers.

For permission to photocopy or use material electronically from this work, access www.copyright.com or contact the
Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. For works that are
not available on CCC please contact [email protected]

Trademark notice: Product or corporate names may be trademarks or registered trademarks and are used only for iden-
tification and explanation without intent to infringe.

Library of Congress Cataloging-in-Publication Data

Names: Kumar, Jitendra, 1975- author. | Singh, Ashutosh Kumar, author. |


Mohan, Anand (Of Indian Institute of Technology), author. | Buyya,
Rajkumar, 1970- author.
Title: Machine learning for cloud management / Jitendra Kumar, Ashutosh
Kumar Singh, Anand Mohan, Rajkumar Buyya.
Description: First edition. | Boca Raton : CRC Press, 2022. | Includes
bibliographical references and index.
Identifiers: LCCN 2021027713 | ISBN 9780367626488 (hardback) | ISBN
9780367622565 (paperback) | ISBN 9781003110101 (ebook)
Subjects: LCSH: Cloud computing. | Machine learning.
Classification: LCC QA76.585 .K85 2022 | DDC 004.67/82--dc23
LC record available at https://lccn.loc.gov/2021027713

ISBN: 978-0-367-62648-8 (hbk)


ISBN: 978-0-367-62256-5 (pbk)
ISBN: 978-1-003-11010-1 (ebk)

DOI: 10.1201/9781003110101

Publisher’s note: This book has been prepared from camera-ready copy provided by the authors.
Dedicated to,
My wife: Gita, daughter: Aru, and Parents

∼Jitendra Kumar

Anushka, Aakash, Akankshya, and Parents

∼Ashutosh Kumar Singh

My wife: Sudha Mohan, son: Ashish Mohan, daughter: Amrita


Mohan, and Late parents

∼Anand Mohan

My international collaborators and team members in Melbourne


CLOUDS Lab!

∼Rajkumar Buyya
Contents

List of Figures xi

List of Tables xvii

Preface xix

Author xxi

Abbreviations xxv

Chapter 1  Introduction 1

1.1 CLOUD COMPUTING 1


1.2 CLOUD MANAGEMENT 2
1.2.1 Workload Forecasting 3
1.2.2 Load Balancing 4
1.3 MACHINE LEARNING 5
1.3.1 Artificial Neural Network 5
1.3.2 Metaheuristic Optimization Algorithms 6
1.3.3 Time Series Analysis 7
1.4 WORKLOAD TRACES 7
1.5 EXPERIMENTAL SETUP & EVALUATION METRICS 8
1.6 STATISTICAL TESTS 9
1.6.1 Wilcoxon Signed-Rank Test 10
1.6.2 Friedman Test 10
1.6.3 Finner Test 10

Chapter 2  Time Series Models 13

2.1 AUTOREGRESSION 14
2.2 MOVING AVERAGE 14
2.3 AUTOREGRESSIVE MOVING AVERAGE 15

vii
viii  Contents

2.4 AUTOREGRESSIVE INTEGRATED MOVING AVERAGE 15


2.5 EXPONENTIAL SMOOTHING 17
2.6 EXPERIMENTAL ANALYSIS 17
2.6.1 Forecast Evaluation 17
2.6.2 Statistical Analysis 21

Chapter 3  Error Preventive Time Series Models 25

3.1 ERROR PREVENTION SCHEME 25


3.2 PREDICTIONS IN ERROR RANGE 27
3.3 MAGNITUDE OF PREDICTIONS 28
3.4 ERROR PREVENTIVE TIME SERIES MODELS 29
3.4.1 Error Preventive Autoregressive Moving Average 29
3.4.2 Error Preventive Autoregressive Integrated Moving Average 35
3.4.3 Error Preventive Exponential Smoothing 38
3.5 PERFORMANCE EVALUATION 47
3.5.1 Comparative Analysis 47
3.5.2 Statistical Analysis 53

Chapter 4  Metaheuristic Optimization Algorithms 59

4.1 SWARM INTELLIGENCE ALGORITHMS IN PREDICTIVE MODEL 59


4.1.1 Particle Swarm Optimization 60
4.1.2 Firefly Search Algorithm 61
4.2 EVOLUTIONARY ALGORITHMS IN PREDICTIVE MODEL 62
4.2.1 Genetic Algorithm 63
4.2.2 Differential Evolution 63
4.3 NATURE INSPIRED ALGORITHMS IN PREDICTIVE MODEL 64
4.3.1 Harmony Search 64
4.3.2 Teaching Learning Based Optimization 66
4.4 PHYSICS INSPIRED ALGORITHMS IN PREDICTIVE MODEL 67
4.4.1 Gravitational Search Algorithm 68
4.4.2 Blackhole Algorithm 69
4.5 STATISTICAL PERFORMANCE ASSESSMENT 71
Contents  ix

Chapter 5  Evolutionary Neural Networks 75

5.1 NEURAL NETWORK PREDICTION FRAMEWORK DESIGN 75


5.2 NETWORK LEARNING 77
5.3 RECOMBINATION OPERATOR STRATEGY LEARNING 78
5.3.1 Mutation Operator 78
5.3.1.1 DE/current to best/1 78
5.3.1.2 DE/best/1 78
5.3.1.3 DE/rand/1 78
5.3.2 Crossover Operator 79
5.3.2.1 Ring Crossover 79
5.3.2.2 Heuristic Crossover 79
5.3.2.3 Uniform Crossover 80
5.3.3 Operator Learning Process 80
5.4 ALGORITHMS AND ANALYSIS 83
5.5 FORECAST ASSESSMENT 86
5.5.1 Short Term Forecast 87
5.5.2 Long Term Forecast 88
5.6 COMPARATIVE ANALYSIS 92

Chapter 6  Self Directed Learning 97

6.1 NON-DIRECTED LEARNING-BASED FRAMEWORK 97


6.1.1 Non-Directed Learning 98
6.2 SELF-DIRECTED LEARNING-BASED FRAMEWORK 99
6.2.1 Self-Directed Learning 100
6.2.2 Cluster-Based Learning 100
6.2.3 Complexity analysis 102
6.3 FORECAST ASSESSMENT 103
6.3.1 Short Term Forecast 103
6.3.1.1 Web Server Workloads 104
6.3.1.2 Cloud Workloads 104
6.4 LONG TERM FORECAST 106
6.4.0.1 Web Server Workloads 106
6.4.0.2 Cloud Workloads 107
6.5 COMPARATIVE & STATISTICAL ANALYSIS 108
x  Contents

Chapter 7  Ensemble Learning 121

7.1 EXTREME LEARNING MACHINE 121


7.2 WORKLOAD DECOMPOSITION PREDICTIVE FRAMEWORK 122
7.2.1 Framework Design 122
7.3 ELM ENSEMBLE PREDICTIVE FRAMEWORK 125
7.3.1 Ensemble Learning 126
7.3.2 Expert Architecture Learning 127
7.3.3 Expert Weight Allocation 129
7.4 SHORT TERM FORECAST EVALUATION 130
7.5 LONG TERM FORECAST EVALUATION 133
7.6 COMPARATIVE ANALYSIS 137

Chapter 8  Load Balancing 141

8.1 MULTI-OBJECTIVE OPTIMIZATION 141


8.2 RESOURCE-EFFICIENT LOAD BALANCING FRAMEWORK 142
8.3 SECURE AND ENERGY-AWARE LOAD BALANCING FRAMEWORK 146
8.3.1 Side-Channel Attacks 147
8.3.2 Ternary Objective VM Placement 148
8.4 SIMULATION SETUP 151
8.5 HOMOGENEOUS VM PLACEMENT ANALYSIS 151
8.6 HETEROGENEOUS VM PLACEMENT ANALYSIS 152

Chapter 9  Summary 155

Bibliography 159

Index 171
List of Figures

1.1 Service model view of cloud computing 2


1.2 Cloud resource management view 2
1.3 Cloud resource management approaches 3
1.4 Schematic representation of workload forecasting 4
1.5 Load balancing 5
1.6 Artificial neural network 6
1.7 An arbitrary optimization function with multiple local optima 7

2.1 Autoregression process 14


2.2 Moving average process 15
2.3 Autoregressive moving average process 15
2.4 Autoregressive integrated moving average process 16
2.5 Exponential smoothing process 17
2.6 Autoregression forecast results on MAE 18
2.7 Autoregression forecast results on MASE 18
2.8 Moving average forecast results on MAE 19
2.9 Moving average forecast results on MASE 19
2.10 Autoregressive moving average forecast results on MAE 20
2.11 Autoregressive moving average forecast results on MASE 20
2.12 Autoregressive integrated moving average forecast results on MAE 21
2.13 Autoregressive integrated moving average forecast results on MASE 21
2.14 Exponential smoothing forecast results on MAE 22
2.15 Exponential smoothing forecast results on MASE 22

3.1 Error preventive workload forecasting model 26


3.2 An example of error preventive and non-error preventive forecast 27
3.3 Prediction error range and segments 28
3.4 Error preventive ARMA forecast analysis for 10-minute prediction
interval 30

xi
xii  LIST OF FIGURES

3.5 Error preventive ARMA forecast analysis for 20-minute prediction


interval 31
3.6 Error preventive ARMA forecast analysis for 30-minute prediction
interval 32
3.7 Error preventive ARMA forecast analysis for 60-minute prediction
interval 33
3.8 R-score comparison of non-error preventive and error preventive ARMA 34
3.9 SEI comparison of non-error preventive and error preventive ARMA 35
3.10 MPE comparison of non-error preventive and error preventive ARMA 35
3.11 PER comparison of non-error preventive and error preventive ARMA 36
3.12 Positive magnitude comparison of non-error preventive and error
preventive ARMA 37
3.13 Negative magnitude comparison of non-error preventive and error
preventive ARMA 37
3.14 Error preventive ARIMA forecast analysis for 10-minute prediction
interval 39
3.15 Error preventive ARIMA forecast analysis for 20-minute prediction
interval 40
3.16 Error preventive ARIMA forecast analysis for 30-minute prediction
interval 41
3.17 Error preventive ARIMA forecast analysis for 60-minute prediction
interval 42
3.18 R-score comparison of non-error preventive and error preventive ARIMA 43
3.19 SEI comparison of non-error preventive and error preventive ARIMA 43
3.20 MPE comparison of non-error preventive and error preventive ARIMA 44
3.21 Positive magnitude comparison of non-error preventive and error
preventive ARIMA 45
3.22 Negative magnitude comparison of non-error preventive and error
preventive ARIMA 45
3.23 PER comparison of non-error preventive and error preventive ARIMA 46
3.24 Error preventive ES forecast analysis for 10-minute prediction interval 47
3.25 Error preventive ES forecast analysis for 20-minute prediction interval 48
3.26 Error preventive ES forecast analysis for 30-minute prediction interval 49
3.27 Error preventive ES forecast analysis for 60-minute prediction interval 50
3.28 R-score comparison of non-error preventive and error preventive ES 51
3.29 SEI comparison of non-error preventive and error preventive ES 51
3.30 MPE comparison of non-error preventive and error preventive ES 52
3.31 Positive magnitude comparison of non-error preventive and error
preventive ES 52
LIST OF FIGURES  xiii

3.32 Negative magnitude comparison of non-error preventive and error


preventive ES 53
3.33 PER comparison of non-error preventive and error preventive ES 54
3.34 R-score performance relative improvement 55
3.35 SEI performance relative improvement 55
3.36 MPE performance relative improvement 56
3.37 Friedman test ranks of non-error preventive and error preventive models 56

4.1 Population-based metaheuristic optimization algorithms’ taxonomy 59


4.2 Forecast accuracy comparison of swarm intelligence based prediction
models 62
4.3 Wilcoxon test statistics of swarm intelligence based prediction models 62
4.4 Forecast accuracy comparison of evolutionary algorithms based pre-
diction models 65
4.5 Wilcoxon test statistics of evolutionary algorithms based prediction
models 65
4.6 Forecast accuracy comparison of nature-inspired algorithms based
prediction models 67
4.7 Wilcoxon test statistics of nature-inspired algorithms based prediction
models 68
4.8 Forecast accuracy comparison of physics-inspired algorithms based
prediction models 70
4.9 Wilcoxon test statistics of physics-inspired algorithms based prediction
models 71
4.10 Friedman test ranks of metaheuristic algorithms based prediction
models 71

5.1 Neural network-based workload prediction model 76


5.2 Ring crossover 79
5.3 Heuristic crossover 80
5.4 Uniform crossover 81
5.5 Learning period effect on forecast accuracy of self-adaptive differential
evolution algorithm based workload prediction model 86
5.6 Short term forecast assessment of self-adaptive differential evolution
algorithm based prediction model on 1-minute prediction interval 87
5.7 Short term forecast residuals of self-adaptive differential evolution
algorithm based prediction model on 1-minute prediction interval 88
5.8 Short term forecast assessment of biphase adaptive differential evolu-
tion algorithm based prediction model on 1-minute prediction interval 89
xiv  LIST OF FIGURES

5.9 Short term forecast residual of biphase adaptive differential evolution


algorithm based prediction model on 1-minute prediction interval 89
5.10 Comparing short term forecast errors of SaDE and BaDE based pre-
dictive frameworks 90
5.11 Long term forecast assessment of self-adaptive differential evolution
algorithm based prediction model on 60-minute prediction interval 91
5.12 Long term forecast residuals of self-adaptive differential evolution
algorithm based prediction model on 60-minute prediction interval 91
5.13 Long term forecast assessment of biphase adaptive differential evolu-
tion algorithm based prediction model on 60-minute prediction interval 92
5.14 Long term forecast residuals of biphase adaptive differential evolution
algorithm based prediction model on 60-minute prediction interval 92
5.15 Comparing long term forecast errors of SaDE and BaDE based pre-
dictive frameworks 93
5.16 Comparing forecast accuracy of Maximum, Average, BPNN, SaDE,
BaDE based predictive frameworks on NASA Trace 93
5.17 Comparing forecast accuracy of Maximum, Average, BPNN, SaDE,
BaDE based predictive frameworks on Saskatchewan Trace 94

6.1 Non-directed predictive framework 98


6.2 Self-directed predictive framework 99
6.3 Self-directed learning process 100
6.4 Position update procedures in blackhole algorithm (standard vs. mod-
ified) 101
6.5 Web server workload prediction results of self-directed learning pre-
dictive framework on 5-minute prediction interval 105
6.6 Web server workload prediction residuals auto-correlation of self-
directed learning predictive framework on 5-minute prediction interval 106
6.7 Mean squared error of non-directed and self-directed predictive frame-
works on short term forecasts of web server workloads 107
6.8 Training time (min) of non-directed and self-directed predictive frame-
works on short term forecasts of web server workloads 108
6.9 Cloud server workload prediction results of self-directed learning pre-
dictive framework on 5-minute prediction interval 109
6.10 Cloud server workload prediction residuals auto-correlation of self-
directed learning predictive framework on 5-minute prediction interval 110
6.11 Web server workload prediction results of self-directed learning pre-
dictive framework on 60-minute prediction interval 111
6.12 Web server workload prediction residuals auto-correlation of self-
directed learning predictive framework on 60-minute prediction interval 112
LIST OF FIGURES  xv

6.13 Mean squared error of non-directed and self-directed predictive frame-


works on long term forecasts of web server workloads 113
6.14 Training time (min) of non-directed and self-directed predictive frame-
works on long term forecasts of web server workloads 114
6.15 Cloud server workload prediction results of self-directed learning pre-
dictive framework on 60-minute prediction interval 115
6.16 Cloud server workload prediction residuals auto-correlation of self-
directed learning predictive framework on 60-minute prediction interval 116

7.1 Decomposition based predictive framework 122


7.2 Decomposition of CPU requests trace 123
7.3 A conceptual view of ensemble stacking 125
7.4 An ensemble of ELMs in workload prediction 127
7.5 Input node selection 128
7.6 Network architecture analysis for short term forecast of decomposition
predictive framework 131
7.7 CPU and Memory data-trace auto-correlation for 5-minute prediction
interval 132
7.8 Short term forecast accuracy of ensemble prediction framework on
CPU Trace 133
7.9 Short term forecast accuracy of ensemble prediction framework on
Memory Trace 134
7.10 Network architecture analysis for long term forecast of decomposition
predictive framework 135
7.11 Long term forecast accuracy of ensemble prediction framework on
CPU Trace 136
7.12 Long term forecast accuracy of ensemble prediction framework on
Memory Trace 136

8.1 An illustration of virtual machine placement scenarios 142


8.2 Resource-efficient load balancing framework design 143
8.3 Chromosome encoding for VM placement 144
8.4 Single point crossover operator for VM placement 145
8.5 Swapping based mutation operator for VM placement 145
8.6 Three cases of VM allocation 149
8.7 Secure and energy-aware load balancing framework design 150
8.8 Power consumption (W) for homogeneous VM requests 152
8.9 Resource utilization for homogeneous VM requests 152
8.10 Presence of conflicting servers (%) for homogeneous VM requests 153
xvi  LIST OF FIGURES

8.11 Power consumption (W) for heterogeneous VM requests 153


8.12 Resource utilization for heterogeneous VM requests 154
8.13 Presence of conflicting servers (%) for heterogeneous VM requests 154
List of Tables

2.1 Friedman test statistics for time series forecasting models 23


2.2 Friedman test ranks for time series forecasting models 23
2.3 Finner test post-hoc analysis of time series forecasting models 23

3.1 Wilcoxon test statistics for error preventive and non-error preventive
time series forecasting model 57
3.2 Finner test post-hoc analysis of error preventive and non-error pre-
ventive time series forecasting models 57

4.1 Friedman test statistics of metaheuristic algorithms based prediction


models 71
4.2 Finner test post-hoc analysis statistics of metaheuristic algorithms
based prediction models (ℵ = 0.05) 73
4.3 Finner test post-hoc analysis results on the null hypothesis of meta-
heuristic algorithms based prediction model 74

5.1 Number of iterations and time elapsed in the training of differential


evolution based predictive models for short term forecasts 90
5.2 Number of iterations and time elapsed in the training of differential
evolution based predictive models for long term forecasts 90
5.3 Iterations elapsed for the training of predictive models using back-
propagation, self-adaptive differential evolution, and biphase adaptive
differential evolution 94
5.4 Training time (sec) elapsed in the training of predictive models us-
ing backpropagation, self-adaptive differential evolution, and biphase
adaptive differential evolution 95

6.1 Mean squared error of non-directed and self-directed predictive frame-


works on short term forecasts of cloud server workloads 110
6.2 Training time (min) of non-directed and self-directed predictive frame-
works on short term forecasts of cloud server workloads 111
6.3 Mean squared error of non directed and self-directed predictive frame-
works on long term forecasts of cloud server workloads 117

xvii
xviii  LIST OF TABLES

6.4 Training time (min) of non-directed and self-directed predictive frame-


works on long term forecasts of cloud server workloads 117
6.5 Mean squared error comparison of non-directed and self-directed learn-
ing based models’ NASA Trace forecasts with state-of-art models 117
6.6 Mean squared error comparison of non-directed and self-directed learn-
ing based models’ Calgary Trace forecasts with state-of-art models 117
6.7 Mean squared error comparison of non-directed and self-directed learn-
ing based models’ Saskatchewan Trace forecasts with state-of-art
models 118
6.8 Mean squared error comparison of non-directed and self-directed learn-
ing based models’ CPU Trace forecasts with state-of-art models 118
6.9 Mean squared error comparison of non-directed and self-directed learn-
ing based models’ Memory Trace forecasts with state-of-art models 118
6.10 Mean squared error comparison of self-directed learning-based model’s
PlanetLab Trace forecasts with deep learning model 118
6.11 Friedman test ranks of non-directed, self-directed, LSTM, SaDE, and
backpropagation based predictive models 119
6.12 Friedman test statistics of non-directed and self-directed learning
predictive frameworks 119
6.13 Wilcoxon signed test ranks for self-directed learning predictive frame-
work 119

7.1 ARIMA analysis orders for cloud resource demand traces 124
7.2 Network configuration parameter choices for decomposition predictive
framework 124
7.3 List of experiments selected by D-Optimal Design 124
7.4 Mean squared error of short term forecast of ELM based predictive
framework 134
7.5 Mean squared error of long term forecast of ELM based predictive
framework 137
7.6 Forecast accuracy comparison of ELM based predictive models on
CPU trace with state-of-art models 138
7.7 Forecast accuracy comparison of ELM based predictive models on
Memory trace with state-of-art models 138
7.8 Forecast accuracy comparison of ELM based predictive models on
Google cluster trace and PlanetLab Trace with state-of-art models 139

8.1 Virtual machine details for illustration 148


Preface

Cloud computing has become one of the revolutionary technology in the history
of the computing world. It offers subscription-based on-demand services and has
emerged as the backbone of the computing industry. It has enabled us to share
resources among multiple users through virtualization by the means of creating a
virtual instance of a computer system running in an abstracted hardware layer. Unlike
early distributed computing models, it assures limitless computing resources through
its large-scale cloud data centers. It has gained wide popularity over the past few
years, with an ever-increasing infrastructure, number of users, and amount of hosted
data. The large and complex workloads hosted on these data centers introduce several
challenges: resource utilization, power consumption, scalability, operational cost, and
many others. Therefore, a practical resource management scheme is essential to bring
operational efficiency with improved elasticity. The elasticity of a system depends on
several factors such as the accuracy of anticipated workload information, performance
behavior of applications in different scenarios communicating the forecast results, use
of the anticipated information, and many others.
Effective resource management can be achieved through workload prediction,
resource scheduling, and provisioning, virtual machine placement, or a combination
of these approaches. The workload prediction has been widely explored and a number
of methods are presented. However, the existing methods suffer from various issues
including the incapability of capturing the non-linearity of workloads and iterative
training that consumes huge computing resources and time. This book discusses
the machine learning-based approaches to address the above-mentioned issues. The
highlights of the discussed models are continuous learning from error feedback, adaptive
nature, decomposition of workload traces, and ensemble learning. Detailed analysis of
predictive methods on different workload traces is also included and their performance
is compared with state-of-art models. Virtual machine placement is another aspect
that is explored to achieve efficient resource management. In general, virtual machine
placement is a multiobjective problem that involves multiple conflicting objectives to
be optimized simultaneously. The frameworks discussed in this book address the issues
of resource utilization, power consumption, and security while placing the workloads
on servers.
The remainder of the book is organized as follows: Chapter 1 briefs the basic cloud
computing concepts. The discussion on the workload prediction mechanisms begins
in chapter 2. First, the basic time series forecasting models are discussed with their
performance on different workload traces. Chapter 3 discusses the error preventive time
series forecasting models which significantly improve the performance over classical
time series models. Then, a discussion on various nature-inspired algorithms is included

xix
xx  Preface

in chapter 4. It also evaluates the performance of neural network-based forecasting


models trained by these algorithms. Next, the forecasting models trained by adaptive
differential evolution are presented in Chapter 5. The first learning algorithm allows
learning the best suitable mutation strategy and crossover rate. In contrast, the second
algorithm allows learning both crossover and mutation strategies along with mutation
and crossover rates. Chapter 6 discusses the blackhole neural network-based forecasting
scheme and evaluates its performance on several workload traces. It also discusses
the concept of self-directed learning. Also, it discusses the self-directed workload
forecasting model inspired by an error preventive scheme along with a modification
in the blackhole learning algorithm to improve the learning capability of the model.
Chapter 7 introduces the decomposition and ensemble learning-based models. The
decomposition-based model trains one network for each component extracted from the
decomposition of workload trace whereas the second approach creates an ensemble
of extreme learning machines and weights their opinions using a blackhole learning
algorithm. Chapter 8 introduces two multi-objective load balancing frameworks. The
first framework considers the resource utilization and power consumption as objectives
to be optimized whereas the second framework also considers the security aspect
into consideration while assigning the VMs to servers. The framework deals with
side-channel attacks only and minimizes the likelihood of the attack occurring. It also
ensures to reduce the number of victim users if any attack occurs. Finally, Chapter 9
summarizes the work discussed in the book.
Author

Dr. Jitendra Kumar is an assistant professor in machine learning at the National


Institute of Technology Tiruchirappalli, Tamilnadu, India. He obtained his doctorate
in 2019 from the National Institute of Technology Kurukshetra, Haryana, India. He is
also a recipient of the Director’s medal for the first rank in the University examination
at Dayalbagh Educational Institute, Agra, Uttar Pradesh in 2011. He has experience
of three years in academia. He has published several research papers in international
journals and conferences of high repute, including IEEE Transactions on Parallel and
Distributed Systems, Information Sciences, Future Generation Computer Systems,
Neurocomputing, Soft Computing, Cluster Computing, IEEE-FUZZ, etc. He has
also obtained the best paper awards in two international conferences. His research
interests are machine learning, cloud computing, healthcare, parallel algorithms, and
optimization. He is also a review board member of several journals, including IEEE
Transactions on Computers, IEEE Transactions on Parallel and Distributed Systems,
IEEE Access, Journal and Parallel Distributed Computing, and more.
Prof. Ashutosh Kumar Singh is an esteemed researcher and academician in
the domain of Electrical and Computer engineering. Currently, he is working as a
Professor; Department of Computer Applications; National Institute of Technology;
Kurukshetra, India. He has more than 20 years of research, teaching, and admin-
istrative experience in various University systems of the India, UK, Australia, and
Malaysia. Dr. Singh obtained his Ph.D. degree in Electronics Engineering from Indian
Institute of Technology-BHU, India; Post Doc from Department of Computer Science,
University of Bristol, UK and Charted Engineer from UK. He is the recipient of
the Japan Society for the Promotion of Science (JSPS) fellowship for a visit to the
University of Tokyo and other universities of Japan. His research area includes Verifi-
cation, Synthesis, Design, and Testing of Digital Circuits, Predictive Data Analytics,
Data Security in Cloud, Web Technology. He has more than 250 publications till
now which includes peer-reviewed journals, books, conferences, book chapters, and
news magazines in these areas. He has co-authored eight books including ‘‘Web Spam
Detection Application using Neural Network’’, ‘‘Digital Systems Fundamentals’’ and
‘‘Computer System Organization & Architecture’’. Prof. Singh has worked as principal
investigator/investigator for six sponsored research projects and was a key member on
a project from EPSRC (United Kingdom) entitled ’’Logic Verification and Synthesis
in New Framework’’.
Dr. Singh has visited several countries including Australia, United Kingdom, South
Korea, China, Thailand, Indonesia, Japan, and the USA for collaborative research
work, invited talks, and present his research work. He had been entitled to 15 awards
such as Merit Awards-2003 (Institute of Engineers), Best Poster Presenter-99 in 86th

xxi
xxii  Author

Indian Science Congress held in Chennai, INDIA, Best Paper Presenter of NSC’99
INDIA and Bintulu Development Authority Best Postgraduate Research Paper Award
for 2010, 2011, 2012.
He has served as an Editorial Board Member of International Journal of Networks
and Mobile Technologies, International Journal of Digital Content Technology and
its Applications. Also, he has shared his experience as a Guest Editor for Pertanika
Journal of Science and Technology, Chairman of CUTSE International Conference
2011, Conference Chair of series of International Conference on Smart Computing and
Communication (ICSCC), and as an editorial board member of UNITAR e-journal.
He is involved in reviewing processes in different journals and conferences of repute
including IEEE transaction of computer, IET, IEEE conference on ITC, ADCOM,
etc.
Prof. Anand Mohan has nearly 44 years of experience in teaching and research
and the administration and management of higher educational institutions. He is
currently an institute professor in the Department of Electronics Engineering, Indian
Institute of Technology (BHU), Varanasi, India. Besides his present academic assign-
ment, Prof. Mohan is a Member of the Executive Council of Banaras Hindu University
and Vice-Chairman of the Board of Governors of Indian Institute of Technology
(BHU), Varanasi, India. Prof. Mohan served as Director (June 2011-June 2016) of
the National Institute of Technology (NIT), Kurukshetra, Haryana, India, and was
also Mentor Director of the National Institute of Technology, Srinagar, Uttarakhand,
India. For his outstanding contributions in the field of Electronics Engineering, Prof.
Mohan was conferred the ’’Lifetime Achievement Award’’ (2016) by Kamla Nehru
Institute of Technology, Sultanpur, India.
Prof. Rajkumar Buyya is a Redmond Barry Distinguished Professor and Direc-
tor of the Cloud Computing and Distributed Systems (CLOUDS) Laboratory at the
University of Melbourne, Australia. He is also serving as the founding CEO of Manjra-
soft Pty Ltd., a spin-off company of the University, commercializing its innovations in
Cloud Computing. He served as a Future Fellow of the Australian Research Council
during 2012-2016. He serving/served as an Honorary/Visiting Professor for several
elite Universities including Imperial College London (UK), University of Birmingham
(UK), University of Hyderabad (India), and Tsinghua University (China). He received
B.E and M.E in Computer Science and Engineering from Mysore and Bangalore
Universities in 1992 and 1995 respectively; and a Doctor of Philosophy (Ph.D.) in
Computer Science and Software Engineering from Monash University, Melbourne,
Australia in 2002. He was awarded Dharma Ratnakara Memorial Trust Gold Medal
in 1992 for his academic excellence at the University of Mysore, India. He received
Richard Merwin Award from the IEEE Computer Society (USA) for excellence in
academic achievement and professional efforts in 1999. He received Leadership and
Service Excellence Awards from the IEEE/ACM International Conference on High-
Performance Computing in 2000 and 2003. He received the ‘‘Research Excellence
Awards’’ from the University of Melbourne for productive and quality research in
computer science and software engineering in 2005 and 2008. With over 112,400
citations, a g-index of 322, and an h-index of 145, he is the highest cited computer
scientist in Australia and one of the world’s Top 30 cited authors in computer science
Author  xxiii

and software engineering. He received the Chris Wallace Award for Outstanding
Research Contribution 2008 from the Computing Research and Education Association
of Australasia, CORE, which is an association of university departments of computer
science in Australia and New Zealand. Dr. Buyya received the ‘‘2009 IEEE TCSC
Medal for Excellence in Scalable Computing’’ for pioneering the economic paradigm for
utility-oriented distributed computing platforms such as Grids and Clouds. He served
as the founding Editor-in-Chief (EiC) of IEEE Transactions on Cloud Computing
(TCC). Dr. Buyya is recognized as a ‘‘Web of Science Highly Cited Researcher’’ for
five consecutive years since 2016, a Fellow of IEEE and Scopus Researcher of the
Year 2017 with Excellence in Innovative Research Award by Elsevier, and ‘‘Lifetime
Achievement Award’’ from two Indian universities for his outstanding contributions
to Cloud computing and distributed systems. He has been recently recognized as the
‘‘Best of the Worl’’, in the Computing Systems field, by The Australian 2019 Research
Review.
Dr. Buyya has authored/co-authored over 850 publications. Since 2007, he received
twelve ‘‘Best Paper Awards’’ from international conferences/journals including a
‘‘2009 Outstanding Journal Paper Award’’ from the IEEE Communications Society,
USA. He has co-authored five text books: Microprocessor x86 Programming (BPB
Press, New Delhi, India, 1995), Mastering C++ (McGraw Hill Press, India, 1st
edition in 1997 and 2nd edition in 2013), Object Oriented Programming with Java:
Essentials and Applications (McGraw Hill, India, 2009), Mastering Cloud Computing
(Morgan Kaufmann, USA; McGraw Hill, India, 2013; China Machine Press, 2015),
and Cloud Data Centers and Cost Modeling (Morgan Kaufmann, USA, 2015). The
books on emerging topics that he edited include High Performance Cluster Computing
(Prentice Hall, USA, 1999), High Performance Mass Storage and Parallel I/O (IEEE
and Wiley Press, USA, 2001), Content Delivery Networks (Springer, Germany, 2008),
Market Oriented Grid and Utility Computing (Wiley Press, USA, 2009), and Cloud
Computing: Principles and Paradigms (Wiley, USA, 2011). He also edited proceedings
of over 25 international conferences published by prestigious organizations, namely
the IEEE Computer Society Press (USA) and Springer Verlag (Germany). He served
as Associate Editor of Elsevier’s Future Generation Computer Systems Journal (2004-
2009) and currently serving on editorial boards of many journals including Software:
Practice and Experience (Wiley Press). Dr. Buyya served as a speaker in the IEEE
Computer Society Chapter Tutorials Program (from 1999-2001), Founding Co-Chair
of the IEEE Task Force on Cluster Computing (TFCC) from 1999-2004, and member
of the Executive Committee of the IEEE Technical Committee on Parallel Processing
(TCPP) from 2003-2011. He served as the first elected Chair of the IEEE Technical
Committee on Scalable Computing (TCSC) during 2005-2007 and played a prominent
role in the creation and execution of several innovative community programs that
propelled TCSC into one of the most successful TCs within the IEEE Computer
Society. In recognition of these dedicated services to the computing community
over a decade, the President of the IEEE Computer Society presented Dr. Buyya a
Distinguished Service Award in 2008.
Dr. Buyya has contributed to the creation of high-performance computing and
communication system software for PARAM supercomputers developed by the Centre
xxiv  Author

for Development of Advanced Computing (C-DAC), India. He has pioneered Economic


Paradigm for Service-Oriented Distributed Computing and demonstrated its utility
through his contribution to conceptualization, design, and development of Grid and
Cloud Computing technologies such as Aneka, GridSim, Libra, Nimrod-G, Gridbus,
and Cloudbus that power the emerging eScience and eBusiness applications. He has
been awarded, over $8 million, competitive research grants from various national and
international organizations including the Australian Research Council (ARC), Sun
Microsystems, StorageTek, IBM, and Microsoft, CA Australia, Australian Department
of Innovation, Industry, Science, and Research (DIISR), and European Council. Dr.
Buyya has been remarkably productive in a research sense and has converted much
of that knowledge into linkages with industry partners (such as IBM, Sun, and Mi-
crosoft), into software tools useful to other researchers in a variety of scientific fields,
and into community endeavors. Software technologies for Grid and Cloud computing
developed under Dr. Buyya’s leadership have gained rapid acceptance and are in use
at several academic institutions and commercial enterprises in 50+ countries around
the world. In recognition of this, he received Vice Chancellor’s inaugural ‘‘Knowledge
Transfer Excellence (Commendation) Award’’ from the University of Melbourne in
Nov 2007. Manjrasoft’s Aneka technology for Cloud Computing developed under
Dr.Buyya’s leadership has received the ‘‘2010 Asia Pacific Frost & Sullivan New Prod-
uct Innovation Award’’. Recently, Dr. Buyya received the ‘‘Bharath Nirman Award’’
and the ‘‘Mahatma Gandhi Award’’ along with Gold Medals for his outstanding and
extraordinary achievements in Information Technology field and services rendered to
promote greater friendship and India-International cooperation.
Abbreviations

SLA Service Level Aggrement


QoS Quality of Service
DE Differential Evolution
PSO Particle Swarm Optimization
EA Evolutionary Algorithm
FSA Firefly Search Algorithm
HS Harmony Search
TLBO Teaching Learning Based Optimization
GSA Gravitational Search Algorithm
BhOA Blackhole Algorithm
BhNN Bloackhole Network
MSE Mean Squared Error
RMSE Root Mean Squared Error
MAE Mean Absolute Error
RMAE Relative Mean Absolute Error
CoC Correlation Coefficient
SEI Sum of Elasticity Index
EP Error Preventive
EPS Error Prevention Score
NEP Non Error Preventive
PER Predictions in Error Range
MoP Magnitude of Prediction
SaDE Self Adaptive Differential Evolution
BaDE Biphase Adaptive Differential Evolution
SDL Self Directed Learning
ELM Extreme Learning Machine
RELB Resource Efficient Load Balancing
SCA Side Channel Attack
SEALB Secure and Energy Aware Load Balancing
PWS Prediction Window Size
WPBPNN BPNN based Workload Prediction Model
WPBhNN BhNN based Workload Prediction Model
WPSDL
BhNN BhNN and SDL based Workload Prediction Model
WPSaDE SaDE based Workload Prediction Model
WPBaDE BaDE based Workload Prediction Model
ELMNN ELM based Neural Network

xxv
xxvi  Abbreviations

eELMNN Ensemble of ELM based Neural Networks


WPELMNN ELMNN based Workload Prediction Model
WPeELMNN eELMNN based Workload Prediction Model
N DS Non Dominated Sorting
CHAPTER 1

Introduction

loud computing paradigm enables the delivery of computing resources and


C applications to users across the globe as subscription-oriented services. Virtu-
alization is the technique behind the scene that helps in resource sharing among
multiple users in this cloud computing environment.

1.1 CLOUD COMPUTING


Cloud computing is a form of distributed computing environment where multiple
virtual instances of a computer system run in abstracted hardware level and every user
experiences like owning the entire system. The cloud infrastructure may be private
(serves to a single organization), public (shared among multiple organizations), and
hybrid (combination of both). A cloud system provides the on-demand services at
three different levels, referred to as Infrastructure as a Service (IaaS), Platform as a
Service (PaaS), and Software as a Service (SaaS), as shown in Fig. 1.1. In IaaS, the
infrastructure components such as servers, networking and storage, and operating ser-
vices hosted by service providers are delivered to the consumers through virtualization.
These components are provided with various services including monitoring, security,
log access, backup and recovery, and load balancing. While in PaaS, users get required
and associated infrastructure to develop, run, and manage their applications. The
service provider is responsible for providing the servers, operating system, storage,
database, and middleware such as Java and .NET runtime. In the case of SaaS
that is a software distribution model, the software or applications are hosted in the
data centers, and users access these applications over the Internet. The applications
delivered through SaaS eliminate the requirement of hardware, installation, support,
and maintenance as they do not need any installation on local computers and can be
accessed through web browsers.
In the last decade, cloud systems have received wide popularity due to ever-
growing services, infrastructure, clients, and the ability to host big data [17]. A
survey conducted in 2017 reported that organizations would shift their 90% enterprise
workload on a cloud by 2021 [29]. The cloud infrastructure is growing very fast, and
the cloud industry is expected to grow with 14.6% compound annual growth rate
to reach the $300 billion mark by 2022 [62, 63]. Modern cloud systems are equipped
with characteristics such as on-demand service, reliability, scalability, elasticity,

DOI: 10.1201/9781003110101-1 1
2  Machine Learning for Cloud Management

SaaS

User Ap-
plications
PaaS

Database Application
UI Services
Grid Grid
IaaS

Virtual Virtual Virtual Virtual Virtual


Network Machine Storage Machine Server

Amazon Elastic Compute


Google Apps, Drop- Amazon Web Services,
Cloud, Microsoft Azure,
box, Salesforce, etc. Google Compute Engine, etc.
Google Apps Engine, etc.

End Users Software Developers Infrastructure & Network Architects

FIGURE 1.1 Service model view of cloud computing

disaster recovery, accessibility, measured services, and many others [25, 73, 75, 98].
However, various challenges and limitations are still open including inefficient resource
management, security and privacy, heterogeneity, elasticity, usability, response time,
and many more [18, 19, 21, 52--54, 90, 109, 124, 125].

Resource
Clients Information

Web
Portal
Resource Management
Resource Pool

Workload
Workload Resource
Analysis Provision
Cloud Scheduler

FIGURE 1.2 Cloud resource management view

1.2 CLOUD MANAGEMENT


Resource management is one of the core functions of cloud systems and must be
improved for better system performance [66, 102, 119]. The inefficiency in resource
management directly affects the system performance and operational cost. The poor
resource utilization degrades the overall system performance and may increase the
service cost as well. A simple resource management block diagram in a cloud system is
depicted in Fig. 1.2. It can be seen that clients are connected to a cloud server through
a web portal. Users send their workloads to the cloud server for the execution. In turn,
Introduction  3

a modern cloud system tries to assign the workloads to one of the server machines
based on different criteria including resource utilization, system performance, user
priorities, operational cost, quality of service, etc. Typically, the complete process of
workload placement over a time to improve different variables of a system is referred
to as cloud resource management. As depicted in Fig. 1.2, the major tasks of a cloud
resource management application are workload analysis and forecasting, resource
provisioning, and scheduling the workloads on hardware. The workload analysis
module is responsible for analyzing the upcoming workload and for forecasting the
expected workload in the near future. This information is used by the resource
provisioning module to allocate the physical resources. The resource scheduler places
the workloads on the servers based on the input from the resource provisioning module
and current resource usage information. Typically, resource management is achieved
through prediction, scaling, provisioning, and load balancing, as shown in Fig. 1.3.
However, this book concentrates on workload forecasting using different approaches
of regression analysis and artificial neural networks, and load balancing.

Cloud Resource
Management

Workload Application Scaling and


Load Balancing
Prediction Provisioning

Resource Demand Resource Utilization Scheduling VM Placement

FIGURE 1.3 Cloud resource management approaches

1.2.1 Workload Forecasting


The workload prediction is a mechanism that estimates the future workload on the
servers and can be classified as either a homeostatic or history-based method [89].
The first class of methods detect a trend in previous actual values and add or subtract
it to the current value to forecast the next value. It could be a static or dynamic
value that is to be added/subtracted. On the other hand, the second class of models
analyzes the workload history and extracts a pattern to forecast the next value. A
homeostatic method attempts to follow the mean of the previous values, while the
history-based approach uses the behavior of previous workload information to forecast
the next instance [70,114,116−118].
4  Machine Learning for Cloud Management

Learning Window Forecast


x1 x2 ... xt x̂t+1

Forecaster

FIGURE 1.4 Schematic representation of workload forecasting

x̂t+1 = f (xt , xt−1 , . . . , x1 ) (1.1)


Let f be a function of X which determines the value of x̂t+1 i.e. the estimated
workload at time t + 1. In order to forecast the upcoming workload, the f analyses the
historical workloads of the length of the learning window (eq. 1.1). For instance, the
function analyses the previous 10 instances from history if the length of the learning
window is 10. Since the forecast function of a real data-trace is generally non-linear
and complex, it becomes a challenging task to find the set of optimal parameters.
And machine learning becomes the natural choice to optimize the model parameters
to forecast the dynamic and non-linear workloads. A typical workload forecasting
model is depicted in Fig. 1.4. The learning window defines the number of recent past
workload instances to be analyzed for anticipation of next the value.
The prediction models have been explored and developed for various applica-
tions [13, 37, 46, 96, 97, 99]. The workload predictive resource management approaches
are tailored with estimations of demand and utilization of resources. The workload
of cloud services is dynamic and varies over time [30, 100]. Therefore a robust pre-
diction model is required to produce reasonably accurate forecasts. On the other
hand, the resource utilization prediction helps in accessing the free resources and
also in accessing the impact of allocating the free resources to individual work-
loads [35, 76, 77, 79, 80, 115, 121, 123, 134].

1.2.2 Load Balancing


The task of a load balancing process is to distribute the workloads uniformly among
servers. The load balancers are responsible for identifying the best suitable servers
or computing resources that meet the application requirements. It ensures that the
high volume of network traffic is not diverted to a single server. The schematic
representation of load balancing in a distributed computing environment is shown in
Fig. 1.5. The load balancer receives the traffic of users’ requests through the Internet
and distributes it among accessible and eligible servers. For instance, when seven users
send their workloads, the load balancer balances the load distribution by assigning
the load to three of the four servers.
Introduction  5

Cloud Datacenter

Se Se
rve rve
r r

Se Se
rve rv
r er

WWW Load
Balancer
Cloud Users

FIGURE 1.5 Load balancing

The effective load balancing is another approach that helps in achieving better
usage of resources and their management. The efficiency of load balancing approaches
has been an issue for cloud systems since its development [84, 129]. The efficiency in
load balancing can be achieved using different approaches such as optimal scheduling
and placement of workloads or virtual machines (VMs). The optimal mapping of
VMs is a complex and challenging task as it involves multiple objectives to optimize
at the same time and belongs to NP-Complete class of problems [16, 88]. Generally,
the existing VM placement algorithms consider the different dimensions of resource
utilization and power consumption in the data centers [4, 138]. We will focus on the
load balancing approaches, also dealing with the security while balancing the load on
cloud servers as it is one of the most important issues in the cloud architectures, and
various approaches have been discovered including [43, 85].

1.3 MACHINE LEARNING


Machine learning allows a computer to master a specific task without being explicitly
programmed. A computer can extract the underlying rules to perform the given task
from a bunch of data points. In this book, we will discuss the cloud management
models which are developed using the following techniques:

1.3.1 Artificial Neural Network


An artificial neural network (ANN) is composed of huge interconnected nodes called
neurons, as shown in Fig. 1.6, that processes any information in a similar way as
of human brain. Typically, a neural network is trained to solve a specific complex
problem such as recognition, classification, forecasting, and others. Similar to the
brain, a neural network adjusts the connections and their weights during the learning
phase. The neural networks are capable of analyzing the complex and large amount
of data and extracting the patterns from it.
6  Machine Learning for Cloud Management

Output
Layer

Input Hidden Hidden


Layer Layer-3 Layer-4

Hidden Hidden
Layer-1 Layer-2

FIGURE 1.6 Artificial neural network

The key difference between a traditional computing approach and a neural network
is that the traditional approach follows a set of rules that must be known to the
computer in advance, while a neural network can learn from the data itself to draw
insightful inferences using some specific rules. Let κ1 = [x1 , x2 , . . . , xt ] be an input
vector, and the network, as shown in Fig. 1.6, is applied to estimate the value of xt+1 .
Assuming that ωi,j k
represents the weight of a synaptic connection between the ith
node of the k th layer and the j th node of the next layer, and ζk denotes the activation
function applied on k th layer nodes. The output of the j th node of layer k + 1 can be
computed as zj = ti=1 ζk+1 (xi · ωi,jk
) that acts as the input to next layer nodes.
P

1.3.2 Metaheuristic Optimization Algorithms


Optimization has become an integral part of solving real-world problems which are
multi-modal and highly non-linear in nature. These problems can be represented
as a constrained optimization problem with one or more decision variables. Some
of the real-world optimization problems are routing, engineering designs, resource
assignment, and most of these problems are NP-hard [16, 86]. Consider that Fig. 1.7
shows an arbitrary function farb of decision variable x that needs to be minimized.
The solution space has an infinite number of solutions along with multiple local
optima such as Sl1 , Sl2 , Sl3 , and many others. If the solution space is multidimensional,
complex, and large enough, it becomes a challenging task to find the global optimal
(Sg∗ ) in a reasonable time. The metaheuristic optimization algorithm helps in solving
such problems. The term metaheuristic was first used by Fred Glover [47] for an
approach that has the capability of guiding and modifying the other heuristics to
produce the solutions beyond their ability [48]. A metaheuristic algorithm does not
guarantee to produce an optimal solution, but it generates an approximated solution
in a reasonable amount of time.
Introduction  7

farb

Sl1 Sl2 Sg∗ Sl4


Decision Variable (x)

FIGURE 1.7 An arbitrary optimization function with multiple local optima

These algorithms can be classified into two major categories i.e. trajectory-based
and population-based approaches. A trajectory-based algorithm such as Simulated
Annealing works around a single solution to find an optimal solution for the problem
under consideration. On the other hand, a population-based algorithm uses a set
of solutions to search for an optimal solution. A detailed study on metaheuristic
optimization can be seen in [14].

1.3.3 Time Series Analysis


A sequence of data points obtained at regular interval and indexed in time is referred
to as time series data. Time series analysis started a long ago in 1927 [128] and has
a range of applications including finance, signal processing, astronomy, forecasting,
stock market, statistics, defense, politics, etc. In general, the task of a time series
analysis model is to extract the meaningful data characteristics to predict the data
trends. The time series models predict the future event after analyzing the historical
events, and a number of models are introduced. Time series analysis is applicable to
any kind of data including numeric, symbolic, continuous, and real-valued data.

1.4 WORKLOAD TRACES


The analysis reported in this book is carried out on various data traces. The workload
data traces belong to two different categories, namely web server workloads and cloud
server workloads.

HTTP-Web Server Logs: The HTTP traces of web servers of NASA, Calgary, and
Saskatchewan servers are used [1]. In this book, these data traces are referred to as
NASA Trace (D1 ), Calgary Trace (D2 ), Saskatchewan Trace (D3 ), respectively. The
D1 is composed of two months of HTTP web requests obtained from the WWW server
of NASA Kennedy Space Center in Florida. Similarly, the D2 data-trace contains the
HTTP request of one-year duration obtained from the WWW server located at the
University of Calgary, Alberta, Canada. On the other hand, the D3 is a data-trace
that contains the HTTP server requests of seven months obtained from a WWW
8  Machine Learning for Cloud Management

server of a university at Saskatchewan. Every data trace stores the records in ASCII
files, and every line stores one record. Every record is composed of five records i.e.
host, timestamp, request, HTTP reply code, and bytes in the reply.

Google Cluster Trace: It contains the data collected from the cluster cell of Google
for 29 days of duration. The workload trace was released in 2011, and it contains
the data from 10388 servers, 20 million tasks, and more than 0.67 million jobs [112].
A job is a set of one or more tasks, and tasks are further decomposed into one or
more processes. In this book, the CPU and Memory resource demands are used and
referred to as CPU Trace (D4 ) and Memory Trace (D5 ).

PlanetLab Trace: It is a collection of mean CPU utilization data which is collected


from 11,746 virtual machines. These virtual machines are scattered at 500 different
locations across the world. The data was collected for randomly selected 10 days
during March and April of 2011, and data was sampled on five minutes intervals. The
CPU utilization data of 22 randomly selected virtual machines is used in this book
and referred to as PlanetLab Trace (D6 ).

1.5 EXPERIMENTAL SETUP & EVALUATION METRICS


A machine equipped which contains two Intel® Xeon® E5-2630 v4 processors, and
both processors run at the clock speed of 2.20 GHz. The machine is equipped with
128GB of main memory, and it operates on 64-bit windows servers 2012 R2. The
predictive frameworks discussed in this book are evaluated using below mentioned
metrics:

Mean Squared Error: The mean squared error (MSE) measures the forecast accuracy,
and it is one of the popular metrics used in the literature. This method heavily penalizes
the large error terms. Mathematically, it is denoted as given in eq. (1.2), where m
represents the size of data in a given trace. The term MSE and MPE (mean squared
prediction error) are interchangeably used in the book. Moreover, the square root of
MSE (RMSE) may also be used as an error metric.
m
1 X
M SE = (xt − x̂t )2 (1.2)
m t=1

Mean Absolute Error: A small number of very large magnitude errors may influence
the accuracy measured using mean squared error. Whereas mean absolute error equally
weights every error term, and it computes the mean of absolute differences between
predicted and actual workloads as given in eq. (1.3). The forecasts are close to the
actual workload values if the measured score is close to zero.
m
1 X
M AE = |xt − x̂t | (1.3)
m t=1
Introduction  9

Relative Mean Absolute Error: A scale-free error metric is required to compare the
forecast models on different data sets, and relative mean absolution error (RelMAE)
is one such metric. The score can be calculated using eq. (1.4), which represents the
mean absolute error of the algorithm (M AEA ) normalized by the mean absolute error
of a base or state of the art model (M AEBM )

M AEA
RelM AE = (1.4)
M AEBM

Mean Absolute Scaled Error: Rob J. Hyndman and Anne B. Koehler introduced a
new metric as a substitution of percentage error metrics [61]. The prediction errors
are scaled on the basis of the training mean absolute error of a naı̈ve forecast method.
It computes the measured score using eq. (1.5), where ms denotes the seasonal term.
This metric is a good choice of accuracy measurement when the prediction model is
compared across a number of different scales.
m
1 X
!
|xt − x̂t |
M ASE(x, x̂) = 1 Pm (1.5)
m t=1 m−ms t=ms +1 |xt − xt−1 |

Correlation Coefficient: The correlation coefficient (CoC) statistically evaluates the


statistical relationship of two variables by measuring the degree of movements. The
¯ are the mean values
CoC score can be computed as given in eq. (1.6), where x̄ and x̂
of actual and predicted workloads, respectively.
P ¯
(xt − x̄)(x̂t − x̂)
CoCx x̂ = qP (1.6)
(xt − x̄)2
P ¯ 2
(x̂t − x̂)

Sum of Elasticity Index: Messias et al. proposed to use the sum of elasticity index
(SEI) as a measure of forecast accuracy [92]. This metric supports a forecast model
having the best performance most of the time. As opposed to MAE and RMSE, it is
very less sensitive to the outliers. The SEI score is computed as given in eq. (1.7) and
it always lies between zero and one, where zero and one define the worst and best
accuracy of the model.
m
min(xt , x̂t )
SEI = (1.7)
X

t=1
max(xt , x̂t )

1.6 STATISTICAL TESTS


The statistical techniques are used to analyze the forecasting behavior of different
approaches. The non-parametric tests are used due to the fact that they are not highly
restrictive and can be used over small sample sizes [44]. The significance tests help in
finding the presence of significant differences in two or more forecasting models.
10  Machine Learning for Cloud Management

1.6.1 Wilcoxon Signed-Rank Test


The Wilcoxon signed-rank test is one of the non-parametric tests that compare two
samples to find out whether they represent the same population or not [130]. It
assumes a null hypothesis (H0WC ) that the mean of both samples is the same. If a
significant difference is detected, H0WC gets rejected.
Considering that two algorithms are being evaluated on k problems and i denotes
the performance score difference on the ith problem. The method ranks the absolute
differences and computes the ranks accordingly. The ties can be addressed using one
of the available approaches. In this book, the number of ties is equally divided to
compute the rank of both algorithms. The total rank of the first algorithm where it
outperforms the second is computed using eq. (1.8) whereas eq. (1.9) computes the
sum of ranks for the problems where the second algorithm gives better results than
the first algorithm.
1X
R+
WC = rank(i ) + rank(i ) (1.8)
X

i >0
2 i =0
1X
WC =
R− rank(i ) + rank(i ) (1.9)
X

i <0
2 i =0

1.6.2 Friedman Test


It is a non-parametric test developed by M. Friedman [40, 41] that provides an
alternative to one-way ANOVA with repeated measures. It conducts multiple tests
that target to detect the presence of differences between the performance behavior of
two or more models [34].
Let H0FR be the null hypothesis of Friedman test that states the equality in the
mean of every prediction model’s result. The alternate hypothesis (H1FR ) of the test
is the negation of H0FR . First, it converts the original results of each algorithm into
j
ranks. Let RFR
i
be the Friedman rank of j th algorithm on the ith problem, the final
j
rank of j th algorithm can be observed by calculating the average of RFR i
as shown in
eq. (1.10), where j = {1, 2, . . . , k} and i = {1, 2, . . . , |D|} denote the algorithms and
datasets respectively, |D| is the number of datasets. The minimum value of ranks
represents the best algorithm.
|D|
j j
RFR = (1.10)
X
i
RFR
i=1

1.6.3 Finner Test


The Friedman test conducts the multiple comparison test and detects the significant
difference over the whole population test. However, it is unable to conduct comparisons
to detect the difference between some of the algorithms. Post-hoc analysis tests deliver
the purpose and allow to detect the presence of difference in the performance of
two algorithms on the basis of a control method [38]. The test adjusts the value of
Introduction  11

significance level (ℵ) in a step-down manner [34]. Considering that the generated
p-values are sorted in an increasing fashion in such a way that pi ≤ pi+1 ; ∀i =
{1, 2, . . . , k − 2}. Let HiFN be the corresponding hypothesis for tests. The Finner test
rejects the hypothesis from H1FN to Hi−1 FN
provided i is the smallest integer number
k−1
that satisfy pi > 1 − (1 − ℵ) i property [34].
CHAPTER 2

Time Series Models

ime series analytical models are being used in forecasting since a long ago in
T 1927 [128]. A time series-based model forecasts the trends after analyzing the
various characteristics of data indexed in time. Since their first usage, they have been
widely used in scientific research and industry-oriented applications. This chapter
concentrates on univariate time series-based workload forecasting. A univariate time
series can be defined as a collection of measurements of the same variable over time
(typically, at regular time intervals). The essential characteristic of any time series
data is that the order of observation matters and change in order may alter the
significance of the data.
The time series analysis is typically associated with the process of finding a model
to fit the time series data. The observed model can be used to extract the pattern,
forecast future events, and explain the effects of past events on the future. Some of
the essential characteristics of a time series are:

• The trend depicts the direction of the data i.e. to increase or decrease. The
direction is always need not be in the same direction for a long period of time.
According to the Organisation for Economic Co-operation and Development
(OECD), ‘‘The trend is the component of a time series that represents variations
of low frequency in a time series, the high and medium frequency fluctuations
having been filtered out.’’
• The seasonality also depicts similar characteristics as of trend. The difference
between the two terms is that the seasonality shows repetitive patterns.
• The noise referred to the component depicting neither trend nor seasonality in
the data.
• Outliers are the far-away data points from the data.
• The other common characteristics in a time series data are long-run cycle,
constant variance over time, and spikes.

This chapter discusses the five basic time series analysis models and uses them
to forecast the different types of workloads on cloud servers. A detailed analysis is
conducted to validate their performance on real-world data traces.

DOI: 10.1201/9781003110101-2 13
14  Machine Learning for Cloud Management

2.1 AUTOREGRESSION
An autoregressive model is used to present a phenomenon where the future values of
any variable are the function of its historical values. It is used to depict a random
process in real-world applications of statistics, signal processing, data analysis, etc.
Formally, an autoregressive model can be defined as a process that considers the
historical data with a white noise term to generate the future outcome of a variable.
This model gets the name from its functioning as it regresses the same variable.
Considering that the cloud workload x is indexed in equally spaced time interval
1, 2, . . . , t as x1 , x2 , . . . , xt then the autoregressive model of order p can be defined as
eq. (2.1) [91].

x̂t = φ1 × xt−1 + φ2 × xt−2 + . . . + φp × xt−p + ℵt (2.1)


The x̂t and xt are predicted and actual workloads respectively at time t; and
model parameters are represented as φi (i = 1, 2, . . . , p) which should hold any value
from -1 to +1. The term ℵt is used to denote the white random noise. The working of
an autoregressive model is graphically shown in Fig. 2.1.

xt−p ... xt−2 xt−1 x̂t

φ2 φ1
φp
Prediction
fAR

ℵt

FIGURE 2.1 Autoregression process

2.2 MOVING AVERAGE


The moving average (MA) model uses the errors in the previous forecasts rather than
the previous actual values. According to Box et al., the time series can be modeled
using Ξ if successive actual values of the series are highly correlated [15]. These error
terms can be considered as the zero-mean white noise series generated using a fixed
distribution. A typical moving average model of order q can be written as eq. 2.2,
where θj represents the weight for j th model term. These weights are required neither
to be positive nor to be total unity [15]. The graphical illustration of a moving average
model with q model terms is shown in Fig. 2.2, where fMA denotes the function that
learns the moving average model.

x̂t = θ1 × ξt−1 + θ2 × ξt−2 + . . . + θq × ξt−q + ℵt (2.2)


Time Series Models  15

ξt−q ... ξt−2 ξt−1 x̂t

θ2 θ1
θq
Prediction
fMA

ℵt

FIGURE 2.2 Moving average process

2.3 AUTOREGRESSIVE MOVING AVERAGE


The autoregressive moving average (ARMA) model is a combination of two different
forecasting models. It suggests using the actual historical values along with the
errors associated with the previous forecasts. The ARMA process can be represented
mathematically as shown in eq. 2.3. The ARMA model parsimoniously represents
time-series data using two different polynomial fits associated with the auto regression
and moving average models respectively. For a complicated time series data, the
ARMA process is preferred over autoregression and moving average models. The
graphical representation of the ARMA model is shown in Fig. 2.3 having p and q
model terms for autoregression and moving average models.

x̂t = φ1 ×xt−1 +φ2 ×xt−2 +. . .+φp ×xt−p +ℵt +θ1 ×ξt−1 +θ2 ×ξt−2 +. . .+θq ×ξt−q (2.3)

xt−p ... xt−2 xt−1

φp φ2
φ1

Prediction
ℵt fARMA x̂t

φp φ1
φ2

ξt−q ... ξt−2 ξt−1

FIGURE 2.3 Autoregressive moving average process

2.4 AUTOREGRESSIVE INTEGRATED MOVING AVERAGE


The more general form of an ARMA process is the autoregressive integrated moving
average, commonly referred to as ARIMA. This model is composed of three components
viz. Autoregression (AR), Integration (I), and Moving Average (MA). The ARIMA
16  Machine Learning for Cloud Management

process is a good choice to model non-stationary time series. The ARIMA process
integrates the non-stationary data to transform into stationary data by applying
the difference operator. The difference operator is depicted as δ d , where d is the
number of different terms. For instance, the first-order difference operation can be
defined as δ 1 = xt − xt−1 . The graphical representation of an ARIMA(p, d, q) is
depicted in Fig. 2.4, where p, d, and q are the model terms or weights associated with
autoregression, difference, and moving average respectively. The workload values
at time t obtained after applying the difference are denoted as x̃ t . The first order
ARIMA is one of the simplest models that can be represented as shown in eq. 2.4,
where the term B denotes the backward shift operator (B × xt = xt−1 ) [91].

(1 − θ1 B) (1 − B)xt = (1 − θ1 B)ξt (2.4)


| {z } | {z } | {z }
AR(1) First Difference MA(1)

xt−p ... xt−2 xt−1

δd ... δd δd

x̃ t−p ... x̃ t−2 x̃ t−1

φ1
ξt−q ... ξt−2 ξt−1 φq φ2 x̂t

θ2 θ1
θq
Prediction
fARIMA

ℵt

FIGURE 2.4 Autoregressive integrated moving average process


Time Series Models  17

2.5 EXPONENTIAL SMOOTHING


As opposed to the regression-based time series analysis models, the exponential
smoothing (ES) model uses the historical actual values and their corresponding
forecasts. This model suggests that the recent predictions are highly important to
consider for better modeling of time series data. The term exponential comes from the
principle of exponentially reducing the weights associated with the previous forecasts.
The mathematical representation of the model is shown in eq. 2.5, where the term α
is associated with the smoothing constant. The exponential smoothing is graphically
illustrated in Fig. 2.5. One of the obvious characteristics of the model is that it does
not need to store a large amount of historical values as oppose to the regression-based
time series models.

x̂t = αxt−1 + (1 − α)x̂t−1 (2.5)

xt−1

fES
Forecast
(1 − α)

x̂t−1 x̂t

FIGURE 2.5 Exponential smoothing process

2.6 EXPERIMENTAL ANALYSIS


The performance of these models on forecasting the cloud server workloads is assessed
with a number of experiments. The workload is forecasted with different values of the
prediction window size. The term PWS defines the time interval in two consecutive
forecasts, for instance, if a model estimates the upcoming workload for every 60
minutes then the length of the prediction window size is 60 minutes. This study is
conducted with the length of the prediction window of size 5, 10, 20, 30, and 60
minutes duration. Also, the model parameters are estimated using 60% of the data
and the remaining 40% of the data is used to observe the accuracy of the time series
models with estimated parameter values.

2.6.1 Forecast Evaluation


The forecast accuracy is measured using mean absolute error (MAE) and mean
absolute scaled error (MASE). The data sets are categorized into two categories viz.
web server workloads and cloud server workloads. The forecast accuracy on web
18  Machine Learning for Cloud Management

server workloads is shown in Figs. 2.6a and 2.7a using MAE and MASE respectively.
The model forecasts Calgary Trace with the least MAE for the prediction window of
length 5, 10, 20, and 40 minutes. For the prediction window of 60 minutes duration,
the least mean absolute error is obtained on the forecasts of Saskatchewan Trace.
Similarly, the MASE-based forecast results are depicted in Figs. 2.6a and 2.7b for web
and cloud server workloads respectively. It is evident that the D2 forecasts are most
accurate for the length of the prediction window of 5, 10, 30, and 60 minutes as per
the MASE. For 20 minutes of the duration, the best forecast accuracy measured using
MASE is generated for D1 . Based on the results, the first-order autoregressive process
models the web server workloads with better accuracy than cloud server workloads.
For cloud server-based workloads, the memory trace (D5 ) obtained better results over
CPU trace. In general, the first-order autoregressive process learns the pattern from
Calgary trace in a better way.

(a) Web workloads (b) Cloud workloads

FIGURE 2.6 Autoregression forecast results on MAE

(a) Web workloads (b) Cloud workloads

FIGURE 2.7 Autoregression forecast results on MASE


Another random document with
no related content on Scribd:
Don Lozano Gomez, who dared, with his iron gauntlet, to smite him on the
face in presence of Sancho the King and his Court. Mingled fury and deep
dejection filled the heart of the old man at this unparalleled affront; he
refused food; sleep left his eyelids, and hourly he brooded on his dire
disgrace, till his son Rodrigo vowed to avenge him. Before the miraculous
crucifix which is still in the Cathedral of Burgos, and which tradition avers
to have been fashioned by St. Nicodemus, he had sworn to do this—and so
strongly were the minds of men constituted in those days, that even as he
registered the evil vow, his heart was filled by a glow of reverence and
adoration—and then he rode forth in search of their enemy, for these were
not times like our own, when young fellows affect to be so much 'used up'
in all the joys and sorrows of the world that nothing excites them.

Quitting the vicinity of the Convent of Miraflores, he took the way to


Miranda del Ebro, and had not ridden many miles when he saw an armed
knight approaching, attended by four esquires, or men-at-arms, and a sense
of fierce joy filled the soul of the Cid on recognising, by the blazoning of
his surcoat, the very man of whom he was in search, Don Lozano, the
Conde de Gormaz, delivered over to him, as he believed, specially by the
hand of Heaven! Goldsmith tells us that 'it is easier to conceive than
describe the complicated sensations which are felt from the pain of a recent
injury and the pleasure of approaching vengeance;' and some such mingled
emotion there was in the heart of the knight.

Reining up his horse in the centre of the narrow and dusty road, Rodrigo
cried:

'Don Lozano—craven dog, who smote my father, defend yourself!'

'Begone, rash youth, lest I have you disarmed and scourged!' replied
Lozano, lowering his lance however, as he knew that he who barred the way
would not stand on trifles. 'We are five to one.'

'Villain, come on! on my side are right and nobility—worth a hundred


comrades!' cried Rodrigo; and meeting at full speed with a dreadful shock,
the splinters of their lances flew twenty feet into the air. Rodrigo then drew
his sword, the famous Tisona, and almost ere Lozano's blade had left its
sheath, he was hewn down from his saddle and bleeding in the dust, while
his armed attendants in terror took to flight. Rodrigo then tore the surcoat
from the dying Count, as a token of his victory—Mariana the historian, we
think, adds that he cut off his head—and then rode leisurely homeward to
Burgos; for if a little homicide by way of duello was thought little of here
when George III. was King, it was a matter of decidedly less consequence
in Spain in the days of the Cid.

At the head of three hundred mounted hidalgos, 'all wearing gold and
silken raiment, with perfumed gloves, and caps of gorgeous colours,' Don
Diego, now, as he thought, redeemed from disgrace, rode forth to meet the
King and kiss his hand, while Rodrigo repaired to the Convent of
Miraflores, with the blood-stained ribbon streaming from his casque, but
the face was not at the window now. Thrice he came thither and watched
and waited for it in vain, and believing that the Mother Abbess had
discovered his love-affair, he returned with a heavy heart to Burgos, to take
counsel of the King Sancho, though some say it was of this latter's father,
King Ferdinand.

But soon tidings came to the Court of Castile that a beautiful lady, who
had been foully wronged, was coming hither attended by a numerous train,
to seek justice at the hands of the King. All the young knights were ready to
embrace her cause, whatever it might be; but all, including the famous
Bellido Dolfos, withdrew in favour of Rodrigo, who first demanded to
make it his own; and yet he thought, 'God wot, why should I champion her,
when my own and only love is the Recluse of Miraflores?' And then the
sweet face at the window came before him in memory with all the soft
brightness of an opium-eater's dream.

Clad in black, with a gauze veil over her dark dishevelled tresses, her
eyes streaming with tears, the lady fell on her knees before the King,
exclaiming, as the Spanish ballad has it:

'Justice, King! I am for justice—


Vengeance on a traitor knight!
Grant it me! So shall thy children
Thrive and prove thy soul's delight.'
Her voice found a painful echo in the heart of Rodrigo, who was filled
with sudden horror.

'Estrella mia!' he exclaimed, as she threw up her veil; 'can such sorrow
be? Are you Ximena Gomez?'

'And you—you—the slayer of my hapless father! O mi padre murio!'


she cried in a piercing voice, as they both made this terrible discovery. Filial
affection had been a ruling passion in the gentle mind of Ximena, who now
experienced a dreadful shock on finding that it was by the hand of her lover,
her father had perished. And great too was the grief and dismay of the
young Cid at a catastrophe—a revelation so unexpected. A blight fell upon
the hearts of both. Lozano had no son to avenge his death. He left only the
helpless and weeping Ximena, whom the King raised up, and who now
ceased to demand on Rodrigo the punishment she had craved before, and
returned to Miraflores, vowing that she would take the veil, while Rodrigo,
accompanied by his comrades, Bellido Dolfos, Pedro Bermudez, and
Martin Pelaez, Ordono, and others, plunged at once into a series of warlike
exploits and expeditions, seeking to appease thereby the memory of the
sorrow that had fallen upon them all. 'Of all the knights, the Cid
distinguished himself most against the Mussulmans,' says Voltaire briefly.
'Many of them ranged themselves under his banner, and altogether, with
their squires and horsemen in armour, composed an army covered with iron
and mounted on the finest horses in the country. The Cid conquered more
than one Moorish king, and having at last fortified himself in the city of
Alcazar, formed there a little sovereignty.'

Spanish history makes the conquered kings five in number, and states
that he caused them to pay tribute after he set them at liberty, 'wherefore
they served him faithfully, and called him their Cid, or Lord.' It also records
that Ximena did not take the veil at Miraflores, but, curiously enough,
exhibited another strange sample of the manners of the age by petitioning
the King 'either to execute Rodrigo for killing her father, or give him to her
for a husband. The King chose the latter, and Rodrigo joyfully received
Ximena and took her to his mother, who kept her as her own child, and they
were betrothed; but Rodrigo promised to gain many more battles against the
Moors before he would claim her as his wife.' And so, while the Cid was
winning five provinces, and gaining glory too, with the edge of Tisona
among the infidels—of whom he slew an incredible number, till a saying of
his is a Spanish proverb to this day, 'The more Moors the more gain'—
Ximena spent her time in fear and hope among her favourite flowers and
love-birds at the house of Donna Teresa, in Burgos (Coronico de los Moros,
etc.).

And even after their marriage it was his boast, 'God wot! oftener is
Tisona than Ximena by my side.'

After the siege of Zamora, during which King Sancho was slain—
treacherously, it is averred, by Don Bellido Dolfos—the Cid, as the former
was repairing to Burgos, gave him a special message to Ximena:

'Tell her that I am coming; and, as an earnest thereof, give her this ring,
which I took from the hand of the Caliph of Cordova.'

Don Billido, who in his heart cherished a secret and treacherous love for
the betrothed of his friend, took the ring, and, saying emphatically,
'Rodrigo, amigo mio, haya cuenta sobre mi' (i.e., 'My friend, rely on me'),
rode gaily home to Burgos.

Bellido has been described as a man with a fierce hooked nose, a black
beard, and slightly treacherous eyes, that, if such are the true index of the
soul, might have revealed his natural character.

He gave the ring to Ximena, and told her that the Cid awaited her at
Miraflores. She was surprised at this, but, never doubting the comrade of
her intended husband, attended by two ladies, she set out for Miraflores,
closely veiled. They rode white palfreys, with velvet caparisons
embroidered with gold, and having silken bridles covered with little bells.
Bellido and some ruffians, on whom he could rely, formed their escort; but
they never reached Miraflores.

In due time the Cid Rodrigo came to Burgos with his heart full of
Ximena, his old love for her mingling with gratitude that she had forgiven
him for the terrible wrong he had done her, and already he seemed to see
her winning smile and her soft and lustrous eyes, that looked so truthfully
under the long, dark lashes that fringed them.

'Madre mia, where is Ximena?' he exclaimed, as he alighted from his


horse.

'At Miraflores, whither you sent for her,' was the reply.

'I sent no such message—there is some mistake.'

'Or treachery,' said Donna Teresa; 'my mind misgives me, or I distrust
Don Bellido.'

'Can he have decoyed her away!' exclaimed the Cid, with alarm and
rage in his voice and eye.

But the old lady knew not what to think, and began to weep bitterly; and
still more did she weep when sure tidings came that in revenge for repelling
his addresses, the double traitor Bellido Dolfos had betrayed Ximena into
the hands of Hiaja, the savage Caliph of Toledo.

Rodrigo was beside himself with sorrow and dismay; but bethought him
at once of his sword, and prevailed upon his new master, Alphonso VI.,
King of Old Castile, to besiege the city of Toledo, offering him all his
knights for that enterprise.

The report of this siege, and the cause thereof—a Christian lady of rare
beauty and high rank, more than all, the betrothed of the Cid, being a
captive in the hands of the odious Hiaja—brought many knights and princes
from distant lands, particularly Raymond, Count of Toulouse, and two
princes of the royal blood of France, of the branch of Burgundy.

Their armies covered all the fertile plain amid which Toledo stands, on a
steep hill, round the base of which flows the Tagus. In some places the
spears of the infantry—whose massed columns seemed like a sea of
glittering steel—stood thick as upright corn; in others were the squadrons of
barbed horse, the knights and men-at-arms, all clothed in chain armour,
bright as winter frost or polished silver, their many-coloured plumes, their
square banners, and swallow-tailed pennons streaming out upon the wind.

High overall, with its towers and the minarets of its mosques, rose the
then infidel city of Toledo, the upper part of which was then, as now, girt by
Roman, and the lower part by Moorish walls. History tells us that when
Alphonso VI. had been a fugitive under the persecution of his brother and
predecessor, Sancho, he had found an asylum at the Court of the Caliph of
Toledo, who treated him with hospitality and princely distinction; and now
more than one Moorish warrior rode forth from the city to reproach
Alphonso with ingratitude to his benefactor, and many a terrible and
remarkable combat was fought under the walls of Toledo, among the
defenders of which was Don Bellido Dolfos, who had renounced his faith
and adopted the turban.

In the combats before the city, the Cid was daily occupied, and many a
Moorish warrior, horse and man, rolled in the dust beneath his lance or
battle-axe; and his followers were enriched by the spoil, the rare weapons,
the costly garments and jewels, that his hand won.

At last there came a day—the anniversary of the victory won by


Mohammed at Bedr, between Mecca and Medina—when the Moors made a
dreadful sortie from Toledo, led by the renegade, Bellido Dolfos; and
closing in on every hand, the Christians met them with equal ardour and
fury.

The hand-to-hand fighting was terrible, and the Christian knights, led by
the Cid, the Count of Toulouse, and others, dashed their horses through and
through the living tide of Moors that surged around them. Gorgeous as a
field of flowers, with their many-coloured turbans and flowing garments,
seemed the Moors as they kept shoulder to shoulder, guarding their heads
with round shields covered with glittering bosses, their sharp scimitars
flashing in the sun, their shouts rolling like thunder between the Tagus and
the walls of Toledo, as they fought with demoniac strength and ferocity, but
fought in vain. High over all the throng towered the Cid upon Babieca, its
mailed flanks stuck full of arrows and even broken lances.
'Santiágo y cerra España!' he shouted ever and anon—the old war-cry of
Spain—and he hewed on all sides with Tisona, till his sword-arm grew
weary, and the last who bit the dust beneath it was the traitor Don Bellido,
after whose fall the Moors were driven headlong into Toledo.

The siege lasted a year, during which Ximena and her two attendants
occupied a noble chamber in the palace of the Caliph. Its ceiling was
adorned with arabesques and fretwork, brilliant with gold and delicate
pencilling. In its centre was an alabaster fountain of perfumed water, and
round it were cages of gold and silver wire, full of singing birds; and there
daily the three ladies offered up their prayers on their knees for the success
of the Christian arms, and for their own release.

After a year and a day Toledo capitulated, and Ximena was restored to
the Cid, to whom all New Castile submitted, and who took possession of it
in the name of Alphonso VI.; and Madrid, then a small village, one day to
become the capital of Spain, was for the first time in the hands of the
Christians, and Hiaja was the last Caliph of Toledo.

To narrate all the heroic deeds performed by the Cid after his marriage
would require the space of a very large volume indeed. The great dominions
he acquired for his royal master the latter increased by espousing Zaid, a
daughter of the Moorish King of Andalusia, after which Rodrigo, at the
head of his knights, subdued the whole of Valentia. No sovereign prince in
Spain was more powerful than he; but he contented himself with the title of
Cid, and never assumed that of King, though he might easily have done so.
No warrior in Spain did more evil to the Moors, yet he occasionally joined
the Beni Huds of Zaragossa against the Counts of Barcelona, whom he
conquered twice. While he never failed in his word to a Christian, he
mercilessly despoiled the Jews, from two of whom he raised money for war,
by depositing with them two chests which were alleged to be full of plate,
but which contained only stones and sand.

His two daughters became queens of Aragon and Navarre.

Five years after the conquest of Valentia, worn out by incessant warfare,
he fell ill, and was abed when tidings were brought to him that Bucar, the
Moor, whom he had expelled from that kingdom, was advancing to regain it
with a mighty army of horse and foot; but Tisona lay idly in the scabbard
now. For seven days preceding his death, the Cid would taste nothing but a
little myrrh and balsam; and on the day he departed he took a solemn
farewell of Ximena, his kinsmen, and all his knights, whom he requested to
carefully bury his old war-horse Babieca, 'to the end that no dogs might eat
the flesh of him whose hoofs had trodden down so much dog's-flesh of the
Moors.' He bequeathed a coffer of silver to the two Jews, and desired that
his body should be borne to San Pedro de Cardena, and laid beside that of
his mother.

He died in the year 1097; but he who had been the terror of the Moors
for so many years when in life, was still fated to strike terror to them in
death, even while all the host of King Bucar were rejoicing that he had
passed away. At midnight, twelve days after that event, the Christians
prepared to abandon the city of Valentia—'Valentia of the Cid,' as it is
called to this day. His body, which had been placed, we are told, 'in a sitting
posture, and left to stiffen between two boards,' was placed on the back of
Babieca, upright in the saddle, with the feet tied in the stirrups. To all
appearance he was completely armed; a light shield of parchment, painted
with his device, was hung on his left arm; the terrible Tisona was fixed bare
and upright in his sword-hand. Geronymo, Bishop of Valentia, led Babieca
by the rein; Pedro Bermudez, with the banner of the Cid upraised, led the
van with 400 knights; then came the Cid's body, with Ximena and her
ladies, guarded by 600 men, and when day broke, though the Moors were
terrified to find that the Cid was there in his saddle again, a battle ensued,
and King Bucar was defeated; but Valentia was lost, and the sorrowing
warriors of Rodrigo continued their retreat to Old Castile and beyond the
Ebro.

At Olmedo they were met by his daughters, with all the knights of
Aragon, clad in black cloaks, with hoods rent, and their shields reversed at
their saddle bows; and with every religious and military solemnity incident
to the time, they laid him in his grave at San Pedro de Cardena, and two
years afterwards Gil Diaz, one of his most faithful followers, buried
Babieca before the gate of the church there. In the course of seven centuries
and a half the remains of the famous Cid Rodrigo have been removed
several times, the last occasion being by the French, in 1809, to the
Espolen, or public promenade of Burgos; but in 1826 they were restored to
San Pedro, where the tomb and effigies of himself and Ximena now remain
in a small but noble chapel. In that chapel lie the bones of Alvar Fanez
Minaya, whom he was wont to call his 'right arm;' of Pedro Bermudez,
Ordono, Martin Pelaez, the Asturian, and many more of his captains and
valiant friends.

His statue, as 'the dread and terror of the Moorish curs,' has a prominent
place in the quaint gateway of Santa Maria, erected by Charles V. at Burgos.
In the time of Cervantes the saddle of Babieca was preserved in the Royal
Armoury at Madrid, and Southey avers that he had personally seen and
handled Tisona, now an heirloom in the family of the Marquis de Falces.
On one side of the blade is graven, 'I am Tisona, made in the year 1002;' on
the other is the legend, 'AVE MARIA GRATIA PLENA DOMINUS
TUUM.'

THE BOY-GENERAL.
THE STORY OF JEAN CAVALIER.
THE BOY-GENERAL.

THE STORY OF JEAN CAVALIER.

'Guillot—you here! Why have you left the mountain of St. Julian?'

'To be with you, brother Jean—to fight for the Cevennes.'

'With a beardless face and a feeble hand!'

'I have about as much beard as you, mon frère; and if my hand be
feeble, it has brought down many a wolf in Mialet and the Gevaudan,'
replied Guillot, slapping the butt of his carbine emphatically.

The speakers were young Guillot Cavalier and his elder brother Jean,
who was then, at the age of seventeen years, actually a general and second
in command of the Camisard army, the Insurgent Protestants of Languedoc;
who fought many a battle with Villars and De Montrevel, the best leaders of
the age; who, with Roland, led the great revolt in 1703; and who in his
twentieth year became a full colonel in the English army!

Both were very handsome lads, and both wore the white tunic (in
Languedocian, camisa) to distinguish themselves from their enemies, and
hence their well-known name of Camisards. Both were well armed, with
swords, silver-mounted pistols, and short carbines; but the elder wore over
his shoulder the scarf of a French general, and in his white velvet cap the
wing of an eagle. Strong—and tender as strong—was the bond of affection
between these two lads, who had both been born in the village of Ribaute,
among the pastoral mountains north of the Valley of Garden; and though
Jean was ready to face any peril and to 'do all that may become a man' for
the cause in which he had been so suddenly made a leader, and in which he
had already won such high distinction, his heart sank at the contemplation
of Guillot—a delicate boy, and their mother's chief care—encountering the
risks of that most savage and rancorous Civil War which now devastated
Languedoc.

Jean, as a very little boy, had been bred a shepherd, and was afterwards
apprenticed to a baker at Anduze; and it was from the employ of the latter
that, with a carbine in his hand, he went forth to become a Camisard, 'and
soon proved himself to be,' as history tells us, 'a most able general, as well
as a powerful prophet and preacher.'

'Return, Guillot—return,' he is said to have urged again; 'our poor


mother cannot spare us both.'

'La Bonne Madelon is the mother we must serve just now, and I will not
quit your camp,' replied Guillot, whose eyes lit up, as he referred to one of
those wild, half-frenzied, and wholly enthusiastic prophetesses, or female
preachers, who thronged the camps of the Camisards, attended their
councils, and followed them into battle.

'Then be it so,' said Jean Cavalier resignedly; adding, 'I have good news
for you and all the faithful, Guillot. The Queen of Great Britain—the good
Queen Anne—is sending a fleet to our aid.'

'Of what use will it be to us among the mountains?' asked Guillot,


laughing.

'It brings us troops, Guillot—troops, who will help us to beat those of


Montrevel,' replied Jean, referring to the expedition consisting of thirty-five
British and twelve Dutch ships of the line, which was to sail on the 1st of
July, 1703, from St. Helens, to the assistance of the Cevennois, and to the
arrival of this expedition off the coast the elder Cavalier looked confidently
forward to repulsing the column of De Montrevel, while Roland was
fighting the King's troops elsewhere. And now to explain briefly what
brought all these affairs about.

In the 'Histoire des Pasteurs du Désert,' and other annals, we are told the
terrible story of that Civil War in which 30,000 Cevennois perished in battle
or on the scaffold, between November, 1702, and December, 1704. Well
fitted for desultory warfare are the mountains of Cevennes, with their rocky
labyrinth of deep gorges and dark defiles, which a mere handful of bold
peasantry were able to hold against the best troops of Louis XIV., and
where, to this hour, the population is almost entirely Protestant, inhabiting
some six hundred villages, which are all but inaccessible.

The white-shirted Camisards had these steep ridges to encamp on;


gorges for ambuscades; forests to rally in; paths trodden only by the wolf or
the fox to retreat by; and caverns which became their arsenals and
fortresses. Army after army came to annihilate these peasants as heretics,
after the Revocation of the Edict of Nantes, but only to be destroyed or
hurled in ruin and defeat into the valleys; but the miseries of the war, the
slaughter of women and children, the burning and pillaging were fearful,
and spread from thence to the ocean on the south, and the Rhone on the
east, among the hundred churches of Dauphine. With much sublime piety
and heroic valour the armed peasantry, as in the similar case of the Scottish
Covenanters, combined a great amount of psalm-singing and the strongest
religious fervour, bordering at times upon fanaticism, and prophets and
prophetesses, like La Bonne Madelon, roused a wildness of enthusiasm
never seen in France since the days of Joan of Arc. 'The spirit of resistance
began to show itself, drawn forth by the recital of their wrongs, the
denunciation of their tyrants, and the assurance of support from heaven;
conventicles were held, in spite of the terrors of prison, torture, and the
soldiery, and in the open air among the rocks and caverns.'

Roland and Cavalier levied their troops from the different parishes, each
of which furnished its quota of armed men and money, and fresh heroes to
fill up the vacancies in the ranks. Many believed themselves to be sword or
bullet proof, while 'the seizures, tortures, executions by breaking on the
wheel and burning alive (the common modes of punishing a Camisard), led
to reprisals on their part—to the slaying of priests and the sacking and
burning of Catholic churches.' But in the spirit of outrage, the French troops
were far surpassed by the guerilla bands, called Florentins, in the pay of the
Grand Monarque.

Jean Cavalier thought of these things keenly now, as he gazed on the


soft boyish face of his brother Guillot, when posting his column of
Camisards in ambush one morning, ere dawn, to give a hot welcome to the
royal forces under the Sieur de Montrevel, an officer high in repute for great
valour, but merciless in his severity.

The sound of the drums had died away, but the sheeny bayonets
glistened in the sun, and the white Bourbon colours of the regiments, with
their golden fleur-de-lys, were waving in the wind, as the column of royal
troops began to penetrate a defile that was clothed with the olive, the vine,
and the fig-tree. The church and hamlet there had perished by fire; the place
was desolate; not a human being was visible, and without halting, the troops
pushed on, with an advanced guard to 'feel the way,' in front, till they
reached a portion of the defile where the impending rocks were higher, the
way narrower, and the trailing vines had given place to the dense, dark, and
woody luxuriance of forest trees. The flower of the column was composed
of one of the four battalions of the ancient regiment of Champagne, raised
so far back as the reign of Henry II.

'Halt!' cried the officer of the advanced guard, whose quick eye had
detected the bright flash of steel amid the green branches. In another
moment, a combination of fearful sounds burst like a storm upon the silent
air, while the soldiers halted, panting with the exertion of climbing the long
and steep ascent. An enormous fragment of rock, dislodged from above,
crashed with the sound of thunder into the defile below, a mass that must
have annihilated the entire advanced guard, had the officer not halted it in
time. Other masses of rock and rubbish came thundering down, barring all
advance, while more than a thousand voices made the defile re-echo with
the shouts of fierce exultation, mingled with a religious hymn.

On the fallen rock in front there was suddenly seen a female, 'the Good
Madelon,' kneeling in an attitude of frenzied supplication, her arms thrown
wildly up, her hands clasped, her black hair floating loose, her drapery
streaming on the wind, and by her side stood Cavalier. As yet no shots had
been fired.

'Voilà! 'Tis the rebel Cavalier!' cried De Montrevel, almost leaping in his
saddle with exultation; and his sharp words of command followed fast.

A volley was poured in front and on both flanks, and from these three
points it was closely responded to; and then the soldiers, who were in great
force, began, at the bayonet's point, to push up the woody sides of the
defile, firing as they went and driving the peasantry before them; and
meanwhile the prophetess—she of the supposed charmed life, La Bonne
Madelon, remained on her knees immovable, absorbed in prayer, half seen,
half hidden, amid the eddying smoke. Guillot strove to lead her aside, but in
vain; and when a bullet grazed his cheek, he rushed away to join his
brother, who, like him, strongly believed in the power of immunity from
death possessed by Madelon, and was now busy in the act of concentrating
and directing the operations of his scattered followers.

It is said that when the prophetess, whose eyes had in them the gleam of
insanity, felt the bullets whiz about her, a sense of danger came with the
sound, and that she opened her eyes and glanced about her, as if seeking to
escape, but she was grasped by four soldiers of the line; and that when the
Camisards beheld her feeble hands bound with cords, while her head sunk
on her breast, and she was dragged away, they became for a time panic-
stricken, and though they hovered on the precipice above the corpse-strewn
defile, they ceased to fire, and gazed on her conveyance to the rear in a
species of stupid wonder.

'She can save herself,' Cavalier is reported to have said, so perfect was
his belief, as a credulous mountaineer, in her divine mission; 'we cannot
rescue her now, but,' he added, lifting his cap and looking upward, 'some
miracle from heaven will.'

But no miracle was wrought, and with his solitary prisoner the Sieur de
Montrevel marched down, somewhat triumphantly, to the nearest town, the
white houses of which could be seen a league or two distant from the
mountains. That night Guillot, with a chosen party, stole from them, and
entered the silent street, from which all the inhabitants had fled, hoping to
find some trace of the Good Madelon, perhaps in the public prison, from
which they might see a way to free her.

But Montrevel and his men had departed, leaving in the market-place a
fearful object, which greeted the eyes of Guillot and his followers when
daybreak came in. Suspended by the neck from a gibbet in the centre of the
place hung the body of their prophetess in its well-known drapery, and
literally full of bullets, as the departing Florentins had made a target of it.
She had been a beautiful woman, whose husband and children had been
cruelly destroyed before her, and sorrow had doubtless turned her brain.

Accustomed though they had become to atrocities, the Camisards gazed


at each other in horror at this spectacle, and then bore away her body for
interment, sadly, slowly, and reverentially, and from the side of her grave
went up the united vow for vengeance!

The fleet of Sir Cloudesley Shovel failed to land either succour or allies,
and returning to England, says Schomberg, in his 'Naval Chronology,' was
off the Isle of Wight on the 16th of November; so the Camisards now had
no hopes but in their own hearts and hands.

Intent on avenging the barbarous death of the Good Madelon, Jean


Cavalier, with 1,500 Camisards, took post near La Tour de Bellot, a
deserted sheep-farm and watch-tower to the westward of Alais, from
whence he meant to issue and attack De Montrevel, who was, he believed,
ignorant of his vicinity, and who, keeping somewhat careless guard, was
encamped not far off among the mountains. In the afternoon the Camisards
were plentifully supplied with food by a wealthy miller on the Garden,
whom they believed to be true to their cause. By nightfall, Cavalier had
reconnoitred all the country; and as the sun set, dark clouds gathered fast,
and premature twilight shrouded the valleys. Through them the wind
howled, foreboding a storm, and Cavalier laughed with stern joy, when
telling his followers that their attack would be veiled by the war of the
elements.

He had laid out his plans with wisdom, and alone, and a little apart from
his troops, was waiting the time to give them the signal to move, when from
all points around the Tour de Bellot burst forth a half-random storm of
musketry, and the boom of cannon announced that the King's troops were
upon him!

'We are betrayed!' cried Guillot, rushing bare-headed to his side.

'By whom?'

'The miller of the Garden!' replied Guillot, passionately.

And so it was; ere the Camisard outposts had been able to give the
alarm, they were cut to pieces, and only Cavalier and a few of his men were
able to sally from the tower before it was invested on all sides. Guillot and
others were shut up in it! Furious were the efforts made by Cavalier—
efforts urged by filial love and despair—to drive back the soldiers and
relieve those in the tower, from the windows and every cranny of which its
slender garrison poured a deadly fire for eight hours, till their ammunition
was expended, and then the edifice was set on fire; 290 perished in it, says
history, 100 Camisards lay dead outside, and around it were 1,200 of the
King's troops killed or wounded!

Compelled to retire some distance, yet fighting every inch of the way,
Cavalier beheld, with horror, the tower sheeted with fire. His soul died
within him as he thought of his brother, the boyish and gentle Guillot, and
all who were perishing there, and he strove to fight his way back just as day
was breaking, and by the light of it he could see, apart from all the hurly-
burly of the strife, a remarkable combat proceeding, and on the very verge
of a cliff close by.

It was a boy—a boy, sword in hand—Guillot, fighting with a young


officer of the Regiment of Champagne. His cap was off—his white camisa
was stained by blood and dirt and scorched with fire. Borne back by
bayonets, Cavalier could only look on in agony, as he saw his brother
driven step by step to the very verge of the dreadful cliff behind him, and of
which he was unaware. Unyielding, though retreating, Guillot kept parrying
thrusts and warding cuts with consummate skill, till a cry escaped him, and
he vanished!
A groan from the breast of Cavalier echoed that cry; a mist came over
his sight, yet he continued to fight, like a blind man, to cover the retreat of
the wreck of his followers, by whom wild justice was soon after done on the
treacherous miller. He was seized, condemned to death, and led out to
execution in front of the insurgents, who, according to their wont, knelt
around him, while offering up prayers for his soul. His parting embrace was
refused by his two sons, who served under Cavalier, and who looked on
unmoved by the terrible death he had to die.

That his brother Guillot might perish in battle, or by torture in the hands
of the enemy, Cavalier had always dreaded; but the catastrophe by which he
lost him was altogether unconceived: and the fortunes of the conflict led
him far from the vicinity of La Tour de Bellot, thus he could neither search
for the remains of Guillot, nor bestow funeral rites upon them.

For months the war went on. The bright valour and cool judgment of
Cavalier, 'the Boy-General,' for such he was, exalted him still more above
all other leaders of the Camisards, and especially so when he succeeded in
utterly defeating a considerable body of the royal troops at Martinarque,
under the Sieur de Montrevel, who commanded them.

The 6th of April, 1704, saw Cavalier again betrayed by one he trusted.
At the head of 900 foot and 300 horse, all well equipped, he entered the
Vaunage, or Valley of Noyes, so called from a little town of that name, in
the fertile district westward of Nismes, intending to waylay the Marechal de
Montrevel, who was on the way to Montpellier, but was himself lured into a
dreadful ambuscade, and surrounded on all sides by the royal troops,
including a great body of King James's-Irish, who had recently fought at the
battle of the Boyne.

On all sides burst forth from amid the shelter of trees and hedgerows the
withering fire of musketry, the boom of the cannon, and the hissing showers
of grape.

Undismayed by the sudden scene of carnage, and by numbers six times


exceeding his own, Cavalier, perceiving a design of the enemy to
completely cut him off, 'wheeled his column rapidly round under the hottest
fire, and in the face of a charge of bayonets drew off his men, retreating in
echelon—a masterly manœuvre of the baker's boy, which drew forth the
admiration of the Marechal Duc de Villars.'

Eventually, however, his retreat was cut off, the royal troops occupied
every height, every avenue and pass that remained, and nothing was left for
him now but to cut his way out at all hazards, or die! He was not long in
choosing. 'Throwing aside his magnificent uniform and white plume, he put
on a common dress,' we are told, and ordering his comrades to close their
ranks, made a headlong dash at the enemy.

'Notre Dame de frappe morte!' was the shout of the regiments of


Champagne and Normandy, as they brought their bayonets to the charge;
but Cavalier broke through the first line. In the attack on the second, he was
singled out when discovered, and a soldier seized the bridle of his horse, but
had his hand hewn off by a young Camisard wearing a scarlet scarf over his
white camisa. He was next grasped by a dragoon, whom he pistolled; but
now, beyond appeared another line and a whole squadron of dragoons
barring his way to the Pont de Rosni—the only issue. Panic-struck, his
fugitive horsemen poured madly down upon it sword in hand, forgetful for
a time of their leader, who was in the rear, and who would probably have
been cut off but for the young Camisard in the red scarf—his brother—his
brother Guillot (of whose escape anon), who suddenly appeared upon the
ground—'his brother, a boy ten (?) years old,' says the French account, 'who
drew his horse across the bridge, and with a pistol presented to the
fugitives, summoned them to defend their chief and not abandon him.'

Cavalier, with the remainder of his force, escaped into the forest of
Cannes. This battle extended over all the ground from the mill of Langlode
to the town of Noyes. Of one thousand dead who lay on the field, one half
were Camisards. During the whole of the conflict one of their prophets,
named Daniel Gui, stood on the summit of a rock, amid six female
enthusiasts, three of whom were afterwards shot, invoking the God of
Battles to favour their cause.

The miraculous restoration of his brother—for such it was deemed—


alone was a palliation to the heart of Cavalier for the deep mortification of
his defeat; and yet it had come about simply enough. Recent rains had
formed a deep basin of water under the cliff from which he fell, in a place
where jagged rocks alone had been visible shortly before. Sinking, he rose
to the surface, struggled to the bank, faint and wounded, and had found
shelter, till well and whole, in a shepherd's hut, till he could join his brother
in the Valley of the Noyes, and now tender indeed was their meeting and the
mutual embrace they gave each other.

But brief time had they for mutual explanations, as ere long the report
of musketry began to wake the echoes of the forest, and Daniel Gui came
rushing in with tidings that the Sieur de Lalande was putting to the sword
the entire inhabitants of the village of Euzet. Entering it suddenly, he had
found a bullock newly-skinned, and bales of hams, bread, and sausages
made up for the men of Cavalier, whom he at once traced and attacked with
vigour, and defeated with the loss of 170 men. Final vengeance now fell on
the unhappy villagers of Euzet, which, together with a cavern close by, was
found to be full of the wounded, ammunition, medicine, and stores of
Cavalier's forces. This sealed the fate of the former; and every human being
lying there was slaughtered, including the helpless creatures in the cavern.
Such was the awful system on which this war was carried on.

Cavalier's commissariat was supplied by requisitions upon districts,


irrespective of their faith, and when not given with goodwill, he was
compelled to write thus to the chief magistrate of the place:

'MM.,—Vous ne manquerez point de nous préparer demain le diner, son


peine d'être assiégé et mis à feu et à sang!—CAVALIER.'

But it was while he was still struggling manfully and bravely to


maintain a desperate cause against the whole force of the French army that
the crushing intelligence came to him of the fall of his compatriot, Roland
Laporte. This was on the 13th of August, 1704, at Castelnau, near the Ners,
a river which in winter rolls down from the mountains in a mighty flood.

His presence there would seem to have been betrayed to the Duc de
Villars. At midnight, when he and his companions were fast asleep, the
sentinel on the tower-head suddenly heard amid the stillness of the hour the
distant noise of horses approaching at a furious gallop, and gave the alarm
just as a column of cavalry was entering the town.

Half-clad and half-armed, the Camisards rushed to the stables, and


mounting their horses bare-backed, rode off without saddles, bits, or spurs;
thus they were soon after taken in a deep hollow way, and compelled to halt
and dismount. Planting his back against an aged olive tree, Roland made a
desperate resistance, to every summons of 'Rendez vous! bas les armes
coquin!' replying by a blow of his sword, or shots from his pistols, a row of
which he carried in his girdle. He slew several dragoons, ere one by a
musket-shot brought him down, by a mortal wound, on which his comrades
threw themselves above his body, and were seized and bound.

On the 16th of August his body was dragged at the tail of a cart into
Nismes and burnt, while five of his companions were broken alive on the
wheel around his funeral pyre. Many Camisards perished thus here, the
most memorable executions being those of Catenat and Ravenel, who were
burned alive, almost within sight of the battle-field on which they had
defeated the Comte de Broglie.

Jean Cavalier found himself almost alone now, yet his spirit did not
quail.

Marshal Villars had now come to the conclusion that the warfare
seemed likely to become interminable; that it was possible to harass the
hardy mountaineers of the Cevennes, but not to conquer them. So resolute
was the spirit of the Camisards, so impregnable their hilly fortresses, that all
hope of ending the war so long as one was left alive, was relinquished by
this able officer; and we are told that in the heart of Cavalier, who beheld
the sufferings of the peasantry from incessant toil and famine, there rose a
great longing for peace, if it were possible with safety and honour; and on
ascertaining that 10,000 of the Huguenots were ready to lay down their
arms and submit to the king, he consented to hold an amicable parley with
any officer the latter might send.

Cavalier's first interview was with Lalande, who was sent by Marshal
Villars. 'Lalande surveyed the worn garments and pale cheeks of the young

You might also like