This presentation offers insight into use of ANN and machine learning for various applications in solar energy. Prepared and presented by Dr. Ali H. A. Alwaeli.
14. PV/ T
โข The electric power (P) is expressed as:
Where (Imp) is the current and (Vmp) is the voltage. The power unit is watts (W).
โข The electrical efficiency (๏จe) of conventional PV is calculated using the formula below:
Where (G) is the solar irradiance in w/m2 and (Amodule) is the PV module area in m2
๐ท ๐๐๐ = ๐ฐ ๐๐ ร ๐ฝ ๐๐
๏จ ๐
=
๐ท ๐๐๐
๐ฎ ร ๐จ ๐๐๐ ๐๐๐
15. PV/ T
โข The total of the efficiencies, which is known as total efficiency ๏จcombined is used to
evaluate the overall performance of the system:
โข The thermal efficiency (๏จth) of the conventional flat plate solar collector is calculated
using the formula below:
โข For temperature-dependent electrical efficiency of the PV module, ๏จel the expression
is given as below:
elthcombined ๏จ๏จ๏จ ๏ซ๏ฝ
c(t)
u
ht
A*I
Q
=๏จ ๏ ๏)( aiLRcu TTUSFAQ ๏ญ๏ญ๏ฝ
๏จ ๏ฉ)TT(*= rpmrel ๏ญ๏ญ ๏ข๏จ๏จ 1
HottelโWhillier
equations
19. PV/ T
OUTDOOR
TESTING
Alzaabi, A. A., Badawiyeh, N. K., Hantoush, H. O., & Hamid, A. K. (2014).
Electrical/thermal performance of hybrid PV/T system in Sharjah, UAE. International
Journal of Smart Grid and Clean Energy, 3(4), 385-389. C
20. PV/ T
Rahman, S., Sarker, M. R. I., Mandal, S., & Beg, M. R. A. (2018). Experimental and Numerical Analysis of a
Stand-Alone PV/T System to Improve its Efficiency. Sch J Appl Sci Res, 1, 28-33. C
22. PV/ T
Electrical
Electrical
efficiency &
power
IV curve
characteristics
Electrical Exergy
& Exergy
efficiency
Thermal
Thermal
efficiency &
power
Optical
efficiency &
mass flow rate
Thermal Exergy
& Exergy
efficiency
Cost Life Cycle Costs Cost of Energy Payback Period
31. Artificial Neural Networks (ANN)
Numerical
Data
Labeling
Images
Video
Text
Time-series
Clustering
Machine
perception
Classification Clustering
C
32. Artificial Neural Networks (ANN)
Classification
SUPERVISED LEARNING
Face detection
Object
Identification
Text
classification
Voice detection
Gesture
Identification
Transcript
speech-text
Create
Numerical
Data
Assign
to
data
TRAIN
THE
ANN
Create
correlation
33. Artificial Neural Networks (ANN)
Ni, D. X. (2007). Application of neural networks to character recognition. Proceedings
of students/faculty research day, CSIS, Pace University, May 4th.
Classification
C
34. Artificial Neural Networks (ANN)
Classification
https://www.analyticsindiamag.com/how-to-create-your-first-artificial-neural-network-in-python/ C
35. Artificial Neural Networks (ANN)
Regression
SUPERVISED LEARNING
Hardware
breakdown
Employee
turnover
Customer churn
Health
breakdown
Machine output
C
36. Artificial Neural Networks (ANN)
Regression
https://dataaspirant.com/2017/03/02/how-logistic-regression-model-works/ C
37. Artificial Neural Networks (ANN)
Regression
https://dataaspirant.com/2017/03/02/how-logistic-regression-model-works/ C
38. Artificial Neural Networks (ANN)
Regression
https://dataaspirant.com/2017/03/02/how-logistic-regression-model-works/ C
39. Artificial Neural Networks (ANN)
Clustering
UNSUPERVISED
LEARNING
Comparing documents, images, etc.
to surface similar items
Detecting anomalies, or unusual
behavior
C
43. Artificial Neural Networks (ANN)
Learning paradigms
Kaplan, S. (2017). Deep generative models for synthetic retinal image generation. C
44. Artificial Neural Networks (ANN)
Learning paradigms
Unsupervised learning Supervised learning Clustering: An Introduction to Supervised
Machine Learning and Pattern Classification: The Big Picture by Sebastian Raschka C
50. Artificial Neural Networks (ANN)
MULTIPLE LAYER PERCEPTRON (MLP)
A MLP consists of at least three layers of nodes: an input layer, a hidden layer and an
output layer. Except for the input nodes, each node is a neuron that uses a
nonlinear activation function. MLP utilizes a supervised learning technique
called backpropagation for training. Its multiple layers and non-linear activation distinguish
MLP from a linear perceptron. It can distinguish data that is not linearly separable.
51. Artificial Neural Networks (ANN)
MULTIPLE LAYER PERCEPTRON (MLP)
changing connection weights after each piece of data is processed, based on the
amount of error in the output compared to the expected result.
52. Artificial Neural Networks (ANN)
MULTIPLE LAYER PERCEPTRON (MLP)
MLPs are useful in research for their ability to solve problems stochastically, which often
allows approximate solutions for extremely complex problems like fitness approximation.
MLPs are universal function approximators as shown by Cybenko's theorem, so they can
be used to create mathematical models by regression analysis. As classification is a
particular case of regression when the response variable is categorical, MLPs make good
classifier algorithms.
MLPs were a popular machine learning solution in the 1980s, finding applications in
diverse fields such as speech recognition, image recognition, and machine
translation software, but thereafter faced strong competition from much simpler (and
related) support vector machines. Interest in backpropagation networks returned due to
the successes of deep learning.
54. Artificial Neural Networks (ANN)
FEED-FORWARD
Connections between the nodes do not form a cycle.
The first and simplest type of artificial neural network
devised.
57. Artificial Neural Networks (ANN)
BACKPROPAGATION
๏ฑ Supervised learning
๏ฑ Used at each layer to minimize the
error between the layerโs response
and the actual data
๏ฑ The error at each hidden layer is an
average of the evaluated error
58. Artificial Neural Networks (ANN)
BACKPROPAGATION
๏ N is a neuron.
๏ Nw is one of Nโs inputs weights
๏ Nout is Nโs output.
๏ Nw = Nw +ฮ Nw
๏ ฮ Nw = Nout * (1โ Nout)* NErrorFactor
๏ NErrorFactor = NExpectedOutput โ NActualOutput
๏ This works only for the last layer, as we can know the actual output, and
the expected output.
62. PV/T modeling and performance
prediction
โข Why use artificial neural networks (ANN) in solar energy
technologies?
โข How to use artificial neural networks (ANN) in solar energy
technologies?
โข How to utilize regression and classification to invest in solar energy?
63. PV performance time-series prediction
Short-term power
forecasting
Estimating power loss
due to environment
Estimating the energetic
performance of PV/T
PV fault detection
Optimize PV array
inclination
Solar Energy Forecasting
Optimize maximum
power point tracking
64. PV performance time-series prediction
C
Senkal, O. & Kuleli, T. (2009). Estimation of solar radiation over Turkey using
artificial neural network and satellite data. Applied Energy, Vol. 86, pp. 1222โ
65. PV performance time-series prediction
Almonacid, F., Rus, C., Hontoria, L., Fuentes, M. & Nofuentes G. (2009). Characterisation
of
Si-crystalline PV modules by artificial neural Networks. Renewable Energy, Vol. 34, C
66. PV performance time-series prediction
EXAMPLE 1: PREDICTIVE
PERFORMANCE OF SOLAR ENERGY
SYSTEM
Yaรฏci, W., Longo, M., Entchev, E., & Foiadelli, F. (2017).
Simulation study on the effect of reduced inputs of artificial
neural networks on the predictive performance of the solar
energy system. Sustainability, 9(8), 1382.
Multi-Layer Perceptron (MLP) Feed Forward (FF) and back propagation
(BP) neural network structure
Comparison with experimental data from a solar energy system tested in
Ottawa, Canada during two years under different weather conditions
70. PV performance time-series prediction
Firefly algorithm (FA) and Particle Swarm Optimization (PSO) applied
to train NN
EXAMPLE 2: SHORT TERM POWER
FORECASTING
Demirdelen, T., Aksu, I. O., Esenboga, B., Aygul, K.,
Ekinci, F., & Bilgili, M. (2019). A new method for
generating short-term power forecasting based on artificial
neural networks and optimization methods for solar
photovoltaic power plants. In Solar Photovoltaic Power
Plants (pp. 165-189). Springer, Singapore.
Multi-Layer Feed Forward (MLFF) neural network structure
The data of 1MWPV power plant in Turkey is used to estimate output power
by
real-time data mining for short time prediction.
72. PV performance time-series prediction
Example 2
Ambient temperature [โฆC]
Solar radiation [W/m2]
PV Panel temperature [โฆC]
Input
Layer
Hidden
Layer
Output
Layer
24 8
Interconnections Interconnections
PV power (W)
73. PV performance time-series prediction
Example 2
During network training, 41 parameters are trained. The algorithms are run for
300 iterations, and during the training, 20 individuals are used for each optimization
method. A total of 100479 data sets are sent to the network.
Weights for the 24 interconnections between the input and the hidden layer
The bias value for the 8 neurons in hidden layers
Weights for the 8 interconnections between the hidden and the output layer
The bias value for the single output layer
The training coefficients during network training
Procedure
75. PV/T performance time-series prediction
Particle Swarm Optimization (PSO) applied to train NN
EXAMPLE 3: ESTIMATING ENERGETIC
PERFORMANCE
Alnaqi, A. A., Moayedi, H., Shahsavar, A., & Nguyen, T.
K. (2019). Prediction of energetic performance of a
building integrated photovoltaic/thermal system thorough
artificial neural network and hybrid particle swarm
optimization models. Energy conversion and
management, 183, 137-148.
Multi-Layer Feed-Forward Back-Propagation (FFBP) neural network
structure
building integrated photovoltaic/thermal system
76. PV/T performance time-series prediction
the total rate of useful thermal and electrical energy
available from the BIPV/T system to the heating load of
the fresh air
Evaluation criteria
Performance Evaluation
Criteria (PEC)
Input Output
Channel
length
Channel
depth
Channel
width
Air mass flow
rate
PEC
Example 3
83. PV/T modeling and performance prediction
Al-Waeli et al. (2019). Artificial neural network modeling and analysis of
photovoltaic/thermal system based on the experimental study.
84. PV/T modeling and performance prediction
Al-Waeli et al. (2019). Artificial neural network modeling and analysis of
photovoltaic/thermal system based on the experimental study.
85. Statistical methods
โข Coefficient of determination
โข Mean Absolute Percentage Error
โข Root Mean Square Error
โข Mean Absolute Error
โข Mean Square Error
difference between two continuous variables
how well a regression model is capable of describing a
data set
a measure of prediction accuracy of a forecasting
method in statistics,
a measure of the quality of an estimator
measure of the differences between
an estimator and the values observed
86. Statistical methods
COEFFICIENT OF DETERMINATION
a measure for how well a regression model is capable of
describing a data set.
used to evaluate the validity of the predictive
results compared to the actual (experimental)
real model results
๐๐= experimental value of (y)
๐๐= predicted value of (y)
๐๐= the mean of the experimental values
N = the number of values
87. Statistical methods
MEAN ABSOLUTE ERROR
a measure of difference between two continuous variables.
the average vertical distance between each point and the
identity line. MAE is also the average horizontal distance
between each point and the identity line.
89. Statistical methods
MEAN SQUARE ERROR
measures the average of the squares of the errorsโthat is, the average
squared difference between the estimated values and what is estimated
The MSE is a measure of the quality of an estimatorโit is always non-
negative, and values closer to zero are better.
90. Statistical methods
ROOT MEAN SQUARE ERROR
a measure for the difference between a data set and a corresponding fit. The RMS
asymptotically converges to the standard deviation from the model's predicted value
for sufficiently large sizes of data sets.
measure of the differences between values (sample or population values)
predicted by a model or an estimator and the values observed.
91. Statistical methods
Al-Waeli et al. [2018]. Comparison of prediction methods of PV/T nanofluid and nano-PCM system
using a measured dataset and artificial neural network
92. Statistical methods
Al-Waeli et al. [2018]. Comparison of prediction methods of PV/T nanofluid and nano-PCM system
using a measured dataset and artificial neural network
93. Statistical methods
Al-Waeli et al. [2018]. Comparison of prediction methods of PV/T nanofluid and nano-PCM system
using a measured dataset and artificial neural network
94. Statistical methods
Al-Waeli et al. [2018]. Comparison of prediction methods of PV/T nanofluid and nano-PCM system
using a measured dataset and artificial neural network
LOWER IS BETTER
95. Food for Thought
โข What is the difference between Artificial Intelligence and
computational intelligence?
โข What is Particle Swarm Optimization?
โข What is Firefly Algorithm?
โข Which is a better algorithm for optimization? Why?
โข How to interpret skewness and kurtosis?
โข How to select the appropriate number of hidden layers?
96. ๏ Initial weights (small random values โ[โ1,1])
๏ Transfer function (How the inputs and the weights are combined to
produce output?)
๏ Error estimation
๏ Weights adjusting
๏ Number of neurons
๏ Data representation
๏ Size of training set
The How?
97. Text book in PV/T
https://www.amazon.com/Photovoltaic-Thermal-Systems-Principles-Applications/dp/3030278239/ref=sr_1_1?keywords=PV%2FT+al-
waeli&qid=1563014343&s=gateway&sr=8-1
101. Follow me on social media
www.linkedin.com/in/ali-al-waeli-
a76a33124
https://www.researchgate.net/profile/Ali_Al-Waeli
https://www.facebook.com/AliHWaeliAR/
https://www.facebook.com/AliHwaeli/
102. Disclaimer
Some of the figures/tables in this presentation are not owned by the
Presenter, they are material copyrighted to their rightful owners. This
presentation is intended for non-profit educational purposes.
Slides with copyrighted material (images/tables) contain the letter C in the
bottom right corner
The actual presentation contains elements that are not mentioned in the
PowerPoint and even edits to the PowerPoint. Still, this presentation contain
useful information and figures with regards to machine learning, artificial
neural networks and photovoltaic thermal (PV/T) collectors
This presentation was prepared independently by the presenter and is
owned by: Dr. Ali H. A. Alwaeli
Editor's Notes
Neuron (unit / node / cell)
Activation function
All classification tasks depend upon labeled datasets; that is, humans must transfer their knowledge to the dataset in order for a neural network to learn the correlation between labels and data. This is known asย supervised learning.
Hardware breakdowns (data centers, manufacturing, transport)
Health breakdowns (strokes, heart attacks based on vital stats and data from wearables)
Customer churn (predicting the likelihood that a customer will leave, based on web activity and metadata)
Employee turnover (ditto, but for employees)
Hardware breakdowns (data centers, manufacturing, transport)
Health breakdowns (strokes, heart attacks based on vital stats and data from wearables)
Customer churn (predicting the likelihood that a customer will leave, based on web activity and metadata)
Employee turnover (ditto, but for employees)
Hardware breakdowns (data centers, manufacturing, transport)
Health breakdowns (strokes, heart attacks based on vital stats and data from wearables)
Customer churn (predicting the likelihood that a customer will leave, based on web activity and metadata)
Employee turnover (ditto, but for employees)
Hardware breakdowns (data centers, manufacturing, transport)
Health breakdowns (strokes, heart attacks based on vital stats and data from wearables)
Customer churn (predicting the likelihood that a customer will leave, based on web activity and metadata)
Employee turnover (ditto, but for employees)
The agent learns to behave in environment depending on these rewards.
To understand the limitations and merits of an algorithm and to develop efficient learning algorithms is the goal in reinforcement learning.
One or more hidden layers
Influence of the number of layers on the pattern recognition ability of MLP
Stage (1) and (2) same input / (3) different input
Y = m (slope) x + B (intersection)
interpretation is clear
of an estimator (of a procedure for estimating an unobserved quantity)