Al Barsha Escorts $#$ O565212860 $#$ Escort Service In Al Barsha
APPLICATION OF STATISTICAL LEARNING TECHNIQUES AS PREDICTIVE TOOLS FOR MACHINING PROCESSES
1. APPLICATION OF STATISTICAL LEARNING TECHNIQUES AS PREDICTIVE TOOLS
FOR MACHINING PROCESSES
by
Shibaprasad Bhattacharya
ROLL: 001911702013
Under the guidance of
Prof. Shankar Chakraborty
Department of Production Engineering
4. Objectives
Integration of Statistical Learning Techniques for Machining Processes
Incorporating non-parametric methods to bypass the rigid assumptions of
parametric methods
Laying down a framework for selecting the right model keeping in mind
different tradeoffs
Selection of right values of parameters for different models
8. Difference between Statistical Learning and Machine
Learning
Different buzz words: Statistics/Operations
Research/Mathematics vs Computer Science
Size of the dataset
Number of variables
Why and What vs How
9. Steps involved in Statistical Learning
Selecting the dataset
Dividing it for training and testing
Training the model with the training data
Testing it with the testing data
Validating the testing results with error estimators
Selection/Rejection of the used model
10.
11. Statistical Learning Techniques
Supervised Unsupervised Reinforcement
Linear Regression
KNN
Trees
Naïve-Bayes
Logistic Regression
Clustering
PCA
LDA
Apriori
FP-growth
Hidden Markov
model
13. Underfit
When the model fails to generalize the data
Performs poorly for both the training and testing set
Can be recognized by looking the training error
Switching to a more complex model will help solve it
Adding more features will also help
14. Overfit
When the model memorises the data too closely
It fails to identify the pattern
Performs well on Training set
Performs poorly on Testing set
Reducing the number of features will help
Adding more data can also solve this problem
15. Trade-off in Statistical Learning
Trade-off between the Bias and Variance
Trade-off between Prediction accuracy and Interpretability
16. Bias-Variance Trade-off
Bias
Introduced because of the of
the over simplification of the
model assumption
Difference between average
prediction of the build model
and the actual value it is trying
to predict
Variance
Error induced by the
randomness of the training
data
Fails to generalize the model
and hence gives higher testing
error
19. Error estimators
Mean absolute percentage error or MAPE:
Root mean squared percentage error or RMSPE:
Root mean squared logarithmic error:
20. Error estimators
Correlation coefficient or R:
Root relative squared error or RRSE:
A= Actual value
P= Predicted value
𝐴= Mean of the actual values
𝑃= Mean of the predicted values
21. Statistical Learning Techniques as Predictive tool
for Machining Processes
1. Prediction of Responses in a Dry Turning Operation: A
Comparative Analysis
2. Prediction of Responses in a CNC Milling Operation using
Random Forest Regressor
3. Predicting Responses for Turning High Strength Steel
Grade H with XGBoost
23. Experimental details
Dry turning operation using a heavy duty lathe
Input parameters: Cutting speed, Feed rate and Depth of cut
Output responses: Surface roughness, Cutting force and Material removal rate
Number of training points: 18
Number of testing points: 9
Statistical learning techniques used:
Multivariate regression analysis
Artificial neural network (ANN)
Fuzzy logic
Adaptive neuro-fuzzy inference system (ANFIS)
25. Multivariate regression analysis
It generally follows the following form:
Where β0 is the Y-intercept coefficient, β1-βn are the main effect coefficients and βij is the interaction
coefficient
These coefficients are initially unknown and they are computed by fitting the data.
27. ANN
Designed to imitate the human behavior
The processing units/nodes/neurons are the building blocks of ANN
Type of neural network considered: Feedforward neural network
Algorithm used: Backpropagation (Levenberg-Marquardt)
28. Fuzzy Logic
Deals with imprecise information to arrive at logical conclusions
Input values are generally converted to linguistic terms like (High-Low)
A fuzzy logic unit contains a fuzzifier, membership functions, a fuzzy rule base, an inference
engine and a defuzzifier
Fuzzy logic is generally based on if-then rules like this:
Rule 1: If x1 is A1 and x2 is B1 and x3 is C1 and x4 is D1, Then output (O) is E1, else
Rule 2: If x1 is A2 and x2 is B2 and x3 is C2 and x4 is D2, Then output (O) is E2, else
Rule n: If x1 is An and x2 is Bn and x3 is Cn and x4 is Dn, Then output (O) is En.
39. Experimental details
CNC Milling Process
Input parameters: Cutting speed, Feed rate, Depth of cut and Width of cut
Output responses: Surface roughness, Material removal rate and Active energy consumption
Number of training points: 21
Number of testing points: 6
Statistical learning techniques used: Random Forest
41. Random Forest
Bagging technique : Parallel ensemble method
Multiple weak learners are combined
This brings more stability
RF is made by combining multiple decision trees
The final output is yielded after taking into account of all the trees
51. Experimental details
CNC Turning Process
Input parameters: Cutting speed, Feed rate, and Depth of cut
Output responses: Surface roughness and Material removal rate
Number of training points: 100
Number of testing points: 25
Statistical learning techniques used: XGBoost
54. XGBoost
Boosting technique : Sequential ensemble method
Another tree based method
Like Random Forest, weak learners are combined
Instead of parallel method, a sequential method is used
55.
56. Process Parameters
nrounds: The number of trees in the model.
eta: The learning rate. Range: 0 to 1.
max_depth: The greatest depth to which a tree can grow.
early_stopping_rounds: After how many rounds should
the model stop if there is no improvement in the
predictions?
63. Statistical Index Ra MRR
MAPE 2.34 13.88
RMSPE 3 18.33
RMSLE 0.014 0.16
R 0.99 0.99
RRSE 0.035 0.115
64. Conclusions
Selecting the best technique is important by comparing
Hybrid learning approaches outclasses the counterparts
Parametric methods are flexible to model
Parametric methods work well even with smaller dataset
Non-parametric methods bypasses rigid assumptions of parametric methods
Non-parametric methods can easily accommodate all kinds of variables
Selection of the right parameter values are important for non-parametric methods
65. Future Scope of Work
Extending the current work by incorporating different types of variables
Example: Coolant type, Tool diameter etc.
Including several conditions as input variables
Example: Skill level of the worker, tool quality etc.
Include budget constraint as output variable to predict the expected cost