Q1Ansdata - read.table(httpfisher.stat.wmich.edujoeSta.pdf

A

Q1: Ans: data <- read.table(\"http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text\",sep=\" \",header=TRUE) attach(data) head(data) y <- data[,7] xmat <- data[,1:6] > head(data) x1 x2 x3 x4 x5 x6 y 1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7 2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4 3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9 4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5 5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8 6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2 Q2: Ans: lmbest2 <- function(y,xmat){ # This returns R-squared and sigmahat for all linear # models of size 2 (2 x\'s). The results are sorted on R-squared. p <- length(xmat[1,]) coll <- c() for(i in 1:(p-1)){ for(j in (i+1):p){ fit <- lm(y ~ xmat[,i] + xmat[,j]) sfit <- summary(fit) tmp <- c(i,j,sfit$sigma,sfit$r.squared) coll <- rbind(coll,tmp) } } ind <- order(coll[,4],decreasing=TRUE) collf <- coll[ind,] colnames(collf) <- c(\"i\",\"j\",\"sigmahat\",\"R-squared\") ic <- length(collf[,1]) rownames(collf) <- rep(\"Model\",ic) return(collf) } Q3: Ans: lmbest2(y,xmat) Top 3 models are: For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat and highest R-squared followed by Model 2 then model 3. Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in between the other two models. From the value of R-squared = 0.479 we can say that there are other models which are better fit because for good fits R-squared should be close to 1. Hence I would like to choose another models with more variables involved. Solution Q1: Ans: data <- read.table(\"http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text\",sep=\" \",header=TRUE) attach(data) head(data) y <- data[,7] xmat <- data[,1:6] > head(data) x1 x2 x3 x4 x5 x6 y 1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7 2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4 3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9 4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5 5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8 6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2 Q2: Ans: lmbest2 <- function(y,xmat){ # This returns R-squared and sigmahat for all linear # models of size 2 (2 x\'s). The results are sorted on R-squared. p <- length(xmat[1,]) coll <- c() for(i in 1:(p-1)){ for(j in (i+1):p){ fit <- lm(y ~ xmat[,i] + xmat[,j]) sfit <- summary(fit) tmp <- c(i,j,sfit$sigma,sfit$r.squared) coll <- rbind(coll,tmp) } } ind <- order(coll[,4],decreasing=TRUE) collf <- coll[ind,] colnames(collf) <- c(\"i\",\"j\",\"sigmahat\",\"R-squared\") ic <- length(collf[,1]) rownames(collf) <- rep(\"Model\",ic) return(collf) } Q3: Ans: lmbest2(y,xmat) Top 3 models are: For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat and highest R-squared followed by Model 2 then model 3. Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in between the other two models. From the value of R-squared = 0.479 we can say that there are other models which are better fit because for good fits R-squared should be close to 1. Hence I would like to cho.

Q1:
Ans:
data <- read.table("http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text",sep="
",header=TRUE)
attach(data)
head(data)
y <- data[,7]
xmat <- data[,1:6]
> head(data)
x1 x2 x3 x4 x5 x6 y
1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7
2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4
3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9
4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5
5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8
6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2
Q2:
Ans:
lmbest2 <- function(y,xmat){
# This returns R-squared and sigmahat for all linear
# models of size 2 (2 x's). The results are sorted on R-squared.
p <- length(xmat[1,])
coll <- c()
for(i in 1:(p-1)){
for(j in (i+1):p){
fit <- lm(y ~ xmat[,i] + xmat[,j])
sfit <- summary(fit)
tmp <- c(i,j,sfit$sigma,sfit$r.squared)
coll <- rbind(coll,tmp)
}
}
ind <- order(coll[,4],decreasing=TRUE)
collf <- coll[ind,]
colnames(collf) <- c("i","j","sigmahat","R-squared")
ic <- length(collf[,1])
rownames(collf) <- rep("Model",ic)
return(collf)
}
Q3:
Ans:
lmbest2(y,xmat)
Top 3 models are:
For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat
and highest R-squared followed by Model 2 then model 3.
Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in
between the other two models.
From the value of R-squared = 0.479 we can say that there are other models which are better fit
because for good fits R-squared should be close to 1. Hence I would like to choose another
models with more variables involved.
Solution
Q1:
Ans:
data <- read.table("http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text",sep="
",header=TRUE)
attach(data)
head(data)
y <- data[,7]
xmat <- data[,1:6]
> head(data)
x1 x2 x3 x4 x5 x6 y
1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7
2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4
3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9
4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5
5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8
6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2
Q2:
Ans:
lmbest2 <- function(y,xmat){
# This returns R-squared and sigmahat for all linear
# models of size 2 (2 x's). The results are sorted on R-squared.
p <- length(xmat[1,])
coll <- c()
for(i in 1:(p-1)){
for(j in (i+1):p){
fit <- lm(y ~ xmat[,i] + xmat[,j])
sfit <- summary(fit)
tmp <- c(i,j,sfit$sigma,sfit$r.squared)
coll <- rbind(coll,tmp)
}
}
ind <- order(coll[,4],decreasing=TRUE)
collf <- coll[ind,]
colnames(collf) <- c("i","j","sigmahat","R-squared")
ic <- length(collf[,1])
rownames(collf) <- rep("Model",ic)
return(collf)
}
Q3:
Ans:
lmbest2(y,xmat)
Top 3 models are:
For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat
and highest R-squared followed by Model 2 then model 3.
Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in
between the other two models.
From the value of R-squared = 0.479 we can say that there are other models which are better fit
because for good fits R-squared should be close to 1. Hence I would like to choose another
models with more variables involved.

Recommandé

Current Score – 0 Due Wednesday, November 19 2014 0400 .docx par
Current Score  –  0 Due  Wednesday, November 19 2014 0400 .docxCurrent Score  –  0 Due  Wednesday, November 19 2014 0400 .docx
Current Score – 0 Due Wednesday, November 19 2014 0400 .docxfaithxdunce63732
4 vues12 diapositives
ridgeandlasso - Jupyter Notebook.pdf par
ridgeandlasso - Jupyter Notebook.pdfridgeandlasso - Jupyter Notebook.pdf
ridgeandlasso - Jupyter Notebook.pdfPiyushBorhade1
19 vues7 diapositives
Julia: The language for future par
Julia: The language for futureJulia: The language for future
Julia: The language for future岳華 杜
343 vues86 diapositives
The Language for future-julia par
The Language for future-juliaThe Language for future-julia
The Language for future-julia岳華 杜
128 vues86 diapositives
Pesq psytechnics par
Pesq psytechnicsPesq psytechnics
Pesq psytechnicsIvica Putrić
201 vues19 diapositives
Apache Commons Math @ FOSDEM 2013 par
Apache Commons Math @ FOSDEM 2013Apache Commons Math @ FOSDEM 2013
Apache Commons Math @ FOSDEM 2013netomi
1.4K vues17 diapositives

Contenu connexe

Similaire à Q1Ansdata - read.table(httpfisher.stat.wmich.edujoeSta.pdf

Tycs ds logistic regression par
Tycs ds logistic regressionTycs ds logistic regression
Tycs ds logistic regressionRohitSutar13
14 vues2 diapositives
R and data mining par
R and data miningR and data mining
R and data miningChaozhong Yang
249 vues78 diapositives
Provide a numeric example of how this regression equation ma.pdf par
Provide a numeric example of how this regression equation ma.pdfProvide a numeric example of how this regression equation ma.pdf
Provide a numeric example of how this regression equation ma.pdfaddtechglobalmarketi
2 vues2 diapositives
lecture7.ppt par
lecture7.pptlecture7.ppt
lecture7.pptEdFeranil
3 vues33 diapositives
20190907 Julia the language for future par
20190907 Julia the language for future20190907 Julia the language for future
20190907 Julia the language for future岳華 杜
356 vues84 diapositives
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014 par
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014Amazon Web Services
2.2K vues28 diapositives

Similaire à Q1Ansdata - read.table(httpfisher.stat.wmich.edujoeSta.pdf(20)

Provide a numeric example of how this regression equation ma.pdf par addtechglobalmarketi
Provide a numeric example of how this regression equation ma.pdfProvide a numeric example of how this regression equation ma.pdf
Provide a numeric example of how this regression equation ma.pdf
20190907 Julia the language for future par 岳華 杜
20190907 Julia the language for future20190907 Julia the language for future
20190907 Julia the language for future
岳華 杜356 vues
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014 par Amazon Web Services
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014
(PFC307) Auto Scaling: A Machine Learning Approach | AWS re:Invent 2014
Solucionario de matemáticas para administación y economia par Luis Perez Anampa
Solucionario de matemáticas para administación y economiaSolucionario de matemáticas para administación y economia
Solucionario de matemáticas para administación y economia
Luis Perez Anampa2.5K vues
Ernest f. haeussler, richard s. paul y richard j. wood. matemáticas para admi... par Jhonatan Minchán
Ernest f. haeussler, richard s. paul y richard j. wood. matemáticas para admi...Ernest f. haeussler, richard s. paul y richard j. wood. matemáticas para admi...
Ernest f. haeussler, richard s. paul y richard j. wood. matemáticas para admi...
Jhonatan Minchán1.8K vues
Sol mat haeussler_by_priale par Jeff Chasi
Sol mat haeussler_by_prialeSol mat haeussler_by_priale
Sol mat haeussler_by_priale
Jeff Chasi215 vues
31350052 introductory-mathematical-analysis-textbook-solution-manual par Mahrukh Khalid
31350052 introductory-mathematical-analysis-textbook-solution-manual31350052 introductory-mathematical-analysis-textbook-solution-manual
31350052 introductory-mathematical-analysis-textbook-solution-manual
Mahrukh Khalid202.9K vues
Learning Predictive Modeling with TSA and Kaggle par Yvonne K. Matos
Learning Predictive Modeling with TSA and KaggleLearning Predictive Modeling with TSA and Kaggle
Learning Predictive Modeling with TSA and Kaggle
Yvonne K. Matos107 vues
Kaggle digits analysis_final_fc par Zachary Combs
Kaggle digits analysis_final_fcKaggle digits analysis_final_fc
Kaggle digits analysis_final_fc
Zachary Combs508 vues
Machine Learning Model for M.S admissions par Omkar Rane
Machine Learning Model for M.S admissionsMachine Learning Model for M.S admissions
Machine Learning Model for M.S admissions
Omkar Rane156 vues
Perlで任意精度計算 par lestrrat
Perlで任意精度計算Perlで任意精度計算
Perlで任意精度計算
lestrrat1.2K vues
1136d2d7a30d0424187c48238befee18_angola-local-content_en2 par Sophie Tricaud
1136d2d7a30d0424187c48238befee18_angola-local-content_en21136d2d7a30d0424187c48238befee18_angola-local-content_en2
1136d2d7a30d0424187c48238befee18_angola-local-content_en2
Sophie Tricaud196 vues
Algebra and Trigonometry 9th Edition Larson Solutions Manual par kejeqadaqo
Algebra and Trigonometry 9th Edition Larson Solutions ManualAlgebra and Trigonometry 9th Edition Larson Solutions Manual
Algebra and Trigonometry 9th Edition Larson Solutions Manual
kejeqadaqo2.6K vues
Incremental statistics for partitioned tables in 11g by wwf from ebay COC par Louis liu
Incremental statistics for partitioned tables in 11g  by wwf from ebay COCIncremental statistics for partitioned tables in 11g  by wwf from ebay COC
Incremental statistics for partitioned tables in 11g by wwf from ebay COC
Louis liu3.1K vues
Kunci Jawaban kalkulus edisi 9[yunusFairVry.blogspot.com].pdf par MutiaraPutri41
Kunci Jawaban kalkulus edisi 9[yunusFairVry.blogspot.com].pdfKunci Jawaban kalkulus edisi 9[yunusFairVry.blogspot.com].pdf
Kunci Jawaban kalkulus edisi 9[yunusFairVry.blogspot.com].pdf
MutiaraPutri41181 vues

Plus de ajitdoll

1. Genes are made of molecules of DNA which is made of nucleotides t.pdf par
1. Genes are made of molecules of DNA which is made of nucleotides t.pdf1. Genes are made of molecules of DNA which is made of nucleotides t.pdf
1. Genes are made of molecules of DNA which is made of nucleotides t.pdfajitdoll
5 vues1 diapositive
Gold star Rice Contribution income statemet .pdf par
       Gold star Rice           Contribution income statemet          .pdf       Gold star Rice           Contribution income statemet          .pdf
Gold star Rice Contribution income statemet .pdfajitdoll
3 vues1 diapositive
where is data .pdf par
                     where is data                                   .pdf                     where is data                                   .pdf
where is data .pdfajitdoll
2 vues1 diapositive
Substitution Reactions of Benzene and Other Aroma.pdf par
                     Substitution Reactions of Benzene and Other Aroma.pdf                     Substitution Reactions of Benzene and Other Aroma.pdf
Substitution Reactions of Benzene and Other Aroma.pdfajitdoll
14 vues13 diapositives
sp2 Solu.pdf par
                     sp2                                      Solu.pdf                     sp2                                      Solu.pdf
sp2 Solu.pdfajitdoll
3 vues1 diapositive
no reaction is visible .pdf par
                     no reaction is visible                           .pdf                     no reaction is visible                           .pdf
no reaction is visible .pdfajitdoll
2 vues1 diapositive

Plus de ajitdoll(20)

1. Genes are made of molecules of DNA which is made of nucleotides t.pdf par ajitdoll
1. Genes are made of molecules of DNA which is made of nucleotides t.pdf1. Genes are made of molecules of DNA which is made of nucleotides t.pdf
1. Genes are made of molecules of DNA which is made of nucleotides t.pdf
ajitdoll5 vues
Gold star Rice Contribution income statemet .pdf par ajitdoll
       Gold star Rice           Contribution income statemet          .pdf       Gold star Rice           Contribution income statemet          .pdf
Gold star Rice Contribution income statemet .pdf
ajitdoll3 vues
where is data .pdf par ajitdoll
                     where is data                                   .pdf                     where is data                                   .pdf
where is data .pdf
ajitdoll2 vues
Substitution Reactions of Benzene and Other Aroma.pdf par ajitdoll
                     Substitution Reactions of Benzene and Other Aroma.pdf                     Substitution Reactions of Benzene and Other Aroma.pdf
Substitution Reactions of Benzene and Other Aroma.pdf
ajitdoll14 vues
no reaction is visible .pdf par ajitdoll
                     no reaction is visible                           .pdf                     no reaction is visible                           .pdf
no reaction is visible .pdf
ajitdoll2 vues
Leu-Phen-Val-Met .pdf par ajitdoll
                     Leu-Phen-Val-Met                                 .pdf                     Leu-Phen-Val-Met                                 .pdf
Leu-Phen-Val-Met .pdf
ajitdoll6 vues
Its Br2- and O2- .pdf par ajitdoll
                     Its Br2- and O2-                                 .pdf                     Its Br2- and O2-                                 .pdf
Its Br2- and O2- .pdf
ajitdoll3 vues
heterolytic elimination .pdf par ajitdoll
                     heterolytic elimination                          .pdf                     heterolytic elimination                          .pdf
heterolytic elimination .pdf
ajitdoll3 vues
httpchemwiki.ucdavis.eduPhysical_ChemistryQu.pdf par ajitdoll
                     httpchemwiki.ucdavis.eduPhysical_ChemistryQu.pdf                     httpchemwiki.ucdavis.eduPhysical_ChemistryQu.pdf
httpchemwiki.ucdavis.eduPhysical_ChemistryQu.pdf
ajitdoll4 vues
Gxggdggdgeg .pdf par ajitdoll
                     Gxggdggdgeg                                      .pdf                     Gxggdggdgeg                                      .pdf
Gxggdggdgeg .pdf
ajitdoll2 vues
d. 7 is the answer Solution .pdf par ajitdoll
                     d. 7  is the answer Solution                 .pdf                     d. 7  is the answer Solution                 .pdf
d. 7 is the answer Solution .pdf
ajitdoll3 vues
When You connect a host to a switch, the host is capable only of hal.pdf par ajitdoll
When You connect a host to a switch, the host is capable only of hal.pdfWhen You connect a host to a switch, the host is capable only of hal.pdf
When You connect a host to a switch, the host is capable only of hal.pdf
ajitdoll2 vues
The first trigonometric table was compiled by Hipparchus, who is now.pdf par ajitdoll
The first trigonometric table was compiled by Hipparchus, who is now.pdfThe first trigonometric table was compiled by Hipparchus, who is now.pdf
The first trigonometric table was compiled by Hipparchus, who is now.pdf
ajitdoll12 vues
tahithi problem- x=months y=temperature we have 2 points here.pdf par ajitdoll
tahithi problem- x=months y=temperature we have 2 points here.pdftahithi problem- x=months y=temperature we have 2 points here.pdf
tahithi problem- x=months y=temperature we have 2 points here.pdf
ajitdoll4 vues
by reversing the order of integration 7y = x y = .pdf par ajitdoll
                     by reversing the order of integration 7y = x y = .pdf                     by reversing the order of integration 7y = x y = .pdf
by reversing the order of integration 7y = x y = .pdf
ajitdoll3 vues
Answer4).DNA codes for mRNA directs the synthesis of proteins..pdf par ajitdoll
Answer4).DNA codes for mRNA directs the synthesis of proteins..pdfAnswer4).DNA codes for mRNA directs the synthesis of proteins..pdf
Answer4).DNA codes for mRNA directs the synthesis of proteins..pdf
ajitdoll5 vues
pseudcode for calculating all the partial sums of a list on 2^n numb.pdf par ajitdoll
pseudcode for calculating all the partial sums of a list on 2^n numb.pdfpseudcode for calculating all the partial sums of a list on 2^n numb.pdf
pseudcode for calculating all the partial sums of a list on 2^n numb.pdf
ajitdoll3 vues
Pressure of the ozone gas = 2.28 torrTemperature, T = 250 KVolum.pdf par ajitdoll
Pressure of the ozone gas = 2.28 torrTemperature, T = 250 KVolum.pdfPressure of the ozone gas = 2.28 torrTemperature, T = 250 KVolum.pdf
Pressure of the ozone gas = 2.28 torrTemperature, T = 250 KVolum.pdf
ajitdoll2 vues

Dernier

Jibachha publishing Textbook.docx par
Jibachha publishing Textbook.docxJibachha publishing Textbook.docx
Jibachha publishing Textbook.docxDrJibachhaSahVetphys
53 vues14 diapositives
A-Level Art par
A-Level ArtA-Level Art
A-Level ArtWestHatch
48 vues82 diapositives
GCSE Geography par
GCSE GeographyGCSE Geography
GCSE GeographyWestHatch
47 vues32 diapositives
GCSE Music par
GCSE MusicGCSE Music
GCSE MusicWestHatch
47 vues50 diapositives
Classification of crude drugs.pptx par
Classification of crude drugs.pptxClassification of crude drugs.pptx
Classification of crude drugs.pptxGayatriPatra14
104 vues13 diapositives
How to empty an One2many field in Odoo par
How to empty an One2many field in OdooHow to empty an One2many field in Odoo
How to empty an One2many field in OdooCeline George
97 vues8 diapositives

Dernier(20)

Classification of crude drugs.pptx par GayatriPatra14
Classification of crude drugs.pptxClassification of crude drugs.pptx
Classification of crude drugs.pptx
GayatriPatra14104 vues
How to empty an One2many field in Odoo par Celine George
How to empty an One2many field in OdooHow to empty an One2many field in Odoo
How to empty an One2many field in Odoo
Celine George97 vues
When Sex Gets Complicated: Porn, Affairs, & Cybersex par Marlene Maheu
When Sex Gets Complicated: Porn, Affairs, & CybersexWhen Sex Gets Complicated: Porn, Affairs, & Cybersex
When Sex Gets Complicated: Porn, Affairs, & Cybersex
Marlene Maheu99 vues
Relationship of psychology with other subjects. par palswagata2003
Relationship of psychology with other subjects.Relationship of psychology with other subjects.
Relationship of psychology with other subjects.
palswagata200377 vues
Retail Store Scavenger Hunt.pptx par jmurphy154
Retail Store Scavenger Hunt.pptxRetail Store Scavenger Hunt.pptx
Retail Store Scavenger Hunt.pptx
jmurphy15447 vues
Pharmaceutical Inorganic chemistry UNIT-V Radiopharmaceutical.pptx par Ms. Pooja Bhandare
Pharmaceutical Inorganic chemistry UNIT-V Radiopharmaceutical.pptxPharmaceutical Inorganic chemistry UNIT-V Radiopharmaceutical.pptx
Pharmaceutical Inorganic chemistry UNIT-V Radiopharmaceutical.pptx
Class 9 lesson plans par TARIQ KHAN
Class 9 lesson plansClass 9 lesson plans
Class 9 lesson plans
TARIQ KHAN53 vues
EILO EXCURSION PROGRAMME 2023 par info33492
EILO EXCURSION PROGRAMME 2023EILO EXCURSION PROGRAMME 2023
EILO EXCURSION PROGRAMME 2023
info33492124 vues
11.28.23 Social Capital and Social Exclusion.pptx par mary850239
11.28.23 Social Capital and Social Exclusion.pptx11.28.23 Social Capital and Social Exclusion.pptx
11.28.23 Social Capital and Social Exclusion.pptx
mary850239383 vues
BÀI TẬP BỔ TRỢ TIẾNG ANH FAMILY AND FRIENDS NATIONAL EDITION - LỚP 4 (CÓ FIL... par Nguyen Thanh Tu Collection
BÀI TẬP BỔ TRỢ TIẾNG ANH FAMILY AND FRIENDS NATIONAL EDITION - LỚP 4 (CÓ FIL...BÀI TẬP BỔ TRỢ TIẾNG ANH FAMILY AND FRIENDS NATIONAL EDITION - LỚP 4 (CÓ FIL...
BÀI TẬP BỔ TRỢ TIẾNG ANH FAMILY AND FRIENDS NATIONAL EDITION - LỚP 4 (CÓ FIL...

Q1Ansdata - read.table(httpfisher.stat.wmich.edujoeSta.pdf

  • 1. Q1: Ans: data <- read.table("http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text",sep=" ",header=TRUE) attach(data) head(data) y <- data[,7] xmat <- data[,1:6] > head(data) x1 x2 x3 x4 x5 x6 y 1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7 2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4 3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9 4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5 5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8 6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2 Q2: Ans: lmbest2 <- function(y,xmat){ # This returns R-squared and sigmahat for all linear # models of size 2 (2 x's). The results are sorted on R-squared. p <- length(xmat[1,]) coll <- c() for(i in 1:(p-1)){ for(j in (i+1):p){ fit <- lm(y ~ xmat[,i] + xmat[,j]) sfit <- summary(fit) tmp <- c(i,j,sfit$sigma,sfit$r.squared) coll <- rbind(coll,tmp) } } ind <- order(coll[,4],decreasing=TRUE) collf <- coll[ind,] colnames(collf) <- c("i","j","sigmahat","R-squared") ic <- length(collf[,1])
  • 2. rownames(collf) <- rep("Model",ic) return(collf) } Q3: Ans: lmbest2(y,xmat) Top 3 models are: For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat and highest R-squared followed by Model 2 then model 3. Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in between the other two models. From the value of R-squared = 0.479 we can say that there are other models which are better fit because for good fits R-squared should be close to 1. Hence I would like to choose another models with more variables involved. Solution Q1: Ans: data <- read.table("http://fisher.stat.wmich.edu/joe/Stat2600/GA/xmaty.text",sep=" ",header=TRUE) attach(data) head(data) y <- data[,7] xmat <- data[,1:6] > head(data) x1 x2 x3 x4 x5 x6 y 1 38.7 13.5 70.2 62.6 60.8 58.6 -12.7 2 69.0 75.2 48.7 43.2 68.6 64.7 -36.4 3 55.1 76.5 51.5 67.5 58.5 87.9 -53.9 4 77.3 84.7 48.5 79.5 74.2 93.9 -56.5 5 62.4 27.3 112.8 63.0 55.9 73.9 -83.8 6 49.5 47.8 42.9 79.4 53.0 83.7 -20.2 Q2: Ans: lmbest2 <- function(y,xmat){
  • 3. # This returns R-squared and sigmahat for all linear # models of size 2 (2 x's). The results are sorted on R-squared. p <- length(xmat[1,]) coll <- c() for(i in 1:(p-1)){ for(j in (i+1):p){ fit <- lm(y ~ xmat[,i] + xmat[,j]) sfit <- summary(fit) tmp <- c(i,j,sfit$sigma,sfit$r.squared) coll <- rbind(coll,tmp) } } ind <- order(coll[,4],decreasing=TRUE) collf <- coll[ind,] colnames(collf) <- c("i","j","sigmahat","R-squared") ic <- length(collf[,1]) rownames(collf) <- rep("Model",ic) return(collf) } Q3: Ans: lmbest2(y,xmat) Top 3 models are: For better model we need lower sigmahat and higher R-squared. Model 1 has lowest sigmahat and highest R-squared followed by Model 2 then model 3. Hence among these three Model 1 is the best fit and Model 3 is the worst fit and model 2 is in between the other two models. From the value of R-squared = 0.479 we can say that there are other models which are better fit because for good fits R-squared should be close to 1. Hence I would like to choose another models with more variables involved.