Learn R Programming

STPGA (version 5.2.1)

GenAlgForSubsetSelectionNoTest: Genetic algorithm for subset selection no given test

Description

It uses a genetic algorithm to select \(n_{Train}\) individuals so that optimality criterion is minimum. The test set is taken as the complement of the training individuals.

Usage

GenAlgForSubsetSelectionNoTest(P, ntoselect, npop = 100, nelite = 5, keepbest = TRUE,
tabu = TRUE, tabumemsize = 1, mutprob=.8, mutintensity = 1,
niterations = 500, minitbefstop = 200, niterreg = 5,
lambda = 1e-06, plotiters = FALSE, plottype=1, errorstat =
 "PEVMEAN2", C = NULL, mc.cores = 1, InitPop = NULL,
 tolconv = 1e-07, Vg = NULL, Ve = NULL, Fedorov=FALSE)

Arguments

P

depending on the criterion this is either a numeric data matrix or a symmetric similarity matrix. When it is a data matrix, the union of the identifiers of the candidate individuals should be put as rownames (and column names in case of a similarity matrix). For methods using the relationships, this is the inverse of the relationship matrix with row and column names as the the identifiers of the candidate individuals.

ntoselect

\(n_{Train}:\) number of individuals to select in the training set.

npop

genetic algorithm parameter, number of solutions at each iteration

nelite

genetic algorithm parameter, number of solutions selected as elite parents which will generate the next set of solutions.

keepbest

genetic algorithm parameter, TRUE or FALSE. If TRUE then the best solution is always kept in the next generation of solutions (elitism).

tabu

genetic algorithm parameter, TRUE or FALSE. If TRUE then the solutions that are saved in tabu memory will not be retried.

tabumemsize

genetic algorithm parameter, integer>0. Number of generations to hold in tabu memory.

mutprob

genetic algorithm parameter, probability of mutation for each generated solution.

mutintensity

mean of the poisson variable that is used to decide the number of mutations for each cross.

niterations

genetic algorithm parameter, number of iterations.

minitbefstop

genetic algorithm parameter, number of iterations before stopping if no change is observed in criterion value.

niterreg

genetic algorithm parameter, number of iterations to use regressions, an integer with minimum value of 1

lambda

scalar shrinkage parameter (\(\lambda>0\)).

plotiters

plot the convergence: TRUE or FALSE. Default is TRUE.

plottype

type of plot, default is 1. possible values 1,2,3.

errorstat

optimality criterion: One of the optimality criterion. Default is "PEVMEAN". It is possible to use user defined functions as shown in the examples.

mc.cores

number of cores to use.

InitPop

a list of initial solutions

tolconv

if the algorithm cannot improve the errorstat more than tolconv for the last minitbefstop iterations it will stop.

C

Contrast Matrix.

Vg

covariance matrix between traits generated by the relationship K (only for multi-trait version of PEVMEANMM).

Ve

residual covariance matrix for the traits (only for multi-trait version of PEVMEANMM).

Fedorov

Whether the Fedorovs exchange algorithm from AlgDesign Package should be used for initial solutions.

Value

A list of length nelite+1. The first nelite elements of the list are optimized training samples of size \(n_{train}\) and they are listed in increasing order of the optimization criterion. The last item on the list is a vector that stores the minimum values of the objective function at each iteration.

Examples

Run this code
# NOT RUN {
###################### Example for three level designs for the 
#second order model in two factors with a square design region
X<-matrix(0,nrow=3^2,ncol=5)
ij=0

for (i in -1:1){
  for (j in -1:1){
    ij=ij+1
    X[ij,]<-c(i,j, i^2,j^2, i*j)
    
  }
}
X<-cbind(1,X)
D<-as.matrix(dist(X))
K<-tcrossprod(X)
rownames(K)<-colnames(K)<-rownames(D)<-colnames(D)<-rownames(X)<-paste("x",1:3^2, sep="")
X
library(STPGA)
ListTrain1<-GenAlgForSubsetSelectionNoTest(P=X,ntoselect=4, InitPop=NULL,
             npop=100, nelite=5, mutprob=.5, mutintensity = 1,
             niterations=200,minitbefstop=20, tabu=F,tabumemsize = 0,plotiters=F,
             lambda=1e-9,errorstat="DOPT", mc.cores=1)
                                                 
                                                 
ListTrain2<-GenAlgForSubsetSelectionNoTest(P=solve(K+1e-6*diag(ncol(K))),ntoselect=4, InitPop=NULL,
            npop=100, nelite=5, mutprob=.5, mutintensity = 1,
            niterations=200,minitbefstop=20, tabu=F,tabumemsize = 0,plotiters=F,
            lambda=1,errorstat="CDMEANMM", mc.cores=1)
                                                 
                                                 
  par(mfrow=c(1,2),mar=c(1,1,1,1))
  labelling1<-rownames(X)%in%ListTrain1[[1]]+1
  plot(X[,2], X[,3], col=labelling1, pch=2*labelling1,cex=2*(labelling1-1),
   xlab="", ylab="", main="DOPT", cex.main=.7,xaxt='n',yaxt='n')
    for (i in -1:1){
  abline(v=i, lty=2)
  abline(h=i,lty=2)
    }
    labelling2<-rownames(X)%in%ListTrain2[[1]]+1
  plot(X[,2], X[,3], col=labelling2, pch=2*labelling2,cex=2*(labelling2-1),
   xlab="", ylab="", main="CDMEANMM", cex.main=.7,xaxt='n',yaxt='n')
    for (i in -1:1){
  abline(v=i, lty=2)
  abline(h=i,lty=2)
    }

########################Dopt design three level designs for the second order 
#model in two factors with a square design region

par(mfrow=c(2,2),mar=c(1,1,1,1))
 for (ntoselect in c(5,6,7,8)){
 	ListTrain<-GenAlgForSubsetSelectionNoTest(P=X,ntoselect=ntoselect, InitPop=NULL,
             npop=10, nelite=3, mutprob=.5, mutintensity = 1,
             niterations=200,minitbefstop=200, tabu=F,tabumemsize = 0,plotiters=F,
             lambda=1e-9,errorstat="DOPT", mc.cores=1)
             
  labelling<-rownames(X)%in%ListTrain[[1]]+1
  plot(as.numeric(X[,2]), as.numeric(X[,3]), col=labelling, pch=2*labelling,cex=2*(labelling-1),
   xlab="", ylab="", main="DOPT", cex.main=.7,xaxt='n',yaxt='n')
    for (i in -1:1){
  abline(v=i, lty=2)
  abline(h=i,lty=2)
    }

}

par(mfrow=c(1,1))

# }

Run the code above in your browser using DataLab