The work you have conducted on the laboratory so far has focused on com- puting and visualising density and discriminant functions. You can use the programs you have made in these exercises to make a classifier.2 Make a program to be called by giving the command classify. Let the the input training data be ( Y) and data to be classified ( X). . It might be convenient to let Y be a list with M cells, where Y[i] corre- sponds to the training data from class W;,1 = 1 ,..., M . . It might also be convenient to let X be a data matrix of containing the feature vectors to be classified. Upon calling the function, it shall return . g containing the discriminant function values for each feature vector in X. It might be convenient to let g be a matrix so that row number i in g , g(1,:) , coresponds to the discriminant function values gi (x ) , where i = 1 ,..., M . Cindicating the classification for each feature vector in X. c) Use a ML-classifier: . Classify the training set (reclassification). . Classify the test set. . Compute the error rate ( P (error ) = 1 - A) and the ratio of correct classi- fications for each of the classes, P (correct |w;) = R;, i = 1, 2, ..., M ). It is recommended to determine the confusion matrix and use this as a basis to determine these performance metrics. The following listings shows a suggested code for the reclassification and the testing for k in range (0, M): gx[k], Cx[k], pxwx[k], Pwx = classify(Y, X[k], met, discr, prm) gy[k], Cy[k], pxwy[k], Pwy = classify(Y, Y[k], met, discr, prm) The variables generated in these iterations are available in the file lab4 data.p. If you want to check that your pdfs, discriminant functions and classifica- tions, the variables can be loaded according to the following code. pfile = 'lab4_data.p' with open (pfile, "rb" ) as fp: pxwx, Pwx, gx, Cx, CNx, pxwy, Pwy, gy, Cy, CNy (fp)