Perform the training step of the localised multiple kernel k-means.

lmkkmeans_missingData(Km, parameters, missing = NULL, verbose = FALSE)

Arguments

Km

Array of size N X N X M containing M different N x N kernel matrices.

parameters

A list of parameters containing the desired number of clusters, cluster_count, and the number of iterations of the algorithm to be run, iteration_count.

missing

Matrix of size N X M containing missingness indicators, i.e. missing[i,j] = 1 (or = TRUE) if observation i is missing in dataset j, missing[i,j] = 0 (or = FALSE).

verbose

Boolean flag. If TRUE, at each iteration the iteration number is printed. Defaults to FALSE.

Value

This function returns a list containing:

clustering

the cluster labels for each element (i.e. row/column) of the kernel matrix.

objective

the value of the objective function for the given clustering.

parameters

same parameters as in the input.

Theta

N x M matrix of weights, each row corresponds to an observation and each column to one of the kernels.

References

Gonen, M. and Margolin, A.A., 2014. Localized data fusion for kernel k-means clustering with application to cancer biology. In Advances in Neural Information Processing Systems (pp. 1305-1313).

Examples

if(requireNamespace("Rmosek", quietly = TRUE) && (!is.null(utils::packageDescription("Rmosek")$Configured.MSK_VERSION))){ # Intialise 100 x 100 x 3 array containing M kernel matrices # representing three different types of similarities between 100 data points km <- array(NA, c(100, 100, 3)) # Load kernel matrices km[,,1] <- as.matrix(read.csv(system.file('extdata', 'kernel_matrix1.csv', package = 'klic'), row.names = 1)) km[,,2] <- as.matrix(read.csv(system.file('extdata', 'kernel_matrix2.csv', package = 'klic'), row.names = 1)) km[,,3] <- as.matrix(read.csv(system.file('extdata', 'kernel_matrix3.csv', package = 'klic'), row.names = 1)) # Introduce some missing data km[76:80, , 1] <- NA km[, 76:80, 1] <- NA # Define missingness indicators missing <- matrix(FALSE, 100, 3) missing[76:80,1] <- TRUE # Initalize the parameters of the algorithm parameters <- list() # Set the number of clusters parameters$cluster_count <- 4 # Set the number of iterations parameters$iteration_count <- 10 # Perform training state <- lmkkmeans_missingData(km, parameters, missing) # Display the clustering print(state$clustering) # Display the kernel weights print(state$Theta) }
#> [1] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 #> [38] 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 #> [75] 4 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 #> [,1] [,2] [,3] #> [1,] 0.40672275 0.1865545 0.4067227 #> [2,] 0.40666028 0.1866794 0.4066603 #> [3,] 0.40695669 0.1860866 0.4069567 #> [4,] 0.40660061 0.1867988 0.4066006 #> [5,] 0.40649808 0.1870038 0.4064981 #> [6,] 0.40670968 0.1865806 0.4067097 #> [7,] 0.40648906 0.1870219 0.4064891 #> [8,] 0.40702505 0.1859499 0.4070250 #> [9,] 0.40697294 0.1860541 0.4069729 #> [10,] 0.40662856 0.1867429 0.4066286 #> [11,] 0.40659128 0.1868174 0.4065913 #> [12,] 0.40669909 0.1866018 0.4066991 #> [13,] 0.40646423 0.1870715 0.4064642 #> [14,] 0.40656469 0.1868706 0.4065647 #> [15,] 0.41512584 0.1697483 0.4151258 #> [16,] 0.40662235 0.1867553 0.4066223 #> [17,] 0.40772361 0.1845528 0.4077236 #> [18,] 0.40647364 0.1870527 0.4064736 #> [19,] 0.40623828 0.1875234 0.4062383 #> [20,] 0.40638003 0.1872399 0.4063800 #> [21,] 0.44384103 0.1123179 0.4438410 #> [22,] 0.40645954 0.1870809 0.4064595 #> [23,] 0.44392508 0.1121498 0.4439251 #> [24,] 0.40649913 0.1870017 0.4064991 #> [25,] 0.40629904 0.1874019 0.4062990 #> [26,] 0.33597988 0.2823297 0.3816904 #> [27,] 0.33606356 0.2822568 0.3816796 #> [28,] 0.33470201 0.2853028 0.3799952 #> [29,] 0.33511353 0.2840370 0.3808495 #> [30,] 0.33557284 0.2830752 0.3813519 #> [31,] 0.33570767 0.2830492 0.3812431 #> [32,] 0.33478880 0.2851379 0.3800733 #> [33,] 0.33442529 0.2851717 0.3804030 #> [34,] 0.33637741 0.2820708 0.3815518 #> [35,] 0.33456387 0.2853118 0.3801243 #> [36,] 0.33409531 0.2858317 0.3800730 #> [37,] 0.33484952 0.2847761 0.3803744 #> [38,] 0.33500073 0.2848965 0.3801028 #> [39,] 0.33461109 0.2850556 0.3803333 #> [40,] 0.33503618 0.2840236 0.3809402 #> [41,] 0.33516537 0.2837793 0.3810554 #> [42,] 0.33628993 0.2823677 0.3813424 #> [43,] 0.33595344 0.2827744 0.3812722 #> [44,] 0.34417225 0.2670324 0.3887953 #> [45,] 0.33462735 0.2850593 0.3803134 #> [46,] 0.34066339 0.2739647 0.3853719 #> [47,] 0.29911401 0.3066074 0.3942785 #> [48,] 0.33510738 0.2842602 0.3806325 #> [49,] 0.33527472 0.2837491 0.3809761 #> [50,] 0.33696613 0.2809975 0.3820364 #> [51,] 0.40017926 0.2511098 0.3487110 #> [52,] 0.39983296 0.2515151 0.3486519 #> [53,] 0.39926834 0.2522220 0.3485097 #> [54,] 0.40174457 0.2502892 0.3479662 #> [55,] 0.39930014 0.2520298 0.3486701 #> [56,] 0.45379996 0.1420377 0.4041623 #> [57,] 0.40069282 0.2511160 0.3481911 #> [58,] 0.39969985 0.2514075 0.3488927 #> [59,] 0.39923190 0.2517919 0.3489762 #> [60,] 0.40322722 0.2462954 0.3504774 #> [61,] 0.37777090 0.2649744 0.3572547 #> [62,] 0.40011483 0.2528958 0.3469894 #> [63,] 0.39911760 0.2516675 0.3492149 #> [64,] 0.39817772 0.2523312 0.3494911 #> [65,] 0.42235882 0.3024376 0.2752036 #> [66,] 0.39528993 0.2540063 0.3507037 #> [67,] 0.39678192 0.2534449 0.3497732 #> [68,] 0.42384482 0.2027065 0.3734487 #> [69,] 0.39995334 0.2520894 0.3479572 #> [70,] 0.39499630 0.2548522 0.3501515 #> [71,] 0.39644258 0.2537548 0.3498026 #> [72,] 0.40007657 0.2515158 0.3484076 #> [73,] 0.40092179 0.2509848 0.3480935 #> [74,] 0.39992877 0.2515546 0.3485166 #> [75,] 0.40012708 0.2513228 0.3485501 #> [76,] 0.00000000 0.4737076 0.5262924 #> [77,] 0.00000000 0.4747041 0.5252959 #> [78,] 0.00000000 0.4743787 0.5256213 #> [79,] 0.00000000 0.4791501 0.5208499 #> [80,] 0.00000000 0.4662747 0.5337253 #> [81,] 0.11267146 0.4204069 0.4669217 #> [82,] 0.12191505 0.4107110 0.4673739 #> [83,] 0.11427966 0.4192859 0.4664344 #> [84,] 0.09726776 0.4258822 0.4768500 #> [85,] 0.11402477 0.4199445 0.4660307 #> [86,] 0.11362549 0.4202083 0.4661662 #> [87,] 0.11466709 0.4190349 0.4662981 #> [88,] 0.11365349 0.4198557 0.4664908 #> [89,] 0.11361817 0.4201960 0.4661859 #> [90,] 0.11350702 0.4199459 0.4665471 #> [91,] 0.11683701 0.4210964 0.4620666 #> [92,] 0.11324444 0.4197055 0.4670500 #> [93,] 0.12036851 0.4123777 0.4672538 #> [94,] 0.11442195 0.4190429 0.4665352 #> [95,] 0.11447061 0.4194038 0.4661256 #> [96,] 0.11388355 0.4199394 0.4661770 #> [97,] 0.11368076 0.4203100 0.4660093 #> [98,] 0.11384421 0.4200016 0.4661542 #> [99,] 0.11354357 0.4203640 0.4660924 #> [100,] 0.11398819 0.4197987 0.4662131