Surprisal Driven -NN for Robust and Interpretable Nonparametric Learning
Abstract
Nonparametric learning is a fundamental concept in machine learning that aims to capture complex patterns and relationships in data without making strong assumptions about the underlying data distribution. Owing to simplicity and familiarity, one of the most well-known algorithms under this paradigm is the -nearest neighbors (-NN) algorithm. Driven by the usage of machine learning in safety-critical applications, in this work, we shed new light on the traditional nearest neighbors algorithm from the perspective of information theory and propose a robust and interpretable framework for tasks such as classification, regression, density estimation, and anomaly detection using a single model. We can determine data point weights as well as feature contributions by calculating the conditional entropy for adding a feature without the need for explicit model training. This allows us to compute feature contributions by providing detailed data point influence weights with perfect attribution and can be used to query counterfactuals. Instead of using a traditional distance measure which needs to be scaled and contextualized, we use a novel formulation of surprisal (amount of information required to explain the difference between the observed and expected result). Finally, our work showcases the architecture’s versatility by achieving state-of-the-art results in classification and anomaly detection, while also attaining competitive results for regression across a statistically significant number of datasets.
1 Introduction
Nonparametric methods, such as -Nearest Neighbors (-NN), have been studied and applied in various domains of statistics and machine learning. Unlike parametric models, nonparametric methods do not rely on a fixed number of parameters or make strict distributional assumptions about the underlying data. This allows for algorithms to flexibly adapt to different types of data and capture intricate structures. First proposed by (Fix & Hodges, 1951), and (Cover & Hart, 1967), -NN has seen several modifications and evolutions over the past decades (Aha et al., 1991; Wang et al., 2009; Hastie et al., 2009; Alpaydın, 1999). Despite these advancements, -NN still has some disadvantages. For example, the curse of dimensionality (Hastie et al., 2009; Indyk & Motwani, 1998; Schuh et al., 2013; Tao et al., 2009), the selection of a distance metric (Prasath et al., 2017), and imbalanced datasets (He & Garcia, 2009) all present significant challenges to -NN.
In this work, we propose methods to enhance -NN to address these issues, derive new concepts driven by entropy, and then demonstrate the performance of this enhanced -NN on various applications. Using our methodology, we are able to improve the performance of -NN while retaining its natural interpretability. Additionally, these improvements allow us to understand the importance of features and weigh them accordingly in the model’s decision making, thereby improving interpretability further. First, we derive the Łukaszyk–Karmowski (LK) distance (Łukaszyk, 2003, 2004) for Laplace distributions to prevent distances of zero based on uncertainty. To our knowledge, this is the first published derivation of that result. Second, we show how Inverse Residual Weighting (IRW) can be used to move our distance measurements into surprisal space. Then, we introduce the concept of conviction: a ratio of expected surprisal to observed surprisal. This is further broken down into familiarity conviction, similarity conviction, and residual conviction. Finally, we show how these methods and concepts can be used to achieve near or above state-of-the-art results on classification, regression, and anomaly detection tasks.
2 Related Work
In recent years, -NN based methods have grown in popularity in Natural Language Processing (NLP) and Computer Vision (CV). In NLP, variants of -NN have been used on machine translation tasks (Khandelwal et al., 2021), (Jiang et al., 2022), (Meng et al., 2022). In (Khandelwal et al., 2021), the authors propose carrying out translation using a large database of pre-translated sentences or phrases as a reference, and during translation, the system searches for the most similar sentences in the database as the translation output. (Meng et al., 2022) built on top of this work by proposing an efficient indexing scheme to organize the reference database, enabling faster search and retrieval of the nearest neighbors. This indexing scheme reduces the computational complexity of the translation process and improves overall efficiency. Another line of work that has gained attention in the recent past focuses on low resource text classification using -NN on compressed text data (Jiang et al., 2023). Other methods use -NN as an auxiliary model on intermediate representations of neural networks for filtering samples. (Bahri et al., 2020) proposed a version of the -NN algorithm called Deep -NN. This algorithm incorporates the principles of -NN into deep learning architectures, specifically convolutional neural networks (CNNs), to effectively handle noisy labels. The work (Papernot & McDaniel, 2018) proposes a new layer, referred to as the confidence layer, which captures the confidence of the network’s predictions. This layer measures the agreement of the predictions of the deep neural network based on the nearest neighbors to detect nonconformal, out-of-distribution instances. (Papernot & McDaniel, 2018) highlights the need for interpretable deep learning models, especially in domains where model transparency, explainability, and robustness are critical. Beyond the popularity in CV and NLP, -NN continues to be a favored approach for classification and regression tasks in tabular and categorical datasets.
2.1 Anomaly Detection
Anomaly detection, also known as outlier detection, is a field in data analysis and machine learning that focuses on identifying instances that deviate significantly from the expected behavior of a given dataset. In recent years, the utilization of anomaly detection techniques has expanded across a diverse range of domains. These methods have found application in detecting fraudulent activities within credit cards, insurance, and healthcare sectors, as well as identifying intrusions in cyber-security, pinpointing faults in safety-critical systems, etc. Some of the well known methods for anomaly detection for tabular data include (Liu et al., 2008; Li et al., 2022; He et al., 2003; Li et al., 2003; Breunig et al., 2000) and (Ruff et al., 2018). Traditional methods like Isolation Forests randomly select features and split points to recursively partition data points into subsets, marking path lengths that effectively ’isolate’ a datapoint as anomalies. This approach is an ensemble method. Isolation Forests assume that anomalies are sparse and can be isolated easily. If the dataset contains clusters of anomalies that are not well-separated from normal instances, the algorithm’s performance may suffer. There are probabilistic methods such as ECOD (Unsupervised Outlier Detection Using Empirical Cumulative Distribution Functions) uses empirical distribution functions to sort the data points and assigns probabilities based on their order. CBLOF (Clustering-Based Local Outlier Factor) assesses the local density deviation of a data point with respect to its neighbors to identify outliers and is a proximity based outlier detection method. While ECOD performs well with unimodal distribution data, it is sensitive to feature dimension noise and may encounter difficulties in accurately identifying outliers in datasets with multiple modes. Additionally, it may face challenges in high-dimensional scenarios, as it focuses on one-dimensional projections. CBLOF on the other hand focuses on local clusters, potentially missing the global context of the dataset. It necessitates a priori specification of the cluster number, posing a challenge. Unevenly sized clusters may impede its ability to distinguish between normal and outlier instances, particularly within smaller clusters. Recently there have been deep learning based approaches such as DeepSVDD which projects high-dimensional data into a latent space using an autoencoder architecture. Anomalies are detected by measuring the distance of data points to the center of a constructed hypersphere in the latent space, employing a threshold to flag anomalous cases based on this distance. These methods relying on training deep neural networks may exhibit sensitivity to noisy labels when mislabeled instances are present in the training data. This sensitivity can have adverse effects on the model’s generalization and its accuracy in detecting outliers. Additionally, these approaches often lack interpretability, making it challenging to understand the rationale behind classifying certain instances as outliers. Furthermore, their performance is often contingent on the careful tuning of hyperparameters within the neural network.
3 Methods
In this section, we introduce the methods through which we enhance -NN, incorporating a novel distance measure and a feature weighting approach, enabling the utilization of innovative techniques and contributing to enhanced performance. Through the application of these methods to an instance-based -NN base, we leverage the inherent interpretability of the architecture, augmenting it with the following methods and concepts. For instance, using the formulae that are to be provided, each decision that a model makes can be traced to the individual cases that influenced it. Additionally, the concepts that are introduced are human oriented in terms of both the simplicity of the math involved and the relationship of the various measures to the model itself.
3.1 Distance Metric
Like many other approaches to -NN, we use Minkowski distance as a starting point
(1) |
where is the parameter for the Lebesgue space, is the feature set, and is the weight for each feature. One problem with this distance metric, however, is that distinguishing points becomes more and more difficult in higher dimensions. One proposed solution is to use a fractional norm heading towards zero to enable points to be distinguished more easily in high dimensional space (Aggarwal et al., 2001). Motivated by this, we derived the Minkowski distance as expressed over the feature set
(2) |
assuming that the weights sum to 1.
The above is a geometric mean, which has the useful property of being scale invariant. This derivation presents a problem, however. If any of the differences are zero, the entire distance metric will become zero. In order to solve this problem, we use the Łukaszyk–Karmowski metric as a distance term rather than absolute error. Given two random variables and with probability density functions and respectively, the LK metric is defined as
(3) |
We assume that if both points (say and ) are near enough to be worth determining the distance between them, then the distributions and parameters for the probability density functions should represent the local data. The two simple maximum entropy distributions on given a point and a distance around the point are the Laplace distribution (double exponential), where the distance is represented as mean absolute error (MAE), and the Gaussian distribution (normal), where the distance is represented as the standard deviation. We choose the Laplace distribution and derive a closed-form solution of equation 3 for it. Letting and with being the expected deviation,
(4) |
the full derivation of which can be found in the appendix.
In order to employ this measure for our method, we need a value for . Measurement error may not always be readily available, and it does not take into account the additional error among the relationships within the model. Hence, residuals are calculated for each prediction. The MAE is be calculated for each observation using a leave-one-out approach, where instances are removed from the model and each of the held out instance’s features are predicted using the rest of the data. (The idea is to quantify the uncertainty in the model). These errors can be locally aggregated or can be aggregated across the entire model to obtain the expected residual, , for predicting each feature, , as . This results in a Minkowski distance metric which uses the derived distance term of
(5) |
We have found that using the residuals in the -NN system with the above distance metric, calculating new residuals, and then feeding these back in, generally yields convergence of the residual values with notable convergence after only 3 or 4 iterations. Measuring a distance value for each feature further enables parameterization regarding the type of data a feature holds. For example, nominal data can result in a distance of 1 if the values are not equal and 0 if they are equal. Thus, one-hot encoding, the expansion of nominal values into multiple features, is not needed. Ordinal data can use a distance of 1 between each ordinal type.
3.2 Inverse Residual Weighting
Having established a distance metric, we can determine distances between points, but the units of measurement and scales of each feature may be entirely different. We propose Inverse Residual Weighting (IRW), a maximum entropy method of transforming these each feature difference into surprisal space so that the entire distance itself becomes expected surprisal.
Using our assumption that absolute prediction residuals follow the exponential distribution where the mean value is the feature residual, we can describe the probability a single prediction residual, , being within the feature residual, as
(6) |
It then follows that the probability of the prediction residual being outside the feature residual is
(7) |
and the surprisal of observing the prediction residual larger than the feature residual is
(8) |
expanded as
In light of this, we observe that to find the surprisal of an observed residual, we can simply divide by the feature residual. This is the motivation for using IRW, where the inverse of feature residuals are used for feature weights when computing distances.
As previously described, we are able to compute a residual for each feature as the mean absolute deviation between the observed values and predicted values for the feature. We can express the feature residual as
(9) |
where represents the feature value of case and represents the prediction for that specific value. Then this feature residual can be used to determine feature weights, , which can then be expressed as
(10) |
Using the inverse of the residual as the weight for each feature allows the distance contributed by each feature to be in the same space as one another. This gives the distance function scale invariance across varying feature types and scales, which solves one common challenge of using nearest neighbors approaches. Additionally, using IRW allows the model to emphasize features with strong relationships and reduce the influence features that appear to be significantly noisy or generally unpredictable. For models with a designated target feature, feature weights can further augmented using Mean Decrease in Accuracy (MDA) or similar techniques that attempt to capture the predictive power of a feature. Additionally, we are actively researching methods of incorporating MDA techniques alongside IRW in targetless applications of our methods.
Furthermore, scaling by the inverse residual feature weights enables the system to interpret distances in surprisal space. Being in surprisal space allows us to utilize a maximum entropy assumption and the Laplace distribution to measure observed residuals in terms of surprisal. These surprisal values can then be utilized for various metrics and downstream tasks. Specifically we use these surprisal values to compute surprisal ratios that we refer to as convictions, which is covered in detail in the concepts section.
4 Concepts
In this section, we introduce human-oriented concepts which enable or enhance interpretable analyses and applications of the above methods to common tasks including classification, regression, feature selection, and anomaly detection. Many of these concepts are naturally understandable, being ratios. Additionally, they provide insight that lends itself naturally to strong performance on many difficult machine learning tasks.
4.1 Distance Contribution
The distance contribution reflects how much distance a point contributes to a graph connecting the nearest neighbors, which is the inverse of the density of points over a unit of distance in the Lebesgue space. The harmonic mean of the distance contribution reflects the inverse of the inverse distance weighting often employed with -NN, though other techniques may be substituted if inverse distance weighting is not employed. We define the distance contribution as:
(11) |
where is the set of nearest neighbors to point and is the distance function. This is a harmonic mean over the distances to each nearest neighbor. Note that the properties of the previously defined distance metric are useful here to prevent divisions by zero.
We can quantify the information needed to express a distance contribution by transforming it into a probability. We begin by selecting the exponential distribution to describe the distribution of residuals as it is the maximum entropy distribution constrained by the first moment. We represent this in typical nomenclature for the exponential distribution using norms.
(12) |
We can directly compare the distance contribution and p-normed magnitude of the residual. This is because the distance contribution and the norm of the residual are both on the same scale, with the distance contribution being the expected distance of new information that the point adds to the model, and the norm of the residual is the expected distance of deviation. Given the entropy maximizing assumption of the exponential distribution of the distances, we can then determine the probability that a distance contribution is greater than or equal to the magnitude of the residual in the form of cumulative residual entropy (Rao et al., 2004) as
(13) |
We then convert the probability to self-information as
(14) |
which simplifies to
(15) |
4.2 Conviction
If we have some form of prior distribution of data given all of the information observed up to that point, the surprisal is the amount of information gained when we observe a new sample, event, case, or state change and update the prior distribution to form a new posterior distribution after the event. The surprisal of an event of observing a random variable is defined as . Thus, the conviction, , can be expressed as
(16) |
By computing this ratio for different types of information, we derive several different types of conviction with different uses in various applications: familiarity conviction, similarity conviction, and residual conviction.
4.2.1 Familiarity Conviction
Familiarity conviction is a metric for describing surprisal of points in a model relative to the training data. Consider a data set that has data points at regular intervals, such as a data point for each corner in a grid. Now consider a new point is added that is very close to one of the existing corner points. This new point should be quite easy to predict as it is close to an existing point, making it unsurprising. However, given this grid data, familiarity conviction would indicate a higher surprisal for such a point even though it is easy to label because the point is unusual with regard to the even distribution of the rest of the data points. This new point does not form another corner of the grid. These properties make familiarity conviction valuable for sanitizing data and reducing data as well as extracting patterns and anomalies, as is discussed in other sections.
Familiarity conviction is based on the distance metric described previously. As long as a low or zero value of is used in space metrics for similarity, familiarity conviction is independent of the scale of the data provided and does not overreact to feature dominance based on feature scale and range. Given a set of points and an integer we define the distance contribution probability distribution, of to be the set
(17) |
for a function that returns the distance contribution. Note that because may be true under some circumstances, multiple identical points may need special consideration, such as splitting the distance contribution among those points. Clearly is a valid probability distribution. We will use this fact to compute the amount of information in . The point probability of a point is
(18) |
where we see the index is assigned the probability of the indexed point’s distance contribution.
We assume the set of random variables that characterize the discrete distribution of point probabilities, , is the set of . Because we have no additional knowledge of the distribution of points other than they follow the distribution of the data, we assume is uniform as the distance probabilities have no trend or correlation. Then, the familiarity conviction of a point is defined as
(19) |
where is the Kullback-Leibler divergence. Since we assume is uniform, we have that the expected probability .
Familiarity conviction is well suited for anomaly detection, particularly at detecting inliers, which would have familiarity conviction significantly smaller than 1. This performance comes at the cost of computational complexity.
4.2.2 Similarity Conviction
Similarity Conviction is another method to evaluate the surprisal of a point in the data relative to the distribution of data that make up the point’s nearest neighbors. Similarity conviction is defined as the expected distance contribution of the point divided by the point’s observed distance contribution. To get the expected distance contribution of a point, the distance contributions of its nearest neighbors are computed and then averaged. Using the local model of the point to get an expected distance contribution gives us a measure of conviction that leverages the contextual information about the sparsity in the local model.
Similarity conviction can be used as a tool to identify anomalies in the data, whether looking for inliers or outliers. Inliers will have uncharacteristically low distance contribution, and consequently have high values for similarity conviction. Similarly, outliers should have higher distance contributions than their local model which gives them low values for similarity conviction. Non-anomalous data should be expected to have similarity conviction values around 1.0, since the expected distance contribution is to be expected. Similarity conviction is less computationally expensive than familiarity conviction, but may not perform as well at identifying certain inliers.
Similarity conviction, , can be expressed as:
(20) |
Using the average distance contribution of the local model as the expected distance contribution, can be expressed by:
(21) |
4.2.3 Residual Conviction
Examining residual conviction provides insight into the model’s uncertainty for a feature prediction. Residual conviction is calculated as the expected model residual for a feature divided by the computed prediction residual for that feature. The expected model residual is calculated by for taking the mean of the residuals in the local model of its nearest neighbors around the predicted feature, thus the residual conviction for feature of point is
(22) |
where is the set of points in the local model around point . This ratio quantifies the difficulty of individual case’s feature prediction, with prediction certainty decreasing as the conviction approaches 0. In more practical terms, residual conviction serves to characterize how uncertain one or more predictions are relative to how uncertain they are expected to be. This can be used to explain model decisions. If a decision is incorrect but has a residual conviction , then this uncertainty is likely due to uncertainty in the data rather than the model.
5 Applications
In this section we demonstrate the performance of the above methods and concepts on various machine learning tasks. Namely, classification, regression, and anomaly detection. In general, we see that -NN using these enhancements consistently performs near or above state of the art while maintaining strong interpretability and flexibility.
5.1 Classification and Regression
We conducted a comprehensive series of experimental comparisons on a diverse set of algorithms. We first perform classification and regression across 308 PMLB datasets (Romano et al., 2021). (146 for classification and 162 for regression) 111Kindly refer to the appendix for more information on the datasets used and for additional experiments and details.. and compare our approach across gradient boosted trees, traditional -nearest neighbors, logistic regression (for classification), regularized least squares (for regression), neural networks, random forests and Light-GBM. A stratified sampling of data having cells was chosen. To ensure robustness and reliability, each classification and regression experiment was iterated 30 times with varying random seeds, and the resulting metric averages were computed for statistical significance. For classification tasks, we present mean, precision, recall, and Matthews Correlation Coefficient (MCC) as evaluation metrics. In regression, Mean , mean absolute error (MAE), mean square error (MSE) and Spearman coefficient were calculated. The consolidated results are detailed in Table 1 and Table 2, providing a comparative perspective against a diverse set of algorithms. It is worth noting that our proposed method consistently outperforms all other classification algorithms in terms of accuracy and precision, while also demonstrating competitive results in regression.
(Blue values indicate the best performance; Brown values indicate the second-best performance )
Classification | Ours | GB | KNN | LR | NN | RF | LGBM |
---|---|---|---|---|---|---|---|
Mean Accuracy | 82.2278 | 81.9668 | 79.3017 | 79.0799 | 79.7512 | 81.4218 | 81.9154 |
Mean Precision | 0.786554 | 0.774766 | 0.746573 | 0.743946 | 0.732757 | 0.772808 | 0.782817 |
Mean Recall | 0.770464 | 0.764163 | 0.719889 | 0.731404 | 0.736894 | 0.756036 | 0.779027 |
Mean MCC | 0.644526 | 0.628351 | 0.562876 | 0.575105 | 0.584838 | 0.618480 | 0.653397 |
(Blue values indicate the best performance; Brown values indicate the second-best performance )
Regression | Ours | GB | KNN | Linear | NN | RF | LGBM |
---|---|---|---|---|---|---|---|
Mean | 0.857244 | 0.864342 | 0.724857 | 0.509989 | 0.727337 | 0.855368 | 0.818680 |
MAE | 0.841702 | 0.851328 | 1.038063 | 1.975748 | 1.008824 | 0.827424 | 1.069768 |
MSE | 10.168815 | 10.459248 | 12.222883 | 36.569369 | 11.598598 | 9.640429 | 20.453173 |
Spearman coeff. | 0.916272 | 0.925263 | 0.832777 | 0.719865 | 0.821594 | 0.917626 | 0.913498 |
5.2 Anomaly Detection
Using the defined conviction metrics, we can judge whether or not a data point is an anomaly on a standardized scale. To evaluate the the accuracy of this method, we present results on anomaly detection on 20 datasets from Outlier Detection Datasets (ODDS) 222Kindly refer to the appendix for dataset related details. (Rayana, 2016). These datasets have ground truth labels indicating which data points are anomalous, which makes them ideal for this analysis. We utilize the previously established method of evaluating the conviction values of each point and compare to the results of using many of the popular anomaly detection methods as shown in Table 3. Specifically, we trained our model by splitting our dataset into two parts (train and test). The training set comprised solely of inliers, and a test set encompassing both inliers and outliers, with a notable prevalence of inliers. Since the ODDS dataset has ground truth labels for both inliers and outliers, we used the ground truth labels to compute F1 scores to measure the performance of the anomaly detection benchmark routine. For our methods, we simply computed the conviction (similarity conviction or familiarity conviction) and compared it to a threshold of 0.7. If the conviction fell below the threshold, then it was classified as an anomaly. In practice we would recommend tuning this threshold per dataset, but here we show that picking a conviction level of 0.7 for all datasets (wihout choosing it in a dataset specific manner), our method achieves the highest scores in 12 of the 20 datasets, surpassing the performance of all other outlier detection methods.
In Table 3, we show the average F1 score for each method across the 20 ODDS datasets. To see the results per dataset, please refer to the appendix.
(Bold values indicate the best performance)
Method | Mean F1 Score |
---|---|
Ours (Familiarity Conviction) | 0.32 |
Ours (Similarity Conviction) | 0.49 |
One Class SVM (Li et al., 2003) | 0.22 |
Isolation Forest (Liu et al., 2008) | 0.38 |
CBLOF (He et al., 2003) | 0.37 |
Local Outlier Factor (Breunig et al., 2000) | 0.19 |
ECOD (Li et al., 2022) | 0.32 |
DeepSVDD (Ruff et al., 2018) | 0.45 |
It is worth noting that certain methodologies, such as CBLOF (Clustering-Based Local Outlier Factor), LOF (Local Outlier Factor), and ECOD (Extended Connectivity-Based Outlier Detection) usually incorporate a distinct partition exclusively composed of inliers during the training phase. Though this is not necessary, these methods can benefit from inlier-based training partitions. In contrast, our approach which harnesses the notion of familiarity conviction, allows us the capability to identify anomalies without necessitating an explicit ‘inlier’ dataset. This innovation enables us to gauge the uncertainty inherent in our model and promptly identify anomalous instances in a real-time manner.
We demonstrate this on a toy dataset as shown in Figure 1.

Consider a toy training dataset in Figure 1 which is sampled from a random variable and we observe two points (blue) and (green) within the data. Given this data, familiarity conviction allows us to measure how close a point is to existing data. Using our notion of surprisal, we can observe that has high surprisal and therefore low conviction. Moreover, the distance contribution is higher than the mean distance contribution of the entire dataset. This allows for us to detect it as anomalous without having requirement of a separate inlier partition.
6 Limitations and Future Work
The present methodology, while promising, exhibits certain limitations in terms of the scale of data it can effectively handle. As an extension of the foundational -Nearest Neighbors framework, this approach necessitates that the data used for model fitting is constrained to the memory capacity of the a machine. Secondly, the challenge lies in determining the conviction threshold beforehand, as it depends on factors such as the contamination level within the anomalous dataset or the absence of an inlier training set (for similarity conviction). Furthermore, placing additional emphasis on interpretability may introduce a trade-off in classification and regression performance, as the model becomes less reliant on spurious correlations within the data. To address the issue of scale we have implemented techniques that make querying the dataset more efficient than many standard methods. Specifically, in practice we use an efficient branch-and-bound implementation that combines efficient use of bit vectors to reduce the total compute required for all but the most pathological datasets. Furthermore, we are looking into sampling strategies of best representing the variance of our original data as well as data ablation techniques which could allow us to store more information in less space by intelligently adjusting weights of trained cases. We have also begun to probe the robustness of our method against inference-time adversarial attacks. Owing to the lack of gradient-based optimization in our approach and the adeptness at outlier detection tasks, early results have shown significant promise of our method’s robustness to out of distribution data across tabular as well as image datasets. Although this facet remains part of our future work, early indications of its resilience against such challenges substantiate the potential for our approach to thrive in safety-critical contexts.
7 Discussion and Conclusion
In summary, we propose several enhancements to the traditional -NN algorithm from the perspective of information theory. In particular, our method utilizes the Łukaszyk–Karmowski (LK) distance tailored to Laplace distributions, effectively mitigating the problem of zero distances predicated on data uncertainty. Furthermore, by leveraging Inverse Residual Weighting (IRW), we convert our distance measurements into the realm of surprisal space. Using the notion of surprisal, we define a new concept of conviction with which we are able to compute interpretable measures of the importance and surprisal for each data point. Finally, these enhancements have increased the effectiveness of -NN while maintaining its natural interpretability. Since our method utilizes nearest-neighbors, it can effectively estimate the underlying density of the data, contributing to its versatility in various statistical and machine learning applications. Unlike traditional methods that rely on post-model interpretability tools, our approach directly addresses data and feature uncertainty. By leveraging the aforementioned tools, we can determine data point weights as well as feature contributions by calculating the conditional entropy for adding a feature without the need for explicit model training. This also allows us to compute MDA and MAE along with feature contributions. Our method provides detailed data point influence weights with perfect attribution and can be used to query counterfactuals. Moreover, by calculating SHAP over feature sets sampled from the entire feature space, we can obtain a more reliable estimate of SHAP that is robust to multicollinearity and feature order. Lastly, we conducted an extensive analysis on 308 datasets for classification and regression, alongside an additional 20 ODDS datasets for anomaly detection. In conclusion, we see that this approach aligns with human understanding of decision-making from data, such as similarity, differences and causality and facilitates a clearer understanding of a model’s decision-making process.
References
- Aggarwal et al. (2001) Aggarwal, C. C., Hinneburg, A., and Keim, D. A. On the surprising behavior of distance metrics in high dimensional space. In Database Theory—ICDT 2001: 8th International Conference London, UK, January 4–6, 2001 Proceedings 8, pp. 420–434. Springer, 2001.
- Aha et al. (1991) Aha, D. W., Kibler, D. F., and Albert, M. K. Instance-based learning algorithms. Mach. Learn., 6:37–66, 1991. doi: 10.1023/A:1022689900470. URL https://doi.org/10.1023/A:1022689900470.
- Alpaydın (1999) Alpaydın, E. Voting over multiple condensed nearest neighbors. Artificial Intelligence Review, 11, 09 1999. doi: 10.1023/A:1006563312922.
- Bahri et al. (2020) Bahri, D., Jiang, H., and Gupta, M. Deep k-nn for noisy labels, 2020.
- Breunig et al. (2000) Breunig, M. M., Kriegel, H.-P., Ng, R. T., and Sander, J. Lof: identifying density-based local outliers. In Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pp. 93–104, 2000.
- Cover & Hart (1967) Cover, T. and Hart, P. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1):21–27, 1967. doi: 10.1109/TIT.1967.1053964.
- Fix & Hodges (1951) Fix, E. and Hodges, J. Discriminatory Analysis: Nonparametric Discrimination: Consistency Properties. USAF School of Aviation Medicine, 1951.
- Hastie et al. (2009) Hastie, T., Tibshirani, R., and Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics). 02 2009. ISBN 0387848576.
- He & Garcia (2009) He, H. and Garcia, E. A. Learning from imbalanced data. IEEE Transactions on Knowledge and Data Engineering, 21(9):1263–1284, 2009. doi: 10.1109/TKDE.2008.239.
- He et al. (2003) He, Z., Xu, X., and Deng, S. Discovering cluster-based local outliers. Pattern recognition letters, 24(9-10):1641–1650, 2003.
- Indyk & Motwani (1998) Indyk, P. and Motwani, R. Approximate nearest neighbors: Towards removing the curse of dimensionality. In Proceedings of the Thirtieth Annual ACM Symposium on Theory of Computing, STOC ’98, pp. 604–613, New York, NY, USA, 1998. Association for Computing Machinery. ISBN 0897919629. doi: 10.1145/276698.276876. URL https://doi.org/10.1145/276698.276876.
- Jiang et al. (2022) Jiang, H., Lu, Z., Meng, F., Zhou, C., Zhou, J., Huang, D., and Su, J. Towards robust k-nearest-neighbor machine translation. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 5468–5477, Abu Dhabi, United Arab Emirates, December 2022. Association for Computational Linguistics. URL https://aclanthology.org/2022.emnlp-main.367.
- Jiang et al. (2023) Jiang, Z., Yang, M., Tsirlin, M., Tang, R., Dai, Y., and Lin, J. “low-resource” text classification: A parameter-free classification method with compressors. In Findings of the Association for Computational Linguistics: ACL 2023, pp. 6810–6828, Toronto, Canada, July 2023. Association for Computational Linguistics. doi: 10.18653/v1/2023.findings-acl.426. URL https://aclanthology.org/2023.findings-acl.426.
- Khandelwal et al. (2021) Khandelwal, U., Fan, A., Jurafsky, D., Zettlemoyer, L., and Lewis, M. Nearest neighbor machine translation. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=7wCBOfJ8hJM.
- Li et al. (2003) Li, K.-L., Huang, H.-K., Tian, S.-F., and Xu, W. Improving one-class svm for anomaly detection. In Proceedings of the 2003 international conference on machine learning and cybernetics (IEEE Cat. No. 03EX693), volume 5, pp. 3077–3081. IEEE, 2003.
- Li et al. (2022) Li, Z., Zhao, Y., Hu, X., Botta, N., Ionescu, C., and Chen, G. Ecod: Unsupervised outlier detection using empirical cumulative distribution functions. IEEE Transactions on Knowledge and Data Engineering, 2022.
- Liu et al. (2008) Liu, F. T., Ting, K. M., and Zhou, Z.-H. Isolation forest. In 2008 eighth ieee international conference on data mining, pp. 413–422. IEEE, 2008.
- Meng et al. (2022) Meng, Y., Li, X., Zheng, X., Wu, F., Sun, X., Zhang, T., and Li, J. Fast nearest neighbor machine translation. In Findings of the Association for Computational Linguistics: ACL 2022, pp. 555–565, Dublin, Ireland, May 2022. Association for Computational Linguistics. doi: 10.18653/v1/2022.findings-acl.47. URL https://aclanthology.org/2022.findings-acl.47.
- Papernot & McDaniel (2018) Papernot, N. and McDaniel, P. Deep k-nearest neighbors: Towards confident, interpretable and robust deep learning, 2018.
- Prasath et al. (2017) Prasath, V. B. S., Alfeilat, H. A. A., Lasassmeh, O., and Hassanat, A. B. A. Distance and similarity measures effect on the performance of k-nearest neighbor classifier - A review. CoRR, abs/1708.04321, 2017. URL http://arxiv.org/abs/1708.04321.
- Rao et al. (2004) Rao, M., Chen, Y., Vemuri, B., and Wang, F. Cumulative residual entropy: a new measure of information. IEEE Transactions on Information Theory, 50(6):1220–1228, 2004. doi: 10.1109/TIT.2004.828057.
- Rayana (2016) Rayana, S. Odds library, 2016. URL https://odds.cs.stonybrook.edu.
- Romano et al. (2021) Romano, J. D., Le, T. T., La Cava, W., Gregg, J. T., Goldberg, D. J., Chakraborty, P., Ray, N. L., Himmelstein, D., Fu, W., and Moore, J. H. Pmlb v1.0: an open source dataset collection for benchmarking machine learning methods. arXiv preprint arXiv:2012.00058v2, 2021.
- Ruff et al. (2018) Ruff, L., Vandermeulen, R., Goernitz, N., Deecke, L., Siddiqui, S. A., Binder, A., Müller, E., and Kloft, M. Deep one-class classification. In International conference on machine learning, pp. 4393–4402. PMLR, 2018.
- Schuh et al. (2013) Schuh, M. A., Wylie, T., and Angryk, R. A. Improving the performance of high-dimensional knn retrieval through localized dataspace segmentation and hybrid indexing. In Catania, B., Guerrini, G., and Pokorný, J. (eds.), Advances in Databases and Information Systems, pp. 344–357, Berlin, Heidelberg, 2013. Springer Berlin Heidelberg.
- Tao et al. (2009) Tao, Y., Yi, K., Sheng, C., and Kalnis, P. Quality and efficiency in high dimensional nearest neighbor search. In Proceedings of the 2009 ACM SIGMOD International Conference on Management of Data, SIGMOD ’09, pp. 563–576, New York, NY, USA, 2009. Association for Computing Machinery. ISBN 9781605585512. doi: 10.1145/1559845.1559905. URL https://doi.org/10.1145/1559845.1559905.
- Wang et al. (2009) Wang, Q., Kulkarni, S., and Verdú, S. Divergence estimation for multidimensional densities via -nearest-neighbor distances. Information Theory, IEEE Transactions on, 55:2392 – 2405, 06 2009. doi: 10.1109/TIT.2009.2016060.
- Łukaszyk (2003) Łukaszyk, S. Probability metric, examples of approximation applications in experimental mechanics. PhD thesis, 01 2003.
- Łukaszyk (2004) Łukaszyk, S. A new concept of probability metric and its applications in approximation of scattered data sets. Computational Mechanics, 33:299–304, 03 2004. doi: 10.1007/s00466-003-0532-2.
Appendix A Derivation of Łukaszyk–Karmowski (LK) with Laplace Distributions
To prove Equation 4 of our work, we begin with the expected distance between two random variables and given two probability density functions, and as
(23) |
Using two Laplace distributions with means and and expected distance from the mean and , we can express the probability density functions as
(24) |
and
(25) |
respectively.
Substituting in the Laplace distributions into the expected distance, we can simplify this slightly as
We further assume that , and use in place of both, which assumes that the error is the same throughout the space and simplify further as
(26) |
Because we only have one value for , we can assume that without loss of generality because we can just exchange the values if this is not true, and in the end we will adjust the formula to remove this assumption. There exist 3 regions of the space for which are , , and .
A.1
Rewriting Equation 26 for the part of the space where is
A.2
Rewriting Equation 26 for the part of the space where is
A.3
Rewriting Equation 26 for the part of the space where is
A.4
Rewriting Equation 26 for the part of the space where is
A.5
Rewriting Equation 26 for the part of the space where is
A.6
Rewriting Equation 26 for the part of the space where is
A.7
Rewriting Equation 26 for the part of the space where is
A.8
Rewriting Equation 26 for the part of the space where is
Rewriting Equation 26 for the part of the space where is
A.9 Combining the Parts
We can combine each of the probability weighted distances as
To remove the assumption that , we can rewrite this result as
(27) |
This completes the derivation of LK distance with Laplace Distributions.
Appendix B Information about the Benchmarked Algorithms
-
•
Gradient Boosted Trees
K-fold cross-validation was carried out with . Grid search was carried out on the number of estimators () such that .
-
•
Tradional KNN
K-fold cross-validation was carried out with . Grid search was carried out on the number of the number of neighbors () and the value of in norms such that and . Note that the search space of neighbors are picked according to the Fibonacci sequence since it grows at a slower rate than some other sequences (e.g. exponential), which is advantageous when exploring hyperparameter values. It provides a versatile set of values that can adapt to different datasets and problem characteristics leading to a more diverse exploration of the search space, helping to identify a wider range of potential optimal values.
-
•
Regularized Least Squares
Elastic Net was used in the case of regression. K-fold cross-validation was carried out with . Grid search was carried out on the scaling ration of the and penalties ranging from values
-
•
Logistic Regression
K-fold cross-validation was carried out with . Grid search was carried out on the inverse of regularization strength in the logarithmic scale such that
The optimization problem was solved using stochastic average gradient with penalty.
-
•
Neural Network
For both classification and regression datasets, Adam Optimizer was used with batch size 128 and learning rate of . An internal validation set of was used from the training data for an early stopping criteria, with maximum epochs set to using swish activation for each hidden layer. A dropout rate of with Layer Norm was used after each hidden layer. The details of the architecture can be found in the table below.
Layer # Parameter Type Parameter Size layers 1 0.weight (512, Input Size) layers 1 0.bias (512,) dropout 1 1.weight (512,) dropout 1 1.bias (512,) layers 2 4.weight (512, 512) layers 2 4.bias (512,) dropout 2 5.weight (512,) dropout 2 5.bias (512,) layers 3 8.weight (512, 512) layers 3 8.bias (512,) dropout 3 9.weight (512,) dropout 3 9.bias (512,) Output Layer weight (Output Size, 512) Output Layer bias (Output Size,) -
•
Random Forests
K-fold cross-validation was carried out with . Grid search was carried out on the number of estimators () such that .
-
•
Light-GBM
K-fold cross-validation was carried out with . The number of estimators used was 100 with number of leaves = 31
Appendix C Detailed Information about the PMLB Classificaton Datasets
Dataset Name | Rows | Columns | Rows Columns | |
---|---|---|---|---|
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | 1600 | 21 | 33600 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | 1600 | 21 | 33600 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | 1600 | 21 | 33600 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | 1600 | 21 | 33600 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | 1600 | 21 | 33600 |
6 | Hill_Valley_with_noise | 1212 | 101 | 122412 |
7 | Hill_Valley_without_noise | 1212 | 101 | 122412 |
8 | agaricus_lepiota | 8145 | 23 | 187335 |
9 | allbp | 3772 | 30 | 113160 |
10 | allhyper | 3771 | 30 | 113130 |
11 | allhypo | 3770 | 30 | 113100 |
12 | allrep | 3772 | 30 | 113160 |
13 | analcatdata_aids | 50 | 5 | 250 |
14 | analcatdata_asbestos | 83 | 4 | 332 |
15 | analcatdata_authorship | 841 | 71 | 59711 |
16 | analcatdata_bankruptcy | 50 | 7 | 350 |
17 | analcatdata_boxing1 | 120 | 4 | 480 |
18 | analcatdata_boxing2 | 132 | 4 | 528 |
19 | analcatdata_creditscore | 100 | 7 | 700 |
20 | analcatdata_cyyoung8092 | 97 | 11 | 1067 |
21 | analcatdata_cyyoung9302 | 92 | 11 | 1012 |
22 | analcatdata_dmft | 797 | 5 | 3985 |
23 | analcatdata_fraud | 42 | 12 | 504 |
24 | analcatdata_germangss | 400 | 6 | 2400 |
25 | analcatdata_happiness | 60 | 4 | 240 |
26 | analcatdata_japansolvent | 52 | 10 | 520 |
27 | analcatdata_lawsuit | 264 | 5 | 1320 |
28 | ann_thyroid | 7200 | 22 | 158400 |
29 | appendicitis | 106 | 8 | 848 |
30 | australian | 690 | 15 | 10350 |
31 | auto | 202 | 26 | 5252 |
32 | backache | 180 | 33 | 5940 |
33 | balance_scale | 625 | 5 | 3125 |
34 | banana | 5300 | 3 | 15900 |
35 | biomed | 209 | 9 | 1881 |
36 | breast | 699 | 11 | 7689 |
37 | breast_cancer | 286 | 10 | 2860 |
38 | breast_cancer_wisconsin | 569 | 31 | 17639 |
39 | breast_w | 699 | 10 | 6990 |
40 | buggyCrx | 690 | 16 | 11040 |
41 | bupa | 345 | 6 | 2070 |
42 | calendarDOW | 399 | 33 | 13167 |
43 | car | 1728 | 7 | 12096 |
44 | car_evaluation | 1728 | 22 | 38016 |
45 | cars | 392 | 9 | 3528 |
46 | chess | 3196 | 37 | 118252 |
47 | churn | 5000 | 21 | 105000 |
48 | clean1 | 476 | 169 | 80444 |
49 | cleve | 303 | 14 | 4242 |
50 | cleveland | 303 | 14 | 4242 |
51 | cleveland_nominal | 303 | 8 | 2424 |
52 | cloud | 108 | 8 | 864 |
53 | cmc | 1473 | 10 | 14730 |
54 | colic | 368 | 23 | 8464 |
55 | collins | 485 | 24 | 11640 |
56 | confidence | 72 | 4 | 288 |
57 | contraceptive | 1473 | 10 | 14730 |
58 | corral | 160 | 7 | 1120 |
59 | credit_a | 690 | 16 | 11040 |
60 | credit_g | 1000 | 21 | 21000 |
61 | crx | 690 | 16 | 11040 |
62 | dermatology | 366 | 35 | 12810 |
63 | diabetes | 768 | 9 | 6912 |
64 | dis | 3772 | 30 | 113160 |
65 | ecoli | 327 | 8 | 2616 |
66 | flags | 178 | 44 | 7832 |
67 | flare | 1066 | 11 | 11726 |
68 | german | 1000 | 21 | 21000 |
69 | glass | 205 | 10 | 2050 |
70 | glass2 | 163 | 10 | 1630 |
71 | haberman | 306 | 4 | 1224 |
72 | hayes_roth | 160 | 5 | 800 |
73 | heart_c | 303 | 14 | 4242 |
Dataset Name | Rows | Columns | Rows Columns | |
---|---|---|---|---|
74 | heart_h | 294 | 14 | 4116 |
75 | heart_statlog | 270 | 14 | 3780 |
76 | hepatitis | 155 | 20 | 3100 |
77 | horse_colic | 368 | 23 | 8464 |
78 | house_votes_84 | 435 | 17 | 7395 |
79 | hungarian | 294 | 14 | 4116 |
80 | hypothyroid | 3163 | 26 | 82238 |
81 | ionosphere | 351 | 35 | 12285 |
82 | iris | 150 | 5 | 750 |
83 | irish | 500 | 6 | 3000 |
84 | kr_vs_kp | 3196 | 37 | 118252 |
85 | krkopt | 28056 | 7 | 196392 |
86 | labor | 57 | 17 | 969 |
87 | led24 | 3200 | 25 | 80000 |
88 | led7 | 3200 | 8 | 25600 |
89 | lupus | 87 | 4 | 348 |
90 | lymphography | 148 | 19 | 2812 |
91 | magic | 19020 | 11 | 209220 |
92 | mfeat_fourier | 2000 | 77 | 154000 |
93 | mfeat_karhunen | 2000 | 65 | 130000 |
94 | mfeat_morphological | 2000 | 7 | 14000 |
95 | mfeat_zernike | 2000 | 48 | 96000 |
96 | mofn_3_7_10 | 1324 | 11 | 14564 |
97 | molecular_biology_promoters | 106 | 58 | 6148 |
98 | monk1 | 556 | 7 | 3892 |
99 | monk2 | 601 | 7 | 4207 |
100 | monk3 | 554 | 7 | 3878 |
101 | movement_libras | 360 | 91 | 32760 |
102 | mushroom | 8124 | 23 | 186852 |
103 | mux6 | 128 | 7 | 896 |
104 | new_thyroid | 215 | 6 | 1290 |
105 | nursery | 12958 | 9 | 116622 |
106 | page_blocks | 5473 | 11 | 60203 |
107 | parity5 | 32 | 6 | 192 |
108 | parity5+5 | 1124 | 11 | 12364 |
109 | pendigits | 10992 | 17 | 186864 |
110 | phoneme | 5404 | 6 | 32424 |
111 | pima | 768 | 9 | 6912 |
112 | postoperative_patient_data | 88 | 9 | 792 |
113 | prnn_crabs | 200 | 8 | 1600 |
114 | prnn_fglass | 205 | 10 | 2050 |
115 | prnn_synth | 250 | 3 | 750 |
116 | profb | 672 | 10 | 6720 |
117 | ring | 7400 | 21 | 155400 |
118 | saheart | 462 | 10 | 4620 |
119 | satimage | 6435 | 37 | 238095 |
120 | schizo | 340 | 15 | 5100 |
121 | segmentation | 2310 | 20 | 46200 |
122 | solar_flare_1 | 315 | 13 | 4095 |
123 | solar_flare_2 | 1066 | 13 | 13858 |
124 | sonar | 208 | 61 | 12688 |
125 | soybean | 675 | 36 | 24300 |
126 | spambase | 4601 | 58 | 266858 |
127 | spect | 267 | 23 | 6141 |
128 | spectf | 349 | 45 | 15705 |
129 | splice | 3188 | 61 | 194468 |
130 | tae | 151 | 6 | 906 |
131 | texture | 5500 | 41 | 225500 |
132 | threeOf9 | 512 | 10 | 5120 |
133 | tic_tac_toe | 958 | 10 | 9580 |
134 | tokyo1 | 959 | 45 | 43155 |
135 | twonorm | 7400 | 21 | 155400 |
136 | vehicle | 846 | 19 | 16074 |
137 | vote | 435 | 17 | 7395 |
138 | vowel | 990 | 14 | 13860 |
139 | waveform_21 | 5000 | 22 | 110000 |
140 | waveform_40 | 5000 | 41 | 205000 |
141 | wdbc | 569 | 31 | 17639 |
142 | wine_quality_red | 1599 | 12 | 19188 |
143 | wine_quality_white | 4898 | 12 | 58776 |
144 | wine_recognition | 178 | 14 | 2492 |
145 | xd6 | 973 | 10 | 9730 |
146 | yeast | 1479 | 9 | 13311 |
Appendix D Detailed Information about the PMLB Regression Datasets
Dataset Name | Rows | Columns | Rows Columns | |
---|---|---|---|---|
1 | 1027_ESL | 488 | 5 | 2440 |
2 | 1028_SWD | 1000 | 11 | 11000 |
3 | 1029_LEV | 1000 | 5 | 5000 |
4 | 1030_ERA | 1000 | 5 | 5000 |
5 | 1089_USCrime | 47 | 14 | 658 |
6 | 1096_FacultySalaries | 50 | 5 | 250 |
7 | 1199_BNG_echoMonths | 17496 | 10 | 174960 |
8 | 192_vineyard | 52 | 3 | 156 |
9 | 197_cpu_act | 8192 | 22 | 180224 |
10 | 210_cloud | 108 | 6 | 648 |
11 | 225_puma8NH | 8192 | 9 | 73728 |
12 | 227_cpu_small | 8192 | 13 | 106496 |
13 | 228_elusage | 55 | 3 | 165 |
14 | 229_pwLinear | 200 | 11 | 2200 |
15 | 294_satellite_image | 6435 | 37 | 238095 |
16 | 4544_GeographicalOriginalofMusic | 1059 | 118 | 124962 |
17 | 503_wind | 6574 | 15 | 98610 |
18 | 505_tecator | 240 | 125 | 30000 |
19 | 519_vinnie | 380 | 3 | 1140 |
20 | 522_pm10 | 500 | 8 | 4000 |
21 | 523_analcatdata_neavote | 100 | 3 | 300 |
22 | 529_pollen | 3848 | 5 | 19240 |
23 | 547_no2 | 500 | 8 | 4000 |
24 | 560_bodyfat | 252 | 15 | 3780 |
25 | 562_cpu_small | 8192 | 13 | 106496 |
26 | 573_cpu_act | 8192 | 22 | 180224 |
27 | 579_fri_c0_250_5 | 250 | 6 | 1500 |
28 | 581_fri_c3_500_25 | 500 | 26 | 13000 |
29 | 582_fri_c1_500_25 | 500 | 26 | 13000 |
30 | 583_fri_c1_1000_50 | 1000 | 51 | 51000 |
31 | 584_fri_c4_500_25 | 500 | 26 | 13000 |
32 | 586_fri_c3_1000_25 | 1000 | 26 | 26000 |
33 | 588_fri_c4_1000_100 | 1000 | 101 | 101000 |
34 | 589_fri_c2_1000_25 | 1000 | 26 | 26000 |
35 | 590_fri_c0_1000_50 | 1000 | 51 | 51000 |
36 | 591_fri_c1_100_10 | 100 | 11 | 1100 |
37 | 592_fri_c4_1000_25 | 1000 | 26 | 26000 |
38 | 593_fri_c1_1000_10 | 1000 | 11 | 11000 |
39 | 594_fri_c2_100_5 | 100 | 6 | 600 |
40 | 595_fri_c0_1000_10 | 1000 | 11 | 11000 |
41 | 596_fri_c2_250_5 | 250 | 6 | 1500 |
42 | 597_fri_c2_500_5 | 500 | 6 | 3000 |
43 | 598_fri_c0_1000_25 | 1000 | 26 | 26000 |
44 | 599_fri_c2_1000_5 | 1000 | 6 | 6000 |
45 | 601_fri_c1_250_5 | 250 | 6 | 1500 |
46 | 602_fri_c3_250_10 | 250 | 11 | 2750 |
47 | 603_fri_c0_250_50 | 250 | 51 | 12750 |
48 | 604_fri_c4_500_10 | 500 | 11 | 5500 |
49 | 605_fri_c2_250_25 | 250 | 26 | 6500 |
50 | 606_fri_c2_1000_10 | 1000 | 11 | 11000 |
51 | 607_fri_c4_1000_50 | 1000 | 51 | 51000 |
52 | 608_fri_c3_1000_10 | 1000 | 11 | 11000 |
53 | 609_fri_c0_1000_5 | 1000 | 6 | 6000 |
54 | 611_fri_c3_100_5 | 100 | 6 | 600 |
Dataset Name | Rows | Columns | Rows Columns | |
---|---|---|---|---|
55 | 612_fri_c1_1000_5 | 1000 | 6 | 6000 |
56 | 613_fri_c3_250_5 | 250 | 6 | 1500 |
57 | 615_fri_c4_250_10 | 250 | 11 | 2750 |
58 | 616_fri_c4_500_50 | 500 | 51 | 25500 |
59 | 617_fri_c3_500_5 | 500 | 6 | 3000 |
60 | 618_fri_c3_1000_50 | 1000 | 51 | 51000 |
61 | 620_fri_c1_1000_25 | 1000 | 26 | 26000 |
62 | 621_fri_c0_100_10 | 100 | 11 | 1100 |
63 | 622_fri_c2_1000_50 | 1000 | 51 | 51000 |
64 | 623_fri_c4_1000_10 | 1000 | 11 | 11000 |
65 | 624_fri_c0_100_5 | 100 | 6 | 600 |
66 | 626_fri_c2_500_50 | 500 | 51 | 25500 |
67 | 627_fri_c2_500_10 | 500 | 11 | 5500 |
68 | 628_fri_c3_1000_5 | 1000 | 6 | 6000 |
69 | 631_fri_c1_500_5 | 500 | 6 | 3000 |
70 | 633_fri_c0_500_25 | 500 | 26 | 13000 |
71 | 634_fri_c2_100_10 | 100 | 11 | 1100 |
72 | 635_fri_c0_250_10 | 250 | 11 | 2750 |
73 | 637_fri_c1_500_50 | 500 | 51 | 25500 |
74 | 641_fri_c1_500_10 | 500 | 11 | 5500 |
75 | 643_fri_c2_500_25 | 500 | 26 | 13000 |
76 | 644_fri_c4_250_25 | 250 | 26 | 6500 |
77 | 645_fri_c3_500_50 | 500 | 51 | 25500 |
78 | 646_fri_c3_500_10 | 500 | 11 | 5500 |
79 | 647_fri_c1_250_10 | 250 | 11 | 2750 |
80 | 648_fri_c1_250_50 | 250 | 51 | 12750 |
81 | 649_fri_c0_500_5 | 500 | 6 | 3000 |
82 | 650_fri_c0_500_50 | 500 | 51 | 25500 |
83 | 651_fri_c0_100_25 | 100 | 26 | 2600 |
84 | 653_fri_c0_250_25 | 250 | 26 | 6500 |
85 | 654_fri_c0_500_10 | 500 | 11 | 5500 |
86 | 656_fri_c1_100_5 | 100 | 6 | 600 |
87 | 657_fri_c2_250_10 | 250 | 11 | 2750 |
88 | 658_fri_c3_250_25 | 250 | 26 | 6500 |
89 | 663_rabe_266 | 120 | 3 | 360 |
90 | 665_sleuth_case2002 | 147 | 7 | 1029 |
91 | 666_rmftsa_ladata | 508 | 11 | 5588 |
92 | 678_visualizing_environmental | 111 | 4 | 444 |
93 | 687_sleuth_ex1605 | 62 | 6 | 372 |
94 | 690_visualizing_galaxy | 323 | 5 | 1615 |
95 | 695_chatfield_4 | 235 | 13 | 3055 |
96 | 712_chscase_geyser1 | 222 | 3 | 666 |
97 | feynman_III_12_43 | 100000 | 3 | 300000 |
98 | feynman_III_15_12 | 100000 | 4 | 400000 |
99 | feynman_III_15_14 | 100000 | 4 | 400000 |
100 | feynman_III_15_27 | 100000 | 4 | 400000 |
101 | feynman_III_17_37 | 100000 | 4 | 400000 |
102 | feynman_III_7_38 | 100000 | 4 | 400000 |
103 | feynman_III_8_54 | 100000 | 4 | 400000 |
104 | feynman_II_10_9 | 100000 | 4 | 400000 |
105 | feynman_II_11_28 | 100000 | 3 | 300000 |
106 | feynman_II_13_23 | 100000 | 4 | 400000 |
107 | feynman_II_13_34 | 100000 | 4 | 400000 |
108 | feynman_II_15_4 | 100000 | 4 | 400000 |
Dataset Name | Rows | Columns | Rows Columns | |
---|---|---|---|---|
109 | feynman_II_15_5 | 100000 | 4 | 400000 |
110 | feynman_II_24_17 | 100000 | 4 | 400000 |
111 | feynman_II_27_16 | 100000 | 4 | 400000 |
112 | feynman_II_27_18 | 100000 | 3 | 300000 |
113 | feynman_II_34_2 | 100000 | 4 | 400000 |
114 | feynman_II_34_29a | 100000 | 4 | 400000 |
115 | feynman_II_34_2a | 100000 | 4 | 400000 |
116 | feynman_II_37_1 | 100000 | 4 | 400000 |
117 | feynman_II_38_14 | 100000 | 3 | 300000 |
118 | feynman_II_3_24 | 100000 | 3 | 300000 |
119 | feynman_II_4_23 | 100000 | 4 | 400000 |
120 | feynman_II_8_31 | 100000 | 3 | 300000 |
121 | feynman_II_8_7 | 100000 | 4 | 400000 |
122 | feynman_I_10_7 | 100000 | 4 | 400000 |
123 | feynman_I_12_1 | 100000 | 3 | 300000 |
124 | feynman_I_12_4 | 100000 | 4 | 400000 |
125 | feynman_I_12_5 | 100000 | 3 | 300000 |
126 | feynman_I_14_3 | 100000 | 4 | 400000 |
127 | feynman_I_14_4 | 100000 | 3 | 300000 |
128 | feynman_I_15_10 | 100000 | 4 | 400000 |
129 | feynman_I_16_6 | 100000 | 4 | 400000 |
130 | feynman_I_18_12 | 100000 | 4 | 400000 |
131 | feynman_I_25_13 | 100000 | 3 | 300000 |
132 | feynman_I_26_2 | 100000 | 3 | 300000 |
133 | feynman_I_27_6 | 100000 | 4 | 400000 |
134 | feynman_I_29_4 | 100000 | 3 | 300000 |
135 | feynman_I_30_3 | 100000 | 4 | 400000 |
136 | feynman_I_30_5 | 100000 | 4 | 400000 |
137 | feynman_I_34_1 | 100000 | 4 | 400000 |
138 | feynman_I_34_14 | 100000 | 4 | 400000 |
139 | feynman_I_34_27 | 100000 | 3 | 300000 |
140 | feynman_I_37_4 | 100000 | 4 | 400000 |
141 | feynman_I_39_1 | 100000 | 3 | 300000 |
142 | feynman_I_39_11 | 100000 | 4 | 400000 |
143 | feynman_I_43_31 | 100000 | 4 | 400000 |
144 | feynman_I_47_23 | 100000 | 4 | 400000 |
145 | feynman_I_48_2 | 100000 | 4 | 400000 |
146 | feynman_I_6_2 | 100000 | 3 | 300000 |
147 | feynman_I_6_2b | 100000 | 4 | 400000 |
148 | nikuradse_1 | 362 | 3 | 1086 |
149 | strogatz_bacres1 | 400 | 3 | 1200 |
150 | strogatz_bacres2 | 400 | 3 | 1200 |
151 | strogatz_barmag1 | 400 | 3 | 1200 |
152 | strogatz_barmag2 | 400 | 3 | 1200 |
153 | strogatz_glider1 | 400 | 3 | 1200 |
154 | strogatz_glider2 | 400 | 3 | 1200 |
155 | strogatz_lv1 | 400 | 3 | 1200 |
156 | strogatz_lv2 | 400 | 3 | 1200 |
157 | strogatz_predprey1 | 400 | 3 | 1200 |
158 | strogatz_predprey2 | 400 | 3 | 1200 |
159 | strogatz_shearflow1 | 400 | 3 | 1200 |
160 | strogatz_shearflow2 | 400 | 3 | 1200 |
161 | strogatz_vdp1 | 400 | 3 | 1200 |
162 | strogatz_vdp2 | 400 | 3 | 1200 |
Appendix E Detailed Results: Classification
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | Ours | 0.653854 | 0.669697 | 0.655529 | 0.32489 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | Ours | 0.759479 | 0.777411 | 0.761283 | 0.538387 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | Ours | 0.659583 | 0.662087 | 0.660329 | 0.322405 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | Ours | 0.691042 | 0.692929 | 0.691785 | 0.384708 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | Ours | 0.671354 | 0.674906 | 0.672593 | 0.347478 |
6 | Hill_Valley_with_noise | Ours | 0.555693 | 0.56096 | 0.555904 | 0.116711 |
7 | Hill_Valley_without_noise | Ours | 0.634979 | 0.63918 | 0.635376 | 0.274511 |
8 | agaricus_lepiota | Ours | 0.999959 | 0.999961 | 0.999957 | 0.999918 |
9 | allbp | Ours | 0.96287 | 0.607804 | 0.511241 | 0.486371 |
10 | allhyper | Ours | 0.98 | 0.514222 | 0.470816 | 0.562103 |
11 | allhypo | Ours | 0.956366 | 0.787001 | 0.737647 | 0.698695 |
12 | allrep | Ours | 0.967285 | 0.573905 | 0.476258 | 0.385945 |
13 | analcatdata_aids | Ours | 0.496667 | 0.575589 | 0.528433 | 0.127752 |
14 | analcatdata_asbestos | Ours | 0.747059 | 0.756768 | 0.759368 | 0.514386 |
15 | analcatdata_authorship | Ours | 0.998817 | 0.9992 | 0.99921 | 0.998268 |
16 | analcatdata_bankruptcy | Ours | 0.83 | 0.829187 | 0.813175 | 0.641827 |
17 | analcatdata_boxing1 | Ours | 0.834722 | 0.826879 | 0.801451 | 0.625709 |
18 | analcatdata_boxing2 | Ours | 0.771605 | 0.778168 | 0.769354 | 0.547147 |
19 | analcatdata_creditscore | Ours | 0.971667 | 0.947156 | 0.981818 | 0.925927 |
20 | analcatdata_cyyoung8092 | Ours | 0.741667 | 0.642799 | 0.584171 | 0.224909 |
21 | analcatdata_cyyoung9302 | Ours | 0.864912 | 0.755211 | 0.758342 | 0.505288 |
22 | analcatdata_dmft | Ours | 0.206875 | 0.209936 | 0.210983 | 0.053707 |
23 | analcatdata_fraud | Ours | 0.677778 | 0.631362 | 0.640972 | 0.246998 |
24 | analcatdata_germangss | Ours | 0.321667 | 0.389592 | 0.339092 | 0.129106 |
25 | analcatdata_happiness | Ours | 0.491667 | 0.493704 | 0.51127 | 0.30306 |
26 | analcatdata_japansolvent | Ours | 0.706061 | 0.72914 | 0.702196 | 0.4281 |
27 | analcatdata_lawsuit | Ours | 0.978616 | 0.913018 | 0.915771 | 0.82225 |
28 | ann_thyroid | Ours | 0.949676 | 0.872635 | 0.755052 | 0.664545 |
29 | appendicitis | Ours | 0.862121 | 0.749118 | 0.752361 | 0.492146 |
30 | australian | Ours | 0.853623 | 0.852566 | 0.852809 | 0.705318 |
31 | auto | Ours | 0.847967 | 0.857068 | 0.8448 | 0.802945 |
32 | backache | Ours | 0.819444 | 0.615342 | 0.562372 | 0.165026 |
33 | balance_scale | Ours | 0.890667 | 0.593734 | 0.642615 | 0.806576 |
34 | banana | Ours | 0.89978 | 0.902141 | 0.895135 | 0.79724 |
35 | biomed | Ours | 0.965079 | 0.968146 | 0.955684 | 0.923524 |
36 | breast | Ours | 0.959286 | 0.956566 | 0.953688 | 0.910165 |
37 | breast_cancer | Ours | 0.698851 | 0.622068 | 0.579049 | 0.194912 |
38 | breast_cancer_wisconsin | Ours | 0.961696 | 0.966384 | 0.952519 | 0.918735 |
39 | breast_w | Ours | 0.959286 | 0.955439 | 0.955047 | 0.910433 |
40 | buggyCrx | Ours | 0.849275 | 0.848633 | 0.850343 | 0.698912 |
41 | bupa | Ours | 0.589372 | 0.623767 | 0.597582 | 0.219039 |
42 | calendarDOW | Ours | 0.594583 | 0.575761 | 0.563412 | 0.489319 |
43 | car | Ours | 0.941522 | 0.863757 | 0.874365 | 0.874208 |
44 | car_evaluation | Ours | 0.950193 | 0.872229 | 0.916856 | 0.895672 |
45 | cars | Ours | 0.995359 | 0.995025 | 0.99148 | 0.991464 |
46 | chess | Ours | 0.985729 | 0.985873 | 0.985597 | 0.971471 |
47 | churn | Ours | 0.884433 | 0.780368 | 0.654586 | 0.414033 |
48 | clean1 | Ours | 0.881944 | 0.883391 | 0.891644 | 0.774906 |
49 | cleve | Ours | 0.808197 | 0.808921 | 0.805756 | 0.614373 |
50 | cleveland | Ours | 0.535519 | 0.289752 | 0.288754 | 0.246545 |
51 | cleveland_nominal | Ours | 0.531694 | 0.281539 | 0.282674 | 0.23382 |
52 | cloud | Ours | 0.837879 | 0.845554 | 0.845053 | 0.788497 |
53 | cmc | Ours | 0.537401 | 0.520275 | 0.505161 | 0.276955 |
54 | colic | Ours | 0.820721 | 0.814725 | 0.801837 | 0.616045 |
55 | collins | Ours | 1 | 1 | 1 | 1 |
56 | confidence | Ours | 0.793333 | 0.796278 | 0.802759 | 0.760388 |
57 | contraceptive | Ours | 0.456384 | 0.438475 | 0.436322 | 0.156417 |
58 | corral | Ours | 1 | 1 | 1 | 1 |
59 | credit_a | Ours | 0.851449 | 0.851033 | 0.853532 | 0.704528 |
60 | credit_g | Ours | 0.727333 | 0.666042 | 0.640079 | 0.304229 |
61 | crx | Ours | 0.846377 | 0.846859 | 0.847505 | 0.694331 |
62 | dermatology | Ours | 0.954505 | 0.952888 | 0.954587 | 0.944266 |
63 | diabetes | Ours | 0.750866 | 0.732876 | 0.692291 | 0.422677 |
64 | dis | Ours | 0.984592 | 0.761389 | 0.645043 | 0.37364 |
65 | ecoli | Ours | 0.867677 | 0.852846 | 0.818471 | 0.814701 |
66 | flags | Ours | 0.425 | 0.413098 | 0.392699 | 0.254926 |
67 | flare | Ours | 0.806231 | 0.594014 | 0.542021 | 0.122528 |
68 | german | Ours | 0.723 | 0.667077 | 0.634947 | 0.299559 |
69 | glass | Ours | 0.69187 | 0.663835 | 0.65902 | 0.575001 |
70 | glass2 | Ours | 0.80303 | 0.813394 | 0.80243 | 0.615146 |
71 | haberman | Ours | 0.734946 | 0.656579 | 0.588955 | 0.228624 |
72 | hayes_roth | Ours | 0.741667 | 0.807642 | 0.723313 | 0.593912 |
73 | heart_c | Ours | 0.808197 | 0.808419 | 0.801124 | 0.609371 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | Ours | 0.79661 | 0.791894 | 0.765024 | 0.555181 |
75 | heart_statlog | Ours | 0.828395 | 0.832279 | 0.818623 | 0.650413 |
76 | hepatitis | Ours | 0.815054 | 0.716643 | 0.635302 | 0.335239 |
77 | horse_colic | Ours | 0.82027 | 0.814153 | 0.801003 | 0.614658 |
78 | house_votes_84 | Ours | 0.95249 | 0.948284 | 0.952441 | 0.900652 |
79 | hungarian | Ours | 0.812429 | 0.803013 | 0.781021 | 0.582977 |
80 | hypothyroid | Ours | 0.972617 | 0.889952 | 0.795092 | 0.675621 |
81 | ionosphere | Ours | 0.901878 | 0.895478 | 0.892366 | 0.787616 |
82 | iris | Ours | 0.946667 | 0.948496 | 0.946156 | 0.921595 |
83 | irish | Ours | 1 | 1 | 1 | 1 |
84 | kr_vs_kp | Ours | 0.985573 | 0.985682 | 0.985419 | 0.971101 |
85 | krkopt | Ours | 0.695747 | 0.729317 | 0.62841 | 0.659415 |
86 | labor | Ours | 0.913889 | 0.906515 | 0.916243 | 0.818887 |
87 | led24 | Ours | 0.723698 | 0.722053 | 0.723278 | 0.693398 |
88 | led7 | Ours | 0.733542 | 0.733753 | 0.731413 | 0.704506 |
89 | lupus | Ours | 0.731481 | 0.719018 | 0.713359 | 0.427516 |
90 | lymphography | Ours | 0.801111 | 0.696942 | 0.711982 | 0.620561 |
91 | magic | Ours | 0.835156 | 0.849248 | 0.787159 | 0.633245 |
92 | mfeat_fourier | Ours | 0.843417 | 0.84611 | 0.844234 | 0.8265 |
93 | mfeat_karhunen | Ours | 0.973667 | 0.974178 | 0.973739 | 0.970786 |
94 | mfeat_morphological | Ours | 0.730333 | 0.731324 | 0.72966 | 0.701354 |
95 | mfeat_zernike | Ours | 0.825667 | 0.823871 | 0.826379 | 0.806484 |
96 | mofn_3_7_10 | Ours | 1 | 1 | 1 | 1 |
97 | molecular_biology_promoters | Ours | 0.833333 | 0.85465 | 0.83638 | 0.689479 |
98 | monk1 | Ours | 0.999107 | 0.999 | 0.999231 | 0.99823 |
99 | monk2 | Ours | 0.81157 | 0.867278 | 0.736535 | 0.587969 |
100 | monk3 | Ours | 0.980781 | 0.980822 | 0.980718 | 0.961539 |
101 | movement_libras | Ours | 0.814815 | 0.830163 | 0.823724 | 0.803783 |
102 | mushroom | Ours | 1 | 1 | 1 | 1 |
103 | mux6 | Ours | 0.974359 | 0.975281 | 0.976794 | 0.951989 |
104 | new_thyroid | Ours | 0.94186 | 0.944832 | 0.905906 | 0.874122 |
105 | nursery | Ours | 0.965509 | 0.961679 | 0.882496 | 0.949436 |
106 | page_blocks | Ours | 0.963866 | 0.852621 | 0.794459 | 0.79896 |
107 | parity5 | Ours | 0.952381 | 0.956111 | 0.97 | 0.924597 |
108 | parity5+5 | Ours | 1 | 1 | 1 | 1 |
109 | pendigits | Ours | 0.993285 | 0.993397 | 0.993287 | 0.99254 |
110 | phoneme | Ours | 0.892969 | 0.877483 | 0.860792 | 0.738053 |
111 | pima | Ours | 0.745887 | 0.726364 | 0.688215 | 0.412293 |
112 | postoperative_patient_data | Ours | 0.75 | 0.37791 | 0.495556 | -0.007968 |
113 | prnn_crabs | Ours | 0.9675 | 0.967838 | 0.96769 | 0.935501 |
114 | prnn_fglass | Ours | 0.687805 | 0.653144 | 0.655095 | 0.569007 |
115 | prnn_synth | Ours | 0.876667 | 0.879281 | 0.876959 | 0.756139 |
116 | profb | Ours | 0.640741 | 0.558732 | 0.535917 | 0.091385 |
117 | ring | Ours | 0.711644 | 0.817582 | 0.709742 | 0.515948 |
118 | saheart | Ours | 0.716846 | 0.692179 | 0.635139 | 0.321519 |
119 | satimage | Ours | 0.910282 | 0.897146 | 0.890226 | 0.889301 |
120 | schizo | Ours | 0.556373 | 0.426085 | 0.380525 | 0.077767 |
121 | segmentation | Ours | 0.970491 | 0.970609 | 0.970437 | 0.965598 |
122 | solar_flare_1 | Ours | 0.695767 | 0.673568 | 0.650753 | 0.607893 |
123 | solar_flare_2 | Ours | 0.716511 | 0.609754 | 0.585798 | 0.637871 |
124 | sonar | Ours | 0.810317 | 0.817667 | 0.810397 | 0.627877 |
125 | soybean | Ours | 0.908642 | 0.951553 | 0.940519 | 0.90012 |
126 | spambase | Ours | 0.916793 | 0.919352 | 0.90635 | 0.825531 |
127 | spect | Ours | 0.769753 | 0.677874 | 0.740606 | 0.412112 |
128 | spectf | Ours | 0.865238 | 0.83142 | 0.861399 | 0.690856 |
129 | splice | Ours | 0.913584 | 0.896234 | 0.932801 | 0.868379 |
130 | tae | Ours | 0.603226 | 0.619652 | 0.608631 | 0.418806 |
131 | texture | Ours | 0.987909 | 0.988061 | 0.987948 | 0.986709 |
132 | threeOf9 | Ours | 0.984142 | 0.984099 | 0.984287 | 0.968383 |
133 | tic_tac_toe | Ours | 0.989062 | 0.990975 | 0.985149 | 0.976097 |
134 | tokyo1 | Ours | 0.925 | 0.920875 | 0.915029 | 0.83581 |
135 | twonorm | Ours | 0.975518 | 0.975525 | 0.97553 | 0.951054 |
136 | vehicle | Ours | 0.714706 | 0.704578 | 0.724239 | 0.623572 |
137 | vote | Ours | 0.959004 | 0.956313 | 0.959582 | 0.915865 |
138 | vowel | Ours | 0.969529 | 0.97055 | 0.971676 | 0.966636 |
139 | waveform_21 | Ours | 0.853967 | 0.860253 | 0.853314 | 0.785179 |
140 | waveform_40 | Ours | 0.8584 | 0.86213 | 0.85963 | 0.790698 |
141 | wdbc | Ours | 0.960819 | 0.964085 | 0.952579 | 0.916542 |
142 | wine_quality_red | Ours | 0.66625 | 0.378358 | 0.337689 | 0.458883 |
143 | wine_quality_white | Ours | 0.625544 | 0.463491 | 0.361405 | 0.429912 |
144 | wine_recognition | Ours | 0.984259 | 0.982731 | 0.986844 | 0.97621 |
145 | xd6 | Ours | 0.999829 | 0.999863 | 0.999775 | 0.999638 |
146 | yeast | Ours | 0.590991 | 0.5583 | 0.536017 | 0.467647 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
---|---|---|---|---|---|---|
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | GB | 0.623229 | 0.624565 | 0.623842 | 0.248402 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | GB | 0.726458 | 0.728732 | 0.726848 | 0.455558 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | GB | 0.538438 | 0.538718 | 0.538705 | 0.077424 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | GB | 0.681458 | 0.682047 | 0.6818 | 0.363846 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | GB | 0.699792 | 0.700784 | 0.700226 | 0.401008 |
6 | Hill_Valley_with_noise | GB | 0.554321 | 0.555855 | 0.555435 | 0.111285 |
7 | Hill_Valley_without_noise | GB | 0.592044 | 0.592688 | 0.592331 | 0.185017 |
8 | agaricus_lepiota | GB | 0.999939 | 0.999944 | 0.999932 | 0.999876 |
9 | allbp | GB | 0.95766 | 0.590159 | 0.443807 | 0.342181 |
10 | allhyper | GB | 0.970508 | 0.423912 | 0.353124 | 0.206518 |
11 | allhypo | GB | 0.953714 | 0.811196 | 0.641956 | 0.644284 |
12 | allrep | GB | 0.969581 | 0.609249 | 0.364862 | 0.302243 |
13 | analcatdata_aids | GB | 0.583333 | 0.589636 | 0.580516 | 0.182331 |
14 | analcatdata_asbestos | GB | 0.747059 | 0.747547 | 0.751964 | 0.49769 |
15 | analcatdata_authorship | GB | 0.984615 | 0.985233 | 0.969084 | 0.977613 |
16 | analcatdata_bankruptcy | GB | 0.83 | 0.824993 | 0.822897 | 0.649452 |
17 | analcatdata_boxing1 | GB | 0.813889 | 0.812901 | 0.779798 | 0.588689 |
18 | analcatdata_boxing2 | GB | 0.777778 | 0.78448 | 0.777737 | 0.561874 |
19 | analcatdata_creditscore | GB | 0.968333 | 0.961483 | 0.960753 | 0.91991 |
20 | analcatdata_cyyoung8092 | GB | 0.775 | 0.717926 | 0.682384 | 0.398269 |
21 | analcatdata_cyyoung9302 | GB | 0.822807 | 0.675207 | 0.653659 | 0.328368 |
22 | analcatdata_dmft | GB | 0.178542 | 0.183872 | 0.180861 | 0.015109 |
23 | analcatdata_fraud | GB | 0.648148 | 0.500728 | 0.543459 | 0.10025 |
24 | analcatdata_germangss | GB | 0.379583 | 0.389557 | 0.387283 | 0.183886 |
25 | analcatdata_happiness | GB | 0.416667 | 0.416698 | 0.421005 | 0.175526 |
26 | analcatdata_japansolvent | GB | 0.784848 | 0.792169 | 0.779848 | 0.568687 |
27 | analcatdata_lawsuit | GB | 0.97673 | 0.90845 | 0.913457 | 0.80917 |
28 | ann_thyroid | GB | 0.996366 | 0.975408 | 0.988119 | 0.974825 |
29 | appendicitis | GB | 0.859091 | 0.77631 | 0.739596 | 0.499832 |
30 | australian | GB | 0.849034 | 0.847615 | 0.84868 | 0.696252 |
31 | auto | GB | 0.769919 | 0.773981 | 0.757548 | 0.701018 |
32 | backache | GB | 0.841667 | 0.460661 | 0.509057 | 0.029151 |
33 | balance_scale | GB | 0.865333 | 0.606516 | 0.631755 | 0.758318 |
34 | banana | GB | 0.900314 | 0.90167 | 0.89636 | 0.798009 |
35 | biomed | GB | 0.894444 | 0.892326 | 0.87181 | 0.762962 |
36 | breast | GB | 0.960476 | 0.955636 | 0.957545 | 0.913107 |
37 | breast_cancer | GB | 0.705172 | 0.6248 | 0.560431 | 0.175343 |
38 | breast_cancer_wisconsin | GB | 0.963158 | 0.964401 | 0.957024 | 0.921303 |
39 | breast_w | GB | 0.961429 | 0.957154 | 0.958785 | 0.915855 |
40 | buggyCrx | GB | 0.858696 | 0.859042 | 0.86369 | 0.722678 |
41 | bupa | GB | 0.583575 | 0.591633 | 0.589798 | 0.181393 |
42 | calendarDOW | GB | 0.61625 | 0.62223 | 0.584366 | 0.517025 |
43 | car | GB | 0.994509 | 0.978554 | 0.981028 | 0.988196 |
44 | car_evaluation | GB | 0.993064 | 0.979688 | 0.974963 | 0.985009 |
45 | cars | GB | 0.994515 | 0.997182 | 0.989806 | 0.989731 |
46 | chess | GB | 0.995104 | 0.995157 | 0.995047 | 0.990204 |
47 | churn | GB | 0.9474 | 0.938357 | 0.832441 | 0.763161 |
48 | clean1 | GB | 0.987153 | 0.985786 | 0.988452 | 0.974225 |
49 | cleve | GB | 0.791803 | 0.793232 | 0.793268 | 0.586226 |
50 | cleveland | GB | 0.546448 | 0.270395 | 0.275676 | 0.22961 |
51 | cleveland_nominal | GB | 0.537705 | 0.238366 | 0.26069 | 0.193917 |
52 | cloud | GB | 0.818182 | 0.828609 | 0.829335 | 0.762896 |
53 | cmc | GB | 0.557966 | 0.54335 | 0.527052 | 0.31305 |
54 | colic | GB | 0.826126 | 0.824393 | 0.806089 | 0.629612 |
55 | collins | GB | 0.998282 | 0.998984 | 0.99794 | 0.998115 |
56 | confidence | GB | 0.737778 | 0.750556 | 0.75937 | 0.699787 |
57 | contraceptive | GB | 0.521921 | 0.499421 | 0.489867 | 0.253712 |
58 | corral | GB | 0.997917 | 0.998039 | 0.998039 | 0.996078 |
59 | credit_a | GB | 0.851208 | 0.851167 | 0.854755 | 0.705898 |
60 | credit_g | GB | 0.751833 | 0.701701 | 0.66482 | 0.363336 |
61 | crx | GB | 0.848792 | 0.849765 | 0.851219 | 0.700959 |
62 | dermatology | GB | 0.972072 | 0.967065 | 0.970117 | 0.965051 |
63 | diabetes | GB | 0.762987 | 0.744642 | 0.716934 | 0.460197 |
64 | dis | GB | 0.981987 | 0.65732 | 0.579602 | 0.214954 |
65 | ecoli | GB | 0.870707 | 0.849951 | 0.806586 | 0.819111 |
66 | flags | GB | 0.450926 | 0.394487 | 0.385558 | 0.280164 |
67 | flare | GB | 0.826791 | 0.473673 | 0.515531 | 0.047305 |
68 | german | GB | 0.751833 | 0.710151 | 0.664039 | 0.370144 |
69 | glass | GB | 0.760163 | 0.70551 | 0.667744 | 0.662973 |
70 | glass2 | GB | 0.853535 | 0.851578 | 0.853562 | 0.704992 |
71 | haberman | GB | 0.72043 | 0.466626 | 0.525402 | 0.059925 |
72 | hayes_roth | GB | 0.814583 | 0.851083 | 0.849469 | 0.715987 |
73 | heart_c | GB | 0.803825 | 0.801057 | 0.7984 | 0.599376 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | GB | 0.810734 | 0.802262 | 0.788048 | 0.589686 |
75 | heart_statlog | GB | 0.809259 | 0.810886 | 0.801569 | 0.611764 |
76 | hepatitis | GB | 0.809677 | 0.679517 | 0.615463 | 0.297586 |
77 | horse_colic | GB | 0.825676 | 0.824893 | 0.803777 | 0.627761 |
78 | house_votes_84 | GB | 0.956322 | 0.952215 | 0.956129 | 0.908305 |
79 | hungarian | GB | 0.822599 | 0.812434 | 0.796415 | 0.608083 |
80 | hypothyroid | GB | 0.960295 | 0.821943 | 0.690758 | 0.492934 |
81 | ionosphere | GB | 0.930047 | 0.934373 | 0.912484 | 0.846345 |
82 | iris | GB | 0.941111 | 0.942046 | 0.940897 | 0.912551 |
83 | irish | GB | 1 | 1 | 1 | 1 |
84 | kr_vs_kp | GB | 0.995833 | 0.995906 | 0.995753 | 0.991659 |
85 | krkopt | GB | 0.741625 | 0.760553 | 0.740988 | 0.711356 |
86 | labor | GB | 0.858333 | 0.853776 | 0.853226 | 0.698621 |
87 | led24 | GB | 0.72375 | 0.72816 | 0.723058 | 0.693332 |
88 | led7 | GB | 0.735573 | 0.743535 | 0.73404 | 0.707107 |
89 | lupus | GB | 0.711111 | 0.709199 | 0.67349 | 0.377305 |
90 | lymphography | GB | 0.841111 | 0.807208 | 0.820962 | 0.698158 |
91 | magic | GB | 0.882404 | 0.881367 | 0.858048 | 0.739042 |
92 | mfeat_fourier | GB | 0.8295 | 0.834299 | 0.829938 | 0.810929 |
93 | mfeat_karhunen | GB | 0.9455 | 0.945762 | 0.946128 | 0.939481 |
94 | mfeat_morphological | GB | 0.725333 | 0.723968 | 0.726472 | 0.698335 |
95 | mfeat_zernike | GB | 0.78575 | 0.793694 | 0.786749 | 0.762353 |
96 | mofn_3_7_10 | GB | 1 | 1 | 1 | 1 |
97 | molecular_biology_promoters | GB | 0.868182 | 0.868368 | 0.876079 | 0.743991 |
98 | monk1 | GB | 1 | 1 | 1 | 1 |
99 | monk2 | GB | 0.990909 | 0.989906 | 0.991143 | 0.981029 |
100 | monk3 | GB | 0.989189 | 0.989524 | 0.988837 | 0.97836 |
101 | movement_libras | GB | 0.694444 | 0.708348 | 0.704736 | 0.674975 |
102 | mushroom | GB | 0.999795 | 0.999802 | 0.999787 | 0.99959 |
103 | mux6 | GB | 0.997436 | 0.997778 | 0.997436 | 0.995212 |
104 | new_thyroid | GB | 0.945736 | 0.941511 | 0.924322 | 0.886921 |
105 | nursery | GB | 0.999961 | 0.999969 | 0.99997 | 0.999943 |
106 | page_blocks | GB | 0.972664 | 0.868281 | 0.827087 | 0.850277 |
107 | parity5 | GB | 0.090476 | 0.067937 | 0.117778 | -0.796165 |
108 | parity5+5 | GB | 0.418667 | 0.365513 | 0.436051 | -0.148251 |
109 | pendigits | GB | 0.992345 | 0.992358 | 0.992296 | 0.991493 |
110 | phoneme | GB | 0.875948 | 0.851969 | 0.848601 | 0.700499 |
111 | pima | GB | 0.761255 | 0.74057 | 0.714152 | 0.453575 |
112 | postoperative_patient_data | GB | 0.753704 | 0.378159 | 0.497421 | -0.010038 |
113 | prnn_crabs | GB | 0.900833 | 0.899298 | 0.902222 | 0.801455 |
114 | prnn_fglass | GB | 0.752846 | 0.707223 | 0.675052 | 0.652292 |
115 | prnn_synth | GB | 0.850667 | 0.853065 | 0.852173 | 0.705186 |
116 | profb | GB | 0.681975 | 0.63178 | 0.597035 | 0.220191 |
117 | ring | GB | 0.969257 | 0.969738 | 0.96917 | 0.938908 |
118 | saheart | GB | 0.70681 | 0.67291 | 0.639235 | 0.309307 |
119 | satimage | GB | 0.914193 | 0.901445 | 0.893554 | 0.894136 |
120 | schizo | GB | 0.593137 | 0.201384 | 0.331625 | -0.011261 |
121 | segmentation | GB | 0.981097 | 0.981188 | 0.980893 | 0.977969 |
122 | solar_flare_1 | GB | 0.726455 | 0.683291 | 0.670855 | 0.647261 |
123 | solar_flare_2 | GB | 0.750623 | 0.645701 | 0.601199 | 0.682929 |
124 | sonar | GB | 0.842063 | 0.852871 | 0.841717 | 0.694243 |
125 | soybean | GB | 0.926914 | 0.961583 | 0.947355 | 0.920555 |
126 | spambase | GB | 0.952805 | 0.951705 | 0.949151 | 0.900847 |
127 | spect | GB | 0.825309 | 0.723452 | 0.687107 | 0.404084 |
128 | spectf | GB | 0.885238 | 0.85755 | 0.861073 | 0.717264 |
129 | splice | GB | 0.966092 | 0.959544 | 0.965494 | 0.945171 |
130 | tae | GB | 0.570968 | 0.576691 | 0.571238 | 0.362031 |
131 | texture | GB | 0.987394 | 0.987468 | 0.987381 | 0.98614 |
132 | threeOf9 | GB | 0.999029 | 0.998991 | 0.999088 | 0.998079 |
133 | tic_tac_toe | GB | 1 | 1 | 1 | 1 |
134 | tokyo1 | GB | 0.924132 | 0.919607 | 0.914589 | 0.834114 |
135 | twonorm | GB | 0.9725 | 0.972497 | 0.97251 | 0.945006 |
136 | vehicle | GB | 0.757843 | 0.760925 | 0.766135 | 0.677476 |
137 | vote | GB | 0.955556 | 0.95252 | 0.956258 | 0.908732 |
138 | vowel | GB | 0.904209 | 0.907806 | 0.907502 | 0.894871 |
139 | waveform_21 | GB | 0.849167 | 0.849025 | 0.848961 | 0.773966 |
140 | waveform_40 | GB | 0.856067 | 0.856348 | 0.856601 | 0.784493 |
141 | wdbc | GB | 0.963743 | 0.965021 | 0.957761 | 0.922668 |
142 | wine_quality_red | GB | 0.669896 | 0.37233 | 0.355688 | 0.474829 |
143 | wine_quality_white | GB | 0.655816 | 0.467276 | 0.390307 | 0.47736 |
144 | wine_recognition | GB | 0.941667 | 0.944792 | 0.945446 | 0.912963 |
145 | xd6 | GB | 1 | 1 | 1 | 1 |
146 | yeast | GB | 0.597973 | 0.558935 | 0.518458 | 0.47637 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | KNN | 0.575208 | 0.580815 | 0.576832 | 0.157566 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | KNN | 0.700521 | 0.707414 | 0.702078 | 0.409406 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | KNN | 0.54875 | 0.550881 | 0.549925 | 0.100793 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | KNN | 0.635208 | 0.637862 | 0.636591 | 0.274444 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | KNN | 0.641563 | 0.644375 | 0.642843 | 0.287202 |
6 | Hill_Valley_with_noise | KNN | 0.58011 | 0.58036 | 0.580044 | 0.160401 |
7 | Hill_Valley_without_noise | KNN | 0.625652 | 0.626107 | 0.625637 | 0.251741 |
8 | agaricus_lepiota | KNN | 1 | 1 | 1 | 1 |
9 | allbp | KNN | 0.960839 | 0.594804 | 0.410202 | 0.357546 |
10 | allhyper | KNN | 0.973201 | 0.356974 | 0.296536 | 0.074057 |
11 | allhypo | KNN | 0.922016 | 0.381923 | 0.335883 | 0.030621 |
12 | allrep | KNN | 0.967329 | 0.291937 | 0.260474 | 0.052416 |
13 | analcatdata_aids | KNN | 0.296667 | 0.310278 | 0.291746 | -0.38327 |
14 | analcatdata_asbestos | KNN | 0.764706 | 0.761949 | 0.763459 | 0.524178 |
15 | analcatdata_authorship | KNN | 0.99645 | 0.996997 | 0.995303 | 0.994826 |
16 | analcatdata_bankruptcy | KNN | 0.84 | 0.830575 | 0.839841 | 0.670505 |
17 | analcatdata_boxing1 | KNN | 0.715278 | 0.706486 | 0.667477 | 0.364821 |
18 | analcatdata_boxing2 | KNN | 0.681481 | 0.697911 | 0.688397 | 0.385563 |
19 | analcatdata_creditscore | KNN | 0.975 | 0.959535 | 0.976108 | 0.933492 |
20 | analcatdata_cyyoung8092 | KNN | 0.738333 | 0.677489 | 0.598784 | 0.265392 |
21 | analcatdata_cyyoung9302 | KNN | 0.859649 | 0.75266 | 0.739636 | 0.488029 |
22 | analcatdata_dmft | KNN | 0.182292 | 0.197581 | 0.185948 | 0.019608 |
23 | analcatdata_fraud | KNN | 0.688889 | 0.576025 | 0.591276 | 0.199142 |
24 | analcatdata_germangss | KNN | 0.30125 | 0.308871 | 0.310429 | 0.079609 |
25 | analcatdata_happiness | KNN | 0.458333 | 0.469555 | 0.460833 | 0.22387 |
26 | analcatdata_japansolvent | KNN | 0.712121 | 0.744517 | 0.701263 | 0.438671 |
27 | analcatdata_lawsuit | KNN | 0.977358 | 0.906733 | 0.913347 | 0.810517 |
28 | ann_thyroid | KNN | 0.940208 | 0.832906 | 0.571052 | 0.463327 |
29 | appendicitis | KNN | 0.860606 | 0.741393 | 0.755686 | 0.486307 |
30 | australian | KNN | 0.72343 | 0.728803 | 0.709709 | 0.4379 |
31 | auto | KNN | 0.579675 | 0.594855 | 0.540941 | 0.449717 |
32 | backache | KNN | 0.848148 | 0.615213 | 0.553968 | 0.167803 |
33 | balance_scale | KNN | 0.8944 | 0.596403 | 0.645229 | 0.813959 |
34 | banana | KNN | 0.901006 | 0.902585 | 0.896997 | 0.799558 |
35 | biomed | KNN | 0.957143 | 0.961794 | 0.944795 | 0.906047 |
36 | breast | KNN | 0.661905 | 0.613687 | 0.586445 | 0.197624 |
37 | breast_cancer | KNN | 0.721264 | 0.666939 | 0.592926 | 0.247492 |
38 | breast_cancer_wisconsin | KNN | 0.929825 | 0.934065 | 0.916267 | 0.849919 |
39 | breast_w | KNN | 0.965476 | 0.961363 | 0.9628 | 0.924101 |
40 | buggyCrx | KNN | 0.757729 | 0.760932 | 0.748297 | 0.50896 |
41 | bupa | KNN | 0.572464 | 0.59432 | 0.581677 | 0.175034 |
42 | calendarDOW | KNN | 0.610833 | 0.592366 | 0.584055 | 0.510121 |
43 | car | KNN | 0.927168 | 0.895108 | 0.703662 | 0.838492 |
44 | car_evaluation | KNN | 0.922158 | 0.919812 | 0.711223 | 0.825496 |
45 | cars | KNN | 0.722363 | 0.640956 | 0.617134 | 0.483621 |
46 | chess | KNN | 0.964323 | 0.964682 | 0.963998 | 0.928678 |
47 | churn | KNN | 0.8914 | 0.868898 | 0.628793 | 0.434725 |
48 | clean1 | KNN | 0.857292 | 0.858204 | 0.865504 | 0.723617 |
49 | cleve | KNN | 0.683607 | 0.684074 | 0.682067 | 0.365891 |
50 | cleveland | KNN | 0.527869 | 0.161106 | 0.21256 | 0.070687 |
51 | cleveland_nominal | KNN | 0.545902 | 0.290312 | 0.289269 | 0.248132 |
52 | cloud | KNN | 0.771212 | 0.781427 | 0.776051 | 0.699808 |
53 | cmc | KNN | 0.52226 | 0.505838 | 0.494587 | 0.260738 |
54 | colic | KNN | 0.806306 | 0.80382 | 0.782485 | 0.585441 |
55 | collins | KNN | 0.991409 | 0.989001 | 0.990664 | 0.990539 |
56 | confidence | KNN | 0.788889 | 0.804741 | 0.799648 | 0.753815 |
57 | contraceptive | KNN | 0.479322 | 0.457819 | 0.448306 | 0.183288 |
58 | corral | KNN | 0.9375 | 0.939173 | 0.936509 | 0.875434 |
59 | credit_a | KNN | 0.755072 | 0.759579 | 0.743276 | 0.502428 |
60 | credit_g | KNN | 0.7035 | 0.59722 | 0.518064 | 0.082173 |
61 | crx | KNN | 0.764493 | 0.769839 | 0.754682 | 0.524168 |
62 | dermatology | KNN | 0.966216 | 0.962508 | 0.961136 | 0.958376 |
63 | diabetes | KNN | 0.741558 | 0.724068 | 0.685483 | 0.406733 |
64 | dis | KNN | 0.984459 | 0.492425 | 0.499798 | -0.001237 |
65 | ecoli | KNN | 0.874747 | 0.855106 | 0.83126 | 0.824807 |
66 | flags | KNN | 0.437037 | 0.400088 | 0.381754 | 0.268685 |
67 | flare | KNN | 0.815576 | 0.620187 | 0.550175 | 0.153951 |
68 | german | KNN | 0.694167 | 0.605427 | 0.51138 | 0.066478 |
69 | glass | KNN | 0.708943 | 0.660893 | 0.665182 | 0.598934 |
70 | glass2 | KNN | 0.836364 | 0.838123 | 0.832347 | 0.669981 |
71 | haberman | KNN | 0.74086 | 0.658995 | 0.592716 | 0.237863 |
72 | hayes_roth | KNN | 0.703125 | 0.744172 | 0.672182 | 0.532058 |
73 | heart_c | KNN | 0.693443 | 0.693642 | 0.682456 | 0.375656 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | KNN | 0.79096 | 0.785242 | 0.758757 | 0.542648 |
75 | heart_statlog | KNN | 0.701235 | 0.699152 | 0.692409 | 0.391261 |
76 | hepatitis | KNN | 0.823656 | 0.777807 | 0.633933 | 0.371221 |
77 | horse_colic | KNN | 0.8 | 0.796235 | 0.773937 | 0.569391 |
78 | house_votes_84 | KNN | 0.918774 | 0.912232 | 0.919218 | 0.831325 |
79 | hungarian | KNN | 0.811299 | 0.802042 | 0.776375 | 0.577313 |
80 | hypothyroid | KNN | 0.957609 | 0.890294 | 0.604244 | 0.38972 |
81 | ionosphere | KNN | 0.899531 | 0.918884 | 0.866985 | 0.783441 |
82 | iris | KNN | 0.96 | 0.959955 | 0.961187 | 0.940858 |
83 | irish | KNN | 0.895333 | 0.893324 | 0.899321 | 0.792576 |
84 | kr_vs_kp | KNN | 0.962708 | 0.963357 | 0.962118 | 0.925472 |
85 | krkopt | KNN | 0.694393 | 0.723243 | 0.616437 | 0.657941 |
86 | labor | KNN | 0.902778 | 0.88994 | 0.904609 | 0.790458 |
87 | led24 | KNN | 0.719375 | 0.71828 | 0.719201 | 0.688519 |
88 | led7 | KNN | 0.734844 | 0.735903 | 0.732953 | 0.706234 |
89 | lupus | KNN | 0.707407 | 0.697309 | 0.678856 | 0.370492 |
90 | lymphography | KNN | 0.828889 | 0.665212 | 0.669381 | 0.663572 |
91 | magic | KNN | 0.820128 | 0.836148 | 0.766126 | 0.598151 |
92 | mfeat_fourier | KNN | 0.843167 | 0.845374 | 0.844022 | 0.826185 |
93 | mfeat_karhunen | KNN | 0.974167 | 0.974626 | 0.974237 | 0.971342 |
94 | mfeat_morphological | KNN | 0.49625 | 0.503202 | 0.499322 | 0.442156 |
95 | mfeat_zernike | KNN | 0.80375 | 0.803931 | 0.804011 | 0.782129 |
96 | mofn_3_7_10 | KNN | 0.94956 | 0.942055 | 0.905287 | 0.845842 |
97 | molecular_biology_promoters | KNN | 0.787879 | 0.807907 | 0.796693 | 0.603439 |
98 | monk1 | KNN | 0.977976 | 0.978604 | 0.978628 | 0.957228 |
99 | monk2 | KNN | 0.819284 | 0.888553 | 0.741561 | 0.611311 |
100 | monk3 | KNN | 0.967568 | 0.967501 | 0.967727 | 0.935225 |
101 | movement_libras | KNN | 0.834722 | 0.846461 | 0.846434 | 0.824496 |
102 | mushroom | KNN | 1 | 1 | 1 | 1 |
103 | mux6 | KNN | 0.964103 | 0.965422 | 0.967609 | 0.93287 |
104 | new_thyroid | KNN | 0.952713 | 0.950851 | 0.926477 | 0.899944 |
105 | nursery | KNN | 0.971052 | 0.97872 | 0.831748 | 0.957694 |
106 | page_blocks | KNN | 0.960974 | 0.854309 | 0.745411 | 0.778077 |
107 | parity5 | KNN | 0.295238 | 0.302262 | 0.331111 | -0.353682 |
108 | parity5+5 | KNN | 0.564148 | 0.568952 | 0.564415 | 0.133119 |
109 | pendigits | KNN | 0.993406 | 0.993464 | 0.993404 | 0.992673 |
110 | phoneme | KNN | 0.899414 | 0.884585 | 0.870135 | 0.754534 |
111 | pima | KNN | 0.749351 | 0.73247 | 0.688519 | 0.41817 |
112 | postoperative_patient_data | KNN | 0.694444 | 0.371555 | 0.459955 | -0.107881 |
113 | prnn_crabs | KNN | 0.9575 | 0.956101 | 0.959952 | 0.916017 |
114 | prnn_fglass | KNN | 0.708943 | 0.661658 | 0.675863 | 0.600603 |
115 | prnn_synth | KNN | 0.880667 | 0.883485 | 0.88098 | 0.764359 |
116 | profb | KNN | 0.665679 | 0.609744 | 0.527343 | 0.107218 |
117 | ring | KNN | 0.74464 | 0.827009 | 0.743116 | 0.563877 |
118 | saheart | KNN | 0.666667 | 0.579849 | 0.550582 | 0.141674 |
119 | satimage | KNN | 0.910256 | 0.896759 | 0.892094 | 0.889304 |
120 | schizo | KNN | 0.594118 | 0.199057 | 0.331733 | -0.00342 |
121 | segmentation | KNN | 0.970635 | 0.970616 | 0.970609 | 0.965743 |
122 | solar_flare_1 | KNN | 0.710582 | 0.662313 | 0.649605 | 0.626002 |
123 | solar_flare_2 | KNN | 0.735981 | 0.609737 | 0.583493 | 0.66378 |
124 | sonar | KNN | 0.834127 | 0.83663 | 0.832012 | 0.668502 |
125 | soybean | KNN | 0.912346 | 0.945973 | 0.934624 | 0.90453 |
126 | spambase | KNN | 0.929099 | 0.929205 | 0.921804 | 0.850957 |
127 | spect | KNN | 0.803704 | 0.682675 | 0.67267 | 0.3521 |
128 | spectf | KNN | 0.902857 | 0.909448 | 0.858244 | 0.761771 |
129 | splice | KNN | 0.915569 | 0.898316 | 0.932485 | 0.870437 |
130 | tae | KNN | 0.595699 | 0.607383 | 0.598616 | 0.401498 |
131 | texture | KNN | 0.990879 | 0.990947 | 0.990867 | 0.989972 |
132 | threeOf9 | KNN | 0.973463 | 0.974155 | 0.97316 | 0.947306 |
133 | tic_tac_toe | KNN | 0.973785 | 0.974429 | 0.968572 | 0.942939 |
134 | tokyo1 | KNN | 0.914062 | 0.905875 | 0.90773 | 0.813534 |
135 | twonorm | KNN | 0.976171 | 0.976173 | 0.976199 | 0.952372 |
136 | vehicle | KNN | 0.683529 | 0.678035 | 0.694529 | 0.579958 |
137 | vote | KNN | 0.924521 | 0.919829 | 0.926626 | 0.846383 |
138 | vowel | KNN | 0.988215 | 0.988325 | 0.988808 | 0.987057 |
139 | waveform_21 | KNN | 0.855633 | 0.862016 | 0.85494 | 0.787715 |
140 | waveform_40 | KNN | 0.858433 | 0.861729 | 0.859645 | 0.790496 |
141 | wdbc | KNN | 0.930117 | 0.932585 | 0.917786 | 0.850077 |
142 | wine_quality_red | KNN | 0.645208 | 0.400918 | 0.317696 | 0.418081 |
143 | wine_quality_white | KNN | 0.629558 | 0.617262 | 0.340278 | 0.424703 |
144 | wine_recognition | KNN | 0.962037 | 0.965207 | 0.966592 | 0.943086 |
145 | xd6 | KNN | 0.999316 | 0.999478 | 0.999029 | 0.998505 |
146 | yeast | KNN | 0.607545 | 0.565962 | 0.516628 | 0.488806 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | Linear | 0.480938 | 0.458951 | 0.48729 | -0.028143 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | Linear | 0.473333 | 0.496988 | 0.479564 | -0.039612 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | Linear | 0.505729 | 0.507804 | 0.507221 | 0.014985 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | Linear | 0.478437 | 0.48032 | 0.485211 | -0.030165 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | Linear | 0.484792 | 0.479447 | 0.490373 | -0.01969 |
6 | Hill_Valley_with_noise | Linear | 0.750754 | 0.822602 | 0.752891 | 0.570721 |
7 | Hill_Valley_without_noise | Linear | 0.707133 | 0.810475 | 0.709951 | 0.510171 |
8 | agaricus_lepiota | Linear | 0.999795 | 0.999807 | 0.999782 | 0.999589 |
9 | allbp | Linear | 0.961943 | 0.583252 | 0.435259 | 0.409765 |
10 | allhyper | Linear | 0.974923 | 0.487764 | 0.368148 | 0.297414 |
11 | allhypo | Linear | 0.96702 | 0.83573 | 0.798871 | 0.772537 |
12 | allrep | Linear | 0.970508 | 0.683944 | 0.352076 | 0.311429 |
13 | analcatdata_aids | Linear | 0.496667 | 0.512831 | 0.534504 | 0.114762 |
14 | analcatdata_asbestos | Linear | 0.752941 | 0.764759 | 0.771761 | 0.53506 |
15 | analcatdata_authorship | Linear | 0.997041 | 0.997884 | 0.997759 | 0.995666 |
16 | analcatdata_bankruptcy | Linear | 0.89 | 0.900066 | 0.871944 | 0.772301 |
17 | analcatdata_boxing1 | Linear | 0.838889 | 0.831318 | 0.809579 | 0.638054 |
18 | analcatdata_boxing2 | Linear | 0.77284 | 0.77663 | 0.775514 | 0.551979 |
19 | analcatdata_creditscore | Linear | 0.915 | 0.888418 | 0.923768 | 0.806581 |
20 | analcatdata_cyyoung8092 | Linear | 0.748333 | 0.71625 | 0.618471 | 0.310801 |
21 | analcatdata_cyyoung9302 | Linear | 0.859649 | 0.760837 | 0.770128 | 0.520873 |
22 | analcatdata_dmft | Linear | 0.19 | 0.164537 | 0.194179 | 0.033217 |
23 | analcatdata_fraud | Linear | 0.67037 | 0.481918 | 0.551085 | 0.111244 |
24 | analcatdata_germangss | Linear | 0.372917 | 0.374384 | 0.377683 | 0.17133 |
25 | analcatdata_happiness | Linear | 0.469444 | 0.499339 | 0.480794 | 0.265205 |
26 | analcatdata_japansolvent | Linear | 0.787879 | 0.795813 | 0.777249 | 0.569436 |
27 | analcatdata_lawsuit | Linear | 0.978616 | 0.916424 | 0.921551 | 0.828755 |
28 | ann_thyroid | Linear | 0.955069 | 0.868876 | 0.70503 | 0.635564 |
29 | appendicitis | Linear | 0.872727 | 0.788722 | 0.761552 | 0.537643 |
30 | australian | Linear | 0.849758 | 0.848129 | 0.849725 | 0.697812 |
31 | auto | Linear | 0.752033 | 0.758928 | 0.751128 | 0.678864 |
32 | backache | Linear | 0.839815 | 0.438873 | 0.500904 | 0.005885 |
33 | balance_scale | Linear | 0.985067 | 0.944644 | 0.989285 | 0.974323 |
34 | banana | Linear | 0.574811 | 0.642197 | 0.526298 | 0.113951 |
35 | biomed | Linear | 0.736508 | 0.71901 | 0.706934 | 0.423777 |
36 | breast | Linear | 0.96119 | 0.958669 | 0.955371 | 0.913977 |
37 | breast_cancer | Linear | 0.69023 | 0.535635 | 0.53289 | 0.101867 |
38 | breast_cancer_wisconsin | Linear | 0.974269 | 0.97617 | 0.969331 | 0.945425 |
39 | breast_w | Linear | 0.964048 | 0.96164 | 0.959009 | 0.920576 |
40 | buggyCrx | Linear | 0.849517 | 0.84847 | 0.84804 | 0.696465 |
41 | bupa | Linear | 0.613043 | 0.620401 | 0.618705 | 0.239021 |
42 | calendarDOW | Linear | 0.624583 | 0.61399 | 0.605714 | 0.530132 |
43 | car | Linear | 0.917341 | 0.872273 | 0.863776 | 0.820822 |
44 | car_evaluation | Linear | 0.921291 | 0.855849 | 0.854512 | 0.827701 |
45 | cars | Linear | 0.992405 | 0.991843 | 0.9892 | 0.986014 |
46 | chess | Linear | 0.972656 | 0.972628 | 0.972708 | 0.945335 |
47 | churn | Linear | 0.864333 | 0.809637 | 0.513192 | 0.125185 |
48 | clean1 | Linear | 0.971181 | 0.969412 | 0.972937 | 0.942318 |
49 | cleve | Linear | 0.851366 | 0.849208 | 0.851395 | 0.700411 |
50 | cleveland | Linear | 0.572678 | 0.282709 | 0.299559 | 0.293684 |
51 | cleveland_nominal | Linear | 0.563388 | 0.279341 | 0.293658 | 0.255622 |
52 | cloud | Linear | 0.342424 | 0.353508 | 0.358721 | 0.142428 |
53 | cmc | Linear | 0.501356 | 0.485464 | 0.472785 | 0.218378 |
54 | colic | Linear | 0.812162 | 0.808513 | 0.788018 | 0.595843 |
55 | collins | Linear | 1 | 1 | 1 | 1 |
56 | confidence | Linear | 0.791111 | 0.773889 | 0.793056 | 0.759626 |
57 | contraceptive | Linear | 0.540565 | 0.518234 | 0.509752 | 0.28441 |
58 | corral | Linear | 0.882292 | 0.880782 | 0.883999 | 0.764551 |
59 | credit_a | Linear | 0.852415 | 0.851813 | 0.850519 | 0.702298 |
60 | credit_g | Linear | 0.757333 | 0.710864 | 0.665865 | 0.372272 |
61 | crx | Linear | 0.85 | 0.850795 | 0.846674 | 0.697414 |
62 | dermatology | Linear | 0.969369 | 0.966814 | 0.96621 | 0.961977 |
63 | diabetes | Linear | 0.768831 | 0.753791 | 0.717294 | 0.469189 |
64 | dis | Linear | 0.984857 | 0.492428 | 0.5 | 0 |
65 | ecoli | Linear | 0.885859 | 0.87216 | 0.85102 | 0.841187 |
66 | flags | Linear | 0.462037 | 0.348462 | 0.377677 | 0.287972 |
67 | flare | Linear | 0.829283 | 0.595478 | 0.535652 | 0.12783 |
68 | german | Linear | 0.753167 | 0.708126 | 0.670058 | 0.375508 |
69 | glass | Linear | 0.621138 | 0.535314 | 0.528464 | 0.461073 |
70 | glass2 | Linear | 0.684848 | 0.693779 | 0.689655 | 0.383066 |
71 | haberman | Linear | 0.731183 | 0.603774 | 0.540267 | 0.131531 |
72 | hayes_roth | Linear | 0.521875 | 0.555751 | 0.568605 | 0.265495 |
73 | heart_c | Linear | 0.847541 | 0.846791 | 0.841921 | 0.688653 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | Linear | 0.819774 | 0.81881 | 0.789051 | 0.606259 |
75 | heart_statlog | Linear | 0.842593 | 0.84248 | 0.835538 | 0.677814 |
76 | hepatitis | Linear | 0.825806 | 0.718776 | 0.661382 | 0.373473 |
77 | horse_colic | Linear | 0.813964 | 0.8109 | 0.789035 | 0.599212 |
78 | house_votes_84 | Linear | 0.952107 | 0.948088 | 0.951188 | 0.899222 |
79 | hungarian | Linear | 0.837853 | 0.835924 | 0.803843 | 0.638102 |
80 | hypothyroid | Linear | 0.972828 | 0.879405 | 0.809378 | 0.683784 |
81 | ionosphere | Linear | 0.881221 | 0.897518 | 0.84736 | 0.742733 |
82 | iris | Linear | 0.961111 | 0.96375 | 0.963009 | 0.943526 |
83 | irish | Linear | 1 | 1 | 1 | 1 |
84 | kr_vs_kp | Linear | 0.97151 | 0.971492 | 0.971469 | 0.94296 |
85 | krkopt | Linear | 0.404936 | 0.417892 | 0.412894 | 0.332323 |
86 | labor | Linear | 0.925 | 0.91797 | 0.926321 | 0.84014 |
87 | led24 | Linear | 0.725365 | 0.724639 | 0.725013 | 0.695144 |
88 | led7 | Linear | 0.739688 | 0.741364 | 0.736986 | 0.711586 |
89 | lupus | Linear | 0.764815 | 0.755433 | 0.746526 | 0.499081 |
90 | lymphography | Linear | 0.835556 | 0.723817 | 0.725014 | 0.684257 |
91 | magic | Linear | 0.790124 | 0.781407 | 0.745224 | 0.525377 |
92 | mfeat_fourier | Linear | 0.8195 | 0.820409 | 0.820496 | 0.799844 |
93 | mfeat_karhunen | Linear | 0.948167 | 0.948802 | 0.948749 | 0.942459 |
94 | mfeat_morphological | Linear | 0.74125 | 0.747294 | 0.743323 | 0.71533 |
95 | mfeat_zernike | Linear | 0.821 | 0.817336 | 0.821383 | 0.801516 |
96 | mofn_3_7_10 | Linear | 1 | 1 | 1 | 1 |
97 | molecular_biology_promoters | Linear | 0.919697 | 0.924414 | 0.92116 | 0.845227 |
98 | monk1 | Linear | 0.738393 | 0.826608 | 0.741431 | 0.561117 |
99 | monk2 | Linear | 0.65427 | 0.327135 | 0.5 | 0 |
100 | monk3 | Linear | 0.97958 | 0.979676 | 0.979446 | 0.959118 |
101 | movement_libras | Linear | 0.741204 | 0.756347 | 0.754199 | 0.724956 |
102 | mushroom | Linear | 0.999754 | 0.999763 | 0.999745 | 0.999508 |
103 | mux6 | Linear | 0.619231 | 0.62953 | 0.630525 | 0.25987 |
104 | new_thyroid | Linear | 0.95814 | 0.952671 | 0.940826 | 0.912416 |
105 | nursery | Linear | 0.924884 | 0.903061 | 0.874039 | 0.889699 |
106 | page_blocks | Linear | 0.962861 | 0.869096 | 0.741621 | 0.790378 |
107 | parity5 | Linear | 0.295238 | 0.161111 | 0.411111 | -0.177778 |
108 | parity5+5 | Linear | 0.465037 | 0.260692 | 0.491501 | -0.017747 |
109 | pendigits | Linear | 0.955556 | 0.954989 | 0.955076 | 0.950612 |
110 | phoneme | Linear | 0.746068 | 0.690885 | 0.660528 | 0.350007 |
111 | pima | Linear | 0.774026 | 0.761064 | 0.721232 | 0.480235 |
112 | postoperative_patient_data | Linear | 0.751852 | 0.377968 | 0.496337 | -0.011313 |
113 | prnn_crabs | Linear | 0.991667 | 0.990527 | 0.992572 | 0.983082 |
114 | prnn_fglass | Linear | 0.625203 | 0.53561 | 0.532143 | 0.46585 |
115 | prnn_synth | Linear | 0.872 | 0.872465 | 0.873961 | 0.74639 |
116 | profb | Linear | 0.742963 | 0.713757 | 0.694255 | 0.406987 |
117 | ring | Linear | 0.76268 | 0.775659 | 0.762389 | 0.537863 |
118 | saheart | Linear | 0.704659 | 0.695624 | 0.601297 | 0.278692 |
119 | satimage | Linear | 0.86001 | 0.827536 | 0.814275 | 0.827081 |
120 | schizo | Linear | 0.594118 | 0.198789 | 0.33184 | -0.00623 |
121 | segmentation | Linear | 0.950866 | 0.95124 | 0.950425 | 0.942725 |
122 | solar_flare_1 | Linear | 0.742857 | 0.68659 | 0.672345 | 0.669937 |
123 | solar_flare_2 | Linear | 0.761215 | 0.640604 | 0.604137 | 0.695362 |
124 | sonar | Linear | 0.777778 | 0.781779 | 0.779092 | 0.560707 |
125 | soybean | Linear | 0.92642 | 0.957506 | 0.948757 | 0.919781 |
126 | spambase | Linear | 0.903294 | 0.901447 | 0.894805 | 0.796205 |
127 | spect | Linear | 0.843827 | 0.766715 | 0.710397 | 0.468957 |
128 | spectf | Linear | 0.803333 | 0.758452 | 0.74046 | 0.496567 |
129 | splice | Linear | 0.960293 | 0.953551 | 0.957961 | 0.935626 |
130 | tae | Linear | 0.52043 | 0.536375 | 0.530949 | 0.299079 |
131 | texture | Linear | 0.997697 | 0.997742 | 0.997653 | 0.997468 |
132 | threeOf9 | Linear | 0.804207 | 0.804279 | 0.803193 | 0.607446 |
133 | tic_tac_toe | Linear | 0.984201 | 0.988054 | 0.977674 | 0.965661 |
134 | tokyo1 | Linear | 0.919792 | 0.917604 | 0.906875 | 0.824321 |
135 | twonorm | Linear | 0.97795 | 0.977926 | 0.977993 | 0.955919 |
136 | vehicle | Linear | 0.797843 | 0.802044 | 0.804879 | 0.730841 |
137 | vote | Linear | 0.954789 | 0.952864 | 0.953859 | 0.906684 |
138 | vowel | Linear | 0.845118 | 0.847877 | 0.848793 | 0.830133 |
139 | waveform_21 | Linear | 0.8675 | 0.868035 | 0.867198 | 0.801874 |
140 | waveform_40 | Linear | 0.868067 | 0.868539 | 0.868586 | 0.802597 |
141 | wdbc | Linear | 0.973977 | 0.976152 | 0.968829 | 0.9449 |
142 | wine_quality_red | Linear | 0.596458 | 0.317116 | 0.279905 | 0.339421 |
143 | wine_quality_white | Linear | 0.538095 | 0.338107 | 0.238882 | 0.264965 |
144 | wine_recognition | Linear | 0.936111 | 0.943046 | 0.93633 | 0.904157 |
145 | xd6 | Linear | 0.82188 | 0.816704 | 0.767 | 0.581403 |
146 | yeast | Linear | 0.594482 | 0.554987 | 0.513609 | 0.475075 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
---|---|---|---|---|---|---|
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | NeuralNet | 0.585729 | 0.585261 | 0.585838 | 0.171095 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | NeuralNet | 0.708542 | 0.708453 | 0.709424 | 0.417870 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | NeuralNet | 0.517500 | 0.517558 | 0.517636 | 0.035193 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | NeuralNet | 0.644687 | 0.645142 | 0.646459 | 0.291590 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | NeuralNet | 0.650833 | 0.651469 | 0.652644 | 0.304104 |
6 | Hill_Valley_with_noise | NeuralNet | 0.545405 | 0.550122 | 0.610026 | 0.146337 |
7 | Hill_Valley_without_noise | NeuralNet | 0.554458 | 0.560038 | 0.622925 | 0.168091 |
8 | agaricus_lepiota | NeuralNet | 0.985820 | 0.985703 | 0.985919 | 0.971621 |
9 | allbp | NeuralNet | 0.956777 | 0.359956 | 0.360469 | 0.074941 |
10 | allhyper | NeuralNet | 0.973466 | 0.275000 | 0.267759 | 0.000000 |
11 | allhypo | NeuralNet | 0.922060 | 0.333333 | 0.307353 | 0.000000 |
12 | allrep | NeuralNet | 0.967241 | 0.250000 | 0.241810 | 0.000000 |
13 | analcatdata_aids | NeuralNet | 0.470000 | 0.502937 | 0.524160 | 0.040197 |
14 | analcatdata_asbestos | NeuralNet | 0.766667 | 0.785112 | 0.779735 | 0.563728 |
15 | analcatdata_authorship | NeuralNet | 0.995266 | 0.996488 | 0.994588 | 0.993077 |
16 | analcatdata_bankruptcy | NeuralNet | 0.813333 | 0.829782 | 0.822063 | 0.659108 |
17 | analcatdata_boxing1 | NeuralNet | 0.688889 | 0.598661 | 0.652703 | 0.243174 |
18 | analcatdata_boxing2 | NeuralNet | 0.669136 | 0.668420 | 0.694571 | 0.361372 |
19 | analcatdata_creditscore | NeuralNet | 0.805000 | 0.793736 | 0.774231 | 0.556686 |
20 | analcatdata_cyyoung8092 | NeuralNet | 0.776667 | 0.710353 | 0.738116 | 0.439706 |
21 | analcatdata_cyyoung9302 | NeuralNet | 0.873684 | 0.815863 | 0.795422 | 0.597854 |
22 | analcatdata_dmft | NeuralNet | 0.205208 | 0.211099 | 0.221483 | 0.052217 |
23 | analcatdata_fraud | NeuralNet | 0.648148 | 0.584841 | 0.613591 | 0.207258 |
24 | analcatdata_germangss | NeuralNet | 0.311250 | 0.316450 | 0.324626 | 0.090982 |
25 | analcatdata_happiness | NeuralNet | 0.491667 | 0.486905 | 0.473810 | 0.259076 |
26 | analcatdata_japansolvent | NeuralNet | 0.781818 | 0.809458 | 0.805020 | 0.614055 |
27 | analcatdata_lawsuit | NeuralNet | 0.961006 | 0.897202 | 0.824873 | 0.707293 |
28 | ann_thyroid | NeuralNet | 0.967685 | 0.809154 | 0.826367 | 0.743097 |
29 | appendicitis | NeuralNet | 0.851515 | 0.767637 | 0.752789 | 0.501828 |
30 | australian | NeuralNet | 0.843720 | 0.842648 | 0.841904 | 0.684523 |
31 | auto | NeuralNet | 0.688618 | 0.667752 | 0.695618 | 0.595821 |
32 | backache | NeuralNet | 0.821296 | 0.536126 | 0.545019 | 0.091842 |
33 | balance_scale | NeuralNet | 0.928000 | 0.795545 | 0.799846 | 0.875203 |
34 | banana | NeuralNet | 0.864403 | 0.854274 | 0.876375 | 0.730268 |
35 | biomed | NeuralNet | 0.891270 | 0.879832 | 0.882356 | 0.761581 |
36 | breast | NeuralNet | 0.961190 | 0.964554 | 0.952326 | 0.916753 |
37 | breast_cancer | NeuralNet | 0.722414 | 0.615661 | 0.678091 | 0.284733 |
38 | breast_cancer_wisconsin | NeuralNet | 0.960234 | 0.959006 | 0.957697 | 0.916536 |
39 | breast_w | NeuralNet | 0.964762 | 0.967518 | 0.956630 | 0.924005 |
40 | buggyCrx | NeuralNet | 0.853382 | 0.852646 | 0.852663 | 0.705261 |
41 | bupa | NeuralNet | 0.582609 | 0.587326 | 0.590296 | 0.177548 |
42 | calendarDOW | NeuralNet | 0.591250 | 0.564882 | 0.572987 | 0.485498 |
43 | car | NeuralNet | 0.886705 | 0.645797 | 0.678965 | 0.743524 |
44 | car_evaluation | NeuralNet | 0.957707 | 0.760487 | 0.730270 | 0.909268 |
45 | cars | NeuralNet | 0.764557 | 0.656525 | 0.689043 | 0.562526 |
46 | chess | NeuralNet | 0.990937 | 0.990997 | 0.990887 | 0.981884 |
47 | churn | NeuralNet | 0.905467 | 0.705715 | 0.855562 | 0.539217 |
48 | clean1 | NeuralNet | 0.965625 | 0.966918 | 0.964145 | 0.931036 |
49 | cleve | NeuralNet | 0.781421 | 0.789767 | 0.785457 | 0.575013 |
50 | cleveland | NeuralNet | 0.556284 | 0.292572 | 0.284658 | 0.274622 |
51 | cleveland_nominal | NeuralNet | 0.546995 | 0.277967 | 0.268059 | 0.259451 |
52 | cloud | NeuralNet | 0.251515 | 0.268453 | 0.263668 | 0.014947 |
53 | cmc | NeuralNet | 0.552768 | 0.526055 | 0.537211 | 0.305936 |
54 | colic | NeuralNet | 0.800901 | 0.784415 | 0.794422 | 0.578367 |
55 | collins | NeuralNet | 0.631959 | 0.603885 | 0.575876 | 0.596168 |
56 | confidence | NeuralNet | 0.748889 | 0.775926 | 0.774537 | 0.723127 |
57 | contraceptive | NeuralNet | 0.549605 | 0.521371 | 0.531815 | 0.300551 |
58 | corral | NeuralNet | 0.990625 | 0.993262 | 0.989018 | 0.982066 |
59 | credit_a | NeuralNet | 0.855314 | 0.853688 | 0.854426 | 0.708084 |
60 | credit_g | NeuralNet | 0.718500 | 0.633586 | 0.655008 | 0.286997 |
61 | crx | NeuralNet | 0.847585 | 0.846356 | 0.846279 | 0.692615 |
62 | dermatology | NeuralNet | 0.958559 | 0.957059 | 0.952279 | 0.948802 |
63 | diabetes | NeuralNet | 0.745238 | 0.704854 | 0.721165 | 0.425190 |
64 | dis | NeuralNet | 0.984857 | 0.500000 | 0.492428 | 0.000000 |
65 | ecoli | NeuralNet | 0.866162 | 0.822123 | 0.835700 | 0.814340 |
66 | flags | NeuralNet | 0.454630 | 0.410999 | 0.418892 | 0.292760 |
67 | flare | NeuralNet | 0.820093 | 0.568488 | 0.642486 | 0.193622 |
68 | german | NeuralNet | 0.721833 | 0.644865 | 0.664890 | 0.308723 |
69 | glass | NeuralNet | 0.634959 | 0.556708 | 0.535536 | 0.490254 |
70 | glass2 | NeuralNet | 0.754545 | 0.764611 | 0.763772 | 0.528184 |
71 | haberman | NeuralNet | 0.721505 | 0.581981 | 0.616532 | 0.193974 |
72 | hayes_roth | NeuralNet | 0.725000 | 0.773298 | 0.741867 | 0.597668 |
73 | heart_c | NeuralNet | 0.812022 | 0.815221 | 0.812394 | 0.627410 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
---|---|---|---|---|---|---|
74 | heart_h | NeuralNet | 0.774576 | 0.745714 | 0.767058 | 0.511602 |
75 | heart_statlog | NeuralNet | 0.801235 | 0.799560 | 0.798810 | 0.598298 |
76 | hepatitis | NeuralNet | 0.816129 | 0.691785 | 0.717233 | 0.404004 |
77 | horse_colic | NeuralNet | 0.791441 | 0.767537 | 0.787856 | 0.554577 |
78 | house_votes_84 | NeuralNet | 0.949425 | 0.948397 | 0.945097 | 0.893421 |
79 | hungarian | NeuralNet | 0.793785 | 0.770260 | 0.781005 | 0.550523 |
80 | hypothyroid | NeuralNet | 0.950395 | 0.500000 | 0.475197 | 0.000000 |
81 | ionosphere | NeuralNet | 0.927230 | 0.916240 | 0.927374 | 0.843267 |
82 | iris | NeuralNet | 0.916667 | 0.918241 | 0.924952 | 0.884396 |
83 | irish | NeuralNet | 0.976667 | 0.977974 | 0.975260 | 0.953207 |
84 | kr_vs_kp | NeuralNet | 0.990000 | 0.989985 | 0.990051 | 0.980036 |
85 | krkopt | NeuralNet | 0.745432 | 0.478336 | 0.429315 | 0.717063 |
86 | labor | NeuralNet | 0.883333 | 0.892945 | 0.868836 | 0.755731 |
87 | led24 | NeuralNet | 0.705365 | 0.705415 | 0.703774 | 0.672870 |
88 | led7 | NeuralNet | 0.735469 | 0.733776 | 0.739016 | 0.706936 |
89 | lupus | NeuralNet | 0.712963 | 0.710248 | 0.705527 | 0.413480 |
90 | lymphography | NeuralNet | 0.836667 | 0.742958 | 0.734830 | 0.693317 |
91 | magic | NeuralNet | 0.874658 | 0.847898 | 0.874354 | 0.721732 |
92 | mfeat_fourier | NeuralNet | 0.818500 | 0.819016 | 0.815613 | 0.798834 |
93 | mfeat_karhunen | NeuralNet | 0.961833 | 0.962581 | 0.962295 | 0.957653 |
94 | mfeat_morphological | NeuralNet | 0.737667 | 0.738714 | 0.730848 | 0.712418 |
95 | mfeat_zernike | NeuralNet | 0.827750 | 0.828335 | 0.823932 | 0.809025 |
96 | mofn_3_7_10 | NeuralNet | 0.999874 | 0.999921 | 0.999691 | 0.999612 |
97 | molecular_biology_promoters | NeuralNet | 0.745455 | 0.769183 | 0.767587 | 0.536550 |
98 | monk1 | NeuralNet | 0.900893 | 0.903577 | 0.911824 | 0.815310 |
99 | monk2 | NeuralNet | 0.915152 | 0.903421 | 0.910310 | 0.818805 |
100 | monk3 | NeuralNet | 0.976276 | 0.976377 | 0.976514 | 0.952887 |
101 | movement_libras | NeuralNet | 0.778704 | 0.795320 | 0.795058 | 0.766881 |
102 | mushroom | NeuralNet | 0.993415 | 0.993650 | 0.993265 | 0.986915 |
103 | mux6 | NeuralNet | 0.943590 | 0.949145 | 0.945417 | 0.894395 |
104 | new_thyroid | NeuralNet | 0.916279 | 0.914935 | 0.890192 | 0.838054 |
105 | nursery | NeuralNet | 0.970473 | 0.746161 | 0.728799 | 0.957097 |
106 | page_blocks | NeuralNet | 0.956195 | 0.599961 | 0.622472 | 0.754415 |
107 | parity5 | NeuralNet | 0.128571 | 0.125833 | 0.120278 | -0.749556 |
108 | parity5+5 | NeuralNet | 0.901481 | 0.901240 | 0.905031 | 0.806210 |
109 | pendigits | NeuralNet | 0.992102 | 0.992029 | 0.992150 | 0.991228 |
110 | phoneme | NeuralNet | 0.858125 | 0.819394 | 0.834342 | 0.653326 |
111 | pima | NeuralNet | 0.757576 | 0.722562 | 0.737934 | 0.459277 |
112 | postoperative_patient_data | NeuralNet | 0.616667 | 0.430137 | 0.422865 | -0.134581 |
113 | prnn_crabs | NeuralNet | 0.970833 | 0.974441 | 0.969353 | 0.943719 |
114 | prnn_fglass | NeuralNet | 0.642276 | 0.573701 | 0.555254 | 0.503537 |
115 | prnn_synth | NeuralNet | 0.863333 | 0.869550 | 0.867224 | 0.736697 |
116 | profb | NeuralNet | 0.660000 | 0.569709 | 0.599023 | 0.165549 |
117 | ring | NeuralNet | 0.967320 | 0.967348 | 0.967515 | 0.934863 |
118 | saheart | NeuralNet | 0.698208 | 0.654281 | 0.661048 | 0.314901 |
119 | satimage | NeuralNet | 0.894380 | 0.865557 | 0.882729 | 0.869960 |
120 | schizo | NeuralNet | 0.560784 | 0.416122 | 0.438102 | 0.134946 |
121 | segmentation | NeuralNet | 0.955772 | 0.955819 | 0.956235 | 0.948619 |
122 | solar_flare_1 | NeuralNet | 0.711111 | 0.641237 | 0.634488 | 0.630834 |
123 | solar_flare_2 | NeuralNet | 0.743458 | 0.595600 | 0.617318 | 0.672688 |
124 | sonar | NeuralNet | 0.844444 | 0.847896 | 0.848892 | 0.696714 |
125 | soybean | NeuralNet | 0.885679 | 0.908994 | 0.903971 | 0.875395 |
126 | spambase | NeuralNet | 0.933623 | 0.929705 | 0.931154 | 0.860840 |
127 | spect | NeuralNet | 0.822840 | 0.717052 | 0.719485 | 0.433855 |
128 | spectf | NeuralNet | 0.855714 | 0.818914 | 0.829624 | 0.645212 |
129 | splice | NeuralNet | 0.868025 | 0.859335 | 0.855673 | 0.786508 |
130 | tae | NeuralNet | 0.535484 | 0.543220 | 0.545302 | 0.311698 |
131 | texture | NeuralNet | 0.997182 | 0.997211 | 0.997186 | 0.996905 |
132 | threeOf9 | NeuralNet | 0.988673 | 0.988757 | 0.988827 | 0.977576 |
133 | tic_tac_toe | NeuralNet | 0.910417 | 0.886942 | 0.916250 | 0.802401 |
134 | tokyo1 | NeuralNet | 0.919965 | 0.910703 | 0.914601 | 0.825215 |
135 | twonorm | NeuralNet | 0.973649 | 0.973740 | 0.973626 | 0.947366 |
136 | vehicle | NeuralNet | 0.795098 | 0.804081 | 0.800503 | 0.729591 |
137 | vote | NeuralNet | 0.948276 | 0.947088 | 0.947381 | 0.894389 |
138 | vowel | NeuralNet | 0.963300 | 0.964049 | 0.962838 | 0.959821 |
139 | waveform_21 | NeuralNet | 0.855033 | 0.855107 | 0.856384 | 0.783666 |
140 | waveform_40 | NeuralNet | 0.846667 | 0.847626 | 0.848218 | 0.771493 |
141 | wdbc | NeuralNet | 0.961404 | 0.958843 | 0.959549 | 0.918218 |
142 | wine_quality_red | NeuralNet | 0.599792 | 0.286262 | 0.284769 | 0.351061 |
143 | wine_quality_white | NeuralNet | 0.562789 | 0.251547 | 0.252105 | 0.313237 |
144 | wine_recognition | NeuralNet | 0.969444 | 0.973278 | 0.966759 | 0.954492 |
145 | xd6 | NeuralNet | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
146 | yeast | NeuralNet | 0.588401 | 0.512226 | 0.507737 | 0.467686 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | RF | 0.592813 | 0.593805 | 0.593487 | 0.18729 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | RF | 0.716042 | 0.717502 | 0.716862 | 0.434362 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | RF | 0.569167 | 0.570024 | 0.569777 | 0.1398 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | RF | 0.65 | 0.650981 | 0.650655 | 0.301635 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | RF | 0.676042 | 0.676914 | 0.676607 | 0.35352 |
6 | Hill_Valley_with_noise | RF | 0.566667 | 0.567162 | 0.566994 | 0.134155 |
7 | Hill_Valley_without_noise | RF | 0.591358 | 0.592646 | 0.592079 | 0.18472 |
8 | agaricus_lepiota | RF | 1 | 1 | 1 | 1 |
9 | allbp | RF | 0.961192 | 0.602601 | 0.412937 | 0.363661 |
10 | allhyper | RF | 0.973289 | 0.348426 | 0.280262 | 0.059236 |
11 | allhypo | RF | 0.942175 | 0.803611 | 0.509489 | 0.499523 |
12 | allrep | RF | 0.966976 | 0.344607 | 0.268917 | 0.076523 |
13 | analcatdata_aids | RF | 0.353333 | 0.387989 | 0.361032 | -0.235453 |
14 | analcatdata_asbestos | RF | 0.747059 | 0.751587 | 0.757251 | 0.507501 |
15 | analcatdata_authorship | RF | 0.990533 | 0.99233 | 0.984122 | 0.986218 |
16 | analcatdata_bankruptcy | RF | 0.84 | 0.830198 | 0.836468 | 0.666737 |
17 | analcatdata_boxing1 | RF | 0.804167 | 0.789302 | 0.774962 | 0.562319 |
18 | analcatdata_boxing2 | RF | 0.746914 | 0.753484 | 0.750487 | 0.503755 |
19 | analcatdata_creditscore | RF | 0.968333 | 0.949908 | 0.973839 | 0.921332 |
20 | analcatdata_cyyoung8092 | RF | 0.808333 | 0.817405 | 0.70781 | 0.502572 |
21 | analcatdata_cyyoung9302 | RF | 0.857895 | 0.764068 | 0.721255 | 0.46788 |
22 | analcatdata_dmft | RF | 0.184792 | 0.19342 | 0.18849 | 0.022857 |
23 | analcatdata_fraud | RF | 0.7 | 0.609319 | 0.612659 | 0.209653 |
24 | analcatdata_germangss | RF | 0.161667 | 0.158693 | 0.161677 | -0.116714 |
25 | analcatdata_happiness | RF | 0.216667 | 0.247496 | 0.232063 | -0.121818 |
26 | analcatdata_japansolvent | RF | 0.842424 | 0.859372 | 0.841978 | 0.699035 |
27 | analcatdata_lawsuit | RF | 0.978616 | 0.937273 | 0.906068 | 0.828542 |
28 | ann_thyroid | RF | 0.996111 | 0.974214 | 0.985039 | 0.97328 |
29 | appendicitis | RF | 0.862121 | 0.77616 | 0.761159 | 0.516606 |
30 | australian | RF | 0.858696 | 0.856531 | 0.857277 | 0.713777 |
31 | auto | RF | 0.8 | 0.82009 | 0.792282 | 0.741249 |
32 | backache | RF | 0.837963 | 0.487991 | 0.519393 | 0.054565 |
33 | balance_scale | RF | 0.841333 | 0.564501 | 0.606117 | 0.712345 |
34 | banana | RF | 0.89522 | 0.895011 | 0.892572 | 0.787576 |
35 | biomed | RF | 0.903968 | 0.913905 | 0.873959 | 0.785684 |
36 | breast | RF | 0.96881 | 0.963213 | 0.969114 | 0.932255 |
37 | breast_cancer | RF | 0.722989 | 0.663864 | 0.616834 | 0.275302 |
38 | breast_cancer_wisconsin | RF | 0.955848 | 0.95673 | 0.949492 | 0.906034 |
39 | breast_w | RF | 0.970714 | 0.965086 | 0.971185 | 0.9362 |
40 | buggyCrx | RF | 0.87029 | 0.868878 | 0.87036 | 0.739192 |
41 | bupa | RF | 0.555072 | 0.55896 | 0.55824 | 0.117196 |
42 | calendarDOW | RF | 0.6225 | 0.601339 | 0.598965 | 0.523132 |
43 | car | RF | 0.962813 | 0.90768 | 0.891529 | 0.920316 |
44 | car_evaluation | RF | 0.964162 | 0.90028 | 0.9145 | 0.923467 |
45 | cars | RF | 0.95443 | 0.945243 | 0.934689 | 0.916105 |
46 | chess | RF | 0.990938 | 0.991027 | 0.990844 | 0.981871 |
47 | churn | RF | 0.917 | 0.938249 | 0.707636 | 0.602217 |
48 | clean1 | RF | 0.904861 | 0.905297 | 0.900208 | 0.80544 |
49 | cleve | RF | 0.821858 | 0.820319 | 0.823499 | 0.643642 |
50 | cleveland | RF | 0.554098 | 0.284665 | 0.287316 | 0.257013 |
51 | cleveland_nominal | RF | 0.538251 | 0.29455 | 0.298114 | 0.255859 |
52 | cloud | RF | 0.348485 | 0.35541 | 0.37207 | 0.149374 |
53 | cmc | RF | 0.506667 | 0.482102 | 0.477236 | 0.228526 |
54 | colic | RF | 0.840991 | 0.844669 | 0.815516 | 0.658974 |
55 | collins | RF | 0.999313 | 0.999313 | 0.999038 | 0.999242 |
56 | confidence | RF | 0.782222 | 0.815463 | 0.797389 | 0.743977 |
57 | contraceptive | RF | 0.510847 | 0.484453 | 0.477114 | 0.233082 |
58 | corral | RF | 0.99375 | 0.993687 | 0.993942 | 0.987625 |
59 | credit_a | RF | 0.861836 | 0.86093 | 0.861414 | 0.722311 |
60 | credit_g | RF | 0.762667 | 0.727824 | 0.65338 | 0.372782 |
61 | crx | RF | 0.862319 | 0.861499 | 0.860795 | 0.722277 |
62 | dermatology | RF | 0.972973 | 0.971601 | 0.968303 | 0.966415 |
63 | diabetes | RF | 0.758442 | 0.736482 | 0.716768 | 0.452329 |
64 | dis | RF | 0.985033 | 0.573135 | 0.508874 | 0.050971 |
65 | ecoli | RF | 0.879293 | 0.859011 | 0.833699 | 0.832325 |
66 | flags | RF | 0.469444 | 0.446879 | 0.428278 | 0.306985 |
67 | flare | RF | 0.804206 | 0.606957 | 0.562855 | 0.162851 |
68 | german | RF | 0.7615 | 0.735306 | 0.656369 | 0.382524 |
69 | glass | RF | 0.793496 | 0.768921 | 0.737416 | 0.709925 |
70 | glass2 | RF | 0.866667 | 0.868614 | 0.86623 | 0.734613 |
71 | haberman | RF | 0.706452 | 0.597622 | 0.57313 | 0.167371 |
72 | hayes_roth | RF | 0.81875 | 0.849932 | 0.854616 | 0.718032 |
73 | heart_c | RF | 0.823497 | 0.821805 | 0.821258 | 0.642999 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | RF | 0.812429 | 0.803651 | 0.791494 | 0.594451 |
75 | heart_statlog | RF | 0.824074 | 0.822665 | 0.816857 | 0.639373 |
76 | hepatitis | RF | 0.8 | 0.729485 | 0.625241 | 0.325763 |
77 | horse_colic | RF | 0.843243 | 0.847947 | 0.817756 | 0.664626 |
78 | house_votes_84 | RF | 0.954023 | 0.951141 | 0.952488 | 0.903554 |
79 | hungarian | RF | 0.833333 | 0.823948 | 0.807803 | 0.631089 |
80 | hypothyroid | RF | 0.961664 | 0.871592 | 0.661778 | 0.486384 |
81 | ionosphere | RF | 0.934272 | 0.933031 | 0.923157 | 0.855937 |
82 | iris | RF | 0.948889 | 0.950367 | 0.949603 | 0.924185 |
83 | irish | RF | 0.999 | 0.999006 | 0.998942 | 0.997946 |
84 | kr_vs_kp | RF | 0.990938 | 0.991119 | 0.990747 | 0.981865 |
85 | krkopt | RF | 0.750249 | 0.760036 | 0.747196 | 0.721422 |
86 | labor | RF | 0.925 | 0.919127 | 0.920991 | 0.836918 |
87 | led24 | RF | 0.71724 | 0.716457 | 0.716799 | 0.685937 |
88 | led7 | RF | 0.732656 | 0.737603 | 0.730783 | 0.703823 |
89 | lupus | RF | 0.677778 | 0.662729 | 0.668674 | 0.328853 |
90 | lymphography | RF | 0.861111 | 0.727772 | 0.73409 | 0.729926 |
91 | magic | RF | 0.880967 | 0.879621 | 0.856553 | 0.735806 |
92 | mfeat_fourier | RF | 0.832417 | 0.83273 | 0.8336 | 0.814312 |
93 | mfeat_karhunen | RF | 0.962333 | 0.962286 | 0.962723 | 0.958194 |
94 | mfeat_morphological | RF | 0.7005 | 0.699192 | 0.700314 | 0.667651 |
95 | mfeat_zernike | RF | 0.781583 | 0.777452 | 0.782013 | 0.757641 |
96 | mofn_3_7_10 | RF | 0.997484 | 0.998403 | 0.994355 | 0.992711 |
97 | molecular_biology_promoters | RF | 0.907576 | 0.907391 | 0.912776 | 0.819971 |
98 | monk1 | RF | 1 | 1 | 1 | 1 |
99 | monk2 | RF | 0.832782 | 0.82718 | 0.79573 | 0.621766 |
100 | monk3 | RF | 0.98048 | 0.980418 | 0.980483 | 0.960899 |
101 | movement_libras | RF | 0.790278 | 0.797014 | 0.798489 | 0.777306 |
102 | mushroom | RF | 1 | 1 | 1 | 1 |
103 | mux6 | RF | 0.939744 | 0.944234 | 0.9448 | 0.888858 |
104 | new_thyroid | RF | 0.949612 | 0.949849 | 0.923227 | 0.895017 |
105 | nursery | RF | 0.990856 | 0.992722 | 0.969001 | 0.98661 |
106 | page_blocks | RF | 0.974795 | 0.885377 | 0.856833 | 0.862329 |
107 | parity5 | RF | 0.061905 | 0.054444 | 0.067778 | -0.87586 |
108 | parity5+5 | RF | 0.596889 | 0.598563 | 0.598106 | 0.196663 |
109 | pendigits | RF | 0.991617 | 0.991755 | 0.991702 | 0.990689 |
110 | phoneme | RF | 0.907524 | 0.891557 | 0.884286 | 0.775755 |
111 | pima | RF | 0.759957 | 0.73681 | 0.715334 | 0.451304 |
112 | postoperative_patient_data | RF | 0.640741 | 0.389869 | 0.431934 | -0.154795 |
113 | prnn_crabs | RF | 0.92 | 0.919792 | 0.921353 | 0.841109 |
114 | prnn_fglass | RF | 0.788618 | 0.760492 | 0.731584 | 0.703691 |
115 | prnn_synth | RF | 0.846 | 0.851007 | 0.845242 | 0.696126 |
116 | profb | RF | 0.668395 | 0.607657 | 0.559177 | 0.158267 |
117 | ring | RF | 0.95205 | 0.952935 | 0.952216 | 0.905151 |
118 | saheart | RF | 0.70681 | 0.674563 | 0.62945 | 0.29905 |
119 | satimage | RF | 0.916887 | 0.908773 | 0.890325 | 0.897563 |
120 | schizo | RF | 0.587745 | 0.199647 | 0.329257 | -0.021156 |
121 | segmentation | RF | 0.979509 | 0.979535 | 0.979382 | 0.97612 |
122 | solar_flare_1 | RF | 0.730688 | 0.714603 | 0.688479 | 0.654436 |
123 | solar_flare_2 | RF | 0.740187 | 0.632358 | 0.610569 | 0.668219 |
124 | sonar | RF | 0.820635 | 0.832932 | 0.818413 | 0.650891 |
125 | soybean | RF | 0.931358 | 0.965342 | 0.959584 | 0.925446 |
126 | spambase | RF | 0.954542 | 0.956436 | 0.948223 | 0.904614 |
127 | spect | RF | 0.821605 | 0.714415 | 0.695816 | 0.407145 |
128 | spectf | RF | 0.90381 | 0.888917 | 0.867458 | 0.754996 |
129 | splice | RF | 0.969592 | 0.966513 | 0.965407 | 0.950437 |
130 | tae | RF | 0.621505 | 0.626625 | 0.623188 | 0.439679 |
131 | texture | RF | 0.979061 | 0.979271 | 0.979157 | 0.976976 |
132 | threeOf9 | RF | 0.992233 | 0.99182 | 0.992679 | 0.984498 |
133 | tic_tac_toe | RF | 0.988194 | 0.990204 | 0.983946 | 0.974115 |
134 | tokyo1 | RF | 0.927778 | 0.922725 | 0.919434 | 0.842092 |
135 | twonorm | RF | 0.974077 | 0.974071 | 0.974099 | 0.94817 |
136 | vehicle | RF | 0.747059 | 0.743724 | 0.755626 | 0.664236 |
137 | vote | RF | 0.957854 | 0.956191 | 0.956952 | 0.913103 |
138 | vowel | RF | 0.970707 | 0.971521 | 0.972488 | 0.967922 |
139 | waveform_21 | RF | 0.854867 | 0.855389 | 0.854524 | 0.783095 |
140 | waveform_40 | RF | 0.856033 | 0.857514 | 0.856918 | 0.785608 |
141 | wdbc | RF | 0.958187 | 0.959796 | 0.951737 | 0.911386 |
142 | wine_quality_red | RF | 0.699062 | 0.398787 | 0.358419 | 0.514626 |
143 | wine_quality_white | RF | 0.68932 | 0.571892 | 0.399087 | 0.522513 |
144 | wine_recognition | RF | 0.969444 | 0.969947 | 0.973605 | 0.953864 |
145 | xd6 | RF | 0.999316 | 0.999479 | 0.999034 | 0.998511 |
146 | yeast | RF | 0.621396 | 0.565004 | 0.529681 | 0.506088 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
1 | GAMETES_Epistasis_2_Way_20atts_0.1H_EDM_1_1 | LGBM | 0.600625 | 0.601217 | 0.600913 | 0.20213 |
2 | GAMETES_Epistasis_2_Way_20atts_0.4H_EDM_1_1 | LGBM | 0.725521 | 0.727364 | 0.72608 | 0.453438 |
3 | GAMETES_Epistasis_3_Way_20atts_0.2H_EDM_1_1 | LGBM | 0.546354 | 0.54645 | 0.546371 | 0.092821 |
4 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | LGBM | 0.665 | 0.665684 | 0.665486 | 0.33117 |
5 | GAMETES_Heterogeneity_20atts_1600_Het_0.4_0.2_… | LGBM | 0.682604 | 0.683213 | 0.682965 | 0.366177 |
6 | Hill_Valley_with_noise | LGBM | 0.580247 | 0.580742 | 0.580552 | 0.161293 |
7 | Hill_Valley_without_noise | LGBM | 0.594513 | 0.596329 | 0.59553 | 0.191853 |
8 | agaricus_lepiota | LGBM | 1 | 1 | 1 | 1 |
9 | allbp | LGBM | 0.974525 | 0.804793 | 0.678941 | 0.67448 |
10 | allhyper | LGBM | 0.988565 | 0.717225 | 0.650713 | 0.76277 |
11 | allhypo | LGBM | 0.985853 | 0.923067 | 0.895773 | 0.902718 |
12 | allrep | LGBM | 0.984768 | 0.796444 | 0.735259 | 0.750835 |
13 | analcatdata_aids | LGBM | 0.35 | 0.175 | 0.483333 | 0 |
14 | analcatdata_asbestos | LGBM | 0.77451 | 0.782152 | 0.786 | 0.566685 |
15 | analcatdata_authorship | LGBM | 0.990335 | 0.991843 | 0.982204 | 0.985941 |
16 | analcatdata_bankruptcy | LGBM | 0.676667 | 0.613571 | 0.710516 | 0.421595 |
17 | analcatdata_boxing1 | LGBM | 0.668056 | 0.645938 | 0.595761 | 0.233457 |
18 | analcatdata_boxing2 | LGBM | 0.780247 | 0.794117 | 0.779873 | 0.573408 |
19 | analcatdata_creditscore | LGBM | 0.986667 | 0.973684 | 0.990276 | 0.962419 |
20 | analcatdata_cyyoung8092 | LGBM | 0.798333 | 0.765406 | 0.732137 | 0.492857 |
21 | analcatdata_cyyoung9302 | LGBM | 0.847368 | 0.735986 | 0.747946 | 0.47124 |
22 | analcatdata_dmft | LGBM | 0.200208 | 0.208616 | 0.203876 | 0.041659 |
23 | analcatdata_fraud | LGBM | 0.7 | 0.366667 | 0.516667 | 0 |
24 | analcatdata_germangss | LGBM | 0.373333 | 0.380059 | 0.376709 | 0.169657 |
25 | analcatdata_happiness | LGBM | 0.444444 | 0.426318 | 0.43869 | 0.237565 |
26 | analcatdata_japansolvent | LGBM | 0.815152 | 0.821257 | 0.809405 | 0.627538 |
27 | analcatdata_lawsuit | LGBM | 0.986792 | 0.955017 | 0.938743 | 0.8882 |
28 | ann_thyroid | LGBM | 0.99669 | 0.978271 | 0.986658 | 0.97706 |
29 | appendicitis | LGBM | 0.859091 | 0.753043 | 0.758115 | 0.496495 |
30 | australian | LGBM | 0.855314 | 0.853435 | 0.854054 | 0.707463 |
31 | auto | LGBM | 0.823577 | 0.843036 | 0.822394 | 0.771473 |
32 | backache | LGBM | 0.835185 | 0.586381 | 0.533229 | 0.112935 |
33 | balance_scale | LGBM | 0.861333 | 0.64243 | 0.646857 | 0.753419 |
34 | banana | LGBM | 0.896918 | 0.897183 | 0.893869 | 0.79104 |
35 | biomed | LGBM | 0.901587 | 0.895857 | 0.884865 | 0.779757 |
36 | breast | LGBM | 0.964524 | 0.958979 | 0.963574 | 0.922501 |
37 | breast_cancer | LGBM | 0.712644 | 0.660383 | 0.615009 | 0.266742 |
38 | breast_cancer_wisconsin | LGBM | 0.964912 | 0.966561 | 0.958654 | 0.925122 |
39 | breast_w | LGBM | 0.965476 | 0.959964 | 0.96463 | 0.924528 |
40 | buggyCrx | LGBM | 0.872947 | 0.871543 | 0.871865 | 0.743374 |
41 | bupa | LGBM | 0.559903 | 0.562541 | 0.561954 | 0.12449 |
42 | calendarDOW | LGBM | 0.600833 | 0.5825 | 0.574801 | 0.497894 |
43 | car | LGBM | 0.994316 | 0.98707 | 0.986767 | 0.987641 |
44 | car_evaluation | LGBM | 0.993738 | 0.985846 | 0.985837 | 0.98658 |
45 | cars | LGBM | 0.972152 | 0.965391 | 0.963578 | 0.948032 |
46 | chess | LGBM | 0.994062 | 0.994134 | 0.993976 | 0.98811 |
47 | churn | LGBM | 0.960233 | 0.952108 | 0.876124 | 0.824569 |
48 | clean1 | LGBM | 0.997917 | 0.998125 | 0.997707 | 0.995829 |
49 | cleve | LGBM | 0.800546 | 0.799896 | 0.800955 | 0.600648 |
50 | cleveland | LGBM | 0.548087 | 0.314911 | 0.304379 | 0.277223 |
51 | cleveland_nominal | LGBM | 0.539891 | 0.30768 | 0.302665 | 0.263412 |
52 | cloud | LGBM | 0.465152 | 0.490845 | 0.477279 | 0.293241 |
53 | cmc | LGBM | 0.531751 | 0.510656 | 0.505896 | 0.272571 |
54 | colic | LGBM | 0.823874 | 0.818533 | 0.804626 | 0.622744 |
55 | collins | LGBM | 0.990378 | 0.988866 | 0.989741 | 0.989378 |
56 | confidence | LGBM | 0.777778 | 0.791537 | 0.79613 | 0.744283 |
57 | contraceptive | LGBM | 0.531751 | 0.510656 | 0.505896 | 0.272571 |
58 | corral | LGBM | 1 | 1 | 1 | 1 |
59 | credit_a | LGBM | 0.862319 | 0.8612 | 0.860714 | 0.721891 |
60 | credit_g | LGBM | 0.760667 | 0.712023 | 0.6798 | 0.389833 |
61 | crx | LGBM | 0.862319 | 0.861024 | 0.860974 | 0.721986 |
62 | dermatology | LGBM | 0.966667 | 0.957741 | 0.962793 | 0.958454 |
63 | diabetes | LGBM | 0.743506 | 0.7179 | 0.708364 | 0.425803 |
64 | dis | LGBM | 0.990552 | 0.902409 | 0.759103 | 0.637092 |
65 | ecoli | LGBM | 0.866162 | 0.825708 | 0.800146 | 0.81292 |
66 | flags | LGBM | 0.47963 | 0.451151 | 0.432137 | 0.324978 |
67 | flare | LGBM | 0.816822 | 0.64016 | 0.573647 | 0.202066 |
68 | german | LGBM | 0.757667 | 0.713125 | 0.680203 | 0.39152 |
69 | glass | LGBM | 0.762602 | 0.712419 | 0.68008 | 0.666747 |
70 | glass2 | LGBM | 0.872727 | 0.870638 | 0.873207 | 0.743677 |
71 | haberman | LGBM | 0.70914 | 0.604596 | 0.576643 | 0.176236 |
72 | hayes_roth | LGBM | 0.80625 | 0.824961 | 0.826361 | 0.701695 |
73 | heart_c | LGBM | 0.811475 | 0.809545 | 0.807614 | 0.617022 |
Dataset Name | Algorithm | Mean Accuracy [0-1] | Mean Precision | Mean Recall | Mean MCC | |
74 | heart_h | LGBM | 0.784746 | 0.770482 | 0.769739 | 0.539905 |
75 | heart_statlog | LGBM | 0.812346 | 0.811347 | 0.807426 | 0.618626 |
76 | hepatitis | LGBM | 0.811828 | 0.718382 | 0.685423 | 0.393035 |
77 | horse_colic | LGBM | 0.82027 | 0.816101 | 0.801575 | 0.617131 |
78 | house_votes_84 | LGBM | 0.955172 | 0.951344 | 0.953965 | 0.905267 |
79 | hungarian | LGBM | 0.79548 | 0.779852 | 0.771316 | 0.550672 |
80 | hypothyroid | LGBM | 0.982096 | 0.919059 | 0.887098 | 0.803823 |
81 | ionosphere | LGBM | 0.941315 | 0.94698 | 0.926107 | 0.872606 |
82 | iris | LGBM | 0.945556 | 0.947153 | 0.946102 | 0.919454 |
83 | irish | LGBM | 1 | 1 | 1 | 1 |
84 | kr_vs_kp | LGBM | 0.994167 | 0.99427 | 0.994067 | 0.988337 |
85 | krkopt | LGBM | 0.669435 | 0.575325 | 0.617445 | 0.632213 |
86 | labor | LGBM | 0.830556 | 0.819286 | 0.827496 | 0.639012 |
87 | led24 | LGBM | 0.689531 | 0.689134 | 0.689131 | 0.65508 |
88 | led7 | LGBM | 0.733333 | 0.737842 | 0.731273 | 0.704584 |
89 | lupus | LGBM | 0.746296 | 0.740188 | 0.730262 | 0.465089 |
90 | lymphography | LGBM | 0.831111 | 0.74156 | 0.743542 | 0.669946 |
91 | magic | LGBM | 0.880827 | 0.880668 | 0.855238 | 0.73546 |
92 | mfeat_fourier | LGBM | 0.844167 | 0.844966 | 0.845643 | 0.827265 |
93 | mfeat_karhunen | LGBM | 0.957833 | 0.957751 | 0.958593 | 0.953202 |
94 | mfeat_morphological | LGBM | 0.681667 | 0.680566 | 0.681202 | 0.646662 |
95 | mfeat_zernike | LGBM | 0.789167 | 0.797656 | 0.789418 | 0.766043 |
96 | mofn_3_7_10 | LGBM | 0.998742 | 0.999192 | 0.997327 | 0.99649 |
97 | molecular_biology_promoters | LGBM | 0.892424 | 0.891829 | 0.894953 | 0.786658 |
98 | monk1 | LGBM | 1 | 1 | 1 | 1 |
99 | monk2 | LGBM | 0.906887 | 0.906045 | 0.886986 | 0.792558 |
100 | monk3 | LGBM | 0.98018 | 0.980204 | 0.980092 | 0.960293 |
101 | movement_libras | LGBM | 0.749537 | 0.770434 | 0.761865 | 0.733694 |
102 | mushroom | LGBM | 1 | 1 | 1 | 1 |
103 | mux6 | LGBM | 0.861538 | 0.867661 | 0.865525 | 0.732914 |
104 | new_thyroid | LGBM | 0.942636 | 0.937915 | 0.916178 | 0.87871 |
105 | nursery | LGBM | 0.999987 | 0.99999 | 0.99999 | 0.999981 |
106 | page_blocks | LGBM | 0.972785 | 0.867976 | 0.843822 | 0.85222 |
107 | parity5 | LGBM | 0.366667 | 0.183333 | 0.5 | 0 |
108 | parity5+5 | LGBM | 0.858815 | 0.859047 | 0.859071 | 0.718117 |
109 | pendigits | LGBM | 0.991572 | 0.991622 | 0.991519 | 0.990634 |
110 | phoneme | LGBM | 0.895035 | 0.874754 | 0.872023 | 0.746735 |
111 | pima | LGBM | 0.748052 | 0.721769 | 0.710611 | 0.431994 |
112 | postoperative_patient_data | LGBM | 0.740741 | 0.376373 | 0.488783 | -0.032547 |
113 | prnn_crabs | LGBM | 0.948333 | 0.946881 | 0.950147 | 0.89697 |
114 | prnn_fglass | LGBM | 0.762602 | 0.712419 | 0.68008 | 0.666747 |
115 | prnn_synth | LGBM | 0.852 | 0.854008 | 0.851682 | 0.705631 |
116 | profb | LGBM | 0.64716 | 0.587869 | 0.573848 | 0.160865 |
117 | ring | LGBM | 0.970248 | 0.970265 | 0.970235 | 0.9405 |
118 | saheart | LGBM | 0.677419 | 0.635696 | 0.623404 | 0.258328 |
119 | satimage | LGBM | 0.922145 | 0.912742 | 0.901401 | 0.903979 |
120 | schizo | LGBM | 0.49902 | 0.38156 | 0.374371 | 0.048628 |
121 | segmentation | LGBM | 0.985786 | 0.985921 | 0.985762 | 0.983441 |
122 | solar_flare_1 | LGBM | 0.767196 | 0.746387 | 0.718967 | 0.701043 |
123 | solar_flare_2 | LGBM | 0.740343 | 0.640344 | 0.618496 | 0.668787 |
124 | sonar | LGBM | 0.852381 | 0.856818 | 0.853737 | 0.710409 |
125 | soybean | LGBM | 0.929877 | 0.958896 | 0.95042 | 0.923242 |
126 | spambase | LGBM | 0.957691 | 0.955772 | 0.955507 | 0.911273 |
127 | spect | LGBM | 0.827778 | 0.724311 | 0.699483 | 0.420503 |
128 | spectf | LGBM | 0.882857 | 0.850989 | 0.858873 | 0.708961 |
129 | splice | LGBM | 0.962644 | 0.954979 | 0.96381 | 0.93975 |
130 | tae | LGBM | 0.504301 | 0.517239 | 0.507952 | 0.264958 |
131 | texture | LGBM | 0.987879 | 0.987909 | 0.98791 | 0.986671 |
132 | threeOf9 | LGBM | 1 | 1 | 1 | 1 |
133 | tic_tac_toe | LGBM | 0.999653 | 0.999733 | 0.999517 | 0.99925 |
134 | tokyo1 | LGBM | 0.926736 | 0.920958 | 0.919076 | 0.839989 |
135 | twonorm | LGBM | 0.96982 | 0.96981 | 0.969834 | 0.939645 |
136 | vehicle | LGBM | 0.762941 | 0.76602 | 0.770803 | 0.684356 |
137 | vote | LGBM | 0.949425 | 0.946945 | 0.949192 | 0.896069 |
138 | vowel | LGBM | 0.938889 | 0.942074 | 0.941 | 0.933086 |
139 | waveform_21 | LGBM | 0.850967 | 0.850802 | 0.850766 | 0.776635 |
140 | waveform_40 | LGBM | 0.8539 | 0.85423 | 0.854521 | 0.781403 |
141 | wdbc | LGBM | 0.966959 | 0.968536 | 0.961098 | 0.929554 |
142 | wine_quality_red | LGBM | 0.684479 | 0.393475 | 0.360834 | 0.493296 |
143 | wine_quality_white | LGBM | 0.671395 | 0.521475 | 0.399357 | 0.497333 |
144 | wine_recognition | LGBM | 0.980556 | 0.977947 | 0.982913 | 0.970321 |
145 | xd6 | LGBM | 0.999658 | 0.999479 | 0.999749 | 0.999227 |
146 | yeast | LGBM | 0.579167 | 0.479634 | 0.457764 | 0.44834 |
Appendix F Detailed Results: Regression
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | Ours | 0.814821 | 0.32415 | 0.364286 | 0.915777 |
2 | 1028_SWD | Ours | 0.242729 | 0.453667 | 0.488667 | 0.547945 |
3 | 1029_LEV | Ours | 0.441602 | 0.4215 | 0.503833 | 0.693376 |
4 | 1030_ERA | Ours | 0.326374 | 1.253 | 2.581 | 0.533985 |
5 | 1089_USCrime | Ours | 0.761386 | 13.656667 | 343.763333 | 0.85516 |
6 | 1096_FacultySalaries | Ours | 0.554916 | 1.917929 | 6.254915 | 0.881548 |
7 | 1199_BNG_echoMonths | Ours | 0.457349 | 8.99762 | 135.514966 | 0.642297 |
8 | 192_vineyard | Ours | 0.48772 | 2.319091 | 9.080212 | 0.733184 |
9 | 197_cpu_act | Ours | 0.954674 | 2.228371 | 15.159589 | 0.933337 |
10 | 210_cloud | Ours | 0.724912 | 0.327201 | 0.320947 | 0.899352 |
11 | 225_puma8NH | Ours | 0.655114 | 2.554851 | 10.963756 | 0.802862 |
12 | 227_cpu_small | Ours | 0.962504 | 2.425178 | 12.472524 | 0.915629 |
13 | 228_elusage | Ours | 0.668637 | 9.680586 | 168.506463 | 0.800816 |
14 | 229_pwLinear | Ours | 0.778315 | 1.596004 | 4.202897 | 0.889358 |
15 | 294_satellite_image | Ours | 0.903787 | 0.243849 | 0.469801 | 0.960742 |
16 | 4544_GeographicalOriginalofMusic | Ours | 0.647891 | 0.42381 | 0.365307 | 0.804204 |
17 | 503_wind | Ours | 0.764658 | 2.485348 | 10.531015 | 0.876728 |
18 | 505_tecator | Ours | 0.990744 | 0.83921 | 1.839203 | 0.984015 |
19 | 519_vinnie | Ours | 0.714124 | 1.248684 | 2.573246 | 0.8475 |
20 | 522_pm10 | Ours | 0.231741 | 0.590352 | 0.589455 | 0.521897 |
21 | 523_analcatdata_neavote | Ours | 0.936044 | 0.508333 | 0.911667 | 0.867378 |
22 | 529_pollen | Ours | 0.736858 | 1.254833 | 2.573448 | 0.852636 |
23 | 547_no2 | Ours | 0.451527 | 0.433395 | 0.306515 | 0.675206 |
24 | 560_bodyfat | Ours | 0.967746 | 0.797175 | 2.210489 | 0.986474 |
25 | 562_cpu_small | Ours | 0.96248 | 2.425747 | 12.479398 | 0.91551 |
26 | 573_cpu_act | Ours | 0.954674 | 2.228371 | 15.159589 | 0.933337 |
27 | 579_fri_c0_250_5 | Ours | 0.747212 | 0.378935 | 0.234794 | 0.867755 |
28 | 581_fri_c3_500_25 | Ours | 0.880145 | 0.274256 | 0.119619 | 0.927998 |
29 | 582_fri_c1_500_25 | Ours | 0.862309 | 0.287687 | 0.138304 | 0.92724 |
30 | 583_fri_c1_1000_50 | Ours | 0.874207 | 0.275637 | 0.125888 | 0.933612 |
31 | 584_fri_c4_500_25 | Ours | 0.862294 | 0.278452 | 0.137559 | 0.922143 |
32 | 586_fri_c3_1000_25 | Ours | 0.922484 | 0.217843 | 0.07711 | 0.950582 |
33 | 588_fri_c4_1000_100 | Ours | 0.833808 | 0.317965 | 0.168277 | 0.916987 |
34 | 589_fri_c2_1000_25 | Ours | 0.864343 | 0.28611 | 0.133228 | 0.914812 |
35 | 590_fri_c0_1000_50 | Ours | 0.777288 | 0.367084 | 0.215868 | 0.886631 |
36 | 591_fri_c1_100_10 | Ours | 0.79392 | 0.355943 | 0.205423 | 0.891629 |
37 | 592_fri_c4_1000_25 | Ours | 0.910138 | 0.230571 | 0.088991 | 0.939652 |
38 | 593_fri_c1_1000_10 | Ours | 0.918017 | 0.223762 | 0.081748 | 0.95081 |
39 | 594_fri_c2_100_5 | Ours | 0.698965 | 0.404108 | 0.277513 | 0.822206 |
40 | 595_fri_c0_1000_10 | Ours | 0.836771 | 0.320007 | 0.163036 | 0.917968 |
41 | 596_fri_c2_250_5 | Ours | 0.855819 | 0.288189 | 0.137918 | 0.908562 |
42 | 597_fri_c2_500_5 | Ours | 0.902917 | 0.242846 | 0.096258 | 0.930817 |
43 | 598_fri_c0_1000_25 | Ours | 0.807318 | 0.348594 | 0.190418 | 0.904452 |
44 | 599_fri_c2_1000_5 | Ours | 0.928995 | 0.205654 | 0.070798 | 0.953063 |
45 | 601_fri_c1_250_5 | Ours | 0.874268 | 0.260361 | 0.117707 | 0.931272 |
46 | 602_fri_c3_250_10 | Ours | 0.852551 | 0.287054 | 0.139902 | 0.915365 |
47 | 603_fri_c0_250_50 | Ours | 0.550945 | 0.522681 | 0.427257 | 0.765852 |
48 | 604_fri_c4_500_10 | Ours | 0.903939 | 0.2345 | 0.095063 | 0.936377 |
49 | 605_fri_c2_250_25 | Ours | 0.776405 | 0.366254 | 0.219222 | 0.866843 |
50 | 606_fri_c2_1000_10 | Ours | 0.895583 | 0.251408 | 0.102217 | 0.92564 |
51 | 607_fri_c4_1000_50 | Ours | 0.904882 | 0.233976 | 0.093041 | 0.9444 |
52 | 608_fri_c3_1000_10 | Ours | 0.928383 | 0.201985 | 0.070179 | 0.955772 |
53 | 609_fri_c0_1000_5 | Ours | 0.883291 | 0.272286 | 0.119157 | 0.945173 |
54 | 611_fri_c3_100_5 | Ours | 0.843342 | 0.290307 | 0.13585 | 0.878346 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | Ours | 0.929314 | 0.204331 | 0.067296 | 0.961312 |
56 | 613_fri_c3_250_5 | Ours | 0.879142 | 0.269598 | 0.123383 | 0.919196 |
57 | 615_fri_c4_250_10 | Ours | 0.821346 | 0.326316 | 0.17978 | 0.879216 |
58 | 616_fri_c4_500_50 | Ours | 0.860252 | 0.288322 | 0.138104 | 0.915084 |
59 | 617_fri_c3_500_5 | Ours | 0.902494 | 0.233656 | 0.095288 | 0.944492 |
60 | 618_fri_c3_1000_50 | Ours | 0.89088 | 0.254606 | 0.105341 | 0.924638 |
61 | 620_fri_c1_1000_25 | Ours | 0.896452 | 0.25679 | 0.103944 | 0.943626 |
62 | 621_fri_c0_100_10 | Ours | 0.514948 | 0.504627 | 0.420885 | 0.719398 |
63 | 622_fri_c2_1000_50 | Ours | 0.868326 | 0.285977 | 0.130393 | 0.908663 |
64 | 623_fri_c4_1000_10 | Ours | 0.910817 | 0.225429 | 0.085886 | 0.942258 |
65 | 624_fri_c0_100_5 | Ours | 0.680657 | 0.409647 | 0.264711 | 0.828772 |
66 | 626_fri_c2_500_50 | Ours | 0.81207 | 0.339832 | 0.185641 | 0.868545 |
67 | 627_fri_c2_500_10 | Ours | 0.870395 | 0.265378 | 0.120164 | 0.90524 |
68 | 628_fri_c3_1000_5 | Ours | 0.935337 | 0.19653 | 0.064458 | 0.959792 |
69 | 631_fri_c1_500_5 | Ours | 0.887184 | 0.251644 | 0.111008 | 0.937762 |
70 | 633_fri_c0_500_25 | Ours | 0.755927 | 0.38867 | 0.23952 | 0.870913 |
71 | 634_fri_c2_100_10 | Ours | 0.681313 | 0.430383 | 0.299145 | 0.775288 |
72 | 635_fri_c0_250_10 | Ours | 0.747336 | 0.383817 | 0.229537 | 0.858411 |
73 | 637_fri_c1_500_50 | Ours | 0.756928 | 0.384564 | 0.240337 | 0.873713 |
74 | 641_fri_c1_500_10 | Ours | 0.903447 | 0.248122 | 0.096938 | 0.947524 |
75 | 643_fri_c2_500_25 | Ours | 0.815767 | 0.340522 | 0.183284 | 0.861509 |
76 | 644_fri_c4_250_25 | Ours | 0.831484 | 0.298289 | 0.159565 | 0.899765 |
77 | 645_fri_c3_500_50 | Ours | 0.853164 | 0.292239 | 0.14205 | 0.911955 |
78 | 646_fri_c3_500_10 | Ours | 0.904141 | 0.237812 | 0.093509 | 0.941917 |
79 | 647_fri_c1_250_10 | Ours | 0.844011 | 0.30014 | 0.153291 | 0.902739 |
80 | 648_fri_c1_250_50 | Ours | 0.731215 | 0.40498 | 0.267496 | 0.853423 |
81 | 649_fri_c0_500_5 | Ours | 0.825379 | 0.322537 | 0.173292 | 0.912715 |
82 | 650_fri_c0_500_50 | Ours | 0.731406 | 0.409569 | 0.265992 | 0.86738 |
83 | 651_fri_c0_100_25 | Ours | 0.464538 | 0.616188 | 0.580735 | 0.723058 |
84 | 653_fri_c0_250_25 | Ours | 0.714149 | 0.420427 | 0.275512 | 0.840435 |
85 | 654_fri_c0_500_10 | Ours | 0.763131 | 0.375268 | 0.230507 | 0.875866 |
86 | 656_fri_c1_100_5 | Ours | 0.67904 | 0.380913 | 0.25083 | 0.859699 |
87 | 657_fri_c2_250_10 | Ours | 0.845575 | 0.294064 | 0.144781 | 0.850391 |
88 | 658_fri_c3_250_25 | Ours | 0.813686 | 0.323878 | 0.188254 | 0.904333 |
89 | 663_rabe_266 | Ours | 0.986557 | 3.859722 | 34.615278 | 0.990176 |
90 | 665_sleuth_case2002 | Ours | 0.30437 | 5.34 | 59.04 | 0.363676 |
91 | 666_rmftsa_ladata | Ours | 0.571491 | 1.314268 | 3.314505 | 0.619187 |
92 | 678_visualizing_environmental | Ours | 0.2157 | 2.418534 | 9.419031 | 0.525967 |
93 | 687_sleuth_ex1605 | Ours | 0.304948 | 8.969231 | 130.948718 | 0.584203 |
94 | 690_visualizing_galaxy | Ours | 0.975347 | 10.611795 | 221.629231 | 0.982753 |
95 | 695_chatfield_4 | Ours | 0.827115 | 13.107853 | 363.147952 | 0.923092 |
96 | 712_chscase_geyser1 | Ours | 0.743774 | 5.067407 | 40.359259 | 0.726834 |
97 | feynman_III_12_43 | Ours | 0.999991 | 0.001792 | 0.000006 | 0.999995 |
98 | feynman_III_15_12 | Ours | 0.995147 | 0.243424 | 0.12736 | 0.996878 |
99 | feynman_III_15_14 | Ours | 0.997892 | 0.000219 | 0.000001 | 0.999857 |
100 | feynman_III_15_27 | Ours | 0.998984 | 0.036029 | 0.007256 | 0.999844 |
101 | feynman_III_17_37 | Ours | 0.999504 | 0.072285 | 0.012494 | 0.999817 |
102 | feynman_III_7_38 | Ours | 0.999296 | 0.44518 | 0.910481 | 0.999843 |
103 | feynman_III_8_54 | Ours | 0.983559 | 0.028682 | 0.002052 | 0.989254 |
104 | feynman_II_10_9 | Ours | 0.99925 | 0.003165 | 0.000046 | 0.999862 |
105 | feynman_II_11_28 | Ours | 0.999989 | 0.000632 | 0.000001 | 0.999986 |
106 | feynman_II_13_23 | Ours | 0.999854 | 0.007621 | 0.000216 | 0.999949 |
107 | feynman_II_13_34 | Ours | 0.999318 | 0.031981 | 0.003025 | 0.999782 |
108 | feynman_II_15_4 | Ours | 0.999476 | 0.0789 | 0.014319 | 0.999783 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | Ours | 0.999482 | 0.07829 | 0.014174 | 0.999788 |
110 | feynman_II_24_17 | Ours | 0.999776 | 0.0083 | 0.000166 | 0.999914 |
111 | feynman_II_27_16 | Ours | 0.999722 | 0.879617 | 2.226821 | 0.999874 |
112 | feynman_II_27_18 | Ours | 0.999991 | 0.05168 | 0.006177 | 0.999996 |
113 | feynman_II_34_2 | Ours | 0.999737 | 0.104771 | 0.024674 | 0.999839 |
114 | feynman_II_34_29a | Ours | 0.999321 | 0.002835 | 0.000036 | 0.999847 |
115 | feynman_II_34_2a | Ours | 0.999342 | 0.005635 | 0.000138 | 0.999846 |
116 | feynman_II_37_1 | Ours | 0.999751 | 0.252268 | 0.137842 | 0.999861 |
117 | feynman_II_38_14 | Ours | 0.999986 | 0.000496 | 0.000001 | 0.999995 |
118 | feynman_II_3_24 | Ours | 0.999958 | 0.000135 | 0.0 | 0.999996 |
119 | feynman_II_4_23 | Ours | 0.99888 | 0.000461 | 0.000001 | 0.999842 |
120 | feynman_II_8_31 | Ours | 0.999991 | 0.025679 | 0.001567 | 0.999996 |
121 | feynman_II_8_7 | Ours | 0.998964 | 0.001101 | 0.000008 | 0.999871 |
122 | feynman_I_10_7 | Ours | 0.999848 | 0.007695 | 0.000223 | 0.999948 |
123 | feynman_I_12_1 | Ours | 0.999991 | 0.011145 | 0.000234 | 0.999995 |
124 | feynman_I_12_4 | Ours | 0.998486 | 0.000332 | 0.000001 | 0.99987 |
125 | feynman_I_12_5 | Ours | 0.999991 | 0.011157 | 0.000232 | 0.999995 |
126 | feynman_I_14_3 | Ours | 0.999736 | 0.209823 | 0.098519 | 0.99984 |
127 | feynman_I_14_4 | Ours | 0.99999 | 0.025775 | 0.001563 | 0.999996 |
128 | feynman_I_15_10 | Ours | 0.999485 | 0.031841 | 0.002294 | 0.999806 |
129 | feynman_I_16_6 | Ours | 0.999825 | 0.009806 | 0.000227 | 0.999923 |
130 | feynman_I_18_12 | Ours | 0.999752 | 0.079259 | 0.013642 | 0.999844 |
131 | feynman_I_25_13 | Ours | 0.999975 | 0.001993 | 0.000015 | 0.999994 |
132 | feynman_I_26_2 | Ours | 0.999975 | 0.001382 | 0.000006 | 0.999993 |
133 | feynman_I_27_6 | Ours | 0.999587 | 0.0044 | 0.000053 | 0.999807 |
134 | feynman_I_29_4 | Ours | 0.999931 | 0.003808 | 0.000121 | 0.999993 |
135 | feynman_I_30_3 | Ours | 0.994476 | 0.105824 | 0.036724 | 0.996563 |
136 | feynman_I_30_5 | Ours | 0.999392 | 0.001463 | 0.000009 | 0.999856 |
137 | feynman_I_34_1 | Ours | 0.999445 | 0.018951 | 0.001767 | 0.999906 |
138 | feynman_I_34_14 | Ours | 0.999759 | 0.014131 | 0.000633 | 0.999936 |
139 | feynman_I_34_27 | Ours | 0.999991 | 0.001779 | 0.000006 | 0.999995 |
140 | feynman_I_37_4 | Ours | 0.999656 | 0.035048 | 0.002833 | 0.999879 |
141 | feynman_I_39_1 | Ours | 0.999991 | 0.016769 | 0.000526 | 0.999995 |
142 | feynman_I_39_11 | Ours | 0.999296 | 0.039029 | 0.006582 | 0.999853 |
143 | feynman_I_43_31 | Ours | 0.999746 | 0.209029 | 0.095828 | 0.99984 |
144 | feynman_I_47_23 | Ours | 0.999546 | 0.008663 | 0.000206 | 0.999846 |
145 | feynman_I_48_2 | Ours | 0.999926 | 0.575089 | 0.766039 | 0.999952 |
146 | feynman_I_6_2 | Ours | 0.999986 | 0.000099 | 0.0 | 0.999994 |
147 | feynman_I_6_2b | Ours | 0.999519 | 0.000746 | 0.000002 | 0.999776 |
148 | nikuradse_1 | Ours | 0.997497 | 0.004895 | 0.000064 | 0.989449 |
149 | strogatz_bacres1 | Ours | 0.997635 | 0.04384 | 0.016487 | 0.987143 |
150 | strogatz_bacres2 | Ours | 0.995697 | 0.031386 | 0.018891 | 0.997317 |
151 | strogatz_barmag1 | Ours | 0.988846 | 0.005775 | 0.000823 | 0.998904 |
152 | strogatz_barmag2 | Ours | 0.997224 | 0.005008 | 0.000216 | 0.99758 |
153 | strogatz_glider1 | Ours | 0.985167 | 0.061423 | 0.009102 | 0.991621 |
154 | strogatz_glider2 | Ours | 0.977176 | 0.073873 | 0.019675 | 0.978445 |
155 | strogatz_lv1 | Ours | 0.467779 | 0.227348 | 7.977268 | 0.994459 |
156 | strogatz_lv2 | Ours | 0.627065 | 0.087129 | 0.584009 | 0.992025 |
157 | strogatz_predprey1 | Ours | 0.931857 | 0.127998 | 0.776514 | 0.991957 |
158 | strogatz_predprey2 | Ours | 0.990725 | 0.060801 | 0.022346 | 0.996018 |
159 | strogatz_shearflow1 | Ours | 0.98088 | 0.019286 | 0.006435 | 0.996114 |
160 | strogatz_shearflow2 | Ours | 0.990451 | 0.007028 | 0.000487 | 0.993458 |
161 | strogatz_vdp1 | Ours | 0.956707 | 0.100081 | 0.144906 | 0.946581 |
162 | strogatz_vdp2 | Ours | 0.999936 | 0.000357 | 0.000001 | 0.999972 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | GB | 0.843825 | 0.411648 | 0.306425 | 0.933898 |
2 | 1028_SWD | GB | 0.40216 | 0.500758 | 0.386582 | 0.629681 |
3 | 1029_LEV | GB | 0.544847 | 0.487292 | 0.41125 | 0.741157 |
4 | 1030_ERA | GB | 0.354067 | 1.265373 | 2.476943 | 0.561748 |
5 | 1089_USCrime | GB | 0.735817 | 15.812908 | 388.658039 | 0.829873 |
6 | 1096_FacultySalaries | GB | 0.655666 | 1.692329 | 5.266757 | 0.867306 |
7 | 1199_BNG_echoMonths | GB | 0.459803 | 9.042931 | 134.895122 | 0.647074 |
8 | 192_vineyard | GB | 0.414016 | 2.540394 | 10.447293 | 0.70301 |
9 | 197_cpu_act | GB | 0.985098 | 1.577219 | 4.927499 | 0.966054 |
10 | 210_cloud | GB | 0.744946 | 0.315625 | 0.247581 | 0.911323 |
11 | 225_puma8NH | GB | 0.666022 | 2.541262 | 10.616178 | 0.810667 |
12 | 227_cpu_small | GB | 0.977728 | 1.927811 | 7.364241 | 0.95139 |
13 | 228_elusage | GB | 0.662711 | 9.845451 | 185.548119 | 0.769757 |
14 | 229_pwLinear | GB | 0.866772 | 1.230784 | 2.497797 | 0.929715 |
15 | 294_satellite_image | GB | 0.891701 | 0.427456 | 0.528769 | 0.944031 |
16 | 4544_GeographicalOriginalofMusic | GB | 0.735437 | 0.370617 | 0.272636 | 0.8508 |
17 | 503_wind | GB | 0.792332 | 2.344334 | 9.286841 | 0.890397 |
18 | 505_tecator | GB | 0.994561 | 0.720312 | 1.089333 | 0.990542 |
19 | 519_vinnie | GB | 0.677588 | 1.331072 | 2.918281 | 0.835808 |
20 | 522_pm10 | GB | 0.40249 | 0.52595 | 0.458839 | 0.643265 |
21 | 523_analcatdata_neavote | GB | 0.946617 | 0.587192 | 0.766019 | 0.86266 |
22 | 529_pollen | GB | 0.766195 | 1.194947 | 2.284538 | 0.866195 |
23 | 547_no2 | GB | 0.579428 | 0.376816 | 0.23371 | 0.744537 |
24 | 560_bodyfat | GB | 0.96948 | 0.447498 | 2.047946 | 0.989411 |
25 | 562_cpu_small | GB | 0.977697 | 1.928112 | 7.378158 | 0.951374 |
26 | 573_cpu_act | GB | 0.985101 | 1.574965 | 4.926176 | 0.966219 |
27 | 579_fri_c0_250_5 | GB | 0.826108 | 0.310522 | 0.159872 | 0.905514 |
28 | 581_fri_c3_500_25 | GB | 0.901667 | 0.241635 | 0.097505 | 0.942909 |
29 | 582_fri_c1_500_25 | GB | 0.896002 | 0.247032 | 0.104206 | 0.945404 |
30 | 583_fri_c1_1000_50 | GB | 0.92859 | 0.206142 | 0.070994 | 0.959217 |
31 | 584_fri_c4_500_25 | GB | 0.892647 | 0.248087 | 0.105783 | 0.93504 |
32 | 586_fri_c3_1000_25 | GB | 0.934247 | 0.197888 | 0.06533 | 0.958883 |
33 | 588_fri_c4_1000_100 | GB | 0.91912 | 0.221115 | 0.081493 | 0.94965 |
34 | 589_fri_c2_1000_25 | GB | 0.932203 | 0.201979 | 0.066672 | 0.962761 |
35 | 590_fri_c0_1000_50 | GB | 0.887233 | 0.261766 | 0.109249 | 0.940085 |
36 | 591_fri_c1_100_10 | GB | 0.764455 | 0.35971 | 0.233366 | 0.858847 |
37 | 592_fri_c4_1000_25 | GB | 0.917861 | 0.217352 | 0.080926 | 0.95125 |
38 | 593_fri_c1_1000_10 | GB | 0.950663 | 0.172583 | 0.049184 | 0.970845 |
39 | 594_fri_c2_100_5 | GB | 0.680408 | 0.406565 | 0.294247 | 0.809223 |
40 | 595_fri_c0_1000_10 | GB | 0.905908 | 0.240498 | 0.093701 | 0.950441 |
41 | 596_fri_c2_250_5 | GB | 0.885681 | 0.255633 | 0.108273 | 0.927558 |
42 | 597_fri_c2_500_5 | GB | 0.932894 | 0.199733 | 0.066737 | 0.953171 |
43 | 598_fri_c0_1000_25 | GB | 0.906981 | 0.241539 | 0.091942 | 0.952676 |
44 | 599_fri_c2_1000_5 | GB | 0.956357 | 0.162371 | 0.043482 | 0.971351 |
45 | 601_fri_c1_250_5 | GB | 0.903114 | 0.23158 | 0.090293 | 0.941958 |
46 | 602_fri_c3_250_10 | GB | 0.819989 | 0.302515 | 0.170535 | 0.912131 |
47 | 603_fri_c0_250_50 | GB | 0.779161 | 0.359235 | 0.209984 | 0.892731 |
48 | 604_fri_c4_500_10 | GB | 0.911427 | 0.228038 | 0.087185 | 0.938375 |
49 | 605_fri_c2_250_25 | GB | 0.841342 | 0.308476 | 0.154068 | 0.903712 |
50 | 606_fri_c2_1000_10 | GB | 0.942107 | 0.186805 | 0.056795 | 0.962095 |
51 | 607_fri_c4_1000_50 | GB | 0.920848 | 0.214492 | 0.076891 | 0.953161 |
52 | 608_fri_c3_1000_10 | GB | 0.933739 | 0.191745 | 0.064433 | 0.960649 |
53 | 609_fri_c0_1000_5 | GB | 0.925105 | 0.218052 | 0.07637 | 0.960385 |
54 | 611_fri_c3_100_5 | GB | 0.787867 | 0.332415 | 0.184403 | 0.874185 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | GB | 0.950666 | 0.167881 | 0.0469 | 0.971885 |
56 | 613_fri_c3_250_5 | GB | 0.888793 | 0.255482 | 0.112866 | 0.930164 |
57 | 615_fri_c4_250_10 | GB | 0.825742 | 0.309544 | 0.175729 | 0.899653 |
58 | 616_fri_c4_500_50 | GB | 0.870546 | 0.266212 | 0.126715 | 0.918094 |
59 | 617_fri_c3_500_5 | GB | 0.91477 | 0.212436 | 0.083035 | 0.952741 |
60 | 618_fri_c3_1000_50 | GB | 0.917416 | 0.217763 | 0.079267 | 0.941898 |
61 | 620_fri_c1_1000_25 | GB | 0.930267 | 0.20674 | 0.070272 | 0.959332 |
62 | 621_fri_c0_100_10 | GB | 0.673515 | 0.409779 | 0.282499 | 0.833133 |
63 | 622_fri_c2_1000_50 | GB | 0.928422 | 0.208101 | 0.070624 | 0.951736 |
64 | 623_fri_c4_1000_10 | GB | 0.9289 | 0.198136 | 0.068375 | 0.952496 |
65 | 624_fri_c0_100_5 | GB | 0.775367 | 0.344426 | 0.185308 | 0.86807 |
66 | 626_fri_c2_500_50 | GB | 0.89591 | 0.252418 | 0.103325 | 0.934982 |
67 | 627_fri_c2_500_10 | GB | 0.913013 | 0.21429 | 0.080541 | 0.946179 |
68 | 628_fri_c3_1000_5 | GB | 0.945819 | 0.180783 | 0.053809 | 0.967147 |
69 | 631_fri_c1_500_5 | GB | 0.914616 | 0.224312 | 0.083882 | 0.951624 |
70 | 633_fri_c0_500_25 | GB | 0.866778 | 0.28306 | 0.130135 | 0.927416 |
71 | 634_fri_c2_100_10 | GB | 0.707546 | 0.403375 | 0.27576 | 0.802657 |
72 | 635_fri_c0_250_10 | GB | 0.795412 | 0.341222 | 0.183931 | 0.883243 |
73 | 637_fri_c1_500_50 | GB | 0.891905 | 0.25252 | 0.106855 | 0.944885 |
74 | 641_fri_c1_500_10 | GB | 0.92871 | 0.207534 | 0.071374 | 0.961263 |
75 | 643_fri_c2_500_25 | GB | 0.900465 | 0.247269 | 0.098138 | 0.933107 |
76 | 644_fri_c4_250_25 | GB | 0.784022 | 0.333082 | 0.209017 | 0.878636 |
77 | 645_fri_c3_500_50 | GB | 0.861811 | 0.277696 | 0.133277 | 0.907534 |
78 | 646_fri_c3_500_10 | GB | 0.911923 | 0.220252 | 0.085503 | 0.945606 |
79 | 647_fri_c1_250_10 | GB | 0.887172 | 0.252802 | 0.109562 | 0.928647 |
80 | 648_fri_c1_250_50 | GB | 0.837642 | 0.303371 | 0.159559 | 0.905972 |
81 | 649_fri_c0_500_5 | GB | 0.903218 | 0.241655 | 0.096167 | 0.949057 |
82 | 650_fri_c0_500_50 | GB | 0.862419 | 0.289007 | 0.135707 | 0.930885 |
83 | 651_fri_c0_100_25 | GB | 0.567311 | 0.556682 | 0.462681 | 0.804561 |
84 | 653_fri_c0_250_25 | GB | 0.775675 | 0.36665 | 0.215326 | 0.87306 |
85 | 654_fri_c0_500_10 | GB | 0.868828 | 0.277005 | 0.127472 | 0.931783 |
86 | 656_fri_c1_100_5 | GB | 0.693063 | 0.372854 | 0.241055 | 0.859148 |
87 | 657_fri_c2_250_10 | GB | 0.883046 | 0.255819 | 0.110549 | 0.905226 |
88 | 658_fri_c3_250_25 | GB | 0.809951 | 0.331847 | 0.192954 | 0.88508 |
89 | 663_rabe_266 | GB | 0.99757 | 1.6694 | 6.247533 | 0.997376 |
90 | 665_sleuth_case2002 | GB | 0.301035 | 5.605296 | 59.782738 | 0.463527 |
91 | 666_rmftsa_ladata | GB | 0.531554 | 1.35438 | 3.540895 | 0.612809 |
92 | 678_visualizing_environmental | GB | 0.182869 | 2.393865 | 9.85989 | 0.524717 |
93 | 687_sleuth_ex1605 | GB | 0.424127 | 8.230798 | 109.769692 | 0.664333 |
94 | 690_visualizing_galaxy | GB | 0.972577 | 11.648016 | 246.035465 | 0.980595 |
95 | 695_chatfield_4 | GB | 0.811831 | 13.754273 | 394.666729 | 0.928151 |
96 | 712_chscase_geyser1 | GB | 0.708235 | 5.390796 | 46.218939 | 0.72483 |
97 | feynman_III_12_43 | GB | 0.999777 | 0.009144 | 0.000146 | 0.999862 |
98 | feynman_III_15_12 | GB | 0.891242 | 1.271665 | 2.854668 | 0.952377 |
99 | feynman_III_15_14 | GB | 0.99803 | 0.000417 | 0.000001 | 0.997153 |
100 | feynman_III_15_27 | GB | 0.998772 | 0.061388 | 0.00875 | 0.999203 |
101 | feynman_III_17_37 | GB | 0.998614 | 0.146078 | 0.03494 | 0.999072 |
102 | feynman_III_7_38 | GB | 0.998941 | 0.823206 | 1.369195 | 0.999285 |
103 | feynman_III_8_54 | GB | 0.303233 | 0.250471 | 0.086956 | 0.53352 |
104 | feynman_II_10_9 | GB | 0.998956 | 0.005424 | 0.000064 | 0.999375 |
105 | feynman_II_11_28 | GB | 0.999746 | 0.00345 | 0.000022 | 0.999751 |
106 | feynman_II_13_23 | GB | 0.999942 | 0.006635 | 0.000085 | 0.999973 |
107 | feynman_II_13_34 | GB | 0.999747 | 0.025141 | 0.001124 | 0.999878 |
108 | feynman_II_15_4 | GB | 0.998739 | 0.145999 | 0.034424 | 0.999112 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | GB | 0.998703 | 0.148046 | 0.035469 | 0.999086 |
110 | feynman_II_24_17 | GB | 0.999786 | 0.009717 | 0.000159 | 0.999882 |
111 | feynman_II_27_16 | GB | 0.999038 | 2.006489 | 7.70378 | 0.999233 |
112 | feynman_II_27_18 | GB | 0.99981 | 0.258533 | 0.124766 | 0.999873 |
113 | feynman_II_34_2 | GB | 0.999065 | 0.224284 | 0.087908 | 0.999368 |
114 | feynman_II_34_29a | GB | 0.998903 | 0.005328 | 0.000058 | 0.999263 |
115 | feynman_II_34_2a | GB | 0.99894 | 0.010497 | 0.000223 | 0.999292 |
116 | feynman_II_37_1 | GB | 0.999126 | 0.526923 | 0.483049 | 0.999449 |
117 | feynman_II_38_14 | GB | 0.99978 | 0.002308 | 0.00001 | 0.999871 |
118 | feynman_II_3_24 | GB | 0.999799 | 0.0005 | 0.000001 | 0.999807 |
119 | feynman_II_4_23 | GB | 0.998778 | 0.000766 | 0.000001 | 0.999227 |
120 | feynman_II_8_31 | GB | 0.999808 | 0.130123 | 0.031723 | 0.999874 |
121 | feynman_II_8_7 | GB | 0.998564 | 0.002103 | 0.000012 | 0.999034 |
122 | feynman_I_10_7 | GB | 0.999942 | 0.006637 | 0.000085 | 0.999973 |
123 | feynman_I_12_1 | GB | 0.999779 | 0.057045 | 0.005667 | 0.999862 |
124 | feynman_I_12_4 | GB | 0.998729 | 0.000545 | 0.000001 | 0.99867 |
125 | feynman_I_12_5 | GB | 0.999781 | 0.056772 | 0.005631 | 0.999864 |
126 | feynman_I_14_3 | GB | 0.999038 | 0.452357 | 0.35937 | 0.999355 |
127 | feynman_I_14_4 | GB | 0.999809 | 0.128623 | 0.031144 | 0.999872 |
128 | feynman_I_15_10 | GB | 0.999736 | 0.025599 | 0.001176 | 0.999875 |
129 | feynman_I_16_6 | GB | 0.999426 | 0.021044 | 0.000744 | 0.999705 |
130 | feynman_I_18_12 | GB | 0.999048 | 0.177748 | 0.052272 | 0.999372 |
131 | feynman_I_25_13 | GB | 0.999765 | 0.008456 | 0.000144 | 0.999842 |
132 | feynman_I_26_2 | GB | 0.999807 | 0.005237 | 0.000048 | 0.999892 |
133 | feynman_I_27_6 | GB | 0.999183 | 0.007485 | 0.000104 | 0.999555 |
134 | feynman_I_29_4 | GB | 0.999734 | 0.01411 | 0.000467 | 0.999786 |
135 | feynman_I_30_3 | GB | 0.944169 | 0.456395 | 0.371263 | 0.949052 |
136 | feynman_I_30_5 | GB | 0.999247 | 0.002193 | 0.000011 | 0.999604 |
137 | feynman_I_34_1 | GB | 0.999499 | 0.027293 | 0.001592 | 0.999795 |
138 | feynman_I_34_14 | GB | 0.999718 | 0.019595 | 0.00074 | 0.999877 |
139 | feynman_I_34_27 | GB | 0.999779 | 0.009122 | 0.000145 | 0.999862 |
140 | feynman_I_37_4 | GB | 0.999358 | 0.055282 | 0.005287 | 0.999542 |
141 | feynman_I_39_1 | GB | 0.999782 | 0.085379 | 0.01271 | 0.999865 |
142 | feynman_I_39_11 | GB | 0.998941 | 0.071568 | 0.009894 | 0.999318 |
143 | feynman_I_43_31 | GB | 0.999045 | 0.452485 | 0.359654 | 0.999359 |
144 | feynman_I_47_23 | GB | 0.999262 | 0.013819 | 0.000336 | 0.99958 |
145 | feynman_I_48_2 | GB | 0.999781 | 1.116941 | 2.259847 | 0.999851 |
146 | feynman_I_6_2 | GB | 0.999765 | 0.000483 | 0.0 | 0.999888 |
147 | feynman_I_6_2b | GB | 0.996133 | 0.002474 | 0.000014 | 0.99801 |
148 | nikuradse_1 | GB | 0.996922 | 0.006384 | 0.000078 | 0.989391 |
149 | strogatz_bacres1 | GB | 0.998239 | 0.045395 | 0.01087 | 0.972479 |
150 | strogatz_bacres2 | GB | 0.993985 | 0.047292 | 0.026821 | 0.9961 |
151 | strogatz_barmag1 | GB | 0.981286 | 0.009798 | 0.001255 | 0.992695 |
152 | strogatz_barmag2 | GB | 0.991133 | 0.011211 | 0.000591 | 0.984098 |
153 | strogatz_glider1 | GB | 0.979589 | 0.079676 | 0.012392 | 0.988568 |
154 | strogatz_glider2 | GB | 0.96412 | 0.092327 | 0.031138 | 0.976937 |
155 | strogatz_lv1 | GB | 0.340567 | 0.223821 | 6.248533 | 0.938462 |
156 | strogatz_lv2 | GB | -0.898451 | 0.150845 | 0.729317 | 0.759218 |
157 | strogatz_predprey1 | GB | 0.940588 | 0.146252 | 0.646596 | 0.988357 |
158 | strogatz_predprey2 | GB | 0.989898 | 0.077975 | 0.024433 | 0.994818 |
159 | strogatz_shearflow1 | GB | 0.940203 | 0.030389 | 0.013452 | 0.987928 |
160 | strogatz_shearflow2 | GB | 0.978294 | 0.012743 | 0.001115 | 0.994161 |
161 | strogatz_vdp1 | GB | 0.966731 | 0.115695 | 0.090597 | 0.953146 |
162 | strogatz_vdp2 | GB | 0.999862 | 0.000618 | 0.000001 | 0.999838 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | KNN | 0.807879 | 0.41613 | 0.37531 | 0.913604 |
2 | 1028_SWD | KNN | 0.343418 | 0.510123 | 0.424094 | 0.600515 |
3 | 1029_LEV | KNN | 0.513662 | 0.493256 | 0.439421 | 0.722121 |
4 | 1030_ERA | KNN | 0.345674 | 1.26361 | 2.507198 | 0.552753 |
5 | 1089_USCrime | KNN | 0.703708 | 14.772808 | 412.108062 | 0.826278 |
6 | 1096_FacultySalaries | KNN | 0.53663 | 1.953896 | 6.991502 | 0.868185 |
7 | 1199_BNG_echoMonths | KNN | 0.334282 | 10.376265 | 166.239817 | 0.569137 |
8 | 192_vineyard | KNN | 0.373565 | 2.628693 | 10.633359 | 0.689823 |
9 | 197_cpu_act | KNN | 0.888774 | 4.247011 | 36.847611 | 0.795425 |
10 | 210_cloud | KNN | 0.658076 | 0.36535 | 0.381462 | 0.854528 |
11 | 225_puma8NH | KNN | 0.616011 | 2.766587 | 12.207019 | 0.78323 |
12 | 227_cpu_small | KNN | 0.886621 | 4.300175 | 37.553102 | 0.791214 |
13 | 228_elusage | KNN | 0.67304 | 9.506251 | 174.1098 | 0.784325 |
14 | 229_pwLinear | KNN | 0.70406 | 1.855126 | 5.603834 | 0.857754 |
15 | 294_satellite_image | KNN | 0.908938 | 0.257462 | 0.444755 | 0.957693 |
16 | 4544_GeographicalOriginalofMusic | KNN | 0.388877 | 0.554609 | 0.637352 | 0.600053 |
17 | 503_wind | KNN | 0.745951 | 2.583735 | 11.36322 | 0.866253 |
18 | 505_tecator | KNN | 0.987731 | 1.073242 | 2.476996 | 0.981551 |
19 | 519_vinnie | KNN | 0.569558 | 1.526279 | 3.896591 | 0.775169 |
20 | 522_pm10 | KNN | 0.180996 | 0.626031 | 0.634458 | 0.442965 |
21 | 523_analcatdata_neavote | KNN | 0.924445 | 0.611024 | 1.116954 | 0.845462 |
22 | 529_pollen | KNN | 0.695374 | 1.337789 | 2.97923 | 0.832418 |
23 | 547_no2 | KNN | 0.16756 | 0.528659 | 0.466909 | 0.431959 |
24 | 560_bodyfat | KNN | 0.594354 | 4.289598 | 27.442293 | 0.768671 |
25 | 562_cpu_small | KNN | 0.88669 | 4.299562 | 37.534212 | 0.791279 |
26 | 573_cpu_act | KNN | 0.888807 | 4.246408 | 36.835197 | 0.795559 |
27 | 579_fri_c0_250_5 | KNN | 0.757979 | 0.370055 | 0.226623 | 0.88502 |
28 | 581_fri_c3_500_25 | KNN | 0.406513 | 0.614419 | 0.594702 | 0.641756 |
29 | 582_fri_c1_500_25 | KNN | 0.299068 | 0.671775 | 0.702257 | 0.590878 |
30 | 583_fri_c1_1000_50 | KNN | 0.273415 | 0.695145 | 0.725051 | 0.52937 |
31 | 584_fri_c4_500_25 | KNN | 0.398744 | 0.628441 | 0.600895 | 0.633638 |
32 | 586_fri_c3_1000_25 | KNN | 0.424535 | 0.602009 | 0.574932 | 0.671286 |
33 | 588_fri_c4_1000_100 | KNN | 0.16538 | 0.731393 | 0.843612 | 0.446299 |
34 | 589_fri_c2_1000_25 | KNN | 0.430646 | 0.60814 | 0.561002 | 0.663878 |
35 | 590_fri_c0_1000_50 | KNN | 0.361098 | 0.639875 | 0.622629 | 0.655584 |
36 | 591_fri_c1_100_10 | KNN | 0.389413 | 0.631294 | 0.603467 | 0.625965 |
37 | 592_fri_c4_1000_25 | KNN | 0.406186 | 0.615368 | 0.590089 | 0.604161 |
38 | 593_fri_c1_1000_10 | KNN | 0.661404 | 0.458173 | 0.338014 | 0.817027 |
39 | 594_fri_c2_100_5 | KNN | 0.502196 | 0.518513 | 0.462946 | 0.755388 |
40 | 595_fri_c0_1000_10 | KNN | 0.698886 | 0.441745 | 0.301583 | 0.863617 |
41 | 596_fri_c2_250_5 | KNN | 0.719608 | 0.395027 | 0.26364 | 0.853442 |
42 | 597_fri_c2_500_5 | KNN | 0.850351 | 0.289368 | 0.147574 | 0.906158 |
43 | 598_fri_c0_1000_25 | KNN | 0.463055 | 0.593894 | 0.531968 | 0.742612 |
44 | 599_fri_c2_1000_5 | KNN | 0.890694 | 0.248991 | 0.108977 | 0.934203 |
45 | 601_fri_c1_250_5 | KNN | 0.793778 | 0.324212 | 0.193876 | 0.884709 |
46 | 602_fri_c3_250_10 | KNN | 0.455146 | 0.563242 | 0.516874 | 0.658824 |
47 | 603_fri_c0_250_50 | KNN | 0.235183 | 0.68487 | 0.728843 | 0.557173 |
48 | 604_fri_c4_500_10 | KNN | 0.609512 | 0.480458 | 0.38484 | 0.777255 |
49 | 605_fri_c2_250_25 | KNN | 0.271431 | 0.696507 | 0.713535 | 0.526729 |
50 | 606_fri_c2_1000_10 | KNN | 0.650717 | 0.45814 | 0.343237 | 0.809788 |
51 | 607_fri_c4_1000_50 | KNN | 0.281967 | 0.663103 | 0.700438 | 0.538739 |
52 | 608_fri_c3_1000_10 | KNN | 0.696365 | 0.420994 | 0.296792 | 0.824605 |
53 | 609_fri_c0_1000_5 | KNN | 0.884304 | 0.270184 | 0.118083 | 0.945111 |
54 | 611_fri_c3_100_5 | KNN | 0.771813 | 0.358065 | 0.198926 | 0.83584 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | KNN | 0.889513 | 0.246193 | 0.105308 | 0.942656 |
56 | 613_fri_c3_250_5 | KNN | 0.826467 | 0.322477 | 0.175191 | 0.893465 |
57 | 615_fri_c4_250_10 | KNN | 0.530654 | 0.543859 | 0.468519 | 0.708213 |
58 | 616_fri_c4_500_50 | KNN | 0.210018 | 0.706435 | 0.783715 | 0.476024 |
59 | 617_fri_c3_500_5 | KNN | 0.859534 | 0.273003 | 0.137508 | 0.922143 |
60 | 618_fri_c3_1000_50 | KNN | 0.311386 | 0.653644 | 0.664081 | 0.557806 |
61 | 620_fri_c1_1000_25 | KNN | 0.375528 | 0.65647 | 0.631844 | 0.608514 |
62 | 621_fri_c0_100_10 | KNN | 0.340703 | 0.582888 | 0.578592 | 0.623509 |
63 | 622_fri_c2_1000_50 | KNN | 0.30276 | 0.681954 | 0.691447 | 0.546624 |
64 | 623_fri_c4_1000_10 | KNN | 0.639109 | 0.454332 | 0.34839 | 0.782942 |
65 | 624_fri_c0_100_5 | KNN | 0.673186 | 0.407632 | 0.274295 | 0.828027 |
66 | 626_fri_c2_500_50 | KNN | 0.271654 | 0.69571 | 0.725827 | 0.533412 |
67 | 627_fri_c2_500_10 | KNN | 0.512372 | 0.521552 | 0.452715 | 0.723344 |
68 | 628_fri_c3_1000_5 | KNN | 0.91219 | 0.222372 | 0.08751 | 0.949091 |
69 | 631_fri_c1_500_5 | KNN | 0.79166 | 0.328605 | 0.205153 | 0.885727 |
70 | 633_fri_c0_500_25 | KNN | 0.429249 | 0.614029 | 0.565441 | 0.726973 |
71 | 634_fri_c2_100_10 | KNN | 0.387796 | 0.588892 | 0.585368 | 0.667469 |
72 | 635_fri_c0_250_10 | KNN | 0.567253 | 0.509036 | 0.396825 | 0.780319 |
73 | 637_fri_c1_500_50 | KNN | 0.190612 | 0.726046 | 0.802992 | 0.454106 |
74 | 641_fri_c1_500_10 | KNN | 0.587863 | 0.509599 | 0.414111 | 0.768669 |
75 | 643_fri_c2_500_25 | KNN | 0.274152 | 0.678898 | 0.723671 | 0.520608 |
76 | 644_fri_c4_250_25 | KNN | 0.258696 | 0.673705 | 0.715887 | 0.551216 |
77 | 645_fri_c3_500_50 | KNN | 0.256191 | 0.680995 | 0.726216 | 0.473903 |
78 | 646_fri_c3_500_10 | KNN | 0.583347 | 0.496202 | 0.410688 | 0.736341 |
79 | 647_fri_c1_250_10 | KNN | 0.509284 | 0.555339 | 0.478257 | 0.700949 |
80 | 648_fri_c1_250_50 | KNN | 0.239375 | 0.73117 | 0.763044 | 0.527385 |
81 | 649_fri_c0_500_5 | KNN | 0.825017 | 0.323343 | 0.173445 | 0.91166 |
82 | 650_fri_c0_500_50 | KNN | 0.329263 | 0.653483 | 0.668729 | 0.665897 |
83 | 651_fri_c0_100_25 | KNN | 0.273143 | 0.722003 | 0.791156 | 0.576842 |
84 | 653_fri_c0_250_25 | KNN | 0.362009 | 0.629356 | 0.618463 | 0.648122 |
85 | 654_fri_c0_500_10 | KNN | 0.624287 | 0.478936 | 0.368239 | 0.821772 |
86 | 656_fri_c1_100_5 | KNN | 0.615747 | 0.408661 | 0.303597 | 0.832531 |
87 | 657_fri_c2_250_10 | KNN | 0.52039 | 0.524871 | 0.457067 | 0.705652 |
88 | 658_fri_c3_250_25 | KNN | 0.244797 | 0.690289 | 0.760943 | 0.501852 |
89 | 663_rabe_266 | KNN | 0.986017 | 3.951143 | 36.343619 | 0.990686 |
90 | 665_sleuth_case2002 | KNN | 0.319907 | 5.308863 | 57.668582 | 0.364329 |
91 | 666_rmftsa_ladata | KNN | 0.543656 | 1.347936 | 3.553963 | 0.594365 |
92 | 678_visualizing_environmental | KNN | 0.216664 | 2.405815 | 9.418845 | 0.509588 |
93 | 687_sleuth_ex1605 | KNN | 0.362567 | 8.921237 | 126.531474 | 0.616783 |
94 | 690_visualizing_galaxy | KNN | 0.974681 | 10.682558 | 227.37364 | 0.98197 |
95 | 695_chatfield_4 | KNN | 0.790414 | 14.388273 | 433.325924 | 0.907828 |
96 | 712_chscase_geyser1 | KNN | 0.680253 | 5.669016 | 50.502146 | 0.714195 |
97 | feynman_III_12_43 | KNN | 0.999991 | 0.001777 | 0.000006 | 0.999995 |
98 | feynman_III_15_12 | KNN | 0.994051 | 0.270764 | 0.156133 | 0.996194 |
99 | feynman_III_15_14 | KNN | 0.997769 | 0.000253 | 0.000001 | 0.999808 |
100 | feynman_III_15_27 | KNN | 0.999043 | 0.038169 | 0.006827 | 0.999828 |
101 | feynman_III_17_37 | KNN | 0.99911 | 0.096243 | 0.022444 | 0.999635 |
102 | feynman_III_7_38 | KNN | 0.999347 | 0.477064 | 0.844295 | 0.999832 |
103 | feynman_III_8_54 | KNN | 0.983681 | 0.02859 | 0.002037 | 0.989199 |
104 | feynman_II_10_9 | KNN | 0.99927 | 0.003401 | 0.000045 | 0.999839 |
105 | feynman_II_11_28 | KNN | 0.99999 | 0.000628 | 0.000001 | 0.999989 |
106 | feynman_II_13_23 | KNN | 0.999866 | 0.010269 | 0.000198 | 0.999946 |
107 | feynman_II_13_34 | KNN | 0.999176 | 0.043096 | 0.003654 | 0.999666 |
108 | feynman_II_15_4 | KNN | 0.999175 | 0.098492 | 0.022525 | 0.999618 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | KNN | 0.999182 | 0.097892 | 0.02238 | 0.999623 |
110 | feynman_II_24_17 | KNN | 0.999337 | 0.015251 | 0.000493 | 0.99979 |
111 | feynman_II_27_16 | KNN | 0.999712 | 0.943228 | 2.311306 | 0.999871 |
112 | feynman_II_27_18 | KNN | 0.99999 | 0.05433 | 0.006645 | 0.999995 |
113 | feynman_II_34_2 | KNN | 0.999754 | 0.104578 | 0.023089 | 0.999851 |
114 | feynman_II_34_29a | KNN | 0.999327 | 0.003067 | 0.000035 | 0.999831 |
115 | feynman_II_34_2a | KNN | 0.999376 | 0.00603 | 0.000131 | 0.999834 |
116 | feynman_II_37_1 | KNN | 0.999759 | 0.25402 | 0.133153 | 0.999863 |
117 | feynman_II_38_14 | KNN | 0.999988 | 0.000494 | 0.000001 | 0.999995 |
118 | feynman_II_3_24 | KNN | 0.999937 | 0.000166 | 0.0 | 0.999995 |
119 | feynman_II_4_23 | KNN | 0.999001 | 0.000481 | 0.000001 | 0.999828 |
120 | feynman_II_8_31 | KNN | 0.99999 | 0.027024 | 0.001644 | 0.999996 |
121 | feynman_II_8_7 | KNN | 0.999104 | 0.001143 | 0.000007 | 0.999852 |
122 | feynman_I_10_7 | KNN | 0.999864 | 0.010321 | 0.000201 | 0.999946 |
123 | feynman_I_12_1 | KNN | 0.999991 | 0.011106 | 0.000228 | 0.999995 |
124 | feynman_I_12_4 | KNN | 0.997699 | 0.000432 | 0.000002 | 0.99983 |
125 | feynman_I_12_5 | KNN | 0.999991 | 0.011098 | 0.000227 | 0.999995 |
126 | feynman_I_14_3 | KNN | 0.999752 | 0.209515 | 0.092489 | 0.99985 |
127 | feynman_I_14_4 | KNN | 0.99999 | 0.02702 | 0.001652 | 0.999995 |
128 | feynman_I_15_10 | KNN | 0.999192 | 0.043141 | 0.0036 | 0.999669 |
129 | feynman_I_16_6 | KNN | 0.999751 | 0.012993 | 0.000322 | 0.999905 |
130 | feynman_I_18_12 | KNN | 0.999626 | 0.093342 | 0.020564 | 0.99971 |
131 | feynman_I_25_13 | KNN | 0.999976 | 0.002032 | 0.000015 | 0.999994 |
132 | feynman_I_26_2 | KNN | 0.999978 | 0.001406 | 0.000005 | 0.999994 |
133 | feynman_I_27_6 | KNN | 0.999582 | 0.004684 | 0.000053 | 0.999834 |
134 | feynman_I_29_4 | KNN | 0.999932 | 0.004091 | 0.00012 | 0.999992 |
135 | feynman_I_30_3 | KNN | 0.99288 | 0.119795 | 0.047354 | 0.995613 |
136 | feynman_I_30_5 | KNN | 0.999445 | 0.00152 | 0.000008 | 0.99985 |
137 | feynman_I_34_1 | KNN | 0.999636 | 0.021039 | 0.001158 | 0.999894 |
138 | feynman_I_34_14 | KNN | 0.999778 | 0.016803 | 0.000582 | 0.999919 |
139 | feynman_I_34_27 | KNN | 0.999991 | 0.001764 | 0.000006 | 0.999995 |
140 | feynman_I_37_4 | KNN | 0.999138 | 0.05649 | 0.007095 | 0.999686 |
141 | feynman_I_39_1 | KNN | 0.999991 | 0.01667 | 0.000513 | 0.999995 |
142 | feynman_I_39_11 | KNN | 0.999384 | 0.04124 | 0.005749 | 0.999838 |
143 | feynman_I_43_31 | KNN | 0.999757 | 0.209387 | 0.091568 | 0.99985 |
144 | feynman_I_47_23 | KNN | 0.999595 | 0.008944 | 0.000184 | 0.999845 |
145 | feynman_I_48_2 | KNN | 0.999928 | 0.589222 | 0.745672 | 0.999961 |
146 | feynman_I_6_2 | KNN | 0.999987 | 0.000101 | 0.0 | 0.999994 |
147 | feynman_I_6_2b | KNN | 0.999483 | 0.000834 | 0.000002 | 0.999805 |
148 | nikuradse_1 | KNN | 0.997674 | 0.004855 | 0.000059 | 0.989526 |
149 | strogatz_bacres1 | KNN | 0.997879 | 0.053442 | 0.013004 | 0.972305 |
150 | strogatz_bacres2 | KNN | 0.996195 | 0.03314 | 0.016602 | 0.99762 |
151 | strogatz_barmag1 | KNN | 0.992561 | 0.004546 | 0.000537 | 0.998896 |
152 | strogatz_barmag2 | KNN | 0.997663 | 0.003313 | 0.000188 | 0.995749 |
153 | strogatz_glider1 | KNN | 0.985515 | 0.056925 | 0.008836 | 0.992129 |
154 | strogatz_glider2 | KNN | 0.966362 | 0.074246 | 0.030366 | 0.976969 |
155 | strogatz_lv1 | KNN | 0.50793 | 0.217824 | 7.069078 | 0.990073 |
156 | strogatz_lv2 | KNN | 0.717427 | 0.073722 | 0.390647 | 0.986078 |
157 | strogatz_predprey1 | KNN | 0.928159 | 0.129424 | 0.802643 | 0.991541 |
158 | strogatz_predprey2 | KNN | 0.989005 | 0.071363 | 0.027123 | 0.996229 |
159 | strogatz_shearflow1 | KNN | 0.982855 | 0.013993 | 0.006231 | 0.998894 |
160 | strogatz_shearflow2 | KNN | 0.99096 | 0.005983 | 0.000436 | 0.997075 |
161 | strogatz_vdp1 | KNN | 0.954852 | 0.101241 | 0.147223 | 0.942659 |
162 | strogatz_vdp2 | KNN | 0.999066 | 0.000899 | 0.000007 | 0.998801 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | Linear | 0.850755 | 0.40513 | 0.292207 | 0.935449 |
2 | 1028_SWD | Linear | 0.378101 | 0.511349 | 0.401739 | 0.620222 |
3 | 1029_LEV | Linear | 0.557796 | 0.488484 | 0.399588 | 0.749403 |
4 | 1030_ERA | Linear | 0.369205 | 1.243903 | 2.418982 | 0.579253 |
5 | 1089_USCrime | Linear | 0.812045 | 13.428465 | 265.112604 | 0.843669 |
6 | 1096_FacultySalaries | Linear | 0.001744 | 1.531526 | 10.476934 | 0.902399 |
7 | 1199_BNG_echoMonths | Linear | 0.439689 | 9.255462 | 139.918052 | 0.614086 |
8 | 192_vineyard | Linear | 0.454057 | 2.328044 | 9.162236 | 0.735941 |
9 | 197_cpu_act | Linear | 0.723461 | 6.035876 | 92.707762 | 0.775688 |
10 | 210_cloud | Linear | 0.822553 | 0.268041 | 0.16666 | 0.917689 |
11 | 225_puma8NH | Linear | 0.372844 | 3.661451 | 19.939988 | 0.657494 |
12 | 227_cpu_small | Linear | 0.711964 | 6.121275 | 96.531839 | 0.766317 |
13 | 228_elusage | Linear | 0.713334 | 9.274947 | 152.617535 | 0.847493 |
14 | 229_pwLinear | Linear | 0.740945 | 1.75077 | 4.883573 | 0.865478 |
15 | 294_satellite_image | Linear | 0.698746 | 1.017789 | 1.47091 | 0.881093 |
16 | 4544_GeographicalOriginalofMusic | Linear | 0.798082 | 0.33812 | 0.206275 | 0.874131 |
17 | 503_wind | Linear | 0.758431 | 2.523946 | 10.804286 | 0.87057 |
18 | 505_tecator | Linear | 0.995984 | 0.601854 | 0.810324 | 0.993431 |
19 | 519_vinnie | Linear | 0.690202 | 1.309108 | 2.799677 | 0.842285 |
20 | 522_pm10 | Linear | 0.141843 | 0.63939 | 0.665433 | 0.391699 |
21 | 523_analcatdata_neavote | Linear | 0.936059 | 0.646718 | 0.917932 | 0.858835 |
22 | 529_pollen | Linear | 0.790313 | 1.133391 | 2.047134 | 0.875056 |
23 | 547_no2 | Linear | 0.490508 | 0.410635 | 0.283465 | 0.719061 |
24 | 560_bodyfat | Linear | 0.976067 | 0.491859 | 1.56724 | 0.990758 |
25 | 562_cpu_small | Linear | 0.711911 | 6.116167 | 96.550215 | 0.766791 |
26 | 573_cpu_act | Linear | 0.723469 | 6.036166 | 92.704662 | 0.775619 |
27 | 579_fri_c0_250_5 | Linear | 0.660825 | 0.434944 | 0.311665 | 0.816355 |
28 | 581_fri_c3_500_25 | Linear | 0.264277 | 0.720032 | 0.733977 | 0.498327 |
29 | 582_fri_c1_500_25 | Linear | 0.257659 | 0.713191 | 0.743551 | 0.50301 |
30 | 583_fri_c1_1000_50 | Linear | 0.281638 | 0.718036 | 0.716633 | 0.519465 |
31 | 584_fri_c4_500_25 | Linear | 0.281899 | 0.714623 | 0.713502 | 0.513447 |
32 | 586_fri_c3_1000_25 | Linear | 0.279202 | 0.705495 | 0.718378 | 0.509394 |
33 | 588_fri_c4_1000_100 | Linear | 0.267338 | 0.708425 | 0.738718 | 0.46675 |
34 | 589_fri_c2_1000_25 | Linear | 0.276266 | 0.707726 | 0.712125 | 0.528804 |
35 | 590_fri_c0_1000_50 | Linear | 0.699727 | 0.413517 | 0.290709 | 0.83999 |
36 | 591_fri_c1_100_10 | Linear | 0.178503 | 0.785305 | 0.799211 | 0.465915 |
37 | 592_fri_c4_1000_25 | Linear | 0.246001 | 0.727384 | 0.748549 | 0.445209 |
38 | 593_fri_c1_1000_10 | Linear | 0.287045 | 0.712731 | 0.712397 | 0.541301 |
39 | 594_fri_c2_100_5 | Linear | 0.041581 | 0.76735 | 0.901306 | 0.366065 |
40 | 595_fri_c0_1000_10 | Linear | 0.704631 | 0.420814 | 0.294328 | 0.837616 |
41 | 596_fri_c2_250_5 | Linear | 0.277518 | 0.688422 | 0.685956 | 0.540818 |
42 | 597_fri_c2_500_5 | Linear | 0.276542 | 0.70663 | 0.720696 | 0.528983 |
43 | 598_fri_c0_1000_25 | Linear | 0.703118 | 0.4217 | 0.293105 | 0.834738 |
44 | 599_fri_c2_1000_5 | Linear | 0.28865 | 0.712516 | 0.710183 | 0.545248 |
45 | 601_fri_c1_250_5 | Linear | 0.289959 | 0.682671 | 0.662813 | 0.530686 |
46 | 602_fri_c3_250_10 | Linear | 0.180244 | 0.742043 | 0.779903 | 0.40493 |
47 | 603_fri_c0_250_50 | Linear | 0.712692 | 0.407259 | 0.274058 | 0.841514 |
48 | 604_fri_c4_500_10 | Linear | 0.248273 | 0.728724 | 0.744357 | 0.480085 |
49 | 605_fri_c2_250_25 | Linear | 0.24698 | 0.717057 | 0.736453 | 0.513876 |
50 | 606_fri_c2_1000_10 | Linear | 0.308385 | 0.690089 | 0.680073 | 0.557761 |
51 | 607_fri_c4_1000_50 | Linear | 0.220702 | 0.714655 | 0.758194 | 0.458222 |
52 | 608_fri_c3_1000_10 | Linear | 0.278023 | 0.707177 | 0.706622 | 0.475936 |
53 | 609_fri_c0_1000_5 | Linear | 0.728649 | 0.408021 | 0.276379 | 0.853735 |
54 | 611_fri_c3_100_5 | Linear | 0.194105 | 0.703612 | 0.689233 | 0.559348 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | Linear | 0.229149 | 0.728327 | 0.735948 | 0.470516 |
56 | 613_fri_c3_250_5 | Linear | 0.263946 | 0.724662 | 0.746828 | 0.505514 |
57 | 615_fri_c4_250_10 | Linear | 0.276361 | 0.708448 | 0.71733 | 0.471958 |
58 | 616_fri_c4_500_50 | Linear | 0.205509 | 0.728739 | 0.788715 | 0.453025 |
59 | 617_fri_c3_500_5 | Linear | 0.216635 | 0.735879 | 0.76765 | 0.449461 |
60 | 618_fri_c3_1000_50 | Linear | 0.265698 | 0.69623 | 0.7071 | 0.472572 |
61 | 620_fri_c1_1000_25 | Linear | 0.250131 | 0.738213 | 0.757734 | 0.478474 |
62 | 621_fri_c0_100_10 | Linear | 0.602227 | 0.44443 | 0.341376 | 0.774586 |
63 | 622_fri_c2_1000_50 | Linear | 0.263696 | 0.715922 | 0.729433 | 0.507265 |
64 | 623_fri_c4_1000_10 | Linear | 0.267106 | 0.710196 | 0.709202 | 0.452199 |
65 | 624_fri_c0_100_5 | Linear | 0.603562 | 0.462771 | 0.321688 | 0.771278 |
66 | 626_fri_c2_500_50 | Linear | 0.217673 | 0.728059 | 0.777863 | 0.472951 |
67 | 627_fri_c2_500_10 | Linear | 0.243343 | 0.694881 | 0.702719 | 0.532132 |
68 | 628_fri_c3_1000_5 | Linear | 0.266377 | 0.7262 | 0.731738 | 0.472058 |
69 | 631_fri_c1_500_5 | Linear | 0.241078 | 0.733383 | 0.74539 | 0.469951 |
70 | 633_fri_c0_500_25 | Linear | 0.690121 | 0.432924 | 0.302853 | 0.828156 |
71 | 634_fri_c2_100_10 | Linear | 0.299368 | 0.636631 | 0.675419 | 0.605714 |
72 | 635_fri_c0_250_10 | Linear | 0.603044 | 0.476327 | 0.356965 | 0.774646 |
73 | 637_fri_c1_500_50 | Linear | 0.217729 | 0.739307 | 0.775907 | 0.463339 |
74 | 641_fri_c1_500_10 | Linear | 0.258236 | 0.729422 | 0.743619 | 0.532342 |
75 | 643_fri_c2_500_25 | Linear | 0.147409 | 0.755978 | 0.850757 | 0.386658 |
76 | 644_fri_c4_250_25 | Linear | 0.184237 | 0.725411 | 0.784909 | 0.448349 |
77 | 645_fri_c3_500_50 | Linear | 0.235281 | 0.69956 | 0.744604 | 0.461882 |
78 | 646_fri_c3_500_10 | Linear | 0.311535 | 0.690855 | 0.674736 | 0.474915 |
79 | 647_fri_c1_250_10 | Linear | 0.258723 | 0.7324 | 0.721684 | 0.523531 |
80 | 648_fri_c1_250_50 | Linear | 0.309327 | 0.717734 | 0.685753 | 0.570247 |
81 | 649_fri_c0_500_5 | Linear | 0.722406 | 0.390243 | 0.275346 | 0.846715 |
82 | 650_fri_c0_500_50 | Linear | 0.745277 | 0.389934 | 0.252129 | 0.862324 |
83 | 651_fri_c0_100_25 | Linear | 0.565772 | 0.554044 | 0.460616 | 0.784511 |
84 | 653_fri_c0_250_25 | Linear | 0.700389 | 0.41904 | 0.287578 | 0.822729 |
85 | 654_fri_c0_500_10 | Linear | 0.676016 | 0.43907 | 0.31314 | 0.829225 |
86 | 656_fri_c1_100_5 | Linear | 0.144937 | 0.704332 | 0.691867 | 0.486115 |
87 | 657_fri_c2_250_10 | Linear | 0.19816 | 0.719233 | 0.771387 | 0.515301 |
88 | 658_fri_c3_250_25 | Linear | 0.1306 | 0.755051 | 0.86765 | 0.41002 |
89 | 663_rabe_266 | Linear | 0.968315 | 6.917994 | 81.304113 | 0.98143 |
90 | 665_sleuth_case2002 | Linear | 0.292627 | 5.592663 | 59.912708 | 0.426888 |
91 | 666_rmftsa_ladata | Linear | 0.526431 | 1.38051 | 3.611345 | 0.633792 |
92 | 678_visualizing_environmental | Linear | 0.309626 | 2.215614 | 8.264205 | 0.596919 |
93 | 687_sleuth_ex1605 | Linear | 0.445392 | 8.476614 | 106.578743 | 0.691451 |
94 | 690_visualizing_galaxy | Linear | 0.897886 | 25.120121 | 922.100554 | 0.96885 |
95 | 695_chatfield_4 | Linear | 0.858441 | 12.031508 | 291.075407 | 0.935352 |
96 | 712_chscase_geyser1 | Linear | 0.758773 | 4.990649 | 38.312636 | 0.759 |
97 | feynman_III_12_43 | Linear | 0.93145 | 0.159014 | 0.044833 | 0.983169 |
98 | feynman_III_15_12 | Linear | 0.233931 | 3.736997 | 20.107373 | 0.423118 |
99 | feynman_III_15_14 | Linear | 0.508537 | 0.007291 | 0.000148 | 0.975434 |
100 | feynman_III_15_27 | Linear | 0.724087 | 0.910702 | 1.967749 | 0.979986 |
101 | feynman_III_17_37 | Linear | 0.139988 | 3.57535 | 21.677414 | 0.376456 |
102 | feynman_III_7_38 | Linear | 0.786887 | 11.464593 | 275.623411 | 0.979451 |
103 | feynman_III_8_54 | Linear | 0.014734 | 0.314506 | 0.12296 | 0.111697 |
104 | feynman_II_10_9 | Linear | 0.781668 | 0.07861 | 0.013397 | 0.982423 |
105 | feynman_II_11_28 | Linear | 0.818812 | 0.093008 | 0.015337 | 0.95077 |
106 | feynman_II_13_23 | Linear | 0.994665 | 0.057811 | 0.007887 | 0.998096 |
107 | feynman_II_13_34 | Linear | 0.962289 | 0.299731 | 0.167243 | 0.991382 |
108 | feynman_II_15_4 | Linear | 0.212968 | 3.567289 | 21.483525 | 0.428445 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | Linear | 0.210697 | 3.561805 | 21.58674 | 0.430193 |
110 | feynman_II_24_17 | Linear | 0.970472 | 0.118832 | 0.021962 | 0.997035 |
111 | feynman_II_27_16 | Linear | 0.802556 | 29.222483 | 1581.939504 | 0.983116 |
112 | feynman_II_27_18 | Linear | 0.878876 | 6.927826 | 79.418532 | 0.988165 |
113 | feynman_II_34_2 | Linear | 0.866166 | 2.617153 | 12.578403 | 0.981266 |
114 | feynman_II_34_29a | Linear | 0.786954 | 0.072809 | 0.011172 | 0.979481 |
115 | feynman_II_34_2a | Linear | 0.784677 | 0.146543 | 0.045263 | 0.979331 |
116 | feynman_II_37_1 | Linear | 0.886671 | 5.839289 | 62.60404 | 0.983136 |
117 | feynman_II_38_14 | Linear | 0.912099 | 0.048427 | 0.003988 | 0.989152 |
118 | feynman_II_3_24 | Linear | 0.637401 | 0.02403 | 0.001155 | 0.988254 |
119 | feynman_II_4_23 | Linear | 0.725582 | 0.01138 | 0.000309 | 0.979904 |
120 | feynman_II_8_31 | Linear | 0.879779 | 3.471949 | 19.902419 | 0.9882 |
121 | feynman_II_8_7 | Linear | 0.685304 | 0.032074 | 0.002524 | 0.97742 |
122 | feynman_I_10_7 | Linear | 0.994747 | 0.057704 | 0.007724 | 0.998101 |
123 | feynman_I_12_1 | Linear | 0.93089 | 0.999035 | 1.7756 | 0.983305 |
124 | feynman_I_12_4 | Linear | 0.556217 | 0.011239 | 0.000322 | 0.984439 |
125 | feynman_I_12_5 | Linear | 0.930938 | 0.998005 | 1.776479 | 0.983035 |
126 | feynman_I_14_3 | Linear | 0.865788 | 5.214275 | 50.142761 | 0.98086 |
127 | feynman_I_14_4 | Linear | 0.8794 | 3.437868 | 19.629161 | 0.987973 |
128 | feynman_I_15_10 | Linear | 0.961738 | 0.301978 | 0.170416 | 0.991385 |
129 | feynman_I_16_6 | Linear | 0.889599 | 0.286332 | 0.143153 | 0.944657 |
130 | feynman_I_18_12 | Linear | 0.505163 | 4.022895 | 27.178362 | 0.74989 |
131 | feynman_I_25_13 | Linear | 0.827937 | 0.242161 | 0.105885 | 0.983643 |
132 | feynman_I_26_2 | Linear | 0.68287 | 0.215662 | 0.079607 | 0.881601 |
133 | feynman_I_27_6 | Linear | 0.879933 | 0.089215 | 0.015361 | 0.971672 |
134 | feynman_I_29_4 | Linear | 0.688935 | 0.514013 | 0.545926 | 0.967886 |
135 | feynman_I_30_3 | Linear | 0.207866 | 1.646033 | 5.267315 | 0.398884 |
136 | feynman_I_30_5 | Linear | 0.788509 | 0.03806 | 0.003043 | 0.989051 |
137 | feynman_I_34_1 | Linear | 0.931204 | 0.309854 | 0.218704 | 0.986952 |
138 | feynman_I_34_14 | Linear | 0.966801 | 0.202536 | 0.08714 | 0.992547 |
139 | feynman_I_34_27 | Linear | 0.931503 | 0.158808 | 0.044902 | 0.983234 |
140 | feynman_I_37_4 | Linear | 0.13927 | 2.24074 | 7.086374 | 0.283763 |
141 | feynman_I_39_1 | Linear | 0.931472 | 1.499868 | 3.9914 | 0.983478 |
142 | feynman_I_39_11 | Linear | 0.822776 | 0.899057 | 1.655431 | 0.98159 |
143 | feynman_I_43_31 | Linear | 0.866215 | 5.246956 | 50.412573 | 0.981239 |
144 | feynman_I_47_23 | Linear | 0.916652 | 0.14283 | 0.037889 | 0.981369 |
145 | feynman_I_48_2 | Linear | 0.897016 | 25.196579 | 1063.490439 | 0.989542 |
146 | feynman_I_6_2 | Linear | 0.758179 | 0.016475 | 0.00044 | 0.918804 |
147 | feynman_I_6_2b | Linear | 0.658535 | 0.022427 | 0.001232 | 0.846275 |
148 | nikuradse_1 | Linear | 0.638503 | 0.083324 | 0.009218 | 0.910944 |
149 | strogatz_bacres1 | Linear | 0.988248 | 0.216058 | 0.06944 | 0.856053 |
150 | strogatz_bacres2 | Linear | 0.984511 | 0.216704 | 0.06989 | 0.993564 |
151 | strogatz_barmag1 | Linear | 0.834231 | 0.034753 | 0.009313 | 0.944294 |
152 | strogatz_barmag2 | Linear | 0.04003 | 0.150235 | 0.067818 | 0.335486 |
153 | strogatz_glider1 | Linear | 0.108586 | 0.669491 | 0.542974 | 0.32942 |
154 | strogatz_glider2 | Linear | 0.687994 | 0.359401 | 0.264734 | 0.747661 |
155 | strogatz_lv1 | Linear | -5.097275 | 0.700792 | 5.861402 | -0.445025 |
156 | strogatz_lv2 | Linear | 0.11747 | 0.186607 | 0.1198 | 0.602734 |
157 | strogatz_predprey1 | Linear | 0.329958 | 1.0811 | 4.190106 | 0.719844 |
158 | strogatz_predprey2 | Linear | 0.898527 | 0.290779 | 0.234859 | 0.968529 |
159 | strogatz_shearflow1 | Linear | 0.071259 | 0.251107 | 0.256582 | 0.231905 |
160 | strogatz_shearflow2 | Linear | 0.220299 | 0.125655 | 0.041106 | 0.615767 |
161 | strogatz_vdp1 | Linear | 0.039449 | 0.775422 | 2.897968 | 0.654878 |
162 | strogatz_vdp2 | Linear | 0.999999 | 0.000078 | 0.0 | 1 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | NeuralNet | 0.852823 | 0.410395 | 2.893702e-01 | 0.937654 |
2 | 1028_SWD | NeuralNet | 0.366961 | 0.506931 | 4.093295e-01 | 0.617072 |
3 | 1029_LEV | NeuralNet | 0.545263 | 0.484831 | 4.109138e-01 | 0.742203 |
4 | 1030_ERA | NeuralNet | 0.365520 | 1.244535 | 2.433973e+00 | 0.568572 |
5 | 1089_USCrime | NeuralNet | 0.740107 | 15.647898 | 3.739526e+02 | 0.804915 |
6 | 1096_FacultySalaries | NeuralNet | 0.784680 | 1.450788 | 3.476780e+00 | 0.883562 |
7 | 1199_BNG_echoMonths | NeuralNet | 0.442570 | 9.068394 | 1.392056e+02 | 0.631720 |
8 | 192_vineyard | NeuralNet | 0.600099 | 1.991144 | 6.537614e+00 | 0.726718 |
9 | 197_cpu_act | NeuralNet | 0.975503 | 2.025710 | 8.246882e+00 | 0.954770 |
10 | 210_cloud | NeuralNet | 0.747172 | 0.329699 | 2.759520e-01 | 0.888264 |
11 | 225_puma8NH | NeuralNet | 0.662979 | 2.507061 | 1.071329e+01 | 0.810947 |
12 | 227_cpu_small | NeuralNet | 0.968858 | 2.311537 | 1.036657e+01 | 0.938835 |
13 | 228_elusage | NeuralNet | 0.784584 | 8.330419 | 1.142678e+02 | 0.810997 |
14 | 229_pwLinear | NeuralNet | 0.798872 | 1.529199 | 3.789153e+00 | 0.898095 |
15 | 294_satellite_image | NeuralNet | 0.902530 | 0.374750 | 4.760028e-01 | 0.951210 |
16 | 4544_GeographicalOriginalofMusic | NeuralNet | 0.690027 | 0.408876 | 3.214000e-01 | 0.827174 |
17 | 503_wind | NeuralNet | 0.795892 | 2.320634 | 9.126839e+00 | 0.892835 |
18 | 505_tecator | NeuralNet | 0.918205 | 2.942551 | 1.807219e+01 | 0.945851 |
19 | 519_vinnie | NeuralNet | 0.734397 | 1.232201 | 2.396868e+00 | 0.854593 |
20 | 522_pm10 | NeuralNet | 0.235241 | 0.599024 | 5.889134e-01 | 0.514193 |
21 | 523_analcatdata_neavote | NeuralNet | 0.914215 | 0.796473 | 1.220032e+00 | 0.850378 |
22 | 529_pollen | NeuralNet | 0.783172 | 1.153324 | 2.117322e+00 | 0.874001 |
23 | 547_no2 | NeuralNet | 0.542080 | 0.399243 | 2.557379e-01 | 0.730360 |
24 | 560_bodyfat | NeuralNet | 0.961306 | 1.117585 | 2.658125e+00 | 0.981898 |
25 | 562_cpu_small | NeuralNet | 0.969328 | 2.303671 | 1.022009e+01 | 0.938424 |
26 | 573_cpu_act | NeuralNet | 0.976396 | 1.977254 | 7.861949e+00 | 0.954865 |
27 | 579_fri_c0_250_5 | NeuralNet | 0.858902 | 0.284880 | 1.311714e-01 | 0.925673 |
28 | 581_fri_c3_500_25 | NeuralNet | 0.215711 | 0.702565 | 7.842227e-01 | 0.490247 |
29 | 582_fri_c1_500_25 | NeuralNet | 0.140094 | 0.735744 | 8.611706e-01 | 0.469505 |
30 | 583_fri_c1_1000_50 | NeuralNet | 0.129949 | 0.748683 | 8.678001e-01 | 0.435474 |
31 | 584_fri_c4_500_25 | NeuralNet | 0.260464 | 0.672682 | 7.394183e-01 | 0.519414 |
32 | 586_fri_c3_1000_25 | NeuralNet | 0.594551 | 0.468416 | 4.028868e-01 | 0.764326 |
33 | 588_fri_c4_1000_100 | NeuralNet | 0.151236 | 0.740214 | 8.556538e-01 | 0.374383 |
34 | 589_fri_c2_1000_25 | NeuralNet | 0.492925 | 0.539173 | 4.984187e-01 | 0.707398 |
35 | 590_fri_c0_1000_50 | NeuralNet | 0.649386 | 0.456280 | 3.401131e-01 | 0.805546 |
36 | 591_fri_c1_100_10 | NeuralNet | -0.020986 | 0.782549 | 1.005607e+00 | 0.401704 |
37 | 592_fri_c4_1000_25 | NeuralNet | 0.562524 | 0.494321 | 4.340235e-01 | 0.742200 |
38 | 593_fri_c1_1000_10 | NeuralNet | 0.873999 | 0.257171 | 1.261005e-01 | 0.930841 |
39 | 594_fri_c2_100_5 | NeuralNet | 0.383327 | 0.569770 | 6.085111e-01 | 0.667519 |
40 | 595_fri_c0_1000_10 | NeuralNet | 0.906380 | 0.243312 | 9.331033e-02 | 0.953951 |
41 | 596_fri_c2_250_5 | NeuralNet | 0.646729 | 0.427951 | 3.347499e-01 | 0.817476 |
42 | 597_fri_c2_500_5 | NeuralNet | 0.863733 | 0.274547 | 1.379375e-01 | 0.928026 |
43 | 598_fri_c0_1000_25 | NeuralNet | 0.819596 | 0.335204 | 1.778779e-01 | 0.904785 |
44 | 599_fri_c2_1000_5 | NeuralNet | 0.921722 | 0.205985 | 7.961486e-02 | 0.956788 |
45 | 601_fri_c1_250_5 | NeuralNet | 0.697394 | 0.390633 | 2.808596e-01 | 0.854169 |
46 | 602_fri_c3_250_10 | NeuralNet | 0.471516 | 0.542086 | 5.032179e-01 | 0.667044 |
47 | 603_fri_c0_250_50 | NeuralNet | 0.548627 | 0.519214 | 4.271940e-01 | 0.751120 |
48 | 604_fri_c4_500_10 | NeuralNet | 0.774700 | 0.342309 | 2.168632e-01 | 0.871671 |
49 | 605_fri_c2_250_25 | NeuralNet | 0.243484 | 0.693581 | 7.410648e-01 | 0.488701 |
50 | 606_fri_c2_1000_10 | NeuralNet | 0.875064 | 0.259213 | 1.219226e-01 | 0.934479 |
51 | 607_fri_c4_1000_50 | NeuralNet | 0.103459 | 0.743767 | 8.717138e-01 | 0.393737 |
52 | 608_fri_c3_1000_10 | NeuralNet | 0.892355 | 0.234260 | 1.056955e-01 | 0.941152 |
53 | 609_fri_c0_1000_5 | NeuralNet | 0.936046 | 0.201667 | 6.523782e-02 | 0.968197 |
54 | 611_fri_c3_100_5 | NeuralNet | 0.578785 | 0.474465 | 3.541383e-01 | 0.744211 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | NeuralNet | 0.901813 | 0.232382 | 9.276487e-02 | 0.951857 |
56 | 613_fri_c3_250_5 | NeuralNet | 0.687361 | 0.412556 | 3.149878e-01 | 0.831824 |
57 | 615_fri_c4_250_10 | NeuralNet | 0.497194 | 0.537168 | 4.991084e-01 | 0.693061 |
58 | 616_fri_c4_500_50 | NeuralNet | 0.091603 | 0.764015 | 9.019947e-01 | 0.375784 |
59 | 617_fri_c3_500_5 | NeuralNet | 0.769612 | 0.338511 | 2.308483e-01 | 0.885101 |
60 | 618_fri_c3_1000_50 | NeuralNet | 0.198645 | 0.696363 | 7.710063e-01 | 0.445806 |
61 | 620_fri_c1_1000_25 | NeuralNet | 0.242152 | 0.686101 | 7.655733e-01 | 0.523801 |
62 | 621_fri_c0_100_10 | NeuralNet | 0.564102 | 0.461186 | 3.743486e-01 | 0.780952 |
63 | 622_fri_c2_1000_50 | NeuralNet | 0.152152 | 0.733028 | 8.396476e-01 | 0.436914 |
64 | 623_fri_c4_1000_10 | NeuralNet | 0.840967 | 0.283853 | 1.582716e-01 | 0.912371 |
65 | 624_fri_c0_100_5 | NeuralNet | 0.786115 | 0.338335 | 1.760822e-01 | 0.881955 |
66 | 626_fri_c2_500_50 | NeuralNet | 0.071821 | 0.768397 | 9.230840e-01 | 0.395946 |
67 | 627_fri_c2_500_10 | NeuralNet | 0.638732 | 0.419177 | 3.381697e-01 | 0.810315 |
68 | 628_fri_c3_1000_5 | NeuralNet | 0.905089 | 0.227557 | 9.340655e-02 | 0.949961 |
69 | 631_fri_c1_500_5 | NeuralNet | 0.771559 | 0.351438 | 2.325012e-01 | 0.886770 |
70 | 633_fri_c0_500_25 | NeuralNet | 0.690277 | 0.429273 | 3.037191e-01 | 0.832301 |
71 | 634_fri_c2_100_10 | NeuralNet | 0.329548 | 0.613479 | 6.359580e-01 | 0.580100 |
72 | 635_fri_c0_250_10 | NeuralNet | 0.781054 | 0.349538 | 1.974406e-01 | 0.881223 |
73 | 637_fri_c1_500_50 | NeuralNet | 0.050588 | 0.788616 | 9.418987e-01 | 0.358172 |
74 | 641_fri_c1_500_10 | NeuralNet | 0.734171 | 0.380561 | 2.662472e-01 | 0.863063 |
75 | 643_fri_c2_500_25 | NeuralNet | -0.047030 | 0.803269 | 1.042054e+00 | 0.334503 |
76 | 644_fri_c4_250_25 | NeuralNet | -0.098817 | 0.811740 | 1.055398e+00 | 0.307765 |
77 | 645_fri_c3_500_50 | NeuralNet | 0.119247 | 0.736821 | 8.586971e-01 | 0.380650 |
78 | 646_fri_c3_500_10 | NeuralNet | 0.750305 | 0.363450 | 2.430099e-01 | 0.846999 |
79 | 647_fri_c1_250_10 | NeuralNet | 0.306954 | 0.633942 | 6.688702e-01 | 0.557775 |
80 | 648_fri_c1_250_50 | NeuralNet | 0.211333 | 0.728166 | 7.866983e-01 | 0.493093 |
81 | 649_fri_c0_500_5 | NeuralNet | 0.911711 | 0.232881 | 8.730737e-02 | 0.956813 |
82 | 650_fri_c0_500_50 | NeuralNet | 0.664720 | 0.451717 | 3.326932e-01 | 0.818101 |
83 | 651_fri_c0_100_25 | NeuralNet | 0.547258 | 0.555754 | 4.817018e-01 | 0.733383 |
84 | 653_fri_c0_250_25 | NeuralNet | 0.620513 | 0.473479 | 3.620537e-01 | 0.789404 |
85 | 654_fri_c0_500_10 | NeuralNet | 0.858299 | 0.296183 | 1.380846e-01 | 0.931270 |
86 | 656_fri_c1_100_5 | NeuralNet | 0.426208 | 0.526906 | 4.847716e-01 | 0.728221 |
87 | 657_fri_c2_250_10 | NeuralNet | 0.379947 | 0.587696 | 5.957274e-01 | 0.641508 |
88 | 658_fri_c3_250_25 | NeuralNet | 0.039462 | 0.762112 | 9.574166e-01 | 0.388744 |
89 | 663_rabe_266 | NeuralNet | 0.975598 | 5.949144 | 6.214602e+01 | 0.991636 |
90 | 665_sleuth_case2002 | NeuralNet | 0.137285 | 6.182009 | 7.395436e+01 | 0.352857 |
91 | 666_rmftsa_ladata | NeuralNet | 0.600076 | 1.276154 | 2.988638e+00 | 0.613317 |
92 | 678_visualizing_environmental | NeuralNet | 0.300701 | 2.304259 | 8.481555e+00 | 0.545703 |
93 | 687_sleuth_ex1605 | NeuralNet | 0.499403 | 7.711171 | 9.685532e+01 | 0.719613 |
94 | 690_visualizing_galaxy | NeuralNet | 0.949464 | 16.315438 | 4.578051e+02 | 0.976084 |
95 | 695_chatfield_4 | NeuralNet | 0.823126 | 13.605473 | 3.618121e+02 | 0.924415 |
96 | 712_chscase_geyser1 | NeuralNet | 0.758728 | 4.979575 | 3.836666e+01 | 0.746288 |
97 | feynman_III_12_43 | NeuralNet | 0.999754 | 0.009627 | 1.609622e-04 | 0.999973 |
98 | feynman_III_15_12 | NeuralNet | 0.994964 | 0.257667 | 1.323041e-01 | 0.996570 |
99 | feynman_III_15_14 | NeuralNet | 0.997831 | 0.000405 | 6.528335e-07 | 0.999321 |
100 | feynman_III_15_27 | NeuralNet | 0.998860 | 0.058674 | 8.163461e-03 | 0.999868 |
101 | feynman_III_17_37 | NeuralNet | 0.998990 | 0.113484 | 2.548044e-02 | 0.999682 |
102 | feynman_III_7_38 | NeuralNet | 0.999198 | 0.659432 | 1.038934e+00 | 0.999906 |
103 | feynman_III_8_54 | NeuralNet | 0.973600 | 0.027549 | 3.294157e-03 | 0.985185 |
104 | feynman_II_10_9 | NeuralNet | 0.999302 | 0.004321 | 4.284257e-05 | 0.999918 |
105 | feynman_II_11_28 | NeuralNet | 0.999608 | 0.004045 | 3.319564e-05 | 0.999887 |
106 | feynman_II_13_23 | NeuralNet | 0.999734 | 0.015028 | 3.925164e-04 | 0.999982 |
107 | feynman_II_13_34 | NeuralNet | 0.999698 | 0.026780 | 1.341865e-03 | 0.999974 |
108 | feynman_II_15_4 | NeuralNet | 0.999088 | 0.115679 | 2.489679e-02 | 0.999704 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | NeuralNet | 0.999063 | 0.117654 | 2.562177e-02 | 0.999707 |
110 | feynman_II_24_17 | NeuralNet | 0.999730 | 0.010473 | 2.008074e-04 | 0.999984 |
111 | feynman_II_27_16 | NeuralNet | 0.999488 | 1.392176 | 4.111538e+00 | 0.999896 |
112 | feynman_II_27_18 | NeuralNet | 0.999613 | 0.360378 | 2.543454e-01 | 0.999967 |
113 | feynman_II_34_2 | NeuralNet | 0.999546 | 0.147771 | 4.276658e-02 | 0.999939 |
114 | feynman_II_34_29a | NeuralNet | 0.999319 | 0.003836 | 3.560954e-05 | 0.999918 |
115 | feynman_II_34_2a | NeuralNet | 0.999368 | 0.008007 | 1.326014e-04 | 0.999930 |
116 | feynman_II_37_1 | NeuralNet | 0.999598 | 0.338736 | 2.221163e-01 | 0.999941 |
117 | feynman_II_38_14 | NeuralNet | 0.999705 | 0.002604 | 1.338456e-05 | 0.999974 |
118 | feynman_II_3_24 | NeuralNet | 0.999165 | 0.000894 | 2.658922e-06 | 0.999905 |
119 | feynman_II_4_23 | NeuralNet | 0.998947 | 0.000658 | 1.181480e-06 | 0.999882 |
120 | feynman_II_8_31 | NeuralNet | 0.999722 | 0.165760 | 4.603600e-02 | 0.999971 |
121 | feynman_II_8_7 | NeuralNet | 0.998873 | 0.001809 | 9.074480e-06 | 0.999819 |
122 | feynman_I_10_7 | NeuralNet | 0.999815 | 0.012788 | 2.725433e-04 | 0.999984 |
123 | feynman_I_12_1 | NeuralNet | 0.999730 | 0.060529 | 6.957336e-03 | 0.999961 |
124 | feynman_I_12_4 | NeuralNet | 0.998552 | 0.000566 | 1.052145e-06 | 0.999400 |
125 | feynman_I_12_5 | NeuralNet | 0.999754 | 0.059734 | 6.328985e-03 | 0.999961 |
126 | feynman_I_14_3 | NeuralNet | 0.999627 | 0.270023 | 1.399226e-01 | 0.999927 |
127 | feynman_I_14_4 | NeuralNet | 0.999756 | 0.148655 | 3.972920e-02 | 0.999971 |
128 | feynman_I_15_10 | NeuralNet | 0.999703 | 0.027272 | 1.321566e-03 | 0.999974 |
129 | feynman_I_16_6 | NeuralNet | 0.999708 | 0.015255 | 3.794285e-04 | 0.999960 |
130 | feynman_I_18_12 | NeuralNet | 0.999239 | 0.151583 | 4.176557e-02 | 0.999755 |
131 | feynman_I_25_13 | NeuralNet | 0.999491 | 0.011609 | 3.132732e-04 | 0.999956 |
132 | feynman_I_26_2 | NeuralNet | 0.999546 | 0.007840 | 1.140345e-04 | 0.999932 |
133 | feynman_I_27_6 | NeuralNet | 0.999589 | 0.005413 | 5.256776e-05 | 0.999935 |
134 | feynman_I_29_4 | NeuralNet | 0.999101 | 0.024454 | 1.573681e-03 | 0.999931 |
135 | feynman_I_30_3 | NeuralNet | 0.995185 | 0.119984 | 3.204372e-02 | 0.995882 |
136 | feynman_I_30_5 | NeuralNet | 0.999058 | 0.002344 | 1.356262e-05 | 0.999919 |
137 | feynman_I_34_1 | NeuralNet | 0.999599 | 0.025631 | 1.276745e-03 | 0.999964 |
138 | feynman_I_34_14 | NeuralNet | 0.999574 | 0.025699 | 1.120748e-03 | 0.999979 |
139 | feynman_I_34_27 | NeuralNet | 0.999768 | 0.009631 | 1.522168e-04 | 0.999972 |
140 | feynman_I_37_4 | NeuralNet | 0.999066 | 0.065067 | 7.688136e-03 | 0.999745 |
141 | feynman_I_39_1 | NeuralNet | 0.999704 | 0.099840 | 1.730699e-02 | 0.999972 |
142 | feynman_I_39_11 | NeuralNet | 0.999561 | 0.045397 | 4.087927e-03 | 0.999941 |
143 | feynman_I_43_31 | NeuralNet | 0.999625 | 0.277835 | 1.415122e-01 | 0.999930 |
144 | feynman_I_47_23 | NeuralNet | 0.999516 | 0.010906 | 2.208181e-04 | 0.999953 |
145 | feynman_I_48_2 | NeuralNet | 0.999736 | 1.187750 | 2.731035e+00 | 0.999964 |
146 | feynman_I_6_2 | NeuralNet | 0.999627 | 0.000603 | 6.789286e-07 | 0.999945 |
147 | feynman_I_6_2b | NeuralNet | 0.999403 | 0.001030 | 2.152602e-06 | 0.999781 |
148 | nikuradse_1 | NeuralNet | 0.917551 | 0.035698 | 2.117774e-03 | 0.975503 |
149 | strogatz_bacres1 | NeuralNet | 0.969642 | 0.252176 | 1.666494e-01 | 0.913027 |
150 | strogatz_bacres2 | NeuralNet | 0.974906 | 0.241574 | 1.112301e-01 | 0.993503 |
151 | strogatz_barmag1 | NeuralNet | 0.902097 | 0.036250 | 5.435290e-03 | 0.940557 |
152 | strogatz_barmag2 | NeuralNet | 0.680461 | 0.084798 | 1.988241e-02 | 0.643502 |
153 | strogatz_glider1 | NeuralNet | 0.118384 | 0.662337 | 5.375853e-01 | 0.306711 |
154 | strogatz_glider2 | NeuralNet | 0.797370 | 0.283208 | 1.690502e-01 | 0.765416 |
155 | strogatz_lv1 | NeuralNet | 0.439642 | 0.570618 | 5.150794e+00 | 0.163208 |
156 | strogatz_lv2 | NeuralNet | 0.607377 | 0.160983 | 2.700868e-01 | 0.694346 |
157 | strogatz_predprey1 | NeuralNet | 0.804045 | 0.589049 | 1.026594e+00 | 0.908491 |
158 | strogatz_predprey2 | NeuralNet | 0.948876 | 0.243723 | 1.185541e-01 | 0.983712 |
159 | strogatz_shearflow1 | NeuralNet | 0.467181 | 0.211420 | 1.315940e-01 | 0.444693 |
160 | strogatz_shearflow2 | NeuralNet | 0.462558 | 0.111373 | 2.764929e-02 | 0.729255 |
161 | strogatz_vdp1 | NeuralNet | 0.752528 | 0.523250 | 7.033177e-01 | 0.675935 |
162 | strogatz_vdp2 | NeuralNet | 0.980716 | 0.008563 | 1.598424e-04 | 0.992766 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | RF | 0.828191 | 0.401064 | 0.337169 | 0.926962 |
2 | 1028_SWD | RF | 0.351163 | 0.509689 | 0.419149 | 0.603557 |
3 | 1029_LEV | RF | 0.513733 | 0.494058 | 0.439321 | 0.721755 |
4 | 1030_ERA | RF | 0.344498 | 1.263994 | 2.511468 | 0.551032 |
5 | 1089_USCrime | RF | 0.787284 | 14.124231 | 307.610681 | 0.825673 |
6 | 1096_FacultySalaries | RF | 0.646406 | 1.674051 | 5.056379 | 0.86207 |
7 | 1199_BNG_echoMonths | RF | 0.466759 | 8.845958 | 133.148714 | 0.658935 |
8 | 192_vineyard | RF | 0.436878 | 2.431581 | 9.063079 | 0.677302 |
9 | 197_cpu_act | RF | 0.98191 | 1.698497 | 5.986437 | 0.960177 |
10 | 210_cloud | RF | 0.749484 | 0.309371 | 0.26005 | 0.909837 |
11 | 225_puma8NH | RF | 0.665886 | 2.50206 | 10.62004 | 0.810112 |
12 | 227_cpu_small | RF | 0.975433 | 1.976695 | 8.121429 | 0.947948 |
13 | 228_elusage | RF | 0.738112 | 8.737147 | 144.746379 | 0.792774 |
14 | 229_pwLinear | RF | 0.836583 | 1.346275 | 3.055031 | 0.913973 |
15 | 294_satellite_image | RF | 0.901088 | 0.356574 | 0.482946 | 0.948016 |
16 | 4544_GeographicalOriginalofMusic | RF | 0.709701 | 0.391354 | 0.299695 | 0.832358 |
17 | 503_wind | RF | 0.782594 | 2.380768 | 9.724192 | 0.886184 |
18 | 505_tecator | RF | 0.988714 | 1.058452 | 2.278102 | 0.985042 |
19 | 519_vinnie | RF | 0.65268 | 1.386581 | 3.146539 | 0.819722 |
20 | 522_pm10 | RF | 0.38052 | 0.536843 | 0.476258 | 0.635714 |
21 | 523_analcatdata_neavote | RF | 0.946035 | 0.584734 | 0.773968 | 0.859502 |
22 | 529_pollen | RF | 0.753629 | 1.222244 | 2.407963 | 0.859562 |
23 | 547_no2 | RF | 0.590758 | 0.372885 | 0.227522 | 0.737098 |
24 | 560_bodyfat | RF | 0.967986 | 0.437872 | 2.131262 | 0.989025 |
25 | 562_cpu_small | RF | 0.975361 | 1.978607 | 8.143903 | 0.947786 |
26 | 573_cpu_act | RF | 0.98196 | 1.698095 | 5.969805 | 0.960181 |
27 | 579_fri_c0_250_5 | RF | 0.747678 | 0.384239 | 0.233941 | 0.870979 |
28 | 581_fri_c3_500_25 | RF | 0.864109 | 0.289336 | 0.135995 | 0.926033 |
29 | 582_fri_c1_500_25 | RF | 0.825411 | 0.323105 | 0.174802 | 0.910122 |
30 | 583_fri_c1_1000_50 | RF | 0.862768 | 0.284879 | 0.136845 | 0.926156 |
31 | 584_fri_c4_500_25 | RF | 0.846209 | 0.302163 | 0.152214 | 0.909411 |
32 | 586_fri_c3_1000_25 | RF | 0.892224 | 0.251459 | 0.107145 | 0.940938 |
33 | 588_fri_c4_1000_100 | RF | 0.876883 | 0.274382 | 0.124012 | 0.923148 |
34 | 589_fri_c2_1000_25 | RF | 0.875381 | 0.274231 | 0.122588 | 0.932206 |
35 | 590_fri_c0_1000_50 | RF | 0.770152 | 0.376476 | 0.22257 | 0.879496 |
36 | 591_fri_c1_100_10 | RF | 0.688072 | 0.424705 | 0.307654 | 0.831629 |
37 | 592_fri_c4_1000_25 | RF | 0.886852 | 0.258927 | 0.111977 | 0.934837 |
38 | 593_fri_c1_1000_10 | RF | 0.904628 | 0.242149 | 0.095112 | 0.944207 |
39 | 594_fri_c2_100_5 | RF | 0.653107 | 0.431547 | 0.323924 | 0.785614 |
40 | 595_fri_c0_1000_10 | RF | 0.825544 | 0.328153 | 0.173882 | 0.910602 |
41 | 596_fri_c2_250_5 | RF | 0.835352 | 0.310415 | 0.156812 | 0.902047 |
42 | 597_fri_c2_500_5 | RF | 0.894523 | 0.249443 | 0.104584 | 0.93249 |
43 | 598_fri_c0_1000_25 | RF | 0.810615 | 0.343871 | 0.187126 | 0.90705 |
44 | 599_fri_c2_1000_5 | RF | 0.92493 | 0.208877 | 0.074809 | 0.951968 |
45 | 601_fri_c1_250_5 | RF | 0.855096 | 0.2867 | 0.135888 | 0.916325 |
46 | 602_fri_c3_250_10 | RF | 0.795581 | 0.329389 | 0.19609 | 0.901573 |
47 | 603_fri_c0_250_50 | RF | 0.659874 | 0.449403 | 0.323374 | 0.833863 |
48 | 604_fri_c4_500_10 | RF | 0.882066 | 0.260423 | 0.116341 | 0.930626 |
49 | 605_fri_c2_250_25 | RF | 0.725294 | 0.403836 | 0.266039 | 0.850122 |
50 | 606_fri_c2_1000_10 | RF | 0.900382 | 0.244165 | 0.097612 | 0.942411 |
51 | 607_fri_c4_1000_50 | RF | 0.880011 | 0.263028 | 0.116614 | 0.931114 |
52 | 608_fri_c3_1000_10 | RF | 0.903813 | 0.233693 | 0.093653 | 0.946613 |
53 | 609_fri_c0_1000_5 | RF | 0.868003 | 0.290612 | 0.134467 | 0.933037 |
54 | 611_fri_c3_100_5 | RF | 0.714641 | 0.402382 | 0.24842 | 0.841404 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | RF | 0.922967 | 0.208122 | 0.073467 | 0.958037 |
56 | 613_fri_c3_250_5 | RF | 0.849498 | 0.301571 | 0.152947 | 0.906948 |
57 | 615_fri_c4_250_10 | RF | 0.76537 | 0.36884 | 0.235968 | 0.871497 |
58 | 616_fri_c4_500_50 | RF | 0.835236 | 0.309777 | 0.16219 | 0.90138 |
59 | 617_fri_c3_500_5 | RF | 0.879998 | 0.255318 | 0.117882 | 0.933057 |
60 | 618_fri_c3_1000_50 | RF | 0.883478 | 0.264537 | 0.112224 | 0.923763 |
61 | 620_fri_c1_1000_25 | RF | 0.874473 | 0.282563 | 0.126939 | 0.931109 |
62 | 621_fri_c0_100_10 | RF | 0.569101 | 0.473485 | 0.378732 | 0.776792 |
63 | 622_fri_c2_1000_50 | RF | 0.874072 | 0.278149 | 0.124303 | 0.922776 |
64 | 623_fri_c4_1000_10 | RF | 0.897811 | 0.237144 | 0.098251 | 0.936852 |
65 | 624_fri_c0_100_5 | RF | 0.704611 | 0.393722 | 0.245705 | 0.847368 |
66 | 626_fri_c2_500_50 | RF | 0.823008 | 0.321812 | 0.175854 | 0.903049 |
67 | 627_fri_c2_500_10 | RF | 0.872773 | 0.259418 | 0.117884 | 0.925824 |
68 | 628_fri_c3_1000_5 | RF | 0.923906 | 0.212655 | 0.075839 | 0.955071 |
69 | 631_fri_c1_500_5 | RF | 0.866227 | 0.275814 | 0.131199 | 0.92734 |
70 | 633_fri_c0_500_25 | RF | 0.765071 | 0.386825 | 0.229933 | 0.875553 |
71 | 634_fri_c2_100_10 | RF | 0.638825 | 0.458935 | 0.335267 | 0.770125 |
72 | 635_fri_c0_250_10 | RF | 0.703241 | 0.413184 | 0.266993 | 0.838454 |
73 | 637_fri_c1_500_50 | RF | 0.797855 | 0.345077 | 0.19983 | 0.890665 |
74 | 641_fri_c1_500_10 | RF | 0.885827 | 0.265389 | 0.114417 | 0.939647 |
75 | 643_fri_c2_500_25 | RF | 0.828081 | 0.3241 | 0.170461 | 0.895178 |
76 | 644_fri_c4_250_25 | RF | 0.729438 | 0.386083 | 0.26263 | 0.869125 |
77 | 645_fri_c3_500_50 | RF | 0.828978 | 0.314482 | 0.165812 | 0.892874 |
78 | 646_fri_c3_500_10 | RF | 0.880021 | 0.266839 | 0.116977 | 0.926986 |
79 | 647_fri_c1_250_10 | RF | 0.811044 | 0.331034 | 0.185019 | 0.881623 |
80 | 648_fri_c1_250_50 | RF | 0.751043 | 0.383354 | 0.247733 | 0.861605 |
81 | 649_fri_c0_500_5 | RF | 0.82724 | 0.317792 | 0.171945 | 0.911659 |
82 | 650_fri_c0_500_50 | RF | 0.739479 | 0.404096 | 0.257992 | 0.870969 |
83 | 651_fri_c0_100_25 | RF | 0.474601 | 0.620371 | 0.56962 | 0.741504 |
84 | 653_fri_c0_250_25 | RF | 0.691073 | 0.435891 | 0.297447 | 0.840384 |
85 | 654_fri_c0_500_10 | RF | 0.787069 | 0.355176 | 0.207396 | 0.892637 |
86 | 656_fri_c1_100_5 | RF | 0.622067 | 0.420105 | 0.29974 | 0.826065 |
87 | 657_fri_c2_250_10 | RF | 0.841526 | 0.296253 | 0.150434 | 0.88202 |
88 | 658_fri_c3_250_25 | RF | 0.716241 | 0.411146 | 0.28833 | 0.851425 |
89 | 663_rabe_266 | RF | 0.988249 | 3.699082 | 29.70961 | 0.990613 |
90 | 665_sleuth_case2002 | RF | 0.21152 | 5.729268 | 66.67892 | 0.393441 |
91 | 666_rmftsa_ladata | RF | 0.534759 | 1.339757 | 3.505726 | 0.620323 |
92 | 678_visualizing_environmental | RF | 0.167423 | 2.409752 | 9.850855 | 0.519201 |
93 | 687_sleuth_ex1605 | RF | 0.478443 | 7.47383 | 97.048396 | 0.713633 |
94 | 690_visualizing_galaxy | RF | 0.972833 | 10.963723 | 244.234181 | 0.980709 |
95 | 695_chatfield_4 | RF | 0.822858 | 13.143151 | 367.921677 | 0.928416 |
96 | 712_chscase_geyser1 | RF | 0.681945 | 5.679489 | 50.057784 | 0.715691 |
97 | feynman_III_12_43 | RF | 0.999994 | 0.001455 | 0.000004 | 0.999997 |
98 | feynman_III_15_12 | RF | 0.996401 | 0.218054 | 0.094469 | 0.997972 |
99 | feynman_III_15_14 | RF | 0.998693 | 0.000186 | 0.0 | 0.999891 |
100 | feynman_III_15_27 | RF | 0.999309 | 0.032186 | 0.004928 | 0.999866 |
101 | feynman_III_17_37 | RF | 0.99973 | 0.056548 | 0.006815 | 0.999917 |
102 | feynman_III_7_38 | RF | 0.999573 | 0.409599 | 0.552861 | 0.999868 |
103 | feynman_III_8_54 | RF | 0.965408 | 0.039261 | 0.004317 | 0.97834 |
104 | feynman_II_10_9 | RF | 0.999448 | 0.002955 | 0.000034 | 0.999874 |
105 | feynman_II_11_28 | RF | 0.999992 | 0.000481 | 0.000001 | 0.999997 |
106 | feynman_II_13_23 | RF | 0.999951 | 0.004556 | 0.000073 | 0.99998 |
107 | feynman_II_13_34 | RF | 0.999861 | 0.015029 | 0.000616 | 0.999954 |
108 | feynman_II_15_4 | RF | 0.999696 | 0.062073 | 0.008304 | 0.999894 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | RF | 0.999697 | 0.062022 | 0.00828 | 0.999894 |
110 | feynman_II_24_17 | RF | 0.999885 | 0.006752 | 0.000086 | 0.999944 |
111 | feynman_II_27_16 | RF | 0.99971 | 0.891547 | 2.322599 | 0.999906 |
112 | feynman_II_27_18 | RF | 0.999992 | 0.044954 | 0.005057 | 0.999997 |
113 | feynman_II_34_2 | RF | 0.999737 | 0.104984 | 0.024754 | 0.999887 |
114 | feynman_II_34_29a | RF | 0.999531 | 0.002667 | 0.000025 | 0.999868 |
115 | feynman_II_34_2a | RF | 0.999604 | 0.005112 | 0.000083 | 0.999869 |
116 | feynman_II_37_1 | RF | 0.999754 | 0.250366 | 0.135856 | 0.999893 |
117 | feynman_II_38_14 | RF | 0.999991 | 0.000411 | 0.0 | 0.999996 |
118 | feynman_II_3_24 | RF | 0.999981 | 0.000101 | 0.0 | 0.999997 |
119 | feynman_II_4_23 | RF | 0.999287 | 0.000406 | 0.000001 | 0.999869 |
120 | feynman_II_8_31 | RF | 0.999993 | 0.02147 | 0.001142 | 0.999997 |
121 | feynman_II_8_7 | RF | 0.999148 | 0.001033 | 0.000007 | 0.999891 |
122 | feynman_I_10_7 | RF | 0.99995 | 0.00454 | 0.000073 | 0.999981 |
123 | feynman_I_12_1 | RF | 0.999994 | 0.009182 | 0.000162 | 0.999997 |
124 | feynman_I_12_4 | RF | 0.999006 | 0.000286 | 0.000001 | 0.999896 |
125 | feynman_I_12_5 | RF | 0.999994 | 0.009267 | 0.000166 | 0.999997 |
126 | feynman_I_14_3 | RF | 0.999739 | 0.209379 | 0.097592 | 0.999884 |
127 | feynman_I_14_4 | RF | 0.999993 | 0.02161 | 0.001161 | 0.999997 |
128 | feynman_I_15_10 | RF | 0.999872 | 0.01471 | 0.000571 | 0.999957 |
129 | feynman_I_16_6 | RF | 0.999897 | 0.007634 | 0.000133 | 0.999946 |
130 | feynman_I_18_12 | RF | 0.9998 | 0.073355 | 0.010974 | 0.999927 |
131 | feynman_I_25_13 | RF | 0.999987 | 0.001564 | 0.000008 | 0.999996 |
132 | feynman_I_26_2 | RF | 0.999993 | 0.00082 | 0.000002 | 0.999998 |
133 | feynman_I_27_6 | RF | 0.999747 | 0.003801 | 0.000032 | 0.999894 |
134 | feynman_I_29_4 | RF | 0.999979 | 0.00267 | 0.000038 | 0.999995 |
135 | feynman_I_30_3 | RF | 0.997875 | 0.07419 | 0.014129 | 0.998623 |
136 | feynman_I_30_5 | RF | 0.999473 | 0.001381 | 0.000008 | 0.999891 |
137 | feynman_I_34_1 | RF | 0.999739 | 0.015217 | 0.000829 | 0.999932 |
138 | feynman_I_34_14 | RF | 0.999841 | 0.012027 | 0.000418 | 0.99995 |
139 | feynman_I_34_27 | RF | 0.999993 | 0.001505 | 0.000004 | 0.999997 |
140 | feynman_I_37_4 | RF | 0.999741 | 0.030484 | 0.002131 | 0.999903 |
141 | feynman_I_39_1 | RF | 0.999993 | 0.014017 | 0.000382 | 0.999997 |
142 | feynman_I_39_11 | RF | 0.999621 | 0.034505 | 0.003542 | 0.999879 |
143 | feynman_I_43_31 | RF | 0.999744 | 0.209035 | 0.096614 | 0.999887 |
144 | feynman_I_47_23 | RF | 0.999705 | 0.007894 | 0.000134 | 0.999871 |
145 | feynman_I_48_2 | RF | 0.999949 | 0.5204 | 0.529267 | 0.999975 |
146 | feynman_I_6_2 | RF | 0.99999 | 0.000082 | 0.0 | 0.999996 |
147 | feynman_I_6_2b | RF | 0.999608 | 0.000729 | 0.000001 | 0.999819 |
148 | nikuradse_1 | RF | 0.998261 | 0.004636 | 0.000043 | 0.990062 |
149 | strogatz_bacres1 | RF | 0.99768 | 0.051648 | 0.014431 | 0.971473 |
150 | strogatz_bacres2 | RF | 0.993031 | 0.050096 | 0.031067 | 0.99526 |
151 | strogatz_barmag1 | RF | 0.959565 | 0.01236 | 0.00262 | 0.995512 |
152 | strogatz_barmag2 | RF | 0.990584 | 0.009236 | 0.000698 | 0.991259 |
153 | strogatz_glider1 | RF | 0.971766 | 0.088604 | 0.017235 | 0.986356 |
154 | strogatz_glider2 | RF | 0.950929 | 0.114753 | 0.043182 | 0.969765 |
155 | strogatz_lv1 | RF | 0.600604 | 0.217083 | 8.024616 | 0.993335 |
156 | strogatz_lv2 | RF | 0.596278 | 0.089651 | 0.526566 | 0.971766 |
157 | strogatz_predprey1 | RF | 0.926847 | 0.148677 | 0.806437 | 0.989732 |
158 | strogatz_predprey2 | RF | 0.984844 | 0.08214 | 0.036595 | 0.993528 |
159 | strogatz_shearflow1 | RF | 0.954483 | 0.026438 | 0.012076 | 0.992629 |
160 | strogatz_shearflow2 | RF | 0.978411 | 0.010467 | 0.00115 | 0.995866 |
161 | strogatz_vdp1 | RF | 0.877137 | 0.162954 | 0.317669 | 0.937715 |
162 | strogatz_vdp2 | RF | 0.999721 | 0.000652 | 0.000002 | 0.999759 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
1 | 1027_ESL | LGBM | 0.844445 | 0.413833 | 0.305865 | 0.931222 |
2 | 1028_SWD | LGBM | 0.359984 | 0.506554 | 0.413474 | 0.609287 |
3 | 1029_LEV | LGBM | 0.524891 | 0.490583 | 0.429172 | 0.728595 |
4 | 1030_ERA | LGBM | 0.346505 | 1.262428 | 2.504109 | 0.552466 |
5 | 1089_USCrime | LGBM | -0.160501 | 36.23973 | 1727.36776 | NaN |
6 | 1096_FacultySalaries | LGBM | -0.103898 | 3.356827 | 20.915339 | 0.760385 |
7 | 1199_BNG_echoMonths | LGBM | 0.476546 | 8.791663 | 130.713328 | 0.666369 |
8 | 192_vineyard | LGBM | -0.063398 | 3.319048 | 19.168705 | 0.715896 |
9 | 197_cpu_act | LGBM | 0.98535 | 1.555198 | 4.839184 | 0.966604 |
10 | 210_cloud | LGBM | 0.607391 | 0.401731 | 0.409442 | 0.873838 |
11 | 225_puma8NH | LGBM | 0.667638 | 2.499582 | 10.564509 | 0.810515 |
12 | 227_cpu_small | LGBM | 0.979111 | 1.856602 | 6.902363 | 0.954275 |
13 | 228_elusage | LGBM | 0.536545 | 10.575459 | 245.965478 | 0.658258 |
14 | 229_pwLinear | LGBM | 0.849223 | 1.3139 | 2.845389 | 0.923151 |
15 | 294_satellite_image | LGBM | 0.901703 | 0.366712 | 0.479844 | 0.947657 |
16 | 4544_GeographicalOriginalofMusic | LGBM | 0.725209 | 0.382052 | 0.281681 | 0.84111 |
17 | 503_wind | LGBM | 0.794789 | 2.322967 | 9.178021 | 0.891768 |
18 | 505_tecator | LGBM | 0.980731 | 1.27209 | 4.103341 | 0.987641 |
19 | 519_vinnie | LGBM | 0.703883 | 1.31389 | 2.669673 | 0.839341 |
20 | 522_pm10 | LGBM | 0.452875 | 0.509969 | 0.419688 | 0.684123 |
21 | 523_analcatdata_neavote | LGBM | 0.913691 | 0.620019 | 1.24188 | 0.870881 |
22 | 529_pollen | LGBM | 0.7574 | 1.215903 | 2.369972 | 0.862246 |
23 | 547_no2 | LGBM | 0.589485 | 0.373971 | 0.228427 | 0.753452 |
24 | 560_bodyfat | LGBM | 0.95801 | 0.896907 | 2.929498 | 0.988279 |
25 | 562_cpu_small | LGBM | 0.979111 | 1.856602 | 6.902363 | 0.954275 |
26 | 573_cpu_act | LGBM | 0.98535 | 1.555198 | 4.839184 | 0.966604 |
27 | 579_fri_c0_250_5 | LGBM | 0.817921 | 0.310909 | 0.168316 | 0.899617 |
28 | 581_fri_c3_500_25 | LGBM | 0.895226 | 0.245636 | 0.104176 | 0.93901 |
29 | 582_fri_c1_500_25 | LGBM | 0.889654 | 0.254534 | 0.110448 | 0.941618 |
30 | 583_fri_c1_1000_50 | LGBM | 0.919194 | 0.21828 | 0.080416 | 0.956086 |
31 | 584_fri_c4_500_25 | LGBM | 0.879668 | 0.256692 | 0.118972 | 0.929843 |
32 | 586_fri_c3_1000_25 | LGBM | 0.929915 | 0.200981 | 0.069801 | 0.959121 |
33 | 588_fri_c4_1000_100 | LGBM | 0.911337 | 0.223146 | 0.089196 | 0.945709 |
34 | 589_fri_c2_1000_25 | LGBM | 0.918465 | 0.219992 | 0.080058 | 0.956587 |
35 | 590_fri_c0_1000_50 | LGBM | 0.874126 | 0.277299 | 0.121789 | 0.932544 |
36 | 591_fri_c1_100_10 | LGBM | 0.472097 | 0.570042 | 0.512804 | 0.684612 |
37 | 592_fri_c4_1000_25 | LGBM | 0.917119 | 0.215887 | 0.081934 | 0.951337 |
38 | 593_fri_c1_1000_10 | LGBM | 0.937294 | 0.190624 | 0.062492 | 0.962283 |
39 | 594_fri_c2_100_5 | LGBM | 0.486797 | 0.518448 | 0.468199 | 0.732305 |
40 | 595_fri_c0_1000_10 | LGBM | 0.893363 | 0.255197 | 0.106264 | 0.943872 |
41 | 596_fri_c2_250_5 | LGBM | 0.849539 | 0.284548 | 0.142847 | 0.908598 |
42 | 597_fri_c2_500_5 | LGBM | 0.925056 | 0.207562 | 0.074506 | 0.948969 |
43 | 598_fri_c0_1000_25 | LGBM | 0.899637 | 0.24992 | 0.099142 | 0.9502 |
44 | 599_fri_c2_1000_5 | LGBM | 0.943596 | 0.179893 | 0.056196 | 0.962465 |
45 | 601_fri_c1_250_5 | LGBM | 0.885572 | 0.246478 | 0.106492 | 0.933273 |
46 | 602_fri_c3_250_10 | LGBM | 0.796471 | 0.306398 | 0.194118 | 0.908101 |
47 | 603_fri_c0_250_50 | LGBM | 0.805041 | 0.338462 | 0.185065 | 0.903421 |
48 | 604_fri_c4_500_10 | LGBM | 0.911491 | 0.224825 | 0.087365 | 0.943372 |
49 | 605_fri_c2_250_25 | LGBM | 0.801806 | 0.329571 | 0.193584 | 0.889578 |
50 | 606_fri_c2_1000_10 | LGBM | 0.933535 | 0.196463 | 0.0653 | 0.959411 |
51 | 607_fri_c4_1000_50 | LGBM | 0.915084 | 0.213321 | 0.082569 | 0.950092 |
52 | 608_fri_c3_1000_10 | LGBM | 0.923461 | 0.201144 | 0.074706 | 0.958283 |
53 | 609_fri_c0_1000_5 | LGBM | 0.919816 | 0.225162 | 0.081722 | 0.957937 |
54 | 611_fri_c3_100_5 | LGBM | 0.608191 | 0.465827 | 0.340244 | 0.739156 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
55 | 612_fri_c1_1000_5 | LGBM | 0.943877 | 0.178135 | 0.053381 | 0.969261 |
56 | 613_fri_c3_250_5 | LGBM | 0.854812 | 0.281067 | 0.147232 | 0.91031 |
57 | 615_fri_c4_250_10 | LGBM | 0.815332 | 0.309522 | 0.188706 | 0.895958 |
58 | 616_fri_c4_500_50 | LGBM | 0.856199 | 0.277982 | 0.140655 | 0.908752 |
59 | 617_fri_c3_500_5 | LGBM | 0.895849 | 0.225265 | 0.102203 | 0.944819 |
60 | 618_fri_c3_1000_50 | LGBM | 0.915046 | 0.217885 | 0.081729 | 0.942882 |
61 | 620_fri_c1_1000_25 | LGBM | 0.928287 | 0.209569 | 0.072189 | 0.959867 |
62 | 621_fri_c0_100_10 | LGBM | 0.725334 | 0.374063 | 0.235699 | 0.849173 |
63 | 622_fri_c2_1000_50 | LGBM | 0.915263 | 0.222149 | 0.08352 | 0.946356 |
64 | 623_fri_c4_1000_10 | LGBM | 0.9176 | 0.206872 | 0.079421 | 0.946893 |
65 | 624_fri_c0_100_5 | LGBM | 0.663385 | 0.414732 | 0.279963 | 0.807866 |
66 | 626_fri_c2_500_50 | LGBM | 0.881852 | 0.262801 | 0.116937 | 0.927061 |
67 | 627_fri_c2_500_10 | LGBM | 0.895066 | 0.229627 | 0.097222 | 0.937239 |
68 | 628_fri_c3_1000_5 | LGBM | 0.939786 | 0.185646 | 0.060039 | 0.96369 |
69 | 631_fri_c1_500_5 | LGBM | 0.894575 | 0.235954 | 0.103217 | 0.94264 |
70 | 633_fri_c0_500_25 | LGBM | 0.867758 | 0.28009 | 0.128961 | 0.928568 |
71 | 634_fri_c2_100_10 | LGBM | 0.554713 | 0.489593 | 0.412479 | 0.755689 |
72 | 635_fri_c0_250_10 | LGBM | 0.808018 | 0.325511 | 0.173748 | 0.888413 |
73 | 637_fri_c1_500_50 | LGBM | 0.88029 | 0.262 | 0.117988 | 0.935986 |
74 | 641_fri_c1_500_10 | LGBM | 0.920707 | 0.21553 | 0.079372 | 0.957008 |
75 | 643_fri_c2_500_25 | LGBM | 0.886997 | 0.256788 | 0.111016 | 0.929579 |
76 | 644_fri_c4_250_25 | LGBM | 0.763602 | 0.343271 | 0.229947 | 0.88179 |
77 | 645_fri_c3_500_50 | LGBM | 0.861184 | 0.269186 | 0.134287 | 0.910007 |
78 | 646_fri_c3_500_10 | LGBM | 0.895953 | 0.229111 | 0.101617 | 0.938216 |
79 | 647_fri_c1_250_10 | LGBM | 0.871382 | 0.272282 | 0.12434 | 0.920986 |
80 | 648_fri_c1_250_50 | LGBM | 0.826962 | 0.309924 | 0.169927 | 0.896567 |
81 | 649_fri_c0_500_5 | LGBM | 0.894272 | 0.251709 | 0.104998 | 0.946873 |
82 | 650_fri_c0_500_50 | LGBM | 0.847083 | 0.302548 | 0.150672 | 0.923984 |
83 | 651_fri_c0_100_25 | LGBM | 0.553901 | 0.551767 | 0.477772 | 0.769774 |
84 | 653_fri_c0_250_25 | LGBM | 0.790537 | 0.34996 | 0.201084 | 0.879091 |
85 | 654_fri_c0_500_10 | LGBM | 0.871514 | 0.272861 | 0.125055 | 0.934404 |
86 | 656_fri_c1_100_5 | LGBM | 0.523393 | 0.464672 | 0.378993 | 0.794858 |
87 | 657_fri_c2_250_10 | LGBM | 0.864705 | 0.273345 | 0.126912 | 0.894812 |
88 | 658_fri_c3_250_25 | LGBM | 0.768969 | 0.348552 | 0.235693 | 0.872487 |
89 | 663_rabe_266 | LGBM | 0.951179 | 8.545605 | 128.061611 | 0.970164 |
90 | 665_sleuth_case2002 | LGBM | 0.158356 | 6.037853 | 70.656119 | 0.396523 |
91 | 666_rmftsa_ladata | LGBM | 0.427431 | 1.453889 | 4.351936 | 0.561823 |
92 | 678_visualizing_environmental | LGBM | 0.192604 | 2.398792 | 9.545897 | 0.512608 |
93 | 687_sleuth_ex1605 | LGBM | 0.149332 | 10.431353 | 171.115995 | 0.501394 |
94 | 690_visualizing_galaxy | LGBM | 0.970978 | 12.036517 | 261.18156 | 0.980453 |
95 | 695_chatfield_4 | LGBM | 0.816784 | 13.612425 | 379.114996 | 0.91925 |
96 | 712_chscase_geyser1 | LGBM | 0.719652 | 5.295068 | 44.129887 | 0.724247 |
97 | feynman_III_12_43 | LGBM | 0.999807 | 0.008734 | 0.000126 | 0.999884 |
98 | feynman_III_15_12 | LGBM | 0.943683 | 0.877025 | 1.478416 | 0.966702 |
99 | feynman_III_15_14 | LGBM | 0.998361 | 0.000355 | 0.0 | 0.998577 |
100 | feynman_III_15_27 | LGBM | 0.99867 | 0.063433 | 0.009479 | 0.999169 |
101 | feynman_III_17_37 | LGBM | 0.998974 | 0.123542 | 0.025872 | 0.999485 |
102 | feynman_III_7_38 | LGBM | 0.998856 | 0.857281 | 1.47939 | 0.999199 |
103 | feynman_III_8_54 | LGBM | 0.55409 | 0.186592 | 0.055649 | 0.787142 |
104 | feynman_II_10_9 | LGBM | 0.998836 | 0.005788 | 0.000071 | 0.999288 |
105 | feynman_II_11_28 | LGBM | 0.99982 | 0.002903 | 0.000015 | 0.999845 |
106 | feynman_II_13_23 | LGBM | 0.999909 | 0.008282 | 0.000135 | 0.999959 |
107 | feynman_II_13_34 | LGBM | 0.99967 | 0.0289 | 0.001463 | 0.999854 |
108 | feynman_II_15_4 | LGBM | 0.999007 | 0.123948 | 0.027096 | 0.99948 |
Dataset Name | Algorithm | Mean | MAE | MSE | Spearman coeff. | |
---|---|---|---|---|---|---|
109 | feynman_II_15_5 | LGBM | 0.998989 | 0.125219 | 0.027641 | 0.999477 |
110 | feynman_II_24_17 | LGBM | 0.999691 | 0.011978 | 0.00023 | 0.999832 |
111 | feynman_II_27_16 | LGBM | 0.999096 | 1.917405 | 7.245248 | 0.999355 |
112 | feynman_II_27_18 | LGBM | 0.999833 | 0.246181 | 0.109237 | 0.999893 |
113 | feynman_II_34_2 | LGBM | 0.998982 | 0.234243 | 0.095705 | 0.999271 |
114 | feynman_II_34_29a | LGBM | 0.998847 | 0.005464 | 0.00006 | 0.999204 |
115 | feynman_II_34_2a | LGBM | 0.998851 | 0.010951 | 0.000242 | 0.999194 |
116 | feynman_II_37_1 | LGBM | 0.999066 | 0.546615 | 0.515954 | 0.999381 |
117 | feynman_II_38_14 | LGBM | 0.999788 | 0.002343 | 0.00001 | 0.999883 |
118 | feynman_II_3_24 | LGBM | 0.999771 | 0.000514 | 0.000001 | 0.999797 |
119 | feynman_II_4_23 | LGBM | 0.998652 | 0.0008 | 0.000002 | 0.999158 |
120 | feynman_II_8_31 | LGBM | 0.999833 | 0.124132 | 0.027658 | 0.999891 |
121 | feynman_II_8_7 | LGBM | 0.99865 | 0.001993 | 0.000011 | 0.999217 |
122 | feynman_I_10_7 | LGBM | 0.999906 | 0.008359 | 0.000138 | 0.999958 |
123 | feynman_I_12_1 | LGBM | 0.999805 | 0.054983 | 0.005011 | 0.999884 |
124 | feynman_I_12_4 | LGBM | 0.998701 | 0.000526 | 0.000001 | 0.998989 |
125 | feynman_I_12_5 | LGBM | 0.999803 | 0.055327 | 0.005072 | 0.999883 |
126 | feynman_I_14_3 | LGBM | 0.99897 | 0.468719 | 0.384822 | 0.999263 |
127 | feynman_I_14_4 | LGBM | 0.99983 | 0.123958 | 0.027743 | 0.999889 |
128 | feynman_I_15_10 | LGBM | 0.999675 | 0.028835 | 0.00145 | 0.999852 |
129 | feynman_I_16_6 | LGBM | 0.99934 | 0.021985 | 0.000856 | 0.999655 |
130 | feynman_I_18_12 | LGBM | 0.999291 | 0.150274 | 0.038946 | 0.99961 |
131 | feynman_I_25_13 | LGBM | 0.999774 | 0.008472 | 0.000139 | 0.999854 |
132 | feynman_I_26_2 | LGBM | 0.999856 | 0.004474 | 0.000036 | 0.999931 |
133 | feynman_I_27_6 | LGBM | 0.999205 | 0.007532 | 0.000102 | 0.999539 |
134 | feynman_I_29_4 | LGBM | 0.999725 | 0.013899 | 0.000482 | 0.999811 |
135 | feynman_I_30_3 | LGBM | 0.970219 | 0.332563 | 0.197979 | 0.965192 |
136 | feynman_I_30_5 | LGBM | 0.99902 | 0.00253 | 0.000014 | 0.999463 |
137 | feynman_I_34_1 | LGBM | 0.999405 | 0.029987 | 0.00189 | 0.999756 |
138 | feynman_I_34_14 | LGBM | 0.999609 | 0.022975 | 0.001025 | 0.999836 |
139 | feynman_I_34_27 | LGBM | 0.999808 | 0.008736 | 0.000126 | 0.999886 |
140 | feynman_I_37_4 | LGBM | 0.999146 | 0.06299 | 0.007028 | 0.999475 |
141 | feynman_I_39_1 | LGBM | 0.999804 | 0.083128 | 0.011441 | 0.999882 |
142 | feynman_I_39_11 | LGBM | 0.998859 | 0.074227 | 0.010657 | 0.999252 |
143 | feynman_I_43_31 | LGBM | 0.998984 | 0.468055 | 0.382724 | 0.999274 |
144 | feynman_I_47_23 | LGBM | 0.998968 | 0.016486 | 0.000469 | 0.999409 |
145 | feynman_I_48_2 | LGBM | 0.999779 | 1.147807 | 2.279839 | 0.999849 |
146 | feynman_I_6_2 | LGBM | 0.99978 | 0.000467 | 0.0 | 0.999882 |
147 | feynman_I_6_2b | LGBM | 0.997631 | 0.00187 | 0.000009 | 0.998732 |
148 | nikuradse_1 | LGBM | 0.992314 | 0.008647 | 0.000192 | 0.984635 |
149 | strogatz_bacres1 | LGBM | 0.978434 | 0.139073 | 0.138135 | 0.918411 |
150 | strogatz_bacres2 | LGBM | 0.980091 | 0.086343 | 0.089204 | 0.985652 |
151 | strogatz_barmag1 | LGBM | 0.945527 | 0.017393 | 0.003195 | 0.98347 |
152 | strogatz_barmag2 | LGBM | 0.774422 | 0.034429 | 0.017593 | 0.930467 |
153 | strogatz_glider1 | LGBM | 0.935193 | 0.132137 | 0.039287 | 0.962894 |
154 | strogatz_glider2 | LGBM | 0.908837 | 0.1514 | 0.07748 | 0.939587 |
155 | strogatz_lv1 | LGBM | -2.331265 | 0.39982 | 8.297826 | 0.445691 |
156 | strogatz_lv2 | LGBM | -0.336276 | 0.170791 | 0.591586 | 0.795204 |
157 | strogatz_predprey1 | LGBM | 0.711828 | 0.340823 | 2.44352 | 0.93433 |
158 | strogatz_predprey2 | LGBM | 0.983054 | 0.096986 | 0.041181 | 0.991001 |
159 | strogatz_shearflow1 | LGBM | 0.812026 | 0.085183 | 0.059188 | 0.909604 |
160 | strogatz_shearflow2 | LGBM | 0.946065 | 0.023819 | 0.002921 | 0.978591 |
161 | strogatz_vdp1 | LGBM | 0.759776 | 0.291957 | 0.62354 | 0.852349 |
162 | strogatz_vdp2 | LGBM | 0.998405 | 0.001758 | 0.000013 | 0.999232 |
Appendix G Detailed Results for Anomaly Detection
Our method Familiarity Conviction (FC) and Similarity conviction (SC) is compared with six other popular methods for carrying out anomaly detection. DeepSVDD was trained on 20 epochs with the inlier set of training data. Picking a conviction level of 0.7 for all datasets (wihout choosing it in a dataset specific manner), our method achieves the highest scores in 12 of the 20 datasets.
Dataset | Dataset Size | % Anomalies |
---|---|---|
wine.mat | [129, 13] | 7.70% |
wbc.mat | [278, 30] | 5.60% |
vowels.mat | [456, 12] | 3.40% |
vertebral.mat | [240, 6] | 12.50% |
thyroid.mat | [3772, 6] | 2.50% |
speech.mat | [3686, 400] | 1.65% |
shuttle.mat | [49097, 9] | 7% |
satimage-2.mat | [5803, 36] | 1.20% |
satellite.mat | [6435, 36] | 32% |
pima.mat | [768, 8] | 35% |
optdigits.mat | [5216, 64] | 3% |
musk.mat | [3062, 166] | 3.20% |
mnist.mat | [7603, 100] | 9.20% |
lympho.mat | [148, 18] | 4.10% |
letter.mat | [1600, 32] | 6.25% |
ionosphere.mat | [351, 33] | 36% |
glass.mat | [214, 9] | 4.20% |
cardio.mat | [1831, 21] | 9.60% |
breastw.mat | [683, 9] | 35% |
arrhythmia.mat | [452, 274] | 15 % |
(Blue values indicate the best performance; Brown values indicate the second-best performance ) Dataset Ours (FC) Ours (SC) OCSVM IForest CBLOF LOF ECOD DeepSVDD wine 0.44 0.18 0.31 0.1 0.87 0.95 0.24 0.53 wbc 0.57 0.65 0.54 0.61 0.51 0.40 0.44 0.49 vowels 0.20 0.75 0.21 0.21 0.37 0.10 0.17 0.42 vertebral 0.28 0.19 0.05 0.04 0.04 0.00 0.14 0.05 thyroid 0.26 0.64 0.29 0.54 0.30 0.19 0.56 0.33 speech 0.06 0.10 0.02 0.00 0.03 0.00 0.06 0.06 shuttle 0.26 0.60 0.33 0.89 0.82 0.10 0.75 0.60 satimage-2 0.10 0.94 0.39 0.41 0.22 0.14 0.34 0.26 satellite 0.58 0.75 0.14 0.51 0.48 0.04 0.29 0.66 pima 0.52 0.01 0.12 0.29 0.25 0.06 0.22 0.52 optdigits 0.10 0.00 0.03 0.08 0.18 0.00 0.03 0.26 musk 0.22 0.78 0.14 0.71 0.48 0.00 0.20 0.48 mnist 0.34 0.24 0.19 0.39 0.34 0.01 0.20 0.43 lympho 0.35 0.73 0.38 0.36 0.29 0.00 0.09 0.26 letter 0.17 0.43 0.18 0.10 0.22 0.08 0.13 0.31 ionosphere 0.59 0.85 0.27 0.67 0.43 0.78 0.32 0.90 glass 0.20 0.14 0.13 0.14 0.13 0.30 0.19 0.30 cardio 0.35 0.51 0.24 0.44 0.54 0.10 0.48 0.61 breastw 0.33 0.86 0.17 0.90 0.42 0.09 0.32 0.94 arrhythmia 0.47 0.51 0.23 0.11 0.41 0.37 0.43 0.51