Ordinal Pattern Kernel for Brain Connectivity Network Classification
Abstract
Brain connectivity networks, which characterize the functional or structural interaction of brain regions, has been widely used for brain disease classification. Kernel-based method, such as graph kernel (i.e., kernel defined on graphs), has been proposed for measuring the similarity of brain networks, and yields the promising classification performance. However, most of graph kernels are built on unweighted graph (i.e., network) with edge present or not, and neglecting the valuable weight information of edges in brain connectivity network, with edge weights conveying the strengths of temporal correlation or fiber connection between brain regions. Accordingly, in this paper, we present an ordinal pattern kernel for brain connectivity network classification. Different with existing graph kernels that measures the topological similarity of unweighted graphs, the proposed ordinal pattern kernels calculate the similarity of weighted networks by comparing ordinal patterns from weighted networks. To evaluate the effectiveness of the proposed ordinal kernel, we further develop a depth-first-based ordinal pattern kernel, and perform extensive experiments in a real dataset of brain disease from ADNI database. The results demonstrate that our proposed ordinal pattern kernel can achieve better classification performance compared with state-of-the-art graph kernels.
1 Introduction
Brain connectivity network characterizes the abstract connection structure of human brain, where brain regions correspond to nodes and functional or anatomical associations between nodes are considered as edges. Brain network is widely applied to classification of brain diseases, including Alzheimer’s disease (AD) [1], attention deficit hyperactivity disorder (ADHD) [2], major depressive disorder (MDD) [3] and schizophrenia [4]. In these studies, various network measures, e.g., degree, clustering coefficient [5, 6], are first extracted from connectivity networks as features for classification.
The graph kernels, which measure the topological similarity of brain network, have shown promising performance on all kinds of classification problems[7,8]. There are a variety of graph kernels that are different from each other in topological structures, including paths, walks, trees and subgraphs. Shortest-path kernel [9] is a graph kernel based on paths. Random walk graph kernels [10, 11] and return probability graph kernel [12] are the graph kernels based on walks. Cyclic pattern kernels [13, 14], tree pattern kernels [15, 16], Weisfeiler-Lehman graph kernel [17] and its variant [18] are the graph kernels based on trees. Subgraph matching kernels [19] are based on subgraphs. Pyramid Match kernel [20, 21] is based on pyramid structure. These graph kernels are widely used to classify the network structured data, such as molecules. But these graph kernels are defined on unweighted graph with edge present or not, thus neglect the valuable weight information of edges in brain connectivity network, with edge weights conveying the strengths of temporal correlation or fiber connection between brain regions.
To address this problem, we develop an ordinal pattern kernel for measuring the brain network similarities. In this work, we firstly introduce our proposed ordinal pattern kernels and provide the theoretical foundations for them. Then, we find that computing ordinal pattern kernels is NP-hard. In order to avoid the NP-hard problem in ordinal pattern kernels, we propose depth-first-based ordinal pattern kernel. At last, we perform the classification experiments and ordinal sub-structure mining experiments in the network data of brain diseases. Specifically, our work has following advantages:
-
•
Our ordinal pattern kernel could make full use of weight information of edge in brain network and outperforms the existing state-of-the-art graph kernels in the classification accuracy.
-
•
Our ordinal pattern kernel has strong robustness. When brain networks have missing data, our method could still acquire the best classification accuracy.
-
•
Our proposed depth-first-based ordinal pattern could capture the discriminative sub-structures for seeking the biomarkers in brain disease.
2 Background
2.1 Graph kernels
Graph kernels are a class of kernel functions measuring the similarities between graphs. There is a map , which could implicitly embed the original graph data set into a Hilbert space , : . In , graph kernel : is a function associated with , given two graphs G1 and G2, G1, G2 , graph kernel is interpreted as a dot product in the high dimensional space , . If is a reproducing kernel Hilbert space (RKHS), then is a positive definite kernel.
2.2 Isomorphism
A and B are two nonempty sets, is a map from A to B. and are respectively the algebraic operation on A and B. If , then , is called the homomorphic mapping from A to B. If is a homomorphic and onto mapping from A to B, we call A and B are homomorphic. If is a homomorphic and bijective mapping from A to B, then is called the isomorphic mapping from A to B. We call A and B are isomorphic, .
3 Ordinal pattern
Ordinal pattern is regarded as a new descriptor for brain connectivity networks [22], which provides ordinal edge sequences for each node. Here, we extend ordinal pattern into graph and redefine it with graph theory.
3.1 Ordinal pattern
A weighted network or graph G consists of a set of nodes V, edges E and weight vectors W, . W is the weight vector for those edges with the i-th element representing the connection strength of the edge ei, . The ordinal pattern (OP) defined in graph G is a set including ordinal nodes and ordinal edges . is a ordinal edge set, , all , , and are called ordinal edges. is a vertex set where vertexes are in ordinal edges included in . The illustration of ordinal patterns could be seen in Figure 1. OP1, OP2 and OP3 are ordinal patterns from a weighted network.

3.2 Ordinal pattern isomorphism
A graph or network could be decomposed into multiple ordinal patterns. The set consisting of all ordinal patterns is called ordinal pattern set . OPs1 and OPs2 are two ordinal pattern sets of graph G1 and G2, OP1= and OP2= are two ordinal patterns, OP1OPs1, OP2OPs2. An ordinal pattern isomorphism between two ordinal patterns OP1 and OP2 is a bijective mapping , i.e. : , . OP1 and OP2 are isomorphic, written . V and V are subsets of ordinal pattern vertices. An ordinal pattern isomorphism of OP1[V] and OP2[V] is called sub-ordinal pattern isomorphism (SOPI) of OP1 and OP2.
4 Ordinal pattern kernel
4.1 Ordinal pattern kernel
We suppose that a graph G has N ordinal patterns, hence ordinal pattern set of graph G is , OPi is the i-th ordinal pattern, . Two graphs G1 and G2 have their own OPs respectively, they are OPs1 and OPs2. is the isomorphism mapping from ordinal pattern OPi to ordinal pattern OPj,OPiOPs1, OPjOPs2. Let refer to the set which includes all sub-ordinal pattern isomorphisms (SOPIs) of OPiand OPj and : a weight function. The sub-ordinal pattern isomorphism kernel is defined as:
(1) |
Theorem 1. is positive semidefinite (p.s.d) kernel
Proof. Kernel counts the number of isomorphisms between sub-ordinal patterns in ordinal pattern OPi and OPj. We have known that the kernel counting the number of isomorphisms between graph sub-structures is [19]. In , sub-ordinal patterns could be treated as the special sub-structures where the nodes and edges are from ordinal patterns. Hence, is .
The sub-ordinal pattern isomorphism kernel measures the similarity between two ordinal patterns by counting the number of sub-ordinal pattern isomorphisms, . Then, we get the ordinal pattern kernel between two graphs G1 and G2, defined as:
(2) |
where OPs1 and OPs2 are the ordinal pattern sets of G1 and G2, iso-count() is a function calculating isomorphism:
(3) |
where a weight function : , which count the node number when two ordinal patterns are isomorphic.
4.2 Ordinal pattern attribute kernel
If the ordinal patterns have the node and edge attributes, the above formula is not appropriate to defining a mapping to preserve their attributes. We need to generalize the formula (2) to the common ordinal patterns with attributes. The new kernel is called ordinal pattern attribute (OPA) kernel.
(4) |
(5) |
where KV and KE are two positive semidefinite kernel functions defined on ordinal pattern node and edge attribute features.
Theorem 2. The ordinal pattern attribute kernel is p.s.d
Before we prove theorem 2, we need to know R relation in ordinal pattern (seeing supplement) and R-ordinal pattern convolution. Suppose OPs1 and OPs1 are two ordinal pattern set in graph G1 and G2. Ordinal pattern and , , . The decompositions of and : and are two parts of and . For , we define a function iso-counti(,) on and that could be used to measure the similarity of ordinal patterns and . For , , we define two positive semidefinite kernels and that could be used to measure the similarity of the ordinal nodes and edges, , , , . Then we define the kernel measuring the similarity between ordinal pattern set OPs1 and OPs2 as the following ordinal pattern convolution.
(6) |
(7) |
where is a symmetric function on , . K is kernel defined on iso-counti(,), and by R relation. Hence K is called R-ordinal pattern convolution which is the zero extension of K to . If R is finite, then K is a finite convolution.
Theorem 3. R-ordinal pattern convolution is a kernel on
Proof. Let U denote , is ordinal pattern subset, , since iso-counti(,), and are kernels. We know that is a kernel on , obviously it is the kernel closure under tensor product
(8) |
Since R is finite, according to Lemma 1, is a kernel on the product of the set of all not empty such that with itself.
(9) |
Since R-ordinal pattern convolution is the zero extension of , it follows that it is a kernel on . The OPA kernel is a R-ordinal pattern convolution kernel , where each kernel is p.s.d, hence OPA kernel is p.s.d.
Lemma 1 If K is a kernel on a set , for all nonempty and finite set . We define a function . Then is a kernel on the product of the set of all nonempty and finite subsets of U with itself.
Proof. For any finite nonempty subset , let , where is the pre-Hilbert space associated with . If for nonempty finite , we define , then by Equation (1) in supplement, . Because an inner product is a kernel, it follows that is a kernel on the product of the set of all nonempty finite subsets of U with itself.
Ordinal pattern kernel in formula (2) is a special case of ordinal pattern attribute kernel in formula (4). When we do not take node and edge attributes into the computation of ordinal pattern kernel, the kernel on node and edge attributes is equal 1, , hence formula (4) degenerate to formula (2).
These kernels guarantee that exactly the conditions of ordinal pattern isomorphism are fulfilled. Hence, the kernel is a special case of OPA kernel, we could obtain the following corollary.
Corollary 1. The ordinal pattern kernel is p.s.d
Although we have known how to calculate ordinal pattern kernel and ordinal pattern attribute kernel, there is another problem in them. The problem is that computing the kernels including ordinal pattern kernel and ordinal pattern attribute kernel based on ordinal pattern isomorphism are NP-hard.
Theorem 4. Computing the kernel based on ordinal pattern isomorphism is NP-hard.
Proof. Let OPnOPs be the ordinal pattern with n edges and let eop be a vector in the ordinal pattern feature space which is defined by the mapping : OPs into Hilbert space with one feature for each ordinal pattern , , , where is a sequence , ,, of weights . In eop, the features corresponding to OP equal 1 and others equal 0. Let be any ordinal pattern with m vertices. According to [11] for linear independent { }, we could always find ,, in polynomial time, make that +,,+=e. Then ,+,,+, >0 if and only if has a Hamiltonian path. We all known that finding a Hamiltonian path is NP-complete. Hence, computing the kernel based on ordinal pattern isomorphism is NP-hard.
5 Modified ordinal pattern kernel
There are two problems in above computation. One problem is that the ordinal pattern has Hamiltonian path which is NP-complete problem. The other problem is that ordinal pattern OP1 is the sub-ordinal pattern of another ordinal pattern OP2. For example, in Figure 1, , V={a,b,c}, E={eab, ebc}, , V={a,b,c,d}, E={eab, ebc, ecd}, OP2. This problem brings redundant calculations for ordinal pattern kernel. In order to overcome these two problems, we propose a modified ordinal pattern kernel based on depth-first search. We adopt the depth-first search (DFS) algorithm to seek the deepest ordinal pattern for each node in the graph and then design the relevant ordinal pattern kernel on them. The ordinal pattern constructed by depth-first search is called depth-first-based ordinal pattern (DOP). The DOP is a linear structure, hence the isomorphism problem in DOP could be regarded as a matched problem.
5.1 Depth-first-based ordinal pattern (DOP)
Here, we detect the establishment process of depth-first-based ordinal pattern in detail. A weighted network or graph G consists of a set of nodes V, edges E and weight vectors W, . , the neighborhood vertex set of a vertex u: , the edge weight set between vertex u and its neighborhood vertexes: . In graph G, we arbitrarily choose a node as the start node v0 and use the depth-first search algorithm to seek the deepest ordinal pattern of node v0. The detailed process of constructing depth-first-based ordinal pattern for node v0 could be seen in Algorithm 1.
Input: Weight matrix W of graph G, start node v0
Output: The deepest ordinal pattern of node v0
5.2 Depth-first-based ordinal pattern attribute kernel
We could use Algorithm 1 to calculate the DOP for each node in graph G. Subsequently, we could utilize these ordinal patterns to construct the depth-first-based ordinal pattern kernel kDOP between graph G1 and G2 with attributes. Because ordinal pattern kernel is a special case of ordinal pattern attribute kernel. Here, we only detect the depth-first-based ordinal pattern attribute kernel.
(10) |
(11) |
where DOP and DOP are the depth-first-based ordinal patterns or the deepest ordinal patterns of node and . and are the kernels defined on node and edge attributes in DOP and DOP.
(12) |
where is a sequence ,,, of weights . Obviously, formula is a special case of formula . Here, could also be calculated by the matched node numbers between and .
Theorem 5. Depth-first-based ordinal patternn attribute kernel is p.s.d.
Proof. According to the definitions of the sub-ordinal pattern isomorphisms and the corresponding sub-ordinal pattern isomorphism kernel, we know that is p.s.d. K is a p.s.d kernel defined in node and edge attributes. is a R-ordinal pattern convolution kernel, hence depth-first-based ordinal pattern attribute kernel is p.s.d.
Kernel selection. In real applications, such as brain neuroimaging, each brain structure is abstracted into a network (or graph) where each node and edge may have multi-dimensional attributes. These attributes are from Euclidean spaces, we could utilize the Gaussian RBF kernel or linear kernel to compute the kernel KV and KE in node and edge attributes. If the attributes are discrete, we use Delta kernel [12].
6 Experiments
In this section, we perform the classification, robustness and discriminative sub-structure mining experiments in the brain network data of brain disease patients and normal controls to verify the effectiveness of ordinal pattern kernel. All the experiments are performed on a server with an Intel Core , CPU and 32GB RAM having 6 cores and 12 threads.
Graph kernel method | MCI vs. NC | AD vs. NC | EMCI vs. LMCI | EMCI vs. AD | LMCI vs. AD |
---|---|---|---|---|---|
SP | 67.79 | 76.19 | 56.57 | 76.67 | 77.92 |
WL-ST | 67.79 | 75.00 | 73.74 | 73.33 | 71.43 |
WL-SP | 75.83 | 77.38 | 74.75 | 77.78 | 72.73 |
RW | 66.44 | 73.80 | 72.73 | 76.67 | 77.92 |
PM | 79.87 | 72.62 | 75.76 | 74.44 | 75.32 |
WWL | 73.83 | 67.86 | 76.77 | 71.11 | 80.52 |
GH | 71.81 | 64.29 | 56.57 | 65.56 | 61.04 |
Tree++ | 74.50 | 60.71 | 57.58 | 65.56 | 63.64 |
DOP | 81.21 | 86.90 | 83.84 | 80.0 | 83.12 |
6.1 Datasets
The brain network data used in the experiments are based on brain disease patients and normal controls, which are constructed from the resting state functional magnetic resonance imaging (fMRI) data deriving from the online database ADNI111http://adni.loni.usc.edu/. Brain diseases are Alzheimer’s disease (AD), early mild cognitive impairment (EMCI), late mild cognitive impairment (LMCI). MCI consists of EMCI and LMCI. The normal controls (NC) matched to brain disease patients are used to classify brain diseases and seek the discriminative sub-structures from brain disease networks. The fMRI data are preprocessed with statistical parametric mapping (SPM) 222http://www.fil.ion.ucl.ac.uk/spm and resting-state fMRI analysis toolkit (REST) 333http://www.restfmri.net. The detailed fMRI data preprocessing steps could be seen in supplement.
6.2 Brain network construction
After processing the fMRI data of brain disease patients and normal controls, we need to transform the preprocessed fMRI data into brain networks. The whole-brain cortical and subcortical structures are subdivided into 90 brain regions for each subject based on the AAL atlas. The linear correlation between mean time series of a pair of brain regions is then calculated to measure the functional connectivity. At last, a fully-connected weighted functional network is constructed for each subject. The detailed contents of constructing brain network could be seen in supplement.
6.3 Experimental setup
In the experiments, uniform weight is chosen from . We compare our kernel with state-of-the-art graph kernels including shortest path kernel (SP) [9], Weisfeiler-Lehman subtree kernel (WL-ST) [17], Weisfeiler-Lehman shortest path kernel (WL-SP) [17], random walk kernel (RW) [10], pyramid match kernel (PM) [21], Wasserstein Weisfeiler-Lehman graph kernel (WWL) [18], GraphHopper kernel (GH) [23], Truncated Tree Based Graph Kernels (Tree++) [24]. In robustness experiment, we randomly discard partial data in each brain network by 25% missing rate.
Support Vector Machine (SVM) [25] as our final classifier is exploited to conduct the classification experiment. We perform leave-one-out cross-validation for all the classifications, using one sample for testing and the others for training. The tradeoff parameter C in the SVM is selected from .
6.4 Results and discussion
The classification accuracy, method robustness and discriminative sub-structures are very important in brain network analysis. We report the classification accuracies in Table 1 and investigate the robustness of our depth-first-based ordinal pattern kernel in the special missing rate in Figure 2 and the discriminative ordinal patterns in Figure 3.
6.4.1 Classification results
We conduct the classification experiments in the brain functional networks of brain disease patients and normal controls, including EMCI, LMCI, AD and NC matched to brain diseases. The MCI consists of EMCI and LMCI. We compare our ordinal pattern kernel with the state-of-the-art graph kernels in these datasets, seeing Table 1. In these datasets, our method outperforms the state-of-the-art graph kernels.
6.4.2 Robustness
The missing information usually exists in the brain fMRI data [26, 27], which will bring challenges for brain network classification. Here, we randomly discard the specific percentage of data in each brain network, seeing Figure 2 (A)-(B). The specific percentage is 25%. Then, we measure the similarities among these specific brain networks with our ordinal pattern kernel. The experimental results indicate that our ordinal pattern kernel could acquire robust and excellent classification accuracies in these missing brain network data, seeing Figure 2 (C).

6.4.3 Discriminative sub-structures
We could also use our proposed depth-first-based ordinal pattern to seek the discriminative sub-structures [28] in brain networks of patients and normal controls. We plot the top six discriminative ordinal patterns identified in the classification tasks in Figure 3 (A)-(D). The start node of depth-first-based ordinal pattern is right hippocampus.
We could find that the depth-first-based ordinal patterns in brain diseases are different from those in normal controls. The first two ordinal nodes in depth-first-based ordinal patterns are same among EMCI, LMCI, AD and NC. The first four ordinal nodes in EMCI are same to those in LMCI, seeing Figure 3 (B) and (C). From Figure 3 (B)-(D), we could also find that ordinal pattern structures are gradually changed from EMCI to AD. The top discriminative depth-first-based ordinal patterns starting from left hippocampus could be seen in supplement.

7 Conclusion
In this paper, we propose the ordinal pattern kernel for making full use of ordinal edge weight information to measure the similarities between brain networks. We perform the classification, robustness and discriminative sub-structure mining experiments in the brain network data of brain disease patients and normal controls. The classification results indicate that our method outperforms the existing state-of-the-art graph kernels in the accuracy and robustness. Our proposed ordinal patterns based on depth-first search could capture the discriminative ordinal sub-structures in the brain networks of EMCI, LMCI and AD.
Broader Impact
Ordinal pattern kernel is a method measuring the similarities between brain networks, which could be used to diagnose the brain diseases. This method could also be extended to the other networks having weight information. This work uses the public datasets from Alzheimer’s Disease Neuroimaging Initiative (ADNI).
References
[1] ShuaiZong Si, XiaoLiu, JinFa Wang, et al. Brain networks modeling for studying the mechanism underlying the development of Alzheimer’s disease. Neural Regeneration Research, 14(10):1805, 2019.
[2] Amir Hossein Ghaderi, Mohammad Ali Nazari, Hassan Shahrokhi, et al. Functional Brain Connectivity Differences Between Different ADHD Presentations: Impaired Functional Segregation in ADHD-Combined Presentation but not in ADHD-Inattentive Presentation. Basic & Clinical Neuroscience,8(4):267-278, 2017.
[3] Corey Fee, Mounira Banasr, Etienne Sibille. Somatostatin-Positive Gamma-Aminobutyric Acid Interneuron Deficits in Depression: Cortical Microcircuit and Therapeutic Perspectives. Biological psychiatry, 82(8):549-559, 2017.
[4] Qingbao Yu, Jing Sui, Kent A. Kiehl, et al. State-related functional integration and functional segregation brain networks in schizophrenia. Schizophrenia Research, 150(2–3):450–458, 2013.
[5] Mikail Rubinov, Olaf Sporns. Complex network measures of brain connectivity: Uses and interpretations. Neuroimage 52(3):1059–1069, 2010.
[6] Kai Ma, Jintai Yu, Wei Shao, et al. Functional Overlaps Exist in Neurological and Psychiatric Disorders: A Proof from Brain Network Analysis. Neuroscience 425:39–48, 2020.
[7] Christopher Morris, Nils M. Kriege, Kristian Kersting, et al. Faster kernels for graphs with continuous attributes via hashing. In Proceedings of the 16th IEEE International Conference on Data Mining (ICDM), 1095–1100, 2016.
[8] Yu Tian, Long Zhao, Xi Peng, et al. Rethinking Kernel Methods for Node Representation Learning on Graphs. In Advances in Neural Information Processing Systems (NeurIPS), 11686-11697, 2019.
[9] Karsten M. Borgwardt, Hanspeter Kriegel. Shortest-path kernels on graphs. Fifth IEEE International Conference on Data Mining (ICDM), 74-81, 2005.
[10] S V N Vishwanathan, Nicol N. Schraudolph, Risi Kondor, et al. Graph kernels. Journal of Machine Learning Research (JMLR), 11(Apr):1201–1242, 2010.
[11] Thomas Gartner, Peter Flach, Stefan Wrobel. On graph kernels: Hardness results and efficient alternatives. In Learning Theory and Kernel Machines, 129–143, 2003.
[12] Zhen Zhang, Mianzhi Wang, Yijian Xiang, et al. RetGK: Graph kernels based on Return Probabilities of Random Walks. In Advances in Neural Information Processing Systems (NeurIPS), 3964-3974, 2018.
[13] Tamas Horvath, Thomas Gartner, Stefan Wrobel. Cyclic Pattern Kernels for Predictive Graph Mining. Advances in Knowledge Discovery & Data Mining (KDD), 158-167, 2004.
[14] Tamas Horvath. Cyclic Pattern Kernels Revisited. Advances in Knowledge Discovery & Data Mining (KDD), 791-801, 2005.
[15] Pierre Mahe, Jeanphilippe Vert. Graph kernels based on tree patterns for molecules. Machine Learning, 75(1):3-35, 2009.
[16] Giovanni Da San Martino, Nicolò Navarin, Alessandro Sperduti. Tree-based kernel for graphs with continuous attributes. IEEE transactions on neural networks and learning systems (TNNLS), 29(7):3270-3276, 2018.
[17] Nino Shervashidze, Pascal Schweitzer, Erik Jan van Leeuwen, et al. Weisfeiler-lehman graph kernels. Journal of Machine Learning Research (JMLR),12(Sep):2539–2561, 2011.
[18] Matteo Togninalli, Elisabetta Ghisu, Felipe Llinares Lopez, et al. Wasserstein Weisfeiler-lehman graph kernels. In Advances in Neural Information Processing Systems (NeurIPS), 6439-6449, 2019.
[19] Nils M. Kriege, Petra Mutzel. Subgraph matching kernels for attributed graphs. International Conference on Machine Learning (ICML), 291-298, 2012.
[20] Kristen Grauman, Trevor Darrell. The Pyramid Match Kernel: Efficient Learning with Sets of Features. Journal of Machine Learning Research (JMLR), 725-760, 2007.
[21] Giannis Nikolentzos, Polykarpos Meladianos, Michalis Vazirgiannis. Matching Node Embeddings for Graph Similarity. Association for the Advance of Artificial Intelligence (AAAI), 2429-2435, 2017.
[22] Daoqiang Zhang, Jiashuang Huang, Biao Jie, et al. Ordinal Pattern: A New Descriptor for Brain Connectivity Networks. IEEE Transactions on Medical Imaging, 37(7):1711-1722, 2018.
[23] Aasa Feragen, Niklas Kasenburg, Jens Petersen, et al. Scalable kernels for graphs with continuous attributes. In Advances in Neural Information Processing Systems (NeurIPS), 216-224, 2013.
[24] Wei Ye, Zhen Wang, Rachel Redberg, et al. Tree++: Truncated Tree Based Graph Kernels. IEEE Transactions on Knowledge and Data Engineering (TKDE), 2019.
[25] Chihchung Chang, Chihjen Lin. LIBSVM: A library for support vector machines. Acm Transactions on Intelligent Systems & Technology, 2011.
[26] Hien M Nguyen, Gary H Glover. A Modified Generalized Series Approach: Application to Sparsely Sampled fMRI. IEEE Transactions on Biomedical Engineering, 60(10): 2867-2877, 2013.
[27] Jutta de Jong, Serge Dumoulin, Barrie Klein, et al. Hand position modulates visually-driven fMRI responses in premotor cortex. Society for Neuroscience. 2016.
[28] Fei Fei, Biao Jie, Daoqiang Zhang. Frequent and Discriminative Subnetwork Mining for Mild Cognitive Impairment Classification. Brain Connectivity, 4(5):347-360, 2014.