This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

Hybrid Method Based on NARX models and Machine Learning for Pattern Recognition

Pedro H. O. Silva    Augusto S. Cerqueira    Erivelton G. Nepomuceno Department of Electrical Engineering, Federal University of
Juiz de Fora (UFJF), Juiz de Fora, MG, Brazil,
(e-mail: [email protected], [email protected]).
Department of Electrical Engineering, Federal University of São João del-Rei (UFSJ), São João del-Rei, MG, Brazil,
(e-mail: [email protected]).
Resumo

This work presents a novel technique that integrates the methodologies of machine learning and system identification to solve multiclass problems. Such an approach allows to extract and select sets of representative features with reduced dimensionality, as well as predicts categorical outputs. The efficiency of the method was tested by running case studies investigated in machine learning, obtaining better absolute results when compared with classical classification algorithms.

Resumo: O presente trabalho apresenta uma nova técnica que integra as metodologias de aprendizado de máquinas e identificação de sistemas na solução de problemas multiclasses. A abordagem permite extrair e selecionar conjuntos de características representativas com dimensionalidade reduzida, da mesma forma que prediz saídas categóricas. A eficiência do método é testada pela aplicação em estudos de casos estudados no aprendizado de máquina, obtendo melhores resultados absolutos em comparação aos algoritmos clássicos de classificação.

keywords:
machine learning; system identification; NARX model; feature extraction; dimensionality reduction. Palavras-chaves: aprendizado de máquina; identificação de sistemas; modelos NARX; extração de características; redução de dimensionalidade.
1:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@0Input: {y(k),k=1,,N}\{y(k),k=1,\dots,N\}, ={ϕi,i=1,,m}\mathcal{M}=\{\phi_{i},i=1,\dots,m\} , ll, nyn_{y}, nun_{u}, kk
2:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-1Output: 𝜶={αi,i=1,,k}\boldsymbol{\alpha}=\{\alpha_{i},i=1,\dots,k\}, 𝜽={θi,i=1,,k}\boldsymbol{\theta}=\{\theta_{i},i=1,\dots,k\}
3:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-2i=1:mi=1:m
4:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-3wiϕiϕi2w_{i}\leftarrow\frac{\phi_{i}}{\left\|\phi_{i}\right\|_{2}}
5:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-4rir_{i}\leftarrow Logistic regression accuracy in wiw_{i} and yy
6:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-5
7:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-6jargmax1im{r(wi,y)}j\leftarrow\text{arg}\underset{1\leq i\leq m}{\max}\{r(w_{i},y)\}
8:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-7q1wjq_{1}\leftarrow w_{j}
9:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-8α1ϕj\alpha_{1}\leftarrow\phi_{j}
10:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-9Train logistic model with α1\alpha_{1} and yy
11:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-10Compute cross-validation
12:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-11Remove ϕj\phi_{j} from \mathcal{M}
13:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-12s=2:ks=2:k
14:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-13i=1:mi=1:m
15:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-14wi(s)w_{i}^{(s)}\leftarrow Orthogonalize ϕi\phi_{i} in [q1,,q(s1)][q_{{}_{1}},\dots,q_{{}_{(s-1)}}]
16:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-15wi𝖳wi<1010w_{i}^{\mathsf{T}}w_{i}<10^{-10}
17:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-16Remove ϕi\phi_{i} from \mathcal{M}
18:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-17Next iteration 
19:\csnameALG@b@\ALG@L@\ALG@thisentity@\csnameALG@currentblock@-18