diff --git a/text/thesis/02MaterialsAndMethods.tex b/text/thesis/02MaterialsAndMethods.tex index 543381d..33b2c78 100644 --- a/text/thesis/02MaterialsAndMethods.tex +++ b/text/thesis/02MaterialsAndMethods.tex @@ -32,7 +32,7 @@ To gain these from the continuous signal there are different methods. The intuitive approach would be to use Fourier transformation however the Fourier transform does not need to exists for a continuous signal. So we used power spectral density (PSD) estimation. \subsubsection{Power spectral density estimation} The PSD is the power per frequency. Power here refers to the square of the amplitude. %TODO: formulation, additional explanation?, fft - If the Fourier transform is existing, PSD can be calculated from it e.g. as periodogram. If not it has to be estimated. One way to do so is parametrised with an Autoregressive model. Here one assumes that the there is a correlation between $p$ consecutive samples and the one following of the spectral density. This leads to an equation with only $p$ parameters which can be estimated in different ways. We used Burg's method (\texttt{pburg} from MATLAB library). + If the Fourier transform is existing, PSD can be calculated from it e.g. as periodogram. If not it has to be estimated. One way to do so is parametrised with an Autoregressive model. Here one assumes that the there is a correlation between $p$ consecutive samples and the one following of the spectral density. This leads to an equation with only $p$ parameters which can be estimated in different ways. We used Burg's method (\texttt{pburg} from \matlab{} library). \subsubsection{Burg's method} \label{mat:burg} Burg's method (\cite{Burg75}) is a special case of parametric PSD estimation. It interprets the Yule-Walker-Equations as least squares problem and iteratively estimates solutions.\\ @@ -62,14 +62,40 @@ \label{mat:pca} \subsection{NMF} \label{mat:nmf} + In some applications Non-negative Matrix Factorization (NMF) is preferred over PCA (cf. \cite{Lee99}). This is because it does not learn eigenvectors but decomposes the input into parts which are all possibly used in the input. When seen as matrix factorisation PCA yields matrices of arbitrary sign where one represents the eigenvectors the other the specific mixture of them. Because an entry may be negative cancellation is possible. This leads to unintuitive representation in the first matrix.\\ + NMF in contrast only allows positive entries. This leads to \qq{what is in, is in} meaning no cancellation which in turn yields more intuitive matrices. The first contains possible parts of the data, the second how strongly they are represented in the current input.\\ + The formula for NMF is + $$Input\approx \mathbf{WH}$$ + where Input is $n\times m$, $W$ is $n\times r$ and $H$ is $r\times m$ with $r<<\min\{m,n\}$. So $\mathbf{WH}$ is only an approximation of the input however with significant lower dimension - the number of Synergies used.\\ %TODO: formulation + The factorisation is learnt with an update rule that my be chosen. We used the \matlab{} default, an alternating least squares (ALS) algorithm. It can be described as in algorithm \ref{alg:als} (cf. \cite{Berry07}).\\ + \begin{algorithm} + \begin{algorithmic} + \State initialize $\mathbf{W}$ randomly + \State $i\gets0$ + \While{$i <$ numberOfIterations} + \State Solve $\mathbf{W^TWH=W^TA}$ for $\mathbf{H}$ + \State set all negative values in $H$ to zero + \State Solve $\mathbf{HH^TW^T=HA^T}$ for $\mathbf{W}$ + \State set all negative values in $\mathbf{W}$ to zero + \State $i\gets i+1$ + \EndWhile + \Return $\mathbf{W}$ + \end{algorithmic} + \caption{Alternating Least Squares in NMF} + \label{alg:als} + \end{algorithm} + This version uses some simplifications (as setting to zero to be non-negative) and an slightly improved form is used in \matlab{}.\\ + ALS usually converges faster and with an better result than multiplicative update algorithms which would be the alternative in \matlab{}. + + \section{Experimental design} The data used for this work were mainly recorded by Farid Shiman, Nerea Irastorza-Landa, and Andrea Sarasola-Sanz for their work (\cite{Shiman15},\cite{Sarasola15}). We were allowed to use them for further analysis.\\ There were 9 right-handed subjects%TODO All the tasks were performed with the right hand.\\ To perform was a centre-out reaching task to one of four targets (see \ref{fig:experimentalDesign}) while 32 channel EEG, at least% - \footnote{\texttt{'AbdPolLo', 'Biceps', 'Triceps', 'FrontDelt', 'MidDelt'} and \texttt{'BackDelt'} were recorded for every subject, others only in some. Only the 6 channels tracked in every session were used}% - 6 channel surface EMG and 7 DOF kinematics were tracked. - \begin{figure} + \footnote{\texttt{'AbdPolLo', 'Biceps', 'Triceps', 'FrontDelt', 'MidDelt'} and \texttt{'BackDelt'} were recorded for every subject, others only in some. Only the 6 channels tracked in every session were used} % + 6 channel surface EMG and 7 DOF kinematics were tracked. + \begin{figure}[b] \centering \includegraphics{experimentalDesign.jpg} \caption{Centre-out reaching task with four colour-coded targets} diff --git a/text/thesis/experimentalDesign.jpg b/text/thesis/experimentalDesign.jpg new file mode 100644 index 0000000..a2f0470 --- /dev/null +++ b/text/thesis/experimentalDesign.jpg Binary files differ diff --git a/text/thesis/mylit.bib b/text/thesis/mylit.bib index c5580a9..d873256 100755 --- a/text/thesis/mylit.bib +++ b/text/thesis/mylit.bib @@ -166,7 +166,23 @@ journal = "Experimental wireless \& the wireless engineer", volume = "7", pages = "536-541" - } +} +@article{Lee99, + author = "Daniel D. Lee and H. Sebastian Seung", + title = "Learning the parts of objects by non-negative matrix factorization", + year = "1999", + journal = "Nature", + volume = "401", + pages = "788-791" +} +@article{Berry07, + author = "Michael W. Berry and Murray Browne and Amy N. Langville and V. Paul Pauca and Robert J. Plemmons", + title = "Algorithms and Applications for Approximate Nonnegative Matrix Factorization", + journal = "Computational Statistics and Data Analysis", + year = "2007", + volume = "51", + pages = "155-173" +} @article{Ting07, diff --git a/text/thesis/thesis.tex b/text/thesis/thesis.tex index 00cc5f8..467cdfe 100644 --- a/text/thesis/thesis.tex +++ b/text/thesis/thesis.tex @@ -17,6 +17,8 @@ \usepackage{helvet} \usepackage{pdfpages} \usepackage[official]{eurosym} +\usepackage[chapter]{algorithm} +\usepackage{algpseudocode} %\renewcommand{\familydefault}{\sfdefault} \newcommand{\qq}[1]{``#1''} @@ -41,6 +43,9 @@ \renewcommand{\author}{Jan-Peter Hohloch} \date{\today} + +\newcommand{\matlab}{\textsc{Matlab}} + \begin{document} %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% @@ -138,11 +143,26 @@ %%% List of tables %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% +% \renewcommand{\baselinestretch}{1.3} +% \small\normalsize + +% \addcontentsline{toc}{chapter}{List of Tables} +% \listoftables + +% \renewcommand{\baselinestretch}{1} +% \small\normalsize + +% \cleardoublepage + +%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% +%%% List of algorithms +%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% + \renewcommand{\baselinestretch}{1.3} \small\normalsize -\addcontentsline{toc}{chapter}{List of Tables} -\listoftables +\addcontentsline{toc}{chapter}{List of Algorithms} +\listofalgorithms \renewcommand{\baselinestretch}{1} \small\normalsize @@ -166,6 +186,7 @@ \textbf{NMF}\> non-Negative Matrix Factorisation \\ \textbf{ANN}\> Artificial Neural Network \\ \textbf{PSD}\> Power Spectral Density \\ +\textbf{ALS}\> Alternating Least Squares \\ \end{tabbing} \cleardoublepage @@ -187,15 +208,15 @@ \cleardoublepage %% Results -% \input{sec3} +% \input{03Results} % \cleardoublepage %% Discussion -% \input{sec4} +% \input{04Discussion} % \cleardoublepage %% future work -% \input{sec5} +% \input{05Future} % \cleardoublepage