Thought Control System

The purpose of this example is to demonstrate how to design a classifier system using ANNHub to classify human brain waveforms. This will involve in designing a feature extraction system to obtain the features from human brain waveforms, also known as Electroencephalography (EEG) signals. That features will then be used as inputs of the Neural Network, designed in ANNHub, to classify EEG mental tasks..

Tuan Nghia Nguyen · 3/12/2019 4:10:34 AM

Thought Control System

The EEG dataset recorded by Zak Keirn at Purdue University for his work on his Masters of Science thesis. The dataset is in binary Matlab format. In this work, EEG data were recorded from 7 subjects using 7 channels, located at C3, C4, P4, P4, O1, O2 and EOG defined by 10-20 system. Subjects were required to perform 5 mental tasks: a baseline, mental multiplication, mental figure rotation, mental letter composing, and visual counting in 10 trials. In each trial, EEG signals were recorded at 250Hz for 10 seconds (2500 samples). Figure 1 shows the EEG signals for 5 mental tasks in 1 trial (10s).



Figure 1: The 10 second EEG signal of five mental tasks


The purpose of this example is to design a mental task classifier that could interpret the human brainwave signal (shown in Figure 1) into the mental task. In other word, the classifier will be able to recognize when the user performs baseline task or multiplication task as shown in Figure 2 based on user EEG signals. 





Figure 2: System diagram of the mental task classifier

As shown in Figure 2, to recognize user thought (EEG signal), first these raw signals are processed to extract relevant information before inputting to the classifier. The procedure to process and extract important information from raw signals is called feature extraction. In this example, the Autoregression technique is used to extract EEG signals' features, then Independent Component Analysis (ICA) is applied to separate blind sources from auto-regression (AR) coefficients to obtain significant features and reduce dataset dimension. These significant features are then used as the input of the neural network classifier to classify what mental tasks the EEG signals belong to.

1. Data preparation and feature extraction

The process of utilizing ICA to extract features from AR coefficients is shown in the following Figure 3.



Figure 3: Feature extraction for five mental tasks using AR and ICA combination.


For the given mental task, the first raw EEG data is segmented into 0.5 seconds of time domain segments. The auto-regression technique will be applied to fit segments into the AR model (order 6) using the Burg method. Each electrode channel will produce 6 AR coefficients that are corresponding to its AR model. After combining 6 electrode channels, 36 AR coefficient vector will form a data sample. The ICA will be used to extract signification features from this AR vector to form a 6-element ICA vector that can be used as an input for a Neural Network Classifier.

The Matlab code for feature extraction.

%% 1. Get EEG mental task data
clear all;
load eegdata;
Fs =125;            % Using the window size = 0.5 second (125 samples)
baseline_task =[];
rotation_task =[];
counting_task =[];

%% 2. Extract EEG data from the subject 1
for i=1:length(data)
subjectID = data{i}{1};
if(subjectID=='subject 1')    
    taskname = data{i}{2};
    switch taskname
        case 'baseline',
        case 'multiplication',
        case 'letter-composing',
         case 'rotation',
         case 'counting',

%% 3. Calculate Autogressive model coeficients using burg method with order of 6
[N,M] = size(multiplication_task);
L = round(M/Fs);
for i=1:N
    temp= reshape(baseline_task(i,:),Fs,L)' ;
    baselineTask{i}= ARCoefs(temp,Fs);
    temp= reshape(multiplication_task(i,:),Fs,L)' ;
    multiplicationTask{i}= ARCoefs(temp,Fs);
    temp= reshape(letter_composing_task(i,:),Fs,L)' ;
    lettercomposingTask{i}= ARCoefs(temp,Fs);
    temp= reshape(rotation_task(i,:),Fs,L)' ;
    rotationTask{i}= ARCoefs(temp,Fs);
    temp= reshape(counting_task(i,:),Fs,L)' ;
    countingTask{i}= ARCoefs(temp,Fs);

%% 4. Combine coeffients from 6 EEG channels into a single source

baselineBands           = CombineChannels(baselineTask);
multiplicationBands     = CombineChannels(multiplicationTask);
lettercomposingBands    = CombineChannels(lettercomposingTask);
rotationBands           = CombineChannels(rotationTask);
countingBands           = CombineChannels(countingTask);

%% Apply ICA

icabaselineBands            = ApplyICA(baselineBands);
icamultiplicationBands      = ApplyICA(multiplicationBands);
icalettercomposingBands     = ApplyICA(lettercomposingBands);
icarotationBands            = ApplyICA(rotationBands);
icacountingBands            = ApplyICA(countingBands);

%% 5. Construct the mental task dataset for baseline and multiplication tasks
% Map targets to mental tasks
baselineTargets         = [1 0 0 0 0];
multiplicationTargets   = [0 1 0 0 0];
lettercomposingTargets  = [0 0 1 0 0];
rotationTargets         = [0 0 0 1 0];
countingTargets         = [0 0 0 0 1];

% Adding targets
baselineData        = AddTargets(icabaselineBands,baselineTargets);
multiplicationData  = AddTargets(icamultiplicationBands,multiplicationTargets);
lettercomposingData = AddTargets(icalettercomposingBands,lettercomposingTargets);
rotationData        = AddTargets(icarotationBands,rotationTargets);
countingData        = AddTargets(icacountingBands,countingTargets);

% Forming dataset
MentalTaskData =[baselineData;multiplicationData;lettercomposingData;rotationData;countingData];

%% 6. Seperate dataset into training set and testing set, and save them into files
[~,inputLength] =size(icabaselineBands);
newdataset = shuffleRow(MentalTaskData);
[N, ~] = size(newdataset);
M = floor(0.7*N);
trainingset = newdataset(1:M,:);
testset = newdataset(M+1:end,:);
input = trainingset(:,1:inputLength);
target = trainingset(:,inputLength+1:end);
edata =[input target];
[N,ip] = size(input);
[N,op] = size(target);
textHeader= getTextHeader(ip,op);

%write header to file
fid = fopen('MentalTaskTrainingICAAR.csv','w');
input = testset(:,1:inputLength);
target = testset(:,inputLength+1:end);
edata =[input target];
[N,ip] = size(input);
[N,op] = size(target);
textHeader= getTextHeader(ip,op);

%write header to file
fid = fopen('MentalTaskTestingICAAR.csv','w');

%%Utility functions used in main script
function icaData = ApplyICA(dataTask)
     Mdl = rica(dataTask,6,'IterationLimit',100);
    icaData = transform(Mdl,dataTask);

After the feature extraction process, two datasets are generated for training (MentalTaskTrainingICAAR.csv) and testing (MentalTaskTestICAAR.csv).

2. Design Neural Network Classifier.

After feature extraction procedure, the obtained EEG features will be used to form five mental task training dataset (MentalTaskTrainingICAAR.csv) and test dataset (MentalTaskTestingICAAR.csv). The training dataset will be further divided into three subsets defined in the training data ratio in Step 2: training set is used for training procedure, the validation set is used for preventing the over-fitting issue, and the test set is used for evaluating the training Neural Network.  This training dataset (MentalTaskTrainingICAAR.csv)  is only used in the design and evaluation stage (Step 1 to Step 4). After the trained Neural Network is verified via evaluation procedure, the test dataset (MentalTaskTestingICAAR.csv) will be loaded to test how the trained Neural Network responds to a completely new dataset. As a result, the test dataset is completely different from the test set, a fraction of the training dataset, used in the design procedure in Step 1 to Step 4. In other words, the test dataset refers to the completely new dataset (MentalTaskTestingICAAR.csv), and the test set refers to a fraction of the training dataset with the fraction ratio = 0.5*(100-training data ratio) in percentages defined in Step 2.


Step 1: Load training data into ANNHUB



Figure 4: Load five mental task training dataset into ANNHUB

Step 2: Configure Neural Network Classifier in ANNHUB



Figure 5: Configure five mental task classifier in ANNHub


In this example, the Bayesian Neural Network is configured to classify five mental tasks’ problem. Since the dataset contains only 1000 data samples, and this data is divided into a training dataset (MentalTaskTrainingICAAR.csv) containing 700 samples and a test dataset (MentalTaskTestingICAAR.scv) containing 300 samples. The training dataset is further divided into a training set (75%*700 = 525 samples) for training, and 175 samples for both validation set and test set. As a result, the training dataset is small, and it could be not sufficient to be used in the training procedure. The Bayesian Neural Network equipped with excellent generalization property, so it does not require a validation set to avoid the over-fitting issue. It is the best candidate to be used in the case when the dataset size is small. As shown in Figure 5, the Bayesian Neural Network is configured so that 525 samples of training dataset will be used for training, and 175 samples of the training dataset will be used for testing (test set).  

Get optimal structure for Neural Network


Figure 6: Determine the optimal number of hidden nodes for five mental task classifier in ANNHUB


When designing a Neural Network classifier, it is important to know what is the optimal structure that could lead to the best classification result. Usually, a design relies on trial and error method to determine the number of hidden nodes, and this method cannot guarantee the best classification result. Thanks to the evidence framework based on probability theory, the optimal hidden nodes could be obtained via scanning through possible neural network structures and compare their log evidence. Figure 6 shows that the five mental tasks’ classifier with 17 hidden nodes would be the best structure for this application.  

Step 3: Train Neural Network Classifier in ANNHUB


Figure 7: Train five mental task classifier in ANNHUB


Since the Bayesian Regularization training algorithm is used, the early stopping technique is not required to avoid the over-fitting issue. Figure 7 shows that after two training attempts, the best performance index is achieved with 50 training epochs.

Step 4: Evaluate Neural Network Classifier in ANNHUB


Figure 8: Evaluate five mental task classifier in ANNHUB using confusion matrix technique



Figure 9: Evaluate five mental task classifier in ANNHUB using ROC curve technique


The confusion matrix and ROC curves are used to evaluate trained Neural Network as shown in Figures 8 and 9, and the classification accuracy achieves over 90% for both training and test set.

3. Test Neural Network classifier with a new dataset.

The new dataset (MentalTaskTestingICAAR.csv) is used to test the train Neural Network Classifier. Figure 10 shows that the five mental task classifier responds with a new dataset with an accuracy rate of around 85%.



Figure 10: Test trained classifier in ANNHUBwith a new dataset.




Figure 11: ROC curves of the trained classifier in ANNHUB with a new dataset.


In this example, a new feature extraction method based on the combination between Auto-Regression and Independent Component Analysis is presented to reveal key features of raw EEG signals that help a Neural Network classifier achieve better classification accuracy rate. Although this technique is applied in EEG signals, this principle can be applied in many applications.



This example also provides design tips to deal with classification problems with small data samples. The optimal structure of the classifier could be obtained thanks to the evidence framework.

4. Thought Control Wheelchair application

 This mental task classifier can be used as a core classifier for wheelchair thought control technology.


Related Blogs

Keep in touch with the latest blogs & news.

Be the first to post a comment.

Please login to add your comment.

ANSCENTER provides elegant solutions to simplify machine learning and deep learning design and deployment for any applications.

We support multiple hardware platforms and programming languages, including LabVIEW, LabVIEW NXG, LabWindow CVI, C/C++/C#, and Arduino.