Tuesday, April 2, 2013

Day 24: Numb-er from Numbers

I just finished compiling another set of data, resulting in this pretty graph! In addition, I've been working on my final product, which will either be in the format of a lab report or a research paper. I plan that my final product will consist of many of the topics I've covered in this blog (including TMS), but mainly, it will discuss the conclusion of this project.

So for this post, let me briefly explain what I do with the data. (Don't forget: You can click on images that seem illegible)


Step 1: Run Spike code
During testing, we collect data using the software, Spike. Spike also allows us to write codes that will do something with that data. In this case, we told Spike to create an Excel sheet of all the values it recorded. But what these values tell us is still unclear. In order to clarify all these data, we run these values through a code in Matlab.
Step 2: Run Matlab code
The Matlab code figures out what condition we cued for the subject (Column C: condition) and when we stimulated the subject with TMS (Column D: timings). The first two columns is what Matlab used to comprehend the data to create the "condition" and "timings" columns, but now they become irrelevant to me when finding the averages.
Step 3: Find averages
This image is of one block for one subject. Here, I find the average value of the MEP values for each condition at each TMS time. Then I divide each of the averages by the Baseline (in the image below).
By dividing the values by the Baseline, we find the normalized MEP values. The Baseline value is the threshold MEP value needed for the action. So we base all our data from the Baseline value. So as a result, I collect the image above from each block of each subject. Then I average the average normalized MEP values of each block for each subject to end up with this monstrosity (below)!
The average normalized MEP values of each subject.
Step 4: Make it look pretty
I take the average of all the values of the subjects and end up with this: a beautiful, simplified set of data and graphs. 
Later, I'll be analyzing data and taking standard deviations to find the error margin of all the data I collected. Thanks again for reading my blog!

No comments:

Post a Comment