I am collecting RT data following the presentation of an object (a target appears on the screen for 250 ms followed by a blank screen, and RTs are calculated from the onset of the target). What I end up with is the RT coded in the blank screen object, but I need to add the time of the target (250 ms) to the total RT since this is part of the processing time of the target. My issue is that the onset-to-onset time for the target is not 250 ms but occasionally e.g. 267 ms, which I interpreted as meaning that the blank screen following the target was delayed by 17 ms (so the target was on the screen for longer than expected). I hope all of that is clear.
My question is: if the onset-to-onset time of the target is 267 ms rather than 250 ms, is this the actual time that the object appeared on the screen? I need to decide the length of the target so that I can add this time to the total RT (which is coded in the next object).
Many thanks for your help.
Please sign in to leave a comment.