What is Block Size?

A block of data is a group of measurements or readings or values that are housed together as a packet or array. For devices that have slow sample rates (< 2Hz) the block size is typically one. On the other hand, for devices above 100Hz the block size must grow to keep up with incoming data demands.

DASYLab processes all its data on a block-by-block basis and passes these blocks around the worksheet from module-to-module. Every module in a worksheet is written into a module administration list. While DASYLab runs, a Dispatcher assigns CPU time to each module in the list for data processing. In assigning CPU time, there is a physical speed boundary that depends on the size of your worksheet and the capability of your computer. This is because the Dispatcher must process all modules on the list within a specific reaction time, defined by the overall sampling rate and block size.

Suppose there are 100 modules and a sampling rate/block size (A/B ratio) relationship of 1000/1. All of the data must be processed within 1 ms (1/1000). With 100 modules, each module is only assigned 10 µsec of CPU time to process its data, which is not enough to keep up with the data stream. The degree of precision that you define determines the sampling rate (resolution) of your measurement. The block size determines the cycle time for the Dispatcher and thus the time to process the collected data. The recommended A/B ratio is 1/10.

DASYLab 6.0 and later include an View»Status Bar option that will show you DASYLab’s system utilization, or its required CPU time as a function of the maximum. Using this, you can determine whether the computer processing of your worksheet is adequate. If the bar gets more than one half red then you may be getting close to an over run condition that may require a larger blocksize or lower sample rate.