Measuring Resource Consumption

Analysis of algorithms is about measuring the growth of resource consumption as data size increases. Resources can be anything that has a limited supply. The top two are time and memory but these are not the only ones.

Time Resource

Each operation that your program performs requires a small slice of time... time to load the instruction, time to perform the operation, time to store the result... The more operations you do, the more time it will take. Thus, one way that we can measure the amount of time required by an algorithm is to measure how many operations it performs.

When doing this, we make the assumption that every operation has the same time cost. Given this assumption, addition and division would take the same amount of time. This is not the case in reality but it is good enough for analysis.

Memory Resource

The other big resource is memory. This is not calculated by operation count because the number of operations doesn't always increase memory consumption. However, we can still make a calculation on this based on variable declarations, dynamically allocated memory etc.

Measuring Resources

To perform an analysis, the first thing we do is figure out mathematically how much resources we consume. However, as stated in the introduction, this is not a number... it is a mathematical expression that typically involves a variable for the amount of data involved. This is because we can expect that more the more data we have the more resources get consumed... it takes more time to process more data, it takes more memory to store more data. The real question is really how much more.

Once we have this expression, we can then look at how the resource consumption grows with respect to the size of the data.

Last updated