Hick's Law Experiment

Introduction

Hick's Law, also known as the Hick–Hyman law, is named after psychologists William Edmund Hick and Ray Hyman. It describes how the time it takes to make a decision increases logarithmically with the number and complexity of choices.

Hick's Law Definition & Principle

Hick's Law derived from cognitive experiments, measures the 'rate of gain of information' — the time required to process bits of information. Despite common intuition, doubling the number of choices does not double decision time due to its logarithmic nature, where the increase in decision time diminishes as choices grow.


Hick's Law Formulas

The average reaction time 𝑇 is calculated using the formula:

T=blog2(n+1)T = b \cdot \log_{2}(n+1)

The constant 𝑏 in this formula is derived from experimental data, and the logarithm reflects the depth of decision-making hierarchy, similar to a binary search mechanism. This method efficiently reduces the decision set by categorizing choices.

When choices have unequal probabilities, the formula can be generalized as:

T=bHT = b \cdot H

Where 𝐻 is strongly related to the information-theoretic entropy of the decision, The information-theoretic entropy 𝐻 is calculated as:

H=i=1npilog2(1pi+1)H = \sum_{i=1}^{n} p_i \log_{2}\left(\frac{1}{p_i} + 1\right)

Where pi refers to the probability of the ith alternative yielding the information-theoretic entropy.


Hick's Law, much like Fitts's Law, emerges because decision-making naturally involves categorizing choices, which logarithmically reduces the set of possibilities rather than requiring a linear evaluation of each option.


For more details, click the button below.


Hick's Law Experiment

Hick's law describes the time it takes for a person to make a decision as a result of the possible choices. The more choices there are, the longer it takes to decide.