Observing the queue of people being checked out at a large grocery or retail store can be fascinating. You can see the shoppers deciding on which line to enter – based on the number of people, how many things are in their cart, how quickly the cashier is ringing up each sale, whether someone is assisting with bagging process, etc. Although most people don’t realize it, they are making an estimate on how long it will take for them to get through the checkout line by using a theory called “Little’s Law”.
This fundamental concept was developed over 50 years ago. It enables us to predict a specific process behavior and is named after John Little, a professor at MIT’s Sloan School of Management.