A New Algorithm Beats Dijkstra's Time For The Shortest Path To Any Point In A Network
When Edsger W. Dijkstra published his algorithm in 1959, computer networks were barely a thing. The algorithm in question found the shortest path between any two nodes on a graph, with a variant refining it to finding the shortest routes to all other nodes from a single source node. This would prove incredibly useful as computer networks expanded and evolved into the internet we all know and love. It says a lot about the algorithm that it's still widely used in modern networks and is integrated into the Open Shortest Path First (OSPF) routing protocol, which uses Dijkstra's Shortest Path First algorithm to map efficient routes through complex networks.
Despite its longevity, the algorithm has one notable problem. Effectively, Dijkstra's algorithm has a speed limit. The limit, known as a sorting barrier, is dictated by a processing bottleneck where the algorithm must continually determine which point is closest to the source at any given step. This ongoing sorting process caps the algorithm's performance.
Now, however, a new algorithm has managed to bypass this barrier and remove the subsequent speed limit. A 2025 paper by a team of researchers from Stanford University and Tsinghua University, Beijing, outlines a combination of Dijkstra's algorithm with the Bellman-Ford algorithm to allow even more efficient computation of the shortest paths. The algorithm works by not focusing on individual nodes, but by essentially grouping nodes, which speeds up the search by reducing the number of nodes to run through.
Network algorithms: A long route to the shortest path
The "sorting barrier" in Dijkstra's algorithm has been challenging researchers for decades. By 1984, researchers at Princeton University had improved the original algorithm to the point where the sorting barrier speed limit was hit. And, although researchers did manage to break this barrier throughout the next couple of decades, doing so relied on making assumptions about the "weight" of each node (in Dijkstra's algorithm, each potential node is assigned a weight; the less a node weighs, the closer it is). Because these techniques depended on assumptions, the speed gain was achieved at the cost of accuracy.
Counterintuitively, the breakthrough came when researchers integrated principles from the Bellman-Ford algorithm into their work. The Bellman-Ford algorithm is also used to compute shorter paths, but it does so without producing a sorted list. The Bellman-Ford method's main disadvantage is that it's substantially slower than Dijkstra's algorithm. They discovered that using the Bellman-Ford algorithm selectively allowed them to target the most influential nodes in the network. These can be thought of as the key highways that many routes connect to. Using this method, the algorithm could expand outward more efficiently and skip unnecessary calculations.
A later refinement removed a randomization aspect of the process and added a layering system that organized the search more intelligently. The result was a network path finding algorithm that finally broke the sorting barrier speed limit.
Why pure algorithms still matter
Most people will have heard of Moore's Law, the law that predicts how quickly hardware performance will improve. This law was originally based on transistor counts doubling annually, and although this figure was subsequently adjusted, it's still used to measure the development of technology. However, algorithms are the unsung heroes in computing. Below the surface, they do everything from amplifying negative emotions on platforms like X to compressing data in file formats like JPEG and MP3.
Research done by the Massachusetts Institute of Technology in 2021 found that improvements in algorithms often outpace advances in hardware. Around 40% of the algorithms rivaled or outpaced Moore's Law, while a quarter improved by orders of magnitude more than hardware advancements.
This context makes the latest breakthrough in the shortest path algorithm all the more remarkable. The fact that Dijkstra's algorithm hasn't improved since it reached its theoretical top speed proves just how difficult the task was. Since 1984, computer scientists had considered the so-called sorting barrier as the de facto speed limit for any algorithm based on a sorting technique.
The breakthrough made by the joint team from Stanford and Tsinghua University has proven that even the most mature areas of computing still have room for improvement. And, while this latest discovery may not suddenly make the internet faster overnight or miraculously stop your YouTube video from taking too long to upload, it does remind us of the importance of algorithms — the unsung heroes of the digital age.