Moore's Law
The number of transistors on a chip doubles roughly every two years.
Tiny Summary
Moore's Law: "The number of transistors on a chip doubles roughly every 2 years." Hardware gets exponentially cheaper and faster over time—until it doesn't.
The Law
Gordon Moore (1965): Transistors on integrated circuits double every ~2 years. Result: Computing power doubles, cost per transistor halves, performance improves exponentially.
Why It Mattered
1970-2010: Moore's Law held true. Computers got exponentially faster. Software could be lazy—hardware would catch up. Enabled personal computers, made smartphones possible, drove entire tech industry.
Why It's Slowing Down
Physics limits: Transistors approaching atomic scale, heat dissipation problems, quantum effects at small scales
The end: Slowing since ~2010. Can't shrink transistors indefinitely. Hitting physical limits.
Implications
Past (Moore's Law active): Software got faster for free (wait for new CPUs). Optimization less critical. Hardware solved performance problems.
Future (post-Moore): Performance requires better algorithms. Can't rely on hardware improvements. Software optimization matters more.
Modern Replacements
Specialized hardware: GPUs for parallel workloads, TPUs for machine learning, ASICs for specific tasks
Architectural changes: Multi-core processors (parallelism), heterogeneous computing, distributed systems
Key Insights
Moore's Law is slowing/ending. Can't rely on hardware to solve performance problems. Algorithmic improvements matter more now. Future gains come from architecture (GPUs, specialized chips), not just transistor density.