What Happened in 2023 in Computing?
Some Key Takeaways
Every week, the Future of Computing newsletter features our interviews with founders and industry experts and computing news – breakthroughs, funding rounds, and more. Here are some key takeaways about what made the headlines in 2023:
- Small language models started matching larger, closed-source models in performance while being small and (cost-)efficient enough to run locally.
- We got strong signs that machine learning will help not only with writing poems but also developing better materials, computer chips, or batteries – and assisting humans to shape the worlds of atoms.
- Memory, not GPUs, has become the bottleneck holding back machine learning, so we need faster, high-bandwidth memory.
- GPUs are better than CPUs for machine learning tasks, but people started exploring what’s next – maybe chips leveraging in-memory computing or fully analog AI chips.
- Similar in spirit, people are looking more than ever into what’s next beyond silicon – maybe another semiconductor material or even leveraging entirely new physical principles for computing – more on optical computing below.
- Extreme ultraviolet lithography (EUV) came to fruition as TSMCs 3nm node reached volume production, among others, for Apple’s M3 chips series.
- While large and larger quantum computers were announced, we have seen in 2023 the first quantum processors with many logical qubits – an important milestone.
- Access to quantum computers for experimentation has become fairly easy through cloud providers. Meanwhile, quantum-inspired, classical algorithms showed value in AI, e.g., for compressing machine learning models.
- The photonics industry shifted its focus from optical AI accelerators to photonic interconnects as a potential remedy for the AI memory bottleneck.