What Happened in 2023 in Computing?

Subscribe to our newsletter below and get access to our review
about what made the headlines 👇

Please enable JavaScript in your browser to complete this form.

2023-in-Computing
2023-in-Computing-1
2023-in-Computing-2
2023-in-Computing-3
2023 in Computing (4)

Some Key Takeaways

Every week, the Future of Computing newsletter features our interviews with founders and industry experts and computing news – breakthroughs, funding rounds, and more. Here are some key takeaways about what made the headlines in 2023:

  1. Small language models started matching larger, closed-source models in performance while being small and (cost-)efficient enough to run locally.
  2. We got strong signs that machine learning will help not only with writing poems but also developing better materials, computer chips, or batteries – and assisting humans to shape the worlds of atoms.
  3. Memory, not GPUs, has become the bottleneck holding back machine learning, so we need faster, high-bandwidth memory.
  4. GPUs are better than CPUs for machine learning tasks, but people started exploring what’s next – maybe chips leveraging in-memory computing or fully analog AI chips.
  5. Similar in spirit, people are looking more than ever into what’s next beyond silicon â€“ maybe another semiconductor material or even leveraging entirely new physical principles for computing â€“ more on optical computing below.
  6. Extreme ultraviolet lithography (EUV) came to fruition as TSMCs 3nm node reached volume production, among others, for Apple’s M3 chips series.
  7. While large and larger quantum computers were announced, we have seen in 2023 the first quantum processors with many logical qubits â€“ an important milestone.
  8. Access to quantum computers for experimentation has become fairly easy through cloud providers. Meanwhile, quantum-inspired, classical algorithms showed value in AI, e.g., for compressing machine learning models.
  9. The photonics industry shifted its focus from optical AI accelerators to photonic interconnects as a potential remedy for the AI memory bottleneck.

Comments are closed.