8 Trends Shaping the Future of Computing in 2023
Since starting this blog in the spring of 2022, I have interviewed more than 50 startup founders about their journey, technology, and how they’re shaping the future of computing.
Here are eight trends that shape the future of computing already today:
The Age of Machine Learning
Since ChatGPT, not only researchers and startup founders got aware that machine learning is going to change the world. Before ChatGPT, you could get fired for implementing machine learning at your company; now, you could get fired for not implementing machine learning.
Machine learning is not only going to unlock a huge number of use cases, from writing poems and marketing ads to making personalized education and personalized medicine a reality. Machine learning also makes developers more productive in writing code,










Synthetic Data & Computer Vision
7) Generative models get really good – good enough to amaze humans but not good enough to train other models reliably, e.g., for computer vision tasks. But we’ll get there.




AI & Quantum for Computational Modeling
6) As computing power becomes ubiquitous, startups can build larger AI models – not just large language models like ChatGPT but also physics-informed models for predicting, e.g., materials properties, fluid flow, or weather. Machine learning continues to be applied to all use cases that it could possibly be applied to.





Quantum Hardware
Every year, a few quantum startups launch, attempting to build quantum computers. Still, no one has figured out how to build practically useful quantum computers. An entire industry has been built on this promise. (Quantum control and anything that shortens algorithms or prevents errors/allows error correction might be an unlock)







Quantum Algorithms
Meanwhile, quantum algorithm startups are sitting on the sidelines, waiting for quantum computers to become finally available. But the timeline is uncertain. Meantime many quantum algorithm startups turn out to not do quantum but smart people developing smart algorithms







Optical Computing
And while the quantum hype is still in full swing, a new generation of optical computing startups set out to enter the data centers. The first wave has focused on high-value use cases like AI accelerators but struggled to integrated with existing electronic systems. The second wave might move entirely to the optical domain.




Specialized Chips and Custom Boards
9) AI has created such large market opportunities that specialized chips are being developed; neuromorphic computing: preparing for large AI models, might make things nit only a lot more efficient but also got beyond classical neural networks e.g. with spiking neural networks that allow for temporal processing.








Everything becomes cloud-first and realtime
- dlkj






