My tryst with computers
Reflecting on the place and meaning of personal computers in human lives
It was 2001, I was 4 years old, I lived in a small township named Donimalai in the state of Karnataka, India. I got a toy that year that would change my life. It was a HCL 2001 Ezeebee, rocking a speedy Intel Pentium 4 with 256MB RAM. This is what it looked like. Those 5 holes on the chassis would light up blue when switched on.
My gateway computing experience on this machine was playing around with the Eyewitness Children's Encyclopedia which was released in ‘97. I added some screenshots of the interface below.
I fondly recall the elements of its interface: chunky, persistent cuboidal buttons that opened windows into thousands of sub-topics you could learn about. In one memorable section, the "Map" offered immersive visual worlds where I could roam and interact with digital landscapes, while a peculiar, animated eye on the right panel playfully followed my every click. At that moment, I remember being distracted by it and I felt as if I were commanding a living spaceship — a personal spacecraft for the cyberspace.
That early experience of clicking through immersive worlds became the foundation for my lifelong relationship with technology. Over the following years, I dove deep into the digital realm - not merely by surfing the web or collecting media, but by absorbing the evolving language of computers. I grew up alongside the technology, witnessing each upgrade, every leap in interface clarity, and the miniaturization that reshaped our daily lives.
While this medium immersed and scaled itself into our reality, nothing truly foundational has changed since then, the interface was still a list of “applications” arranged on a screen. Everything just got clearer, tastier and snappier. Below you can see a comparison of the interface from the 70s (at Xerox PARC) vs now.
It wasn’t until I found myself heads down into Steve Job’s biography by Walter Isaacson, my curiosity piqued about the origin of these machines and how they made their way into our hands. I started this exploration1 way way back from scales and abacus in 500BC, to understand the oldest tools that externalized thought but for this discussion lets start from the 60s and 70s where computers got miniaturized enough to make it sit atop a desk.
The earliest Personal Computers
The Kenbak-1, often considered the first personal computer. It was designed by John Blankenbaker in 1970 (you can see him holding it below) and first sold in 1971, it was designed primarily for educational purposes, enabling users to learn programming concepts without prior experience.
It had just 256 bytes of memory, 8 input switches and 8 output lights, but its rich instruction set and capabilities made it similar to the "big computers" of the time, and actually easier to program than some of the microprocessors which followed.
According to John Blankenbaker, some of his first sales were made to educational institutions: "The first two Kenbak-1 computers were sold to a private girl's school in Hawaii.”
Around the same time, the Apple I entered the scene with slight but significant strides toward usability. It had an ASCII keyboard for direct input and composite video output, which could connect to a television to display text, 40 characters per line across 24 lines. Yet, without an operating system, it relied on a barebones “monitor program,” where users typed machine code to manage memory and processes. Loading software like BASIC required connecting a cassette interface, which read programs from magnetic tapes.
These early machines were like strange, unfriendly tools requiring you to think like a computer just to use them. Still, they marked a profound shift—laying the groundwork for computers to evolve from specialist instruments into the personal, immersive worlds we know today.
There were 3 personal computers released later that decade, they were called the ‘77 trinity as pictured above. Apple II: Released in June 1977, it became one of the first successful mass-market personal computers. TRS-80: Launched by Tandy Corporation in August 1977, it sold over 100,000 units. Commodore PET: Announced in January 1977, it was another key player in making personal computers accessible to a broader audience.
The Pivotal Moment
The late 70s and early 80s saw the dawn of a radical shift in how humans and computers would communicate. No place embodied this revolution better than Xerox PARC. It wasn’t just a research lab; it was a space where the most brilliant minds in design, science, engineering, philosophy and arts came together to dream. And that was the explicit mandate. Visionaries like Alan Kay, Doug Engelbart, and others2 at PARC were driven by one question: How do we make computing accessible, intuitive, and deeply integrated into everyday life? They understood the potential of visual metaphors and affordances, design principles that allow users to interact with complex systems as naturally as turning a page or opening a door.
The shift began with a simple yet profound realization: humans don't think in strings of code or binary instructions. If computers were to truly become personal, their interfaces had to reflect how our minds worked. PARC pioneers drew inspiration from the everyday physical world, leading to the birth of visual metaphors and affordances that remain central to user interfaces today.
And their crowning achievement? The gooey or Graphical User Interface (GUI). It was at PARC that the computer screen transformed from a barren field of monochrome text into a rich canvas filled with windows, icons, and graphical elements. Clicking a folder icon became synonymous with accessing a set of files, much like pulling open a drawer in the real world. The act of dragging a document to the trash mimicked tidying up a physical desk.
If you haven't seen their demo you should totally watch it below.
The genius of PARC lay in how they bridged human intuition with computing power. They saw that the key to unlocking technology's potential wasn't just about more powerful processors but about creating interactions that felt natural and meaningful. This thinking would go on to inspire countless innovations, Apple's Macintosh being one of the most famous examples. This ethos still echoes in every click, swipe, and tap we make today.
Personal computers of today and tomorrow
As groundbreaking as Xerox PARC was, their biggest insight was simple yet profound: computers should meet us where our minds are. They dreamed up a future where interacting with a machine felt intuitive, like flipping through a book or shuffling papers on a desk. Alan Kay and his team believed computing could be as natural as thinking, and they turned that vision into something real, the graphical user interface. Suddenly, computing wasn't about knowing cryptic commands; it was about pointing, clicking, and seeing possibilities unfold in front of you.
But here's the thing: despite all the progress, that foundational dream only got halfway there. Sure, our interfaces got sleeker, buttons turned into icons, mice gave way to touchscreens but they still revolve around isolated apps and rigid workflows. They expect us to think in compartments: "open this app for writing, that one for meetings, this one for tasks." Our minds don’t work that way. We have messy, overlapping intentions: "plan a trip," "get healthy,", "I want to explore some art", "find focus." Each of those cuts across multiple apps and tools, none of which really talk to each other.
The revolution we were all waiting for didn’t come from VR headsets or AR glasses. It came from something subtler but far more transformative: AI, specifically large language models. For the first time, machines could meet us where our minds are. This shift brings to life exciting possibilities for computing interfaces. Instead of being bound by static metaphors like apps, files and folders, we might interact with our computers through dynamic contexts that understand and adapt to our needs. Rather than switching between applications, we might work in fluid spaces that bring together all the tools and information relevant to our current task in shapes and forms unique to each one of us. The computer might become less of a tool we use and more of an environment we inhabit, much like how my childhood self experienced those worlds in the encyclopedia.
Extending Alan Kay's thinking, AI offers a way to break free from those silos of apps and functions. It's about creating a computing experience that adapts to our intentions, not the other way around.
We're standing on the edge of this shift. LLMs have given us a taste of what’s possible, but the architecture around them still needs to catch up. We finally have new companies that rethinking what personal computers could like like that weren’t possible before, like the ones shown below.
This experimentation I think is the real future. The dream that started at PARC, making computers an even more intuitive and deeper extensions of our unique minds is fully realizable now. All we have to do now is build the right bridges and infrastructure to make this a reality
Footnotes
These are some of the books I’d read and other media about this since 2014 -
"Doing with Images Makes Symbols" by Alan Kay
"Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age" by Michael Hiltzik
"The Second Self: Computers and the Human Spirit" by Sherry Turkle
"What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer" by John Markoff
General Magic documentary
Pirates of Silicon Valley Movie
There were many more giants and their works I did not mention that I almost felt guilty about Ted Nelson, JCR Licklider, David Liddle, Engelbart and his [mother of all demos].














