We are searching data for your request:
Upon completion, a link will appear to access the found materials.
The cultural impact of personal computers in our culture was profound not just because anyone could have a powerful computer in their home — that was nothing new — the real revolutionary changes came from the innovators who advanced computer accessibility, making them more usable for people who didn't have a computer science degree.
Larry Tesler, a man who passed away on Monday, may not be a household name like Bill Gates, or Steve Jobs, but his work to adapt computers and mobile devices for accessibility are merely one among many serious contributions his career had on the world of modern computing.
RELATED: NEXT-GENERATION OF CLOUD COMPUTING: DISTRIBUTED CLOUD
Larry Tesler co-invented "cut," "copy," and "paste"
Larry Tesler was born in 1945, in New York, and later studied computer science at Stanford University. After graduation, he worked in artificial intelligence research (before it became controversial), and worked in the anti-war and anti-corporate monopoly movements that targeted major companies like IBM.
In 1973, Tesler landed a job at Xerox Alto Research Center (PARC) where he stayed until 1980. Xerox PARC is renowned for creating the mouse-driven graphical user interface we knew and loved for decades. While there, Tesler worked with Tim Mott to create a word processor called Gypsy, familiar to many for coining everyday terms like "cut," "copy," and "paste" — used today as commands for removing, repositioning or duplicating snippets of text.
Xerox PARC is also familiar to many for not capitalizing on the revolutionary research it completed within personal computing. In 1980 Tesler moved to Apple Computer, and stayed until 1997. In the following years, he held countless positions at the company including Vice President of AppleNet (an in-house local area networking system that was later canceled), and was even Chief Scientist of Apple, a position once held by Steve Wozniak, before he eventually left the company.
Beyond these major contributions, Tesler also made waves in the software and interface accessibility industry. In addition to "cut," "copy," and "paste" terminology, Tesler pushed for an approach to UI design called modeless computing, reflected best in his personal website. In a nutshell, it ensures that user actions remain consistent through disparate functions and apps of an operating system. Once a user opens a word processor, for example, they can simply assume that pushing any alphanumeric keys on their keyboard will make the character show up on-screen, wherever the cursor is located.
This might sound obvious in 2020, but there was a time when word processors had to be switched between separate modes, where the action of typing on a keyboard could either add characters to a document or instigate functional commands in the operating system.
Tesler's dynamic career through the world of modern computing
Today, we still find many software applications where tools and functionality change depending on what mode in which they're set. These include apps like Photoshop, where for example a suite of tools cause different actions and perform distinct functions. However, most modern operating systems — including Apple's macOS and Microsoft's Windows — have thoroughly embraced user-friendly interfaces that use less complicated modeless systems.
After Tesler left Apple in 1997, he co-founded a new company called Stagecast Software, which created applications that improved accessibility for children learning programming concepts. He joined Amazon in 2001, where he eventually rose to the rank of Vice President of Shopping Experience. In 2005 he moved to Yahoo, where he led the company's user experience and design group, and then later in 2008 he became product fellow at 23andMe. Tesler's CV says he parted ways with 23andMe in 2009, and mostly stuck to consulting in his final years.
Tesler's contributions to modern computing are vast. His work at Xerox and Apple spawned so many innovations that much of his work will likely remaing forever unknown to the world. Regardless, his visionary approach to modern computing is one of the essential reasons computers made the move from bulky research centers into our daily lives.