Why the Apple II Didn’t Support Lowercase Letters

Tuesday, September 8th, 2020

1977 Apple II Advertisement

[Editor’s Note: I recently asked Steve Wozniak via email about why the original Apple II did not support lowercase letters. I could have guessed the answer, but it’s always good to hear the reason straight from the source. Woz’s response was so long and detailed that I asked him if I could publish the whole thing on VC&G. He said yes, so here we are. –Benj]

----------

In the early 1970s, I was very poor, living paycheck to paycheck. While I worked at HP, any spare change went into my digital projects that I did on my own in my apartment. I was an excellent typist. I was proficient at typing by touch using keypunches with unusual and awkward special characters — even though some used two fingers of one hand.

Steve Wozniak and Steve Jobs with an Apple II saw a friend typing on a teletype to the six computers on the early ARPAnet. I had to have this power over distant computers too. After building many arcade games on computers, how to build it was obvious to me instantly. I’d create a video generator (as with the arcade games) and display text using a character generator chip. But I needed a keyboard.

I’d show up at HP every morning around 6 AM to peruse engineering magazines and journals to see what new chips and products were coming. I found an offer for a $60 keyboard modeled after the upper-case-only ASR-33 teletype.

That $60 for the keyboard is probably like $500 today [About $333 adjusted for inflation — Benj]. This $60 was the single biggest price obstacle in the entire development of the early Apple computers. I had to gulp just to come up with $60, and I think my apartment rental check bounced that month — they put me on cash payment from then on. Other keyboards you could buy back then cost around $200, which might be $1000 or more now. There just wasn’t any mass manufacturing of digital keyboards in 1974.

So my TV Terminal, for accessing the ARPAnet, was uppercase only.

----------

Apple I Owned By Steve Jobs Auction ImageThe idea for my own computer came into my head the first day of the Homebrew Computer Club.

Maybe a year prior, I had looked at the 4-bit Intel 4004 microprocessor and determined that it could never be used to build the computer I wanted for myself — based on all the minicomputers that I’d designed on paper and desired since 1968-1970. But at the Homebrew Computer Club, they were talking about the 8008 and 8080 microprocessors, which I had not kept up with after my 4004 disappointment. I took home a data sheet for the 8008, based on a version of it from a Canadian company. That night, I discovered that this entire processor was capable of being a computer.

I already had my input and output, my TV Terminal. With that terminal, I’d type to a computer in Boston, for example, and that far-away computer, on the ARPAnet, would type back to my TV. I now saw that all I had to do was connect the microprocessor, with 4K of RAM (I’d built my tiny computer with the capability of the Altair, 5 years prior, in 1970, with my own TTL chips as the processor). 4K was the amount of RAM allowing you to type in a program on a human keyboard and run it.

My computer wasn’t designed from the ground up. I just added the 6502 microprocessor and 4K DRAMS (introduced that summer of 1975 and far less costly than Intel static RAMs) to have a complete computer with input and output.

So the uppercase keyboard was not designed as part of a computer. It already existed as my TV Terminal.

Steve Wozniak and Steve Jobs with an Apple III truly would have wanted lower case on a keyboard, but I was still totally cash strapped, with no spare money. After already starting a BASIC interpreter for my computer, I would have had to re-assemble all my code. But here again, I did not have the money to have an account on a timeshare service for a 6502 interpreter. The BASIC was handwritten and hand-assembled. I’d write the source code and then write the binary that an interpreter would have turned my code into. To implement a major change like lower case (keeping 6 bits per character in my syntax table instead of 5 bits) would have been a horrendous and risky job to do by hand. If I’d had a time-share assembler, it would have been quick and easy. Hence, the Apple I wound up with uppercase only.

I discussed the alternatives with Steve Jobs. I was for lower case, but not for money (cost). Steve had little computer experience, and he said that uppercase was just fine. We both had our own reasons for not changing it before the computers were out. Even with the later Apple II (as with the Apple I), the code was again hand-written and hand-interpreted because I had no money. All 8 kB of code in the Apple II was only written by my own hand, including the binary object code. That made it impossible to add lower case into it easily.

So, in the end, the basic reason for no lowercase on the Apple I and Apple II was my own lack of money. Zero checking. Zero savings.

Larry Tesler (1945-2020)

Wednesday, February 19th, 2020

Larry TeslerIn Memoriam: Lawrence G. “Larry” Tesler (1945-2020),
inventor of Copy/Paste at Xerox PARC, member of Apple Lisa team,
human-computer interaction expert

Tesler was a giant in the field of human-computer interaction, having pioneered modeless interfaces at Xerox PARC and carried those over to Apple as part of the Lisa team. While at PARC, he and Timothy Mott created a text editor called Gypsy that included the first implementation of the now-common Copy and Paste features for moving blocks of information easily within a document. According to Robert Scoble, Tesler was also on the committee at Apple that decided to re-hire Steve Jobs in the mid-1990s. He will be missed.

VC&G Anthology Interview: Trip Hawkins on 30 Years of Electronic Arts (2012)

Monday, November 9th, 2015

Trip Hawkins Interview on EDGE-online.com
10 DAYS OF VINTAGE: Day 8

[ This interview I conducted was originally published on Edge.com in June 2012 to roughly coincide with Electronic Arts’ 30th Anniversary. Since then, the interview has disappeared from the web. A few people have asked me to make it available again, and since I retained the rights to the interview, I am free to publish it on VC&G for everyone to enjoy. ]

Originally Published on Edge.com in June 2012:

VC&G Anthology BadgeElectronic Arts is 30 years old, and there is no denying that the behemoth game publisher casts a long shadow of influence over the entire industry. The company, founded in May 1982, pioneered a business model that treated game designers like rock stars and software publishers like record labels. It pushed the use of big names and big licenses in sports (think Madden, NFL) and soon grew to gobble up many renowned development studios to become a massive entertainment conglomerate.

These days, that conglomerate catches lots of flack from gamers on various issues including employee treatment, content milking, premature server termination, and more. Whether or not those criticisms have any merit, there is no denying that Electronic Arts was once revered as a top corporate impresario for identifying and cultivating the world’s best game design talent (although one would have to admit that time was very long ago).

The man behind the early, creatively-rich image of EA is Trip Hawkins, an Apple veteran who founded the company with a simple dream: to bring his sports simulations to life. Hawkins, now 58, left EA in 1991 to start The 3DO Company, which folded in 2003. He then launched mobile game developer Digital Chocolate that same year. Just recently, Hawkins announced he was stepping down as CEO of Digital Chocolate to face an as-yet unrevealed future.

In late May of this year [2012 — Ed.], on the occasion of EA’s 30th anniversary, I spoke with Hawkins over the telephone and via email about the creation of Electronic Arts, the design of its early games, and at some length about the negative criticism the company tends to attract today. Along the way, we touched on the personal source of his creative spirit and about heady days as a close friend of Apple co-founder Steve Jobs.

[ Continue reading VC&G Anthology Interview: Trip Hawkins on 30 Years of Electronic Arts (2012) » ]

Benj’s Recent Macworld Adventures

Monday, November 26th, 2012

Macworld Logo

As long time readers of VC&G know, I usually post short entries about my non-blog writing activities on this blog so you can enjoy them.

Recently, I’ve been so engrossed in writing Macworld articles that I have neglected to mention them. Consider that remedied with this handy digest of pieces I’ve written over the past two months for said Mac-related publication. Conveniently, they all have history angles to them (or else I wouldn’t list them here):

There’s more on the way, so stay tuned to see whether I neglect to mention those here as well. The excitement is palpable!

Trip Hawkins Interview: 30 Years of Electronic Arts

Friday, June 29th, 2012

Trip Hawkins Interview on EDGE-online.com

Electronic Arts turned 30 on May 28th, and I thought it would be a good opportunity to check in with its founder, Trip Hawkins, on how he feels about Electronic Arts today. It’s no secret that EA, while a massively successful company, takes a lot of heat from gamers on a number of issues (see this Retro Scan and its comments for more on that).

In an interview published at Edge Online, Hawkins and I spoke at length about Electronic Arts, including the founding of EA, finding early EA developers, his time at Apple, his friendship with Steve Jobs, and yes, how he feels about Electronic Arts today.

The resulting interview was so long that Edge decided to split it into five parts. It just published the last part today, so I thought I’d collect all the links here so you can read it.

06/25/2012 Trip Hawkins: The inspiration for EA
06/26/2012 Trip Hawkins on Apple and Steve Jobs
06/27/2012 Trip Hawkins: Founding Electronic Arts
06/28/2012 Trip Hawkins: The EA Days
06/29/2012 Trip Hawkins on the EA of today

Interestingly, there has been no mention of the company’s 30th anniversary from Electronic Arts itself. Its staff was probably too busy revising its own history to notice.

Macintosh II 25th Anniversary

Friday, June 8th, 2012

Macintosh II 25th Anniversary at Macworld

25 years ago this March (1987), Apple released the Macintosh II, the first open architecture Macintosh. Naturally, I’ve written a short feature about this pioneering machine over at Macworld.

While speaking with Michael Dhuey, the Apple engineer that conceived the Mac II, I learned that Apple patterned the Mac II after the 1977 Apple II, which sported the same sort of flexibility and expandability as the Mac II. That self-referential influence amazed me — especially coming from a company that recently institutionalized the practice of ignoring its own history.

But only two years after Steve Jobs resigned from Apple, the company had no problem making the un-Jobs move of both looking backward and opening up the Macintosh. The result changed the course of Macintosh history.

[ Continue reading Macintosh II 25th Anniversary » ]

The Beleaguered IBM PC in History

Friday, August 12th, 2011

The IBM PC 5150

From the 1990s until very recently, the press has been generally unkind to the achievements of the first IBM PC. Due to the PC platform’s utter dominance of the personal computer market, popular accounts of personal computer history commonly paint IBM as the slow, lumbering, clueless enemy while cheering on spunky underdogs like Apple. I’m not even going to cite specific examples: Google “computer history.” Read. You will see it.

But that perspective is not fair at all. IBM truly pulled off something smart, savvy, and remarkable in designing the IBM PC 5150 (and the machines that followed it, into the PS/2 era). With the 5150, a team of 12 people took the machine from concept to shipping product in less than a year. And yet many focus on how IBM supposedly lost its way.

IBM PC KidMuch ballyhoo has been made, for example, about how IBM lost its grip on the PC’s direction as clones flooded the market. From a different perspective, that runaway-freight-train-of-a-platform is a success story for IBM.

While Big Blue lost market share to clone manufacturers, you have to keep in mind that IBM’s percentage shrank as the market size exploded. IBM fostered a rich PC standard that it kept reaping until it sold its PC division to Lenovo in 2004. IBM may not have kept steering the ship, but they sure made a lot of money in the cargo hold.

And if you think IBM’s influence on the PC standard ended in the early 1980s, think again. Real history is not so cut-and-dry. The PS/2 era (which dawned in 1987) gave us stalwarts like the PS/2 mouse/keyboard ports and, ah yes, that minor display technology called VGA. You can also thank the 1990s ThinkPad line for its part in streamlining the modern laptop.

Apple vs. IBM

The popular narrative of IBM vs. Apple in the 1980s, with its strong contrasts of Good vs. Evil and Hero vs. Villain was largely a creation of Apple’s marketing department. The image of Apple’s David verses IBM’s Goliath got repeated so many times that the press started using the supposed rivalry as the basis of dramatic stories. Humans need narratives to make sense of history, and writers have forced the PC market story into that archetypal mold.

Sure, IBM and Apple competed for dollars — and they may have even done it vigorously — but business is business. It’s not swashbuckling. The first thing you learn when actually studying computer history (i.e. interviewing folks) is that just about no one involved in creating these products thinks they were doing something so incredibly amazing that it should be turned into a movie. They were just doing their jobs, developing good products, and trying to make money like everyone else. When the project was over, they moved on to other things. That story is incredibly boring if you don’t dramatize it.

By using the IBM PC for a week for a recent article, I learned firsthand that the original PC really was an amazing machine for its time. It wasn’t just a generic box that happened to have an IBM logo on it, as some people argue. Sure, it didn’t have flashy graphics or a GUI, but it was solid, reliable, well-designed, and it was definitely the most qualified personal computer for getting work done in 1981. There is a reason it became a standard, after all — everyone imitated it, and they imitated it because it was amazing.

What Computer Nerds Should Be Thankful For

Wednesday, November 22nd, 2006

Things That Nerds Should Be Thankful ForTomorrow is Thanksgiving in the United States, which means we cook a lot, eat a lot, sleep a lot, feel uncomfortable around somewhat estranged relatives a lot, prepare to spend a lot, officially start Christmas a lot, and generally take it all for granted, despite the title of the holiday. In order to break with American tradition, I thought I’d offer a personal list of things that I think we — vintage computer and video game enthusiasts — should be thankful for. After all, these things let us enjoy our hobbies. Without them, we’d be collecting dirt and not even know what it’s called. Pay attention, my friends, as we start off serious-ish and degrade into something resembling silliness — but it’s all in the name of holiday fun.

[ Continue reading What Computer Nerds Should Be Thankful For » ]