YTread Logo
YTread Logo

Graphical User Interfaces: Crash Course Computer Science #26

Jan 14, 2022
Hi, I'm Carrie Anne and welcome to CrashCourse Computer Science! We end the last episode with the 1984 release of Apple's Macintosh personal

computer

. It was the first

computer

that a normal person could buy with a

graphical

user

interface and a mouse to interact with it. This was a radical evolution from the command line

interfaces

found on all other personal computers of the time. Instead of having to remember... or guess... the correct commands to type, a

graphical

user

interface shows you what functions are possible. You just have to look around the screen to see what you want to do.
graphical user interfaces crash course computer science 26
It is a "point and click" interface. Computers were suddenly much more intuitive. Anyone, not just hobbyists or computer scientists, could figure things out for themselves. INTRO The Macintosh is credited with adopting graphical user

interfaces

, or GUIs, as mainstream, but they were actually the result of many decades of research. In previous episodes, we discussed some of the first interactive graphics applications, like Sketchpad and Spacewar!, both created in 1962. But these were stand-alone programs, not fully integrated computing experiences. Arguably the true ancestor of modern GUIs was Douglas Engelbart. Let's go to the thought bubble! During World War II, while Engelbart was stationed in the Philippines as a radar operator, he read Vannevar Bush's article on Memex.
graphical user interfaces crash course computer science 26

More Interesting Facts About,

graphical user interfaces crash course computer science 26...

These ideas inspired him, and when he finished his service in the Navy, he went back to school and completed a doctorate. in 1955 at U.C. Berkeley. Heavily involved in the emerging computing scene, he compiled his thoughts in a seminal 1962 report, titled: "Augmentation of the Human Intellect." Engelbart “believed that the complexity of the problems facing humanity grows faster than our ability to solve them. So finding ways to augment our intellect seems to be both a necessary and a desirable goal." He saw that computers could be useful beyond automation and be essential interactive tools for future knowledge workers to tackle complex problems.
graphical user interfaces crash course computer science 26
More Inspired Per Sutherland's recently demonstrated Ivan Sketchpad, Engelbart set out to make his vision a reality, recruiting a team to build the system online. He recognized that a keyboard alone was insufficient for the kinds of applications he hoped to enable. In his words: " We envision problem solvers using computer-aided workstations to augment their efforts. They required the ability to interact with displays of information using some kind of device to move around the screen." And in 1964, working with his colleague Bill English, he created the first computer mouse. The cord came from the bottom of the device and looked like very rodent-like and the nickname stuck.
graphical user interfaces crash course computer science 26
Thanks thought bubble! In 1968, Engelbart demonstrated his entire system at the Fall Joint Computer Conference, in what is often referred to as "the mother of all demos." The demo lasted 90 minutes and demonstrated many features of modern computing: bitmap graphics, video conferencing, word processing, and real-time collaborative document editing.There were also precursors to modern GUIs, such as the mouse and multiple windows, although not could overlap.It was ahead of its time and, like many products with that label, ultimately failed, at least commercially.But its influence on researchers informed penthouses of the time was huge. Engelbart was recognized for this water. turning point in computing with a Turing Award in 1997.
Federal funding began to dwindle in the early 1970s, which we talked about two episodes ago. At that point, many of Engelbart's team, including Bill English, left and went to Xerox's Palo Alto Research Center, better known as Xerox PARC. It was here that the first true GUI computer was developed: the Xerox Alto, completed in 1973. For the computer to be easy to use, it needed more than just fancy graphics. It had to be built around a concept that people were already familiar with, so they could immediately recognize how to use the interface with little to no training. Xerox's response was to treat the 2D screen like the top of a desk...or desktop.
Just like you can have lots of papers laid out on a desk, a user can have several computer programs open at once. Each was contained in its own frame, which offered a view of the application, called a window. Also like papers on a desk, these windows could overlap, blocking items behind them. And there were desktop accessories, such as a calculator and clock, that the user could place on the screen and move around. However, it was not an exact copy of a desk. Instead, it was a metaphor for a desk. For this reason, surprisingly, it is called Desktop Metaphor.
There are many ways to design an interface like this, but the Alto team did it with windows, icons, menus, and a pointer, which is called a WIMP interface. It's what most desktop GUIs use today. It also offered a basic set of widgets, reusable graphical building blocks…things like buttons, checkboxes, sliders, and tabs that were also pulled from real-world objects to make them familiar. GUI applications are built from these widgets, so let's try coding a simple example using this new programming paradigm. First, we need to tell the operating system that we need a new window to be created for our application.
We do this through a GUI API. We need to specify the name of the window and also its size. Let's say 500 by 500 pixels. Now, let's add some widgets: a text box and a button. These require a few parameters to create. First, we need to specify which window they should appear in, because applications can have multiple windows. We also need to specify the default text, the X and Y location in the window, and the width and height. Ok, now we have something that looks like a GUI application, but has no functionality. If you click the "roll" button, nothing happens.
In previous examples we've discussed, the code pretty much runs from top to bottom. GUIs, on the other hand, use what's called event-driven programming; the code can fire at any time and in different orders, in response to events. In this case, these are user-driven events, such as clicking a button, selecting a menu item, or scrolling through a window. Or if a cat runs across your keyboard, that's a bunch of events at once! Let's say that when the user clicks the "roll" button, we want to randomly generate a number between 1 and 20, and then display that value in our text box.
We can write a function that does just that. We can even get a little fancy and say that if we get the number 20, we set the background color of the window to blood red! The last thing we need to do is wire this code up so that it fires every time our button is clicked. To do this, we need to specify that our function "handles" this event for our button, by adding a line to our initialization function. The event type, in this case, is a click event, and our function is the event handler for that event.
Now we are done. We can click that button throughout the day, and each time, our "throw D20" function is submitted and executed. This is exactly what happens behind the scenes when you press the little bold button in a text editor, or select off from a dropdown: a function tied to that event is triggered. I hope I don't get a 20. Ahhhh! Ok, back to the Xerox Alto! Approximately 2,000 Altos were made, used at Xerox, and delivered to university laboratories. They were never sold commercially. Instead, the PARC team continued to refine the hardware and software, culminating in the Xerox Star system, released in 1981.
Xerox Star expanded on the desktop metaphor. Files now looked like pieces of paper and could be stored in little folders, all of which could be placed on your desktop or kept in digital filing cabinets. It's a metaphor that sits on top of the underlying file system. From the user's perspective, this is a new level of abstraction! Xerox, being in the business of printing machines, also advanced text and graphics creation tools. For example, they introduced the terms: cut, copy and paste. This metaphor was drawn from the way people went about making edits to documents written on typewriters.
You would literally cut the text with scissors and then glue it anywhere you wanted on another document. You would then photocopy the page to flatten it back into a single layer, making the change invisible. Thank God for computers! This manual process became moot with the advent of word processing software, which existed on platforms like the Apple II and Commodore PET. But Xerox went well beyond the competition with the idea that everything you do on the computer should look exactly like the real world version, if you printed it out. They called this what you see is what you get or WYSIWYG.
Unfortunately, like Engelbart's online system, the Xerox Star was ahead of its time. Sales were slow because it was priced at the equivalent of nearly $200,000 today for an office facility. It also didn't help that the IBM PC was released that same year, followed by a tsunami of cheap "IBM-compatible" PC clones. But the great ideas that PARC researchers had been cultivating and building on for nearly a decade did not go to waste. In December 1979, a year and a half before the Xerox Star shipped, you visited a guy you may have heard of: Steve Jobs. There is a lot of lore surrounding this visit, with many suggesting that Steve Jobs and Apple stole Xerox's ideas.
But that is simply not true. In fact, Xerox approached Apple hoping to partner with them. Ultimately, Xerox was able to buy a million dollar stake in Apple ahead of its highly anticipated IPO. -but it came with an additional provision: "reveal all the good things that happen at Xerox PARC". Steve knew they had some of the brightest minds in computing, but he wasn't prepared for what he saw. There was a demonstration of Xerox graphics technology. user interface, running on a crisp bitmap display, all controlled with intuitive mouse input. Steve" later said: "It was like the veil was lifted from my eyes.
I was able to see the future of what computing was meant to be." Steve returned to Apple with his entourage of engineers and they went to work inventing new features, like the menu bar and a trash can to store files that needed to be deleted. It even bulged when it was full, again with the metaphors -- the graphical user interface and the mouse was the Apple Lisa, released in 1983. It was a super-advanced machine, with a super-advanced price -- almost $25,000 today. significantly cheaper than the Xerox Star, but it turned out to be just as much of a flop on the market.Luckily, Apple had another project up its sleeve: the Macintosh, released a year later, in 1984.
It was priced at about $6,000 on the today, a quarter of the cost of the Lisa. And it hit the mark, selling 70,000 units in the first 100 days. But after the initial craze, sales began to falter, and Apple was selling more Apple II computers than Mac. A big problem was that no one was creating software for this new machine with its radical new interface. And it got worse. The competition caught up quickly. Soon other personal computers had primitive graphical user interfaces, but usable on computers at a fraction of the cost. Consumers ate it up, as did PC software developers.
With Apple's finances becoming increasingly desperate and tensions rising with new Apple CEO John Sculley, Steve Jobs was fired. A few months later, Microsoft released Windows 1.0. It may not have been as pretty as Mac OS, but it was the first salvo in what would become a bitter rivalry and near-industry dominance by Microsoft. Within ten years, Microsoft Windows was running on nearly 95% of personal computers. Initially, Mac OS fans could rightfully claim superior graphics and ease of use. Those early versions of Windows were built on top of DOS, which was never designed to run GUIs. But, after Windows 3.1, Microsoft began developing a new consumer-facing operating system with an updated GUI called Windows 95.
This was a significant rewrite that offered much more than polished graphics. It also had advanced features that Mac OS did not, such as program multitasking and protected memory. Windows 95 introduced many GUI elements that are still seen in versions of Windows today, such as the Start menu, the taskbar, and the Windows Explorer file manager. However, Microsoft was not infallible. Seeking to make the desktop metaphor even easier and friendlier, it worked in a product calledMicrosoft Bob, and he took the idea of ​​using metaphors to an extreme. Now you had a whole virtual room on your screen, with apps built in as objects you could put on tables and shelves.
It even came with a roaring fireplace and a virtual dog to offer help. And see those doors on the sides? Yes, those went to different rooms on your computer where different applications were available. As you may have guessed, it was not a success. This is a great example of how the user interfaces we enjoy today are the product of what is essentially natural selection. Whether you're running Windows, Mac, Linux, or some other desktop GUI, it's almost certainly an evolved version of the WIMP paradigm first introduced in the Xerox Alto. Along the way, many bad ideas were tried and failed.
Everything had to be invented, tested, refined, adopted or abandoned. GUIs are everywhere these days, and while they're good, they're not always great. No doubt you've experienced design-related frustrations after downloading an app, using someone else's phone, or visiting a website. And for this reason, computer scientists and interface designers continue to work hard to create computing experiences that are both simpler and more powerful. Ultimately working towards Engelbart's vision of augmenting the human intellect. I will see you next week.

If you have any copyright issue, please Contact