It’s 1969 on this season’s *Mad Men, *and a glass-enclosed climate-controlled room is being built to house Sterling Cooper’s first computer — a soon-to-be-iconic IBM System/360 – in the space where the copywriters used to meet.

That same year, in an article entitled “Computer Graphics for Decision Making,” IBM engineer Irvin Miller introduced HBR’s readers to a potent new computing technology that was part of the 360 — the interactive graphical display terminal.

Punch cards and tapes were being replaced by virtual data displays on glass-screened teletypes, but those devices still displayed mainly text. Now the convergence of long-standing cathode-ray-tube and light-pen hardware with software that would accept English language commands was about to create a revolution in data analysis.

Previously, if executives had wanted to investigate, say, the relationship of plant capacity to the cost of production, marginal costs to quantity produced, or marginal revenues to quantities sold, they’d have to fill out a requisition, wait for a data analyst to run a query through the machine, using some computer language like Fortran, and then generate a written report. That could take months.

But interactive graphics offered the possibility of providing realistic answers quickly and directly. As Miller explains: “With such a console in his office, an executive can call for the curves that he needs on the screen; then, by touching the screen with the light pen, he can order the computer to calculate new values and redraw the graphs, which it does almost instantaneously.”

To read Miller’s tutorial is to return to some first principles that may still be worth bearing in mind, even in today’s world of vastly greater amounts of data and computing power (the largest mainframe Miller refers to has a capacity of two megabytes). The first is his almost off-hand initial stipulation that the factors affecting a business that a computer can process are quantitative.

The second is his explanation (or, for us, reminder) of what the computer does when it delivers up the graphs: “To solve business problems requiring executive decisions, one must define the total problem and then assign a mathematical equation to each aspect of the problem. A composite of all equations yields a mathematical model representing the problem confronting the executive.” Miller suggests, as an example, that a system programmed with data on quantities produced and sold, plant capacity, marginal cost, marginal revenues, total cost, total revenue, price, price for renting, and price for selling could enable businesspeople to make informed decisions about whether to hold inventory; expand plant production; rent, buy, or borrow; increase production; and examine the effects of anomalies on demand or the effects of constraints.

Even in this simple example it’s easy to see how hard it is to “define the total problem” — how, for instance, decisions might be skewed by the absence of, say, information on interest rates (which in 1969 were on the threshold of skyrocketing to epic proportions) or of any data on competitors, or on substitutes (a concept Michael Porter wouldn’t introduce until 1979).

Miller is hardly oblivious to the dangers (the term “garbage in; garbage out” had been coined in 1963); and in answer to the question of why an executive should rely on the differential calculus and linear programming that underpins the models (interestingly, Miller assumes senior business executives haven’t had calculus), he replies that the point of the equations is only to* “anticipate and verify intuitive guesses which are expected to be forthcoming from the businessman” *[italics original]. In other words, the mathematics are essentially meant to serve as an amplification of the executive’s judgment, not as a substitute.

Intuition-support is, in fact, the point for Miller. For him, the real benefit of the new technology isn’t just the ability to perform what-if analyses on current data, as powerful as that is, but that executives could do it in the privacy of their own offices, which would afford them the time for the private reflection from which intuition springs. “The executive needs a quiet method whereby he alone can anticipate, develop, and test the consequences of following various of his intuitive hunches before publicly committing himself to a course of action,” Miller says, before he even begins to explain how the technology works.

In this it’s enlightening to revisit Miller’s estimates of how much time the entire process was supposed to take: a few weeks to construct the model, five minutes to conduct each what-if scenario – and then two full hours for the executive to consider the implications of the answers. In this, HBR’s first examination of data visualization, it is in those two hours of solitary quiet time that the real value of interactive computing lies.