Okay I'm just going to admit it and stop being in denial: I'm a bad blogger! There I feel better now. With that said I offer the following jem.
How Security Became an Issue
It is interesting to pick up various computer books and see that there is usually a history section that sets the stage for where society is today pertaining to computing and data processing. Unlike histories that tell of times long past, the history of computing typically begins in the 1960s. A lot has happened in a short period of time, and computer security is just starting to reach its time in the limelight.
Roughly twenty-five years ago, the only computers were mainframes. They were few and far between and used for specialized tasks, usually running large batch jobs, one at a time, and carrying out complex computations. If users were connected to the mainframes, it was through “dumb” terminals that had limited functionality and were totally dependent on the mainframe for their operations and processing environment. This was a closed environment with little threat of security breaches or vulnerabilities being exploited. This does not mean that things were perfect, that security vulnerabilities did not exist, and that people were in a computing utopia. Instead, it meant there were a handful of people working in a “glass house” who knew how to operate the computer. They decided who could access the mainframe and when. This provided a much more secure environment, because of its simplicity, than what we see in today’s distributed and interconnected world.
In the days of mainframes, web sites describing the steps of how to break into a specific application or operating system did not exist. The network stacks and protocols being used were understood by very few people relative to the vast number of people that understand stacks and protocols today. Point-and-click utilities that can overwhelm buffers or interrogate ports did not exist. This was a truly closed environment that only a select few understood.
If networks were connected, it was done in a crude fashion for specific tasks, and corporations did not totally depend on data processing as they do today. The operating systems of that time had problems, software bugs, and vulnerabilities, but not many people were interested in taking advantage of them. Computer operators were at the command line and if they encountered a software problem, they usually just went in and manually changed the programming code. All this was not that long ago, considering where we are today.
As companies became more dependent on the computing power of mainframes, the functionality of the systems grew and various applications were developed. It was clear that giving employees only small time slices of access to the mainframes was not as productive as it could be. Processing and computing power was brought closer to the employees, enabling them to run small jobs on their desktop computers while the big jobs still took place within the “glass house.” This trend continued and individual computers became more independent and autonomous, only needing to access the mainframe for specific functionality.
As individual personal computers became more efficient, they continually took on more tasks and responsibilities. It was shown that several users accessing a mainframe was inefficient and that some major components needed to be more readily available so that users could perform their tasks in an efficient and effective way. This thinking led to the birth of the client/server model. Although many individual personal computers had the processing power to compute their own calculations and perform their own logic operations, it did not make sense that each computer held information that was needed by all other computers. Thus, programs and data were centralized on servers, with individual computers accessing them when necessary and accessing the mainframes less frequently.
No comments:
Post a Comment