28 minute read
Computer Learning Centers, Inc. Business Information, Profile, and History
11350 Random Hills Rd., Ste. 240
Fairfax, Virginia 22030
U.S.A.
Company Perspectives:
Computer Learning Centers grew up with the computer field. We started training technical professionals more than 40 years ago--long before computers became a way of life at home and in business.
We have been at the forefront of technology for a long time, but most importantly, we have developed the most advanced methods for teaching people how to use it. Tens of thousands of our graduates have joined the work force during the past four decades.
CLC is one of the country's oldest and largest school systems devoted solely to the training of computer professionals.
History of Computer Learning Centers, Inc.
Computer Learning Centers, Inc. (CLC) provides information technology and computer-related education and training to adults seeking entry-level jobs in information technology. CLC designs programs and courses to meet current information technology education needs, offering instruction in rapidly growing technologies such as client/server programming, databases, network engineering, information technology support, and computer systems. Through its accredited career programs, CLC offers associate degrees and non-degree diplomas in several primary areas of study, including business applications; electronics, systems and hardware; programming; networking; information technology support; and business applications with networking. CLC's Advantec Institute division offers customized ongoing training programs to corporate clients that focus on current and emerging technologies in information technology.
The Early Years--Acquisitions and Mergers
Located in Fairfax, Virginia, the original Computer Learning Centers was founded in 1967 and taught systems management, data entry, and computer operations to computer center operations personnel. Its first 20 years were ones of repeated acquisitions and mergers. In 1968, it acquired International Tabulation Institutes, a Los Angeles-based training school founded in 1957. In 1970, the company merged with the Washington School for Secretaries. Three years later, MCD Enterprises, a construction company located in Maryland, bought CLC and combined the Learning Center institutions with its own schools. In 1976 Airco, Inc., a business whose primary interests were in welding gases, medical products, and alloy production, acquired CLC from MCD.
By 1983, Airco had increased the number of CLC schools to 25. It was about that time that Airco merged with British Oxygen to form the $3 billion British Oxygen Group, whose core interests were in medical products and industrial gases. Then, in 1987, British Oxygen Group made the decision to divest six of the schools to Connecticut-based General Atlantic Partners. Reid Bechtle was retained by the new owners of CLC in 1991 to review the business for possible sale. Bechtle instead advised General Atlantic to invest in the fledgling information technology business and became the corporate chief executive officer and president of Computer Learning Centers in 1991. Charles L. Cosgrove, who came with Bechtle from Planning Research Corporation became vice-president and chief financial officer in 1992, while Harry H. Gaines became its chairman. The company, by this time, also had operations in the United Kingdom, where it was known as Comprehensive Learning Concepts.
Expansion in the Mid-1990s
Bechtle, Cosgrove, and Gaines proved good leaders for CLC in the highly fragmented post-secondary adult education and training market of the 1990s. The market for such programs and services was characterized by rapidly changing requirements, with no single institution or company holding a dominant share. CLC thus had to compete for students not just with other vocational and technical training schools, but also with degree-granting colleges and universities, and continuing education and commercial training programs. CLC met this challenge by designing a series of programs intended to meet the needs of adult learners pursuing information technology-related careers, offering programs which could be completed in as little as seven to 18 months; flexible class schedules; monthly start dates; and financial aid eligibility for qualified students.
Steady enrollment increases fueled the company's growth throughout the 1990s. In 1995, CLC had about $40 million in annual revenue and eight schools serving about 6,500 students in California, Illinois, Pennsylvania, and Virginia. Yet CLC was still a small fry when compared with companies such as National Educational Corporation, with $241 million in annual sales, and DeVry, with $228 million. (Other competitors included Sylvan Learning Systems, Apollo Group, and ITT Educational Services.) When the company went public on May 31, 1995, it sold 2.2 million shares of common stock and brought in net proceeds of about $14.9 million. This left it relatively debt free, although still with reported losses of $1.3 million from discontinued operations. After that point, CLC grew steadily, proving Chairman Gaines correct, when he said in 1995 that the market for information technology training was due to "explode."
By 1996, the Washington Post listed CLC among that paper's top-rated 100 companies with 1995 revenues of $46.1 million and profits of $1.9 million. The Post reported that CLC continued to increase enrollment, especially in its newer, associate degree programs, and had received approval from state regulators. In 1997, it had revenues of $64 million and profits of $5.6 million, and earnings per share at a little more than a dollar. The eight schools had grown to 19 schools, enrollment was up 32 percent over the previous year, and the company started a program of specialized courses geared to working adults, called the Advantec Institute. CLC was again featured by the Post as "one of the area's hottest stock performers" in 1996. In order to raise capital to open new facilities and expand programs, the company offered up 1.3 million shares in its second public offering in October 1996.
By 1998, CLC had 12,000 students at its 23 schools nationwide in the United States and three in Canada, up from about 8,500 in 1997, and $25 million in new computers and facilities. The company grossed $97 million in the fiscal year ending January 31 and netted $9.6 million. Stock prices reached a record almost $39 per share in March after adjusting for two stock splits, prompting traders to speculate that the company was in for an adjustment some time soon and leading critics to short sell four million shares of CLC stock. Bechtle, in an act of some bravado, taunted the "shorts" publicly, saying, "Every dollar the stock goes up is $4 million [they] take out of their bank accounts," and continued CLC's acquisition spree. The company had acquired Boston Education Corporation, a privately held Boston, Massachusetts provider of information technology education and training, which served nearly 650 students, in late December 1997 and, in January 1998, Markerdowne Corporation, a privately held provider of information technology education and training based in Paramus, New Jersey, serving approximately 800 students a year. In March 1998, it closed the deal on Delta College, a Montreal, Quebec-based, privately held information technology training firm serving 800 students.
Legal and Regulatory Problems
Yet, while the company was reveling in its steady growth spurt in enrollment, stock prices, and earnings (more than 50 percent in its latest fiscal year), trouble was brewing on several CLC campuses. As early as 1991 through 1993, student default rates on federally financed student loans had reached "unacceptable" levels of 25 percent, according to the U.S. Department of Education, prompting this body to suspend CLC's eligibility from some of its Title IV programs from September 1996 until October 1997 when it showed a reduced student loan default rate.
In December 1997, 11 students at the company's Alexandria, Virginia campus filed complaints with the Virginia Council of Higher Education that CLC had misrepresented students' future career prospects. In March 1998, the attorney general of the state of Illinois sued CLC in a civil lawsuit and asked the judge to shut down operations on the school's Schaumburg campus, alleging that the company had violated seven provisions of the state's consumer fraud and private vocational school laws by misrepresenting course offerings and employment prospects to students at its Schaumburg school. CLC officials responded to the Illinois attorney general's allegations by signing an agreement which "affirms that there were legitimate grounds for certain students to voice complaints," but they denied any violation of state laws. A few months later, the company agreed to a settlement that entailed creating a program to promptly address student complaints; establishing a program to provide $95,000 worth of software to nonprofit institutions; and contributing $90,000 to the attorney general's consumer education fund. CLC also agreed to hire more faculty, install new computers, and revamp its student recruiting efforts. Finally, the company said that it would hire an independent arbitrator whose role would be to determine whether a student's restitution would be in the form of cash or free classes.
To complicate matters further, the Illinois State Board of Education (BOE) ordered the Schaumburg school to suspend marketing and to stop enrolling students on its campus. Thirty days later, the state lifted the Schaumburg suspension after CLC agreed to change its advertising, admissions, and student complaint procedures, and to tighten faculty qualifications--but not before a two-day sell-off by investors and speculators which slashed the company's share price by more than 49 percent. In the wake of these actions, seven lawsuits were filed in federal courts on behalf of stockholders who had lost money because of the dip in stock prices, accusing CLC executives of violating securities laws and making millions of dollars by short selling stock before it nosedived.
On the heels of the Illinois BOE's action, the U.S. Department of Education launched an investigation of CLC schools. In a public letter, officials of the DOE notified the company that it was tightening oversight of federal student aid programs and ordered the schools to produce the names of all students who had received federal aid in the past two years. This move carried potentially serious ramifications, since approximately 75 percent of CLC's revenue came from federal student loans. According to the Washington Post, the Federal Trade Commission, which has jurisdiction over the marketing practices of private career schools, also began gathering information on CLC's testing and recruiting of potential students and the quality of its classes. However, according to CLC, the FTC had never contacted them for an investigation.
Rebounding from Difficulty
Although these regulatory and legal problems cut into CLC's first-quarter 1998 profits by roughly 25 percent, in the wake of their settlement, stock prices once again climbed, back up to almost $29 in July 1998. However, the company's rollercoaster ride was not over yet. In that month, a private detective working for shareholders' plaintiff attorneys announced that he had found thousands of pages of discarded student records in a dumpster outside the Virginia campus. These pages allegedly included some of those sought by the DOE. Stock prices plunged one more time as a result. They turned upward again once Bechtle dismissed the importance of the discarded documents as "waste in the normal operations of our business," and two independent analysts dismissed the importance of the discarded documents as well. Still, the Washington Post reported in late August that there was an ongoing FTC investigation into the marketing of the school's computer courses and that this investigation had been expanded based on allegations that CLC threw out records just before the federal review began.
As the company headed into 1999, it sought to recover from its drop in earnings attributed to a sharp falloff in enrollment at campuses in the Washington and Chicago areas and the cost of settling the consumer fraud lawsuit filed by the state of Illinois. By August 1998, the Illinois settlement had cost CLC more then $300,000 in penalties and another $500,000 in legal fees and in lost student fees. More importantly, it had led to a situation in which CLC was having trouble attracting students. In that month, the company disclosed expected second quarter profits of only four to five cents a share rather than the 16 cents that analysts had earlier projected.
CLC predicted that its ability to rebound from its difficulties and meet its future operating and financial goals would depend upon its ability to shed its bad publicity and to implement a successful growth strategy which included the establishment of new learning centers in new locations; the development of new and/or the enhancement of existing programs; the expansion of the Advantec Institute; the improvement of student outcomes through academic services and job placement assistance; the increased availability of associate degree programs at its various centers; and the acquisition of assets and programs complementary to the company's actions. The company's objective overall at the end of 1998 was to strengthen and expand its position as one of the leading providers of information technology education and training programs for adults in the upcoming years.
Related information about Computer
The modern electronic digital computer is the result of a long
series of developments, which started some 5000 years ago with the
abacus. The first mechanical adding device was developed in 1642 by
the French scientist-philosopher, Pascal. His ‘arithmetic machine’,
was followed by the ‘stepped reckoner’ invented by Leibnitz in
1671, which was capable of also doing multiplication, division, and
the evaluation of square roots by a series of stepped additions,
not unlike the methods used in modern digital computers. In 1835,
Charles Babbage formulated his concept of an ‘analytical machine’
which combined arithmetic processes with decisions based on the
results of the computations. This was really the forerunner of the
modern digital computer, in that it combined the principles of
sequential control, branching, looping, and storage units.
In the later 19th-c, George Boole developed the symbolic binary
logic which led to Boolean algebra and the binary switching
methodology used in modern computers. Herman Hollerith (1860–1929),
a US statistician, developed punched card techniques, mainly to aid
with the US census at the beginning of the 20th-c; this advanced
the concept of automatic processing, but major developments awaited
the availability of suitable electronic devices. J Presper Eckert
(1919–95) and John W Mauchly (1907–80) produced the first
all-electronic digital computer, ENIAC (Electronic Numerical
Integrator and Calculator), at the University of Pennsylvania in
1946, which was 1000 times faster than the mechanical computers.
Their development of ENIAC led to one of the first commercial
computers, UNIVAC I, in the early 1950s, which was able to handle
both numerical and alphabetical information. Very significant
contributions were made around this time by Johann von Neumann, who
converted the ENIAC principles to give the EDVAC computer
(Electronic Discrete Variable Automatic Computer) which could
modify its own programs in much the same way as suggested by
Babbage.
The first stored program digital computer to run an actual
program was built at Manchester University, UK, and first performed
successfully in 1948. This computer was later developed into the
Ferranti Mark I computer, widely sold. The first digital computer
(EDSAC) to be able to be offered as a service to users was
developed at Cambridge University, UK, and ran in the spring of
1949. The EDSAC design was used as the basis of the first business
computer system, the Lyons Electronic Office. Advances followed
rapidly from the 1950s, and were further accelerated from the
mid-1960s by the successful development of miniaturization
techniques in the electronics industry. The first microprocessor,
which might be regarded as a computer on a chip, appeared in 1971,
and nowadays the power of even the most modest personal computer
can equal or outstrip the early electronic computers of the 1940s.
The key elements in computing today are miniaturization and
communications. Hand-held computers, with input via a stylus, can
be linked to central systems through a mobile telephone.
sprotected
A computer is a machine for manipulating data according to a list of instructions known as a program.
Computers are extremely versatile. According to the Church?Turing
thesis, a computer with a certain minimum threshold capability
is in principle capable of performing the tasks of any other
computer. Therefore, computers with capabilities ranging from those
of a personal digital assistant to a supercomputer may all
perform the same tasks, as long as time and memory capacity
are not considerations. Therefore, the same computer designs may be
adapted for tasks ranging from processing company payrolls to controlling unmanned
spaceflights. Due to technological advancement, modern electronic computers
are exponentially more capable than those of preceding generations
(a phenomenon partially described by Moore's Law).
Computers take numerous physical forms. Early electronic computers
were the size of a large room, while entire modern embedded
computers may be smaller than a deck of playing cards. Even today,
enormous computing facilities still exist for specialized scientific
computation and for the transaction
processing requirements of large organizations. Smaller
computers designed for individual use are called personal computers.
Along with its portable equivalent, the laptop computer, the
personal computer is the ubiquitous information processing and
communication
tool, and is usually what is meant by "a computer". However, the
most common form of computer in use today is the embedded computer.
They may control machines from fighter aircraft to industrial robots to
digital
cameras.
History of computing
Originally, the term "computer" referred to a person who performed
numerical calculations, often with the aid of a mechanical
calculating device or analog computer. Examples of these early devices, the
ancestors of the computer, included the abacus and the Antikythera
mechanism, an ancient Greek device for calculating the movements of
planets which dates from
about 87 BC. The end of the Middle Ages saw a reinvigoration of European mathematics
and engineering, and Wilhelm Schickard's 1623 device was the first of a
number of mechanical calculators constructed by European
engineers.
In 1801, Joseph Marie
Jacquard made an improvement to existing loom designs that used
a series of punched paper cards as a program to weave intricate
patterns. The resulting Jacquard loom is not considered a true computer but it
was an important step in the development of modern digital
computers.
Charles Babbage
was the first to conceptualize and design a fully programmable
computer as early as 1820, but due to a combination of the limits
of the technology of the time, limited finance, and an inability to
resist tinkering with his design, the device was never actually
constructed in his lifetime. By the end of the 19th century a
number of technologies that would later prove useful in computing
had appeared, such as the punch card and the vacuum tube, and large-scale automated data processing
using punch cards was performed by tabulating machines designed by
Hermann
Hollerith.
During the first half of the 20th century, many scientific
computing needs were met by increasingly sophisticated
special-purpose analog computers, which used a direct mechanical or
electrical model of
the problem as a basis for computation (they became increasingly
rare after the development of the programmable digital computer). A
succession of steadily more powerful and flexible computing devices
were constructed in the 1930s and 1940s, gradually adding the key
features of modern computers.
The use of digital electronics was introduced by Claude Shannon in
1937Shannon, Claude Elwood (1940). He came up with the idea while
studying the relay
circuits of Vannevar
Bush's Differential Analyzer.{scienceworld.wolfram.com/biography/Shannon.html
Biography of Claude Elwood Shannon] - URL retrieved September 26, 2006 This point marked the
beginning of binary digital circuit design and the use of logic gates. Precursors of
this idea were Almon
Strowger, who patented a device containing a logic gate switch
circuit, Nikola
Tesla who filed for patents of devices containing logic gate
circuits in 1898 (see List of Tesla patents), and Lee De Forest's
modification, in 1907, who replaced relays with vacuum tubes.
Defining one point along this road as "the first digital electronic
computer" is exceedingly difficult.
On 12 May, 1941 Konrad Zuse completed his electromechanical
Z3, being the first working
machine featuring automatic binary arithmetic and feasible programmability
(therefore the first digital operational programmable computer,
although not electronic); other notable achievements include the
Atanasoff-Berry Computer (shown working around Summer
1941), a special-purpose machine that used valve-driven (vacuum
tube) computation, binary numbers, and regenerative memory; the Harvard Mark I, a
large-scale electromechanical computer with limited programmability
(shown working around 1944); which was the first general
purpose electronic computer, but originally had an inflexible
architecture that meant reprogramming it essentially required it to
be rewired.
The team who developed ENIAC, recognizing its flaws, came up with a
far more flexible and elegant design, which has become known as the
Von Neumann
architecture (or "stored program architecture"). The first to
be up and running was the Small-Scale
Experimental Machine, but the EDSAC was perhaps the first practical version that was
developed.
Valve (tube) driven computer designs were in use throughout the
1950s, but were eventually replaced with transistor-based computers,
which were smaller, faster, cheaper, and much more reliable, thus
allowing them to be commercially produced, in the 1960s. By the
1970s, the adoption of integrated circuit technology had enabled computers to
be produced at a low enough cost to allow individuals to own
personal
computers.
How computers work: the stored program architecture
- Display
- Motherboard
- CPU (Microprocessor)
- Primary
storage (RAM)
- Expansion
cards
- Power
supply
- Optical disc
drive
- Secondary
storage (HD)
- Keyboard
- Mouse
]]
While the technologies used in computers have changed dramatically
since the first electronic, general-purpose computers of the 1940s, most
still use the stored program architecture (sometimes called the von
Neumann architecture). rewrite -->
The architecture describes a computer with four main sections: the
arithmetic
and logic unit (ALU), the control circuitry, the memory, and the input
and output devices (collectively termed I/O). These parts are
interconnected by bundles of wires (called "buses" when the same bundle
supports more than one data path) and are usually driven by a timer
or clock (although
other events could drive the control circuitry).
Conceptually, a computer's memory can be viewed as a list of cells.
This information can
either be an instruction, telling the computer what to do, or data,
the information which the computer is to process using the
instructions that have been placed in the memory. In principle, any
cell can be used to store either instructions or data.
The ALU is in many senses
the heart of the computer. On a typical personal computer, input
devices include objects like the keyboard and mouse, and output devices
include computer
monitors, printers and the like, but as will be discussed later a
huge variety of devices can be connected to a computer and serve as
I/O devices.
The control system ties this all together. typically, this is
incremented each time an instruction is executed, unless the
instruction itself indicates that the next instruction should be at
some other location (allowing the computer to repeatedly execute
the same instructions).
Since the 1980s the ALU and control unit (collectively called a
central
processing unit or CPU) have typically been located on a single
integrated
circuit called a microprocessor.
The functioning of such a computer is in principle quite
straightforward.
Instructions, like data, are represented within the computer as
binary
code ? The particular instruction set that a specific computer
supports is known as that computer's machine language.
More powerful computers such as minicomputers, mainframe computers
and servers may differ from the model above by dividing
their work between more than one main CPU. Multiprocessor and
multicore personal and laptop computers are also
beginning to become available.
Supercomputers
often have highly unusual architectures significantly different
from the basic stored-program architecture, sometimes featuring
thousands of CPUs, but such designs tend to be useful only for
specialized tasks. At the other end of the size scale, some
microcontrollers
use the Harvard
architecture that ensures that program and data memory are
logically separate.
Digital circuits
The conceptual design above could be implemented using a variety
of different technologies. As previously mentioned, a stored
program computer could be designed entirely of mechanical
components like Babbage's devices or the Digi-Comp I. However, digital circuits allow
Boolean logic and
arithmetic using
binary numerals to be implemented using relays ? when electricity is provided to one of
the pins, current can flow through between the other two.
Through arrangements of logic gates, one can build digital circuits
to do more complex tasks, for instance, an adder, which
implements in electronics the same method ? in computer
terminology, an algorithm ? Therefore, by the 1960s they were replaced
by the transistor, a
new device which performed the same task as the tube but was much
smaller, faster operating, reliable, used much less power, and was
far cheaper.
In the 1960s and 1970s, the transistor itself was gradually
replaced by the integrated circuit, which placed multiple transistors
(and other components) and the wires connecting them on a single,
solid piece of silicon. By the 1970s, the entire ALU and control
unit, the combination becoming known as a CPU, were being
placed on a single "chip" called a microprocessor. as of 2006, the Intel Core Duo
processor contains 151 million transistors.
Tubes, transistors, and transistors on integrated circuits can be
used as the "storage" component of the stored-program architecture,
using a circuit design known as a flip-flop,
and indeed flip-flops are used for small amounts of very high-speed
storage. Instead, earliest computers stored data in Williams tubes ? These
results can either be viewed directly by a user, or they can be
sent to another machine, whose control has been assigned to the
computer: In a robot, for
instance, the controlling computer's major output device is the
robot itself.
The first generation of computers were equipped with a fairly
limited range of input devices. A punch card reader, or something similar, was used
to enter instructions and data into the computer's memory, and some
kind of printer, usually a modified teletype, was used to record the results. For the
personal computer, for instance, keyboards and mice are the primary ways
people directly enter information into the computer; and monitors are the
primary way in which information from the computer is presented
back to the user, though printers, speakers, and headphones are common, too. The first
class is that of secondary storage devices, such as hard disks, CD-ROMs, key drives and the like,
which represent comparatively slow, but high-capacity devices,
where information can be stored for later retrieval;
Programs
Computer
programs are simply lists of instructions for the computer to
execute. Rather, they do millions of simple instructions arranged
by people known as programmers.
In practice, people do not normally write the instructions for
computers directly in machine language. Instead, programmers
describe the desired actions in a "high level" programming
language which is then translated into the machine language
automatically by special computer programs (interpreters
and compilers). The
language chosen for a particular task depends on the nature of the
task, the skill set of the programmers, tool availability and,
often, the requirements of the customers (for instance, projects
for the US military were often required to be in the Ada programming
language).
Computer
software is an alternative term for computer programs; A
computer
application is a piece of computer software provided to many
computer users, often in a retail environment. The stereotypical
modern example of an application is perhaps the office suite, a set of
interrelated programs for performing common office tasks.
Going from the extremely simple capabilities of a single machine
language instruction to the myriad capabilities of application
programs means that many computer programs are extremely large and
complex. A typical example is Windows XP, created from roughly 40 million lines of computer
code in the C++
programming
language;Tanenbaum, Andrew S. the discipline of software
engineering has attempted, with some success, to make the
process quicker and more productive and improve the quality of the
end product.
A problem or a model is computational if it is formalized in
such way that can be transformed to the form of a computer program.
the classic example of this type of early operating system was
OS/360 by IBM.
The next major development in operating systems was timesharing ? Security
access controls, allowing computer users access only to files,
directories and programs they had permissions to use, were also
common.
Perhaps the last major addition to the operating system was tools
to provide programs with a standardized graphical user
interface. For instance, Apple's Mac OS X ships with a digital video
editor application.
Operating systems for smaller computers may not provide all of
these functions. The operating systems for early microcomputers with
limited memory and processing capability did not, and Embedded computers
typically have specialized operating systems or no operating system
at all, with their custom application programs performing the tasks
that might otherwise be delegated to an operating system. The
ENIAC was originally
designed to calculate ballistics-firing tables for artillery, but it was also
used to calculate neutron cross-sectional densities to help in the
design of the hydrogen
bomb significantly speeding up its development. (Many of the
most powerful supercomputers available today are also used for
nuclear weapons
simulations.) The
CSIR Mk I, the first
Australian stored-program computer, was amongst many other tasks
used for the evaluation of rainfall patterns for the catchment area of the
Snowy Mountains
Scheme, a large hydroelectric generation projectThe last of the first :
CSIRAC : Australia's first computer, Doug McCann and Peter Thorne,
ISBN 0-7340-2024-4. Others were used in cryptanalysis, for example
the first programmable (though not general-purpose) digital
electronic computer, Colossus, built in 1943 during World War II. The LEO, a stored
program-computer built by in the United Kingdom, was operational and being used for
inventory management and other purposes 3 years before IBM built their first commercial
stored-program computer. In the 1980s, personal computers
became popular for many tasks, including book-keeping, writing and
printing documents, calculating forecasts and other repetitive
mathematical tasks involving spreadsheets.
As computers have become less expensive, they have been used
extensively in the creative arts as well. Sound, still pictures,
and video are now routinely created (through synthesizers, computer graphics and
computer
animation), and near-universally edited by computer. They have
also been used for entertainment, with the video game
becoming a huge industry.
Computers have been used to control mechanical devices since they
became small and cheap enough to do so; indeed, a major spur for
integrated circuit technology was building a computer small enough
to guide the Apollo
missions two of the first major applications for embedded
computers. Industrial robots have become commonplace in mass production, but
general-purpose human-like robots have not lived up to the promise
of their fictional counterparts and remain either toys or research
projects.
Robotics, indeed, is the physical expression of the field of
artificial
intelligence, a discipline whose exact boundaries are fuzzy but
to some degree involves attempting to give computers capabilities
that they do not currently possess but humans do.
Networking and the Internet
Computers have been used to coordinate information in multiple
locations since the 1950s, with the US military's SAGE
system the first large-scale example of such a system, which led to
a number of special-purpose commercial systems like Sabre.
In the 1970s, computer engineers at research institutions
throughout the US began to link their computers together using
telecommunications technology. This effort was funded by ARPA,
and the computer
network that it produced was called the ARPANET. In the phrase of
John Gage and Bill Joy (of Sun Microsystems), "the
network is the computer". Initially these facilities were available
primarily to people working in high-tech environments, but in the
1990s the spread of applications like e-mail and the World Wide Web, combined with the development of
cheap, fast networking technologies like Ethernet and ADSL saw computer networking become ubiquitous almost
everywhere. A very large proportion of personal computers
regularly connect to the Internet to communicate and receive information.
"Wireless" networking, often utilizing mobile phone networks, has
meant networking is becoming increasingly ubiquitous even in mobile
computing environments. Therefore, there has been research interest
in some computer models that use biological processes, or the
oddities of quantum
physics, to tackle these types of problems. However, such a
system is limited by the maximum practical mass of DNA that can be
handled.
Quantum
computers, as the name implies, take advantage of the unusual
world of quantum physics.
These alternative models for computation remain research projects
at the present time, and will likely find application only for
those problems where conventional computers are inadequate.
See also Unconventional computing. Terminology for different
professional disciplines is still somewhat fluid and new fields
emerge from time to time: however, some of the major groupings are
as follows:
- Computer
engineering is the branch of electrical
engineering that focuses both on hardware and software
design, and the interaction between the two.
- Computer
science is a traditional name of the academic study of the
processes related to computers and computation, such as
developing efficient algorithms to perform specific class of tasks. one of
many examples is experts in geographical
information systems who apply computer technology to problems
of managing geographical information.
There are three major professional societies dedicated to
computers, the British Computer Society the Association for Computing Machinery and IEEE Computer
Society.
See also
- Association for Computing Machinery
- The British Computer Society
- IEEE
Computer Society
- Operating
system
- Computer
hardware
- Computability theory
- Computer
datasheet
- Computer
expo
- Computer
science
- Computer types: analog computer, hybrid computer, supercomputer (along with the minisupercomputer),
mainframe computer,
workstation
computers, laptop,
roll-away
computer, embedded computer, cart computer, tablet pc, handheld computer, subnotebook, thin client, minicomputer (and the supermini), microcomputer, computer terminal,
and server
- Computing
- Computers
in fiction
- Computer
music
- Computer
security and Computer insecurity challenges such as: malware, phishing, spam
(electronic), and how to solve them, such as firewall,
computer
security audit
- Digital
- History
of computing
- List of computer term etymologies
- List
of computing topics
- Personal
computer
- Word
processing
- Internet
- Computer
programming
Other computers
- Analog
computer
- Chemical
computer
- DNA
computer
- Human
computer
- Molecular
computer
- Optical
computer
- Quantum
computer
- Wetware
computer
See also Unconventional computing.
Notes and references
www97.intel.com/discover/JourneyInside/TJI_Intro_lesson1/default.aspx
Additional topics
This web site and associated pages are not associated with, endorsed by, or sponsored by Computer Learning Centers, Inc. and has no official or unofficial affiliation with Computer Learning Centers, Inc..