1 Curie Court
Rockville, Maryland 20850
U.S.A.
History of Computer Data Systems, Inc.
Computer Data Systems, Inc. does 90 percent of its computer service business with federal and state governments, primarily in setting up and operating data processing systems for its client agencies and selling its own prepackaged software to them. Computer Data Systems (CDSI) grew from humble beginnings in 1968 with four employees to become one of the nation's top 25 government contractors in 1995 with a staff of 3,400 at 22 locations across the United States. In the 1990s, the company ranked among Forbes magazine's 200 best small U.S. companies for four years in a row.
In providing computer systems, products, and expertise to its clients, CDSI contracts for the establishment and day-to-day operation of data processing systems. Projects range from developing a prototype online fingerprint identification system for the U.S. Immigration and Naturalization Service, to implementing and operating the Federal Direct Student Loan Program for the U.S. Department of Education, to developing a computer system for the state of Georgia that responds to 25,000 calls a month from travelers seeking information.
In an industry where the people change almost as fast as the technology, most of CDSI's respected senior management has been with the company for more than ten years, and its top executives have been at CDSI for decades. Clifford M. Kendall, one of the four CDSI founders in 1968 and the company's first vice-president of finance, served as president and chief executive officer for 20 years before moving to chairman of the board. Gordon S. Glenn, his successor as president and CEO, joined the company back in 1971 as a computer programmer for a U.S. Navy contract and worked his way up to the company's top operating position. Glenn took pride in the continuity of senior management. "We like to get them, work with them and promote from within," he told Washington Technology in 1995. By that year, all 25 of the company's officers had been promoted from within its ranks.
In the brutally competitive government contracting arena, CDSI achieved strong revenue and earnings growth over the years by sticking close to its target markets and leveraging its expertise in financial and management computer systems from one project to another.
CDSI was founded and incorporated in 1968, in Rockville, Maryland, just outside of Washington, D.C. One year after its founding, it became the first Washington, D.C.,-area company to be listed on the National Stock Exchange. In 1970, the young firm opened its first regional office in Florida and formed a subsidiary, Computer Data Systems International, Ltd., to support its new clients in Western Europe. The following year, CDSI developed automatic information management systems for the state of Florida, the U.S. Navy, and the National Science Foundation.
By 1971, the staff had grown to 100 and profits exceeded $9,000. And the following year, with new clients that included the U.S. Environmental Protection Agency, the U.S. Department of Health and Human Services, and several labor unions, revenues were up 27 percent and profits tenfold, to $98,000.
New clients in 1973 included the World Bank and the U.S. Department of Agriculture. Revenues reached $2.3 million and the company formed another subsidiary, the National Institute for Public Services, Inc., to focus on the information-processing requirements of credit unions. The next year, revenues were up by 40 percent and the project backlog exceeded $3.8 million. In 1975, CDSI acquired Electronic Composition, Inc., an automated typesetting and photocomposition firm.
The company paid its first cash dividend in 1976, as its business volume increased for the eighth consecutive year, and it established a full-service corporate data center. In 1977 CDSI acquired Forlines and Associates, a firm specializing in financial systems, and its software products that included the Builders Information System, the Law Firm Accounting System and the Financial Accounting and Reporting System (FARS) which became one of CDSI's mainstay software offerings in the years ahead. The company celebrated its tenth anniversary in 1978 with 100 clients and a nine percent rise in revenues.
CDSI began its second decade by adding two mainframe computers to its corporate data center, as revenues again increased to more than $8 million. In 1980, the company tailored its FARS accounting software so that it could be used by federal government agencies and recorded its best year in its history, with revenues up 71 percent to $14.8 million.
CDSI won a three-year, $40-million contract with the General Services Administration (GSA) in 1983 and formed Computer Data Systems Sales, Inc., to compete in the expanding turnkey systems marketplace. Its Debt Management and Collection System software was implemented for the Department of Housing and Urban Development's Title 1 program. In its 15th consecutive year of revenue increases and 13th of profitability, CDSI had 1,500 employees supporting 200 clients at 29 sites around the world.
In 1984, CDSI completed construction of a new corporate headquarters in Rockville. Other major projects included the operation of a 70,000-square-foot fulfillment warehouse for the Federal Emergency Management Agency. The next year, CDSI signed two new GSA contracts with a total value of $54 million over four years. In 1986, it licensed its FARS accounting software to the Interstate Conference of Employment Security Agencies. In 1987, it won a $22.8-million Navy project and a $11-million project for the Department of the Interior, both lasting three years.
During its early years, CDSI focused primarily on the professional services side of its business&mdash′oviding technological expertise and specialized software to its clients, mostly in the area of financial management and accounting. In 1987, the fast-growing company set up a new division to pursue larger projects in which it would integrate its own software and services with hardware and software from other vendors.
The company marked its 20th anniversary in 1988. That year it acquired Group Operations, Inc., in a deal that added a suite of software "productivity tools" to CDSI's offerings. The so-called tools&mdashtually specialized software for analyzing and writing computer programs--were used to re-engineer and restructure old programs, making them more efficient and easier to keep current.
The year 1989 was a blockbuster for CDSI. Contract awards totaled $500 million, paced by a $158-million, five-year contract to support the Department of Energy's Office of Information Technology Services and Operations. Company revenues were up by 59 percent to more than $105 million.
In 1990, CDSI demonstrated its capability to handle large, multidisciplined projects with the addition of the Defense Department's civilian medical claims processing system, which would grow in three years to encompass 55 separate computer systems running on a network that linked six regional data centers which processed more than 18 million health claims a year. Another contract win involved work for the U.S. Naval Weapons Center, and the company's Transportation Management System was deployed during Operation Desert Storm in the Persian Gulf.
By 1991, CDSI had become GSA's largest information services contractor, and its financial and management offerings supported 20 federal and 24 state government agencies. That year, the company centralized its sales and marketing efforts previously handled by senior managers in each of its specialized areas, into a single business development group.
CDSI celebrated its 25th anniversary in 1993, its best year yet with record revenues of $180.9 million, up 27.7 percent, and net income of $5.5 million, an increase of 56.8 percent. More than 3,600 CDSI employees supported 185 contracts in 42 states.
In December of that year, CDSI won its largest contract ever, a $376 million project to handle the data processing for the Education Department's Federal Direct Student Loan program. Although the profitability of the contract got off to a slow start because of up-front investments in hardware, it began to improve as the number of schools participating in the loan program headed upwards from 105 to a projected 1,500. And as the volume of student loans increased, so did CDSI's revenue and profit from the program.
Other new business in 1993 included contacts with the Justice and Housing and Urban Development departments totaling $28.1 million. And the company sustained its excellent record in recompeting for its contracts that re-opened to bidding as their terms expired, winning awards from the Defense, Justice, Transportation and Army departments.
As the company moved into its second quarter century, it entered a pivotal period in the evolution of its technology, its marketplace and its business. In terms of technology, the large, expensive mainframe and midrange computers that had dominated government data processing operations since the 1960s were being challenged by networks of low-cost personal computers that could perform many of the same tasks. The software for mainframe computers, in which federal automation contractors like CDSI had invested so much effort and money over the years, had to be adapted to run on the new "client/server" local area networks of personal computers. The new PC networks were very attractive to government agencies because they cost less than the big mainframes and were easier to use and maintain.
The government contracting marketplace of the early 1990s, meantime, was in turmoil. Defense expenditures flattened in the post-Cold War period as the armed services downsized ranks, triggering consolidations in the defense aerospace industry. And political, fiscal and downsizing pressures constrained spending by civilian agencies, as well.
CDSI sustained some short-term business setbacks itself in 1994, losing its bids to continue servicing three contracts that it originally had won in 1988 when its competitors in the new bidding cut their profit margins to wrest the projects away. But the company regarded these as the normal ups and downs of the government contracting business, and CDSI's net income for 1994 was a record $7.73 million, an increase of 40.3 percent over the previous year on revenue of $205.9 million.
Despite cost pressures on government, the outlook for the federal automation industry was healthy in the early 1990s, according to computer industry analyst William Loomis, who followed CDSI. "If the government is going to cut back employees, the thinking is that they'll need more computers to increase productivity," he told Warfield's Business Record, noting that "If so, growth in that area will continue, even if the government does downsize."
Changing times provide profitable opportunities for businesses able to exploit them, and in the early 1990s CDSI began positioning itself to capitalize on the downsizing trends in computer technology and government by aggressively investing for the future.
The company revamped its proprietary FARS financial software, which was originally developed in the 1980s for big IBM mainframe computers, to run on the client/server networks of the 1990s that typically mixed hardware from different manufacturers. The new version was designed to be portable between different brands of hardware and easily tailored to different computing environments.
It upgraded its corporate data processing facility to increase its capacity to handle the processing work from clients. It established internal research and development organizations, called "centers for excellence" to focus on its core technologies of financial management, networking, quality, software development methodology and imaging technology.
Like other government contractors, CDSI sought to broaden its services to other markets, but the federal government continued to be its bread-and-butter business. "We want to expand in the commercial state and local markets," Glenn told Washington Technology in 1995, adding, "we don't want to rely on the federal government too much, but it still will be the biggest player in information technology."
Principal Subsidiaries: Computer Systems Data Sales, Inc.
Related information about Computer
The modern electronic digital computer is the result of a long
series of developments, which started some 5000 years ago with the
abacus. The first mechanical adding device was developed in 1642 by
the French scientist-philosopher, Pascal. His ‘arithmetic machine’,
was followed by the ‘stepped reckoner’ invented by Leibnitz in
1671, which was capable of also doing multiplication, division, and
the evaluation of square roots by a series of stepped additions,
not unlike the methods used in modern digital computers. In 1835,
Charles Babbage formulated his concept of an ‘analytical machine’
which combined arithmetic processes with decisions based on the
results of the computations. This was really the forerunner of the
modern digital computer, in that it combined the principles of
sequential control, branching, looping, and storage units.
In the later 19th-c, George Boole developed the symbolic binary
logic which led to Boolean algebra and the binary switching
methodology used in modern computers. Herman Hollerith (1860–1929),
a US statistician, developed punched card techniques, mainly to aid
with the US census at the beginning of the 20th-c; this advanced
the concept of automatic processing, but major developments awaited
the availability of suitable electronic devices. J Presper Eckert
(1919–95) and John W Mauchly (1907–80) produced the first
all-electronic digital computer, ENIAC (Electronic Numerical
Integrator and Calculator), at the University of Pennsylvania in
1946, which was 1000 times faster than the mechanical computers.
Their development of ENIAC led to one of the first commercial
computers, UNIVAC I, in the early 1950s, which was able to handle
both numerical and alphabetical information. Very significant
contributions were made around this time by Johann von Neumann, who
converted the ENIAC principles to give the EDVAC computer
(Electronic Discrete Variable Automatic Computer) which could
modify its own programs in much the same way as suggested by
Babbage.
The first stored program digital computer to run an actual
program was built at Manchester University, UK, and first performed
successfully in 1948. This computer was later developed into the
Ferranti Mark I computer, widely sold. The first digital computer
(EDSAC) to be able to be offered as a service to users was
developed at Cambridge University, UK, and ran in the spring of
1949. The EDSAC design was used as the basis of the first business
computer system, the Lyons Electronic Office. Advances followed
rapidly from the 1950s, and were further accelerated from the
mid-1960s by the successful development of miniaturization
techniques in the electronics industry. The first microprocessor,
which might be regarded as a computer on a chip, appeared in 1971,
and nowadays the power of even the most modest personal computer
can equal or outstrip the early electronic computers of the 1940s.
The key elements in computing today are miniaturization and
communications. Hand-held computers, with input via a stylus, can
be linked to central systems through a mobile telephone.
sprotected
A computer is a machine for manipulating data according to a list of instructions known as a program.
Computers are extremely versatile. According to the Church?Turing
thesis, a computer with a certain minimum threshold capability
is in principle capable of performing the tasks of any other
computer. Therefore, computers with capabilities ranging from those
of a personal digital assistant to a supercomputer may all
perform the same tasks, as long as time and memory capacity
are not considerations. Therefore, the same computer designs may be
adapted for tasks ranging from processing company payrolls to controlling unmanned
spaceflights. Due to technological advancement, modern electronic computers
are exponentially more capable than those of preceding generations
(a phenomenon partially described by Moore's Law).
Computers take numerous physical forms. Early electronic computers
were the size of a large room, while entire modern embedded
computers may be smaller than a deck of playing cards. Even today,
enormous computing facilities still exist for specialized scientific
computation and for the transaction
processing requirements of large organizations. Smaller
computers designed for individual use are called personal computers.
Along with its portable equivalent, the laptop computer, the
personal computer is the ubiquitous information processing and
communication
tool, and is usually what is meant by "a computer". However, the
most common form of computer in use today is the embedded computer.
They may control machines from fighter aircraft to industrial robots to
digital
cameras.
History of computing
Originally, the term "computer" referred to a person who performed
numerical calculations, often with the aid of a mechanical
calculating device or analog computer. Examples of these early devices, the
ancestors of the computer, included the abacus and the Antikythera
mechanism, an ancient Greek device for calculating the movements of
planets which dates from
about 87 BC. The end of the Middle Ages saw a reinvigoration of European mathematics
and engineering, and Wilhelm Schickard's 1623 device was the first of a
number of mechanical calculators constructed by European
engineers.
In 1801, Joseph Marie
Jacquard made an improvement to existing loom designs that used
a series of punched paper cards as a program to weave intricate
patterns. The resulting Jacquard loom is not considered a true computer but it
was an important step in the development of modern digital
computers.
Charles Babbage
was the first to conceptualize and design a fully programmable
computer as early as 1820, but due to a combination of the limits
of the technology of the time, limited finance, and an inability to
resist tinkering with his design, the device was never actually
constructed in his lifetime. By the end of the 19th century a
number of technologies that would later prove useful in computing
had appeared, such as the punch card and the vacuum tube, and large-scale automated data processing
using punch cards was performed by tabulating machines designed by
Hermann
Hollerith.
During the first half of the 20th century, many scientific
computing needs were met by increasingly sophisticated
special-purpose analog computers, which used a direct mechanical or
electrical model of
the problem as a basis for computation (they became increasingly
rare after the development of the programmable digital computer). A
succession of steadily more powerful and flexible computing devices
were constructed in the 1930s and 1940s, gradually adding the key
features of modern computers.
The use of digital electronics was introduced by Claude Shannon in
1937Shannon, Claude Elwood (1940). He came up with the idea while
studying the relay
circuits of Vannevar
Bush's Differential Analyzer.{scienceworld.wolfram.com/biography/Shannon.html
Biography of Claude Elwood Shannon] - URL retrieved September 26, 2006 This point marked the
beginning of binary digital circuit design and the use of logic gates. Precursors of
this idea were Almon
Strowger, who patented a device containing a logic gate switch
circuit, Nikola
Tesla who filed for patents of devices containing logic gate
circuits in 1898 (see List of Tesla patents), and Lee De Forest's
modification, in 1907, who replaced relays with vacuum tubes.
Defining one point along this road as "the first digital electronic
computer" is exceedingly difficult.
On 12 May, 1941 Konrad Zuse completed his electromechanical
Z3, being the first working
machine featuring automatic binary arithmetic and feasible programmability
(therefore the first digital operational programmable computer,
although not electronic); other notable achievements include the
Atanasoff-Berry Computer (shown working around Summer
1941), a special-purpose machine that used valve-driven (vacuum
tube) computation, binary numbers, and regenerative memory; the Harvard Mark I, a
large-scale electromechanical computer with limited programmability
(shown working around 1944); which was the first general
purpose electronic computer, but originally had an inflexible
architecture that meant reprogramming it essentially required it to
be rewired.
The team who developed ENIAC, recognizing its flaws, came up with a
far more flexible and elegant design, which has become known as the
Von Neumann
architecture (or "stored program architecture"). The first to
be up and running was the Small-Scale
Experimental Machine, but the EDSAC was perhaps the first practical version that was
developed.
Valve (tube) driven computer designs were in use throughout the
1950s, but were eventually replaced with transistor-based computers,
which were smaller, faster, cheaper, and much more reliable, thus
allowing them to be commercially produced, in the 1960s. By the
1970s, the adoption of integrated circuit technology had enabled computers to
be produced at a low enough cost to allow individuals to own
personal
computers.
How computers work: the stored program architecture
- Display
- Motherboard
- CPU (Microprocessor)
- Primary
storage (RAM)
- Expansion
cards
- Power
supply
- Optical disc
drive
- Secondary
storage (HD)
- Keyboard
- Mouse
]]
While the technologies used in computers have changed dramatically
since the first electronic, general-purpose computers of the 1940s, most
still use the stored program architecture (sometimes called the von
Neumann architecture). rewrite -->
The architecture describes a computer with four main sections: the
arithmetic
and logic unit (ALU), the control circuitry, the memory, and the input
and output devices (collectively termed I/O). These parts are
interconnected by bundles of wires (called "buses" when the same bundle
supports more than one data path) and are usually driven by a timer
or clock (although
other events could drive the control circuitry).
Conceptually, a computer's memory can be viewed as a list of cells.
This information can
either be an instruction, telling the computer what to do, or data,
the information which the computer is to process using the
instructions that have been placed in the memory. In principle, any
cell can be used to store either instructions or data.
The ALU is in many senses
the heart of the computer. On a typical personal computer, input
devices include objects like the keyboard and mouse, and output devices
include computer
monitors, printers and the like, but as will be discussed later a
huge variety of devices can be connected to a computer and serve as
I/O devices.
The control system ties this all together. typically, this is
incremented each time an instruction is executed, unless the
instruction itself indicates that the next instruction should be at
some other location (allowing the computer to repeatedly execute
the same instructions).
Since the 1980s the ALU and control unit (collectively called a
central
processing unit or CPU) have typically been located on a single
integrated
circuit called a microprocessor.
The functioning of such a computer is in principle quite
straightforward.
Instructions, like data, are represented within the computer as
binary
code ? The particular instruction set that a specific computer
supports is known as that computer's machine language.
More powerful computers such as minicomputers, mainframe computers
and servers may differ from the model above by dividing
their work between more than one main CPU. Multiprocessor and
multicore personal and laptop computers are also
beginning to become available.
Supercomputers
often have highly unusual architectures significantly different
from the basic stored-program architecture, sometimes featuring
thousands of CPUs, but such designs tend to be useful only for
specialized tasks. At the other end of the size scale, some
microcontrollers
use the Harvard
architecture that ensures that program and data memory are
logically separate.
Digital circuits
The conceptual design above could be implemented using a variety
of different technologies. As previously mentioned, a stored
program computer could be designed entirely of mechanical
components like Babbage's devices or the Digi-Comp I. However, digital circuits allow
Boolean logic and
arithmetic using
binary numerals to be implemented using relays ? when electricity is provided to one of
the pins, current can flow through between the other two.
Through arrangements of logic gates, one can build digital circuits
to do more complex tasks, for instance, an adder, which
implements in electronics the same method ? in computer
terminology, an algorithm ? Therefore, by the 1960s they were replaced
by the transistor, a
new device which performed the same task as the tube but was much
smaller, faster operating, reliable, used much less power, and was
far cheaper.
In the 1960s and 1970s, the transistor itself was gradually
replaced by the integrated circuit, which placed multiple transistors
(and other components) and the wires connecting them on a single,
solid piece of silicon. By the 1970s, the entire ALU and control
unit, the combination becoming known as a CPU, were being
placed on a single "chip" called a microprocessor. as of 2006, the Intel Core Duo
processor contains 151 million transistors.
Tubes, transistors, and transistors on integrated circuits can be
used as the "storage" component of the stored-program architecture,
using a circuit design known as a flip-flop,
and indeed flip-flops are used for small amounts of very high-speed
storage. Instead, earliest computers stored data in Williams tubes ? These
results can either be viewed directly by a user, or they can be
sent to another machine, whose control has been assigned to the
computer: In a robot, for
instance, the controlling computer's major output device is the
robot itself.
The first generation of computers were equipped with a fairly
limited range of input devices. A punch card reader, or something similar, was used
to enter instructions and data into the computer's memory, and some
kind of printer, usually a modified teletype, was used to record the results. For the
personal computer, for instance, keyboards and mice are the primary ways
people directly enter information into the computer; and monitors are the
primary way in which information from the computer is presented
back to the user, though printers, speakers, and headphones are common, too. The first
class is that of secondary storage devices, such as hard disks, CD-ROMs, key drives and the like,
which represent comparatively slow, but high-capacity devices,
where information can be stored for later retrieval;
Programs
Computer
programs are simply lists of instructions for the computer to
execute. Rather, they do millions of simple instructions arranged
by people known as programmers.
In practice, people do not normally write the instructions for
computers directly in machine language. Instead, programmers
describe the desired actions in a "high level" programming
language which is then translated into the machine language
automatically by special computer programs (interpreters
and compilers). The
language chosen for a particular task depends on the nature of the
task, the skill set of the programmers, tool availability and,
often, the requirements of the customers (for instance, projects
for the US military were often required to be in the Ada programming
language).
Computer
software is an alternative term for computer programs; A
computer
application is a piece of computer software provided to many
computer users, often in a retail environment. The stereotypical
modern example of an application is perhaps the office suite, a set of
interrelated programs for performing common office tasks.
Going from the extremely simple capabilities of a single machine
language instruction to the myriad capabilities of application
programs means that many computer programs are extremely large and
complex. A typical example is Windows XP, created from roughly 40 million lines of computer
code in the C++
programming
language;Tanenbaum, Andrew S. the discipline of software
engineering has attempted, with some success, to make the
process quicker and more productive and improve the quality of the
end product.
A problem or a model is computational if it is formalized in
such way that can be transformed to the form of a computer program.
the classic example of this type of early operating system was
OS/360 by IBM.
The next major development in operating systems was timesharing ? Security
access controls, allowing computer users access only to files,
directories and programs they had permissions to use, were also
common.
Perhaps the last major addition to the operating system was tools
to provide programs with a standardized graphical user
interface. For instance, Apple's Mac OS X ships with a digital video
editor application.
Operating systems for smaller computers may not provide all of
these functions. The operating systems for early microcomputers with
limited memory and processing capability did not, and Embedded computers
typically have specialized operating systems or no operating system
at all, with their custom application programs performing the tasks
that might otherwise be delegated to an operating system. The
ENIAC was originally
designed to calculate ballistics-firing tables for artillery, but it was also
used to calculate neutron cross-sectional densities to help in the
design of the hydrogen
bomb significantly speeding up its development. (Many of the
most powerful supercomputers available today are also used for
nuclear weapons
simulations.) The
CSIR Mk I, the first
Australian stored-program computer, was amongst many other tasks
used for the evaluation of rainfall patterns for the catchment area of the
Snowy Mountains
Scheme, a large hydroelectric generation projectThe last of the first :
CSIRAC : Australia's first computer, Doug McCann and Peter Thorne,
ISBN 0-7340-2024-4. Others were used in cryptanalysis, for example
the first programmable (though not general-purpose) digital
electronic computer, Colossus, built in 1943 during World War II. The LEO, a stored
program-computer built by in the United Kingdom, was operational and being used for
inventory management and other purposes 3 years before IBM built their first commercial
stored-program computer. In the 1980s, personal computers
became popular for many tasks, including book-keeping, writing and
printing documents, calculating forecasts and other repetitive
mathematical tasks involving spreadsheets.
As computers have become less expensive, they have been used
extensively in the creative arts as well. Sound, still pictures,
and video are now routinely created (through synthesizers, computer graphics and
computer
animation), and near-universally edited by computer. They have
also been used for entertainment, with the video game
becoming a huge industry.
Computers have been used to control mechanical devices since they
became small and cheap enough to do so; indeed, a major spur for
integrated circuit technology was building a computer small enough
to guide the Apollo
missions two of the first major applications for embedded
computers. Industrial robots have become commonplace in mass production, but
general-purpose human-like robots have not lived up to the promise
of their fictional counterparts and remain either toys or research
projects.
Robotics, indeed, is the physical expression of the field of
artificial
intelligence, a discipline whose exact boundaries are fuzzy but
to some degree involves attempting to give computers capabilities
that they do not currently possess but humans do.
Networking and the Internet
Computers have been used to coordinate information in multiple
locations since the 1950s, with the US military's SAGE
system the first large-scale example of such a system, which led to
a number of special-purpose commercial systems like Sabre.
In the 1970s, computer engineers at research institutions
throughout the US began to link their computers together using
telecommunications technology. This effort was funded by ARPA,
and the computer
network that it produced was called the ARPANET. In the phrase of
John Gage and Bill Joy (of Sun Microsystems), "the
network is the computer". Initially these facilities were available
primarily to people working in high-tech environments, but in the
1990s the spread of applications like e-mail and the World Wide Web, combined with the development of
cheap, fast networking technologies like Ethernet and ADSL saw computer networking become ubiquitous almost
everywhere. A very large proportion of personal computers
regularly connect to the Internet to communicate and receive information.
"Wireless" networking, often utilizing mobile phone networks, has
meant networking is becoming increasingly ubiquitous even in mobile
computing environments. Therefore, there has been research interest
in some computer models that use biological processes, or the
oddities of quantum
physics, to tackle these types of problems. However, such a
system is limited by the maximum practical mass of DNA that can be
handled.
Quantum
computers, as the name implies, take advantage of the unusual
world of quantum physics.
These alternative models for computation remain research projects
at the present time, and will likely find application only for
those problems where conventional computers are inadequate.
See also Unconventional computing. Terminology for different
professional disciplines is still somewhat fluid and new fields
emerge from time to time: however, some of the major groupings are
as follows:
- Computer
engineering is the branch of electrical
engineering that focuses both on hardware and software
design, and the interaction between the two.
- Computer
science is a traditional name of the academic study of the
processes related to computers and computation, such as
developing efficient algorithms to perform specific class of tasks. one of
many examples is experts in geographical
information systems who apply computer technology to problems
of managing geographical information.
There are three major professional societies dedicated to
computers, the British Computer Society the Association for Computing Machinery and IEEE Computer
Society.
See also
- Association for Computing Machinery
- The British Computer Society
- IEEE
Computer Society
- Operating
system
- Computer
hardware
- Computability theory
- Computer
datasheet
- Computer
expo
- Computer
science
- Computer types: analog computer, hybrid computer, supercomputer (along with the minisupercomputer),
mainframe computer,
workstation
computers, laptop,
roll-away
computer, embedded computer, cart computer, tablet pc, handheld computer, subnotebook, thin client, minicomputer (and the supermini), microcomputer, computer terminal,
and server
- Computing
- Computers
in fiction
- Computer
music
- Computer
security and Computer insecurity challenges such as: malware, phishing, spam
(electronic), and how to solve them, such as firewall,
computer
security audit
- Digital
- History
of computing
- List of computer term etymologies
- List
of computing topics
- Personal
computer
- Word
processing
- Internet
- Computer
programming
Other computers
- Analog
computer
- Chemical
computer
- DNA
computer
- Human
computer
- Molecular
computer
- Optical
computer
- Quantum
computer
- Wetware
computer
See also Unconventional computing.
Notes and references
www97.intel.com/discover/JourneyInside/TJI_Intro_lesson1/default.aspx
This web site and associated pages are not associated with, endorsed by, or sponsored by Computer Data Systems, Inc. and has no official or unofficial affiliation with Computer Data Systems, Inc..