Меню
Поиск



рефераты скачатьИстория развития компьютеров (Silicon Valley, its history the best companies)

his engineers, whom he called traitors and they are now known as "the

Traitorous Eight".

Although Shockley was not very successful with his firm in Palo Alto, he

"deserves credit for starting the entrepreneurial chain-reaction that

launched the semiconductor industry in Silicon Valley,") since he had

brought together excellent scientists there like Robert Noyce without whom

there might never have been a Silicon Valley on the San Francisco Peninsula

at all. Or as M. Malone calls it, "Shockley put the last stone in place in

the construction of Silicon Valley.")

The father of one of those young men who left Shockley had contacts to a

New York investment firm, which sent a young executive named Arthur Rock to

secure financing for their new enterprise. Rock asked a lot of companies,

if they were interested in backing this project, but has not been

successful so far. The concept of investing money in new technology

ventures was largely unknown then, and indeed the term "venture capital"

itself wouldn't be coined until 1965") - by Arthur Rock, who should become

Silicon Valley's first and most famous venture capitalist later on.

Finally, due to Rock's efforts, the "Traitorous Eight" managed to obtain

financial support from industrialist Sherman Fairchild to start Fairchild

Semiconductor in 1957.

Fairchild Semiconductor was developed by Shockley's firm, and as the "still

existing granddaddy of them all") has itself spawned scores of other

companies in Silicon Valley: Most semiconductor firms' roots can be traced

back to Fairchild. The most famous ones of them are National Semiconductor,

Intel, Advanced Micro Devices (AMD); and many well-known Valley leaders

have worked at Fairchild, e.g. Charlie Sporck (National Semiconductor),

Jerry Sanders (AMD's founder), Jean Hoerni, and last but not least Robert

Noyce, who is considered the "Mayor of Silicon Valley") due to his

overwhelming success.

Robert Noyce was born in southwestern Iowa in 1927. His father was a

preacher in the Congregational Church and thus was "perpetually on the move

to new congregations, his family in tow.") When the Noyces decided to stay

at the college town of Grinnell, Iowa, for a longer period of time after

many years of moving, this place meant stability in young Bob's life and

thus would become his first and only real home, which he would later regard

as important for his eventual success.

After high school, Robert studied at Grinnell College. His physics

professor had been in contact with John Bardeen (one of the three inventors

of the transistor) and obtained two of the first transistors in 1948, which

he presented his students, including Bob Noyce. This aroused young Robert's

interest in semiconductors and transistors, which made him try to learn

everything he could get about this fascinating field of solid-state

physics.

Having graduated from Grinnell College he continued his studies at "the

premier school of science on the East Coast, MIT,") where he met famous

scientists like Shockley. He received his doctorate, and decided to work at

Philco until 1955, when he was invited by William Shockley to join a new

firm named "Shockley Semiconductor" in Santa Clara County - together with

seven other splendid scientists.

When the so-called "Shockley Eight" started a new venture with Fairchild

Semiconductor, Robert Noyce began "his own transformation from engineer to

business manager:") He was chosen to lead the new company as he seemed the

best to do this job.

Fairchild Semiconductor focused on building a marketable silicon transistor

applying a new manufacturing process called "mesa". Despite being the

smallest company in electronics business then, it attracted public

attention, particularly in 1958, when "Big Blue" - as dominant IBM is

nicknamed - ordered the "first-ever mesa silicon transistors") for memory

drivers in its computers.

This order contributed to the early success of Fairchild Semiconductor, and

indicated the beginning of a long relationship between IBM and Silicon

Valley.

Importance of military funding

Before switching over to the events at Intel, the aspect of military

funding is to be dealt with, since it has played an important role in the

early days of Silicon Valley.

During World War II, after the Japanese attack at Pearl Harbor in 1942, a

great deal of the U.S. military forces and of the military production was

moved to California. Within a few years, California - formerly an

agricultural state - became a booming industrial state and the military

center of the USA.)

After the war, in the time of the Cold War and the arms race, the Korean

conflict, the "missile gap" and the space program, the Pentagon kept

ordering high-technology products from the armament factories in

California. Many companies established R&D departments and production

facilities in Santa Clara County near Stanford University, which provided

them with bright engineers and scientists, and were largely supported by

the Pentagon's demand for electronic high-tech products.

Examples for such firms are FMC, GTE, Varian Associates, Westinghouse, and

finally Lockheed, which opened its R&D department in the Stanford Research

Park in 1956, and started Lockheed Missiles and Space Company (LMSC) in

Sunnyvale. Lockheed's move to Northern California was crucial for the

developments in Santa Clara County; today the company is Silicon Valley's

largest employer with more than 24,000 people.)

Military funding for high-tech products was responsible for the early

growth of Silicon Valley in the 1950s and 1960s. The U.S. Department of

Defense was the biggest buyer of these products, e.g. its purchases

represented about 70 percent of the total production of ICs in 1965.)

While this share in chip demands has dropped to 8 percent today, the

Pentagon remains the biggest supporter of new technologies and accounts for

most of the purchases of the latest developments.

Intel Corp.

After the transistor and the integrated circuit, the invention of the

microprocessor in the early 1970s represents the next step towards the

modern way of computing, providing the basis for the subsequent personal

computer revolution.

It was at Intel where the first microprocessor was designed - representing

the key to modern personal computers. With its logic and memory chips, the

company provides the basic components for microcomputers. Intel is regarded

as Silicon Valley's flagship and its most successful semiconductor company,

owing its worldwide leading role to a perpetually high spending on research

and development (R&D).

Foundation in 1968

It all started in 1968, when Bob Noyce resigned as head of Fairchild

Semiconductor taking along Gordon Moore and Andy Grove, to embark on a new

venture. They had decided to leave the company, because they wanted "to

regain the satisfaction of research and development in a small, growing

company,") since Fairchild had become big with lots of bureaucracy work to

be done. Gordon Moore had belonged to the famous Shockley Eight and was in

charge of the R&D team at Fairchild. Andy Grove, a young Hungarian йmigrй,

who had earned a doctorate in chemical engineering at U.C. Berkeley, had

joined Fairchild in the early 1960s.

Intel (short for Integrated Electronics), a typical Fairchild spin-off, was

financially backed by venture capital from Arthur Rock, who had been in

contact with Noyce since 1957. The company was founded upon the idea of

integrating many transistors on a chip of silicon, after Noyce had

developed a new photochemical process. The three engineers initially

focused on building the first semiconductor chips used for computer memory,

which should replace the dominant memory storage technology at the time,

called "magnetic core". Intel's task was to drive down the cost per bit by

increasing the capacity of memory chips dramatically.

First products - Moore's Law

Within a year, Intel developed its first product - the 3101 Schottky

bipolar 64-bit static random access memory (SRAM), which was followed soon

after by the 1101. This chip (1101) was a 256-bit SRAM and had been

developed on Intel's new "silicon gate metal -oxide semiconductor (MOS)

process," which should become the "industry's process technology of

choice.") With the first two products, the young company started with 12

employees and net revenues of $2,672 in 1968, had already gained the

technological lead in the field of memory chips.

Intel's first really successful product was the 1103 dynamic random access

memory (DRAM), which was manufactured in the MOS process. Introduced in

1970, this chip was the "first merchant market LSI (large-scale integrated)

DRAM," and received broad acceptance because it was superior to magnetic

core memories. So, by the end of 1971, the 1103 became "the world's largest-

selling semiconductor device" and provided the capital for Intel's early

growth.)

Until today, semiconductors have "adhered to Moore's Law," which has been

framed by the "cofounder of Fairchild and Intel" when the first commercial

DRAMs appeared in the early 1970s. This law predicts that the price per bit

(the smallest unit of memory) drops by 30 percent every year. It implies

that you will receive 30 percent more power (speed/capacity) at the same

price, or that the "price of a certain power is 30 percent less.")

Moore's Law applies to both memory chips and microprocessors, and shows the

unprecedented rapid progress in microelectronics. This "astonishing ratio"

has never occurred in "the history of manufacturing" before. Applied to

automobiles, it means that "a Cadillac would have a top speed of 500 miles

per hour, get two hundred miles to a gallon of gas and cost less than a

dollar" - almost incredible.)

1971 was a crucial year at Intel. The company's revenues surpassed

operating expenses for the first time, and the company went public, raising

$6.8 million.

Moreover, the company introduced a new memory chip - the first erasable,

programmable read only memory (EPROM). Invented by Intel's Dov Frohman, the

new memory could store data permanently like already existing ROMs, but

besides could be erased simply by a beam of ultraviolet light and be used

again. The EPROM was initially viewed as a "prototyping device" for R&D.

The invention of the microprocessor in the same year, however, showed the

real significance of the EPROM, which could be used by original equipment

manufacturer (OEM) customers (they build the end-products) to store

microprocessor programs in a "flexible and low-cost way." The "unexpected

synergy" between the EPROM and the microprocessor resulted in a growing

market for both chips and contributed a great deal to Intel's early

success.)

"Ted" Hoff's first microprocessor

The invention of the microprocessor marked a turning point in Intel's

history. This development "changed not only the future of the company, but

much of the industrial world.")

The story to this technological breakthrough began in 1969, when a Japanese

calculator manufacturer called Busicomp asked Intel to design a set of

chips for a family of programmable calculators. Marcian "Ted" Hoff, a young

and "very bright ex-Stanford research associate") who had joined Intel as

employee number 12, was charged with this project. However, he did not like

the Japanese design calling for 12 custom chips - each of them was assigned

a distinct task. Hoff thought designing so many different chip s would make

the calculators as expensive as minicomputers such as DEC's PDP-8, although

they could merely be used for calculation. His idea was to develop a four-

chip set with a general-purpose logic device as its center, which could be

programmed by inst ructions stored on a semiconductor memory chip. This was

the theory behind the first microprocessor.

With the help of new employee Stan Mazor, Hoff perfected the design of what

would be the 4004 arithmetic chip. After Busicomp had accepted Hoff's chip

set, Frederico Faggin, one of the best chip design experts, who had been

hired recently, began transforming the design into silicon. The 4004

microprocessor, a 4-bit chip (processes 4 bits - a string of four ones or

zeroes - of information at a time), contained 2300 MOS transistors, and was

as powerful as the legendary first electronic computer, ENIAC.

Soon after the first 4004s had been delivered to Busicomp, Intel realized

the market potential of the chip, and successfully renegotiated with the

Japanese to regain the exclusive rights, which had been sold to Busicomp.

In November 1971, Intel introduced the 4004 to the public in an Electronic

News ad. It announced not just a new product, but "a new era of integrated

electronics [...], a micro programmable computer on a chip.") The

microprocessor is - as Gordon Moore call s it - "one of the most

revolutionary products in the history of mankind,") and ranks as one of 12

milestones of American technology in a survey of U.S. News and World Report

in 1982. This chip is the actual computer itself: It is the central

processing u nit (CPU) - the computer's brains. The microprocessor made

possible the microcomputer, which is "as big as it is only to accommodate

us." For "we'd have a hard time getting information into or out of a

microprocessor without a keyboard, a printer and a terminal," as Th.Mahon

puts it.)

However significant Hoff's invention, nevertheless, it was hardly noticed

in the public until early 1973. The microprocessor had its own instruction

set and was to be programmed in order to execute specific tasks. So Ted

Hoff had to inform the public and t he engineers about the capabilities of

the new device and how to program it.

Cooperation with IBM in the 1980s

Intel's measures in the late 1970s as a reaction to increasing competition

from other chip manufacturers paid off greatly and resulted in a remarkable

technological lead against its competitors. The most significant

consequence, which was a landmark in the company's development, was IBM's

decision to rely on the Intel 8088 microprocessor for its PCs in 1980.

IBM (short for International Business Machines) has been the world's

leading company in the big mainframe computers since the 1950s. Due to its

dominance, it was often compared with a giant and referred to as "Big

Blue." Surprisingly, it was not before 198 1 (the PC revolution had already

been on for a few years) that IBM introduced its own Personal Computer.

Because of IBM's dominance and worldwide reputation, its PCs soon became

industry standard and penetrated the office market: other established

computer companies followed and introduced their own PCs - the so-called

Страницы: 1, 2, 3, 4, 5




Новости
Мои настройки


   рефераты скачать  Наверх  рефераты скачать  

© 2009 Все права защищены.