More notes on the usual stuff, plus URL's and books.

The people at FindMail have set up a new Web-based archive of RRE
messages.  You can find the link to it through the RRE home page:

Although the FindMail archive has some nice features, neither it nor
the Utopia archive is complete.  You may have to search them both.

If you have established a mail filter that diverts RRE messages into
a separate folder for later viewing, can you please check to make sure
that the filter diverts messages "To: rre" and not "From: Phil Agre"?
Messages that I send to people who happen to be RRE subscribers are
sometimes not read for weeks or months because they get sidetracked by
a mail filter.

Some people pointed out that last week's message advertising MIS Review
nowhere explained what MIS is.  It stands for Management Information
Systems i.e., the people who run the computers in large organizations,
as well as the academics who study and advise them.  Is it an accident
that they assumed that the whole world knows what MIS means?  Your call.

More about David Isenberg, whose Stupid Networks newsletter I sent out
to the list a few weeks ago.  David was one of the small furry animals
at AT&T -- an old-time telephone guy who nonetheless saw and mapped
the evolutionary path forward during the reign of the dreaded-yet-doomed
Allenosaurus.  He got less than no thanks for this, and he is now out
on his own.  His "stupid networks" paper, long circulating in samizdat
form and now seeing the light of day, is an instant classic.  It is
the most straightforward explanation I've seen of the reasons why the
phone companies ought to fear the Internet.  And I'm glad it's out,
because I got a lot of absolute heck when I mentioned the issue here
a while back.  One person, for example, told me that I must be unaware
that the phone system has long been digital.  Now I can direct the
doubters to David's paper.

The bottom line is that the Internet distinguishes different service
layers in a parsimonious way, so that each layer can be applied in the
widest possible variety of contexts.  Clean functional differentiation
among service layers, however, means that simple data transport now
threatens to become a commodity business -- a disaster for the phone
companies, which are addicted to the extra revenue that they obtain
from the voice-specific "intelligence" that they wire into the phone
system.  In any case, the Wall Street Journal wrote up David's article
recently.  The best reference I have is Gautam Naik, "Internet Threat:
Will Technology Trip Up the Telecom Titans?", The Wall Street Journal
Europe's Convergence Supplement 3(4), Winter 1997 (11/17/1997), but
it may also be available on, to which I don't subscribe.

Perhaps this is a commonplace, but I see a rough analogy between
the phone companies and Apple Computer.  Telecommunications networks
and operating systems both exhibit two important economic phenomena:
network effects -- the benefit of using a given system increases
as other people use it -- and economies of scale -- the price of the
software can decrease rapidly as more people use it.  This combination
dramatically favors whichever system has the most users, and the best
strategy somehow combines a steady flow of capital to finance software
upgrades with a steady increase in the size of the network -- that is,
in the number of people who are using the system.

Now, in each case, the product in question consists of an underlying
commodity that can easily be standardized in its outward functionality
-- i.e., the personal computer hardware or the transport of digital
data -- with software built on top -- the operating system or the
various value-added phone system functionalities.  Apple and the
phone companies have both depended upon what is increasingly called
a "stovepipe" strategy: they sell the underlying commodity and the
added software functionality as an integrated unit, and they get
their cash flow from the high profit margin they obtain by selling the
whole package.  By conscious design or not, Microsoft and the Internet
depend upon a different strategy: they break apart the underlying
commodity from the added software.

Why does this strategy work better?  The strategy pursued by Apple
and the phone companies is stable and even highly profitable in the
absence of competition.  But in the long run, as I suggested above, it
resembles an addiction.  In particular, Apple and the phone companies
are getting their cash flow from the wrong source.  The strategy
pursued by Microsoft and the Internet produces value through network
effects and economies of scale, and so once their system gains a
critical mass of users, it can combine constantly lowered prices
with high profit margins.  Why can't Apple achieve the same goal --
lower prices to expand the market, then capturing back the investment
through economies of scale -- by pushing for production efficiencies
on its hardware?  It can try, but it is competing against a lot of
companies that are totally focused are reducing overhead and gaining
incremental production efficiencies in a commodity marketplace -- the
market for hardware that conforms to the IBM-PC standard.  This is the
accidental genius of the IBM-PC regime: one company (Intel) captures
big profit margins by controlling the standard for the processor,
another company (Microsoft) captures big profit margins by controlling
the standard for the operating system, and all of the other companies
work like dogs to obtain economic margins on all of other components,
none of them encumbered by strong intellectual property protections.

What is the analogy, then, to the Internet?  Isenberg's analysis is
that the phone companies are shafted because the phone system is too
specialized to a certain range of functionalities.  The Internet, by
defining its service layers more parsimoniously, has the potential of
being applied to a much wider range of applications, including all of
the applications of the current phone system.  As a result, the fixed
costs of developing the Internet can be distributed across a larger
number of customers.  Once the Internet achieves a subscriber base
that's anywhere in the ballpark of the phone system, therefore, we
are likely to see the same vicious spiral that consumed Apple: those
who control standards (if, in fact, anybody ends up controlling
any major Internet standards) will be able to lower prices while
also enjoying high profit margins through the economies of scale in
software, while everyone else will have to make their money the old-
fashioned way, as companies like Cisco (in the case of the Internet)
and Compaq (in the case of PC's) do now.

Beyond this, of course, it's hard to argue in a principled way about
whether the phone companies will wake up in time.  Yes, the phone
companies handle most of the long-distance Internet traffic -- but
then, on one level an Internet backbone is basically a permanent
telephone connection.  And yes, even AT&T is announcing something that
is said to resemble Internet telephony.  The issue is not whether the
phone companies will handle a lot of Internet traffic -- that seems
certain.  The issue is whether they will make any money doing it.
This is what the Internet people are looking forward to: once TCP/IP
has finally cleaved the transport layer from the applications layer
(or layers), the stovepipe model no longer applies.  Because the
layers are no longer integrated, it is no longer certain that the
same company will operate them.  Transport will become a commodity
business, and a hundred applications flowers will bloom.

That's the theory.  And it's an appealing theory.  So let me try to
generalize it.  What doesn't work, according to this theory, is the
stovepipe strategy.  This strategy is based on we might call "cross-
layer bundling" -- in other words, bundling of functionalities across
service layers.  The problem with vertical bundling is that you're
vulnerable to the emergence of a new de facto standard that replaces
the functionality on the lowest layer.  When this happens, your users
risk becoming stranded and it becomes difficult to recruit new users
who wish to comply with the standard.  Faced with such a dilemma,
one approach is to rapidly port your system to the new standard --
for example, when Lotus Notes was ported to the Internet.  Another
approach is to wither and die -- for example, General Magic's
Telescript, or the whole lot of "push" technologies for the Internet.

Looking at the big picture, every software vendor that specializes
in a particular vertical market -- medical software, CAD, etc -- is
probably engaged in cross-layer bundling.  This is obvious in the case
of systems that depend on any networking protocol except TCP/IP, but
it is also true on other layers as well.  A vendor who establishes a
de facto standard that cuts across vertical markets -- for example,
Microsoft's Windows, Office, and Back Office -- can spread development
costs across all of those markets.  Applications vendors in those
markets may then be compelled, often at great expense, to port their
systems to the new platform.

When this dynamic works right, the result resembles the Internet:
each layer is defined in a clean, parsimonious way, and the de facto
standards are all open.  When this dynamic works wrong, the layers
are defined badly or somebody owns them.  However it works, I call
this dynamic the "platform cycle", and I think it is one of the really
important patterns in the development of information technology.

Is it my imagination, or has the flow of pyramid-scheme get-rich-quick
spam fallen off dramatically in the last few weeks?  Can it possibly
be a coincidence that this drop-off followed immediately after the US
Federal Trade Commission sent out a thousand letters to the people who
were circulating this kind of spam?  Might this action from the FTC
have resulted from the torrent of complaints that responsible Internet
users have filed with various government agencies in recent times?
Could it be that the Internet community is able to defend itself from
at least certain categories of antisocial parasites?  I think so, yes.
Warnings from the cops, of course, only alleviate certain kinds of
antisocial behavior.  We're in this for the long haul, and the answer
will eventually involve some thus-far-hard-to-comprehend combination
of technical fixes and legislation.  But let us at least stop our
despairing about spam.

The next front in the spam wars, it seems, is the pernicious idea that
spam is okay if it is "targeted".  Both bad people and good people
have contributed to confusion on this matter.  Spam is bad in large
part because many spammers have no economic incentive to target their
mailings.  It does not follow from this, however, that all mailings
are okay if they are targeted.  For one thing, the word "targeted" has
no very precise meaning, so anybody can claim that their mailings are
"targeted" simply because they harvested their mailing list from a Web
crawler that grabs e-mail addresses from pages that include vaguely
relevant keywords.  And in any case, targeted or not, unsolicited mail
still imposes a cost on its recipient.

One problem here is an ambiguity in the word "targeted".  A "targeted"
mailing list can be a list that is somehow filtered to include people
who seem somewhat statistically likely to be interested in the matter
being advertised.  It can also be a list of people who have taken a
specific action to expressly solicit such advertisements.  The former
case is spam; the latter is not.  I suspect that some spam-sending
services are fooling their would-be customers by saying, "it's okay
because it's targeted", thus leading ignorant but otherwise innocent
businesses into actions that cause them great shame.  And I know that
some foolish people within reputable businesses, such as for example
the always-clueful AT&T, have initiated spam messages out of honest
confusion on the matter.  So our next goal, it seems to me, should
be to establish clear distinctions.  When someone uses a word like
"targeted" in an ambiguous fashion, set them straight.  The principle
is clear: the Internet makes it easy to enable people to solicit
advertising, and so there is no excuse for unsolicited advertising.

We should also take a stand against spammers' spurious assertions
that you have "solicited" their mail.  There's nothing worse than a
righteous spammer, and the righteous spammers that I've encountered
have generally been gripped by a delusion that their victims had asked
for it, for example by having an e-mail address, running a newsletter,
contributing to a newsgroup, or what-have-you.  This is obviously a
dangerous form of thinking, and not something to be taken lightly.

Another bit of spammer sophistry concerns the word "customer".  One
will hear spam apologists saying things like, "We shouldn't worry
about abuses of direct e-mail because it doesn't make sense that a
company would go out and offend its own customers".  Observe that,
according to this very special version of the English language, you
can be someone's "customer" even if you have never done any business
with them.  The correct response is that many business models do not
depend on the firm building or maintaining a good reputation among
the vast majority of people, so long as they grab a profit from the
narrow few.

Also about spam...  Since I started circulating "How to Complain About
Spam" on the net, I have started receiving regular calls and messages
from people whose businesses have been attacked by spammers.  Most of
the attacks take the form of forged message headers: if someone sends
out spam that mentions your site, your reputation will suffer and you
will incur costs to clean up the mess.  Other attacks seem to be more
intentional campaigns to shut down a business, whether by forging its
name to spam or by messing with its web site etc.  What's needed is a
how-to for businesses and individuals who find themselves in this sort
of situation.  Frequently asked questions include, How do I figure out
who's doing it?, How do I find a lawyer who has dealt with this sort
of thing before?, Has a crime been committed?, Do I have a cause of
action for a lawsuit?, Should I contact the individuals whose names
or addresses appear in the spam messages?, and Do I have any way of
finding out who owns the PO Box that is almost invariably part of such

I certainly don't know the answers to these questions myself, and
so my advice is pretty generic: document everything including the
costs you've incurred, solicit help from your ISP, write letters to
all potentially relevant law enforcement agencies, hire a computer
security consultant if you really want to find out what's happening
technically, and talk to a lawyer if you really are mad enough to
contemplate legal action.  Beyond that, I see an opportunity here
for some lawyers to advertise themselves: start an association, write
a good solid FAQ, explain what services you can provide, and include
your e-mail address.  If you prepare something that's credible, I'll
advertise the URL on my list, and I'll bet that you can get some news
coverage by issuing a press release on PR Newswire.

It seems that many people are unclear about why I invested so much
effort during 1997 assembling a graduate course on compatibility
standards.  The Senate hearings on Microsoft -- excuse me, on
competitive issues in the computer industry -- that were convened
by Orrin Hatch (R-Novell) might have been more helpful if they
had addressed the issues on that level.  So in case you're still
wondering, I honestly think that the bizarre institutional dynamics
of compatibility standards are the key to understanding information
infrastructure.  Here, for what it's worth, is the advertisement for
my course that I sent out to the people at UC Irvine in the fall:

  Information infrastructure evolves and interacts with institutions
  largely through the construction and contestation of standards.
  Prominent examples of contentious standards include Java, key
  escrow, CORBA versus DCOM, DVD, and several Internet and Web
  standards.  This course is an interdisciplinary survey of the
  literature on compatibility standards and standardization processes.
  Issues include:

     * the changing role of formal standards organizations and
       government standards-setting;
     * the nature and dynamics of standards conflicts;
     * strategy in standards-driven markets;
     * the design and evaluation of successful standards;
     * the economic analysis of standards in antitrust law; and
     * the relationship of technical standards to larger questions of
       social structure such as (de)centralization, globalization, and
       the evolution of markets.

The syllabus for the course, including an extensive reading list, can
be found on the Web at
Don Weightman  is organizing a reading group
based on this syllabus, with face-to-face meetings in Washington and
discussion on the net. Contact him if you're interested.

The March issue of First Monday, , includes
some articles about the cooperative development model of Linux. I've
never thought much about software engineering, but I finally became
interested through my thinking about standards. If you look at the
software industry as the politics of standards, as Ted Nelson puts
it, then software development and standards development are two aspects
of the same thing. The key is to broaden one's perspective: software
development isn't just something that happens within one building,
and standards development isn't just something that happens within
a formal standards committee. Software tends to congeal into a small
number of de facto standards for each given functionality, and that
congealing happens upon a big, fast-moving, often conflictual global
stage. Modern software development regimens like Microsoft's, and
modern standards development regimens like the IETF's, both operate
on the principle of frequent, incremental milestones. That way the
trajectory of development can be influenced by reality -- that is, by
users, by the market, and by the endless, intense consensus-building
that goes on among developers.

As the First Monday articles explain, Linux radicalizes this
insight. It's easy to romanticize the Linux development process using
the language of self-organization, but the key is really to maintain
an astute discipline about what aspects of the process are centralized
and what aspects are decentralized. Linux is not the perfect gas
of idealized physics but something more hierarchical. It's not a
hierarchy in the command-and-control sense, of course, but rather a
hierarchy, like that of the IETF, in which centrality is determined
by knowledge and initiative. Somebody should do a comparative study
of the individuals like Linus Torvalds and Tim Berners-Lee who somehow
hold very dynamic standards processes together through little more
than hard work, expertise, and charisma.

These matters have helped me to think about some of the experiments
that I've been doing on RRE lately. These experiments aren't the
result of hard work, expertise, or charisma, but they seem vaguely
analogous anyway. Long-timers might have noticed that I have not
announced any "Projects" lately: those experiments in gathering
everyone's opinion about a question and then sending the whole
batch out to the list. I stopped because a couple of those projects
didn't work -- for example, the "Instructive Web Sites" project from
September 1996, which turned out to be a lot more effort than I had
bargained for.

The newer projects -- lower case p, and not advertised as part of
a series -- are designed more conservatively. The older Projects
had been based on the principle that the list subscribers were doing
the work and I was simply reflecting it back to the group. The newer
projects -- "How to Complain About Spam", my commentaries on cheap
pens, and the bibliography on social aspects of computing -- are all
based on the principle that I'm doing the work and inviting others to

Framed that way, of course, the newer regime sounds harder, not
easier. The key is that I don't initiate a project unless I'm willing
to do the whole thing myself. That's pretty much what happened with
the "Bibliography on Social Aspects of Computing" project (available
at ). The bibliography
currently includes well over 300 books (see below), and while I do
certainly appreciate the contributions from people on the net, the
fact remains that I gathered about 85% of those citations myself. With
"How to Complain About Spam", on the other hand, a pretty good project
became immeasurably more excellent because about 100 outstanding
people contributed to it -- anywhere from vague comments to repeated
close readings of the text. Their contributed effort certainly added
up to something greater than my own effort, and yet it was easy to
integrate all of their comments into the successive revisions.

None of these projects remotely approaches the magnificence of Linux
or any of the other cooperative software development projects that
ferment on the Internet as we speak. I don't have the talent or
attention span for such things; I just want to explore some new models
of cooperative thinking and creating on the net. Thousands of other
people, including lots of people on this list, are doing the same
thing on larger or smaller scales. And hopefully someone will talk to
us all and write down whatever it is we've learned.

The New York Times has started a weekly Thursday section called
"Circuits" about the social aspects of computing. You can find it
at (free, but you have to register).
I've found reading it quite strange.  "Circuits" is, to be honest and
straightforward about it, a computer lifestyle section that resembles
the Food or Weekend section much more than, say, the excellent Monday
Business section on the Information Industries. And in fact, I have
yet to learn much from the two "Circuits" sections that I have
read. This is not a criticism, but rather an expression of the
dissonance one experiences when encountering something genuinely new
in the world. The "Circuits" reporters are all first-rate -- for
example Amy Harmon, who as far as I am aware single-handedly invented
the computer lifestyle beat when she was at the LA Times. Computers
are now officially a taken-for-granted, downright intimate component
of everybody's life, at least in industrial countries. The computer
world may talk to itself in the harsh, testoterone-fueled language of
objects and heroes, but now other, more sensible languages are
starting to emerge.  And that's okay.

Speaking of the Times, I have been reading the daily paper online
lately, and I have noticed one worrisome effect of the medium: just
as the Internet critics would predict, I read less international news
on the Web. A page of type and a picture about the nomads of Mali is
enough to persuade me to stop and read as I'm flipping through the
newsprint edition, but an article title about the nomads on a Web page
is not enough to persuade me to sit through the excruciatingly painful
download times. Perhaps this will change.

Learned RRE readers Paul Jonusaitis and Dave McArthur provided
the correct answer to my request for the original source of Nathan
Myhrvold's comments on economies of scale in software production. They
appeared in an interview in Wired in September 1995, available online

Pens.  Jessica Hekman sent me a Zebra J-Roller Medium (0.7 mm), a
relatively conservative pen that outwardly resembles a traditional
transparent Bic. It's not a traditional transparent Bic, though, and
the key is a much-improved formulation of the traditional viscous ink
that flows more smoothly and resists clotting. I matched it against
a similar pen, the Pentel Hybrid, which howled at me from the shelves
of WalMart that I should buy it because of its utterly revolutionary
technology.  After comparisons on several surfaces, the J-Roller won
out. It is a little smoother to write with, and its cap clicks home
more convincingly. It also makes a cleaner line on the page. The
Hybrid doesn't so much make a line as a long, straight puddle that
someone has run their finger through.

My comprehensive treatise on highlighter pens still lies far in the
future, but I did want to mention the Micro Brushliter from Korea.
It is, what shall I say?, a whole new paradigm in highlighter pens.
Whereas other highlighter pens are meant to be used at a 60-degree
angle from the page, the Brushliter tapers to a point and is used
at a 15-degree angle. What's more, it is remarkably easy to vary the
width of the highlighting by adjusting the precise angle between pen
and page. This is ideal if you read while reclining.  It is therefore
definitely not for use at the office, but telecommuters will like it.

Jim Berry and Amy Unruh have informed me that Office Depot carries
cost-effective packages of both the Staedtler pens that Ralph Laube
had sent me from Australia and the the Pilot Vball pens that I had
recommended (with slight reservations) in my original message.

Web sites (again, mostly from alert RRE subscribers):

FTC Staff to Survey Consumer Privacy on the Internet

The Yuckiest Site on the Internet

National Science Foundation Societal Dimensions of Engineering, Science, and Technology

Anywho reverse phone directory

Collection of materials from the CFP'98 conference

Good article on the ridiculous domain name service controversy

Debunking the Myth of a Desperate Software Labor Shortage

Association of Personal Computer User Groups

Berkman Center for Internet and Society at Harvard

Top Cyberspace Law Cases of 1997 by Jerry Kang

Newly Released CIA Report on the Bay of Pigs

Ars Electronica Festival of Art, Technology and Society: "Info War"

FCC Clarifies Customer Privacy Provisions of 1996 Act;
Customers To Control Use and Disclosure of Personal Information

html The Digital Libraries Initiative Phase 2 announcement

List of conferences

Canadian privacy -- data protection bill now in the legislature

CSA (Canadian Standards Association) privacy standard

1995 European Privacy directive

HURIDOCS (Human Rights Documents)

EASST '98 General Conference, "Cultures of Science and Technology:
Europe and the Global Context", Lisbon, 9/30 to 10/3, 1998.

Markle Foundation's E-mail for All Outreach Campaign

International Society for the History of Behavioral and Social Sciences,
San Diego, 18-21 June 1998

Article on spam by Roger Clarke

Course on "Privacy in Cyberspace" by Arthur Miller

Smithsonian's "Revealing Things" exhibit

Opera -- faster, less bloated Web browser

Mnemonic -- free Linux browser

New books for my bibliography continue to pour in at an alarming
rate. The list new includes over 300 books. Putting the list together
has been an interesting experience. For example, go to,
look up a book (say, Agre and Rotenberg's "Technology and Privacy: The
New Landscape") and start following the links that are marked with
"Readers who bought [such-and-such] also bought". You'll find yourself
engaged in a spontaneous sociological experiment in mapping people's
interests. In any case, here are the books that I have added since I
last sent the whole thing out:

Daniel J. Barrett, Bandits on the Information Superhighway, O'Reilly,

Joseph E. Behar, ed, Mapping Cyberspace: Social Research on the
Electronic Frontier, Dowling College Press, 1997.

James D. Best, The Digital Organization: AlliedSignal's Success With
Business Technology, New York: Wiley, 1997.

Carl F. Cargill, Open Systems Standardization: A Business Approach,
second edition, Prentice Hall, 1996.

Kenneth W. Dam and Herbert S. Lin, eds, Cryptography's Role in
Securing the Information Society, National Academy Press, 1996.

Candace Deans and Jaak Jurison, Information Technology in a Global
Business Environment: Readings and Cases, Boyd and Fraser, 1996.

M. David Ermann, Mary B. Williams, and Michele S. Shauf, eds,
Computers, Ethics, and Society, second edition, New York: Oxford
University Press, 1997.

Martin Fransman, Japan's Computer and Communications Industry: The
Evolution of Industrial Giants and Global Competitiveness, Oxford
University Press, 1996.

John O. Green, The New Age of Communications, New York: Holt, 1997.

Douglas Groothuis, The Soul in Cyberspace, Baker, 1997.

Andrew S. Grove, Only the Paranoid Survive: How to Exploit the Crisis
Points That Challenge Every Company and Career, New York: Currency
Doubleday, 1996.

Jean Guisnel, Cyberwars: Espionage on the Internet, Plenum Press,

Peter Hall, Cities of Tomorrow, revised edition, Blackwell, 1996.

Michael Hauben and Ronda Hauben, Netizens: On the History and Impact
of Usenet and the Internet, IEEE Computer Society, 1997.

J.C. Herz, Joystick Nation: How Videogames Ate Our Quarters, Won Our
Hearts, and Rewired Our Minds, Boston: Little, Brown, 1997.

Steven Holtzman, Digital Mosaics: The Aesthetics of Cyberspace, Simon
and Schuster, 1997.

Alan Irwin and Brian Wynne, eds, Misunderstanding Science? The Public
Reconstruction of Science and Technology, Cambridge University Press,

Rama Dev Jager and Rafael Ortiz, In the Company of Giants: Candid
Conversations With the Visionaries of the Digital World, McGraw-Hill,

Margo Komenar, Electronic Marketing, New York: Wiley, 1997.

Kenneth C. Laudon, Carol Guercio Traver, and Jane Price Laudon,
Information Technology and Society, second edition, Course Technology,

Michael Lesk, Practical Digital Libraries: Books, Bytes, and Bucks,
Morgan Kaufmann, 1997.

T.G. Lewis, The Friction-Free Economy: Marketing Strategies for a
Wired World, New York: HarperBusiness, 1997.

Jonathan Littman, The Watchman: The Twisted Life and Crimes of Serial
Hacker Kevin Poulsen, Boston: Little, Brown, 1997.

 Paul W. MacAvoy, The Failure of Antitrust and Regulation to Establish
Competition in Long-Distance Telephone Services, Cambridge: MIT Press,

Martin Mayer, The Bankers: The Next Generation, Dutton, 1997.

Hamid Mowlana, Global Information and World Communication: New
Frontiers in International Relations, second edition, Sage, 1997.

Cleo Odzer, Virtual Spaces: Sex and the Cyber Citizen, Berkley, 1997.

Sanjaya Panth, Technological Innovation, Industrial Evolution, and
Economic Growth, Garland, 1997.

John V. Pavlik, New Media Technology: Cultural and Commercial
Perspectives, Allyn and Bacon, 1996.

Charles Platt, Anarchy Online: Net Sex, Net Crime, Harper, 1997.

Lynnette R. Porter, Creating the Virtual Classroom: Distance Learning
with the Internet, Wiley, 1997.

Kenneth G. Robinson, ed, Building a Global Information Society: Report
of the Aspen Institute Roundtable on International Telecommunications,
Aspen Institute, 1996.

Richard S. Rosenberg, The Social Impact of Computers, second edition,
Academic Press, 1997.

Susan M. Ryan, Downloading Democracy: Government Information in an
Electronic Age, Hampton Press, 1996.

Harold Sackman, Biomedical Information Technology: Global Social
Responsibilities for the Democratic Information Age, Academic Press,

Pier Paolo Saviotti, Technological Evolution, Variety and the Economy,
Elgar, 1996.

Paul M. Schwartz and Joel R. Reidenberg, Data Privacy Law, Michie,

Rob Shields, ed, Cultures of Internet: Virtual Spaces, Real Histories,
Living Bodies, Sage, 1996.

Daniel E. Sichel and Marilyn Whitmore, eds, The Computer Revolution:
An Economic Perspective Brookings, 1997.

J. Gregory Sidak, Foreign Investment in American Telecommunications,
University of Chicago Press, 1997.

Robert Ellis Smith, Compilation of State and Federal Privacy Laws,
Privacy Journal, 1997.

Joseph Straubhaar and Robert LaRose, Communications Media in the
Information Society, Wadsworth, 1996.

Andrew S. Targowski, Global Information Infrastructure: The Birth,
Vision, and Architecture, Idea Group, 1996.

Robin Thornes, Protecting Cultural Objects in the Global Information
Society: The Making of Object ID, Getty Information Institute, 1997.

Luke Uka Uche, ed, North-South Information Culture: Trends in Global
Communications and Research Paradigms, Longman Nigeria, 1996.

Paul Virilio, Open Sky, translated by Julie Rose, Verso Books, 1997.

Jonathan Wallace, Sex, Laws, and Cyberspace: Freedom and Censorship on
the Frontiers of the Online Revolution, Holt, 1996.

Charles B. Wang, Techno Vision II: Every Executive's Guide to
Understanding and Mastering Technology and the Internet, New York:
McGraw-Hill, 1997.

Brad Wieners and David Pescovitz, eds, Reality Check, Hardwired, 1996.

J. MacGregor Wise, Exploring Technology and Social Space, Sage, 1997.

Phillip Woolfson and Angus Murray, Information Society in the European
Union, Wiley, 1996.

Kaoru Yamaguchi, ed, Sustainable Global Communities in the Information
Age: Visions from Futures Studies, Praeger, 1997.

G. Pascal Zachary, Endless Frontier: Vannevar Bush, Engineer of the
American Century, New York: Free Press, 1997.