Tải bản đầy đủ (.pdf) (6 trang)

O''''Reilly Network For Information About''''s Book part 122 pptx

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (23.55 KB, 6 trang )


Web site for documentation of the Cocoa class library.

Web site for releases and documentation for the GNUstep library, an open
source version of Cocoa.

A searchable archive that combines several Cocoa developer mailing lists.
Cocoa Programming for Mac OS X
An introduction to both Objective-C and the Cocoa framework. By Aaron
Hillegass. Addison-Wesley.
Learning Cocoa with Objective-C
An introduction to both Objective-C and the Cocoa framework. By James
Duncan Davidson. O'Reilly & Associates.
Cocoa Programming
A more advanced guide to the Cocoa framework. By Scott Anguish, Erik
Buck, and Donald A. Yacktman. SAMS.
[NEXT]
ISBN: 0-13-977430-0
Copyright 1999 by Prentice-Hall PTR
Copyright 2000 by Chrysalis Software Corporation

Would you like to discuss interesting topics with intelligent people? If so,
you might be interested in Colloquy, the world's first internet-based high-IQ
society. I'm one of the Regents of this society, and am responsible for the
email list and for membership applications. Application instructions are
available here.

Imagine that you are about to finish a relatively large program, one that has taken a
few weeks or months to write and debug. Just as you are putting the finishing
touches on it, you discover that it is either too slow or runs out of memory when
you feed it a realistic set of input data. You sigh, and start the task of optimizing it.


But why optimize? If your program doesn't fit in memory, you can just get more
memory; if it is too slow, you can get a faster processor.
I have written Optimizing C++ because I believe that this common attitude is
incorrect, and that a knowledge of optimization is essential to a professional
programmer. One very important reason is that we often have little control over the
hardware on which our programs are to be run. In this situation, the simplistic
approach of adding more hardware is not feasible.
Optimizing C++ provides working programmers and those who intend to be
working programmers with a practical, real-world approach to program
optimization. Many of the optimization techniques presented are derived from my
reading of academic journals that are, sadly, little known in the programming
community. This book also draws on my nearly 30 years of experience as a
programmer in diverse fields of application, during which I have become
increasingly concerned about the amount of effort spent in reinventing
optimization techniques rather than applying those already developed.
The first question you have to answer is whether your program needs optimization
at all. If it does, you have to determine what part of the program is the culprit, and
what resource is being overused. Chapter 1 indicates a method of attack on these
problems, as well as a real-life example.
All of the examples in this book were compiled with both Microsoft's Visual C++
5.0 and the DJGPP compiler, written and copyrighted by DJ Delorie. The latter
compiler is available here. The source code for the examples is available here. If
you want to use DJGPP, I recommend that you also get RHIDE, an integrated
development environment for the DJGPP compiler, written and copyrighted by
Robert Hoehne, which is available here.
All of the timings and profiling statistics, unless otherwise noted, were the result of
running the corresponding program compiled with Visual C++ 5.0 on my Pentium
II 233 Megahertz machine with 64 megabytes of memory.
I am always happy to receive correspondence from readers. If you wish to contact
me, the best way is to visit my WWW home page.

If you prefer, you can email me.
In the event that you enjoy this book and would like to tell others about it, you
might want to write an on-line review on Amazon.com, which you can do here.
I should also tell you how the various typefaces are used in the book.
HelveticaNarrow is used for program listings, for terms used in programs, and
for words defined by the C++ language. Italics are used primarily for technical
terms that are found in the glossary, although they are also used for emphasis in
some places. The first time that I use a particular technical term that you might not
know, it is in bold face.
Now, on with the show!
Dedication
Acknowledgements
Prologue
A Supermarket Price Lookup System
A Mailing List System
Cn U Rd Ths (Qkly)? A Data Compression Utility
Free at Last: An Efficient Method of Handling Variable-Length Records
Heavenly Hash: A Dynamic Hashing Algorithm
Zensort: A Sorting Algorithm for Limited Memory
Mozart, No. Would You Believe Gershwin?
About the Author

This book is dedicated to Susan Patricia Caffee Heller, the light of my life.
Without her, this book would not be what it is; even more important, I would not
be what I am: a happy man.

Acknowledgements
I'd like to thank all those readers who have provided feedback on the first two
editions of this book, especially those who have posted reviews on Amazon.com;
their contributions have made this a better book.

I'd also like to thank Jeff Pepper, my editor at Prentice-Hall, for his support and
encouragement. Without him, this third edition would never have been published.
Finally, I would like to express my appreciation to John P. Linderman at AT&T
Labs Research for his help with the code in the chapter on sorting immense files.
Prologue
Introduction to Optimization
What is optimization anyway? Clearly, we have to know this before we can discuss
how and why we should optimize programs.
Definition
Optimization is the art and science of modifying a working computer program so
that it makes more efficient use of one or more scarce resources, primarily
memory, disk space, or time. This definition has a sometimes overlooked but very
important corollary (The First Law of Optimization): The speed of a nonworking
program is irrelevant.
Algorithms Discussed
Radix40 Data Representation, Lookup Tables
1

Deciding Whether to Optimize
Suppose you have written a program to calculate mortgage payments; the yearly
run takes ten minutes. Should you spend two hours to double its speed? Probably
not, since it will take twenty-four years to pay back the original investment of time
at five minutes per year.
2
On the other hand, if you run a program for three hours
every working day, even spending thirty hours to double its speed will pay for
itself in only twenty working days, or about a month. Obviously the latter is a
much better candidate for optimization. Usually, of course, the situation is not
nearly so unambiguous: even if your system is overloaded, it may not be
immediately apparent which program is responsible.

3
My general rule is not to
optimize a program that performs satisfactorily. If you (or the intended users) don't
become impatient while waiting for it to finish, don't bother. Of course, if you just
feel like indulging in some recreational optimization, that's another matter.
Why Optimization Is Necessary
Assuming that our programs are too big, or too slow, why don't we just add more
memory or a faster processor? If that isn't possible today, then the next generation
of processors should be powerful enough to spare us such concerns.
Let's examine this rather widely held theory. Although the past is not an infallible
guide to the future, it is certainly one source of information about what happens
when technology changes. A good place to start is to compare the computers of the
late 1970's with those of the late 1990's.
The first diskette-based computer I ever owned was a Radio Shack TRS-80 Model
III
TM
, purchased in 1979.
4
It had a 4 MHz Z80
TM
processor, 48 Kbytes of memory,
and Basic
TM
in ROM. The diskettes held about 140 Kbytes apiece. Among the
programs that were available for this machine were word processors, assemblers,
debuggers, data bases, and games. While none of these were as advanced as the
ones that are available today on 80x86 or 680x0 machines, most of the basic
functions were there.
The Pentium II
TM

machines of today have at least 1000 times as much memory and
20000 times as much disk storage and are probably 1000 times as fast. Therefore,
according to this theory, we should no longer need to worry about efficiency.
Recently, however, several of the major microcomputer software companies have
had serious performance problems with new software releases of both
application

×