13 Posts in Category 'Book Reviews'
I recently reviewed Surely You're Joking, Mr. Feynman!. It was good enough that I had to get the sequel.
What Do You Care What Other People Think? is another collection of stories and anecdotes written by and/or about Richard Feynman. A bit in contrast to the first book, rather than a chronological series of anecdotes, this book focuses on a couple of main topics.
Feynman discusses his first wife in some detail. Of particular interest, he describes his and his wife's brutal devotion to honesty in their relationship, even in the face of highly unpleasant truths (terminal disease, in this case). It's the honesty of a scientist, carried into "everyday" life. This was bittersweet for me to read, because the story has a sad ending.
There is also a short series of letters from Feynman to others, where he discusses the silliness of pomp and circumstance, e.g. his foibles and breaches of protocol when meeting some king or other. As someone who hates ceremony, I got a huge kick out of these.
A large part of the book is devoted to discussing the Presidential Commission which investigated the cause of the Challenger shuttle disaster. Feynman's full report is included in the book as well.
As someone interested in astronomy and space flight (and who isn't interested in those?) I found this fascinating. There's a lot of behind-the-scenes stuff. Engineers are painted in a good light, managers and politicians not so much. (Software engineers come out looking especially good, which made me feel (unjustifiably) good about myself by proxy.) There are some diagrams and a lot of technical discussion of the shuttle. Not so much that it drowns the narrative, but enough that I'm probably going to spend the next week reading Wikipedia on the subject now.
Feynman explains his simple methods at getting to the truth in the investigation. Go talk to the guys who put things together. Get your hands on some O-ring rubber and test its resistance to temperature yourself in a glass of ice water. Cut to the heart of the matter. It's good stuff.
Ultimately, as you know if you've read the report, Feynman rips NASA apart, showing that they were fooling themselves into believing the shuttle was safer than it really was. The last sentence of the report says everything: "Nature cannot be fooled."
The last section of the book discusses the value of science. More specifically, Feynman discusses the value of doubt. I very much liked how the chapter ends:
It is our responsibility as scientists , knowing the great progress which comes from a satisfactory philosophy of ignorance, the great progress which is the fruit of freedom of thought, to proclaim the value of this freedom; to teach how doubt is not to be feared but welcomed and discussed; and to demand this freedom as our duty to all coming generations.
If there's one trait I had to pick to separate good people from bad, it would be the ability to admit being wrong. And if I had to separate the good from the excellent, it would be not just the ability to admit being wrong, but the eagerness to be proved wrong.
There's a certain kind of devotion to the truth that not many people achieve, and maybe not many people even want to achieve. There's comfort in thinking that you know things. It's very tempting. I think it's probably partly why most people are religious. I suspect it's a big reason why so many people are so stubbornly wrong about so many things in general. I suspect this comfort is an enormous source of suffering in the world.
But there's another kind of comfort that people miss out on. It's the comfort of knowing that although you're probably wrong about a lot of things, you're trying your hardest to be right. You pay the price of being aware of your own state of ignorance, but you can rest a bit easier knowing that you're maybe, hopefully, inching towards the truth. I never heard the word "freedom" used to describe this feeling before, as Feynman does above, but it fits.
That's why I like reading about Feynman and reading Feynman's words. He seemed to live this philosophy as well as anyone could hope to.
I'm probably the last person on earth to read "Surely You're Joking, Mr. Feynman!", a collection of stories and anecdotes from the life of Richard Feynman. But better late than never.
I'll keep this short. Feynman was the kind of nerd every nerd wishes he was, in one way or another. He was socially awkward. He was blunt and tactless. He felt out of place much of the time. And it was surprising to see how often Feynman expressed feelings of inadequacy. He wrote about being highly intimidated when talking in front of big names in physics, and jealous of the math abilities of some other people.
But none of that stopped him from having an exciting life. In fact he turned these traits to his advantage. He had romantic success, often via very highly unconventional means (e.g. walking up to girls and asking them for sex outright; they often said yes). His bluntness was seen as an admirable trait: where others might be intimidated by a famous physicist, Feynman would give his honest opinion, and that was often appreciated.
And Feynman went far outside his comfort zone. He did a stint in biology, even though he knew nothing of biology at the time. He went to Brazil and joined a samba band. He sold drawings and paintings for a while. He played drums for a ballet.
I picked up at least three lessons from this book.
Try new things, even if you suck at them. Life is boring if you stick to what you're good at.
Be intellectually honest. Brutally so. It's the only way to do good science, and arguably the only way to live a good life. I've always believed this, and Feynman hammers the point home well.
Even the best and brightest of us feel insecure at times. You shouldn't let it stop you.
On top of all of that, reading of Feynman's time at Los Alamos working on The Bomb was a fascinating piece of history. I highly recommend this book.
And watch as much of Feynman on Youtube as you can find. It never ceases to be fascinating.
Vim is an open-source text editor with a power and flexibility matched only by the steepness of its learning curve. As the author of this book states, "Vim Can Do Everything"; but configuring it to do so is sometimes daunting. Hacking Vim 7.2 aims to help the average Vimmer get the most out of customizing Vim, for fun and productivity.
For Christmas this year, I received a shiny hardback copy of Parsing Techniques: A Practical Guide by Grune and Jacobs. It's a thrilling book, if you want to learn parsing, which I do.
Where most books proceed in a sort of linear fashion, this book teaches parsing in layers. First you learn what a grammar is. Then you learn what it means to parse: what's a parse tree? What's bottom-up vs. top-down? What's a leftmost vs. rightmost derivation?
Next you get some general ideas and methods for parsing, e.g. CYK and Unger, and then you dive into the implementations of parsers (in pseudocode and in C) in great detail. This is about as far as I've gotten so far, before having to go back and figure out what the heck I just read. But it's an interesting progression. Reading the book, I feel like I'm constantly revisiting things I learned a few chapters ago, but this time in more detail. The book kind of does a breadth-first traversal of the world of parsing.
Be warned however: this book is not easy reading. It's dense, heavy on the info, light on the entertainment. Unless you really get a kick of out parsing, this will probably put you to sleep if taken in large doses. But it is a trove of information, and I couldn't put the book down during certain chapters.
In fact there's so much information in this book that it's almost depressing. The bibliography alone takes up 1/4 of the book, and lists 1,500(!) authors. It'd take me a week to read the bibliography, and probably many years to read every book listed there. Parsing could easy consume a lifetime of study, and I'm saddened that I'm probably never going to find the time to master all there is to know. But such is life.
If I had one quibble with this book, it'd be the same quibble I have with most math papers. The notation is horrible. Say what you will about programmers, most of us know that code is written for humans, not for machines, and we give our variables descriptive names. In math it's all single letters variable names.
When the authors of this book run out of single letters, they use letters with bars over them, or bold letters vs. normal typeface letters, or they do things like this:
...whenever a non-terminal A is entered in to entry Ri,l of the recognition table because there is a rule A -> BC and B is in Ri,k, and C is in Ri+k,l-k, the rule A_i_l -> B_i_k C_m_n is added to the parse forest grammar, where m = i + k and n = i + l - k.
This is the first paragraph of a section. Those variables are not mentioned before this sentence. This is certainly not a style of writing that I'm used to reading. It takes me a good dozen tries to understand. (Using lowercase i's and l's right next to each other should be prohibited by law.)
In any case, this book is good. One of my favorite tools has always been Perl-style regular expressions, and I feel like this book has expanded my understanding of how they work. Learning to write a recognizer, learning how things are implemented under the hood, you couldn't ask for a more interesting topic. I can't wait to try writing a toy parser generator or regex recognizer in Clojure once I've solidified my understanding of some of these concepts.
Recently I received a preview copy of Peter Seibel's newest book, Coders at Work.
This is a wonderful book if you are a programmer and care at all about the art, craft, and/or science of programming. If there is any wisdom to be had in this fledgling field of ours, this book contains buckets of it.
The book consists entirely of long interviews with some big names in the world of programming: Donald Knuth, Ken Thompson, Jamie Zawinski, Guy Steele, Peter Norvig, the list goes on. There are also some names I wasn't quite so familiar with (but maybe should have been), like Brad Fitzpatrick, the inventor of Livejournal.
But everyone interviewed for the book has produced some grand, successful project in their time. These are tried-and-true, battle-tested programmers and in this book they share their war stories and advice.
I went on a reading binge over Thanksgiving break. Read on for reviews.
Test-Driven Development By Example
First I read Test-Driven Development By Example. This is a very short book, especially given the hefty price tag. I would probably not have bought this if I'd seen it on a shelf at the store rather than online.
That said, it's a good book if you want to understand the mindset behind the whole test-driven development fad. The book is heavy on mindset and light on mechanics; it doesn't tell you how to set up an environment, how to compile and run tests, or any such thing. It just goes through some examples and explains what the programmer was thinking, and how to tackle the problem from a TDD point of view. The intended audience therefore is someone who has plenty of experience programming and wants to learn a new way to look at problem solving.
The book goes through writing some terrible code (admittedly terrible by the author) to solve a simple problem, then refactoring the code to be less and less terrible by means of tests. The writing style is engaging and casual and he describes mistakes as well as successes, which is a nice way to write a tutorial book. The alternate writing style, belting out the correct answer the first time every time, is not as enlightening, and I appreciate it in this book.
The book places a heavy emphasis on writing tests FIRST, making small incremental changes, and refactoring as you go. It also explains how TDD can be related to various OOP design patterns, which I didn't find all that helpful. And it describes some "test patterns", which I found even less helpful. But it's mostly an aesthetic objection; something about "design patterns" sits less and less well with me the more I get away from Java-like languages. The information is still good, straightforward and to the point, take it or leave it.
I have been unable to drink the TDD Kool-Aid, but I can see where it'd probably result in an improvement in some of my code for certain specific kinds of problems. I'll probably try it out again next time I have to write a little library at work.
This book is expensive and short, but the alternative for learning about TDD are half-baked blog posts, which is why I bought a book. I don't know of a better book because this is the only one I have, and I'm unsure at this point whether I really want to buy another book on this topic. But this book is highly recommended by many TDD
fanatics enthusiasts, so I don't know.
The Little Schemer
Next I read The Little Schemer. The style of this book can best be described as "cute". Examples all use food, and it instructs you at various points to go make yourself a sammich (even leaving room in the book for "jelly stains"). This somehow makes the topics, which can be rather intimidating, somehow less scary. The book is very short, and not very dense (not many words on a page), but they somehow cram a lot of information into the book anyways, via the many examples. It's a very focused and methodical book.
This book goes through developing a simplified dialect of Scheme, with a focus on recursion and building up parts of the language from very simple primitives. There are some neat little things along the way, like using s-expressions to represent numbers, and building up a full set of arithmetic functions using nothing but "add 1", "subtract 1", and recursion. It'd be insane to do in real life because of performance, but it's really neat from a "hey look what you can do, isn't this cool!" point of view.
The first half of the book is pretty simplistic (at least for me, with some basic knowledge of Lisp and Scheme), but the second half really starts delving into some crazy things, passing lambdas around to lambdas and writing evaluators and stuff. But you almost don't even notice because of how solidly the foundations are built up to that point, and how well the examples are explained. (You probably will notice once you hit the y-combinator, on account of your brains detonating.)
This not a good book for "learning Lisp" in terms of learning the gritty details of to use a real-life implementation of Lisp. But it's a good book for learning some of the concepts that make Lisp and functional languages powerful. This is a very interesting and unique book, also highly recommended by many in the Lisp world. I'm already planning to try The Seasoned Schemer next.
Real World Haskell
Finally I started plowing through the recently-released Real World Haskell. This book is freely available online, which is great. It also allows readers to make comments on every paragraph in the book, which is a highly collaborative and probably very intelligent way to write a book, and also can provide insight for readers who don't understand the text at any given point. I may buy a hard copy, I haven't decided. (The online version has some FIXME notes and typos that I can only hope are fixed in the real thing.)
I'm only up to chapter 11, but it's good so far. It gives a ton of examples and goes slow enough for people completely new to Haskell to pick the language up. And it includes some of the mechanics of compiling and/or running programs in GHC, which is extremely helpful. I did start getting lost around the time monads were introduced, but that's to be expected. It usually takes me two or three good reads of this kind of book before I grok it all, which is no fault of the book.
Haskell. Last time I used it, I wrote Pong in Haskell in one of my college classes. It was an immensely painful experience. I'm going back to re-learn Haskell now because I find myself edging more and more toward functional languages and away from the C/Java world.
Haskell is far too much to swallow if you have no experience with functional programming, which is probably why I hated it in college. (That said, a smarter person than I may have fewer problems.) After graduating school, I got my first whiff of this world again via Perl's
grep and friends. Then Ruby pushed it home for me with its widespread use of block parameters (lambdas in disguise). And after learning Lisp (and especially later, Clojure) I can't live without higher-order functions. If I had to write a classic for-loop and manually keep track of a counter variable I'd probably vomit at this point. Haskell takes function manipulation to an even greater extreme, which I still haven't fully wrapped my head around, but I like what I've learned so far.
Laziness is awesome, function currying is awesome, and this is the first time I've read about folding, apart from brief struggles with Ruby's (poorly named)
Enumerable#inject. Pattern matching strikes me as vaguely similar to Common Lisp's / Clojure's destructuring-binding, which is one of the nicest features I've seen in any language when it comes to making a programmer's life easier.
Maybe it's all just interesting because it's still new to me. But I like the whole idea of functional programming. I like the idea of functions that always produce the same output from a given input. How and when to handle side-effects and object mutation is a problem that's always nagged at the back of my mind even when writing Ruby code.
That said, Haskell still (at this point in the book) does not strike me as a practical or real-world language in the slightest. You've got to jump through some crazy hoops to get a lot of things to work, especially when it comes to I/O. The book describes how to write a pretty-printer, and how to parse a binary file, both of which require some acrobatics to write concisely in Haskell. In particular the parser library lost me entirely; functions were being chained to functions in all directions and I couldn't follow it.
I think one reason Java is so popular is that you don't have to be a genius to write it. To write Haskell (or even Lisp) well, to take advantage of the language and use it smoothly, you really do need to do some deeper thinking. The abstractions are more powerful and more "abstract". It's not too hard to understand a world made of objects with a bunch of knobs you turn via method calls. Lambdas and recursive functions and monads and typeclasses are a more ephemeral thing.
That said I plan to stick with the book. Practice makes perfect.
It's been a while since I acquired PAIP. I can't say I've read all of it over the past couple months, but I've read most of it.
The initial chapters that give an intro to Common Lisp seem largely useless to me. There are better books to introduce Common Lisp (especially now that we have PCL). Norvig admits as much himself, and thankfully doesn't devote much of the book to intro material.
Once you get past those, this book is densely packed with information. Just tracing through and understanding the source code would probably take me a couple of months, let alone grasping all the concepts the source code is trying to exemplify. It's a wonder to me that one person can produce this much material. But the downside is that it's a bit difficult to read at times; it's kind of dry and reads like a text book. In spite of that, I was glued to this book for quite some time; the information inside is engaging enough to make up for the lack of presentation.
Some of the examples of code in the book really are quite startling. For example Norvig writes a pattern-matcher that implements some subset of Perlish regular expressions, and it only takes a couple pages of code. Later in another couple pages of code, he implements a variant of Prolog. At one point he writes some pattern-matchers that can parse a surprisingly large subset of the English language. But a lot of other examples of code weren't quite so interesting to me personally; a lot of the material is dedicated to tree and graph search algorithms, which brought to mind long boring lectures from my college days. (As a side note, it's very sad how much of the history of AI can be reduced to "clever graph search algorithms". I don't know much about AI today but I hope it's advanced a bit past that.)
This book is from 1992 and some of it does feel a bit dated. This book was apparently written just after Common Lisp was standardized. Object oriented programming was apparently relatively new back then, and Norvig glosses over CLOS. Most of the code is written in functional style. However Common Lisp today still doesn't force OO on you by any means, and a lot of Lispers today apparently still don't care much for OO, so the code is more relevant than you'd imagine.
And as you can't help realizing after reading an old CS book or two, CS as a science hasn't seen all that many breakthroughs in the short couple of decades it's been around. It's really quite sad in a way that a book written over 15 years ago can still be so relevant; we have all these huge advances in computer hardware, yet it still feels like we're in the Stone Age of computer programming.
Not being an Artifical Intelligence person myself, I can't judge how well this book serves as a guide to AI. But Norvig's stated goal was in large part to show the history of AI, and he does that very well. (This was history back in 1992 when the book was written, so it's even older now, but still interesting.) He walks through implementing a lot of famous AI programs from the past, and explains their limitations and how they can be improved. Want to write ELISA or other chat-bots? He does that. Want to write an Othello-playing program? He does that too.
One of the best things about this book is that Norvig's code is just about the Lispiest code I've seen. He does many things that take full advantage of all the nooks and crannies of Common Lisp and wouldn't make much sense in other languages. This is a good book to help you think in Lisp.
So I recommend this book. Though I'm unsure I'll ever manage to absorb most of the information in it.
I got through the first four chapters of PAIP yesterday and today. It's good so far. I had reservations about spending $80 on a book, but it's 950 pages and almost all of it is good stuff, from what I can see at a glance. Little space is wasted on introduction material or tutorials for people who know nothing about programming. He dives right into the good stuff in chapter four.
Already I've picked up a few nice bits of info, interesting standard functions / macros like
TRACE which is good for debugging. And
SUBLIS which takes a list of (key . value) sublists and replaces all the keys with all the values, in some other list. Common Lisp is such a freaking huge language, in terms of its standard library of functions. Good thing or bad thing? It may be a bad thing if not for the fact that you load Lisp once and it persists forever.
Norvig makes an interesting point about the REPL. What do most programs need to do? Take input from the user, do something with it, and give output back to the user.
To get input, your program can read from command-line args, or prompt to STDIN, or take it out of text fields in a QT app, or read it from a config file. But it's generally always strings. So you get some strings, and then what? You have to figure out what the strings say, and either turn the strings into something usable or use the strings to figure out what action to perform. So you may have a dedicated library to parse command-line args and a dispatch table to map them to function calls. Or you may match the strings against a regex and depending what the strings look like, turn them into your language's representation of an integer or float using some "parseInt" or "parseFloat" functions. Or if your strings are XML, you may parse them into some big funky XML structure, then access bits of that structure to figure out what to do.
In Lisp on the other hand, you happen already to have a powerful reader/parser at your disposal: the REPL itself. It already knows how to read string representations of lists, numbers, pathnames, and many other things, and it can turn them into Lisp objects for you with no effort on your part. If you were using Ruby or Perl, short of
eval, you wouldn't get anything close to that capability in your program.
What's more, the REPL knows how to read string representations of Lisp code (new functions and function calls etc.), and evaluate the code and end up with Lisp objects. It would be like if you wrote a C program which prompted you for input, and the input you gave could be a string representing a new C program that when run would produce a string that contained your input (or produce in-memory C structures your C program could use). So the main program would call GCC and compile and run your input to get results and use those results as its final input. Except that still wouldn't be nearly as powerful as what Lisp is doing.
So if you can figure out how to shape your input into a form that the Lisp reader can read, you're set. And it so happens sexps are a great representation for a great many things. Maybe this is part of why Lispers seem to worship the REPL and shun compiling their apps into command-line, standalone executables? Maybe this is part of what's meant by the oft-quoted-to-death Greenspun's 10th Rule?
Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
Much more PAIP to follow. It should be enjoyable. There's nothing like a good book.
Paul Graham's On Lisp is available in PDF form (and Postscript) for free download, so I thought I'd give it a read. It's an older book (early 90's) and you can tell. It has a lot of good knowledge if you can make it past some of the things that obscure the more relevant parts.
A lot of the code given in the book favors efficiency over simplicity and readability, which is probably a product of the book being 15 years old and written by someone who's clearly been programming for a lot longer than that. It's something I notice a lot in older books. Of course there was a good reason for it: you had to worry about efficiency if you wanted your program to finish before next year.
Given today's computers and the kind of programming I tend to do personally, I don't care about efficiency beyond making my code "fast enough". Very rarely does this ever mean manipulating my code to make it tail-recursive so my compiler can optimize away some of my function calls, which Graham devotes a lot of effort to in the book. Someone once said something about "premature optimization" which I'm sure everyone is familiar with and may apply here.
(But I have a ton of respect for people who programmed computers back when every clock cycle and every byte of RAM was precious. I'm not sure how I could've ever managed in that kind of world. Sometimes it seems like you had to be a genius to get anything done back then. Maybe necessity was the mother of invention though.)
Common Lisp was apparently just being standardized or being heavily revised when this book was written, judging by much of what Graham writes. A lot of stuff I read in Practical Common Lisp that Seibel glossed over as being "something the old-school Lispers used to do" is in full-effect in Graham's book, and a lot of what Graham mentions as "new" is probably very solidly standard nowadays. Graham uses lambdas for EVERYTHING: data storage, algorithm optimization, sometimes even just to introduce a new lexical scope. And our primitive friends the CAR and CDR and their many cousins make plenty guest appearances in this book, to the detriment of my poor brain.
Graham also glosses over a lot of the code in his book and leaves it to speak for itself, making this probably a book intended for far more advanced Lispers than I. For an intro to Lisp I still highly recommend Seibel's book; his code and prose are so transparent and easy to understand that it's a joy to read. Graham's book reads more like a math textbook.
If you can get past all of those things, there's clearly a lot to learn from the book. It's really amazing some of the things you can pull off in Lisp, and Graham gives a great demonstration. Many of the design principles that Graham lists as goals for his new language Arc are explained really well in this book, so it's worth a read if you're interested in that. It's funny that a book about Lisp often blurs into a book about programming language design, probably because as Graham says, when you write Lisp you often build the language up to your program at the same time you build your program down to the language.
I got K&R (the ANSI C version) as one of the going-away presents from my current / last job, and I just finished reading it.
A lot of what's in the book is sort of review for me now but there were some things I'd never seen before. For example in the book they like to put function declarations inside other functions instead of globally at the top of the file. I never even thought to do that, for whatever reason. I also was unfamiliar with some of the more obscure corners of the C language, like register variables. Apparently the book was just written when ANSI was becoming / had become standard, so it's also fun to get a small glimpse at how things were before that. (They looked to be a mess.)
It's definitely the kind of book I wish I'd have had back in high school when I was just learning to program. For example the very straightforward explanation of how file handles work in Unix systems is nice and clear and is one of those things that I was wondering about for years in my youth: How is it that I can access stdout, stdin and stderr without doing anything? Where are those handles coming from? They seemed to be there like magic, and I never knew if it was always safe to assume that I'd have those available in every programming language I used on every OS I tried, or what. Likewise, I remember in the vast majority of programs I wrote in high school, I would trip up over I/O buffering. K&R explains that really well also.
Things that everyone "just knows" and assume that you also know are the biggest pitfall in learning computers, to me. Unix / Linux seems chock-full of that kind of crap. How do you know to try
man to figure out how to use a command? How do you know that
~ means "home directory"? How do you know that you start X via
startx? If you don't know those things, short of asking someone else, it's very hard to discover how to do them. For people who know how to do it, it's such common knowledge that no one bothers to write it down. It's a huge leap that people have to make when first learning Linux or programming in general. A book like K&R is great because it gives you a starting point, and they explain pretty much everything. "Comprehensive" is probably the word I'm looking for.
In any case, as a rule, I detest low-level programming, of the "bit-twiddling" form. But there's obviously a need to know how to do it at times, and I think if I was more in the practice of falling back to C or other low-level languages when the problem at hand demanded it, I would probably be a better programmer. So I'm glad to try to learn more of it.