Computer science, to judge by how it is represented in the popular media, has an image problem. It seems that whenever a theme of deep computational need lurches into the plot of a blockbuster movie it is addressed by a heavy-set, socially awkward individual in an unusually loud shirt slouching behind a wildly customised workstation surrounded by junk-food debris and beverage cans. This sweaty, bespectacled figure saves the world by means of 120-word-per-minute typing and rapidly delivered techno-babble, while green pseudo-code from the screen inexplicably scrolls across his face. (Yes, “his” face.)
If we are to wean folk away from this surreal representation of computer science, we need a way of introducing people to the elegant structures and processes that define the subject in real life. Thankfully, this engaging book by Martin Erwig addresses this need, providing a thoughtful and approachable guide to the fundamentals of how computer science exists as an intellectual discipline. Turning the tables on traditional approaches, he strips away the usual reliance on the hardware model to concentrate on how the essential processes of computation are defined and expressed. He does this by taking apart well-known stories, from Hansel and Gretel to Harry Potter, to show how they are logically constructed and how you would express them in the form of algorithms.
Starting at the simplest level, we are led through the process of decomposing problems into smaller units that can be more easily addressed. From this basis, the narrative builds quickly to embrace concepts such as representation and data structures, control structures and recursion. In each case, an appropriate scene from a popular story provides a walking tour through the learning process, giving the reader a clear trail to follow and helping to bring out the essential links between well-understood scenarios and the world of computation.
While the cultural themes are popular and accessible, there is nothing lightweight about the content. Erwig has worked hard to make sure the book gives readers with little or no technical knowledge access to a fundamental understanding of computer science. It also delivers an agreeable guide to problem-solving in the broadest sense. Splitting the text into two sections, the author teases out the roles of algorithms and languages. The discussion of language is especially interesting as it uses the example of musical notation as a representational language, which helps dispel some of the mystique of software syntax and rules. The musical score is not used in isolation but is woven into the story of The Wizard of Oz.
色盒直播
Sometimes, you just need to take a fresh approach to a subject. Erwig helps to disperse the fog generated by fictional representations of computation and brings a sense of the obvious delight he takes in his subject.
John Gilbey teaches in the department of computer science at Aberystwyth University.
色盒直播
Once Upon an Algorithm: How Stories Explain Computing
By Martin Erwig
MIT Press, 336pp, ?22.95
ISBN 9780262036634 and 9780262341684 (e-book)
Published 29 September 2017
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login