Monthly Archives: October 2010

Reading list for techies

Inspired by the Atlantic Monthly Tech Canon, books every technie ought to have read, I have volunteered to put together a reading list for computer science students. This is not a list of textbooks but a list of more general reading on tech topics (computing and others) that will help them broaden their understanding of technology and science. This is a ‘work in progress’ – some of my colleagues are making suggestions and these will be added when they give me more information about them. We want perhaps 20 books in total that we think every computer science student should have read.

Perrow, C. (1984). Normal Accidents: Living with High-Risk Technologies. New York: Basic Books. Reprinted 1999. [Ian S.]

A case study of a number of accidents and a discussion of why accidents are inevitable in technically complex systems. Looks at risk from a social perspective and discusses why we can’t eliminate risk by technology alone. In fact, adding technology supposedly to reduce risk can have the opposite effect as it increases complexity and increases the likelihood of unexpected couplings between elements of the system.

Nick Carr. (2008). The Big Switch: Rewiring the World from Edison to Google. W.W. Norton & Co.[Ian S.]

An introduction to cloud computing and how it will change the way we use computing systems. All students should know something about the cloud even if it isn’t a course module.

Clay Shirky. (2009). Here Comes Everybody: How Change Happens when People Come Together. Penguin. [Ian S.]

All about social media and how it’s use is fundamentally changing the way people interact and how it can empower group actions. Gets over the importance of social media and how it isn’t just about interactions with ‘friends’.

Henry Petroski. (1992). To Engineer is Human: The Role of Failure in Successful Design. Vintage Books. [Ian S.]

Discusses how engineering design is all about learning from failure and changing design practice to avoid the same failures. It’s important that CS students have an understanding of design in other disciplines, constrained by the laws of physics. It is worrying that in software engineering, we haven’t really learned these lessons.

Douglas Coupland. (1995). Microserfs. Harper Collins. Reprinted, 2004. [Ian S.]

Fictional life of a group of geeks (and their search for a life) at MicroSSSS and at a startup in Silicon Valley. Good fun; I suspect it’s a pretty accurate portrayal of these times and Coupland seems to have learned enough about the technology to make it believable.

Don Norman (1988) The Design of Everyday Things [Aaron Q.]

This is very easy to read yet insightful book on what makes for good and bad design in the everyday objects around us. Some of the examples will make you laugh, some cry, and some will make you think, could I have designed that? Originally called the Psychology of Everyday things, Norman walks us through elements of psychology, knowledge and knowing. Completing with a discussion on the design challenge we face and how “User Centred Design” can be employed. After reading this any computer scientist will start to look at the physical and digital objects in your work and life differently.

Ed Tufte (1990) Envisioning Information [Aaron Q.]

This book is a beautiful exposition of various visual design concepts drawing on rich examples from both our recent and more distant past. Computer scientist’s interested in Information Visualisation might wish a text on algorithms, methods and visual toolkits. This is not that book, there are many other books which address those topics. Instead here, Tufte demonstrates how elements of graphic design and information display  can alter our thinking, argumentation, confidence, resolve and sense of aesthetic. Importantly, Tufte draws on a range of physical printed graphical displays (maps, charts, diagrams etc.) to offer practical advice on how to explain complex material by visual means. This book challenges us to start looking at how both desktop and mobile graphical displays might better present information.

Neal Stephenson Cryptonomicon [Al D].

A fictional epic featuring everything for a Computer Science: Bletchley Park, encryption, Van-Eck freaking, protecting electronic assets, Internet banking (did I mention gold, submarines and the second World War). If you like this you should also read Snow Crash by the Same Author.

Charles Stross Accelerando [Al D.]

This is another amazing book in the Cyber-Punk genre. It features a myriad of ideas including the world being turned into Computronium, digitised humans being sent to the other side of the Universe in a Coke can and the idea that no advanced civilisation would want to invade the Earth because we are too far away from the centre of the Universe and consequently there would not be enough bandwidth!  Once you read this one, read his other novels – brilliant.

E. DiMarco and T. Lister. Peopleware [Al D]

This is a classic – everything you need to know about working in a team (and how not to). Lots of stuff about the working environment which I try to follow whenever I can.

William Poundstone. How Would You Move Mount Fuji? [Al D]

This book is subtitled “Microsoft’s Cult of the Puzzle – How the World’s Smartest Company Selects the Most Creative Thinkers”. Most Computer Science students will be looking to get a job sometime. This book contains what are essentially interview questions – many of them based on what we teach in CS2001 – how do you find out if a linked list has a look in it…

Charles Murray and Catherine Bly Cox. Apollo: the Race to the Moon

This book is amazing but very hard to find (so I include this link http://www.amazon.co.uk/gp/product/0436302241/). This book is all about how the Americans put Man on the moon. The book is really all about projects and how to run them and also how not to. I would really recommend this book to anyone but especially to anyone who thinks they can manage projects (or want to).

Goldacre, B. (2009). Bad Science. Harper Perennial. [Saleem B.]

Describes how data can be mis-used to present what appear to be “facts”. Includes many examples from real stories reported in the press and in scientific publications, covering topics from medicine to the public misunderstanding of science. It is scary how much completely wrong stuff people can believe to be true.

Hannam, B. (2010) God’s Philosophers: How the Medieval World Laid the Foundations of Modern Science. Icon Books Ltd [Saleem B.]

Most people think that science is a relatively modern occurrence, and that the Dark Ages were an intellectual void. This book discusses some of the key inventions of our time such as spectacles, the mechanical clock, the compass and gunpower, and places them into their historical context in the Middle Ages of Europe. (Shortlisted for the Royal Society Prize for Science Books 2010)

du Sautoy, M. (2010). The Number Mysteries: A Mathematical Odyssey Through Everyday Life. Fourth Estate [Saleem B.]

An explanation of how mathematics impacts our everyday lives, from playing football to shopping on the Internet. The examples from everyday life, presented in a very accessible manner, are what make this book such a good read. (Prof Marcus Du Sautoy delivered the 2005 Royal Institute Christmas Lectures)

Stephens, W. R., Rago, S. A. (2005). Advanced Programming in the UNIX Environment. Addison Wesley; 2nd Ed. [Saleem B.]

The title says it all – it’s the Daddy.

Eco, U. (1980) The Name Of The Rose. Vintage Classics; New edition (1 May 2008) [Saleem B.]

First published in 1980, this is a medieval murder mystery involving coded manuscripts and a brotherhood of conspiratorial monks. A really good whodunnit.

 

3 Comments

Filed under Reading

What is complexity?

I’m part of the LSCITS project where LSCITS stands for ‘Large Scale Complex IT Systems’ and we have been having discussions about what is meant by ‘complexity’. Some argue that the term complexity should be reserved for complex adaptive systems, systems which are dominated by emergent behaviour. Others argue that ‘conventionally’ engineered systems can also be complex in that their non-functional characteristics and (sometimes) their functional behaviour cannot be predicted. This is particularly likely when we create systems by integrating different parts (often other systems) which are independently developed and managed. In such cases, it is practically impossible to predict how the characteristics of one system will interfere with the characteristics of others.

We suggested that ‘complex’ and ‘complicated’ were not the same. A complicated system is one that is understandable in principle although, in practice, the effort involved may be so great that it is not understandable in practice. This was my own view at one time, but I’ve now changed my mind and think that there is no practical difference between a complex and a very complicated system.  This position has emerged from musings on the roots of complexity.

Some systems are inherently complex – we cannot deduce their properties by a study of their components; we cannot predict the consequences to changes to these systems. System behaviour and properties are emergent and non-deterministic. I believe that such inherent complexity stems from the fact the there are dynamic, dependent relationships between the parts of the system. These relationships evolve in time and according to stimuli from the system’s environment. New relationships may be created and existing relationships may change. As a consequence, deterministic modelling techniques cannot be used to make predictions about such systems although statistical approaches may be used in some cases.

As soon as you consider the people who use a system to be part of the system, you have dynamic, dependent relationships between components in the system so I argue that all large, socio-technical systems can be considered to be complex systems.

There is also another aspect to complexity – what might be called epistemic complexity. This relates to the predictability of system properties when changes are proposed. If you don’t have enough knowledge about a system’s components and their relationships, you cannot make predictions about it, even if, in principle, that system does not have dynamic dependent relationships between its components. Therefore, I argue that large complicated systems are also complex systems when it is practically impossible to acquire the necessary knowledge to understand the system.

This means, of course, that complexity is not just a property of the system but also of the system observer. We have all encountered system experts who know about some system and can make changes to it in a reliable way. Their knowledge is hard to articulate and when they are no longer available, someone taking over the system may find it impossible to develop the same level of understanding. Therefore, what was a complicated system has become a complex system.

Where is all this leading – who cares?

Well, I think it is important to emphasise that complexity isn’t simple. Striving for a simple, universal definition of complexity isn’t really going to get us anywhere.

If we are to try and manage complexity we need a toolbox of theories and methods to do so. To give an example, if you think about dynamic dependencies between components, formal methods of computer science don’t really help. However, if you think about epistemic complexity, they may be very useful indeed as they allow us to state ‘truths’ about a system – filling in our knowledge about that system.

The notion of dynamic, dependent relationships may also be useful in helping us manage complexity. By developing a better understanding of such relationships (e.g. through socio-technical analysis of organisations), we may be able to change the type of these relationships from dynamic to static and hence reduce the complexity of the system.

As I said, complexity isn’t simple so there’s lots of scope for disagreement here.

5 Comments

Filed under LSCITS

The Shallows, Nick Carr

I’ve just finished reading The Shallows, by Nick Carr which is his latest book after The Big Switch, which ‘sold’ cloud computing. This book has quite a different tone from the technological optimism of the Big Switch – here, Nick has turned against the Internet that feeds him and suggests that the use of the Internet and its multi-channel, multi-modal communication is having serious effects on our ability to think deeply and concentrate for extended periods.  In essence, his argument is:

1. Our brains dynamically reconfigure themselves to allow us to do common activities better but an inevitable consequence of this reconfiguration is that we get worse at things we don’t do so often.

2. Electronic communication through the Internet is essentially shallow and is based on superficial impressions rather than deep reading and analysis.

3. Therefore, our brains are changing so that we can cope with many channels, many modes of communication; but we are losing our ability for contemplation, reflection and analysis.

Carr bases the argument on how he feels that his thinking has changed and I can sympathise with this. Certainly, I read less in depth than I used to and am much more likely to skim technical material rather than spend a lot of time thinking it through carefully. I had put this down to the nature of my job(s) where I do lots of different things but maybe Carr has a point and this is a real physiological change.

Is Carr right or wrong here? Hard to say but the consequences of him being right are quite profound (especially to authors of textbooks!) and the precautionary principle should apply. If we do something to encourage deep reading, and he is right, then we win; if he is wrong, we lose very little. So, I guess that there are two courses of action:

1. From a personal perspective, it seems pretty important to me  that I spend time away from the computer reading and thinking about what I read.

2. From an educational perspective, we need to design our courses so that deep reading and reflection is required of our students. They simply should not be able to get a degree without demonstrating this ability to read, analyse and critique. This is expensive in teaching time and, sadly, budget and other pressures mean that many institutions are going the opposite way – courses are constructed from fragments and students may never have to read a book or research paper.

Oh, and it seems that reading a paper book is qualitatively different from reading an electronic book. So, don’t rush to buy an iPad or Kindle.

Having said all this, the book itself is, like The Big Switch, rather repetitive. He makes the same point several times in not so different ways. Really, an extended article rather than a book would be better. Or is it just I can’t cope with books any longer?

1 Comment

Filed under Book review