Cloud Street

Monday, January 30, 2006

Home again

So, I'm a researcher. (At least until the money runs out next year; hopefully I'll have something similar lined up by then.) Before I was a researcher I was a freelance journalist for about six years, while I did my doctorate; before that I was a full-time journalist for three years; and before that I worked in IT. Which is a whole other dark and backward abysm of time - I was a Unix sysadmin, and before that I was an Oracle DBA, and before that... database design, data analysis, Codasyl[1] database admin, a ghastly period running a PC support team, and before that systems analysis and if you go back far enough you get to programming, and frankly I still don't trust any IT person who didn't start in programming. (I'm getting better - at one time I didn't trust anyone who didn't start in programming.)

Now, there's an odd kind of intellectual revelation which you sometimes get, when you're a little way into a new field. It's not so much a Eureka moment as a homecoming moment: you get it, but it feels as if you're getting it because you knew it already. You feel that you understand what you've learnt so fully that you don't need to think about it, and that everything that's left to learn is going to follow on just as easily. Which usually turns out to be the case. The way it feels is that the structures you're exploring are how your mind worked all along - or, perhaps, how your mind would have been working all along if you'd had these tools to play with. (Or: "It's Unix! I know this!")

I had that feeling a few times in my geek days - once back at the start, when I was loading BASIC programs off a cassette onto an Acorn Atom (why else would I have carried on?); once when I was introduced to Codasyl databases; and once (of course) when I met Unix, or rather when I understood piping and redirection. But the strongest homecoming moment was when, after being trained in data analysis, I saw a corporate information architecture chart (developed by my employer's then parent company, with a bit of help from IBM). Data analysis hadn't come naturally, but once I'd got it it was there - and, now that I had got it, just look what you could do with it! It was a sheet of A3 covered with lines and boxes, expressing propositions such as "a commercial transaction takes place between two parties, one of which is an organisational unit while the other may be an individual or an organisational unit"; propositions like that, but mostly rather more complex. I thought it was wonderful.

Fast forward again: database design, DBA, sysadmin, journalism, freelancing, PhD, research. Research which, for the last month or so, has involved using OWL (the ontology language formerly known as DAML+OIL) and the Protege logical modelling tool - which has enabled me to produce stuff like this.

It's not finished - boy, is it not finished. But it is rather lovely. (Perhaps I just like lines and boxes...)

[1] If you don't know what this means, don't worry about it. (And if you do, Hi!)

Thursday, January 26, 2006

The hippies were evil

Like a lot of people, I've been playing around with the Chinese version of Google. If my searches are anything to go by, they don't seem bothered about whether you know about the Dalai Lama, but they do seem to be concerned that you get the right sources on Falun Gong (which is a very bad thing) and the Taishi Village incident (which probably never happened). The nastiest piece of censorship I've seen so far concerns the BBC news site, which seems to be blocked in its entirety. But I've only scratched the surface, and obviously I can't speak for the results of Chinese-language searches.

Here in Britain, Google redirects 'google.com' invocations to google.co.uk, from where you can choose to go to google.com if you really want to. Danny explains: "Google routinely redirects those outside the US to a country-specific version of Google. Those who want to reach Google.com can do so by selecting the "Google.com in English" link on the home page of these versions." (The link on the .co.uk page doesn't specify 'in English'.) So in China it's still possible to use google.com as well as google.cn. Similarly in France and Germany, where google.fr and google.de search results are silently censored to comply with legislation banning neo-Nazism. But it gets worse: thanks to the Digital Millennium Copyright Act there is (visible) censorship on google.com, which appears to be replicated on all the country-specific Googles. (Try searching for 'kazaalite'.) And, of course, we don't know whether there is any silent censorship. Considering that we know of effective public campaigns to bring pressure to bear on Google, it seems unlikely that no pressure has been exerted behind the scenes. (More on google.fr and google.de here; these are some of the sites that are hidden, and here's how it's done.)

David's argument is typical of the bloggers who are calling for Google to be shown some leniency with regard to this one. I don't share his conclusions, primarily because I don't agree with his premise. His argument seems to start from identification with the people at Google who have had a hard choice to make and have done, in their judgment, the best they could: "It's a tough world. Most of what we do is morally mixed.", and so forth. But for me the operative metric is not the relative quality of service Google can provide the people of China - which is certainly higher under the google.cn regime - but Google's relative complicity with restrictions on the free flow of information. This is important because of Google's extraordinarily unusual position. The company does one thing; the one thing it does - provide information - is an unqualified ethical good; and it does it well. Any complicity with censorship tarnishes the company's ethical reputation; it also threatens its reputation for delivering its service well, since it suggests that this can be compromised by external considerations. The Google.cn story threatens Google on both these grounds. Previously, Google was guilty of tolerating censorship; now, it's guilty of assisting censorship.

David concedes that this story "shows once and for all that Google's motto is just silly in a world as complex as this one". I'd go further. Unless you take the (Lutheran?) view that obedience to the government - any government - is a pious duty, Google's co-operation with the Chinese government has made nonsense of their proclaimed commitment to avoid evil. But I'd also add that Google crossed that line some time ago, when they doctored search results to comply with French, German and US law - and, in the case of France and Germany, did so without any indication that results were incomplete.

Ultimately this is primarily a lesson about what Will (in a completely different context) calls digital exuberance - and about the enthusiasm for big business - as long as it's a cool big business which I identified as a growing element of "Web 2.0". (Hang on to those double-quotes, you'll be glad of them later.) Owen sums it up:
I now notice that the corporate philosophy illustrates "don’t be evil" with the example that advertisements should be unobtrusive; and [Eric Schmidt and Hal Varian, writing in Newsweek] interpreted it to mean that management should not throw chairs. Google never actually said they would not cut a deal with an undemocratic regime to deny information and access to news to hundreds of millions of repressed people. But that was the kind of thing that "don’t be evil" implied to me.

I have some sympathy with Google’s dilemma - they are, after all, a shareholder-owned company, not a branch of Reporters sans frontières. But companies that say one thing and do another eventually get themselves into trouble.

Google was once the underdog; a quirky startup, doing one thing (search) really well: and quickly without all those annoying ads. We got cool free gizmos, like Google Earth and webmail with big storage. And it seemed to have a corporate philosophy that hackers and the internet generation could relate to. Today Google seems a lot more like Microsoft, AOL or any other large corporation. It buys companies to get their technology (what exactly has Google invented, since PageRank?). It introduces Digital Rights Management systems for video. And now it cuts deals with the Chinese government to expand its market, instead of standing up for uncensored access to the internet.

Monday, January 09, 2006

Could do better

"Yes, I know, I know. 2005's over already, we're more than halfway through the decade... you're not the only one who's embarrassed, how do you think I feel? Yes, I know it was supposed to be all m-computing and always-on GPS and ubiquitous wifi jetpacks by now, but it's been difficult, I mean, look at the state of the economy... OK, OK, I shouldn't have said 'jetpacks', forget I said that, no, I am taking this seriously, really I am... but come on, apart from anything else there's been this war going on, that hasn't helped... yes, I know military spending is supposed to be a major driver of high-tech R&D... maybe it just isn't driving R&D the way we'd like it to, have you thought of that? look at Segways, they were going to be the next big thing at one stage... OK, OK, I'll stop trying to change the subject... Look, what can I say? We'll do better this year. I'll do better this year. Trust me. OK? OK."

- Web promises to become more pervasive in 2006

Friday, January 06, 2006

Soft enough for you

Anne:
Is it reasonable to have to learn to ride a bike but expect a computer to be as simple to figure out as a toaster? (Not the perfect analogy I know, but you know what I'm getting at...) Some days I think that user-friendliness was/is a really bad idea, not least because it's obdurate, so hard to change.
If you have to work at using a technology, in other words, you necessarily end up working with it and through it. You work to adapt it to your needs - and you adapt it. Technologies which offer ease of use, by contrast, make it easy to work in certain pre-defined ways - and resist adaptation by the individual user. (There are, of course, technologies which are both easy to use and flexible - ask any Flickr user. But I think the 'user-friendliness' Anne is talking about here is more like the comment a tutor of mine once made on the BBC and 'open access' broadcasting: "They say they'll come and help you, show you how to do it. They don't, of course - what they do is show you how to do what you do because that's how you do it." User-friendliness is very often a matter of HTDWYDBTHYDI.)

But there's more to it than that. What is this thing called obduracy? Anne again:
[Anique Hommels] argues that one way to emphasise the material aspects [of technologies in society] is to focus on their obduracy or resistance to change. (Imagine what it would *actually* take to replace the infrastructure that currently provides our electricity with something more sustainable.) The notion of obduracy is inextricably connected to embeddedness - a matter of interest to any kind of computing that seeks to become part of something else, be it an event, a habit, a skirt, a chair, a building, a street, a city. As Hommels reminds us, obduracy (or embeddedness) is a relational concept:

"Because the elements of a network are closely interrelated, the changing of one element requires the adaptation of other elements. The extent to which an artifact has become embedded determines its resistance to efforts aimed at changing it."
An embedded technology, then, would be one which has behind it a community of people who do a certain thing in a certain way. Becoming a user entails enrolment in that community. In short, the technology adapts you.

Where does this leave user-friendliness? Perhaps we could think of the embedding of a new technology as a process, which can continue to the point of the collapse of the possible ends and uses inherent in the technology and its reduction to the status of tool: a toaster, not a bicycle. And perhaps a 'user-friendly' technology - at least in the HTDWYD sense - is one designed to enlist a tool-using community and collapse its own potential into instrumentality.

(Relatedly, from Dan Hills' essential critique of digital music: "there is a powerful necessity to think long term; to not take such short cuts which may inadvertently delete possible outcomes; to enable the flexibility and endless modifications seen in previous generations of music devices". Dan has a lovely quote from William Gibson: "That which is overdesigned, too highly specific, anticipates outcome; the anticipation of outcome guarantees, if not failure, the absence of grace.")

More broadly, what all this highlights is the value of difficulty, incompatibility, misunderstanding. Dan also led me (indirectly) to this quote from the late Derek Bailey:
There has to be some degree, not just of unfamiliarity, but incompatibility [with a partner]. Otherwise, what are you improvising for? What are you improvising with or around? You've got to find somewhere where you can work. If there are no difficulties, it seems to me that there's pretty much no point in playing. I find that the things that excite me are trying to make something work. And when it does work, it's the most fantastic thing.
One of the great frustrations in my work with ontologies and e-social science is the recurrent assumption that the concepts used in social science data can be documented cleanly and consistently - or, conversely, that if they can't be documented cleanly and consistently they're not worth documenting. The point, surely, is to find ways of recording both the logic of individual classifications and the incompatibilities between them - and the (qualified, partial) correspondences between them. And, of course, to make this documentation changeable over time, without effacing the historical traces which contribute to its meaning. Parenthetically, it's worth noting here that preservation of historical data has nothing to do with obduracy. History is not obdurate, having no power to resist and (by and large) no enrolled community; the erasure of history can facilitate embeddedness and instrumentality, while the preservation of an artifact's history may actually preserve resources of flexibility. (That's enough abstractions - Ed.)

Thursday, January 05, 2006

We're never together

Back here, I wrote:
Social software may start with connecting data, but what it's really about is connecting people - and connecting them in dialogue, on a basis of equality. If this goal gets lost, joining the dots may do more harm than good.
It's not about connecting machines, either - and the same caveat applies. Via Thomas, I recently read this item about location-based services (which, I remember, were going to be quite the thing a couple of years ago, although they seem to have faded since people started actually getting their hands on 3G technology). Anyway, here are the quotes:
This project focuses on [location-based technology's] collaborative uses: how group of people benefits from knowing others' whereabouts when working together on a joint activity ... we set up a collaborative mobile environment called CatchBob! in which we will test how a location awareness tool modifies the group interactions and communications, the way they perform a joint task as well as how they rely on this spatial information to coordinate.
And how did that work out?
"We found that players who were automatically aware of their partners’ location did not perform the task better than other participants. In addition, they communicated less and had troubles reminding their partners' whereabouts (which was surprising). These results can be explained by the messages exchanged. First the amount of messages is more important in the group without the location-awareness tool: players had then more traces to rely on in order to recall the others’ trails. And when we look at the content, we see that players without the location-awareness tool sent more messages about position, direction or strategy. They also wrote more questions."
Really, we're back with 'push' technology - which was going to be quite the thing round about 1998, as I remember. Give people device for talking to each other: works. Give people device which gives them a constant stream of information: doesn't work.

The trouble is, we've got the technology. The problems with social software are social; see this deeply depressing Register story.
Alongside video on demand TV services from Homechoice, the SDB [Shoreditch Digital Bridge] will offer a "Community Safety Channel" which will allow residents "to monitor estate CCTV cameras from their own living rooms, view a 'Usual Suspects' ASBO line up, and receive live community safety alerts."
...
Other aspects of the Shoreditch Digital Bridge are less controversial, but likely to be considerably harder to execute. The SDB proposes an education channel, "allowing children and adults to take classes, complete on-line homework assignments and log-on to 'virtual tutors'", a "Health Channel" allowing patients to book GP appointments, and providing "virtual Dr/Nurse consultations and on-line health and diagnosis information", a "Consumer Channel, allowing on-line group buying of common services such as gas, electricity and mobile phone tariffs", and an "Employment Channel, providing on-line NVQ courses, local jobs website and virtual interview mentoring."

So within that little lot, the educational aspects will require substantial input from, and involvement of, existing schools and colleges, the Health Channel will need a whole new interface to NHS systems that are already struggling to implement their own new electronic booking systems, and the Consumer Channel will merely have to reinvent the co-operative movement electronically.
But CCTV - ah, now, we've got CCTV...

Will:
Yet again, the technology arrives promising us a vibrant civic and economic future ... then beds down as a means of protecting us from each other.
Or rather, as a means of protecting us from Them (caution - sweary link).

If we're talking about social software or social networks, let's be clear that we're talking about connecting people rather than dividing them. Connecting machines doesn't necessarily help connect people.