I’ve posted a trip report for PerCom 2007 over on RickyRobinson.id.au. Some of you might be interested.
Tag: research
About my research work.
A Climate Change Reality Check?
Anyone who’s interested in the climate change debate (and I’m still of the opinion that there is a debate [update: this was clearly a bonkers statement since the basic science of climate change was settled decades ago; the debate is on what action to take and how to implement that action, as it’s not going well]) should read this two-part paper published in the World Economics Journal at the end of last year. It’s a critical analysis of the Stern Review on the Economics of Climate Change, a document that has arguably done more than any other (save, perhaps for the various IPCC papers) to convince governments of the need to act on global warming. The first part covers the science (written by several climate change experts), and the second covers the economics. Readers of this weblog might be particularly interested in the following excerpt from the critique:
Section 3 is concerned with fundamental issues of scientific conduct and procedure that the Review fails to consider. Professional contributions to the climate change debate very largely take the form of published peer-reviewed articles and studies. It is widely assumed, in particular by governments and the Intergovernmental Panel on Climate Change (IPCC), that the peer review process provides a guarantee of quality and objectivity. This is not so. We note that the process as applied to climate science has tolerated gross failures in due disclosure and archiving, and that peer review is both too inbred and insufficiently thorough to serve any audit purpose, which we believe is now essential for science studies that are to be used to drive trillion-dollar policies.
I think this observation about peer-review processes in the climate science community probably holds true for many, if not most, scientific communities. I’ve certainly seen evidence of inbreeding and insufficient thoroughness within the small subset of the computer science community with which I’m involved. And, from my (still fairly limited) experience, due disclosure barely gets a look-in. For instance, the frequency with which authors are asked whether they’ve disclosed all their funding sources and correctly cited all their sources is very low in my (still fairly limited) experience.
New York, NY
I’ve just returned from New York, where I was attending the PerCom 2007 conference in White Plains. The conference was okay. Some interesting papers on using RFID to do some clever things (one that I remember in particular was about using the weaknesses of RFID to do intrusion detection). This year, there were a few HCI type papers accepted, one of which was about interacting with wall-sized video panels. The PerWare and CoMoRea workshops ended with some fairly lively discussions, which is a sign for continued interest in those workshops. My presentations at CoMoRea went well, though I was totally out of it by the end of the second presentation due to a cold or flu which I picked up from somewhere and which I’m still recovering from. Next year’s PerCom will be held in Hong Kong.
I had the weekends on either side of the conference to explore Manhattan. It was the first time I’d visited New York, so there was a lot I wanted to do. My first notable experience of New York was the freezing cold temperatures and the snow. I happened to arrive on the day that a severe snow storm blanketed large parts of the north eastern United States to the extent that JFK, La Guardia and Newark were shut down. When my plane landed at JFK, they’d already grounded most other flights. I ended up sitting around the baggage carousel at JFK with my fellow passengers for more than an hour because – get this – the luggage bay doors had frozen shut. Then, once I’d retrieved my suitcase, I found myself waiting outside in -2° Celsius temperatures for another hour because there was a severe shortage of taxis. Presumably there was a shortage of cabs because outgoing flights had been grounded, which meant there were no passengers being dropped off by cabs, which meant there were no cabs to convey passengers from my flight. Eventually, it was my turn to jump into a cab, and boy, was I in for a wild ride…
As we drove along an expressway (probably Van Wyck) from JFK towards Manhattan, I noticed there were traffic accidents and bogged vehicles everywhere. The snow and sleet were causing absolute chaos on the roads. Little did I realise that the cab I was in was about to get sideways, too. We were driving along, and all of a sudden, the cab fishtailed and slid across three lanes of expressway towards the centre barrier. Unbelievably, at that moment, there were no cars to the left of us, so we avoided an accident on that count, but there was still the centre barrier to deal with. Somehow, at the last moment, the cab driver managed to straighten the vehicle, and narrowly avoided colliding with the barrier. I’m still not quite sure how he managed to pull it off without even grazing the barrier; I was sure that physics dictated the front left headlight would get smashed, but it was like the cab turned on a pivot at the last second, rather than doing a normal arc turn. At this point the driver said: “Whoah! Do you have your seatbelt on?” I put my seatbelt on as soon as I sat down in the cab!
All the way to my hotel on Central Park South, I was trying to put some life back into my poor frozen fingers by holding them in front of the heater in the back seat. I was glad to finally arrive at the hotel (after our little incident, the driver stayed below 50km/h for the rest of the trip, so it took a while) and retire to my nice warm room.
The next day (Saturday, March 17), I strolled (or rather, trudged) through Central Park, taking a round-a-bout trip to the Metropolitan Museum of Art. This gallery has a number of Vermeers, Rembrandts and Rubens, so it kept me amused for the rest of the day. Saturday was St Patricks day, and New York has a famous St Pat’s day parade, which I watched a fair bit of. Apparently it’s quite rare for the parade to take place on snow covered streets. As I was walking from 5th Avenue back towards my hotel after I’d watched the parade pass, a massive chunk of ice fell from some skyscraper into the street a few feet away from me with an almighty crash. All I can say is I’m glad nobody was underneath it (I’m especially glad I wasn’t underneath it), because that person would not have lived to see another St Patrick’s day. For the rest of the day I was looking up at the sky, watching for falling blocks of ice.
On Sunday, I checked out Bloomingdales and then walked to the UN building, which was closed to visitors, and then over to the Rockefeller Center and Times Square. I ended up going to the “Top of the Rock” (that is, the top of the Rockefeller Center), which had an amazing view of Manhattan and surrounding regions. After spending a bit of time browsing various shops, including the Sony shop at the bottom of the Sony Building, I headed back to the hotel to pick up my suitcase and laptop, then hauled everything 21 blocks to Grand Central Station where I caught the train to White Plains.
Upon returning to Manhattan the following Saturday, I went, with a new friend, Matthias, whom I met at the conference, to Battery Park to get a view of the Statue of Liberty. We walked along the Hudson to the World Financial Center and Ground Zero, the World Trade Center site. It’s unbelievable to think that there used to be two massive towers standing at this site, and although construction on some new buildings seems to be getting underway, there’s still a sadness hanging about the place, and I got a kind of eerie feeling while looking over the site. Matthias needed to fly back to Germany that day, so at 3pm or so, we started to head back to my hotel, where Matthias had left his bags for a few hours.
The next day I took a boat cruise around Manhattan Island, which was very worthwhile. There were some awesome views of the Manhattan skyline and the various bridges connecting Manhattan with New Jersey and Long Island. The guide was a fountain of knowledge about New York. Then I caught a 3 o’clock showing of the Broadway musical The Producers, which was hilarious.
On Monday, I took the subway downtown to Wall Street to photograph the New York Stock Exchange and some other buildings. Trinity Church, between Trinity Place and Broadway at Wall Street, is a beautiful building in the neo-Gothic style (at least, I’d say it’s neo-Gothic from what little I know about architecture). Then I headed back uptown to the Museum of Modern Art, to while away the final hours of my New York trip. I’m not the hugest fan of most kinds of modern art. I dig many of Cézanne’s paintings and a lot of Picasso’s work, but I fail to appreciate anything as abstract as a Pollock or a Mondrian. I was hoping that seeing some of the paintings by Pollock, Mondrian and company up close would give me a new perspective; but, alas, they still didn’t do anything for me. Nevertheless, it was well worth the $20 entrance fee.
One of the highlights of a trip to New York is the food, from the street vendors to the delis to the upmarket restaurants. My modest travel budget didn’t allow me to try any cuisine from the last category, but I did sample the sidewalk fare. One subtly interesting fact is that a large proportion of the street vendors use halal meat, while some of the others are kosher. This was good news for me, as I still refrain from eating pork. In Australia, you might, if you’re lucky, find chicken-based hotdogs at Woolworths or Coles; but from what I hear, they’re a pretty poor substitute for the real thing. Meanwhile, New Yorkers seem to devour these halal hotdogs by the truck-load. I ate at a few different burger joints (Burger Heaven was great), and dined at a few delis. I also tried the Italian restaurant across the street from where I was staying for my second weekend in New York: good food, good wine, good service.
I had a good time in New York. I was a bit wasted during the second half of the trip due to the stupid ailment I picked up, but other than that it was a blast. Really glad to be back home though!
Schulze & Webb: Awesome
Before I go any further with this post, I want to thank Ben for imploring the readers of his blog to check out this presentation from some guys called Schulze & Webb. These days, you get pointers to so much stuff out there on the web, a lot of it interesting, but a lot of it only so-so. Then, occasionally, you’ll come across a gem, which truly was worth reading, and the presentation by Schulze & Webb, for me at least, is one of those gems. A word of advice if you do decide to read it, though: if you’re going to read it, read it right through as there’s a lot of good stuff in it.
I can relate to the presentation, titled The Hills are Alive with the Sound of Interaction Design, and its authors, Schulze & Webb, on a number of different levels. For starters, they use the example of football, specifically that magical goal Argentina scored against Serbia and Montenegro in the 2006 World Cup, to illustrate the concept that the means or the experience is more important to most people than the end result. In scoring that beautiful goal, Argentina strung together 24 passes before Cambiasso struck the ball into the back of the net. Football fans all over the world appreciate that goal because of the lead up to it, not the goal itself. This is also one of the reasons why football lovers can tolerate nil-all draws, and indeed why it can be truthfully said that some nil-all draws are more enjoyable than games in which five goals are scored: there’s so much more to the game than the goals. But football is only the most obvious example. The same can be said of other sports from cricket (the innings-long battle between batsman and bowler, rather than the fall of the wicket) to tennis (the rallies, rather than the rally-ending shot). Anyway, using football to illustrate a neat concept is a sure way to get me on side!
Their presentation also resonates with my recently written “About” page. They both speak of thresholds, boundaries and tipping points. They both talk about figuring out how to develop new things that harmonise with human experience and the human cognitive model (I love their bumptunes hack for the Powerbook; I wonder if the MacBook Pro has an accelerometer?).
Several months before submitting my Ph.D. thesis, I made the decision that I wanted to refocus my subsequent pervasive computing research more towards the user, or at the very least, to ensure that if I was going to be developing middleware to support pervasive computing applications, I would lobby hard to have some time and resources set aside to build decent, cool applications to exercise that middleware. It turns out I didn’t have to lobby that hard! But the point I’m trying to make here is that the Schulze & Webb presentation has provided a timely reminder of why I made that decision to think more about the user in the work that I do: it’s because in the research space I work in, that’s where the rubber hits the road. You can build middleware, context management systems and so forth, but in the end, it’s all in support of what the user wants to do, and it’s a fun challenge figuring out neat applications that people actually want to use because they’re a joy to use.
The challenge in my particular line of work is this: how do you create applications for emergency and disaster prediction, response and recovery which are “fun” to use? How do you design an application for the emergency services sector which creates an experience as pleasurable as watching Argentina’s second goal against Serbia & Montenegro in the World Cup? Is it even appropriate to create fun applications for an industry that, by definition, regularly deals with human tragedy? I hope the answer to the third question is a resounding “yes” if the applications help to save more lives than would otherwise be the case. Perhaps the word I’m looking for isn’t “fun” but “rewarding”. An application that makes its user feel rewarded for using it is a successful application because, presumably, the user will want to continue using it. An emergency worker feels rewarded if they are saving lives and making snap decisions that turn out to be good ones. Therefore, I think a good reformulation of my goal while I remain part of the SAFE project at NICTA is this: to develop rewarding applications (and supporting infrastructure) for the emergency services sector. This isn’t far off my official job description, but what it does is bring into sharp focus the importance of considering the users’ experiences as they interact with the application and system.
Thank you Ben. Thank you Schulze & Webb.
Against academic hermits
In writing about the misconceptions of collaboration, I hadn’t expected that anyone would interpret my article as arguing against any kind of grouping of researchers, but it’s come to my attention that at least one reader has interpreted it that way, and a re-reading of my post tells me that’s a fair enough interpretation of what was written. I’m happy to take the blame for this, as my writing can be kind of loose at times. On the other hand, I’ve received some e-mails that indicate some of my other readers knew exactly what I was trying to get at, as they’d experienced some of the things I was talking about. Nevertheless, I’d like to clarify that I’m all for research teams! I hadn’t considered that research teams of the sort one might find within a single research organisation can be classified as a collaborative arrangement. To me, such a grouping is something more than a collaboration. Rather naively, I had not even thought about the possibility that academics can still get away with working in complete isolation, so I was somewhat stupidly relying on the reader to make some assumptions about what was written. When I spoke of loosely coupled interactions
, I meant between two or more groups of academics at different institutions rather than between individual academics. When I said History is littered with hundreds of examples where this is the case, and very few in which close collaboration between teams of researchers yielded a scientific breakthrough
I was talking about interactions between multiple teams rather than intra-team dynamics. What I’m arguing against are top-down directions that mandate collaboration between research teams from different organisations with little or no forethought. There’s pressure to form these kinds of collaborations because they attract funding either directly, or indirectly whereby they are used as a kind of metric that can be counted to attract future funding.
It has also occurred to me that if the article is read in a certain way, it could be seen as demeaning Australian CRCs. While I do believe it is getting harder for researchers who work in CRCs (and other government funded institutions) to focus on fundamental research, I certainly am not asking for the end of CRCs! Good grief! A certain CRC with which I have had the privilege to be associated will probably always be one of the best, coolest, most awesome places I will ever have worked at, and that’s precisely because of the people who worked there and the creative buzz one felt when interacting with those great people.
I think Kerry actually pin-pointed exactly the point I was trying to make:
Yes, but would not a formal collaboration of the correct range of skills in the one lab with a common goal have found it sooner?
The key point is “the one lab”. This is exactly right. But again, as soon as a cohesive unit is formed, to me it ceases to be what I think of as collaboration, and I think this is where Kerry and I were getting our wires crossed. When a research team is formed within a lab, the unit is no longer the researcher: it’s the research team. When collaboration does happen in this context, it is not between individual researchers but between entire research teams. And the fact that the researchers have gathered in the one lab indicates that they’ve all been drawn there of their own accord. It’s a bottom-up process, not a command-and-control one. Furthermore, collaboration between one research team and another one will happen of its own accord where the majority of individuals involved see the benefit. There’s no need for other (politically motivated) incentives.
For what it’s worth, I personally can’t even imagine being some kind of academic hermit working in isolation. I need to be part of a team so I can feed off the vibes and ideas of my co-workers. Hell, I’ve never even written a paper on my own (barring my thesis)! I hope that takes care of any misunderstandings. But knowing my luck, I’ve just dug a deeper hole! Perhaps you’ll be better off reading Kerry’s concise clarification.
RickyRobinson.id.au has served as my personal website for many years now, although I think that it has been rather usurped by The Thin Line in recent times. To address this concern, RickyRobinson.id.au will now serve as my professional website, documenting my research and listing my publications. The Thin Line weblog will now be the place to find anything to do with my personal life, although I’m still wondering whether to host photos on this weblog or to upload them to Flickr. RickyRobinson.id.au has been given a facelift to coincide with this separation of professional and personal content. I hope you like its new look.
Don Norman, respected usability guru, wrote an article on the demise of simplicity as a selling point, and it’s caused reverberations all around the world. In fact, his article has been so controversial that he’s found it necessary to write a clarifying addendum for the essay (added to the bottom of the article), fearing that many of his readers interpreted his article as concluding that simplicity should no longer be a design goal. Norman’s point is that a product with a greater number of features is more appealing than a similar product with fewer features. The “more complicated” product is therefore more likely to sell. In other words, feature creep is driven by the knowledge that consumers will be suckered in to paying for a product that looks more complicated, even though, in many cases, they might complain about the difficulty of using the product when they get home.
I think there’s a difference between giving a user too many choices and too many features. Confusion and frustration arises when the user is presented with an array of subtly different choices. Joel Spolsky provides an excellent example: the Windows Vista shutdown menu. Windows Vista provides the user with umpteen slightly different ways of shutting down the computer. Why? On the other hand, providing lots of features that do different things need not result in frustrating the user, because, well, they are for accomplishing distinct tasks, and the user can clearly separate them in their mind. Take those Japanese toilets, for instance. These toilets have an integrated bidet, dryer, seat warmer, massage options, automatic flushing and so on and so forth. The existence of these features does not mean that the toilet isn’t simple to use, per se. If however, each of those features had a confusing list of subtly different settings, then that could be a problem!
Norman’s essay could have been made easier to read and resulted in less confusion if it had been written more clearly and more carefully. The following is just the most confusing of a number of errors that can be found in his article:
Notice the question: “pay more money for a washing machine with less controls.†An early reviewer of this paper flagged the sentence as an error: “Didn’t you mean ‘more money’?†the reviewer asked?
But it already says “more money”. Somehow Norman and his reviewer have conspired to introduce an error that is similar to the one they were seeking to avoid. If that’s not irony, I don’t know what is.
NOTE: A clarification of this article has been posted here.
In the latest edition of The Australian’s Higher Education Supplement, Julian Cribb (Adjunct Professor of Science Communication, UTS) voices his dissatisfaction with current scientific research policy in this country (The Australian, HES, page 23, 20/12/2006) by drawing on the findings of a Productivity Commission report. He has written what hundreds, if not thousands, of researchers must surely be thinking: Australian science policy is a failure. Rather than simply reiterate the arguments of Professor Cribb and the Productivity Commission, in this article I wish to highlight what I believe to be an unjustified emphasis on research collaboration, particularly in the formative stages of a research project.History shows, emphatically, that the most important scientific discoveries and theories and the greatest inventions have come about, not through intense collaboration between organisations, but rather as a result of the ingenuity of one or two (often brilliant) minds. It is in the application of these discoveries, theories and inventions that collaboration is of most benefit, not in the forming of the ideas in the first place. History is littered with hundreds of examples where this is the case, and very few in which close collaboration between teams of researchers yielded a scientific breakthrough. Certainly, loosely coupled forms of cooperation are a mainstay of scientific advancement; afterall, isn’t academic communication via peer-review and publishing a form of cooperation whereby ideas are circulated throughout particular research communities? But this kind of cooperation is clearly different from the kinds of collaboration that researchers in Australia and other parts of the world are being forced to engage in by (government) funding bodies. Where successful research collaboration does occur, it happens in a bottom-up fashion whereby the benefits of collaboration are immediately obvious to the individual researchers involved.
The movement of the Earth around the Sun, the model of the structure of an atom, evolution, DNA and relativity – these are all theories or discoveries which have had a profound effect on the way we see the world we live in. None of them were the result of collaboration between research organisations, and they certainly were not conceived of a collaboration between researchers and industry. Even if we consider a less fundamental breakthrough of the modern day, such as Google’s PageRankTM algorithm which signalled a substantial leap forward in terms of how the World Wide Web is searched, we can see that the algorithm was a result of a convergence of the ideas of two university students who happened to meet more or less by chance.
Why, then, do so many Australian government funded research organisations emphasize the need for research collaboration, when all the evidence shows that few significant scientific breakthroughs have come through such collaboration? Granted, there needs to be some kind of collaboration between research institutions and industry when it comes to exploiting the results of research, but this is a completely different thing, and it comes at a later stage in the development of a research idea. In the technology sector, for instance, the mean time between the conception of a new technology and the formation of a billion dollar industry is twenty years [1]. Take the computer mouse, conceived by Douglas Engelbart in 1963 and refined by researchers at the fabled Xerox PARC lab through the 1970s. Although the first commercial computer mice were shipped with Xerox workstations from 1981, it wasn’t until the Apple Macintosh came onto the scene in 1984 that the point-and-click paradigm really took off. Similar experiences were had by other computer and communications technologies, from relational databases to local area networks [2]. I include this information only to show that research-industry collaboration might be important for commercialisation of research, but it has not been shown that this sort of collaboration is important for coming up with good ideas in the first place. In order to maintain consistency with Professor Cribb’s article and the findings of the Productivity Commission, I need to add that the current trend towards requiring researchers to seek out commercial support for their research in order to keep their projects afloat for longer than the typical three year funding term is counter-productive. It means that either the research being conducted in our government funded research organisations is not as cutting edge as it should be (and that, in effect, tax payers are subsidising incremental industry research), or that the government is flushing tax payers’ money down the toilet because researchers can’t see their cutting edge research through to a mature state. The government must account for and value the massive positive externalities generated by fundamental research, and, if anything, the onus ought to be on the Australian industry sector to seek out research that they can exploit commercially. Afterall, it seems that the technology underlying the billion dollar industry of ten years from now was conceived ten years ago. In other words, the next commercial success stemming from fundamental research is already here; industry just needs to find it.
As alluded to earlier, loosely coupled interaction, as it naturally exists in academia, is probably a much better model for research cooperation. Schemes that tie funding to certain levels of collaboration do a disservice to research in this country. Such funding arrangements do nothing but compel researchers to enter into meaningless, time- and money-wasting relationships, and distract them from their core task of doing good research. Where it is beneficial to do so, researchers will enter into collaborative arrangements of their own accord (unless they are masochistic, which is a possibility that we can disregard in most cases), without the need for funding incentives that only serve to distract researchers from the main game.
Middleware 2006
The week before last, Karen and I attended Middleware 2006 along with Jaga and a couple of our students. I attended the Middleware for Sensor Networks conference (MidSens 2006) to present a paper that Karen and wrote, and Karen was running the Middleware Doctoral Symposium (MDS 2006). MidSens and MDS were on the same day, which is why we both got to go to Melbourne to attend the conference. We spent the first day at the Advanced Data Processing in Ubiquitous Computing (ADPUC 2006) workshop.
From all accounts, MDS was a real success, and I’m a little disappointed that I couldn’t attend. Karen was able to get some pretty well-known people, including Maarten van Steen and Michi Henning, to play the role of mentors/panelists for the day. I think MidSens was also successful. It had a kind of buzz around it. ADPUC could have done with a few controversial papers to get some discussions going. The Middleware conference itself, which ran for the last three days of the week, was a mixed bag. The papers, in general, were of high quality (apparently many high quality papers were rejected), but as would be expected of a conference with such a broad theme, not all the papers appealed to my interests. One thing I did learn is that the Middleware conference might not be a bad place to try to submit my own papers, since there were a couple of papers in the area of pervasive computing.
I still find it unbelievable that The University of Queensland, one of Australia’s largest universities and a member of the Group of 8, does not appear on Google’s list of universities. Apparently, the way a university gets onto the list is for large numbers of its students and faculty to ask Google to add the university to the list using a web form. After completing the form, it asks you to tell other people from your university to complete the form. Does the absence of UQ from Google’s list reflect an underlying apathy of UQ students and staff towards their university? Surely not. I have blogged on this topic before, but nothing seems to have changed in the meantime. I am hereby starting the campaign to get UQ on Google’s list. UQ students, staff and alumni, do your bit!