Archive

Interface innovations

Ian’s experience of using Android Wear echoes my own in large part, especially this paragraph:

It’s also much, much less intrusive in social situations. Glancing at your wrist for a second to check an alert lets you stay more present in the conversation which is happening around you than ferreting around in your pocket, dragging out your phone, switching it on, checking whatever and putting it back. And of course with the phone, you’ve got the temptation to keep it on the table in front of you, glance at it, maybe see what Twitter is talking about… all of which breaks the social contact you’re having in the real world.

To which I’d add that the physical gesture of glancing at one’s watch is something we’re pretty much globally comfortable with in social situations, unlike say getting your phone out and trying to maintain a conversation…

Umeå

I got invited to northern Sweden by the lovely folks at Umeå Institute of Design and Tellart.

Umeå Design School

It was a fantastic couple of days, where ideas were swapped, things were made and fine fun was had late into the sub-artic evening…

Umeå

It was their first (and hopefully not the last) Spring Summit at the Umeå Institute of Design, entitled “Sensing and sensuality”.

Umeå Institute of Design Spring Summit, "Sensing and Sensuality"

I tried to come up with something on that theme, mainly of half-formed thoughts that I hope I can explore some more here and elsewhere in the coming months.

It’s called “Data as seductive material” and the presentation with notes is on slideshare, although I’ve been told that there will be video available of the entire day here with great talks from friends old and new.

Thank you so much to the faculty and students of Umeå Institute of Design, and mighty Matt Cottam of Tellart for the invitation to a wonderful event.

Apple’s iPhone 3.0 announcements caused a kerfuffle today, but it seems to me insane that the thing that’s being talked about most is… Cut and Paste?

At the time the event was running I summed my feelings up in <140 chars thusly:

Twitter / Matt Jones: of course, while I'm shaki ...

I mean – they’d announced that you could create custom UIs that worked with physical peripherals – they’d had someone from Johnson & Johnson on stage to show a diabetes sensor companion to the iphone – the nearest thing to AP’s Charmr you could imagine!

Then my friend Josh said:

“Am now wondering whether a bluetooth/serial module and arduino will be able to talk with iPhone. And, pachube

A rapid prototyping platform for physical/digital interactions? A mobile sensor platform for personal and urban informatics that’s going mainstream?

Imagine – AppleStores with shelves of niche, stylish sensor products for sale in a year’s time – pollution sensors, particulates analysis, spectroscopy, soil analysis, cholesterol? All for the price of a Nike+ or so?

Come on, that’s got to be more exciting than cut and paste?

—–
UPDATE

Tom Igoe points out in his comment correctly that I have been remiss in not mentioning Tellart’s NadaMobile project from late last year – which allows you to easily prototype physical/digital/sensor apps on the iPhone through a cable that cleverly connects to the audio jack. It’s also totally open-source.

Reblog this post [with Zemanta]

Warning – this is a collection of half-formed thoughts, perhaps even more than usual.

I’d been wanting to write something about Google Latitude, and other location-sharing services that we (Dopplr) often get lumped in with for a while. First of all, there was the PSFK Good Ideas Salon, where I was thinking about it (not very articulately) then shortly after that Google Latitude was announced, in a flurry of tweets.

At the time I myself blurted:

twitter-_-matt-jones_-i-still-maintain-perhaps

My attitude to most Location-Based Services (or LBS in the ancient three-letter-acronymicon of the Mobile Industry) has been hardened by sitting through umpty-nine presentations by the white-men-in-chinos who maintain a fortune can be made by the first company to reliably send a passer-by a voucher for a cheap coffee as they drift past *bucks.

It’s also been greatly informed by working and talking with my esteemed erstwhile colleague Christopher Heathcote who gave a great presentation at Etech (5 years ago!!! Argh!) called “35 ways to find your location“, and has both at Orange and Nokia been in many of the same be-chino’d presentations.

me_home_work1Often, he’s pointed out quite rightly, that location is a matter of routine. We’re in work, college, at home, at our corner shop, at our favourite pub. These patterns are worn into our personal maps of the city, and usually it’s the exceptions to it that we record, or share – a special excursion, or perhaps a unexpected diversion – pleasant or otherwise that we want to broadcast for companionship, or assistance.

Also, most of the time – if I broadcast my location to trusted parties such as my friends, they may have limited opportunity to take advantage of that information – they after all are probably absorbed in their own routines, and by the time we rendevous, it would be too late.

Location-based services that have worked with this have had limited success – Dodgeball was perhaps situated software after all, thriving in a walkable bar-hopping subculture like that of Manhattan or Brooklyn, but probably not going to meet with the same results worldwide.

This attitude carried through to late 2006/early 2007 and the initial thinking for Dopplr – that by focussing on (a) nothing more granular than cities-as-place and days-as-time and (b) broadcasting future intention, we could find a valuable location-based service for a certain audience – surfacing coincidence for frequent travellers.

Point (a): taking cities and days as the grain of your service, we thought was the sweet-spot. Once that ‘bit’ of information about the coincidence has been highlighted and injected into whichever networks you’re using, you can use those networks or other established communications methods to act on it: facebook, twitter, email, SMS or even, voice…

“Cities-and-days” also gave a fuzziness that allowed for flexibility and, perhaps plausible deniablity – ameliorating some of the awkwardness that social networks can unitentionally create (we bent over backwards to try and avoid that in our design decisions, with perhaps partial success)

In the latest issue of Wired, there’s a great example of the awkward situations broadcasting your current exact location could create:

“I explained that I wasn’t actually begging for company; I was just telling people where I was. But it’s an understandable misperception. This is new territory, and there’s no established etiquette or protocol.

This issue came up again while having dinner with a friend at Greens (37.806679 °N, 122.432131 °W), an upscale vegetarian restaurant. Of course, I thought nothing of broadcasting my location. But moments after we were seated, two other friends—Randy and Cameron—showed up, obviously expecting to join us. Randy squatted at the end of the table. Cameron stood. After a while, it became apparent that no more chairs would be coming, so they left awkwardly. I felt bad, but I hadn’t really invited them. Or had I?”

It also seemed like a layer in a stack of software enhancing the social use and construction of place and space – which we hoped would ‘handover’ to other more appropriate tools and agents in other scales of the stack. This hope became reinforced when we saw a few people taking to prefacing twitters broadcasting where they were about to go in the city as ‘microdopplr‘. We were also pleased to see the birth of more granular intention-broadcasting services such as Mixin and Zipiko, also from Finland

This is also a reason that we were keen to connect with FireEagle (aside from the fact that Tom Coates is a good friend of both myself and Matt B.) in that it has the potential to act as a broker between elements in the stack, and in fact help, weave the stack in the first place. At the moment, it’s a bit like being a hi-fi nerd connecting hi-specification separates with expensive cabling (for instance, this example…), but hopefully an open and simple way to control the sharing of your whereabouts for useful purposes will emerge from the FE ecosystem or something similar.

Point (b) though, still has me thinking that sharing your precise whereabouts – where you are right now, has limited value.

lightcone_slideThis is a slide I’ve used a lot when giving presentations about Dopplr (for instance, this one last year at IxDA)

It’s a representation of an observer moving through space and time, with the future represented by the ‘lightcone’ at the top, and the past by the one at the bottom.

I’ve generally used it to emphasise that Dopplr is about two things – primarily optimising the future via the coincidences surfaced by people sharing their intended future location with people they trust, and secondly, increasingly – allowing you to reflect on your past travels with visualisations, tips, statistics and other tools, for instance the Personal Annual Reports we generated for everyone.

It also points out that the broadcasting of intention is something that necessarily involves human input – it can’t be automated (yet)- more on which later.

By concentrating on the future lightcone, sharing one’s intentions and surfacing the potential coincidences, you have enough information to make the most of them – perhaps changing plans slightly in order to maximise your overlap with a friend or colleague. It’s about wiggling that top lightcone around based on information you wouldn’t normally have in order to make the most of your time – at the grain of spacetime Dopplr operates at.

Google Latitude, Brightkite and to an extent FireEagle have made mee think a lot about the grain of spacetime in such services, and how best to work with it in different contexts. Also, I’ve been thinking about cities a lot, in preparation for my talk at Webstock this week – and inspired by Adam‘s new book, Dan’s ongoing mission to informationally refactor the city and the street, Anne Galloway and Rob Shield’s excellent “Space and culture” blog and the work of many others, including neogeographers-par-excellance Stamen.

I’m still convinced that hereish-and-soonish/thereish-and-thenish are the grain we need to be exploring rather than just connecting a network of the pulsing ‘blue-dot’.

Tom Taylor gave voice to this recently:

“The problem with these geolocative services is that they assume you’re a precise, rational human, behaving as economists expect. No latitude for the unexpected; they’re determined to replace every unnecessary human interaction with the helpful guide in your pocket.

Red dot fever enforces a precision into your design that the rest must meet to feel coherent. There’s no room for the hereish, nowish, thenish and soonish. The ‘good enough’.

I’m vaguely tempted to shutdown iamnear, to be reborn as iamnearish. The Blue Posts is north of you, about five minutes walk away. Have a wander around, or ask someone. You’ll find it.”

My antipathy to the here/now fixation in LBS lead me to remix the lightcone diagram and post it to flickr, ahead of writing this ramble.

The results of doing so delighted and surprised me.

Making the most of hereish and nowish

In retrospect, it wasn’t the most nuanced representation of what I was trying to convey – but it got some great responses.

There was a lot of discussion around whether the cones themselves were the right way to visualise spacetime/informational futures-and-pasts, including my favourite from the ever-awesome Ben Cerveny:

“I think I’d render the past as a set of stalactites dripping off the entire hypersurface, recording the people and objects with state history leaving traces into the viewers knowledgestream, information getting progressively less rich as it is dropped from the ‘buffers of near-now”

Read the entire thread at Flickr - it gets crazier.

But, interwoven in the discussion of the Possibility Jellyfish, came comments about the relative value of place-based information over time.

Chris Heathcote pointed out that sometimes that pulsing blue dot is exactly what’s needed to collapse all the ifs-and-buts-and-wheres-and-whens of planning to meet up in the city.

Blaine pointed out that

“we haven’t had enough experience with the instantaneous forms of social communication to know if/how they’re useful.”

but also (I think?) supported my view about the grain of spacetime that feels valuable:

“Precise location data is past its best-by date about 5-10 minutes after publishing for moving subjects. City level location data is valuable until about two hours before you need to start the “exit city” procedures.”

Tom Coates, similarly:

“Using the now to plan for ten minutes / half an hour / a day in the future is useful, as is plotting and reflecting on where you’ve been a few moments ago. But on the other hand, being alerts when someone directly passes your house, or using geography to *trigger* things immediately around you (like for example actions in a gaming environment, or tool-tips in an augmented reality tool, or home automation stuff) requires that immediacy.”

He also pointed out my prejudice towards human-to-human sharing in this scenario:

“Essentially then, humans often don’t need to know where you are immediately, but hardware / software might benefit from it — if only because they don’t find the incoming pings distracting and can therefore give it their full and undivided attention..”

Some great little current examples of software acting on exact real-time location (other than the rather banal and mainstream satnav car navigation) are Locale for Android – a little app that changes the settings of your phone based on your location, or iNap, that attempts to wake you up at your rail or tube stop if you’ve fallen asleep on the commute home.

But to return to Mr. Coates.

Tom’s been thinking and building in this area for a long time – from UpMyStreet Conversations to FireEagle, and his talk at KiwiFoo on building products from the affordances of real-time data really made me think hard about here-and-now vs hereish-and-nowish.

Tom at Kiwifoo

Tom presented some of the thinking behind FireEagle, specifically about the nature of dealing with real-time data in products an services.

In the discussion, a few themes appeared for me – one was that of the relative-value of different types of data waxing and waning over time, and that examining these patterns can give rise to product and service ideas.

Secondly, it occured to me that we often find value in the second-order combination of real-time data, especially when visualised.

Need to think more about this certainly, but for example, a service such as Paul Mison’s “Above London” astronomical event alerts would become much more valuable if combined with live weather data for where I am.

Thirdly, bumping the visualisation up-or-down a scale. In the discussion at KiwiFoo I cited Citysense as an example of this – which Adam Greenfield turned me onto –  where the aggregate real-time location of individuals within the city gives a live heatmap of which areas are hot-or-not at least in the eyes of those who participate in the service.

From the recent project I worked on at The Royal College of Art, Hiromi Ozaki’s Tribal Search Engine also plays in this area – but almost from the opposite perspective: creating a swarming simulation based on parameters you and your friends control to suggest a location to meet.

I really want to spend more time thinking about bumping things up-and-down the scale: it reminds me of one of my favourite quotes by the Finnish architect Eliel Saarinen:

demons029

And one of my favourite diagrams:

brand_keynote_400

It seems to me that a lot of the data being thrown off by personal location-based services are in the ‘fashion’ strata of Stewart Brand’s stack. What if we combined it with information from the lower levels, and represented it back to ourselves?

Let’s try putting jumper wires across the strata – circuit-bending spacetime to create new opportunities.

Finally, I said I’d come back to the claim that you can’t automate the future – yet.

twitter-_-matt-jones_-kiwifoo-plasticbaguk_sIn the Kiwifoo discussion, the group referenced the burgeoning ability of LBS systems to aggregating patterns of our movements.

One thing that LBS could do is serve to create predictive models of our past daily and weekly routines – as has been investigated by Nathan Eagle et al in the MIT Reality Mining project.

I’ve steered clear of the privacy implications of all of this, as it’s such a third-rail issue, but as I somewhat bluntly put it in my lightcone diagram the aggregation of real-time location information is currently of great interest to spammers, scammers and spooks – but hopefully those developing in this space will follow the principles of privacy, agency and control of such information expounded by Coates in the development of FireEagle and referenced in our joint talk “Polite, pertinent and pretty” last year.

The downsides are being discussed extensively, and they are there to be sure: both those imagined, unimagined, intended and unintended.

But, I can’t help but wonder – what could we do if we are given the ability to export our past into our future…?

Reblog this post [with Zemanta]

Wikidashboard

Wikidashboard‘s a fantastic project from Parc, which I found today via Waxy.

It displays and visualises the change history of an article in-situ. As Waxy says it would be great to have a greasemonkey script which placed this information on the pages of wikipedia proper.

Except – it’s too much for me.

I want it to be glanceable-not-pore-over-able at this level, just giving me a swift indication of the volatility of the entry.

In early 2005, I proposed tiny sparklines based on the HistoryFlow visualisation project that could be seen in-situ to give a very quick feel of the change history of an article.

Proposal for 'History Flow' sparkline graphics for Wikipedia

Since then, sparklines are almost part of the furniture in services like Flickr, Google Analytics and *ahem* Dopplr. There are libraries and refinements galore for such things. Putting something together like this using Wikidashboard, greasemonkey and one of those libraries would be well within the reach of an enterprising geek I would have thought…

So, can I have it?



WALKING CITY, originally uploaded by blackbeltjones.

Jonathan Feinberg emailed me and said “Inspired by your typographically sophisticated “hand-tooled” cloud, I came up with a novel way of cramming a bunch of words together.” which is underserved praise for me, and dramatically underselling what he’s acheived with Wordle.
It does the simple and much-abused thing of creating a tag-cloud, and executes it playfully and beautifully. There are loads of options for type and layout, and it’s enormous fun to fiddle with.
As I said back when Kevan Davies did his delicious phrenology visualiser, there is some apophenic pleasure in scrying your tag could and seeing the patterns there – so I was very pleased when my playing with Wordle returned me an Archigram-esque walking city of things I’ve found interesting.
Congrats to Jonathan on building and finally releasing Wordle!

Follow

Get every new post delivered to your Inbox.

Join 5,135 other followers