Journal tags: web2.0

6

sparkline

Plumbing

On Monday, I linked to Tom’s latest video. It uses a clever trick whereby the title of the video is updated to match the number of views the video has had. But there’s a lot more to the video than that. Stick around and you’ll be treated to a meditation on the changing nature of APIs, from a shared open lake to a closed commercial drybed.

It reminds me of (other) Tom’s post from a couple of year’s ago called Pouring one out for the Boxmakers, wherein he talks about Twitter’s crackdown on fun bots:

Web 2.0 really, truly, is over. The public APIs, feeds to be consumed in a platform of your choice, services that had value beyond their own walls, mashups that merged content and services into new things… have all been replaced with heavyweight websites to ensure a consistent, single experience, no out-of-context content, and maximising the views of advertising. That’s it: back to single-serving websites for single-serving use cases.

A shame. A thing I had always loved about the internet was its juxtapositions, the way it supported so many use-cases all at once. At its heart, a fundamental one: it was a medium which you could both read and write to. From that flow others: it’s not only work and play that coexisted on it, but the real and the fictional; the useful and the useless; the human and the machine.

Both Toms echo the sentiment in Anil’s The Web We Lost, written back in 2012:

Five years ago, if you wanted to show content from one site or app on your own site or app, you could use a simple, documented format to do so, without requiring a business-development deal or contractual agreement between the sites. Thus, user experiences weren’t subject to the vagaries of the political battles between different companies, but instead were consistently based on the extensible architecture of the web itself.

I know, I know. We’re a bunch of old men shouting at The Cloud. But really, Anil is right:

This isn’t our web today. We’ve lost key features that we used to rely on, and worse, we’ve abandoned core values that used to be fundamental to the web world. To the credit of today’s social networks, they’ve brought in hundreds of millions of new participants to these networks, and they’ve certainly made a small number of people rich.

But they haven’t shown the web itself the respect and care it deserves, as a medium which has enabled them to succeed. And they’ve now narrowed the possibilites of the web for an entire generation of users who don’t realize how much more innovative and meaningful their experience could be.

In his video, Tom mentions Yahoo Pipes as an example of a service that has been shut down for commercial and idealogical reasons. In many ways, it was the epitome of what Anil was talking about—a sort of meta-API that allowed you to connect different services together. Kinda like IFTTT but with a visual interface that made it as empowering as something like the Scratch programming language.

There are services today that provide some of that functionality, but they’re more developer-focused. Trys pointed me to Pipedream, which looks good but you need to know how to write Node.js code and import npm packages. I’m sure it’s great if you’re into serverless Jamstack lambda thingamybobs but I don’t think it’s going to unlock the potential for non-coders to create cool stuff.

On the more visual pipes-esque Scratchy side, Cassie pointed me to Cables:

Cables is a tool for creating beautiful interactive content.

It isn’t about making mashups, but it does look something that non-coders could potentially use to make something that looks cool. It reminds me a bit of Bret Victor and his classic talk on Inventing On Principle—always worth revisting!

Why You Should Have a Web Site

The enigmatic is at XTech to tell us Why you should have a Web site: it’s the law! (and other Web 3.0 issues). God, I hope he’s using Web 3.0 ironically.

Steven has heard many predictions in his time: that we will never have LCD screens, that digital photography could never replace film, etc. But the one he wants to talk about is Moore’s Law. People have been seeing that it hasn’t got long to go since 1977. Steven is going to assume that Moore’s Law is not going to go away in his lifetime.

In the 1980s the most powerful computers were the Crays. People used to say One day we will all have a Cray on our desk. In fact most laptops are about 120 Craysworth and mobile phones are about 35 Craysworth.

There is actually an LED correlation to Moore’s Law (brighter and cheaper faster). Steven predicts that within our lifetime all lighting will be LCDs.

Bandwidth follows a similar trend. Jakob Nielsen likes to claim this law; that bandwidth will double every year. In fact the timescale is closer to 10.5 months.

Following on from Moore’s and Nielsen’s laws, there’s Metcalfe’s Law: the value of a network is proportional to the square of the number of nodes. This is why it’s really good that there is only one email network and bad that there are so many instant messenger networks.

Let’s define the term Web 2.0 using Tim O’Reilly’s definition: sites that gain value by their users adding data to them. Note that these kinds of sites existed before the term was coined. There are some dangers to Web 2.0. When you contribute data to a web site, you are locking yourself in. You are making a commitment just like when you commit to a data format. This was actually one of the justifications for XML — data portability. But there are no standard ways of getting your data out of one Web 2.0 site and into another. What if you want to move your photos from one website to another? How do you choose which social networking sites to commit to? What about when a Web 2.0 site dies? This happened with MP3.com and Stage6. Or what about if your account gets closed down? There are documented cases of people whose Google accounts were hacked so those accounts were subsequently shut down — they lost all their data.

These are examples of Metcalfe’s law in action. What should really happen is that you keep all your data on your website and then aggregators can distribute it across the Web. Most people won’t want to write all the angle brackets but software should enable you to do this.

What do we need to realize this vision? First and foremost, we need machine-readable pages so that aggregators can identify and extract data. They can then create the added value by joining up all the data that is spread across the whole Web. Steven now pimps RDFa. It’s like microformats but it will invalidate your markup.

Once you have machine-readable semantics, a browser can do a lot more with the data. If a browser can identify something as an event, it can offer to add it to your calendar, show it on a map, look up flights and so on. (At this point, I really have to wonder… why do the RDFa examples always involve contact details or events? These are the very things that are more easily solved with microformats. If the whole point of RDFa is that it’s more extensible than microformats, then show some examples of that instead of showing examples that apply equally well to hCalendar or hCard)

So rather than putting all your data on other people’s Web sites, put all your data on your Web site and then you get the full Metcalfe value. But where can you store all this stuff? Steven is rather charmed by routers that double up as web servers, complete with FTP. For a personal site, you don’t need that much power and bandwidth. In any case, just look at all the power and bandwidth we do have.

To summarise, Web 2.0 is damaging to the Web. It divides the Web into topical sub-webs. With machine-readable pages, we don’t need those separate sites. We can reclaim our data and still get the value. Web 3.0 sites will aggregate your data (Oh God, he is using the term unironically).

Questions? Hell, yeah!

Kellan kicks off. Flickr is one of the world’s largest providers of RDFa. He also maintains his own site. Even he had to deal with open source software that got abandoned; he had to hack to ensure that his data survived. How do we stop that happening? Steven says we need agreed data formats like RDFa. So, Kellan says, first we have to decide on formats, then we have to build the software and then we have to build the aggregators? Yes, says Steven.

Dan says that Web 2.0 sites like Flickr add the social value that you just don’t get from building a site yourself. Steven points to MP3.com as a counter-example. Okay, says Dan, there are bad sites. Simon interjects, didn’t Flickr build their API to provide reassurance to people that they could get their data out? Not quite, says Kellan, it was created so that they could build the site in the first place.

Someone says they are having trouble envisioning Steven’s vision. Steven says I’m not saying there won’t be a Flickr — they’ll just be based on aggregation.

Someone else says that far from being worried about losing their data on Flickr, they use Flickr for backup. They can suck down their data at regular intervals (having written a script on hearing of the Microsoft bid on Yahoo). But what Flickr owns is the URI space.

Gavin Starks asks about the metrics of energy usage increases. No, it drops, says Steven.

Ian says that Steven hit on a bug in social websites: people never read the terms of service. If we encouraged best practices in EULAs we could avoid worst-case scenarios.

Someone else says that our focusing on Flickr is missing the point of Steven’s presentation.

Someone else agrees. The issue here is where the normative copy of your data exists. So instead of the normative copy living on Flickr, it lives on your own server. Flickr can still have a copy though. Steven nods his head. He says that the point is that it should be easy to move data around.

Time’s up. That was certainly a provocative and contentious talk for this crowd.

Speaking at South by Southwest

This was my third year attending South by Southwest and also my third year speaking.

It seems to have become a tradition that I do a “bluffing” presentation every year. I did How to Bluff Your Way in CSS two years ago with Andy. Last year I did How to Bluff Your Way in DOM Scripting with Aaron. This year, Andy was once again my partner in crime and the topic was How to Bluff Your Way in Web 2.0.

It was a blast. I had so much fun and Andy was on top form. I half expected him to finish with “Thank you, we’ll be here all week, try the veal, don’t forget to tip your waiter.”

As soon as the podcast is available, I’ll have it transcribed. In the meantime, Robert Sandie was kind enough to take a video the whole thing. It’s posted on Viddler which looks like quite a neat video service: you can comment, tag or link to any second of a video. Here, for instance, Robert links to the moment when I got serious and called for the abolition of Web 2.0 as a catch-all term. I can assure you this moment of gravity is the exception. Most of the presentation was a complete piss-take.

My second presentation was a more serious affair, though there were occasional moments of mirth. Myself and Derek revisited and condensed our presentation from Web Directions North, Ajax Kung Fu meets Accessibility Feng Shui. This went really well. I gave a quick encapsulation of the idea of Hijax and Derk gave a snappy run-through of accessibility tips and tricks. We wanted to make sure we had enough time for questions and I’m glad we did; the questions were excellent and prompted some great discussion.

Again, once the audio recording is available, I’ll be sure to get it transcribed.

That was supposed to be the sum of my speaking engagements but Tantek had other ideas. He arranged for me to rush the stage during his panel, The Growth and Evolution of Microformats. The panel was excellent with snappy demos of the Operator plug-in and Glenn’s backnetwork app. I tried to do a demo of John McKerrell’s bluetooth version of the Tails extension using a volunteer from the audience but that didn’t work out too well and I had to fall back on just using a localhost example. Still, it was good to be on-hand to answer some of the great questions from the audience.

And yes, once the audio is available, I’ll get it transcribed. Seeing a pattern here? Hint, hint, other speakers.

As panels go, the microformats one was pretty great, in my opinion. Some of the other panels seem to have been less impressive according to the scuttlebutt around the blogvine.

Khoi isn’t keen on the panel format. It’s true enough that they probably don’t entail as much preparation as full-blown presentations but then my expectations are different going into a panel than going into a presentation. So, for something like Brian’s talk on the Mobile Web, I was expecting some good no-nonsense practical advice and that’s exactly what I got. Whereas for something like the Design Workflows panel, I was expecting a nice fireside chat amongst top-notch designers and that’s exactly what I got. That’s not to say the panel wasn’t prepared. Just take one look at the website of the panel which is a thing of beauty.

The panelists interviewed some designers in preparation for the discussion and you can read the answers given by the twenty interviewees. Everyone gave good sensible answers… except for me.

Anyway, whether or not you like panels as a format, there’s always plenty of choice at South by Southwest. If you don’t like panels, you don’t have to attend them. There’s nearly always a straightforward presentation on at the same time. So there isn’t much point complaining that the organisers haven’t got the format right. They’re offering every format under the sun—the trick is making it to the panels or presentations that you’ll probably like.

In any case, as everyone knows, South by Southwest isn’t really about the panels and presentations. John Gruber wasn’t keen on all the panels but he does acknowledge that the real appeal of the conference lies elsewhere:

At most conferences, the deal is that the content is great and the socializing is good. At SXSWi, the content is good, but the socializing is great.

The Web 2.0 show

This is quite possibly the best thing I’ve seen since breakfast: Cerado’s Web 2.0 or Star Wars Quiz. The premise is simple: decide whether a silly-sounding word is the name of an over-hyped web company or whether it’s really the name of a character from Star Wars.

I scored a reasonably good 41 thanks to my knowledge of Star Wars, I’m glad to say… I’d hate to have scored well by actually recognising half of the companies listed:

31-40: As your doctor, I recommend moving out of your parents’ basement.

Ex-tech

XTech 2006 is over and with it, my excursion to Amsterdam.

All in all, it was a good conference. A lot of the subject matter was more techy than I’m used to, but even so, I found a lot to get inspired by. I probably got the most out of the “big picture” discussions rather than presentations of specific technology implementations.

Apart from my outburst during Paul Graham’s keynote, I didn’t do any liveblogging. Suw Charman, on the other hand, was typing like a demon. Be sure to check out her notes.

The stand-out speaker for me was Steven Pemberton of the W3C. He packed an incredible amount of food for thought into a succinct, eloquently delivered presentation. Come to think of it, a lot of the best stuff was delivered by W3C members. Dean Jackson gave a great report of some of the most exciting W3C activities, like the Web API Working Group, for instance.

I had the pleasure of chairing a double-whammy of road-tested presentations by Tom Coates and Thomas Vander Wal. I knew that their respective subject matters would gel well together but the pleasant surprise for me was the way that the preceding presentation by Paul Hammond set the scene perfectly for the topic of open data and Web Services. Clearly, a lot of thought went into the order of speakers and the flow of topics.

Stepping back from the individual presentations, some over-arching themes emerge:

  • The case for declarative languages was strongly made. Steven Pemberton gave the sales pitch while the working example was delivered in an eye-opening presentation of Ajax delivered via XForms.

  • Tim O’Reilly is right: data is the new Intel Inside. Right now, there’s a lot of excitement as to do with access to data via APIs but I think in the near future, we might see virtual nuclear war fought around control for people’s data (events, contacts, media, etc.). I don’t know who would win such a war but, based on Jeffrey McManus’s presentation, Yahoo really “gets it” when it comes to wooing developers. On the other hand, Jeff Barr showed that Amazon can come up APIs for services unlike any others.

  • Standards, standards, standards. From the long-term vision of the W3C right down to microformats, it’s clear that there’s a real hunger for standardised, structured data.

Put all that together and you’ve got a pretty exciting ecosystem: Web Services as the delivery mechanism, standardised structures for the data formats and easy to use declarative languages handling the processing. Apart from that last step — which is a longer-term goal — that vision is a working reality today. Call it Web 2.0 if you like; it doesn’t really matter. The discussion has finally moved on from defining Web 2.0 to just getting on with it (much like the term “information architecture” before it). The tagline of XTech 2006 — Building Web 2.0 — was well chosen.

But the presentations were only one part of the conference. Just like every other geek gathering, the real value comes from meeting and hanging out with fellow web junkies who invariably turn out to be not only ludicrously smart but really, really nice people too. It helps that a city like Amsterdam is a great place to eat, drink and talk about matters nerdy and otherwise.

Darwinian webolution

Odeo have released an embedded recorder that you can add to your own webpages.

Del.icio.us now offers private bookmarks.

Flickr now marks up profiles using the hCard microformat.

viewing source on my Flickr profile

Something that became very clear — both at the Carson Workshops Summit and at the many web app panels at South by Southwest — is that websites like these are never finished. Instead, the site evolves, growing (and occasionally dropping) features over time.

Traditionally, the mental model for websites has been architectural. Even the term itself, website, invites a construction site comparison. Plans are drawn up and approved, then the thing gets built, then it’s done.

That approach doesn’t apply to the newer, smarter websites that are dominating the scene today. Heck, it doesn’t even apply to older websites like Amazon and Google who have always been smart about constantly iterating changes.

Steve Balmer was onto something when he said “developers, developers, developers, ad nauseam”. Websites, like Soylent Green, are people. Without the people improving and tweaking things, the edifice of the site structure will crack.

I’m going to make a conscious effort to stop thinking about the work I do on the Web in terms of building and construction: I need to find new analogies from the world of biology.

Update: Paul Hammond told me via IM about a book called “How Buildings Learn: What happens after they’re built”. Maybe I don’t need to abandon the architectural analogies completely.