What I don't get about Wave: Robots

(Originally written as a comment on Writing a Wave Robot @ Science in the open.)

I think that having applications in the cloud edit what I wrote is wrong on many levels.

Obviously, trust in the robot provider is the biggest issue here.

Then, bugs in the robot implementation run the risk of completely destructing our data (which is balanced somewhat by Wave’s unlimited undo/playback, IIUC).

Then, this makes auditing content much more difficult, since robot contributions aren’t clearly marked (unless all robots use some common annotations…). Wave proposes the playback feature to discover the trail of changes that led to the current state, but I think that’s a bad idea: I don’t want to watch a possibly very long animation just to see who wrote what (Note: all of this is based on my limited understanding of Wave, which I haven’t tried yet.)

Then, there’s the engineering question of what happens to the robot-generated content when I edit what I wrote. The robot will have to reparse the text, and update its inserted content. But I may have moved, edited, etc, that content by purpose or accidentally. I think that’s just a big stupid mess altogether.


Introduction to Content-Centric Networking

By Van Jacobson:

There's also a Google talk
, A New Way to Look at Networking, which is very worth watching:

Tip o' the hat to neuraxon77.


Zero-Latency Internet GUIs Using "Multiple Worlds"

I have discovered an interesting way to write zero-latency user interfaces for internet services.

It's similar in spirit to Alice ML's first-class futures, promises.

The basic idea is that every user operation that hasn't been acked yet by the server, results in a new "possible world" in some part of the user interface, that may have to be rolled back (if the server fails, for example).

The model (if you're a MVC person) is represented as a tree of operands, which are basically futures, containers for variables that reside on the server. GUI widgets subscribe to operands as listeners, and GUI operations are represented as first-class objects that change an operand.

/** A usually high-latency action that changes the states of one or
more operands. */
interface Operation {
public String getDescription();

interface OperandListener<OP extends Operand> {
public void onOperandChange(OP operand);
public void onOperandFailure(OP operand);

/** An observable variable with a last known state (usually retrieved
from the server), and optionally a pending operation and a
tentative state (a state that the pending operation would like the
operand to have, but that hasn't been acked by the server yet).

Clients of the operand should usually just call getState(), which
returns the operand's tentative state if there is one, or the last
known state otherwise. This guarantees that the user sees an
up-to-date image of the operation's progress. (There is a danger
however, in that the tentative state of an operand may be invalid
from the client's perspective.)

There can be only one pending operation on an operand at one time,
and trying to start a new operation on an operand with a pending
operation will result in an error. */
abstract class Operand<T> {
protected T lastKnownState;
protected T tentativeState;
protected Throwable currentFailure;
protected Operation pendingOperation;
protected Set<OperandListener> listeners;

A GUI operation simply puts an operand in a tentative state, which locks the operand, so that it cannot be changed (by another operation) until the server has acked the change.

This means that a user can edit multiple objects on the screen, and each of these changes results in a zero-latency screen update to the new tentative state, and a background thread that tries to update the state on the server.

(I haven't implemented this yet, but it would also be possible to merge commutative operations.)

Storing JSON as Protocol Buffers

I love JSON (and I think that my HyperJSON format is the best data model in existence) but for some reason or other, I always have trouble with JSON libraries for mainstream programming languages.

As a way out, I have defined my own data model, similar to JSON's, but I'm serializing it as Protocol Buffers using this .proto file:

message Data {

optional Text text = 1;
optional Link link = 2;
optional Entry entry = 3;
optional Feed feed = 4;
optional Date date = 5;
optional Bool bool = 6;

message Text {
required string string = 1;

message Link {
required string href = 1;
optional string rel = 2 [default = ""];
optional bool rev = 3 [default = false];

message Feed {
repeated Data element = 1;

message Property {
required string key = 1;
required Data value = 2;

message Entry {
repeated Property property = 1;

message Date {
required int64 millis = 1;

message Bool {
required bool bool = 1;


There are a lot of different ways to store JSON in PB, and the details are pretty much irrelevant. An important point though is how the JSON objects (Entry) are stored: simply as a list of Property objects.

It's the same as with object-oriented programming: you never want to model extensible records (business objects, documents, etc) directly as objects (Protocol Buffers in this case). Instead, you want to compose your business objects from your programming language's native objects.

See Being poor at modeling is essential to OOP.


Microsoft Announces Food Search Cooperation With Cuil

Redmond, WA - Microsoft (MSFT) today announced a cooperation with Google killer Cuil Inc. The next major release of Microsoft's search engine, Bing 2012, will incorporate the special Cuil Theory developed by Cuil, along with recipe search technology by Google killer Powerset, which Microsoft acquired earlier in a $100 million deal.

In an interview with the press, Microsoft CEO Steve Ballmer told reporters that Cuil Theory will further enhance Bing's standing as a so-called decision engine, especially when it comes to food-related decisions. "Others do nonfood searches quite OK, but the next big thing is food decisions," Mr. Ballmer said. "There is no better way to decide for, say, a grilled-meat restaurant, than Cuil's prized technology."

Mr. Ballmer declined to give any new information about .EAT, Microsoft's rumored major entry into the food decision market, but predicted that ".EAT will be bigger than Java" in an obvious blow to Oracle CEO Larry Ellison. He also told the press that this will be an opportunity to sell some of the tablet computers stacked in Washington warehouses. "What's better for food decisions than a table—t," Mr. Ballmer jokingly asked the reporters. "Get it, you dumb _____?"

Analysts were cautious, however. While both Cuil and Powerset were "quite effective" as Google killers, they weren't as successful as Wolfram|Alpha and other newcomers to the scene, a Silicon Valley analyst noted. "Hunting and killing Google will remain a favorite pastime of startup companies for some time. There's still more than enough Google to go around. Happy hunting!"


Gadgets o' the Third Week of August '09

iPhone Soap


VTech KidiLook


Weird "Burn After Reading"-Inspired Sony Patent (source?)

Bonus Graphic (from this magnificent thread)



Woobius Turns Security on its Head, and Makes it Right

Woobius is an innovative company that produces communication software for architects and engineers.

I think a lot about their approach to security, which could be called real-world security:
The reality is that if you have a file, either one that you’ve produced or one that someone has sent to you, you can do whatever you want with that file.

If the system doesn’t allow you to do that, you will simply circumvent the system and get the file across.
This is exactly the opposite of every security system I know.

Ordinary security software builds its own network of trust, separate from the real network of trust among the participants. And then it creates a fake sense of security by doing as if there were no out-of-band channels between participants (f.e. taking a photo of a laptop screen).

We have to accept that people will forward information in their network of trust, whether our software believes that's OK or not. Woobius shows that embracing the real network of trust leads to an intuitive and useful security system.

Instead of playing silly games with ACLs, where we try to restrict the circle of participants that can view/edit/... an item, we need to give participants the tools to expand that circle to include their network of trust, transitively.


The Comeback of the Wrist Watch

(IBM Research Linux Watch, August 2000)

I think wrist watches will make a comeback as "first responder" computers, connected to our phones and laptops.

Display tweets on your watch, some buttons, a scrollwheel, WebKit, done.


Microsoft and Nokia Announce Copy & Paste

Keilaniemi, Espoo - In a joint statement, Microsoft and Nokia today presented their plan to take the throne of mobile innovation from incumbent Apple Inc. Their latest effort, called Blebi, is designed to beat Apple with its own weapon – high-octane, shock-and-awe innovation.

Blebi is an effort to cross-license intellectual property and technical know-how regarding the coveted "copy & paste" feature of modern supercomputers.

The technological difficulty of introducing copy & paste in the mobile form factor (it took Apple years to "port" this technology from its desktop range to mobile) was underscored by an expletive-laden meeting with Microsoft CEO Steve Ballmer, in which he threatened to keelhaul the programmers if they don't come up with the feature fast enough. "I've done it before," Mr. Ballmer said.

Nokia appears to be in even more trouble, having to implement this far-reaching innovation across its range of operating systems: Maemo, Nokia WebLinux, MobileLinux, Symbian, UIQ, Symbian69, Symbian92, Symbian08, Moblin, UIQ2, and UIQ3. A Nokia spokesperson was unable to comment on a specific timeline for the introduction, and also declined to answer which of these operating systems were actually in use across Nokia's phones.


Comments on the Internet Notebook

For some reason, Blogger won't let me comment on this very blog (a win for the cloud!), so I'm answering your comments on the Internet Notebook here.

Gavin, I didn't mention ubiquitous access, because I take that for granted in an internet tool. I was just listing the kind of features that describe the tool's "mental universe".

Craig, you're referring to content-centric networking when you ask for an internet that never forgets? Well, we'll have to make do without that for the moment.


10 Must-Have Features of an Internet Notebook

  • typed, bidirectional hyperlinks = structured tags
  • wiki namespace
  • ubiquitous reverse chronological access
  • search
  • outlining
  • asymmetric follow
  • unlimited undo
  • instantaneousity
  • views
  • secret feature
Dig we must!

Quoth Dave Winer

It's time to use the web again to store our ideas, and instead of relying on Silicon Valley companies to link our stuff together, let's just use the Internet. That's what it was designed for.

We'll go back to basics now, take what we learned from this round of innovation, and build it for real this time.

Apple News for 20090811

ap.pl URL Shortening Store

Unlike other shortening schemes where URLs are quickly shortened via a web input page, Apple’s scheme will first require “URL developers” to submit their URLs for approval. ...

URLs can be rejected for any reason, based on the understanding of a 400-page specification document by Gordy, the one intern who will be reviewing all URLs.

iPhone development in decline, analysts say

Suspicious diagram


Japanese Innovations to Be Aware Of

Haptic ultrasound feedback for holographic displays

Ramen roboters

Infinite Soybean

What I don't get about Wave: Shared State

State. Big issue. Wave is teh shared state. Shared state = anti-scale.

Some of the most successful collaborative processes in the world, music, the blogosphere, and Linux, are completely shared-nothing.
  • When Bob Dylan changes an old folk song, his changes don't somehow magically rewrite existing prints of the song in textbooks. Instead, Bob commits his changes to the world through his albums and concerts, and if interesting, the changes will be included in the canon by Bob's social network.
  • Bloggers don't edit each others' posts. They write their ideas on their own blog, and through various orthogonal flows, these ideas do or don't end up in interesting places, and feed back to their authors.
  • When somebody develops a patch for the Linux kernel, he doesn't go ahead and edit Linus' code repository. The patch is sent to the LKML for discussion, and eventually included by Linus or other upstream people (or not).
In all of these hugely successful cases we have 2 properties:

1. Every participant has his own state, completely separate from others' states.

2. Changes to somebody else's state go through an orthogonal, social channel, that enforces authority.

Google Wave has neither of these properties. In Wave, all participants edit the same state, and authority to change something is not a social, but an administrative property (i.e. if you're in the ACL you can do it, without community moderation).

(Rohit Khare, whose writings are among the most inspiring in all of the intarwebs, has identified latency and agency as the two fundamental forces in internetworking. IMO, Wave takes a worst-of-class approach to dealing with both of these forces: Wave's user experience depends on low latency, and Wave's model of agency is simply boring, as shown in this post.)

The Star Wars Platform

The genius of George Lucas is the way in which this platform he established allows others to imagine and build upon his original ideas. Everyone knows that the force exists—and roughly what it can do. Everyone knows that faster than light travel is possible. Everyone knows there is the "Empire" and the "Rebel Alliance," and everyone knows that Leia is Luke's sister. In a Star Wars Universe, the force would never disappear in the same way that Luke would never sleep with Leia.

What Brands Can Learn From Luke and Leia, by Wolff Olins head of strategy Paul Worthington.

(Star Wars' introductory sequences also show that proper kerning is not a necessary condition for success.)


Sagmeister presents Logo Generator for Koolhaas' CdM

Nice one, Sagi!

Note to up-and-coming designers: don't start to wear hot women's clothes on stage until you're famous.

(PC-Note: Unless you really are a hot woman. Duh.)

Gadgets o' the first week of August '09

Samsung R450 Mister Cartoon

iBuyPower Chimera Killer

Sony Vaio P91

Thanks, ÜberGizmo!

Interesting infographic on fixie bikes


Clearing up Confusion About John Gruber Note App

I'm sorry to have caused confusion with my post about the John Gruber Special Edition Note App and for the record I want to say that it was a prank.

(God, this so made my day!)

Update: It appears that I've been reverse-pranked. Even better!

I'm not bitter, I'm better

Alex Payne characterized this blog as "bitter internet technology humor".

I don't want to come across as bitter.

I want to rid the internet of stupid stuff and make it better.

What I don't get about Wave: General Architecture

What happens when you take the one dumbest data format in existence, extend it (I won't even go into that), stuff it into a centralized server, and let people manipulate it concurrently, using a TECO-ish differential editing protocol?

Answer: Google Wave.

Of course to make that work you have to look into CSCW which hasn't produced any interesting results in 40 years, and adopt silly bean-counting technology (ha!) like Operational Transformation. Only so that text-editing works in your system.

Joel doesn't understand that stuff. Heck, not even Manuel does.


Hypermedia as Objectified Thought

From Lev Manovich's On Totalitarian Interactivity:
Interactive computer media perfectly fits in this trend. Mental processes of reflection, problem solving, memory and association are externalized, equated with following a link, moving to a new image, choosing a new scene or a text. In fact, the very principle of new media – links – objectifies the process of human thinking which involves connecting ideas, images, memories. Now, with interactive media, instead of looking at a painting and mentally following our own private associations to other images, memories, ideas, we are asked to click on the image on the screen in order to go to another image on the screen, and so on. Thus we are asked to follow pre-programmed, objectively existing associations. In short, in what can be read as a new updated version of Althusser's "interpolation," we are asked to mistake the structure of somebody's else mind for our own.

This is a new kind of identification appropriate for the information age of cognitive labor. The cultural technologies of an industrial society – cinema and fashion – asked us to identify with somebody's bodily image. The interactive media asks us to identify with somebody else's mental structure.

Leak: Apple Tablet to be Operated by Foot?


The Web Killfile Plugin

I'd love a browser plugin that removes all links and paragraphs mentioning killfiled people or companies. You'd never have to read about Microsoft or 37signals again. Imagine that for a change!

With an additional server-side component, governments could compute a list of the most annoying people on the internet, and automatically block access to their web sites. This would provide the population with freedom from annoyance during their daily internet use, and prevent annoying people in chat rooms from contacting little children with the goal of annoying them.


Apple Releases John Gruber Special Edition Note App

Cupertino - In a rare move, Apple Inc. (AAPL) today released a special edition of their highly prized, and cleverly innovative Notes application for the iPhone, in honor of the well-known evangelist John Gruber.

At a cozy friends-and-family-only reception, CEO Steve Jobs introduced and presented the application personally to Gruber on a small stage, who was visibly moved by the occasion.

The guests, which included Bertrand Serlet and Joan Baez, gave a long standing ovation to the laughingly chatting Jobs and Gruber, who left the stage arm in arm, suggesting a long night out in downtown Palo Alto.

The application, called simply Note, consists of a single virtual page of paper, onto which a user may write small sentences, almost tweet-like. We have reached Gruber, who has by now had time to play with the application, for comment. In his classic style, Gruber writes to the readers of this blog:
First of all, I need to strongly state that although Steve Jobs has bestowed upon me the special honor of being the first and only independent blogger with a Special Edition iPhone App, this doesn't in any way make me "more special" than the other equal and like-minded individuals that make up the harmonious community that we call the Applebubble.
But let's talk about this technical wonder. The Special Edition Note app must be understood as the newest in a long series of insanely great applications designed by Steve, starting with the original calculator.

The brilliancy of Note is so stark that it almost hurts the brain. This is innovation in classic Apple style: bold, fearless, different. I can literally feel the Zen-like atmosphere that must have been present in the board room, right after Steve had received the crucial insight in a lightning flash out of the cloud of limitless creativity, and spoke to his savants: "There is only one page."

Look at the diamond-like beauty forged by the The Man's sheer mental power. There are no buttons anymore: you don't need a trash button, because why would you want to delete the only page that you have? Marvellous! There's no send button either, because why would you want to send your single page of notes to somebody else. They're probably not interesting for them anyway. And of course, there are no forward and backward buttons, because now we are freed from the folder-like tyranny of having to manage our multiple (ugh!) pages by hand.

There's also no way to enter more text than fits on a single screen. This radical design gives birth to a new paradigm of note-taking – about which I'll blog at length once I've immersed myself fully in this new experience – just like the 140-character limitation of a tweet forces you to compress your complex, subtle, and interlinked thoughts into pearls of clarity, wisdom, and poignancy. If your screen is full, there is probably something on it that's no longer interesting. Delete that piece and you have more room for your precious notes.

Also, Note has no synchronization. This turns out to be yet another grandiose achievement of the single-page design: why would you want to sync a single page, when you probably know what's written on it anyway? Brilliant!

All in all, I want to thank Steve and the whole Applebubble for making possible this kind of advancement in computing in this bright new millenium.
Thanks, John, you've been as fantastically insightful as always, and really, Apple continues to surprise and awe us with marvellous innovations like this!

My, what a cute tank under that surface


“Ecks Emm Ell – In Ze Claud!!”


I, for one, welcome Shanghai as our new creative world capital

I've lived in Shanghai for 8 months, and what I most remember about this ravishing city is the incredible energy, the buzz, the high speed, the strive for excellence in all matters, coupled with a gaudiness in going about it (yes, I am ignoring all troubling aspects of the Chinese polity for the purposes of this post).

It is my belief that Shanghai will become the world capital for visual design in the next 5-10 years.

Just look at this flyer for a recent visit of Jan Chipchase to W&K Shanghai:

Would that flyer have looked as cool in New York, Berlin, Paris, or Los Angeles? Nope! Maybe Tokyo could represent, but soon, Shanghai's designers will influence the whole world.

Thoughts about design by Martí Guixé

transformation to ex-industrial society happening right now 0:00-1:00

parallels to futurist period (conservative, violent) 1:30

Sloterdijk: designers are new revolutionaries (good and free living for the people) 2:00

pragmatics, efficiency, no romantics of destruction 2:30

build the thing and the context at the same time 4:00

Wolff Olins


WO is one of the most out-there companies.

You really need to check them out in depth for yourself.

Such are the changes upon us, and objectives = plans and Dubai's fake desert yellow ties it all together with the name of this blog, from Frank Herbert's Dune.

The awesome Helma and the lacking JavaScript

I've written about the impressive Helma server-side JavaScript tool before.

Helma's been continually developed for decades by a group of nice people striving for excellence, mostly around Vienna, and in various Alpine valleys.

Last time I wrote about the pros of Helma. This time I'm going to focus on JavaScript, and why I think it sucks qua programming language.

JavaScript: The GOOD
  • Closures and proper lexical scope
  • Everything is an object (for most purposes)
  • Limited, but standardized library of data objects: numbers, strings, booleans, dictionaries, lists, null
  • Every variable and function can be changed at runtime
  • Varargs
  • Exceptions
  • No continuations
  • One of the most important and heavily used languages in the world
JavaScript: The DEBATABLE
  • No keyword parameters
  • No checking of number of arguments
JavaScript: The UGLY
  • Prototype system: Just read the spec. The mind boggles. What were they thinking? I don't think anyone can describe how the prototype stuff works in under one page.
  • this: I think the way JavaScript tries to keep it "simple" (haha!) and avoid a class system with the weird contraption that is constructor functions and this in functions is totally rejectable.
  • undefined: What's this for?
  • Unreadable spec
JavaScript: The BAD
  • No macros. You can't program seriously without them.
  • No class system. I think JavaScript should go the full way and offer a multiple-inheritance, runtime-redefinable class hierarchy. The topic of class linearization seems to be a solved problem, and one doesn't need OOP too much anyway. For all prototype weenies out there, such a dynamic class system would mean that you can still do all fancy tricks with extensible objects and mutable inheritance chains.
  • No module system. SML's module system is the best, and would probably work well for JavaScript. Its mechanics are beautiful and simple.
  • No big numbers. Serious damage for a dynamic language.
JavaScript: My ideas for making it better for SSJS
  • Add macros. Shouldn't be too hard, given that Dylan did it, and JavaScript has a quite regular syntax.
  • Add keyword and rest arguments, like Common Lisp.
  • Add a class system, and have method declarations outside of classes, like Common Lisp. Make exceptions objects.
  • Add SML's module system. Well, OK, that's probably overkill, but it would for once solve JavaScript's namespace issues. Or, even more radical, add Alice ML's packages.
  • Make exceptions restartable. Once you go restartable, you don't go back.
  • PlasticsLibraries
I, for one, will abstain from using the wicked language that is JavaScript. GWT is wonderful for the client side, and on the server Java is not too shabby, either.

Yes, Java is crap, but at least they brought in Guy Steele seconds before the point of no return. Python, Perl, and Ruby have a substantially higher crappiness factor than Java (and JavaScript). As Dave Moon would say, “they don't even suck”.

(BTW, my new Lisp will have most of the features noted above.)