Pages

April 27, 2006

The Long Tail of Advertising Getting Longer

Microsoft acquired an in-game advertising company, Massive, yesterday.

No kidding - I think its really going to be like that old Saturday Nite Live skit, the Change Bank.

Change Bank: "We make change for you. You got a $1, we can give you 20 nickels. Or we can give you 5 dimes, a quarter, and 25 pennies - its all up to you."

Customer/Narrator: "How do you make money?"

Change Bank: [deadpan] "Volume".

See my
earlier post, also from yesterday.

April 26, 2006

The Long Tail (the book) and
Remnant Inventory Management

Hypothesis: When we look at monetization opportunties in the Long Tail that the Internet promises and is beginning to deliver, remnant inventory optimization is the business model, not an adjunct in the margins.

Optimization has always been of interest to me, and not just the optimization of code. Now, to be clear, I tend to think of myself more as an algorithms type person than an optimization specialist (for example, I abhor Assembly programming for anything and everything), because living in the margins just doesn't seem that fun. In the business world, for example, this usually takes the form of arbitrage of some kind - there's gold in them thar' hills, to be sure, but I don't know - it just doesn't generally draw me in, intellectually.

All that said, I do believe that having a mental framework for optimization - understanding the "why's" of performance (code or business) - is important in designing the right strategy for execution. And that definitely does get me going. (And, yes, when absolutely required, I will advocate Assembly :))

In that vein, one interesting "emerging" category of business optimization is
remnant inventory management. Its not that its new - supply chain operationalists understand well the opportunities, and waste management (er... productivity/production/inventory/etc. optimization, not Sopranos-style) is a lucrative "hidden" field. As I said, mostly this is in the margins, though its a core part of operational excellence for any major company: building to order, material expenditures, lean inventory management, etc., because those margins can be large.

And, of course, people in the advertising business (online and otherwise, especially Print Publishing) know all about remnant inventory - those are those ads in random spots that no one really wants. But somebody's got to buy the ad on the bottom half of page 34, after all - the space it already allocated.


I mention all of this because I came across a new book on Amazon, by Chris Anderson (of
The Long Tail fame), called (appropriately): "The Long Tail: Why the Future of Business is selling Less of More".

I haven't read the book yet (its not out yet :)), but its sure to be a good read, and I agree with at least the title - how's that for a ringing endorsement? :P


My belief is that this New Media Economy will be primarily won or lost by applications, not content . However, I'll posit that in any case, deriving value from resultant traffic/eyeballs/attention will be about tail optimization - and that, in turn, will require scale to be effective.

April 24, 2006

Sree's got a brand new bag

In all the activities and goings-on around here at AOL, I forgot to mention that I have a new job (sort of).

More than a year ago now, AOL reorganzied itself into 4 Business Units (BUs): Access, Audience, Digital Services, and AOLE (Europe - the other three are US focused). The big reason for that move was to enable focus and prioritization on value drivers that were/are fundamentally different (getting online, advertising , digital commerce, etc.). As a part of that, we've also been working to move the front-end application development teams into the BUs to be integrated parts of the respective businessses.

We (finally) finished that some weeks ago, and I'm proud to say I now run what we've christened the "Open Services" group at AOL, reporting to the CTO. While the application front-ends are in the BUs - focused on the next releases of those applications and the BU immediate priorities, my team is respnsible for the enabling technologies and infrastructure development that powers the entire business, today and (more importantly) into the future.

Included in my group are: Desktop Client infrastructure (OCP), Host infrastructure (SOA), Instant Messaging and Realtime communications, Publishing systems, Search development, Mail Host infrastructure, and the new Billing and Business Systems.

That "Open" part of the group is very important to our future strategy, and you'll be hearing more about that shortly.

April 23, 2006

P2P: Peer to Peer and Network costs

I'm not sure I was clear on the root issue that prompted my Growing-Pains-of-the-Video-Web article (though reading the links from that post will help...).

The short (short) version: although P2P technologies relieve scalability and cost problems for the originating server(s) - which is good for scaling the Internet, they don't help at all with the Network infrastructure congestion and cost issues, which is not so good.

There are some extensions and exceptions to that (for example, P2P in corporate LAN sort of helps shift some of the costs), but by and large, the reason that
some ISPs, University sys admins , et al, are crying foul: it costs them time and money to deliver value for someone else.

And these are only the leading indicators...

April 20, 2006

SPAS: Search Spam -
an RSS/SEO Mashup Love Story

I don't know about you, but when performing unfamiliar searches, I increasingly run into sites like these two:

http://ptbhkzj.jhloan.com/issue/subtlety.html
http://memory.urnf.com/memoryram/

If you haven't yet, I predict you will.

Welcome to the new world of unwanted information: Content spam that targets Search Engines. By combining relevant auto-categorized (real)
RSS feed data with good cross-linking to (real) sites, clean(-ish) URL structures, and semi-static perma-linking concepts - well, they show up as "relevant" in search engines for the appropriate terms these pseudo-sites are targetting. Usually not on the first page of search results, but high enough that they get a little bit of click through (and revenue). Rinse and repeat a few million times - these are machine-generated pages after all - and, heck, the business case just writes itself. And I'm probably not helping by linking to them...

This isn't
Search Engine Optimization (SEO) gone wrong, which is how most people think of Search Engine Spam (SES) - that is, benign or malicious conscious manipulation of search relevancy. This is what I call SPAS (pronounced "spaz" :)) - SPAm for Search - feed aggregated auto-editorial content that, for all intents and purposes, mirrors a modern content publishing system - except its editorial gibberish and Internet pollution.

Or is it?

How are these sites any different than
Google News? Reselling someone else's content, by organizing it, is the entire premise of the Search Engine business model.

SPAM (unwanted mail).
SPIM (unwanted instant messaging).
SPAZ (unwanted search [results]).

Flickerlish Nosescums, indeed,
Jack.

April 19, 2006

JavaScript Applications, pt 3: Optimization and Compilers

Note: Benchmarks with JavaScript compilers below.

There was some small amount of confusion regarding my point about chained calls at the end of my last article on JavaScript. Although nearly every JavaScript optimization article suggests the same optimization (explicitly caching nested property access into a local variable) - and I agree with it, especially for DOM access - my main point was NOT about optimization at all (per se), but rather the idea that understanding that this is an optimization.

There are two distinct reasons that JavaScript can be slow(er), and its useful to separate them in attempting to achieve optimal performance, and, more importantly, in understanding how best to use JavaScript in large scale application designs:

1) JavaScript is (usually) an interpreted language
2) JavaScript is a dynamic language

There's been a fair amount of work around JavaScript compilers and compilation, most notably from Microsoft (JScript.NET and the CLR) and Adobe/Macromedia (Flex beta 2 and Flash 8.5 Player). And while compilation (and just-in-time compilation) can help with some class of things (not to mention Moore's Law helping out over time), the very dynamism that makes JavaScript useful as a rapid-application-development and prototyping language continues to make it necessary to internalize the trade-offs that causes.

Basically, because (among other things) dynamic essentially translates into "late binding" for property discovery, even compiled JavaScript won't address many of the Programming-in-the-Large problems - those are better handled by classes, prototypes, strict typing, co-routines/generators and the like (which I'll cover in a future JavaScript post) - some of which are really future JavaScript features. For now, if want to scale your JavaScript applications, properly leveraging instantiation and the binding model is really the key. Compilers won't really help, and the interpreter isn't (really) the issue.


By way of example, it's often said that the Microsoft .NET CLR isn't good for dynamic languages.

Its not so much that its not true (IMHO), as that its misleading. The Microsoft CLR (VMs, JIT-ing and compilation generally) just don't particularly help for dynamic languages.

Let's look at a simple benchmark (JScript.NET version shown):

function benchmarkmath() {
var x1 : double = 0;
var x2 : double = 0;
var x3 : double = 0;
for (x1=1; x1<=10000; x1++)

for(x2=1; x2<=10000; x2++)
x3 += Math.sqrt (x1*x2);
}


[I got this from a site that I can't find anymore, so apologies to the author - I'll link to it as soon as I dig it up...]

The benchmark does some very simple floating point math (multiplication and addition) in a tight loop (100,000,000 times) that also invokes a complex "native" math function, Math.sqrt. Though I wasn't able to replicate the timings listed on the site (which showed that C, C#, and JScript.NET were basically the same), here's what I saw on my 1.2Ghz Pentium 4:

C(Visual Studio): 5.6 secs (provided only as a baseline)
JScript.NET(MS CLR): 10.8 secs
Flash Player 8.5(Flex): 20.7 secs
JavaScript(SpiderMonkey): 193.75 secs


Cool!

The JavaScript compilers really did well - and so JavaScript is slow because its just an intepreter in the browser, right? This would be seem to be further validated, as when I performed the same benchmark with previous Flash Players (which had not only a "regular" interpreter, but a SLOOOOOOOW interpreter), we see that this test takes over 1000 seconds (I stopped it because I got bored).

However, I then leveraged the basic "optimization" I mentioned above and removed the chained call. I basically added a variable (var f:function = Math.sqrt), and invoked that instead of the Math.sqrt function directly. New times:

JScript.NET: 114.4 secs
Flash Player 8.5: 116.7 secs
JavaScript: 178.9 secs
(note the small win for the chained call removal here)

Oops.

Apparently, the JavaScript compilers can inline (or at least reduce to a jump) the call of a function on an immutable object. Once it becomes a little dynamic.... not so much. You'd hope the virtual machines would do some caching of property look-ups or something, but this is indeed
harder than it first sounds because of the (potential) volatility in dynamic programming contexts.

So if your design scales beyond the trivial, not much help is coming for this direction.

Still, the dynamic nature of the language does afford a fair amount of power - one just needs to understand (how to avoid some of) the costs. In particular with Object Oriented Programming (OOP) paradigms and JavaScript, its important to remember that nested functions are NOT classes, though they look like them, i.e. emulate some of their behaviours.

I meant to delve into that more this time, but it'll have to wait I guess.

I can't recall exactly where I was going with this series, but next time I'll (finally) cover some specific JavaScript optimization and performance tips and recommendations, for JavaScript in the browser as well as compiled, with some time tests/benefits. Again, this isn't about performance, exactly - just that understanding the "why's" of the performance is useful in understanding the strengths and weaknesses of the language itself.

April 18, 2006

Burst-ing Apple's Bubble: Patent axe falls on iTunes

The patent fun continues: Apple is now being sued by Burst technologies - they want 2% of iTunes revenues.

I'm familiar with
Burst (and iTunes/Quicktime), though I don't know the details of the case, so everything I say should be taken with the view that I'm an armchair critic.

In my view, Burst may have been the first to suggest their ideas on media streaming, but mostly (and likely only) because they were (among) the first to look at those problems. That does not make their work novel Intellectual Property. I hope you don't read this as sour grapes or
NIH - when anyone, and pretty much everyone, else looked at the same problems, they arrived at (basically) the same solutions.

Patents are supposed to reward innovation, not discovery - being Einstein, not being Columbus.

A big part of the problem is that challenges really seem to stand on date and prior art, not technical validity - i.e. is it actually innovative? As long as that's the case, we're in the IP equivalent of
the Wild West.

Make sure you're armed when you travel.

April 17, 2006

Growing Pains of the (Video) Web

I'm seeing more and more articles about Network costs and possible Internet tiering - ranging from P2P costs and BitTorrent (courtesy of slashdot) to the "free ride" the "New Media" conglomerates of the Internet are getting on other people's backbones.

In general, the Internet has thrived on a settlement-free model across the interconnected nodes of the 'Net (remember why they called it the "Web"?), and I'm generally an egalitarian, meritocratic technologist kind of fellow, so may the best products and services win - an Internet where content and application providers pay to access consumers seems like a barrier to scale and success. That is, I think a rising tide lifts all boats, so let's not open the drain :)

That said, I also think these discussions are not just the result of "old media" greedily eyeing the profit structures of companies like e-Bay, Amazon, Yahoo, Google, etc and wanted a piece of the pie. One of the ugly truths of the Internet is that we (royal Internet "We") really don't how to scale it (yet) cost effectively (read: pay for and maintain the cost structure for high quality media delivery through advertising alone).

Robert Cringley (of PBS fame)
discusses this quite a bit, in terms of technology implications - we're not even close to Broadcast TV at scale, for prime-time numbers, and falling further behind as HDTV and the like catch hold. Cable pipes are shared (traditional cable works because its essentially a multicast model), and DSL pipes don't/won't hit the scale required. Peer to Peer technologies (P2P) are a promising answer, but there's a lot of work there to pay off the Long Tail for high quality rich media content and applications, especially when you consider streaming, and not just delivery.

Of course, this also leads me to wonder (no point here - just thinking aloud):

(a) Why do most videos I watch on my laptop look like Quicktime demos from 1993?
Conversely, when I buy or download a "DVD" quality video, it takes forever - I can get it from my Cable Company's On-demand service in real-time... and is being a better TV really where this ends? Which then makes me think....

(b) its not clear to me that the Internet is ultimately a content medium - only, or even primarily. Perhaps this trend towards tiering is the first thrashings of separating "tools" from "distrubution" (or something) - or maybe we're running headlong into "economy of scale" meets "economy of specialization".

The good news is, its all just getting started - heck, we're only up to
2.0. Everybody knows it won't be any good until version 3.

April 14, 2006

Blog-flog

I've been recieving a fair amount of grief at the office for the Google Ads that accompany my site. One could say that I've been getting blog-flogged :) (obviously inspired choice of words, perhaps, but I'm trying to coin a new phrase again :))

Examples of
AdSense enticements to visitors of my blog:
- Uninstall AOL
- Offshoring and Outsourcing
- Netzero ISP
- Careers in DC and VA

Not as bad as
this, but still...

Ah, well.

I, for one,
welcome our new thought overlords. God bless, and please pass the PageRank.

April 13, 2006

Go, Go, S-E-O! Go, Go - Uh oh...

I saw this arcticle in the Times when it first came out, but it really stuck with me and I thought it worth sharing: This Boring Headline Is Written for Google

(Read it and come back - I'll wait.)

As they say: He who has the gold, makes the rules.

Increasingly, content is being designed to be found and well placed in search engine results. Its called SEO(Search Engine Optimization). This isn't an hyped media concept: its really happening, and it translates to real business. From a technology perspective, no big deal - this has always been the case (or the idea at least) - there's a give and take to make sure that discovery and presentation technologies stay in synch.

But having to profoundly adjust editorial policy to ensure that content is easily found in Internet search engines? Yikes. Talk about the tail wagging the dog.

Or is it all that bad?

I mean, you're not supposed to judge a book by its cover (*ahem* article by its title), and editors make those kind of editorial decisions ALL the time to drive up readership. One could argue that this uniformity helps better organize the world's information.

And this isn't the search engines' fault, per se - there is metadata information (semantic content) that augments the presented content that should be able to inform classification and rating. The problem is that too many people, over the years, have abused it for their own gain.

So search engines have had to figure out how to judge "honesty", and the best answer seems to be "speak plainly". And we (royal Internet "we") did elect to put those search engines in charge of everything (metaphorically), because, well, they really did work better at finding stuff - and heck, they're free.

Still, I'm reminded of Ben Franklin's famous phrase, "Those who would sacrifice liberty for security deserve neither."

Ah, what did he know? The guy was an extremist rebel kook.

I, for one, welcome our new thought overlords. God bless, and please pass the PageRank.

More on this soon.

April 11, 2006

Microformats: the new, new XML

Its interesting to watch the wheel of reincarnation at work with regard to Computer Science. Everything old is new again (again), but just a little bit better. That, in and of itself, is no big deal - its the way of all history in all human endeavors.

The amusing part is the swing of arguments (and arguers) as various design patterns come in and out of vogue (rembember this story?).

For a while (late 80's, early 90's), proceduralism was all the rage: everything was an application. In the mid to late 90's, of course, we just knew
better. HTML and JavaScript (among other code hairball paradigms) taught us very clearly why we need to separate data, business logic, and presentation (the old model/view/controller thing).

And now we have microformats.

I've been spending a fair amount of time discussing microformats lately (we're using them in a to-be-announced product that you can play with here and learn about here), and I thought it worth elucidating on a few of the basic concepts.

First and foremost, microformats are
really simple. Simple in the same way that XML is simple - the format merely codifies a few "good ideas" so that everybody doesn't have to reinvent the wheel, differently.

In the case of microformats, the "good idea" is to overlay your presentation format/data with a tagging structure that allows you extract semantic information - i.e.,
mix presentation and data, but in a way that allows you to still separate the two when required.

Sounds complicated everytime I try to explain it (and I'm not the only one that makes it harder than it should be), but its just an evolution of XML data techniques.

With XML applications systems (data-driven designs), you separate the data that drives the system from the application itself. XML might be used as an input or output, and it standardized the grammar of I/O into simple text blocks annotated with tags and attributes that have a specific syntax (read more about XML) to enable trivial interchange.

You can, for example, take XML data and apply what are called "Style Sheet Transforms" (XSLT) to convert that raw, tagged text data into another XML grammar (for example, to convert to a different tagging structure for a different application), or to a presentation format like HTML or XHTML. XHTML is a forced clean-up of the mess HTML became because of the permissive parsing in early Web browsers, but its really basically the same as HTML - just more machine/parser friendly.

So microformats are essentially reversible XSLT transforms applied to XML data. Microformatted content is XHTML, so a browser (or other HTML display technology) can present it nicely, as the originator of that content intended, but the XHTML is tagged with "span" tags of specific "classes" to enable the extraction of the data from the display format.

Neat, hunh?

So there are microformats for address card info, mail, calendar entries, etc. - all manner of data that you might want to interchange, but where presentation is still important.

Tastes great,
and less filling.

It seems overly complex to even call them "microformats", but I guess you have to call them something.

April 10, 2006

Dodgeball! Tournament

As promised in my previous post, the short sweet summary:

We came, We saw, We got our @##es handed to us.

Turns out other groups at AOL LLC also entered (content groups), so of about 10 or 12 total teams playing, AOL fielded four teams. Not one made it past the first round of the single elimination tournament.

Ouch.

We did assemble an AOL Technologies team from the rubble to enter the "losers bracket". Heroes were made, villains were crushed - we beat Google, yay! (I feel 5% bad about that) - and damsels were saved, and we emerged victorious: We won the "losers bracket".

The real final, meanwhile, was an all MSN final (they had two teams playing also) - the MCP was in ascendance and Alan-1 was nowhere to be found...

Looking around, I'd say that ours was the most senior, career-wise and age-wise, though I have to say (thankfully) we were NOT the only the geeks out on the courts.

As we watched the final and were discussing (from the sidelines :)) the strategies of game play and the attributes that made for fine dodge ball playing - was it a good arm? speed? good catcher? - it become really clear that the single most important quality for a player was "youth" :)

It was a lot of fun (though there was a surprising amount of un-sportman-like conduct for a "fun" game - yeesh, relax people!), and I'd do it again - its surprisingly satisfying to bean someone with a rubber ball.

April 7, 2006

A True Underdog Story

Updated: The result.

At the behest of our CTO, I've assembled a crack team of technologists to hold the line againt the dark tide that seeks to overwhelm. We will be competing against the eyeball stealing axis of evil (Google, Yahoo, MSN and others) in a no holds barred, knuckle baring, flat out fight-to-the-finish battle match of... Dodgeball!

Yes, Dodgeball.
As in, dodge, duck, dip, dive, and, uh, dodge.

Seriously, I'm fielding my own AOL team in an intercompany Dodgeball tournament (there will be one other AOL team playing also). We play today, and it should be an interesting tourney - I expect we'll be the only company fielding a tech team.

And... I expect it'll be pretty ugly (less doughy tech types over in Marketing :P), but it matters not, for...

I am Sree. I fight for the users.

(I'll let you know how it goes... :))

April 6, 2006

The Uncanny Valley of CPUs and Moore's Law

In the Computer Graphics industry, there's a concept called the "uncanny valley". The idea is that there's a major visual plateau that you hit when things get VERY realistic looking. For a while, things are more and more convincing as you get more photo-realistic, and more and more pleasing, until the graphics get so realistic that every little thing that's just off jumps out at you.

And because its otherwise so realistic (and most people see these defects only subconciously), this can create disbelief, and even revulsion. The gap in belief actually WIDENS in this "uncanny valley"
as you approach photorealism, at least until one can iron out these previously unimportant kinks.

I think that's more or less where we are with compute power on the desktop.


The trivial summation of Moore's law is: "Computers get twice as fast every 18 months". There's more subtlety there (something about empirical observations in trending of transistor counts per square inch on circuit boards), but its a fair summation I think.

Unfortunately, we're not seeing the product and consumer experiences that really benefit from Moore's Law anymore (games aside). We're in the "uncanny valley" of application experiences, at least from a desktop compute power perspective. (OK, so its more of a plateau than a valley, but you get the idea...)

What's interesting is this: It's not clear to me if this current experience gap, and our (industry "our") ability to clear it, is a failure of sufficient compute power growth to enable these new experiences, or a failure of imagination.

I suspect the latter.

April 5, 2006

America Online, Inc: R.I.P.

The company once known as America Online, Inc. is no more.

AOL is dead. Long live AOL (*ahem* AOL LLC). The "LLC" part was really driven by the deal with Google (hey, we're 5% "Not evil" now!). The "AOL" part? Well, it was time...

One other bit of "trivia" news from Q1(-ish :)), along with the price increase for Dial-up members and the move to "AOL Hi-speed" (real Broadband, broadly available), well, all those "3,099 FREE HOURS!!!" offers?

Gone. Finito.

I think any CD offers that are out there will continue to be honored, but the new, new deal is a 90 day money-back trial (or some such).

We're still crapping out lots of CDs (and so it may be hard to tell), but there were some significant shifts in our business practices (finally) in 2005, and those ripples are, as we're seeing, cascading into the marketplace. These changes to the core business are but indicators.

Time will tell if it all plays out as one would hope, if indeed it will be AOL's "third act", but it sure beats standing still while the world passes by...

April 4, 2006

AOL and the Offshoring question

I mentioned the AOLT All Hands yesterday. It was the first "public" showing for our new CTO (and her senior management team, myself included), and I thought she came off quite well: Talked the talk very crisply.

Now its time to walk the walk :)

However, there was a question that came up during the Q&A which I felt was very important, emotionally, and under-answered during the session. It was a tough question, and my post here is in no way intended as a slight on the answer given at the All Hands (which I did not deliver) - but with some time to reflect, I wanted to try and address it again. "Answering" it is probably beyond my reach.

The gist of the question was:
Q: There is a lot of fear and a lot of talk about outsourcing [and offshoring]. There have been a lot of initiatives to start outsourcing [and offshoring] certain jobs to Bangalore. But what’s not clear is if outsourcing working for AOL. We have several examples in AOLT now where we have successfully outsourced functions or systems and it appears that we should have enough data to be able to quantify those results. What are the key business metrics any organization should be looking at before they say, this is working, or no, it’s not working? Is AOL really saving money from these outsourced initiatives, and if so, can we get some information or data on that?

I think the underlying question can be fairly summarized as: "I understand that offshoring saves us money, on paper - i.e. cost per hour - but do we know (and how do we know) that it saves us money really, i.e. cost per unit of productivity?"


Or, perhaps, even more succinctly: "If its not working, let's stop doing it." (which isn't really a question :))

Here's the problem (and I realize this will seem like I'm dodging, ducking, dipping, diving and, uh, dodging): its just the wrong question.


If "they" are not as productive in our execution (which, of course, "they" are, mostly, though not always and not uniformly - just as anywhere) then that's STILL not a reason to not continue offshoring where it makes sense on paper. We should, naturally, know and deal with productivity issues, such as they may be - but it doesn't change the idea that we, as a Company, and as a managment team, should pursue offshoring and outsourcing.

The issue is, of course, that this is a really touchy subject - and for good reason - it triggers patriotic sensibilities, as well as real job security sensitivities.


However (and this is a big "However", because I don't want to or mean to dismiss these concerns lightly) we have to recognize that there are smart people ALL over the world. It behooves us to figure out how to be advantaged by them - this was certainly a major point that Eric Schmidt, CEO of Google, made when he visited, and he's right.

I'm not saying people are smarter elsewhere, but, tough though it may be to admit, all over the world there are people JUST as smart, and JUST as talented, and JUST as effective. To compete at the scale that AOL does (and needs to) - to compete at a global level - we need to be able to leverage smart resources all over the world.

No one asks, "Why do we offshore [sic] to Mt. View, California?" after all. As AOL, we need to be wherever the talent is, and that includes Bangalore.

April 3, 2006

Tim O'Reilly visits AOL

Timely, given recent posts on this blog about the Web:

Tim O'Reilly (of O'Reilly Media and Web 2.0 fame) was here last week to speak at the AOLT (AOL Technologies, for the uniniated) All Hands. Jon Miller, AOL CEO, "interviewed" him about Web 2.0, and technology trends in our industry - which was, I thought, a reasonably engaging way to NOT lecture to a bunch of technology people (*cough*skeptics*cough*) while still lecturing them.

AOL is a company that trends toward inward focus, so I think Tim's perspectives were generally quite illuminating and well receieved - I hope he comes back, and that we (AOL) continue to build a relationship with him. Though not quite proscriptive, he certainly set a tone about the role that big companies play in the Web 2.0 world (in his view), and AOL's opportunities to retain (regain? :)) relevance.

I'm always a little skeptical myself when it comes to pundits (talkers, not do-ers) talking to do-ers, but Tim certainly does not fall into that category. He's got some
street cred, and it showed.

Still, the Web 2.0 pitch felt a tad
dot-com-y (to me) in that the commerce aspects seem to follow the "get eyeballs/sell ads or get bought" model. I have a hard time believing the Internet is just another media outlet (one to many, many to many, or otherwise). There's more there than just re-selling your eyeballs to advertisers, but Tim's time was short and the forum was a TECHNOLOGY All Hands, so the ommission was probably appropriate.

Tim made a very compelling case that the Web is every bit as disruptive as people anticipated, and the next generation, who (practically speaking), never knew life without the Web, is upon us already.

Naturally, the usually suspects of Web 2.0 were mentioned: Google Maps, Del.icio.us, and Flickr - Tim mentioned that at one point Flickr was pushing builds out every 30 minutes live (to which the crowd applauded). I have to admit - I'd have been more impressed if they were pushing FEATURES out every 30 minutes :P

It may be unkind, but builds every 30 minutes sounds SLOPPY to me, not impressive. I remember pushing out builds of the AIBO 2.0 Launch website live (for Sony, obviosuly) back in 2000 (pre-AOL days) - it wasn't because we were that GOOD. Tim did make a specific and spot on point about the value of operational excellence in the new Internet world.

He also quoted author William Gibson: "The future is here. It's just not evenly distributed yet".

Please pardon my language, but that's just f***ing brilliant. How is it that I don't already use that one all the time?