Pages

January 29, 2007

Reading online

I've been trying to do more long form reading online, and it kind of sucks. Anything more than a page, and it get annoying. I've tried online books, PDFs, technical specs, long form comics, and its just not comfortable.

Short stuff is fine - though even long magazine articles are taxing...

I don't think its a tech issue, per se (resolution, rendering, or the like) - I think its a device form factor thing. One of the things I always really like about the swivel/arm generation of iMac was that it entered your plane... it wasn't
distant in the way most monitor configurations are.

Its also very possible that its simply an age/comfort thing - enough effort and I'll get used it. Nevertheless, I think it contributes to the cycle of ever shortening attention spans.

On the flipside, it encouraged me to get a wireless printer, which I love and use regularly to read anything more than a few pages. I think somebody will get this right at some point - PDAs/phones are too small, and computers (even laptops) are too far and too ... not right.

January 25, 2007

Sampling theory, pt 1: WPFE in pictures

I got a few questions and comments about my comment that WPFE "botched" the sampling rules for bitmaps/images. I tried to explain in the comments, but its clear I wasn't clear.

So, I'll post a clearer text/mathematical explanation shortly but here's the issue, by visual example.

First a simple "linework" image:


Next, the same image if I apply a 400% scale to the canvas, as rendered by WPFE (I cropped the result to the upper left corner):


And now, the same image, with a 400% scale as produced by my (new) rasterizer - this is also the same image that Flash 8+ and OpenGL will produce:


See the problem?

I'll explain why this isn't a black and white issue, but why I think MS landed on the wrong side of it shortly, in a follow-up. The problem is covered well in the seminal computer graphics memo from Alvy Ray Smith, "A Pixel is NOT a little square..."

Alvy Ray, of course (being Alvy Ray), is not wrong in his final conclusion, but that doesn't make his point right for all applications either, and therein lies the rub... most UI designers, I'll posit, think of images as exactly little bags of rectangles...

January 19, 2007

MLK, Jr. Reflections: Name that Source

Monday was the official US holiday in remembrance of Dr. Martin Luther King, Jr. and his contributions to the Civil Rights movement.

So in honour of that, I'd thought I'd share three quotes from, let's say, karmically connected sources - I'm not going to provide attribution. Its easy enough to find out on any search engine, but before you look it up - think about who you THINK said it; you might be surprised (OK, *I* was surprised - you might not be...)

Quote 1: "Hate the sin, not the sinner"
Why this was interesting: I've heard this forever and always, and had always assumed that it was a theologically derived philosopohy. Not so much - turns out it has a very secular origin, and the source provided much of MLK's inspiration. And I really should have known this one.

Quote 2: "Good fences make good neighbors"
Why this was interesting: Like "chuffed" or "droll" (or "absence makes the heart grow fonder..."), this phrase is often used in support of the thing that was intended as its antithesis. By that I mean, the originator meant that fences make boundaries, not neighbors, but its taken literally quite often, which is quite ironic. I think Dr. King would have agreed with the original sentiment - being a "together" guy and all.

Quote 3: "If we don't stand for something, we may fall for anything."
Why this was interesting: Its kinda of a "philosophical" gimme.... you know, like you said something important, but you didn't really say anything at all. Still makes you feel good saying it!

January 12, 2007

CES 2007: Quick Trip Report

As I mentioned, the CES show floor was mostly uninteresting (especially with Apple stealing a lot of the thunder), but it was great to connect with everyone.

There was one thing I saw that I thought was pretty startlingly amazing, and deserved of another mention: Powercast. Its essentially (odd as this sounds) wireless power. They're expecting it to show up in consumer devices by end of calendar year, and I can only imagine how this technology evolves in the future.

The company appropriately won an "Emerging Tech" Best-in-show award from CNET- the only award I thought that didn't seem like a triumph of volume over substance.

January 9, 2007

Apple iPhone: Wow

Everything is software - the rest is just wiring.

I've mentioned the idea of the specific trumping the general for UI, and the value of general purpose computing environments for providing more specialization of application and eliminating distribution barriers (and mentioned Jeff Han's multi-touch work before).

And Apple's built something real - much sooner than I expected. Lots and lots of details to follow I'm sure, but the key innovations are:
  • Ubiquituous connectivity,
  • Portable form factor, and
  • "Hardware as software" (most importantly multi-touch)
I think this is disruptive for reasons we don't even know yet. I'm sure many will predict failure, but they're wrong because the killer app won't be your cel phone replacement - though that'll be the excuse to buy it.

Read about Steve's Keynote here (and the iPhone here).

There's plenty of Jobs-ian hyperbole ("the first fully usable browser on a cellphone?" the Opera guys, or the Nokia S60 browser team, or the mobile Internet Explorer team might disagree - for very good reason) but its a doozy of an event and signals a significant transformation: Apple wants to become the last mile (the UI) for everything you do. Let everyone fight over the increasing commoditization of infrastructure and networks; Apple just wants the users.

That doesn't mean they're going to get them, but I do think this is a real effort... its no iWork (like that's a real replacement for Microsoft Office) or (*cough*) Zune... in fact...

...this is what the Origami should have been...

January 8, 2007

Bill Gates and my bathroom

Update 1: I've been informed (by my wife) that, um, shoving a whole chicken in the garbage disposal is NOT a good idea. Go figure.

Update 2: Bwa-ha-ha - my diabolical plan (humour-by- association) appears to be working (You're welcome Carl!). We now return you to your (ir)regularly scheduled post.

What the heck do they make garbage disposals from? Those things are nigh indestructible. I shoved pretty much a whole (leftover) chicken down my sink last night, along with various other food-ish substances, and that thing just kept chugging along.

And that was just last night. I wonder if that's what all the people who used to make samurai swords are doing now...

And speaking of funny (see what I did just there?), if you're not reading Scott Adams' blog, you should. Much has been made of Bill Gates' prognistications in SciAm last month, about robots and home automation, but I thoroughly enjoyed Scott's take. He puts pretty much the same concepts into more, um, relatable territory: the bathroom.

Scott Adams, of course, is the author of Dilbert, but his blog (and especially his book, the Religion War) are to Dilbert as Office Space is to Beavis and Butthead - all they share is the author (Mike Judge in the latter case) and a skewed world view. I enjoy all four, but in very different ways.

Plus, I'm just hoping you'll think *I'm* funnier when you read Scott's blog; humour-by-association, or something.

And (ironically for this post), Scott Adams is also a (serious) proponent of the Bill-Gates-for-President movement. Its a strangely compelling argument: "For my president I want a mixture of Mother Teresa, Carl Sagan, Warren Buffet, and Darth Vader."

Indeed.

Just keep him away from my toilet...

January 4, 2007

CES 2007 is coming

I'll be out at CES next week, during the middle of the week. It really is the oddest show. The main reason to go, and indeed the reason I'm going, is that everyone else goes.

So its a great place to connect with a whole lot of people from all over in a very short amount of time. I hear there's some exhibition thing there, too... though, if previous years are any indication, I might not even make it to the show floor (at least, not for very long) - ands its certainly NOT a compelling reason to show up.

Those ouroborosistic sophists would be proud.

January 3, 2007

How to make software faster (with hardware)

I also read over the holidays some nice posts from a Photoshop architect (and engineer) about performance in software with regards to both 64 bit computing and the broader proliferation of multi-core processors (with Photoshop specifically, in this case). I've said this many (many) times, but this is a good excuse to repeat it: the best way for hardware designers and vendors (*cough* Intel *cough*) to improve utility (and therefore value) of the CPU is to improve memory bandwidth from the CPU. Clock speeds and cores and more processors and fancier instructions will make micro-benchmarks perform better, but RAM access is what will provide BY FAR the most improvement for well optimized software (read: where algorithms trump assembly).

For example, the biggest advantage that NVidia and ATI, with their GPUs, provide isn't the specialized HW instructions for rendering graphics (though those help): its that they can access RAM, oh, about a 50x to 100x faster than it can be addressed by the CPU.

That said, it is clear that the dramatic increase in multi-core processors in computers (and units per CPU) will create a new class of software and algorithms to be advantaged by them. Nobody's arguing that their sh!# doesn't stink...

January 2, 2007

Review: Michael Crichton's Next

I read (among other things ) Michael Crichton's new novel Next over the holidays. As is always the case with Crichton, the book is a meticulously researched (not to be confused with well researched :P) escapist thriller about the social, ethical, legal (three terms also not to be confused) and technical dangers of some either common or emerging disruptive technology development. His last two novel, Prey and State of Fear were about nanotechnology and global warming, respectively. Next is about transgenics.

Part of the joy of Crichton is that he does provide a lot of background education and details about his field of attack, which is always fun. He is to science and technology what John Stewart is to Network News - there's real information, but fact and fiction are sometimes more than a little entwined.

But make no mistake: the science and plot are riddled with holes, even moreso than the average novel (Crichton or not), and though most thrillers and sci-fi make short shrift of characterizations in favor of plot advancement, Next carries that stereotype to a new level. There are waaaaay too many characters and plotlines, including two characters with the same name, just (as far as I can tell) in order to serve one minor, slightly silly plot point. And the plot is ridiculously riddled with an (literally) unbelievable cacophony of of coincidences required to string it together. But still.... somehow the novel does compel you to turn from one page to the next until the end. The man knows how to keep you reading.

In short, if you're a fan of the Michael Crichton house style - procedural thrillers exploring intellectual ambiguities - chances are you'll enjoy the novel, but not that much. It passes the time, but you won't remember or re-read this book.

If you're not a fan, read something else.

Speaking of which... my favourite Michael Crichton novel, though still adhering to that procedural thriller modus operandi, is the Great Train Robbery. Highly recommended, whether you like Crichton or not - its a very well executed "caper" story with just enough social commentary and periodic detail to engage you in its Victorian era setting. The characters are engaging if a bit archetypal, but that works in favour of the narrative which builds steadily to a crescendo of a conclusion that will have you marveling at its elegance (IMHO, of course). Its a page turner I'd read again and again...