Jackson attends Lakers' loss to Rockets

FOXSports.com - NBA- Jackson attends Lakers' loss to Rockets:

Ah the Lakers... so much sadness. I think Kobe's a great player, but he's such an ass! If you're playing basketball, and want to win championships, why WOULDN'T you want Shaq? Why why why?

Additionally:


Phil Jackson (R) watches the game with Kato Kalin (C) and Lakers owner Jerry Buss. (Mark J. Terrill / FOXSports.com)

Um, What the FUCK? Kato is still around? How does one become a professional hanger-on anyway?

Foldershare and Google Desktop Search

The Shared Folder

I've been a big fan of Foldershare for a while now, and this feature absolutely kicks ass. I have used foldershare to keep documents and favorites in synch no matter where I am, but now I can synch up all my indexes to not only find where it is on my computer but on which computer it resides! That is so cool! Next step? If I do not have it in a shared folder, allow me to share the document it found automatically, based on my permissions.

Google hires another Firefox engineer via News.Com

Google hires another Firefox engineer via News.Com

The google news coverage continues, here at Iron Yuppie; this time focusing on the Firefox engineer hiring as proof of them creating a browser. Create a browser, don't create a browser, I promise that's not why they're hiring him. They're hiring him because they happen to do a lot of web pages, and are pushing browsers further and faster than they've ever been pushed before and potentially having someone on board who knows a little bit about browsers is a Good Thing(TM). Jeez people, it's not always a huge conspiracy.

Google's Maps and Redacting

I thought this was an interesting little comparison of the mapping data between TerraServer and Google's new maps.

Google’s White House

TerraServer's White House

For those that can't tell, Google appears to have blocked out the White House buildings with neutral pixels. I certainly do not blame Google; I am sure it is not their doing. I'm not exactly clear on how a picture which is significantly less than 3 meter resolution could be that big a threat; you could get that accuracy from binoculars. I am sure someone out there can let me in on that. The only take away from this is if you're planning to reshingle the roof of the White House, you better use something besides these maps to plan it.

D

The Pin Clock

The Pin Clock via Engadget

In the continuing clock series...

This may be the first use in recorded history where someone is doing something useful with these pin pads, rather than making an impression of their nose or middle finger.

Also on the site:


"We're already hard at work trying to figure out way to hack this thing to
display the time in binary format instead of regular numerals..."

Do you think that this from ThinkGeek would meet the bar?

One Minute Rechargeable Battery

One Minute Recharge For Toshiba's Nano-Particle Coated Lithium-Ion Battery

I'm such a huge fan of this, not only because it's super cool, but because it takes a completely different look at a very common problem. Everyone gets annoyed at how long you have on battery power, but very few people are really away from power outlets for 4 straight hours (barring a plane ride I suppose). Having a 1 minute rechargable battery makes it so simple to recharge and still remain on the go. It solves 70% of the problem, rather than 100%, but the 70% of people who now no longer have a problem will be pretty happy about it.

Datablogging

John Robb's Weblog

I totally agree how cool this is. I was having a conversation with someone the other day... basically THE requirement for the next step of the web is to take all the crappy data that's out there and make it structured. To the extent that RSS provides a nice and easy way to do that, I think it'll really advance things. Whether or not Siebel or SAP decides to RSS enable their huge databases or someone else needs to write the layer that sits on top is still open. But it's absolutely a great idea... imagine being able to keep two directories or two databases entirely in synch through RSS? I love it!

Burning Bed Kills Queens Man

Gothamist: Burning Bed Kills Queens Man

Not funny, but very interesting from a resource allocation perspective. A man was burned to death under a burning mattress and the fire engine took 7 minutes to get there. Is that a long time? According to this "Firefighters, Chiefs Respond to Survey on Boston Globe Series - (Firehouse.com News)" the NFPA guidelines are that fire departments must be able to reach 90% of building fires in 6 minutes, only 35% of which actually met the goal in 2002. Ok, so let's say the firehouse that closed down could get the time back to 6 minutes. You could put a firehouse on every street corner and get the time to 1 minute... does that improve the average welfare of your city? It's a bit like planning for the next Tsunami or Meteor... sure you can do it, but it'll cost you and it'll likely have a very rare payoff. The second Mr. Bloomberg started closing firehouses to save money and put that money into other causes, you knew something like this was going to happen at least once. Note, bring this note up when my condo is burning to the ground because the fire engine got to my house in 8 minutes instead of 6.

Traffic in NYC

The New Yorker: Life in the Slow Lane

I love this article... as you may know, I'm a long time fan of understanding traffic; it's such a fascinating system. The article covers exactly how difficult it is to manage the traffic in New York. Why not build more roads? I'm glad you asked! There's an amazing theory called Braess's Paradox which explains that, believe it or not, adding more roads may actually INCREASE average traffic time! Not because more cars are going, but because cars that previously took long roads with low time changes due to congestion now switch to shorter roads where congestion makes things much worse. Click through the link for a detailed explanation but imagine the following:

You have two ways home, each with two sections of road on your way home. One takes 25 minutes + 1 minute for every car on that road for the first section, then 1 minute + 5 minutes for every car on the road for the second section. The second is the exact reverse, with the first section taking 1 minute + 5 minutes for every car on the road, then 25 minutes + 1 minute for every car on the road. It looks a bit like this…

Average length of time to get home for 1 car = 32 minutes
Average length of time to get home for 9 cars = 80 minutes

So the city comes along and puts a short cut in between the two shortest roads, each which only take six minutes… and the short cut is so good it only takes one minute to cover no matter how many cars are on it.

Average length of time to get home for 1 car (taking just the shortest roads) = 12 minutes!

Good times, right? Wrong!

Average length of time to get home for 9 cars (taking just shortest roads) = 92 minutes… ACK!

Though you can end up load balancing for a while, where certain cars continue to take longer routes, it turns out when all the roads reach capacity, it’s likely that you’ve INCREASED the average drive home. This basically results from the fact that you’ve put more load on the less scalable roads where roads that get congested very quickly then can slow drive time even more than if you just took the long way home. Now imagine having to do a calculation like this for NYC where, rather than one or two routes home, you have a BILLION routes home. I’ll leave solving that problem as an exercise for the reader.

But wait, the article gets even better!

Just as the curve of maximum "throughput"—moving as many cars between two points on a road as efficiently as possible—reaches its peak, it abruptly falls off the cliff and is squashed flat against the baseline of the graph.

Traffic engineering is the science of maximizing throughput. What makes traffic jams hard to understand, at least within traditional traffic-engineering practice, is that they tend to occur around the time that the road is performing according to the engineers' peak specification. One important development in understanding this "nonlinear" phenomenon came in 1992, when Kai Nagel and Michael Schreckenberg, two physicists at the University of Cologne, in Germany, began to apply a computational technique known as "cellular automata" (or C.A.) to traffic. In a C.A. model, highway capacity is represented as a two-dimensional grid. Each cell in the grid has one of two "states": empty or occupied by a particle, which in this case is a car. Unlike traditional mathematical models used by traffic engineers, where it is assumed that all drivers are the same, in a C.A. model the particles can be assigned values intended to represent different types of drivers: fast drivers, slow drivers,
tailgaters, and lane changers can all be represented in the model. The result is virtual traffic.

Um, by better of course I mean more geeky. BUT boy is it geeky! I love this... where else can you find a system where one second before it starts failing it's operating at absolutely peak efficiency! And modeling traffic based on different driver agressiveness levels? Yummy! If you're designing SimCity 5, please build this in... I'll be indebted to you forever!