As a lifelong EV fan I just love watching these two videos:
The electric car beats the Ferrari and the Porsche
The electric car beats the Lamborghini and the NASCAR
Even when you set aside the mega-geek factor and the bragging rights, I believe fast and powerful electric cars and trucks are the way to change the American perception of EVs for the better.
For the record, my first ride in an EV was 1971, before some readers of this page were born, and it was not a demo or a prototype. It was a commercial vehicle in daily use, a British milk delivery truck to be exact (you may have a hard time finding info specific to these EVs on the web unless you to know that the Brit term for them is "milk float"). Being a 'milkman' was a great way to earn money between high school and university and I was in good company (Sean Connery worked as a milkman in Edinburgh, although he drove a horse-drawn cart, not an electric 'float').
In techno-speak and biz-think, the role of the electric milk float meshes perfectly with the traditional characteristics of an electric vehicle. The range was 30 miles, plenty for the inner city delivery route I covered. The speed topped out at 30 mph, the highest speed limit of any of the roads on the route. The float pictured on the right is pretty much the same as the one I drove. It is even in the livery of the Unigate company, the same dairy I worked for, owned by food giant Unilever. The image is from the amazing milkfloats.org.uk web site. Amazing because yes, there is a whole web site devoted to these vehicles.
The awesome torque of electric motors was perfectly suited to getting a loaded truck off the mark and up to speed in a hurry. The crates back then were metal. The milk bottles were glass, and a full load of 750 Imperial pints weighed, well, it weighed a whole...a big...well a heck of a lot (if anyone happens to know how much, I'd love to hear from them). The point about the weight is, heavy loads are easy for an electric motor to handle (as most EV fans know, electric motors drive locomotives and cruises ships). Furthermore, the weight declined during the seven to eight hours that I spent dropping off full milk bottles and picking up empties, even as the batteries were being discharged. Back at the depot I would plug it in to recharge in overnight and it would be ready to go the next morning.
Remember folks, those EVs have been working like that, efficiently and pollution-free, since the 1960s. This was not a reaction to the oil crisis of the 1970s. What do you bet that more than 80 percent of all U.S. Postal Service delivery vehicles fit the 30/30 operational parameters of that old milk wagon? We could have had four decades of great gas-saving and emission-reduction from the postal service rather than a sweetheart deal for a petroleum-based government contractor (Grumman seems to make most of the postal vans I see in Florida--and I think the USPS ordered them in 1986).
China Sentences Former Drug Regulator to Death: Accountability indeed
China’s former top drug regulator has been sentenced to death for taking bribes to approve untested medicines. This was announced as the country’s main quality control agency started its first recall system targeting unsafe food products.
One can't help but wonder if there is something America can learn here about accountability. We see Bush appointees departng office, after making dismal and disastrous decisions, loaded with medals on their chests and cheered by pats on their backs. In contrast, the Chinese are executing a public official for taking bribes. A meaner spirited person than I might be tempted to wonder just how many people would be left alive in the White House if we applied the same rule here.
One can't help but wonder if there is something America can learn here about accountability. We see Bush appointees departng office, after making dismal and disastrous decisions, loaded with medals on their chests and cheered by pats on their backs. In contrast, the Chinese are executing a public official for taking bribes. A meaner spirited person than I might be tempted to wonder just how many people would be left alive in the White House if we applied the same rule here.
Money Handling Lags Behind Technology [in big chunks]
Finally, on 5/22, I got the payment through for cobb.com, sold 8 days earlier. As I suspected, the buyer has positioned the domain as a parking site. Whether or not the owners will now try to sell it to a "Cobb" business, I don't know. I contacted as many of those as I could ahead of the auction and they we obviously outbid by the new owner.
The length of time it took to complete the transaction was surprisingly long. What we have right now in the transaction field is a strange mix of models and technologies. Some transactions seem fast. For example, deposits to, and debit purchases from, my Bank of XXXX account seem almost immediate, although the 'posting cycle' may not match always match what you see when monitoring online. Paypal seems to happen fast and top-ups from my bank account are pretty quick.
But try moving a lot of money and things slow down. When you go from moving hundreds of dollars to shifting thousands, your choices start shrinking. At the same time, confidence in the system and trust in the customer seems to decline (as anyone who has heard the dreaded words come through the drive-thru speaker: "There'll be a hold on these funds").
As with all things commerce-related, it's all about trust and so far there is little evidence that the new forms of trust enabled by technology have outpaced the new forms of trust-abuse, a.k.a. fraud, that technology has engendered.
Anyway, cobb.com has gone, long live cobbsblog.com!
The length of time it took to complete the transaction was surprisingly long. What we have right now in the transaction field is a strange mix of models and technologies. Some transactions seem fast. For example, deposits to, and debit purchases from, my Bank of XXXX account seem almost immediate, although the 'posting cycle' may not match always match what you see when monitoring online. Paypal seems to happen fast and top-ups from my bank account are pretty quick.
But try moving a lot of money and things slow down. When you go from moving hundreds of dollars to shifting thousands, your choices start shrinking. At the same time, confidence in the system and trust in the customer seems to decline (as anyone who has heard the dreaded words come through the drive-thru speaker: "There'll be a hold on these funds").
As with all things commerce-related, it's all about trust and so far there is little evidence that the new forms of trust enabled by technology have outpaced the new forms of trust-abuse, a.k.a. fraud, that technology has engendered.
Anyway, cobb.com has gone, long live cobbsblog.com!
The Intuitive Interface Myth: The fault of gurus and experts
Okay, so I am officially fed up with the notion that graphical user interfaces are "intuitive" and "easy to use." There is nothing inherently intuitive or easy in a GUI. It all comes down to the design. Moving a mouse pointer over an icon and clicking it may look cool, may feel cool, but how easy is it for the average person? The answer depends on a variety of factors, like hand eye coordination and icon design. Half the time my screen has a bunch of icons on it the meaning of which is less than obvious. In other words, I have to learn what the icon means, I cannot simply intuit the meaning. Surely a word would be better? Yes, I know that you can turn on words for some icons, but this is inconsistent between applications and operating systems. And when you get to the web all bets are off. Some sites underline links, others don't. Some use rollovers, others don't. The same function is given different names on different sites, and so on and so forth.
How did we arrive at this situation, where computers and software are designed with interfaces that are non-obvious? Obstacles and not enablers? There are several parties to blame. Let's start with the industry giants and the wars between them that did not help (a great case study for MBA students--how the free market influences interface design--does the iPod dominate MP3 players due to interface? Did the windows wars between Apple and Microsoft help or hinder the interface evolution?).
Competition is great for some things, but when companies get fixated with one-upping the competition (in order to sell more product) there is a tendency to force software and hardware developers to add bells and whistles and do things different, even when an unadorned standard config is working fine. There is a whole book in this phenomenon, but consider one example, an interface issue that may well be the single greatest cause of lost productivity in the late nineties and early oughties (or whatever this current decade is called).
I'm talking about the way File Save works. Back in the old days, somewhere between the Pterodactyls and the 386 chip, it was "standard" for the File Save command to require confirmation, much the same way that the File Save As command does today. Suppose you had opened up the spreadsheet of weekly sales figures and updated them. When you selected File Save the spreadsheet application would ask you: Yes or No? The reason for this was obvious: You might want both versions of the spreadsheet, the one that you opened and the edited one . The latter might be very different. For example, the original might be the Megabank proposal which you had edited to become the Ultrabank proposal. You might have deleted a lot of information from the original on the way to the new version.
Obviously the File Save As command is for just such situations, but if there was one instruction that was drummed into the brains of early adopters of PC technology, back in the days when they were prone to disk crashes and brownouts and OS flakiness, it was this: Save now and save often. At that time, saving was not a destructive process. But it became one. And the Apple Mac was where it started. The Mac introduced "File Save with no overwrite confirmation." This meant you could have a problem if you opened a 10 page report, spent an hour re-writing the last 5 pages, hit File Save, then changed your mind about the changes. Even worse, open the document, perform Select All , Cut, File Save, and think about what happens if the machine hiccups before you Paste.
In all these scenarios there were workarounds that prevented them from being problematic, but they required a significant change in work flow. And for what? To make it easier to save work, a goal not necessarily accomplished without some hard lessons and tough data losses in the interim. Arguably things got worse when Microsoft Windows apps aped this style of File Save. (I well recall long distance arguments as a beta-tester with Borland as it struggled to choose the file save style for Quattro Pro--go with the new Excel/Mac "overwrite" style or stick with the traditional "confirm overwrite" style of Lotus 1-2-3.)
Windows aspired to be like Mac only different. That led to several File Save issues. One of the benefits of a graphical OS is the ability to convey more information in the same space. For example, an application could show if File Save was necessary by graying out and disabling the File Save command when the version of the document in memory was the same as that stored on the hard drive. But that feature has never been implemented consistently. That's a pity because it is really handy to know if changes have been made. Consider the task of editing a large image where the File Save command can take a long time to execute; performing unnecessary file saves in this situation is a real waste of time. The Canvas graphics program is one application that conforms to the "gray=saved" convention.
The current "saved" status of a document is particularly important when you are dealing with files that exist in two places, such as a web site you are editing locally before uploading. Fittingly, Dreamweaver MX is another app that uses the "gray=saved" convention.
I like the "gray=saved" convention but like a lot of interface conventions one cannot rely on it being there across apps or platforms. Why is this a problem? Because better and more consistent interfaces improve productivity and safety. We're all familiar with steering wheels. They allow us to jump behind the wheel of any car and navigate through traffic with a high level of expectation of success. They are a convention that car makers mess with at their peril, however much they want to "out-innovate" or "one-up" the competition. And we don't teach our kids to drive cars by telling them clockwise for starboard, anti-clockwise for port, because those are not the conventions used in driving cars. Port and starboard are for boats, where steering is sometimes a matter of push the tiller right to go left and so on. But in the early days of automobiles, some used tillers. Most people agree the wheel thing was a step forward and it has been the automotive interface standard for navigation for nearly a century. Maybe computers could use a similar period of interface standardization and stability.
.
How did we arrive at this situation, where computers and software are designed with interfaces that are non-obvious? Obstacles and not enablers? There are several parties to blame. Let's start with the industry giants and the wars between them that did not help (a great case study for MBA students--how the free market influences interface design--does the iPod dominate MP3 players due to interface? Did the windows wars between Apple and Microsoft help or hinder the interface evolution?).
Competition is great for some things, but when companies get fixated with one-upping the competition (in order to sell more product) there is a tendency to force software and hardware developers to add bells and whistles and do things different, even when an unadorned standard config is working fine. There is a whole book in this phenomenon, but consider one example, an interface issue that may well be the single greatest cause of lost productivity in the late nineties and early oughties (or whatever this current decade is called).
I'm talking about the way File Save works. Back in the old days, somewhere between the Pterodactyls and the 386 chip, it was "standard" for the File Save command to require confirmation, much the same way that the File Save As command does today. Suppose you had opened up the spreadsheet of weekly sales figures and updated them. When you selected File Save the spreadsheet application would ask you: Yes or No? The reason for this was obvious: You might want both versions of the spreadsheet, the one that you opened and the edited one . The latter might be very different. For example, the original might be the Megabank proposal which you had edited to become the Ultrabank proposal. You might have deleted a lot of information from the original on the way to the new version.
Obviously the File Save As command is for just such situations, but if there was one instruction that was drummed into the brains of early adopters of PC technology, back in the days when they were prone to disk crashes and brownouts and OS flakiness, it was this: Save now and save often. At that time, saving was not a destructive process. But it became one. And the Apple Mac was where it started. The Mac introduced "File Save with no overwrite confirmation." This meant you could have a problem if you opened a 10 page report, spent an hour re-writing the last 5 pages, hit File Save, then changed your mind about the changes. Even worse, open the document, perform Select All , Cut, File Save, and think about what happens if the machine hiccups before you Paste.
In all these scenarios there were workarounds that prevented them from being problematic, but they required a significant change in work flow. And for what? To make it easier to save work, a goal not necessarily accomplished without some hard lessons and tough data losses in the interim. Arguably things got worse when Microsoft Windows apps aped this style of File Save. (I well recall long distance arguments as a beta-tester with Borland as it struggled to choose the file save style for Quattro Pro--go with the new Excel/Mac "overwrite" style or stick with the traditional "confirm overwrite" style of Lotus 1-2-3.)
Windows aspired to be like Mac only different. That led to several File Save issues. One of the benefits of a graphical OS is the ability to convey more information in the same space. For example, an application could show if File Save was necessary by graying out and disabling the File Save command when the version of the document in memory was the same as that stored on the hard drive. But that feature has never been implemented consistently. That's a pity because it is really handy to know if changes have been made. Consider the task of editing a large image where the File Save command can take a long time to execute; performing unnecessary file saves in this situation is a real waste of time. The Canvas graphics program is one application that conforms to the "gray=saved" convention.
The current "saved" status of a document is particularly important when you are dealing with files that exist in two places, such as a web site you are editing locally before uploading. Fittingly, Dreamweaver MX is another app that uses the "gray=saved" convention.
I like the "gray=saved" convention but like a lot of interface conventions one cannot rely on it being there across apps or platforms. Why is this a problem? Because better and more consistent interfaces improve productivity and safety. We're all familiar with steering wheels. They allow us to jump behind the wheel of any car and navigate through traffic with a high level of expectation of success. They are a convention that car makers mess with at their peril, however much they want to "out-innovate" or "one-up" the competition. And we don't teach our kids to drive cars by telling them clockwise for starboard, anti-clockwise for port, because those are not the conventions used in driving cars. Port and starboard are for boats, where steering is sometimes a matter of push the tiller right to go left and so on. But in the early days of automobiles, some used tillers. Most people agree the wheel thing was a step forward and it has been the automotive interface standard for navigation for nearly a century. Maybe computers could use a similar period of interface standardization and stability.
.
Pew Survey Finds Most Knowledgeable Americans Watch 'Daily Show' and 'Colbert'
Yes! I am officially a "most knowledgeable American" according to a Pew Survey. And you know dem Pew guys is smart.
Carter Calls Bush Worst in History: What's wrong with that?
So, former President Carter was quoted by the Arkansas Democrat-Gazette as saying the Bush administration "has been the worst in history." Fair enough. It is an opinion that I happen to share. But the Bush White House, still determined to undermine the cornerstone of the open society, namely criticism, expressed outrage and called Carter "increasingly irrelevant." Yeah right. The opinion of a former President and Nobel Prize winner is irrelevant. Sadly, Carter felt he had to back-pedal as reported in this story.
Personally, I don't hold with this whole "past presidents don't criticize sitting presidents" thing. After all, Reagan saw fit to criticize Clinton and took several "cheap" shots (remember "I may not be a Rhodes scholar but..."). Consider this:
Personally, I don't hold with this whole "past presidents don't criticize sitting presidents" thing. After all, Reagan saw fit to criticize Clinton and took several "cheap" shots (remember "I may not be a Rhodes scholar but..."). Consider this:
Eisenhower was critical of John F. Kennedy's domestic policies, the first President Bush pounded on Bill Clinton, now his pal, for his Haiti policy, and Nixon chided the first President Bush (for comparing himself to Harry Truman in his 1992 re-election campaign). Theodore Roosevelt was brutal in his assaults on Taft and Woodrow Wilson. Media Matters
Media Matters - ABC, CBS still have not reported on Comey's revelation of wiretapping "hospital drama"
An interesting media watch-dog site reports that ABC and CBS still have not reported on Comey's revelation of wiretapping "hospital drama. This is the bizarre-but-true story of how, in 2004, White House counsel [now Attorney General] Alberto R. Gonzales and White House chief of staff Andrew Card
The Comey referred to above is James Comey, who was then the number-two man at the Justice Department but temporarily in charge because his boss, Attorney General John Ashcroft, was seriously ill, hospitalized with pancreas trouble. According to Brian Williams reporting for NBC, Comey was on the way home from work when he got an urgent call and sped to the hospital...
Was it Gonzales' willingness to sink this low that got him Ashcroft's job?
attempted to pressure then-Attorney General John Ashcroft "at his [hospital] bedside" to approve an extension of the secret NSA warrant-less eavesdropping program over strong Justice Department objections.I would never have thought I could feel sympathy for John Ashcroft, but it just shows how a really bad job performance [Gonzales as AG] can make a mediocre job performance [Ashcroft as AG] look positively stellar by comparison.
The Comey referred to above is James Comey, who was then the number-two man at the Justice Department but temporarily in charge because his boss, Attorney General John Ashcroft, was seriously ill, hospitalized with pancreas trouble. According to Brian Williams reporting for NBC, Comey was on the way home from work when he got an urgent call and sped to the hospital...
"...he ran up the stairs hoping to get there before Alberto Gonzales, then White House counsel, and Andy Card, White House chief of staff. He [Comey] says when they arrived, they tried to get Ashcroft's approval for an extension of the eavesdroping program despite strong Justice Department objections. He [Comey] says Ashcroft lifted his head off the pillow and adamantly refused to sign. "
Was it Gonzales' willingness to sink this low that got him Ashcroft's job?
Domain Name Selling Not Cool for Quick Cash
Okay, so the auction of cobb.com ended a little after noon on May 14. Some 5 days have passed without payment actually arriving in my bank account. The domain 'broker' assures me this is normal.
Ergo, don't look to a domain name sale to raise really quick cash. I will let you know when the money does come through, and share a few tips on how to speed up the process (hint: you'll need to know your EPP code to get paid for your domain). I will also share some thoughts on the final price and tell you who bought it.
BTW, at this point I have not been told who bought cobb.com but my guess is: A domain name speculator.
Ergo, don't look to a domain name sale to raise really quick cash. I will let you know when the money does come through, and share a few tips on how to speed up the process (hint: you'll need to know your EPP code to get paid for your domain). I will also share some thoughts on the final price and tell you who bought it.
BTW, at this point I have not been told who bought cobb.com but my guess is: A domain name speculator.
Mitt Romney Wants To Re-Tool Washington
Apparently Mitt Romney Wants To Re-Tool Washington, according to a Mike Wallace interview with the contender for the GOP presidential nomination. (That page also has links to several video interviews with Romney).
Now, a lot of people would agree that DC needs a good re-tooling. But I don't think Romney is the person to do it. It's not just that I rarely vote Republican and has nothing to do with the fact that he is a Mormon (dare I say "some of my best friends are Mormons"?). And I don't have a problem with politicians changing their stand on issues. How else are we going to get change? If we insist that every politician who changes his or her mind be discarded because of it, we are not going to have a democracy for very long. A free and open society must leave room for criticism and change. I just happen to disagree with him. Consider:
Now, a lot of people would agree that DC needs a good re-tooling. But I don't think Romney is the person to do it. It's not just that I rarely vote Republican and has nothing to do with the fact that he is a Mormon (dare I say "some of my best friends are Mormons"?). And I don't have a problem with politicians changing their stand on issues. How else are we going to get change? If we insist that every politician who changes his or her mind be discarded because of it, we are not going to have a democracy for very long. A free and open society must leave room for criticism and change. I just happen to disagree with him. Consider:
"Among the things he wants to do as president is increase U.S. troop strength overall by at least 100,000 and modernize military equipment."We need less military, not more. We have more better equipment than any other standing army of comparable size. We just use the stuff wrong. No equipment overhaul or troop increase is going to put a stop to terrorism. You defeat terrorism with humint and diplomacy. Not laser guided bullets.
"He wants to secure the Mexican border and decrease U.S. dependence on foreign oil."Well actually you can't secure the Mexican border (or the Canadian). Immigration is a problem best solved with economic policy not unworkable gestures like fences. But besides that, no politician is actually against securing the border, so you are hardly standing out by saying that. And most people on the planet want America to decrease its dependence on foreign oil. It is a position so obvious that it wins Romney no points with me.
"He’s against gay marriage and civil unions..."Sorry, a politician holding that view has to be very special in every other department before they get my vote.
"...and says that he'll hold the line on taxes."That strikes me as code for leaving in place the tax breaks for big business and the super-rich. Not something I agree with. Some of those businesses are oil companies--whose interests are not the same as those of the American people. It is we the people who will win the energy war, not politicians or oil companies.
Can You Believe Your Own Google?
Do you Google yourself? It sounds like rather a personal question so let me break the ice here: I Google myself, about once a week. In other words, I enter my name into the Google search box to see what comes up. Why? Because I can. Because I'm a techie. And because my ability to get new and interesting consulting assignments depends, to some admittedly unquantifiable extent, on those Google results.
But lately I've become concerned that results you get IF you are logged into Google when you Google yourself are different from those that a stranger would get.
In other, hopefully less clumsy, words: the results that Google returns about you could be different on a stranger's computer from those you get on your own computer (if you are logged into Google on that computer).
I don't know this for a fact and it is a hard fact to check because the results that Google returns can change each time you plug in the same search term (at least that is my experience). So, does anyone know the answer to this one? Does Google slant the results to you if you are logged in? This is not a trivial question and in my next post I will explain why.
P.S. My hat, indeed all three of my hats, is off to Stephen Euin Cobb who often tops me in the Stephen Cobb results. Nicely done Sir!
But lately I've become concerned that results you get IF you are logged into Google when you Google yourself are different from those that a stranger would get.
In other, hopefully less clumsy, words: the results that Google returns about you could be different on a stranger's computer from those you get on your own computer (if you are logged into Google on that computer).
I don't know this for a fact and it is a hard fact to check because the results that Google returns can change each time you plug in the same search term (at least that is my experience). So, does anyone know the answer to this one? Does Google slant the results to you if you are logged in? This is not a trivial question and in my next post I will explain why.
P.S. My hat, indeed all three of my hats, is off to Stephen Euin Cobb who often tops me in the Stephen Cobb results. Nicely done Sir!
Subscribe to:
Posts (Atom)