How Many Economists Does It Take . . .

How Many Economists Does It Take To Change A Lightbulb America?

lightbulb-at-sunset-landscape-900x1440.jpg

              They say it’s always good to start with a joke.  Here’s another.  Theres this married couple.  The husband says, “Robots, that sounds like something out of science fiction.”   His wife replies, “You live in a spaceship, dear.”

              You may be cringing a little because of the joke, or you feel like science fiction isn’t your thing.   But that’s the point of the joke.  Whether you like it or not, you have been living out a science fiction story, or, more accurately, competing science fiction stories, all your life.

            The authors of these stories aren’t names you would generally associate with science fiction or any type of fiction, for that matter -- they are economists.  (Insert your own joke here.)  But one of the many “jobs” economists have is to tell stories, based on economic theory and methodology, about our future.  The really funny thing is people in power believe these stories, and they shape, or try to shape, our economies and lives around them.  

            If you have gone to college, odds are it is in part a result of the science fiction story told by the economist, philosopher and management guru Peter Drucker. In The Landmarks of Tomorrow (1959), a title which just begs for a cover with vintage 50's sci-fi art, Drucker coins the term "knowledge workers" to describe a new class of workers who get paid to think for a living.

            Knowledge workerThink for a living?  These ideas may sound as clunky and retro as Robbie the Robot but they were actually some of the guiding economic stories for Americans in the twentieth century.  Search the phrase "knowledge worker" and you will find it used not just by economists, but authors in every field with anything to say about the modern workplace. 

            In 2003 younger economists were still finding huge success just by riffing off Drucker's work. Take economist Richard Florida, whose The Rise of The Creative Class  was a national best seller. (Trust me the words "economist" and "national best seller" don't happen in the same sentence all that often.)  But the people comprising Florida's creative class are really just knowledge workers with sexier jobs- think, programmer, graphic designer, audio engineer, etc.  In other words, Florida has made a really lucrative career just by updating the graphics in Drucker's original knowledge worker sci-fi classic. 

            It’s a classic that has spawned not just creative imitations, but millions of college degrees and related careers. Ask most people why they went to college and they will say something like, "to get a better job" or "I needed the degree for my career." Whether they know it or not, what these people mean is they went to college to become knowledge workers.   Until very recently, the college + knowledge worker = upward mobility calculation was considered more of a social science fact than an economic science fiction.  But increasingly obvious economic realities have called the validity of this story into question. 

            Like all science fiction, The Landmarks of Tomorrow sets its story in a world with a specific set of technologies.  Any science fiction writer will tell you today’s futuristic technology often looks old and busted just a few years down the road because of actual technological advances.  For example, had Rufus just given Bill and Ted iPhones and skipped the whole bogus phone booth cliche, I dare say their adventure would have been most triumphantly more excellent!

            A year before The Landmarks of Tomorrow was published, the technology that eventually destroys Drucker's visionary tomorrow land, and the reason so many of you went to college, is already at the working prototype stage.  By 1961 the first microchips were being commercially sold.

            Given its starting speed and growth rate, it took the microchip decades to become seriously world-changing. Even as it approached a point of power where it began to gain  broad public attention( think video games or the personal computer of the 1980’s) most saw it only as a long overdue bit of the shiny future the television had been promising them since birth.  There were no flying cars.  We were expecting flying cars!

             Explaining the larger implications of emerging technologies often falls to two groups of writers-economists, who usually explain it to old people in formal wear, or science fiction writers, who explain it to teenagers in trendier clothes. The teenagers,  usually years later, explain it to the old people in formal wear.  Rarely is it explained by an economist using science fiction, but it will be this time. 

            Paul Krugman is perhaps the best known economist in America.  In addition to winning the 2008 Nobel Prize for Economics, he teaches at Princeton, writes a bi-weekly column for the New York Times, is a best-selling author, and the only economist I know of who is the subject of a viral music video.  What you may not know about Paul Krugman is that he is very open about his love of science fiction.

            In fact, he has attributed his love of science fiction as the reason he eventually became an economist.  “I read [Isaac Asimov's] Foundation back when I was in high school, when I was a teenager . . . and thought about the psychohistorians, who save galactic civilization through their understanding of the laws of society, and I said ‘I want to be one of those guys.’ And economics was as close as I could get.

            It must then have been something of an adolescent pyschohistorian dream-come-true when the New York Times Magazine commissioned Krugman to write for its hundredth anniversary edition.  As requested, White Collars Turn Blue (September 29, 1996) is a speculative piece set a hundred years in the future. Krugman uses his opportunity as a professional economist/science fiction writer to answer the question of "why pundits of the time (1996) completely misjudged the consequences" of globalization and digital technology.

The future, everyone insisted, would bring an ''information economy'' that would mainly produce intangibles. The good jobs would go to ''symbolic analysts,'' who would push icons around on computer screens; knowledge, rather than traditional resources like oil or land, would become the primary source of wealth and power. . . But even in 1996 it should have been obvious that this was silly.

            For Krugman, two core problems make the information economy a bit of tomorrow land that would never be fully realized.  

            First, as the global standard of living rose, the third world's inhabitants would want the same traditional resources, oil and land, as well as the same standard of living, eating meat and owning a home, long enjoyed by those privileged enough, by chance, to inhabit the first world.  Even the staunchest vegetarian has to concede that a nice steak is really more satiating than a nice icon--even one on a retina display iPad.  

            Second, the computers that powered these screens and icons these symbolic analysts (a.k.a knowledge workers) were going to be manipulating for a living were, by necessity, going to be wildly powerful.  The more powerful the computers got, the fewer knowledge workers a company would need.  There was a bright future for you in this information economy--if you were a computer.

            Which brings us to another misjudged consequence Krugman illuminates--the decline of higher education.  Drucker's knowledge worker mythology inspired the vast majority of people with college degrees to ever consider applying. To be sure socialization, football, finding a mate, etc. were also drivers, but they were intended to be the icing on the knowledge worker cake.  According to Krugman writing from the future, without the promise future of knowledge worker jobs, and more specifically the salary and benefits that came with them, interest in college and universities soon dried up,which meant a decreased need for institutions of higher education and professors and staff.  

            It took a lot less than a hundred years for the information economy, knowledge workers and higher education to begin mirroring the fates outlined in Krugman"s science fiction. In fact, it took less than twenty. In 2013, economists Paul Beaudry, David A. Green, and Ben Sand authored The Great Reversal.  Among the more interesting findings of their article is that demand for knowledge workers actually peaked in the year 2000 -- just four years after Krugman's sci-fi story.  The ever increasing demand for knowledge workers died when the dot com bubble burst, though no one seemed to notice.

            After 2000, those that did notice began championing the idea that there was plenty of work for people with STEM (Science, Technology, Engineering, Math) degrees. You were safe, they would suggest, the college + knowledge worker = upward mobility calculation was still valid as long as you had a STEM degree and not one of those “useless” liberal arts degrees. But, in 2013, the Economic Policy Institute published a paper showing there was no STEM shortage; only half of all students graduating with a STEM degree are able to find a STEM job. STEM degree ≠ STEM job.

            It's not that there were no knowledge worker jobs or no STEM jobs, just not enough to meet the global supply being generated by colleges and universities. So the knowledge worker bubble kept on growing, and the belief in Ducker's mythology kept leading people to attend college.  But the crash of 2008 should have been something of a cultural wake up call.  When the head of Microsoft, a knowledge worker STEM mecca, lays off 5,000 people and says publicly  "Our model is things go down . . .  and they reset. The economy shrinks and then it doesn't rebound, it builds from a lower base effectively,"  maybe that would have been a good time for a little intervention,  a “come to Jesus moment” about the knowledge worker bubble.  But college costs kept rising and students kept coming

            In 2010 Edward Tenner wrote a piece for The Atlantic asking if Krugman's Prophecy  was coming true?  But, no one seemed to notice, and Law School costs kept rising and Law School students kept coming.  This year (2013), Krugman pointed to fellow economist Nancy Folbre, who also suggests parts of his vision for the future coming true. But college costs keep rising and students keep coming

            What happens to all these college educated people who graduate only to find themselves without a choice knowledge worker career?  Are they whisked off by UFO's?   Are they filed away in the X-Files?  No, they cascade.  Cascading is Beaudry, Green, and Sands term for how these college educated non-knowledge workers influence the economy.  Essentially, they just take what work they can find, that they are now competing with non-college educated workers often for entry level positions, or Starbucks gigs, or other jobs they would not consider in the more vibrant information economy they had been counting on. 

            Even a casual look at unemployment data from the last decade will show you that the more educated you are, the more likely you are to be employed -- just not necessarily in a job that required all that education.  The less education you have, the more likely you are to be forced out of the workforce altogether, because everyone is pushing down a rung on the employment ladder, and, as many near the bottoms of the ladder discovered, there were no rungs left for them.

         The final joke is that no one knows for certain where this science fiction story, the one we are living in, goes next. In the 1950’s when Drucker was writing The Landmarks of Tomorrow, Americans had a cultural certainty that future was going to be like The Jetsons or Disney’s Tomorrowland    But for those of us living in that actual future it’s not nearly as wonderful as we were told it would be.  In the year 2000, a date that already feels like saying back in the year 1900, Avery Brooks asked in an IBM commercial,  “where are the flying cars?  I was promised flying cars!”  In 2013 college graduates are asking the same questions about middle class jobs. 

 

I’d Say, “Well played," But He Already Knows That

Gore

Gore

As you have probably heard Al Jazeera recently bought Al Gore’s Current TV for “a reported $500 million.”  Before his personal taxes, that should garner Gore $100 million.  This sum alone makes him worth more than Mitt Romney , and that makes me smile.  But, it also makes me raise an eye brow.  Why would Gore sell Current TV now?  Why not last year or next year?

What you may not know is that Al Gore also sits on Apple’s Board of Directors;  or that almost one month prior to the day the Current TV announcement, Tim Cook, Apple’s CEO told NBC’s 30 Rock that television is “a market that we have intense interest in, and it’s a market that we see that has been left behind.”

The Current TV deal no doubt took months to orchestrate but the timing is interesting in light of the Tim Cook interview.  It makes me wonder what Al Gore knows about the future of television that Al Jazeera doesn't.

UPDATE: Jan 18th, Gore just used some of his new wealth to buy $29 Million Worth Of Apple Stock

Is Android Ready For Its Kodak Moment?

I keep thinking about Google and Kodak.

Mainly about how their fates may be similar. See, they have a lot on common.  Kodak and Google are both epic American success stories. Both companies have used the classic “razor and blades” business model to make billions.  Kodak built its business on selling inexpensive, dependable, handheld cameras, but made billions by selling film to the camera customers.  Google built its business by offering -- for free -- the most trusted search on the World Wide Web, but makes billions from selling its AdWords.

Some of you are too young to know this, but Kodak was revolutionary!  In 1900 Kodak introduced the Brownie camera which  sold for 1 dollar andits proprietary rolled film sold for 15 cents a roll.  (Trust me they made made bank on the film.) The Brownie democratized photography allowing everyone to become a photographer and making photographic technology “as convenient as the pencil.”

That was just the start for Kodak. The company would spend the rest of its life  succeeding through innovation. Just a few major examples to prove this point: In 1923 Kodak offered its first consumer grade CINE-KODAK Motion Picture Camera, as well as its first consumer grade KODASCOPE Projector. The immediate popularity of these products resulted in a global network of Kodak processing laboratories to meet demand. In 1935 Kodak introduced KODACHROME, the first successful consumer color film.  Because of these and other innovations it is fair to say that in many ways Kodak defined consumer grade photography as we know it.

Nearly four decades later singer songwriter Paul Simon would still be begging us not “take (his) Kodachrome away,” but I suspect that not long after this song hit #2 on Billboards Top 100 Kodak’s board of directors was already beginning to “read the writing on the wall.” In 1975 Kodak prototyped the first digital camera. In 1994 Apple would release the Apple QuickTake 100 and 150 both built by Kodak.  Kodak would not release their own digital camera until 1995.

Much like Xerox, Kodak’s brief romance with Apple may have been the start of their unraveling.  Less than 20 years later Kodak would exist mainly in name and patent rights only.  The digital image, one of their many revolutionary inventions, put them out of business.

I think something similar might be happening with Google and their Android effort.  Let’s start with what should be good news for Android.  There are vastly more phones running some flavor of Android than anything else. (Here are the iPhone numbers for comparison.)  This is a trend that has been going on for some time and shows no sign of stopping.

What all these people are doing with their Android phones is unclear.  They are rooting and ROMing and modding  them.  Like everyone, they’re playing a lot of Angry Birds. The have lots of apps to choose from now.  One thing is clear though they are not spending a lot of time surfing the web with them.

Know who is surfing the web via mobile devices? iOS users!  The iPhone represents only 5.5 % of all mobile phone sales, but according to the online advertising network Chitika iOS represents almost 50% of all mobile web traffic.  If we look at the October 2012 Net Applications numbers for browser share iOS controls 60%, while Android controls only 27% share of this market.  This is another trend does not appear to be changing and tablet numbers are even worse for Android.

Who cares which device has the most web traffic?  Smartphones are about so much more than just web access.  Web access is so 1990s. You can do so much more with a smart phone than just surf the web.  There’s maps, and apps, and mail and SMS, and Kodak’s gift of digital photography and more much more. And that’s the problem for Google.

Google stole from Apple “developed” Android because it saw that mobile web browsing was going to be huge.  Google’s main revenue stream is still web based advertising, largely delivered through its search engine.  Google owns search.  They own it so much that the FTC keeps fining them and investigating them. They continue to own it on mobile devices, but it is a lot less profitable on mobile devices.  Google search is also a lot less visible on mobile devices. On an iPhone/iPad you usually don’t even see Google’s AdWords ads.  Plus, there are Apps on iOS and Android and (probably) now Microsoft’s mobile products that make use of Google’s search API that never even show you Google AdWords.

If mobile software nibbles into Google’s immediate Android goals, mobile hardware bites significantly deeper. First devices like the Kindle Fire and NOOK use their version of Android primarily to sell users Amazon or Barnes and Noble content. But then there is Samsung. Samsung banks off the Android OS! The reason that Apple and Samsung are in full on thermonuclear patent war is that they are the only big winners. Google’s  Nexus line is sold at or near cost and is still not a big seller.  So the biggest hardware winner from all of Google’s investment of time, energy and money isn’t Google -- it’s Samsung.

These are not insignificant investments of time, energy and money on Google’s part.  Some interesting “back of the envelope” calculations suggest that Google has spent 15-20 billion dollars on Android.  Google first developed Android to bring people to the web so they could sell more Ad Sense deals. Sadly, all of the above suggests that that is not happening.  In fact, Google admits that its mobile advertising is “decelerating” its ad revenue.

Some have argued Google made the Android investment to ensure competition. Investing 15-20 Billion dollars to “ensure competition” that is effectively decelerating your revenue is not successful strategy. Spending 15-20 Billion dollars to create a platform which you then --give-- to Amazon and Barnes and Nobel allowing them to sell content to their customers, while simultaneously undercutting your own content deals, is not wise corporate governance. Spending 15-20 Billion dollars to create a platform which you give to Samsung so that competing corporation can make wild profits is just foolish. (Note to Samsung: where’s your App connected media store?)

You might be tempted to think; “Hey, it’s Google. What‘s  a few billion?” Well, a billion here, a billion there and pretty soon -- you are talking real money.  But I wonder if Android isn’t getting Google ready for its Kodak moment.  Kodak was on the forefront of digital imaging. They saw it coming and decided to be the ones that made it happen.  And so they did.  Every time you use your smartphone to take a picture you can thank Kodak.  Unfortunately, while Kodak intensely focused on developing digital imaging technology they never figured out how make that technology into a business.  Kodak developed the technology that put themselves out of business.

Google obviously has a greater opportunity for income streams than Kodak ever had, and that is a good thing.  Because, in the long run, I suspect Android is not going to be one of them.  Personally I eagerly await Google to give me my self driving car.  Hear me Google!  Perfect the self driving technology, then let Apple design the car. 

Is The New iPad, The New Normal?

ipad4

ipad4

During the last week of October Apple announced a host of new products; a new Mac Mini, new iMac, new MacBook Pro with Retina Display, and the thing everyone watching the keynote was waiting for--the iPad mini. No big surprise there, as the rumor mill had predicted all these products with varying degrees of certainty. But there was a surprise, one more thing if you will - the New iPad, or should I say the new new iPad less than seven months after releasing the third generation iPad, Apple has released the fourth generation iPad. This update was understandably disconcerting for some owners of the now “old” iPad.

But picture this: You make the best selling desktop computer in the nation. You make the best selling laptop computer in the nation. You make the best selling smartphone in the nation. You own the tablet market, which it could be said, you created. Pretty sweet deal huh?

But success brings unintended consequences, and you have to deal with lots of those. Not the least of which is your corporation has fans. Yeah, your products, they amaze and delight, they create a childlike sense of wonder in your users, and as a result, your corporation is followed like a rock star 24/7. In part, you like this, because it means billions of dollars of free advertising. The downside? All of your fans want backstage passes --all the time.

They want to know everything! They want to know what’s coming out next, what’s in it, what it’s going to cost. This used to not be such a big deal until the web, another blessing/curse you have to deal with, made everyone a “journalist.” If you let them, these “journalists” would storm the gates of your not so secret Willy Wonka HQ and tell all. But you can’t allow that to happen.

Creating digital devices that amaze, delight and imbue their users with a childlike sense of wonder is--by necessity--a super secretive process. Once the fans see the man behind the curtain, they loose The Great and Powerful Oz, and you can’t have that. But the fans hunger and the “journalists” get fed by feeding them. The web also makes this harder because it makes everyone in your global supply chain, or anyone with clausal contact with them, a potential tipster to the “journalists.” The twisted part is your marvelous mobile devices makes sharing these tips all the easier.

All this “sharing” is creating another problem for you. It screws with your earnings and you love earnings. Once the fans and the “journalists” get wise to the idea that a new magical device is coming out they stop buying the existing magical device because it is about to become the old magical device. And an old magical device is, well, just less magical. Back in the day when the fan base was small, this was not such a big deal. Plus, it made the fans feel special to be in the know. But now you are The Beatles of the Tech Industry, and if anyone ever questions you on that remind them you bought The Beatles’ label. So now it’s not just the fans and the poser fans, it’s everybody who knows this. That’s a problem.

Another way the “sharing” creates a problem for you is one of the groups watching is your competition. All this sharing means they get crowd sourced industrial espionage. As a result, even before you bring out the shinny new magical device, they have it copied or have taken the bits they like best.

In a case like this there is only one thing to do. Stop rolling the magical devices out on an orderly annual cycle. When you have the chance to iterate, you take it. New faster chips are ready - so put ‘em in. Better Gorilla Glass comes off the line - make it so. Oh sure every year or so you may tweak the design, but really we are talking rectangles here so it’s not like we are talking about big aesthetic changes. You’ve been doing something like this with the laptops and desktops for years now, so it shouldn’t come as a surprise.

But it will. The fans will freak a bit the first time you do this because their state of the art magical device very quickly become the "old" magical device. But they will get over it, and in a year, this will be the new normal. Everyone will have a magical device, and people will be less freaked as new new magical devices keep on coming. There will be a few blessed souls who will always have to have the newest magical device, but they’ll find their way.

This change is good for you because you have seen the numbers and know that people are upgrading a little less frequently. The tablets and phones are starting to mature as a product, and suddenly last year’s model is still surprisingly useful, and some would dare say just as magical. This change will bode well for you, and the fans will happily continue to consume media from the magical device, and the competition is less likely to know where you are headed in six months. If they do, at least you will have made them invest some of their own capital in some good old fashioned industrial espionage.

Get the picture? Good, because I have got something to tell you. The rumor has it the iPhone 5s may already in the works.

UPDATE: 1/28/13 Apple Will Start Selling A 128GB iPad Next Week.  Just as predicted above.

21,000,000 More Reasons To Fire Scott Forstall

firehim

firehim

Q: Who was in charge of iOS 6? A: Scott Forstall

Q: Maps? A: Scott Forstall

Q: Siri? A: Scott Forstall

Q: Skeuomorphic design on iCal, iBooks, Game Center, et al. A: Scott Forstall

Q: So this SBB clock theft issue? A: Scott Forstall _  _  _  _  _  _  _  _  _  _  _

All of that said, it would be a foolish to disregard  the numerous contributions Scott Forstall has made to NeXT and OS X and Apple. Some very great minds have been booted form Apple and gained wisdom from their days in the desert only to make triumphant returns. You have taken a large bite out of APPL's stock Scott, but I do thank you for all the good work you have done.