Feeds:
Posts
Comments

By Jay Rao and Jim Watkinson

We live in an age dominated by machines, sensors, software, and automation, so it is easy to lose sight of the fact that all great innovations have a common thread running through them – people. Regardless of discoveries and technology, nothing new and important has ever been created without great people with a passion to find new solutions.

The importance of this has been made increasingly clear to us as we move forward to complete our upcoming book examining the early innovation years at Pixar. So it is appropriate that we continue in our blog articles previewing some of our book’s concepts by focusing on several of the key people who influenced, or were directly involved in starting and building Pixar during its early difficult struggles to become a maker of great animated films.

One very famous entertainment pioneer had a strong influence on several early Pixar people, and it was the work and inspiration of this man, and his company, that moved them toward animation and computer graphics. As a consequence, the whole story of how Pixar’s founders came to the field of animation begins many years earlier. To understand the source of that inspiration, and how he sparked the thinking and motivation of others that followed we must wind our story way back, back to a time when there was no recorded entertainment.

The Beginning of Cinema

The birth of cinema and the film industry as we know it began with a wealthy man, a photographer, and the legend of a wager.

While at his horse ranch, former California Governor and future founder of Stanford University, Leland Stanford, often admired the beauty of the horses in motion as they galloped. To his eye they appeared at certain points to be flying across the ground without any hooves touching the earth. So convinced was he, that in 1872, he made a large bet with a friend on this point, and engaged well-known photographer, Eadweard Muybridge, to discover a way to photograph and capture proof of his belief.

Muybridge conducted his first effective test in January 1873, and his photos did indeed show that horses have all four feet in the air between strides, but his initial images were blurry because of the speed of the horses and technical limitations of the camera. To compensate for the blurring, Muybridge had an artist create hand-painted versions of the photos, and then photographed these images for showing to the public. Viewers, however, recognized they were painted, and judged the work with cynicism as a proof of Leland’s premise.

Some years passed, and in 1878, with additional money from Stanford, Muybridge took up the project again. This time, the photographer used a faster electro-mechanical shutter he had developed capable of freezing the horse in mid-flight for a sharp picture. He combined this new shutter with a system of twelve cameras capable of taking photos in quick succession. The new process worked, producing a sequence of shots clearly showing a horse throughout its running stride, and now all could see that horses did fly between strides.

Much acclaim followed, and Muybridge next adapted a child’s toy, a zoetrope, to put the images in motion. He applied photos to a glass disk that, when spun, showed the images in quick sequence, giving the illusion that the horse was moving right before the eyes of his audience. To correct the image compression that occurred in the process, Muybridge had an artist draw the pictures, and it was these hand-drawn images, in a sense an early form of animation, that were used in the show. Muybridge called his device a zoopraxiscope.

In February of 1888, Muybridge was in Orange, New Jersey, to give a lecture about his techniques. In attendance, was the well known inventor, Thomas Edison. Seeing an opportunity at hand, Muybridge proposed that the two partner to join his motion picture device with Edison’s phonograph, resulting in a machine that would play motion pictures and sound together. After some examination, Edison decided his own staff could produce a much better device and Muybridge’s zoopraxiscope concept did not advance any further.

With his interest sparked, Edison’s engineers and photo experts worked to design an effective motion picture device, and over the next two and a half years they would invent a new machine for recording motion pictures called the Kinetoscope, which became the first successful motion picture camera. Many other innovations would come along to power the growth of cinema, and the list of those who contributed to the development of the film industry is long. But despite his lack of success, Muybridge is still called the Father of Cinema, and some would say that his use of hand-drawn images in his first short film, marks animation as the true starting point for what would become the Hollywood film industry.

Through the first 30+ years of film, animation was relegated to a minor role in cinema, serving to provide short, humorous cartoons to warm the audience up before the feature film was shown. But that began to change when a man whose company had created one of the most popular cartoon characters leapt into the unknown, spending a previously unheard sum of money to produce what would become the first successful animated feature film.

The Inspiration of Walt Disney

I do what I do because of Walt Disney – his films and his
theme park and his characters and his joy in entertaining.
– John Lasseter

People may find it hard to believe that the man who transformed short, humorous cartoons into highly popular and profitable feature films was not a very good artist, but it is true. Walt Disney was, however, gifted with the ability to see stories, scenes and characters in his mind, and then act them out with such convincing drama that people who could draw were able to transform his ideas into characters and films filled with human emotion, and loved by millions of people of all generations.

Nearly broke after his first animation business failed in Kansas City, Disney decided to aim high and in 1923, at age 21, he moved to California, where his older brother Roy and an Uncle lived. He hoped to gain a job directing live-action films, but after two months talking with film studios, all that Disney had gained was frustration. Now struggling without money, but still possessing some sample animations he had prepared in his old business, he sent a note and copy of the film to a New York cartoon distributor, proposing he could make this story about a girl named Alice into a series. To his surprise, he quickly received a note back accepting his proposal at the then astounding amount of $1,500 per short film. With the letter in hand as evidence, Walt visited his brother Roy, who was in the hospital recovering from the effects of tuberculosis. Walt told him a story of how he could create great cartoons and make a lot of money if Roy joined him in the business. The next day, without his doctor’s permission, Roy checked himself out of the hospital, and soon the two brothers had a place to setup their new business, the Disney Brothers Studio. . . in their uncle’s garage.

When ideas for creating new episodes for the Alice series dried up in 1927, the Disney’s created a new rabbit character that proved very successful. They were expecting big things from their rabbit, but the Disney brothers learned a hard lesson when through the use of some fine print in their contract, their distributor took over production of the rabbit series, cutting the Disney’s out, and hired away most of their staff. Left with no character series to sell and no other source of revenue, the Disney’s future looked grim. In desperation, Walt came up with an idea for a new mouse character. Working over the following weeks with one of his few remaining employees, Ub Iwerks, they turned the mouse into what we now know as Mickey Mouse.

Though famous now, the first two Mickey Mouse cartoons were rejected by all film distributors because there were already many animal character cartoons. Walt needed to find a way to make his mouse stand out. Sound had only recently been added to live action films, yet sensing an opportunity, Walt looked for a way to add sound to his next Mickey cartoon. He needed to make it unique from competing characters, so he hired a full orchestra and found a way to time the music to the beat of the cartoon and the actions of its character (a first in cartoons). Distributors found the new Mickey very entertaining, but were still reluctant to buy, and with bills to pay, Roy Disney had to sell Walt’s car to raise cash. Having no other prospects at hand, Walt broke with accepted practices and had Mickey shown in one theater in New York City with the hope that a strong audience reaction and news coverage would build credibility and draw in a distributor. The ploy worked. The little mouse received great applause and acclaim in the local press, and within the year he was a national hit, making the Disney’s famous.

Despite the worldwide fame of Mickey Mouse, the cartoon business was very competitive and pricing for these short films was falling, while production costs were rising. As a result, by 1933, Disney was losing money on his Mickey Mouse films. Recognizing they needed a new film product that could generate more revenue than short cartoons, Walt decided to make another big jump, this time to create the first successful feature length animated film, Snow White. Taking more than 3 years to make, the cost of the film ballooned to over $1.5 million, then a record for any film, and forcing Disney to borrow huge sums of money. Before it was released, some industry experts predicted that no one would sit through a ninety minute cartoon, but Disney proved them all wrong, when thousands of people waited in long lines to see his Snow White, generating $10 million in ticket sales and more in merchandise.

Walt Disney would go on to create many more firsts in innovative entertainment products, most of them dismissed at first by the experts as impossibly crazy, and sure to fail. A full list would fill most of this page, but some of his best known ideas would include the first popular educational nature films, Disney’s True-Life Adventures, which his distributor refused to sell, certain that no one would pay to see short films about nature and animals; the Walt Disney TV show (continued for many years and known by many names); Disneyland, the first theme park, that amusement park experts told him would fail; Walt Disney World, the world’s first destination resort; and life-like mechanically controlled animal and humanoid figures called audio-animatronics.

Each of these ideas moved the Disney Company not with a single step, but a giant leap into an area it had no prior experience, in fact, into areas where no one had ever gone. Unlike the characters in his films that have remained perpetually alive, Walt’s time here on earth ended in 1966, and with his passing, the end also came for the long string of giant-leap innovations by his company.

By the time of his passing, the theme park business had grown larger than all their film operations, and now generated the lion’s share of the company’s profits. As a consequence, the always risky, hit or bust business of films received far less attention from company leaders and began to stagnate. The Disney Studio’s next generation of films all had new titles of course, but they were often just repeats of old themes and characters. This was particularly true for the animation division from which all of the Disneyland attraction ideas sprang.

Company leaders could not see that the creation of new films and characters were the source of new attractions in the park, and that this in turn drew visitors back year after year, and drove park, hotel and merchandise revenues and profits. Thus Disney films, a winner of 48 Academy Awards during Walt’s time, now began a two decade decline. So despite the many successes of his company, and a large pool of very creative people, once Walt was gone, so too was the Disney magic for films and characters loved by children, teens and adults alike. Without an opportunity to advance and make their own mark on the world, good people left Disney, and other good people would never come.

One of the people who never came, was a young graduate student named Edward Catmull. Visiting Disney to recruit their involvement in computer graphic research while he was a graduate student at the University of Utah, Catmull was instead offered an intern job to apply his computer knowledge to the design of a new ride at Disney World. He declined the offer because it had nothing to do with his main interest – making computer animation.

Next in our series: The Early Years for Pixar Co-founder Ed Catmull

The Often Long Journey to Radical Innovation

 With Concepts Drawn from our Upcoming Book:

Innovator’s Grit: Pixar’s Perilous Innovation Journey

 By

 Jay Rao and Jim Watkinson

If you have read any business periodicals over the last two decades you might easily conclude that the path to successful innovation is normally a very short journey. But the world we see around us has been largely shaped by ideas that took many years to develop, and often taking even longer to gain wide market acceptance.

As with all maxims, there are of course exceptions, and we can certainly point to many cases where success was reached very quickly. Take for example the mobile photo-sharing application Instagram. Begun in Oct. 2010 by twenty-something entrepreneurs Kevin Systrom and Mike Kreiger, who had met as students at Stanford seven years earlier, and had also taken the same work-study program for entrepreneurs. While working at Google and through personal contacts, Systrom had already gotten to know a wide range of entrepreneurs, angels and venture capital friends before starting Instagram. Once launched, the ability of their service to connect people helped push it out to the millions of younger internet users who were eager to reach out to the world, causing a strong viral-effect resulting in rapid product adoption, and quickly building a large base of users. Within weeks, Instagram was carrying the images and messages of millions of people. In April 2012, Facebook bought Instagram for a billion dollars.[1]

This and other similar stories have spurred an entire generation of wanna-be entrepreneurs armed with the vocabulary of get-rich quick speak, and incessantly looking for their insta-million, or even better, insta-billion opportunities. They network furiously, make rocket-pitches, do hackathons, and have exit strategies even before having a single paying customer. This need for fast riches has also infused the thinking and decision-making at medium and large enterprises, along with their shareholders, resulting in a shortening and narrowing of their view towards acceptable innovation ideas.

Thankfully the epidemic of innovation near-sightedness has not killed off all entrepreneurial courage and toughness, even in the notoriously short term internet space. Rovio is the maker of the famous game Angry Birds. Rovio was founded in 2003 and for eight years they labored through 51 different game releases without a significant hit. Finally, on their 52nd attempt they had a hit with Angry Birds reaching 1 billion downloads. Instagram makes Rovio looks like an old and tired way of winning the innovation race. But is it really a sprinters race, or is innovation more often the longest of life’s marathons?

The reality is that out beyond the world of software and virally driven adoption, innovators must travel a long and difficult road before reaching even the beginning stage of success. Let’s consider some noteworthy examples.

John Harrison and the Longitudinal Problem

Exactly 300 years ago, in 1714, the British Government established a Board to solve a difficult problem. The Board of Longitude was seeking a solution to determine a ship’s exact location at sea. Ships often got lost at sea during their long transoceanic voyages; sometimes tragically. While determining the latitude (Jay: was there a reason you used the word – altitude- here?) of the ship was relatively easy, there were no easy tools for finding the longitudinal location. So, the Board offered £20,000 (£2.45 million in 2014 terms) for a practical method to determine the ship’s longitude position within 30 nautical miles.

Several clockmakers and scientists had tried to solve this problem with little luck. In 1730, John Harrison an English carpenter and clock maker decided to compete for the prize with some financial help from another clockmaker who believed in Harrison’s skills. Harrison started building an accurate sea clock. The first sea trial took place after five years of work and it proved not too accurate. However, impressed with the general direction, the Board granted Harrison £5,000 to continue development. Harrison abandoned the second attempt after another five years of work when a serious design flaw was discovered. The third attempt lasted yet another 17 years, at which time Harrison determined that a clock won’t work and that it should be a much smaller watch. Having worked another 6 years, the 68-year old Harrison’s first Sea Watch was tested in 1761. The watch was accurate to within 1 nautical mile. The Board deemed the test as luck and demanded another trial. Following another successful demonstration, the Board still balked. Finally, the King had to intervene and the 80-year old Harrison was paid the reminder of the award money in 1773; just three years prior to his death. The persistence, testing and failures were not it vain. Harrison had contributed to several valuable inventions in clock and watch making and his Sea Watches changed the future of naval explorations forever. Fast forward to today and we can see that Harrison’s work to create accuracy in measuring time for travel and location now serves as a key to the operation of the Global Positioning Systems (GPS) that we use to guide planes, boats and people driving their cars.

Soichiro Honda and Honda Motor Company

In 1922, fifteen year old Soichiro Honda left school to work at a motorcycle and car repair shop. While the repair shop gave him tremendous knowledge about the workings of motorcycles and cars, Honda was more interested in manufacturing. In 1936, with a friend he set up Tokai Seiki, a piston rings manufacturing firm. After several tries Tokai Seiki finally got selected by Toyota as a supplier. However, they soon lost it since only 3 piston rings out of 50 met the required standards. Honda attended engineering school but did not graduate. He travelled extensively in Japan to understand Toyota’s production processes and finally by 1941 he was reliably supplying to Toyota. Then came the war. In 1942, Toyota took 40% control of the firm and Honda was downgraded from president to senior manager.[2] Towards the end of the war, one plant of Tokai Seiki was reduced to rubble by air raids and in Jan. 1945 the other plant was destroyed by an earthquake. Honda picked up from these setbacks and tried to manufacture a rotary weaving machine for the textile industry. This failed due to a lack of capital. He then tried to make frosted glass with floral patterns, and then roofing sheets of woven bamboo set in mortar. Uncharacteristically, he never pursued any of these in earnest. His heart didn’t seem to be in any of these initiatives. In late 1946, he was visiting an old friend from Tokai Seiki who happened to show him a generator engine designed for a wireless radio. He was immediately inspired to use it for something very different—to power a bicycle. The Honda motorcycle company was born!

George Mitchell and Mitchell Energy

By 1980, Mitchell was already a Texas natural-gas baron; but his wells were drying up. So, he turned to a decades old technique called hydraulic fracturing – fracking. In fracking, water, sand and chemicals were pumped into a well at high pressure to produce cracks in stone of the deep underground shale layer to dislodge the trapped gas.[3] Though demonstrated in the 1940s, this method had been abandoned since it was deemed commercially unviable. While, George Mitchell did not invent fracking, starting in 1981, his firm Mitchell Energy drilled well after well in the Barnett Shale area of Texas. For the next 15 years the firm struggled to demonstrate through trial and error that fracking could be an economically viable and reliable source of natural-gas. Finally, in 1997 one technique that involved water and sand and inexpensive foams and gels worked spectacularly well. In 2002, at the age of 82, Mitchell sold his company for $3.2 billion. By 2012, Shale gas accounted for nearly 35% of U.S. natural gas production. While still environmentally controversial, one industry historian hailed Mitchell’s fracking technique as the biggest and most important innovation of this century.[4]

Elizabeth Holmes and Theranos

In 2003, 19-year old Elizabeth Holmes dropped out of Stanford and started a revolutionary blood diagnostics firm called Theranos. In the last decade, Theranos has raised $400 million, and now has 500 employees, and is valued at $9 billion. The $73 billion blood testing industry performs nearly 10 billion tests a year that are used in nearly 70% of all medical decisions. But historically, most of these tests are made in hospitals and large, free-standing labs, where work is time-consuming. Emergency labs are faster, but can only perform about 40 different tests. Theranos process performs nearly 70 different tests in the same time and with just 25 to 50 microliters of blood. That is nearly a 100 times less than what most blood tests require. Finally, Theranos charges 50-75% less than what independent labs charge and about 10% of what hospital labs charge. The firm posts all prices online, and aims to never charge more than half the published Medicare rate. Today, Theranos exists only in a few Walgreens drug stores in California. Not bad for an 11-year old firm. But, Elizabeth Holmes’ goal is to have a testing facility within five miles of every American.[5]

Certainly history is showing us that in most industries the path to success is often a long journey. Some ideas, because of the nature of their science, application, and acceptance experience long cycle times where there are no short-cuts. On average it takes about a billion dollars and 10 years to bring a new drug to market. Coca Cola’s internal estimates are that it takes about 10 years to build a $1 billion new beverage franchise. When looking out this far, the road is always fraught with uncertainties, ambiguities, set-backs, failures, losses and heart-aches. The best innovators persist on this journey with sheer resilience, patience and an indomitable spirit that we call INNOVATOR’S GRIT.

Lately, the concept of personal grit has been getting a lot of media attention and especially the work of Prof. Angela Duckworth at the University of Pennsylvania. Her research into the behavior of people shows that those who have achieved outstanding success tend to have a high level of personal passion and unwavering dedication toward the accomplishment of their mission, whatever the obstacles, and however long it takes. In fact, she has found personal grit to be a great predictor of long-term success in a variety of situations, from school performance of kids with difficult backgrounds, to cadets surviving the demands of West Point, as well as the performance of students at Ivy League schools and National Spelling Bee competitions.

There is still much that we need to learn about innovation and the unique form of grit that powers those who succeed at it. But rather than trying to falsely suggest that it can be approached as a simple repeatable formula, we will try to help would-be innovators understand and be prepared for what they will face by bringing to life the true-life story of the more than twenty year journey of Pixar Animation Studio and its people as they struggled to survive through a long series of transformations. This journey begins in 1974 when one of their founders, Ed Catmull, took a position running a computer lab for graphics research at a school on Long Island. Then in 1979, Catmull, and co-founder Alvy Ray Smith and their team moved to become part of Lucasfilm, where they began work to make computerized film production hardware and special visual effects for films. Because of changes at Lucasfilm, the team was spun out as Pixar in 1985, and their new business would focus on making a unique graphics computer and related software. Part of the team here included a small group of animators who produced short animated films to demonstrate the hardware/software. By 1989 it was clear that their hardware and software business would never produce significant income, so they turned to using their animation and software skills to make TV advertising. Finally after knocking on Disney’s door for 16 years, in 1991 Disney called back, suggesting they partner to make the first fully computer generated animated film, Toy Story, which itself took four years to complete. The voyage to reach this first real success was a continuous struggle for survival, and every step was charged with remarkable Innovator’s Grit.

We were drawn to this story of Pixar because of the length of time they struggled before reaching success, the many technical innovations they had to develop along the way, the dramatic transformation they helped to create in film-making, and the many remarkable people involved. Although several books have been written about Pixar, all focus solely on history. For the first time, our research, interviews, and writings for the book will show the story of Pixar and its people through an innovator’s eyes. In coming months we will preview material from the book, along with other stories providing important insights for all executives and enterprises into personal, innovator’s, and enterprise grit.

Some of the topics we’ll cover from the book will include:

The birthplace of Pixar – The New York Institute of technology

The challenges of Pixar’s early film days as part of Industrial Light & Magic (ILM) & Lucasfilm

The innovation history of Walt Disney and George Lucas

Early computer innovations in film-making: Tron, Star Wars, Star Trek II, Jurassic Park.

Pixar’s first project with Disney – The Computer Animation Production System (CAPS)

How culture within the film industry impacted early Pixar and innovation adoption

How Disney moved slowly to accept computer animation

The Pixar Team: Ed Catmull, Alvy Ray Smith, John Lasseter, Steve Jobs

The Disney Corp. Folk: Roy Disney, Jeffrey Katzenberg, Mike Eisner

Pixar’s famous projects: Star Trek – Genesis, Andre & Wally, Luxo Jr, Red’s Dream, Tin Toy.

As well as general discussion and true-life innovation stories on:

Evolution and Development of Technology – Hardware and Software

Industry Life Cycles and Innovation Life Cycles

The Loss of Competitiveness among Incumbents

The Rise of the Disruptors

The Coming Together of Technology and Art

The impact of examining the culture of your enterprise on innovation

[1] Behind Instagram’s Success, Networking the Old Way, New York Times, April 13, 2012

[2] http://world.honda.com/history/ , accessed June 25, 2014

[3] Exxon’s Big Get on Shale Gas, by Brian O’Keefe, Fortune, April 16, 2012

[4] He fracked until it paid off, by Jon Gertner, The Lives they Lived, NYTimes Magazine, 21 Dec. 2013

[5] This CEO is out for blood, Fortune, June 12, 2014

Key Words: Innovation; Innovator’s DNA – Networking, Observing, Questioning, Associating, Experimenting; Organic Growth, Culture of Innovation – Purpose, Mastery, Autonomy; Knowledge Work Productivity; Manual Work Productivity

Recently, Lydia Dishman, an innovation and entrepreneurship contributor to Fast Company, asked me to comment on a trend in the workplace – tracking of employee collaboration and productivity using wearable technology devices. You can read my comments in her Fast Company article titled: “Can Performance Be Quantified? Wearable Tech In The Office.” In this blog, I will elaborate on several of the comments I made for the article.

Problem: All the developed countries today are predominantly service / knowledge based economies. Upwards of 70% of the employees are working in these sectors. While this has been true for more than 20 years now, unfortunately, productivity in the service sector has never reached the levels of productivity in the manufacturing and/or agricultural sectors. Quantifying, capturing, tracking and improving Productivity in the knowledge sector has been even more difficult; hence the interest in this topic. Please note: I make a very clear distinction between the low-wage service jobs and the relatively higher-wage knowledge work.

Solution: Wearable Technology that tracks employees. For example, the Hitachi’s Business Microscope is a device that employees have to wear around their necks at work. It measures and analyzes the employees’ interactions and activities. When the employees come within a specified distance of each other, they recognize each other and record the face time, body and behavior rhythm data to a server. Executives can then analyze which groups tend to interact and cooperate. So, where are we heading with these sophisticated “dog tags?”

Trend: In the last 5-7 years, based on several data collection techniques, enterprises have been labeling employees as knowledge “spreaders” or “bottlenecks;” as “loners” or “connectors;” as “influencers” or “followers.” Why are firms doing this?

Challenge: Innovation that spurs organic growth is the most difficult challenge that large firms are facing in the last 15+ years. Specifically, firms realize that they need a cadre of seasoned innovators and internal-entrepreneurs (intra-preneurs) to spur innovation and organic growth. Unfortunately, except for a few, the majority of firms are struggling in their innovation efforts as well as fostering a culture of innovation where these innovators and entrepreneurs can thrive and flourish.

Innovator’s DNA: What makes innovators different? How do they routinely come up with great ideas? How do they think and act? What is their mindset? What are their behaviors? Research shows that great innovators and successful and serial entrepreneurs demonstrate five key skills – Network, Observe, Question, Associate and Experiment.[1] Firstly, they are great at networking – meeting people from diverse backgrounds and skills. They immerse themselves into situations that expose them a variety of perspectives. This in turn helps them to sharpen their observation, questioning and association skills. When thrown into uncomfortable and unknown situations, most of our senses are in a state of heightened awareness. Hence, intense networking helps innovators and entrepreneurs to become good at observing and listening; especially, they do so without prejudice. Immersion and interaction with a diversity of situations propel them to constantly question the status quo within their own areas of expertise or specialty. They are constantly trying to improve and change things for the better. This questioning leads them to associate, copy and relate ideas and experiences across functions, industries and arenas; leading to possible new ideas and solutions. Finally, innovators are great at experimenting, exploring and testing their new ideas and solutions. They just don’t talk about it. They take the initiative to test if their ideas are in fact opportunities.

Innovation and organic growth within large firms is about routinely identifying great opportunities, shaping and developing them and then capturing them. For large firms, these great opportunities lie at the intersection of disciplines, functions and/or geographies. As seen in the Innovator’s DNA discussion above, we know that great ideas and creativity happens by associating and merging disparate streams of knowledge. However, association and new opportunities emerge only when there is a lot of networking among the different disciplines and functions of their large enterprise. Networking leads to better observation and listening and that in turn drives curiosity and questioning of the status quo. Creativity can be highly individualistic. However, organic growth which is the result of innovation is still the result of a lot of collaboration within large enterprises. So you see firms are desperately trying to force networking and collaboration among employees; and trying to measure it.

Innovation is knowledge work. Unfortunately, knowledge work cannot be treated and/or captured the way we have captured manual work. The traditional ways of measuring manual productivity is more than 100 years old. It goes back to Fredrick Taylor’s scientific method on manual work. It was about defining the task, defining standards, measuring against standards, focus on quantity and minimizing worker costs for a task through command and control structures. However, we live today in Peter Drucker’s Knowledge world. Drucker knowledge worker as against Taylor’s manual worker is much more focused on understanding the task, continuously learning, teaching others and innovating. Ideally, the employees focus on quality of work, they are treated as assets and not a cost and they work in environments where there is great autonomy.[2]

Further, there are more differences between manual work and knowledge work. Manual work is visible whereas knowledge work is invisible. Manual work is highly specialized, quite stable, has structure—definite process and outcome, and is about running known tasks with the right processes and fewer decisions. On the other hand knowledge work is holistic, always changing, has no defined boundaries of process and outcome, and is about uncovering the unknown by asking the right questions and making a lot of decisions.[3]

Hence, it will be quite difficult to capture knowledge work productivity using manual work productivity tools and methodologies. We need to invent new ways of capturing the knowledge worker productivity. Innovative firms have found ways to harness the knowledge worker in multiple ways. 3M has been doing this for nearly five decades, W.L. Gore for the last 40 years and Google more recently. They energize and engage their knowledge workers with a sense of purpose; enable them to master creativity and innovation in a climate with a great deal of autonomy.

Some questions to ponder: Will these high-tech wearable tracking devices help firms become more creative and innovative? Do they foster networking, observing, questioning, associating and experimenting? Do they transmit a sense of purpose, provide autonomy and enhance mastery?

Cheers!

Jay

[1] Innovators’ DNA, HBR, Dec. 2009

[2] Source: Reinvent Your Enterprise, by Jack Bergstrand

[3] ibid

Key Words: Strategic Change, Innovation, Risk, Uncertainty, Ambiguity, Prediction Logic, Creation Logic, Planning vs. Testing, Project Management, Agile SCRUM vs. Waterfall, Lean Startup, healthcare.gov, JC Penney, Lululemon, Georgia Tech, Coursera.

How Executives Get Fired

On Oct. 1, 2013 the much anticipated healthcare.gov went live. And, almost immediately, it crashed. Unanticipated surge in web traffic was blamed for most of the problems. Even those who were able to get through faced a multitude of issues and errors – confusing instructions, missing drop-down tools, unexpected hang-ups and puzzling design. Those who gave up and called the customer service reps didn’t fare any better. The reps couldn’t access the online market place either.

On Nov. 29th, 2013 JC Penny (JCP)—an original member of the S&P 500 since 1957—was kicked off the list for its sharp decline in market value. While JCP still has more than a 1000 stores and 2012 revenues stood at $17B, the historic 100+ year old U.S. mid-range department store has fallen on hard times.

In Oct. 2004, Myron Ullman, a former executive at Macy’s and LVMH was named CEO. Ullman bought in brands like Liz Claiborne and introduced mini-shops within the department store. By Feb. 2007, JCP’s shares had doubled to nearly $80; a 10-year high. The economic downturn hit JCP hard and by March 2009 the stock was trading at $14. In Jan. 2011 William Ackman, a hedge-fund manager who had built up a sizable position in JCP stock, was appointed to the board. Amid Ackman’s push for new leadership, in June that year, former Apple retail-star Ron Johnson was named JCP’s new CEO, replacing Ullman. Johnson arrived at JCP in Nov. 2011.

In Dec. 2011, JCP acquired 16% of Martha Stewart Living Omnimedia stock and planned to put “mini-Martha Stewart shops” in many of its stores by 2013. In Jan. 2012, Johnson introduced a strategy involving in-store boutiques and a pricing plan that eliminated the popular JCP coupons. Instead, it would have “Every Day,” “Monthly Value,” and “Best Price” strategies. Prices would also not end in 9 or 7, but on whole numbers instead. In May 2012, JCP announced a 20% drop in sales and a $163M loss in Q1. Suddenly in June, the Head of Marketing, Michael Francis, who had come 8 months earlier from Target, said “he was leaving.” He was blamed for the marketing messages that were not resonating with customers.

In Aug. JCP started rolling out the “Shops” strategy in stores. Simultaneously, an overhaul of the home department in 500 stores was also started. In Nov 2013, JCP reported a Q3 loss of $123M as sales fell by another 27%. However, CEO Johnson said the firm won’t diverge from the strategy he laid out. By the end of the first year of Johnson’s turnaround strategy, JCP had amassed nearly a billion dollars in losses and a 25% drop in its revenues. In April 2013, Johnson was fired and Ullman rejoined the firm as CEO.

In Feb. 2013, an online internet course offered by Georgia Tech and hosted by the leading online learning firm Coursera promised to teach 40,000 students how to create their own massively open online course. The online platform asked participants to sign up using Google Docs. When the crush of students tried to sign up the system crashed. According to Google, apparently Google Docs only allows 50 people to edit a document simultaneously. A small detail, that seemed to have been overlooked by the planners.[i]

In March 2013, the high-flying Canadian yoga apparel maker and retailer Lululemon had to recall more than $60 million worth of a women yoga pants for being too see-through. Within a month there was an announcement stating “product chief to exit.” The following month the CEO announced that she was “stepping down.”

As we all know, expressions like “to exit,” “stepping down,” and “spending more time with family,” are just euphemisms for getting fired. Why do CEOs and executives get fired? The #1 reason is Mismanaging Change.[ii]

Analytical vs. Emergent Strategies for Growth, Innovation and Change

In a press conference after Johnson was let go from JC Penney, Bill Ackman—who had pushed for Johnson’s hiring, said that Johnson deserved criticism for unleashing a series of pricing and merchandising changes without first testing consumer views. Other Penney insiders criticized Johnson for eliminating the company’s sales and coupons last year without a broad market test, a move that led to a sales slump. The key words to focus on, in the above two statements, are “testing consumer views” and “broad market test.”

In March 2013, six months before the healthcare.gov website went live, McKinsey was asked to do a risk analysis and to develop mitigating strategies. McKinsey submitted a 14-slide presentation to the White House by early April. I have pasted two key slides from that slide deck. The first slide is about “complexity” and the second is about how to manage “complex projects.”

A website like healthcare.gov is a massive and complex undertaking – too many variables and too many unknowns. When you are dealing with unknowns, one is talking about uncertainty and/or ambiguity. On the other hand, risk is about the “known” world – known variables with data from the past. You can calculate and estimate risk using analytical tools. When you know the variables and you have data from the past, one can analyze, predict, plan and then take action. Specifically, we can go into the future in primarily two ways – (1) Analysis before Action and (2) Analysis after Action.

Traditional approaches to minimize and manage risk in innovation and change management projects is by doing a lot of analysis before taking action. BHAGs (Big Hairy Audacious Goals) are announced with much fan-fare. Then, the future is approached by performing an environmental scanning (SWOT, STEP, Value Chain Analysis) and followed by the setting of a project plan to execute strategy. Trend lines are predicted based on IRRs and WACC or projected cost benefits; KPIs and milestones are set and budgets are allocated. When project performance does not meet projections, money and energy is spent to get the project back on to the predicted trend line. Unfortunately, heads roll when the predicted future fails to materialize after a couple of tries.

Below: McKinsey’s exhibit demonstrating the magnitude & complexity of the healthcare.gov website project.

 McKinsey picture 1

This approach to change and project management makes a bunch of assumptions: (1) all process and outcome variables are known and can be accounted for ex-ante, (2) existing data from past projects can be used to predict the process and outcome of this project, (3) some variation to projections can be accommodated along the way using managerial judgment, and (4) failure is not an option.

This concept of going into the future is called “predictive logic” and the method is called “analytical strategy.” I just call this the BIG BANG approach to change. Most large firms, governments and institutions predominantly still prefer this mode of going into the future. I call the firms, organizations and individuals who principally use this strategy of going into the future as “PLANNERS.”

On the other hand, all innovative and complex change management projects have a number of unknowns. Specifically, there are two types of unknowns – known unknowns and unknown unknowns. Uncertainty is about known unknowns. In these situations, you know which variables may impact the process and outcome of the project but there is no data from the past to assign probabilistic numbers. Ambiguity is a second order uncertainty. One cannot surmise as to what variables may be lurking in the background. They only appear once the project is underway. Unfortunately, analytical strategies do not account for these unknowns ex-ante. So, when there are a number of unknown variables, most analysis and hence prediction of outcomes a priori becomes a futile exercise.

In the presence of unknowns, the way to manage projects is drastically different. Seasoned entrepreneurs, innovators and VCs test their ideas for potential opportunities predominantly through Analysis after Action. They Think Big, but Start Small. They start several small projects to test their hypotheses. They prototype rapidly and try to establish proof of concept by quick feedback from the market – voice of customer, voice of technology, voice of supply and voice of demand. They try to fail fast, fail cheap and fail smart. In doing so, they learn quickly by uncovering hitherto unknown variables and/or create data where there is none. With this new knowledge they refine their hypotheses and business models. They iterate this process of prototyping, failing, uncovering unknowns and establishing a viable business model. They pour in more resources only after a positive proof of concept has been established and the successful business model is replicated and scaled slowly. I call this approach to going into the future as START SMALL; as against the BIG BANG approach described previously. I call the firms that employ this technique as “TESTERS.”

In 2009, this method of going into the future was termed “discovery driven growth” by Rita McGrath. Decades earlier in 1978, Henry Mintzberg termed this as “emergent strategy” as against “intended or deliberate strategy” (analytical). In the mid-1990s software developers started using Agile Scrum (iterative emergent techniques) vs. the traditional Waterfall methodology (sequential analytical techniques) based on the work of Takeuchi and Nonaka in the mid-1980s. Most of today’s “Lean Startup” (Steve Blank 2012, Eric Reis 2011 and Ash Maurya 2012) concepts, principles and frameworks profess this very same emergent strategy. At Babson, our entrepreneurship and innovation faculty has been teaching this stuff for decades.

To summarize, PLANNERS usually follow the traditional BIG BANG approach that is characterized by the following sequence: Analyze > Predict > Plan > Act > Full Scale Launch. TESTERS on the other hand follow the START SMALL approach that is characterized by the following sequence: Design > Build > Test > Learn > Redesign > Scale Slow Launch.

Below: McKinsey’s slide that contrasts, the Start Small “emergent strategy” technique predominantly used by “Testers” (on the left) vs the Big Bang “analytical strategy” technique usually used by “Planners” (on the right).

McKinsey picture2

I am not suggesting that one approach is good and the other is bad. The right question to ask is: when do you use analytical strategies and when do you use emergent strategies. The Big Bang approach to change or project management works very well for version 2 or 3 of a product. For incremental innovations and for the known world – known technology, known products, known customers, known business models etc. – when we have lot of data and prior experience, the Big Bang approach still works very well. Unfortunately, in the unknown world – unproved technologies, unidentified customers, untested business models – and for radical innovation or major organizational change the Big Bang approach fails miserably. The Start Small approach works much better.

Unfortunately, healthcare.gov chose the Big Bang approach. At least 5 months prior to launch, McKinsey’s warnings were quite  clear, “…there was scant time to test the system before launch;….there wasn’t enough testing and revision;….create a Version 1.o before full launch….”

I have elaborated on these the Start Small concept in a previous blog as well:

http://innovationatwork.wordpress.com/2012/07/

Have a great holiday season!

Jay


[i] Crash sinks course on online teaching, WSJ, Feb. 4, 2013

[ii] Why CEOs get fired, by Mark Murphy, Leadership Excellence, Vol. 22, No. 9, 2005

This blog is the English version (in prose form) of my Aug. 14th, 2013 interview in Spanish by Estrategia, a leading Chilean business magazine. The original interview can be seen at:

http://www.estrategia.cl/detalle_noticia.php?cod=84245

BlackBerry contemplates selling itself!

But for a few hard-core customer fans in the corporate world, this should come as no surprise to anyone in the field of IT and Innovation. Most of us have seen this movie before; several times. While it is always impossible to predict the future of any one firm; executives and experts always anticipate these changes at the aggregate industry level. The current predicament of BlackBerry and to some extant Apple can be attributed to at least 3 reasons: Market Position, Open/Closed Systems, and Ecosystems & Network Effects. Finally, I will also provide a Historical Precedence.

Market Position
Firstly, there are only a few 100 million people on earth who are willing to pay $50-$75 per month who want access to their emails and news instantaneously anywhere and at any time. BlackBerry started out as a luxury item and continued to stay there without going down the pyramid; thus limiting itself to a specific market. Unfortunately, for BlackBerry two amazing competitors showed up on the horizon—Apple and Samsung—going after the same market. Palm fell victim to this and Apple will also if they don’t start going down the pyramid.

Closed and Open Systems
In the early stages of most industries, firms tend to be vertically integrated. Businesses have to provide complete solutions—hardware and software—to all the problems for its niche customer segments. So firms tend to develop both proprietary hardware and proprietary software that work in tandem, i.e. closed systems. However, as industries grow and become attractive, more competitors move in and one strategy for new entrants is to try to change the game by breaking ‘open’ the ‘closed’ systems. We have seen this happen repeatedly, e.g., Sony Betamax vs. JVC VHS, Apple vs. IBM PC, Microsoft NT vs. Linux. Closed systems are better for the user experience. However, Open systems are better for the industry players and thus innovation. BlackBerry and Apple are more closed as compared to Samsung and Google. Nokia used to be closed and only recently switched its Symbian system for Microsoft’s Windows Phone. The recent popularity and rise of the open system offerings has had a detrimental effect on both BlackBerry and Apple. The primary reason for the rise of the open systems is explained next.

Emergence of Ecosystems and Network Effects
One year after the launch of iPhone in 2007, Apple opened the App store. This ushered in the era of Apps and third-party content developers. Within three months, Google launched the open source Android operating system. Together, these two firms created an entire new industry of App developers for their smartphones. While BlackBerry, Nokia and Palm all followed them with their own app stores, all within the next nine months, all of their smartphones were still non-touch screen, outdated and industrial. As more and more of BlackBerry’s traditional customer base didn’t care so much about the high-security that BlackBerry claimed to offer them and switched to either Apple or Samsung devices, App developers had to choose where they should put their time and effort. As more and more app developers created content for Apple and Android operating systems, more and more customers flocked to devices offering the latest captivating apps – content and information. So, the game is not just about hardware and software, but the new battle is one of ecosystems – app developers, platform providers (Apple, Samsung) and social networks (customers).

Within two years Android overtook Symbian as the most widely used smartphone operating system. Palm was sold to HP and Motorola Mobility to Google. Today, BlackBerry is set to meet a similar fate. It shouldn’t be a surprise if Microsoft will buy either BlackBerry and/or Nokia. Because of network effects, it will be very difficult for more than 3 eco-systems to survive simultaneously.

Google seems to be doing much better in the last one year. Google or Apple? Who will be the reigning king for the next 5-years? Where is Samsung in this picture? This is where we need to take a look at history. Long history.

Historical Precedence
Information always follows Technology. Decades ago, Peter Drucker commented on the term “Information Technology.” He argued that in all of history “Technology” has always preceded “Information.” Let me clarify what he meant. When Gutenberg invented the press, in the early days of the print world, it was the machine manufacturers who made the most money in the value chain. As the machines became more ubiquitous it was the creators of content and publishers of information that became richer than the manufacturers of the hardware, i.e., the presses. Similarly, before Mergenthaler invented the Linotype machine in 1884, no newspaper had more than 8 pages. Typesetting was a time consuming and costly activity. However, with the advent of the 90-character key board Linotype, the media revolution was born and so too the newspaper and publishing magnates. Again, the ‘technology’ preceded the ‘information.’ During the PC-era, hardware firms Apple and IBM were the first money makers and they were followed by Microsoft (in terms of market valuation) and PC-mogul Bill Gates. During the WWW boom, for a brief period, Cisco was the most valuable firm on earth. But then came the web-tycoons that build their empires based on information and content—Amazon (books), Skype (voice), Apple (music), Netflix (movies) and Facebook (social media). So, the money in the value chain or in the industry first goes to the hardware or the technology firms and then it shifts to the software / information / content firms. Hence, Information Technology should actually be read as Technology Information. While very strong on the “Technology” side, Samsung is still far behind in the “Information” game.

A brief glimpse into the future
A similar battle is brewing in the TV industry. The recent battle between CBS and Time Warner should not be surprising as well. Content bundlers and distributors (cable companies) have had a monopolistic choke on the customer end. The rise of the new internet based content distributors – Netflix, Apple, Hulu, Roku, Amazon—is providing content creators other options to reach their end customers by by-passing the cable firms. Similar to millions who cut out landlines in the last decade, younger populations are by-passing expensive cable operators and customizing their own content on their computers and TVs. The cable companies, with their hardware (cables and set-top boxes) can be seen as the “technology” firms and the content creators as the “information” firms. The tide is shifting in this very traditional business model as well.

The June 11, 2013 Wall Street Tech section on the web read: “Apple’s streak of game-changing devices has stalled and the iPhone and iPad seem stale, compared with new offerings from Samsung Electronics and others. Software blunders, like Apple’s widely panned mapping app, have raised doubts about the company’s ability to build cutting-edge mobile services.”

Apple was again in the spotlight during its week-long Developers Conference. So far, announcements about its digital radio service and an overdue overhaul of iOS7 have received negative or at best neutral reactions from analysts and experts. While investors, analysts and gurus get all the attention, I would pay more attention to what the 6 million registered app developers are saying or not saying.

There is no doubt that Apple is playing catch-up to Pandora and Spotify in the radio/streaming space and upgrading its operating system is just a part of routine incremental innovation introductions. On the other hand, the announcement of the “all-day battery life” on the MacBook Air has not received much attention. It now boasts 5 hours more of battery life for the same price. Is it incremental innovation or radical innovation? Or is Apple over-shooting the needs of the consumers? Remember the dismal performance of Nokia’s 41-megapixel Lumina 1020 smartphone? The answer will be revealed in a few months.

In any case, the world seems to be screaming for an un-ending string of radical innovations from Apple. Radical innovations are about the “unknown” world – unproven technologies, unknown markets, untested business models. Unfortunately, history shows us that radical innovations are few, far in-between and highly risky. On the other-hand, incremental innovations are about the “known” world – existing technologies, existing products and existing business models. Incremental innovations tend to be quite predictable and firms keep working on them and introducing them at regular intervals. So, they tend to be quite boring for the general populace, investors and the hype-seeking media.

All great innovators—GE, P&G, 3M, Google, Apple, Samsung, Singapore Airlines—have to carefully manage their radical and incremental innovations. Incremental innovations tend to have a certain ritual, rhyme and rhythm. On the other hand, radical innovations don’t have time-tested formulas – its proportion in the innovation portfolio, the proportion of time and money set aside for it, and the timing of its introductions.

Unfortunately, pleasing an impatient, attention-deficit, constant high-excitement seeking consumers, investors and media is definitely very challenging. I guess we need some radical innovations in expectations management!

Cheers!
Jay

This blog is the translation of a recent article of mine that appreared in Estrategia, a leading Strategy magazine in Chile. You can get the original article from:

http://www.estrategia.cl/detalle_noticia.php?cod=72739

Stop the Nonsense! Innovation is a Discipline.

A recent article in the Wall Street Journal highlighted the problem we have today with innovation. Indeed, this once worthy term is being degraded by CEOs, consultants, marketers, and journalists for whom it is the buzzword d’jour. The term “innovation” has been deeply devalued—to the point of being a slogan or aspiration. Also, for all the use of the term, it is clear that firms are having lots of problems managing innovation. They’re spending heavily and complaining about how little they are getting back. Abuse of the term “innovation” is leading to dismal outcomes, cynicism, and wasted money. Studies show that despite huge sums spent on ideation software, stage-gate systems and consultants a majority of executives are dissatisfied with the results. Dissatisfaction among employees is even higher. The reasons are many.

Firstly, most people today still associate innovation with R&D and invention. For example, in a lot of Spanish speaking countries, the governments, large enterprises and executives commonly use the expression I+D+i. In fact some governments also have special tax incentives on R&D spending; but no rewards for spending on innovations. If we really read the expression I+D+i carefully, it would mean “innovation” is less important than R&D. This is wrong.

Inventions and patents that are not commercialized have very little value. Since 2005, consulting firms—Booz Allen, McKinsey, BCG, CapGemini—have been publishing yearly data about innovation practices within large global firms. Data from Booz Allen shows us that there is no correlation between R&D spending and financial performance among the top 1000 global firms. Based on this they created two categories of firms – innovators and spenders. Innovators spend much less money in R&D as compared to their peers and have a better financial performance while the ‘spenders’ spend a lot of money in R&D but have poorer financial performance.

Innovation is more than R&D. Innovation encompasses R&D, products, processes, services, supply chain, marketing, business models and others. These are just opportunities for innovation activities. In fact, Innovation is a discipline. It is a discipline that can managed and mastered like other management disciplines.

Many disciplines operate in the world of business, and their evolutions provide insights into the development of innovation as a body of knowledge and field of practice. Marketing, for example, has a conceptual framework (the “4Ps”) and a unique vocabulary. It has developed practical methods (e.g., segmentation) and tools (e.g., conjoint analysis) that practitioners master through formal study. Subfields of marketing such as advertising and consumer behavior have broadened the discipline. Academic departments have formed to increase the body of marketing knowledge and to pass it on to others. Journals, professional associations, and conferences dedicated to marketing have emerged over the years.

We have witnessed a similar evolution with the quality movement. Corporations that took quality seriously made it part of their cultures—embedding the “discipline” in their thinking, planning, and behaving. Today, quality is no longer an empty buzzword or organizational aspiration, but a solid and respected discipline that produces measurable benefits for companies and their customers.

Like other disciplines, innovation can also be mastered. The good news is that the road to mastery in any discipline is the same:

 1.     Mastery is the result of leadership desire, choice and commitment.

 2.     Mastery requires years of effort.

 3.     Mastery requires a cadre of experts to lead the way.

 4.     Mastery requires a broad-based understanding of principles and methods of the discipline among employees.

Like marketing and quality, innovation has been following an evolutionary path. As a discipline, it is perhaps midway along its evolution path—where the quality movement was twenty or so years ago.

Disciplined corporate innovative efforts can be traced back to Thomas Edison’s first “invention factory” at Menlo Park, New Jersey. Bell Laboratory and the R&D centers of, DuPont, Etc were its offspring. By the 1980s other tools of the innovator’s craft were being adopted by new product developers. It wasn’t until the 1990s, however, that academics began publishing thoughtful and practical books that explained innovation as more than R&D or invention, but as a process, and told executives how to harness it in service of corporate strategy. The explosive emergence of eBay, Google, Facebook, and Amazon made it clear that innovation is not simply about physical products and technologies, but extends to services and business models. So, today the word innovation is sizzling hot and on every executive’s lips. Academics are studying and writing about it, and today’s employees have some useful principles, methods and tools to work with.

Despite the availability of principles, methods and tool, several obstacles impede progress in handling innovation as a discipline. The biggest obstacle of them all is the failure of most executives to recognize and support the “soft” side of innovation. Executives are investing substantial time, money and energy on resources, processes, and metrics but most ignore the values, behaviors and workplace climate—aspects of culture—that make those investments pay off. Several studies support the conclusion that enterprise culture is the primary driver of innovation. The fact that successful innovation is one part principles, methods and tools, and another part human creativity and insight is the greatest impediment to innovation in large companies. To capture the potential of innovation, leaders must bring these two very different parts together.

Below are some questions to think about:

Is Innovation mostly ad-hoc or is it treated as a business discipline within your firm? If innovation is not disciplined, how can you educate your executives about this? Does your firm’s culture support innovation? What are your specific challenges?

Follow

Get every new post delivered to your Inbox.

Join 842 other followers

%d bloggers like this: