Box Slams Microsoft While Introducing New iPad Cloud Service - Yahoo! Finance


Excerpt:

Essentially, the new service, called Box OneCloud, gives iPad and iPhone users an app store of business productivity apps where all the files created by those apps are stored in the Box cloud. Microsoft-bashing aside, this is a pretty good idea.


Sent from iPhone 

Posted via email from Pete's posterous

Pinterest Is Scared Of Having A 'Twitter Problem' - Yahoo! Finance


Excerpt:

Pinterest has an API done and ready for developers, but it hasn't released it to the public yet.

And it might not release it for a while, says an industry source familiar with Pinterest's plans. This source says that Pinterest fears having a "Twitter problem."

An API, or application programming interface, allows developers to build apps using Pinterest data. 

Twitter released its API when it was still an immature company, allowing developers to build applications with features that it was missing. When Twitter matured, and it wanted to control its platform, it began adding those features, thus damaging those developers.

For instance, Twitpic, which built a way to share photos is now threatened because Twitter added native photo sharingHootSuite, and Twitterrific, which build greatmobile apps were crushed when Twitter started putting more effort into its own mobile apps.

As Twitter began competing with its developers, the developer community turned against Twitter. It didn't really trust Twitter.

Pinterest doesn't want that to happen. It's a very young company and it's just getting started, says our source. It doesn't want developers to build features/applications that it plans on building, and then alienate those developers by building similar features.


Sent from iPhone 

Posted via email from Pete's posterous

[INFOGRAPHIC] How Much Does A One-Second Page Load Delay Cost?

http://www.readwriteweb.com/archives/infographic_how_much_does_a_one-second_page_load_d.php

excerpt:

Google has long been telling us how long it took to perform whatever search we sent its way. That little note may seem self-congratulatory to the average Internet user, but it's vitally important.

Slowing that number by just 4/10ths of a second, for example, would cut 8 million searches from Google's daily total of 3 billion. If its pages took one second longer to load, Amazon, for example, could lose as much as $1.6 billion in annual revenue.

These and other findings are included in a smart new infographic (see below) from OnlineGraduatePrograms that sheds light on the need for speed when it comes to Web page design. All of this emphasis on instant gratification comes at a time when the Internet is trending towards being more visual, meaning Web designers need to find ways to create image-heavy sites that still load quickly.

Posted via email from Pete's posterous

Don't Let 'Corporate Antibodies' Kill Your Best Ideas, Warns Ex-HP Exec - Business Insider


Excerpt:

The outspoken McKinney is a cult figure at HP between IPO and his Killer Innovations podcast, which attracts 30,000 listeners. He also just published the book Beyond the Obvious: Killer Questions that Spark Game-Changing Innovation

Business Insider spoke to McKinney to find out how employees at large companies can get their ideas into action and grow their careers.

1. Beware the corporate antibodies: "The frustration employees feel is because they run headlong into the corporate antibodies," he quips. "Corporate antibodies aren't just obstructionists (though some are). You need to understand why they are pushing back." For instance, some of them are "ego antibodies" which means "they view themselves as the idea person. You pitching an idea is threatening to them," McKinney says.

Deal with an ego antibody by coming to that person with a rough draft and taking whatever feedback the person gives you -- verbatim -- and adding it to your presentation. Credit the person generously. This makes the person feel ownership of the idea.

McKinney names several other types of antibodies in his book, such as the naysayers who automatically veto everything. These folks are afraid of change and you work with them by making them more afraid of not-changing. You point out why your idea will beat your competitors.

2. Perform "the bar room test" on your idea. "Go to your local pub. Go to the bar and order a drink. Then pick three random people, introduce yourself and, assuming the person is not blotto drunk, give them the three-minute pitch. Tell them the problem you want to solve."

If your bar buddies are interested, you've got a good idea. If not, either your idea isn't great, your presentations isn't -- or both.

3. Practice "strategic story telling." Most people's biggest problem is that "they don't know how to do the pitch. They are enthralled with speeds and deeds," he says. A good pitch tells an emotional story. Think of it like pitching a "Hollywood movie" where the business problem is the villain and your idea is the hero.

4. Don't fight the "rule of 18": "Any senior executive inside an organization can suffer 18 months of pain. But if that idea isn't a booming success within 18 months, it gets killed," he warns. It is best to pitch ideas  that should work within 18 months so you can build some cred. But if your idea will take longer than 18 months to mature, you'll need to teach your executives to be patient. For that you will need a mentor and a champion.

5. Don't expect your boss to be your idea mentor. When finding a mentor, you probably need to look outside your direct chain of command. "If you find an executive who's passionate about the area you are working on they tend to be very receptive -- unless the guy's a jerk," he says. These folks will be willing to have coffee with you. If they like your idea, they'll advise you and give "air cover" he says. "In some cases I actually gave budget dollars."

Find these folks by looking for execs talking about the idea your area covers -- maybe they are blogging about it or chatting on LinkedIn groups. Maybe they are working on other, similar projects.

6. Be prepared to work, fail, and work some more. "The challenge for a lot of people is that they think the idea is fully formed," he says. But chances are some at least some of it will need work. "So fix it and stick it out again. Most people want the fun, the glory and the press coverage," he says. But if the first time you run into a roadblock you walk away, you'll never succeed. "Innovation is a skill you can learn. It's not magic. It's not a special gift from God. It's just plain hard work."

Posted via email from Pete's posterous

Study Finds Developer Interest In Android Is Slowly ‘Eroding’


Excerpt:

Late last year, Google chairman Eric Schmidt announced that he firmly believed Android would overtake iOS as the first choice for mobile application developers within six months. But according to a new study conducted by IDC and Appcelerator, that’s not going to happen any time soon, as Android developers are slowly beginning to lose interest in the platform.

The companies surveyed over 2,000 of the 280,000 developers on Appcelerator’s mobile development platform this year, and found that there has been “a steady erosion of interest” in Google’s Android operating system.

Interest in Android development for smartphone has dropped from 86% to just 78%, while interest in Android development for tablets has dropped from 75% to 67%. In comparison, interest in developing for Apple’s iOS operating system has remained stable.

Posted via email from Pete's posterous

And Here's The Secret Reason Apple Is Crushing Google...


Excerpt:

It's no secret that Google's products often fail to win the hearts and minds of mass-market consumers the way Apple's do.

Importantly, this failure generally has nothing to do with the technology that powers Google's products, which is often amazing.

Rather, it's the result of weaker product design.

Google TV, for example, was an absurdly complex flop that was apparently designed for consumers who have been dying to buy a TV that is as complicated as a computer (all four of them).

Google's email system, Gmail, for years forced consumers to use a "conversations" format that geeks raved about but that confused normal people who liked good old email.

Google's Android operating system, meanwhile, despite having many technological advantages over Apple's iOS, is still harder and more complicated to use that Apple's offering.

The common thread of these anecdotes is that Google designs its products for geeky technologists, while Apple designs for normal humans.

And it turns out that geeky technologists are a small, weird niche of the broader consumer market, which is making it harder for Google to become a beloved mass-market brand.

The difference between Google's product design and Apple's product design starts with the difference between the types of people each company places the highest value on.

Google has an engineering culture, in which brilliant technologists are the rock stars.

Apple, meanwhile, has a product-design and marketing culture, in which "technology" merely serves to support a product's function and form.

Posted via email from Pete's posterous

Mobile and the news media's imploding business model

Pew research has a new survey showing that tablets and smart phones are now 27% of Americans' primary news source. The overwhelming share of this is phones, not tablets; and a reasonable view says this will rise to 50% in three years.

Makes sense: just as radio became one of the big purveyors of news because it was the medium that traveled with you, so should mobile.

But it is also a depressing development, portending, once again, the end of the world as we know it: the news business has been plunged into a crisis because web advertising dollars are a fraction of old media money. And mobile is now a fraction of web: the approximate conversion rate is $100 offline = $10 on the web = $1 in mobile.

In part, the reasons are purely mechanical: you can cram three or four ads on a web page, meaning an average web CPM (cost per thousand views) of $1.00 (if you're lucky) can become a rate-per-page per thousand (RPM) of $4.00 (versus $20-$40 CPMs in traditional media). Mobile CPMs are running at something closer to $0.25 – and we're only able to fit one ad on those miniature pages.

And to some extent, the problem defines the medium: who wants to pay for inattention and a cursory scrawl? (How much of mobile news is consumed by people behind the wheel, even?)

And yet, this resistance, or lack of interest, on the part of advertisers challenges the bedrock logic of marketing: follow your customers.

Brands and big agencies continue to announce their commitment to and excitement about the revolution at hand. They surely want to be seen as players and cool people. I don't know anybody in consumer marketing who isn't gaga about digital and mobile. But these are the same people and big brands who have doubled down on television, your dad's medium, making 2011 a golden TV advertising year.

Now, there's an optimistic and stubborn view that this must change, that agencies and brands can't hold on to the past for ever. Almost every digital executive and booster will, at this point in the conversation, outline his or her children's media consumption habits as anecdotal evidence in support of the coming digital ad boom. But, as it happens, the patterns seem to get only worse. According to a recent report by Kanter Media, broadcast TV was up was 7.7% in the fourth quarter of 2011 (up 2.4% for the year), while internet search advertising was down 6.4% (down 2.8% for the year); and internet display advertising was down 5.9% in the fourth quarter (and down 5.5% for the year). Mobile does not even get its own break-out category.

There is another bleak element here: a basic shift in how advertising is bought and sold. More and more digital space, both web and mobile, is moved through a real-time auction process: audiences (or demographic segments) are sold like soy beans. Curiously, for all other commodities, the auction process raises prices. In a virtually unlimited world of digital advertising space, it lowers them.

If the news business on the web is depressing, contributing to the existential angst that has gripped every established news organization, mobile turns the story apocalyptic: there is no foreseeable basis on which the news establishment can support itself. There is no way even a stripped-down, aggregation-based, unpaid citizen-journalist staffed newsroom can support itself in a mobile world.

Posted via email from Pete's posterous

The Idea Factory: Insights on Creativity from Bell Labs and the Golden Age of Innovation

At the turn of the twentieth century, Thomas Edison was the most famous inventor in the world. He hoarded useful materials, from rare metals to animal bones, and through careful, methodical testing, he made his new inventions work, and previous inventions work better. Churning out patent after patent, Edison’s particular form of innovation was about the what, and not about the how — the latter he could outsource and hire for.

“In 1910, few Americans knew the difference between a scientist, an engineer, and an inventor” explains Jon Gertner at the beginning of his lively book about a place that fostered a home for all three, The Idea Factory: Bell Labs and the Great Age of American Innovation. The difference was clear to Edison, who was generally disinterested in the theory behind his inventions, filling his Menlo Park complex with specialists to do the work he’d rather not. “I can always hire mathematicians,” he said, “but they can’t hire me.”

Posted via email from Pete's posterous

Why Would Anyone Ever Use Siri...

To try and understand a bit more about how people decide whether they want to use a voice-activated or touch-screen activated interface, I spoke with Alex Rudnicky, an expert in human-computer interaction at Carnegie Mellon University. The process he described is something like a marketplace, with voice and touchscreen interfaces competing for a user’s attention: The user weighs a host of factors, some circumstantial (am I in a crowded bar that will make it hard for Siri to hear me?) and others about the inherent nature of the task (Siri may prove better at “find a cheap Chinese restaurant within a mile” than Yelp’s interface, which requires you to sort through many options to nail that request).

This description fits in well with some of the Parks Associates qualitative findings, in which many people told them they use Siri when driving or otherwise have their hands full, John Barrett of the Parks Associates told me. Calculating the costs and benefits of each, a user will go with whatever’s easier, even if, Rudnicky says, they find the AI “annoying.” (Of course, this raises the question of just what it means for AI to be “annoying.” It seems that the robotic qualities people find annoying might cease to be so, if the AI were to just do it’s job effectively and quickly.)

Beyond these sorts of functional concerns, users may also weigh cultural norms. A study done by some human-computer interaction experts and Carnegie Mellon and SUNY Buffalo found that people preferred to give feedback via text than vocally to a robot (pdf), perhaps because they became sheepish talking to a robot around strangers. But if you’ve ever taken a public bus or subway in recent years, you know how fragile such inhibitions are. Rudnicky said we can expect self-consciousness about talking to robots to melt away over the next few years, much as it has for talking on a cellphone.

Posted via email from Pete's posterous

For Google, to Play Is to Fight the Commoditization of Android

Excerpt:

The most profound thing I've heard lately was from a guy standing behind the counter of an empty RadioShack on St. Patrick's Day in Boston. Smartphones are like the cereal aisle in the grocery store, he said. There are a thousand options but really, there are only a couple varieties. Applied to Google's relationship with Android, what we see is an ecosystem that has become increasingly commoditized. The Android brand has been diluted and while its core features come from Google, the search giant is not the first company that people think of when the platform is mentioned.

Sent from my iPad

Posted via email from Pete's posterous

Lenders' frozen bagels: Easier is better than better

http://mobile.slate.com/articles/business/moneybox/2012/03/murray_lender_and_...

Excerpt:

Innovation is often thought of as coming with better products. But sometimes the most successful innovations involve coming up with inferior products, but making them cheaper and more convenient.

Sent from iPhone

Posted via email from Pete's posterous

Codecademy Challenge

Dear Fake Geek Girls: Please Go Away - Forbes


Excerpt:

The venn diagram made by Matthew Mason depicts geek as the intersection between intelligence and obsession.

From Great White Snark

As someone who is married to an obsessive deep-diver, those definitions ring true. My husband Sean Bonner is a coffee geek, an art geek, a meme geek, and a  punk-rock geek. He is super passionate and obsessive about the things that he is interested in. He has been that way his entire life and it’s unreal watching him get hooked on something new and watching his knowledge about it grow each and every day non-stop. It’s also why he has such a strong following: people count on him to share his deep knowledge on their favorite subject-matter. He does all the heavy lifting and they get the CliffsNotes.

Sent from iPhone 

Posted via email from Pete's posterous

News Desk: Do Chinese Factory Workers Dream of iPads?

A counter view on low wage factory workers in China.  

I recall my Dad, himself an immigrant, telling me something vaguely similar years ago.  Different context, different time -- but remarkably resonant insight.


Excerpt:

The simple narrative equating American demand and Chinese suffering is appealing, especially at a time when many Americans feel guilty about their impact on the world. It’s also inaccurate and disrespectful. We must be peculiarly self-obsessed to imagine we have the power to drive tens of millions of people on the other side of the world to migrate and suffer in terrible ways. China produces goods for markets all over the world, including for its own consumers, thanks to low costs, a large and educated workforce, and a flexible manufacturing system that responds rapidly to market demands. To imagine that we have willed this universe into being is simply solipsistic. It is also demeaning to the workers. We are not at the center of this story—we are minor players in theirs. By focussing on ourselves and our gadgets, we have reduced the human beings at the other end to invisibility, as tiny and interchangeable as the parts of a mobile phone.

Chinese workers are not forced into factories because of our insatiable desire for iPods. They choose to leave their farming villages for the city in order to earn money, to learn new skills, to improve themselves, and to see the world.  

Posted via email from Pete's posterous

Berkeley/Stanford rankings

Yay, Berkeley!  Go Cal Bears!  Go... Tree?!

excerpt:

BERKELEY — UC Berkeley’s reputation shines internationally, and the campus’s graduate schools retain high marks nationally, as well, according to two separate rankings made public this week.

Berkeley placed fifth in the world in reputation rankings issued Thursday by the Times Higher Education, the United Kingdom’s leading educational news publication. The rankings are based on a survey of more than 17,500 academics worldwide.

According to the Times‘ analysis of the reputation rankings, Berkeley is part of “an elite Anglo-American cadre of six global university ‘super-brands’ ” that has emerged. Rounding out the sweet six are Harvard, MIT, Cambridge, Stanford and Oxford.

“The six occupy what one expert describes as ‘a special zone beyond ordinary competition,’ riding well ahead of the chasing pack and reaping the multiple rewards associated with being the world’s best in teaching and research,” the analysis continued.


http://www.timeshighereducation.co.uk/world-university-rankings/2011-2012/reputation-rankings.html

excerpt:

Image

Posted via email from Pete's posterous

Plantronics Savi 440 Review

Business grade wireless headset

http://www.pcmag.com/article2/0,2817,2390297,00.asp


Sent from iPhone

Posted via email from Pete's posterous

The Real Leadership Lessons of Steve Jobs - Harvard Business Review

The Real Leadership Lessons of Steve Jobs

His saga is the entrepreneurial creation myth writ large: Steve Jobs cofounded Apple in his parents’ garage in 1976, was ousted in 1985, returned to rescue it from near bankruptcy in 1997, and by the time he died, in October 2011, had built it into the world’s most valuable company. Along the way he helped to transform seven industries: personal computing, animated movies, music, phones, tablet computing, retail stores, and digital publishing. He thus belongs in the pantheon of America’s great innovators, along with Thomas Edison, Henry Ford, and Walt Disney. None of these men was a saint, but long after their personalities are forgotten, history will remember how they applied imagination to technology and business.

In the months since my biography of Jobs came out, countless commentators have tried to draw management lessons from it. Some of those readers have been insightful, but I think that many of them (especially those with no experience in entrepreneurship) fixate too much on the rough edges of his personality. The essence of Jobs, I think, is that his personality was integral to his way of doing business. He acted as if the normal rules didn’t apply to him, and the passion, intensity, and extreme emotionalism he brought to everyday life were things he also poured into the products he made. His petulance and impatience were part and parcel of his perfectionism.

One of the last times I saw him, after I had finished writing most of the book, I asked him again about his tendency to be rough on people. “Look at the results,” he replied. “These are all smart people I work with, and any of them could get a top job at another place if they were truly feeling brutalized. But they don’t.” Then he paused for a few moments and said, almost wistfully, “And we got some amazing things done.” Indeed, he and Apple had had a string of hits over the past dozen years that was greater than that of any other innovative company in modern times: iMac, iPod, iPod nano, iTunes Store, Apple Stores, MacBook, iPhone, iPad, App Store, OS X Lion—not to mention every Pixar film. And as he battled his final illness, Jobs was surrounded by an intensely loyal cadre of colleagues who had been inspired by him for years and a very loving wife, sister, and four children.

So I think the real lessons from Steve Jobs have to be drawn from looking at what he actually accomplished. I once asked him what he thought was his most important creation, thinking he would answer the iPad or the Macintosh. Instead he said it was Apple the company. Making an enduring company, he said, was both far harder and more important than making a great product. How did he do it? Business schools will be studying that question a century from now. Here are what I consider the keys to his success.

Focus

When Jobs returned to Apple in 1997, it was producing a random array of computers and peripherals, including a dozen different versions of the Macintosh. After a few weeks of product review sessions, he’d finally had enough. “Stop!” he shouted. “This is crazy.” He grabbed a Magic Marker, padded in his bare feet to a whiteboard, and drew a two-by-two grid. “Here’s what we need,” he declared. Atop the two columns, he wrote “Consumer” and “Pro.” He labeled the two rows “Desktop” and “Portable.” Their job, he told his team members, was to focus on four great products, one for each quadrant. All other products should be canceled. There was a stunned silence. But by getting Apple to focus on making just four computers, he saved the company. “Deciding what not to do is as important as deciding what to do,” he told me. “That’s true for companies, and it’s true for products.”

After he righted the company, Jobs began taking his “top 100” people on a retreat each year. On the last day, he would stand in front of a whiteboard (he loved whiteboards, because they gave him complete control of a situation and they engendered focus) and ask, “What are the 10 things we should be doing next?” People would fight to get their suggestions on the list. Jobs would write them down—and then cross off the ones he decreed dumb. After much jockeying, the group would come up with a list of 10. Then Jobs would slash the bottom seven and announce, “We can only do three.”

Focus was ingrained in Jobs’s personality and had been honed by his Zen training. He relentlessly filtered out what he considered distractions. Colleagues and family members would at times be exasperated as they tried to get him to deal with issues—a legal problem, a medical diagnosis—they considered important. But he would give a cold stare and refuse to shift his laserlike focus until he was ready.

Near the end of his life, Jobs was visited at home by Larry Page, who was about to resume control of Google, the company he had cofounded. Even though their companies were feuding, Jobs was willing to give some advice. “The main thing I stressed was focus,” he recalled. Figure out what Google wants to be when it grows up, he told Page. “It’s now all over the map. What are the five products you want to focus on? Get rid of the rest, because they’re dragging you down. They’re turning you into Microsoft. They’re causing you to turn out products that are adequate but not great.” Page followed the advice. In January 2012 he told employees to focus on just a few priorities, such as Android and Google+, and to make them “beautiful,” the way Jobs would have done.

Simplify

Jobs’s Zenlike ability to focus was accompanied by the related instinct to simplify things by zeroing in on their essence and eliminating unnecessary components. “Simplicity is the ultimate sophistication,” declared Apple’s first marketing brochure. To see what that means, compare any Apple software with, say, Microsoft Word, which keeps getting uglier and more cluttered with nonintuitive navigational ribbons and intrusive features. It is a reminder of the glory of Apple’s quest for simplicity.

Jobs learned to admire simplicity when he was working the night shift at Atari as a college dropout. Atari’s games came with no manual and needed to be uncomplicated enough that a stoned freshman could figure them out. The only instructions for its Star Trek game were: “1. Insert quarter. 2. Avoid Klingons.” His love of simplicity in design was refined at design conferences he attended at the Aspen Institute in the late 1970s on a campus built in the Bauhaus style, which emphasized clean lines and functional design devoid of frills or distractions.

When Jobs visited Xerox’s Palo Alto Research Center and saw the plans for a computer that had a graphical user interface and a mouse, he set about making the design both more intuitive (his team enabled the user to drag and drop documents and folders on a virtual desktop) and simpler. For example, the Xerox mouse had three buttons and cost $300; Jobs went to a local industrial design firm and told one of its founders, Dean Hovey, that he wanted a simple, single-button model that cost $15. Hovey complied.

Jobs aimed for the simplicity that comes from conquering, rather than merely ignoring, complexity. Achieving this depth of simplicity, he realized, would produce a machine that felt as if it deferred to users in a friendly way, rather than challenging them. “It takes a lot of hard work,” he said, “to make something simple, to truly understand the underlying challenges and come up with elegant solutions.”

In Jony Ive, Apple’s industrial designer, Jobs met his soul mate in the quest for deep rather than superficial simplicity. They knew that simplicity is not merely a minimalist style or the removal of clutter. In order to eliminate screws, buttons, or excess navigational screens, it was necessary to understand profoundly the role each element played. “To be truly simple, you have to go really deep,” Ive explained. “For example, to have no screws on something, you can end up having a product that is so convoluted and so complex. The better way is to go deeper with the simplicity, to understand everything about it and how it’s manufactured.”

During the design of the iPod interface, Jobs tried at every meeting to find ways to cut clutter. He insisted on being able to get to whatever he wanted in three clicks. One navigation screen, for example, asked users whether they wanted to search by song, album, or artist. “Why do we need that screen?” Jobs demanded. The designers realized they didn’t. “There would be times when we’d rack our brains on a user interface problem, and he would go, ‘Did you think of this?’” says Tony Fadell, who led the iPod team. “And then we’d all go, ‘Holy shit.’ He’d redefine the problem or approach, and our little problem would go away.” At one point Jobs made the simplest of all suggestions: Let’s get rid of the on/off button. At first the team members were taken aback, but then they realized the button was unnecessary. The device would gradually power down if it wasn’t being used and would spring to life when reengaged.

Likewise, when Jobs was shown a cluttered set of proposed navigation screens for iDVD, which allowed users to burn video onto a disk, he jumped up and drew a simple rectangle on a whiteboard. “Here’s the new application,” he said. “It’s got one window. You drag your video into the window. Then you click the button that says ‘Burn.’ That’s it. That’s what we’re going to make.”

In looking for industries or categories ripe for disruption, Jobs always asked who was making products more complicated than they should be. In 2001 portable music players and ways to acquire songs online fit that description, leading to the iPod and the iTunes Store. Mobile phones were next. Jobs would grab a phone at a meeting and rant (correctly) that nobody could possibly figure out how to navigate half the features, including the address book. At the end of his career he was setting his sights on the television industry, which had made it almost impossible for people to click on a simple device to watch what they wanted when they wanted.

Take Responsibility End to End

Jobs knew that the best way to achieve simplicity was to make sure that hardware, software, and peripheral devices were seamlessly integrated. An Apple ecosystem—an iPod connected to a Mac with iTunes software, for example—allowed devices to be simpler, syncing to be smoother, and glitches to be rarer. The more complex tasks, such as making new playlists, could be done on the computer, allowing the iPod to have fewer functions and buttons.

Jobs and Apple took end-to-end responsibility for the user experience—something too few companies do. From the performance of the ARM microprocessor in the iPhone to the act of buying that phone in an Apple Store, every aspect of the customer experience was tightly linked together. Both Microsoft in the 1980s and Google in the past few years have taken a more open approach that allows their operating systems and software to be used by various hardware manufacturers. That has sometimes proved the better business model. But Jobs fervently believed that it was a recipe for (to use his technical term) crappier products. “People are busy,” he said. “They have other things to do than think about how to integrate their computers and devices.”

Part of Jobs’s compulsion to take responsibility for what he called “the whole widget” stemmed from his personality, which was very controlling. But it was also driven by his passion for perfection and making elegant products. He got hives, or worse, when contemplating the use of great Apple software on another company’s uninspired hardware, and he was equally allergic to the thought that unapproved apps or content might pollute the perfection of an Apple device. It was an approach that did not always maximize short-term profits, but in a world filled with junky devices, inscrutable error messages, and annoying interfaces, it led to astonishing products marked by delightful user experiences. Being in the Apple ecosystem could be as sublime as walking in one of the Zen gardens of Kyoto that Jobs loved, and neither experience was created by worshipping at the altar of openness or by letting a thousand flowers bloom. Sometimes it’s nice to be in the hands of a control freak.

When Behind, Leapfrog

The mark of an innovative company is not only that it comes up with new ideas first. It also knows how to leapfrog when it finds itself behind. That happened when Jobs built the original iMac. He focused on making it useful for managing a user’s photos and videos, but it was left behind when dealing with music. People with PCs were downloading and swapping music and then ripping and burning their own CDs. The iMac’s slot drive couldn’t burn CDs. “I felt like a dope,” he said. “I thought we had missed it.”

But instead of merely catching up by upgrading the iMac’s CD drive, he decided to create an integrated system that would transform the music industry. The result was the combination of iTunes, the iTunes Store, and the iPod, which allowed users to buy, share, manage, store, and play music better than they could with any other devices.

After the iPod became a huge success, Jobs spent little time relishing it. Instead he began to worry about what might endanger it. One possibility was that mobile phone makers would start adding music players to their handsets. So he cannibalized iPod sales by creating the iPhone. “If we don’t cannibalize ourselves, someone else will,” he said.

Put Products Before Profits

When Jobs and his small team designed the original Macintosh, in the early 1980s, his injunction was to make it “insanely great.” He never spoke of profit maximization or cost trade-offs. “Don’t worry about price, just specify the computer’s abilities,” he told the original team leader. At his first retreat with the Macintosh team, he began by writing a maxim on his whiteboard: “Don’t compromise.” The machine that resulted cost too much and led to Jobs’s ouster from Apple. But the Macintosh also “put a dent in the universe,” as he said, by accelerating the home computer revolution. And in the long run he got the balance right: Focus on making the product great and the profits will follow.

John Sculley, who ran Apple from 1983 to 1993, was a marketing and sales executive from Pepsi. He focused more on profit maximization than on product design after Jobs left, and Apple gradually declined. “I have my own theory about why decline happens at companies,” Jobs told me: They make some great products, but then the sales and marketing people take over the company, because they are the ones who can juice up profits. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off. It happened at Apple when Sculley came in, which was my fault, and it happened when Ballmer took over at Microsoft.”

When Jobs returned, he shifted Apple’s focus back to making innovative products: the sprightly iMac, the PowerBook, and then the iPod, the iPhone, and the iPad. As he explained, “My passion has been to build an enduring company where people were motivated to make great products. Everything else was secondary. Sure, it was great to make a profit, because that was what allowed you to make great products. But the products, not the profits, were the motivation. Sculley flipped these priorities to where the goal was to make money. It’s a subtle difference, but it ends up meaning everything—the people you hire, who gets promoted, what you discuss in meetings.”

Don’t Be a Slave To Focus Groups

When Jobs took his original Macintosh team on its first retreat, one member asked whether they should do some market research to see what customers wanted. “No,” Jobs replied, “because customers don’t know what they want until we’ve shown them.” He invoked Henry Ford’s line “If I’d asked customers what they wanted, they would have told me, ‘A faster horse!’”

Caring deeply about what customers want is much different from continually asking them what they want; it requires intuition and instinct about desires that have not yet formed. “Our task is to read things that are not yet on the page,” Jobs explained. Instead of relying on market research, he honed his version of empathy—an intimate intuition about the desires of his customers. He developed his appreciation for intuition—feelings that are based on accumulated experiential wisdom—while he was studying Buddhism in India as a college dropout. “The people in the Indian countryside don’t use their intellect like we do; they use their intuition instead,” he recalled. “Intuition is a very powerful thing—more powerful than intellect, in my opinion.”

Sometimes that meant that Jobs used a one-person focus group: himself. He made products that he and his friends wanted. For example, there were many portable music players around in 2000, but Jobs felt they were all lame, and as a music fanatic he wanted a simple device that would allow him to carry a thousand songs in his pocket. “We made the iPod for ourselves,” he said, “and when you’re doing something for yourself, or your best friend or family, you’re not going to cheese out.”

Bend Reality

Jobs’s (in)famous ability to push people to do the impossible was dubbed by colleagues his Reality Distortion Field, after an episode of Star Trek in which aliens create a convincing alternative reality through sheer mental force. An early example was when Jobs was on the night shift at Atari and pushed Steve Wozniak to create a game called Breakout. Woz said it would take months, but Jobs stared at him and insisted he could do it in four days. Woz knew that was impossible, but he ended up doing it.

Those who did not know Jobs interpreted the Reality Distortion Field as a euphemism for bullying and lying. But those who worked with him admitted that the trait, infuriating as it might be, led them to perform extraordinary feats. Because Jobs felt that life’s ordinary rules didn’t apply to him, he could inspire his team to change the course of computer history with a small fraction of the resources that Xerox or IBM had. “It was a self-fulfilling distortion,” recalls Debi Coleman, a member of the original Mac team who won an award one year for being the employee who best stood up to Jobs. “You did the impossible because you didn’t realize it was impossible.”

One day Jobs marched into the cubicle of Larry Kenyon, the engineer who was working on the Macintosh operating system, and complained that it was taking too long to boot up. Kenyon started to explain why reducing the boot-up time wasn’t possible, but Jobs cut him off. “If it would save a person’s life, could you find a way to shave 10 seconds off the boot time?” he asked. Kenyon allowed that he probably could. Jobs went to a whiteboard and showed that if five million people were using the Mac and it took 10 seconds extra to turn it on every day, that added up to 300 million or so hours a year—the equivalent of at least 100 lifetimes a year. After a few weeks Kenyon had the machine booting up 28 seconds faster.

When Jobs was designing the iPhone, he decided that he wanted its face to be a tough, scratchproof glass, rather than plastic. He met with Wendell Weeks, the CEO of Corning, who told him that Corning had developed a chemical exchange process in the 1960s that led to what it dubbed “Gorilla glass.” Jobs replied that he wanted a major shipment of Gorilla glass in six months. Weeks said that Corning was not making the glass and didn’t have that capacity. “Don’t be afraid,” Jobs replied. This stunned Weeks, who was unfamiliar with Jobs’s Reality Distortion Field. He tried to explain that a false sense of confidence would not overcome engineering challenges, but Jobs had repeatedly shown that he didn’t accept that premise. He stared unblinking at Weeks. “Yes, you can do it,” he said. “Get your mind around it. You can do it.” Weeks recalls that he shook his head in astonishment and then called the managers of Corning’s facility in Harrodsburg, Kentucky, which had been making LCD displays, and told them to convert immediately to making Gorilla glass full-time. “We did it in under six months,” he says. “We put our best scientists and engineers on it, and we just made it work.” As a result, every piece of glass on an iPhone or an iPad is made in America by Corning.

Impute

Jobs’s early mentor Mike Markkula wrote him a memo in 1979 that urged three principles. The first two were “empathy” and “focus.” The third was an awkward word, “impute,” but it became one of Jobs’s key doctrines. He knew that people form an opinion about a product or a company on the basis of how it is presented and packaged. “Mike taught me that people do judge a book by its cover,” he told me.

When he was getting ready to ship the Macintosh in 1984, he obsessed over the colors and design of the box. Similarly, he personally spent time designing and redesigning the jewellike boxes that cradle the iPod and the iPhone and listed himself on the patents for them. He and Ive believed that unpacking was a ritual like theater and heralded the glory of the product. “When you open the box of an iPhone or iPad, we want that tactile experience to set the tone for how you perceive the product,” Jobs said.

Sometimes Jobs used the design of a machine to “impute” a signal rather than to be merely functional. For example, when he was creating the new and playful iMac, after his return to Apple, he was shown a design by Ive that had a little recessed handle nestled in the top. It was more semiotic than useful. This was a desktop computer. Not many people were really going to carry it around. But Jobs and Ive realized that a lot of people were still intimidated by computers. If it had a handle, the new machine would seem friendly, deferential, and at one’s service. The handle signaled permission to touch the iMac. The manufacturing team was opposed to the extra cost, but Jobs simply announced, “No, we’re doing this.” He didn’t even try to explain.

Push for Perfection

During the development of almost every product he ever created, Jobs at a certain point “hit the pause button” and went back to the drawing board because he felt it wasn’t perfect. That happened even with the movie Toy Story. After Jeff Katzenberg and the team at Disney, which had bought the rights to the movie, pushed the Pixar team to make it edgier and darker, Jobs and the director, John Lasseter, finally stopped production and rewrote the story to make it friendlier. When he was about to launch Apple Stores, he and his store guru, Ron Johnson, suddenly decided to delay everything a few months so that the stores’ layouts could be reorganized around activities and not just product categories.

The same was true for the iPhone. The initial design had the glass screen set into an aluminum case. One Monday morning Jobs went over to see Ive. “I didn’t sleep last night,” he said, “because I realized that I just don’t love it.” Ive, to his dismay, instantly saw that Jobs was right. “I remember feeling absolutely embarrassed that he had to make the observation,” he says. The problem was that the iPhone should have been all about the display, but in its current design the case competed with the display instead of getting out of the way. The whole device felt too masculine, task-driven, efficient. “Guys, you’ve killed yourselves over this design for the last nine months, but we’re going to change it,” Jobs told Ive’s team. “We’re all going to have to work nights and weekends, and if you want, we can hand out some guns so you can kill us now.” Instead of balking, the team agreed. “It was one of my proudest moments at Apple,” Jobs recalled.

A similar thing happened as Jobs and Ive were finishing the iPad. At one point Jobs looked at the model and felt slightly dissatisfied. It didn’t seem casual and friendly enough to scoop up and whisk away. They needed to signal that you could grab it with one hand, on impulse. They decided that the bottom edge should be slightly rounded, so that a user would feel comfortable just snatching it up rather than lifting it carefully. That meant engineering had to design the necessary connection ports and buttons in a thin, simple lip that sloped away gently underneath. Jobs delayed the product until the change could be made.

Jobs’s perfectionism extended even to the parts unseen. As a young boy, he had helped his father build a fence around their backyard, and he was told they had to use just as much care on the back of the fence as on the front. “Nobody will ever know,” Steve said. His father replied, “But you will know.” A true craftsman uses a good piece of wood even for the back of a cabinet against the wall, his father explained, and they should do the same for the back of the fence. It was the mark of an artist to have such a passion for perfection. In overseeing the Apple II and the Macintosh, Jobs applied this lesson to the circuit board inside the machine. In both instances he sent the engineers back to make the chips line up neatly so the board would look nice. This seemed particularly odd to the engineers of the Macintosh, because Jobs had decreed that the machine be tightly sealed. “Nobody is going to see the PC board,” one of them protested. Jobs reacted as his father had: “I want it to be as beautiful as possible, even if it’s inside the box. A great carpenter isn’t going to use lousy wood for the back of a cabinet, even though nobody’s going to see it.” They were true artists, he said, and should act that way. And once the board was redesigned, he had the engineers and other members of the Macintosh team sign their names so that they could be engraved inside the case. “Real artists sign their work,” he said.

Tolerate Only “A” Players

Jobs was famously impatient, petulant, and tough with the people around him. But his treatment of people, though not laudable, emanated from his passion for perfection and his desire to work with only the best. It was his way of preventing what he called “the bozo explosion,” in which managers are so polite that mediocre people feel comfortable sticking around. “I don’t think I run roughshod over people,” he said, “but if something sucks, I tell people to their face. It’s my job to be honest.” When I pressed him on whether he could have gotten the same results while being nicer, he said perhaps so. “But it’s not who I am,” he said. “Maybe there’s a better way—a gentlemen’s club where we all wear ties and speak in this Brahmin language and velvet code words—but I don’t know that way, because I am middle-class from California.”

Was all his stormy and abusive behavior necessary? Probably not. There were other ways he could have motivated his team. “Steve’s contributions could have been made without so many stories about him terrorizing folks,” Apple’s cofounder, Wozniak, said. “I like being more patient and not having so many conflicts. I think a company can be a good family.” But then he added something that is undeniably true: “If the Macintosh project had been run my way, things probably would have been a mess.”

It’s important to appreciate that Jobs’s rudeness and roughness were accompanied by an ability to be inspirational. He infused Apple employees with an abiding passion to create groundbreaking products and a belief that they could accomplish what seemed impossible. And we have to judge him by the outcome. Jobs had a close-knit family, and so it was at Apple: His top players tended to stick around longer and be more loyal than those at other companies, including ones led by bosses who were kinder and gentler. CEOs who study Jobs and decide to emulate his roughness without understanding his ability to generate loyalty make a dangerous mistake.

“I’ve learned over the years that when you have really good people, you don’t have to baby them,” Jobs told me. “By expecting them to do great things, you can get them to do great things. Ask any member of that Mac team. They will tell you it was worth the pain.” Most of them do. “He would shout at a meeting, ‘You asshole, you never do anything right,’” Debi Coleman recalls. “Yet I consider myself the absolute luckiest person in the world to have worked with him.”

Engage Face-to-Face

Despite being a denizen of the digital world, or maybe because he knew all too well its potential to be isolating, Jobs was a strong believer in face-to-face meetings. “There’s a temptation in our networked age to think that ideas can be developed by e-mail and iChat,” he told me. “That’s crazy. Creativity comes from spontaneous meetings, from random discussions. You run into someone, you ask what they’re doing, you say ‘Wow,’ and soon you’re cooking up all sorts of ideas.”

He had the Pixar building designed to promote unplanned encounters and collaborations. “If a building doesn’t encourage that, you’ll lose a lot of innovation and the magic that’s sparked by serendipity,” he said. “So we designed the building to make people get out of their offices and mingle in the central atrium with people they might not otherwise see.” The front doors and main stairs and corridors all led to the atrium; the cafĂ© and the mailboxes were there; the conference rooms had windows that looked out onto it; and the 600-seat theater and two smaller screening rooms all spilled into it. “Steve’s theory worked from day one,” Lasseter recalls. “I kept running into people I hadn’t seen for months. I’ve never seen a building that promoted collaboration and creativity as well as this one.”

Jobs hated formal presentations, but he loved freewheeling face-to-face meetings. He gathered his executive team every week to kick around ideas without a formal agenda, and he spent every Wednesday afternoon doing the same with his marketing and advertising team. Slide shows were banned. “I hate the way people use slide presentations instead of thinking,” Jobs recalled. “People would confront a problem by creating a presentation. I wanted them to engage, to hash things out at the table, rather than show a bunch of slides. People who know what they’re talking about don’t need PowerPoint.”

Know Both the Big Picture and the Details

Jobs’s passion was applied to issues both large and minuscule. Some CEOs are great at vision; others are managers who know that God is in the details. Jobs was both. Time Warner CEO Jeff Bewkes says that one of Jobs’s salient traits was his ability and desire to envision overarching strategy while also focusing on the tiniest aspects of design. For example, in 2000 he came up with the grand vision that the personal computer should become a “digital hub” for managing all of a user’s music, videos, photos, and content, and thus got Apple into the personal-device business with the iPod and then the iPad. In 2010 he came up with the successor strategy—the “hub” would move to the cloud—and Apple began building a huge server farm so that all a user’s content could be uploaded and then seamlessly synced to other personal devices. But even as he was laying out these grand visions, he was fretting over the shape and color of the screws inside the iMac.

Combine the Humanities with the Sciences

“I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me on the day he decided to cooperate on a biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” It was as if he was describing the theme of his life, and the more I studied him, the more I realized that this was, indeed, the essence of his tale.

He connected the humanities to the sciences, creativity to technology, arts to engineering. There were greater technologists (Wozniak, Gates), and certainly better designers and artists. But no one else in our era could better firewire together poetry and processors in a way that jolted innovation. And he did it with an intuitive feel for business strategy. At almost every product launch over the past decade, Jobs ended with a slide that showed a sign at the intersection of Liberal Arts and Technology Streets.

The creativity that can occur when a feel for both the humanities and the sciences exists in one strong personality was what most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to building innovative economies in the 21st century. It is the essence of applied imagination, and it’s why both the humanities and the sciences are critical for any society that is to have a creative edge in the future.

Even when he was dying, Jobs set his sights on disrupting more industries. He had a vision for turning textbooks into artistic creations that anyone with a Mac could fashion and craft—something that Apple announced in January 2012. He also dreamed of producing magical tools for digital photography and ways to make television simple and personal. Those, no doubt, will come as well. And even though he will not be around to see them to fruition, his rules for success helped him build a company that not only will create these and other disruptive products, but will stand at the intersection of creativity and technology as long as Jobs’s DNA persists at its core.

Stay Hungry, Stay Foolish

Steve Jobs was a product of the two great social movements that emanated from the San Francisco Bay Area in the late 1960s. The first was the counterculture of hippies and antiwar activists, which was marked by psychedelic drugs, rock music, and antiauthoritarianism. The second was the high-tech and hacker culture of Silicon Valley, filled with engineers, geeks, wireheads, phreakers, cyberpunks, hobbyists, and garage entrepreneurs. Overlying both were various paths to personal enlightenment—Zen and Hinduism, meditation and yoga, primal scream therapy and sensory deprivation, Esalen and est.

An admixture of these cultures was found in publications such as Stewart Brand’s Whole Earth Catalog. On its first cover was the famous picture of Earth taken from space, and its subtitle was “access to tools.” The underlying philosophy was that technology could be our friend. Jobs—who became a hippie, a rebel, a spiritual seeker, a phone phreaker, and an electronic hobbyist all wrapped into one—was a fan. He was particularly taken by the final issue, which came out in 1971, when he was still in high school. He took it with him to college and then to the apple farm commune where he lived after dropping out. He later recalled: “On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: ‘Stay Hungry. Stay Foolish.’” Jobs stayed hungry and foolish throughout his career by making sure that the business and engineering aspect of his personality was always complemented by a hippie nonconformist side from his days as an artistic, acid-dropping, enlightenment-seeking rebel. In every aspect of his life—the women he dated, the way he dealt with his cancer diagnosis, the way he ran his business—his behavior reflected the contradictions, confluence, and eventual synthesis of all these varying strands.

Even as Apple became corporate, Jobs asserted his rebel and counterculture streak in its ads, as if to proclaim that he was still a hacker and a hippie at heart. The famous “1984” ad showed a renegade woman outrunning the thought police to sling a sledgehammer at the screen of an Orwellian Big Brother. And when he returned to Apple, Jobs helped write the text for the “Think Different” ads: “Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes…” If there was any doubt that, consciously or not, he was describing himself, he dispelled it with the last lines: “While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world are the ones who do.”

Sent from iPhone 

Posted via email from Pete's posterous

Make An iPhone, Android App Without Knowing A Line Of Code

ISoBusy!

Great example of Slideology:

Posted via email from Pete's posterous

User Experience Vision For Startups - TechCrunch


Excerpt:

How can you achieve such focus though? Let’s look at two examples that will help explain the concept of a User Experience Vision.

Evernote is a note-taking tool. Every computer ships today with at least two or three free, pre installed, note-taking applications. How come Evernote is so successful? One reason for that is Phil Libin’s grand vision for the product. He wants it to replace your brain. Seriously. This is beautifully captured by Evernote’s perfectly crafted tagline — “Remember Everything”. So simple. So powerful. The beauty of this tagline is that it touches a real pain (you forget stuff), offers a compelling vision (you will now remember everything), and even more importantly – gives the Evernote team a beautiful User Experience Vision to optimize for.

When Dropbox was founded, there were probably more than 100 companies that were offering some sort of cloud storage or backup. How did Dropbox grow so fast? Phenomenal virality aside, the key to Dropbox’s success was a fantastic product. But what did Dropbox build? When they started their message was: “Your files everywhere”. Simple. Powerful. Clear. Now, as they move beyond that, it changed to “Simplify your life”. It relates to the Dropbox’s plan for the future.

The Ingredients of a Great User Experience Vision

It’s very hard to capture the UXV of your product in such a meaningful and concise manner. To make it easier, consider the four critical elements that make a great User

Experience Vision:

  1. It addresses a real need – If you don’t know what is the need you are solving for, I suggest that you take time and think through it. Now. It will also give you a good starting point for defining the UXV and help you focus on what is meaningful for the user.
  2. It is simple — keeping the UXV simple is critical so you can communicate it effectively to your customers, team, partners or any other stakeholder. If it is not simple, you probably didn’t figure out the right UXV yet.
  3. It serves as a guiding light — a successful UXV provides guidance to your team as for what to build next. It can help you think through your roadmap and identify whether the next feature you are building will be useful or not.
  4. It is unique — it does not apply to every other startup on earth. Don’t have as your UXV something like “Great User Experience”. The more unique it is, the more meaningful it will be.

It is not easy to come up with a UXV. It takes time. You have to intimately understand the needs of your users. It might take weeks to come up with a good one and either way you will keep developing and refining it. The time to start is now.


Sent from iPhone 

Posted via email from Pete's posterous