December 2009


This simple question is at the crux of social media – after all why should anyone bother spending their increasingly limited time and resources trying to influence what a blogger writes if what they say has no impact.

Which begs two more important questions…

  • Firstly is there any proof that a blog can influence behaviour (specifically where it counts for many firms – i.e. procurement)
  • And secondly, how do I identify which blogs have the power to influence?

Jeremiah Owyang of Altimeter Group ran a survey to find out what people thought of his blog. Anecdotally, you may think that seeing as Jeremiah has a huge audience that regularly comments on his posts, it would presume that his blog does indeed influence. Unfortunately for me, firms do not commit budget to presumptions so research like this is fantastic.

These are some of the results:

image

What can we learn from this?

  • Jeremiah’s blog influences action internally to a large degree
  • The correlation to actual buying is mostly equal for and against.

The fact that there is any link between the blog and procurement is a massive validation point. Obviously we are taking people’s words for this and it would be excellent to have credible evidence to back this up, but this in itself is a huge factoid.

They point that I keep trying to remember is that in the buying cycle, the people who are closest to the money have the greatest level of importance. Some argue that the media and analyst industry help to create buzz about a need or even a shortlist, but it is the (very few) people at the top that hold a significant amount of weight. This is why influencing these influencers is of paramount importance.

image

To my second point regarding how do I indentify which blogs are important – there are tools (such as BlogLevel) but this is part of a greater theme that I cannot do justice in this post alone.

A few additional facts about the survey on the methodology:

  • Survey fielded through Web Strategy blog, using SurveyMonkey
  • Conducted from Nov 30 – Dec 12, 2009
  • 19 questions total – 15 multiple choice, 4 open-ended. Specific questions about Altimeter Group and contact info are not included in the attached sample data.
  • 195 respondents – 100% of data clean

There are several other points that this survey found that are interesting but not directly related to this post’s title. However, as I have often said:

be interested not interesting

…what Jeremiah as also found regarding technical know-how fascinates me. Taken with the caveat that the people who read what is on his blog have an interest in what he says, the market skew is not a representation of the world en masse. However, it is fascinating to find that both many firms social media and mobile strategy are in their infancy. I guess 2010 may see a change to that model.

image image I am sure that Jeremiah will publish his own take on this survey but my thanks to him for giving me an advance copy of the results to draw my own conclusions on.

Originally posted on Technoabble 2.0

by @jonnybentwood

GSMA causes confusion with a strange choice of taxis at Congress 2010

“The reports of my death are greatly exaggerated” or so said Mark Twain, once upon a time in a country far far away. Isn’t it great how a little bit of information (or misinformation) can have such a profound effect on the way the world is viewed?

Do you remember where you were when you heard that Nokia were not exhibiting at the Mobile World Congress 2010? Followed by the collective gasps of horror, the ‘tsks’ and ‘tuttings’ and inevitable barrage of rhetorical questions. Was this the end of the show as we knew it?  Would the congress go the way of the likes of Comdex etc?  Did anybody care?

Well no, I don’t think they did. Why?  Because the mobile industry has grown up. They are no longer the ‘kids in hoodies’ hassling the big fixed line communication providers as they try to hang onto shrinking marketplaces and margins.

The mobile boys dumped their ‘L’Enfant terrible’ image years ago and they are ready to do business with the new masters of the mobile world; the companies that offer the holy grail of the mobile marketplace:  “Content Monetization” … if you are allowed to utter such an awful phrase.

Those new masters are the media companies. Disney, Google, News International, Electronic Arts, Sony Pictures etc; the list goes on. As does their list of premium interactive content which we, the viewing, button bashing public, just can’t get enough of.

So what if Nokia hasn’t spent millions of Euros building a booth?  No doubt they will have several hundred of their elite Finnish shock troopers going into battle all over the Fira and up and down “Las Ramblas” doing deals, pressing flesh and generally making their presence felt.

I wouldn’t bet my house on whether the Congress is going to be around forever but in my opinion as long as the operators want to keep selling airtime and m-commerce services, handset manufacturers want to shift volume and the whole world wants to watch the “The Simpsons” on the move Mobile World Congress ain’t going nowhere.

@MarkCasey

NEWS JUST IN: Nokia Microsite went live today announcing the rumoured presence in Barcelona.  Still – you get our point about Disney and stuff. http://events.nokia.com/mwc/

Regular visitors to The Pheasant will be familiar with Edelman’s Twitter influence measurement tool TweetLevel. While it’s by no means a perfect measure of a person’s influence on Twitter, it’s arguably the best tool on the market and something that, as an agency, we’re rightly proud of.

 

Over the month since its launch (and the beta launch of its companion service BlogLevel), we’ve looked at a number of influencer rankings including MPs, analysts and Edelman staff and today we’re launching our most ambitious project to date, we want to find out who the top technology influencers are on Twitter.

 

We can all probably guess what the top ten will look like, but what about the top 20, 200 or 2000? And will some of the assumptions around who holds influence be challenged by the findings?  Will marketing officers rank higher than technology journalists? Will analysts rank as highly as their profession would suggest? While the Tweetlevel data won’t hold all the answers, it will provide solid, reliable statistics on which to make credible – and perhaps controversial – assertions and assumptions.

 

To build the database of Technology Tweeters, we’ve developed a criteria checklist that looks at profile description, the most popular words used in Tweets over a given time frame as well as looking at followers and who people are following. We’re keeping the entry bar low however, as who’s to say that a geeky father of four from Wigan isn’t more influential on a technology buying decision than a reviewer for a big consumer tech magazine?

 

That’s the beauty and unpredictability of the new digital world, old models and assumptions can no longer go unchallenged and tools like TweetLevel are helping us all find out who the new influencers are.

 

So if you think you should be included in our round up, or you’d like to nominate a colleague, client or a friend send a Tweet with the person’s Twitter ID followed by the hashtag #2010techinfluencer. Or send their Twitter profile in the subject line of an email to techinfluencers@edelman.com and keep an eye out for the results early in 2010. For an explanation of the complex algorithms used to create TweelLevel rankings, please go here.

 

I must confess that I was never much interested in graphs and statistics because I’ve always found them somewhat necessary but boring, dull, static, not to say that they’ve always seemed to lack what we call usability after all.

For my surprise, a couple of years ago I was zapping some content on TED when I found this great talk by doctor and researcher Hans Rosling on his new data presentation software which is really amazing (it’s a 20 minute video but I strongly recommend you take the time to watch it whenever you get a chance).

I’ve shared the link with many friends and even talked about it during my classes at university, but since then, no other guru had really grabbed my attention until last week when the December issue of Yorokobu magazine landed on my desk. A two-page interview with London based writer, designer and author David McCandless, who recently launched his book The Visual Miscellaneum, a colourful guide that help readers like me make sense of the countless statistics and random facts that constantly bombard us.

According to David, people are gradually passing from text to images when consuming information especially in a media saturated landscape we are experimenting today. McCandless makes use of his knowledge in usability – result of years of experience in web design – to condensate his investigations in clear, concise and neat graphs.

Besides collaborating with The Guardian and Wired, David also shares his work in his blog Information is Beautiful as well as his Flickr, where you can find all kinds of information varying from the probabilities of dying in a plane accident to the hierarchy of digital distractions.

I’ve selected some of this work here – truly wish I could use some of his talent when drafting my next proposal or results report

from @vaneribeiro

Some interesting stats about the much-trumpeted Twitter community – visualized!

  

 

Billions spent on this. Billions spent on that. What does it all look like?

A concept-map exploring the Left vs Right political spectrum. A collaboration between David McCandless and information artist Stefanie Posavec.

I recently stumbled on a rather fascinating article which appeared in the December issue of Wired that really delves into the intricacies of privacy in today’s digital age. Well, that’s not really the main objective of the story, "Vanish: Finding Evan Ratcliff" . As the title suggests, Evan Ratcliff, a freelance journalist, decides to throw himself into an experiment, which, at first thought, doesn’t seem as hard as it sounds: Disappear. “I told no one of my plans, not my girlfriend, or my parents, or my friends. No one knew where I was going, or my new name. I left no hints. If anyone found me, it would because of my own mistakes.”

“….And?” was my initial reaction. I mean, how hard could it be? We see this kind of stuff depicted in films and TV series all the time. Change your appearance, don a disguise even, invent a new identity, pay for everything in cash, keep a low profile, and keep moving. Right?

Except for the fact that Evan continues to use his bank card on occasion, his real name when there’s no other way around it, and – here’s the real kicker- the Internet and even social networking sites such as Facebook and Twitter.

Ah… yeah, probably hard to keep a low profile with the above considered.

Granted, when on the Net, he uses aliases and technology to cover up his digital footprints. But nonetheless, the story brings up a really interesting question: Is it possible to entirely “shed your identity in the digital age” today? Have you ever thought about what would happen if you passed away? Who takes care of deleting your Twitter, Facebook, or Linked In accounts? Does it just hang around in cyber space until someone contacts the said website(s) to flag your demise to the administrators? And what if no one ever does flag them?

We all know how easy, sometimes a little too easy, it can be to dig up information about an individual on the internet, even if he/she has never logged-on her/himself (ahem – I can tell you, for example, exactly which politicians and campaigns the grandparents of my friends gave contributions to just by a quick search on the web – so much for keeping your political beliefs private…). But as I was reading this article, I was not only fascinated by the sheer wealth of information on this guy that people were finding in order to track him down, but I was also entirely amused by the lengths that people went to in order to (legally) dig it up. It’s actually quite scary, really, which made me think of this article again a few nights ago…

I was speaking to a friend about Facebook, and he explained that his employer forbids him to “friend” any people at the company where he is currently contracted to work. Even if he is really friends in real life with such individuals, his company doesn’t want the lines between his professional life and personal life to be blurred. This started a lively discussion, of course, in which we debated both the merits and downsides of using Facebook today, especially when it comes to work.

At one point he opined that he doesn’t understand all the hoopla regarding people who claim that Facebook is a dangerous tool which can be detrimental to one’s privacy. “In the end, you’re the one who gets to decide what you put out there. If you’re not comfortable with the idea of a professional acquaintance happening upon your page and seeing that your status reads how drunk you got last night at friend’s party, then don’t put it out there.” [As a side note, here are a few hilarious cases which demonstrate this point perfectly…]

But I thought of the Wired article during the course of our conversation. Everyone is quick to point the finger at Facebook these days when it comes to the news swirling around about the sharing of personal information with others who may not know you. But really, this debate goes far beyond Facebook, and the Evan Ratcliff’s experiment demonstrates this beautifully. The amount of information hovering out there in cyber space about your average Joe is staggering. And with a little know-how, it’s almost creepy how easy it is to gain access to this personal information.

God forbid you ever posted a comment in the forum of a website entitled “I like to dress my dog up in cute clown costumes and dye his fur pink” on a dare from your friends 8 years back. I’d hate to see the look on your potential employer’s face when he Googles your name before deciding whether to invite you in for a formal interview.

by @Jjpisces

You can run, but you can’t hide

So Christmas parties are renowned for being embarrassing.  It wouldn’t be December without indiscreet inter-colleague relations, bad dancing and copious amounts of alcohol, of course.

As avid readers of this blog might know – we at Edelman are quite passionate about going the extra mile.  True to form – we couldn’t settle on just some nibbles and drinks to celebrate the passing of another year.  So some of our colleagues, indeed many posters to this blog, ran the gauntlet of embarrassment last night for the inaugural Edelman Christmas Panto.

Of course there were nay-sayers who thought that the Panto was the epitome of corporate cringe behaviour.  But in the grand Panto tradition we all screamed “Oh no it isn’t” and the Scrooges were proved wrong.  In fact the panto wasn’t embarrassing at all.  The plot was preposterous (who knew a mash-up of Macbeth and X factor would be quite so entertaining), the cast were tremendous, the lighting professional, set changes were slick, front of house staff were generous with vodka jelly, the band could have actually won the X factor, the costumes were incredible.  Particularly the Dog costume.  The Dog costume was my favourite.  It was a production created by colleagues for colleagues.  It was in two words, fully awesome.  Well done everyone.

*APPLAUDS*

But we thought it would only be appropriate to share some photos.  Rumour has it a DVD will be available through all major retail outlets in early 2010.

There’s been substantial interest in the news this week about the Stateside launch of Vevo – an online music player that is being dubbed MTV for the 2.0 generation, and perhaps rightly so. Firstly, the service has the buy-in of three of the major labels (at present, EMI, Universal and Sony), and has done so by novel means; EMI has gone down the tried and tested licensing routes, but interestingly the latter majors have gone for equity in the business. This equity approach shows a robust confidence of the service and perhaps also suggests the licensing route is perhaps going to wane in entertainment industries if major labels can instead get a share of the profits outright.
 
Secondly, what Vevo looks to have solved was the perhaps fundamental flaw in
Google’s high value acquisition of YouTube, with many analysts and industry commentators at a loss as to where the return on investment was really coming from, given the vast majority of content on YouTube is poor quality, grainy and often filmed from another medium in the first place, such as a TV or is a skateboarding cat. Would record labels want to have their brand next to a poor quality music video – pretty much no, and YouTube continues to flatter to deceive with regards giving Google back the billions spent to acquire it. That YouTube is powering Vevo however could resolve this; Vevo will be a branded, dedicated player with high quality content that will interest advertisers much more than current video quality – its CEO has suggested phenomenally strong rates as high as $25 – $40 per 1,000 views, an incredible jump from today’s norm of $3 – $8.
 
What’s more, if this content really is as high quality – and in the long term potentially exclusive or streamed live – this will encourage more people to share it and thus drive traffic even further; a solution to monetising peer-to-peer sharing (in the friends sense, not the technological sense). So is Vevo the saviour of the entertainment industry? Initial reaction has been very positive and it will be interesting to see how it rolls out in the States before hitting the UK sometime next year. Fingers crossed.

@wonky_donky

You may have heard or read about two seemingly rather dull announcements from the UK government relating to Met Office and the Ordnance Survey data in the last day or so. What has actually been announced though is quite interesting – and perhaps even revolutionary.

The government has decided to make data from these organisations (or at least a certain amount of it) freely available to the public. What they are hoping to do is encourage entrepreneurs to develop new businesses through the inventive use of this data. It is hoped this will generate tax revenue greater than could have been realised by selling that data for commercial use.

This is a very interesting move on the part of the government that could result in the creation of a wide range of new businesses.

But surely this is a bit too forward thinking of a government that is most likely approaching the end of its days? Well yes it is. The idea was actually seeded by Sir Tim Berners-Lee, creator of the web, and Professor Nigel Shadbolt from the University of Southampton. Both were recently appointed as government advisers on technology.

For Sir Tim and Professor Shadbolt, the real motivation here will be to encourage the growth of the semantic web, which has been long talked about but painfully slow in realisation.

Essentially, the semantic web is an ongoing effort to make the web more “intelligent” by allowing it to "understand" and satisfy user requests (including requests from machines) to a greater degree. At the heart of the semantic web is linked data and because much of the data held Met Office and Ordinance Survey can be classified as linked, it is essentially semantic web ready, making it ideally suited to the purpose of encouraging the next stage in the Internet’s evolution.

This article from the FT provides more detail on the government’s announcement and is worth reading: http://www.ft.com/cms/s/0/cdcc60a2-e399-11de-9f4f-00144feab49a.html

@AndyRobertson

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

A Star Wars example of how semantic web works – taken from the excellent folk at http://www.howstuffworks.com/semantic-web.htm

Last week a friend of mine suggested that I get involved with a social media experiment  being run by a gallery in Brighton.  @Fabrica had tweeted that they were looking for people to take part in a social media ‘game’ of sorts.  Jumping at the chance to ‘play’ whilst passing it off as ‘work’ – I got in touch…

You can read the official write up, but I figured I’d replay the tweets and let you know what I thought as well.

The games was called “Broken Whispers”.  Basically I was told that I would receive a message from a stranger, via tweet.  I then had to change two words and then send the altered message onto another stranger.  The ‘game’ happened three times.  Basically like Chinese Whispers2.0 the game was looking to explore themes of how stories evolve, to tie into an exhibition at the gallery.  As far as I can tell the game players enjoyed it a lot – gaining a cheap thrill out of knowing what these odd messages in their tweet stream were about, tapping into new communities and crucially having fun.

The old lit student within me found it interesting from a narrative point of view.  With my tweets I was trying to somehow continue a sense of story – only for an irregular word to be added further down the chain to really trip things up. I quickly realised though as with all crowd sourced content – the merit isn’t in how the message and story finishes, but in watching and participating in how it evolves.  Like Bowie’s best lyrics the chain was fragmented nonsense, but  by taking part in the process – listening to the whole band play, to continue the analogy – was where the fun could be had.

This obviously had some sort of artistic purpose, but when thinking about it within our brand focussed work it had a couple of learning’s for our own campaigns.  The first was a way of looking beyond the obvious “engage a community tactics”.  This game uses an existing community (the Fabrica Gallery’s followers – but equally of any brand) to build and create touch-points in other pools of influence in micro-communities – associated groups of people who are ultimately not related (my micro community of followers, and the followers of those I was messaging).

Recently we have been talking about the future of the press release and how companies can’t expect to fully own messages, only steer audiences in the right direction.  The game acted as an example of this – albeit with a heavily involved catalyst and moderator to steer the way.  It showed that if you give the community the right tools  they can play with the sentiment without totally destroying the message.  In today’s world, where brands are concerned, it should be about getting people involved, getting people to think, getting people to play.  It doesn’t have to be about repurposing the party line.

So who’s up for a game of “Broken Key Messages”?

@LukeMackay

Edit: Now includes predictions from: IDC, Gartner, Screen Digest, Elemental Links, Freeform Dynamics, Quocirca, Interpret, CCS Insight, RedMonk, Juniper Research, Interpret, IDEAS International

Predicting what will be big is a fun game that all the analysts like to play each year. The technology industry, often so full of hype, frequently drinks its own Kool-Aid and genuinely believes that they have the next big thing. Obviously I am a cynic, but I often doff my cap to the analyst world who makes their living (as IDC say) analysing the future.

Consequently I have listed below some of the predictions that our analysts have come out with in the hope of looking back at this post in a years time to see how accurate their prophetic skills are.

IDC’s two big guesses are an “Apple iPad” and “Battles in the Cloud.” They have an excellent download detailing their full views here which i would recommend everyone to get. Their summary is as follows:

  1. Growth will return to the IT industry in 2010. We predict 3.2% growth for the year, returning the industry to 2008 spending levels of about $1.5 trillion.
  2. 2010 will also see improved growth and stability in the worldwide
    telecommunications market, with worldwide spending predicted to increase 3%.
  3. Emerging markets will lead the IT recovery, with BRIC countries growing 8–13%.
  4. Cloud computing will expand and mature as we see a strategic battle for cloud
    platform leadership, new public cloud hot spots, private cloud offerings, cloud
    appliances, and offerings that bridge public and private clouds.
  5. It will be a watershed year in the ascension of mobile devices as strategic
    platforms
    for commercial and enterprise developers as over 1 billion access the
    Internet, iPhone apps triple, Android apps quintuple, and Apple’s "iPad" arrives.
  6. Public networks — more important than ever — will continue their aggressive
    evolution to fiber and 3G and 4G wireless. 4G will be overhyped, more wireless
    networks will become "invisible," and the FCC will regulate over-the-top VoIP.
  7. Business applications will undergo a fundamental transformation — fusing
    business applications with social/collaboration software and analytics into a new generation of "socialytic" apps, challenging current market leaders.
  8. Rising energy costs and pressure from the Copenhagen Climate Change
    Conference will make sustainability a source of renewed opportunity for the IT
    industry in 2010.
  9. Other industries will come out of the recession with a transformation agenda and look to IT as an increasingly important lever for these initiatives. Smart meters and electronic medical records will hit important adoption levels.
  10. The IT industry’s transformations will drive a frenetic pace of M&A activity.

Gartner always one to beat the rush tend to name their strategic technologies in October for the next year. In fairness, they do have many bespoke reports on Predict 2010, they are behind their firewall so I cannot share the full content with you. However, below is their list of strategic technologies for 2010 which is in the public domain. I have fully copied below the below extract from the aforementioned release.

  1. Cloud Computing. Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. Cloud-based services can be exploited in a variety of ways to develop an application or a solution. Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. In addition, consuming cloud services enterprises will increasingly act as cloud providers and deliver application, information or business process services to customers and business partners.
  2. Advanced Analytics. Optimization and simulation is using analytical tools and models to maximize business process and decision effectiveness by examining alternative outcomes and scenarios, before, during and after process implementation and execution. This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more informed decisions powered by the right information delivered at the right time, whether through customer relationship management (CRM) or enterprise resource planning (ERP) or other applications. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action. The new step looks into the future, predicting what can or will happen.
  3. Client Computing. Virtualization is bringing new ways of packaging client computing applications and capabilities. As a result, the choice of a particular PC hardware platform, and eventually the OS platform, becomes less critical. Enterprises should proactively build a five to eight year strategic client computing roadmap outlining an approach to device standards, ownership and support; operating system and application selection, deployment and update; and management and security plans to manage diversity.
  4. IT for Green. IT can enable many green initiatives. The use of IT, particularly among the white collar staff, can greatly enhance an enterprise’s green credentials. Common green initiatives include the use of e-documents, reducing travel and teleworking. IT can also provide the analytic tools that others in the enterprise may use to reduce energy consumption in the transportation of goods or other carbon management activities.
  5. Reshaping the Data Center. In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed by a uninterruptible power supply (UPS), water-and air-cooled and mostly empty. However, costs are actually lower if enterprises adopt a pod-based approach to data center construction and expansion. If 9,000 square feet is expected to be needed during the life of a data center, then design the site to support it, but only build what’s needed for five to seven years. Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up money to apply to other projects or investments either in IT or in the business itself.
  6. Social Computing. Workers do not want two distinct environments to support their work – one for their own work products (whether personal or group) and another for accessing “external” information. Enterprises must focus both on use of social software and social media in the enterprise and participation and integration with externally facing enterprise-sponsored and public communities. Do not ignore the role of the social profile to bring communities together.
  7. Security – Activity Monitoring. Traditionally, security has focused on putting up a perimeter fence to keep others out, but it has evolved to monitoring activities and identifying patterns that would have been missed before. Information security professionals face the challenge of detecting malicious activity in a constant stream of discrete events that are usually associated with an authorized user and are generated from multiple network, system and application sources. At the same time, security departments are facing increasing demands for ever-greater log analysis and reporting to support audit requirements. A variety of complimentary (and sometimes overlapping) monitoring and analysis tools help enterprises better detect and investigate suspicious activity – often with real-time alerting or transaction intervention. By understanding the strengths and weaknesses of these tools, enterprises can better understand how to use them to defend the enterprise and meet audit requirements.
  8. Flash Memory. Flash memory is not new, but it is moving up to a new tier in the storage echelon. Flash memory is a semiconductor memory device, familiar from its use in USB memory sticks and digital camera cards. It is much faster than rotating disk, but considerably more expensive, however this differential is shrinking. At the rate of price declines, the technology will enjoy more than a 100 percent compound annual growth rate during the new few years and become strategic in many IT areas including consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the storage hierarchy in servers and client computers that has key advantages including space, heat, performance and ruggedness.
  9. Virtualization for Availability. Virtualization has been on the list of top strategic technologies in previous years. It is on the list this year because Gartner emphases new elements such as live migration for availability that have longer term implications. Live migration is the movement of a running virtual machine (VM), while its operating system and other software continue to execute as if they remained on the original physical server. This takes place by replicating the state of physical memory between the source and destination VMs, then, at some instant in time, one instruction finishes execution on the source machine and the next instruction begins on the destination machine.
    However, if replication of memory continues indefinitely, but execution of instructions remains on the source VM, and then the source VM fails the next instruction would now place on the destination machine. If the destination VM were to fail, just pick a new destination to start the indefinite migration, thus making very high availability possible.
    The key value proposition is to displace a variety of separate mechanisms with a single “dial” that can be set to any level of availability from baseline to fault tolerance, all using a common mechanism and permitting the settings to be changed rapidly as needed. Expensive high-reliability hardware, with fail-over cluster software and perhaps even fault-tolerant hardware could be dispensed with, but still meet availability needs. This is key to cutting costs, lowering complexity, as well as increasing agility as needs shift.
  10. Mobile Applications. By year-end 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing a rich environment for the convergence of mobility and the Web. There are already many thousands of applications for platforms such as the Apple iPhone, in spite of the limited market and need for unique coding. It may take a newer version that is designed to flexibly operate on both full PC and miniature systems, but if the operating system interface and processor architecture were identical, that enabling factor would create a huge turn upwards in mobile application availability.

Edit now with mobile predictions from Julien Theys at Screen Digest (his view not the companies)

  1. Facebook’s careful and progressive foray into Location-Based services will deal a deadly blow to many mobile LBS startups
  2. SonyEricsson and Motorola’s handset businesses will face very serious existential crisis and possibly split up.
  3. Sony is very likely to take another solo shot at mobile (especially if it wants to leverage the Playstation branding)
  4. Palm will be acquired by a bigger fish
  5. Sync services, Address Book/Social Network consolidation and cloud backups are going to be ubiquitous
  6. Apple’s app store will remain an outlier in terms of success. Developers of crappy apps will blame everyone but themselves for not making any money.
    App rejection horror stories will keep tech pundits busy yet customers won’t care.
  7. Open source mobile software will keep tech pundits busy yet customers won’t care
  8. A very bad year for Windows Mobile, Microsoft to (try to) unify all its scattered efforts in portable media (WinMo, Zune, Sidekick…)
  9. A surprisingly high number of people will keep buying simple phones and not care about mobile web
  10. Wholesale B2B mobile data deals (like Kindle content delivery) to bring in extra cash for operators (big for eBook readers, navigation and automotive industries)
  11. Don’t hold your breath: Google hardware, iPhone nano, Android-based Nokias, decent mobile broadband speed & coverage.
  12. Tempted to say no Apple Tablet in 2010 because, really, where’s the rush?


Edit
: Now with Clive Longbottom from Quocirca

  1. Process needs come to the fore (again). BPM players need to go to the cloud, feel the FaaS
  2. Budgets do improve, but only for point projects.
  3. Few platform projects outside of virtualisation…
  4. Licence management issues force more vendors to subscription agreements – but small print still problematic

Edit: Now with CCS Insight

  1. Motorola will be acquired by Google. Although this is an unlikely scenario, it would deliver Google a portfolio of Android-powered devices. We would expect Motorola’s Devices headquarters to shift to Mountain View in California.
  2. Apple will make major moves into mapping, location and augmented reality. Such moves would pitch Apple into direct competition with Google and Nokia. They would also end Apple’s reliance on Google for mapping services.
  3. Amazon will make a larger play in mobile in 2010. It already sells phones, accessories, netbooks and music, and has a well-established and trusted payment system. The Kindle device and its connectivity agreement with AT&T are expanding worldwide. Amazon will exploit its strength in cloud services, either by offering recommendations as a cloud service to software stores, or by setting up a world-scale application store.
  4. Most application stores will not be profitable during 2010. Like store owners, developers will be frustrated by the gap between the promise of easy income and the reality of needing careful marketing to make consumers aware of applications.
  5. At least one European country will introduce a mobile telecom tax in 2010. Governments are looking to bridge national budget gaps, and mobile network operators’ substantial revenues will be an attractive target.
  6. Some countries will see a fall in subscriber numbers. As prepaid users switch to contracts they will discard their two or three old prepaid SIMs. In high-penetration markets such as Italy we expect leading operators to report a decline in customers.
  7. At least two major European operators will stop subsidising phones in 2010. They will switch to a SIM-only strategy, and offer nothing but SIM-only contracts to new customers and to people renewing or upgrading contracts.
  8. Microsoft will port Windows 7 to the ARM architecture. The move will be a hedge against burgeoning growth in sub-netbook mobile computing, but Microsoft will not acknowledge the port before the end of 2010. Any ARM products running Windows 7 would not be available until 2011.

Edit: Now with Governor, Coté and SOG from the RedMonk massive. My apologies to RedMonk for combining and editing your predictions together to form a top 20

  1. Cloud API proliferation will become a serious problem
  2. Data as revenue – we’re going to see datasets increasingly recognized as a serious, balance sheet-worthy asset
  3. Developer target fragmentation will accelerate
  4. It’s all about the analytics – metrics can be immensely important in maximizing returns, and to an extent, profits. In 2010 business intelligence will become less about the power user, and more about democratised access to the ad hoc query. In memory databases will underpin the trend.
  5. Marketplaces will be table stakes (Jonny edit: my first client was Commerce One)
  6. New languages to watch: Clojure and Go
  7. NoSQL will bid for mainstream acceptance
  8. Location, location, location: the new frontier in app dev is location-aware applications and services
  9. Augmented Reality will begin to make a mark in the mobile space.
  10. Greener business processes through deeper instrumentation, more effective automation and orchestration
  11. Google will significantly ramp up enterprise efforts
  12. Hybrid Cloud and On Premise models for the enterprise – the Big Cloud Backlash will be in full effect in 2010, after all the hype in 2009.
  13. SOA without the SOA
  14. A big upswing in enterprise demerger activity
  15. New devices: Smart phones, tablets, toys, TVs, and other devices are now on the Internet. Software goes here.
  16. Users no longer tolerate slow and dumb computers.
  17. Technology every where and at all times changes how people go about their daily work and lives.
  18. New technology actually seems to work; but it’s not as open as we’re used to.
  19. Identity management standards
  20. The consumerization of IT, or whatever you like: the core difference with these new platforms is that end-users expect more out of their “computers” and the related software.

image Edit: Now with Juniper Research

  1. Mobile Data Traffic Explosion to strain 3G Networks, spur data pricing overhaul
  2. Mobile Ecosystem starts to go green
  3. Mobile Heads for the Cloud
  4. New category of Smartbooks to Emerge
  5. Apps Stores All Round
  6. Mobile Social Networking to Integrate with other Applications including M-Commerce
  7. NFC phones appear in the shops
  8. At least 10 LTE networks to be launched into service
  9. Smartphones to Get Augmented Reality Makeover
  10. Christmas Kindle sales expected to herald the rise of the connected embedded consumer devices

Edit: Now with CMS Watch

  1. Enterprise Content Management and Document Management will go their separate ways
  2. Faceted search will pervade enterprise applications
  3. Digital Asset Management vendors will focus on SharePoint integration over geographic expansion
  4. Mobile will come of age for Document Management and Enterprise Search
  5. WCM vendors will give more love to Intranets
  6. Enterprises will lead thick client backlash
  7. Cloud alternatives will become pervasive
  8. Document Services will become an integrated part of ECM
  9. Gadgets and Widgets will sweep the Portal world
  10. Records Managers face renewed resistance
  11. Internal and external social and collaboration technologies will diverge
  12. Multi-lingual requirements will rise to the fore

image Edit: Now with Hurwitz Group about the SMB market

  1. Pent Up SMB Demand Will Be There—But Wont Be Easy to Capture
  2. SMBs Accelerate Their Shift to Digital Marketing Media
  3. The Collaboration Battle Heats Up
  4. The New Face of Small Business
  5. Savvy SMB Vendors Get Strategic About Social Media Analysis
  6. SMBs Drive the Mobile Internet Tsunami
  7. Virtualization Boosts Cloud Computing Adoption
  8. SMBs Appetite for Managed Services Grows
  9. Beyond ExcelTargeted Workflow and Analytic Tools Takes Flight
  10. 2009 Acquisitions Drive New Value for SMB customers in 2010
  11. Time to Get Paid for Selling a Free Lunch
  12. Vendors Scramble for SMB Developer Loyaltyand New Integration Needs Arise
  13. SaaS Computing Lifts Off in New Areas

image Edit: Now with Interpret, LLC

  1. Mobile services will gain importance over applications
  2. The smartphone market will fragment
  3. Mobile platforms will consolidate
  4. We will hit the intersection of mobile and social networks
  5. Microsoft will not retreat from the mobile market

tmp1E4Edit: Now with IDEAS International

  1. More mergers, acquisitions and alliances will occur. IDEAS expects to see some surprising moves, many of which will be opportunistic, but others due to strategic shifts in response to competitive realignments in the industry. Acquisitions could include hardware companies buying large system software companies, and more high-profile software companies buying hardware companies.
  2. Demand for server systems will rebound. Server deployment will pick up in 2010 due to pent-up demand, and as customers take advantage of new technology to streamline their infrastructure. Companies that do not make their IT more efficient will find themselves at a competitive disadvantage. As the economy recovers, companies that optimize their business processes to maximize use of IT will gain an advantage over those that have concentrated only on cutting costs.
  3. Green marketing hype will subside as the issue matures. Corporate executives will remain highly focused on environmental goals for the overall corporation, but for departmental-level IT managers, "efficient IT" will gradually replace "green computing" as a priority.
  4. x86 servers will continue their drive into the datacenter. The sharp increase in performance of Intel’s Nehalem processor will accelerate the use of x86 servers for hosting enterprise-level workloads, but x86 servers will not take on the same characteristics as the "big-iron" systems that traditionally made up the heart of datacenters. Instead, servers and storage systems will employ various forms of clustering software on inexpensive hardware to achieve scalability and reliability.
  5. Server form factors will progress from towers and racks to integrated appliance-like systems. Integrated solutions that combine multiple layers of IT infrastructure, including servers, storage, networking, and software, will become increasingly appealing to organizations that have the need for next-generation datacenter capabilities, but lack the depth of personnel to manage complex new technology such as virtual infrastructure.
  6. Server virtualization will converge with storage and network virtualization. I/O infrastructure virtualization, which applies virtualization functions to storage and networking infrastructure, will continue to evolve very rapidly throughout 2010, with high stakes for vendors to chart out positions in next-generation datacenter infrastructures.
  7. System software will fragment, and heterogeneous virtualization management will become a requirement. As virtualization takes hold across the industry, the operating system will begin to lose its grip on IT as the center of the software universe. Users will apply virtualization to workloads that span multiple departments or business units, and as a result, virtualization management tools will have to support multiple virtualization platforms.
  8. Organizations will build their own "secure" clouds. Despite continued fascination with the prospect of tapping into third-party computing infrastructures, the most pressing concern for the majority of users will be to virtualize as much as possible of their internal infrastructure into "secure" or "private" clouds.
  9. Public clouds will draw startups, and some experimental use by larger organizations. Startups will use cloud-based services to minimize costs and accelerate time to market, and some small companies will move routine workloads to Software as a Service (SaaS) for cost savings. Mid-sized and large enterprise companies will experiment with third-party cloud services by temporarily deploying some production workloads on them.
  10. IT administrator roles will take on a broader scope. With the rise of virtual infrastructure in next-generation datacenters, skill sets that traditionally fell into separate silos for servers, storage, software, and networking, will have to converge. Few people have such converged skills today, and those that do have them will become more valuable.

Edit: single predictions from analysts:

  1. The biggest thing in IT in 2010 will be datacentres. but thanks to virtualisation they will be a tad lighter.
    Martin Atherton from Freeform Dynamics (< well I found this very funny)
  2. 2010: Event Processing transcends niche status, to well-recognized & adopted business technique for real-time visibility & responsiveness.
    Brenda Michelson from Elemental Links
  3. The real hot mobile topic in 2010 will shift from mobile apps to mobile services.
    Michael Gartenberg from Interpret

I will update this post with additional content as it becomes available from other analyst houses in the hope of having a one-place summary.

Additional note: Edelman have had their own stab at gazing into the crystal ball looking at what 2010 and beyond may hold for the Economy, Business, Government and the Media. You can read the full content here.

Next Page »

Follow

Get every new post delivered to your Inbox.

Join 30 other followers