Tag: Facebook

  • The changing face of search

    The last week saw some frenetic activity in the online space – a few events that are not just going to cause a shift in the way we search and share online, but could possibly impact the direction in which the web develops henceforth. These are very very interesting developments, and not just from a technology standpoint. Apparently, if we go by this, our brain is hard-wiring us to  love Google, Twitter and texting. That will change the way we evolve as a species. But meanwhile…

    Facebook began the week by acquiring Friendfeed (FB, FF – BFF), something I’d hoped that Twitter would do. For those not familiar with the service, its a neat aggregator of most of your activities online (blogs, twitter, facebook, delicious, flickr, YouTube…) and allows others to comment, share, like, search. Yes, most of those features that Facebook has been adding have been lifted from Friendfeed. For several reasons, the service, though extremely useful, has remained geeky.

    The integration is bound to be tricky. While Friendfeed is used mostly as an aggregator (though some publish content exclusively there), Facebook thrives on ‘original’ content. Also, there are features on FF that don’t have a parallel on FB, and perhaps users too. I have different user names on both places, and there are very few who are friends of mine on both networks, and for a reason. I wouldn’t want to import my network on FF to FB. Also, I don’t import all of my content on FB. In many ways FF was my ‘private’ aggregator,  a place where I could aggregate without making it too public. Adapting that on FB would require a lot of settings work. FF’s stream and its approach to updates is also different from FB. So it is quite possible that integration will not happen. But the Friendfeed ‘brain bank’ – people who had earlier made GMail, co-founded Google Maps, is unquestionably an asset, and one part of me won’t mind the fact that the acquisition will perhaps ensure that the innovations will reach a wider audience, and perhaps speed up the learning curve of casual social media users. The other part hopes that they will leave this version of FF intact too, even if it is as FB Labs.

    Facebook’s ‘Lite’ also caused a stir, as several users saw an announcement that they were the chosen ones to test it out, though it turned out to be an accident, but that meant that all of us got to see a preview. It turns out to be a lighter, faster-loading version of Facebook, designed to give new users (especially from countries with lesser broadband access) a simple experience to begin with.

    Facebook also launched real time search around the same time, and the ability to search shared (friends and public shared) news feeds (of the past 30 days) – status updates, photos, links, updates, Fan pages, with the option of filters, is quite a huge step. In many ways, FB is ‘forcing’ people to be more public to derive the maximum advantage out of the service. As Steve Rubel correctly points out, it has major implications on our consumption of content, making us ‘source agnostic’, which we are already, to a certain extent. Also, as he mentions, the impact of Facebook Connect in this equation means that the net is cast wider. The important factor in this, and the reason why i feel Google needs to take a long hard look at this is because there is a people filter here, in addition to the algorithm – news feeds of friends, people who have chosen to share their FB content publicly, means that it works as a kind of endorsement, a personally tested good source. That might potentially be better than Google’s spiders. I am not even bringing Twitter Search into the equation because if FB uses FF correctly and gets a majority of Twitter users to get their tweets into FB (store all but dipslay selectively), then the uniqueness of Twitter Search is gone. Besides FB has a much larger user base anyway.

    Yes, Google is watching, flexing its muscles, and developing a few new ones too. On the day that Facebook dropped its big news, Google also unveiled the next generation of its own search – Caffeine. According to them, “It’s the first step in a process that will let us push the envelope on size, indexing speed, accuracy, comprehensiveness and other dimensions.” More than an upgrade, it seems like completely new architecture, and will change the way Google indexes pages, and these changes also include real-time. Meanwhile, its also playing with new forms of product ads.

    Google is also getting a bot more serious about ‘social’, and that is perhaps the reason behind iGoogle getting a facelift with 18 new widgets on the homepage. I’m not too much of a user of this service, but according to RWW, Google is slowly unleashing the services built on OpenSocial, and trying to make iGoogle the hub of a user’s Google activities, and sigh, there’s quite a lot of them. There are Facebook like update feeds (of friends), a share-able To Do gadget, a Scrabble gadget (hmm, that’s appealing) among other things. But the integration is not complete as shown by the YouTube widget and the absence of a Reader widget. But as I always say, the potential, if they actually manage to integrate all of this, and then add Wave features on top of it, is scary. But perhaps (since the social graph – i.e. who sees your comments and shares, is different) iGoogle is not meant to be connected with others.

    The last announcement from Google was on the subject of Reader. In addition to the recent social developments, now reader items can be shared easily to other networks including Twitter, Facebook, Digg, MySpace, Blogger etc. Also, some tweaks in the ‘Mark all as read’ feature make it a lot more useful now. You can read the details here. But hey, Google, how about bringing Reader closer to real time?

    Meanwhile, in the midst of these killer shark wars, the ‘whale’ boys have their own bogeymen. In addition to the wave of DSoS attacks, and the fact that Facebook grew twice as fast in July, the Gartner Hype Cycle white paper for 2009, has stated that microblogging has tipped over the peak and are about to enter the ‘Trough of Disillusionment’. But I am not sure I agree with that. Microblogging, as Seth Godin once stated (about Twitter) is a protocol (nailed it brilliantly!!), what gets transmitted across it is a variable. Its news and links now, and who knows, a smart user/set of users might figure out something else tomorrow that would cause yet another disruption. Perhaps Gartner meant it only in the current context of usage. Twitter has just announced phase one of Project Retweet, which is aimed at changing the way the format of RT works and looks. While it does pose some inconvenience – we are used to the current RT @ format and will perhaps take some time to get used to seeing just the original tweet with a small ‘RT by’ (reminds me of Friendfeed’s ‘Like’), I am hoping that the open API means the developers will deliver to us some useful stuff (Retweets by/to me, of my tweets timelines, the lessening of clutter, as Mashable points out) But honestly, these seem to be small efforts when compared to those of Google and FB.

    Interesting indeed. Rather than conspicuous face offs, Facebook and Google are warily circling each other, and launching and tweaking services that  test out each others’ stranglehold on areas. An elaborate game of chess, that doesn’t look like it will end anytime soon. Stalemate? Though it could be argued that there is space for both, I am inclined to think that the margin of advantages between the leader and the second best will be very high. The battle is for understanding consumer intent and making a revenue stream out of it. Google did that without much competition with search, until specific competition (Bing), real time and social media made threatening noises. Facebook’s appeal was on both those fronts, and now Google is making advances there. But Google is rich and now even has a browser with which it can define the starting point and direction of a user’s web experience, while Facebook revenues are still iffy. Facebook users have shared so much of content inside the ‘walled garden’ that it’ll be difficult to get out even if they desire. Not that Google is an angel on that count.  (You must see this hilarious Onion video – Google’s opt out village) And now with Friendfeed, FB can lay its hands on Google content too – YouTube, Blogger etc can all be pulled into Facebook. But if they rub users the wrong way while trying to accelerate revenues, one can never say.

    What would I like to see? Microsoft buying out Facebook. Perhaps then, we’ll have a fight that’s really too difficult to call.

    until next time, which service is your BFF? 🙂

    Bonus Read: John Borthwick’s ‘The rise of social distribution networks‘.

  • Paper Capers

    Almost 2 months since we last discussed newspapers, so I thought its a good time to update. Rumour is that Murdoch plans to sue Google and Yahoo over news services. Fact is that he’s going to charge for news, something he’s been doing for a while with WSJ, and the ‘experiment’ is going to start with The Sunday Times. Others are set to follow his example.  “Quality journalism is not cheap,” said Murdoch. “The digital revolution has opened many new and inexpensive distribution channels but it has not made content free. We intend to charge for all our news websites.

    I, for one, am happy, because the keywords for me are ‘quality journalism’. Its perhaps a prelude to a shakedown, and the survival of only those who can adapt to a world with internet. With the width and depth of content available, the debate of ‘free vs paid’ has been going on for a while now. But perhaps the time has come to end it. Build the wall, and let’s see if people want to pay to enter. (that link is an excellent read, detailed and thought through, check it out) Opinions are bound to vary – and to be in extremes. Most people feel that it is flawed. Chris Anderson feels that at some point in the future, “maybe media will be a hobby rather than a job“,  Vivian Schiller, former senior vice president and general manager of the NYTimes.com, believes that “people will not in large numbers pay for news content online“, but there’s still space for an NYT to cut expenses and survive. Murdoch obviously believes he can get the audience to pay.

    Meanwhile, the Associated Press is planning to charge $2.50 per word if 5 words or more are quoted from its articles, with the help of a microformat. Not surprisingly, it has been widely criticised in several tones all over the web. Jeff Jarvis even has a post on ‘How (and why) to replace the AP‘, and illustrates the interesting concept of ‘reverse syndication’. Chris Ahearn, at Thomson Reuters, implores entities that are declaring war on the link economy to stop whining, and stands ready to help those who wish for an alternative to AP.

    Interestingly, Google had recently quadrupled its newspaper archives. (Locally, Dainik Jagran is now part of Google’s News Archive Partner Programme, and has a strategic deal with Google to help the group archive its bilingual daily, Inext) The average newspaper’s stance on Google is understandably ambivalent. On one hand, it is happy to get the traffic from Google, but its not happy that its only one among the websites shown, and the amount of content that Google shows. (that might prevent a reader from visiting the site) Sometime back, Google had posted its views and how, any publication can block search engines with a slight change in code.

    The reactions to this obvious ‘transition stage‘ for the newspaper industry has been taking many forms. Paywalls are boycotts are only one kind. Alternate methods of news collection like crowdsourcing+crowdfunding, public collaboration, (an interesting case, for more than this reason), nichepapers and ways in which journalists can use tools like Facebook and Twitter, are being discussed, as well as radical ideas like making the newspaper a gateway for particpative experiences, even as technological developments – touch screen ‘intelligent plastic’ roll up reader, and experiments from NYT (‘What we are reading‘) continue.

    While it would be easy to say that these are trends in the West, that are not very relevant to India at this stage, I’d still say that these are trends that media in India, especially newspapers, should be closely watching and learning from. A good read from Pradyuman Maheshwari at e4m on the same subject. While the Nielsen Online Global Survey on trust, value and engagement in advertising shows that newspapers are the most trusted form of paid advertising (in india), the TCS study on Indian urban school children show that they are extremely technology savvy and totally at ease with the web and social media.

    As stated in the TCS study “This societal trend has important implications for parents, educators, policy makers, as future employers as well as companies and brands that want to sell to tomorrow’s generation.” Some understand this, and have started experimenting with new forms of distribution. I just got  a mail asking me to check out Star Player!! The point is that one can never be sure whether the trends in the US will be replicated in India, though I’d say its more a ‘when’ question than an ‘if’, even though India’s version of the trends would be mutated, thanks to its own socio-cultural and economic pecularities. But it helps to be prepared. I read at Medianama, a few days back that the Hindu is taking Ergo, its 5 day a week publication aimed at young professionals in Chennai, online. Though the motive might have been cost saving, I’m sure it will be a great learning in understanding consumption patterns and figuring out revenue streams. I quite liked the site, powered by WordPress, with a very casual ‘About’ page, and covering some interesting stuff. It looks like an online news site, not the website of a newspaper.

    On hindsight, the collision was bound to happen. Newspapers, which subsidised news to the reading audience by making advertising pay for it. Google, which aggregated content, and served ads in context. They had to meet somewhere, and disagree on who makes how much. The concern areas for newspapers are manifold – news consumption has changed – quantitatively and qualitatively, modes of creation and distribution have changed, and Google has developed a much better advertising model. In essence, all entities in the publishing business have changed – producers, consumers, advertisers. Isn’t it inevitable that the publisher has to find a new business model? Newspapers in India still have some time on their hands, and some good tools too. With most publishing houses having multiple products that cater to specific audiences, they can actually experiment in different directions. It does cost money to create good content, the trick obviously is to figure out ways to minimise the cost and work out how much each stakeholder – reader and advertiser, is willing to pay for it. Now would definitely be a good time to start, unless you want to sound like the (as usual) hilarious Onion story – “Why did no one inform us of the imminent death of the American Newspaper industry” 🙂

    until next time, think about the link economy

  • Reading beyond the obvious

    As a regular user of Google Reader, I was happy to see that a couple of weeks back, Google deemed it important enough to carry out a few changes – a ‘like’ button, the ability to follow specific people (using Reader Search), and friend groups (with customisation options of who sees what content). The public nature of the ‘Like’ button meant that sharing on reader got a lot more social, though it had its share of detractors too.  Many complained about not wanting to see “likes from the unwashed masses”, Google corrected it by adding an option in the Settings, so that if you so desired, you could only see the ‘Likes’ by people you followed.

    As a regular user, I’d say that people who give only partial feeds stand to lose out a bit on the ‘Like’ part. It would also be great if the time lag between publishing and the post appearing on Reader could be reduced. As a publisher, I wish Google would tie these social features in Reader with Google Analytics, so that I can know who shared/liked my posts. One way to know the number of ‘like’ is to subscribe to your own blog, but I’m sure that Google can make it easier if they want. Then maybe a plugin that can show these details on my post (at the site). Much like the Tweetmeme plugin I have installed on my other blog.  Speaking of Tweetmeme, according to Venture Beat, the button is now shown more than 50 million times a day across the web. It has its share of competitors, and is even threatening to sue one.

    That number gives a rough idea of why Google want a piece of the sharing pie. In fact, this chart, created by AddtoAny (the same guys who gave us that awesome widget at the bottom of my posts) shows how sharing happens on the web. Facebook leads, followed by email and Twitter. Google, though dominant in search, would be looking closely at specific competition – the Yahoo-MS deal and how Bing’s interesting games shape up. But more importantly, it also has to keep an eye on how generic search and sharing (social) are changing and shaping each others’ future. Twitter just got itself a new homepage, and ““Share and discover what’s happening right now, anywhere in the world”  clearly shows the intent. I thought it even answered, to a certain extent, the oft heard question – “But what do i do on Twitter”. Call it discovery/recommendation/trend, but it is just a different perspective on search. And its not just Twitter, Friendfeed recently added a feature – ‘recommend friends’. No, silly, not the Orkut/LinkedIn type, if you feel your subscriber would also like the feed of someone you subscribe to, you can share it easily. Though its nothing radical, its helpful for new folk.

    The Nielsen Global Online Consumer Survey shows recommendations (from known people) as the most trusted source of advertising, at 90% and consumer opinions posted online at 70% next. Among Indian audience, recommendations top, but editorial is placed second. A post on Six Pixels of Separation blog talks about how the next ‘Google’ will be a referral engine, which ranks website not basis text optimisation, but basis what people have said and done there, and how the information there has been used by people. But there are challenges there too as such a system needs to incorporate relevance, immediacy, trustworthiness and have an interface that will display it in the most intuitive, easy manner possible. This post on RWW discusses the concept of Social Relevancy Rank, with five layers, where search results on streams (like Twitter, which already have real time) will be arranged basis relevance to your social graph. Friendfeed does this and provides more options in Advanced Search. The post also suggests ‘friends of friends’ as the next layer of results, and a concept of ‘taste neighbours’ (a mining of ‘people who liked this also liked’) after that. The last two layers are made of influencers and the crowd aggregate. In fact, I thought, maybe a possible visualisation would be to actually have all five layers arranged vertically side-by-side and a thumbs up/down button by the side of each search result, so that each user can contribute to filtering. Is this a perfect method? No, but then neither is Google’s Page Rank, as the author says. Which perhaps is why Google, while it is master of the algorithmic search, needs to experiment with Reader and see if it can create a social layer on top of its Page Rank search system. A new system that also incorporates the data from likes and shares beyond the optimised keywords, and is able to operate in real time too. Possible? That would be fun, and would even take Ad Sense to a whole new level. 🙂

    So what does this mean for brand and marketing? Beyond mastering the algorithm, optimising all the queries, mining all the data and connecting it, how does differentiation happen, other than the obvious product possibilities? This very interesting article (via @vijaysankaran) discusses the battle between art and algorithm. Amidst the quest for perfect targeting, and the smoothing out of our search experience, we might be losing out on serendipity. The  author goes on to say that in this ‘end of surprise’ is the opportunity for marketing – to deliver revelation along with relevance. The perfect  of left brain analytics and right brained creativity and emotions, which seemed to have been lost somewhere in between.

    until next time, search and socialise 🙂

  • Revenge of the corporate website?

    A few weeks back, there was a discussion on one of the LinkedIn groups I’m part of, on whether the corporate website is becoming irrelevant, and whether there was a tendency to make it more social. It was based on a post by Jeremiah Owyang a couple of years back, on how to evolve the corporate website. Coincidentally, I also caught a post by the Jeremiah on the same topic, a couple of weeks back, which talked about brand websites becoming aggregators of conversations happening around the web.

    This is a topic I have written about earlier, but with the rapid progression of tools that have been happening in the last few months, this would be a good time to update.  The tools have been evolving – Facebook, Twitter, Flickr, You Tube, and on each of them are built communities, which are finding newer ways and more mechanisms to express themselves on topics, and that includes brands. The aggregation is happening within the networks themselves, and there are ways to take the conversation outside the networks. I’d written last week about Facebook’s Live Stream Box last week, which allows updates to be streamed on external sites. Center Networks has an interesting post that talks about how Friendfeed can take over the forum/ bulletin board world. I also read about one of the pioneers in the user generated content space – MouthShut, planning to tap into the social media marketing to reach out to customers and giving free accounts to brands. On an aside, they are also planning to hire a couple of folks to handle this, so SMEs (Social Media Experts now 😉 ) might want to check it out.

    Meanwhile, AdAge has a very interesting post on how, even though Twitter and Facebook have grown as feedback and customer-service channels, the product review has also been growing in importance thanks to its more structured nature. The post also rightly points out that in addition to the listening skill, it is also important for brands to develop a culture that can respond to the feedback that’s now perhaps coming in torrents.

    In my earlier post, I had wondered if the reason behind brands’ reluctance to join conversations on networks, and sticking to their own, often static websites, was because of their liking for control. The other reason I had thought of was the ability to ‘measure’. Things have moved on, and we now see many brands making Twitter accounts and Facebook pages. While many of us bemoan the lack of a concrete plan behind such efforts, it is still a step forward. Even the Skittles episode, which many people ridicule, was a significant experiment to me. They tried something, they learned, they moved on. Measurement is still a much debated subject in the social media space. There’s nothing stopping brands from utilising traditional measurable methods of web marketing and also having ‘unmeasurable’ conversations on the side.

    If brand websites are guilty of missing the bus on involving existing/ potential consumers on their website when the conversations on social media platforms were still in a nascent stage, this perhaps is the time they can redeem themselves. Indeed, brands have started listening to, and acting on the basis of consumer feedback. As newer and better monitoring tools crop up on a regular basis, this is becoming easier. But for now, all these communities perhaps prefer the conversations to happen on the ‘unofficial networks’, as opposed to the corporate website.

    Perhaps brands could try to figure out why that is so, this would help them evolve objectives and a strategy for the website. Going further, it would also give them an understanding of how they could tweak their internal structures to create sustainable processes that can tackle the challenges that an evolving web throws at them.  This is perhaps even an imperative if the mob justice I’d written about last week becomes a trend. But that would be a negative way of looking at it. An interactive website that (without bias) pulls in ‘relevant’ conversations from around the web and gives more perspective to their customers would be definitely appreciated. By treating consumer feedback with the respect it deserves, brands would not only be giving more credibility to their website, and increasing the number of conversations that happen there, but perhaps even creating evangelists who would help the brand by proactively giving it relevant feedback and even taking up for the brand in case of bad PR, or at the very least, considering issues objectively. But then this is as much a culture and process change as it is a web design change.

    until next time, homepage with branch offices..

  • Mob bile

    Facebook recently launched Live Stream Box, which allows webmasters to stream relevant real time status updates on their site. Users can log in with Facebook Connect and post updates that will appear on facebook (their own profile as well as friends depending on their settings) as well as the site. It means that if say, I’m watching a live stream of any event on a particular site, which has this installed, I can use this to get my friends on FB to join the conversation. Two things struck me- one, it makes a whole new way of connecting friends around their topic of interest (context), and two, (a question), is this a step aimed at bettering twitter’s common lifestream and hashtag based way of aggregating conversations? (something that Facebook lacked so far)

    As all the services increase their focus on real time, I couldn’t help but think of the impact it has had on usage. Are the users on these services becoming increasingly trigger happy? TC had an article recently titled “Friendfeed, syphilis and the perfection of online mobs“, which talked about the service being the hotbed of mob justice enthusiasts.  (because of its ability to aggregate conversations in one place) Its a subject that I have discussed here earlier – once in the Hasbro-Scrabulous context, and then collating 3 separate incidents. I must say that we have moved on since then- to places closer to home – the latest being The Kiruba Incident involving Cleartrip (The Kiruba version) In many cases, the mob doesn’t even pause to check the facts or look at the issue objectively/rationally, before they react. With all kinds of people out there, I wonder how long it will be before someone decides to use more than just the keyboard, and look at real justice options. (Actually it has happened before)

    So, what would the effect of all this be on brands? Would they be able to keep up? Would they be able to deal with an angry mob? Real time is a reality, and it is would be more of a loss if brands decided not to use twitter. Its a different matter whether they choose to engage or are content with listening. There are quite a few tools out there which can help monitor the conversations, but what if the brands are not wired enough to respond effectively to the fires that happen? In this context, I read an interesting article on Adage, that talks about Slow Marketing. It talks about going back to the basics, and a need to focus on human, one-to-one connections.

    The responsibility is on both sides. In their eagerness to cash in on the new big thing and create buzz, brands (and agencies that advise brands) set expectations that may be way beyond what the organisation behind the brand can actually meet – in this context, perhaps turn around speed, and response to all communication directed at it. From the article, 

    Pick your battles: The social-media feeding frenzy puts a premium on responding to all conversation. You don’t need to respond to everything. Take a step back before diving in. In some cases, not engaging is the best form of engagement.

    The responsibility lies with users too. Long before there were brands on the real time platforms, there were people. And people used to help newbies learn the protocols of communicating in the network. If you were a user, you wouldn’t want to be in a place where people were only out to make fun or do harm to you. Maybe we should extend that courtesy to brands too, and allow some leeway, at least in terms of reaction time. In many cases, the person behind the handle will be just another enthusiast like you, with hardly any support from the organisation, and he would be trying to show to his bosses the value that these services can provide. All of us have favourite brands, which, if they use social media effectively, will end up being more useful to us. By making witchhunts a standard operating procedure, we might be doing more harm than we realise.

    There is an interesting discussion online, that talks about company websites and their return to favour, but more on that next week 😉

    until next time, see you later