Stupid question, but I’m going to ask it anyway, have you Googled anything lately? Of course you have, everyone has. I mean, it’s Google. Duh.
At bloomfield knoble, we don’t just use Google, we study Google. We ponder, pontificate, process and a bunch of other fancy sounding “p” words about it. Why? Because we have to. Google is an essential component of our integrated marketing efforts at bloomfield knoble. We’re always worried about SEO and more often than not, we’re utilizing SEM as well (in addition to everything else we do that makes up integrated marketing). So when we hear something about Google – specifically that Google is doing something new that could affect the way we do things, our ears perk up and we pay attention.
As such, my ears (well, more my eyes since I was reading at the time) perked up when I saw an article by Hal Hodson in a recent issue of New Scientist. According to Hodson, Google is adapting their model.
The Internet is stuffed with garbage. Anti-vaccination websites make the front page of Google, and fact-free “news” stories spread like wildfire. Google has devised a fix – rank websites according to their truthfulness. Google’s search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting the incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team. The score they compute for each page is its Knowledge-Based Trust score. The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.
There are already lots of apps that try to help Internet users unearth the truth. LazyTruth is a browser extension that skims inboxes to weed out the fake or hoax emails that do the rounds. Emergent, a project from the Tow Center for Digital Journalism at Columbia University, New York, pulls in rumors from trashy sites, then verifies or rebuts them by cross-referencing to other sources. LazyTruth developer Matt Stempeck, now the director of civic media at Microsoft New York, wants to develop software that exports the knowledge found in fact-checking services such as Snopes, PolitiFact and FactCheck.org so that everyone has easy access to them. He says tools like LazyTruth are useful online, but challenging the erroneous beliefs underpinning that information is harder. “How do you correct people’s misconceptions? People get very defensive,” Stempeck says. “If they’re searching for the answer on Google they might be in a much more receptive state.”
It becomes immediately obvious that establishing trustworthiness will become an integral (if not central) aspect of marketing campaigns in the future. Or (for those of you old enough to remember) you can go this way:
A STEM (Science / Technology / Engineering / Math) graduate and COO of bloomfield knoble, Thomas exemplifies the view that advertising is becoming an engineering discipline. He leads the integrated insights and strategic planning group in a way consistent with bloomfield knoble’s goal of bringing a strong analytical foundation to uncover fresh and innovative insights and business opportunities.