I want to ask you a question and I want you to really consider it. Prepare yourself . . . Ready?
Here it is: Who is smarter? A Chimpanzee or a Dog?
If you picked the dog, raise your hand. What? Nobody? Even the dog owners out there, you betrayed your best friend?
Well yes, chimps have way more brainpower than dogs. They can use sign language, drive cars and perform complex tasks. But are they actually smarter?
That depends on how you define smart. There is one thing a dog will do that even the most highly trained chimpanzee can never do: read its environment for context and act on that information. If you put two containers in front of a dog —with one of them hiding a treat, the dog will pause and look to you for a clue. If you point to the container hiding the treat, the dog will head right over to that container and get it.
As this study shows, if you do the same thing with a chimp, the chimp will totally ignore your cues.
Dogs look for context and put that context into action. They do it in the same way humans do – reading your face for telltale emotional clues to see if information can be gleaned. From there they look for clues like a pointed hand or, as researchers discovered, discerning where you are looking. Yup – you can give a dog information with just your eyes.
So is a dog smarter? For social intelligence and context, absolutely. And isn’t that really what matters? No matter how much raw brainpower you have, if you apply it without context, it’s really not meaningful.
Same thing applies for Big Data and search. Big Data is like the chimp — a lot of capacity with big volumes of data and content, server farms for massive processing and storage. But without context all that data isn’t really that smart.
Then how do we make that data smart? The first step it to put intelligence into the data and content. What is the content about? And how does it all relate to each other? With a mix of tagging, text mining and data processing the data can be enriched. And with semantic models it can be linked to taxonomies and ontologies to give each item relevance to an expert domain or individual context.
But this is all still just raw brainpower – the content and data now have even more capacity – but what questions can they answer?
Traditional search and BI tools rely on the user to find their way through this data and content. There are lots of advances to let you see the shape of the data, navigators to help you refine your search. But you are always starting at step 1: do a search, run some analysis. And then you can use all that big data horsepower any way you want to get to your answer. And, by the way, if you get the wrong answer, or it takes you 5 times as long to get the answer, it’s not the machine’s fault! It is just processing away and ignoring the clues as to what you really are trying to achieve. Sounds like a certain chimp we were discussing . . .
What if, instead, the big data systems knew what you wanted to do? What if they were sensitive to your location, to your device and your movements and, from these clues, were always looking to understand the task you are trying to accomplish?
Searching for information is taking up more and more of the information workers time. Outsell is reporting that the amount has grown 13 percent since 2002. And others say that the total number is now between 20 and 30 percent of information worker’s time. Thirty percent!!! That’s a full day a week not producing information or work just churning using the tools to find what you need to do your work.
This is because the nuance of each individual information task is lost on traditional tools. There is a difference between research for new ideas and research to validate your findings. One might look at the topics and trends in research – who is writing what – how do various disciplines intersect with each other? The other is narrow – who is doing things just like me, using the same sources as me and what did they find?
The same is true for almost every task: gathering data for a report versus competitive intelligence, preparing an industry presentation versus tracking a trend. Each has their own pattern of information access and steps … and with traditional tools all you get is a blank slate every time. It’s up to you to build up the information to get your answer. And then, when you switch to a new task, you have to do it all over again.
What if the tools were more like the dog? What if they took into account your context?
Well they can!
- Springer Images – built to aid researchers who are creating new materials gives you access to just the components you need, i.e., images and tables. Then it does your job for you -> it will make the Powerpoint with the references all filled in because this is (sadly) what most information workers are actually creating.
- CQ Floor Video – instead of sending you links to videos it lays out the whole day of congressional activity so you can get to the part you want to see. Its targeted to that task.
- BBC iPlayer is location and device sensitive – it gets you the show (or not) dynamically. It’s a big leap -> every request to a major public site is custom!
These applications are tailored to the end-user’s role — and provide access to the content and data tailored to that role. This is the big leap ahead for us: The leap from search and BI to information applications.
It’s almost like these applications have a dog inside them – watching what we want to do and tailoring their reactions to us, the users.
So let’s not be like the chimpanzee, all horsepower, all big data. Its context that really matters and lets look the dog, the champion of social awareness, for a lesson in what is actually smart.
Footnote for the study: Do domestic dogs interpret pointing as a command?
Animal Cognition (2012): 1-12 , November 09, 2012
By Scheider, Linda; Kaminski, Juliane; Call, Josep; Tomasello, Michael