By some definitions, "Web 3.0" will be characterized by semantic mapping of data. Unlike regular searches which mine information based on keywords you type in, semantic search looks for information you want by connecting the meaning of words. Say, for example, you type in the word "cold." The way search engines like Google and Yahoo run now, you would get results based on the word alone. But "cold," like many words in the English language, is ambiguous and could mean anything from your health to the temperature. The semantic map would give computers the ability to understand words in context through tenses and sentence structure, much like the human brain. A semantic map released this week claims to teach computers the meaning of words with more than 10 million semantic connections.
Companies like Microsoft have invested in such technologies already (in July it bought three-year-old start-up Powerset, which also specializes in searches based on meaning). But the map released this week is the world's largest, according to its creator, Cognition Technologies, Inc. That's not hard to believe, considering that it is built on more than 20 years of research and offers mappings for 99.9 percent of the standard English dictionary. The company has a staff dedicated to adding new words, slang, and colloquialisms to the catalog.
At the moment, Cognition doesn't offers a search engine for the general consumer market, preferring to target industry-specific clients. Right now the company is in talks to license its map for specific search tools. In the meanwhile, those who are curious can test out its technology at Cognition.com through one of three search portals: legal, health, and Wikipedia.
It's important to make a distinction here between "semantic" as in "semantic web" and semantics—being the study or use of meaning in language itself. Cognition, Powerset and other services are actually about Natural Language Processing. This is a complimentary area of development to the Semantic Web, but by no means its main proponent.
The Semantic Web is actually about making information machine-readable through the use of RDF (Resource Description Framework) and other methods (Microformats and others)to actually wrap information in machine-readable bundles, so the software dealing with the user's request is able to use the information as data. Language processing is a parallel area of development about allowing people to interact better with machines through more naturalistic linguistic processes.