Throwing a HUGE amount of information into a computer and letting it figure out all possible problems and solutions. By plotting and analyzing points, computers can find correlations that no human could ever discover. Sounds big brother-like but I was surprised to find out that this is how Google’s translator works:

Google uses the same philosophy of learning via massive data for their translation programs. They can translate from English to French, or German to Chinese by matching up huge datasets of humanly translated material. For instance, Google trained their French/English translation engine by feeding it Canadian documents which are often released in both English and French versions. The Googlers have no theory of language, especially of French, no AI translator. Instead they have zillions of datapoints which in aggregate link “this to that” from one language to another.

~Kevin Kelly


Tags: , , ,


  1. EL FIN DE LA TEORÍA : El diluvio de datos ¿hará obsoleto el método científico? Por Chris Anderson, editor jefe de Wired. Says:

    […] otros puedan basarse? No lo sabemos todavía. El término técnico para este enfoque científico es Data Intensive Scalable Computation (DiSC). Otros términos son «Grid Datafarm Architecture» o «Petascale Data Intensive Computing […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: