What would a robot see in TED Talks? …beautiful TED Talk maps

What would happen if we fed a robot with all the TED talks from the most inspiring leaders in the world over the last 10 years and asked key questions of the robot afterwards? Review beautiful TED Talk maps generated by the Noggle knowledge assistant.

Would the robot answer similarly to how we as humans would? Is current machine-learning and cognitive artificial intelligence able to learn and teach a “dump” robot about our world? No theoretical talk about the future of AI—let’s see what we can get out of this technology today.

So I took the publicly available TED talks from their website (ted.com, 2,224 talks as of today, 2006-2016) and used the summaries and transcripts to feed our robot; a state-of-the art machine-learning AI algorithm.

After the robot gathered all the content—which took only seconds—we asked it key questions. The robot uses a cognitive-pattern detection algorithm across all talks, so it looks for common patterns within the presentations of all speakers. The robot builds patterns and presents its findings about key topic clusters in a colorful knowledge map – TED Talk maps. The size of the cluster represents the number of talks that have been assigned to a given thematic cluster. The most important clusters are shown in the center. So behind each cluster is the respective number of TED talks that deal with the cluster name shown.

Our natural language processing “robot” works like a natural human response would have: our brains use cognitive shortcuts to make sense of our increasingly complicated world, and the shortcuts used here by our machine learning algorithm seem to have the same effect. Out of these 520 hours of video, the robot was able to extract the important shortcuts with cognitive text processing.

Please visit and read the full article on LinkedIn here:

Download the spreadsheet with all details here: goo.gl/sWSSBr

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply