NeuralTalk and Walk – Mit einem neuronalen Netzwerk spazieren gehen

Kyle McDonald hat bei einem Spaziergang in Amsterdam ein neuronales Netzwerk mit dem Live-Feed einer Webcam gefüttert und es in Echtzeit die Umgebung analysieren zu lassen. Ziemlich beeindruckend:


Direktlink: NeuralTalk and Walk (via prosthetic knowledge)

Andrej Karpathy’s „NeuralTalk“ code github.com/karpathy/neuraltalk2 slightly modified to run from a webcam feed. I recorded this live while walking near the bridge at Damstraat and Oudezijds Voorburgwal in Amsterdam. All processing is done on my 2013 MacBook Pro with the NVIDIA 750M and only 2GB of GPU memory. I’m walking around with my laptop open pointing it at things, hence the shaky footage and people staring at themselves. The openFrameworks code for streaming the webcam and reading from disk is available at gist.github.com/kylemcdonald/b02edbc33942a85856c8