The goal of retinal prosthetics to build an electronic device to replaced diseased or damaged eye. One challenge of building this system is translating video information into the electrical signal sent from the eye to the brain. All electronic devices for interfacing with biology are imprecise and even if the desired electrical signal is known, the electronic device mostly likely will not be able to elicit this signal. This discrepancy requires devising a method for determining how close a given pair of neural firing patterns are in order to determine the best possible stimulation pattern to approximate the external visual world. In this work we propose to learn such a metric from recording of populations of retinal ganglion cells (RGCs) in the primate retina. We demonstrate that this metric outperforms existing techniques and demonstrate its application and utility in real stimulation experiments to mimic a retinal prosthetic device.
Neural prosthetics systems requires measuring the distance between codes. We 4 build such a system.