To run our circadian clocks, regulate sleep and control hormone levels, we rely on light-sensing neurons known as M1 ganglion cell photoreceptors. Separate from the retina’s rods and cones, M1 cells specialize in “non-image” vision and function even in people who are blind.
Reporting in today’s Cell, neuroscientists at Boston Children’s Hospital describe an unexpected system that M1 cells use to sense changing amounts of environmental illumination. They found that the cells divvy up the job, with particular neurons tuned to different ranges of light intensity.
“As the earth turns, the level of illumination ranges across many orders of magnitude, from starlight to full daylight,” says Michael Do, PhD, of the F.M. Kirby Neurobiology Center at Boston Children’s Hospital, senior author on the paper. “How do you build a sensory system that covers such a broad range? It seems like a straightforward problem, but the solution we found was a lot more complex than expected.”
Sunrise, sunset: Tag teaming to relay light information
Elliott Milner, a PhD student in Harvard’s Program in Neuroscience and the paper’s lead author, established new methods to study M1 cells’ electrical outputs. This allowed a better understanding of the signals these cells send from the eye to areas throughout the brain.
“The expectation from prior work was that signaling from these cells would simply increase with brightness, and that averaging across them would provide a measure of the overall light intensity,” says Milner.
Instead, Milner and Do found that while the cells visually appear the same, they are tuned to respond to different light levels, and take turns signaling the brain as these levels change. As a result, the brain gets information about light intensity from the identities of the cells that are active, not just signal size.
“Some cells signal vigorously in twilight and others in full daylight,” says Milner. “Together, they cover a broad range of light intensities in the environment.”
Leveraging a “pathological” phenomenon
Interestingly, the M1 cells’ turn-taking system uses a mechanism that is usually considered abnormal or pathological, known as depolarization block.
As the light level goes up, a protein called melanopsin in the M1 cells captures more and more photons of light. This causes the voltage across the cell membrane to become more positive — that is, it “depolarizes.” As the voltage becomes more positive, the cell generates more electrical spikes, also known as action potentials, which are sent to the brain.
In depolarization block, typically observed in certain disorders like epilepsy, the cell cannot fire spikes when the membrane voltage gets too positive. “There’s so much excitation that the cell can’t keep up and it goes silent,” says Do.
The M1 cells seem to be using this feature to advantage. Milner and Do think this system may have evolved to help the brain distinguish light levels more precisely, based on which cells are “talking” and not only their general volume. It also may conserve energy.
“Spikes are expensive metabolically for a cell to produce,” Do explains. “Because some cells are silenced as others activate, this system provides information at a lower energetic cost.”
The brain’s division of labor
In future work, Milner and Do hope to explore the following questions:
- How does the brain extract information about the light level from these cells? Do particular regions of the brain listen to some M1 cells and not others?
- How are the different M1 cells distributed across the retina? Different parts of the retina receive light from different directions, so how M1 cells are distributed spatially could have implications for designing light therapy systems for conditions like seasonal affective disorder.
- Do other cells involved in sensory perception — such as those that allow us to sense odors or touch — also use depolarization block?
“The bottom line is that nerve cells have more in their toolkit than we previously thought, and divide labor in ways we didn’t expect,” says Do. “We’re finding surprises even in systems that are considered to be quite simple, like the one used to sense light intensity.”
The study was supported by the National Institutes of Health (F31 EY025466, R01 EY023648; 1U54HD090255; P30 EY012196), the Whitehall Foundation (2011-05-15) and The Karl Kirchgessner Foundation.
Read more about the Do lab’s previous research on non-image vision.