1. Neuroscience
Download icon

Interactions between stimulus and response types are more strongly represented in the entorhinal cortex than in its upstream regions in rats

  1. Eun-Hye Park
  2. Jae-Rong Ahn
  3. Inah Lee Is a corresponding author
  1. Seoul National University, Korea
Research Advance
Cited
0
Views
615
Comments
0
Cite as: eLife 2017;6:e32657 doi: 10.7554/eLife.32657
4 figures and 1 additional file

Figures

Cannula implantation in the PER and POR.

(A) Cannula tracks in thionin-stained sections in the PER and POR. (B) Cannula-tip positions are indicated by dots (color-coded for rats) for the nonspatial response tasks (top), SCN-SR task (middle), and OBJ-SR task (bottom).

https://doi.org/10.7554/eLife.32657.002
Scene-memory tasks.

(A) Scene-cued nonspatial response task. Rats made a nonspatial response (push or dig) depending on the visual scene displayed on monitors. (B) Behavioral performance in the SCN-NSR task (Mean ± SEM). Both the PER- and POR-MUS conditions produced significant differences in performance compared to controls. (C) Scene-cued spatial response task. Rats made a spatial choice (left or right turn) depending on visual scenes. (D) Behavioral performance in the SCN-SR task (Mean ± SEM). Both the PER-MUS and POR-MUS conditions resulted in significantly impaired performance compared to control conditions. **p<0.01.

https://doi.org/10.7554/eLife.32657.003
Figure 2—source data 1

Performance in the SCN-NSR task.

Percent correct performance for individual rats under different drug conditions in the SCN-NSR task.

https://doi.org/10.7554/eLife.32657.004
Figure 2—source data 2

Performance in the SCN-SR task.

Percent correct performance for individual rats under different drug conditions in the SCN-SR task.

https://doi.org/10.7554/eLife.32657.005
Figure 3 with 1 supplement
Object-memory tasks.

(A) Rats made a nonspatial choice (push or dig) toward the sand-filled jar depending on the object cue attached to the jar. (B) Behavioral performance in the OBJ-NSR task (Mean ±SEM). The PER-MUS condition resulted in significant deficits in performance compared to the ACSF and POR-MUS conditions. (C) Object-cued spatial response task. Rats made a spatial choice depending on the toy object attached to the intersection wall. (D) Behavioral performance in the OBJ-SR task (Mean ± SEM). A significant difference in performance was found between the ACSF and PER-MUS conditions. **p<0.01.

https://doi.org/10.7554/eLife.32657.006
Figure 3—source data 1

Performance in the OBJ-NSR task.

Percent correct performance for individual rats under different drug conditions in the OBJ-NSR task.

https://doi.org/10.7554/eLife.32657.009
Figure 3—source data 2

Performance in the OBJ-SR task.

Percent correct performance for individual rats under different drug conditions in the OBJ-SR task.

https://doi.org/10.7554/eLife.32657.010
Figure 3—figure supplement 1
Multimodal versus visual OBJ-NSR tasks.

(A) Cumulative performance of the PER-MUS group across trials in the multimodal OBJ-NSR (mOBJ-NSR) task and visual OBJ-NSR (vOBJ-NSR) task. We noticed that the performance of rats in the PER-MUS group in the mOBJ-NSR task improved in later trials, possibly suggesting a switch in the animal’s dependence on specific sensory modality (e.g., from visual to tactile) during object recognition. Such trend was not observed when objects were only sampled using visual modality in the vOBJ-NSR task. In the mOBJ-NSR task, the rats were allowed to sample the object stimulus using multiple sensory modalities such as vision and tactile information. The use of olfactory information, however, was controlled by cleaning the objects every 10 trials with diluted ethanol (70%), and replacing the object with its replica. Behavioral testing in the mOBJ-NSR task was followed by the vOBJ-NSR task. In the task, the object stimuli were encased in a transparent acrylic box (6 × 6 × 4 cm) with a sliding door to restrict the perceptual access to the object to the visual modality only. The general task procedures and object-response contingencies were the same between the two versions of the task. There was a significant difference between the slopes of the performance curves for the two tasks. (B) Behavioral performance in the vOBJ-NSR task (Mean ± SEM). Significant differences in performance were found among all groups (F(2,12) = 40.477, p<0.0001; one-way repeated ANOVA). There was a significant performance difference between ACSF and PER-MUS groups (p<0.0001), between ACSF and POR-MUS groups (p=0.001), and between PER-MUS and POR-MUS groups (p<0.001; Bonferroni-Dunn). **p<0.01, ***p<0.0001.

https://doi.org/10.7554/eLife.32657.007
Figure 3—figure supplement 1—source data 1

Performance in the visual OBJ-NSR task.

https://doi.org/10.7554/eLife.32657.008
Figure 4 with 1 supplement
Lack of stimulus-response interaction in the PER and POR and a theoretical model.

(A) Performance deficits (calculated by subtracting the ACSF-based performance from the MUS-based performance) in the nonspatial response tasks (Mean ± SEM). The POR-MUS condition resulted in significant deficits in performance when scenes were used as cues, but not when objects were used. The PER-MUS condition produced deficits in using scenes and objects for making nonspatial choices with bigger deficits with object cues. (B) Performance deficits in the spatial response tasks (Mean ± SEM). Both PER-MUS and POR-MUS conditioned produced similar levels of impairment observed in the nonspatial tasks (A), suggesting the lack of scene-response interaction at the PER and POR level. Furthermore, the more prominent roles of the PER, but not the POR, in the object-cued task was also observed in the spatial response tasks. *p<0.025. (C) A working model for information processing in the medial temporal lobe. Multimodal sensory inputs (VIS: visual, OLF: olfactory, AUD: auditory, SOM: somatosensory) are provided to the PER, and only visual inputs are fed to the POR. The PER and POR process these inputs to recognize objects and scenes, respectively. The LEC is involved in remembering choice responses associated with objects, whereas the MEC represents navigation-related variables using visual scene information from the POR. The LEC is reciprocally connected to the PER, hippocampus, insular cortex, and frontal areas (Burwell and Amaral, 1998a). Also, the LEC projects to the basal ganglia, medial prefrontal cortex, somatosensory cortex, and motor areas (Swanson and Köhler, 1986). The MEC has reciprocal connections with the POR, hippocampus, cingulate, and parietal cortex (Burwell and Amaral, 1998b). The MEC receives projections from the parasubiculum and postsubiculum (Canto et al., 2008). In this model, the PER-LEC networks are to interact with objects and the POR-MEC networks process information to navigate in space. In the hippocampus, the neural representations from these two channels are temporally structured with relative values in a goal-directed manner to generate rich episodic memories.

https://doi.org/10.7554/eLife.32657.011
Figure 4—figure supplement 1
Comparisons of the PER-POR networks with the LEC-MEC networks.

(A) Performance deficits in the nonspatial response tasks (Mean ± SEM). The POR-MUS showed significant deficits when scenes and vOBJ were used as cues. The PER-MUS produced deficits in using scenes and objects for making nonspatial choices with bigger deficits with object cues. Between vOBJ- and SCN-NSR tasks, significant effects of task (F(1,6) = 8.794, p<0.05) and task-region interaction (F(1,6) = 7.556, p<0.05) were found. A post hoc t-test showed a significantly larger deficit in the PER in the vOBJ-NSR task than in the SCN-NSR task (p<0.025), a difference that was not observed in the POR (F(1,6) = 3.36, p>0.1). (B) Double dissociation reported in the Yoo and Lee, 2017 between the LEC and MEC in the SCN-SR and SCN-NSR tasks. (C) Performance deficits of the PER-MUS and POR-MUS groups in the SCN-SR and SCN-NSR tasks. *p<0.05, **p<0.001.

https://doi.org/10.7554/eLife.32657.012

Additional files

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Download citations (links to download the citations from this article in formats compatible with various reference manager tools)

Open citations (links to open the citations from this article in various online reference manager services)