A biologically grounded computational model built to mimic real neural circuits, not trained on animal data, learned a visual categorization task just as actual lab animals do, matching their accuracy, variability, and underlying neural rhythms. By integrating fine-scale synaptic rules with large-scale architecture across cortex, striatum, brainstem, and acetylcholine-modulated systems, the model reproduced hallmark patterns of learning, including strengthened beta-band synchrony between regions during correct decisions.