In temperate songbirds, song is traditionally considered a reproductive and territorial signal produced by males. Previous research has described the production of male and female songs by black-capped chickadees, Poecile atricapillus, a temperate songbird species. Statistical classification revealed that the frequency decrease in the first note of the song is a potential acoustic mechanism that would allow birds to distinguish between the sexes. Here we used an operant discrimination task to examine whether this statistical difference in song is an acoustic difference that is perceived by black-capped chickadees in a manner that would allow birds to quickly assess the sex of a singing conspecific. To better understand the underlying perceptual mechanisms for this sex-based discrimination, we also presented birds with untrained, manipulated songs. In experiments 1 and 2, we tested black-capped chickadees using a true category/pseudo category task. Birds in a true category group performed similarly to birds in a pseudo category group, suggesting that there is no advantage in discrimination abilities for birds using categorization (i.e. true category group) over rote memorization (i.e. pseudo category group), possibly because the heightened biological salience of the song influenced the performance of the chickadees. However, responses to untrained songs suggest that birds learned a sex-based category rule when discriminating among songs. In experiment 3, we trained artificial neural networks (ANNs) using an analogous task in order to examine responding in the absence of experiential or biological factors. Results from ANNs suggest that male and female songs are acoustically distinct and can be discriminated using categorization, and that acoustic features within the fee note are an important acoustic mechanism for this sex-based discrimination. Overall, the results suggest that the biological salience of the songs affected the birds’ responses.