Medical Definition of social science
1: a branch of science that deals with the institutions and functioning of human society and with the interpersonal relationships of individuals as members of society
2: a science (as anthropology or social psychology) dealing with a particular phase or aspect of human society
Seen and Heard
What made you want to look up social science? Please tell us where you read or heard it (including the quote, if possible).