social work


social work

noun

Definition of SOCIAL WORK

: any of various professional services, activities, or methods concretely concerned with the investigation, treatment, and material aid of the economically, physically, mentally, or socially disadvantaged

Seen & Heard

What made you want to look up social work? Please tell us where you read or heard it (including the quote, if possible).