(no subject)
Jun. 13th, 2001 01:48 pmWhy do novels always have to say they're novels? Like, on the cover of American Gods. It says this in tiny tiny print above the title. Is this a leftover from when novels were considered tawdry entertainment not fit for respectable people so they'd know what not to read? It just seems to me a very odd practice, because almost everything I find at bookstores are novels nowadays.
A stupid thing to ponder when I should be studying for my sociology final, I know.
A stupid thing to ponder when I should be studying for my sociology final, I know.