(Yes, we know, MOOCs are free. This will be too, if you add it to your holiday wish list, or insist that your local library orders a copy.)
Barry’s marching orders are always to be executed on paper, even when they have been retrieved on smartphones, tablets, and a variety of other screens. They are the antithesis of dry. A less accidental professor might have dispensed with the doodle encrusted, lined yellow legal paper, after privately outlining her game plan. Barry’s choice to preserve and share the method behind her madness is a gift to students, and to herself.
The decontextualization of cheap, common, or utilitarian paper (which also harkens back to the historical avant-garde) may be understood as a transvaluation of the idea of working on “waste” –a knowing, ironic acknowledgment on Barry’s part that her life narrative, itself perhaps considered insignificant, is visualized in an accessible popular medium, comics, that is still largely viewed as “garbage.”
I got screamed at a lot for using up paper. The only blank paper in the house was hers, and if she found out I touched it she’d go crazy. I sometimes stole paper from school and even that made her mad. I think it’s why I hoard paper to this day. I have so much blank paper everywhere, in every drawer, on every shelf, and still when I need a sheet I look in the garbage first. I agonize over using a “good” sheet of paper for anything. I have good drawing paper I’ve been dragging around for twenty years because I’m not good enough to use it yet. Yes, I know this is insane.
Sample assignments from “The Unthinkable Mind” are above and below, and you will find many more in Syllabus: Notes from an Accidental Professor. Let us know if Professor Chewbacca’s neurological assumptions are correct. Does drawing and writing by hand release the monsters from the id and squelch the internal editor who is the enemy of art?
Schools like Harvard, Oxford, and the Sorbonne surely have qualities to recommend them, but to my mind, nothing would feel quite as cool as saying your degree comes from the Jack Kerouac School of Disembodied Poetics. If you aspire to say it yourself, you’ll have to apply to Naropa University, which Tibetan Buddhist teacher (and, incidentally, Oxford scholar) Chögyam Trungpa established in Boulder, Colorado in 1974. This rare, accredited, “Buddhist-inspired” American school has many unusual qualities, as you’d expect, but, as many of us remember from our teenage years, your choice of university has as much to do with who has passed through its halls before as what you think you’ll find when you pass through them. Naropa, besides naming a school after the late Kerouac has hosted the likes of Allen Ginsberg, Anne Waldman, William S. Burroughs, Gregory Corso, Philip Whalen, and Lawrence Ferlinghetti.
But you don’t actually have to attend Naropa to partake of its Beat legacy. At the Naropa Poetics Audio Archives, freely browsable at the Internet Archive, you can hear over 5000 hours of readings, lectures, performances, seminars, panels, and workshops recorded at the school and featuring the aforementioned luminaries and many others. “The Beat writers had intervened on the culture,” says Waldman in an interview about her book Beats at Naropa. “It wasn’t just a matter of simply offering the usual kind of writing workshops, but reading and thinking lectures, panels, presentations as well. The Beat writers have been exceptional as political and cultural activists, investigative workers, translators, Buddhists, environmental activists, feminists, seers. There’s so much legendary history here.” Emphasis — I repeat, 5000 hours — on so much.
To help you dive into this legendary history, we’ve rounded up today some previously featured highlights from Naropa. Begin here, and if you keep going, you’ll discover varieties of Beat experience even we’ve never had — and maybe you’ll even consider putting in a Kerouac School application, and doing some cultural intervention of your own.
The term “creative nonfiction” has picked up a great deal of traction over the past decade — perhaps too much, depending upon how valid or invalid you find it. Meaningful or not, the label has come into its current popularity in part thanks to the essays of novelist David Foster Wallace: whether writing nonfictionally about the Illinois State Fair, David Lynch, professional tennis, or a seven-night Caribbean cruise, he did it in a way unlike any other man or woman of letters. While nobody can learn to write quite like him — this we’ve seen when Wallace-imitators write pastiches of their own — he did spend time teaching the art of creative nonfiction as he saw it,
a broad category of prose works such as personal essays and memoirs, profiles, nature and travel writing, narrative essays, observational or descriptive essays, general-interest technical writing, argumentative or idea-based essays, general-interest criticism, literary journalism, and so on. The term’s constituent words suggest a conceptual axis on which these sorts of prose works lie. As nonfiction, the works are connected to actual states of affairs in the world, are “true” to some reliable extent. If, for example, a certain event is alleged to have occurred, it must really have occurred; if a proposition is asserted, the reader expects some proof of (or argument for) its accuracy. At the same time, the adjective creative signifies that some goal(s) other than sheer truthfulness motivates the writer and informs her work. This creative goal, broadly stated, may be to interest readers, or to instruct them, or to entertain them, to move or persuade, to edify, to redeem, to amuse, to get readers to look more closely at or think more deeply about something that’s worth their attention… or some combination(s) of these.
In some ways, Wallace syllabi themselves count as pieces of creative nonfiction. What other professor ever had the prose chops to make you actually want to read anything under the “Class Rules & Procedures” heading? In the ninth of its thirteen points, he lays out the workshop’s operative belief:
that you’ll improve as a writer not just by writing a lot and receiving detailed criticism but also by becoming a more sophisticated and articulate critic of other writers’ work. You are thus required to read each of your colleagues’ essays at least twice, making helpful and specific comments on the manuscript copy wherever appropriate. You will then compose a one-to-three-page letter to the essay’s author, communicating your sense of the draft’s strengths and weaknesses and making clear, specific suggestions for revision.
But whatever the rigors of English 183D, Wallace would have succeeded, to my mind, if he’d instilled nothing more than this in the minds of his departing students:
In the grown-up world, creative nonfiction is not expressive writing but rather communicative writing. And an axiom of communicative writing is that the reader does not automatically care about you (the writer), nor does she find you fascinating as a person, nor does she feel a deep natural interest in the same things that interest you.
True to form, DFW’s syllabus comes complete with footnotes.
1 (A good dictionary and usage dictionary are strongly recommended. You’re insane if you don’t own these already.)
If you feel the need for tips on developing a writing style, you probably don’t look right to the Institute of Electrical and Electronics Engineers’ journal Transactions on Professional Communications. You certainly don’t open such a publication expecting such tips from novelist Kurt Vonnegut, a writer with a style of his own if ever there was one.
But in a 1980 issue, the author of Slaughterhouse-Five, Jailbird, and Cat’s Cradledoes indeed appear with advice on “how to put your style and personality into everything you write.” What’s more, he does it in an ad, part of a series from the International Paper Company called “The Power of the Printed Word,” ostensibly meant to address the need, now that “the printed word is more vital than ever,” for “all of us to read better, write better, and communicate better.”
This arguably holds much truer now, given the explosion of textual communication over the internet, than it did in 1980. And so which of Vonnegut’s words of wisdom can still help us convey our words of wisdom? You can read the full PDF of this two-page piece of ad-ucation here, but some excerpted points follow:
Find a subject you care about. “Find a subject you care about and which you in your heart feel others should care about. It is this genuine caring, and not your games with language, which will be the most compelling and seductive element in your style. I am not urging you to write a novel, by the way — although I would not be sorry if you wrote one, provided you genuinely cared about something. A petition to the mayor about a pothole in front of your house or a love letter to the girl next door will do.”
Keep it simple. “As for your use of language: Remember that two great masters of language, William Shakespeare and James Joyce, wrote sentences which were almost childlike when their subjects were most profound. ‘To be or not to be?’ asks Shakespeare’s Hamlet. The longest word is three letters long. Joyce, when he was frisky, could put together a sentence as intricate and as glittering as a necklace for Cleopatra, but my favorite sentence in his short story ‘Eveline’ is this one: ‘She was tired.’ At that point in the story, no other words could break the heart of a reader as those three words do.”
Sound like yourself. “English was Conrad’s third language, and much that seems piquant in his use of English was no doubt colored by his first language, which was Polish. And lucky indeed is the writer who has grown up in Ireland, for the English spoken there is so amusing and musical. I myself grew up in Indianapolis, where common speech sounds like a band saw cutting galvanized tin, and employs a vocabulary as unornamental as a monkey wrench. [ … ] No matter what your first language, you should treasure it all your life. If it happens to not be standard English, and if it shows itself when your write standard English, the result is usually delightful, like a very pretty girl with one eye that is green and one that is blue. I myself find that I trust my own writing most, and others seem to trust it most, too, when I sound most like a person from Indianapolis, which is what I am. What alternatives do I have?”
Say what you mean. “My teachers wished me to write accurately, always selecting the most effective words, and relating the words to one another unambiguously, rigidly, like parts of a machine. They hoped that I would become understandable — and therefore understood. And there went my dream of doing with words what Pablo Picasso did with paint or what any number of jazz idols did with music. If I broke all the rules of punctuation, had words mean whatever I wanted them to mean, and strung them together higgledy-piggledy, I would simply not be understood. Readers want our pages to look very much like pages they have seen before. Why? This is because they themselves have a tough job to do, and they need all the help they can get from us.”
While easy to remember, Vonnegut’s plainspoken rules could well take an entire career to master. I’ll certainly keep writing on the subjects I care most about — many of them on display right here on Open Culture — keeping it as simple as I can bear, saying what I mean, and sounding like… well, a rootless west-coaster, I suppose, but one question sticks in my mind: which corporation will step up today to turn out writing advice from our most esteemed men and women of letters?
Though the term “weird fiction” came into being in the 19th century—originally used by Irish gothic writer Sheridan Le Fanu—it was picked up by H.P. Lovecraft in the 20th century as a way, primarily, of describing his own work. Lovecraft produced copious amounts of the stuff, as you can see from our post highlighting online collections of nearly his entire corpus. He also wrote in depth about writing itself. He did so in generally prescriptive ways, as in his 1920 essay “Literary Composition,” and in ways specific to his chosen mode—as in the 1927 “Supernatural Horror in Literature,” in which he defined weird fiction very differently than Le Fanu or modern authors like China Miéville. For Lovecraft,
The true weird tale has something more than secret murder, bloody bones, or a sheeted form clanking chains according to rule. A certain atmosphere of breathless and unexplainable dread of outer, unknown forces must be present; and there must be a hint, expressed with a seriousness and portentousness becoming its subject, of that most terrible conception of the human brain–a malign and particular suspension or defeat of those fixed laws of Nature which are our only safeguard against the assaults of chaos and the daemons of unplumbed space.
Here we have, broadly, the template for a very Lovecraftian tale indeed. Ten years later, in a 1937 essay titled “Notes on Writing Weird Fiction,” Lovecraft would return to the theme and elaborate more fully on how to produce such an artifact.
Weird Fiction, wrote Lovecraft in that later essay, is “obviously a special and perhaps a narrow” kind of “story-writing,” a form in which “horror and the unknown or the strange are always closely connected,” and one that “frequently emphasize[s] the element of horror because fear is our deepest and strongest emotion.” Although Lovecraft self-deprecatingly calls himself an “insignificant amateur,” he nonetheless situates himself in the company of “great authors” who mastered horror writing of one kind or another: “[Lord] Dunsany, Poe, Arthur Machen, M.R. James, Algernon Blackwood, and Walter de la Mare.” Even if you only know the name of Poe, it’s weighty company indeed.
But be not intimidated—Lovecraft wasn’t. As our traditional holiday celebration of fear approaches, perhaps you’d be so inclined to try your hand at a little weird fiction of your own. You should certainly, Lovecraft would stress, spend some time reading these writers’ works. But he goes further, and offers us a very concise, five point “set of rules” for writing a weird fiction story that he says might be “deduced… if the history of all my tales were analyzed.” See an abridged version below:
Prepare a synopsis or scenario of events in the order of their absolute occurrence—not the order of their narrations.
Prepare a second synopsis or scenario of events—this one in order of narration (not actual occurrence), with ample fullness and detail, and with notes as to changing perspective, stresses, and climax.
Write out the story—rapidly, fluently, and not too critically—following the second or narrative-order synopsis. Change incidents and plot whenever the developing process seems to suggest such change, never being bound by any previous design.
It may be that the second rule is made just to be broken, but it provides the weird fiction practitioner with a beginning. The third stage here brings us back to a process every writer on writing, such as Stephen King, will highlight as key—free, unfettered drafting, followed by…
Revise the entire text, paying attention to vocabulary, syntax, rhythm of prose, proportioning of parts, niceties of tone, grace and convincingness of transitions…
And finally….
Prepare a neatly typed copy—not hesitating to add final revisory touches where they seem in order.
You will notice right away that these five “rules” tell us nothing about what to put in our weird fiction, and could apply to any sort of fiction at all, really. This is part of the admirably comprehensive quality of the otherwise succinct essay. Lovecraft tells us why he writes, why he writes what he writes, and how he goes about it. The content of his fictional universe is entirely his own, a method of visualizing “vague, elusive, fragmentary impressions.” Your mileage, and your method, will indeed vary.
Lovecraft goes on to describe “four distinct types of weird story” that fit “into two rough categories—those in which the marvel or horror concerns some condition or phenomenon, and those in which it concerns some action of persons in connection with a bizarre condition or phenonmenon.” If this doesn’t clear things up for you, then perhaps a careful reading of Lovecraft’s complete “Notes on Writing Weird Fiction” will. Ultimately, however, “there is no one way” to write a story. But with some practice—and no small amount of imagination—you may find yourself joining the company of Poe, Lovecraft, and a host of contemporary writers who continue to push the boundaries of weird fiction past the sometimes parochial, often profoundly bigoted, limits that Lovecraft set out.
But if this seems out of bounds, wait until you hear what he suggests. Instead of issuing even more seemingly arbitrary, burdensome commands, Pinker aims to free us from the tyranny of the senseless in grammar—or, as he calls it in an article at The Guardian, from “folklore and superstition.” Below are five of the ten “common issues of grammar” Pinker selects “from those that repeatedly turn up in style guides, pet-peeve lists, newspaper language columns and irate letters to the editor.” In each case, he explains the absurdity of strict adherence and offers several perfectly reasonable exceptions that require no correction to clarify their meaning.
Beginning sentences with conjunctions
We have almost certainly all been taught in some fashion or another that this is a no-no. “That’s because teachers need a simple way” to teach children “how to break sentences.” The “rule,” Pinker says, is “misinformation” and “inappropriate for adults.” He cites only two examples here, both using the conjunction “because”: Johnny Cash’s “Because you’re mine, I walk the line,” and the stock parental non-answer, “Because I said so.” And yet (see what I did?), other conjunctions, like “and,” “but,” “yet,” and “so” may also “be used to begin a sentence whenever the clauses being connected are too long or complicated to fit comfortably into a single megasentence.”
Dangling modifiers
Having taught English composition for several years, and thus having read several hundred scrambled student essays, I find this one difficult to concede. The dangling modifier—an especially easy error to make when writing quickly—too easily creates confusion or downright unintelligibility. Pinker does admit since the subjects of dangling modifiers “are inherently ambiguous,” they might sometimes “inadvertently attract a reader to the wrong choice, as in ‘When a small boy, a girl is of little interest.’” But, he says, this is not a grammatical error. Here are a few “danglers” he suggests as “perfectly acceptable”:
“Checking into the hotel, it was nice to see a few of my old classmates in the lobby.”
“Turning the corner, the view was quite different.”
“In order to contain the epidemic, the area was sealed off.”
Who and Whom
I once had a student ask me if “whom” was an archaic affectation that would make her writing sound forced and unnatural. I had to admit she had an excellent point, no matter what our overpriced textbook said. In most cases, even if correctly used, whom can indeed sound “formal verging on pompous.” Though they seem straightforward enough, “the rules for its proper use,” writes Pinker, “are obscure to many speakers, tempting them to drop ‘whom’ into their speech whenever they want to sound posh,” and to generally use the word incorrectly. Despite “a century of nagging by prescriptive grammarians,” the distinction between “who” and “whom” seems anything but simple, and so one’s use of it—as with any tricky word or usage—should be carefully calibrated “to the complexity of the construction and the degree of formality” the writing calls for. Put plainly, know how you’re using “whom” and why, or stick with the unobjectionable “who.”
Very unique
Oftentimes we find the most innocuous-sounding, common sense usages called out by uptight pedants as ungrammatical when there’s no seeming reason why they should be. The phrase “very unique,” a description that may not strike you as excessively weird or backward, happens to be “one of the commonest insults to the sensibility of the purist.” This is because, such narrow thinkers claim, as with other categorical expressions like “absolute” or “incomparable,” something either is or it isn’t, in the same way that one either is or isn’t pregnant: “referring to degrees of uniqueness is meaningless,” says the logic, in the case of absolute adjectives. Of course, it seems to me that one can absolutely refer to degrees of pregnancy. In any case, writes Pinker, “uniqueness is not like pregnancy […]; it must be defined relative to some scale of measurement.” Hence, “very unique,” makes sense, he says. But you should avoid it on aesthetic grounds. “’Very,’” he says, “is a soggy modifier in the best of circumstances.” How about “rather unique?” Too posh-sounding?
That and which
I breathed an audible sigh on encountering this one, because it’s a rule I find particularly irksome. Of note is that Pinker, an American, is writing in The Guardian, a British publication, where things are much more relaxed for these two relative pronouns. In U.S. usage, “which” is reserved for nonrestrictive—or optional clauses: “The pair of shoes, which cost five thousand dollars, was hideous.” For restrictive clauses, those “essential to the meaning of the sentence,” we use “that.” Pinker takes the example of a sentence in a documentary on “Imelda Marcos’s vast shoe collection.” In such a case, of course, we would need that bit about the price; hence, “The pair of shoes that cost £5,000 was hideous.”
It’s a reasonable enough distinction, and “one part of the rule,” Pinker says, “is correct.” We would rarely find someone writing “The pair of shoes, that cost £5,000…” after all. It probably looks awkward to our eyes (though I’ve seen it often enough). But there’s simply no good reason, he says, why we can’t use “which” freely, as the Brits already do, to refer to things both essential and non-. “Great writers have been using it for centuries,” Pinker points out, citing whoever (or “whomever”) translated that “render unto Caesar” bit in the King James Bible and Franklin Roosevelt’s “a day which will live in infamy.” QED, I’d say. And anyway, “which” is so much lovelier a word than “that.”
See Pinker’s Guardian piece for his other five anti-rules and free yourself up to write in a more natural, less stilted way. That is, if you already have some mastery of basic English. As Pinker rightly observes, “anyone who has read an inept student paper [um-hm], a bad Google translation, or an interview with George W. Bush can appreciate that standards of usage are desirable in many areas of communication.” How do we know when a rule is useful and when it impedes “clear and graceful prose?” It’s really no mystery, Pinker says. “Look it up.” It sounds like his book might help put things into better perspective than most writing guides, however. You can also hear him discuss his accessible and intuitive writing advice in the KQED interview with Michael Krasny above.
We’ve brought you a wealth of Haruki Murakami lately, and for good reason. Not only does the wildly popular Japanese novelist have a new novel out, he also has an upcoming novella, The Strange Library, a 96-page story about, well, a “strange trip to the library,” due from Knopf on December 2nd. Admirably prolific, writing roughly 3–4 novels per decade since his first in 1979, and a few collections of stories and essays, the notoriously shy Murakami took to writing somewhat late in life at age 30, and to running even later at 33. The latter pursuit gave him a great deal of material for his essay collection What I Talk About When I Talk About Running.
Like other authors who write nonfiction pieces on their avocations—Jamaica Kincaid on gardening, Hemingway on hunting—in his running book, Murakami can’t help but turn his passion for fitness into a metaphor for reading and writing. Given his natural reticence, he begins, with a disclaimer: “a gentleman shouldn’t go on and on about what he does to stay fit.”
Nevertheless, the ultra-marathoner can’t help but indulge. At one point, the writing on running turns to writing on writing, and a summary of the qualities the good novelist must have. Read his thoughts condensed below.
Talent:
Like Flannery O’Connor, whose thoughts on the MFA degree we quoted a few days ago, Murakami frames talent as an attribute that can’t be taught or bought. For the writer, talent is “more of a prerequisite than a necessary quality […] No matter how much enthusiasm and effort you put into writing, if you totally lack literary talent you can forget about being a novelist.” One feels this should go without saying, but for whatever reason, it seems that more people entertain the idea of becoming a writer longer in life than that of becoming, say, a musician or a painter. Maybe this is why Murakami then makes an analogy to music as a pursuit in which, ideally, natural aptitude is indispensable. But in mentioning two of his favorite composers, Schubert and Mozart, Murakami makes the point that these are examples of artists “whose genius went out in a blaze of glory.” He is quick to point out that “for the vast majority of us this isn’t the model we follow.” The novelist as runner, we might say, should train for a career running marathons.
Focus:
Murakami-as-runner, an Economist review muses, is “if not a madman […] a very focused man.” One would have to be to finish 27 marathons, including a 62-mile monster in Hokkaido, and several triathlons. The qualities that serve him in his physical discipline are also those he identifies as necessary in the novelist. Murakami defines focus as “the ability to concentrate all your limited talents on whatever’s critical at the moment. Without that you can’t accomplish anything of value.” He “generally concentrate[s] on work for three or four hours every morning. I sit at my desk and focus totally on what I’m writing. I don’t see anything else, I don’t think about anything else.” Murakami’s running memoir may contain “long descriptions of training schedules and diet,” but when it comes to writing, there seems to be one overwhelmingly singular way to go about things. Just sit down and do it.
Endurance:
Consider yourself more of a sprinter? Maybe stick to short stories. “If you concentrate on writing three or four hours a day and feel tired after a week of this,” Murakami chides, “you’re not going to be able to write a long work. What’s needed of the writer of fiction—at least one who hopes to write a novel—is the energy to focus every day for half a year, or a year, or two years. Fortunately, these two disciplines—focus and endurance—are different from talent, since they can be acquired and sharpened through training.” The act of acquisition, Murakami writes, “is a lot like the training of muscles I wrote of a moment ago. [It] involves the same process as jogging every day to strengthen your muscles and develop a runner’s physique.”
Clearly there’s little room for spacing out waiting around for inspiration. To extend the analogy, this might be likened to the rare desire one gets to try a new, challenging routine, an impulse that wanes pretty quickly once things get painful and dull. But in writing, Murakami suggests, sometimes it’s enough just to show up. He refers to the discipline of Raymond Chandler, who “made sure he sat down at his desk every single day and concentrated” even if he wrote not a word. It’s a fitting image for what Murakami describes as the writer’s need to “transmit the object of your focus to your entire body.” I wonder if it’s not going too far to claim that this sentence betrays the real subject of Murakami’s running book.
Flannery O’Connor once wrote, “because fine writing rarely pays, fine writers usually end up teaching, and the [MFA] degree, however worthless to the spirit, can be expected to add something to the flesh.” That phrase “worthless to the spirit” contains a great deal of the negative attitude O’Connor expressed toward the institutionalization of creative writing in MFA programs like the one she helped make famous at the University of Iowa. The verbiage comes from an essay she wrote for the alumni magazine of the Georgia College for Women after completing her degree in 1947, quoted in the Chad Harbach-edited collection of essays MFA vs. NYC. Although fresh from the program, O’Connor was already on her way to literary success, having published her first story, “The Geranium,” the year previous and begun work on her first novel, Wise Blood. Nevertheless, her insights on the MFA are not particularly sanguine.
On the one hand, she writes with characteristic dark humor, writing programs can serve as alternatives to “the poor house and the mad house.” In graduate school, “the writer is encouraged or at least tolerated in his odd ways.” An MFA program may offer some small respite from the loneliness and hardship of the writing life, and ultimately provide a credential to be “pronounced upon by his future employers should they chance to be of the academy.” But the time and effort (not to mention the expense, unless one is fully funded) may not be worth the cost, O’Connor suggests. Her own program at Iowa was “designed to cover the writer’s technical needs […], and to provide him with a literary atmosphere which he would not be able to find elsewhere. The writer can expect very little else.”
Later, in her collection of essays Mystery and Manners, O’Connor expressed similar sentiments. Concluding a lengthy discussion on the very limited role of the teacher of creative writing, she concludes that “the teacher’s work is largely negative […] a matter of saying ‘This doesn’t work because…’ or ‘This does work because….’” Remarking on the common observation that universities stifle writers, O’Connor writes, “My opinion is that they don’t stifle enough of them. There’s many a best-seller that could have been prevented by a good teacher.” Creative writing teachers may nod their heads in agreement, and shake them in frustration. But we should return to that phrase “worthless to the spirit,” for while MFA programs may turn out “competent” writers of fiction, O’Connor admits, they cannot produce “fine writing”:
In the last twenty years the colleges have been emphasizing creative writing to such an extent that you almost feel that any idiot with a nickel’s worth of talent can emerge from a writing class able to write a competent story. In fact, so many people can now write competent stories that the short story as a medium is in danger of dying of competence. We want competence, but competence by itself is deadly. What is needed is the vision to go with it, and you do not get this from a writing class.
O’Connor probably overestimates the degree to which “any idiot” can learn to write with competence, but her point is clear. She wrote these words in the mid-fifties, in an essay titled “The Nature and Aim of Fiction.” As Harbach’s new essay collection demonstrates, the debate about the value of MFA programs—which have expanded exponentially since O’Connor’s day—has not by any means been settled. And while there are certainly those writers, she notes wryly, who can “learn to write badly enough” and “make a great deal of money,” the true artist may be in the same position after the MFA as they were before it, compelled to “chop a path in the wilderness of his own soul; a disheartening process, lifelong and lonesome.”
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.