Wed, 1 April 2015
Transcript available upon request.
Mon, 23 March 2015
MOST mornings, I wake up, put on some stretchy pants and very bright t-shirt and strap on my phone for a run, because for some reason you need a phone to go running. Why do I do that? Is it because I am a master of my fate, and I choose my fate to be sweaty and singing along to Shakira through the wilderness trails near my home? Or is it because I am being influenced by the institutions of the beauty industry, the fitness industry, the nature industry and the Spandex industry to conform to a certain predictable type, which happens to include skipping over rocks and dirt while a GPS tracks my every step?
Pierre Broudieu—not to be confused with bourdoux—is convinced that it’s not about just my free will nor entirely just society structures that makes me go for a run, but continuous give-and-take between them. What I think I want to do are shaped by past events and institutions that in turn are influenced by what I choose to do. Because choosing to wake up and run, I get feedback from structures that reinforce what I think of as my choice to wake up and run. This combination of choice and society stricture, Bourdieu calls habitus and it’s his most famous contribution to rhetoric and to sociology.
Habitus is a combination of deep-rooted, even unconscious, desires and what we choose to desire, which has been formed from childhood. It is, as Bourdieu often described it, “the feel for the game.” I don’t know how to articulate how and why I run, but I know it’s something I do, because it’s also something that my society does.
Sometimes it’s hard to see how institutions support a habitus unless you see the opposite, something that happened to my sister when she was doing medical surveys in a very remote village in Tanzania. She woke up one morning and went for a run—and flummoxed the villagers. “What are you running from?” said one person, huffing up beside her. “Nothing,” she said. “I’m just running.” “Why?” “I don’t know—to burn calories maybe?” The villager, who had been working with her on, among other things, questions of nutrition, paused a moment and then asked incredulously, “You want to burn calories?”
The feel for the game that my sister had was for a totally different game than the one that made sense in a small fishing village struggling to get and keep calories rather than burn them. The feel for the game wasn’t something that my sister conscious set out to learn, and it was somewhat only when she bumped against a different rule that she noticed that she was doing something wrong. Habitus is created and reproduced unconsciously, ‘without any deliberate pursuit of coherence,” as Bourdieu says “without any conscious concentration’ (ibid: 170).
Although our habitus can cause embarrassing mismatches when we’re in a different culture, it adept at taking us through our native environments, as we play the game around us like insiders.
Playing the game like an insider was a really important thing for young Pierre Bourdieu, who came from a working class family in southern France. Southern France is like, the sticks, for French people, and his family had a strong accent, both in the lilt to their speech and the things that they believed were important. Going to study in Paris certainly would have highlighted the differences between his home culture and the elite intellectual world. The Elite intellectual world became a sociological phenom to Bourdieu as exotic and interesting as the Algerian tribes he did his field work with. The ways that elite intellectual used language, used taste, used culture became the basis of his landmark book Distinction: A social critique of the judgment of taste, which was published in 1979. Distinction highlights the way that the elite create an insider habitus. As B says, “symbolic goods, especially those regarded as the attributes of excellence, [are] the ideal weapon in strategies of distinction.” If you do or don’t like opera, if you do or don’t see running as recreation, if you do or don’t value certain food, cultures, presentation or any other type of distinction, creates the social class fracture that distinguishes the upper class from the middle, or the very upper. Are snails a pest or a delicacy? That sort of thing. People don’t even ask why they think opera is just good music or snails are just plain tastey, because it’s deeply engrained in their lives since almost birth.
This so-called social capital is learned from a very young age as part of your habitus, and if you grow up thinking that You don't kick a dressage horse after a failed pas de deux, you live in a very different world than where you don’t kick a man when he’s done.
But none of this is to say that once you’re in (or out) you’re stuck. According to one commentator on Bourdieu, habitus “is not fixed or permanent, and can be changed under unexpected situations or over a long historical period” (Navarro). Somewhere along the way, for example, the elite picked up jazz as the preferred music to opera, and snails gave way to craft beer and high end cupcakes and—going for runs? Not only can the cultures and institutions switch, but people can switch, too—Bourdieu, for all of his criticism of the elite, ended up in the in-group of often cited scholars in rhetoric, philosophy and sociology. And I used to hate running. But here I am, waking up most mornings to put on some stretchy pants and very bright t-shirt and strap on my phone for a run, because for some reason you need a phone to go running.
If you have a deeply engrained habitus, why not tell us about it at email@example.com? We’d love to hear from you and any other comments or ideas you may have, including distance running tips, because there seem to be a disproportionate number of distance runners in higher education. Must be something about the habitus…
Mon, 16 March 2015
A few weeks ago I was at an excellent lecture by Collin Brooke here at the university of Texas and he was talking about applying the master tropes to different models of networks. Then I thought--by Jove, the Master Tropes! What a brilliant idea for a podcast! So with all deference to Dr. Brooke, let’s dive into these four beauties of the world of tropes.
A trope, you may or not know, is a way of presenting thought in language. A trope is different from what’s called a figure because it doesn’t deal with arranging words, but rather arranging thought. For example, a figure might be something like hyperbaton, which is the the way that Yoda talks: “Patience you must have” just means “you must have patience” there’s not change in the thought behind the words, but the refiguring of the words creates interest, so Yoda says things like “Miss them do not” instead of do not miss them, but the ideas aren’t changed at all. That’s figures.
Occasionally, though, Yoda will use a trope. For example, once he said ““In a dark place we find ourselves, and a little more knowledge lights our way.” This is, as it turns out, a metaphor: knowledge doesn’t actually cast a glow, but it does make things metaphorically clear. The words transform the ideas: light equals knowledge. It’s not that Yoda changed the words around--all considered this is pretty syntactically straight-forward for the sage-green sage--but he’s presented the ideas in a different way. This is a trope, not a figure.
It is, as a matter of fact, one of the four master tropes: Metaphor, Synecdoche, Metonymy and Irony. It’s possible that these terms aren’t familiar to you, or only in a vague, AP English sort of way, so let me provide examples and definitions. Metaphor is the trope that is most familiar to us: knowledge is light, the Force is a river, many Storm troupers are a wall. So I’m going to skip over that. Synecdoche is--aside from being difficult to pronounce, using the part to represent the whole. I always think of that movie Synecdoche New York, where the guy builds a replica of New York for a movie. The standard examples include things like “earning your bread and butter” when you’re hopefully earning much more than that or “putting boots on the ground” when the military often needs soldiers, too, to fill those boots. I used to joke with my Mormon comedy group since everyone prays to “bless the hands that prepared this food,” if there was a terrible accident in the kitchen and everyone died, at least the hands would be preserved. So you get the idea. Metonymy can sometimes be a little more confusing, because it, like Synecdoche, involves using a word associated with the idea to stand in for the idea itself. We say things like “the White House has issued a statement” when the building itself has done no such thing, or “Hollywood is corrupt” to represent the movie business generally. Some people will say that synecdoche is just a specific kind of metonymy, like how simile is a specific kind of metaphor. Finally, irony may seem like a simple, straightforward trope, but it can be notoriously complex, as Wayne Booth describes in greater detail in The Rhetoric of Irony. How we we know when someone is being ironic? How much is irony dependent on understanding cultural cues? Why do we say the opposite of what we mean as a way to say what we want? Tricky stuff all around.
The four master tropes are probably most familiar to rhetoricians as the essay found way in the back of Kenneth Burke’s Grammar of Methods, way way back as an appendix. There, Burke equates these over-arching tropes with different epistemic perspectives: metaphor correlates with perspective, metonymy with reduction, synecdoche with representation, and irony withdialectic. The way that we construct thought depends on how we use these four master tropes.
Remember when we talked about the Metaphors we live by? Well, Burke says that we don’t just live by metaphors individually, but also by the idea of metaphor, or by reduction, representation or dialectic. The tropes, instead of just being a way to make your writing more flowery, can be a critical part of invention, and how you see the world more generally. Are you inclined to think inductively, looking at a couple of examples of Sith lords and there after making generalizations about the group as a whole and their capacity to run a competent daycare? It’s possible to think in terms of irony, transpositioning one view of truth with an anti-thetical perspective: can Anikin be both on the dark side and not on the dark side? Can you both do and do not if you only try? These master tropes are not just ways of expressing ideas about the world, but coming to make ideas as well.
I’m a huge fan of Burke, but I’m afraid that I can’t give him credit for coming up with the idea of four master tropes that encompass other ways of figuring ideas. I’m sorry to say that that distinction goes to--ew--Petrus Ramus. Yes, Ramus, the mustache-twirling villain of rhetoric himself. Back when we did our series on the villains of rhetoric, Ramus was public enemy number one, removing invention from rhetoric and diminishing the whole affair to a series of branching “yes and no” questions and needless ornamentation. And yet it was Ramus, in his eagerness to classify everything into categories and subcategories who coined the idea of the master tropes back in 1549. Fortunately the idea was taken up by a more palatable figure of rhetorical history, Giambattista Vico, who in the 18th century, identified the master tropes as basic tropes, or fundamental tropes, being those to which all others are reducible.
Since Burke, though, others have taken up the idea that these tropes of arranging ideas might become ways to think about the world in general. Hayden White, for instance, saw the master tropes as representing something about literature.
He constructed a table where each trope has its own genre, worldview and ideology. Metaphor, for instance, was about romance--or we might say fantasy--and was associated with formism and an ideology of anarchism because anything might apply as a metaphor. Metonymy was associated with comedy, organicism and conservatism--presumably because if you assume that “the White House” speaks for the country, you’re putting a lot of stock in the traditional power that dominates. Conversely, synecdoche was associated with tragedy, mechanism and radicalism. Irony, naturally enough, was the trope of satire and its world view of contextualism and liberalism. Once White had come up with this tidy table, he because to think about the tropes not just statically, but how they might evolve temporally, both in terms of an individual child’s development and in a civilization.
Metaphor was the earliest stage, corresponding to infants up to two years old and aligned with Foucault’s conceptualization of the Renaissance. Then metaphor gives way to metonymy, the domain of children from 2-7, which White lines up with the Classical period and the Enlightenment--very conservative and fond of straight-forward comedy. Next comes synecdoche of tweens and the modernist period--radically breaking from the past and finally, in crowning achievement, irony, the stage of teenagers and adults, corresponding to the post-modernist era, with its love of counterintuitive and contradictory thought.
Others have highlighted the philosophical or historiographical possibilities of the mastertropes, including Jakobson and Foucault himself. Which brings me back to this fascinating, exploratory lecture by Collin Brooke.
Brooke suggested another correlation for the master tropes: not ways of thinking or periods of time, but networks of connection. Networks are a big stinking deal for digital humanists and new media rhetoricians like Brooke, and some of the different types of networks, brooke proposes, may correlate to the master tropes: hierarchies, for instance, are like metaphors, which correspond across groups--the padowan learner doesn’t really tell us much about the Jedi master who trains her, but you expect the role of that padowan learner to be similar to the role of another padowan who studies under another master. Synecdoche, though, can be seen in truly random networks. A network of 200 that is truly random, is representative of a network of 2000, or of 2 million. Some networks are neither analogous like metaphor or random like synecdoche. In situations that produce what’s been called the long tail--citations for example, some groups or people are more popular because they are more popular. the more people who fear Jabba the hut--peons, bounty hunters-- the more he is feared. It creates a snowball effect that is similar to metonymy. Brooke’s ideas are inchoate and he admits that he’s not sure what network might correlate to irony--it’s all a work in progress, afterall, but it goes to show that the organization appeal of the master tropes continues. The idea of tropes that rule all the other tropes and say something meaningful about the ways in which we construct and understand the world around us is a timeless appeal that goes all the way back to Vico--er, let’s just say Vico, okay. Until next week--miss us you must not because patience you must have.
Thu, 5 March 2015
Weclome to Mere Rhetoric, a podcast for beginners and insiders about the ideas, terms and movement who have shaped rhetorical history. I’m Mary Hedengren and today we’re talking about two influencial chapters from one book: Richard Weavers’ “The Ethics of Rhetoric”
The Ethics of rhetoric was written in 1953, and it definitely feels like it and Weaver was Southern and definitely feels like it. Even though he spent most of his career at the University of Chicago, with Wayne Booth, he kept his summers free to go down to a farm that he kept where he lived an agrarian dream of plowing the family vegetable garden with a mule. He definitely believed in the Jeffersonian ideal of the gentleman farmer, connected to the earth.
Somehow in the middle of all that plowing, Weaver was able to be one of the most important of the “new conservative” branch of thinkers and the leading neo-platonist rhetorician of the 20th century. Weaver believed also somewhat idealistically about rhetoric. He said, “Rhetoric “instills belief and action” through “intersect[ing] possibility with the plan of actuality and hences of the imperative” (28). Rhetoric is “a process of coordination and subordination […] very close to the essential thought process” (210). Thought and rhetoric were interwoven and rhetoric couldn’t be ignored.
There are two chapters in The Ethics of Rhetoric which have had especially lasting influence. The first is a reading of Phaedrus, because Weaver loved him some Plato. Remember when we talked about the Phaedrus? For those of you who weren’t here, it’s a story about Plato giving two opposite speeches about love: in the first, he tweaks an existing speech about the importance of choosing someone who doesn’t love you as your lover, in the second, he repents of the first and gives a speech about how it is good to have a lover who loves you, and at the end, he ends up talking about rhetoric. Some people may say, “what? what’s the connection?” Not Weaver. Weaver says tthat“beginning with something simple” Plato’s dialogue “pass to more general levels of application” and then end up in allegory (4). The lovers are like rhetoric—you can have good, bad and impotent rhetoric. The non-lover is a lie, like “semantically purified speech” (7). Bad rhetoric, like bad lover, seeks to keep recipient weak and passive (11). If we have impure motives towards our audience, we’ll keep them dependant on us, week and passive, instead of empowering them the way that a true lover would. Ulitmately, Weaver believed in an ideal of rhetoric, rhetoric that would make people "better versions of themselves" (Young 135)
Another one of Weavers’ chapters to have lasting influence classifies the very words we use, most famously, “god terms” and “devil terms.” “God terms are those words that, for a specific audience, are so positive and influencial that they can overpower a lot of other language or ideas. For Weaver, writing in 1953, he uses “American” as one fo the key political god terms. In contrast to god terms are devil termns and for weaver, writing in 1953, the ultimate devil term is “communist.’ From here, he can set up the language of the McCarthy era nicely, right? The committee on Unamerican activities uses a powerful god term. Most famously, Weaver introduces “god-terms” and “devil-terms” as ultimate terms that are either “imoart to the other [terms] their lesser degree of force and fixes the scale by which degrees of comparison are understood” (212), either positively or negatively (222). When you hear a god or devil term, the defensive rhetorician must “"hold a dialectic with himself" to see if he buys the word as it’s being used.
But additionally Charismatic terms= those terms who have “broken loose [from] referential connetions” which will that “they shall mean something” (eg: “freedom”) (227-8). These terms don’t mean something in particular just “happy feeling.”While, Uncontested term= seems to invite a contest, but not in its context (eg: appealing to “illustrious Rome”) (166). They aren’t really disputed with. Ultimate terms like these are often “a single term [awaits] coupling with another term” (211).
Weaver was also influencial in the rhetoric of poetics because he swa that “Like poetry, rhetoric relies on the connotation of words as well as their denotation.” That is to say, not just what the words mean in the dictionary, but what they mean to a community—communist to a group of 1953 american politicians is a far more fearful thing than its dictionary definition.Like poetry, too, there must be an enthemyme, a truncated syllogism, where the audience fills in the blanks, or as Weaver puts it “The missing propsition […] ‘in their hearts’” (174)
"Good rhetoricians, he claimed, use poetic analogies to relate abstract ideas directly to the listeners (Young 132). Specifically focusing on metaphor, he found that comparison should be an essential part of the rhetorical process (Johannesen 23)."
Weaver didn’t producemore than a handful of books, possibly also because he died quite suddenly in his fifties, but he had a lastin influence in the Chicago school and elsewhere. Weaver certainly wasn’t a perfect person—for instance, he disliked jazz and that is just plain wrong—and he’s kind of gone out of favor, but reading The Ethics of Rhetoric, you see how crucial his ideas have been to the 20th century revival of rhetoric.
Tue, 24 February 2015
Transcript on request.
Tue, 17 February 2015
Welcome to Mere Rhetoric. Or maybe welcome back, because last week we talked about John Dewey and today we’re talking about John Dewey again. You don’t have to go back and listen to the last week’s episode on Dewey and aesthetics, but if you like this, Dewey part the Deuce, then you migh want to go check out the previous episode on Dewey and the artful life. Today, today thought,we get to talk about Dewey’s political and educational contributions.
Dewey was a huge fan of democracy and of education for democracy. He said, “Democracy and the one, ultimate, ethical ideal of humanity are to my mind synonymous."
One scholar summarized Dewey’s politics in this way: “First, Dewey believed that democracy is an ethical ideal rather than merely a political arrangement. Second, he considered participation, not representation, the essence of democracy. Third, he insisted on the harmony between democracy and the scientific method: ever-expanding and self-critical communities of inquiry, operating on pragmatic principles and constantly revising their beliefs in light of new evidence, provided Dewey with a model for democratic decision making…Finally, Dewey called for extending democracy, conceived as an ethical project, from politics to industry and society.” Dewey was big on democracy. this idea, especially about participation in democracy instead of just representation inspired much of his writing in education. The kind of progressive education that Dewey endorsed was education for democracy, education that focused on making student empathetic and engaged citizens.
Dewey’s most articulate thinking about engaged democracy comes as most good thinking does: in response to an interlocutor whose ideas make our blood boil. For Dewey this was Walter Lippmann. the famous Lippmann-Dewer debates begne in 1922 when Walter Lippmann wrote s book called Public Opinion. In Public Opinion, Lippman says that democracy is demo-crazy--public opinion is actually shaped by adverstisers and demogogues who can manipulate the public into thinking what ever they want. The people as a whole can’t make any decision that hasn’t already been made by sleezy Madison Ave. types. So Lippman says that instead the government should be led by experts, preferably scientitic and objective types who would be immune to propaganda. Instead of democracy romantically conceived, he suggested representation and political experts.
Well this got Dewey’s goat and in The Public and its Problems, he responded to Lippmann’s view of democracy. Instead of relying on experts for democracy, Dewey recommends that “"it is not necessary that the many should have the knowledge and skill to carry on the needed investigations; what is required is that they have the ability to judge of the bearing of the knowledge supplied by others upon common concerns." Sure, he admitted, there could be ignorant publics swayed by propaganda, but the solution was not to toss the baby with the sludgewater--education was what the populace needed if they were to engage in participatory democracy.
The Dewey Lippmann Debate has gotten a lot of press from recent rhetoricians. Search for it on Google scholar and you’ll find over a thousand entries since 2011. In the 2008 meeting of the Rhetorical Society of America, a “lively panel” discussion took place where, according to one witness “Jean Goodwin effectively advanced journalist Walter Lippmann’s critique of the “omnicompetent” citizen against Robert Asen’s John Dewey, who represented hope for collaborative dialogue.” And in the most recent meeting of the Modern Language Association, another scholar pointed out how the Lippmann-Dewey debate relates to the current expert-laden political rhetoric. A recent collection of essays on called Trained Capacities: John Dewey, Rhetoric, and Democratic Practice, Brian Jackson and Gregory Clark, eds. also reminds us of the perrential importance of asking ourselves “Are our citizens trained for democracy? Can they be?” The debate, so it seems, continues.
The kind of education you would need to particpate in democracy includes not just information about the value of nuclear energy or the political history of the middle east: you need to have some sense of how you fit in to a democracy, what the moral obligations you have and what the society can provide you.
For Dewey, America’s ideal model of civic engagement wasn’t a selfish, me-first mentality, but neither was it entirely collective and socialist. In Individualism Old and New, Dewey says it’s time to move past the old, rugged, wild-west homesteader kind of individualism that theAmericans he was writing to could possibly remember, or at least could remember stories of their parents and grandparents. while his audience of early 20th century Americans idealized that kind of independence, they were also increasingly aware of how to connect. The experience of world war have taught them that “Most social unifications come about in response to external pressure” (11) and “personal participation in the development of a shared culture” (17). Defining that interconnectivity against the struggles and hardships of war and poverty may seem intutive but the move from frontier rugged individualism to an individualism that recognizes our interconnectivitity is at the core of Dewey’s political philosophy.“Each of us needs to cultivate his own garden. But there is no fence around this garden” (82).
Now just so you know that last week’s episode on the aesthetic of Dewey wasn’t totally separated fromt his sort of thing, Dewey also talked about how that “shared sulture” happens through art, and how this art educates, cultivating the skills that are necessary for democracy: “The art which our times needs in order to create a new type of individuality is the art which, being sensitive to the technology and science that are the moving force of our time, will envisage the expansive, the social culture which they may be made to serve” (49). Or, another way, “The work of art is the truly individual thing” (81).
Sat, 7 February 2015
Today on Mere Rhetoric, we talk about John Dewey. John Dewey was a big ol’ deal, even back in his day. Just after his death in 1952, Hilda Neaby wrote”Dewey has been to our age what Aristotle was to the later Middle Ages, not just a philosopher, but the philosopher.”
And what does a person have to do to be compared to Aristotle? I mean to be compared in a serious way to Aristotle, because I’m like Aristotle because, you know, I enjoy olive oil on occasions, not because I’m the philosopher. I think one thing Neaby means is that Dewey was involved in everything. Just like how Aristotle had huge impact in politics, theology, science and rhetoric, John Dewey seemed to have a finger in every pie. By the time he died at age 92, he had written significantly on education, politics, art, ethics and sociology. But it’s not enough to be a big freakin’ deal a hundred years ago, but Dewey is a big deal in rhetoric today. It’s rare to search too many issues back in Rhetoric Review, Rhetoric Society Quarterly or Rhetoric and Public Affairs without hitting on an article either directly about or draws on Dewey, and books about Dewey are popping up all over the map. John Dewey is hot real estate.
So because John Dewey is such an important thinker for rhetoricians today, we have to take more time than today to talk about him. That’s right-- a Mere Rhetoric two-parter. A to-be-continued. A cliffhanger. If that cliff is carefully divided, I guess and that division is this: today we’ll talk about John Dewey’s contribution to aesthetics, his book Art as Experience and responses to that book from contemporary rhetoricans. Next week we’ll talk more about his politics, the dream of his pragmatism, what he means by Individualism Old and New and the famous Dewey-Lippmann debate. So that’s what we’ll be doing the next two weeks. So let’s get started on the first part of this Dewey-twoey.
Like many great thinkers, Dewey started his career by realizing that what he thought he wanted to do, he really, really didn’t. In Dewey’s case it was education. It’s ironic that Dewey became one of the 20th century’s most important voices in education because he did not teach secondary or primary school for longer than a couple of years each. Good thing he had a back-up plan as a major philosopher. He joined the ground floor of the University of Chicago and became one of the defining voices of the University of Chicago style of thinking, although he eventually left, somewhat acrimoniously, and taught at Columbia for the rest of his career. Somewhere along the way, though, he became president of the American philosophical association and published Art as Experience.
The title kind of gives away Dewey’s claim--he situates art in the experience which you have with art. As he says “the actual work of art is what the product does with and in experience” (1). But he also means the opposite, that experience can be art. Instead of thinking of art as something that happens in rarified situations behind glass and velvet ropes, Dewey opens up “art” to mean popular culture, experiences with nature and even just a way of living.
Being in the moment is a big part of this artful living. If you’re experiencing or rather, to use the particular philosophical parlance Dewey insists on “having an experience” then you are totally being in the moment: “only when the past ceases to trouble and anticipations of the future are not perturning is a being wholly united with his environment and therefore fully alive. Art celebrates with peculiar intensity the moments in which the past reenforces the present and in which the future is a quickening of what is now is” (17). In such a view, any time we live the moment artfully, in full presence of being, we’re having an artful experience.
In having an experience, you have some sort of awareness and some kind of form.
As Dewey says, “art is thus prefigured in the very processes of life” (25).
This idea may sound radical. How can sitting in a crowded bus be art the way that the Mona Lisa is art? But Dewey is insistent. He sighs, “the hostility to association of fine art with normal processes of living is a pathetic, even a tragic, commentary on life as it is ordinarily lived” (27-28).
That’s not to say that there can’t be objects of art that concentrate the sensation of having an experience. But it’s the whole experience. For example, “Reflections on Tintern Abbey” isn’t really about Tintern Abbey any more than it’s about Wordworth and evenings and homecomings and 1798 and that sycamore and all of it. It expresses a complete experience of Wordsworth. And that expression is always changing as times change.“the very meaning,” Dewey writes “of an important new movement in any art is that it expresses something new in human experience” (316). Meanwhile the art that remains after the moment passes and the movement becomes cliche. “Art is the great force in effecting [...] consolidation. The individuals who have minds pass away one by one. The works in which meanings have received objective expression endure. [...] every art in some manner is a medium of this transmission while its products are no inconsiderable part of the saturating matter” (340)
And the value of art is moral. First off, Dewey says that“The moral function of art itself is to remove prejudice, do away with the scales that keep the eye from seeing, tear away the veils due to wont and custom, perfect the power to perceive. The critic’s office is to further this work, performed by the object of art” (338).
Pretty cool stuff, huh? But wait, there’s more. The process of having an experience, that complete being, has its own moral value, or so argues Scott Stroud in John Dewy and the Artful Life: Pragmatism, aesthetics and morality. There he claims “I want to examine how art can be seen as a way of moral cultivation” (3) because“At various places, Dewey’s work provides us with tantalizing clues to his real project--the task of making more of life aesthetic or artful” (5) Put in other words: “art can show individuals how certain value schemes feel, how behaviors affect people, etc.--in other words, art can force the reflective instatement (creation) of moral values” (9)
Stroud connects the pragmatists like Dewey with mysticism in Eastern philosophy and medieval monastic Christianity. Remember how Dewey is all about having an experience, really being in the moment? So Stroud says, “The way to substantially improve our experience is not by merely waiting for the material setup of the world to change, but instead lies in the intelligent altering of our deep-seated bahits (orientations) toward activity and toward other individuals” (11).
“The important point,” writes Stroud, “is that attentiveness to the present is a vital way to cultivate the self toward the goal of progressive adjustment and it is also a vital means in the present to do so” (69)
For Stroud, as for Dewey“the art object [...] imbued with meaning partially by the actions of the artist, but also because of the crucial contributions of meaning that a common cultural background contributes to the activity of producing and receiving art objects” (97)--the way that the artistic object is received popularly and by critics. And for that aim “criticism does more than merely tell one what an important work of art is or what impression was had; instead, it gives one a possible orientation that is helpful in ordering and improving one’s past and future experiences” (122). And in that, criticism, or even appreciation, is also a moral act.
Stroud’s argument has immediate application of the artful life. He ponders “How can we render everyday communication, such as that experiences in mundane conversations with friends, cashiers, and so on, as aesthetic?” (170). To answer this, he draws on dewey to suggest that we avoid focusing on a remote goal, cultivate habits of attending to the demands of the present communication situation and fight against the idea of reified, separate self (186-7).
Mon, 2 February 2015
I want you to do a little experiment for me. Think back to what you were writing five years ago. If you happen to be at your computer or the scrapbook of everything you’ve ever written, you can even pull up your writing. If not, just go ahead and meditate. Do you need a moment? It’s okay, I’ll wait. Now then—has your writing gotten better? Have you become a better writer?
If you’re like me, you probably look at the things you were writing five years, or even a year ago, you might say, “yes” in very enthusiastic tones. If you’re like me, you might, in fact, have a hard time reading the work you did five years ago. How could I have been so stupid? How was I such a bad writer?
Lee Ann Carroll, in her book Rehearsing New Roles has a shocking proposition for you: maybe you weren’t a bad writer, maybe you were just inexpert in writing the sort of things you write today. Carroll gathered up some college students and performed a longitudinal study, which means that she followed the same subjects around through their entire time at school and beyond. She had them sit down in interviews with her and fill out time logs detailing how much studying they do outside of class (in case you’re curious, the amount ranged from five hours a week to forty). They brought in their writing assignments, and their outside writing to talk about. It was very thorough. And do you want to know her take away?
First off that “students in college do not necessarily learn to write ‘better,’ but that they learn to write differently—to produce new more complicated forms addressing challenging topics with greater depth, complexity and rhetorical sophistication” (xiv) “Wait a moment,” you might say, “great depth, complexity and rhetorical sophistication? Isn’t that just a fancy way of saying “better writing?” Maybe it is, but it’s not that they’re getting better at this vague genre of “academic writing. As Carroll puts it “Their writing gets better in that they do learn to write differently but the do not fulfill the fantasy of mastering one kind of literacy, an idealized version of academic writing” (60).
This is the real-life writing changes of writing in the disciplines. A student gets into one class, learns the genres and expectations of that class and then, right when she gets the hang of it, heads into another class. “Students’ literacy develops because students must take on new and difficult roles that challenge their abilities as writers. In fact, student writing may sometimes need to get ‘worse’ befor it can become ‘better.’ Because many college writing tasks are essentially new to students, they will need repeated practice to become proficient.” (9). How much do professors take this into consideration? Not very much.
The writing assignments that Carroll’s participants navigated were complex and sophisticated, but also, very, very different from each other. She claims that “Faculty are likely to underestimate how much writing tasks differ from course to course, from discipline to discipline, and from professor to professor” (9) Put another way, “students must learn to write differently but have few opportunities to develop one particular type of writng over any extended period of time” (55).
And where does this leave first year composition? Carroll writes that we should take the work of first year composition seriously, but not “too seriously. A first-year composition course can serve students by helping them make connections between what they have already learned about writing in their k-12 education and ways they might learn to write differently both in the academy and as citizens of the larger society. On the other hand, first-year composition cannot succed as a source that will teach students how to write for contexts they have no yet encountered A one-semester writing course is bet viewed as ust one step in a long process of development that extends from children’s first encounters with literacy on through their adult lives” (27-28).
Carroll does have some practical recommendations. She suggests that first year composition focus on metacognitive awareness and students own writing as much as possible. You know asking students things like “what do you do when you get an assignment prompt?” and discussing their own writing practice. She also recommends focusing on portfolios—in classes as well as in departments and programs. As much as possible, those portfolios should provide opportunities to return to similar genres as well as challendging students to try new things—remembering that the results will be less than perfect and that students will need plenty of specific feedback.
I find Carroll’s argument very persuasive, and as I’ve written and recorded more of these podcasts, I’ve noticed that this weird literary genre is becoming more comfortable for me. But it’s been more than a year! How many literacy projects do average undergraduates get to revisit over and over again? Is there a project you’ve mastered or a project you thought you mastered (like the so called reading response) only to discover that a different teacher had a different expectation? If so drop us a line at firstname.lastname@example.org I’d love to hear about it. Now go back and check out your writing from five years again, because if it’s anything like mine, it’s pretty darn amusing and well worth a reread
Wed, 21 January 2015
Welcome to Mere Rhetoric, a podcast for beginners and insiders about the ideas, people and movements who have shaped rhetorical history. I’m Mary Hedengren and, ah, here I am in my newly redecorated research cube. I’ve taped grey and yellow chevron wrapping paper over the old horrific 90s wallpaper and the books that completely fill my bookshelf are organized—somewhat. The tiny red and green Loeb editions look like Christmas decorations among the others and one whole shelf of books is tattooed with library barcodes. My door is propped open by the extra hard wood chair and is scrubbed clean—you almost can’t see the faint traces of pen from all of the strange graffiti, including one sloppy invitation for a previous occupant to get sushi. I’ve hung an orange-and-white abstract painting on the outside of the door and you can just see the corner of it from my seat. Why am I telling you about my cube in such detail? Because today we’re talking about Ekphrasis. Ekphrasis is the Greek term for description, a rich description that makes you see a scene before you in such detail that you feel like you’re actually there. Did it work? Did you imagine yourself in my cozy little cube?
Last week I talked about a how there was a sculpture of kairos that someone had written a poem about and I called it ekphrasis, but I may have given a very short definition of just what ekphrasis is. I’ve been thinking about ekphrasis for a long time, largely because of a 2009 book called Ekphrasis, Imagination and Persuasion in Ancient Rhetorical Theory and Practice. In this book, Ruth Webb seeks to rehabilitate ekphrasis from its long misuse. We think of ekphrasis as a describing a subject matter—art—in poetic practice rather than a method—bringing something “vividly before the eyes”—used for a variety of rhetorical purposes (1). When I first learned of ekphrasis, it was in a poetry class. The teacher showed us several poems that were written to describe pictures and then challenged us to find works of art that we could transfer into words. There are several famous poems that are ekphrasis. For example, do you remember Keats’ Ode to a Grecian Urn? Or William Carlos Williams’ poem about Landscape with Fall of Icarus ? Perhaps one of the most famous examples of ekphrasis, for ancient and modern students, is the description of the achilles’ shield in Homer. In fact, Webb figures that shield led to this confusion of describing an artifact rather than just describing something.
Webb doesn’t just tell us what ekphrasis is not; she describes how Progymnasmata series of educational practices and other student handbooks influenced use and understanding of this tool that permeated rhetorical life from the arts (168) to the law courts (89) to the forum (131). Ekphrasis, then, isn’t just an ornament or a figure of speech—Webb claims that it is a “quality of language” (105), something that allows listeners and readers to become what she calls “virtual witnesses” of people, places, and events (95). You can imagine how it would be useful to bring your listeners in to become “virtual witnesses” if you were, say, a lawyer painting a picture of the crime, or if you were a politician petitioning for more military spending by describing a pitiful defeat. Through ekphrasis, your listeners become shared participants in an experience. You recreate an experience so we’re all together for a moment, seeing the same thing, feeling—maybe—the same way. Ekphrasis brings people in with you.
Because ekphrasis is more than just an occasional strategy, Webb has to cover a lot of ground in her book. She begins by describing the context in which ekphrasis was named, admired and taught, back in ancient Greece where memory was always connected with imagery (25). “Seeing” something was critically connecting with how you think and remember. For example, do you remember in a previous episode on canons, where we talked about how classical rhetors would create a place, say a palace, and then place facts around that palace so that they could visualize walking around to encounter the facts? It’s the same practice that popped up recently in an episode of the BBC series Sherlock. When you have a clear visual reminder of a place, an object, you can better remember the abstract principles or facts. Another reason why ekphrasis was central to the Greeks was because of the way people encountered composition: whether or not a speech was written down, it was almost always spoken aloud (26). When you’re listening rather than reading, it can be difficult to pay attention to long abstracts, but being invited into a visual scene is refreshing and entertaining. No TV, remember? This understanding of literacy may seem alien to modern readers, so Webb has to explain them explicitly
Then she introduces ekphrasis to us the same way it was introduced to Greeks and Romans: through the Progymnasmata and other handbooks of instruction. In the pedagogical explanation, Webb emphasized that ekphrasis was seen as formative for young learners, a tool to advance socially, and as an absolutely transferable skill (47-51). Remember when we talked about the progymnasmata? The exercises that young Greek students went through? Well, ephrasis was part of the progymnasmata exercises and Webb sais it was “the exercise which taught students how to use vivid evocation and imagery in their speeches” as “an effect which transcents categories and normal expectations oflangauge” (53). She then gives readers a complete chapter discussing the subjects of ekphrasis that go beyond just descriptions of works of art, and, in fact, often focus on narrative aspects (68-70). She really has to define the term because we have several hundred years of misdefinition of the term as only associated with art.
Webb also introduces us to two versions of ekphrasis: Enargia which makes “absent things present” and Phantasia which she links to “memory, imagination and the gallery of the mind” (v). Here’s an example of enargia from Theon: “When I am lamenting a murdered man will I not have before my eyes all the things which might believably have happened in the case under consideration? […] Will I not see the blow and the citicm falling to the ground? Will his blood, his pallor, his dying groans not be impressed on my mind. This gives rise to eneragia,[…] by which we seem to show what happened rather than to tell it and this gives rise to the same emotions as if we were present at the event itself” (qtd 94). Phantaias on the other hand, is creation, which might include “mythical and fantastic beats […] imagines through a process of synthesis, putting together man dna horse” (119) for example, or it might just be creatively expanding on the details of what we aren’t told. Quintilian describes this in terms of a quote from Cicero: “Is there anyone so incapable of forming images of things that, when he read the passace in [Cicero’s] Verrines ‘the praetor of the Roman people stood on the shoes dressed in slippers, wearing a purple cloak and long tunic, leaning on this worthless woman’ he does not only seem to see them, the place [..] but even imagines for himself some of those things which are not mentioned. I for my part certainly seem to see his face, his eyes, the unseemly caresses of both” (qtd 108). So there you have it. Ekphrasis can be about things that were or things that can be imagined To use an example, enargia would describe a scene that was distant, like a visit to Disneyland, while Phantaisa would create a scene that was fictional, like developing a new Disney movie adaption.
Webb’s book is certainly readable and her argument is very thorough, taking in a very large range of Classical civilization, spanning several hundred years and including both Eastern and Western Roman Empires. She’s also made the convincing argument that ekphrasis was a little bit of the sublime that could be made an effective argument in almost any situation. Many texts that talk about rhetoric of poetics make the “audacious” claim that poetics can be rhetorical; Webb’s book seems to be claim that the rhetorical was often, poetic.
I’m especially interested in this ancient idea that one thing a rhetor needs to do is make the audience see it, to be there and experience the event or object—existing, historical, hypothetical, or fantastic—to be “virtual witnesses” of it for themselves. This seems to be an interesting link between a logos-centered viewpoint that admits only one clear interpretation of objective facts and the obvious realization that the audience was being brought into “worlds […] not real” (169). The audience readily give themselves up to the “willing suspension of disbelief” to order to feel, and experience, the fictive ( and no matter its veracity, the ekphrasis is always fictive, even when the object is before the audience) world the rhetor carefully creates through word choice and selective description. There’s something potentially deceptive about ekphrasis. And to make a clean breast of it, I’ve bamboozled you, because when I’m writing this, I’m not actually in my cube—I’m flying in a window seat with an orange sunset lighting up the cabin from over the north Pacific Ocean. Even worse, I haven’t even redecorated my research cube—yet. And I’m not sure where I’ll be when I actually record this episode. Right now, the scene I described so convincingly was a bald-faced…phantasia. But I made you a witness with me. Ekphrasis is so immersive that it can be hard to challenge it It’s too bad that we don’t know more about how audiences were trained to read these ekphrasis: the handbook information is wonderful for describing the theory and practice from the rhetor’s side, but what might be the equivalent for readers? How does an audience respond to ekphrasis? Should they be skeptical or allow themselves to be swept away in the description and become willing witnesses? Hey, I don’t have the answer to this question. If you have thoughts on the proper way to respond to the ways that words create worlds, drop us a line a email@example.com? Until then, I’ll be enjoying my nicely redecorated research cube. Maybe.
Wed, 14 January 2015
Remember back when we talked about kairos? Just to remind you, here’s a poem, a Greek poem, translated by Jeffrey Walker, explain. This poem is ekphersis, a piece of writing that describes a piece of art, in this case a sculpture of Kairos done by Lysippos of Sicyon. The rest explains itself.
From where is your sculptor? Sicyon. What is his name?
Lysippos. And who are you? Kairos the all-subduer.
Why do you go on tiptoes? I’m always running. Why do you have
Double wings on your feet? I fly like the wind.
Why do you have a razor in your right hand? As a sign to men
That I’m sharper than any razor’s edge.
Why does your hair hang down in front? For him that meets me to grab,
By God. Why is the back part bald?
None that I have once passed by on my winged feet
May seize me, even if he wishes to.
Thus the artist fashioned me, for your sake,
Stranger, and placed me at the entrance as a lesson.
So here we have this figure of kairos, with a haircut that is party in the front and business in the back ad if you don’t grab him, too bad. It’s done. Game over, chance lost.
But then what? when you’ve missed your chance, what’s even left? Are you all alone as Kairos flits away?
Not really. The ancient Greeks created another figure, named Metanoia to describe the deep regret that comes when there’s something you could have done and you missed the chance. MEtanoia literally means after thought, or after mind, I guess if you want to get picky about it. It’s similar to regret. As Kelly A Myers put it in her rhetoric society quarterly article, Metanoia was a figure that “resides in the wake of opportunity, sowing regret and inspiring repentance in the missed moment” (1). It is “a reflective act in which a person returns to a past event in order to see it anew” (8)
In Roman poetry, metanoia accompanies the god of opportunity in Ausonius’s epigrams. The first part of the epigram sounds very similar to the ekphrasis of kairos poem “who are you” “I’m opportunity” “why do you look so weird?” “seize the moment” etc. etc. but then the questioner turns to metanoia “please tell me who you are.” “I am a goddess to whom even Cicero himself did not give a name. I am the goddess who exacts punishment for what has and has not been done, so that people regret it. Hence my name is Metanoea.”
There’s something weirdly compensatory in this accusation against Cicero. Metanoia is a such an important concept, Ausonius seems to say, that Cicero must have known, must have felt, but neglected to name. Metanoia is out there, but under studied and ignored.
But we’ve all felt that regret, haven’t we? Me, personally, I get that feeling in the shower, when dumb things I’ve said, or witty comebacks I should have said come sweeping in on me. I’ve also heard people getting hit with metanoia when they’re trying to sleep or when they’re driving or when they’re staring into a beautiful tropical sunset. It makes you want to stab your eyes out.
So what was the purpose of metanoia? What did it accomplish to feel such crippling regret? Hopefully such reflection and regret means that next time around you doing something different. Hopefully you change. This became a big deal as Christianity burst onto the scene. Metanoia became associated as a step of repentance, reflecting on the mistake you made before you can move forward. The New Testament uses matanoia as an “act of repentance that lead to spiritual conversion.” (9).
As kittel et al describe it “affects the whole man,” not just the brain.
It’s important that this emotional aspect of metanoia exists. Some sources point out that metanoia is always emotional as well as mental it is a “change of mind ad heart” (Liddell and Scott 1115) a “profound transformation of the epistemic orientation of the whole person” (Torrance 10). Myers points out that “metanoia ia the affective dimension of kairos” (2)
Metanoia as a rhetorical figure really hit its stride in the middle ages and beyond. Visual representations of metanoia became as common as kairos. Metanoia stuck with kairos, showing up in the Middle Ages and the Renaissance, sometimes as a beautiful young woman and sometimes as a vengeful hag. Here’s the thing: there’s a moment of indecision, a Schrödinger's Cat moment where you don’t know whether you will seize the moment or live to regret it.As Myers says ‘once a descion has been made or missed, the two part ways, but before that crucial moment they stand together” (4).
So when you think of kairos, think of the inverse as well, the potential for deep abiding regret that makes you want to burn your high school yearbook. But remember metamoia in the moment that comes, not just regretting what is past, but looking at where you are now and making sure you make the right choice right now, so that you don’t have to regret it later.