Thu, 16 April 2015
Show Notes (transcript available upon request):
Why and how does rhetoric break down? For Wayne Booth the issue is that there has been a loss of faith in the idea of good reasons--that we can, indeed, persuade each other to change minds. The crucial assertion is that we are able to change minds.
Modern dogmas--either sciencism or romanticism-- assert that"the purpose of offering reasons ... cannot be to change men's mind in the sense of showing that one view is genuinely superior to another" but it all must be trickery (87). Because of the dogmas of modernism "what had once ben a domain with many grades of dubiety and credibility now becomes simply the dubious (for scientism) or the arena of conflicting faiths (for irrationalism)" (89). The poster boy for these--conflicting enough-- positions is Bertrand Russell, or--rather--Bertrand Russells. Booth splits Russell's work into three parts: Russell I, "the genius of mathematical logic" who was all into proof and facts, Russell II who "tried to disestablish certain past beliefs and establish the more adequate beliefs" of science, and Russell II who was "the man of action and passion, the poet and mystic" (46-7). Both the completely, sterilely rational and the impassioned romantic are part of the modernist perspective that can undermine rhetoric.
The crazy thing is that "Not only do we talk and write and create art and mathematical systms and act as if we shared them: we really do share them, sometimes. Sometimes we understand each other" (113).
Boothe can take it a step further and say that not only do we understand each other, but we actually make each other. We "successfully infer other human beings' states of mind from symbolic clues" but also we "characteristically, in all societies, build each other's minds" (114). This is, in fact, "the supreme purpose of persuasion"-- to "engage in mutual inquire or exploration" and rhetoricians should be committed to learn "whatever conditions make such mutual inquiry possible" (137). "Rhetoric is a supremely self-justifying activity for man only when those engaged in it fully respect the rules and the steps of inquiry" (138). In the rhelm of rhetorical inquiry "we can add value fields that modernism would exclude: in love by lovers, in gastronomy by gourmets, in ever kind of value by those who have some to know a good reason from a bad" (143)--in short what I have called, before, untenetable claims.
The way to do this is through--surprise--thoughtful dialoge. "as I do so I will know that the justic of my action is determined by whther what looks like good reasons" are, in fact (149). We must "somehow constitute [society] as a rhetorical field" (149). Ultimately, "it is not a comfortable community nor a stable one. Even those who join it consciously and sustematically, as we all do by talking together here, cannot provide a convenient list of gods and devils, friends and enemies. But at the same time it can give us some ease in whatever subcommunity we have already assented to" (203).
There's also a great part on rhetoric of poetics and narrative, which I could include in a rhetoric of poetics course--"story as reasons" "Every kind of argument that anyone could ever use in real life might be used in a narrative work and it could presumably carry as much force one place as another" (181).
"if there are good reasons for confidence in the values of discoursing together, then we can get about our business, what ever that may be" (100)
"truth is not always on the side o th rebel"..."simply to say no when everyone else is saying no is just antoher form of group compliance, a disguised and therefore feeble yes" (195)
Motivism is a dogma "not because I think that all or most value choices are made on the basis of fully conscious and 'scientifically cogent reasoning' but because I find many people assuming, without argument, that none of them ever can be. 'Look for the secret motive'"(25). In practice, motivism has often led to a cutting down of man's aspirations and capacities to the 'merely animal' or, in a natural further step, to the chemical or physical" (29)
Fri, 10 April 2015
Transcript available upon request. firstname.lastname@example.org
Wed, 1 April 2015
Transcript available upon request.
Mon, 23 March 2015
MOST mornings, I wake up, put on some stretchy pants and very bright t-shirt and strap on my phone for a run, because for some reason you need a phone to go running. Why do I do that? Is it because I am a master of my fate, and I choose my fate to be sweaty and singing along to Shakira through the wilderness trails near my home? Or is it because I am being influenced by the institutions of the beauty industry, the fitness industry, the nature industry and the Spandex industry to conform to a certain predictable type, which happens to include skipping over rocks and dirt while a GPS tracks my every step?
Pierre Broudieu—not to be confused with bourdoux—is convinced that it’s not about just my free will nor entirely just society structures that makes me go for a run, but continuous give-and-take between them. What I think I want to do are shaped by past events and institutions that in turn are influenced by what I choose to do. Because choosing to wake up and run, I get feedback from structures that reinforce what I think of as my choice to wake up and run. This combination of choice and society stricture, Bourdieu calls habitus and it’s his most famous contribution to rhetoric and to sociology.
Habitus is a combination of deep-rooted, even unconscious, desires and what we choose to desire, which has been formed from childhood. It is, as Bourdieu often described it, “the feel for the game.” I don’t know how to articulate how and why I run, but I know it’s something I do, because it’s also something that my society does.
Sometimes it’s hard to see how institutions support a habitus unless you see the opposite, something that happened to my sister when she was doing medical surveys in a very remote village in Tanzania. She woke up one morning and went for a run—and flummoxed the villagers. “What are you running from?” said one person, huffing up beside her. “Nothing,” she said. “I’m just running.” “Why?” “I don’t know—to burn calories maybe?” The villager, who had been working with her on, among other things, questions of nutrition, paused a moment and then asked incredulously, “You want to burn calories?”
The feel for the game that my sister had was for a totally different game than the one that made sense in a small fishing village struggling to get and keep calories rather than burn them. The feel for the game wasn’t something that my sister conscious set out to learn, and it was somewhat only when she bumped against a different rule that she noticed that she was doing something wrong. Habitus is created and reproduced unconsciously, ‘without any deliberate pursuit of coherence,” as Bourdieu says “without any conscious concentration’ (ibid: 170).
Although our habitus can cause embarrassing mismatches when we’re in a different culture, it adept at taking us through our native environments, as we play the game around us like insiders.
Playing the game like an insider was a really important thing for young Pierre Bourdieu, who came from a working class family in southern France. Southern France is like, the sticks, for French people, and his family had a strong accent, both in the lilt to their speech and the things that they believed were important. Going to study in Paris certainly would have highlighted the differences between his home culture and the elite intellectual world. The Elite intellectual world became a sociological phenom to Bourdieu as exotic and interesting as the Algerian tribes he did his field work with. The ways that elite intellectual used language, used taste, used culture became the basis of his landmark book Distinction: A social critique of the judgment of taste, which was published in 1979. Distinction highlights the way that the elite create an insider habitus. As B says, “symbolic goods, especially those regarded as the attributes of excellence, [are] the ideal weapon in strategies of distinction.” If you do or don’t like opera, if you do or don’t see running as recreation, if you do or don’t value certain food, cultures, presentation or any other type of distinction, creates the social class fracture that distinguishes the upper class from the middle, or the very upper. Are snails a pest or a delicacy? That sort of thing. People don’t even ask why they think opera is just good music or snails are just plain tastey, because it’s deeply engrained in their lives since almost birth.
This so-called social capital is learned from a very young age as part of your habitus, and if you grow up thinking that You don't kick a dressage horse after a failed pas de deux, you live in a very different world than where you don’t kick a man when he’s done.
But none of this is to say that once you’re in (or out) you’re stuck. According to one commentator on Bourdieu, habitus “is not fixed or permanent, and can be changed under unexpected situations or over a long historical period” (Navarro). Somewhere along the way, for example, the elite picked up jazz as the preferred music to opera, and snails gave way to craft beer and high end cupcakes and—going for runs? Not only can the cultures and institutions switch, but people can switch, too—Bourdieu, for all of his criticism of the elite, ended up in the in-group of often cited scholars in rhetoric, philosophy and sociology. And I used to hate running. But here I am, waking up most mornings to put on some stretchy pants and very bright t-shirt and strap on my phone for a run, because for some reason you need a phone to go running.
If you have a deeply engrained habitus, why not tell us about it at email@example.com? We’d love to hear from you and any other comments or ideas you may have, including distance running tips, because there seem to be a disproportionate number of distance runners in higher education. Must be something about the habitus…
Mon, 16 March 2015
A few weeks ago I was at an excellent lecture by Collin Brooke here at the university of Texas and he was talking about applying the master tropes to different models of networks. Then I thought--by Jove, the Master Tropes! What a brilliant idea for a podcast! So with all deference to Dr. Brooke, let’s dive into these four beauties of the world of tropes.
A trope, you may or not know, is a way of presenting thought in language. A trope is different from what’s called a figure because it doesn’t deal with arranging words, but rather arranging thought. For example, a figure might be something like hyperbaton, which is the the way that Yoda talks: “Patience you must have” just means “you must have patience” there’s not change in the thought behind the words, but the refiguring of the words creates interest, so Yoda says things like “Miss them do not” instead of do not miss them, but the ideas aren’t changed at all. That’s figures.
Occasionally, though, Yoda will use a trope. For example, once he said ““In a dark place we find ourselves, and a little more knowledge lights our way.” This is, as it turns out, a metaphor: knowledge doesn’t actually cast a glow, but it does make things metaphorically clear. The words transform the ideas: light equals knowledge. It’s not that Yoda changed the words around--all considered this is pretty syntactically straight-forward for the sage-green sage--but he’s presented the ideas in a different way. This is a trope, not a figure.
It is, as a matter of fact, one of the four master tropes: Metaphor, Synecdoche, Metonymy and Irony. It’s possible that these terms aren’t familiar to you, or only in a vague, AP English sort of way, so let me provide examples and definitions. Metaphor is the trope that is most familiar to us: knowledge is light, the Force is a river, many Storm troupers are a wall. So I’m going to skip over that. Synecdoche is--aside from being difficult to pronounce, using the part to represent the whole. I always think of that movie Synecdoche New York, where the guy builds a replica of New York for a movie. The standard examples include things like “earning your bread and butter” when you’re hopefully earning much more than that or “putting boots on the ground” when the military often needs soldiers, too, to fill those boots. I used to joke with my Mormon comedy group since everyone prays to “bless the hands that prepared this food,” if there was a terrible accident in the kitchen and everyone died, at least the hands would be preserved. So you get the idea. Metonymy can sometimes be a little more confusing, because it, like Synecdoche, involves using a word associated with the idea to stand in for the idea itself. We say things like “the White House has issued a statement” when the building itself has done no such thing, or “Hollywood is corrupt” to represent the movie business generally. Some people will say that synecdoche is just a specific kind of metonymy, like how simile is a specific kind of metaphor. Finally, irony may seem like a simple, straightforward trope, but it can be notoriously complex, as Wayne Booth describes in greater detail in The Rhetoric of Irony. How we we know when someone is being ironic? How much is irony dependent on understanding cultural cues? Why do we say the opposite of what we mean as a way to say what we want? Tricky stuff all around.
The four master tropes are probably most familiar to rhetoricians as the essay found way in the back of Kenneth Burke’s Grammar of Methods, way way back as an appendix. There, Burke equates these over-arching tropes with different epistemic perspectives: metaphor correlates with perspective, metonymy with reduction, synecdoche with representation, and irony withdialectic. The way that we construct thought depends on how we use these four master tropes.
Remember when we talked about the Metaphors we live by? Well, Burke says that we don’t just live by metaphors individually, but also by the idea of metaphor, or by reduction, representation or dialectic. The tropes, instead of just being a way to make your writing more flowery, can be a critical part of invention, and how you see the world more generally. Are you inclined to think inductively, looking at a couple of examples of Sith lords and there after making generalizations about the group as a whole and their capacity to run a competent daycare? It’s possible to think in terms of irony, transpositioning one view of truth with an anti-thetical perspective: can Anikin be both on the dark side and not on the dark side? Can you both do and do not if you only try? These master tropes are not just ways of expressing ideas about the world, but coming to make ideas as well.
I’m a huge fan of Burke, but I’m afraid that I can’t give him credit for coming up with the idea of four master tropes that encompass other ways of figuring ideas. I’m sorry to say that that distinction goes to--ew--Petrus Ramus. Yes, Ramus, the mustache-twirling villain of rhetoric himself. Back when we did our series on the villains of rhetoric, Ramus was public enemy number one, removing invention from rhetoric and diminishing the whole affair to a series of branching “yes and no” questions and needless ornamentation. And yet it was Ramus, in his eagerness to classify everything into categories and subcategories who coined the idea of the master tropes back in 1549. Fortunately the idea was taken up by a more palatable figure of rhetorical history, Giambattista Vico, who in the 18th century, identified the master tropes as basic tropes, or fundamental tropes, being those to which all others are reducible.
Since Burke, though, others have taken up the idea that these tropes of arranging ideas might become ways to think about the world in general. Hayden White, for instance, saw the master tropes as representing something about literature.
He constructed a table where each trope has its own genre, worldview and ideology. Metaphor, for instance, was about romance--or we might say fantasy--and was associated with formism and an ideology of anarchism because anything might apply as a metaphor. Metonymy was associated with comedy, organicism and conservatism--presumably because if you assume that “the White House” speaks for the country, you’re putting a lot of stock in the traditional power that dominates. Conversely, synecdoche was associated with tragedy, mechanism and radicalism. Irony, naturally enough, was the trope of satire and its world view of contextualism and liberalism. Once White had come up with this tidy table, he because to think about the tropes not just statically, but how they might evolve temporally, both in terms of an individual child’s development and in a civilization.
Metaphor was the earliest stage, corresponding to infants up to two years old and aligned with Foucault’s conceptualization of the Renaissance. Then metaphor gives way to metonymy, the domain of children from 2-7, which White lines up with the Classical period and the Enlightenment--very conservative and fond of straight-forward comedy. Next comes synecdoche of tweens and the modernist period--radically breaking from the past and finally, in crowning achievement, irony, the stage of teenagers and adults, corresponding to the post-modernist era, with its love of counterintuitive and contradictory thought.
Others have highlighted the philosophical or historiographical possibilities of the mastertropes, including Jakobson and Foucault himself. Which brings me back to this fascinating, exploratory lecture by Collin Brooke.
Brooke suggested another correlation for the master tropes: not ways of thinking or periods of time, but networks of connection. Networks are a big stinking deal for digital humanists and new media rhetoricians like Brooke, and some of the different types of networks, brooke proposes, may correlate to the master tropes: hierarchies, for instance, are like metaphors, which correspond across groups--the padowan learner doesn’t really tell us much about the Jedi master who trains her, but you expect the role of that padowan learner to be similar to the role of another padowan who studies under another master. Synecdoche, though, can be seen in truly random networks. A network of 200 that is truly random, is representative of a network of 2000, or of 2 million. Some networks are neither analogous like metaphor or random like synecdoche. In situations that produce what’s been called the long tail--citations for example, some groups or people are more popular because they are more popular. the more people who fear Jabba the hut--peons, bounty hunters-- the more he is feared. It creates a snowball effect that is similar to metonymy. Brooke’s ideas are inchoate and he admits that he’s not sure what network might correlate to irony--it’s all a work in progress, afterall, but it goes to show that the organization appeal of the master tropes continues. The idea of tropes that rule all the other tropes and say something meaningful about the ways in which we construct and understand the world around us is a timeless appeal that goes all the way back to Vico--er, let’s just say Vico, okay. Until next week--miss us you must not because patience you must have.
Thu, 5 March 2015
Weclome to Mere Rhetoric, a podcast for beginners and insiders about the ideas, terms and movement who have shaped rhetorical history. I’m Mary Hedengren and today we’re talking about two influencial chapters from one book: Richard Weavers’ “The Ethics of Rhetoric”
The Ethics of rhetoric was written in 1953, and it definitely feels like it and Weaver was Southern and definitely feels like it. Even though he spent most of his career at the University of Chicago, with Wayne Booth, he kept his summers free to go down to a farm that he kept where he lived an agrarian dream of plowing the family vegetable garden with a mule. He definitely believed in the Jeffersonian ideal of the gentleman farmer, connected to the earth.
Somehow in the middle of all that plowing, Weaver was able to be one of the most important of the “new conservative” branch of thinkers and the leading neo-platonist rhetorician of the 20th century. Weaver believed also somewhat idealistically about rhetoric. He said, “Rhetoric “instills belief and action” through “intersect[ing] possibility with the plan of actuality and hences of the imperative” (28). Rhetoric is “a process of coordination and subordination […] very close to the essential thought process” (210). Thought and rhetoric were interwoven and rhetoric couldn’t be ignored.
There are two chapters in The Ethics of Rhetoric which have had especially lasting influence. The first is a reading of Phaedrus, because Weaver loved him some Plato. Remember when we talked about the Phaedrus? For those of you who weren’t here, it’s a story about Plato giving two opposite speeches about love: in the first, he tweaks an existing speech about the importance of choosing someone who doesn’t love you as your lover, in the second, he repents of the first and gives a speech about how it is good to have a lover who loves you, and at the end, he ends up talking about rhetoric. Some people may say, “what? what’s the connection?” Not Weaver. Weaver says tthat“beginning with something simple” Plato’s dialogue “pass to more general levels of application” and then end up in allegory (4). The lovers are like rhetoric—you can have good, bad and impotent rhetoric. The non-lover is a lie, like “semantically purified speech” (7). Bad rhetoric, like bad lover, seeks to keep recipient weak and passive (11). If we have impure motives towards our audience, we’ll keep them dependant on us, week and passive, instead of empowering them the way that a true lover would. Ulitmately, Weaver believed in an ideal of rhetoric, rhetoric that would make people "better versions of themselves" (Young 135)
Another one of Weavers’ chapters to have lasting influence classifies the very words we use, most famously, “god terms” and “devil terms.” “God terms are those words that, for a specific audience, are so positive and influencial that they can overpower a lot of other language or ideas. For Weaver, writing in 1953, he uses “American” as one fo the key political god terms. In contrast to god terms are devil termns and for weaver, writing in 1953, the ultimate devil term is “communist.’ From here, he can set up the language of the McCarthy era nicely, right? The committee on Unamerican activities uses a powerful god term. Most famously, Weaver introduces “god-terms” and “devil-terms” as ultimate terms that are either “imoart to the other [terms] their lesser degree of force and fixes the scale by which degrees of comparison are understood” (212), either positively or negatively (222). When you hear a god or devil term, the defensive rhetorician must “"hold a dialectic with himself" to see if he buys the word as it’s being used.
But additionally Charismatic terms= those terms who have “broken loose [from] referential connetions” which will that “they shall mean something” (eg: “freedom”) (227-8). These terms don’t mean something in particular just “happy feeling.”While, Uncontested term= seems to invite a contest, but not in its context (eg: appealing to “illustrious Rome”) (166). They aren’t really disputed with. Ultimate terms like these are often “a single term [awaits] coupling with another term” (211).
Weaver was also influencial in the rhetoric of poetics because he swa that “Like poetry, rhetoric relies on the connotation of words as well as their denotation.” That is to say, not just what the words mean in the dictionary, but what they mean to a community—communist to a group of 1953 american politicians is a far more fearful thing than its dictionary definition.Like poetry, too, there must be an enthemyme, a truncated syllogism, where the audience fills in the blanks, or as Weaver puts it “The missing propsition […] ‘in their hearts’” (174)
"Good rhetoricians, he claimed, use poetic analogies to relate abstract ideas directly to the listeners (Young 132). Specifically focusing on metaphor, he found that comparison should be an essential part of the rhetorical process (Johannesen 23)."
Weaver didn’t producemore than a handful of books, possibly also because he died quite suddenly in his fifties, but he had a lastin influence in the Chicago school and elsewhere. Weaver certainly wasn’t a perfect person—for instance, he disliked jazz and that is just plain wrong—and he’s kind of gone out of favor, but reading The Ethics of Rhetoric, you see how crucial his ideas have been to the 20th century revival of rhetoric.
Tue, 24 February 2015
Transcript on request.
Tue, 17 February 2015
Welcome to Mere Rhetoric. Or maybe welcome back, because last week we talked about John Dewey and today we’re talking about John Dewey again. You don’t have to go back and listen to the last week’s episode on Dewey and aesthetics, but if you like this, Dewey part the Deuce, then you migh want to go check out the previous episode on Dewey and the artful life. Today, today thought,we get to talk about Dewey’s political and educational contributions.
Dewey was a huge fan of democracy and of education for democracy. He said, “Democracy and the one, ultimate, ethical ideal of humanity are to my mind synonymous."
One scholar summarized Dewey’s politics in this way: “First, Dewey believed that democracy is an ethical ideal rather than merely a political arrangement. Second, he considered participation, not representation, the essence of democracy. Third, he insisted on the harmony between democracy and the scientific method: ever-expanding and self-critical communities of inquiry, operating on pragmatic principles and constantly revising their beliefs in light of new evidence, provided Dewey with a model for democratic decision making…Finally, Dewey called for extending democracy, conceived as an ethical project, from politics to industry and society.” Dewey was big on democracy. this idea, especially about participation in democracy instead of just representation inspired much of his writing in education. The kind of progressive education that Dewey endorsed was education for democracy, education that focused on making student empathetic and engaged citizens.
Dewey’s most articulate thinking about engaged democracy comes as most good thinking does: in response to an interlocutor whose ideas make our blood boil. For Dewey this was Walter Lippmann. the famous Lippmann-Dewer debates begne in 1922 when Walter Lippmann wrote s book called Public Opinion. In Public Opinion, Lippman says that democracy is demo-crazy--public opinion is actually shaped by adverstisers and demogogues who can manipulate the public into thinking what ever they want. The people as a whole can’t make any decision that hasn’t already been made by sleezy Madison Ave. types. So Lippman says that instead the government should be led by experts, preferably scientitic and objective types who would be immune to propaganda. Instead of democracy romantically conceived, he suggested representation and political experts.
Well this got Dewey’s goat and in The Public and its Problems, he responded to Lippmann’s view of democracy. Instead of relying on experts for democracy, Dewey recommends that “"it is not necessary that the many should have the knowledge and skill to carry on the needed investigations; what is required is that they have the ability to judge of the bearing of the knowledge supplied by others upon common concerns." Sure, he admitted, there could be ignorant publics swayed by propaganda, but the solution was not to toss the baby with the sludgewater--education was what the populace needed if they were to engage in participatory democracy.
The Dewey Lippmann Debate has gotten a lot of press from recent rhetoricians. Search for it on Google scholar and you’ll find over a thousand entries since 2011. In the 2008 meeting of the Rhetorical Society of America, a “lively panel” discussion took place where, according to one witness “Jean Goodwin effectively advanced journalist Walter Lippmann’s critique of the “omnicompetent” citizen against Robert Asen’s John Dewey, who represented hope for collaborative dialogue.” And in the most recent meeting of the Modern Language Association, another scholar pointed out how the Lippmann-Dewey debate relates to the current expert-laden political rhetoric. A recent collection of essays on called Trained Capacities: John Dewey, Rhetoric, and Democratic Practice, Brian Jackson and Gregory Clark, eds. also reminds us of the perrential importance of asking ourselves “Are our citizens trained for democracy? Can they be?” The debate, so it seems, continues.
The kind of education you would need to particpate in democracy includes not just information about the value of nuclear energy or the political history of the middle east: you need to have some sense of how you fit in to a democracy, what the moral obligations you have and what the society can provide you.
For Dewey, America’s ideal model of civic engagement wasn’t a selfish, me-first mentality, but neither was it entirely collective and socialist. In Individualism Old and New, Dewey says it’s time to move past the old, rugged, wild-west homesteader kind of individualism that theAmericans he was writing to could possibly remember, or at least could remember stories of their parents and grandparents. while his audience of early 20th century Americans idealized that kind of independence, they were also increasingly aware of how to connect. The experience of world war have taught them that “Most social unifications come about in response to external pressure” (11) and “personal participation in the development of a shared culture” (17). Defining that interconnectivity against the struggles and hardships of war and poverty may seem intutive but the move from frontier rugged individualism to an individualism that recognizes our interconnectivitity is at the core of Dewey’s political philosophy.“Each of us needs to cultivate his own garden. But there is no fence around this garden” (82).
Now just so you know that last week’s episode on the aesthetic of Dewey wasn’t totally separated fromt his sort of thing, Dewey also talked about how that “shared sulture” happens through art, and how this art educates, cultivating the skills that are necessary for democracy: “The art which our times needs in order to create a new type of individuality is the art which, being sensitive to the technology and science that are the moving force of our time, will envisage the expansive, the social culture which they may be made to serve” (49). Or, another way, “The work of art is the truly individual thing” (81).
Sat, 7 February 2015
Today on Mere Rhetoric, we talk about John Dewey. John Dewey was a big ol’ deal, even back in his day. Just after his death in 1952, Hilda Neaby wrote”Dewey has been to our age what Aristotle was to the later Middle Ages, not just a philosopher, but the philosopher.”
And what does a person have to do to be compared to Aristotle? I mean to be compared in a serious way to Aristotle, because I’m like Aristotle because, you know, I enjoy olive oil on occasions, not because I’m the philosopher. I think one thing Neaby means is that Dewey was involved in everything. Just like how Aristotle had huge impact in politics, theology, science and rhetoric, John Dewey seemed to have a finger in every pie. By the time he died at age 92, he had written significantly on education, politics, art, ethics and sociology. But it’s not enough to be a big freakin’ deal a hundred years ago, but Dewey is a big deal in rhetoric today. It’s rare to search too many issues back in Rhetoric Review, Rhetoric Society Quarterly or Rhetoric and Public Affairs without hitting on an article either directly about or draws on Dewey, and books about Dewey are popping up all over the map. John Dewey is hot real estate.
So because John Dewey is such an important thinker for rhetoricians today, we have to take more time than today to talk about him. That’s right-- a Mere Rhetoric two-parter. A to-be-continued. A cliffhanger. If that cliff is carefully divided, I guess and that division is this: today we’ll talk about John Dewey’s contribution to aesthetics, his book Art as Experience and responses to that book from contemporary rhetoricans. Next week we’ll talk more about his politics, the dream of his pragmatism, what he means by Individualism Old and New and the famous Dewey-Lippmann debate. So that’s what we’ll be doing the next two weeks. So let’s get started on the first part of this Dewey-twoey.
Like many great thinkers, Dewey started his career by realizing that what he thought he wanted to do, he really, really didn’t. In Dewey’s case it was education. It’s ironic that Dewey became one of the 20th century’s most important voices in education because he did not teach secondary or primary school for longer than a couple of years each. Good thing he had a back-up plan as a major philosopher. He joined the ground floor of the University of Chicago and became one of the defining voices of the University of Chicago style of thinking, although he eventually left, somewhat acrimoniously, and taught at Columbia for the rest of his career. Somewhere along the way, though, he became president of the American philosophical association and published Art as Experience.
The title kind of gives away Dewey’s claim--he situates art in the experience which you have with art. As he says “the actual work of art is what the product does with and in experience” (1). But he also means the opposite, that experience can be art. Instead of thinking of art as something that happens in rarified situations behind glass and velvet ropes, Dewey opens up “art” to mean popular culture, experiences with nature and even just a way of living.
Being in the moment is a big part of this artful living. If you’re experiencing or rather, to use the particular philosophical parlance Dewey insists on “having an experience” then you are totally being in the moment: “only when the past ceases to trouble and anticipations of the future are not perturning is a being wholly united with his environment and therefore fully alive. Art celebrates with peculiar intensity the moments in which the past reenforces the present and in which the future is a quickening of what is now is” (17). In such a view, any time we live the moment artfully, in full presence of being, we’re having an artful experience.
In having an experience, you have some sort of awareness and some kind of form.
As Dewey says, “art is thus prefigured in the very processes of life” (25).
This idea may sound radical. How can sitting in a crowded bus be art the way that the Mona Lisa is art? But Dewey is insistent. He sighs, “the hostility to association of fine art with normal processes of living is a pathetic, even a tragic, commentary on life as it is ordinarily lived” (27-28).
That’s not to say that there can’t be objects of art that concentrate the sensation of having an experience. But it’s the whole experience. For example, “Reflections on Tintern Abbey” isn’t really about Tintern Abbey any more than it’s about Wordworth and evenings and homecomings and 1798 and that sycamore and all of it. It expresses a complete experience of Wordsworth. And that expression is always changing as times change.“the very meaning,” Dewey writes “of an important new movement in any art is that it expresses something new in human experience” (316). Meanwhile the art that remains after the moment passes and the movement becomes cliche. “Art is the great force in effecting [...] consolidation. The individuals who have minds pass away one by one. The works in which meanings have received objective expression endure. [...] every art in some manner is a medium of this transmission while its products are no inconsiderable part of the saturating matter” (340)
And the value of art is moral. First off, Dewey says that“The moral function of art itself is to remove prejudice, do away with the scales that keep the eye from seeing, tear away the veils due to wont and custom, perfect the power to perceive. The critic’s office is to further this work, performed by the object of art” (338).
Pretty cool stuff, huh? But wait, there’s more. The process of having an experience, that complete being, has its own moral value, or so argues Scott Stroud in John Dewy and the Artful Life: Pragmatism, aesthetics and morality. There he claims “I want to examine how art can be seen as a way of moral cultivation” (3) because“At various places, Dewey’s work provides us with tantalizing clues to his real project--the task of making more of life aesthetic or artful” (5) Put in other words: “art can show individuals how certain value schemes feel, how behaviors affect people, etc.--in other words, art can force the reflective instatement (creation) of moral values” (9)
Stroud connects the pragmatists like Dewey with mysticism in Eastern philosophy and medieval monastic Christianity. Remember how Dewey is all about having an experience, really being in the moment? So Stroud says, “The way to substantially improve our experience is not by merely waiting for the material setup of the world to change, but instead lies in the intelligent altering of our deep-seated bahits (orientations) toward activity and toward other individuals” (11).
“The important point,” writes Stroud, “is that attentiveness to the present is a vital way to cultivate the self toward the goal of progressive adjustment and it is also a vital means in the present to do so” (69)
For Stroud, as for Dewey“the art object [...] imbued with meaning partially by the actions of the artist, but also because of the crucial contributions of meaning that a common cultural background contributes to the activity of producing and receiving art objects” (97)--the way that the artistic object is received popularly and by critics. And for that aim “criticism does more than merely tell one what an important work of art is or what impression was had; instead, it gives one a possible orientation that is helpful in ordering and improving one’s past and future experiences” (122). And in that, criticism, or even appreciation, is also a moral act.
Stroud’s argument has immediate application of the artful life. He ponders “How can we render everyday communication, such as that experiences in mundane conversations with friends, cashiers, and so on, as aesthetic?” (170). To answer this, he draws on dewey to suggest that we avoid focusing on a remote goal, cultivate habits of attending to the demands of the present communication situation and fight against the idea of reified, separate self (186-7).
Mon, 2 February 2015
I want you to do a little experiment for me. Think back to what you were writing five years ago. If you happen to be at your computer or the scrapbook of everything you’ve ever written, you can even pull up your writing. If not, just go ahead and meditate. Do you need a moment? It’s okay, I’ll wait. Now then—has your writing gotten better? Have you become a better writer?
If you’re like me, you probably look at the things you were writing five years, or even a year ago, you might say, “yes” in very enthusiastic tones. If you’re like me, you might, in fact, have a hard time reading the work you did five years ago. How could I have been so stupid? How was I such a bad writer?
Lee Ann Carroll, in her book Rehearsing New Roles has a shocking proposition for you: maybe you weren’t a bad writer, maybe you were just inexpert in writing the sort of things you write today. Carroll gathered up some college students and performed a longitudinal study, which means that she followed the same subjects around through their entire time at school and beyond. She had them sit down in interviews with her and fill out time logs detailing how much studying they do outside of class (in case you’re curious, the amount ranged from five hours a week to forty). They brought in their writing assignments, and their outside writing to talk about. It was very thorough. And do you want to know her take away?
First off that “students in college do not necessarily learn to write ‘better,’ but that they learn to write differently—to produce new more complicated forms addressing challenging topics with greater depth, complexity and rhetorical sophistication” (xiv) “Wait a moment,” you might say, “great depth, complexity and rhetorical sophistication? Isn’t that just a fancy way of saying “better writing?” Maybe it is, but it’s not that they’re getting better at this vague genre of “academic writing. As Carroll puts it “Their writing gets better in that they do learn to write differently but the do not fulfill the fantasy of mastering one kind of literacy, an idealized version of academic writing” (60).
This is the real-life writing changes of writing in the disciplines. A student gets into one class, learns the genres and expectations of that class and then, right when she gets the hang of it, heads into another class. “Students’ literacy develops because students must take on new and difficult roles that challenge their abilities as writers. In fact, student writing may sometimes need to get ‘worse’ befor it can become ‘better.’ Because many college writing tasks are essentially new to students, they will need repeated practice to become proficient.” (9). How much do professors take this into consideration? Not very much.
The writing assignments that Carroll’s participants navigated were complex and sophisticated, but also, very, very different from each other. She claims that “Faculty are likely to underestimate how much writing tasks differ from course to course, from discipline to discipline, and from professor to professor” (9) Put another way, “students must learn to write differently but have few opportunities to develop one particular type of writng over any extended period of time” (55).
And where does this leave first year composition? Carroll writes that we should take the work of first year composition seriously, but not “too seriously. A first-year composition course can serve students by helping them make connections between what they have already learned about writing in their k-12 education and ways they might learn to write differently both in the academy and as citizens of the larger society. On the other hand, first-year composition cannot succed as a source that will teach students how to write for contexts they have no yet encountered A one-semester writing course is bet viewed as ust one step in a long process of development that extends from children’s first encounters with literacy on through their adult lives” (27-28).
Carroll does have some practical recommendations. She suggests that first year composition focus on metacognitive awareness and students own writing as much as possible. You know asking students things like “what do you do when you get an assignment prompt?” and discussing their own writing practice. She also recommends focusing on portfolios—in classes as well as in departments and programs. As much as possible, those portfolios should provide opportunities to return to similar genres as well as challendging students to try new things—remembering that the results will be less than perfect and that students will need plenty of specific feedback.
I find Carroll’s argument very persuasive, and as I’ve written and recorded more of these podcasts, I’ve noticed that this weird literary genre is becoming more comfortable for me. But it’s been more than a year! How many literacy projects do average undergraduates get to revisit over and over again? Is there a project you’ve mastered or a project you thought you mastered (like the so called reading response) only to discover that a different teacher had a different expectation? If so drop us a line at firstname.lastname@example.org I’d love to hear about it. Now go back and check out your writing from five years again, because if it’s anything like mine, it’s pretty darn amusing and well worth a reread