Concept of Knowledge Revisited

Discussion on R&R from all regions
Post Reply
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

What Is Your Purpose?

Every reflective person sooner or later faces certain questions: What is the purpose of my life? How do I find a moral compass so I can tell right from wrong? What should I do day by day to feel fulfillment and deep joy?

As late as 50 years ago, Americans could consult lofty authority figures to help them answer these questions.

Some of these authority figures were public theologians. Reinhold Niebuhr was on the cover of Time magazine. Rabbi Abraham Joshua Heschel wrote about everything from wonder to sin to civil rights. Harry Emerson Fosdick wrote a book called “On Being a Real Person” on how to live with integrity.

Other authority figures were part of the secular priesthood of intellectuals.

Public discussion was awash in philosophies about how to live well. There was a coherent moral ecology you could either go along with or rebel against.

All of that went away over the past generation or two. It is hard to think of any theologian with the same public influence that Niebuhr and Heschel had. Intellectuals are given less authority and are more specialized. They write more for each other and are less likely to volley moral systems onto the public stage.

These days we live in a culture that is more diverse, decentralized, interactive and democratized. The old days when gray-haired sages had all the answers about the ultimate issues of life are over. But new ways of having conversations about the core questions haven’t yet come into being.

Public debate is now undermoralized and overpoliticized. We have many shows where people argue about fiscal policy but not so many on how to find a vocation or how to measure the worth of your life. In fact, we now hash out our moral disagreement indirectly, under the pretense that we’re talking about politics, which is why arguments about things like tax policy come to resemble holy wars.

Intellectual prestige has drifted away from theologians, poets and philosophers and toward neuroscientists, economists, evolutionary biologists and big data analysts. These scholars have a lot of knowledge to bring, but they’re not in the business of offering wisdom on the ultimate questions.

The shift has meant there is less moral conversation in the public square. I doubt people behave worse than before, but we are less articulate about the inner life. There are fewer places in public where people are talking about the things that matter most.

As a result, many feel lost or overwhelmed. They feel a hunger to live meaningfully, but they don’t know the right questions to ask, the right vocabulary to use, the right place to look or even if there are ultimate answers at all.

As I travel on a book tour, I find there is an amazing hunger to shift the conversation. People are ready to talk a little less about how to do things and to talk a little more about why ultimately they are doing them.

This is true among the young as much as the older. In fact, young people, raised in today’s hypercompetitive environment, are, if anything, hungrier to find ideals that will give meaning to their activities. It’s true of people in all social classes. Everyone is born with moral imagination — a need to feel that life is in service to some good.

The task now is to come up with forums where these sorts of conversations can happen in a more modern, personal and interactive way.

I thought I’d do my part by asking readers to send me their answers to the following questions: Do you think you have found the purpose to your life, professional or otherwise? If so, how did you find it? Was there a person, experience or book or sermon that decisively helped you get there?

If you have answers to these questions, go the website for my book, “The Road to Character,” click on First Steps and send in your response. We’ll share as many as we can on the site’s blog called The Conversation, and I’ll write a column or two reporting on what I’ve learned about how people find purpose these days.

I hope this exercise will be useful in giving people an occasion to sit down and spell out the organizing frame of their lives. I know these essays will help others who are looking for meaning and want to know how to find more of it.

Mostly the idea is to use a community of conversation as a way to get somewhere: to revive old vocabularies, modernize old moral traditions, come up with new schools and labels so that people have more concrete building blocks and handholds as they try to figure out what life is all about.

http://www.nytimes.com/2015/05/05/opini ... pe=article
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

What’s the Point of a Professor?

ATLANTA — IN the coming weeks, two million Americans will earn a bachelor’s degree and either join the work force or head to graduate school. They will be joyous that day, and they will remember fondly the schools they attended. But as this unique chapter of life closes and they reflect on campus events, one primary part of higher education will fall low on the ladder of meaningful contacts: the professors.

That’s what students say. Oh, they’re quite content with their teachers; after all, most students receive sure approval. In 1960, only 15 percent of grades were in the “A” range, but now the rate is 43 percent, making “A” the most common grade by far.

Faculty members’ attitudes are kindly, too. In one national survey, 61 percent of students said that professors frequently treated them “like a colleague/peer,” while only 8 percent heard frequent “negative feedback about their academic work.” More than half leave the graduation ceremony believing that they are “well prepared” in speaking, writing, critical thinking and decision-making.

But while they’re content with teachers, students aren’t much interested in them as thinkers and mentors. They enroll in courses and complete assignments, but further engagement is minimal.

One measure of interest in what professors believe, what wisdom they possess apart from the content of the course, is interaction outside of class. It’s often during incidental conversations held after the bell rings and away from the demands of the syllabus that the transfer of insight begins and a student’s emulation grows. Students email teachers all the time — why walk across campus when you can fire a note from your room? — but those queries are too curt for genuine mentoring. We need face time.

Here, though, are the meager numbers. For a majority of undergraduates, beyond the two and a half hours per week in class, contact ranges from negligible to nonexistent. In their first year, 33 percent of students report that they never talk with professors outside of class, while 42 percent do so only sometimes. Seniors lower that disengagement rate only a bit, with 25 percent never talking to professors, and 40 percent sometimes.

It hasn’t always been this way. “I revered many of my teachers,” Todd Gitlin said when we met at the New York Public Library last month. He’s a respected professor of journalism and sociology at Columbia, but in the 1960s he was a fiery working-class kid at Harvard before becoming president of Students for a Democratic Society.

I asked if student unrest back then included disregard of the faculty. Not at all, he said. Nobody targeted professors. Militants attacked the administration for betraying what the best professors embodied, the free inquisitive space of the Ivory Tower.

I saw the same thing in my time at the University of California, Los Angeles, in the early 1980s, when you couldn’t walk down the row of faculty offices without stepping over the outstretched legs of English majors lining up for consultations. First-year classes could be as large as 400, but by junior year you settled into a field and got to know a few professors well enough to chat with them regularly, and at length. We knew, and they knew, that these moments were the heart of liberal education.

In our hunger for guidance, we were ordinary. The American Freshman Survey, which has followed students since 1966, proves the point. One prompt in the questionnaire asks entering freshmen about “objectives considered to be essential or very important.” In 1967, 86 percent of respondents checked “developing a meaningful philosophy of life,” more than double the number who said “being very well off financially.”

Naturally, students looked to professors for moral and worldly understanding. Since then, though, finding meaning and making money have traded places. The first has plummeted to 45 percent; the second has soared to 82 percent.

My Bachelor's degree was earned over 40 years ago, but I still feel a deep sense of gratitude to university faculty mentors during that time...

I returned to U.C.L.A. on a mild afternoon in February and found the hallways quiet and dim. Dozens of 20-year-olds strolled and chattered on the quad outside, but in the English department, only one in eight doors was open, and barely a half dozen of the department’s 1,400 majors waited for a chance to speak.

When college is more about career than ideas, when paycheck matters more than wisdom, the role of professors changes. We may be 50-year-olds at the front of the room with decades of reading, writing, travel, archives or labs under our belts, with 80 courses taught, but students don’t lie in bed mulling over what we said. They have no urge to become disciples.

Sadly, professors pressed for research time don’t want them, either. As a result, most undergraduates never know that stage of development when a learned mind enthralled them and they progressed toward a fuller identity through admiration of and struggle with a role model.

Since the early 2000s, I have made students visit my office every other week with a rough draft of an essay. We appraise and revise the prose, sentence by sentence. I ask for a clearer idea or a better verb; I circle a misplaced modifier and wait as they make the fix.

As I wait, I sympathize: So many things distract them — the gym, text messages, rush week — and often campus culture treats them as customers, not pupils. Student evaluations and ratemyprofessor.com paint us as service providers. Years ago at Emory University, where I work, a campus-life dean addressed new students with a terrible message: Don’t go too far into coursework — there’s so much more to do here! And yet, I find, my writing sessions help diminish those distractions, and by the third meeting students have a new attitude. This is a teacher who rejects my worst and esteems my best thoughts and words, they say to themselves.

You can’t become a moral authority if you rarely challenge students in class and engage them beyond it. If we professors do not do that, the course is not an induction of eager minds into an enlarging vision. It is a requirement to fulfill. Only our assistance with assignments matters. When it comes to students, we shall have only one authority: the grades we give. We become not a fearsome mind or a moral light, a role model or inspiration. We become accreditors.

http://www.nytimes.com/2015/05/10/opini ... pe=article
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Greatest Generation of Scientists

A great man died late last month, one of the last of the pioneering biologists who essentially created the modern science of molecular biology. His name was Alexander Rich; he was 90 years old; and he had spent the past 57 years as a professor at the Massachusetts Institute of Technology, where he was still going to work right up until the last two months of his life.

In her obituary of Rich in The Times last week, Denise Gellene recounted some of his scientific achievements: In 1973, some 20 years after James Watson and Francis Crick worked out the structure of DNA — theoretically at least — Rich proved them right, using “X-rays to produce a distinct image of the famous double helix.” He then went on to make important discoveries about the structure of ribonucleic acid, or RNA, and the way RNA translated genetic information in DNA. His work on RNA and DNA is one of the foundations of biotechnology and the biotech industry.

Rich was a scientist with a wide-ranging curiosity. In a video that was put together when he won an important award in 2008, Rich talked about “the excitement associated with a discovery. You see something new,” he said. “When somebody else makes a discovery and I read about it, I get that same boost of awe and wonder about nature.”

In his lifetime, he met Albert Einstein, worked under Linus Pauling, and knew everyone from Crick and Watson to the great physicists Leo Szilard and Richard Feynman. Rich co-founded several companies. He mentored scientists who would go on to make their own important discoveries. He helped establish Israel’s Weizmann Institute of Science, which is today one of that country’s leading scientific institutions. He was a member of the Pugwash Conferences on Science and World Affairs, whose founding mission was to eliminate nuclear weapons. “Alex lived a rich and important life,” wrote Phillip Sharp, a scientist at M.I.T. who was close to him. That he did.

But as I looked into his life this week, three things struck me, all of which could be said not only of Rich, but of most members of that remarkable generation.

The first was how maddeningly difficult — and painstaking — it was to create the building blocks of molecular biology. In the early 1950s, recalled Jack Strominger, a scientist at Harvard who was Rich’s roommate in college, the federal government did not hand out grants to scientists the way it does today; so just getting the money to do science was hard.

“You walk into a lab today and you see the rows of equipment,” Strominger said. Equipment was much harder to come by, and not as precise. “As late as the 1980s,” he says, “X-ray crystallography was a difficult field. Now a young person can do it quickly.” It took many years — of trial and error — before Rich proved, through X-ray crystallography, that the Crick and Watson double helix structure was correct. “Persistence is luck,” Rich was known to say. That certainly was true of his generation of biologists.

The second insight was how collaborative the scientists were. One good example was something called the RNA Tie Club, of which Rich was a member. Founded by the physicist George Gamow, it was a collaborative effort to figure out the structure of RNA. (Each member was given a tie with a green-and-yellow RNA helix; hence the name of the group.) Although the members met infrequently, they circulated papers among each other and talked freely about their ideas, not fearing that their ideas would be stolen or misused.

That kind of collaboration is something that has been largely lost in the scientific world. “Everything is so much more competitive now,” says Strominger. Scientists today are more likely to look for niches they can dominate. They compete to get their discoveries published ahead of rivals. Or to start a biotech company and make millions. “It would be pretty hard to capture the flavor of that period again,” says Strominger.

Finally, one gets the sense — and this is especially true of Rich — that they didn’t get into science to get rich or become famous. They loved science for its own sake. Rich would call other scientists, often in the evening, just to talk shop. “I never saw Alex bad-mouth another scientist, even when he had reason to,” said Robert Gallo, the director of the Institute of Human Virology at the University of Maryland School of Medicine. “I don’t know anybody who was more interested in science for its own sake than that man. He lived it and loved it.”

In the course of talking to Strominger, I discovered that, at 89, he is also still going to work every day. Why didn’t scientists like him and Rich retire? I asked.

“Why should we?” he replied. “It’s too much fun.”

http://www.nytimes.com/2015/05/16/opini ... pe=article

*******
It Is, in Fact, Rocket Science

THE other week I was working in my garage office when my 14-year-old daughter, Olivia, came in to tell me about Charles Darwin. Did I know that he discovered the theory of evolution after studying finches on the Galápagos Islands? I was steeped in what felt like the 37th draft of my new book, which is on the development of scientific ideas, and she was proud to contribute this tidbit of history that she had just learned in class.

Sadly, like many stories of scientific discovery, that commonly recounted tale, repeated in her biology textbook, is not true.

The popular history of science is full of such falsehoods. In the case of evolution, Darwin was a much better geologist than ornithologist, at least in his early years. And while he did notice differences among the birds (and tortoises) on the different islands, he didn’t think them important enough to make a careful analysis. His ideas on evolution did not come from the mythical Galápagos epiphany, but evolved through many years of hard work, long after he had returned from the voyage. (To get an idea of the effort involved in developing his theory, consider this: One byproduct of his research was a 684-page monograph on barnacles.)

The myth of the finches obscures the qualities that were really responsible for Darwin’s success: the grit to formulate his theory and gather evidence for it; the creativity to seek signs of evolution in existing animals, rather than, as others did, in the fossil record; and the open-mindedness to drop his belief in creationism when the evidence against it piled up.

The mythical stories we tell about our heroes are always more romantic and often more palatable than the truth. But in science, at least, they are destructive, in that they promote false conceptions of the evolution of scientific thought.

Of the tale of Newton and the apple, the historian Richard S. Westfall wrote, “The story vulgarizes universal gravitation by treating it as a bright idea ... A bright idea cannot shape a scientific tradition.” Science is just not that simple and it is not that easy.

Still, you might ask, so what? What happens when we misjudge the scientific process, when we underestimate its complexity?

The oversimplification of discovery makes science appear far less rich and complex than it really is. In the film “The Theory of Everything,” Stephen Hawking is seen staring at glowing embers in a fireplace when he has a vision of black holes emitting heat. In the next scene he is announcing to an astonished audience that, contrary to prior theory, black holes will leak particles, shrink and then explode. But that is not how his discovery happened.

In reality, Mr. Hawking had been inspired not by glowing embers, but by the work of two Russian physicists.

According to their theory, rotating black holes would give off energy, slowing their rotation until they eventually stopped. To investigate this, Mr. Hawking had to perform difficult mathematical calculations that carefully combined the relevant elements of quantum theory and Einstein’s theory of gravity — two mainstays of physics that, in certain respects, are known to contradict each other. Mr. Hawking’s calculations showed, to his “surprise and annoyance,” that stationary black holes also leak.

To a physicist that was a shocking result, as it contradicted the idea that black holes devour matter and energy, but never regurgitate it. To Mr. Hawking, it was especially dismaying, for it lent support to a Princeton physicist’s theory about black hole entropy that he had great disdain for.

So Mr. Hawking attacked his own work, trying to poke holes in it. In the end, after months of calculations, he was forced to accept that his conclusion was correct, and it changed the way physicists think about black holes.

Two thousand years ago, Aristotle’s “Physics” was a wide-ranging set of theories that were easy to state and understand. But his ideas were almost completely wrong. Newton’s “Principia” ushered in the age of modern science, but remains one of the most impenetrable books ever written. There is a reason: The truths of nature are subtle, and require deep and careful thought.

Over the past few centuries we have invested that level of thought, and so while in the 19th century the Reuters news service used carrier pigeons to fly stock prices between cities, today we have the Internet.

Even if we are not scientists, every day we are challenged to make judgments and decisions about technical matters like vaccinations, financial investments, diet supplements and, of course, global warming. If our discourse on such topics is to be intelligent and productive, we need to dip below the surface and grapple with the complex underlying issues. The myths can seduce one into believing there is an easier path, one that doesn’t require such hard work.

But even beyond issues of science, there is a broader lesson to learn, and that was the crux of my reply to my daughter. We all run into difficult problems in life, and we will be happier and more successful if we appreciate that the answers often aren’t quick, or easy.

The Pew Research Center’s Internet and American Life Project summed up a recent study by saying that the negative effects of today’s ubiquitous media “include a need for instant gratification.” The Darwin, Newton and Hawking of the myths received that instant gratification. The real scientists did not, and real people seldom do.

http://www.nytimes.com/2015/05/16/opini ... 05309&_r=0
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Dusting the Furniture of Our Minds

Dust is everywhere. We contribute to its multiplication through our polluting industries, by wearing clothes and using things around us, and in the course of merely living — shedding skin cells, hair, and other byproducts of our life.

But we also are it. Both the Bible and William Shakespeare would have us believe as much. “Dust thou art, and unto dust shalt thou return,” Adam and Eve are told in Genesis. Hamlet, in his nihilistic soliloquy, asks rhetorically about the human, “What is this quintessence of dust?” Science, of course, has provided some actual basis for this notion in findings indicating that the most fundamental material of life on earth originated in the “dust” of long-dead stars.

If the dust outside me does not send me back to myself, to the dust that I am, then the drudgery of wiping away the traces of past life will be strictly external, as well.

In any case, be it prophetic, poetic or scientific, the message is clear: We are in a continual flux of growth and decay, but the latter will win out, and each human body’s end state will be the same — a collection of mere particles, dispersed.

So what is the relation between the dust outside us and the dust that we are?

Most of us lack the courage to examine ourselves with Biblical or Shakespearean frankness. We fail to understand that, as we clash with external dust, we displace existential anxieties and confront our mortal, rootless, restless selves, albeit no longer discernable as such.

Let me give you an example of this strange displacement. Since household dust is comprised, in large part, of the material traces of our bodily existence, the endeavor to eliminate it strives, quite unconsciously, to expunge vestiges of ourselves. Cloth in hand, we erase proofs of our mortality and of our posthumous blending with the environment. With the penchant for putting the house in order, desiring to bring it back to a pristine condition, dusting makes the places where we sojourn a little sterile, a tad dead, all in the name of life.

In a way, this activity symbolizes our inability or our unwillingness to deal in a constructive manner with our lives, to accept their entanglement with finitude, death, and the others archived in the dust.

My encounter with dust is a face-to-face (indeed, surface-to-surface) meeting with myself, with parts of me that, though already dead, lead an uncanny afterlife in combination with other fragmentary and whole entities, be they threads of fabric or dust mites. The very possibility of our mingling harkens back to the shared source of finite existence, against which the act of dusting rebels. There is no depth in the encounter, save for the reciprocal mirroring effect of two surfaces: the dusting duster and the dusted dust. In most cases, nonetheless, the mirror is broken, and the former does not recognize her- or himself in the latter. A tremendous psychic investment is necessary to inhibit the flashes of this traumatic recognition and, perversely, to identify vitality with shining, dust-free, lifeless exteriors.

If the dust outside me does not send me back to myself, to the dust that I am, then the drudgery of wiping away the traces of past life will be strictly external, as well. I will not be guided by the question of how to dust within myself, to bring my mind to a spotless state of wonder about the world, while taking care not to sterilize either the mind or the world.

The labor of enlightenment and critical thinking will have to recommence every now and then in order to reorganize our mental dwelling and to unblock the neglected points of access to what is.

Along similar lines, Henry David Thoreau writes in “Walden”: “I had three pieces of limestone on my desk, but I was terrified to find that they required to be dusted daily, when the furniture of my mind was undusted still … How, then, could I have a furnished house? I would rather sit in the open air, for no dust gathers on the grass, unless where man has broken ground.”

A psychoanalyst might say that we avoid precisely this sort of hard self-analysis when we earnestly dust our houses, without giving any thought to our psychic dwellings. The tremendous difficulty of what, following Thoreau, we might call “dusting the furniture of our minds” far exceeds that of passing a cloth over an actual filing cabinet. Inner dusting entails a constant interrogation of the suppositions we either hold dear or fail to notice because of their obviousness; it draws its inspiration from the ancient Greek injunction “Know thyself!” Without revisiting, exposing, and oft-times brushing away these assumptions, as ubiquitous as dust, the knowledge of everything else in the world — including the concept world itself — would be worthless.

In turn, dusting the contents of our houses and apartments, we come back to the things that populate them — a coffee table, a bookcase, a lamp, windowsills. We touch upon them lightly, with care, and in doing so, release them from the functions they are supposed to serve, which allows us to become reacquainted with them, to consider them in a new light. Much in the same way, stooping over a word before enunciating or writing it, we see or hear it for what it is and feel like we are experiencing it for the first time. A poet or a philosopher dusts words before utilizing them, explores their usability and un-usability

It bears noting that dusting rituals can be as thoughtless as any other routine of daily life — and that is their mixed blessing. On the one hand, the temporary paralysis of deep thought they occasion could be beneficial. Thanks to it, we might be able, finally, to see and touch what readily offers itself, rather than try to penetrate its essence. We would, then, eschew the tendency to “overthink” existence, to dig out its buried causes and disclose its intrinsic constitution. On the other hand, dusting our houses, we seek, unbeknownst to ourselves, a sense of certainty among things, their exteriors peering from underneath the dust. We try to restore the original colors and shapes to this universe in the miniature by removing whatever occludes them. In a word, we aim to disclose the things themselves, as they are, in what amounts to a household version of naïve realism.

Epistemologically, the notion of truth as lucidity, expressed in “clear and distinct ideas,” has been a hallmark of the European Enlightenment, at least ever since the early modern period in the history of philosophy. Dusting takes the metaphor of clarity back to unmitigated literalness. At the same time, a dusted object presents itself not to a disembodied mind but to the senses: above all, to vision, but also to touch. Through its recovered luminosity, the shades and textures of things show themselves as they are. Dusting is material enlightenment. It discloses the façades of furniture and of everything else it cares for, so as to get to the kernel of reality, which is not some hidden inner essence of things but their outward countenance, their first look.

What I like about the allegory of dusting is that it elucidates how critique cannot achieve its objectives once and for all. Just as dust will continue accumulating after every attempt to get rid of it, so prejudices and preconceptions will keep accruing after analysis (no matter how radical) shakes received ideas to the core. The labor of enlightenment and critical thinking will have to recommence every now and then in order to reorganize our mental dwelling and to unblock the neglected points of access to what is. Trivial as it may sound, dusting, mired in finitude, is an infinite task.

Michael Marder is a research professor at the University of the Basque Country, Vitoria-Gasteiz, Spain. He is the author of the recent books “The Philosopher’s Plant: An Intellectual Herbarium,” and “Pyropolitics: When the World Is Ablaze,” and the forthcoming “Dust.” Twitter @michael_marder.

Follow The New York Times Opinion section on Facebook and on Twitter, and sign up for the Opinion Today newsletter.

http://opinionator.blogs.nytimes.com/20 ... our-minds/
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Nadim Pabani: The Intellectual Tradition of Shia Ismāʿīlī Islam: The Fatimids and their Approaches to Knowledge

1. Introduction

“We ought not to be ashamed of appreciating the truth and of acquiring it wherever it comes from, even if it comes from races distant and nations different from us...” –Abū Yūsuf Yaʿqūb ibn Isḥāq al-Kindī (c. 800-866)

ICM-14 Fatimid 11th cent. CE detail SVI270107

These words, uttered by one of the greatest Islamic philosophers within the medieval period, speak volumes. His statement appears as a calling; a calling to the truth, a truth which is enshrined within the knowledge of those who may indeed have been, and may continue to be,‘races distant and nations different from us’. What strikes the reader from al-Kindi’s statement however is the uncompromising attitude of openness towards knowledge which seems to underpin it, as he admits that truth can indeed be found in places far and wide.

More....

http://www.academia.edu/7849900/The_Int ... _Knowledge
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Case for Teaching Ignorance

When we present knowledge as more certain than it is, we discourage curiosity.

IN the mid-1980s, a University of Arizona surgery professor, Marlys H. Witte, proposed teaching a class entitled “Introduction to Medical and Other Ignorance.” Her idea was not well received; at one foundation, an official told her he would rather resign than support a class on ignorance.

Dr. Witte was urged to alter the name of the course, but she wouldn’t budge. Far too often, she believed, teachers fail to emphasize how much about a given topic is unknown. “Textbooks spend 8 to 10 pages on pancreatic cancer,” she said some years later, “without ever telling the student that we just don’t know very much about it.” She wanted her students to recognize the limits of knowledge and to appreciate that questions often deserve as much attention as answers. Eventually, the American Medical Association funded the class, which students would fondly remember as “Ignorance 101.”

Classes like hers remain rare, but in recent years scholars have made a convincing case that focusing on uncertainty can foster latent curiosity, while emphasizing clarity can convey a warped understanding of knowledge.

In 2006, a Columbia University neuroscientist, Stuart J. Firestein, began teaching a course on scientific ignorance after realizing, to his horror, that many of his students might have believed that we understand nearly everything about the brain. (He suspected that a 1,414-page textbook may have been culpable.)

As he argued in his 2012 book “Ignorance: How It Drives Science,” many scientific facts simply aren’t solid and immutable, but are instead destined to be vigorously challenged and revised by successive generations. Discovery is not the neat and linear process many students imagine, but usually involves, in Dr. Firestein’s phrasing, “feeling around in dark rooms, bumping into unidentifiable things, looking for barely perceptible phantoms.” By inviting scientists of various specialties to teach his students about what truly excited them — not cold hard facts but intriguing ambiguities — Dr. Firestein sought to rebalance the scales.

Presenting ignorance as less extensive than it is, knowledge as more solid and more stable, and discovery as neater also leads students to misunderstand the interplay between answers and questions.

People tend to think of not knowing as something to be wiped out or overcome, as if ignorance were simply the absence of knowledge. But answers don’t merely resolve questions; they provoke new ones.

Michael Smithson, a social scientist at Australian National University who co-taught an online course on ignorance this summer, uses this analogy: The larger the island of knowledge grows, the longer the shoreline — where knowledge meets ignorance — extends. The more we know, the more we can ask. Questions don’t give way to answers so much as the two proliferate together. Answers breed questions. Curiosity isn’t merely a static disposition but rather a passion of the mind that is ceaselessly earned and nurtured.

The borderland between known and unknown is also where we strive against our preconceptions to acknowledge and investigate anomalous data, a struggle Thomas S. Kuhn described in his 1962 classic, “The Structure of Scientific Revolutions.” The center of the island, by contrast, is safe and comforting, which may explain why businesses struggle to stay innovative. When things go well, companies “drop out of learning mode,” Gary P. Pisano, a professor at Harvard Business School, told me. They flee uncertainty and head for the island’s interior.

The study of ignorance — or agnotology, a term popularized by Robert N. Proctor, a historian of science at Stanford — is in its infancy. This emerging field of inquiry is fragmented because of its relative novelty and cross-disciplinary nature (as illustrated by a new book, “Routledge International Handbook of Ignorance Studies”). But giving due emphasis to unknowns, highlighting case studies that illustrate the fertile interplay between questions and answers, and exploring the psychology of ambiguity are essential. Educators should also devote time to the relationship between ignorance and creativity and the strategic manufacturing of uncertainty.

The time has come to “view ignorance as ‘regular’ rather than deviant,” the sociologists Matthias Gross and Linsey McGoey have boldly argued. Our students will be more curious — and more intelligently so — if, in addition to facts, they were equipped with theories of ignorance as well as theories of knowledge.

Jamie Holmes is a fellow at New America and the author of the forthcoming book “Nonsense: The Power of Not Knowing.”

http://www.nytimes.com/2015/08/24/opini ... 05309&_r=0
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

How I Learned to Stop Worrying and Love A.I.

The distinction between man and machine is under siege. The technology wizard Ray Kurzweil speaks with casual confidence of achieving electromagnetic immortality with our once-human selves eternally etched onto universal servers. For me, the possibility that machines will acquire the equivalent of human feelings and emotions is pure fantasy. And yet, as a neurologist, I cannot ignore advancing machine intelligence’s implications about the human mind.

To begin, think back to IBM’s Deep Blue defeat of Garry Kasparov in 1997. One pivotal move that shifted the match in favor of Deep Blue prompted Kasparov to accuse IBM of human intervention. In so doing, he highlighted the essential cognitive dissonance that we will face as machines get smarter.

We know what it means to understand something because we experience the sensation of understanding. Machines don’t.

Kasparov couldn’t believe that he had been beaten by a computer because he felt the play was a sign of superior intelligence. But he was wrong — years later it was revealed by Deep Blue’s co-creator that the triumphant move had been a result of a software bug. When presented with several options, Deep Blue could not make a definitive decision, so made a random move that rattled Kasparov.

Uncovering the so-called biology of creativity is big business. FMRI scan aficionados tell us which brain areas light up when someone has a novel idea. Brain wave experts propose electrical patterns specific to originality. Even if these observations pan out, they cannot tell us how to interpret a brilliant chess move arising out of a software glitch. If we are forced to expand our notion of creativity to include random electrical firings, what does that tell us about our highly touted imaginative superiority over a mindless machine?

For Kasparov, Deep Blue was an enigmatic black box, his opinions shaped by his biases as to what constitutes human versus machine intelligence. He’s not alone. We have a strong personal sense of how humans think because we experience thoughts. We know what it means to understand something because we experience the sensation of understanding. This sense of understanding requires both consciousness and awareness of one’s thoughts. We cannot conceive of understanding without consciousness.

What is overlooked in this equation is the quality or accuracy of the actual decision. A standard move by a chess player is evidence of understanding, but a superior move by an inanimate collection of wires and transistors is considered rote machine learning, not understanding. To put this in perspective, imagine a self-proclaimed chess novice making the same pivotal move as Deep Blue. I doubt that any of us would believe that she didn’t know anything about chess.

Yet neuroscience is revealing that understanding isn’t a result of conscious deliberation. From the hunch to the “aha,” various degrees of the feeling of knowing are involuntary mental sensations that arise from subliminal brain mechanisms. They are the brain’s way of telling us the likelihood that a subliminal thought is correct. More akin to bodily sensations than emotions, they can occur spontaneously, with certain psychoactive drugs and with direct brain stimulation in the absence of any conscious thought. We can’t think up an aha — the ultimate sense of understanding. It just happens to us in the same way that we experience love and surprise.

Conversely, we can know things without any sense of knowing (as in the classic example of blindsight, where patients with cortical blindness can point out in which visual field a light is flashing even when they consciously see nothing and are entirely unaware of this knowledge).

If we accept that the feeling of understanding is an involuntary sensation that machines don’t experience, you would think that we would stop worrying about what machines “understand.” Not so. In 1980 the philosopher John Searle introduced the Chinese Room argument to show that it is impossible for digital computers to understand language or think. His 1999 summary of the argument goes as follows:

Imagine a native English speaker who knows no Chinese locked in a room full of boxes of Chinese symbols (a database) together with a book of instructions for manipulating the symbols (the program). Imagine that people outside the room send in other Chinese symbols which, unknown to the person in the room, are questions in Chinese (the input). And imagine that by following the instructions in the program the man in the room is able to pass out Chinese symbols which are correct answers to the questions (the output). The program enables the person in the room to pass the Turing Test for understanding Chinese but he does not understand a word of Chinese.

Back in 1980, when we knew little about the brain and artificial intelligence hadn’t yet flexed much practical muscle, the argument felt reasonable; absent the ability to understand, you wouldn’t expect A.I. to make sophisticated decisions that are the equivalent of smart human thought. Thirty-five years later, though, the argument seems outdated. At bottom, it is a convoluted way of saying that machines don’t have consciousness and feelings. Denying machine understanding tells us nothing about the potential limits of machine intelligence. Even so, according to the Stanford Encyclopedia of Philosophy, the Chinese Room argument remains the most widely discussed philosophical argument in cognitive science since the Turing Test.

It’s as though our self-proclaimed position of superiority and uniqueness is constantly threatened, and we seem constitutionally compelled to compare ourselves to all other potentially thinking entities. A few hundred years ago, Descartes assumed that animals were automatons. Now we know that crows use tools and chimpanzees wage territorial war. Still, we aren’t worried about crow and chimpanzee takeover of our planet, or that they are going to replace us as the highest life form on earth. But machines, well that’s a different story.

To his credit, Kasparov saw the future. As the story goes, his close friends tried to console him for his loss by arguing that the computer had enormously greater computational power than the human brain, but did not understand chess. Kasparov’s prescient response, referring to the vast amount of information the computer had processed: “More quantity creates better quality.”

Most of us now know this to be true in our own lives. As a practicing neurologist, I took great pride in my clinical reservoir of obscure information. Now, any hand-held device with a modicum of memory has a greater and more accurate database. And it isn’t just neurology. I have had a lifelong fascination with poker, and have managed a reasonable skill set based on practice, study and a bit of math. For me, the pleasure of the game is figuring out what an opponent has, what a bet means, when he’s bluffing. In essence I have loved neurology and poker because they allow me to use my wits. No longer.

In the last several years, a poker-playing program (Cepheus) developed by the computer science department at the University of Alberta has consistently outplayed the world’s best heads up limit hold ’em players. What makes this conquest so intriguing is that the computer isn’t programmed in advance to play in any particular style or have any knowledge of the intricacies of poker theory. Instead it is an artificial neural network with a huge memory capacity (4000 terabytes). It plays and records the outcome of millions of trial and error simulations, eventually learning the optimal strategy for any given situation. It does so without any knowledge of the game or its opponent, or any of the subtleties that inform the best human players. It is completely in the dark as to why it does anything. (If you want, you can test your skills against the program at poker.srv.ualberta.ca.)

So if we are to accept reality, and acknowledge this sort of relative superiority in machines, how should we adapt? I like the perspective of a young friend of mine, a top flight professional hold ’em player, who has spent considerable time playing (but rarely winning) against Cepheus. He is hoping to improve his game by observing and unraveling the presumed reasons behind the computer’s often counterintuitive plays. He doesn’t care whether or not Cepheus understands anything about the game of poker. “Hey, I’m practical, not a philosopher. If it knows something that I don’t, I’m all for learning it.”

Rather than burden ourselves with biological biases as to what constitutes understanding, let me suggest adopting a new taxonomy. Let’s give machines the status of a separate species with a distinctly different type of intellect — one that is superior in data crunching but is devoid of emotional reasoning. Not better, not worse, just different. No more condescension based on animistic beliefs. No more machine worship based on one’s love of technology. Let’s avoid using words like thinking and understanding when talking about machine intelligence; they add nothing to our understanding of their understanding (see what I mean?). We are slowly learning the myriad ways that animals and plants exhibit their own forms of intelligence; the same criteria should apply to machines.

The division is straightforward. For data that can be quantified, wisdom will become collective, not personal. We will ask our smart machines to tell us which will be the best treatment for an illness, the best move for a chess match or poker game, the optimal rush hour traffic flow, the likelihood of climate change. We cannot compete at this level.

The ultimate value added of human thought will lie in our ability to contemplate the non-quantifiable. Emotions, feelings and intentions — the stuff of being human — don’t lend themselves to precise descriptions and calculations. Machines cannot and will not be able to tell us the best immigration policies, whether or not to proceed with gene therapy, or whether or not gun control is in our best interest. Computer modeling can show us how subtle biases can lead to overt racism and bigotry but cannot factor in the flood of feelings one experiences when looking at a photograph of a lynching.

Most of this seems too obvious for words. We have emotional intelligence; machines don’t. Rather than fretting over what sources of pride machines will take from us, we should focus on those areas where man alone can make a difference.

In all fairness, this essay contains a long-festering personal agenda. My real concern is that, in keeping with our growing obsession with creating and using smart machines, we are on the way to losing the cognitive skills that won’t be replaced by machines. Witness the decline in university enrollment in the humanities, the demise of the literary novel and the seeming obsession with information over contemplation. Of course nothing is black or white. Trends are in the eye of the beholder.

I confess to a bias for those minds that rely on scientific evidence and critical reasoning for those questions that can be answered empirically while simultaneously retaining a deep appreciation for the inexplicable, mysterious and emotionally complex — the indescribable yet palpable messiness that constitutes a life. For the latter, our value added isn’t in any specific answer, but in the deeply considered question. In the end, it will be the quality of the questions that will be the measure of a man.

Robert A. Burton, a former chief of neurology at the University of California, San Francisco Medical Center at Mt. Zion, is the author of “On Being Certain: Believing You Are Right Even When You’re Not,” and “A Skeptic’s Guide to the Mind: What Neuroscience Can and Cannot Tell Us About Ourselves.”

http://opinionator.blogs.nytimes.com/20 ... d=45305309
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

What Art Unveils

I think a lot about art. As a philosopher working on perception and consciousness, and as a teacher and writer, maybe more than most. It’s part of my work, but it is a pleasure, too. The task of getting a better sense of what art is — how it works, why it matters to us and what it can tell us about ourselves — is one of the greatest that we face, and it is also endlessly rewarding. But at times it also seems just endless, because art itself can be so hard to grasp. And so is the question of how to approach it. Is there a way of thinking about art that will get us closer to an understanding of its essential nature, and our own?

These days, as I’ve discussed here before, the trend is to try to answer these questions in the key of neuroscience. I recommend a different approach, but not because I don’t think it is crucial to explore the links between art and our biological nature. The problem is that neuroscience has yet to frame an adequate conception of our nature. You look in vain in the writings of neuroscientists for satisfying accounts of experience or consciousness. For this reason, I believe, we can’t use neuroscience to explain art and its place in our lives. Indeed, if I am right, the order of explanation may go in the other direction: Art can help us frame a better picture of our human nature.

This may be one of the sources of art’s abiding value. Art is a way of learning about ourselves. Works of art are tools, but they have been made strange, and that is the source of their power.

I begin with two commonplaces. First, artists make stuff. Pictures, sculptures, performances, songs; art has always been bound up with manufacture and craft, with tinkering and artifice. Second, and I think this is equally uncontroversial, the measure of art, the source of its value, is rarely how well it is made, or how effective it is in fulfilling this or that function. In contrast with mere technology, art doesn’t have to work to be good.

I don’t deny that artists sometimes make stuff that does work. For example, Leonardo’s portrait of Duke Ludovico’s teenage mistress, “The Lady With an Ermine,” works in the sense that it, well, it shows her. The same could be said of a photograph on a shopping website: it shows the jacket and lets you decide whether to order it. I only mean that the value of the artwork never boils down to this kind of application.

Why do artists make stuff if the familiar criteria of success or failure in the domain of manufacture are not dispositive when it comes to art? Why are artists so bent on making stuff? To what end?

My hypothesis is that artists make stuff not because the stuff they make is special in itself, but because making stuff is special for us. Making activities — technology, for short — constitute us as a species. Artists make stuff because in doing so they reveal something deep and important about our nature, indeed, I would go so far as to say, about our biological nature.

One of the reasons I’m skeptical of the neuroscientific approach is that it is too individualist, and too concerned alone with what goes on in the head, to comprehend the way social activities of making and doing contribute in this way to making us.

Human beings, I propose, are designers by nature. We are makers and consumers of technologies. Knives, clothing, dwellings, but also language, pictures, email, commercial air travel and social media. Tools and technologies organize us; they do so individually — think of the way chairs and doorknobs mold your posture and the way you move; and they do so collectively — think of the way the telephone or email have changed how we communicate. Technologies solve problems, but they also let us frame new problems. For example, there would be no higher mathematics without mathematical notations. Tools like the rake extend our bodies; tools like writing extend our minds.

Technologies organize us, but they do so only insofar as they are embedded in our lives. This is a crucial idea. Take a doorknob, for example. A simple bit of technology, yes, but one that presupposes a vast and remarkable social background. Doorknobs exist in the context of a whole form of life, a whole biology — the existence of doors, and buildings, and passages, the human body, the hand, and so on. A designer of doorknobs makes a simple artifact but he or she does so with an eye to its mesh with this larger cognitive and anthropological framework.

When you walk up to a door, you don’t stop to inspect the doorknob; you just go right through. Doorknobs don’t puzzle us. They do not puzzle us just to the degree that we are able to take everything that they presuppose — the whole background practice — for granted. If that cultural practice were strange to us, if we didn’t understand the human body or the fact that human beings live in buildings, if we were aliens from another planet, doorknobs would seem very strange and very puzzling indeed.

This brings us to art. Design, the work of technology, stops, and art begins, when we are unable to take the background of our familiar technologies and activities for granted, and when we can no longer take for granted what is, in fact, a precondition of the very natural-seeming intelligibility of such things as doorknobs and pictures, words and sounds. When you and are I talking, I don’t pay attention to the noises you are making; your language is a transparency through which I encounter you. Design, at least when it is optimal, is transparent in just this way; it disappears from view and gets absorbed in application. You study the digital image of the shirt on the website, you don’t contemplate its image.

Art, in contrast, makes things strange. You do contemplate the image, when you examine Leonardo’s depiction of the lady with the ermine. You are likely, for example, to notice her jarringly oversized and masculine hand and to wonder why Leonardo draws our attention to that feature of this otherwise beautiful young person. Art disrupts plain looking and it does so on purpose. By doing so it discloses just what plain looking conceals.

Art unveils us ourselves. Art is a making activity because we are by nature and culture organized by making activities. A work of art is a strange tool. It is an alien implement that affords us the opportunity to bring into view everything that was hidden in the background.

If I am right, art isn’t a phenomenon to be explained. Not by neuroscience, and not by philosophy. Art is itself a research practice, a way of investigating the world and ourselves. Art displays us to ourselves, and in a way makes us anew, by disrupting our habitual activities of doing and making.

Alva Noë is a philosopher at the University of California, Berkeley, and the author, most recently, of “Strange Tools: Art and Human Nature.”

http://opinionator.blogs.nytimes.com/20 ... ef=opinion
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Big University

Many American universities were founded as religious institutions, explicitly designed to cultivate their students’ spiritual and moral natures. But over the course of the 20th century they became officially or effectively secular.

Religious rituals like mandatory chapel services were dropped. Academic research and teaching replaced character formation at the core of the university’s mission.

Administrators and professors dropped spiritual language and moral prescription either because they didn’t know what to say or because they didn’t want to alienate any part of their diversifying constituencies. The humanities departments became less important, while parents ratcheted up the pressure for career training.

Universities are more professional and glittering than ever, but in some ways there is emptiness deep down. Students are taught how to do things, but many are not forced to reflect on why they should do them or what we are here for. They are given many career options, but they are on their own when it comes to developing criteria to determine which vocation would lead to the fullest life.

But things are changing. On almost every campus faculty members and administrators are trying to stem the careerist tide and to widen the system’s narrow definition of achievement. Institutes are popping up — with interdisciplinary humanities programs and even meditation centers — designed to cultivate the whole student: the emotional, spiritual and moral sides and not just the intellectual.

Technology is also forcing change. Online courses make the transmission of information a commodity. If colleges are going to justify themselves, they are going to have to thrive at those things that require physical proximity. That includes moral and spiritual development. Very few of us cultivate our souls as hermits. We do it through small groups and relationships and in social contexts.

In short, for the past many decades colleges narrowed down to focus on professional academic disciplines, but now there are a series of forces leading them to widen out so that they leave a mark on the full human being.

The trick is to find a way to talk about moral and spiritual things while respecting diversity. Universities might do that by taking responsibility for four important tasks.

First, reveal moral options. We’re the inheritors of an array of moral traditions. There’s the Greek tradition emphasizing honor, glory and courage, the Jewish tradition emphasizing justice and law, the Christian tradition emphasizing surrender and grace, the scientific tradition emphasizing reason and logic, and so on.

Colleges can insist that students at least become familiar with these different moral ecologies. Then it’s up to the students to figure out which one or which combination is best to live by.

Second, foster transcendent experiences. If a student spends four years in regular and concentrated contact with beauty — with poetry or music, extended time in a cathedral, serving a child with Down syndrome, waking up with loving friends on a mountain — there’s a good chance something transcendent and imagination-altering will happen.

Third, investigate current loves and teach new things to love. On her great blog, Brain Pickings, Maria Popova quotes a passage from Nietzsche on how to find your identity: “Let the young soul survey its own life with a view of the following question: ‘What have you truly loved thus far? What has ever uplifted your soul, what has dominated and delighted it at the same time?’ ” Line up these revered objects in a row, Nietzsche says, and they will reveal your fundamental self.

To lead a full future life, meanwhile, students have to find new things to love: a field of interest, an activity, a spouse, community, philosophy or faith. College is about exposing students to many things and creating an aphrodisiac atmosphere so that they might fall in lifelong love with a few.

Fourth, apply the humanities. The social sciences are not shy about applying their disciplines to real life. But literary critics, philosophers and art historians are shy about applying their knowledge to real life because it might seem too Oprahesque or self-helpy. They are afraid of being prescriptive because they idolize individual choice.

But the great works of art and literature have a lot to say on how to tackle the concrete challenges of living, like how to escape the chains of public opinion, how to cope with grief or how to build loving friendships. Instead of organizing classes around academic concepts — 19th-century French literature — more could be organized around the concrete challenges students will face in the first decade after graduation.

It’s tough to know how much philosophical instruction anybody can absorb at age 20, before most of life has happened, but seeds can be planted. Universities could more intentionally provide those enchanted goods that the marketplace doesn’t offer. If that happens, the future of the university will be found in its original moral and spiritual mission, but secularized, and in an open and aspiring way.

http://www.nytimes.com/2015/10/06/opini ... d=71987722
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Ghosts in a Secular Age

During Pope Francis’s stateside visit I wrote a post arguing that his rapturous reception from our ostensibly-secular media was partial evidence for secularism’s relative weakness, notwithstanding certain recent de-Christianizing trends in the United States. For a very different sort of evidence, I recommend this personal essay in the latest Elle Magazine, in which Lisa Chase, the widow of the late Peter Kaplan — the beloved New York Observer editor, dead untimely of cancer in 2013 — describes her experiences communicating with what she thinks (not-unreasonably, on the evidence presented) is Peter’s “discarnate” spirit.

The essay is a dispatch from the heart of what we think of as hyper-secular America: Not just New York City, not just upper-middle class white liberal New York City, but literary/journalistic New York City. But it’s clear as the story progresses that the author’s experience is not some outlying intrusion of the pre-modern into a thoroughly materialistic milieu; from the beginning her experiences are informed and steered and ratified by a social network (friends, psychiatrists, doctors) in which encounters with the numinous are accepted, however quietly and slightly nervously, as a part of the normal run of human life.

I wrote a little bit about these kind of experiences, and what they tell us about the nature of our allegedly-secular age, in a post last year about ghost sightings in post-tsunami Japan. In that post, I invoked the Canadian philosopher Charles Taylor to suggest two possible ways of understanding the secular world-picture’s influence on religious/numinous/supernatural experience. In the first scenario, people in a secular society have the same basic kinds of mystical experiences as their ancestors, but the secular “immanent frame” imposes a particular interpretation on those experiences, encouraging people to interpret them as strictly internal/psychological events. (In that post I cited the example of how the Dutch filmmaker Paul Verhoeven reacted to what felt like a divine incursion in his youth; another example would be the aside in this interview with the atheist-intellectual couple Steven Pinker and Rebecca Goldstein in which Goldstein mentions how hard she had to work to “reason away” an experience where she felt like she was being contacted by the dead.) In the second scenario, though, the secular frame somehow changes the very nature of numinous experience, so that it feels more attenuated and unreal, and the human self is more “buffered” against its enchantments, terrors, and pull.

As I said in that post, this distinction is important for how we estimate the durability of secularism:

To the extent that the buffered self is a reading imposed on numinous experience after the fact, secularism looks weaker (relatively speaking), because no matter how much the intellectual assumptions of the day tilt in its favor, it’s still just one possible interpretation among many: On a societal level, its strength depends on the same mix of prejudice, knowledge, fashion and reason as any other world-picture, and for the individual there’s always the possibility that a mystical experience could come along … that simply overwhelms the ramparts thrown up to keep alternative interpretations at bay.

But if the advance of the secular world-picture actually changes the nature of numinous experience itself, by making it impossible to fully experience what Taylor calls “enchantment” in the way that people in pre-secular contexts did and do, then the buffered self is a much more literal reality, and secularism is self-reinforcing in a much more profound way. It doesn’t just close intellectual doors, it closes perceptual doors as well.

But the Elle essay suggests yet another understanding of how secularism interacts with spiritual experience. In this scenario, the key feature of the secular world-picture isn’t that it requires people to reinterpret their numinous experiences as strictly psychological events; it’s simply that it discourages people who have such experiences from embracing any kind of systematic (that is, religious/theological) interpretation of what’s happened to them, and then as a corollary discourages them from seeking out a permanent communal space (that is, a religious body) in which to further interact with these ultimate realities. Under secularism, in other words, most people who see a ghost or have a vision or otherwise step into the supernatural are still likely to believe in the essential reality of their encounter with the otherworldly or transcendent; they’re just schooled to isolate the experience, to embrace it as an interesting (and often hopeful) mystery without letting it call them to the larger conversion of life that most religious traditions claim that the capital-S Supernatural asks of us in return.

What secularism really teaches people, in this interpretation, isn’t that spiritual realities don’t exist or that spiritual experiences are unreal. It just privatizes the spiritual, in a kind of theological/sociological extension of church-state separation, and discourages people from organizing either intellectual systems (those are for scientists) or communities of purpose (that’s what politics is for) around their sense, or direct experience, that Something More exists.

This interpretation – which I think is clearly part of the truth of our time — has interesting implications for the future of religion in the West. One of the big religious questions going forward is whether the large swathe of people who have drifted from traditional faith but remain dissatisfied (for excellent reasons!) with strict neo-Darwinian materialism constitute a major market for religious entrepreneurs. Is there a version of theologically-liberal Christianity that could actually bring these drifters back to church and keep them in the pews? Is there some new synthesis –pantheist, deist, syncretistic — that could seem plausible and nourishing and intellectually satisfying enough to plan an actual new religion in “spiritual, but not religious” territory? Is there enough residual Christian orthodoxy knocking around in the West’s cultural subconscious to make a revival or Great Awakening not only possible but likely? Etc.

My suspicion is that eventually someone will figure out a new or refashioned or revivalist message that resonates with the fallen-away but still spiritually-inclined; man is a religious animal, nature abhors a vacuum, people want community and common purpose, and above all people keep having metaphysical experiences and it’s only human to want to make sense out of them and not just compartmentalize them away from the remainder of your life.

But what you see in the Elle piece is that in the absence of strong institutions and theological systems dedicated to the Mysteries, human beings and human society can still make sense of these experiences through informal networks, private channels, personalized interpreters. And to the extent that these informal networks succeed in satisfying the human hunger for interpretation, understanding and reassurance — as they seem to have partially satisfied Peter Kaplan’s widow — then secularism might be more resilient, more capable of dealing effectively with the incorrigibility of the spiritual impulse, than its more arid and strictly materialist manifestations might suggest.

http://douthat.blogs.nytimes.com/2015/1 ... d=45305309
****
Lisa Chase's experience is given at:

http://www.elle.com/life-love/news/a309 ... -a-medium/
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Schools for Wisdom

Friends of mine have been raving about the documentary “Most Likely to Succeed,” and it’s easy to see what the excitement is about. The film is a bold indictment of the entire K-12 educational system.

Greg Whiteley’s documentary argues that the American school system is ultimately built on a Prussian model designed over 100 years ago. Its main activity is downloading content into students’ minds, with success or failure measured by standardized tests. This lecture and textbook method leaves many children bored and listless.

Worse, it is unsuited for the modern workplace. Information is now ubiquitous. You can look up any fact on your phone. A computer can destroy Ken Jennings, the world’s best “Jeopardy!” contestant, at a game of information retrieval. Computers can write routine news stories and do routine legal work. Our test-driven schools are training kids for exactly the rote tasks that can be done much more effectively by computers.

The better approach, the film argues, is to take content off center stage and to emphasize the relational skills future workers will actually need: being able to motivate, collaborate, persevere and navigate through a complex buffet of freelance gigs.

Whiteley highlights one school he believes is training students well. This is High Tech High, a celebrated school in San Diego that was started by San Diego business and tech leaders. This school takes an old idea, project-based learning, and updates it in tech clothing.

There are no textbooks, no bells marking the end of one period or start of the next. Students are given group projects built around a driving question. One group studied why civilizations rise and fall and then built a giant wooden model, with moving gears and gizmos, to illustrate the students’ theory. Another group studied diseases transmitted through blood, and made a film.

“Most Likely to Succeed” doesn’t let us see what students think causes civilizational decline, but it devotes a lot of time to how skilled they are at working in teams, demonstrating grit and developing self-confidence. There are some great emotional moments. A shy girl blossoms as a theater director. A smart but struggling boy eventually solves the problem that has stumped him all year.

The documentary is about relationships, not subject matter. In the school, too, teachers cover about half as much content as in a regular school. Long stretches of history and other subject curriculums are effectively skipped. Students do not develop conventional study habits.

The big question is whether such a shift from content to life skills is the proper response to a high-tech economy. I’d say it’s at best a partial response.

Ultimately, what matters is not only how well you can collaborate in groups, but the quality of the mind you bring to the group. In rightly playing up soft skills the movie underemphasizes intellectual virtues. For example, it ignores the distinction between information processing, which computers are good at, and knowledge, which they are not.

If we want to produce wise people, what are the stages that produce it? First, there is basic factual acquisition. You have to know what a neutron or a gene is, that the Civil War came before the Progressive Era. Research shows that students with a concrete level of core knowledge are better at remembering advanced facts and concepts as they go along.

Second, there is pattern formation, linking facts together in meaningful ways. This can be done by a good lecturer, through class discussion, through unconscious processing or by going over and over a challenging text until it clicks in your head.

Third, there is mental reformation. At some point while studying a field, the student realizes she has learned a new language and way of seeing — how to think like a mathematician or a poet or a physicist.

At this point information has become knowledge. It is alive. It can be manipulated and rearranged. At this point a student has the mental content and architecture to innovate, to come up with new theses, challenge others’ theses and be challenged in turn.

Finally after living with this sort of knowledge for years, exposing it to the rigors of reality, wisdom dawns. Wisdom is a hard-earned intuitive awareness of how things will flow. Wisdom is playful. The wise person loves to share, and cajole and guide and wonder at what she doesn’t know.

The cathedrals of knowledge and wisdom are based on the foundations of factual acquisition and cultural literacy. You can’t overleap that, which is what High Tech High is in danger of doing.

“Most Likely to Succeed” is inspiring because it reminds us that the new technology demands new schools. But somehow relational skills have to be taught alongside factual literacy. The stairway from information to knowledge to wisdom has not changed. The rules have to be learned before they can be played with and broken.

http://www.nytimes.com/2015/10/16/opini ... ef=opinion
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Academia’s Rejection of Diversity

ONE of the great intellectual and moral epiphanies of our time is the realization that human diversity is a blessing. It has become conventional wisdom that being around those unlike ourselves makes us better people — and more productive to boot.

Scholarly studies have piled up showing that race and gender diversity in the workplace can increase creative thinking and improve performance. Meanwhile, excessive homogeneity can lead to stagnation and poor problem-solving.

Unfortunately, new research also shows that academia has itself stopped short in both the understanding and practice of true diversity — the diversity of ideas — and that the problem is taking a toll on the quality and accuracy of scholarly work. This year, a team of scholars from six universities studying ideological diversity in the behavioral sciences published a paper in the journal Behavioral and Brain Sciences that details a shocking level of political groupthink in academia. The authors show that for every politically conservative social psychologist in academia there are about 14 liberal social psychologists.

Why the imbalance? The researchers found evidence of discrimination and hostility within academia toward conservative researchers and their viewpoints. In one survey cited, 82 percent of social psychologists admitted they would be less likely to support hiring a conservative colleague than a liberal scholar with equivalent qualifications.

This has consequences well beyond fairness. It damages accuracy and quality. As the authors write, “Increased political diversity would improve social psychological science by reducing the impact of bias mechanisms such as confirmation bias, and by empowering dissenting minorities to improve the quality of the majority’s thinking.”

One of the study’s authors, Philip E. Tetlock of the University of Pennsylvania, put it to me more bluntly. Expecting trustworthy results on politically charged topics from an “ideologically incestuous community,” he explained, is “downright delusional.”

Are untrustworthy academic findings really a problem? In a few high-profile cases, most definitely. Take, for example, Prof. Diederik Stapel of Tilburg University in the Netherlands, who in 2011 faked experiments to show, among other things, that eating meat made people selfish. (He later said that his work was “a quest for aesthetics, for beauty — instead of the truth”).

This kind of ideologically motivated fraud is mercifully rare. As a social scientist working in universities and think tanks, I have never met a colleague who I believe has engaged in this sort of misconduct.

But even honest researchers are affected by the unconscious bias that creeps in when everyone thinks the same way. Certain results — especially when they reinforce commonly held ideas — tend to receive a lower standard of scrutiny. This might help explain why, when the Open Science Collaboration’s Reproducibility Project recently sought to retest 100 social science studies, the group was unable to confirm the original findings more than half the time.

These concerns aren’t a modern innovation. In one classic experiment from 1975, a group of scholars was asked to evaluate one of two research papers that used the same statistical methodology to reach opposite conclusions. One version “found” that liberal political activists were mentally healthier than the general population; the other paper, otherwise identical, was set up to “prove” the opposite conclusion. The liberal reviewers rated the first version significantly more publishable than its less flattering twin.

The World Bank has found a similar phenomenon at work among its own staff. In a recent exercise, the organization presented identical data sets to employees under two different pretexts. Some employees were told the data were measuring the effectiveness of a skin rash cream, while others were told the same data measured the effects of minimum wage laws on poverty. The politicized context of the second question led to more erroneous analyses, and the accuracy of left-leaning respondents plummeted when the data conflicted with their worldview.

Improving ideological diversity is not a fundamentally political undertaking. Rather, it is a question of humility. Proper scholarship is based on the simple virtues of tolerance, openness and modesty. Having people around who think differently thus improves not only science, but also character.

Many academics and intellectuals see their community as a major force for diversity and open-mindedness throughout American society, and take justifiable pride in this image. Now they can be consistent and apply those values to their own profession, by celebrating ideological diversity.

http://www.nytimes.com/2015/10/31/opini ... d=45305309
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Moral Dispute or Cultural Difference?

The word “relativism” tends to generate strong reactions. This is odd, given that the word is not generally used with a clear and agreed upon meaning. I want to offer a specific proposal about what it means, with a view to navigating the following “real-world” problem, discussed by Alex Rosenberg here at The Stone in July: What should we do when we face what are often described as irresolvable moral disagreements?

It’s possible for two people to live in different moral worlds, in which different moral truths hold.

In a disagreement, two parties affirm and deny the same thing; because the parties contradict each other, they cannot both be right; because they cannot both be right, there is something to be resolved between them by figuring out which of them is mistaken; a disagreement remains unresolved so long as both parties continue to think the other is mistaken; it is irresolvable when there is no method by which to resolve it.

Are there any such irresolvable disagreements?

A moral relativist will most likely argue that moral disputes are irresolvable because moral beliefs are not strictly true or false, because there are no facts in the world that would make them so — no moral facts. Therefore, such “truth” they may be said to have is grounded in subjective elements — states of mind such as our desires and emotions. Since these subjective elements will vary among persons and cultures, they may generate conflicts that no possible appeal to objective facts could ever resolve.

Because the term “moral relativism” is closely associated with this subjectivist picture of morality, it elicits understandable hostility. How can we earnestly hold our moral commitments if we give up on the aspiration to objectivity regarding morals, to getting them right rather than wrong?

I think there is another way to understand what moral relativism involves, which does not require us to give up our aspiration to objectivity. Let me use an example.

Imagine me to be a middle-aged woman of middle-class origin who grew up in middle America. I went to college, graduated, then went on to get a master’s degree in business, after which I worked on Wall Street and made a lot of money — so much that I retired early. I never married or had children, which was a source of regret to my parents. But they are proud of me. We are all committed to the ideals of liberal individualism, and agree that each of us is responsible for his or her own life, financially and otherwise.

Shortly after retirement, I decide to travel, and during a visit to a rural village in the Punjab I meet a woman my age named Anjali. The main facts of her life are: her parents arranged her marriage when she was a very young girl, she was married in her early teens and since then she has had many children. She is already a grandmother. Her life has been organized entirely around family responsibilities.

Initially, Anjali finds my decision not to marry or have children repugnant, especially since my parents clearly wished it. She tells me, through an interpreter, that we are all morally obliged to defer to our parents’ wishes. I initially take myself to have a moral disagreement with her, for I believe that I was not morally obliged to defer to my parents’ wish that I marry and have children.

Many Westerners may think this moral disagreement can easily be resolved in my favor: I have done no wrong in seeking my fortune, and Anjali should be allowed a right to do the same. But this Western attitude overlooks an important fact: it does not generally lie within human power to re-make a whole culture at will.

In her actual cultural circumstances, it isn’t an option for Anjali to set off to seek her fortune on her own, apart from her family network, any more than it is an option for me to take up the various traditional duties that befall females in extended families in rural Punjab. Owing to these differences in our cultural circumstances, Anjali and I need very different moral truths to live by, in order to navigate the specific moral options that we face. Does this mean that she and I are bound to live by conflicting values — that we face an irresoluble moral disagreement, about whether it is morally obligatory to defer to our parents’ wishes?

No. When Anjali recognizes her moral obligation to defer to her parents’ wishes, she conceives it as one among many special duties that she bears to her parents, which sit alongside other special duties to other members of her extended family, all of which go by the name katarvya. When I deny that I have any moral obligation to defer to my parents’ wishes, I am not thinking of katarvya, and in fact before I got to know Anjali I had no conception of katarvya at all. I was thinking in terms of the sorts of obligations that are recognized within the framework of liberal individualism; I was not violating my parents’ rights when I fashioned my life plan according to my own wishes rather than theirs. So Anjali and I never really contradicted each other concerning what we owe our parents. She had been affirming that she owes her parents the special duties of katarvya, while I had been denying that my parents’ rights include dictating my major life decisions.

What we really confront here, then, is a kind of difference which is not a disagreement. We come to see that we are each right to live by our respective moral beliefs, due to the way in which they speak to our respective circumstances, and the specific moral issues that arise within them. Yet although we each come to regard the other’s moral beliefs as true, neither of us adopts the other’s moral beliefs for herself, as truths to live by. When this occurs, we have occasion to adopt a distinctively relativist stance, which is a stance of disengagement rather than disagreement. As we learn about one anothers’ moral beliefs, we do not thereby gain any moral insight into how we should live our own moral lives, and nor do we try to instruct others about how they should live theirs.

The moral relativism I am proposing makes sense of this situation by concluding that while moral truths hold objectively, they do not hold universally, only locally. If Anjali and I require different moral beliefs to live by, this shows that we live in different moral worlds, in which different moral truths hold.

This conception of moral relativism is not without its problems. It seems more plausible for some cases, such as the one I just gave, than for others. Take, for example, such practices as sati (widow burning), female genital mutilation and honor killing. Our immediate response is likely to be: We deem these practices wrong; they must be stopped, preferably by convincing those who participate in them that they are wrong. If we cannot convince them, then we take ourselves to face irresoluble moral disagreements, in the face of which we should remain true to our moral beliefs, by continuing to insist that the practices are wrong, and opposing them by all the usual means available in a polity — legislation, state intervention, etc.

The response I just described is essentially anti-relativist. And note that it opposes relativism on both of the conceptions I have been discussing. It insists that the parties to a disagreement cannot both be right, that there are objective matters to be right or wrong about. It also insists that we should not disengage from those who morally differ from us, but should retain a sense of disagreement with them, by putting forward the moral truths by which we live as universal truths that hold for everyone.

But before we reject moral relativism, we should explore one other possibility. While the relativist does want to say, in a general way, that people with moral differences probably are responding to very different cultural circumstances, she does not have to say that those who participate in the specific practices of sati, female genital mutilation and honor killing are right to do so. She may also say that they are wrong by their own standards. For there may be local moral truths, which hold in the very cultural conditions in which those practices have arisen, in the light of which they are wrong. If that is so, then the participants in these practices misunderstand what their own moral principles entail. This is one perfectly plausible way of understanding how American society came to realize that it was wrong to give only white males full civil rights. And it would be a particularly parochial form of self-congratulation to say that what was true of America is not feasible for other societies.

Of course, it is conceivable that there are no local truths in the light of which sati, female genital mutilation and honor killing would count as wrong. But, the point is, only then — that is, only if there are no local truths that stand in tension with these practices — would the moral relativist have to conclude that our moral differences over them are like the case I described above about Anjali and me, in the respect that both parties are actually right.

Moral relativism, as I propose understanding it, calls for an exploratory approach to encounters with moral difference. It discourages us from taking the appearance of irresoluble moral disagreements at face value, without first exploring the possibility that the parties might not actually be disagreeing, but addressing quite different moral circumstances, for which they need quite different moral truths to live by. If we arrived at this relativist conclusion, which gives up on the universality of moral truth, we would not then abandon the idea of moral objectivity. It is still possible for the parties involved to be in error, insofar as they misunderstand their own moral principles.

Carol Rovane is a professor of philosophy at Columbia University, and the author of “Metaphysics and Ethics of Realtivism” and “The Bounds of Agency: An Essay in Revisionary Metaphysics.”

http://opinionator.blogs.nytimes.com/20 ... ef=opinion
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Communities of Character

We live in an individualistic age. As Marc J. Dunkelman documents in his book “The Vanishing Neighbor,” people tend to have their close group of inner-ring family and friends and then a vast online outer-ring network of contacts, but they are less likely to be involved in middle-ring community organizations.

But occasionally I stumble across a loving, charismatic and super-tight neighborhood organization. Very often it’s a really good school.

You’d think that schools would naturally nurture deep community bonds. But we live in an era and under a testing regime that emphasizes individual accomplishments, not community cohesion. Even when schools talk about values, they tend to talk about individualistic values, like grit, resilience and executive function, not the empathy, compassion and solidarity that are good for community and the heart.

Researchers at the Harvard Graduate School of Education asked 10,000 middle and high school students if their parents cared more about their personal achievement or whether they were kind. Eighty percent said their parents cared more about achievement — individual over the group.

But there are some schools that nurture achievement precisely by building tight communities.

The Denver School of Science and Technology has an intense values-centered culture, emphasizing values like respect and responsibility. Four days a week everybody gathers for a morning meeting. Those who contribute to the community are affirmed. When students have strained the community, by being rude to cafeteria workers, for example, the rift is recognized, discussed and healed.

Last week I visited the Leaders School in Bensonhurst, Brooklyn, which is a glowing example of community cohesion. This is a school with roughly 300 students who speak between them 22 languages. Eighty-five percent are on free and reduced lunch. Last year the graduation rate was an amazing 89 percent and every single graduate went to college. The average SAT score was 411 math and 384 verbal.

The school’s approach and curriculum is organized by Outward Bound. (This newspaper’s publisher, Arthur Sulzberger Jr., once was chairman of the NYC Outward Bound Schools chapter.)

When the students arrive at Leaders as freshmen they are assigned to a crew, a group of 12-15 students with an adviser. Right at the start they go on a wilderness adventure, and go through a process of “storming, norming and performing.” As they learn to cook for each other and deal with outdoor challenges, first they fight, then they come up with community norms, and then they perform. The crew stays together for the next four years, supporting each other with family, romantic and academic issues.

Students are given tremendous responsibility, and are put in challenging social circumstances that call forth compassion, judgment, sensitivity and mercy. If one student writes something nasty about another on social media, then the two get together with two student mediators and together they work out a resolution. If there’s a serious infraction that would merit a suspension at another school, the guilty party meets with a Harm Circle, and they figure out some proper act of contrition and restorative justice.

One day each December the community gathers outside the school and the seniors march as a unit with their college application letters through cheering crowds and to a waiting mail truck.

Most classes are conducted through Socratic dialogue. Students learn to negotiate disagreements. They get academic grades, but also leadership grades that measure their character. The students lead their own parent-teacher conferences. They stand up before their parents, a teacher and other observers and they give a presentation on their successes, failures and how they might improve.

I was amazed by how well the students had been trained at group discussion, using a talking and listening method they call “Step Up/Step Back.” “Let me build on what Shazzarda was saying…” one student would say. If a member of the group had been silent for a few minutes, somebody would pull her in: “Maybe Essence is the best person to explain that…”

Most of all I was struck by their kindness toward one another. No student could remember any racial or ethnic conflict. Many upperclassmen serve as peer mentors to the underclassmen. There’s a palpable sense of being cared for. That’s in part because the school has a wide definition of student achievement.

Kurt Hahn, the founder of Outward Bound, once wrote, “It is the foremost task of education to insure the survival of these qualities: an enterprising curiosity, an undefeatable spirit, tenacity in pursuit, readiness for sensible denial, and above all, compassion.”

All over the country there are schools and organizations trying to come up with new ways to cultivate character. The ones I’ve seen that do it best, so far, are those that cultivate intense, thick community. Most of the time character is not an individual accomplishment. It emerges through joined hearts and souls, and in groups.

http://www.nytimes.com/2015/11/27/opini ... inion&_r=0
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Memes, Dreams and Themes

We have ideas, many of them, every day. We have them, but we don’t often reflect on them. Mostly they just come and go. How many ideas did you have today? What was their character? Some you might describe as big or small, simple or complex. Is it possible to gain a better understanding of ideas, their types and value to us? Is it possible to establish a taxonomy of ideas?

I’d like to try. I want to propose a taxonomy of ideas that invokes a three-way division: memes, dreams and themes.

Let’s start with a basic fact: It is characteristic of ideas to be shared by many minds. Why is this?

One reason is that ideas spread from one mind to another. Here, the concept of a meme comes in: A meme spreads like a virus from one mind to another, duplicating itself, colonizing new minds.

Once inside our heads, memes can vary from mild mental nuisance to dangerous ideology.

In modern life, we are immersed in memes — jingles, catchphrases, fads, fashions, crazes, religions, ideologies, mannerisms and accents. They spread by imitation and natural credulity, exploiting the receptivity of the human mind to new information and influence, forged in childhood. Uncritical copying is favored. People just can’t help picking stuff up, willy-nilly. Memes may also mutate and be subject to natural selection, sometimes proliferating wildly, before possibly going extinct. Thus ideas (in a broad sense) exist in many minds because they are memes: They have arrived there from somewhere else by means of meme transfer.

Memes are like computer viruses — they trade on the architecture of the system to insert themselves into the software. Once inside they can vary from mild mental nuisance to dangerous ideology. In some respects they work like a drug: They trigger reactions in our brains that take over our minds. That annoying jingle in your head is a meme playing with your brain chemistry. (Here I have paraphrased a concept originated by Richard Dawkins, who coined the term “meme.”)

The concept of the meme can be taken more or less widely. Some people take it to provide a general theory of human culture and idea transmission. I want to distinguish the meme from two other sorts of idea that are importantly different from it.

First, dreams. Dream ideas, like memes, are widely shared, with the same kinds of dream cropping up in widely different communities and cultures. And as with memes, these dream contents often seem arbitrary and pointless — despite being widely shared. Though the variety of dreams is essentially unlimited, there are types of dreams that most of us have regularly: dreams of falling, flying, being pursued, being embarrassed, missing trains or buses, being inadequately prepared, being incapacitated and finding an extra room in the house. (The last item is particularly peculiar: Why should so many people dream of that?) Dream ideas are not shared because they are transmitted like memes. They don’t spread like a virus from one mind to another; they are not the result of automatic imitation.

No one knows for sure why people dream as they do, though theories abound. But one thing is clear: It is not by means of imitation. Dreamers do not transmit their dreams to others by recounting them or otherwise making them public (say, by making a film embodying the dream). You cannot plant a dream in the mind of another. Yet people still tend to spontaneously have the same sorts of dream.

It’s possible that dream life can be influenced to some degree by shared culture in memelike fashion, but that does not explain shared dream content. Dreams seem to grow from within, like bits of anatomy. Memes are externally formed; dreams are internally formed. So dreams are not memes. They don’t spread from mind to mind by imitation or manipulation.

I call the third category themes — mainly for the sake of the rhyme, but also because it has a breadth that I want to emphasize. One of the salient features of memes is that they do not spread by rational persuasion — they spread by nonrational or irrational manipulation. But the spread of scientific ideas, to take the most obvious example, is not like that: They spread because they have been found to be true, or at least empirically confirmed. Thus Darwinism is accepted because of the overwhelming evidence in its support. There is no psychological exploitation at work here. The explanation for the spread of scientific ideas is simply the power of scientific method.

I hope what I have just said is completely uncontroversial, because now I want to court controversy. It would surely be wrong to restrict the nonmeme type of idea transmission to science: Many other disciplines involve shared beliefs, where these beliefs are shared for good rational reasons. Thus, history, geography, literature, philosophy, mathematics, music theory, engineering and cookery.

There is a large range of human cognitive activities in which ideas are shared by something other than meme propagation — not all of them counting as “science.” We clearly need to expand the notion of rationality so as to incorporate these areas. And there is no difficulty in doing so: There are standards of evidence and argument and intellectual rigor that characterize all these areas — it isn’t all jingles and ideology (despite what post-modernists may claim).

There is a difference between an annoying jingle and the opening of Beethoven’s Fifth Symphony.

But, as always, matters get a bit more interesting when it comes to morals and aesthetics. Moral ideas spread, as do aesthetic ideas — is this kind of spread more like meme transmission or scientific communication? Compare an advertising jingle to the opening bars of Beethoven’s Fifth. Both may lodge in the mind against one’s will, there to repeat themselves endlessly; and they may be transmitted to others, for example by whistling. Are they both therefore memes? I would say not. There is a different explanation of the musical spread in each case: In the jingle case we have a meme, a worthless cultural trope that insidiously takes over the mind; but in the Beethoven’s Fifth case we have an aesthetically appealing and valuable musical theme. In the case of morals, one might cite the difference between the proclamation of universal human rights, a theme, and the spread by propaganda of racist ideology, a meme — one rational, the other irrational. And that is why I call my third category “themes”: Themes are cultural units with intrinsic value, which deserve to be spread and replicated. The reason they spread is that they are inherently good — meritorious, worthwhile — and are generally recognized to be so.

Notice here that we can distinguish themes from memes only by employing evaluative language, and by assuming that values play a role in cultural transmission. Themes spread because they have value (notably truth or rational justification), while memes spread despite having no value. It is the same with other aesthetic products, like art and literature. Famous lines from Shakespeare don’t spread because they are memes — worthless cultural viruses — but because they are judged to be aesthetically valuable, and rightly so judged. Nor is this a matter of high culture versus low culture: A good Beatles song is a completely different animal from a commercial jingle. The point is that the mechanism of transmission is quite different in the two cases, being more like science in the theme case, in contrast to your typical meme.

None of this is to say that memes and dreams cannot get mixed up in practice, or that it is always easy to tell the difference. There can be fads and fashions in science — memes masquerading as themes (the idea of a “paradigm shift” comes to mind); and not all art that is popular is good art (popularity as measured by the number of reproductions of a given picture adorning the walls of suburban homes). But there is a deep difference of principle here — there are two very different kinds of idea transmission.

Memes may disguise themselves as themes in order to gain a stronger hold, as with certain “scientific” ideologies, or kitsch art. The difference lies in the psychological means of transmission. Themes may spread from mind to mind in an epidemiological manner, even mutating as they spread, but the reason for their exponential spread is not the same as for memes. In the latter case it is brute susceptibility, but in the former case it is appreciation of merit. This is why we don’t resent the transmission of themes into our minds, while we do resent the insertion of memes. Theme transmission is genuine learning or improvement, but meme transmission involves no learning or improvement, merely mental infection.

One of the central questions of civilized life is which of one’s existing ideas are memes and which are themes: Which are absorbed because of mental manipulation, usually conformity and imitation, and which represent genuine value — scientific, moral, or aesthetic? That is, does one accept certain ideas for the right reasons? Does one’s personal culture consist of memes or themes? Is it merit or manipulation that explains the contents of one’s mind? That vital question is possible only if we refuse to extend the meme concept beyond its legitimate domain.

Colin McGinn has taught at various universities in England and the United States. His most recent book is “Prehension: The Hand and the Emergence of Humanity.”

http://opinionator.blogs.nytimes.com/20 ... 87722&_r=0
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Lie About College Diversity

Once a campus of different races and ethnicities is assembled, then what? Schools haven't mastered that part

Excerpt:

There’s a profusion of affinity groups. There are themed living arrangements that allow students with similar backgrounds and overlapping hobbies to huddle together. In terms of curriculum, there’s enormous freedom, which can translate into the ability to chart and stick to a narrow path with fellow travelers whose perspectives are much the same as yours.

So even if a school succeeds in using its admissions process to put together a diverse student body, it often fails at the more important goal that this diversity ideally serves: meaningful interactions between people from different backgrounds, with different scars and different ways of looking at the world.

A given college may be a heterogeneous archipelago. But most of its students spend the bulk of their time on one of many homogeneous islands.

That’s consistent with the splintered state of America today, but it’s a betrayal of education’s mission to challenge ingrained assumptions, disrupt entrenched thinking, broaden the frame of reference.

In that sense it’s a betrayal as well of affirmative action, which isn’t merely a matter of cultural and economic redress and isn’t just about social mobility (though those are plenty worthy aims). It’s about an optimal learning environment for all students: white as well as black, privileged as well as underprivileged.

That environment hinges on what happens after admissions.

More...
http://www.nytimes.com/2015/12/13/opini ... d=71987722
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

MHI. Farman:: modern knowledge
Darkhana Jamatkhana
Karachi Pakistan
October 26,2000(A.M.)

""The wherewithal to use modern knowledge is a central issue for the FUTURE of the Jamat.

Today,here,there are many people from outside Karachi,and I say this: I GIVE YOU THIS FARMAN TODAY because I do not want it to be believed that, in order to have the benefit of this up-to-date knowledge,you have to live in Karachi.The opposite is true, that knowledge has to be brought to you wherever you are. As REMOTE as the Jamat is, that knowledge MUST come to you.""
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

When Philosophy Lost Its Way

The history of Western philosophy can be presented in a number of ways. It can be told in terms of periods — ancient, medieval and modern. We can divide it into rival traditions (empiricism versus rationalism, analytic versus Continental), or into various core areas (metaphysics, epistemology, ethics). It can also, of course, be viewed through the critical lens of gender or racial exclusion, as a discipline almost entirely fashioned for and by white European men.

Yet despite the richness and variety of these accounts, all of them pass over a momentous turning point: the locating of philosophy within a modern institution (the research university) in the late 19th century. This institutionalization of philosophy made it into a discipline that could be seriously pursued only in an academic setting. This fact represents one of the enduring failures of contemporary philosophy.

Take this simple detail: Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university. Against the inclinations of Socrates, philosophers became experts like other disciplinary specialists. This occurred even as they taught their students the virtues of Socratic wisdom, which highlights the role of the philosopher as the non-expert, the questioner, the gadfly.

Philosophy, then, as the French thinker Bruno Latour would have it, was “purified” — separated from society in the process of modernization. This purification occurred in response to at least two events. The first was the development of the natural sciences, as a field of study clearly distinct from philosophy, circa 1870, and the appearance of the social sciences in the decade thereafter. Before then, scientists were comfortable thinking of themselves as “natural philosophers” — philosophers who studied nature; and the predecessors of social scientists had thought of themselves as “moral philosophers.”

The second event was the placing of philosophy as one more discipline alongside these sciences within the modern research university. A result was that philosophy, previously the queen of the disciplines, was displaced, as the natural and social sciences divided the world between them.

This is not to claim that philosophy had reigned unchallenged before the 19th century. The role of philosophy had shifted across the centuries and in different countries. But philosophy in the sense of a concern about who we are and how we should live had formed the core of the university since the church schools of the 11th century. Before the development of a scientific research culture, conflicts among philosophy, medicine, theology and law consisted of internecine battles rather than clashes across yawning cultural divides. Indeed, these older fields were widely believed to hang together in a grand unity of knowledge — a unity directed toward the goal of the good life. But this unity shattered under the weight of increasing specialization by the turn of the 20th century.

Early 20th-century philosophers thus faced an existential quandary: With the natural and social sciences mapping out the entirety of both theoretical as well as institutional space, what role was there for philosophy? A number of possibilities were available: Philosophers could serve as 1) synthesizers of academic knowledge production; 2) formalists who provided the logical undergirding for research across the academy; 3) translators who brought the insights of the academy to the world at large; 4) disciplinary specialists who focused on distinctively philosophical problems in ethics, epistemology, aesthetics and the like; or 5) as some combination of some or all of these.

There might have been room for all of these roles. But in terms of institutional realities, there seems to have been no real choice. Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy.

This was the act of purification that gave birth to the concept of philosophy most of us know today. As a result, and to a degree rarely acknowledged, the institutional imperative of the university has come to drive the theoretical agenda. If philosophy was going to have a secure place in the academy, it needed its own discrete domain, its own arcane language, its own standards of success and its own specialized concerns.

Having adopted the same structural form as the sciences, it’s no wonder philosophy fell prey to physics envy and feelings of inadequacy. Philosophy adopted the scientific modus operandi of knowledge production, but failed to match the sciences in terms of making progress in describing the world. Much has been made of this inability of philosophy to match the cognitive success of the sciences. But what has passed unnoticed is philosophy’s all-too-successful aping of the institutional form of the sciences. We, too, produce research articles. We, too, are judged by the same coin of the realm: peer-reviewed products. We, too, develop sub-specializations far from the comprehension of the person on the street. In all of these ways we are so very “scientific.”

Our claim, then, can be put simply: Philosophy should never have been purified. Rather than being seen as a problem, “dirty hands” should have been understood as the native condition of philosophic thought — present everywhere, often interstitial, essentially interdisciplinary and transdisciplinary in nature. Philosophy is a mangle. The philosopher’s hands were never clean and were never meant to be.

There is another layer to this story. The act of purification accompanying the creation of the modern research university was not just about differentiating realms of knowledge. It was also about divorcing knowledge from virtue. Though it seems foreign to us now, before purification the philosopher (and natural philosopher) was assumed to be morally superior to other sorts of people. The 18th-century thinker Joseph Priestley wrote “a Philosopher ought to be something greater and better than another man.” Philosophy, understood as the love of wisdom, was seen as a vocation, like the priesthood. It required significant moral virtues (foremost among these were integrity and selflessness), and the pursuit of wisdom in turn further inculcated those virtues. The study of philosophy elevated those who pursued it. Knowing and being good were intimately linked. It was widely understood that the point of philosophy was to become good rather than simply to collect or produce knowledge.

As the historian Steven Shapin has noted, the rise of disciplines in the 19th century changed all this. The implicit democracy of the disciplines ushered in an age of “the moral equivalence of the scientist” to everyone else. The scientist’s privileged role was to provide the morally neutral knowledge needed to achieve our goals, whether good or evil. This put an end to any notion that there was something uplifting about knowledge. The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions. By the late 19th century, Kierkegaard and Nietzsche had proved the failure of philosophy to establish any shared standard for choosing one way of life over another. This is how Alasdair MacIntyre explained philosophy’s contemporary position of insignificance in society and marginality in the academy. There was a brief window when philosophy could have replaced religion as the glue of society; but the moment passed. People stopped listening as philosophers focused on debates among themselves.

Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. The individual scientist is no different from the average Joe; he or she has, as Shapin has written, “no special authority to pronounce on what ought to be done.” For many, science became a paycheck, and the scientist became a “de-moralized” tool enlisted in the service of power, bureaucracy and commerce.

Here, too, philosophy has aped the sciences by fostering a culture that might be called “the genius contest.” Philosophic activity devolved into a contest to prove just how clever one can be in creating or destroying arguments. Today, a hyperactive productivist churn of scholarship keeps philosophers chained to their computers. Like the sciences, philosophy has largely become a technical enterprise, the only difference being that we manipulate words rather than genes or chemicals. Lost is the once common-sense notion that philosophers are seeking the good life — that we ought to be (in spite of our failings) model citizens and human beings. Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.

Robert Frodeman and Adam Briggle teach in the department of philosophy and religion and the University of North Texas. They are co-authors of the forthcoming “Socrates Tenured: The Institutions of 21st-Century Philosophy.”

http://opinionator.blogs.nytimes.com/20 ... ef=opinion

*****
There is a counterview of the above post in the article below:

Philosophy’s True Home

We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.

This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.

More....
http://opinionator.blogs.nytimes.com/20 ... ef=opinion
Last edited by kmaherali on Mon Mar 07, 2016 9:17 am, edited 1 time in total.
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

If a man does not know what port he is steering for,
no wind is favorable to him.
- Seneca

Efforts and courage are not enough without purpose and direction.
- John F. Kennedy

The tragedy of life doesn't lie in not reaching your goal.
The tragedy lies in having no goal to reach.
- Benjamin Mays

Strategy without tactics is the slowest route to victory.
Tactics without strategy is the noise before defeat.
- Sun Tzu

Two roads diverged in a wood, and I ...
I took the one less traveled by,
and that has made all the difference.
- Robert Frost
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

What a Million Syllabuses Can Teach Us

COLLEGE course syllabuses are curious documents. They represent the best efforts by faculty and instructors to distill human knowledge on a given subject into 14-week chunks. They structure the main activity of colleges and universities. And then, for the most part, they disappear.

Some schools archive them, some don’t. Some syllabus archives are public, some aren’t. Some faculty members treat their syllabuses as trade secrets, others are happy to post them online. Despite the bureaucratization of higher education over the past few decades, syllabuses have escaped systematic treatment.

Until now. Over the past two years, we and our partners at the Open Syllabus Project (based at the American Assembly at Columbia) have collected more than a million syllabuses from university websites. We have also begun to extract some of their key components — their metadata — starting with their dates, their schools, their fields of study and the texts that they assign.

This past week, we made available online a beta version of our Syllabus Explorer, which allows this database to be searched. Our hope and expectation is that this tool will enable people to learn new things about teaching, publishing and intellectual history.

More....
http://www.nytimes.com/2016/01/24/opini ... d=71987722
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Marvin Minsky, Pioneer in Artificial Intelligence, Dies at 88

Marvin Minsky, who combined a scientist’s thirst for knowledge with a philosopher’s quest for truth as a pioneering explorer of artificial intelligence, work that helped inspire the creation of the personal computer and the Internet, died on Sunday night in Boston. He was 88.

His family said the cause was a cerebral hemorrhage.

Well before the advent of the microprocessor and the supercomputer, Professor Minsky, a revered computer science educator at M.I.T., laid the foundation for the field of artificial intelligence by demonstrating the possibilities of imparting common-sense reasoning to computers.

“Marvin was one of the very few people in computing whose visions and perspectives liberated the computer from being a glorified adding machine to start to realize its destiny as one of the most powerful amplifiers for human endeavors in history,” said Alan Kay, a computer scientist and a friend and colleague of Professor Minsky’s.

Fascinated since his undergraduate days at Harvard by the mysteries of human intelligence and thinking, Professor Minsky saw no difference between the thinking processes of humans and those of machines. Beginning in the early 1950s, he worked on computational ideas to characterize human psychological processes and produced theories on how to endow machines with intelligence.

Professor Minsky, in 1959, co-founded the M.I.T. Artificial Intelligence Project (later the Artificial Intelligence Laboratory) with his colleague John McCarthy, who is credited with coining the term “artificial intelligence.”

More....
http://www.nytimes.com/2016/01/26/busin ... at-88.html
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Don’t Turn Away From the Art of Life

In a letter written in 1871, the Symbolist poet Arthur Rimbaud uttered a phrase that announces the modern age: “‘Je’ est un autre” (“‘I’ is someone else”). Some 69 years later I entered the world as an identical twin, and Rimbaud’s claim has an uncanny truth for me, since I grew up being one of a pair. Even though our friends and family could easily tell us apart, most people could not, and I began life with a blurrier, more fluid sense of my contours than most other folks.

My brother and I live in different cities, but I have never lost my conviction that one’s outward form — the shape of people, but also of surfaces and things — may not be what it seems.

That personal intuition is of a piece with my career as a professor of literature, since I am convinced that great works of art tell us about shape-shifting, about both the world and ourselves as more mobile, more misperceived, more dimensional beings, than science or our senses would have us believe.

Enthusiasm for the Humanities, though, is much diminished in today’s educational institutions. Our data-driven culture bears much of the blame: The arts can no longer compete with the prestige and financial payoffs promised by studying the STEM fields — a curriculum integrating science, technology, engineering and mathematics. These are all worthy disciplines that offer precise information on practically everything. But, often and inadvertently, they distort our perceptions; they even shortchange us.

The regime of information may well sport its specific truths, but it is locked out of the associations — subjective but also moral and philosophical — that bathe all literature. A new technology like GPS provides us with the most efficient and direct route to a destination, but it presupposes we know where we are going. Finding an address is one thing; finding one’s way in life is another. Even our smartest computers or most brilliant statisticians are at a loss when it comes to mapping our psychic landscapes.

When and how do you take your own measure? And what are you measuring? Both Oedipus and Lear could initially subscribe to Shakespeare’s notation, “every inch a king,” but by play’s end, something different, varied and terrifying has come to light: for one, an unknown history of parricide and incest, for the other, an opening into a moral vision of such force that it wrecks all prior frames, leading to madness, as Lear suffers his kinship with all “bare, fork’d animal[s].” Life’s actual hurdy-gurdy often explodes our labels and preconceptions.

“How much do you know about Shakespeare,” I once asked a friend who has committed much of her life to studying the Bard. She replied, “Not as much as he knows about me.” Remember this the next time someone tells you literature is useless.

Why does this matter? The humanities interrogate us. They challenge our sense of who we are, even of who our brothers and sisters might be. When President Obama said of Trayvon Martin, “this could have been my son,” he was uttering a truth that goes beyond compassion and reaches toward recognition. “It could have been me” is the threshold for the vistas that literature and art make available to us.

Art not only brings us news from the “interior,” but it points to future knowledge. A humanistic education is not about memorizing poems or knowing when X wrote Y, and what Z had to say about it. It is, instead, about the human record that is available to us in libraries and museums and theaters and, yes, online. But that record lives and breathes; it is not calculable or teachable via numbers or bullet points. Instead, it requires something that we never fail to do before buying clothes: Trying the garment on.

Art and literature are tried on. Reading a book, seeing a painting or a play or a film: Such encounters are fueled by affect as well as intelligence. Much “fleshing out” happens here: We invest the art with our own feelings, but the art comes to live inside us, adding to our own repertoire. Art obliges us to “first-personalize” the world. Our commerce with art makes us fellow travelers: to other cultures, other values, other selves. Some may think this both narcissistic and naïve, but ask yourself: What other means of propulsion can yield such encounters?

This humanistic model is sloppy. It has no bottom line. It is not geared for maximum productivity. It will not increase your arsenal of facts or data. But it rivals with rockets when it comes to flight and the visions it enables. And it will help create denser and more generous lives, lives aware that others are not only other, but are real. In this regard, it adds depth and resonance to what I regard as the shadowy, impalpable world of numbers and data: empirical notations that have no interest nor purchase in interiority, in values; notations that offer the heart no foothold.

The world of information is more Gothic than its believers believe, because it is ghostly, silhouette-like, deprived of human sentience. If we actually believe that the project of education is to enrich our students’ lives, then I submit that the Humanities are on the right side of the aisle, whatever paychecks they do or do not deliver.

At a time when the price of a degree from elite institutions is well over six figures, fields such as literature and the arts may seem like a luxury item. But we may have it backwards. They are, to cite Hemingway’s title for his Paris memoir, “a moveable feast,” and they offer us a kind of reach into time and space that we can find nowhere else.

We enter the bookstore, see the many volumes arrayed there, and think: so much to read, so little time. But books do not take time; they give time, they expand our resources of both heart and mind. It may sound paradoxical, but they are, in the last analysis, scientific, for they trace the far-flung route by which we come to understand our world and ourselves. They take our measure. And we are never through discovering who we are.

Arnold Weinstein is a professor of comparative literature at Brown University, and the author, most recently, of “Morning, Noon and Night: Finding the Meaning of Life’s Stages.”

http://www.nytimes.com/2016/02/24/opini ... 87722&_r=0
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Should All Research Papers Be Free?

DRAWING comparisons to Edward Snowden, a graduate student from Kazakhstan named Alexandra Elbakyan is believed to be hiding out in Russia after illegally leaking millions of documents. While she didn’t reveal state secrets, she took a stand for the public’s right to know by providing free online access to just about every scientific paper ever published, on topics ranging from acoustics to zymology.

Her protest against scholarly journals’ paywalls has earned her rock-star status among advocates for open access, and has shined a light on how scientific findings that could inform personal and public policy decisions on matters as consequential as health care, economics and the environment are often prohibitively expensive to read and impossible to aggregate and datamine.

“Realistically only scientists at really big, well-funded universities in the developed world have full access to published research,” said Michael Eisen, a professor of genetics, genomics and development at the University of California, Berkeley, and a longtime champion of open access. “The current system slows science by slowing communication of work, slows it by limiting the number of people who can access information and quashes the ability to do the kind of data analysis” that is possible when articles aren’t “sitting on various siloed databases.”

More...
http://www.nytimes.com/2016/03/13/opini ... d=45305309

*******
Some responses to the article above at:

Sharing Knowledge, for a Price

http://www.nytimes.com/2016/03/21/opini ... ef=opinion
Last edited by kmaherali on Tue Mar 22, 2016 8:00 am, edited 1 time in total.
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Where Computers Defeat Humans, and Where They Can’t

ALPHAGO, the artificial intelligence system built by the Google subsidiary DeepMind, has just defeated the human champion, Lee Se-dol, four games to one in the tournament of the strategy game of Go. Why does this matter? After all, computers surpassed humans in chess in 1997, when IBM’s Deep Blue beat Garry Kasparov. So why is AlphaGo’s victory significant?

Like chess, Go is a hugely complex strategy game in which chance and luck play no role. Two players take turns placing white or black stones on a 19-by-19 grid; when stones are surrounded on all four sides by those of the other color they are removed from the board, and the player with more stones remaining at the game’s end wins.

Unlike the case with chess, however, no human can explain how to play Go at the highest levels. The top players, it turns out, can’t fully access their own knowledge about how they’re able to perform so well. This self-ignorance is common to many human abilities, from driving a car in traffic to recognizing a face. This strange state of affairs was beautifully summarized by the philosopher and scientist Michael Polanyi, who said, “We know more than we can tell.” It’s a phenomenon that has come to be known as “Polanyi’s Paradox.”

Polanyi’s Paradox hasn’t prevented us from using computers to accomplish complicated tasks, like processing payrolls, optimizing flight schedules, routing telephone calls and calculating taxes. But as anyone who’s written a traditional computer program can tell you, automating these activities has required painstaking precision to explain exactly what the computer is supposed to do.

This approach to programming computers is severely limited; it can’t be used in the many domains, like Go, where we know more than we can tell, or other tasks like recognizing common objects in photos, translating between human languages and diagnosing diseases — all tasks where the rules-based approach to programming has failed badly over the years.

Deep Blue achieved its superhuman performance almost by sheer computing power: It was fed millions of examples of chess games so it could sift among the possibilities to determine the optimal move. The problem is that there are many more possible Go games than there are atoms in the universe, so even the fastest computers can’t simulate a meaningful fraction of them. To make matters worse, it’s usually far from clear which possible moves to even start exploring.

What changed? The AlphaGo victories vividly illustrate the power of a new approach in which instead of trying to program smart strategies into a computer, we instead build systems that can learn winning strategies almost entirely on their own, by seeing examples of successes and failures.

Since these systems don’t rely on human knowledge about the task at hand, they’re not limited by the fact that we know more than we can tell.

AlphaGo does use simulations and traditional search algorithms to help it decide on some moves, but its real breakthrough is its ability to overcome Polanyi’s Paradox. It did this by figuring out winning strategies for itself, both by example and from experience. The examples came from huge libraries of Go matches between top players amassed over the game’s 2,500-year history. To understand the strategies that led to victory in these games, the system made use of an approach known as deep learning, which has demonstrated remarkable abilities to tease out patterns and understand what’s important in large pools of information.
Continue reading the main story

Learning in our brains is a process of forming and strengthening connections among neurons. Deep learning systems take an analogous approach, so much so that they used to be called “neural nets.” They set up billions of nodes and connections in software, use “training sets” of examples to strengthen connections among stimuli (a Go game in process) and responses (the next move), then expose the system to a new stimulus and see what its response is. AlphaGo also played millions of games against itself, using another technique called reinforcement learning to remember the moves and strategies that worked well.

Deep learning and reinforcement learning have both been around for a while, but until recently it was not at all clear how powerful they were, and how far they could be extended. In fact, it’s still not, but applications are improving at a gallop, with no end in sight. And the applications are broad, including speech recognition, credit card fraud detection, and radiology and pathology. Machines can now recognize faces and drive cars, two of the examples that Polanyi himself noted as areas where we know more than we can tell.

We still have a long way to go, but the implications are profound. As when James Watt introduced his steam engine 240 years ago, technology-fueled changes will ripple throughout our economy in the years ahead, but there is no guarantee that everyone will benefit equally. Understanding and addressing the societal challenges brought on by rapid technological progress remain tasks that no machine can do for us.


Andrew McAfee is a principal research scientist at M.I.T., where Erik Brynjolfsson is a professor of management. They are the co-founders of the M.I.T. Initiative on the Digital Economy and the authors of “The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies.”

http://www.nytimes.com/2016/03/16/opini ... d=71987722
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

"It is through the creation of such a new elite, inspired by and widely read in everything related to our heritage, that there must come about a revival in Muslim thought. The whole approach to education, without becoming archaic, should begin now to re-introduce, as widely as possible, the work and thought of our great Muslim writers and philosophers. Thus, from the nursery school to the university, the thoughts of the young will be inspired by our own heritage and not that of some foreign culture. Again, let there be no misunderstanding: I am not in any way opposed to the literature or the art or the thought of the West. I simply maintain that the Islamic heritage is just as great and that it is up to us to bring it to the forefront again. When our nursery school children first begin to read, why should they not let their imaginations build upon the prowess of the Great Khaled rather than Wellington or Napoleon? And if the student of philosophy seeks a degree, should he not be encouraged to read about even Al-Hallaj rather than Hegel or Kierkegaard?"( His Highness the Aga Khan speaking after receiving an Honorary Degree of Doctorate of Laws conferred by the University of Sindh, Jamshoro, during a special convocation.)

http://www.akdn.org/speech/his-highness ... sity-sindh


If Philosophy Won’t Diversify, Let’s Call It What It Really Is

The vast majority of philosophy departments in the United States offer courses only on philosophy derived from Europe and the English-speaking world. For example, of the 118 doctoral programs in philosophy in the United States and Canada, only 10 percent have a specialist in Chinese philosophy as part of their regular faculty. Most philosophy departments also offer no courses on Africana, Indian, Islamic, Jewish, Latin American, Native American or other non-European traditions. Indeed, of the top 50 philosophy doctoral programs in the English-speaking world, only 15 percent have any regular faculty members who teach any non-Western philosophy.

Given the importance of non-European traditions in both the history of world philosophy and in the contemporary world, and given the increasing numbers of students in our colleges and universities from non-European backgrounds, this is astonishing. No other humanities discipline demonstrates this systematic neglect of most of the civilizations in its domain. The present situation is hard to justify morally, politically, epistemically or as good educational and research training practice.

More...
http://www.nytimes.com/2016/05/11/opini ... ef=opinion

******
Should Philosophy Departments Change Their Names? Readers Join the Debate

Extract:

Many readers said they support a continued focus on the traditional philosophy curriculum in American universities.

“The western philosophical canon formed the foundation of our concepts of human rights, civil liberties, and government. As such, it deserves our focus and priority, especially for undergrads,” Steve Misuta wrote from the United States.

A professor who teaches Islamic philosophy said defenders of the traditional canon would benefit by expanding their studies.

“I suggest that these commentators who are so quick to dismiss the relevance of non-Western philosophy take a year or two off to read Chinese philosophy with its distinctive concerns and its logic and metaphysics grounded in a non-Indo-European language, Islamic logic with its concern for philosophy of law, or the rich metaphysical traditions of India,” wrote John Walbridge, a professor of Near East language and culture at Indiana University. “And see what’s happening in Africa.”

More...
http://takingnote.blogs.nytimes.com/201 ... 87722&_r=0
Last edited by kmaherali on Tue May 24, 2016 12:44 am, edited 1 time in total.
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Having Eyes for Beauty

Everything has beauty,
but not everyone sees it.
- Confucius

Beauty is simply reality seen with the eyes of love.
- Evelyn Underhill

Those who find beauty in all of nature
will find themselves at one with the secrets of life itself.
- L. Wolfe Gilbert

We live in a wonderful world that is
full of beauty, charm and adventure.
There is no end to the adventures that we can have
if only we seek them with our eyes open.
- Jawaharlal Nehru

Look closely... See with new eyes...
Don't just pass by what is familiar without a thought.
Pay attention and look closely.
There is beauty - there is discovery -
there in a whole new world hiding
beneath the face of the familiar.
- Jonathan Lockwood Huie
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Clinging to Our ‘Roots’

Not long ago I came across a commercial for the genealogical research website Ancestry.com, in which a man recounts how he’d always believed his roots to be German. He had proudly celebrated this heritage — for example, by wearing Lederhosen and joining a German dance group — until a DNA test revealed the hidden reality of his origins: 52 percent of his ancestors were from Scotland and Ireland, he learned; there were no Germans in his family tree. “So I traded in my Lederhosen for a kilt,” the man says. And just like that, he replaces one set of roots with another, swinging like Tarzan from one vine to the next.

The commercial is silly in its reduction of identity to a wardrobe choice, but it does reveal something about the sense of self we derive from our roots, which are both troubled and enriched by the new global technological reality of our time. We can know more about distant ancestors than ever before. But what purpose does this knowledge serve? Why is the root such a compelling metaphor for thinking about our connection to ancestors, homelands, and the earth itself?

Humans are context-seeking creatures, and this need to feel woven into the world takes many forms: research into family history; pride about one’s hometown, state, or country and the specificities of these places that have marked one’s character, behavior, and speech; nostalgia for a past when people appeared to have stable destinies, when gender roles, social hierarchies, and the “order of things” seemed clearer, and when inherited categories went uncontested; and the pastoral longing to restore a lost communion with the earth itself. Rootedness is the most flexible metaphor for talking about the contextualized human being. Often, it compels us to such an extent that we forget it is a metaphor.

The History Channel’s current remake of the mini-series “Roots,” based on Alex Haley’s 1976 novel and its first televised iteration in 1977, has to some extent returned the idea to the contemporary American consciousness. Here, the cultural heritage transmitted from generation to generation in the context of the American slave trade gives voice to the legacy of those torn from their homes and sent toward an irrevocable and tragic future. This is a distinctly American narrative, but the language of rootedness can be found in a remarkable range of contexts, across cultures and in all kinds of writing — including poetry, nature writing and ecocritical philosophy — to describe the human’s self-extraction from the earth. How could the same image be used in such varied settings?

The philosophical novelist and essayist Michel Tournier, who died in January, believed that nearly all human conflicts could be traced to the tensions between rootless and rooted peoples. He offers many examples in an essay called “Nomad and Sedentary” in his book “The Mirror of Ideas”: the fratricide in Genesis involving the sedentary farmer Cain’s murder of his nomadic brother Abel, a shepherd; the invention of barbed wire in America in the 1800s, which marked the sedentarization of pioneers and bloodshed over the rightful ownership of land; the conflicts between the nomadic Tuareg and the settled Saharan peoples; and the Nazis’ demonization of the Jews, imagined as rootless and thus unrighteous transients.

It is often pointed out that tracing our lineage far back enough would show that we all came from the same place. But this primordial root is usually disregarded. For some reason, each collective, whether it be a nation, ethnic group or tribe, adopts a distinct conception of its own roots that tends to ignore this most fundamental idea of human connectedness.

We’ve arrived at a strange juncture in history, one that puts two world systems at odds: the first, an older root system that privileged “vertical” hierarchy, tradition, and national sovereignty; and the second, the “horizontal” globalized latticework of cybernetic information transfer and economic connectivity.

Perhaps this is what the French philosophers Gilles Deleuze and Félix Guattari anticipated in the introduction to their 1980 book “A Thousand Plateaus” when they described the rhizome, a figure for systems that begin in medias res with no discernible beginning or end and that operate on a principle of horizontal, unpredictable proliferation. The fact that our current moment, with its proliferation of technological networks, is more rhizomatic, doesn’t mean that rootedness no longer appeals to people. On the contrary, perhaps now more than ever, people have legitimate reasons for feeling alienated from the world and from one another — the greater the level of alienation, the more precious roots become.

Today, people across the world face an array of uprooting and alienating forces: the Syrian refugee crisis, Islamist terrorism, immigration, the identity-dissolving tendency of the European Union, global competition, capitalist uniformization and immersive, digital loneliness. Not coincidentally, each of these has been answered in its own way with an appeal to rootedness or, rather, re-enrootment. One hears in Donald Trump and Marine Le Pen’s speeches the same longing for a rerooting of their national cultures. A celebration of roots is a central motivation for those who wish to keep Confederate symbols and emblems of a specific white heritage in the public space. Paradoxically, this longing can also be heard in calls for the recuperation of minority histories, of suppressed voices and of the dignity of history’s victims.

While patriotic nationalism is usually imagined as the polar opposite of diversity-focused multiculturalism, the proponents of each actually have very similar motivations and desires. Each group hopes to preserve or recuperate a sense of rootedness in something. Given the great confusion about how to celebrate one’s own roots without insulting someone else’s, this struggle will certainly continue in the coming decades.

The nation, of course, is still a meaningful unit. For centuries, people have died, and continue to die, for their nations. No one, on the other hand, will ever be willing to die for “global,” as a friend of mine wisely put it. In fact, globalism seems to challenge the very possibility of rootedness, at least the kind that once relied on nation-states for its symbolic power. How will people be rooted in the future if global networks replace nations? Through bloodlines? Ideologies? Shared cultural practices? Elective affinities? Will we become comfortable rooting ourselves in rootlessness?

These questions go hand in hand with uncertainties about the health of the planet. Our relationship with the earth is fraught with anxiety, if the sustained interest in the Anthropocene and other manifestations of humanity’s self-excision from the natural world is any indication.

Efforts to repair a broken circuit with the planet include the push for outdoor preschools, which seek to restore the umbilical connection between children and Mother Earth by delivering them back to their natural habitat. It is telling that Rudolf Steiner, the founder of Waldorf education, often made analogies between children and plants in his writings. In his view, plants model the cosmic embeddedness necessary for human happiness and thus offer children a living example of resistance to the uprooting forces that constitute modernity.

Throughout history, many philosophers — including, for example, Montesquieu, Rousseau, and Heidegger — have believed that the land and climate of a particular region impart certain characteristics to its inhabitants, whose temperaments, language, and cultural production are heavily influenced by the topographical, meteorological, and botanical features of the place. This bioregionalism resembles the French concept of terroir, a term used in agriculture and gastronomy to describe the relationship between flavor and place. But does the same hold true for humans?

A desire for roots and rootedness may be acquiring a new importance in the new global tangle, where certainties are hard to come by. But I wonder sometimes if this root-oriented thinking actually causes many of the problems whose solutions we can’t seem to find. Think of your own roots and how much of your identity relies on them. How many things that trouble or anger you relate in some way, if only peripherally, to this rootedness? If you were to suddenly discover that you were mistaken about your roots, would you trade in your Lederhosen for a kilt? How negotiable is your sense of self? How much do your roots determine your actions? What if you’d been born with someone else’s roots, say, those of your enemy?

Each person will have different answers to these questions. And yet there is something universal about rootedness as well. All people seek a context into which they may enfold themselves. If we truly are wired for connectedness, we’ve gotten our wish in a sense; our unprecedented system of networks has shrunk the globe and at least offered the possibility for new kinds of continuity and growth. But it remains to be seen how this connectivity will be reconciled with individual identities, with old brands of embeddedness, and with nostalgia for the first garden.

Christy Wampole is an assistant professor in the department of French and Italian at Princeton, and the author of “Rootedness: The Ramifications of a Metaphor” and “The Other Serious: Essays for the New American Generation.”

http://www.nytimes.com/2016/05/30/opini ... d=71987722

******
There is a video on a related issue at:

momondo – The DNA Journey
https://www.youtube.com/watch?v=tyaEQEmt5ls
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

This article is an insightful reflection about the role of history at present...

No, He’s Not Hitler. And Yet ...

History may not repeat itself, but it does yield warnings.

Extract:

"History seems to present us with a choice between two undesirable options: If it is just one singular thing after another, then we can derive no general laws or regularities from it, and so we would seem to have no hope of learning from it; but when we do try to draw lessons from it, we lapse all too easily into such a simplified version of the past, with a handful of stock types and paradigm events, that we may as well just have made it up. History seems to be a pointless parade of insignificant events until we shape it into something that has significance for us, until we build myths out of it, until we begin using it to make up stories.

This is what makes it so easy and tempting to weaponize history, to forgo any interest in “how it actually was” — to use the 19th-century historian Leopold von Ranke’s definition of the true goal of the study of history — and to bend it toward our own present ends."

More...
http://www.nytimes.com/2016/06/05/opini ... d=71987722
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Wrong Questions

When you keep getting the wrong answers,
try asking better questions.
- Jonathan Lockwood Huie

There are no right answers to wrong questions.
- Ursula K. Le Guin

When we have arrived at the question,
the answer is already near.
- Ralph Waldo Emerson

Well, you ask a silly question, and you get a silly answer.
- Tom Lehrer

We thought that we had the answers,
it was the questions we had wrong.
- Bono

Often, we get stuck on a question
for which we can't find an acceptable answer.
That is a good time to consider asking a different question.
Every question is based on some assumptions -
usually invisible assumptions that we don't see, unless we go looking.
- Jonathan Lockwood Huie

*******
6 in 10 of you will share this link without reading it, a new, depressing study says

On June 4, the satirical news site the Science Post published a block of “lorem ipsum” text under a frightening headline: “Study: 70% of Facebook users only read the headline of science stories before commenting.”

Nearly 46,000 people shared the post, some of them quite earnestly — an inadvertent example, perhaps, of life imitating comedy.

Now, as if it needed further proof, the satirical headline’s been validated once again: According to a new study by computer scientists at Columbia University and the French National Institute, 59 percent of links shared on social media have never actually been clicked: In other words, most people appear to retweet news without ever reading it.

[The fastest-growing ‘news’ site on the Web is an obscure meme farm for moms]

Worse, the study finds that these sort of blind peer-to-peer shares are really important in determining what news gets circulated and what just fades off the public radar. So your thoughtless retweets, and those of your friends, are actually shaping our shared political and cultural agendas.

“People are more willing to share an article than read it,” study co-author Arnaud Legout said in a statement. “This is typical of modern information consumption. People form an opinion based on a summary, or a summary of summaries, without making the effort to go deeper.”

More....
https://www.washingtonpost.com/news/the ... ing-study/
kmaherali
Posts: 25106
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Nature of Learning

We Learn...
10% of what we read,
20% of what we hear,
30% of what we see,
50% of what we see and hear,
70% of what we discuss,
80% of what we experience,
95% of what we teach others.
- William Glasser

Learning is the beginning of wealth.
Learning is the beginning of health.
Learning is the beginning of spirituality.
Searching and learning is where the miracle process all begins.
- Jim Rohn

Education is not the answer to the question.
Education is the means to the answer to all questions.
- William Allin

True learning is not about facts,
but about conscious appreciation of the experience of living.
- Jonathan Lockwood Huie
Post Reply