“Children should be allowed to go barefoot in the dirt, play in the dirt, and not have to wash their hands when they come in to eat,” [Dr. Joel V. Weinstock, the director of gastroenterology and hepatology at Tufts Medical Center in Boston] said. He and [Dr. David Elliott, a gastroenterologist and immunologist at the University of Iowa] pointed out that children who grow up on farms and are frequently exposed to worms and other organisms from farm animals are much less likely to develop allergies and autoimmune diseases.
Friday, January 30
Stay dirty
The pursuit of taste encourages competition and conformity
In modern times, there is nothing which more exactly defines social differences than personal taste, whether in food or music or wallpaper or the choice of children’s names. The choices that people make in these areas of life may seem spontaneous and genuine, but, without any apparent pressure or coercion, they usually conform to class lines. The possessions which we place in our living spaces and the way we decorate those spaces instantly reveal our sensibilities, our preoccupations, and our social milieux. That is why they will evoke the admiration of some observers and the disdain of others. This state of affairs was already in evidence in the early modern period.Great article, with the exception of the emphasized sentence.
...
Possessions were symbols of refinement and politeness. They helped to define individual identity. They even shaped their owners’ physical deportment and behaviour, for knives and forks, cups and teapots, fragile porcelain and increasingly delicate furniture imposed a distinctively mannered way of eating, drinking, moving and sitting. In this way the consumption of goods created social differences as well as expressing them.
The competition...shifted away from the conspicuous display of opulence to a more restrained demonstration of elegance, refinement and fastidious discrimination. ...The purchasing power of the middling and lower classes might rise, but the elite could hold on to its monopoly of cultural capital by asserting that wealth was not enough.
Meanwhile, the consumption of new goods was unleashing a torrent of contemporary criticism. From the 16th century onwards, there were denunciations of ‘immoderate purchasings’, ‘unlawful spending and consuming’, and what one Protestant divine called ‘the inordinate and unsatiable desire of having’. Moralists pointed to the waste of resources which could have been better employed in relieving the poor; to the adverse consequences for the balance of trade of the import of foreign commodities; and to the ruinous effects of self-indulgence upon an individual’s health and finances.
They attacked ‘wasters’ and ‘spenders’ and were contemptuous of what they called ‘superfluities’, ‘needless toys’, ‘vain trifles’, ‘fantasies’, ‘new fangles’ and ‘trumpery trash’. Adam Smith was in this tradition when he said that it was the desire of great lords for ‘frivolous and useless’ objects which brought down the feudal system: they bartered their authority ‘for the gratification of the most childish, the meanest and the most sordid of all vanities’. Luxuries were ‘contemptible and trifling’, ‘trinkets and baubles, fitter to be the play-things of children than the serious pursuits of men’.
Distaste for frivolities was reinforced by the classical notion that luxury weakened the state, undermined civic virtue and led eventually to despotism. The ‘civic humanist’ belief was that comfortable living had an enervating effect, sapping the martial spirit and reducing military effectiveness; the very word ‘luxury’ had connotations of excessive fleshly indulgence. Republican virtue required frugality, whereas abundance produced ‘effeminacy’; and the lure of private comforts distracted citizens from a commitment to public service.
Throughout the early modern period, commentators repeatedly harked back to ‘the plainness and hardness of the people of old’, contrasting the military vigour of ‘our plain forefathers’ with the soft and luxurious habits of their descendants. The exact location of this age of primitive simplicity was variously put at any point between the Old Testament patriarchs and the reign of Queen Anne, but its virile austerity remained constant. It was asserted that effeminacy was generated by the new objects of expenditure, from coaches, which robbed men of their riding skills, to tea, which emasculated them as they sat sipping in female company. This objection to ‘foreign, effeminate and superfluous commodities’ was not just a classical theme: it reflected the central role of women in the purchase of objects for domestic consumption and the association of the new goods with domesticity and an unmilitary style of life.
...
In fact, of course, it was not only the female sex who engaged in this competitive shopping. Well-to-do men spent heavily on horses, carriages, clothes, paintings, watches, plate, wigs, books and other luxury objects without incurring the same odium. Besides, much female expenditure was of a vicarious kind, designed to bolster the husband’s position: as a Jacobean moralist remarked, ‘there cometh credit and praise to the man by the comely apparel of his wife’.
Nevertheless, women spent more time in shops because they were usually responsible for provisioning the household; and middle-class housewives were coming to think of houses as places in which goods were displayed. The wives of better-off farmers were notoriously house-proud, keeping their floors spotless, and polishing and scouring their pewter dishes until they shone. More goods meant more cooking, more washing up and more housework. In 18th-century London, even modest rented rooms had copper kettles, walnut-framed looking-glasses, curtains, and white cotton counterpanes.
...In the absence of probate inventories for the poor, it is hard to generalise, but it is clear that furnishings were sparse and over half of the domestic budget went on food. The emphasis was more on immediate consumption than on enduring possessions; though, by the later 18th century, cotton and linen, pewter, pottery, tea sets, and decorative household items would reach even labourers’ cottages. Carpets and curtains arrived rather later. ‘Best’ clothes had always been important.
An early 18th-century observer thought that ‘the poorest labourer’s wife in the parish’ would half-starve herself and her husband in order to buy a second-hand gown and petticoat, rather than ‘a strong wholesome frieze’, because it was more ‘genteel’. Even when undernourished and poorly housed, the lower classes were prepared to devote some of their limited resources to goods which boosted their self-esteem and helped them to create social relationships with others.
For the consumption of goods was, as it remains, as much conformist in spirit as competitive. Most people bought commodities out of a desire to keep in line with the accepted standards of their own peer group rather than to emulate those of the one above: similarity in living styles was an important source of social cohesion; and anxiety to do the right thing was more common than the urge to stand out. As Josiah Wedgwood remarked: ‘Few ladies, you know, dare venture at anything out of the common style ’till authoris’d by their betters – by the ladies of superior spirit who set the ton.’
...as Lord Halifax sardonically observed, ‘we call all ‘‘necessary’’ that we have a mind to’. So-called ‘artificial’ wants soon came to seem natural. Adam Smith would rule that necessities were not just those goods which were indispensable for the support of life, but ‘whatever the custom of the country renders it indecent for creditable people, even of the lowest order, to be without’. By relating the dividing line to prevailing conceptions of decency, he accepted that it was a shifting boundary.
For standards constantly changed; and, as Mandeville had observed, one could not tell what words like ‘decency’ and ‘conveniency’ meant until one knew the quality of the person using them; what the poor saw as intolerable luxury was regarded by the gentry merely as ‘decency’. In this way the values of civility, respectability, refinement and politeness were invoked to legitimise the unceasing acquisition of goods.
...
What we see during the 17th and 18th centuries is the gradual emergence of a new ideology, accepting the pursuit of consumer goods as a valid object of human endeavour and recognising that no limit could, or should, be put to it. Consumption was justified in terms of the opportunities it brought for human fulfilment. The growth of a consumer market, unrestricted by the requirements of social hierarchy, offered increasing possibilities for comfort, enjoyment and self-realisation. Poverty was no longer to be regarded as a holy state; and there was no need to feel guilt about envying the rich; one should try to emulate them. Or so the advocates of laissez-faire commerce would argue. Goods were prized, for themselves, for the esteem they brought with them, for the social relationships they made possible. To interfere with the process of acquisition by sumptuary laws was what Adam Smith would call ‘the highest impertinence and presumption’; it threatened liberty and personal happiness. The labourer had the right ‘to spend his own money himself and lay out the produce of his labour his own way’. The sovereignty of consumer choice triumphed over the notion that consumption should be regulated to fit social status; and the distribution of goods was left to the working of the market. No one yet foresaw that monopolistic capitalism might one day do as much to restrict choice as to enlarge it.
...
...independent spirits asserted their individuality by perversely not consuming, rather like a modern middle-class family extolling the virtues of ‘good plain cooking’ or boasting that it does not possess a television set. Social competition can take many different forms.
Not all drivers take responsibility for their actions
encourages driver anarchy by removing traffic lights, street markings, and boundaries between the street and sidewalk. Studies conducted in northern Europe, where shared streets are common, point to improved safety and traffic flow.This wouldn't work in China or most places in Taiwan, where drivers regularly ignore traffic lights, street markings, and boundaries between the street and sidewalk.
The idea is that the absence of traffic regulation forces drivers to take more responsibility for their actions. “The more uncomfortable the driver feels, the more he is forced to make eye contact on the street with pedestrians, other drivers and to intuitively go slower,” explains Chris Conway, a city engineer with Montgomery, Ala. Last April the city converted a signalized downtown intersection into a European-style cobblestone plaza shared by cars, bikes and pedestrians—one of a handful of such projects that are springing up around the country.
Generalizing this to the libertarian love of freedom, it suggests to me that for freedom to work, people need to be educated into believing in the rights of others. So will I have to concede that we need a minimum amount of moral brainwashing? Ugh.
Tuesday, January 27
Rationality
would be far more effective if they took into account not only mental "brightness" but also rationality — including such abilities as "judicious decision making, efficient behavioral regulation, sensible goal prioritization ... [and] the proper calibration of evidence."But there's such a thing as too much rationality.
Friday, January 9
Assume that a tiger is there
Yeah, I'm not magical.Magical thinking can be plotted on a spectrum, with skeptics at one end and schizophrenics at the other. People who endorse magical ideation, ranging from the innocuous (occasional fear of stepping on sidewalk cracks) to the outlandish (TV broadcasters know when you're watching), are more likely to have psychosis or develop it later in their lives. People who suffer from obsessive-compulsive disorder also exhibit elevated levels of paranoia, perceptual disturbances, and magical thinking, particularly "thought-action fusion," the belief that your negative thoughts can cause harm. These people are compelled to carry out repetitive tasks to counteract their intrusive thoughts about unlocked doors or loved ones getting cancer. But more magical thinking does not necessarily mean more emotional problems—what counts is whether such thinking interferes with everyday functioning.
You wouldn't want to be at the skeptic end of the spectrum anyway. "To be totally 'unmagical' is very unhealthy," says Peter Brugger, head of neuropsychology at University Hospital Zurich. He has data, for example, strongly linking lack of magical ideation to anhedonia, the inability to experience pleasure. "Students who are 'not magical' don't typically enjoy going to parties and so on," he says. He's also found that there's a key chemical involved in magical thinking. Dopamine, a neurotransmitter that the brain uses to tag experiences as meaningful, floods the brains of schizophrenics, who see significance in everything, but merely trickles in many depressives, who struggle to find value in everyday life. In one experiment, paranormal believers (who are high in dopamine) were more prone than nonbelievers to spot nonexistent faces when looking at jumbled images and also were less likely to miss the faces when they really were there. Everyone spotted more faces when given dopamine-boosting drugs. Brugger argues that the ability to see patterns and make loose associations enhances creativity and also serves a practical function: "If you're on the grassland, it's always better to assume that a tiger is there."