Tag Archives: best

The posts that aren’t all that bad…

Are women more honest than men?

The journal Science recently published a fascinating article from Alain Cohn et al, which looked at cultural proclivities for civic honesty around the globe. They employed a rather ingenious method: they “lost” wallets all over the world and recorded when the receiver of the lost wallet attempted to return the wallet to its rightful owner. The wallets were fake and included a false ID of a person who appeared to be local to the country in which the wallet was lost, including fake contact info that actually belonged to the researchers. The ingenious element of the research was that instead of leaving the wallet out in the open, the research assistants actually pretended to have found the wallets in our nearby local businesses and turned in the wallet to somebody working in that business, thus enabling them to record interesting ancillary data on the “subject,” such as their age, if they had a computer on their desk, and whether or not the person was local to the country. Clearly, the researchers were hoping to engage in a little bit of data mining to ensure their not insignificant efforts returned some publishable results regardless of the main outcome.

As it turns out, they needn’t have been concerned. The level of civic honesty, as measured by wallet return rates, varied significantly between cultures. In addition, there is an interesting effect where the likelihood of the wallet being returned increased if there was more money in it, an effect that persists across regions and which was evidently not predicted by most economists. I encourage you to read the original article, which is fascinating. On the top end of the civic honesty scale are the Scandinavian and Northern European countries, with rates at around 75%. On the bottom end of the curve is China, with about 14%. In the case of China, all the study did was confirm what anybody who does business there knows, and something that has been well covered by journalists and completely ignored by our politicians: to the Chinese, not cheating is a sign you’re not trying hard enough.

Here’s where things get interesting: in keeping with modern scientific publishing standards, the researchers made their entire dataset available in an online data repository so that others could reproduce their work. There are a lot of interesting conclusions one can make beyond what the authors were willing to point out in their paper, perhaps due to the political implications and the difficulty of doing a proper accounting for all the possible biases. However, unburdened by the constraints of an academic career in the social sciences, I was more than happy to dig into the data to see what it could turn up…

Perhaps the most interesting thing I found is that women appear to be more honest than men. Over the entire world-wide dataset, women returned the wallets about 51% of the time, versus 42% for men. It is tempting to look at individual countries, but the male versus female difference is not statistically significant enough when looking at individual countries, so I chose to only look at the aggregate data. The data is not weighted by country population, so one should take the absolute magnitude of the difference with a bit of skepticism. However, looking at the individual country data it appears a proper accounting for population bias would likely maintain or increase the difference. (Some of the most populous countries had the largest difference between women and men.)

Worldwide, women appear to be statistically significantly more honest than men. Standard error was less than 1% for both cases.

Here is the full dataset of men versus women broken down by country. You can see that the most populous countries are those where women appear to be more honest than men, so fixing the chart above to account for sample bias would likely still find a significant difference.

Women appear to be more honest than men in most cultures, though the individual country results are not usually statistically significant.

Another interesting question to ask of the data is whether or not there is a generational difference in honesty. Surprisingly, the answer turns out to be that there’s not a statistically significant difference:

Age doesn’t appear to be a statistically significant predictor of honesty. Standard error was roughly 1%, so the difference shown is not meaningful.

Looking at the breakdown by country, we see that there are no big differences between the generations, with one exception that I’m not even going to try to explain:

It’s possible that the young are more honest than the old, but it doesn’t appear to be statistically significant except in one country.

One interesting set of issues that always comes up with population studies like this is what, if anything, should we do with this information? It is true that a Swedish woman is about eight times more civically honest, on average, than a Chinese man. That’s interesting, but also pretty dangerous information. Should this inform our immigration policy, where population statistics might actually be valid? Is it better to not even ask these questions given the abuse of the information that might result? Or, is it good to have this information, especially when it flies in the face of our image of ourselves and others? I suspect in the case of the US, most would be surprised to find out that the average US citizen is as honest as the average Russian. We may be surprised by both halves of that statement, and both might be good to think about.

Getting the most data speed out of your cell phone

You may have noticed there have been very few posts here. There’s a reason for that. The first and foremost is that sending my rants in to the void has not been as personally cathartic as I’d hoped. My other goal for the blog, which actually has been somewhat successful, was to simply provide a vehicle for putting information out on the web that I thought might be useful for people, and that I couldn’t find elsewhere. Based on the traffic stats, those posts have actually been worthwhile, and my only reason for not doing more of this kind of post has been that I’ve been too busy playing with my son, finishing up my projects at MIT, and trying to get a job (in that order).

So, going forward, I’m just going to focus on the second category of posts (though I reserve the right to devolve to the first occasionally). This blog was getting too negative, anyway. In that spirit, here’s a particularly useful trick I just figured out while sitting in a coffee shop working remotely.

I recently gave up my nice window office since I was feeling guilty about taking up a nice spot but only working part time. So, I’ve been doing a lot of work remotely, usually from a coffee shop given that working at home just isn’t very productive when there’s an adorable toddler running around begging to be hugged. So, I splurged and decided to start paying the extra $20 a month to use my phone as an internet connection for my computer. This is becoming a pretty common thing, and Sprint even offers phones that will create a WiFi network on the fly (I use Bluetooth with my iPhone). I expect this will become even more common once the iPhone hits Verizon, as Apple will reportedly allow this version of their phone to create WiFi hotspots, too.

I would typically just leave my phone laying flat on the table next to my laptop. However, giving it a minute of thought, this is actually pretty dumb, for two reasons. First, having the phone so close to the laptop is probably not smart, as computers are notorious spewers of electromagnetic interference at pretty much every frequency imaginable. In theory, they should be shielded, but nothing is perfect and between the memory data rates and the processor clock speeds, a computer pretty much has the cell phone spectrum covered directly, if not with overtones. So, keep the cell phone away form the computer at least a foot or so.

Most importantly, however, leaving the cell phone flat on a table is a bad idea because it puts the antenna horizontal, whereas cell phone signals are polarized vertically. (What this means, if you’re not a fan of electromagnetics, is that the electrons in the cell phone tower antenna are being shaken up and down, not side-to-side. Radio waves are really just a way of keeping track of how electrons interact with each other. Without anything interfering, the electrons in your cell phone’s antenna will be wiggled in the same orientation and frequency as those in the cell tower antenna. However, antennas are designed for their electrons to be wiggled in a certain direction (it’s almost always along the long axis of the antenna) and a cell phone’s antenna is oriented with the assumption that the user is holding it upright against their ear.) Once I realized this, I put my phone up against a nearby wall so that it was standing straight up and down (as if somebody were holding it) and my data rates nearly doubled.

So, if you’re using your cell phone as an internet connection, keep it a bit away from the computer and prop it up so it’s vertical. Keeping it vertical in your pocket probably isn’t a great idea, since your body is pretty good at blocking radio. If you find this helps, please let me know in the comments. Right now my experience alone isn’t very statistically significant, to say the least.

Accelerating code using GCC’s prefetch extension

I recently started playing with GCC’s prefetch builtin, which allows the programmer to explicitly tell the processor to load given memory locations in cache. You can optionally inform the compiler of the locality of the data (i.e. how much priority the CPU should give to keep that piece of data around for later use) as well as whether or not the memory location will be written to. Remarkably, the extension is very straighforward to use (if not to use correctly) and simply requires calling the __builtin_prefetch function with a pointer to the memory location to be loaded.

It turns out that in certain situations, tremendous speed-ups of several factors can be obtained with this facility. In fact, I’m amazed that I haven’t read more about this. In particular, when memory is being loaded “out of sequence” in a memory bandwidth-constrained loop, you can often benefit a great deal from explicit prefetch instructions. For example, I am currently working on a program which has has two inners loops in sequence. First, an array is traversed one way, and then it is traversed in reverse. The details of why this is done aren’t important (it’s an optical transfer matrix computation, if you’re interested) but the salient aspect of the code is that the computation at each iteration is not that great, and so memory bandwidth is the main issue. Here is the relevent section of code where the arrays are accessed in reverse:

/*
* Step backward through structure, calculating reverse matrices.
*/
for (dx = n-1; dx > 0; dx--)
{
Trev1[dx] = Trev1[dx+1]*Tlay1[dx] + Trev2[dx+1]*conj(Tlay2[dx]);
Trev2[dx] = Trev1[dx+1]*Tlay2[dx] + Trev2[dx+1]*conj(Tlay1[dx]);
dTrev1[dx] = dTrev1[dx+1]*Tlay1[dx] + dTrev2[dx+1]*conj(Tlay2[dx]) +
Trev1[dx+1]*dTlay1[dx] + Trev2[dx+1]*conj(dTlay2[dx]);
dTrev2[dx] = dTrev1[dx+1]*Tlay2[dx] + dTrev2[dx+1]*conj(Tlay1[dx]) +
Trev1[dx+1]*dTlay2[dx] + Trev2[dx+1]*conj(dTlay1[dx]);
}

Despite having exactly same number of operations in the forward and reverse loops, it turns out that the vast majority of time was being spend in this second (reverse) loop!

Why? Well, I can’t be entirely certain, but I assume that when memory is accessed, the chip loads not just the single floating point double being requested, but an entire cache line starting at that address. Thus, the data for the next couple of iterations is always loaded into L1 cache ahead of time when you’re iterating forward in address space. However, in the reverse loop, the chip isn’t smart enough to notice that I’m going backwards (nor should it be) and so it has to wait for the data to come from either L2 or main memory every single iteration. By adding a few simple prefetch statements to the second loop, however, the time spent in this section of code went way down. Here is the new code for the second loop:

/*
* Step backward through structure, calculating reverse matrices.
*/
for (dx = n-1; dx > 0; dx--)
{
Trev1[dx] = Trev1[dx+1]*Tlay1[dx] + Trev2[dx+1]*conj(Tlay2[dx]);
Trev2[dx] = Trev1[dx+1]*Tlay2[dx] + Trev2[dx+1]*conj(Tlay1[dx]);
__builtin_prefetch(Trev1+dx-1,1);
__builtin_prefetch(Trev2+dx-1,1);
__builtin_prefetch(Tlay1+dx-1);
__builtin_prefetch(Tlay2+dx-1);
dTrev1[dx] = dTrev1[dx+1]*Tlay1[dx] + dTrev2[dx+1]*conj(Tlay2[dx]) +
Trev1[dx+1]*dTlay1[dx] + Trev2[dx+1]*conj(dTlay2[dx]);
dTrev2[dx] = dTrev1[dx+1]*Tlay2[dx] + dTrev2[dx+1]*conj(Tlay1[dx]) +
Trev1[dx+1]*dTlay2[dx] + Trev2[dx+1]*conj(dTlay1[dx]);
}

The prefetch instructions tell the processor to request the next loop’s data, so that the data is making its way through the memory bus while the current computation is being done in parallel. In this case, this section of code ran over three times as fast with the prefetch instructions! About the easiest optimization you’ll ever make. (The second argument given to the prefetch instruction indicates that the memory in question will be written to.)

When playing around with prefetch, you just have to experiment with how much to fetch and how far in advance you need to issue the fetch. Too far in advance and you increase overhead and run the risk of having the data drop out of cache before you need it (L1 cache is very small). Too late and the data won’t have arrived on the bus by the time you need it.

Why did I not prefetch the dTrev1 and dTrev2 memory locations? Well, I tried and it didn’t help. I really have no idea why. Maybe I exceeded the memory bandwidth, and so there was no point in loading it in. I then tried loading it in even earlier (two loops ahead) and that didn’t help. Perhaps in that case the cache got overloaded. Who knows? Cache optimization is a black art. But when it works, the payoff can be significant. It’s a technique that is worth exploring whenever you are accessing memory in a loop, especially out of order.

Zen and the Art of Linux Maintenance

As I sat watching the Ubuntu upgrade work its way through the packages, at some point the computer became unresonsive to mouse clicks. I ended up having to do a hot shutdown in the middle. As you might imagine, this completely and utterly hosed my Linux partition.

You might wonder why I keep banging my head against the wall of Linux, despite my rantings about it. So did I. As I sat starting at the kernel panic message, however, I realized something:

As much as I complain, part of me enjoys putting up with this stupid operating system, even though it long ago exausted its utility by causing me to spend so much of my time that it was no longer worth any amount of avoided software cost.

As an engineer, I like to tinker and fix things, and Linux gave me the opportunity (or rather, forced me) to delve into the workings of the OS in order to manage it. Linux provided me with the illusion of feeling useful and productive on a regular basis as it required me to put my knowledge to work fixing the never ending litany of problems.

But as I sat looking at a hosed partition, I had the embarassed, hollow feeling that I’d really wasted an extraordinary amount of time focused on my computer as an object of inherent interest, as opposed to an expedient for actual useful work. My linux machine had become a reflexive endevour, largely existing for its own purpose, like a little bonsai garden that I tended to with wearing patience.

And now what do I have for it? I have some profoundly uninteresting knowledge of the particulars of one operating system, and a munged disk that’s about as practically useful as a bonsai tree. (Yes, my actuall work is backed up, but it’s never trivial getting everything exactly the way you had it with a new system install, no matter how much you backed up.)

This was all good, though, because it ripped from my hands something I didn’t have the good sense to throw away. Rather than huddle down with an install CD and try to fix my little Linux partition, I just let it go and started to get back to work, actual work in the outside world, using Windows.*

It feels good. I’m done with operating systems as a hobby, tired of indulging technology for its own sake. One must not get too attached to things.

*I’m not trying to insult OS X, which I think is probably better than Windows. I just don’t have a Mac at work. (I can only fight one holy war at a time.)

How to make a far left progressive media statement

In the interest of giving fair time to all opinions, I’ve decided to step aside and table my regularly scheduled rabid wall-punching diatribe. Instead, today’s post has been guest written by a member of the Green Party in Cambridge, on the topic of how to give a proper media statement.

How to make a left-wing progressive media statement

by Sheila Baldwin-Cooper-Oscar-Meyer

Are you planning to attend a protest against a G7 convention? Going to picket outside of an oil company? Just planning to throw a brick through some deserving corporate window? If there’s any chance that you might be interviewed by a reporter, especially on camera, you should brush up on the following official advice for progressive media statements.

  1. Make sure your voice goes up—preferably a dissonant interval like a half-tone or a diminished fifth (“The Maria”)—at the end of every sentence. Otherwise, you’ll sound offensively declarative and patriarchal. Kind of like a Republican.
  2. Shrill monotone nasal intonation! I can’t emphasize this enough. A low, calm voice does NOT get the message across. You want to aim for something between a child’s whine and a cat being ingested in a jet engine. You know who have creepy-low, calm voices? Republicans.
  3. Use the word “shocked” or “outraged” at least five times. Per sentence. If you’re not shocked, you’re probably a Republican.
  4. Use the phrase “the current administration” in a smugly mocking tone in every other sentence. Republicans!!!

Despite this advice, you may find yourself flustered in the heat of the moment. The best of us do (especially with all the great weed that one tends to find at a protest). If all else fails, chant something that rhymes. It will be hard, so fortunately the research and development wing of the progressive movement has discovered that “ho” and “go” rhyme, even if–and this is crucial–you put other words in between them. An example: “Hey hey, ho ho, lateral extraction drilling has got to go.” Does it mean anything? No. But did you actually learn anything about economics or environmental science while you were majoring in gender studies at Brown? Exactly. Stick to the playbook; it’s time tested by a generation who managed to dismantle an entire culture while higher than a roadie at an Allman Brothers concert.

And just remember: when all else fails, call somebody a “fascist”.

The Great Hudson Arc: A 250-mile-wide mystery

Annotated satellite photo of Hudson Bay arc.
Great Arc of Hudson Bay. (Click for a larger view.)

It’s nice to find out that there are still mysteries left in this world, let alone ones that are visible from space. On the southeast corner of Hudson Bay, the coastline traces a near perfect arc, roughly concentric on another ring of islands in the bay. So, what caused it? The obvious answer, proposed in the 1950s, is that it’s the remnants of a large impact crater. Apparently, however, there is none of the usual geologic evidence for this, and over the past 50 years, there has been debate on its origins. From other sites I’ve read, many geologists seem to have concluded that it is a depression caused by glacial load during the ice age, though a recent conference paper (2006) argues that it may indeed be a crater. The current thinking is summarized nicely on this web page:

There is fairly extensive information on this in Meteorite Craters by Kathleen Mark, University Press, isbn 0-8165-1568-9 (paperback). The feature is known as the Nastapoka Arc, and has been compared to Mare Crisium on the Moon. There is “missing evidence,” which suggests that it isn’t an impact structure, however: “Negative results were . . . reached by R. S. Dietz and J. P. Barringer in 1973 in a search for evidence of impact in the region of the Hudson Bay arc. They found no shatter cones, no suevite or unusual melt rocks, no radial faults or fractures, and no metamorphic effects. They pointed out that these negative results did not disprove an impact origin for the arc, but they felt that such an origin appeared unlikely.” (p. 228)

I know next to nothing about geology, but in the spirit of rank amateur naturalists that came before me, I won’t let that stop me from forming an opinion. In physics, whenever you see something that is symmetric about a point, you have to wonder about what is so special about the center of that circle; could it really be chance that roughly 800 miles of coastline all aim at the same point? If not, what defined that point? One explanation for how large circular formations are created is that they start as small, point-like features that get expanded over eons by erosion. In other words, the original sink-hole that started to erode is what defines the center of the improbable circle. There are also lots of physical phenomena that makes circles, such as deposition and flow of viscous materials from a starting point, assuming isotropic (spatially uniform) physical conditions everywhere. However, the planet is not isotropic. In fact, you can see plenty of arc-like features on coastlines and basins visible from satellite photos, and I can’t find a single one that is even close to as geometrically perfect as the Hudson Bay arc. If you overlay a perfect circle on Hudson Bay, as I’ve done in the picture, you see that it is nearly a perfect circle. How would erosion, or a glacial depression, manage to yield such a perfect geometry over such a large scale? Is it possible for the earth to be that homogeneous over such a large distance, and over the geologic span of time required to create it? To my untrained eye, at least, it screams single localized event.

If so, it would’ve been a major event, on par (at least based on size) with the impact site that is credited with putting a cap on the Cretaceous Period and offing the dinosaurs. On the other hand, this fact only serves to heighten the mystery, as you’d think there would be global sedimentary evidence for it. Whether the arc is the result of one of the biggest catastrophic events in earth’s history, or an example of nature somehow managing to create a near perfect circle the size of New York State by processes acting over unimaginably long spans of time, its mere existence is absolutely fascinating.

The Boston Symphony on a weeknight: Death is gaseous and awesome

One of the nicest things about being a student in Boston is the $25 “BSO Student Card,” which lets you attend certain Thursday night performances of the Boston Symphony Orchestra for free. Of course, Thursday night is not the big night for the Boston intelligentsia to attend the symphony, and tickets for the cheap seats are actually cheap, even if you’re not a student. Thus, it’s fair to conjecture that you get a different crowd at the Thursday night performances, to put it politely, and it’s clear that many of us “far in the back” are not taking the experience as seriously as those paying $150 for the privilege. I fear that the musicians probably think of Thursday night as riff-raff night, and regard it as a rehearsal for the weekend’s benefactor show. If they don’t, they probably will from now on.

This week the orchestra played Edward Elgar’s “The Dream of Gerontius,” which is a huge piece for full chorus and orchestra with pipe organ. It is a setting of a poem of the same name, which deals with the death of a man and his transport beside his guardian angel to His final Judgement and on to Purgatory. (Too much capitalization there? Well, better safe than sorry, I say. The grammarian version of Pascal’s wager.)

The beginning of “The Dream…” is a somber orchestral prelude, setting the mood using perhaps the quietest tone in which I’ve ever heard an orchestra play. (For the first time I’ve seen, the concert notes are printed with the admonition “Please turn the page quietly.”) The hall is hushed, and this beautiful string adagio begins to wax quietly, creating a hallowed, church-like atmosphere. But it does not last long, this being Bingo night at Symphony Hall. An older gentleman in the balcony starts to go into a comical, high-pitched coughing fit that sounds like an asthmatic cat being repeatedly gut punched. They are probably looking frantically for this guy in whatever ICU he wandered out of. Going out in public was probably a poor call, but he clearly has a health problem and can surely be forgiven, if not lauded for his thematic complement to the subject matter. Jesu, Maria–I am near to death, And Thou art calling me; I know it now, sings the tenor. But there are others for whom Judgement will not be so kind…

Continue reading