Showing posts with label chemistry. Show all posts
Showing posts with label chemistry. Show all posts

11 November 2008

They can take our labs, but they can't take our brains!

Chemistry hobbyists face a labyrinth of local and state regulations -- an article about how increasing regulation makes amateur chemistry more and more difficult. Via slashdot.

Fortunately mathematics is not so easily regulated. Although if mathematics comes to rely increasingly on computers to do the "dirty work", then one can imagine a future where amateur mathematics is concentrated in certain fields -- the fields which don't require computation -- if high-powered computers become regulated. But in the end mathematics is pure thought, and they can't regulate what goes on inside our brains.

(I studied chemistry as well as mathematics in college. I would never have laboratory equipment in my home. But this is because whenever I touched laboratory equipment it broke, which is why I got out of chemistry. In the hands of a competent chemist, there's nothing to fear.)

09 April 2008

You mean there are people who *don't* write everywhere?

Harnessing Biology, and Avoiding Oil, for Chemical Goods, today's New York Times. I studied a fair bit of chemistry as an undergrad, so this is of interest to me academically. Basically, a lot of synthetic goods are made out of compounds with lots of carbon, which can eventually be traced back to petroleum; as you may have noticed, petroleum and its derivatives have gotten more expensive recently. So even if you were never a chemist you should still care.

The photo at the top of the article, though, is what got my attention. It's captioned "The scientists use the glass shield as a board on which to write chemical formulas", and I feel like there's the implication that they're doing this to conserve scarce resources (coming from the captions on the other photos). No! It's just that scientists of any sort write things everywhere -- every chemistry lab I was ever in had this property. I wonder what they'd think of mathematics departments. (One professor that I know often has about four different calculations going on simultaneously on the whiteboard of his office; they overlap each other, but they're in different colors, so he can tell them apart. I can't do that.)

In the hall of the dormitory floor I lived on as an undergrad we had several blackboards. They were often filled with mathematics of one sort of another. Of course, they were also often filled with transcriptions of the silly or obscene things some of us had said. I kind of wish I'd written them down... but let's face it, they were probably pretty embarrassing and are best left where posterity can't see them.

It might be interesting to see pictures of well-known mathematicians' blackboards...

10 October 2007

The mathematician's "you"

Ertl Wins: Down With Witchcraft, by Derek Lowe at In The Pipeline, on this year's Nobel winner in Chemistry. Most chemical reactions take place in some sort of bulk liquid or gas; Ertl's work considers chemistry that occurs on the surface of a solid. The most important example is probably the Haber-Bosch process for creating ammonia from nitrogen and hydrogen.
`
Lowe writes:
You can Haber-Bosch yourself some ammonia simply enough – just take iron powder, mix it with some drain cleaner (potassium hydroxide) and stir that up with some alumina and finely ground sand (silica). Heat it up to several hundred degrees and blow nitrogen and hydrogen across it; ammonia gas comes whiffing out the other end. Now, bacteria do this at room temperature in water, down around the roots of bean plants, but bacteria can do a lot of things we can’t do. For human civilization, this is a major achievement, because nitrogen does not want to do this reaction at all.

Megan McArdle's response to this: "Derek Lowe has a highly exaggerated notion of my abilities."

For a moment this struck me as the way a mathematician might use "you" (although the preferred second-person pronoun in mathematical texts is "the reader", as in "This is left as an exercise for the reader." or "The reader can show that...") But it's not quite the same thing. If I'm sitting there reading some mathematics, and I come across something that "the reader" should do, I probably can do it, sitting there in my chair, if I have a large enough supply of paper and coffee. (Whether I will is a different matter; how badly do I want to understand what I'm reading?) But the reader of Lowe's post -- or the reader of a chemistry paper -- can't actually do that. The chemist has to take the author's word for it, in most cases, because it would be prohibitively expensive to check everything they read.

Vladimir Arnold, in an essay "On Teaching Mathematics", said that "Mathematics is the part of physics where experiments are cheap." This is an example of that phenomenon, although I would argue that "physics" should be replaced with some broader term, as more and more areas of knowledge are becoming mathematizable and computing power becomes cheaper and cheaper.

02 September 2007

Another shot at the Doomsday argument

Robin Hanson critiques the Doomsday Argument. This is an argument on the lifespan of the human species, which begins from the following principle: there is nothing special about present-day humans. "Therefore" we can consider the number of humans who have lived so far as a fraction of the number of humans who will ever live; the probability that this is between p and q is q-p. I put "therefore" in quotes because the implication is tempting, but one could equally well conclude that the amount of time there have been humans, as a fraction of the amount of time there will ever be humans, has this same distribution. (Indeed, I've heard both versions of the argument.) The first version of this argument says, for example, that the probability that there will be sixty billion more humans is at least one-half; the second says that the probability that we as a species will survive for another two hundred thousand years or so is at least one-half. (I'm assuming there have been sixty billion people who've ever lived and that our species is 200,000 years old.)

And indeed there are other classes of beings that you can use as the reference class here. Living things, for example. Or vertebrates, or living cells, or humans, or even such classes as "humans who haved lived after the year X", which get kind of ridiculous. That last one is particularly prone to abuse, as we can simultaneously say that humanity has a fifty percent chance of surviving past 2114 (if we take X = 1900) and past 3014 (if we take X = 1000).

The name "Doomsday argument" is rather misleading, too. "Doomsday" is usually seen as a bad thing. But what comes after humanity might be the "posthumans" that the people who believe in a technological singularity talk about; is that really doom? Hanson gives a quantitative version of this where there are several "toy universes".

I've talked before about how I'm not entirely comfortable with the "Copernican principle" from which this is derived. For some reason I am much more uncomfortable with this than I would be with the equivalent line of reasoning applied to non-human objects. If I had an urn containing balls labeled from 1 to N, and I didn't know N, and I reached in and grabbed a ball marked 100, I'd say in a heartbeat that the urn probably contained around 200 balls. But the difference is that in the Doomsday Argument we don't even know what the urn is.

The Doomsday argument supposedly only is provisional, until such time as we have better knowledge on how long societies tend to last. This is in my mind one of the most useful reasons for trying to find extraterrestrial intelligence; the knowledge that they do exist (or even a thorough search which doesn't turn up anything) would give us substantial information about how long we might expect to last.

When I studied biochemistry I thought something similar. Essentially all known life forms on Earth have similar biochemistry, because we all evolved from the same ancestors. So an introductory biochemistry class essentially consists of the memorization of those mechanisms. What I would have wanted to see is, say, a dozen or so independently evolved biochemistries, and then see which features of our own biochemistry are just accidents of evolution and which are essential to having complex, self-replicating systems.

30 June 2007

drug + drug = better drug

Old Drugs In, New Ones Out -- from today's New York Times.

A field known as combinatorial chemistry has recently emerged. Many molecules have similar "backbones" to each other and only differ in, say, a few groups of atoms hanging off of the end; the canonical example are proteins, which are built up from just twenty different amino acids. The amino acids all look like the image at the left, differing only in the group called "R". The actual protein is made up by sticking these molecules together via peptide bond formation, which eliminates the -OH group at the right end and one of the hydrogen atoms at the left end, bonding the carbon and nitrogen in adjacent amino acids together directly.

In drug design, it seems that what's often considered is the pharmacophore -- basically, the "business end" of a molecule. If you synthesize a bunch of molecules that are the same at one end but different at the other end, well, that means that the "business end" won't be exactly the same in each instance, and some might be better than others.

But what they're doing now takes this to a new level. Drugs that have already been created are now being combined with other drugs -- not chemically, just being put in the same pill. (Although sometimes more subtly than just throwing them both in, which means that you can't just take the two pills separately.) And of course, there are a lot of combinations you can get this way. What's more, the combinations aren't what a mathematician would call "linear" -- if you take a drug that does A, and a drug that does B, and stick them together, you don't always get a drug that does A-and-B. For example, one drug mentioned by the article -- Avanir's Zenvia -- takes a cough suppressant and a drug used to treat heart rhythm disturbances, and gets out a drug to stop uncontrolled laughing and crying. Predicting which combinations of drugs will have effects like this is tricky, and a lot of the work is in screening the combinations. But synthesizing all those combinations is also hard. Here's a patent for robotic synthesis.

One company, CombinatoRx, got my attention because their name is pronounced like the word "combinatorics". The article states that their current research program is to take two thousand generic drugs, make all possible pairs, and screen them to see if they do anything interesting; then develop the interesting drugs. There are two million possible pairs of drugs. They test "several thousand pairs of medicines a day". How long can this last? Well, if you assume "several thousand" means "two thousand", then it can last a thousand days. (Presumably they could expand their library of generic drugs, though.)

The next step would then be to try three-part drugs -- with the same library, you'd have about 1.3 billion of them. At 2000 combinations tested a day, that would take about two thousand years to test.

For a triple combination, the F.D.A. might want evidence that the trio is better than not only the individual parts but also better than any of the possible pairs. Showing that would require huge and costly clinical trials.

One wonders if it would be as huge and costly as implied here. My instinct is that combinations of three, four, or more drugs would come from adding a single drug to an already existing combination -- or, in the case of four-part drugs, taking two two-part drugs and putting them together. So some of the testing would already be done. From what I've heard about the FDA, though, they're likely not to care.