You Inherit Part of Your Fingerprint from Your Parents

2 minute read
Originally posted here:

Our fingerprints are a one-of-a-kind pattern, so unique to an individual that even identical twins don’t share them. And yet I’m here to tell you that you inherit part of your fingerprint from your parents. Huh?

If you look closely at your fingerprints, you’ll notice that their patterns are one of three main types: loops, whorls or arches.

If you were to look at your fingerprint under a microscope though you’d see that while the ridges on your fingers follow one of the patterns, there are small variations in them, like breaks, forks and islands.

While the general shape of your fingerprints is heritable, these small details, often called minutiae, are not. Why that is comes down to how fingerprints are formed.

When a fetus is about 7 weeks old, they begin to form pads on their hands and feet called volar pads. These pads only exist for a few weeks, because at around 10 weeks they start to be reabsorbed into the palms of the hands and feet.

Around this time, the very bottom layer of the epidermis begins to form folds due to pressures from the growing skin. These folds are the precursors to your finger ridges, or fingerprints, and the pattern they take depends on how much of the volar pad has been absorbed when they begin to form. If the volar pad is still very present, then you’ll develop a whorl pattern. If the volar pad is partially absorbed, you’ll form a loop pattern, and if it’s almost entirely absorbed, you’ll form an arch pattern.

So how do genetics come into this? Well, the rate of volar pad reabsorption and the specific timing of the creases in the epidermis appearing are genetically linked. However, these events only determine the general shape of the fingerprint. The minutiae are influenced by things such as the density of the amniotic fluid, where the fetus is positioned and what the fetus touches while in utero. Since every fetus will grow in a different environment, their minutiae will differ. Even twins that share a uterus will interact with their surroundings differently. So even if your fingerprint shape matches that of your parents, if you look closer, you’ll see the differences that make your prints uniquely yours.

Did you know that fingerprints aren’t only a human feature? To read about fingerprints in koalas, click here!

Koalas Have Fingerprints Just like Humans

2 minute read
Originally posted here:

In 1975 police took fingerprints from six chimpanzees and two orangutans housed at zoos in England. They weren’t just looking for a unique souvenir; they were testing to see if any unsolved crimes could be the fault of these banana-eating miscreants.

While these primates ended up being as innocent as they seemed, the police did determine that their fingerprints were indistinguishable from a human’s without careful inspection.

A few years later, in 1996, a different type of mammal came under police suspicions: a koala!

While it makes sense that orangutans and chimpanzees would have fingerprints like us, being some of our closest relatives, koalas are evolutionarily distant from humans. It turns out that fingerprints are an excellent example of convergent evolution, or different species developing similar traits independently from each other.

Another example of convergent evolution is seen in the bony structure supporting both birds’ and bats’ wings.

Fingerprints are thought to serve two purposes. First, they aid in grip, allowing an animal to better hold onto rough surfaces like branches and tree trunks. Second, they increase the sensitivity of our touch and allow us a finer level of perception regarding the textures and shapes of the things we hold.

Why this is useful for humans is obvious. Our hands are made to grasp, hold and manipulate objects. Whether it’s some nuts we foraged for or our Xbox controller, we humans spend all day every day relying on our sensitive sense of touch.

For koalas, it’s not really so different. They are incredibly picky eaters, showing strong preferences for eucalyptus leaves of a certain age. It seems that their fingerprints allow them to thoroughly inspect their food before they chow down.

Police aren’t exactly worried about koala bank robbers, but it is possible that koala fingerprints could be found incidentally at a crime scene and be mistaken for a human’s, making it pretty difficult to find a match.

To read about how fingerprints form, how parts of them are genetic, and why identical twins have different ones, click here!

You’re probably storing leftovers wrong (especially if it’s rice)

3 minute read
Originally posted here:

If, like me, you aim to cook dinners that provide both your next day’s lunch as well as a freezer portion to be thawed at some future date, you may want to stop. At least with rice.

Uncooked rice can contain spores of Bacillus cereus, a bacterium that can cause two different types of food poisoning. The first type is characterized by vomiting (and thus is called the emetic form). It results from consuming a toxin produced by the bacteria while they’re growing in your food and has a short incubation time of 1-5 hours. The second is characterized by diarrhea (and is non-surprisingly called the diarrhoeal form). It results from a toxin that is produced in your small intestine as the bacteria grow there and has a longer incubation time of 6-15 hours.

The two forms are commonly associated with different types of foods. The diarrhoeal form has been linked with foodstuff like soupsmeatvegetablesand milk products including formula. The emetic form comes from a more limited list of culprits, as it’s mostly associated with starchy foods that have been improperly stored like rice, pasta, pastries or sauces.

But what does “improperly stored” actually mean?

If a raw food is contaminated with B. cereus (as much rice is) and then cooked, some spores will remain in the cooked product (unless you’re in the habit of heating your rice to above 100 ˚C for extended periods of time). These spores, If left standing in temperatures between 10 ˚C and 50 ˚C, such as on your stove or countertop, find themselves in their ideal environment (wet and warm) to germinate, grow and produce the toxin that will make you sick.

It doesn’t take long for the spores to reproduce either. A colony of B. cereuscan double in size within 20 minutes if kept at 30˚C. The routine reheating of your food will not help to deactivate the toxin or kill the bacteria. Since this bacteria and its toxin are so resistant to heat your only hope of dodging food poisoning is to avoid allowing the bacteria to germinate.

To sidestep a nasty bout of illness caused by B. cereus you should aim to eat your food as soon as possible after it is cooked. If you can’t do that, then hot foods should be kept above 60˚C and cold foods, below 5˚C. Meats and vegetables should be cooked to an internal temperature of 60˚C and kept there for at least 15 seconds. Frozen foods should ideally be thawed in the fridge or as a part of the cooking process.

If storing leftovers for later, they should be rapidly cooled in the fridge as fast as possible (according to the NHS, within 1 hour is best). You should avoid storing hot leftovers in deep dishes or stacking containers together, as it will cause the food to cool slower. When reheating leftovers make sure they reach an internal temperature of at least 74˚C and don’t keep them for more than seven days, even in the fridge.

When dealing with high-risk ingredients (like rice, grains and other starchy foods) it’s best to not keep leftovers at all. But if you do, try not to keep them for more than one day, and never reheat them more than once. Even freezingdoesn’t kill bacteria but rather just stops them from multiplying, so, by all means, freeze your leftover curry, but make fresh rice when it’s time to eat it again.

Considering the amount of improperly stored rice I now know I’ve eaten it seems almost a miracle that I haven’t gotten sick yet. Then again, food poisoning with B. cereus is often confused with the 24-hour flu, so I may have already paid for my mistakes without even knowing it.

Let’s all learn from my mistakes and start storing our leftovers properly.

Cell Phones and Wifi are Perfectly Safe

2 minute read

The idea that cell phones, routers, wireless heart rate monitors, alarm clocks or pretty much any other electronic device will give you cancer is one of the most persistent fears around. The good news is, it’s also one of the most baseless.

Read the entire article here:

7 Up: Originally an Antidepressant

1 minute read
Originally posted:

When 7 Up was originally placed on the market (In 1929), it was named Bib-Label Lithiated Lemon-Lime Soda- a much less catchy, though more descriptive name. The ‘lithiated’ in the name came from the soda’s ingredient lithium citrate, a compound used to treat patients with mental health problems like bipolar disorder, depression or mania. 

The soda went through a name change to 7 Up Lithiated Lemon Soda, before finally settling on just 7 Up, and a formula with no added lithium. The 7 in the name has no confirmed source, but several theories about its origin. Some soda fans claim that it is derived from the 7 ingredients used in the original recipe, others from the soda having a pH of 7 (which is not true), and others think that the 7 originates from the lithium in the original formula, as this element has an atomic mass of ~7.

Sulfates in Shampoo

2 minute read
Originally posted here:

As someone who likes to routinely dye my hair bright pinks, blues and purples, I’m often told by my hairdresser to use sulfate free shampoos. He often talks to me about how multiple bleachings and dying’s will leave my hair damaged and brittle, and how sulfate free shampoo will be gentler, both on my damaged hair and on the colour. It seems like every time I take a shower it occurs to me to look into why that is, and whether or not it’s true, but somehow by the time I’m dry, dressed and sitting at the computer I’ve forgotten again. Finally though, here is what I’ve found about sulfates in shampoos. 

Shampoo as we know it was invented somewhere near the 1920’s and 1930’s. It was in 1930 that Procter and Gamble made the first sulfate-based shampoo, and since then the formulations haven’t changed all that much. It’s important to remember that ‘sulfate’ isn’t one compound, it’s a common name for any compound containing a sulfate. The ones commonly used in shampoo (historically and currently) are sodium laureth sulfate, sodium lauryl sulfate, or ammonium laureth sulfate.

So what do sulfates do anyways? Well, a couple of things. Primarily they are surfactants, which means they can attract both water and oil molecules, and it’s this property that makes them good for cleaning. They attract the oil on your scalp, then wash away down the drain. It’s also their surfactant status that allows sulfates to create the lather we all know and love in our shampoos. 

The problems with sulfates really are that they’re a bit too good at their job as surfactants. Their ability to effectively strip dirt and oil out of our hair means that we also lose a lot of natural oils that protect our hair and scalp, which can leave our heads feeling dry, or even getting irritated and red if you have sensitive skin. Sulfates are also irritants, so if you get shampoo in your eyes a lot (like me), you may notice that sulfate containing shampoos sting a bit more. If you dye your hair (again, like me) you likely will want to use sulfate free shampoos, as sulfate’s efficacy at stripping oils will also strip colour. 

Outside of being a bit intense, there are no other problems with sulfates. The myth that they cause cancer is just that, a myth, and they have been studied and approved many times for use in hair products. Shampoos need surfactants to work (they’re made up of 5-30% surfactant), and sulfate-based surfactants are the most effective option getting the job done, but there are other options if you find these shampoos drying your skin out or causing rashes. Do not, however, buy into the media hype that normal shampoos are dangerous or unhealthy.

Rust Doesn’t Give You Tetanus

1 minute read
Originally posted here:

Ever step on a rusty nail? It was, in all likelihood, rapidly followed by your parents dragging you to the doctor’s office for a painful (but safe!) tetanus shot. The memory of my first tetanus shot is preceded by an exploring an abandoned barn and getting cut by a stray wire fence. If it had happened in my own home it wouldn’t have even deserved a band-aid, but the threat of rust sent us to the doctor’s office.

But it turns out that injuries caused by rusty objects aren’t any worse than injuries caused by any other discarded object.

Tetanus, or lockjaw, is a bacterial infection caused by Clostridium tetani, an extremely hardy rod-shaped bacterium found in animal digestive tracts and soil worldwide. Tetanus is fatal in about 10% of cases but causes muscle spasms, fever and trouble swallowing in all cases.

The reason we associate tetanus with rust is because it’s often found in soil that’s rich in organic material like manure or dead leaves. Old houses, cars or other discarded items left in nature for long enough will rust (if they’re metal) and collect bacteria like Clostridium tetani, but the relationship between rust and tetanus-causing bacteria is purely correlative, not causative. Humans can be exposed to Clostridium tetani in a variety of non-rusty ways, such as when cleaning animal cages, when bitten by infected animals, or if exposed to contaminated heroin.

So if your skin is pierced from anything from your own kitchen knife to a rusty gnarled screw, or if you begin working on a farm, it’s worth making sure that your tetanus shot is up to date. After all, (in Canada at least) it’s free and lasts an entire decade.

The Truth Behind “Beer Before Liquor”

2 minute read
Originally posted here:

Have you ever heard the saying “beer before liquor never been sicker”? Or “liquor before beer, you’re in the clear”? What about “grape or grain but never the twain”? Well, it turns out that there might be some truth to at least some of these adages.

There are a few factors to consider here.

First, there’s the absolute volume of alcohol you are consuming. Looking at the Manhattan as our example cocktail, it contains roughly 28% alcohol by volume (ABV), which makes it seem much less potent than, say, straight whiskey, with its ABV of 40%. But it’s not really fair to compare these drinks on their ABVs since the amounts consumed tend to be different.

What matters isn’t the ABV of a drink, but the true amount of pure alcohol (ethanol) in a drink. In the chart below you can see a comparison of drinks’ ABVs, volumes, and actual amounts of ethanol.

DrinkABV (%)Volume of
1 Drink
Absolute Amount of
Alcohol in 1 Drink (oz)
Bloody Mary122200.9
Straight vodka40450.6

So you can see that, even though we tend to consider one glass of wine, cocktail, or can of beer equal to “one drink”, the actual amount of alcohol you’re consuming can vary wildly by what kind of drink you are having.

The volume difference in drinks also influences how quickly we drink them. A beer tends to take longer to drink than a cocktail, or especially a shot, simply because it’s much larger. Purely based on volume, you could drink 2.5 Manhattans in the time it takes to drink one bottle of beer. So, by drinking beer, you essentially give yourself a lower alcohol per minute rate of consumption than when drinking cocktails.

If your options are only to drink cocktails and then beer, or beer and then cocktails, it makes sense to keep your heavier drinking for the beginning of your night. When you’re more sober you’ll be better able to pace yourself, evaluate how you’re feeling, and make changes to your rate of consumption if need be. Later in the evening, when your decision-making process is already compromised, beer is a safer option that won’t contribute as much to making you more intoxicated.

There is however another factor at play here: how well your body absorbs alcohol in different preparations. A 2007 study found that the vodka served diluted (with carbonated or still water) was absorbed faster than the vodka served neat. This means that even if the same amount of time is taken to drink straight liquor or a glass of wine (two drinks which contain about the same absolute amount of alcohol) the wine still may leave you more intoxicated, as it is better absorbed into your blood.

As for the grape or grain advice? Feel free to ignore it. A 2019 study compared the hangover severities of subjects who drank only beer, only wine, beer and then wine, or wine and then beer, and found that “neither type nor order of consumed alcoholic beverages significantly affected hangover intensity.”

Your Pet Cat May Be a Bit More Dangerous Than You Think

2 minute read
Originally posted here:

Cat scratch disease (CSD) is an infection resulting from a scratch or bite of a cat (or, in rarer cases, dogs or other animals). It is not the same thing as Cat Scratch Fever, an album by Ted Nugent, although CSD can cause a fever, as well as swollen lymph nodes, lethargy, neuroretinitis and headaches.

CSD is the result of an infection by Bartonella henselae, a bacterium commonly transmitted to cats via the cat flea (yes, cats and dogs usually have different fleas). Rarely, ticks and spiders can also carry the bacterium, and transmit it directly to humans.

Kittens are more likely to carry Bartonella henselae than adult cats due to their underdeveloped immune systems, and are much more likely to bite or scratch their owners while learning how to play gently. But anyone who is exposed to cats of any age should take care to clean any wounds well to avoid risk. Bartonella henselae can also be transmitted to humans via cats’ saliva, so as sweet as it may seem that Fluffy is licking your wounds for you, probably best to wash it and wear a Band-Aid.

For veterinarians, CSD is actually considered an occupational hazard. Vets are frequently in close proximity to many cats, oftentimes cats that are acting aggressively and are more likely to bite or scratch. One study found Bartonella DNA in 32 of the 114 veterinarian patients they tested.

CSD is diagnosed via blood test, or simply by considering the symptoms of the patient, the most obvious of which is a swollen blister or sore and red area surrounding the infected bite or cut. Those who are immunocompromised (such as patients with HIV), very young or very old are more likely to be infected, and rates of infection generally increase during spring in North America, likely due to the birth of many new kittens.

So while they may be as cute as anything, cats do still pose a risk to their owners, and not only because they may destroy your favourite furniture.

The kitty in the picture is named Jean-Charles and he is available for adoption from the Réseau Secours Animal in Monteal now!

Leafcutter Ants are Farmers Who Grow Fungi

2 minute read
Originally posted here:

Leafcutter ants can strip as much as 17% of the leaf biomass from plants in their ecosystem and can clear entire trees in under a day. Next to ours, leafcutter ant society is the most complex society on earth. They build nests that can contain thousands of rooms and cover up to 0.5 km2, a feat that is necessary since a mature colony can contain more than eight million individuals.

But if they’re not eating the leaves that they carry home, what are they doing with them?

Farming. Leafcutter ants use leaves as their fertilizer to grow their crop: fungus.

They cultivate their fungal gardens by providing them with freshly cut leaves, protecting them from pests and molds, and clearing them of decayed material and garbage. In return, the fungus acts as a food source for the ants’ larvae. The ants are so sensitive to the fungi’s needs that they can detect how they are responding to a certain food source and change accordingly. This symbiotic relationship also benefits from a bacterium that grows on the ants’ bodies and secretes antimicrobials, which the ants use to protect their fungi.

Adult ants don’t feed on the fungus, but rather get their nutrients from leaf sap. Smaller adults often hitchhike on leaves being carried back to the nest to opportunistically feed on the sap, as well as protect the carrier from flies and to check that the leaf isn’t contaminated with other fungi.

Leafcutter ant society is divided into castes, with each group having a different role to play. The largest ants, called Majors, act as soldiers and heavy lifters. They guard the nest and help to clear out the highways between the nest and a food source. The next smallest caste, the Mediae, is made up of generalists, cutting and transporting the bulk of the leaves for their colony. Next in size are the Minors, who protect the foraging path and food source, and the smallest ants, the Minims, work exclusively at home, tending to the larvae and fungus garden.

Some Minims work exclusively as garbage collectors, removing decaying organic matter from their fungal gardens and transporting it to dedicated garbage rooms placed well below the rest of the nest. After becoming garbage collectors, these ants will never interact with the fungus or the queen, to prevent any disease from being passed onto them.

Leafcutter ants are often presented as a single species of ant, but in reality, there are 250 species of ants which practice fungus farming. Besides their agrarian tendencies, these ants have something else in common: queens. When it comes time to establish a new colony, winged virgin queens-to-be take part in their nuptial flight and mate with many different males to collect sperm. They then set out to find an appropriate place for a new colony, bringing with them a piece of the fungus to seed their new fungal gardens.