Lab Write-Up: Fruit Flies

In this lab, we sought to produce a pure-breeding strain of white-eyed, ebony, vestigial-winged fruit flies. The challenge was designed to demonstrate the properties and principles of genetic inheritance. Prior to beginning work on the flies, a clear plan needed to be decided upon. After a few hours worth of researching the genetics of fruit flies, the plan was finalized. Following is a list of our desired traits:

Fly Trait Dominant/Recessive Extra Notes
Vestigial Wings Recessive
Ebony Body Recessive
White-Apricot Eyes Recessive X-Linked

Very early in the challenge, we decided that we would have two separate strains of flies running simultaneously to both increase the speed of the experiment and decrease the number of generations the experiment would take. We decided that the most useful beginning crosses would be White-Eyed Female x Ebony Male and Ebony Female x Vestigial Male.

White-Eyed Female x Ebony Male’s cross produced, as expected, apricot-eyed males with grey bodies, and normal-eyed females with grey bodies. White-eyes, being X-linked and recessive, are guaranteed to be passed on to any male children of a white-eyed female. Because the female flies had both the recessive white-apricot gene, as well as the recessive ebony-body gene, they were chosen for the third cross.

The Ebony Female x Vestigial Male cross produced grey, non-vestigial flies. Both traits being recessive, and neither being X-linked, neither was able to express itself. However, the genes were present. Mating the males with their white-eyed counterparts would bring all of the needed genes into one group of flies.

The third cross, Apricot-Ebony x Vestigial-Ebony, produced a few different types of flies. The ones we were interested in keeping were the ones containing the apricot, ebony, and vestigial genes. We transferred the females without ebony bodies, as well as the males without ebony bodies and apricot eyes, to a holding tube. What remained was a collection of males with apricot eyes and ebony bodies, and females with ebony bodies, which we put into yet another new tube. There was a slight risk involved in making this move: there was no way to guarantee that the vestigial genes were still in the pool. However, the chances of that were low, and we’d be able to tell in the next generation.

The fourth cross, produced by inbreeding a selection of the third cross, contained a lot more variety than the third. This time, there were uniform ebony bodies, but there were very few flies who had both apricot eyes and vestigial wings. It took a few days to gather enough of these to begin the final cross.

Unfortunately, at the time of writing, the final cross has not yet begun to emerge, so it is impossible to say whether the venture was a success. There shouldn’t be, however, much (if any) variation in the flies, since their parents all exhibited recessive traits.

Flies have a very simple genetic code. Controlling their phenotypes through selective breeding alone is, in itself, a huge hassle. If the genes weren’t on different chromosomes, it would’ve been nearly impossible to finish this assignment due to the complexity of the task. Fortunately, drosophila melanogaster is a great critter for this specific topic.

Playing with Pi

I recently picked up a Raspberry Pi from Adafruit Industries (http://adafruit.com/). While the Pi itself ran for less than $40, I couldn’t help but buy a few thousand accessories to complement it. The final order came out to around $110, but it’s proven its worth.

At the moment, I’m doing nothing but experiments with it. Recently, I’ve been contemplating the possibility of building a miniature cluster of Pi servers. Pis, which are relatively cheap, as far as electricity goes, could easily do the job of my web server (a very low-traffic system) for a much lower cost. While building a Pi cluster could easily cause enough heat to be generated to require ventilation, the single Pi that I’m currently working with barely generates enough heat to be noticeable, unless it’s doing a resource-intensive task, such as compiling the webserver package that I use for all my web hosting needs.

All in all, power consumption is low. I’m able to power my Pi using either a wall socket, or my computer’s USB port (which provides enough power, despite USB specifications saying it shouldn’t be able to). As with all but two of my computers (my laptop and main desktop), I operate the Pi headlessly. While it’s perfectly equipped for TV-out, I already have a machine that’s perfectly capable of supporting the same feature. Possibly the most useful TV-out that it has is a natively-supported HDMI-out, which is something my already-existing machine doesn’t support, while my main desktop and laptop do, but, due to software errors, are incapable of utilizing.

I’ve thought up many different devices to build with the Pi, but I won’t be able to implement many of them without further planning and resources. The Pi is, without a doubt, an awesome tool, toy, platform, and computational resource. Despite its awesome capabilities, however, it is not able to do anything about Taco Bell’s inability to produce food that doesn’t make me ill. ¡No te puedes matarme!

Adsense Sucks

The title says it all. I’m removing the advertisement from my webpage. I’ll keep my adsense account active, though, in case I decide to add it to a later project. Hopefully that’s okay with Google’s guidelines. If not, I’ll use another advertising group.

Horizon

Thoughtless, I sit looking out upon an empty horizon.
Waves crash at the rocks below my feet.
An empty horizon is a lonely thing.
Over the horizon, a lone ship fights to meet my eyes.
Its burning sail steals my attention.
The Warpath overpowers the way of peace, and is darkened by it.
Restoration of ancient ways will not provide ancient innocence.
What was lost cannot be found.
All logical assertions depend upon mutual truth.
The enkindled ship announces future events, but not resolutions.
An empty horizon is a lonely thing, but this is something else.
This is the Warpath.

I have no idea what this means. This is partially because it was written while I was purposely trying to waste time, but also because I was trying to write it from the standpoint of a figure who only exists in my personal mythology.

The Cave: A Brief Overview

I’ve been computer-obsessed for a number of years (a very large number that’s nearly equal to my age). This obsession has lead to my amassing a small battalion of machines. The majority of them, being completely incapable of doing anything, have been carelessly tossed into a closet and forgotten about. Those lucky enough to be worth keeping alive, however, were given places atop my desks and shelves that line my cave (also known as my office (which also happens to function as a bedroom (unless it’s the other way around))).

All of the machines in my cave run either a GNU/Linux distribution or OpenBSD. I don’t often use OpenBSD, but I completely agree with the project’s mission of building a secure system. The two GNU/Linux distributions that I prefer are Debian GNU/Linux and Slackware Linux.

I realize that my Slackware machines are horrendously out of date (Slackware 14.1 was released a few days ago and I’m still running Slackware 13.37 like a pathetic weirdo), but I keep them around because I find them useful for development-related tasks. While Debian is fully capable of replacing Slackware on these boxes, Debian is a mind-numbing system that eases the pain of manually editing text files to configure software packages. Because of the pain and devotion that it takes to get anything to run properly in Slackware, I’ve long thought that anything capable of compiling and running under a Slackware environment deserves to consider itself ‘stable’. This is entirely superstition, however, and I do not base my code-maturity ratings on whether or not said code is capable of being compiled under Slackware. At the same time, I develop things on Slackware because I feel that its KISS (Keep It Simple, Stupid) attitude is contagious.

Despite the fact that Debian is mind-numbing, I use it on the rest of my boxes (I actually have more Debian boxes now than I do Slackware boxes) because Slackware is hard to work with when deadlines are involved. Spending four hours trying to get an office suite to compile and having a huge English project due the next day (I’m such a horrible procrastinator in this example which I’ll claim is not based on reality) is unreasonably painful.

My actual hardware is a collection of (mostly) recycled, self-refurbished machines. I’ll now go on to list them, along with their functions.

My main desktop is named ‘valerie’. I chose the name because, at the time, my brother was (alright, so he still is) attracted to a girl by the same name. I succeeded in pissing him (and said girl (probably)) off, but I later came to realize that I’d broken my cardinal rule of computer-naming: not to name them after people. However, since I have only spoken to the girl a few times, and because she’s forked my lawn more times than I’ve talked to her, I decided that I would rename her within my own mind (to ‘Morris’ (I always give people (especially girls) names that are as far from my computer-naming scheme as possible)), therefore mitigating the harm done to my naming scheme.

Getting back to the topic, valerie runs Debian GNU/Linux. I use her (boxen are feminine, I won’t back down from this position, especially if someone asks me nicely) as a desktop machine, as well as a network bridge. She provides Internet connectivity to the rest of the boxes in the cave. When I first got her, her BIOS needed to be restored. Sadly, the company that built her (Gateway) never released the data I needed to flash her. I searched for a few hours and discovered that eMachines had released a single model that had the same motherboard as valerie. I downloaded the eMachines ROM and flashed it. Thankfully it worked. Now when she turns on (which rarely happens, she hasn’t been shut down in forty days), she displays an eMachines logo.

The second box on my network is called ‘vixen’. She needs a new name, unfortunately, as animal-related names are reserved for Slackware boxes, and she’s been running Debian since the beginning of this summer. vixen acts as a very hackish NAS (Network Attached Storage). She has a 3TB hard disk attached to her, which I use to store random things, such as pictures, backups, and my humongous tarball collection. She is also responsible for handling my Debian repo, which is my local copy of Debian 7.1.0 x86. vixen is a refurbished machine. The majority of her parts were originally used by my old school district, and I bought them for $1 at a sale. I added the parts from a few other machines I picked up at the same sale, as well as a TV-Out card that I picked up a few months later. Now she uses a TV as a monitor whenever she has need for such a thing.

The third box on my network is called ‘wolf’. This one is a Slackware box, and she is my main test environment. She is my only unmodified box: a Dell Dimension 3000 with a Celeron (ick). While I don’t throw any resource-intensive tasks her way, I do use her to test my Slackware Autoconfiguration Script, host one of my two development webservers (the one that’s configured to support CGI execution), and burn CDs. I’ve only had her for about a year, and she’s one of the few boxes I allow to be shut down regularly.

The fourth box on the network is ‘rose’, a Dell OptiPlex with a Celeron (ick). rose doesn’t really do anything right now, as wolf has basically taken over as the development webserver.

The fifth and final box (outside of laptops) in the cave is ‘allison’. allison has been in the cave longer than any of the other machines. It used to be my desktop machine, and I never moved any of the files from that time to any of my newer machines. Because of that, I rarely fire her up to acquire old files (mostly music that vixen doesn’t handle). She’s more powerful than wolf, but she is one of the few machines left that has irreplaceable data on her. If there were a machine on my network that I’d store secret information on, it would be her because she’s unplugged when not in use.

Anyway, that’s a hopelessly overcomplicated review of my boxes. I’d go further into implementation of their activities by talking about the software that they run, but I doubt a casual reader would care. (if you’re reading this as a casual reader, I urge you to consider whether or not the time you spent reading this article was ‘valuable’. I apologize in advance for wasting your time.)

Lab Write-Up: Osmosis

This study aimed to measure the molarity of cytoplasm by measuring water movement into and out of potato cells.

When a cell is non-isotonic, it will be adjusted to its surroundings through a process called osmosis. Osmosis is the movement of water through a membrane from a less-concentrated solution to a more-concentrated solution. This process equalizes the concentrations on both sides of the cell wall, making the cell isotonic. That is to say, hypotonic cells will gain water, and hypertonic cells will lose water.

Osmosis moves water into (and out of) cells, the change in the cells is expressed in their mass. Because of this, finding a pair of measurements for cell mass (one before a long, variably-concentrated soak in sucrose water, the second after their soak) is sufficient for determining the movement of water through a cell’s membrane.

Unfortunately, due to the fact that our lab is currently unable to measure these data for a single cell, we decided to use chunks of potatoes, which are quite a bit easier to observe.

For our experiment, we extracted 21 2cm-long cylinders from a potato, using a scalpel to cut them to an exact length. We then dried and weighed the cylinders, recorded the weights, and immersed them in their appropriate solutions, three to a solution. Three were put into deionized water, and the rest were put into 0.1, 0.2, 0.3, 0.4, and 0.5 mole sucrose solutions. Due to an error, the 0.5 mole solution’s potato cylinders were 0.5cm long, rather than 2cm long. Regardless, we opted to see the effect of decreased cylinder length upon collected data.

After an hour, the potatoes were removed from their solutions, dried, and weighed again, and the weights were recorded.

Upon conducting this experiment, these values were determined:

Molarity of Sucrose Beginning Weight (grams) Ending Weight (grams) Change in Weight (grams) Percentage Change
0 1.358 1.448 +0.090 6.6%
0.1 1.326 1.377 +0.051 3.8%
0.2 1.336 1.384 +0.048 3.5%
0.3 1.300 1.289 -0.011 -0.4%
0.4 1.233 1.122 -0.111 -9.0%
0.5 0.327 0.275 -0.052 -15.9%

The potato cylinders which changed in weight have undergone osmosis. Those which gained weight were immersed in hypertonic solutions (the potatoes were hypotonic), and those which lost weight were immersed in hypotonic solutions (the potatoes, again, being hypertonic). Had there been no transfer of water–which is to say, had there been no change in weight–the cell would have been immersed in an isotonic solution. By these data, the molarity of the potatoes is between 0.2 and 0.3. As always, a binary search between those numbers could be used to find a more accurate molarity range, supposing anybody were willing to sit around immersing potatoes in minutely different sucrose solutions all day.

On the subject of the 0.5cm-long potato cylinders, these cylinders lost about 16% of their mass. Compared to the 2cm-long potato cylinders, the 0.5cm-long cylinders lost a greater amount of their water than any of the other cells. This is possibly because the ratio of surface area to mass was a bit higher, allowing direct contact with a larger percentage of the total number of cells to the solution. Because of their awkwardness, the 0.5cm-long cylinders provided especially interesting data.

This study uses procedures posted by Dr. Bunch on Clatsop CC’s Blackboard under the title “Determining molarity of cytoplasm (working draft Oct. 2013)”, reproduced as a PDF file here: (http://spenceradams.org/wp-content/uploads/2013/12/molarity.pdf).

Lab Write-Up: Enzymes

This study focused on the effects of environmental conditions on the reaction rate of amylase (an enzyme) and potato starch (a starch) in order to further understand the functionality and working conditions of enzymes. The environmental conditions tested were: concentration of amylase, buffer solution pH level, and solution temperature. To determine the reaction rate of each solution, Lugol’s iodine (I2KI) was used to detect the presence of starch in a small sample of each solution at various time increments through the duration of the solution’s experiment.

The first set of experiments in this study were those which dealt with the effect of amylase concentration on the reaction time of the solution. We hypothesized that reaction rate is directly related to amylase concentration—that is to say that a positive or negative change in amylase concentration would cause the same change in reaction rate—because higher levels of amylase should be able to deal with a fixed level of starch more quickly than lower levels of amylase. To test this, 1ml of each of the test concentrations (1/2, 1/4, 1/20, 1/200) of amylase was mixed with 250 microliters of potato starch. A 25 microliter sample was extracted from this solution after 10, and 25+20(n-2) seconds, where n is equal to the sample number, beginning with sample number 1. If this sample, when mixed with Lugol’s iodine, caused a discoloration in the iodine solution, the sample still contained starch. Once a sample was produced which did not cause discoloration, the solution’s reaction time was known to be less than or equal to the sample’s time value.

Upon conducting this experiment, these values were determined:

Concentration Reaction Time
1/2 ≤10 seconds
1/4 ≤10 seconds
1/20 ≤25 seconds
1/200 ≤135 seconds

The 1/200 concentration was, out of the four tested concentrations, the easiest to observe. Because of this, we chose it as the standard concentration used for the rest of the experiments. The results of this experiment make sense on a fundamental level: more processors can handle input more quickly than less processors can handle the same amount of input. When conducting this experiment, we thought of using a mere 100 microliters of amylase solution without buffer for our 1/2 concentration. When combined with 250 microliters of starch, this solution took a significantly longer amount of time. We eventually reasoned that this was because of the different amylase solution/starch solution ratio. Though incidental, this gives a small amount of backing to the idea that a higher processor count leads to a higher amount of processing. It would be interesting to determine if this behavior continues with this amylase/starch ratio.

The second set of experiments in this study were those which dealt with the effect of buffer pH on the reaction time of the solution. The research team hypothesized that reaction rate is directly related to buffer pH—that is to say that an acid will decrease reaction rate, but a base will increase it—because hydrogen ions would interfere with the reaction. To test this, 1ml of of the 1/200 amylase-buffer solution (with buffer pHs of 6.4, 5.4, and 7.6) was mixed with 250 microliters of potato starch. A 25 microliter sample was then extracted from this solution after 10, and 25+20(n-2) seconds, where n is equal to the sample number, beginning with sample number 1. If this sample, when mixed with Lugol’s iodine, caused a discoloration in the iodine solution, the sample still contains starch. Once a sample was produced which did not cause discoloration, the solution’s reaction time was less than or equal to the sample’s time value.

Upon conducting this experiment, these values were determined:

pH Reaction Time
7.6 ≤245 seconds
6.4 ≤135 seconds
5.4 ≤45 seconds

These results indicate that the hypothesis needed rethinking, as more hydrogen ions seem to cause more efficient reactions. This indicates that hydrogen ions are necessary for enzymes to properly function. As most enzymes are proteins, it is possible that hydrogen ions manipulate the shape (and therefore the function) of the enzymes, leading to increased or decreased reactivity. It is, therefore, possible that a more acidic solution could lead to a breaking point, at which the enzyme loses its ability to properly function.

The third set of experiments in this study were those which dealt with the effect of temperature on the reaction time of the solution. Our team hypothesized that reaction rate is directly related to temperature—that is to say that a positive or negative change in temperature would cause the same change in reaction rate—because an energy-rich environment would be better suited to allowing reactions to take place. To test this, 1ml of of the 1/200 amylase-buffer solution at different temperatures (room, cold, and hot) was mixed with 250 microliters of potato starch. A 25 microliter sample was extracted from this solution after 10, and 25+20(n-2) seconds, where n is equal to the sample number, beginning with sample number 1. If this sample, when mixed with Lugol’s iodine, caused a discoloration in the iodine solution, the sample still contains starch. Once a sample was produced which did not cause discoloration, the solution’s reaction time was less than or equal to the sample’s time value.

Upon conducting this experiment, these values were determined:

Temperature Reaction Time
Cold ≤225 seconds
Room ≤135 seconds
Hot (note: borrow result from another group)

These results indicated that the hypothesis was incorrect, to an extent. The fact that room temperature is faster than either cold or hot seems to indicate that the effect of temperature on reaction rate follows a nonlinear model. This is, perhaps, because the shape of the amylase is drastically altered once its temperature exceeds a certain limit. If it is altered in this way, a binary search-based experiment could lead to the discovery of that limit. If the relationship is smooth, however, a parabola should function to describe the curve. A more detailed exploration of temperature’s affect on enzyme reaction rate is needed to properly identify the relationship’s exact shape.

From this data, it can be seen that higher enzyme counts, higher pH levels, and mid-range temperatures positively affect amylase’s ability to react with starch. Though we hypothesized that the opposite relationship would exist in the case of pH levels, the research questions were still answered. The hypothesis concerning temperature was incorrect—while cold environments were less reactive than warm ones, hot environments were also less reactive than warm ones. This shows that the relationship is nonlinear.

This study uses procedures posted by Dr. Bunch on Clatsop CC’s Blackboard under the title “Protocol for Enzyme Experiment 10/16″, reproduced as a PDF file here: (http://spenceradams.org/wp-content/uploads/2013/10/enzyme.pdf).

It’s Friday, and I’m still alive.

What a week, eh? The Feds clocked out, I went back to sitting through lectures, I got a new job, the power was out for several hours on Thursday, and now I’m sitting in my office listening to Woody Guthrie for no easily explained reason. For this reason, I’ll claim that I’m doing it for no reason whatsoever.

Those things are all unimportant, though. What is important is my absolutely crazy obsession with anthropomorphizing my collection of computers. For the last week, I’ve been building a collection of scripts that we’ll call “L” for now, to preserve their collective privacy. Bashing in a short command, like “l vpn” will save me a ton of typing out long strings of commands. That command, alone, saves me about four minutes each time I use it.

Other commands, like “l reg” function as bookmarks. Reg, in particular, launches my favorite news site, The Register (http://theregister.co.uk/). “goog” searches Google for whatever its parameters are, “wiki” loads a Wikipedia page, and “pud” loads power outage information for Pacific County’s PUD.

All through the process of serving me, L gives me “human” feedback, which is less impressive to me than it is impressive to others. In fact, I don’t even know why I decided to have L say things like “Alright, you’re on the virtual network.” It just seemed to be the thing I wanted to do. Anyway, it makes it a lot easier for other people to figure out what I’m doing.

For some reason, when I’m sitting in front of a computer screen with a few hundred little terminals filling the screen, people decide that I must be doing something scary, interesting, or insane. Most of the time, I’m just using them because I don’t like the concept of having to access programs via a menu-driven interface. That’s some peoples’ game, sure, but it’s not mine. I’d rather type “iceweasel” than have to click on a (using Windows terminology) “start button” and navigate to its appropriate entry. Not only is it faster, but I hate having to deal with something so analog as a mouse.

I realize that, once loaded, I choose to use a mouse to interact with iceweasel. This is because some websites use JavaScript to grab keys that I’d otherwise be using for keyboard-based navigation. Because I’d rather learn a standardized interface than a few hundred different ones, depending on the sites being visited, I opt to simply use the mouse.

Getting back to the topic, L is a useful thing. While I’m writing its sudo-ish features, I’m having to put my knowledge of Linux filesystem security to the test. When a user has access to a folder containing a script that’s executed by a setuid program, it doesn’t matter how much security the system has, it’ll be broken into the second that user realizes that he can execute programs as root.

That being said, the directory structure had to come into consideration. L usually searches for extensions in the cmds directory, but when no command can be found, it searches in the root-owned read-only (to the user) secure directory. As of yet, there is only one program in this directory. The rest of the directory is composed of links to that program. The program, “rootscript”, learns the name it was called by (shutdown, for instance), uses its setuid flag to escalate its privilege level, and finally executes a script in the root-owned inaccessible (to the user) rcmds directory that is named that command.

As of yet, I haven’t focused on pentesting this setup. I’m sure any cracker worth their nuts would be able to exploit my relative lack of security knowledge to gain complete control of my laptop, my house, my car, and the Internet. From my own perspective, however, it seems secure.

Please leave a comment if you see a blatant, major security flaw. I doubt I’ll receive any, but that’s my own problem for operating a website that nobody reads.