You are here


Raspberry Pi Phone Home Script

For working with headless Raspberry Pis, I thought it was useful to have the Pi report its address(es) when it boots. Originally I wrote it as a one-liner, but I realized that it was useful to grab the MAC address for debugging porpoises. But to get that, you need to know which interface is active. And this was starting to sound less like a one-liner and more like a script. So I hacked one together in bash.

The script just iterates through the interfaces, checks to see if they're up, and then reports the info back to a script on the server.

#! /bin/bash
# rpi phone home script 20200228 sbrewer
for i in $(ls -1 /sys/class/net)
  #echo "Working on $i interface..."
  if [ $(cat /sys/class/net/$i/operstate) = "up" ]
     echo "Reporting $i interface..."
     _IP=$(hostname -I) 
     _HW=$(cat /sys/class/net/$i/address)
     /usr/bin/wget -O/dev/null -q$_HW\&pi=$_HN\&ip=$_IP

BluetoothLE Project

Last spring, a faculty member and student met with me to discuss building an instrument and I agreed as a kind of "dry run" as part of the preparation for my new class.  It sounds like a relatively simple project: a small, wearable device that collects two temperature readings. After a cursory bit of research, I recommended we try using the Adafruit Feather with BluetoothLE and two Onewire temperature sensors. It sounds simple, but trying to get it to work proved way more complicated than I had imagined.

First, we were hampered by trying to connect to the feather using the Windows laptop the student had. She didn't know anything about using serial ports via USB. And the USB cable she had was power only. And the bluetooth interface on her laptop couldn't do BluetoothLE at all. So everything felt like two steps forward and one step back.

Getting first one and then two temperature sensors working over the Onewire interface was not too bad. But then we confronted the fact that her laptop couldn't do BluetoothLE -- and I realized that even if we could, we would end up wanting a data logger that would remain after she completed her project. So I recommended we get a Raspberry Pi.

I had expected BluetoothLE to be relatively straightforward. Boy was I surprised. There were a few examples that were provided with the Adafruit Feather: we tried the heart-rate monitor example to start with. But it turned out to be not a good fit for what we were trying to do. It turned out that the implementation is tied to proprietary protocols and there were complicated twists at every level. The heart rate monitor example uses a predefined set of characteristics that are not readily generalizable. After struggling with it for a long time I was on the point of giving up.

After much experimentation, I decided that a generic uart connection would make more sense. But there were a bunch of tricks to getting it to work at all. It turned out I couldn't connect to the device without first doing a "scan" with the adaptor. The python bindings that came from Adafruit didn't work. Eventually, I found pygatt -- a different set of python libraries that (by using a deprecated set of connection tools) I could get to connect more-or-less reliably, and receive data. But it could only send text, so I multiplied the float by 100, rounded to an integer, transmitted the integer as CSV, parsed the CSV with python, and divided by 100, before rebuilding CSV together with a time stamp.

It reminded me a bit of the Galileos I worked with a couple of years ago. As an expert, I could work with them and ultimately get them to roll-over and bark. But it wasn't something that I would expect a novice to be able to take on and get anywhere with.

Overall, the experience was very useful to me for the kinds of projects I'm hoping students will do. And I learned a bunch of useful tricks for working with these tools, although there are still a number of hurdles to overcome.

Help desk woes

This morning my frustration with the help desk at my university boiled over and I (perhaps unwisely) sent this note:

Thank you very much for the prompt and complete response to my concerns. This is perhaps the fourth or fifth time in the past year that I have reported a problem to the help desk and have gotten an inappropriate reply back. I could go back in my email and find them all, but that's not the point I'd like to make.

In the past, Dreamhost had you indicate with a menu option how familiar you were with technology with options from "I have no idea" to "I probably know more about this than you do". (I always picked the option before that.) I don't know if that's why I always got prompt, excellent service from them, but I'd like to imagine that they paid attention to that and used it like a filter to interpret what I was saying.

Similarly, in xkcd, randall munroe dreams of having a "code word" you can use that will automatically transfer your call to a person who knows a minimum of two programming languages.

I'm not sure what the best solution is, but perhaps something like printing out a small note and putting it next to each person's phone that says "Steve Brewer is not a moron." I don't need people to tell me to check whether it's plugged in. If I say they should do something, I'd like them actually check with someone who knows what they're talking about before replying. I don't report stuff that isn't borken.

What I've been doing is what I did this time: I take a deep sigh and I write a polite note to [the director] asking him to fix it. But it's a PITA, wastes my time, wastes his time, and makes everything take longer.

Perhaps, instead of a note with only my name, perhaps we could have a flag that shows up for any of the technical staff. Or something.

I want to use the system. I just want the system to work.

Thanks again for the prompt, courteous follow up to my concerns.

It probably won't help, but I've already been told by someone else that I'm "very pissy" this morning and needed to vent.

BCRC Server Update

We updated the BCRC server over spring break. We've known we needed to do it for a long time now, but have been putting it off because we have old Drupal 6 sites that need to be migrated, I've believed that they (or some of their modules) will probably not work with newer version of PHP in this version of Ubuntu. During the Fall, we updated the server in the ISB and, to prepare, we tested and I rewrote some of the old legacy code that I had written. So we knew that much. But this server was enormously more complicated.

It turned out there were only two serious problems with my old code. Originally, PHP used its own regular expressions library (ereg) but, at some point, started including functions against the perl regular-expressions library (preg). I've probably been using PHP since before it had either. But I had to make minor API changes and update my functions to use perl regular-expressions. I also had start using mysql very early and those functions had become deprecated, so I needed to make minor changes to all of those.

The first serious problem was that our MySQL databases didn't get updated properly. The updater is supposed to be able to ask mysql to update the databases, but something went wrong. At first, mysql wouldn't start at all, I think because it was confused about which my.cnf it should be reading. This was probably why the update didn't work correctly. This server has existed since ~1995 and was originally a Sparc 10. Then we got an E250, then a T5220, and most recently a Supermicro running Ubuntu. But a lot -- too much -- of that history is still there. Once I cleaned up the my.cnf files, we got mysql running, but it couldn't read the database of users and so nothing could get access to its data. It turns out that to update the database, you need to log in as "root" so if you can't log in, you can update the database. So then I shut down the service and started it manually with --skip-grant-tables. Then I was able to run the upgrade script. Then mysql came up. (Note this story actually skips several epicycles I made trying to sort out the problems, but is what I did in the end.)

Before the migration, I had started re-building a new web-tree around Drupal 8 (rather the old tree which was built around Drupal 6) hosted at a different CNAME. I had migrated the key pieces of what I wanted going forward, without taking down the old web-tree. After the update, I switched to the new web-tree and then migrate stuff out of the old tree to the new one. I figured this was potentially more disruptive, but safer: there was a lot of old history that was infested with bitrot. It seemed better to start with known good things and then rescue stuff that people identified as missing after-the-fact.

There are still a lot of rough edges: broken paths, missing images, stuff that needs to be reconfigured. But basically good to go. The two most complicated updates are yet to come. But this was a good test-bed to help us prepare for what we need to do for those.

Re-mystifying Technology

As someone who lived through the heady days of the Internet revolution, it's been hugely discouraging to see big corporations gradually stuffing the internet genie back into the bottle. Today's edition is that Google wants to kill the URL.

The early internet was amazing because it was something people could aspire to actually understand. There was a time when many people were interested in learning HTML and, as part of that, learning how URLs worked. Nobody does this anymore. Partly, this is because the technology has become so complicated. But a big part of this complexity is actually unnecessary -- and contributes to empowering corporations to create interfaces that conceal the complexity behind a "consumer" experience.

URLs have become a problem because people don't understand how they work. And because corporations have chosen to make really complicated URLs, it can be hard to tell a "real" URL from a fake one cooked up by identity thieves or malware authors.

As it turns out, however, URLs mostly don't have to be complicated. Google *could* instead undertake an effort to punish sites that use complicated URLs (or "link shorteners") and encourage other technology companies to do the same. Instead, however, we see a continuing effort to conceal the complexity and "re-mystify" the technology. It's rather like the annoying "check engine" idiot-light in your car. There's no reason why they couldn't tell you exactly what the light means. But, instead, auto manufacturers have created a system that requires an expensive, proprietary tool to be connected to the car's computer to read-out the code.

Why? More money for them.


I was an early Dropbox user, but was never happy about using a "cloud" service. The "cloud" is just someone else's computer and I've always wanted to use my own computer. I used SparkleShare for several years, but there was never a client for mobile, which limited its utility. But recently, I found out about Syncthing.

Syncthing provides a web-based graphical user interface to set up end-to-end relationships among devices. It tries to be something a non-technical person could use, but I'm not sure there it's quite there yet. It's a somewhat uneasy compromise between the two: I sorta wish it just had plain-text configuration files that I could edit with vim. (Its configuration files are plain text, but XML). In the end, I found that the easiest way to configure things was to use an ssh tunnel so I could configure both ends at the same time in different browser tabs.

ssh [hostname] -L 8333:localhost:8384

I set it up among 4 devices: my Ubuntu server, a Mac desktop, a Mac laptop, and an android phone. I could install via packages on Ubuntu. Simple. On the Mac, I had to move files by hand and missed that I needed to edit one file before starting it. And then that I needed to replace more than one instance of USERNAME in the file. But eventually I got it working. You could fix the directions, but you couldn't fix my pattern of only reading the directions once everything else has failed. It was easy to get it installed on Android except for figuring out how to create folders in the file-system. I could see the DCIM, Music, etc, folders, but didn't see how to create a folder at that level. Eventually, I saw you could select the /storage/emulated/0 folder and THEN create the folder. Very tricksy.

I was worried it wouldn't work properly across my broken NAT gateway, but it was fine. It uses local discovery as well as a distributed network of "discovery" servers to exchange information about where nodes are. It's just a little creepy, but seems to work OK.

One thing Syncthing has taught me is patience. A couple of times, I would set up something at one end, go to the other end and try to set it up there too, only to (eventually) have Syncthing simply ask me if I didn't want to set it up, with the setup already done. Sometimes it takes longer than I think it's going to, but just a little patience—getting a cup of coffee—does the trick.

I still haven't figured out all the configuration options. My brain wants to think in client-server terms, but Syncthing is more peer-to-peer in orientation. But it is highly configurable. It has four or five different options for versioning, including an "external" option, so you can write a script to manage versioning just how you like it.

I still haven't used it long enough to be sure I'm ready to migrate away from Dropbox and Sparkleshare, but initial results are very encouraging—encouraging enough I've given them some money. Now I just need to persuade Phil to set up a peer so we can provide off-site backups for each other. I mean, he's got that Raspberry Pi JUST SITTING THERE…

Poem Window Prototype

I decided it was a Really Good Idea, having proposed to make infrastructure for the the Poem Windows, to try out the system to make sure it would actually work like I expected. I bought a Geeekpi 7" 1024x600 display with Acrylic Stand and assembled it.

It arrived without instructions of any kind. There were 5 acrylic pieces, a PCB, an LCD, and a bunch of little nuts and screws and other things. I went to the Amazon page, where there were several pictures that were actually sufficient for most of the assembly. I discovered the hard way that you needed to get all the cables connected before trying to assemble the stand, but I had only been hooking pieces together very loosely, so it was not much work to disassemble and then reassemble the stand after everything was hooked up. The only really frustrating part was trying to attach the ribbon-cable video connector, which was not described anywhere nor clearly visible in the video.

Eventually I got it assembled and was genuinely amazed when I hooked everything together, powered it up, and it lit right up. And then turned off. I checked the connections a couple more times and then logged into the pi via ssh and un-commented the safe_hdmi directive to see if that would make it work. And it did! So from that point on, it was just a matter of getting the HDMI configuration right. Eventually, I found that there was a page with configuration information. The magic recipe (from that page) was
hdmi_cvt 1024 600 60 3 0 0 0

Poem Window Prototype

Voila! One Poem Window prototype, suitable for testing and display.

Google's App Specific Passwords in

At some point, the only way to use Google Calendar with Apple's was to set up two-factor authentication and then create an "app-specific password" to use in the application. On my laptop, running Mavericks, this quit working a couple of days ago. Trying to create and enter a new password didn't work either. Eventually, I used this as an excuse to install El Capitan where the Internet Accounts System Preferences now directs you to authenticate directly via Shibboleth to configure a Google Account. But I suspect this means that Mavericks (and previous) systems will find it hard to work with Google anything going forward and will need to be updated to Yosemite (or El Capitan).

Talk at 4VA: Putting the Horse Before the Cart

Over the past 15 years, I've tried to help faculty use technology thoughtfully, to empower students and facilitate learning. I'm unusual in this role, since I'm first-and-foremost a science educator. I would like to outline briefly what I think the research in science education tells us about designing learning environments, provide some examples where I think we have been able to build successful environments and make a difference, and finally, to reflect on where current trends in technology are taking us — and where I think we should go instead.

I was asked to speak briefly about my experience as a consultant to faculty on the implementation of technology in support of education. Thank you for the opportunity to address you and to reflect on my experiences. I would also like to acknowledge my community of collaborators: Randy Phillis, Elizabeth Connor, Zane Barlow, Tom Hoogendyk, George Drake, and the Morrill Science Center Technical Staff — I couldn't have done this without their help and participation. I should also acknowledge the financial support of the Howard Hughes Medical Institute that initially funded the Biology Computer Resource Center and brought me to UMass; the Pew Charitable Trust and the Center for Academic Transformation, that funded our work on revising Introductory Biology; and the National Science Foundation, that funded our more recent work on model-based reasoning.

I saw a program once that was considered very successful at helping older faculty adopt technology. Faculty were given a laptop, a partial release from teaching, and some technical support to adopt technology for a course. What almost inevitably resulted was the faculty member starting to use PowerPoint for lectures. (Today it might be "class capture" or "Camtasia".) That is, the events in the classroom were largely the same, but now, instead of yellowing viewgraphs, the faculty member presented their slides using a computer (or did what they were doing before, but now "captured" it using technology). And that was considered "progress", because the faculty member had adopted technology. I always thought it was putting the cart before the horse. The goal shouldn't be to mindlessly adopt technology because technology is better: we should be mindful of current research regarding how people learn and then look at how technology can uniquely enable and support that. I think part of the issue has been the practice of coupling a faculty member with a technologist. What often results is modern technology implementing traditional education.

Seymour Papert used the analogy of putting a jet engine on a stage coach "to assist the horses", but throttled down enough to not hurt them. In this way, you could cross the great plains in just a week — rather than transformative change, that would allow you to fly across in just hours. (You'll notice that Seymour is more ambitious than I am: I'm not suggesting ditching the horse all together — let's just make sure we get the horse in front of the cart, and save flying for later.)

Bloom's Taxonomy

I believe there has never been a more misused educational tool than Bloom's Taxonomy. I assume everyone has seen it at some point. Frequently it is cited as though it represents developmental steps or prescriptive activities and that students must pass through each lower phase before they can reach the upper phases. But nothing could be farther from the truth. Bloom himself described his books as "the most cited, yet least read book in American Education." In fact, there is no good reason for people to be encouraged to do the lower activities at all unless, as part of trying to achieve the higher ones, they see the lower ones serving some proximate goal. In other words, there's no reason not to have every student activity as part of striving for the upper levels of application, analysis, synthesis, or evaluation.

How People Learn and Constructivism

The National Academy report "How People Learn" developed a model for thinking about effective learning environments that described them as "learner-centered", "knowledge-centered", "assessment-centered", and "community-centered" — pretty abstract. A current buzzword is the "flipped classroom". Like "active learning" or "blended learning", it's not a particularly specific term. I have real problems with how teaching practice often gets disseminated, as "teaching tips". What needs to happen is a paradigm shift in terms of how people think about teaching and learning.

The key insight in this paradigm shift is that you can't transmit knowledge from one person to another. Knowledge must be built, or "constructed" as an extension of existing knowledge. Students are not blank slates upon which instructors write their wisdom: students' prior knowledge is essential. And for learning to be meaningful, students must understand their own knowledge and intentionally extend it, which is somewhat threatening and risky: it's hard to admit that what you think is wrong. But this is critical, because students' alternate conceptions can interfere with the conceptions you want them to learn.

Briefly, returning to the language of the National Academy, learning environments need to be "learner-centered" — learning takes place in the learner and has to be built around their unique needs and prior knowledge. By "knowledge-centered" they mean that learning goes beyond remembering: knowledge implies not just individual facts and ideas, but how those ideas are interrelated into a (hopefully) coherent whole. Assessment-centered, means that learners need many opportunities to get feedback on whether the knowledge they're building holds up and is consistent with the goals. And finally, "community-centered" means recognizing that knowledge is embedded in a social context: how you talk, how you relate to other people, being part of a community of scientists with accepted behaviors, practices, etc. It's a tall order to expect to achieve those learning goals with a lecture — even if you're using PowerPoint.


One additional aspect of designing learning environments requires considering what motivates students to participate. Last year, I taught a course for honors students. At first, it was a pleasure: all the students but one completed every activity required to get an "A". They came to class, they participated, and did everything that was required. But when I offered a special activity outside of class: a visitor from Germany who could talk about the subject, but who couldn't attend during the hours of the class. Not a single a student came. Why not? It wasn't required. I ultimately came to be a bit creeped out by the honors students because they were totally uninterested in the subject -- they seemed, in fact, uninterested in anything except maintaining their honors status. I expect they each had a long list of everything they had to do to maintain honors and tried to do everything on that list, which left little or no time for anything else. And they didn't do anything out of interest or passion -- just checking things off a list.

Alfie Kohn wrote a excellent book a number of years ago called "Punished by Rewards". It turns out there's a huge literature showing that rewards (like punishments) don't motivate people to anything except to get the reward. And any kind of higher-order skill you want to name, learning or customer service or ditch-digging, will be undermined by trying to "reward" people to do it. If you want people to do a good job, ask them to do a good job and turn them loose with as much support and feedback as you can give them. But don't try to reward particular behaviors because, what you'll get, is the mindless pursuit of those behaviors rather than using all their faculties to try to do the right things. Unfortunately, our students come to us already programmed to focus on "points". It's a problem -- and not one that's easily solved. I'll return to this just a bit at the end, but I raise it now as something to think about as you try to create environments that will foster genuine engagement and meaningful learning. The more you try to reward students for doing particular things, the less likely they are to actually try to do what you want and the more likely they are to focus on how to get the points with the minimum effort.

Learning Goals and Technology

I co-chaired two committees in the UMass Biology Department where we established a set of learning goals for the major. These aren't content-based — we assumed each course and curriculum will establish content goals. What we wanted was to describe the capabilities and perspectives that we thought a Biology Major should come away with. They include things like "observe and describe nature accurately", "ability to construct logical arguments in biology", and "ability to communicate ideas and arguments effectively both orally and in writing". Coming together as a faculty and building a set of goals has been incredibly important in terms of establishing a dialog about the outcomes we're looking for in our classes. One of my favorite quotes about education is attributed to Jerome Bruner. He said, "Learning is not the product of teaching — learning is the product of the activity of the learner". To know what students are learning, then, you need to look at what students are doing. And to produce meaningful learning, you need to create environments where students' activity is aligned with your learning goals.

As someone who tries to facilitate and support updating pedagogy and implementing technology I think what I do most can be summarized by this simple statement: "But what are the students going to do?" A good model for thinking about this is "what does a scientist do?" What do you do? It's probably a complex mix of trying to figure stuff out by making your own observations, using the observations of others, struggling with data, writing and discussing with peers, consulting a mix of information sources, and maybe working with models and simulations. Are there other things you do? Students should be doing this too and technology can support a lot of it. Technology can play three fundamental roles. Technology can be a tool to collect, analyze, and synthesize data: like spreadsheets, statistical packages, and image-manipulation software. Technology can also support communication: enabling people to share ideas, collaborate on writing, and find sources of information. Finally, technology is fantastic for creating models and simulations that can provide realistic data consistent with a set of assumptions. This can be fantastic for exploring what data might look under a simple set of assumptions to see how closely it matches real data.

Molecular visualization

Shortly after I arrived at UMass, a colleague was developing a set of teaching resources for "Rasmol" molecular visualization software. Currently, he's still doing much the same kind of thing with J/Mol. He was building tools so that instructors could develop presentations and tutorials of the features of particular molecules. An instructor could make a script that would load a molecule, rotate it a certain number of times, highlight a particular site or bond, zoom in on that, etc, with little buttons to step through the presentation. He was really excited about putting the software into the hands of faculty. "But what are the students going to do?" I asked. He stopped and sort of scratched his head. I pointed at the screen and said, "The students are just clicking 'next'. All of the 'thinking' is built-in." Over several months, we came up with a number of ways that students could be engaged with using the software to explore and make inferences about modules: not just watch canned presentations created by someone else. In one, we had students color the bases with how conserved that base is in the sequence and make inferences about what parts of the module are most highly conserved. In another, we looked at which parts of a protein are polar and non-polar to make inferences about the structure and function of modules.

Intro Biology Labs

When I arrived at UMass, the Intro Biology laboratories were identified as a place that needed updating. The labs were primarily confirmational, that is, students had a set of directions that each student, sitting in neat rows, followed individually, that said things like "Step 14: add the reagent and observe the blue color. Write 'blue' in your lab notebook." That's an exaggeration of course, but it's not very wide of the mark. If you asked "What are the students going to do?" it was mostly "follow directions". And it turns out that "follow directions" is not actually one of the Biology Department learning goals. The lab coordinator wanted the labs to run smoothly — often so smoothly that students didn't learn very much. When the students used software, she would prepare "click-by-click" directions that told the students exactly what to do to analyze data or generate a figure. We redesigned the laboratory environment to put students naturally into groups of three with a computer workstation and a space for "wet" activities. And we began to "uncook" the activities.

We tried to build in routine opportunities for collaboration and a focus on higher-order goals. One way that we built in collaboration was to have students work in groups. Group work is incredibly valuable: students can provide feedback to each other much more quickly than the instructor can. Students often have complementary knowledge so a group gets stuck less often than an individual. In this way, the instructor is needed only to resolve the most difficult and complicated problems. Another way was to build large datasets by having each group submit data to an online form that aggregated the data across the entire class. Previously, each student would have only their single record of data to analyze: now we could have a spreadsheet with 1000 rows. This had two effects: first, it really required using technology: nobody wants to add up 1000 values with a calculator (although we've had students try). But, more importantly, with 1000 values, you have real statistical power, so that students were more likely to be describing what the effect was, rather than why their particular data wasn't very good. We could also build interesting data sets, like time-series, that leveraged having students collect data for a whole week, and provided an additional dimension to the data being collected.

Higher Order Skills

Click-by-click directions don't require higher-order thinking. Rather than click-by-click directions, I'm more inclined to pose problems to students by offering them a spreadsheet of data (perhaps by having them jointly collect a body of data that they can each download) and giving them the general task of analyzing the data and presenting a figure, or figures, that will summarize the analysis. And letting them figure out how to do it. In one exercise with juniors, I made a spreadsheet (CSV) of data with columns for Gender, GPA Hours Studied per week, and Hours Slept per week. I made about 9 different versions, so that each group of 3 students could have a different version. Each version was biased differently: in some males showed an effect on GPA for hours studied, in some it was females, in some it was reversed, and so on. I also added some odd values: I added a line with an "O" for gender, and would add an outlier for one of the other values (eg 520 for "hours slept per week").

It was fascinating (though a bit discouraging) to watch the students try to analyze the data with no guidance. I had intentionally put the GPA column first, since I know that that spreadsheets assume that the dependent variable will be second. Many of the students generated plots that implied GPA was causing students to study or sleep. Very few recognized that, with a spreadsheet, you need to sort by gender and then plot and analyze each series of points separately. But they recognized pretty quickly that they were using the wrong tool for the job.

Subsequently, we used r-cmndr to analyze the data, which has tools for doing a multivariate analysis, identifying independent and dependent variables, and controlling for groups. Just a couple of minutes of summary was enough to get them started and, in a short time, most groups were able to generate a figure that illustrated how their dataset had been biased.


One of the most transformative technologies enabled by the Internet was a surprise to nearly everyone. How many of you have used Wikipedia? When wikis were first created, and when wikipedia began, most people sneered. How could a system where random people on the internet could write anything, possibly produce anything of value. But I don't believe it is controversial any longer to say that Wikipedia is the best source for general information on the internet. If you want a good, basic presentation of nearly anything: "Balsamic Vinegar" or "Houston, Texas" or "Natsume Yunjincho" you can find it at Wikipedia. Moreover, most scientific concepts are also well represented there: like "Mitosis" or "Oxidative Phosphorylation". One feature that few people understand is the ability to look at the history of the document and easily compare versions. If you look at another source, like a Biology textbook, you often have little information about how authoritative a text is: the textbook has an "author", although much of the writing is often done by others as a "work for hire". And the book has a publisher, probably some giant corporation. And whatever authority the book has derives from them. But with a wikipedia article, you can see who contributed all of the changes and when. This provides a lot of context for making informed decisions about how reliable a particular piece of text is. Was it written once 5 years ago and never edited? Or was it begun 5 years ago and subsequently edited 10,000 times? Have the changes been a process of gradual refinement toward a final polished piece? Or does it lurch back and forth based on differing viewpoints among authors? You have vastly more information on which to base a decision. People who object to Wikipedia, because a page might be vandalized when you look at it, fail to recognize that what Wikipedia really represents is a different conception about what "knowledge" is.

Most students, and many courses, approach knowledge like it is received fact. Students believe they are expected to master a vast array of information, commit it to memory, and demonstrate they can answer questions in a particular way. Fewer approach knowledge as being able to construct a good answer based on what they know and what they can find out. We have used wikis (and wiki-like systems) for student collaborative writing. I generally have each group of students create a page to contain their writing for a project. At the end, I have each student take the text from the page and use it to independently create a correctly-formatted manuscript (conforming to manuscript-submission guidelines similar to ones we've looked at for journal articles). When I evaluate the papers, however, I use the revision history of the wiki: when was the document created? Who contributed which pieces of text? Did all of the authors make meaningful edits and comments on the rest of the text? And, in particular, did the entire document magically spring into existence the night before the deadline? Using a wiki gives me much greater confidence regarding the originality of my students' work. Furthermore, it enables me to monitor documents as they're being created and provide feedback in real time. I can follow up with authors that aren't contributing and encourage them to do more. I can make suggestions If you're asking students questions (in class, on an exam, ie, ever) that can be answered by google, you're asking the wrong questions. Students don't need to know stuff that they can look up in google. Remembering stuff isn't what students should be doing.


Early on, a number of faculty wanted a way to ask students questions via the web. I looked at a variety of the available options at the time (which were much more limited than today) and decided to just write something. In particular, though, I wanted to ameliorate what I saw as the biggest problems with these systems. One problem is that many of the systems required options to be marked as "right" or "wrong": I wanted to have the instructor simply provide feedback. I also wanted a student to be able to interact with a question as long as they wanted to, to explore other options. I also wanted to make it difficult to extract a score from the system: it could show you how a class had approached a question (how many people had chosen each option first and how many total), and you could see any particular student's transcript from a particular quiz session, but it was hard to simply put a number on the performance, to encourage faculty to use it for formative, and not summative, assessment. I was satisfied that the project met its goals: it was very popular with students, students obviously tried to select better answers first (ie, they didn't just explore the options in order), and would often explore a second or third option, even if they'd found the "best" option first. But I was particularly pleased with a somewhat unintended outcome: faculty who used the tool discovered that, in writing the feedback for options, that some questions were not very interesting. One said, "If you ask 'how many membranes does the mitochondrion have?' what feedback do you write for '1' or '3' or '4'? 'Wrong'? It's just not a good question." You might ask whether a quizzing system meets my "But what will the students do?" criterion for meaningful activity. It was a compromise. At the same time, I'll argue that, by removing having the computer tell students whether they're "right" or "wrong" and, instead, saying "Here's what your instructor thought", it becomes a communication tool between students and faculty, and not a drilling system.


When I arrived at UMass, the Physics Education Research Group was using a technology called "ClassTalk". It was a rube-goldbergesque system that connected Texas Instrument calculators using a wired network and ran some flaky software on them to allow students to log in (up to 4 per calculator) and respond to questions presented in class. This system has generally been replaced with special-purpose devices called "clickers" — in the past year, UMass has discontinued using a second-generation system called PRS and adopted a third-generation one called "iClickers", but they're all pretty much the same.

These systems get used in a variety of ways. One fellow would ask students questions and then smack the screen with a stick as he ticked through the items "Option 1: WRONG! Option 2: WRONG!" This is probably a pretty sub-optimal way to use the technology. What you *can* do is ask much richer and more nuanced questions in class and ask students work in a small group to figure out why some items are better answers than others. Since we started using this approach in our intro biology classes, the instructor only speaks for about half the time that class is going on: the rest of the time, students are speaking, either to a small group or to the full class. And attendance is way up. Some faculty, seeing the high attendance, believed it was the result of using the clickers to take attendance, but that's not what's happening — at least in our case. Students appear to attend class because the events in the class are closely aligned with the exams. Faculty that use clickers to take attendance discover that it makes students angry: they don't like just being spied on. They also find it relatively easy to work around: they send one person with a bunch of clickers to click in for friends. Most importantly, they find that none of the clickers we've used are reliable enough for the purpose of actually recording every click reliably: it does sampling, not voting. If you try to use it to record every event, you create a headache for yourself: you have to allow much longer for students to reply and deal with record-keeping for students who's "clicks" weren't recorded. We've had a lot more success just using it as a sampling event to drive discussion, and not giving more than token credit for participating. The student attendance isn't driven by the technology: its the pedagogical approach and the fact that students find their time well-spent.

The key issue, however, is the nature of the tasks you give students. They need to require real problem-solving skills and be difficult enough that most students can't solve them independently. They're really quite different from the kinds of questions you usually ask on an exam. At the same time, they need to be aligned closely with the exam questions. It's a real trick to construct good questions. We could spend a whole weekend just practicing how to write good clicker questions. Our experience has been that questions that require model-based reasoning lead to the best outcomes. Most test-bank questions, when they refer to models at all, tend to be questions about the model. Is it "anaphase, prophase, metaphase and telophase" or "metaphase, telophase, prophase, and anaphase"? Those are questions *about* the model. Or you can have model-using, model-elaborating, model-revising, or multi-model problems. For example, if you introduce colchicine, which inhibits spindle fiber formation, where will the cell-cycle be interrupted? Now it's not enough to just know the order, you have to know what's happening in the different phases. Using clickers has been transformative in how our biology courses work, but its not just using clickers: we've transformed the nature of the tasks that students do and that has been what actually resulted in the difference.

Course Websites

Shortly after I arrived in the Biology Department, I began providing ways for faculty to build course websites. This was during the period of time when at technology conferences, there would be faculty giving a talk about how they built a website for their class. When we started, there was no centralized support at the University for course websites. At first, we just provided a directory and a few other resources. Later, we built a home-grown sort of templating, or early content-management system. It provided an attractive "wrapper" and navigation system, a system for preparation pages using duck for pre-class quizzes, wikis, and more. More recently, we've been building Drupal websites that can support image galleries, online forms to collect and aggregate student data, and many other features people need, although increasingly, I'm encouraging faculty to use the learning-management system provided centrally for all the basic functionality. They've migrated to a new system (starting this fall) that actually works.

UMass selected several "learning-management systems", seeming to bet wrong each time. First, they picked an outfit called "Prometheus". Ever heard of them? It was that good. Prometheus was bought by WebCT, so they picked WebCT next, just before it was bought by Blackboard. I made a point of trying out all these systems to be familiar with them and to be able to provide and support. But I really couldn't recommend any of them. First, because they were shockingly poor pieces of software. They simply, fundamentally, didn't work. There were constant issues from students who couldn't make the system work. One issue: it had an internal messaging feature that you were supposed to be able to configure to forward messages to your email. But I found that about 30% of those emails were never received. Many students could make most of the site work, but the widget to upload files, for example, wouldn't show up for them.

Even more fundamentally problematic is how they chose to very conservatively configure the systems. To explain what I mean, let me use a metaphor: if the physical environment of the University were configured like the LMS, there would only be one door into the University and it would be locked: you'd need enter your account information to get in. Once you got in, the only doors that would be visible were the ones you were allowed to go through: ie, the two or three courses you're taking this semester. All of the other doors would be invisible. Inside the classroom, the blackboards would be covered with glass, the chalk would be locked in a box, and all of the students would be hooded and gagged so they couldn't see one another or say anything. Some faculty would go to the effort to unlock everything and unhood and ungag the students, but many didn't. By default, the system creates a very oppressive environment that may be OK for didactic teaching, but is not really suitable for implementing a constructivist learning environment.

Most recently, UMass has adopted Moodle, which is Free Software and is designed to enable much more co-created environments with students. (I'll talk more about Free Software later.) Moodle has a number of limitations, but I'm encouraging our faculty to explore the features it has that can enable students to work collaboratively (including wikis). I'm very hopeful that Moodle can provide most basic needs and I'll be able to focus on providing support for more interesting and specialized features that faculty and students want.

Technology Trends

Now I want to get out a crystal ball and a soap box and spend a few minutes talking about current trends in technology and where they're taking us. There was a period of time, maybe 5 or 10 years long, when the most common complaint faculty made about students was that their cell phones would ring in class. How rude! Can you believe those students? How hard can it be to turn off your cell phone when come into class! Now, you don't hear faculty say that anymore. How many of you have had your cell phone ring while you were teaching? It's happened to me. It turns out it's really easy to forget to silence your cell phone when you go to class. And once faculty routinely had a cell phone, the complaints stopped.

How many of you ever used a modem to dial up a computer? There was a time when most of the time your computer wasn't connected to the internet. To use the Internet, you had to run a special program, listen to some funny noises, and then you could use the internet (slowly) for a while. Then you'd hang up, or get timed out, and go back to being disconnected. That sucked. More importantly, when that ended — when your computer was connected all the time — it changed how you used the internet. It changed the kinds of things you'd use it for and how you thought about it. Now, with smartphones, it isn't just your computer that's connected all time. You can use the internet from very nearly anywhere: while eating dinner or on the train or even in the bathroom. If this hasn't happened to you, it will. And when it does, it will change how you use the Internet.

One thing is the advent of having a network connection all the time, everywhere. A faculty member in my department sent an email to the faculty saying that he didn't like students using laptops in class, so he was planning to submit a petition to the faculty senate requesting that the university administration only turn on the wireless connectivity in a classroom if a faculty member requested it. I pointed out that this was a fatally-flawed plan on several levels. First, because the university-provided wifi is part of the general infrastructure of the building — not something associated with a particular classroom. Second, because many students were using their own 3G wireless devices that didn't depend on the University's connectivity anyway. But most of all, because if your pedagogy is so uninteresting that students are surfing the net instead, you should fix your pedagogy — not cripple the infrastructure. Why not have the students do something with the connectivity? Why not get them engaged using it to do something productive?

A current trend related to this issue is sometimes call BYOD: Bring Your Own Device. Universities are currently evaluating whether to maintain their own computer labs or to expect students to bring their own computers (or tablets or smartphones) to provide those services in support of education. At the moment, we still see value in providing computer labs with workstations. Student computers tend to be highly variable and its difficult to teach when students have different versions of software or operating systems. Apple's move to create a closed platform has the potential of reducing this trend but, at the same time, is such a bad idea that its hard to be encouraged by the idea. One faculty member solved the problem in a unique way: as part of the class, each student installs a separate partition on their hard-drive and installs a particular version of the linux operating system. In this way, he can be assured that the students are all using the same operating system and software. And, although that could work for almost everyone, probably most people would find it daunting.

Free Software

When I was a graduate student, I used a packaged called "SuperCard". Have any of you ever heard of SuperCard? It was a program you could use to build something that looked like a Macintosh application. It was pretty cool. I built a program for teaching phylogenetic inquiry as part of my doctoral research. I spent several years building this program. SuperCard was modelled on HyperCard — an early multimedia system built by Apple — and only worked on Macs. The SuperCard developers very much wanted to build a cross-platform system that would work on both Macs and PCs. This was in the era when Microsoft was working hard to thwart anything that could be cross platform — this was around the same time they were working to kill Netscape (and ultimately convicted of illegally using their monopoly to do so). The company making SuperCard went bankrupt trying to build their Windows version just as I was finishing my PhD. As I was getting ready for interviews, Apple released a system update that caused SuperCard to not work. For a while, I looked like I would be unable to get anyone to use my software and I was really stuck: the package I had used was proprietary. The only people that could fix it were in the bankrupt company. Luckily, they got a little infusion of venture capital, resuscitated enough to release another version, and I was able to show my software. And I got a job.

What I learned from this, however, was never to become dependent on proprietary software. I sometimes use proprietary software, when it's convenient, but I make sure that everything I do is based on open standards and can be done with Free Software. Does everyone know what Free Software is? You've probably heard of Open Source software and Freeware — those are both different. Free Software is software that preserves your freedom to use the software, not just as a consumer, but as a participant.

The Free Software Foundation was started by Richard Stallman, a programmer at MIT. They had a shared printer that was often pretty busy and he had hacked the software so that it would send an email to people with queued jobs if the printer was jammed, so someone could go fix it. It was a simple hack and was possible because the software was provided not not only in executable form, but also in source form. When the next version of the software came out, however, the sources weren't provided and Stallman couldn't make the fix. Even worse was when Stallman found that a colleague had the source, but couldn't provide it, because he'd been required to sign a non-disclosure agreement to get it. Stallman was livid that the corporations were undermining the potential for collaboration among programmers. He began the process of building an open-source version of the Unix computer operating system, which ultimately led to what most of you think of as "linux".

There is, however, a whole ecosystem of Free Software. The backbone of the Internet, the Domain Name System is based on Free Software. Most webpages are served by the Apache webserver package. Many websites use the PHP Hypertext Preprocessor. All free software. And Free Software isn't just on the server: Libreoffice is a full office suite with word processor, presentation tool, and spreadsheet. The GNU Image Manipulation Package can do most of the things you would want to do with Photoshop. Inkscape can do most of what you'd want to do with Illustrator. And so on. Unfortunately, the corporations are colluding to try to take your freedom away.

How many people have an iPhone or an iPad? I have an iPhone. It's amazing. One concern I have about Apple, however, is their effort to exert greater and greater control over your device. Your iPhone is a computer, but it really isn't your computer. Apple really controls what you can do with it — they let you use it for things that they want you to be able to do, but increasingly they are monetizing your ability to use its capabilities. For example: my phone contains a wifi chip and a cell phone chip. Therefore, it has the ability to act as a wifi basestation and share the connectivity from the cell network with my other devices. And Apple will let me do that — if I pay someone $15/month. How long until Apple wants to charge you a penny to increase (or decrease) the volume. Or the brightness. Who owns the device? Me or them?

In the early days of the Internet, people were amazed to discover that you could just send email for free. Or look at webpages. Or download files. This wasn't an accident. The Internet was built by people who had made a key insight: the Internet was built to efficiently route packets of data between endpoints. And all of the smarts of the system were put out at the ends. The middle just moved data. (There's an elegant essay about this thinking called the "Cluetrain Manifesto".) This was all in dramatic contrast to the telephone and later cell-phone networks, where the smarts were all in the middle. They had been built that way to enhance corporate and government interests that wanted to monetize, control, and spy on people. A corporation would never have allowed something like the Internet to get built in the first place and, little by little, the freedom it offers is being whittled away.

The government too is uneasy with the ability of Internet to allow people to organize and some, like Joe Lieberman, have been working to craft legislation that would give the government an "off switch" so they could turn off the internet, if it were inconvenient — ie, if people were using it in ways the government didn't like. You should consider joining the Electronic Frontier Foundation to help preserve your freedom.


When I teach, I spend most of my effort teaching trying to convince students to take charge of their own learning. I challenge them to think about what they need to learn and to use every course they take as an opportunity to meet their needs. I tell them "Don't ask the instructor what you need to do: ask yourself what you need most and use your time in every course to make sure it meets your needs." Few undergraduates approach school this way: it's not what they've been rewarded for in their prior experiences with education.

In fact, one of the key insights I've had is that students will very quickly revert to their prior behavior if you provide any experiences that reify their original conceptions of teaching and learning. For example, if you ask interesting thought-questions in class, but use recall questions on your exams, students will quickly recognize that the thought questions don't matter, and will quit participating.

There's a huge issue of trust: students know how education works. They're experts. They think teachers are trying to trick them, trip them up, and cheat them out of points. And they approach classes by trying to figure out what's the least amount of work they can do to "get through the class" rather than to approach the class as an opportunity for personal transformation.

It requires constant effort and maintenance of the relationship to persuade students to try to satisfy themselves, rather than you: to use their time in your class to do something that they think is genuinely worth the time and effort they're going to put into it. When you can make that happen, you will be astonished at what students can do. Similarly, it takes a big effort to shift your teaching practice. As someone who was very successful in the existing paradigm of teaching and learning (ie, or else you wouldn't be here) you'll find transmissionist ideas embedded throughout your thinking and it requires time, effort, and patience to expunge them all. And you may find that the intoxicating allure of being the center of the classroom is hard to give up, in order to create space where students can use your class for their ends, rather than yours. But I think the outcome is worth it.

In traditional teaching, I often think of it as Sisyphean work, pushing the boulder up the hill every day, while the students watch. I'm suggesting that, if you create the environment the right way, it will be the students pushing the boulder up the hill while you watch — and give helpful suggestions. ("Psst! Go that way!") And technology can be very helpful — if you start off on the right foot. And put the horse before the cart.

ContactCon and conversations worth continuing

When I heard about ContactCon, I signed up almost immediately. The issues being raised have been of interest to me since I started using the Internet: how to make sure the net can be used for empowerment rather than oppression. The net is clearly useful for both, but the trend has been shifting in the wrong direction for years.

A corporation would have never made something like the Internet in the first place. When I was a kid, we still had The old AT&T and Bell Telephone network. You weren't allowed to own a telephone: you were required to rent one from the phone company. And everything was monetized. Now, I suspect that the most expensive thing about current cell-phone operations is the overhead necessary for administration, metering, and billing. And that's the direction we've been going: give users a dumbed-down box that only enables what the monetizers want you to be able to do.

There were a lot of interesting people at ContactCon most of whom I'd never met before. The demographic was mostly white, largely male, and somewhat younger than me. There were some folks my age or older, but we were the exception. Many were young entrepreneurs and freelancers looking to network to support their project. It reminded me of the luxury of my current circumstances: I have a steady job and don't need to spend half my time trying to market myself or bill people. I don't have to work on spec or limit what I do to what people are willing to pay for. I get to spend most of my time actually just working and being creative. I lament for this generation that is so circumscribed and limited in their choices -- and will probably end up permanently stunted by the economic conditions that have been imposed on us by the 1%. Or, if you prefer, that through my generation's lack of engagement, we have allowed ourselves to be disempowered.

I wore my "Official Red Hat" red hat and took my ubuntu netbook to demonstrate my free software street cred. I actually met the guy who'd ordered the stock of red hats when he worked at Red Hat in that time period. I had completely borked my install of Ubuntu a few days ago (or maybe the update from Easy Peasy never really worked right). In the event, I completely wiped the netbook the night before and re-installed everything. I've started using Dropbox to maintain the rough drafts of my writing, so it was easy to get my data back. I could have just taken my macbook, but it wouldn't have been as fun. In point of fact, I hardly used it, but it was nice to know it was there.

I met dozens of people, learned about many new projects, and also touched base with projects I've known about but haven't had time to explore. I've been interested in the Freedom Box since I first heard about it: it's consistent with my vision for people having their own server. And it's also the only way to have any assurance of privacy: you can't trust third parties not to reveal all of your private information to the government or corporations.

I organized a discussion about education and unschooling. It was a very receptive audience to the ideas and there were a number of people working on interesting things. The most interesting was probably Be You, but there were many, many others. ContactCon reminded me of what John Jungck used to say about the goals of BioQUEST: to begin conversations worth continuing. I suspect I will continue to interact with some of these people going forward.

Subscribe to RSS - technology