|WATCH A DEMO VIDEO OF THE MACHINE||
"Rarely is the question asked: Is our children learning?"
--President Bush, Florence, S.C., Jan. 11, 2000
The 'Is our machines learning?' machine is a networked art installation in which an automated test-taking machine answers questions based on user input at a website composed of real U.S. standardized test questions. The physical installation consists of a machine that is mechanically capable of making marks on a Scantron brand standardized test form with a pencil. In a separate online space, visitors coming to a website determine which multiple-choice answers the machine in the installation selects to fill in. The website consists of a testing interface which delivers standardized test questions written by a government agency called the National Assessment of Educational Progress (NAEP). When visiting users answer each question, their response is sent to the remote machine. After each answer a user provides, she can watch the machine respond to her input in real-time via a streaming video feed from the installation. To provide incentive for interaction, the user can also explore dynamic statistics about fellow users' answers as well as generate a student profile based upon how her answers align with actual NAEP statistics.
The physical installation for this project consists of a custom designed machine mounted on an antique school desk. This machine is built from lasercut
acrylic pieces, aluminum rods, and electric motors. A PIC microcontroller controls the movement and timing of the motors on the machine to drive a pencil to fill in multiple
choice bubbles. The machine's microcontroller is connected serially to a PC in the installation to receive commands from the project website about which test bubble to fill in.
A network IP camera positioned next to the machine sends video images of its movement to the project website.
The website for this project presents the user with a testing environment in which she can participate by answering multiple-choice questions (see Figure 4). Each time a user responds to a question, a streaming video window shows the machine in the installation marking a multiple-choice option. The site is designed to be a multi-user experience, so if multiple people are answering one of the website's questions simultaneously, the system determines the most common response given from the current user set, and that average answer is sent to the machine to pencil in. The site keeps each user informed of the activity of all of the other users via custom status icons on its interface. If a person beats everyone else to answering a question, they will see information about who is still answering the question while the remaining time for that question elapses.
Before launching into a testing session, the site takes users through a registration page of profiling questions. This login survey begins with standard age and race specification and progresses to more ridiculous questions such as whether a user prefers Coke or Pepsi. After registering, multiple-choice questions are presented one at a time to the user. Once the time for a question expires, the website's backend programming calculates the total answers of the visitors on the website at that instant. On the webpage layout beside the test question, the user sees a graph appear visualizing the number of users who chose each multiple-choice option. Drawing on information gathered from the registration survey, users can interact with this graph to show statistics of how different categories of users stack up for each question. Above the graph, the user sees the machine come to life in a streaming video window.
Multiple-choice questions used on the website come from official standardized tests administered by the National Assessment of Educational Progress, a U.S. governmental agency. The NAEP maintains a website with example questions from past tests that feature statistical breakdowns of achievement scores for each question. The site utilizes these statistics to "grade" the users on their achievement on the test questions they answer on the website. After every few questions a user answers, the site prompts them with the option to see their grade or continue answering questions. If a user chooses to see her grade, then she is taken to a dynamic webpage that matches her answers to the statistics taken from the NAEP website. Instead of a simple number score, the website grades the user with a probable student profile which determines likely gender, ethnicity, and school type (i.e. public, private, or catholic) based upon how their answers match the percentages that the NAEP has recorded from its standardized tests.