We basically set up our bi-directional LED the circuit according to diagram shown in the last last post. We did followed all the instruction stated the paper on the implementation of a simple bidirectional LED demo, but it just doesn’t work!! There are I think two possibilities why this is not working.
- We did something seriously wrong
- Something is wrong with the microcontroller
At the current moment, I really don’t know how to fix this but here is a picture of the setup followed by the source code
#define digi_fwd 2
#define digi_rev 3
count = 0;
The Arduino Sketch can be downloaded here
Filed under: Electronics
We finally recieved the parts we ordered from sparkfun Electronics. Here are some pictures of the microcontroller, bluetooth module and the accelerometer!! At the current moment we are hoping to familiar ourselves with the microcontroller first and hopefully we will be able to implement a prototype of the Multitouch LED display as soon as possible.
The Arduino Stamp ships with an on-board ATmega168 microcontroller and it comes with an open source multiplatform (currently works on MacOSX, Windows, and Linux) IDE which can be downloaded for free. The Arduino Stamps connect to the computer via the serial port. Due to the compact design of the stamp, an additional adapter is required. The following is a picture of the Arduino Serial USB Board. Using the FT232RL, the Arduino Serial USB Board emulates a com port on the computer allowing communication between the computer and the Arduino Stamp. This is especially convieient since my MacBook doesn’t comes with a serial port =)
We also bought the BlueSMiRF bluetooth module from sparkfun. The bluetooth module connects to the Tx/Rx pins on the Arduino Stamp. And it will shows up as an bluetooth serial port on the computer side. It is capable of transmitting serial stream from 9600 to 115200bps from as far as 160m.
And here is our last piece of goodies… The IMU 5 Degree of Freedom from sparkfun. The “Inertia Measurement Unit” is only 1 square inch in size and it ships with an on-board 3-axis accelerometer from Analog Device and a dual axis gyro from InenSense. By combining these sensor, this board will allow us to sense the 5 degrees of freedom (Roll, Pitch, X, Y, Z). Mounting this on the our device will hopefully allow us to sense the rotational and translational movements of the device.
- Arduino Stamp $38
- Arduino USB Board $21
- BlueSMiRF $65
- IMU 5 DOF $110
Since there isn’t really much Information on any multitouch LED display, we started off with some background research on the subject. The paper by Paul Dietz, William Yerazunis, and Darren Leigh talks in detail about using LEDs as both input and an output.
Part A of the above diagram illustrate the noramal operating condition a typical LED driver. The current is flowing from the cathode to the anode of the LED and the LED is emitting light. However LED is also a photodiode and therefore it is sensitive to light.
Part B of the diagram illustrate a LED that is reverse biased. Under such conditions, the LED acts as a capacitor and the reverse biases charges the capacitance.
In Part C of the diagram, the I/O pin that is previously at VCC is switched to input mode. This allows the photocurrent to discharge the capacitance built up in the LED. By timing how long it takes the LED to discharge we can measure the amount of photocurrent. (Paul Dietz, William Yerazunis, Darren Leigh) If we extend this to a LED matrix, then it is possible to create a multitouch LED Display.
E. Fred Schubert, Light-Emitting Diodes
Paul Dietz, William Yerazunis, Darren Leigh, Very Low-Cost Sensing and Communication Using Bidrectional LEDs
Jonathan Pak, The Light Matrix: An Interface for musical expression and performance.
Scott E. Hudson, Using Light Emitting Diode Arrays as Touch-Sensitive Input and Output Devices
Now it really seems like the original idea of using the Force Sensing Resistors isn’t going to work. After talking to some other people, we come up with a few possible other ideas.
We can either try to construct a 2D multitouch surface using the Force Sensing Resistor. This sounds like a good solution but we would like to try something new.
We can try to construct a 2D multitouch surface using the novel method by Jeff Han, a researcher at NYU of using a webcam/projector setup.
We can also try to construct a multitouch LED Display also by Jeff Han. We kind of like this idea alot especially after seeing this really cool youtube clip. There is no documentation or what so ever on the web concerning the construction of this device. Professor Anind suggest that we should seek advice from Johnny. This does seem to be a risky choice for us to try since there are a lot of potential problems with such a device. But we like the idea and want to give it a shot
Filed under: Uncategorized
During the past week, we did some research on the force sensing resistors and evaluated the feasibility of implementing such a device. The Final Verdict: Expensive.. Hard to Implement… Messy… Might not work. Yes, we did indeed come to the above consensus.
- The force sensing resistors shown below, cost at the least $6 per piece, and we need at least 40~50 pieces to cover the surface of the entire sphere. We did take into consider when writing up the proposal, but we have three people in the group, and it is hard to have three people working with just one device
- In the picture, you all see that the lead (the long plastic connector) extending from the force sensing resistor is quite long and we need to take special care in bending it. Now comes the problems of how we are going to put all 40~50 leads inside a sphere and how we are going to connect all of them to the microcontroller. We can definitely use multiplexers but the whole setup is messy and tedious. We only have very little time to do this…
- There is a good chance that this will not work. We want a 3D multitouch device and all of the capacitance based touch sensitive device do not support multitouch. The only capacitance based multitouch device we know of is the iPhone and it is capable of sensing up to two fingers. There is obvious technical problems with such concept and we are not sure if we will have the time or financial support to experiment with such design.
A NOVEL MULTIMEDIA INPUT DEVICE: THE ELECTRO-SPHERE
Students: Jie Jin, Yen-Wen Liu, Peter Foon-Wang Pong
Faculty Advisor: Prof. Anind Dey
Motivated by a frustration with the limitations of conventional electronic input devices (such as the ubiquitous mouse and keyboard), we propose a novel input device, the “Electro-sphere”, that allows users to input and control data through the use of a touch-sensitive sphere. This sphere takes the shape of a stress ball padded with force sensitive resistors underneath the surface, allowing users to manipulate data directly with their hands. This input device will extend the workspace environment to three dimensions, and will allow for intuitive and immediate data manipulation.
In this project, we will explore the usability and practicality of the “Electro-sphere” as a novel input device. Computer input devices on the market today only allow users to move on a 2-d plane on a screen. We would like to extend this limited interaction to a 3-dimensional interface. We hypothesize that the proposed device will greatly decrease the constraints placed on usability that conventional input devices do. We propose that the 3-dimensionality of the sphere is ideal for this purpose – it will allow for direct, perceptive, and user-friendly data management and manipulation.
The electro-sphere will allow users to use it both as an 3-dimensional modeling agent and as an intuitive navigational device. Both functions are served ideally by the device, whose shape is ergonomic in its ability to be easily manipulated (in both orientation and position). Other common 3-dimensional input devices (such as force- and movement-sensitive gloves or camera-aided tracking systems which follows one’s movements) are either unwieldy and overly constraining or exorbitantly expensive to implement (computationally and materially). The electro-sphere is an intuitive and elegant solution that is also low-cost and easy to implement.
We propose that the electro-sphere has potential applications in 3-d modeling, electronic art and media authoring, entertainment (as pioneered by the Wii controller), and scientific/mathematical modeling. We will explore some of these potential applications in our software visualizations and demonstrations.
On the hardware level, we will build the device by placing force-sensitive resistors underneath the surface of the sphere. These sensors, commonly used in cell phone touch pads and touch sensitive applications, will send analog signals to the microcontroller placed further inside the sphere. Other sensors, including the accelerometer and the digital gyroscope placed in the core of the sphere, will allow us to sense the movement and angular velocity of the sphere, letting us interpolate its position and orientation. These sensors will also communicate with the microcontrollers, which will then relay signals to the computer via the Bluetooth module installed underneath the sensors. Finally, we will install a pager motor, which will provide feedback to the user in the form of small vibrations.
On the software level, we will develop a library of APIs (Application Programming Interfaces) for the device. This will allow other programmers to add new functionalities to our software packages without the necessity to know the specific implementation details.
One of the anticipated problem would be filtering We will also create several visualization and demonstration software packages in order to truly test the functionality and usability of the device. These will include plug-ins to Google Earth, and popular 3-d modeling software such as (3D studio and Maya). We will also create software packages that allow users to truly experience the potential of the sphere to manipulate multimedia data.
The following diagram illustrates the hardware design implementation:
Hinckley, K, “Interaction and modeling techniques for desktop two-handed input.” Handbook of Human-Computer Interaction. 2006. Lawrence Erlbaum & Associates.
Marshall, Damien, et al. ” From Chasing Dots to Reading Minds: The Past, Present, and Future of Video Game Interaction.” ACM Xrds13-2 (2004) 01 May 2007
Motoyuki Akamatsu, et al. “Movement characteristics using a mouse with tactile and force feedback.” International Journal of Human-Computer Studies, v.45 n.4, p.483-493, Oct. 1996.
Steed, Anthony, et al. “Interaction between Users of Immersion Projection Technology Systems.” 2005.HCI, Las Vegas.
Vogel, Daniel. “Shift: a technique for operating pen-based interfaces using touch.” ACM Press (2007) 657 – 666. 02 May 2007
Zeleznik, R., et al. “Two pointer input for 3D interaction.” ACM/SIGGRAPH
Symposium on Interactive 3D Graphics, 1997, 115-120.
Impact on CREU
This project will further the goal of the CREU program (which is, as the website states, to “increase the numbers of women and minorities who continue on to graduate school in computer science and engineering.”) in that it will give all the members involved a great deal of research experience, and will further their academic goals. It will spark (and has sparked) a great deal of interest in the multi-disciplinary fields of computer science, electrical and computer engineering, human computer interaction, and electronic design. Finally, it will help its members gain experience working both collaboratively and independently as part of a team, in synthesizing different skill sets and technical abilities in order to create one cohesive product.
Student Activity and Responsibility
It will be the student researchers’ responsibility to maintain proper documentation of progress on the project, both in the form of a written log and an up-to-date website. Students will also meet weekly with the advisor to discuss our group’s progress and further necessary collaborative work. Each student, having worked in different disciplines and research areas (such as HCI, robotics, machine learning), will work collaboratively and effectively in bringing these varied but necessary skill sets together.
Faculty Activity and Responsibility
It will be the responsibility of the faculty mentor to provide technical assistance, as well as general guidance on the project.
I Final Design Sept 2007
We will have finished our final design for the physical layout of the “Electro-Sphere” and ordered the appropriate supplies. The design will be laid out in SolidWorks and will be subjected to small modifications.
II Hardware Portion Nov 2007
The hardware itself is one of the most difficult parts that we anticipate in the project. It might be subject to change. However, we plan to have it completed by December at the latest. The device will be fully assembled – equipped with a microcontroller and connected to the force sensing resistors. The microcontroller will be programmed to take simple commands and to output sensory data to the computer via Bluetooth.
III Software Portion Feb 2008
The software portion will be done in Processing. We will create a GUI that displays various visualizations on a screen, allowing users to manipulate objects and data using the device.
IV Testing April 2008
The project should be completed by April and we plan to refine our design and prepare for the poster presentation in Meetings of the Minds in May.
Microcontroller – Arduino Stamp $35
Serial to USB module $20
Bluetooth Module $60
Force Sensing Resistors (40) $200
Accelerometer and Gyroscope $100
Materials for physical structure $50
Power Source $20
Total : $485
A PDF version of the proposal can be downloaded here!
Filed under: Uncategorized