The University of Massachusetts Amherst
University of Massachusetts Amherst

Search Google Appliance

Links

ECE Places Two Teams in the National Cornell Cup Competition

The Electrical and Computer Engineering Department placed two teams of student innovators in the final field of 24 from across the nation that competed in the Cornell Cup competition on May 4 and 5 at Walt Disney World in Orlando, Fla. One ECE team designed and built an “Automated Aero-Painting System,” a model “quadrocopter” (helicopter with four propellers) that can spray-paint. The other team created an “Augmented Reality Head-Up Display,” a wearable augmented-reality system, displayed on the lenses of goggles and capable of creating an immersive 3-D environment. Cornell Cup USA, presented by Intel, is a college-level competition created so student teams can design and invent the newest innovative applications of embedded technology.

An embedded technology is a computer system designed for specific control functions within a larger system, used in everything from digital watches to nuclear power plants. Such technology is embedded as part of a complete device, often including hardware and mechanical parts. It is predicted there will be 50 billion of these devices worldwide within four years.

"We’re asking the top 24 teams nationwide to showcase their latest inventions in embedded design,” explains David Schneider, academic coordinator of Cornell Cup USA. “We’re asking them what would be the one product they can come up with that will really move embedded design forward?”

At the Cornell Cup, students have the opportunity to enhance their resumes and demonstrate their professional design skills, highly sought by today’s companies, as they transform their ideas into robust reality. The competition is an academic year-long experience culminating in the two-day summit event at Walt Disney World, where finalists attend talks, network with leading engineering company sponsors, and ultimately showcase their original entries.

After an application round during the fall 2011 semester, the 24 finalists were awarded funding and equipment to help them develop their entries. At the finals, the top three finalists win grand prizes set at $10,000, $5,000, and $2,500. Additionally, approximately 50 percent of the finalists receive either a first, second, or third place award as formal recognition of their efforts. Sponsors may potentially offer other special awards to various teams in addition to the three official Cornell Cup awards.

Under the leadership of ECE faculty advisors Csaba Andras Moritz and Roderic A. Grupen, the Automated Aero-Painting System equipped a quadrocopter with a spray paint canister and set it up to paint a figure autonomously on a vertical surface.

“What’s exciting and unique about this project is that not only does it make an approach to unmanned aerial vehicle automation, but also proposes a completely new application,” the team notes in its project description. “The challenge is to design an autonomous system that involves real-time actuator control and constant feedback evaluation during flight.”

To meet the needs of that design, this project consists of three main components: a Base Processing Unit (BPU); a Quadrocopter Intel Atom Processor (CPU); and an Actuator Control System (ACS). The BPU guides and commands the quadrocopter during task execution. The quadrocopter’s CPU relays information regarding its position and stabilization to the BPU. The ACS interfaces with the spray paint canister, as coordinated by the BPU and the quadrocopter’s CPU.

The team members are Adib Khozouee, Christopher Brennan, Edmar Gonçalves, and Ejiroghene Urhiafe.

The Augmented Reality Head-Up Display aims high. “Our goal,” reads the project description, “is to step up [from] the infantile stage of Augmented Reality to build a portable system that is capable of capturing live sensing data of the users and then use this information in real time application.”

The design consists of a sensor unit, a computational Intel Atom Processor, and a goggle head display. The sensor unit, with four Micro-Electro-Mechanical Systems (MEMS), a GPS, a compass, an accelerometer, and a gyroscope, obtains the live position and movement data of the users and sends that information to the processor via a USB cable. It uses these data as input to a custom graphical application, processed through an integrated GPU in the computational environment of the design, and then projects its output onto the goggle display. For the prototype, the team is generating a virtual world suitable for live-action gaming.

The ECE faculty advisor is Tilman Wolf, and the student team is made up of To Chong, Ryan Offir, James Kestyn, and Matt Ferrante. (May 2012)